Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3 values | title stringlengths 1 844 | labels stringlengths 4 721 | body stringlengths 1 261k | index stringclasses 12 values | text_combine stringlengths 96 261k | label stringclasses 2 values | text stringlengths 96 248k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
278,350 | 24,147,549,564 | IssuesEvent | 2022-09-21 20:15:53 | mozilla-mobile/mobile-test-eng | https://api.github.com/repos/mozilla-mobile/mobile-test-eng | closed | [META] GCP Project Maintenance - Document and implement best practices | android infra:mobile infra:ui-test maintenance META | DESCRIPTION
We need to establish best practices for our GCP projects
STEPS
1. Investigate all best practices and document
i.e. key rotation, etc.
2. Create issues to implement enhancements
NOTES
- Create tooling script to JSON key into a TC-formatted YAML key
- Remove all old keys
- Set (and document!) data retention threshold for Focus-android (and all projects)
- Revisit 9 mo data retention for fenix - is 3? 6? mos of Firebase data enough?
| 1.0 | [META] GCP Project Maintenance - Document and implement best practices - DESCRIPTION
We need to establish best practices for our GCP projects
STEPS
1. Investigate all best practices and document
i.e. key rotation, etc.
2. Create issues to implement enhancements
NOTES
- Create tooling script to JSON key into a TC-formatted YAML key
- Remove all old keys
- Set (and document!) data retention threshold for Focus-android (and all projects)
- Revisit 9 mo data retention for fenix - is 3? 6? mos of Firebase data enough?
| non_priority | gcp project maintenance document and implement best practices description we need to establish best practices for our gcp projects steps investigate all best practices and document i e key rotation etc create issues to implement enhancements notes create tooling script to json key into a tc formatted yaml key remove all old keys set and document data retention threshold for focus android and all projects revisit mo data retention for fenix is mos of firebase data enough | 0 |
4,645 | 6,741,639,884 | IssuesEvent | 2017-10-20 02:17:21 | wesnoth/wesnoth | https://api.github.com/repos/wesnoth/wesnoth | opened | high contrast icon | enhancement Graphics linux Services | similar to #2109, if someone would create a high contrast icon it would be beneficial too. | 1.0 | high contrast icon - similar to #2109, if someone would create a high contrast icon it would be beneficial too. | non_priority | high contrast icon similar to if someone would create a high contrast icon it would be beneficial too | 0 |
76,648 | 15,496,159,429 | IssuesEvent | 2021-03-11 02:10:10 | jinuem/Shopping-Cart-POC | https://api.github.com/repos/jinuem/Shopping-Cart-POC | opened | CVE-2019-10747 (High) detected in set-value-0.4.3.tgz, set-value-2.0.0.tgz | security vulnerability | ## CVE-2019-10747 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>set-value-0.4.3.tgz</b>, <b>set-value-2.0.0.tgz</b></p></summary>
<p>
<details><summary><b>set-value-0.4.3.tgz</b></p></summary>
<p>Create nested values and any intermediaries using dot notation (`'a.b.c'`) paths.</p>
<p>Library home page: <a href="https://registry.npmjs.org/set-value/-/set-value-0.4.3.tgz">https://registry.npmjs.org/set-value/-/set-value-0.4.3.tgz</a></p>
<p>Path to dependency file: /Shopping-Cart-POC/rejsx/package.json</p>
<p>Path to vulnerable library: Shopping-Cart-POC/rejsx/node_modules/union-value/node_modules/set-value/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.5.tgz (Root Library)
- fork-ts-checker-webpack-plugin-alt-0.4.14.tgz
- micromatch-3.1.10.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- cache-base-1.0.1.tgz
- union-value-1.0.0.tgz
- :x: **set-value-0.4.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>set-value-2.0.0.tgz</b></p></summary>
<p>Create nested values and any intermediaries using dot notation (`'a.b.c'`) paths.</p>
<p>Library home page: <a href="https://registry.npmjs.org/set-value/-/set-value-2.0.0.tgz">https://registry.npmjs.org/set-value/-/set-value-2.0.0.tgz</a></p>
<p>Path to dependency file: /Shopping-Cart-POC/rejsx/package.json</p>
<p>Path to vulnerable library: Shopping-Cart-POC/rejsx/node_modules/set-value/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.5.tgz (Root Library)
- fork-ts-checker-webpack-plugin-alt-0.4.14.tgz
- micromatch-3.1.10.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- cache-base-1.0.1.tgz
- :x: **set-value-2.0.0.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
set-value is vulnerable to Prototype Pollution in versions lower than 3.0.1. The function mixin-deep could be tricked into adding or modifying properties of Object.prototype using any of the constructor, prototype and _proto_ payloads.
<p>Publish Date: 2019-08-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10747>CVE-2019-10747</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/jonschlinkert/set-value/commit/95e9d9923f8a8b4a01da1ea138fcc39ec7b6b15f">https://github.com/jonschlinkert/set-value/commit/95e9d9923f8a8b4a01da1ea138fcc39ec7b6b15f</a></p>
<p>Release Date: 2019-07-24</p>
<p>Fix Resolution: 2.0.1,3.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-10747 (High) detected in set-value-0.4.3.tgz, set-value-2.0.0.tgz - ## CVE-2019-10747 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>set-value-0.4.3.tgz</b>, <b>set-value-2.0.0.tgz</b></p></summary>
<p>
<details><summary><b>set-value-0.4.3.tgz</b></p></summary>
<p>Create nested values and any intermediaries using dot notation (`'a.b.c'`) paths.</p>
<p>Library home page: <a href="https://registry.npmjs.org/set-value/-/set-value-0.4.3.tgz">https://registry.npmjs.org/set-value/-/set-value-0.4.3.tgz</a></p>
<p>Path to dependency file: /Shopping-Cart-POC/rejsx/package.json</p>
<p>Path to vulnerable library: Shopping-Cart-POC/rejsx/node_modules/union-value/node_modules/set-value/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.5.tgz (Root Library)
- fork-ts-checker-webpack-plugin-alt-0.4.14.tgz
- micromatch-3.1.10.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- cache-base-1.0.1.tgz
- union-value-1.0.0.tgz
- :x: **set-value-0.4.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>set-value-2.0.0.tgz</b></p></summary>
<p>Create nested values and any intermediaries using dot notation (`'a.b.c'`) paths.</p>
<p>Library home page: <a href="https://registry.npmjs.org/set-value/-/set-value-2.0.0.tgz">https://registry.npmjs.org/set-value/-/set-value-2.0.0.tgz</a></p>
<p>Path to dependency file: /Shopping-Cart-POC/rejsx/package.json</p>
<p>Path to vulnerable library: Shopping-Cart-POC/rejsx/node_modules/set-value/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.5.tgz (Root Library)
- fork-ts-checker-webpack-plugin-alt-0.4.14.tgz
- micromatch-3.1.10.tgz
- snapdragon-0.8.2.tgz
- base-0.11.2.tgz
- cache-base-1.0.1.tgz
- :x: **set-value-2.0.0.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
set-value is vulnerable to Prototype Pollution in versions lower than 3.0.1. The function mixin-deep could be tricked into adding or modifying properties of Object.prototype using any of the constructor, prototype and _proto_ payloads.
<p>Publish Date: 2019-08-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10747>CVE-2019-10747</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/jonschlinkert/set-value/commit/95e9d9923f8a8b4a01da1ea138fcc39ec7b6b15f">https://github.com/jonschlinkert/set-value/commit/95e9d9923f8a8b4a01da1ea138fcc39ec7b6b15f</a></p>
<p>Release Date: 2019-07-24</p>
<p>Fix Resolution: 2.0.1,3.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in set value tgz set value tgz cve high severity vulnerability vulnerable libraries set value tgz set value tgz set value tgz create nested values and any intermediaries using dot notation a b c paths library home page a href path to dependency file shopping cart poc rejsx package json path to vulnerable library shopping cart poc rejsx node modules union value node modules set value package json dependency hierarchy react scripts tgz root library fork ts checker webpack plugin alt tgz micromatch tgz snapdragon tgz base tgz cache base tgz union value tgz x set value tgz vulnerable library set value tgz create nested values and any intermediaries using dot notation a b c paths library home page a href path to dependency file shopping cart poc rejsx package json path to vulnerable library shopping cart poc rejsx node modules set value package json dependency hierarchy react scripts tgz root library fork ts checker webpack plugin alt tgz micromatch tgz snapdragon tgz base tgz cache base tgz x set value tgz vulnerable library vulnerability details set value is vulnerable to prototype pollution in versions lower than the function mixin deep could be tricked into adding or modifying properties of object prototype using any of the constructor prototype and proto payloads publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
104,864 | 9,011,638,006 | IssuesEvent | 2019-02-05 15:09:10 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | teamcity: failed test: TestImportCSVStmt | C-test-failure O-robot | The following tests appear to have failed on master (testrace): TestImportCSVStmt/empty-file, TestImportCSVStmt/empty-with-files, TestImportCSVStmt, TestImportCSVStmt/schema-in-file-auto-gzip, TestImportCSVStmt/schema-in-file-implicit-gzip, TestImportCSVStmt/schema-in-query-transform-only, TestImportCSVStmt/schema-in-file-auto-decompress, TestImportCSVStmt/schema-in-query-opts, TestImportCSVStmt/schema-in-file-no-decompress, TestImportCSVStmt/schema-in-file-explicit-gzip, TestImportCSVStmt/schema-in-file-sstsize
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestImportCSVStmt).
[#1124489](https://teamcity.cockroachdb.com/viewLog.html?buildId=1124489):
```
TestImportCSVStmt/schema-in-query-transform-only
...ca_command.go:244 [n1,s1,r187/1:/Table/63/2/{NULL/4…-"Q"/33…}] initiating a split of this range at key /Table/63/2/"A"/2522 [r190] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:42.219889 16571 storage/replica_command.go:244 [n1,s1,r184/1:/Table/63{-/1/2945}] initiating a split of this range at key /Table/63/1/783 [r201] (manual)
I190205 02:44:42.226917 16571 storage/replica_command.go:244 [n1,s1,r201/1:/Table/63/1/{783-2945}] initiating a split of this range at key /Table/63/1/1538 [r202] (manual)
I190205 02:44:42.234432 16571 storage/replica_command.go:244 [n1,s1,r202/1:/Table/63/1/{1538-2945}] initiating a split of this range at key /Table/63/1/1734 [r203] (manual)
I190205 02:44:42.239344 16568 storage/replica_command.go:244 [n1,s1,r187/1:/Table/63/2/{NULL/4…-"Q"/33…}] initiating a split of this range at key /Table/63/2/"A"/2522 [r204] (manual)
I190205 02:44:42.248291 16568 storage/replica_command.go:244 [n1,s1,r204/1:/Table/63/2/"{A"/2522-Q"/3319}] initiating a split of this range at key /Table/63/2/"F"/1124 [r205] (manual)
I190205 02:44:42.249538 16571 storage/replica_command.go:244 [n1,s1,r203/1:/Table/63/1/{1734-2945}] initiating a split of this range at key /Table/63/1/2190 [r206] (manual)
I190205 02:44:42.299483 16568 storage/replica_command.go:244 [n1,s1,r205/1:/Table/63/2/"{F"/1124-Q"/3319}] initiating a split of this range at key /Table/63/2/"J"/3494 [r207] (manual)
I190205 02:44:42.315691 16568 storage/replica_command.go:244 [n1,s1,r207/1:/Table/63/2/"{J"/3494-Q"/3319}] initiating a split of this range at key /Table/63/2/"L"/4223 [r208] (manual)
I190205 02:44:43.280904 8065 server/status/runtime.go:464 [n1] runtime stats: 208 MiB RSS, 642 goroutines, 44 MiB/58 MiB/137 MiB GO alloc/idle/total, 61 MiB/86 MiB CGO alloc/total, 0.0 CGO/sec, 0.0/0.0 %(u/s)time, 0.0 %gc (96x), 58 MiB/58 MiB (r/w)net
W190205 02:44:43.369682 8067 server/node.go:869 [n1,summaries] health alerts detected: {Alerts:[{StoreID:1 Category:METRICS Description:queue.replicate.process.failure Value:17 XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}] XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}
I190205 02:44:43.535308 8351 server/status/runtime.go:464 [n2] runtime stats: 209 MiB RSS, 642 goroutines, 52 MiB/51 MiB/137 MiB GO alloc/idle/total, 61 MiB/87 MiB CGO alloc/total, 0.0 CGO/sec, 0.0/0.0 %(u/s)time, 0.0 %gc (96x), 58 MiB/58 MiB (r/w)net
I190205 02:44:43.624748 8596 server/status/runtime.go:464 [n3] runtime stats: 215 MiB RSS, 643 goroutines, 0 B/0 B/0 B GO alloc/idle/total, 61 MiB/87 MiB CGO alloc/total, 0.0 CGO/sec, 0.0/0.0 %(u/s)time, 0.0 %gc (96x), 58 MiB/58 MiB (r/w)net
I190205 02:44:44.086973 16565 storage/replica_command.go:244 [n1,s1,r195/1:/{Table/63/3/6…-Max}] initiating a split of this range at key /Table/63/3/1392/!NULL [r199] (manual); delayed split for 2.0s to avoid Raft snapshot
I190205 02:44:44.094590 16565 storage/replica_command.go:244 [n1,s1,r199/1:/{Table/63/3/1…-Max}] initiating a split of this range at key /Table/63/3/4902/"O"/PrefixEnd [r209] (manual)
I190205 02:44:44.102980 16566 storage/replica_command.go:244 [n1,s1,r199/1:/{Table/63/3/1…-Max}] initiating a split of this range at key /Table/63/3/2094/"O"/PrefixEnd [r210] (manual)
I190205 02:44:44.183923 16566 storage/replica_command.go:244 [n1,s1,r199/1:/Table/63/3/{1392/!…-4902/"…}] initiating a split of this range at key /Table/63/3/2094/"O"/PrefixEnd [r211] (manual)
I190205 02:44:44.191541 16566 storage/replica_command.go:244 [n1,s1,r211/1:/Table/63/3/{2094/"…-4902/"…}] initiating a split of this range at key /Table/63/3/2796/!NULL [r212] (manual)
I190205 02:44:44.208181 16566 storage/replica_command.go:244 [n1,s1,r212/1:/Table/63/3/{2796/!…-4902/"…}] initiating a split of this range at key /Table/63/3/3498/"O"/PrefixEnd [r213] (manual)
I190205 02:44:44.223863 16566 storage/replica_command.go:244 [n1,s1,r213/1:/Table/63/3/{3498/"…-4902/"…}] initiating a split of this range at key /Table/63/3/4200/!NULL [r214] (manual)
TestImportCSVStmt/schema-in-file-implicit-gzip
...removing replica r175/1
I190205 06:17:27.727535 41669 storage/replica_command.go:244 [n1,s1,r395/1:/{Table/76/3/4…-Max}] initiating a split of this range at key /Table/77 [r396] (manual)
I190205 06:17:27.780961 9893 storage/store.go:2669 [n2,s2,r348/2:/Table/74/3/{2012/"…-3346/"…}] removing replica r351/2
I190205 06:17:27.834434 9623 storage/store.go:2669 [n1,s1,r348/1:/Table/74/3/{2012/"…-3346/"…}] removing replica r351/1
I190205 06:17:27.848176 41565 storage/replica_command.go:383 [n3,merge,s3,r346/3:/Table/74/3/{821/"P…-2012/"…}] initiating a merge of r348:/Table/74/3/{2012/"K"/PrefixEnd-4013/"J"/PrefixEnd} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=33 KiB+36 KiB qps=6.86+4.45 --> 11.31qps) below threshold (size=69 KiB, qps=11.31))
I190205 06:17:27.877922 10067 storage/store.go:2669 [n3,s3,r348/3:/Table/74/3/{2012/"…-3346/"…}] removing replica r351/3
I190205 06:17:27.902864 9642 storage/store.go:2669 [n1,s1,r342/1:/Table/74/2/"{A"/4004-E"/1357}] removing replica r336/1
I190205 06:17:27.906450 10063 storage/store.go:2669 [n3,s3,r342/3:/Table/74/2/"{A"/4004-E"/1357}] removing replica r336/3
I190205 06:17:27.938244 9879 storage/store.go:2669 [n2,s2,r342/2:/Table/74/2/"{A"/4004-E"/1357}] removing replica r336/2
I190205 06:17:28.013754 41645 storage/replica_command.go:383 [n2,merge,s2,r348/2:/Table/74/3/{2012/"…-4013/"…}] initiating a merge of r352:/Table/7{4/3/4013/"J"/PrefixEnd-6/1/741} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=2] into this range (lhs+rhs has (size=55 KiB+46 KiB qps=4.45+0.00 --> 4.45qps) below threshold (size=101 KiB, qps=4.45))
I190205 06:17:28.177833 41675 storage/replica_command.go:383 [n1,merge,s1,r342/1:/Table/74/2/"{A"/4004-H"/3726}] initiating a merge of r340:/Table/74/2/"{H"/3726-R"/980} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=37 KiB+50 KiB qps=4.44+0.00 --> 4.44qps) below threshold (size=86 KiB, qps=4.44))
I190205 06:17:28.711245 9629 storage/store.go:2669 [n1,s1,r342/1:/Table/74/2/"{A"/4004-H"/3726}] removing replica r340/1
I190205 06:17:28.720885 9881 storage/store.go:2669 [n2,s2,r342/2:/Table/74/2/"{A"/4004-H"/3726}] removing replica r340/2
I190205 06:17:28.779575 10182 storage/store.go:2669 [n3,s3,r342/3:/Table/74/2/"{A"/4004-H"/3726}] removing replica r340/3
I190205 06:17:29.141754 41776 storage/replica_command.go:383 [n1,merge,s1,r347/1:/Table/74/{1/3119-2/"A"/4…}] initiating a merge of r342:/Table/74/2/"{A"/4004-R"/980} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=2] into this range (lhs+rhs has (size=54 KiB+86 KiB qps=0.00+8.02 --> 8.02qps) below threshold (size=140 KiB, qps=8.02))
I190205 06:17:29.146539 10201 storage/store.go:2669 [n3,s3,r348/3:/Table/74/3/{2012/"…-4013/"…}] removing replica r352/3
I190205 06:17:29.151097 9933 storage/store.go:2669 [n2,s2,r348/2:/Table/74/3/{2012/"…-4013/"…}] removing replica r352/2
I190205 06:17:29.158310 9612 storage/store.go:2669 [n1,s1,r348/1:/Table/74/3/{2012/"…-4013/"…}] removing replica r352/1
I190205 06:17:29.529919 9901 storage/store.go:2669 [n2,s2,r347/2:/Table/74/{1/3119-2/"A"/4…}] removing replica r342/2
I190205 06:17:29.540478 9641 storage/store.go:2669 [n1,s1,r347/1:/Table/74/{1/3119-2/"A"/4…}] removing replica r342/1
I190205 06:17:29.559750 10197 storage/store.go:2669 [n3,s3,r347/3:/Table/74/{1/3119-2/"A"/4…}] removing replica r342/3
import_stmt_test.go:1180: job 12 did not match:
Description: "IMPORT TABLE csv12.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:////csv/data-0.gz', 'nodelocal:////csv/data-1.gz', 'nodelocal:////csv/data-2.gz', 'nodelocal:////csv/data-3.gz', 'nodelocal:////csv/data-4.gz')" != "IMPORT TABLE csv8.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH decompress = 'auto'"
TestImportCSVStmt/schema-in-file-explicit-gzip
...key /Table/73/1/2171 [r301] (manual)
I190205 02:44:48.365998 22181 storage/replica_command.go:244 [n1,s1,r301/1:/{Table/73/1/2…-Max}] initiating a split of this range at key /Table/73/2/"E"/1955 [r302] (manual)
I190205 02:44:48.376512 22165 storage/replica_command.go:244 [n1,s1,r301/1:/{Table/73/1/2…-Max}] initiating a split of this range at key /Table/73/1/2886 [r303] (manual)
I190205 02:44:48.404602 22141 storage/replica_command.go:244 [n1,s1,r302/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"H"/4324 [r304] (manual)
I190205 02:44:48.420369 22169 storage/replica_command.go:244 [n1,s1,r304/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"L"/1780 [r305] (manual)
I190205 02:44:48.432983 22232 storage/replica_command.go:244 [n1,s1,r305/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"O"/4227 [r306] (manual)
I190205 02:44:48.437248 22157 storage/replica_command.go:244 [n2,s2,r301/2:/Table/73/{1/2171-2/"E"/1…}] initiating a split of this range at key /Table/73/1/2886 [r243] (manual)
I190205 02:44:48.457784 22217 storage/replica_command.go:244 [n1,s1,r306/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"S"/1344 [r307] (manual)
I190205 02:44:48.458206 22243 storage/replica_command.go:244 [n2,s2,r243/2:/Table/73/{1/2886-2/"E"/1…}] initiating a split of this range at key /Table/73/1/2990 [r244] (manual)
I190205 02:44:48.501713 22266 storage/replica_command.go:244 [n1,s1,r307/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"V"/3766 [r308] (manual)
I190205 02:44:48.515978 22372 storage/replica_command.go:244 [n1,s1,r308/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"Z"/1248 [r309] (manual)
I190205 02:44:48.531405 22394 storage/replica_command.go:244 [n1,s1,r309/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/3/547/"B"/PrefixEnd [r310] (manual)
I190205 02:44:48.540569 22381 storage/replica_command.go:244 [n2,s2,r244/2:/Table/73/{1/2990-2/"E"/1…}] initiating a split of this range at key /Table/73/1/3705 [r245] (manual)
I190205 02:44:48.549296 22399 storage/replica_command.go:244 [n1,s1,r310/1:/{Table/73/3/5…-Max}] initiating a split of this range at key /Table/73/3/1214/"S"/PrefixEnd [r321] (manual)
I190205 02:44:48.561372 22369 storage/replica_command.go:244 [n2,s2,r245/2:/Table/73/{1/3705-2/"E"/1…}] initiating a split of this range at key /Table/73/1/4420 [r246] (manual)
I190205 02:44:48.599985 22533 storage/replica_command.go:244 [n2,s2,r246/2:/Table/73/{1/4420-2/"E"/1…}] initiating a split of this range at key /Table/73/2/"A"/3277 [r247] (manual)
I190205 02:44:48.619614 22483 storage/replica_command.go:244 [n2,s2,r247/2:/Table/73/2/"{A"/3277-E"/1955}] initiating a split of this range at key /Table/73/2/"A"/4602 [r248] (manual)
I190205 02:44:48.651991 22495 storage/replica_command.go:244 [n1,s1,r321/1:/{Table/73/3/1…-Max}] initiating a split of this range at key /Table/73/3/3020/"E"/PrefixEnd [r322] (manual)
I190205 02:44:48.666364 22495 storage/replica_command.go:244 [n1,s1,r322/1:/{Table/73/3/3…-Max}] initiating a split of this range at key /Table/73/3/3687/"V"/PrefixEnd [r323] (manual)
I190205 02:44:48.680834 22495 storage/replica_command.go:244 [n1,s1,r323/1:/{Table/73/3/3…-Max}] initiating a split of this range at key /Table/73/3/4354/"M"/PrefixEnd [r324] (manual)
I190205 02:44:48.700074 22495 storage/replica_command.go:244 [n1,s1,r324/1:/{Table/73/3/4…-Max}] initiating a split of this range at key /Table/74 [r325] (manual)
I190205 02:44:48.743623 22603 storage/replica_command.go:244 [n2,s2,r321/2:/Table/73/3/{1214/"…-3020/"…}] initiating a split of this range at key /Table/73/3/1881/"J"/PrefixEnd [r249] (manual)
I190205 02:44:48.787819 22610 storage/replica_command.go:244 [n2,s2,r249/2:/Table/73/3/{1881/"…-3020/"…}] initiating a split of this range at key /Table/73/3/2354/"O" [r250] (manual)
TestImportCSVStmt/schema-in-file-sstsize
...90205 02:44:41.059222 14944 storage/replica_command.go:244 [n1,s1,r160/1:/{Table/61/2/"…-Max}] initiating a split of this range at key /Table/61/2/"X"/543 [r161] (manual)
I190205 02:44:41.069685 15014 storage/replica_command.go:244 [n1,s1,r152/1:/Table/61/2/"{G"/3152-P"/3655}] initiating a split of this range at key /Table/61/2/"K"/557 [r162] (manual)
I190205 02:44:41.070503 14990 storage/replica_command.go:244 [n1,s1,r152/1:/Table/61/2/"{G"/3152-P"/3655}] initiating a split of this range at key /Table/61/2/"N"/3263 [r163] (manual)
I190205 02:44:41.083949 14993 storage/replica_command.go:244 [n1,s1,r162/1:/Table/61/2/"{K"/557-P"/3655}] initiating a split of this range at key /Table/61/2/"N"/299 [r164] (manual)
I190205 02:44:41.086502 15035 storage/replica_command.go:244 [n1,s1,r161/1:/{Table/61/2/"…-Max}] initiating a split of this range at key /Table/61/3/1602/"Q" [r165] (manual)
I190205 02:44:41.125360 14990 storage/replica_command.go:244 [n1,s1,r164/1:/Table/61/2/"{N"/299-P"/3655}] initiating a split of this range at key /Table/61/2/"N"/3263 [r166] (manual)
I190205 02:44:41.127459 15025 storage/replica_command.go:244 [n1,s1,r157/1:/Table/61/2/"{P"/3655-V"/4338}] initiating a split of this range at key /Table/61/2/"R"/4515 [r167] (manual)
I190205 02:44:41.158751 15154 storage/replica_command.go:244 [n1,s1,r165/1:/{Table/61/3/1…-Max}] initiating a split of this range at key /Table/61/3/2689/"L" [r168] (manual)
I190205 02:44:41.161988 15222 storage/replica_command.go:244 [n1,s1,r167/1:/Table/61/2/"{R"/4515-V"/4338}] initiating a split of this range at key /Table/61/2/"S"/1916 [r169] (manual)
I190205 02:44:41.209103 15246 storage/replica_command.go:244 [n1,s1,r161/1:/Table/61/{2/"X"/5…-3/1602/…}] initiating a split of this range at key /Table/61/3/1275/"B" [r170] (manual)
I190205 02:44:41.258801 15365 storage/replica_command.go:244 [n1,s1,r168/1:/{Table/61/3/2…-Max}] initiating a split of this range at key /Table/61/3/3745/"B" [r171] (manual)
I190205 02:44:41.279624 15218 storage/replica_command.go:244 [n1,s1,r165/1:/Table/61/3/{1602/"…-2689/"…}] initiating a split of this range at key /Table/61/3/2128/"W" [r172] (manual)
I190205 02:44:41.303325 15417 storage/replica_command.go:244 [n1,s1,r161/1:/Table/61/{2/"X"/5…-3/1275/…}] initiating a split of this range at key /Table/61/3/128/"Y"/PrefixEnd [r173] (manual)
I190205 02:44:41.308783 15447 storage/replica_command.go:244 [n1,s1,r168/1:/Table/61/3/{2689/"…-3745/"…}] initiating a split of this range at key /Table/61/3/3349/"V" [r174] (manual)
I190205 02:44:41.343164 15493 storage/replica_command.go:244 [n1,s1,r173/1:/Table/61/3/12{8/"Y"…-75/"B"}] initiating a split of this range at key /Table/61/3/804/"Y"/PrefixEnd [r175] (manual)
I190205 02:44:41.358217 15556 storage/replica_command.go:244 [n1,s1,r175/1:/Table/61/3/{804/"Y…-1275/"…}] initiating a split of this range at key /Table/61/3/859/"B" [r176] (manual)
I190205 02:44:41.362697 15542 storage/replica_command.go:244 [n1,s1,r171/1:/{Table/61/3/3…-Max}] initiating a split of this range at key /Table/61/3/4534/"K" [r177] (manual)
I190205 02:44:41.366433 15469 storage/replica_command.go:244 [n1,s1,r171/1:/{Table/61/3/3…-Max}] initiating a split of this range at key /Table/62 [r178] (manual)
I190205 02:44:41.391946 15564 storage/replica_command.go:244 [n1,s1,r165/1:/Table/61/3/{1602/"…-2128/"…}] initiating a split of this range at key /Table/61/3/1994/"S" [r179] (manual)
I190205 02:44:41.425975 15597 storage/replica_command.go:244 [n1,s1,r177/1:/{Table/61/3/4…-Max}] initiating a split of this range at key /Table/62 [r180] (manual)
I190205 02:44:41.433226 15485 storage/replica_command.go:244 [n1,s1,r168/1:/Table/61/3/{2689/"…-3349/"…}] initiating a split of this range at key /Table/61/3/2894/"I" [r181] (manual)
I190205 02:44:41.505441 15766 storage/replica_command.go:244 [n1,s1,r171/1:/Table/61/3/{3745/"…-4534/"…}] initiating a split of this range at key /Table/61/3/4360/"S" [r182] (manual)
TestImportCSVStmt/schema-in-file-auto-gzip
...paction for range /Table/70/1/3601 - /Table/72 that contains live data
I190205 06:16:56.107860 37907 storage/replica_command.go:244 [n1,s1,r344/1:/Table/74/1/{2886-3834}] initiating a split of this range at key /Table/74/1/3119 [r347] (manual)
I190205 06:16:56.138421 38184 storage/replica_command.go:244 [n1,s1,r345/1:/{Table/74/2/"…-Max}] initiating a split of this range at key /Table/74/3/821/"P"/PrefixEnd [r346] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:16:56.701051 38298 storage/replica_command.go:244 [n1,s1,r346/1:/{Table/74/3/8…-Max}] initiating a split of this range at key /Table/74/3/2012/"K"/PrefixEnd [r348] (manual)
I190205 06:16:56.920504 38328 storage/replica_command.go:244 [n3,s3,r345/3:/Table/74/{2/"N"/3…-3/821/"…}] initiating a split of this range at key /Table/74/2/"R"/980 [r174] (manual)
I190205 06:16:57.119035 38347 storage/replica_command.go:244 [n1,s1,r346/1:/{Table/74/3/8…-Max}] initiating a split of this range at key /Table/74/3/1346/"U" [r349] (manual)
I190205 06:16:57.454460 38322 storage/replica_command.go:244 [n1,s1,r348/1:/{Table/74/3/2…-Max}] initiating a split of this range at key /Table/74/3/2679/"B"/PrefixEnd [r350] (manual)
I190205 06:16:57.562145 38321 storage/replica_command.go:244 [n3,s3,r346/3:/Table/74/3/{821/"P…-2012/"…}] initiating a split of this range at key /Table/74/3/1346/"U" [r175] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:16:57.793672 38459 storage/replica_command.go:244 [n3,s3,r174/3:/Table/74/{2/"R"/9…-3/821/"…}] initiating a split of this range at key /Table/74/2/"U"/3427 [r176] (manual)
I190205 06:16:58.115230 38518 storage/replica_command.go:244 [n1,s1,r350/1:/{Table/74/3/2…-Max}] initiating a split of this range at key /Table/74/3/3346/"S"/PrefixEnd [r351] (manual)
I190205 06:16:58.919268 38480 storage/replica_command.go:244 [n1,s1,r351/1:/{Table/74/3/3…-Max}] initiating a split of this range at key /Table/74/3/4013/"J"/PrefixEnd [r352] (manual)
I190205 06:16:58.973432 38588 storage/replica_command.go:244 [n3,s3,r176/3:/Table/74/{2/"U"/3…-3/821/"…}] initiating a split of this range at key /Table/74/2/"Y"/909 [r177] (manual)
I190205 06:16:59.555441 38593 storage/replica_command.go:244 [n1,s1,r352/1:/{Table/74/3/4…-Max}] initiating a split of this range at key /Table/74/3/4680/"A"/PrefixEnd [r353] (manual)
I190205 06:16:59.845337 38627 storage/replica_command.go:244 [n3,s3,r177/3:/Table/74/{2/"Y"/9…-3/821/"…}] initiating a split of this range at key /Table/74/3/148/"S" [r178] (manual)
E190205 06:17:00.002268 38647 storage/queue.go:826 [n1,replicate,s1,r352/1:/Table/74/3/4{013/"J…-680/"A…}] [n1,s1,r352/1:/Table/74/3/4{013/"J…-680/"A…}]: unable to transfer lease to s3: [NotLeaseHolderError] r352: replica (n1,s1):1 not lease holder; current lease is repl=(n2,s2):2 seq=4 start=1549347419.980479043,0 epo=1 pro=1549347419.980503825,0
I190205 06:17:00.245105 9706 server/status/runtime.go:464 [n1] runtime stats: 1.4 GiB RSS, 670 goroutines, 92 MiB/18 MiB/137 MiB GO alloc/idle/total, 284 MiB/325 MiB CGO alloc/total, 10676.3 CGO/sec, 163.9/16.5 %(u/s)time, 1.5 %gc (10x), 3.7 MiB/3.7 MiB (r/w)net
I190205 06:17:00.335597 38701 storage/replica_command.go:244 [n1,s1,r353/1:/{Table/74/3/4…-Max}] initiating a split of this range at key /Table/75 [r354] (manual); delayed split for 0.2s to avoid Raft snapshot
import_stmt_test.go:1180: job 11 did not match:
Description: "IMPORT TABLE csv11.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:////csv/data-0.gz', 'nodelocal:////csv/data-1.gz', 'nodelocal:////csv/data-2.gz', 'nodelocal:////csv/data-3.gz', 'nodelocal:////csv/data-4.gz') WITH decompress = 'auto'" != "IMPORT TABLE csv7.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///empty.csv', 'nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4')"
TestImportCSVStmt
...e/replica_command.go:798 [n1,replicate,s1,r16/1:/Table/2{0-1}] change replicas (ADD_REPLICA (n2,s2):3): read existing descriptor r16:/Table/2{0-1} [(n1,s1):1, (n3,s3):2, next=3, gen=0]
I190205 06:12:27.853010 9443 storage/replica_raft.go:372 [n1,s1,r16/1:/Table/2{0-1}] proposing ADD_REPLICA((n2,s2):3): updated=[(n1,s1):1 (n3,s3):2 (n2,s2):3] next=4
I190205 06:12:27.896422 9443 storage/store_snapshot.go:762 [n1,replicate,s1,r2/1:/System/NodeLiveness{-Max}] sending preemptive snapshot b4a997e1 at applied index 23
I190205 06:12:27.897529 9443 storage/store_snapshot.go:805 [n1,replicate,s1,r2/1:/System/NodeLiveness{-Max}] streamed snapshot to (n3,s3):?: kv pairs: 13, log entries: 13, rate-limit: 8.0 MiB/sec, 0.01s
I190205 06:12:27.902561 10749 storage/replica_raftstorage.go:805 [n3,s3,r2/?:{-}] applying preemptive snapshot at index 23 (id=b4a997e1, encoded size=2786, 1 rocksdb batches, 13 log entries)
I190205 06:12:27.906412 10749 storage/replica_raftstorage.go:811 [n3,s3,r2/?:/System/NodeLiveness{-Max}] applied preemptive snapshot in 4ms [clear=0ms batch=0ms entries=2ms commit=0ms]
I190205 06:12:27.909283 9443 storage/replica_command.go:798 [n1,replicate,s1,r2/1:/System/NodeLiveness{-Max}] change replicas (ADD_REPLICA (n3,s3):3): read existing descriptor r2:/System/NodeLiveness{-Max} [(n1,s1):1, (n2,s2):2, next=3, gen=0]
I190205 06:12:27.999034 9443 storage/replica_raft.go:372 [n1,s1,r2/1:/System/NodeLiveness{-Max}] proposing ADD_REPLICA((n3,s3):3): updated=[(n1,s1):1 (n2,s2):2 (n3,s3):3] next=4
I190205 06:12:28.014223 9443 storage/store_snapshot.go:762 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] sending preemptive snapshot 467b39e0 at applied index 41
I190205 06:12:28.053716 9443 storage/store_snapshot.go:805 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] streamed snapshot to (n2,s2):?: kv pairs: 35, log entries: 5, rate-limit: 8.0 MiB/sec, 0.05s
I190205 06:12:28.071331 10732 storage/replica_raftstorage.go:805 [n2,s2,r3/?:{-}] applying preemptive snapshot at index 41 (id=467b39e0, encoded size=85233, 1 rocksdb batches, 5 log entries)
I190205 06:12:28.096342 10732 storage/replica_raftstorage.go:811 [n2,s2,r3/?:/System/{NodeLive…-tsd}] applied preemptive snapshot in 25ms [clear=0ms batch=0ms entries=22ms commit=1ms]
I190205 06:12:28.099642 9443 storage/replica_command.go:798 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] change replicas (ADD_REPLICA (n2,s2):3): read existing descriptor r3:/System/{NodeLivenessMax-tsd} [(n1,s1):1, (n3,s3):2, next=3, gen=0]
I190205 06:12:28.190357 9443 storage/replica_raft.go:372 [n1,s1,r3/1:/System/{NodeLive…-tsd}] proposing ADD_REPLICA((n2,s2):3): updated=[(n1,s1):1 (n3,s3):2 (n2,s2):3] next=4
I190205 06:12:28.316951 9443 storage/store_snapshot.go:762 [n1,replicate,s1,r13/1:/Table/1{7-8}] sending preemptive snapshot 5226e91d at applied index 16
I190205 06:12:28.317964 9443 storage/store_snapshot.go:805 [n1,replicate,s1,r13/1:/Table/1{7-8}] streamed snapshot to (n3,s3):?: kv pairs: 7, log entries: 6, rate-limit: 8.0 MiB/sec, 0.03s
I190205 06:12:28.319889 10674 storage/replica_raftstorage.go:805 [n3,s3,r13/?:{-}] applying preemptive snapshot at index 16 (id=5226e91d, encoded size=1255, 1 rocksdb batches, 6 log entries)
I190205 06:12:28.322027 10674 storage/replica_raftstorage.go:811 [n3,s3,r13/?:/Table/1{7-8}] applied preemptive snapshot in 2ms [clear=0ms batch=0ms entries=1ms commit=0ms]
I190205 06:12:28.326302 9443 storage/replica_command.go:798 [n1,replicate,s1,r13/1:/Table/1{7-8}] change replicas (ADD_REPLICA (n3,s3):3): read existing descriptor r13:/Table/1{7-8} [(n1,s1):1, (n2,s2):2, next=3, gen=0]
I190205 06:12:28.460757 9443 storage/replica_raft.go:372 [n1,s1,r13/1:/Table/1{7-8}] proposing ADD_REPLICA((n3,s3):3): updated=[(n1,s1):1 (n2,s2):2 (n3,s3):3] next=4
I190205 06:12:28.865291 10779 sql/event_log.go:135 [n1,client=127.0.0.1:51924,user=root] Event: "set_cluster_setting", target: 0, info: {SettingName:kv.import.batch_size Value:10KB User:root}
TestImportCSVStmt/schema-in-file-implicit-gzip
... storage/replica_command.go:244 [n1,s1,r357/1:/{Table/77/3/6…-Max}] initiating a split of this range at key /Table/77/3/1346/"U"/PrefixEnd [r358] (manual)
I190205 02:44:50.723950 24925 storage/replica_command.go:244 [n3,s3,r355/3:/Table/77/{1/3049-3/679/"…}] initiating a split of this range at key /Table/77/2/"B"/28 [r312] (manual)
I190205 02:44:50.725605 24896 storage/replica_command.go:244 [n1,s1,r358/1:/{Table/77/3/1…-Max}] initiating a split of this range at key /Table/77/3/2013/"L"/PrefixEnd [r359] (manual)
I190205 02:44:50.729618 24929 storage/replica_command.go:244 [n3,s3,r355/3:/Table/77/{1/3049-3/679/"…}] initiating a split of this range at key /Table/77/1/3764 [r313] (manual)
I190205 02:44:50.746192 25033 storage/replica_command.go:244 [n1,s1,r359/1:/{Table/77/3/2…-Max}] initiating a split of this range at key /Table/77/3/2680/"C"/PrefixEnd [r360] (manual)
I190205 02:44:50.757342 25041 storage/replica_command.go:244 [n3,s3,r312/3:/Table/77/{2/"B"/28-3/679/"…}] initiating a split of this range at key /Table/77/2/"E"/2423 [r314] (manual)
I190205 02:44:50.784391 25051 storage/replica_command.go:244 [n3,s3,r314/3:/Table/77/{2/"E"/2…-3/679/"…}] initiating a split of this range at key /Table/77/2/"H"/4792 [r315] (manual)
I190205 02:44:50.804090 25066 storage/replica_command.go:244 [n3,s3,r315/3:/Table/77/{2/"H"/4…-3/679/"…}] initiating a split of this range at key /Table/77/2/"L"/2248 [r316] (manual)
I190205 02:44:50.806716 24929 storage/replica_command.go:244 [n3,s3,r355/3:/Table/77/{1/3049-2/"B"/28}] initiating a split of this range at key /Table/77/1/3764 [r317] (manual)
I190205 02:44:50.825779 25155 storage/replica_command.go:244 [n3,s3,r316/3:/Table/77/{2/"L"/2…-3/679/"…}] initiating a split of this range at key /Table/77/2/"O"/4695 [r318] (manual)
I190205 02:44:50.832413 25236 storage/replica_command.go:244 [n3,s3,r317/3:/Table/77/{1/3764-2/"B"/28}] initiating a split of this range at key /Table/77/1/4479 [r319] (manual)
I190205 02:44:50.854347 25192 storage/replica_command.go:244 [n3,s3,r318/3:/Table/77/{2/"O"/4…-3/679/"…}] initiating a split of this range at key /Table/77/2/"S"/2177 [r320] (manual)
I190205 02:44:50.859771 25225 storage/replica_command.go:244 [n3,s3,r319/3:/Table/77/{1/4479-2/"B"/28}] initiating a split of this range at key /Table/77/1/4493 [r391] (manual)
I190205 02:44:50.904434 25313 storage/replica_command.go:244 [n3,s3,r320/3:/Table/77/{2/"S"/2…-3/679/"…}] initiating a split of this range at key /Table/77/2/"V"/4624 [r392] (manual)
I190205 02:44:50.912669 25317 storage/replica_command.go:244 [n1,s1,r360/1:/{Table/77/3/2…-Max}] initiating a split of this range at key /Table/77/3/3347/"T"/PrefixEnd [r381] (manual)
I190205 02:44:50.915201 25264 storage/replica_command.go:244 [n1,s1,r360/1:/{Table/77/3/2…-Max}] initiating a split of this range at key /Table/77/3/4922/"I"/PrefixEnd [r382] (manual)
I190205 02:44:50.924512 25202 storage/replica_command.go:244 [n1,s1,r381/1:/{Table/77/3/3…-Max}] initiating a split of this range at key /Table/77/3/4014/"K"/PrefixEnd [r383] (manual)
I190205 02:44:50.930099 25296 storage/replica_command.go:244 [n3,s3,r392/3:/Table/77/{2/"V"/4…-3/679/"…}] initiating a split of this range at key /Table/77/2/"Z"/2106 [r393] (manual)
I190205 02:44:50.956583 25418 storage/replica_command.go:244 [n3,s3,r393/3:/Table/77/{2/"Z"/2…-3/679/"…}] initiating a split of this range at key /Table/77/2/"Z"/4705 [r394] (manual)
I190205 02:44:50.977081 25264 storage/replica_command.go:244 [n1,s1,r383/1:/{Table/77/3/4…-Max}] initiating a split of this range at key /Table/77/3/4922/"I"/PrefixEnd [r384] (manual)
I190205 02:44:50.987986 25264 storage/replica_command.go:244 [n1,s1,r384/1:/{Table/77/3/4…-Max}] initiating a split of this range at key /Table/78 [r385] (manual)
I190205 02:44:51.091411 25499 storage/replica_command.go:244 [n2,s2,r383/2:/Table/77/3/4{014/"K…-922/"I…}] initiating a split of this range at key /Table/77/3/4256/"S" [r370] (manual)
TestImportCSVStmt/schema-in-file-auto-decompress
...size=110 KiB, qps=14.02))
I190205 06:15:46.909131 31002 storage/replica_command.go:383 [n1,merge,s1,r199/1:/Table/66/3/3{625/"L…-818/"W"}] initiating a merge of r242:/Table/66/3/{3818/"W"-4484/"M"/PrefixEnd} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=5.3 KiB+18 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=24 KiB, qps=0.00))
I190205 06:15:46.938597 10065 storage/store.go:2669 [n3,s3,r250/3:/Table/66/{2/"Y"/4…-3/3625/…}] removing replica r199/3
I190205 06:15:46.938957 9795 storage/store.go:2669 [n2,s2,r250/2:/Table/66/{2/"Y"/4…-3/3625/…}] removing replica r199/2
I190205 06:15:46.939772 9653 storage/store.go:2669 [n1,s1,r250/1:/Table/66/{2/"Y"/4…-3/3625/…}] removing replica r199/1
I190205 06:15:46.985353 30989 storage/replica_command.go:244 [n2,s2,r240/2:/Table/68/2/"{O"/4435-X"/2338}] initiating a split of this range at key /Table/68/2/"S"/1917 [r271] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:15:47.128989 30995 storage/replica_command.go:383 [n2,merge,s2,r250/2:/Table/66/{2/"Y"/4…-3/3818/…}] initiating a merge of r242:/Table/66/3/{3818/"W"-4484/"M"/PrefixEnd} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=110 KiB+18 KiB qps=12.61+0.00 --> 12.61qps) below threshold (size=128 KiB, qps=12.61))
I190205 06:15:47.533019 9667 storage/store.go:2669 [n1,s1,r250/1:/Table/66/{2/"Y"/4…-3/3818/…}] removing replica r242/1
I190205 06:15:47.538280 9910 storage/store.go:2669 [n2,s2,r250/2:/Table/66/{2/"Y"/4…-3/3818/…}] removing replica r242/2
I190205 06:15:47.543624 10192 storage/store.go:2669 [n3,s3,r250/3:/Table/66/{2/"Y"/4…-3/3818/…}] removing replica r242/3
I190205 06:15:47.660378 31042 storage/replica_command.go:383 [n2,merge,s2,r250/2:/Table/66/{2/"Y"/4…-3/4484/…}] initiating a merge of r249:/Table/6{6/3/4484/"M"/PrefixEnd-7} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=128 KiB+14 KiB qps=12.61+0.00 --> 12.61qps) below threshold (size=142 KiB, qps=12.61))
I190205 06:15:47.883622 9932 storage/store.go:2669 [n2,s2,r250/2:/Table/66/{2/"Y"/4…-3/4484/…}] removing replica r249/2
I190205 06:15:47.884414 10060 storage/store.go:2669 [n3,s3,r250/3:/Table/66/{2/"Y"/4…-3/4484/…}] removing replica r249/3
I190205 06:15:47.889679 9670 storage/store.go:2669 [n1,s1,r250/1:/Table/66/{2/"Y"/4…-3/4484/…}] removing replica r249/1
I190205 06:15:47.907822 31116 storage/replica_command.go:244 [n2,s2,r271/2:/Table/68/2/"{S"/1917-X"/2338}] initiating a split of this range at key /Table/68/2/"T"/4881 [r272] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:15:47.988830 31190 storage/replica_command.go:383 [n2,merge,s2,r250/2:/Table/6{6/2/"Y"/…-7}] initiating a merge of r251:/Table/6{7-8/1/741} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=2] into this range (lhs+rhs has (size=142 KiB+19 KiB qps=16.88+0.00 --> 16.88qps) below threshold (size=162 KiB, qps=16.88))
I190205 06:15:48.239456 9890 storage/store.go:2669 [n2,s2,r250/2:/Table/6{6/2/"Y"/…-7}] removing replica r251/2
I190205 06:15:48.242316 10198 storage/store.go:2669 [n3,s3,r250/3:/Table/6{6/2/"Y"/…-7}] removing replica r251/3
I190205 06:15:48.249160 9632 storage/store.go:2669 [n1,s1,r250/1:/Table/6{6/2/"Y"/…-7}] removing replica r251/1
import_stmt_test.go:1180: job 8 did not match:
Description: "IMPORT TABLE csv8.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH decompress = 'auto'" != "IMPORT TABLE \"\".\"\".t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0-opts', 'nodelocal:///csv/data-1-opts', 'nodelocal:///csv/data-2-opts', 'nodelocal:///csv/data-3-opts', 'nodelocal:///csv/data-4-opts') WITH comment = '#', delimiter = '|', \"nullif\" = '', skip = '2', transform = 'nodelocal:///5'"
TestImportCSVStmt
...I190205 02:44:33.953772 7743 storage/replica_command.go:798 [n1,replicate,s1,r6/1:/Table/{SystemCon…-11}] change replicas (ADD_REPLICA (n2,s2):3): read existing descriptor r6:/Table/{SystemConfigSpan/Start-11} [(n1,s1):1, (n3,s3):2, next=3, gen=0]
I190205 02:44:33.959200 7743 storage/replica_raft.go:372 [n1,s1,r6/1:/Table/{SystemCon…-11}] proposing ADD_REPLICA((n2,s2):3): updated=[(n1,s1):1 (n3,s3):2 (n2,s2):3] next=4
I190205 02:44:33.961860 7743 storage/store_snapshot.go:762 [n1,replicate,s1,r20/1:/{Table/24-Max}] sending preemptive snapshot 7d0c6543 at applied index 16
I190205 02:44:33.962046 7743 storage/store_snapshot.go:805 [n1,replicate,s1,r20/1:/{Table/24-Max}] streamed snapshot to (n3,s3):?: kv pairs: 7, log entries: 6, rate-limit: 8.0 MiB/sec, 0.00s
I190205 02:44:33.963024 9109 storage/replica_raftstorage.go:805 [n3,s3,r20/?:{-}] applying preemptive snapshot at index 16 (id=7d0c6543, encoded size=1244, 1 rocksdb batches, 6 log entries)
I190205 02:44:33.963296 9109 storage/replica_raftstorage.go:811 [n3,s3,r20/?:/{Table/24-Max}] applied preemptive snapshot in 0ms [clear=0ms batch=0ms entries=0ms commit=0ms]
I190205 02:44:33.963663 7743 storage/replica_command.go:798 [n1,replicate,s1,r20/1:/{Table/24-Max}] change replicas (ADD_REPLICA (n3,s3):3): read existing descriptor r20:/{Table/24-Max} [(n1,s1):1, (n2,s2):2, next=3, gen=0]
I190205 02:44:33.969041 7743 storage/replica_raft.go:372 [n1,s1,r20/1:/{Table/24-Max}] proposing ADD_REPLICA((n3,s3):3): updated=[(n1,s1):1 (n2,s2):2 (n3,s3):3] next=4
I190205 02:44:33.971517 7743 storage/store_snapshot.go:762 [n1,replicate,s1,r12/1:/Table/1{6-7}] sending preemptive snapshot 373f58e7 at applied index 16
I190205 02:44:33.971686 7743 storage/store_snapshot.go:805 [n1,replicate,s1,r12/1:/Table/1{6-7}] streamed snapshot to (n3,s3):?: kv pairs: 7, log entries: 6, rate-limit: 8.0 MiB/sec, 0.00s
I190205 02:44:33.971993 9089 storage/replica_raftstorage.go:805 [n3,s3,r12/?:{-}] applying preemptive snapshot at index 16 (id=373f58e7, encoded size=1239, 1 rocksdb batches, 6 log entries)
I190205 02:44:33.973440 9089 storage/replica_raftstorage.go:811 [n3,s3,r12/?:/Table/1{6-7}] applied preemptive snapshot in 1ms [clear=0ms batch=0ms entries=0ms commit=1ms]
I190205 02:44:33.973794 7743 storage/replica_command.go:798 [n1,replicate,s1,r12/1:/Table/1{6-7}] change replicas (ADD_REPLICA (n3,s3):3): read existing descriptor r12:/Table/1{6-7} [(n1,s1):1, (n2,s2):2, next=3, gen=0]
I190205 02:44:33.976839 7743 storage/replica_raft.go:372 [n1,s1,r12/1:/Table/1{6-7}] proposing ADD_REPLICA((n3,s3):3): updated=[(n1,s1):1 (n2,s2):2 (n3,s3):3] next=4
I190205 02:44:33.978485 7743 storage/store_snapshot.go:762 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] sending preemptive snapshot a27e865b at applied index 41
I190205 02:44:33.979056 7743 storage/store_snapshot.go:805 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] streamed snapshot to (n2,s2):?: kv pairs: 35, log entries: 5, rate-limit: 8.0 MiB/sec, 0.00s
I190205 02:44:33.979429 9055 storage/replica_raftstorage.go:805 [n2,s2,r3/?:{-}] applying preemptive snapshot at index 41 (id=a27e865b, encoded size=85224, 1 rocksdb batches, 5 log entries)
I190205 02:44:33.980315 9055 storage/replica_raftstorage.go:811 [n2,s2,r3/?:/System/{NodeLive…-tsd}] applied preemptive snapshot in 1ms [clear=0ms batch=0ms entries=0ms commit=1ms]
I190205 02:44:33.980675 7743 storage/replica_command.go:798 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] change replicas (ADD_REPLICA (n2,s2):3): read existing descriptor r3:/System/{NodeLivenessMax-tsd} [(n1,s1):1, (n3,s3):2, next=3, gen=0]
I190205 02:44:33.983674 7743 storage/replica_raft.go:372 [n1,s1,r3/1:/System/{NodeLive…-tsd}] proposing ADD_REPLICA((n2,s2):3): updated=[(n1,s1):1 (n3,s3):2 (n2,s2):3] next=4
I190205 02:44:34.094919 9160 sql/event_log.go:135 [n1,client=127.0.0.1:35936,user=root] Event: "set_cluster_setting", target: 0, info: {SettingName:kv.import.batch_size Value:10KB User:root}
TestImportCSVStmt/schema-in-file-auto-decompress
...741 [r54] (manual)
I190205 02:44:46.269688 19388 storage/replica_command.go:244 [n3,s3,r54/3:/Table/69/1/{741-3915}] initiating a split of this range at key /Table/69/1/1456 [r55] (manual)
I190205 02:44:46.291830 19232 storage/replica_command.go:244 [n3,s3,r55/3:/Table/69/1/{1456-3915}] initiating a split of this range at key /Table/69/1/2171 [r56] (manual)
I190205 02:44:46.328957 19508 storage/replica_command.go:244 [n3,s3,r56/3:/Table/69/1/{2171-3915}] initiating a split of this range at key /Table/69/1/2886 [r57] (manual)
I190205 02:44:46.336499 19525 storage/replica_command.go:244 [n1,s1,r255/1:/{Table/69/1/4…-Max}] initiating a split of this range at key /Table/69/2/"G"/4505 [r256] (manual)
I190205 02:44:46.355570 19556 storage/replica_command.go:244 [n1,s1,r256/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/2/"K"/1935 [r257] (manual)
I190205 02:44:46.358942 19442 storage/replica_command.go:244 [n3,s3,r57/3:/Table/69/1/{2886-3915}] initiating a split of this range at key /Table/69/1/3200 [r58] (manual)
I190205 02:44:46.381063 19501 storage/replica_command.go:244 [n1,s1,r257/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/2/"N"/4382 [r258] (manual)
I190205 02:44:46.406300 19473 storage/replica_command.go:244 [n1,s1,r258/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/2/"R"/1864 [r259] (manual)
I190205 02:44:46.425865 19600 storage/replica_command.go:244 [n1,s1,r259/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/2/"U"/4311 [r260] (manual)
I190205 02:44:46.428159 19653 storage/replica_command.go:244 [n1,s1,r259/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/3/3530/"U"/PrefixEnd [r261] (manual)
I190205 02:44:46.432343 19655 storage/replica_command.go:244 [n2,s2,r255/2:/Table/69/{1/4630-2/"G"/4…}] initiating a split of this range at key /Table/69/2/"B"/3382 [r237] (manual)
I190205 02:44:46.458341 19617 storage/replica_command.go:244 [n1,s1,r260/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/2/"Y"/1793 [r262] (manual)
I190205 02:44:46.466008 19619 storage/replica_command.go:244 [n2,s2,r237/2:/Table/69/2/"{B"/3382-G"/4505}] initiating a split of this range at key /Table/69/2/"D"/2161 [r238] (manual)
I190205 02:44:46.478881 19750 storage/replica_command.go:244 [n1,s1,r262/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/3/376/"M"/PrefixEnd [r263] (manual)
I190205 02:44:46.536816 19784 storage/replica_command.go:244 [n1,s1,r263/1:/{Table/69/3/3…-Max}] initiating a split of this range at key /Table/69/3/3530/"U"/PrefixEnd [r264] (manual)
I190205 02:44:46.589741 19797 storage/replica_command.go:244 [n1,s1,r264/1:/{Table/69/3/3…-Max}] initiating a split of this range at key /Table/69/3/4197/"L"/PrefixEnd [r265] (manual)
I190205 02:44:46.601867 19801 storage/replica_command.go:244 [n1,s1,r265/1:/{Table/69/3/4…-Max}] initiating a split of this range at key /Table/69/3/4864/"C"/PrefixEnd [r266] (manual)
I190205 02:44:46.616931 19794 storage/replica_command.go:244 [n1,s1,r266/1:/{Table/69/3/4…-Max}] initiating a split of this range at key /Table/70 [r267] (manual)
I190205 02:44:46.670397 19819 storage/replica_command.go:244 [n2,s2,r263/2:/Table/69/3/3{76/"M"…-530/"U…}] initiating a split of this range at key /Table/69/3/1043/"D"/PrefixEnd [r239] (manual)
I190205 02:44:46.697864 19924 storage/replica_command.go:244 [n2,s2,r239/2:/Table/69/3/{1043/"…-3530/"…}] initiating a split of this range at key /Table/69/3/1710/"U"/PrefixEnd [r240] (manual)
I190205 02:44:46.734502 19940 storage/replica_command.go:244 [n2,s2,r240/2:/Table/69/3/{1710/"…-3530/"…}] initiating a split of this range at key /Table/69/3/2377/"L"/PrefixEnd [r241] (manual)
I190205 02:44:46.769686 19961 storage/replica_command.go:244 [n2,s2,r241/2:/Table/69/3/{2377/"…-3530/"…}] initiating a split of this range at key /Table/69/3/2864/"E" [r242] (manual)
TestImportCSVStmt/schema-in-query-transform-only
... 9654 storage/store.go:2669 [n1,s1,r212/1:/Table/61/2/"{O"/1314-Q"/2226}] removing replica r205/1
I190205 06:14:57.866125 9924 storage/store.go:2669 [n2,s2,r190/2:/Table/61/{1/4554-2/"E"/4…}] removing replica r189/2
I190205 06:14:57.866611 10059 storage/store.go:2669 [n3,s3,r190/3:/Table/61/{1/4554-2/"E"/4…}] removing replica r189/3
I190205 06:14:57.883059 9641 storage/store.go:2669 [n1,s1,r190/1:/Table/61/{1/4554-2/"E"/4…}] removing replica r189/1
I190205 06:14:58.077630 26078 storage/replica_command.go:383 [n3,merge,s3,r212/3:/Table/61/2/"{O"/1314-S"/4568}] initiating a merge of r216:/Table/61/2/"{S"/4568-W"/2025} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=0] into this range (lhs+rhs has (size=24 KiB+18 KiB qps=3.73+0.00 --> 3.73qps) below threshold (size=43 KiB, qps=3.73))
I190205 06:14:58.223857 26134 storage/replica_command.go:383 [n2,merge,s2,r207/2:/Table/61/2/"{G"/266-I"/892}] initiating a merge of r202:/Table/61/2/"{I"/892-L"/3314} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=11 KiB+18 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=30 KiB, qps=0.00))
I190205 06:14:58.501033 9899 storage/store.go:2669 [n2,s2,r212/2:/Table/61/2/"{O"/1314-S"/4568}] removing replica r216/2
I190205 06:14:58.503815 10058 storage/store.go:2669 [n3,s3,r212/3:/Table/61/2/"{O"/1314-S"/4568}] removing replica r216/3
I190205 06:14:58.520903 9662 storage/store.go:2669 [n1,s1,r212/1:/Table/61/2/"{O"/1314-S"/4568}] removing replica r216/1
I190205 06:14:58.758322 9895 storage/store.go:2669 [n2,s2,r207/2:/Table/61/2/"{G"/266-I"/892}] removing replica r202/2
I190205 06:14:58.760517 9631 storage/store.go:2669 [n1,s1,r207/1:/Table/61/2/"{G"/266-I"/892}] removing replica r202/1
I190205 06:14:58.770387 10224 storage/store.go:2669 [n3,s3,r207/3:/Table/61/2/"{G"/266-I"/892}] removing replica r202/3
I190205 06:14:59.067032 26164 storage/replica_command.go:383 [n3,merge,s3,r185/3:/Table/61/1/{3179-4224}] initiating a merge of r167:/Table/61/1/42{24-64} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=28 KiB+1.1 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=29 KiB, qps=0.00))
I190205 06:14:59.249000 26231 storage/replica_command.go:383 [n2,merge,s2,r207/2:/Table/61/2/"{G"/266-L"/3314}] initiating a merge of r203:/Table/61/2/"{L"/3314-O"/1314} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=30 KiB+14 KiB qps=7.89+0.00 --> 7.89qps) below threshold (size=43 KiB, qps=7.89))
I190205 06:14:59.505789 10108 storage/store.go:2669 [n3,s3,r185/3:/Table/61/1/{3179-4224}] removing replica r167/3
I190205 06:14:59.510108 9895 storage/store.go:2669 [n2,s2,r185/2:/Table/61/1/{3179-4224}] removing replica r167/2
I190205 06:14:59.535740 9608 storage/store.go:2669 [n1,s1,r185/1:/Table/61/1/{3179-4224}] removing replica r167/1
I190205 06:14:59.705067 9910 storage/store.go:2669 [n2,s2,r207/2:/Table/61/2/"{G"/266-L"/3314}] removing replica r203/2
I190205 06:14:59.708682 10186 storage/store.go:2669 [n3,s3,r207/3:/Table/61/2/"{G"/266-L"/3314}] removing replica r203/3
I190205 06:14:59.713006 9662 storage/store.go:2669 [n1,s1,r207/1:/Table/61/2/"{G"/266-L"/3314}] removing replica r203/1
import_stmt_test.go:1180: job 5 did not match:
Description: "IMPORT TABLE \"\".\"\".t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0-opts', 'nodelocal:///csv/data-1-opts', 'nodelocal:///csv/data-2-opts', 'nodelocal:///csv/data-3-opts', 'nodelocal:///csv/data-4-opts') WITH comment = '#', delimiter = '|', \"nullif\" = '', skip = '2', transform = 'nodelocal:///5'" != "IMPORT TABLE csv3.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0-opts', 'nodelocal:///csv/data-1-opts', 'nodelocal:///csv/data-2-opts', 'nodelocal:///csv/data-3-opts', 'nodelocal:///csv/data-4-opts') WITH comment = '#', delimiter = '|', \"nullif\" = '', skip = '2'"
TestImportCSVStmt/empty-file
...e/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"P"/1055 - /Table/61/2/"Q"/2226 that contains live data
I190205 06:15:06.741291 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"Q"/2226 - /Table/61/2/"S"/4568 that contains live data
I190205 06:15:06.742447 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"S"/4568 - /Table/61/2/"W"/2025 that contains live data
I190205 06:15:06.742810 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"X"/4157 - /Table/61/3/264/"E" that contains live data
I190205 06:15:06.743494 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/731/"D" - /Table/61/3/1365/"N" that contains live data
I190205 06:15:06.744680 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/1365/"N" - /Table/61/3/2031/"D"/PrefixEnd that contains live data
I190205 06:15:06.745919 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/2522/"A" - /Table/61/3/3188/"Q"/PrefixEnd that contains live data
I190205 06:15:06.747192 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/3188/"Q"/PrefixEnd - /Table/61/3/3565/"D" that contains live data
I190205 06:15:06.747938 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/4117/"J" - /Table/61/3/4674/"U" that contains live data
I190205 06:15:06.749116 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/4359/"R" - /Table/61/3/4674/"U" that contains live data
I190205 06:15:06.799087 10193 storage/store.go:2669 [n3,s3,r212/3:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/3
I190205 06:15:06.805015 9883 storage/store.go:2669 [n2,s2,r212/2:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/2
I190205 06:15:06.806335 9620 storage/store.go:2669 [n1,s1,r212/1:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/1
I190205 06:15:06.849427 27174 storage/replica_command.go:383 [n2,merge,s2,r166/2:/Table/61/1/2{146-861}] initiating a merge of r170:/Table/61/1/{2861-4554} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=19 KiB+45 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=64 KiB, qps=0.00))
import_stmt_test.go:1180: job 6 did not match:
Description: "IMPORT TABLE csv6.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///empty.csv')" != "IMPORT TABLE csv4.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH sstsize = '10K'"
------- Stdout: -------
I190205 02:44:44.320484 9160 sql/event_log.go:135 [n1,client=127.0.0.1:35936,user=root] Event: "create_database", target: 64, info: {DatabaseName:csv6 Statement:CREATE DATABASE csv6 User:root}
I190205 02:44:44.327135 16970 storage/replica_consistency.go:139 [n1,consistencyChecker,s1,r4/1:/System/ts{d-e}] triggering stats recomputation to resolve delta of {ContainsEstimates:true LastUpdateNanos:1549334683671481933 IntentAge:0 GCBytesAge:0 LiveBytes:-14745 LiveCount:-1554 KeyBytes:-73641 KeyCount:-1554 ValBytes:58896 ValCount:-1554 IntentBytes:0 IntentCount:0 SysBytes:0 SysCount:0 XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}
I190205 02:44:44.445887 16927 ccl/importccl/read_import_proc.go:75 [n2] could not fetch file size; falling back to per-file progress: <nil>
I190205 02:44:44.498623 17163 ccl/importccl/read_import_proc.go:75 [n2] could not fetch file size; falling back to per-file progress: <nil>
TestImportCSVStmt/schema-in-query-opts
..., 1.2 %gc (11x), 3.4 MiB/3.4 MiB (r/w)net
I190205 06:13:52.874143 18985 storage/replica_command.go:244 [n2,s2,r150/2:/{Table/59/2/"…-Max}] initiating a split of this range at key /Table/59/3/497/"D"/PrefixEnd [r152] (manual)
I190205 06:13:53.124901 18968 storage/replica_command.go:244 [n2,s2,r150/2:/{Table/59/2/"…-Max}] initiating a split of this range at key /Table/59/2/"F"/2579 [r153] (manual)
I190205 06:13:54.010856 19127 storage/replica_command.go:244 [n3,s3,r150/3:/Table/59/{2/"B"/3…-3/497/"…}] initiating a split of this range at key /Table/59/2/"J"/4924 [r103] (manual)
I190205 06:13:54.123120 19068 storage/replica_command.go:244 [n2,s2,r152/2:/{Table/59/3/4…-Max}] initiating a split of this range at key /Table/59/3/1199/"D"/PrefixEnd [r154] (manual)
I190205 06:13:54.233251 18493 storage/replica_command.go:244 [n3,s3,r150/3:/Table/59/{2/"B"/3…-3/497/"…}] initiating a split of this range at key /Table/59/2/"F"/2579 [r104] (manual)
I190205 06:13:54.902201 18493 storage/replica_command.go:244 [n3,s3,r150/3:/Table/59/2/"{B"/3954-J"/4924}] initiating a split of this range at key /Table/59/2/"F"/2579 [r105] (manual)
I190205 06:13:55.128472 19105 storage/replica_command.go:244 [n2,s2,r154/2:/{Table/59/3/1…-Max}] initiating a split of this range at key /Table/59/3/1901/"D"/PrefixEnd [r155] (manual)
I190205 06:13:55.238082 19150 storage/replica_command.go:244 [n3,s3,r103/3:/Table/59/{2/"J"/4…-3/497/"…}] initiating a split of this range at key /Table/59/2/"O"/4747 [r106] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:13:56.090108 19252 storage/replica_command.go:244 [n2,s2,r155/2:/{Table/59/3/1…-Max}] initiating a split of this range at key /Table/59/3/2603/"D"/PrefixEnd [r156] (manual)
I190205 06:13:56.307559 19318 storage/replica_command.go:244 [n3,s3,r106/3:/Table/59/{2/"O"/4…-3/497/"…}] initiating a split of this range at key /Table/59/2/"T"/2360 [r107] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:13:56.809474 19311 storage/replica_command.go:244 [n2,s2,r156/2:/{Table/59/3/2…-Max}] initiating a split of this range at key /Table/59/3/3305/"D"/PrefixEnd [r158] (manual)
I190205 06:13:56.894569 19331 storage/replica_command.go:244 [n2,s2,r156/2:/{Table/59/3/2…-Max}] initiating a split of this range at key /Table/59/3/4645/"R"/PrefixEnd [r157] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:13:57.093354 19418 storage/replica_command.go:244 [n3,s3,r107/3:/Table/59/{2/"T"/2…-3/497/"…}] initiating a split of this range at key /Table/59/2/"X"/4808 [r108] (manual)
I190205 06:13:57.482919 19438 storage/replica_command.go:244 [n2,s2,r158/2:/{Table/59/3/3…-Max}] initiating a split of this range at key /Table/59/3/4645/"R"/PrefixEnd [r160] (manual)
I190205 06:13:57.623473 19478 storage/replica_command.go:244 [n3,s3,r108/3:/Table/59/{2/"X"/4…-3/497/"…}] initiating a split of this range at key /Table/59/2/"Y"/3638 [r109] (manual)
I190205 06:13:57.657277 19437 storage/replica_command.go:244 [n2,s2,r158/2:/{Table/59/3/3…-Max}] initiating a split of this range at key /Table/59/3/3944/NULL [r159] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:13:57.966350 19437 storage/replica_command.go:244 [n2,s2,r158/2:/Table/59/3/{3305/"…-4645/"…}] initiating a split of this range at key /Table/59/3/3944/NULL [r161] (manual)
I190205 06:13:58.041480 19529 storage/replica_command.go:244 [n2,s2,r160/2:/{Table/59/3/4…-Max}] initiating a split of this range at key /Table/60 [r162] (manual)
import_stmt_test.go:1180: job 3 did not match:
Description: "IMPORT TABLE csv3.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0-opts', 'nodelocal:///csv/data-1-opts', 'nodelocal:///csv/data-2-opts', 'nodelocal:///csv/data-3-opts', 'nodelocal:///csv/data-4-opts') WITH comment = '#', delimiter = '|', \"nullif\" = '', skip = '2'" != "CREATE STATISTICS __auto__ FROM [53] AS OF SYSTEM TIME '-30s'"
TestImportCSVStmt/empty-with-files
...15:21.871779 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/61/3/264/"E" - /Table/61/3/3565/"D" that contains live data
I190205 06:15:21.872210 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/61/3/2031/"D"/PrefixEnd - /Table/61/3/3565/"D" that contains live data
I190205 06:15:21.872472 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/61/3/3565/"D" - /Table/64 that contains live data
I190205 06:15:21.872645 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/61/3/4674/"U" - /Table/64 that contains live data
I190205 06:15:21.876206 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/62 - /Max that contains live data
I190205 06:15:21.928277 28727 storage/replica_command.go:244 [n1,s1,r247/1:/Table/66/{2/"R"/4…-3/1624/…}] initiating a split of this range at key /Table/66/2/"V"/1790 [r248] (manual)
I190205 06:15:21.970677 9874 server/status/runtime.go:464 [n2] runtime stats: 1.1 GiB RSS, 677 goroutines, 52 MiB/55 MiB/137 MiB GO alloc/idle/total, 197 MiB/240 MiB CGO alloc/total, 6496.3 CGO/sec, 144.6/13.9 %(u/s)time, 0.8 %gc (10x), 3.7 MiB/3.7 MiB (r/w)net
I190205 06:15:22.362644 10240 gossip/gossip.go:557 [n3] gossip status (ok, 3 nodes)
gossip client (1/3 cur/max conns)
1: 127.0.0.1:35391 (3m0s: infos 594/1204 sent/received, bytes 166167B/419400B sent/received)
gossip server (0/3 cur/max conns, infos 594/1204 sent/received, bytes 166167B/419400B sent/received)
I190205 06:15:22.522772 28008 storage/replica_command.go:383 [n3,merge,s3,r110/3:/Table/61{-/1/4554}] initiating a merge of r190:/Table/6{1/1/4554-4} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=6] into this range (lhs+rhs has (size=120 KiB+284 KiB qps=1.20+0.00 --> 1.20qps) below threshold (size=404 KiB, qps=1.20))
I190205 06:15:22.793524 10171 server/status/runtime.go:464 [n3] runtime stats: 1.1 GiB RSS, 680 goroutines, 53 MiB/53 MiB/137 MiB GO alloc/idle/total, 197 MiB/240 MiB CGO alloc/total, 6776.9 CGO/sec, 143.8/13.7 %(u/s)time, 1.2 %gc (10x), 3.8 MiB/3.8 MiB (r/w)net
I190205 06:15:22.795696 28721 storage/replica_command.go:244 [n1,s1,r242/1:/{Table/66/3/3…-Max}] initiating a split of this range at key /Table/66/3/4484/"M"/PrefixEnd [r249] (manual)
I190205 06:15:22.878177 28776 storage/replica_command.go:244 [n1,s1,r248/1:/Table/66/{2/"V"/1…-3/1624/…}] initiating a split of this range at key /Table/66/2/"Y"/4237 [r250] (manual)
I190205 06:15:23.610461 10212 storage/store.go:2669 [n3,s3,r110/3:/Table/61{-/1/4554}] removing replica r190/3
I190205 06:15:23.615939 9913 storage/store.go:2669 [n2,s2,r110/2:/Table/61{-/1/4554}] removing replica r190/2
I190205 06:15:23.623197 9662 storage/store.go:2669 [n1,s1,r110/1:/Table/61{-/1/4554}] removing replica r190/1
I190205 06:15:23.694907 28008 storage/queue.go:912 [n3,merge] purgatory is now empty
I190205 06:15:23.713115 28802 storage/replica_command.go:244 [n1,s1,r249/1:/{Table/66/3/4…-Max}] initiating a split of this range at key /Table/67 [r251] (manual)
I190205 06:15:23.862117 28783 storage/replica_command.go:244 [n1,s1,r250/1:/Table/66/{2/"Y"/4…-3/1624/…}] initiating a split of this range at key /Table/66/3/470/"C"/PrefixEnd [r252] (manual)
I190205 06:15:25.100854 28859 storage/replica_command.go:244 [n1,s1,r252/1:/Table/66/3/{470/"C…-1624/"…}] initiating a split of this range at key /Table/66/3/958/"W" [r253] (manual)
import_stmt_test.go:1180: job 7 did not match:
Description: "IMPORT TABLE csv7.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///empty.csv', 'nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4')" != "CREATE STATISTICS __auto__ FROM [57] AS OF SYSTEM TIME '-30s'"
TestImportCSVStmt/empty-file
...r.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"B"/1119 - /Table/61/2/"E"/3463 that contains live data
I190205 06:15:06.731776 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"E"/3463 - /Table/61/2/"E"/4190 that contains live data
I190205 06:15:06.732818 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"E"/4190 - /Table/61/2/"G"/266 that contains live data
I190205 06:15:06.736565 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"I"/892 - /Table/61/2/"L"/3314 that contains live data
I190205 06:15:06.738047 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"L"/3314 - /Table/61/2/"O"/1314 that contains live data
I190205 06:15:06.740642 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"P"/1055 - /Table/61/2/"Q"/2226 that contains live data
I190205 06:15:06.741291 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"Q"/2226 - /Table/61/2/"S"/4568 that contains live data
I190205 06:15:06.742447 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"S"/4568 - /Table/61/2/"W"/2025 that contains live data
I190205 06:15:06.742810 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"X"/4157 - /Table/61/3/264/"E" that contains live data
I190205 06:15:06.743494 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/731/"D" - /Table/61/3/1365/"N" that contains live data
I190205 06:15:06.744680 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/1365/"N" - /Table/61/3/2031/"D"/PrefixEnd that contains live data
I190205 06:15:06.745919 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/2522/"A" - /Table/61/3/3188/"Q"/PrefixEnd that contains live data
I190205 06:15:06.747192 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/3188/"Q"/PrefixEnd - /Table/61/3/3565/"D" that contains live data
I190205 06:15:06.747938 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/4117/"J" - /Table/61/3/4674/"U" that contains live data
I190205 06:15:06.749116 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/4359/"R" - /Table/61/3/4674/"U" that contains live data
I190205 06:15:06.799087 10193 storage/store.go:2669 [n3,s3,r212/3:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/3
I190205 06:15:06.805015 9883 storage/store.go:2669 [n2,s2,r212/2:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/2
I190205 06:15:06.806335 9620 storage/store.go:2669 [n1,s1,r212/1:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/1
I190205 06:15:06.849427 27174 storage/replica_command.go:383 [n2,merge,s2,r166/2:/Table/61/1/2{146-861}] initiating a merge of r170:/Table/61/1/{2861-4554} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=19 KiB+45 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=64 KiB, qps=0.00))
import_stmt_test.go:1180: job 6 did not match:
Description: "IMPORT TABLE csv6.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///empty.csv')" != "IMPORT TABLE csv4.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH sstsize = '10K'"
TestImportCSVStmt/schema-in-file-no-decompress
...58316 32648 storage/replica_command.go:244 [n1,s1,r287/1:/{Table/70/3/4…-Max}] initiating a split of this range at key /Table/70/3/2182/"Y"/PrefixEnd [r290] (manual)
I190205 06:16:02.177970 32643 storage/replica_command.go:244 [n1,s1,r288/1:/Table/70/{1/2171-2/"B"/3…}] initiating a split of this range at key /Table/70/1/2886 [r291] (manual)
I190205 06:16:02.609808 32468 storage/replica_command.go:244 [n1,s1,r289/1:/{Table/70/3/1…-Max}] initiating a split of this range at key /Table/70/3/1516/"I" [r292] (manual)
I190205 06:16:02.871254 10171 server/status/runtime.go:464 [n3] runtime stats: 1.3 GiB RSS, 690 goroutines, 57 MiB/48 MiB/137 MiB GO alloc/idle/total, 232 MiB/276 MiB CGO alloc/total, 4243.4 CGO/sec, 136.5/12.6 %(u/s)time, 1.3 %gc (10x), 3.7 MiB/3.7 MiB (r/w)net
I190205 06:16:02.932635 32700 storage/replica_command.go:244 [n1,s1,r289/1:/{Table/70/3/1…-Max}] initiating a split of this range at key /Table/70/3/2182/"Y"/PrefixEnd [r293] (manual)
W190205 06:16:03.166227 9876 server/node.go:869 [n2,summaries] health alerts detected: {Alerts:[{StoreID:2 Category:METRICS Description:queue.replicagc.process.failure Value:1 XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}] XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}
I190205 06:16:03.726772 32671 storage/replica_command.go:244 [n1,s1,r292/1:/{Table/70/3/1…-Max}] initiating a split of this range at key /Table/70/3/2182/"Y"/PrefixEnd [r294] (manual)
I190205 06:16:03.739747 32805 storage/replica_command.go:244 [n1,s1,r291/1:/Table/70/{1/2886-2/"B"/3…}] initiating a split of this range at key /Table/70/1/3601 [r295] (manual)
I190205 06:16:04.194676 32792 storage/replica_command.go:244 [n1,s1,r294/1:/{Table/70/3/2…-Max}] initiating a split of this range at key /Table/70/3/2776/"U" [r296] (manual)
I190205 06:16:04.806200 32853 storage/replica_command.go:244 [n1,s1,r295/1:/Table/70/{1/3601-2/"B"/3…}] initiating a split of this range at key /Table/70/1/4316 [r297] (manual)
I190205 06:16:05.615978 32866 storage/replica_command.go:244 [n1,s1,r297/1:/Table/70/{1/4316-2/"B"/3…}] initiating a split of this range at key /Table/70/1/4617 [r298] (manual)
I190205 06:16:06.168390 32946 storage/replica_command.go:244 [n1,s1,r281/1:/Table/70/2/"{H"/3101-Y"/3795}] initiating a split of this range at key /Table/70/2/"L"/532 [r299] (manual)
I190205 06:16:06.403008 32892 storage/replica_command.go:244 [n1,s1,r299/1:/Table/70/2/"{L"/532-Y"/3795}] initiating a split of this range at key /Table/70/2/"O"/2979 [r300] (manual)
I190205 06:16:06.645417 33000 storage/replica_command.go:244 [n1,s1,r300/1:/Table/70/2/"{O"/2979-Y"/3795}] initiating a split of this range at key /Table/70/2/"S"/461 [r301] (manual)
I190205 06:16:07.070317 33003 storage/replica_command.go:244 [n1,s1,r301/1:/Table/70/2/"{S"/461-Y"/3795}] initiating a split of this range at key /Table/70/2/"V"/1373 [r302] (manual)
I190205 06:16:07.567661 33066 storage/replica_command.go:244 [n1,s1,r296/1:/{Table/70/3/2…-Max}] initiating a split of this range at key /Table/70/3/3442/"K"/PrefixEnd [r303] (manual)
I190205 06:16:07.882628 33109 storage/replica_command.go:244 [n1,s1,r303/1:/{Table/70/3/3…-Max}] initiating a split of this range at key /Table/70/3/4109/"B"/PrefixEnd [r304] (manual)
I190205 06:16:08.188667 33028 storage/replica_command.go:244 [n1,s1,r304/1:/{Table/70/3/4…-Max}] initiating a split of this range at key /Table/70/3/4776/"S"/PrefixEnd [r305] (manual)
I190205 06:16:08.501875 33173 storage/replica_command.go:244 [n1,s1,r305/1:/{Table/70/3/4…-Max}] initiating a split of this range at key /Table/71 [r306] (manual)
import_stmt_test.go:1180: job 9 did not match:
Description: "IMPORT TABLE csv9.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH decompress = 'none'" != "CREATE STATISTICS __auto__ FROM [59] AS OF SYSTEM TIME '-30s'"
TestImportCSVStmt/schema-in-file-auto-gzip
...)
I190205 02:44:49.403935 23281 storage/replica_command.go:244 [n2,s2,r331/2:/Table/75/1/{741-4898}] initiating a split of this range at key /Table/75/1/1456 [r333] (manual)
I190205 02:44:49.404000 23432 storage/replica_command.go:244 [n1,s1,r343/1:/{Table/75/3/3…-Max}] initiating a split of this range at key /Table/75/3/3927/"B"/PrefixEnd [r344] (manual)
I190205 02:44:49.422037 23432 storage/replica_command.go:244 [n1,s1,r344/1:/{Table/75/3/3…-Max}] initiating a split of this range at key /Table/75/3/4594/"S"/PrefixEnd [r345] (manual)
I190205 02:44:49.433638 23620 storage/replica_command.go:244 [n2,s2,r333/2:/Table/75/1/{1456-4898}] initiating a split of this range at key /Table/75/1/2171 [r334] (manual)
I190205 02:44:49.434460 23432 storage/replica_command.go:244 [n1,s1,r345/1:/{Table/75/3/4…-Max}] initiating a split of this range at key /Table/75/3/4829/"T" [r346] (manual)
I190205 02:44:49.514207 23539 storage/replica_command.go:244 [n2,s2,r334/2:/Table/75/1/{2171-4898}] initiating a split of this range at key /Table/75/1/2886 [r335] (manual)
I190205 02:44:49.552268 23614 storage/replica_command.go:244 [n2,s2,r335/2:/Table/75/1/{2886-4898}] initiating a split of this range at key /Table/75/1/3601 [r336] (manual)
I190205 02:44:49.575151 23562 storage/replica_command.go:244 [n2,s2,r336/2:/Table/75/1/{3601-4898}] initiating a split of this range at key /Table/75/1/4183 [r337] (manual)
I190205 02:44:49.603039 23403 storage/replica_command.go:244 [n2,s2,r341/2:/Table/75/{2/"H"/4…-3/3260/…}] initiating a split of this range at key /Table/75/2/"L"/2196 [r332] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:49.627746 23403 storage/replica_command.go:244 [n2,s2,r332/2:/Table/75/{2/"L"/2…-3/3260/…}] initiating a split of this range at key /Table/75/2/"O"/4643 [r338] (manual)
I190205 02:44:49.630246 23736 storage/replica_command.go:244 [n2,s2,r332/2:/Table/75/{2/"L"/2…-3/3260/…}] initiating a split of this range at key /Table/75/3/1144/"A"/PrefixEnd [r339] (manual)
I190205 02:44:49.654969 23403 storage/replica_command.go:244 [n2,s2,r338/2:/Table/75/{2/"O"/4…-3/3260/…}] initiating a split of this range at key /Table/75/2/"S"/2125 [r340] (manual)
I190205 02:44:49.669974 23403 storage/replica_command.go:244 [n2,s2,r340/2:/Table/75/{2/"S"/2…-3/3260/…}] initiating a split of this range at key /Table/75/2/"V"/4572 [r361] (manual)
I190205 02:44:49.706106 23403 storage/replica_command.go:244 [n2,s2,r361/2:/Table/75/{2/"V"/4…-3/3260/…}] initiating a split of this range at key /Table/75/2/"Z"/2054 [r362] (manual)
I190205 02:44:49.909931 23892 storage/replica_command.go:244 [n2,s2,r361/2:/Table/75/2/"{V"/4572-Z"/2054}] initiating a split of this range at key /Table/75/3/1144/"A"/PrefixEnd [r363] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:49.945311 23403 storage/replica_command.go:244 [n2,s2,r362/2:/Table/75/{2/"Z"/2…-3/3260/…}] initiating a split of this range at key /Table/75/3/478/"K" [r364] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:49.968551 23932 storage/replica_command.go:244 [n2,s2,r364/2:/Table/75/3/{478/"K"-3260/"…}] initiating a split of this range at key /Table/75/3/1144/"A"/PrefixEnd [r365] (manual)
I190205 02:44:49.981685 23852 storage/replica_command.go:244 [n1,s1,r346/1:/{Table/75/3/4…-Max}] initiating a split of this range at key /Table/76 [r347] (manual)
I190205 02:44:49.993520 24000 storage/replica_command.go:244 [n2,s2,r365/2:/Table/75/3/{1144/"…-3260/"…}] initiating a split of this range at key /Table/75/3/1811/"R"/PrefixEnd [r366] (manual)
I190205 02:44:50.029724 24068 storage/replica_command.go:244 [n2,s2,r366/2:/Table/75/3/{1811/"…-3260/"…}] initiating a split of this range at key /Table/75/3/2478/"I"/PrefixEnd [r367] (manual)
I190205 02:44:50.057544 24041 storage/replica_command.go:244 [n2,s2,r367/2:/Table/75/3/{2478/"…-3260/"…}] initiating a split of this range at key /Table/75/3/2594/"U" [r368] (manual)
TestImportCSVStmt/schema-in-file-explicit-gzip
....910569 9879 storage/store.go:2669 [n2,s2,r273/2:/Table/68{-/1/3052}] removing replica r233/2
I190205 06:16:40.913259 10222 storage/store.go:2669 [n3,s3,r273/3:/Table/68{-/1/3052}] removing replica r233/3
I190205 06:16:40.936269 9637 storage/store.go:2669 [n1,s1,r273/1:/Table/68{-/1/3052}] removing replica r233/1
I190205 06:16:41.383515 36625 storage/replica_command.go:383 [n2,merge,s2,r273/2:/Table/68{-/2/"L"/1…}] initiating a merge of r239:/Table/{68/2/"L"/1988-70} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=191 KiB+213 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=404 KiB, qps=0.00))
I190205 06:16:41.559355 36762 storage/replica_command.go:244 [n1,s1,r318/1:/{Table/72/3/3…-Max}] initiating a split of this range at key /Table/72/3/3992/"O"/PrefixEnd [r319] (manual)
I190205 06:16:41.789422 9883 storage/store.go:2669 [n2,s2,r273/2:/Table/68{-/2/"L"/1…}] removing replica r239/2
I190205 06:16:41.799294 9634 storage/store.go:2669 [n1,s1,r273/1:/Table/68{-/2/"L"/1…}] removing replica r239/1
I190205 06:16:41.800309 10065 storage/store.go:2669 [n3,s3,r273/3:/Table/68{-/2/"L"/1…}] removing replica r239/3
I190205 06:16:42.108469 9874 server/status/runtime.go:464 [n2] runtime stats: 1.3 GiB RSS, 669 goroutines, 68 MiB/36 MiB/137 MiB GO alloc/idle/total, 276 MiB/314 MiB CGO alloc/total, 6377.7 CGO/sec, 151.5/13.8 %(u/s)time, 1.8 %gc (10x), 3.1 MiB/3.1 MiB (r/w)net
I190205 06:16:42.169137 36762 storage/replica_command.go:244 [n1,s1,r319/1:/{Table/72/3/3…-Max}] initiating a split of this range at key /Table/72/3/4659/"F"/PrefixEnd [r320] (manual)
I190205 06:16:42.194380 36299 sql/event_log.go:135 [n1] Event: "create_statistics", target: 61, info: {StatisticName:__auto__ Statement:CREATE STATISTICS __auto__ FROM [61] AS OF SYSTEM TIME '-30s'}
I190205 06:16:42.596801 36762 storage/replica_command.go:244 [n1,s1,r320/1:/{Table/72/3/4…-Max}] initiating a split of this range at key /Table/73 [r331] (manual)
I190205 06:16:42.945888 10171 server/status/runtime.go:464 [n3] runtime stats: 1.3 GiB RSS, 667 goroutines, 87 MiB/21 MiB/137 MiB GO alloc/idle/total, 275 MiB/315 MiB CGO alloc/total, 6345.5 CGO/sec, 154.0/14.3 %(u/s)time, 1.8 %gc (10x), 3.1 MiB/3.1 MiB (r/w)net
I190205 06:16:43.285212 36839 storage/replica_command.go:383 [n1,merge,s1,r286/1:/Table/70/1/{1456-2171}] initiating a merge of r288:/Table/70/1/2{171-886} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=19 KiB+19 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=38 KiB, qps=0.00))
W190205 06:16:43.399742 10173 server/node.go:869 [n3,summaries] health alerts detected: {Alerts:[{StoreID:3 Category:METRICS Description:queue.replicagc.process.failure Value:1 XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}] XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}
I190205 06:16:43.883124 9624 storage/store.go:2669 [n1,s1,r286/1:/Table/70/1/{1456-2171}] removing replica r288/1
I190205 06:16:43.896340 36935 storage/replica_command.go:383 [n3,merge,s3,r288/3:/Table/70/1/2{171-886}] initiating a merge of r291:/Table/70/1/{2886-3601} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=19 KiB+19 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=38 KiB, qps=0.00))
I190205 06:16:43.907335 9891 storage/store.go:2669 [n2,s2,r286/2:/Table/70/1/{1456-2171}] removing replica r288/2
I190205 06:16:43.909580 10068 storage/store.go:2669 [n3,s3,r286/3:/Table/70/1/{1456-2171}] removing replica r288/3
import_stmt_test.go:1180: job 10 did not match:
Description: "IMPORT TABLE csv10.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:////csv/data-0.gz', 'nodelocal:////csv/data-1.gz', 'nodelocal:////csv/data-2.gz', 'nodelocal:////csv/data-3.gz', 'nodelocal:////csv/data-4.gz') WITH decompress = 'gzip'" != "IMPORT TABLE csv6.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///empty.csv')"
TestImportCSVStmt/schema-in-file-sstsize
...2,s2,compactor] purging suggested compaction for range /Table/59/3/3305/"D"/PrefixEnd - /Table/59/3/3944/NULL that contains live data
I190205 06:14:34.574468 9868 storage/compactor/compactor.go:329 [n2,s2,compactor] purging suggested compaction for range /Table/59/3/3944/NULL - /Table/59/3/4645/"R"/PrefixEnd that contains live data
I190205 06:14:34.574774 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/55/1/3601 - /Table/57 that contains live data
I190205 06:14:34.574913 9868 storage/compactor/compactor.go:329 [n2,s2,compactor] purging suggested compaction for range /Table/59/3/4645/"R"/PrefixEnd - /Table/60 that contains live data
I190205 06:14:34.575401 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/57/1/4360 - /Table/57/3/1122/"E"/PrefixEnd that contains live data
I190205 06:14:34.575921 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/57/3/1122/"E"/PrefixEnd - /Table/59 that contains live data
I190205 06:14:34.576341 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/2/"X"/4808 - /Table/59/2/"Y"/3638 that contains live data
I190205 06:14:34.576856 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/2/"Y"/3638 - /Table/59/3/497/"D"/PrefixEnd that contains live data
I190205 06:14:34.577325 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/497/"D"/PrefixEnd - /Table/59/3/1199/"D"/PrefixEnd that contains live data
I190205 06:14:34.577894 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/1199/"D"/PrefixEnd - /Table/59/3/1901/"D"/PrefixEnd that contains live data
I190205 06:14:34.578378 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/1901/"D"/PrefixEnd - /Table/59/3/2603/"D"/PrefixEnd that contains live data
I190205 06:14:34.578865 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/3305/"D"/PrefixEnd - /Table/59/3/3944/NULL that contains live data
I190205 06:14:34.579422 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/3944/NULL - /Table/59/3/4645/"R"/PrefixEnd that contains live data
I190205 06:14:34.579908 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/4645/"R"/PrefixEnd - /Table/60 that contains live data
I190205 06:14:34.580286 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/60 - /Table/61/1/741 that contains live data
I190205 06:14:34.575420 9868 storage/compactor/compactor.go:329 [n2,s2,compactor] purging suggested compaction for range /Table/60 - /Table/61/1/741 that contains live data
I190205 06:14:35.014982 9911 storage/store.go:2669 [n2,s2,r107/2:/Table/59/{2/"T"/2…-3/2603/…}] removing replica r156/2
I190205 06:14:35.017567 10111 storage/store.go:2669 [n3,s3,r107/3:/Table/59/{2/"T"/2…-3/2603/…}] removing replica r156/3
I190205 06:14:35.092082 9649 storage/store.go:2669 [n1,s1,r107/1:/Table/59/{2/"T"/2…-3/2603/…}] removing replica r156/1
I190205 06:14:36.634347 23897 storage/replica_command.go:244 [n1,s1,r220/1:/Table/6{1/3/4359…-2}] initiating a split of this range at key /Table/61/3/4674/"U" [r115] (manual)
import_stmt_test.go:1180: job 4 did not match:
Description: "IMPORT TABLE csv4.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH sstsize = '10K'" != "CREATE STATISTICS __auto__ FROM [55] AS OF SYSTEM TIME '-30s'"
TestImportCSVStmt/schema-in-file-no-decompress
... storage/replica_command.go:244 [n1,s1,r282/1:/{Table/71/1/3…-Max}] initiating a split of this range at key /Table/71/1/4413 [r284] (manual)
I190205 02:44:47.427740 20802 storage/replica_command.go:244 [n3,s3,r60/3:/Table/71/1/{741-2462}] initiating a split of this range at key /Table/71/1/1456 [r271] (manual)
I190205 02:44:47.433088 20821 storage/replica_command.go:244 [n1,s1,r283/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"E"/473 [r285] (manual)
I190205 02:44:47.443421 20821 storage/replica_command.go:244 [n1,s1,r285/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"H"/2842 [r286] (manual)
I190205 02:44:47.443774 20802 storage/replica_command.go:244 [n3,s3,r271/3:/Table/71/1/{1456-2462}] initiating a split of this range at key /Table/71/1/1747 [r272] (manual)
I190205 02:44:47.463767 20821 storage/replica_command.go:244 [n1,s1,r286/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"L"/298 [r287] (manual)
I190205 02:44:47.478618 20973 storage/replica_command.go:244 [n3,s3,r282/3:/Table/71/{1/3892-2/"A"/3…}] initiating a split of this range at key /Table/71/1/4413 [r273] (manual)
I190205 02:44:47.480787 20821 storage/replica_command.go:244 [n1,s1,r287/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"O"/2745 [r288] (manual)
I190205 02:44:47.500570 20821 storage/replica_command.go:244 [n1,s1,r288/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"S"/227 [r289] (manual)
I190205 02:44:47.513324 20821 storage/replica_command.go:244 [n1,s1,r289/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"V"/2674 [r291] (manual)
I190205 02:44:47.538534 20821 storage/replica_command.go:244 [n1,s1,r291/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"V"/2751 [r292] (manual)
I190205 02:44:47.566995 21138 storage/replica_command.go:244 [n1,s1,r292/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/3/3498/"O"/PrefixEnd [r293] (manual)
I190205 02:44:47.584995 21177 storage/replica_command.go:244 [n1,s1,r293/1:/{Table/71/3/3…-Max}] initiating a split of this range at key /Table/71/3/4165/"F"/PrefixEnd [r294] (manual)
I190205 02:44:47.602691 21122 storage/replica_command.go:244 [n1,s1,r294/1:/{Table/71/3/4…-Max}] initiating a split of this range at key /Table/71/3/4832/"W"/PrefixEnd [r295] (manual)
I190205 02:44:47.669114 21198 storage/replica_command.go:244 [n1,s1,r295/1:/{Table/71/3/4…-Max}] initiating a split of this range at key /Table/72 [r296] (manual)
I190205 02:44:47.705378 21124 storage/replica_command.go:244 [n1,s1,r288/1:/Table/71/2/"{O"/2745-S"/227}] initiating a split of this range at key /Table/71/2/"Z"/208 [r290] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:47.764472 21076 storage/replica_command.go:244 [n3,s3,r292/3:/Table/71/{2/"V"/2…-3/3498/…}] initiating a split of this range at key /Table/71/2/"Z"/208 [r274] (manual)
I190205 02:44:47.782285 21076 storage/replica_command.go:244 [n3,s3,r274/3:/Table/71/{2/"Z"/2…-3/3498/…}] initiating a split of this range at key /Table/71/3/507/"N"/PrefixEnd [r275] (manual)
I190205 02:44:47.801201 21076 storage/replica_command.go:244 [n3,s3,r275/3:/Table/71/3/{507/"N…-3498/"…}] initiating a split of this range at key /Table/71/3/1174/"E"/PrefixEnd [r276] (manual)
I190205 02:44:47.825166 21076 storage/replica_command.go:244 [n3,s3,r276/3:/Table/71/3/{1174/"…-3498/"…}] initiating a split of this range at key /Table/71/3/1841/"V"/PrefixEnd [r277] (manual)
I190205 02:44:47.842974 21076 storage/replica_command.go:244 [n3,s3,r277/3:/Table/71/3/{1841/"…-3498/"…}] initiating a split of this range at key /Table/71/3/2508/"M"/PrefixEnd [r278] (manual)
I190205 02:44:47.857598 21076 storage/replica_command.go:244 [n3,s3,r278/3:/Table/71/3/{2508/"…-3498/"…}] initiating a split of this range at key /Table/71/3/2832/"Y" [r279] (manual)
TestImportCSVStmt/schema-in-query-opts
...I190205 02:44:40.146743 13411 storage/replica_command.go:244 [n1,s1,r119/1:/{Table/59/1/2…-Max}] initiating a split of this range at key /Table/59/1/3048 [r120] (manual)
I190205 02:44:40.157367 13311 storage/replica_command.go:244 [n1,s1,r120/1:/{Table/59/1/3…-Max}] initiating a split of this range at key /Table/59/1/3803 [r121] (manual)
I190205 02:44:40.166763 13299 storage/replica_command.go:244 [n1,s1,r121/1:/{Table/59/1/3…-Max}] initiating a split of this range at key /Table/59/1/4033 [r122] (manual)
I190205 02:44:40.172444 13337 storage/replica_command.go:244 [n1,s1,r122/1:/{Table/59/1/4…-Max}] initiating a split of this range at key /Table/59/1/4788 [r124] (manual)
I190205 02:44:40.184371 13337 storage/replica_command.go:244 [n1,s1,r124/1:/{Table/59/1/4…-Max}] initiating a split of this range at key /Table/59/2/NULL/2425 [r125] (manual)
I190205 02:44:40.192974 13337 storage/replica_command.go:244 [n1,s1,r125/1:/{Table/59/2/N…-Max}] initiating a split of this range at key /Table/59/2/"B"/1484 [r126] (manual)
I190205 02:44:40.194893 13514 storage/replica_command.go:244 [n1,s1,r125/1:/{Table/59/2/N…-Max}] initiating a split of this range at key /Table/59/3/962/"A"/PrefixEnd [r127] (manual)
I190205 02:44:40.201550 13337 storage/replica_command.go:244 [n1,s1,r126/1:/{Table/59/2/"…-Max}] initiating a split of this range at key /Table/59/2/"D"/2031 [r128] (manual)
I190205 02:44:40.250733 13547 storage/replica_command.go:244 [n1,s1,r128/1:/{Table/59/2/"…-Max}] initiating a split of this range at key /Table/59/3/962/"A"/PrefixEnd [r129] (manual)
I190205 02:44:40.261575 13530 storage/replica_command.go:244 [n1,s1,r129/1:/{Table/59/3/9…-Max}] initiating a split of this range at key /Table/59/3/1664/!NULL [r130] (manual)
I190205 02:44:40.271881 13443 storage/replica_command.go:244 [n1,s1,r130/1:/{Table/59/3/1…-Max}] initiating a split of this range at key /Table/59/3/2366/"A"/PrefixEnd [r131] (manual)
I190205 02:44:40.283266 13604 storage/replica_command.go:244 [n1,s1,r131/1:/{Table/59/3/2…-Max}] initiating a split of this range at key /Table/59/3/3068/!NULL [r132] (manual)
I190205 02:44:40.302432 13565 storage/replica_command.go:244 [n1,s1,r132/1:/{Table/59/3/3…-Max}] initiating a split of this range at key /Table/59/3/3770/"A"/PrefixEnd [r133] (manual)
I190205 02:44:40.312445 13630 storage/replica_command.go:244 [n1,s1,r133/1:/{Table/59/3/3…-Max}] initiating a split of this range at key /Table/59/3/4472/!NULL [r134] (manual)
I190205 02:44:40.323078 13581 storage/replica_command.go:244 [n1,s1,r134/1:/{Table/59/3/4…-Max}] initiating a split of this range at key /Table/60 [r135] (manual)
I190205 02:44:40.369520 13347 storage/replica_command.go:244 [n1,s1,r121/1:/Table/59/1/{3803-4033}] initiating a split of this range at key /Table/59/2/"H"/4376 [r123] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:40.430777 13571 storage/replica_command.go:244 [n1,s1,r128/1:/Table/59/{2/"D"/2…-3/962/"…}] initiating a split of this range at key /Table/59/2/"H"/4376 [r136] (manual)
I190205 02:44:40.441593 13599 storage/replica_command.go:244 [n1,s1,r136/1:/Table/59/{2/"H"/4…-3/962/"…}] initiating a split of this range at key /Table/59/2/"M"/3627 [r137] (manual)
I190205 02:44:40.451421 13686 storage/replica_command.go:244 [n1,s1,r137/1:/Table/59/{2/"M"/3…-3/962/"…}] initiating a split of this range at key /Table/59/2/"R"/1786 [r138] (manual)
I190205 02:44:40.460791 13643 storage/replica_command.go:244 [n1,s1,r138/1:/Table/59/{2/"R"/1…-3/962/"…}] initiating a split of this range at key /Table/59/2/"V"/4234 [r139] (manual)
I190205 02:44:40.470715 13662 storage/replica_command.go:244 [n1,s1,r139/1:/Table/59/{2/"V"/4…-3/962/"…}] initiating a split of this range at key /Table/59/3/78/"A"/PrefixEnd [r140] (manual)
I190205 02:44:40.480338 13708 storage/replica_command.go:244 [n1,s1,r140/1:/Table/59/3/{78/"A"…-962/"A…}] initiating a split of this range at key /Table/59/3/261/"B" [r141] (manual)
TestImportCSVStmt/empty-with-files
... 18047 storage/replica_command.go:244 [n2,s2,r215/2:/Table/6{5-7/1/4079}] initiating a split of this range at key /Table/67/1/741 [r231] (manual)
I190205 02:44:45.162106 18018 storage/replica_command.go:244 [n2,s2,r231/2:/Table/67/1/{741-4079}] initiating a split of this range at key /Table/67/1/1456 [r232] (manual)
I190205 02:44:45.198677 18072 storage/replica_command.go:244 [n2,s2,r232/2:/Table/67/1/{1456-4079}] initiating a split of this range at key /Table/67/1/2171 [r233] (manual)
I190205 02:44:45.217220 18064 storage/replica_command.go:244 [n3,s3,r216/3:/Table/67/{1/4079-2/"Q"/5…}] initiating a split of this range at key /Table/67/1/4794 [r49] (manual)
I190205 02:44:45.239743 18109 storage/replica_command.go:244 [n2,s2,r233/2:/Table/67/1/{2171-4079}] initiating a split of this range at key /Table/67/1/2886 [r234] (manual)
I190205 02:44:45.243343 18172 storage/replica_command.go:244 [n3,s3,r49/3:/Table/67/{1/4794-2/"Q"/5…}] initiating a split of this range at key /Table/67/2/"C"/2369 [r50] (manual)
I190205 02:44:45.261406 18263 storage/replica_command.go:244 [n3,s3,r50/3:/Table/67/2/"{C"/2369-Q"/563}] initiating a split of this range at key /Table/67/2/"F"/4738 [r51] (manual)
I190205 02:44:45.264000 18232 storage/replica_command.go:244 [n2,s2,r234/2:/Table/67/1/{2886-4079}] initiating a split of this range at key /Table/67/1/3364 [r235] (manual)
I190205 02:44:45.289094 18253 storage/replica_command.go:244 [n3,s3,r51/3:/Table/67/2/"{F"/4738-Q"/563}] initiating a split of this range at key /Table/67/2/"J"/2142 [r52] (manual)
I190205 02:44:45.298437 18359 storage/replica_command.go:244 [n1,s1,r220/1:/{Table/67/2/"…-Max}] initiating a split of this range at key /Table/67/2/"Y"/4158 [r222] (manual)
I190205 02:44:45.298446 18303 storage/replica_command.go:244 [n1,s1,r220/1:/{Table/67/2/"…-Max}] initiating a split of this range at key /Table/67/3/466/"Y"/PrefixEnd [r221] (manual)
I190205 02:44:45.318375 18377 storage/replica_command.go:244 [n3,s3,r52/3:/Table/67/2/"{J"/2142-Q"/563}] initiating a split of this range at key /Table/67/2/"M"/3106 [r53] (manual)
I190205 02:44:45.360776 18303 storage/replica_command.go:244 [n1,s1,r222/1:/{Table/67/2/"…-Max}] initiating a split of this range at key /Table/67/3/466/"Y"/PrefixEnd [r223] (manual)
I190205 02:44:45.384558 18303 storage/replica_command.go:244 [n1,s1,r223/1:/{Table/67/3/4…-Max}] initiating a split of this range at key /Table/67/3/1133/"P"/PrefixEnd [r224] (manual)
I190205 02:44:45.401951 18303 storage/replica_command.go:244 [n1,s1,r224/1:/{Table/67/3/1…-Max}] initiating a split of this range at key /Table/67/3/1253/"F" [r225] (manual)
I190205 02:44:45.610614 18519 storage/replica_command.go:244 [n1,s1,r224/1:/Table/67/3/1{133/"P…-253/"F"}] initiating a split of this range at key /Table/67/3/1919/"V"/PrefixEnd [r226] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:45.663376 18462 storage/replica_command.go:244 [n1,s1,r225/1:/{Table/67/3/1…-Max}] initiating a split of this range at key /Table/67/3/1919/"V"/PrefixEnd [r227] (manual)
I190205 02:44:45.680611 18527 storage/replica_command.go:244 [n1,s1,r227/1:/{Table/67/3/1…-Max}] initiating a split of this range at key /Table/67/3/2586/"M"/PrefixEnd [r228] (manual)
I190205 02:44:45.709568 18480 storage/replica_command.go:244 [n1,s1,r228/1:/{Table/67/3/2…-Max}] initiating a split of this range at key /Table/67/3/3253/"D"/PrefixEnd [r229] (manual)
I190205 02:44:45.723287 18505 storage/replica_command.go:244 [n1,s1,r229/1:/{Table/67/3/3…-Max}] initiating a split of this range at key /Table/67/3/3920/"U"/PrefixEnd [r230] (manual)
I190205 02:44:45.760119 18601 storage/replica_command.go:244 [n1,s1,r230/1:/{Table/67/3/3…-Max}] initiating a split of this range at key /Table/67/3/4587/"L"/PrefixEnd [r251] (manual)
I190205 02:44:45.777528 18540 storage/replica_command.go:244 [n1,s1,r251/1:/{Table/67/3/4…-Max}] initiating a split of this range at key /Table/68 [r252] (manual)
```
Please assign, take a look and update the issue accordingly.
| 1.0 | teamcity: failed test: TestImportCSVStmt - The following tests appear to have failed on master (testrace): TestImportCSVStmt/empty-file, TestImportCSVStmt/empty-with-files, TestImportCSVStmt, TestImportCSVStmt/schema-in-file-auto-gzip, TestImportCSVStmt/schema-in-file-implicit-gzip, TestImportCSVStmt/schema-in-query-transform-only, TestImportCSVStmt/schema-in-file-auto-decompress, TestImportCSVStmt/schema-in-query-opts, TestImportCSVStmt/schema-in-file-no-decompress, TestImportCSVStmt/schema-in-file-explicit-gzip, TestImportCSVStmt/schema-in-file-sstsize
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestImportCSVStmt).
[#1124489](https://teamcity.cockroachdb.com/viewLog.html?buildId=1124489):
```
TestImportCSVStmt/schema-in-query-transform-only
...ca_command.go:244 [n1,s1,r187/1:/Table/63/2/{NULL/4…-"Q"/33…}] initiating a split of this range at key /Table/63/2/"A"/2522 [r190] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:42.219889 16571 storage/replica_command.go:244 [n1,s1,r184/1:/Table/63{-/1/2945}] initiating a split of this range at key /Table/63/1/783 [r201] (manual)
I190205 02:44:42.226917 16571 storage/replica_command.go:244 [n1,s1,r201/1:/Table/63/1/{783-2945}] initiating a split of this range at key /Table/63/1/1538 [r202] (manual)
I190205 02:44:42.234432 16571 storage/replica_command.go:244 [n1,s1,r202/1:/Table/63/1/{1538-2945}] initiating a split of this range at key /Table/63/1/1734 [r203] (manual)
I190205 02:44:42.239344 16568 storage/replica_command.go:244 [n1,s1,r187/1:/Table/63/2/{NULL/4…-"Q"/33…}] initiating a split of this range at key /Table/63/2/"A"/2522 [r204] (manual)
I190205 02:44:42.248291 16568 storage/replica_command.go:244 [n1,s1,r204/1:/Table/63/2/"{A"/2522-Q"/3319}] initiating a split of this range at key /Table/63/2/"F"/1124 [r205] (manual)
I190205 02:44:42.249538 16571 storage/replica_command.go:244 [n1,s1,r203/1:/Table/63/1/{1734-2945}] initiating a split of this range at key /Table/63/1/2190 [r206] (manual)
I190205 02:44:42.299483 16568 storage/replica_command.go:244 [n1,s1,r205/1:/Table/63/2/"{F"/1124-Q"/3319}] initiating a split of this range at key /Table/63/2/"J"/3494 [r207] (manual)
I190205 02:44:42.315691 16568 storage/replica_command.go:244 [n1,s1,r207/1:/Table/63/2/"{J"/3494-Q"/3319}] initiating a split of this range at key /Table/63/2/"L"/4223 [r208] (manual)
I190205 02:44:43.280904 8065 server/status/runtime.go:464 [n1] runtime stats: 208 MiB RSS, 642 goroutines, 44 MiB/58 MiB/137 MiB GO alloc/idle/total, 61 MiB/86 MiB CGO alloc/total, 0.0 CGO/sec, 0.0/0.0 %(u/s)time, 0.0 %gc (96x), 58 MiB/58 MiB (r/w)net
W190205 02:44:43.369682 8067 server/node.go:869 [n1,summaries] health alerts detected: {Alerts:[{StoreID:1 Category:METRICS Description:queue.replicate.process.failure Value:17 XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}] XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}
I190205 02:44:43.535308 8351 server/status/runtime.go:464 [n2] runtime stats: 209 MiB RSS, 642 goroutines, 52 MiB/51 MiB/137 MiB GO alloc/idle/total, 61 MiB/87 MiB CGO alloc/total, 0.0 CGO/sec, 0.0/0.0 %(u/s)time, 0.0 %gc (96x), 58 MiB/58 MiB (r/w)net
I190205 02:44:43.624748 8596 server/status/runtime.go:464 [n3] runtime stats: 215 MiB RSS, 643 goroutines, 0 B/0 B/0 B GO alloc/idle/total, 61 MiB/87 MiB CGO alloc/total, 0.0 CGO/sec, 0.0/0.0 %(u/s)time, 0.0 %gc (96x), 58 MiB/58 MiB (r/w)net
I190205 02:44:44.086973 16565 storage/replica_command.go:244 [n1,s1,r195/1:/{Table/63/3/6…-Max}] initiating a split of this range at key /Table/63/3/1392/!NULL [r199] (manual); delayed split for 2.0s to avoid Raft snapshot
I190205 02:44:44.094590 16565 storage/replica_command.go:244 [n1,s1,r199/1:/{Table/63/3/1…-Max}] initiating a split of this range at key /Table/63/3/4902/"O"/PrefixEnd [r209] (manual)
I190205 02:44:44.102980 16566 storage/replica_command.go:244 [n1,s1,r199/1:/{Table/63/3/1…-Max}] initiating a split of this range at key /Table/63/3/2094/"O"/PrefixEnd [r210] (manual)
I190205 02:44:44.183923 16566 storage/replica_command.go:244 [n1,s1,r199/1:/Table/63/3/{1392/!…-4902/"…}] initiating a split of this range at key /Table/63/3/2094/"O"/PrefixEnd [r211] (manual)
I190205 02:44:44.191541 16566 storage/replica_command.go:244 [n1,s1,r211/1:/Table/63/3/{2094/"…-4902/"…}] initiating a split of this range at key /Table/63/3/2796/!NULL [r212] (manual)
I190205 02:44:44.208181 16566 storage/replica_command.go:244 [n1,s1,r212/1:/Table/63/3/{2796/!…-4902/"…}] initiating a split of this range at key /Table/63/3/3498/"O"/PrefixEnd [r213] (manual)
I190205 02:44:44.223863 16566 storage/replica_command.go:244 [n1,s1,r213/1:/Table/63/3/{3498/"…-4902/"…}] initiating a split of this range at key /Table/63/3/4200/!NULL [r214] (manual)
TestImportCSVStmt/schema-in-file-implicit-gzip
...removing replica r175/1
I190205 06:17:27.727535 41669 storage/replica_command.go:244 [n1,s1,r395/1:/{Table/76/3/4…-Max}] initiating a split of this range at key /Table/77 [r396] (manual)
I190205 06:17:27.780961 9893 storage/store.go:2669 [n2,s2,r348/2:/Table/74/3/{2012/"…-3346/"…}] removing replica r351/2
I190205 06:17:27.834434 9623 storage/store.go:2669 [n1,s1,r348/1:/Table/74/3/{2012/"…-3346/"…}] removing replica r351/1
I190205 06:17:27.848176 41565 storage/replica_command.go:383 [n3,merge,s3,r346/3:/Table/74/3/{821/"P…-2012/"…}] initiating a merge of r348:/Table/74/3/{2012/"K"/PrefixEnd-4013/"J"/PrefixEnd} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=33 KiB+36 KiB qps=6.86+4.45 --> 11.31qps) below threshold (size=69 KiB, qps=11.31))
I190205 06:17:27.877922 10067 storage/store.go:2669 [n3,s3,r348/3:/Table/74/3/{2012/"…-3346/"…}] removing replica r351/3
I190205 06:17:27.902864 9642 storage/store.go:2669 [n1,s1,r342/1:/Table/74/2/"{A"/4004-E"/1357}] removing replica r336/1
I190205 06:17:27.906450 10063 storage/store.go:2669 [n3,s3,r342/3:/Table/74/2/"{A"/4004-E"/1357}] removing replica r336/3
I190205 06:17:27.938244 9879 storage/store.go:2669 [n2,s2,r342/2:/Table/74/2/"{A"/4004-E"/1357}] removing replica r336/2
I190205 06:17:28.013754 41645 storage/replica_command.go:383 [n2,merge,s2,r348/2:/Table/74/3/{2012/"…-4013/"…}] initiating a merge of r352:/Table/7{4/3/4013/"J"/PrefixEnd-6/1/741} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=2] into this range (lhs+rhs has (size=55 KiB+46 KiB qps=4.45+0.00 --> 4.45qps) below threshold (size=101 KiB, qps=4.45))
I190205 06:17:28.177833 41675 storage/replica_command.go:383 [n1,merge,s1,r342/1:/Table/74/2/"{A"/4004-H"/3726}] initiating a merge of r340:/Table/74/2/"{H"/3726-R"/980} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=37 KiB+50 KiB qps=4.44+0.00 --> 4.44qps) below threshold (size=86 KiB, qps=4.44))
I190205 06:17:28.711245 9629 storage/store.go:2669 [n1,s1,r342/1:/Table/74/2/"{A"/4004-H"/3726}] removing replica r340/1
I190205 06:17:28.720885 9881 storage/store.go:2669 [n2,s2,r342/2:/Table/74/2/"{A"/4004-H"/3726}] removing replica r340/2
I190205 06:17:28.779575 10182 storage/store.go:2669 [n3,s3,r342/3:/Table/74/2/"{A"/4004-H"/3726}] removing replica r340/3
I190205 06:17:29.141754 41776 storage/replica_command.go:383 [n1,merge,s1,r347/1:/Table/74/{1/3119-2/"A"/4…}] initiating a merge of r342:/Table/74/2/"{A"/4004-R"/980} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=2] into this range (lhs+rhs has (size=54 KiB+86 KiB qps=0.00+8.02 --> 8.02qps) below threshold (size=140 KiB, qps=8.02))
I190205 06:17:29.146539 10201 storage/store.go:2669 [n3,s3,r348/3:/Table/74/3/{2012/"…-4013/"…}] removing replica r352/3
I190205 06:17:29.151097 9933 storage/store.go:2669 [n2,s2,r348/2:/Table/74/3/{2012/"…-4013/"…}] removing replica r352/2
I190205 06:17:29.158310 9612 storage/store.go:2669 [n1,s1,r348/1:/Table/74/3/{2012/"…-4013/"…}] removing replica r352/1
I190205 06:17:29.529919 9901 storage/store.go:2669 [n2,s2,r347/2:/Table/74/{1/3119-2/"A"/4…}] removing replica r342/2
I190205 06:17:29.540478 9641 storage/store.go:2669 [n1,s1,r347/1:/Table/74/{1/3119-2/"A"/4…}] removing replica r342/1
I190205 06:17:29.559750 10197 storage/store.go:2669 [n3,s3,r347/3:/Table/74/{1/3119-2/"A"/4…}] removing replica r342/3
import_stmt_test.go:1180: job 12 did not match:
Description: "IMPORT TABLE csv12.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:////csv/data-0.gz', 'nodelocal:////csv/data-1.gz', 'nodelocal:////csv/data-2.gz', 'nodelocal:////csv/data-3.gz', 'nodelocal:////csv/data-4.gz')" != "IMPORT TABLE csv8.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH decompress = 'auto'"
TestImportCSVStmt/schema-in-file-explicit-gzip
...key /Table/73/1/2171 [r301] (manual)
I190205 02:44:48.365998 22181 storage/replica_command.go:244 [n1,s1,r301/1:/{Table/73/1/2…-Max}] initiating a split of this range at key /Table/73/2/"E"/1955 [r302] (manual)
I190205 02:44:48.376512 22165 storage/replica_command.go:244 [n1,s1,r301/1:/{Table/73/1/2…-Max}] initiating a split of this range at key /Table/73/1/2886 [r303] (manual)
I190205 02:44:48.404602 22141 storage/replica_command.go:244 [n1,s1,r302/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"H"/4324 [r304] (manual)
I190205 02:44:48.420369 22169 storage/replica_command.go:244 [n1,s1,r304/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"L"/1780 [r305] (manual)
I190205 02:44:48.432983 22232 storage/replica_command.go:244 [n1,s1,r305/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"O"/4227 [r306] (manual)
I190205 02:44:48.437248 22157 storage/replica_command.go:244 [n2,s2,r301/2:/Table/73/{1/2171-2/"E"/1…}] initiating a split of this range at key /Table/73/1/2886 [r243] (manual)
I190205 02:44:48.457784 22217 storage/replica_command.go:244 [n1,s1,r306/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"S"/1344 [r307] (manual)
I190205 02:44:48.458206 22243 storage/replica_command.go:244 [n2,s2,r243/2:/Table/73/{1/2886-2/"E"/1…}] initiating a split of this range at key /Table/73/1/2990 [r244] (manual)
I190205 02:44:48.501713 22266 storage/replica_command.go:244 [n1,s1,r307/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"V"/3766 [r308] (manual)
I190205 02:44:48.515978 22372 storage/replica_command.go:244 [n1,s1,r308/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/2/"Z"/1248 [r309] (manual)
I190205 02:44:48.531405 22394 storage/replica_command.go:244 [n1,s1,r309/1:/{Table/73/2/"…-Max}] initiating a split of this range at key /Table/73/3/547/"B"/PrefixEnd [r310] (manual)
I190205 02:44:48.540569 22381 storage/replica_command.go:244 [n2,s2,r244/2:/Table/73/{1/2990-2/"E"/1…}] initiating a split of this range at key /Table/73/1/3705 [r245] (manual)
I190205 02:44:48.549296 22399 storage/replica_command.go:244 [n1,s1,r310/1:/{Table/73/3/5…-Max}] initiating a split of this range at key /Table/73/3/1214/"S"/PrefixEnd [r321] (manual)
I190205 02:44:48.561372 22369 storage/replica_command.go:244 [n2,s2,r245/2:/Table/73/{1/3705-2/"E"/1…}] initiating a split of this range at key /Table/73/1/4420 [r246] (manual)
I190205 02:44:48.599985 22533 storage/replica_command.go:244 [n2,s2,r246/2:/Table/73/{1/4420-2/"E"/1…}] initiating a split of this range at key /Table/73/2/"A"/3277 [r247] (manual)
I190205 02:44:48.619614 22483 storage/replica_command.go:244 [n2,s2,r247/2:/Table/73/2/"{A"/3277-E"/1955}] initiating a split of this range at key /Table/73/2/"A"/4602 [r248] (manual)
I190205 02:44:48.651991 22495 storage/replica_command.go:244 [n1,s1,r321/1:/{Table/73/3/1…-Max}] initiating a split of this range at key /Table/73/3/3020/"E"/PrefixEnd [r322] (manual)
I190205 02:44:48.666364 22495 storage/replica_command.go:244 [n1,s1,r322/1:/{Table/73/3/3…-Max}] initiating a split of this range at key /Table/73/3/3687/"V"/PrefixEnd [r323] (manual)
I190205 02:44:48.680834 22495 storage/replica_command.go:244 [n1,s1,r323/1:/{Table/73/3/3…-Max}] initiating a split of this range at key /Table/73/3/4354/"M"/PrefixEnd [r324] (manual)
I190205 02:44:48.700074 22495 storage/replica_command.go:244 [n1,s1,r324/1:/{Table/73/3/4…-Max}] initiating a split of this range at key /Table/74 [r325] (manual)
I190205 02:44:48.743623 22603 storage/replica_command.go:244 [n2,s2,r321/2:/Table/73/3/{1214/"…-3020/"…}] initiating a split of this range at key /Table/73/3/1881/"J"/PrefixEnd [r249] (manual)
I190205 02:44:48.787819 22610 storage/replica_command.go:244 [n2,s2,r249/2:/Table/73/3/{1881/"…-3020/"…}] initiating a split of this range at key /Table/73/3/2354/"O" [r250] (manual)
TestImportCSVStmt/schema-in-file-sstsize
...90205 02:44:41.059222 14944 storage/replica_command.go:244 [n1,s1,r160/1:/{Table/61/2/"…-Max}] initiating a split of this range at key /Table/61/2/"X"/543 [r161] (manual)
I190205 02:44:41.069685 15014 storage/replica_command.go:244 [n1,s1,r152/1:/Table/61/2/"{G"/3152-P"/3655}] initiating a split of this range at key /Table/61/2/"K"/557 [r162] (manual)
I190205 02:44:41.070503 14990 storage/replica_command.go:244 [n1,s1,r152/1:/Table/61/2/"{G"/3152-P"/3655}] initiating a split of this range at key /Table/61/2/"N"/3263 [r163] (manual)
I190205 02:44:41.083949 14993 storage/replica_command.go:244 [n1,s1,r162/1:/Table/61/2/"{K"/557-P"/3655}] initiating a split of this range at key /Table/61/2/"N"/299 [r164] (manual)
I190205 02:44:41.086502 15035 storage/replica_command.go:244 [n1,s1,r161/1:/{Table/61/2/"…-Max}] initiating a split of this range at key /Table/61/3/1602/"Q" [r165] (manual)
I190205 02:44:41.125360 14990 storage/replica_command.go:244 [n1,s1,r164/1:/Table/61/2/"{N"/299-P"/3655}] initiating a split of this range at key /Table/61/2/"N"/3263 [r166] (manual)
I190205 02:44:41.127459 15025 storage/replica_command.go:244 [n1,s1,r157/1:/Table/61/2/"{P"/3655-V"/4338}] initiating a split of this range at key /Table/61/2/"R"/4515 [r167] (manual)
I190205 02:44:41.158751 15154 storage/replica_command.go:244 [n1,s1,r165/1:/{Table/61/3/1…-Max}] initiating a split of this range at key /Table/61/3/2689/"L" [r168] (manual)
I190205 02:44:41.161988 15222 storage/replica_command.go:244 [n1,s1,r167/1:/Table/61/2/"{R"/4515-V"/4338}] initiating a split of this range at key /Table/61/2/"S"/1916 [r169] (manual)
I190205 02:44:41.209103 15246 storage/replica_command.go:244 [n1,s1,r161/1:/Table/61/{2/"X"/5…-3/1602/…}] initiating a split of this range at key /Table/61/3/1275/"B" [r170] (manual)
I190205 02:44:41.258801 15365 storage/replica_command.go:244 [n1,s1,r168/1:/{Table/61/3/2…-Max}] initiating a split of this range at key /Table/61/3/3745/"B" [r171] (manual)
I190205 02:44:41.279624 15218 storage/replica_command.go:244 [n1,s1,r165/1:/Table/61/3/{1602/"…-2689/"…}] initiating a split of this range at key /Table/61/3/2128/"W" [r172] (manual)
I190205 02:44:41.303325 15417 storage/replica_command.go:244 [n1,s1,r161/1:/Table/61/{2/"X"/5…-3/1275/…}] initiating a split of this range at key /Table/61/3/128/"Y"/PrefixEnd [r173] (manual)
I190205 02:44:41.308783 15447 storage/replica_command.go:244 [n1,s1,r168/1:/Table/61/3/{2689/"…-3745/"…}] initiating a split of this range at key /Table/61/3/3349/"V" [r174] (manual)
I190205 02:44:41.343164 15493 storage/replica_command.go:244 [n1,s1,r173/1:/Table/61/3/12{8/"Y"…-75/"B"}] initiating a split of this range at key /Table/61/3/804/"Y"/PrefixEnd [r175] (manual)
I190205 02:44:41.358217 15556 storage/replica_command.go:244 [n1,s1,r175/1:/Table/61/3/{804/"Y…-1275/"…}] initiating a split of this range at key /Table/61/3/859/"B" [r176] (manual)
I190205 02:44:41.362697 15542 storage/replica_command.go:244 [n1,s1,r171/1:/{Table/61/3/3…-Max}] initiating a split of this range at key /Table/61/3/4534/"K" [r177] (manual)
I190205 02:44:41.366433 15469 storage/replica_command.go:244 [n1,s1,r171/1:/{Table/61/3/3…-Max}] initiating a split of this range at key /Table/62 [r178] (manual)
I190205 02:44:41.391946 15564 storage/replica_command.go:244 [n1,s1,r165/1:/Table/61/3/{1602/"…-2128/"…}] initiating a split of this range at key /Table/61/3/1994/"S" [r179] (manual)
I190205 02:44:41.425975 15597 storage/replica_command.go:244 [n1,s1,r177/1:/{Table/61/3/4…-Max}] initiating a split of this range at key /Table/62 [r180] (manual)
I190205 02:44:41.433226 15485 storage/replica_command.go:244 [n1,s1,r168/1:/Table/61/3/{2689/"…-3349/"…}] initiating a split of this range at key /Table/61/3/2894/"I" [r181] (manual)
I190205 02:44:41.505441 15766 storage/replica_command.go:244 [n1,s1,r171/1:/Table/61/3/{3745/"…-4534/"…}] initiating a split of this range at key /Table/61/3/4360/"S" [r182] (manual)
TestImportCSVStmt/schema-in-file-auto-gzip
...paction for range /Table/70/1/3601 - /Table/72 that contains live data
I190205 06:16:56.107860 37907 storage/replica_command.go:244 [n1,s1,r344/1:/Table/74/1/{2886-3834}] initiating a split of this range at key /Table/74/1/3119 [r347] (manual)
I190205 06:16:56.138421 38184 storage/replica_command.go:244 [n1,s1,r345/1:/{Table/74/2/"…-Max}] initiating a split of this range at key /Table/74/3/821/"P"/PrefixEnd [r346] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:16:56.701051 38298 storage/replica_command.go:244 [n1,s1,r346/1:/{Table/74/3/8…-Max}] initiating a split of this range at key /Table/74/3/2012/"K"/PrefixEnd [r348] (manual)
I190205 06:16:56.920504 38328 storage/replica_command.go:244 [n3,s3,r345/3:/Table/74/{2/"N"/3…-3/821/"…}] initiating a split of this range at key /Table/74/2/"R"/980 [r174] (manual)
I190205 06:16:57.119035 38347 storage/replica_command.go:244 [n1,s1,r346/1:/{Table/74/3/8…-Max}] initiating a split of this range at key /Table/74/3/1346/"U" [r349] (manual)
I190205 06:16:57.454460 38322 storage/replica_command.go:244 [n1,s1,r348/1:/{Table/74/3/2…-Max}] initiating a split of this range at key /Table/74/3/2679/"B"/PrefixEnd [r350] (manual)
I190205 06:16:57.562145 38321 storage/replica_command.go:244 [n3,s3,r346/3:/Table/74/3/{821/"P…-2012/"…}] initiating a split of this range at key /Table/74/3/1346/"U" [r175] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:16:57.793672 38459 storage/replica_command.go:244 [n3,s3,r174/3:/Table/74/{2/"R"/9…-3/821/"…}] initiating a split of this range at key /Table/74/2/"U"/3427 [r176] (manual)
I190205 06:16:58.115230 38518 storage/replica_command.go:244 [n1,s1,r350/1:/{Table/74/3/2…-Max}] initiating a split of this range at key /Table/74/3/3346/"S"/PrefixEnd [r351] (manual)
I190205 06:16:58.919268 38480 storage/replica_command.go:244 [n1,s1,r351/1:/{Table/74/3/3…-Max}] initiating a split of this range at key /Table/74/3/4013/"J"/PrefixEnd [r352] (manual)
I190205 06:16:58.973432 38588 storage/replica_command.go:244 [n3,s3,r176/3:/Table/74/{2/"U"/3…-3/821/"…}] initiating a split of this range at key /Table/74/2/"Y"/909 [r177] (manual)
I190205 06:16:59.555441 38593 storage/replica_command.go:244 [n1,s1,r352/1:/{Table/74/3/4…-Max}] initiating a split of this range at key /Table/74/3/4680/"A"/PrefixEnd [r353] (manual)
I190205 06:16:59.845337 38627 storage/replica_command.go:244 [n3,s3,r177/3:/Table/74/{2/"Y"/9…-3/821/"…}] initiating a split of this range at key /Table/74/3/148/"S" [r178] (manual)
E190205 06:17:00.002268 38647 storage/queue.go:826 [n1,replicate,s1,r352/1:/Table/74/3/4{013/"J…-680/"A…}] [n1,s1,r352/1:/Table/74/3/4{013/"J…-680/"A…}]: unable to transfer lease to s3: [NotLeaseHolderError] r352: replica (n1,s1):1 not lease holder; current lease is repl=(n2,s2):2 seq=4 start=1549347419.980479043,0 epo=1 pro=1549347419.980503825,0
I190205 06:17:00.245105 9706 server/status/runtime.go:464 [n1] runtime stats: 1.4 GiB RSS, 670 goroutines, 92 MiB/18 MiB/137 MiB GO alloc/idle/total, 284 MiB/325 MiB CGO alloc/total, 10676.3 CGO/sec, 163.9/16.5 %(u/s)time, 1.5 %gc (10x), 3.7 MiB/3.7 MiB (r/w)net
I190205 06:17:00.335597 38701 storage/replica_command.go:244 [n1,s1,r353/1:/{Table/74/3/4…-Max}] initiating a split of this range at key /Table/75 [r354] (manual); delayed split for 0.2s to avoid Raft snapshot
import_stmt_test.go:1180: job 11 did not match:
Description: "IMPORT TABLE csv11.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:////csv/data-0.gz', 'nodelocal:////csv/data-1.gz', 'nodelocal:////csv/data-2.gz', 'nodelocal:////csv/data-3.gz', 'nodelocal:////csv/data-4.gz') WITH decompress = 'auto'" != "IMPORT TABLE csv7.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///empty.csv', 'nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4')"
TestImportCSVStmt
...e/replica_command.go:798 [n1,replicate,s1,r16/1:/Table/2{0-1}] change replicas (ADD_REPLICA (n2,s2):3): read existing descriptor r16:/Table/2{0-1} [(n1,s1):1, (n3,s3):2, next=3, gen=0]
I190205 06:12:27.853010 9443 storage/replica_raft.go:372 [n1,s1,r16/1:/Table/2{0-1}] proposing ADD_REPLICA((n2,s2):3): updated=[(n1,s1):1 (n3,s3):2 (n2,s2):3] next=4
I190205 06:12:27.896422 9443 storage/store_snapshot.go:762 [n1,replicate,s1,r2/1:/System/NodeLiveness{-Max}] sending preemptive snapshot b4a997e1 at applied index 23
I190205 06:12:27.897529 9443 storage/store_snapshot.go:805 [n1,replicate,s1,r2/1:/System/NodeLiveness{-Max}] streamed snapshot to (n3,s3):?: kv pairs: 13, log entries: 13, rate-limit: 8.0 MiB/sec, 0.01s
I190205 06:12:27.902561 10749 storage/replica_raftstorage.go:805 [n3,s3,r2/?:{-}] applying preemptive snapshot at index 23 (id=b4a997e1, encoded size=2786, 1 rocksdb batches, 13 log entries)
I190205 06:12:27.906412 10749 storage/replica_raftstorage.go:811 [n3,s3,r2/?:/System/NodeLiveness{-Max}] applied preemptive snapshot in 4ms [clear=0ms batch=0ms entries=2ms commit=0ms]
I190205 06:12:27.909283 9443 storage/replica_command.go:798 [n1,replicate,s1,r2/1:/System/NodeLiveness{-Max}] change replicas (ADD_REPLICA (n3,s3):3): read existing descriptor r2:/System/NodeLiveness{-Max} [(n1,s1):1, (n2,s2):2, next=3, gen=0]
I190205 06:12:27.999034 9443 storage/replica_raft.go:372 [n1,s1,r2/1:/System/NodeLiveness{-Max}] proposing ADD_REPLICA((n3,s3):3): updated=[(n1,s1):1 (n2,s2):2 (n3,s3):3] next=4
I190205 06:12:28.014223 9443 storage/store_snapshot.go:762 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] sending preemptive snapshot 467b39e0 at applied index 41
I190205 06:12:28.053716 9443 storage/store_snapshot.go:805 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] streamed snapshot to (n2,s2):?: kv pairs: 35, log entries: 5, rate-limit: 8.0 MiB/sec, 0.05s
I190205 06:12:28.071331 10732 storage/replica_raftstorage.go:805 [n2,s2,r3/?:{-}] applying preemptive snapshot at index 41 (id=467b39e0, encoded size=85233, 1 rocksdb batches, 5 log entries)
I190205 06:12:28.096342 10732 storage/replica_raftstorage.go:811 [n2,s2,r3/?:/System/{NodeLive…-tsd}] applied preemptive snapshot in 25ms [clear=0ms batch=0ms entries=22ms commit=1ms]
I190205 06:12:28.099642 9443 storage/replica_command.go:798 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] change replicas (ADD_REPLICA (n2,s2):3): read existing descriptor r3:/System/{NodeLivenessMax-tsd} [(n1,s1):1, (n3,s3):2, next=3, gen=0]
I190205 06:12:28.190357 9443 storage/replica_raft.go:372 [n1,s1,r3/1:/System/{NodeLive…-tsd}] proposing ADD_REPLICA((n2,s2):3): updated=[(n1,s1):1 (n3,s3):2 (n2,s2):3] next=4
I190205 06:12:28.316951 9443 storage/store_snapshot.go:762 [n1,replicate,s1,r13/1:/Table/1{7-8}] sending preemptive snapshot 5226e91d at applied index 16
I190205 06:12:28.317964 9443 storage/store_snapshot.go:805 [n1,replicate,s1,r13/1:/Table/1{7-8}] streamed snapshot to (n3,s3):?: kv pairs: 7, log entries: 6, rate-limit: 8.0 MiB/sec, 0.03s
I190205 06:12:28.319889 10674 storage/replica_raftstorage.go:805 [n3,s3,r13/?:{-}] applying preemptive snapshot at index 16 (id=5226e91d, encoded size=1255, 1 rocksdb batches, 6 log entries)
I190205 06:12:28.322027 10674 storage/replica_raftstorage.go:811 [n3,s3,r13/?:/Table/1{7-8}] applied preemptive snapshot in 2ms [clear=0ms batch=0ms entries=1ms commit=0ms]
I190205 06:12:28.326302 9443 storage/replica_command.go:798 [n1,replicate,s1,r13/1:/Table/1{7-8}] change replicas (ADD_REPLICA (n3,s3):3): read existing descriptor r13:/Table/1{7-8} [(n1,s1):1, (n2,s2):2, next=3, gen=0]
I190205 06:12:28.460757 9443 storage/replica_raft.go:372 [n1,s1,r13/1:/Table/1{7-8}] proposing ADD_REPLICA((n3,s3):3): updated=[(n1,s1):1 (n2,s2):2 (n3,s3):3] next=4
I190205 06:12:28.865291 10779 sql/event_log.go:135 [n1,client=127.0.0.1:51924,user=root] Event: "set_cluster_setting", target: 0, info: {SettingName:kv.import.batch_size Value:10KB User:root}
TestImportCSVStmt/schema-in-file-implicit-gzip
... storage/replica_command.go:244 [n1,s1,r357/1:/{Table/77/3/6…-Max}] initiating a split of this range at key /Table/77/3/1346/"U"/PrefixEnd [r358] (manual)
I190205 02:44:50.723950 24925 storage/replica_command.go:244 [n3,s3,r355/3:/Table/77/{1/3049-3/679/"…}] initiating a split of this range at key /Table/77/2/"B"/28 [r312] (manual)
I190205 02:44:50.725605 24896 storage/replica_command.go:244 [n1,s1,r358/1:/{Table/77/3/1…-Max}] initiating a split of this range at key /Table/77/3/2013/"L"/PrefixEnd [r359] (manual)
I190205 02:44:50.729618 24929 storage/replica_command.go:244 [n3,s3,r355/3:/Table/77/{1/3049-3/679/"…}] initiating a split of this range at key /Table/77/1/3764 [r313] (manual)
I190205 02:44:50.746192 25033 storage/replica_command.go:244 [n1,s1,r359/1:/{Table/77/3/2…-Max}] initiating a split of this range at key /Table/77/3/2680/"C"/PrefixEnd [r360] (manual)
I190205 02:44:50.757342 25041 storage/replica_command.go:244 [n3,s3,r312/3:/Table/77/{2/"B"/28-3/679/"…}] initiating a split of this range at key /Table/77/2/"E"/2423 [r314] (manual)
I190205 02:44:50.784391 25051 storage/replica_command.go:244 [n3,s3,r314/3:/Table/77/{2/"E"/2…-3/679/"…}] initiating a split of this range at key /Table/77/2/"H"/4792 [r315] (manual)
I190205 02:44:50.804090 25066 storage/replica_command.go:244 [n3,s3,r315/3:/Table/77/{2/"H"/4…-3/679/"…}] initiating a split of this range at key /Table/77/2/"L"/2248 [r316] (manual)
I190205 02:44:50.806716 24929 storage/replica_command.go:244 [n3,s3,r355/3:/Table/77/{1/3049-2/"B"/28}] initiating a split of this range at key /Table/77/1/3764 [r317] (manual)
I190205 02:44:50.825779 25155 storage/replica_command.go:244 [n3,s3,r316/3:/Table/77/{2/"L"/2…-3/679/"…}] initiating a split of this range at key /Table/77/2/"O"/4695 [r318] (manual)
I190205 02:44:50.832413 25236 storage/replica_command.go:244 [n3,s3,r317/3:/Table/77/{1/3764-2/"B"/28}] initiating a split of this range at key /Table/77/1/4479 [r319] (manual)
I190205 02:44:50.854347 25192 storage/replica_command.go:244 [n3,s3,r318/3:/Table/77/{2/"O"/4…-3/679/"…}] initiating a split of this range at key /Table/77/2/"S"/2177 [r320] (manual)
I190205 02:44:50.859771 25225 storage/replica_command.go:244 [n3,s3,r319/3:/Table/77/{1/4479-2/"B"/28}] initiating a split of this range at key /Table/77/1/4493 [r391] (manual)
I190205 02:44:50.904434 25313 storage/replica_command.go:244 [n3,s3,r320/3:/Table/77/{2/"S"/2…-3/679/"…}] initiating a split of this range at key /Table/77/2/"V"/4624 [r392] (manual)
I190205 02:44:50.912669 25317 storage/replica_command.go:244 [n1,s1,r360/1:/{Table/77/3/2…-Max}] initiating a split of this range at key /Table/77/3/3347/"T"/PrefixEnd [r381] (manual)
I190205 02:44:50.915201 25264 storage/replica_command.go:244 [n1,s1,r360/1:/{Table/77/3/2…-Max}] initiating a split of this range at key /Table/77/3/4922/"I"/PrefixEnd [r382] (manual)
I190205 02:44:50.924512 25202 storage/replica_command.go:244 [n1,s1,r381/1:/{Table/77/3/3…-Max}] initiating a split of this range at key /Table/77/3/4014/"K"/PrefixEnd [r383] (manual)
I190205 02:44:50.930099 25296 storage/replica_command.go:244 [n3,s3,r392/3:/Table/77/{2/"V"/4…-3/679/"…}] initiating a split of this range at key /Table/77/2/"Z"/2106 [r393] (manual)
I190205 02:44:50.956583 25418 storage/replica_command.go:244 [n3,s3,r393/3:/Table/77/{2/"Z"/2…-3/679/"…}] initiating a split of this range at key /Table/77/2/"Z"/4705 [r394] (manual)
I190205 02:44:50.977081 25264 storage/replica_command.go:244 [n1,s1,r383/1:/{Table/77/3/4…-Max}] initiating a split of this range at key /Table/77/3/4922/"I"/PrefixEnd [r384] (manual)
I190205 02:44:50.987986 25264 storage/replica_command.go:244 [n1,s1,r384/1:/{Table/77/3/4…-Max}] initiating a split of this range at key /Table/78 [r385] (manual)
I190205 02:44:51.091411 25499 storage/replica_command.go:244 [n2,s2,r383/2:/Table/77/3/4{014/"K…-922/"I…}] initiating a split of this range at key /Table/77/3/4256/"S" [r370] (manual)
TestImportCSVStmt/schema-in-file-auto-decompress
...size=110 KiB, qps=14.02))
I190205 06:15:46.909131 31002 storage/replica_command.go:383 [n1,merge,s1,r199/1:/Table/66/3/3{625/"L…-818/"W"}] initiating a merge of r242:/Table/66/3/{3818/"W"-4484/"M"/PrefixEnd} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=5.3 KiB+18 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=24 KiB, qps=0.00))
I190205 06:15:46.938597 10065 storage/store.go:2669 [n3,s3,r250/3:/Table/66/{2/"Y"/4…-3/3625/…}] removing replica r199/3
I190205 06:15:46.938957 9795 storage/store.go:2669 [n2,s2,r250/2:/Table/66/{2/"Y"/4…-3/3625/…}] removing replica r199/2
I190205 06:15:46.939772 9653 storage/store.go:2669 [n1,s1,r250/1:/Table/66/{2/"Y"/4…-3/3625/…}] removing replica r199/1
I190205 06:15:46.985353 30989 storage/replica_command.go:244 [n2,s2,r240/2:/Table/68/2/"{O"/4435-X"/2338}] initiating a split of this range at key /Table/68/2/"S"/1917 [r271] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:15:47.128989 30995 storage/replica_command.go:383 [n2,merge,s2,r250/2:/Table/66/{2/"Y"/4…-3/3818/…}] initiating a merge of r242:/Table/66/3/{3818/"W"-4484/"M"/PrefixEnd} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=110 KiB+18 KiB qps=12.61+0.00 --> 12.61qps) below threshold (size=128 KiB, qps=12.61))
I190205 06:15:47.533019 9667 storage/store.go:2669 [n1,s1,r250/1:/Table/66/{2/"Y"/4…-3/3818/…}] removing replica r242/1
I190205 06:15:47.538280 9910 storage/store.go:2669 [n2,s2,r250/2:/Table/66/{2/"Y"/4…-3/3818/…}] removing replica r242/2
I190205 06:15:47.543624 10192 storage/store.go:2669 [n3,s3,r250/3:/Table/66/{2/"Y"/4…-3/3818/…}] removing replica r242/3
I190205 06:15:47.660378 31042 storage/replica_command.go:383 [n2,merge,s2,r250/2:/Table/66/{2/"Y"/4…-3/4484/…}] initiating a merge of r249:/Table/6{6/3/4484/"M"/PrefixEnd-7} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=128 KiB+14 KiB qps=12.61+0.00 --> 12.61qps) below threshold (size=142 KiB, qps=12.61))
I190205 06:15:47.883622 9932 storage/store.go:2669 [n2,s2,r250/2:/Table/66/{2/"Y"/4…-3/4484/…}] removing replica r249/2
I190205 06:15:47.884414 10060 storage/store.go:2669 [n3,s3,r250/3:/Table/66/{2/"Y"/4…-3/4484/…}] removing replica r249/3
I190205 06:15:47.889679 9670 storage/store.go:2669 [n1,s1,r250/1:/Table/66/{2/"Y"/4…-3/4484/…}] removing replica r249/1
I190205 06:15:47.907822 31116 storage/replica_command.go:244 [n2,s2,r271/2:/Table/68/2/"{S"/1917-X"/2338}] initiating a split of this range at key /Table/68/2/"T"/4881 [r272] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:15:47.988830 31190 storage/replica_command.go:383 [n2,merge,s2,r250/2:/Table/6{6/2/"Y"/…-7}] initiating a merge of r251:/Table/6{7-8/1/741} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=2] into this range (lhs+rhs has (size=142 KiB+19 KiB qps=16.88+0.00 --> 16.88qps) below threshold (size=162 KiB, qps=16.88))
I190205 06:15:48.239456 9890 storage/store.go:2669 [n2,s2,r250/2:/Table/6{6/2/"Y"/…-7}] removing replica r251/2
I190205 06:15:48.242316 10198 storage/store.go:2669 [n3,s3,r250/3:/Table/6{6/2/"Y"/…-7}] removing replica r251/3
I190205 06:15:48.249160 9632 storage/store.go:2669 [n1,s1,r250/1:/Table/6{6/2/"Y"/…-7}] removing replica r251/1
import_stmt_test.go:1180: job 8 did not match:
Description: "IMPORT TABLE csv8.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH decompress = 'auto'" != "IMPORT TABLE \"\".\"\".t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0-opts', 'nodelocal:///csv/data-1-opts', 'nodelocal:///csv/data-2-opts', 'nodelocal:///csv/data-3-opts', 'nodelocal:///csv/data-4-opts') WITH comment = '#', delimiter = '|', \"nullif\" = '', skip = '2', transform = 'nodelocal:///5'"
TestImportCSVStmt
...I190205 02:44:33.953772 7743 storage/replica_command.go:798 [n1,replicate,s1,r6/1:/Table/{SystemCon…-11}] change replicas (ADD_REPLICA (n2,s2):3): read existing descriptor r6:/Table/{SystemConfigSpan/Start-11} [(n1,s1):1, (n3,s3):2, next=3, gen=0]
I190205 02:44:33.959200 7743 storage/replica_raft.go:372 [n1,s1,r6/1:/Table/{SystemCon…-11}] proposing ADD_REPLICA((n2,s2):3): updated=[(n1,s1):1 (n3,s3):2 (n2,s2):3] next=4
I190205 02:44:33.961860 7743 storage/store_snapshot.go:762 [n1,replicate,s1,r20/1:/{Table/24-Max}] sending preemptive snapshot 7d0c6543 at applied index 16
I190205 02:44:33.962046 7743 storage/store_snapshot.go:805 [n1,replicate,s1,r20/1:/{Table/24-Max}] streamed snapshot to (n3,s3):?: kv pairs: 7, log entries: 6, rate-limit: 8.0 MiB/sec, 0.00s
I190205 02:44:33.963024 9109 storage/replica_raftstorage.go:805 [n3,s3,r20/?:{-}] applying preemptive snapshot at index 16 (id=7d0c6543, encoded size=1244, 1 rocksdb batches, 6 log entries)
I190205 02:44:33.963296 9109 storage/replica_raftstorage.go:811 [n3,s3,r20/?:/{Table/24-Max}] applied preemptive snapshot in 0ms [clear=0ms batch=0ms entries=0ms commit=0ms]
I190205 02:44:33.963663 7743 storage/replica_command.go:798 [n1,replicate,s1,r20/1:/{Table/24-Max}] change replicas (ADD_REPLICA (n3,s3):3): read existing descriptor r20:/{Table/24-Max} [(n1,s1):1, (n2,s2):2, next=3, gen=0]
I190205 02:44:33.969041 7743 storage/replica_raft.go:372 [n1,s1,r20/1:/{Table/24-Max}] proposing ADD_REPLICA((n3,s3):3): updated=[(n1,s1):1 (n2,s2):2 (n3,s3):3] next=4
I190205 02:44:33.971517 7743 storage/store_snapshot.go:762 [n1,replicate,s1,r12/1:/Table/1{6-7}] sending preemptive snapshot 373f58e7 at applied index 16
I190205 02:44:33.971686 7743 storage/store_snapshot.go:805 [n1,replicate,s1,r12/1:/Table/1{6-7}] streamed snapshot to (n3,s3):?: kv pairs: 7, log entries: 6, rate-limit: 8.0 MiB/sec, 0.00s
I190205 02:44:33.971993 9089 storage/replica_raftstorage.go:805 [n3,s3,r12/?:{-}] applying preemptive snapshot at index 16 (id=373f58e7, encoded size=1239, 1 rocksdb batches, 6 log entries)
I190205 02:44:33.973440 9089 storage/replica_raftstorage.go:811 [n3,s3,r12/?:/Table/1{6-7}] applied preemptive snapshot in 1ms [clear=0ms batch=0ms entries=0ms commit=1ms]
I190205 02:44:33.973794 7743 storage/replica_command.go:798 [n1,replicate,s1,r12/1:/Table/1{6-7}] change replicas (ADD_REPLICA (n3,s3):3): read existing descriptor r12:/Table/1{6-7} [(n1,s1):1, (n2,s2):2, next=3, gen=0]
I190205 02:44:33.976839 7743 storage/replica_raft.go:372 [n1,s1,r12/1:/Table/1{6-7}] proposing ADD_REPLICA((n3,s3):3): updated=[(n1,s1):1 (n2,s2):2 (n3,s3):3] next=4
I190205 02:44:33.978485 7743 storage/store_snapshot.go:762 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] sending preemptive snapshot a27e865b at applied index 41
I190205 02:44:33.979056 7743 storage/store_snapshot.go:805 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] streamed snapshot to (n2,s2):?: kv pairs: 35, log entries: 5, rate-limit: 8.0 MiB/sec, 0.00s
I190205 02:44:33.979429 9055 storage/replica_raftstorage.go:805 [n2,s2,r3/?:{-}] applying preemptive snapshot at index 41 (id=a27e865b, encoded size=85224, 1 rocksdb batches, 5 log entries)
I190205 02:44:33.980315 9055 storage/replica_raftstorage.go:811 [n2,s2,r3/?:/System/{NodeLive…-tsd}] applied preemptive snapshot in 1ms [clear=0ms batch=0ms entries=0ms commit=1ms]
I190205 02:44:33.980675 7743 storage/replica_command.go:798 [n1,replicate,s1,r3/1:/System/{NodeLive…-tsd}] change replicas (ADD_REPLICA (n2,s2):3): read existing descriptor r3:/System/{NodeLivenessMax-tsd} [(n1,s1):1, (n3,s3):2, next=3, gen=0]
I190205 02:44:33.983674 7743 storage/replica_raft.go:372 [n1,s1,r3/1:/System/{NodeLive…-tsd}] proposing ADD_REPLICA((n2,s2):3): updated=[(n1,s1):1 (n3,s3):2 (n2,s2):3] next=4
I190205 02:44:34.094919 9160 sql/event_log.go:135 [n1,client=127.0.0.1:35936,user=root] Event: "set_cluster_setting", target: 0, info: {SettingName:kv.import.batch_size Value:10KB User:root}
TestImportCSVStmt/schema-in-file-auto-decompress
...741 [r54] (manual)
I190205 02:44:46.269688 19388 storage/replica_command.go:244 [n3,s3,r54/3:/Table/69/1/{741-3915}] initiating a split of this range at key /Table/69/1/1456 [r55] (manual)
I190205 02:44:46.291830 19232 storage/replica_command.go:244 [n3,s3,r55/3:/Table/69/1/{1456-3915}] initiating a split of this range at key /Table/69/1/2171 [r56] (manual)
I190205 02:44:46.328957 19508 storage/replica_command.go:244 [n3,s3,r56/3:/Table/69/1/{2171-3915}] initiating a split of this range at key /Table/69/1/2886 [r57] (manual)
I190205 02:44:46.336499 19525 storage/replica_command.go:244 [n1,s1,r255/1:/{Table/69/1/4…-Max}] initiating a split of this range at key /Table/69/2/"G"/4505 [r256] (manual)
I190205 02:44:46.355570 19556 storage/replica_command.go:244 [n1,s1,r256/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/2/"K"/1935 [r257] (manual)
I190205 02:44:46.358942 19442 storage/replica_command.go:244 [n3,s3,r57/3:/Table/69/1/{2886-3915}] initiating a split of this range at key /Table/69/1/3200 [r58] (manual)
I190205 02:44:46.381063 19501 storage/replica_command.go:244 [n1,s1,r257/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/2/"N"/4382 [r258] (manual)
I190205 02:44:46.406300 19473 storage/replica_command.go:244 [n1,s1,r258/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/2/"R"/1864 [r259] (manual)
I190205 02:44:46.425865 19600 storage/replica_command.go:244 [n1,s1,r259/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/2/"U"/4311 [r260] (manual)
I190205 02:44:46.428159 19653 storage/replica_command.go:244 [n1,s1,r259/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/3/3530/"U"/PrefixEnd [r261] (manual)
I190205 02:44:46.432343 19655 storage/replica_command.go:244 [n2,s2,r255/2:/Table/69/{1/4630-2/"G"/4…}] initiating a split of this range at key /Table/69/2/"B"/3382 [r237] (manual)
I190205 02:44:46.458341 19617 storage/replica_command.go:244 [n1,s1,r260/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/2/"Y"/1793 [r262] (manual)
I190205 02:44:46.466008 19619 storage/replica_command.go:244 [n2,s2,r237/2:/Table/69/2/"{B"/3382-G"/4505}] initiating a split of this range at key /Table/69/2/"D"/2161 [r238] (manual)
I190205 02:44:46.478881 19750 storage/replica_command.go:244 [n1,s1,r262/1:/{Table/69/2/"…-Max}] initiating a split of this range at key /Table/69/3/376/"M"/PrefixEnd [r263] (manual)
I190205 02:44:46.536816 19784 storage/replica_command.go:244 [n1,s1,r263/1:/{Table/69/3/3…-Max}] initiating a split of this range at key /Table/69/3/3530/"U"/PrefixEnd [r264] (manual)
I190205 02:44:46.589741 19797 storage/replica_command.go:244 [n1,s1,r264/1:/{Table/69/3/3…-Max}] initiating a split of this range at key /Table/69/3/4197/"L"/PrefixEnd [r265] (manual)
I190205 02:44:46.601867 19801 storage/replica_command.go:244 [n1,s1,r265/1:/{Table/69/3/4…-Max}] initiating a split of this range at key /Table/69/3/4864/"C"/PrefixEnd [r266] (manual)
I190205 02:44:46.616931 19794 storage/replica_command.go:244 [n1,s1,r266/1:/{Table/69/3/4…-Max}] initiating a split of this range at key /Table/70 [r267] (manual)
I190205 02:44:46.670397 19819 storage/replica_command.go:244 [n2,s2,r263/2:/Table/69/3/3{76/"M"…-530/"U…}] initiating a split of this range at key /Table/69/3/1043/"D"/PrefixEnd [r239] (manual)
I190205 02:44:46.697864 19924 storage/replica_command.go:244 [n2,s2,r239/2:/Table/69/3/{1043/"…-3530/"…}] initiating a split of this range at key /Table/69/3/1710/"U"/PrefixEnd [r240] (manual)
I190205 02:44:46.734502 19940 storage/replica_command.go:244 [n2,s2,r240/2:/Table/69/3/{1710/"…-3530/"…}] initiating a split of this range at key /Table/69/3/2377/"L"/PrefixEnd [r241] (manual)
I190205 02:44:46.769686 19961 storage/replica_command.go:244 [n2,s2,r241/2:/Table/69/3/{2377/"…-3530/"…}] initiating a split of this range at key /Table/69/3/2864/"E" [r242] (manual)
TestImportCSVStmt/schema-in-query-transform-only
... 9654 storage/store.go:2669 [n1,s1,r212/1:/Table/61/2/"{O"/1314-Q"/2226}] removing replica r205/1
I190205 06:14:57.866125 9924 storage/store.go:2669 [n2,s2,r190/2:/Table/61/{1/4554-2/"E"/4…}] removing replica r189/2
I190205 06:14:57.866611 10059 storage/store.go:2669 [n3,s3,r190/3:/Table/61/{1/4554-2/"E"/4…}] removing replica r189/3
I190205 06:14:57.883059 9641 storage/store.go:2669 [n1,s1,r190/1:/Table/61/{1/4554-2/"E"/4…}] removing replica r189/1
I190205 06:14:58.077630 26078 storage/replica_command.go:383 [n3,merge,s3,r212/3:/Table/61/2/"{O"/1314-S"/4568}] initiating a merge of r216:/Table/61/2/"{S"/4568-W"/2025} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=0] into this range (lhs+rhs has (size=24 KiB+18 KiB qps=3.73+0.00 --> 3.73qps) below threshold (size=43 KiB, qps=3.73))
I190205 06:14:58.223857 26134 storage/replica_command.go:383 [n2,merge,s2,r207/2:/Table/61/2/"{G"/266-I"/892}] initiating a merge of r202:/Table/61/2/"{I"/892-L"/3314} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=11 KiB+18 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=30 KiB, qps=0.00))
I190205 06:14:58.501033 9899 storage/store.go:2669 [n2,s2,r212/2:/Table/61/2/"{O"/1314-S"/4568}] removing replica r216/2
I190205 06:14:58.503815 10058 storage/store.go:2669 [n3,s3,r212/3:/Table/61/2/"{O"/1314-S"/4568}] removing replica r216/3
I190205 06:14:58.520903 9662 storage/store.go:2669 [n1,s1,r212/1:/Table/61/2/"{O"/1314-S"/4568}] removing replica r216/1
I190205 06:14:58.758322 9895 storage/store.go:2669 [n2,s2,r207/2:/Table/61/2/"{G"/266-I"/892}] removing replica r202/2
I190205 06:14:58.760517 9631 storage/store.go:2669 [n1,s1,r207/1:/Table/61/2/"{G"/266-I"/892}] removing replica r202/1
I190205 06:14:58.770387 10224 storage/store.go:2669 [n3,s3,r207/3:/Table/61/2/"{G"/266-I"/892}] removing replica r202/3
I190205 06:14:59.067032 26164 storage/replica_command.go:383 [n3,merge,s3,r185/3:/Table/61/1/{3179-4224}] initiating a merge of r167:/Table/61/1/42{24-64} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=28 KiB+1.1 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=29 KiB, qps=0.00))
I190205 06:14:59.249000 26231 storage/replica_command.go:383 [n2,merge,s2,r207/2:/Table/61/2/"{G"/266-L"/3314}] initiating a merge of r203:/Table/61/2/"{L"/3314-O"/1314} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=30 KiB+14 KiB qps=7.89+0.00 --> 7.89qps) below threshold (size=43 KiB, qps=7.89))
I190205 06:14:59.505789 10108 storage/store.go:2669 [n3,s3,r185/3:/Table/61/1/{3179-4224}] removing replica r167/3
I190205 06:14:59.510108 9895 storage/store.go:2669 [n2,s2,r185/2:/Table/61/1/{3179-4224}] removing replica r167/2
I190205 06:14:59.535740 9608 storage/store.go:2669 [n1,s1,r185/1:/Table/61/1/{3179-4224}] removing replica r167/1
I190205 06:14:59.705067 9910 storage/store.go:2669 [n2,s2,r207/2:/Table/61/2/"{G"/266-L"/3314}] removing replica r203/2
I190205 06:14:59.708682 10186 storage/store.go:2669 [n3,s3,r207/3:/Table/61/2/"{G"/266-L"/3314}] removing replica r203/3
I190205 06:14:59.713006 9662 storage/store.go:2669 [n1,s1,r207/1:/Table/61/2/"{G"/266-L"/3314}] removing replica r203/1
import_stmt_test.go:1180: job 5 did not match:
Description: "IMPORT TABLE \"\".\"\".t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0-opts', 'nodelocal:///csv/data-1-opts', 'nodelocal:///csv/data-2-opts', 'nodelocal:///csv/data-3-opts', 'nodelocal:///csv/data-4-opts') WITH comment = '#', delimiter = '|', \"nullif\" = '', skip = '2', transform = 'nodelocal:///5'" != "IMPORT TABLE csv3.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0-opts', 'nodelocal:///csv/data-1-opts', 'nodelocal:///csv/data-2-opts', 'nodelocal:///csv/data-3-opts', 'nodelocal:///csv/data-4-opts') WITH comment = '#', delimiter = '|', \"nullif\" = '', skip = '2'"
TestImportCSVStmt/empty-file
...e/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"P"/1055 - /Table/61/2/"Q"/2226 that contains live data
I190205 06:15:06.741291 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"Q"/2226 - /Table/61/2/"S"/4568 that contains live data
I190205 06:15:06.742447 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"S"/4568 - /Table/61/2/"W"/2025 that contains live data
I190205 06:15:06.742810 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"X"/4157 - /Table/61/3/264/"E" that contains live data
I190205 06:15:06.743494 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/731/"D" - /Table/61/3/1365/"N" that contains live data
I190205 06:15:06.744680 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/1365/"N" - /Table/61/3/2031/"D"/PrefixEnd that contains live data
I190205 06:15:06.745919 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/2522/"A" - /Table/61/3/3188/"Q"/PrefixEnd that contains live data
I190205 06:15:06.747192 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/3188/"Q"/PrefixEnd - /Table/61/3/3565/"D" that contains live data
I190205 06:15:06.747938 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/4117/"J" - /Table/61/3/4674/"U" that contains live data
I190205 06:15:06.749116 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/4359/"R" - /Table/61/3/4674/"U" that contains live data
I190205 06:15:06.799087 10193 storage/store.go:2669 [n3,s3,r212/3:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/3
I190205 06:15:06.805015 9883 storage/store.go:2669 [n2,s2,r212/2:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/2
I190205 06:15:06.806335 9620 storage/store.go:2669 [n1,s1,r212/1:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/1
I190205 06:15:06.849427 27174 storage/replica_command.go:383 [n2,merge,s2,r166/2:/Table/61/1/2{146-861}] initiating a merge of r170:/Table/61/1/{2861-4554} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=19 KiB+45 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=64 KiB, qps=0.00))
import_stmt_test.go:1180: job 6 did not match:
Description: "IMPORT TABLE csv6.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///empty.csv')" != "IMPORT TABLE csv4.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH sstsize = '10K'"
------- Stdout: -------
I190205 02:44:44.320484 9160 sql/event_log.go:135 [n1,client=127.0.0.1:35936,user=root] Event: "create_database", target: 64, info: {DatabaseName:csv6 Statement:CREATE DATABASE csv6 User:root}
I190205 02:44:44.327135 16970 storage/replica_consistency.go:139 [n1,consistencyChecker,s1,r4/1:/System/ts{d-e}] triggering stats recomputation to resolve delta of {ContainsEstimates:true LastUpdateNanos:1549334683671481933 IntentAge:0 GCBytesAge:0 LiveBytes:-14745 LiveCount:-1554 KeyBytes:-73641 KeyCount:-1554 ValBytes:58896 ValCount:-1554 IntentBytes:0 IntentCount:0 SysBytes:0 SysCount:0 XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}
I190205 02:44:44.445887 16927 ccl/importccl/read_import_proc.go:75 [n2] could not fetch file size; falling back to per-file progress: <nil>
I190205 02:44:44.498623 17163 ccl/importccl/read_import_proc.go:75 [n2] could not fetch file size; falling back to per-file progress: <nil>
TestImportCSVStmt/schema-in-query-opts
..., 1.2 %gc (11x), 3.4 MiB/3.4 MiB (r/w)net
I190205 06:13:52.874143 18985 storage/replica_command.go:244 [n2,s2,r150/2:/{Table/59/2/"…-Max}] initiating a split of this range at key /Table/59/3/497/"D"/PrefixEnd [r152] (manual)
I190205 06:13:53.124901 18968 storage/replica_command.go:244 [n2,s2,r150/2:/{Table/59/2/"…-Max}] initiating a split of this range at key /Table/59/2/"F"/2579 [r153] (manual)
I190205 06:13:54.010856 19127 storage/replica_command.go:244 [n3,s3,r150/3:/Table/59/{2/"B"/3…-3/497/"…}] initiating a split of this range at key /Table/59/2/"J"/4924 [r103] (manual)
I190205 06:13:54.123120 19068 storage/replica_command.go:244 [n2,s2,r152/2:/{Table/59/3/4…-Max}] initiating a split of this range at key /Table/59/3/1199/"D"/PrefixEnd [r154] (manual)
I190205 06:13:54.233251 18493 storage/replica_command.go:244 [n3,s3,r150/3:/Table/59/{2/"B"/3…-3/497/"…}] initiating a split of this range at key /Table/59/2/"F"/2579 [r104] (manual)
I190205 06:13:54.902201 18493 storage/replica_command.go:244 [n3,s3,r150/3:/Table/59/2/"{B"/3954-J"/4924}] initiating a split of this range at key /Table/59/2/"F"/2579 [r105] (manual)
I190205 06:13:55.128472 19105 storage/replica_command.go:244 [n2,s2,r154/2:/{Table/59/3/1…-Max}] initiating a split of this range at key /Table/59/3/1901/"D"/PrefixEnd [r155] (manual)
I190205 06:13:55.238082 19150 storage/replica_command.go:244 [n3,s3,r103/3:/Table/59/{2/"J"/4…-3/497/"…}] initiating a split of this range at key /Table/59/2/"O"/4747 [r106] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:13:56.090108 19252 storage/replica_command.go:244 [n2,s2,r155/2:/{Table/59/3/1…-Max}] initiating a split of this range at key /Table/59/3/2603/"D"/PrefixEnd [r156] (manual)
I190205 06:13:56.307559 19318 storage/replica_command.go:244 [n3,s3,r106/3:/Table/59/{2/"O"/4…-3/497/"…}] initiating a split of this range at key /Table/59/2/"T"/2360 [r107] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:13:56.809474 19311 storage/replica_command.go:244 [n2,s2,r156/2:/{Table/59/3/2…-Max}] initiating a split of this range at key /Table/59/3/3305/"D"/PrefixEnd [r158] (manual)
I190205 06:13:56.894569 19331 storage/replica_command.go:244 [n2,s2,r156/2:/{Table/59/3/2…-Max}] initiating a split of this range at key /Table/59/3/4645/"R"/PrefixEnd [r157] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:13:57.093354 19418 storage/replica_command.go:244 [n3,s3,r107/3:/Table/59/{2/"T"/2…-3/497/"…}] initiating a split of this range at key /Table/59/2/"X"/4808 [r108] (manual)
I190205 06:13:57.482919 19438 storage/replica_command.go:244 [n2,s2,r158/2:/{Table/59/3/3…-Max}] initiating a split of this range at key /Table/59/3/4645/"R"/PrefixEnd [r160] (manual)
I190205 06:13:57.623473 19478 storage/replica_command.go:244 [n3,s3,r108/3:/Table/59/{2/"X"/4…-3/497/"…}] initiating a split of this range at key /Table/59/2/"Y"/3638 [r109] (manual)
I190205 06:13:57.657277 19437 storage/replica_command.go:244 [n2,s2,r158/2:/{Table/59/3/3…-Max}] initiating a split of this range at key /Table/59/3/3944/NULL [r159] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 06:13:57.966350 19437 storage/replica_command.go:244 [n2,s2,r158/2:/Table/59/3/{3305/"…-4645/"…}] initiating a split of this range at key /Table/59/3/3944/NULL [r161] (manual)
I190205 06:13:58.041480 19529 storage/replica_command.go:244 [n2,s2,r160/2:/{Table/59/3/4…-Max}] initiating a split of this range at key /Table/60 [r162] (manual)
import_stmt_test.go:1180: job 3 did not match:
Description: "IMPORT TABLE csv3.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0-opts', 'nodelocal:///csv/data-1-opts', 'nodelocal:///csv/data-2-opts', 'nodelocal:///csv/data-3-opts', 'nodelocal:///csv/data-4-opts') WITH comment = '#', delimiter = '|', \"nullif\" = '', skip = '2'" != "CREATE STATISTICS __auto__ FROM [53] AS OF SYSTEM TIME '-30s'"
TestImportCSVStmt/empty-with-files
...15:21.871779 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/61/3/264/"E" - /Table/61/3/3565/"D" that contains live data
I190205 06:15:21.872210 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/61/3/2031/"D"/PrefixEnd - /Table/61/3/3565/"D" that contains live data
I190205 06:15:21.872472 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/61/3/3565/"D" - /Table/64 that contains live data
I190205 06:15:21.872645 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/61/3/4674/"U" - /Table/64 that contains live data
I190205 06:15:21.876206 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/62 - /Max that contains live data
I190205 06:15:21.928277 28727 storage/replica_command.go:244 [n1,s1,r247/1:/Table/66/{2/"R"/4…-3/1624/…}] initiating a split of this range at key /Table/66/2/"V"/1790 [r248] (manual)
I190205 06:15:21.970677 9874 server/status/runtime.go:464 [n2] runtime stats: 1.1 GiB RSS, 677 goroutines, 52 MiB/55 MiB/137 MiB GO alloc/idle/total, 197 MiB/240 MiB CGO alloc/total, 6496.3 CGO/sec, 144.6/13.9 %(u/s)time, 0.8 %gc (10x), 3.7 MiB/3.7 MiB (r/w)net
I190205 06:15:22.362644 10240 gossip/gossip.go:557 [n3] gossip status (ok, 3 nodes)
gossip client (1/3 cur/max conns)
1: 127.0.0.1:35391 (3m0s: infos 594/1204 sent/received, bytes 166167B/419400B sent/received)
gossip server (0/3 cur/max conns, infos 594/1204 sent/received, bytes 166167B/419400B sent/received)
I190205 06:15:22.522772 28008 storage/replica_command.go:383 [n3,merge,s3,r110/3:/Table/61{-/1/4554}] initiating a merge of r190:/Table/6{1/1/4554-4} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=6] into this range (lhs+rhs has (size=120 KiB+284 KiB qps=1.20+0.00 --> 1.20qps) below threshold (size=404 KiB, qps=1.20))
I190205 06:15:22.793524 10171 server/status/runtime.go:464 [n3] runtime stats: 1.1 GiB RSS, 680 goroutines, 53 MiB/53 MiB/137 MiB GO alloc/idle/total, 197 MiB/240 MiB CGO alloc/total, 6776.9 CGO/sec, 143.8/13.7 %(u/s)time, 1.2 %gc (10x), 3.8 MiB/3.8 MiB (r/w)net
I190205 06:15:22.795696 28721 storage/replica_command.go:244 [n1,s1,r242/1:/{Table/66/3/3…-Max}] initiating a split of this range at key /Table/66/3/4484/"M"/PrefixEnd [r249] (manual)
I190205 06:15:22.878177 28776 storage/replica_command.go:244 [n1,s1,r248/1:/Table/66/{2/"V"/1…-3/1624/…}] initiating a split of this range at key /Table/66/2/"Y"/4237 [r250] (manual)
I190205 06:15:23.610461 10212 storage/store.go:2669 [n3,s3,r110/3:/Table/61{-/1/4554}] removing replica r190/3
I190205 06:15:23.615939 9913 storage/store.go:2669 [n2,s2,r110/2:/Table/61{-/1/4554}] removing replica r190/2
I190205 06:15:23.623197 9662 storage/store.go:2669 [n1,s1,r110/1:/Table/61{-/1/4554}] removing replica r190/1
I190205 06:15:23.694907 28008 storage/queue.go:912 [n3,merge] purgatory is now empty
I190205 06:15:23.713115 28802 storage/replica_command.go:244 [n1,s1,r249/1:/{Table/66/3/4…-Max}] initiating a split of this range at key /Table/67 [r251] (manual)
I190205 06:15:23.862117 28783 storage/replica_command.go:244 [n1,s1,r250/1:/Table/66/{2/"Y"/4…-3/1624/…}] initiating a split of this range at key /Table/66/3/470/"C"/PrefixEnd [r252] (manual)
I190205 06:15:25.100854 28859 storage/replica_command.go:244 [n1,s1,r252/1:/Table/66/3/{470/"C…-1624/"…}] initiating a split of this range at key /Table/66/3/958/"W" [r253] (manual)
import_stmt_test.go:1180: job 7 did not match:
Description: "IMPORT TABLE csv7.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///empty.csv', 'nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4')" != "CREATE STATISTICS __auto__ FROM [57] AS OF SYSTEM TIME '-30s'"
TestImportCSVStmt/empty-file
...r.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"B"/1119 - /Table/61/2/"E"/3463 that contains live data
I190205 06:15:06.731776 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"E"/3463 - /Table/61/2/"E"/4190 that contains live data
I190205 06:15:06.732818 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"E"/4190 - /Table/61/2/"G"/266 that contains live data
I190205 06:15:06.736565 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"I"/892 - /Table/61/2/"L"/3314 that contains live data
I190205 06:15:06.738047 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"L"/3314 - /Table/61/2/"O"/1314 that contains live data
I190205 06:15:06.740642 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"P"/1055 - /Table/61/2/"Q"/2226 that contains live data
I190205 06:15:06.741291 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"Q"/2226 - /Table/61/2/"S"/4568 that contains live data
I190205 06:15:06.742447 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"S"/4568 - /Table/61/2/"W"/2025 that contains live data
I190205 06:15:06.742810 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/2/"X"/4157 - /Table/61/3/264/"E" that contains live data
I190205 06:15:06.743494 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/731/"D" - /Table/61/3/1365/"N" that contains live data
I190205 06:15:06.744680 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/1365/"N" - /Table/61/3/2031/"D"/PrefixEnd that contains live data
I190205 06:15:06.745919 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/2522/"A" - /Table/61/3/3188/"Q"/PrefixEnd that contains live data
I190205 06:15:06.747192 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/3188/"Q"/PrefixEnd - /Table/61/3/3565/"D" that contains live data
I190205 06:15:06.747938 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/4117/"J" - /Table/61/3/4674/"U" that contains live data
I190205 06:15:06.749116 10238 storage/compactor/compactor.go:329 [n3,s3,compactor] purging suggested compaction for range /Table/61/3/4359/"R" - /Table/61/3/4674/"U" that contains live data
I190205 06:15:06.799087 10193 storage/store.go:2669 [n3,s3,r212/3:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/3
I190205 06:15:06.805015 9883 storage/store.go:2669 [n2,s2,r212/2:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/2
I190205 06:15:06.806335 9620 storage/store.go:2669 [n1,s1,r212/1:/Table/61/2/"{O"/1314-W"/2025}] removing replica r213/1
I190205 06:15:06.849427 27174 storage/replica_command.go:383 [n2,merge,s2,r166/2:/Table/61/1/2{146-861}] initiating a merge of r170:/Table/61/1/{2861-4554} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=19 KiB+45 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=64 KiB, qps=0.00))
import_stmt_test.go:1180: job 6 did not match:
Description: "IMPORT TABLE csv6.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///empty.csv')" != "IMPORT TABLE csv4.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH sstsize = '10K'"
TestImportCSVStmt/schema-in-file-no-decompress
...58316 32648 storage/replica_command.go:244 [n1,s1,r287/1:/{Table/70/3/4…-Max}] initiating a split of this range at key /Table/70/3/2182/"Y"/PrefixEnd [r290] (manual)
I190205 06:16:02.177970 32643 storage/replica_command.go:244 [n1,s1,r288/1:/Table/70/{1/2171-2/"B"/3…}] initiating a split of this range at key /Table/70/1/2886 [r291] (manual)
I190205 06:16:02.609808 32468 storage/replica_command.go:244 [n1,s1,r289/1:/{Table/70/3/1…-Max}] initiating a split of this range at key /Table/70/3/1516/"I" [r292] (manual)
I190205 06:16:02.871254 10171 server/status/runtime.go:464 [n3] runtime stats: 1.3 GiB RSS, 690 goroutines, 57 MiB/48 MiB/137 MiB GO alloc/idle/total, 232 MiB/276 MiB CGO alloc/total, 4243.4 CGO/sec, 136.5/12.6 %(u/s)time, 1.3 %gc (10x), 3.7 MiB/3.7 MiB (r/w)net
I190205 06:16:02.932635 32700 storage/replica_command.go:244 [n1,s1,r289/1:/{Table/70/3/1…-Max}] initiating a split of this range at key /Table/70/3/2182/"Y"/PrefixEnd [r293] (manual)
W190205 06:16:03.166227 9876 server/node.go:869 [n2,summaries] health alerts detected: {Alerts:[{StoreID:2 Category:METRICS Description:queue.replicagc.process.failure Value:1 XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}] XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}
I190205 06:16:03.726772 32671 storage/replica_command.go:244 [n1,s1,r292/1:/{Table/70/3/1…-Max}] initiating a split of this range at key /Table/70/3/2182/"Y"/PrefixEnd [r294] (manual)
I190205 06:16:03.739747 32805 storage/replica_command.go:244 [n1,s1,r291/1:/Table/70/{1/2886-2/"B"/3…}] initiating a split of this range at key /Table/70/1/3601 [r295] (manual)
I190205 06:16:04.194676 32792 storage/replica_command.go:244 [n1,s1,r294/1:/{Table/70/3/2…-Max}] initiating a split of this range at key /Table/70/3/2776/"U" [r296] (manual)
I190205 06:16:04.806200 32853 storage/replica_command.go:244 [n1,s1,r295/1:/Table/70/{1/3601-2/"B"/3…}] initiating a split of this range at key /Table/70/1/4316 [r297] (manual)
I190205 06:16:05.615978 32866 storage/replica_command.go:244 [n1,s1,r297/1:/Table/70/{1/4316-2/"B"/3…}] initiating a split of this range at key /Table/70/1/4617 [r298] (manual)
I190205 06:16:06.168390 32946 storage/replica_command.go:244 [n1,s1,r281/1:/Table/70/2/"{H"/3101-Y"/3795}] initiating a split of this range at key /Table/70/2/"L"/532 [r299] (manual)
I190205 06:16:06.403008 32892 storage/replica_command.go:244 [n1,s1,r299/1:/Table/70/2/"{L"/532-Y"/3795}] initiating a split of this range at key /Table/70/2/"O"/2979 [r300] (manual)
I190205 06:16:06.645417 33000 storage/replica_command.go:244 [n1,s1,r300/1:/Table/70/2/"{O"/2979-Y"/3795}] initiating a split of this range at key /Table/70/2/"S"/461 [r301] (manual)
I190205 06:16:07.070317 33003 storage/replica_command.go:244 [n1,s1,r301/1:/Table/70/2/"{S"/461-Y"/3795}] initiating a split of this range at key /Table/70/2/"V"/1373 [r302] (manual)
I190205 06:16:07.567661 33066 storage/replica_command.go:244 [n1,s1,r296/1:/{Table/70/3/2…-Max}] initiating a split of this range at key /Table/70/3/3442/"K"/PrefixEnd [r303] (manual)
I190205 06:16:07.882628 33109 storage/replica_command.go:244 [n1,s1,r303/1:/{Table/70/3/3…-Max}] initiating a split of this range at key /Table/70/3/4109/"B"/PrefixEnd [r304] (manual)
I190205 06:16:08.188667 33028 storage/replica_command.go:244 [n1,s1,r304/1:/{Table/70/3/4…-Max}] initiating a split of this range at key /Table/70/3/4776/"S"/PrefixEnd [r305] (manual)
I190205 06:16:08.501875 33173 storage/replica_command.go:244 [n1,s1,r305/1:/{Table/70/3/4…-Max}] initiating a split of this range at key /Table/71 [r306] (manual)
import_stmt_test.go:1180: job 9 did not match:
Description: "IMPORT TABLE csv9.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH decompress = 'none'" != "CREATE STATISTICS __auto__ FROM [59] AS OF SYSTEM TIME '-30s'"
TestImportCSVStmt/schema-in-file-auto-gzip
...)
I190205 02:44:49.403935 23281 storage/replica_command.go:244 [n2,s2,r331/2:/Table/75/1/{741-4898}] initiating a split of this range at key /Table/75/1/1456 [r333] (manual)
I190205 02:44:49.404000 23432 storage/replica_command.go:244 [n1,s1,r343/1:/{Table/75/3/3…-Max}] initiating a split of this range at key /Table/75/3/3927/"B"/PrefixEnd [r344] (manual)
I190205 02:44:49.422037 23432 storage/replica_command.go:244 [n1,s1,r344/1:/{Table/75/3/3…-Max}] initiating a split of this range at key /Table/75/3/4594/"S"/PrefixEnd [r345] (manual)
I190205 02:44:49.433638 23620 storage/replica_command.go:244 [n2,s2,r333/2:/Table/75/1/{1456-4898}] initiating a split of this range at key /Table/75/1/2171 [r334] (manual)
I190205 02:44:49.434460 23432 storage/replica_command.go:244 [n1,s1,r345/1:/{Table/75/3/4…-Max}] initiating a split of this range at key /Table/75/3/4829/"T" [r346] (manual)
I190205 02:44:49.514207 23539 storage/replica_command.go:244 [n2,s2,r334/2:/Table/75/1/{2171-4898}] initiating a split of this range at key /Table/75/1/2886 [r335] (manual)
I190205 02:44:49.552268 23614 storage/replica_command.go:244 [n2,s2,r335/2:/Table/75/1/{2886-4898}] initiating a split of this range at key /Table/75/1/3601 [r336] (manual)
I190205 02:44:49.575151 23562 storage/replica_command.go:244 [n2,s2,r336/2:/Table/75/1/{3601-4898}] initiating a split of this range at key /Table/75/1/4183 [r337] (manual)
I190205 02:44:49.603039 23403 storage/replica_command.go:244 [n2,s2,r341/2:/Table/75/{2/"H"/4…-3/3260/…}] initiating a split of this range at key /Table/75/2/"L"/2196 [r332] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:49.627746 23403 storage/replica_command.go:244 [n2,s2,r332/2:/Table/75/{2/"L"/2…-3/3260/…}] initiating a split of this range at key /Table/75/2/"O"/4643 [r338] (manual)
I190205 02:44:49.630246 23736 storage/replica_command.go:244 [n2,s2,r332/2:/Table/75/{2/"L"/2…-3/3260/…}] initiating a split of this range at key /Table/75/3/1144/"A"/PrefixEnd [r339] (manual)
I190205 02:44:49.654969 23403 storage/replica_command.go:244 [n2,s2,r338/2:/Table/75/{2/"O"/4…-3/3260/…}] initiating a split of this range at key /Table/75/2/"S"/2125 [r340] (manual)
I190205 02:44:49.669974 23403 storage/replica_command.go:244 [n2,s2,r340/2:/Table/75/{2/"S"/2…-3/3260/…}] initiating a split of this range at key /Table/75/2/"V"/4572 [r361] (manual)
I190205 02:44:49.706106 23403 storage/replica_command.go:244 [n2,s2,r361/2:/Table/75/{2/"V"/4…-3/3260/…}] initiating a split of this range at key /Table/75/2/"Z"/2054 [r362] (manual)
I190205 02:44:49.909931 23892 storage/replica_command.go:244 [n2,s2,r361/2:/Table/75/2/"{V"/4572-Z"/2054}] initiating a split of this range at key /Table/75/3/1144/"A"/PrefixEnd [r363] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:49.945311 23403 storage/replica_command.go:244 [n2,s2,r362/2:/Table/75/{2/"Z"/2…-3/3260/…}] initiating a split of this range at key /Table/75/3/478/"K" [r364] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:49.968551 23932 storage/replica_command.go:244 [n2,s2,r364/2:/Table/75/3/{478/"K"-3260/"…}] initiating a split of this range at key /Table/75/3/1144/"A"/PrefixEnd [r365] (manual)
I190205 02:44:49.981685 23852 storage/replica_command.go:244 [n1,s1,r346/1:/{Table/75/3/4…-Max}] initiating a split of this range at key /Table/76 [r347] (manual)
I190205 02:44:49.993520 24000 storage/replica_command.go:244 [n2,s2,r365/2:/Table/75/3/{1144/"…-3260/"…}] initiating a split of this range at key /Table/75/3/1811/"R"/PrefixEnd [r366] (manual)
I190205 02:44:50.029724 24068 storage/replica_command.go:244 [n2,s2,r366/2:/Table/75/3/{1811/"…-3260/"…}] initiating a split of this range at key /Table/75/3/2478/"I"/PrefixEnd [r367] (manual)
I190205 02:44:50.057544 24041 storage/replica_command.go:244 [n2,s2,r367/2:/Table/75/3/{2478/"…-3260/"…}] initiating a split of this range at key /Table/75/3/2594/"U" [r368] (manual)
TestImportCSVStmt/schema-in-file-explicit-gzip
....910569 9879 storage/store.go:2669 [n2,s2,r273/2:/Table/68{-/1/3052}] removing replica r233/2
I190205 06:16:40.913259 10222 storage/store.go:2669 [n3,s3,r273/3:/Table/68{-/1/3052}] removing replica r233/3
I190205 06:16:40.936269 9637 storage/store.go:2669 [n1,s1,r273/1:/Table/68{-/1/3052}] removing replica r233/1
I190205 06:16:41.383515 36625 storage/replica_command.go:383 [n2,merge,s2,r273/2:/Table/68{-/2/"L"/1…}] initiating a merge of r239:/Table/{68/2/"L"/1988-70} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=3] into this range (lhs+rhs has (size=191 KiB+213 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=404 KiB, qps=0.00))
I190205 06:16:41.559355 36762 storage/replica_command.go:244 [n1,s1,r318/1:/{Table/72/3/3…-Max}] initiating a split of this range at key /Table/72/3/3992/"O"/PrefixEnd [r319] (manual)
I190205 06:16:41.789422 9883 storage/store.go:2669 [n2,s2,r273/2:/Table/68{-/2/"L"/1…}] removing replica r239/2
I190205 06:16:41.799294 9634 storage/store.go:2669 [n1,s1,r273/1:/Table/68{-/2/"L"/1…}] removing replica r239/1
I190205 06:16:41.800309 10065 storage/store.go:2669 [n3,s3,r273/3:/Table/68{-/2/"L"/1…}] removing replica r239/3
I190205 06:16:42.108469 9874 server/status/runtime.go:464 [n2] runtime stats: 1.3 GiB RSS, 669 goroutines, 68 MiB/36 MiB/137 MiB GO alloc/idle/total, 276 MiB/314 MiB CGO alloc/total, 6377.7 CGO/sec, 151.5/13.8 %(u/s)time, 1.8 %gc (10x), 3.1 MiB/3.1 MiB (r/w)net
I190205 06:16:42.169137 36762 storage/replica_command.go:244 [n1,s1,r319/1:/{Table/72/3/3…-Max}] initiating a split of this range at key /Table/72/3/4659/"F"/PrefixEnd [r320] (manual)
I190205 06:16:42.194380 36299 sql/event_log.go:135 [n1] Event: "create_statistics", target: 61, info: {StatisticName:__auto__ Statement:CREATE STATISTICS __auto__ FROM [61] AS OF SYSTEM TIME '-30s'}
I190205 06:16:42.596801 36762 storage/replica_command.go:244 [n1,s1,r320/1:/{Table/72/3/4…-Max}] initiating a split of this range at key /Table/73 [r331] (manual)
I190205 06:16:42.945888 10171 server/status/runtime.go:464 [n3] runtime stats: 1.3 GiB RSS, 667 goroutines, 87 MiB/21 MiB/137 MiB GO alloc/idle/total, 275 MiB/315 MiB CGO alloc/total, 6345.5 CGO/sec, 154.0/14.3 %(u/s)time, 1.8 %gc (10x), 3.1 MiB/3.1 MiB (r/w)net
I190205 06:16:43.285212 36839 storage/replica_command.go:383 [n1,merge,s1,r286/1:/Table/70/1/{1456-2171}] initiating a merge of r288:/Table/70/1/2{171-886} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=19 KiB+19 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=38 KiB, qps=0.00))
W190205 06:16:43.399742 10173 server/node.go:869 [n3,summaries] health alerts detected: {Alerts:[{StoreID:3 Category:METRICS Description:queue.replicagc.process.failure Value:1 XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}] XXX_NoUnkeyedLiteral:{} XXX_sizecache:0}
I190205 06:16:43.883124 9624 storage/store.go:2669 [n1,s1,r286/1:/Table/70/1/{1456-2171}] removing replica r288/1
I190205 06:16:43.896340 36935 storage/replica_command.go:383 [n3,merge,s3,r288/3:/Table/70/1/2{171-886}] initiating a merge of r291:/Table/70/1/{2886-3601} [(n1,s1):1, (n2,s2):2, (n3,s3):3, next=4, gen=1] into this range (lhs+rhs has (size=19 KiB+19 KiB qps=0.00+0.00 --> 0.00qps) below threshold (size=38 KiB, qps=0.00))
I190205 06:16:43.907335 9891 storage/store.go:2669 [n2,s2,r286/2:/Table/70/1/{1456-2171}] removing replica r288/2
I190205 06:16:43.909580 10068 storage/store.go:2669 [n3,s3,r286/3:/Table/70/1/{1456-2171}] removing replica r288/3
import_stmt_test.go:1180: job 10 did not match:
Description: "IMPORT TABLE csv10.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:////csv/data-0.gz', 'nodelocal:////csv/data-1.gz', 'nodelocal:////csv/data-2.gz', 'nodelocal:////csv/data-3.gz', 'nodelocal:////csv/data-4.gz') WITH decompress = 'gzip'" != "IMPORT TABLE csv6.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///empty.csv')"
TestImportCSVStmt/schema-in-file-sstsize
...2,s2,compactor] purging suggested compaction for range /Table/59/3/3305/"D"/PrefixEnd - /Table/59/3/3944/NULL that contains live data
I190205 06:14:34.574468 9868 storage/compactor/compactor.go:329 [n2,s2,compactor] purging suggested compaction for range /Table/59/3/3944/NULL - /Table/59/3/4645/"R"/PrefixEnd that contains live data
I190205 06:14:34.574774 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/55/1/3601 - /Table/57 that contains live data
I190205 06:14:34.574913 9868 storage/compactor/compactor.go:329 [n2,s2,compactor] purging suggested compaction for range /Table/59/3/4645/"R"/PrefixEnd - /Table/60 that contains live data
I190205 06:14:34.575401 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/57/1/4360 - /Table/57/3/1122/"E"/PrefixEnd that contains live data
I190205 06:14:34.575921 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/57/3/1122/"E"/PrefixEnd - /Table/59 that contains live data
I190205 06:14:34.576341 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/2/"X"/4808 - /Table/59/2/"Y"/3638 that contains live data
I190205 06:14:34.576856 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/2/"Y"/3638 - /Table/59/3/497/"D"/PrefixEnd that contains live data
I190205 06:14:34.577325 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/497/"D"/PrefixEnd - /Table/59/3/1199/"D"/PrefixEnd that contains live data
I190205 06:14:34.577894 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/1199/"D"/PrefixEnd - /Table/59/3/1901/"D"/PrefixEnd that contains live data
I190205 06:14:34.578378 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/1901/"D"/PrefixEnd - /Table/59/3/2603/"D"/PrefixEnd that contains live data
I190205 06:14:34.578865 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/3305/"D"/PrefixEnd - /Table/59/3/3944/NULL that contains live data
I190205 06:14:34.579422 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/3944/NULL - /Table/59/3/4645/"R"/PrefixEnd that contains live data
I190205 06:14:34.579908 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/59/3/4645/"R"/PrefixEnd - /Table/60 that contains live data
I190205 06:14:34.580286 9684 storage/compactor/compactor.go:329 [n1,s1,compactor] purging suggested compaction for range /Table/60 - /Table/61/1/741 that contains live data
I190205 06:14:34.575420 9868 storage/compactor/compactor.go:329 [n2,s2,compactor] purging suggested compaction for range /Table/60 - /Table/61/1/741 that contains live data
I190205 06:14:35.014982 9911 storage/store.go:2669 [n2,s2,r107/2:/Table/59/{2/"T"/2…-3/2603/…}] removing replica r156/2
I190205 06:14:35.017567 10111 storage/store.go:2669 [n3,s3,r107/3:/Table/59/{2/"T"/2…-3/2603/…}] removing replica r156/3
I190205 06:14:35.092082 9649 storage/store.go:2669 [n1,s1,r107/1:/Table/59/{2/"T"/2…-3/2603/…}] removing replica r156/1
I190205 06:14:36.634347 23897 storage/replica_command.go:244 [n1,s1,r220/1:/Table/6{1/3/4359…-2}] initiating a split of this range at key /Table/61/3/4674/"U" [r115] (manual)
import_stmt_test.go:1180: job 4 did not match:
Description: "IMPORT TABLE csv4.public.t (a INT8 PRIMARY KEY, b STRING, INDEX (b), INDEX (a, b)) CSV DATA ('nodelocal:///csv/data-0', 'nodelocal:///csv/data-1', 'nodelocal:///csv/data-2', 'nodelocal:///csv/data-3', 'nodelocal:///csv/data-4') WITH sstsize = '10K'" != "CREATE STATISTICS __auto__ FROM [55] AS OF SYSTEM TIME '-30s'"
TestImportCSVStmt/schema-in-file-no-decompress
... storage/replica_command.go:244 [n1,s1,r282/1:/{Table/71/1/3…-Max}] initiating a split of this range at key /Table/71/1/4413 [r284] (manual)
I190205 02:44:47.427740 20802 storage/replica_command.go:244 [n3,s3,r60/3:/Table/71/1/{741-2462}] initiating a split of this range at key /Table/71/1/1456 [r271] (manual)
I190205 02:44:47.433088 20821 storage/replica_command.go:244 [n1,s1,r283/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"E"/473 [r285] (manual)
I190205 02:44:47.443421 20821 storage/replica_command.go:244 [n1,s1,r285/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"H"/2842 [r286] (manual)
I190205 02:44:47.443774 20802 storage/replica_command.go:244 [n3,s3,r271/3:/Table/71/1/{1456-2462}] initiating a split of this range at key /Table/71/1/1747 [r272] (manual)
I190205 02:44:47.463767 20821 storage/replica_command.go:244 [n1,s1,r286/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"L"/298 [r287] (manual)
I190205 02:44:47.478618 20973 storage/replica_command.go:244 [n3,s3,r282/3:/Table/71/{1/3892-2/"A"/3…}] initiating a split of this range at key /Table/71/1/4413 [r273] (manual)
I190205 02:44:47.480787 20821 storage/replica_command.go:244 [n1,s1,r287/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"O"/2745 [r288] (manual)
I190205 02:44:47.500570 20821 storage/replica_command.go:244 [n1,s1,r288/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"S"/227 [r289] (manual)
I190205 02:44:47.513324 20821 storage/replica_command.go:244 [n1,s1,r289/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"V"/2674 [r291] (manual)
I190205 02:44:47.538534 20821 storage/replica_command.go:244 [n1,s1,r291/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/2/"V"/2751 [r292] (manual)
I190205 02:44:47.566995 21138 storage/replica_command.go:244 [n1,s1,r292/1:/{Table/71/2/"…-Max}] initiating a split of this range at key /Table/71/3/3498/"O"/PrefixEnd [r293] (manual)
I190205 02:44:47.584995 21177 storage/replica_command.go:244 [n1,s1,r293/1:/{Table/71/3/3…-Max}] initiating a split of this range at key /Table/71/3/4165/"F"/PrefixEnd [r294] (manual)
I190205 02:44:47.602691 21122 storage/replica_command.go:244 [n1,s1,r294/1:/{Table/71/3/4…-Max}] initiating a split of this range at key /Table/71/3/4832/"W"/PrefixEnd [r295] (manual)
I190205 02:44:47.669114 21198 storage/replica_command.go:244 [n1,s1,r295/1:/{Table/71/3/4…-Max}] initiating a split of this range at key /Table/72 [r296] (manual)
I190205 02:44:47.705378 21124 storage/replica_command.go:244 [n1,s1,r288/1:/Table/71/2/"{O"/2745-S"/227}] initiating a split of this range at key /Table/71/2/"Z"/208 [r290] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:47.764472 21076 storage/replica_command.go:244 [n3,s3,r292/3:/Table/71/{2/"V"/2…-3/3498/…}] initiating a split of this range at key /Table/71/2/"Z"/208 [r274] (manual)
I190205 02:44:47.782285 21076 storage/replica_command.go:244 [n3,s3,r274/3:/Table/71/{2/"Z"/2…-3/3498/…}] initiating a split of this range at key /Table/71/3/507/"N"/PrefixEnd [r275] (manual)
I190205 02:44:47.801201 21076 storage/replica_command.go:244 [n3,s3,r275/3:/Table/71/3/{507/"N…-3498/"…}] initiating a split of this range at key /Table/71/3/1174/"E"/PrefixEnd [r276] (manual)
I190205 02:44:47.825166 21076 storage/replica_command.go:244 [n3,s3,r276/3:/Table/71/3/{1174/"…-3498/"…}] initiating a split of this range at key /Table/71/3/1841/"V"/PrefixEnd [r277] (manual)
I190205 02:44:47.842974 21076 storage/replica_command.go:244 [n3,s3,r277/3:/Table/71/3/{1841/"…-3498/"…}] initiating a split of this range at key /Table/71/3/2508/"M"/PrefixEnd [r278] (manual)
I190205 02:44:47.857598 21076 storage/replica_command.go:244 [n3,s3,r278/3:/Table/71/3/{2508/"…-3498/"…}] initiating a split of this range at key /Table/71/3/2832/"Y" [r279] (manual)
TestImportCSVStmt/schema-in-query-opts
...I190205 02:44:40.146743 13411 storage/replica_command.go:244 [n1,s1,r119/1:/{Table/59/1/2…-Max}] initiating a split of this range at key /Table/59/1/3048 [r120] (manual)
I190205 02:44:40.157367 13311 storage/replica_command.go:244 [n1,s1,r120/1:/{Table/59/1/3…-Max}] initiating a split of this range at key /Table/59/1/3803 [r121] (manual)
I190205 02:44:40.166763 13299 storage/replica_command.go:244 [n1,s1,r121/1:/{Table/59/1/3…-Max}] initiating a split of this range at key /Table/59/1/4033 [r122] (manual)
I190205 02:44:40.172444 13337 storage/replica_command.go:244 [n1,s1,r122/1:/{Table/59/1/4…-Max}] initiating a split of this range at key /Table/59/1/4788 [r124] (manual)
I190205 02:44:40.184371 13337 storage/replica_command.go:244 [n1,s1,r124/1:/{Table/59/1/4…-Max}] initiating a split of this range at key /Table/59/2/NULL/2425 [r125] (manual)
I190205 02:44:40.192974 13337 storage/replica_command.go:244 [n1,s1,r125/1:/{Table/59/2/N…-Max}] initiating a split of this range at key /Table/59/2/"B"/1484 [r126] (manual)
I190205 02:44:40.194893 13514 storage/replica_command.go:244 [n1,s1,r125/1:/{Table/59/2/N…-Max}] initiating a split of this range at key /Table/59/3/962/"A"/PrefixEnd [r127] (manual)
I190205 02:44:40.201550 13337 storage/replica_command.go:244 [n1,s1,r126/1:/{Table/59/2/"…-Max}] initiating a split of this range at key /Table/59/2/"D"/2031 [r128] (manual)
I190205 02:44:40.250733 13547 storage/replica_command.go:244 [n1,s1,r128/1:/{Table/59/2/"…-Max}] initiating a split of this range at key /Table/59/3/962/"A"/PrefixEnd [r129] (manual)
I190205 02:44:40.261575 13530 storage/replica_command.go:244 [n1,s1,r129/1:/{Table/59/3/9…-Max}] initiating a split of this range at key /Table/59/3/1664/!NULL [r130] (manual)
I190205 02:44:40.271881 13443 storage/replica_command.go:244 [n1,s1,r130/1:/{Table/59/3/1…-Max}] initiating a split of this range at key /Table/59/3/2366/"A"/PrefixEnd [r131] (manual)
I190205 02:44:40.283266 13604 storage/replica_command.go:244 [n1,s1,r131/1:/{Table/59/3/2…-Max}] initiating a split of this range at key /Table/59/3/3068/!NULL [r132] (manual)
I190205 02:44:40.302432 13565 storage/replica_command.go:244 [n1,s1,r132/1:/{Table/59/3/3…-Max}] initiating a split of this range at key /Table/59/3/3770/"A"/PrefixEnd [r133] (manual)
I190205 02:44:40.312445 13630 storage/replica_command.go:244 [n1,s1,r133/1:/{Table/59/3/3…-Max}] initiating a split of this range at key /Table/59/3/4472/!NULL [r134] (manual)
I190205 02:44:40.323078 13581 storage/replica_command.go:244 [n1,s1,r134/1:/{Table/59/3/4…-Max}] initiating a split of this range at key /Table/60 [r135] (manual)
I190205 02:44:40.369520 13347 storage/replica_command.go:244 [n1,s1,r121/1:/Table/59/1/{3803-4033}] initiating a split of this range at key /Table/59/2/"H"/4376 [r123] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:40.430777 13571 storage/replica_command.go:244 [n1,s1,r128/1:/Table/59/{2/"D"/2…-3/962/"…}] initiating a split of this range at key /Table/59/2/"H"/4376 [r136] (manual)
I190205 02:44:40.441593 13599 storage/replica_command.go:244 [n1,s1,r136/1:/Table/59/{2/"H"/4…-3/962/"…}] initiating a split of this range at key /Table/59/2/"M"/3627 [r137] (manual)
I190205 02:44:40.451421 13686 storage/replica_command.go:244 [n1,s1,r137/1:/Table/59/{2/"M"/3…-3/962/"…}] initiating a split of this range at key /Table/59/2/"R"/1786 [r138] (manual)
I190205 02:44:40.460791 13643 storage/replica_command.go:244 [n1,s1,r138/1:/Table/59/{2/"R"/1…-3/962/"…}] initiating a split of this range at key /Table/59/2/"V"/4234 [r139] (manual)
I190205 02:44:40.470715 13662 storage/replica_command.go:244 [n1,s1,r139/1:/Table/59/{2/"V"/4…-3/962/"…}] initiating a split of this range at key /Table/59/3/78/"A"/PrefixEnd [r140] (manual)
I190205 02:44:40.480338 13708 storage/replica_command.go:244 [n1,s1,r140/1:/Table/59/3/{78/"A"…-962/"A…}] initiating a split of this range at key /Table/59/3/261/"B" [r141] (manual)
TestImportCSVStmt/empty-with-files
... 18047 storage/replica_command.go:244 [n2,s2,r215/2:/Table/6{5-7/1/4079}] initiating a split of this range at key /Table/67/1/741 [r231] (manual)
I190205 02:44:45.162106 18018 storage/replica_command.go:244 [n2,s2,r231/2:/Table/67/1/{741-4079}] initiating a split of this range at key /Table/67/1/1456 [r232] (manual)
I190205 02:44:45.198677 18072 storage/replica_command.go:244 [n2,s2,r232/2:/Table/67/1/{1456-4079}] initiating a split of this range at key /Table/67/1/2171 [r233] (manual)
I190205 02:44:45.217220 18064 storage/replica_command.go:244 [n3,s3,r216/3:/Table/67/{1/4079-2/"Q"/5…}] initiating a split of this range at key /Table/67/1/4794 [r49] (manual)
I190205 02:44:45.239743 18109 storage/replica_command.go:244 [n2,s2,r233/2:/Table/67/1/{2171-4079}] initiating a split of this range at key /Table/67/1/2886 [r234] (manual)
I190205 02:44:45.243343 18172 storage/replica_command.go:244 [n3,s3,r49/3:/Table/67/{1/4794-2/"Q"/5…}] initiating a split of this range at key /Table/67/2/"C"/2369 [r50] (manual)
I190205 02:44:45.261406 18263 storage/replica_command.go:244 [n3,s3,r50/3:/Table/67/2/"{C"/2369-Q"/563}] initiating a split of this range at key /Table/67/2/"F"/4738 [r51] (manual)
I190205 02:44:45.264000 18232 storage/replica_command.go:244 [n2,s2,r234/2:/Table/67/1/{2886-4079}] initiating a split of this range at key /Table/67/1/3364 [r235] (manual)
I190205 02:44:45.289094 18253 storage/replica_command.go:244 [n3,s3,r51/3:/Table/67/2/"{F"/4738-Q"/563}] initiating a split of this range at key /Table/67/2/"J"/2142 [r52] (manual)
I190205 02:44:45.298437 18359 storage/replica_command.go:244 [n1,s1,r220/1:/{Table/67/2/"…-Max}] initiating a split of this range at key /Table/67/2/"Y"/4158 [r222] (manual)
I190205 02:44:45.298446 18303 storage/replica_command.go:244 [n1,s1,r220/1:/{Table/67/2/"…-Max}] initiating a split of this range at key /Table/67/3/466/"Y"/PrefixEnd [r221] (manual)
I190205 02:44:45.318375 18377 storage/replica_command.go:244 [n3,s3,r52/3:/Table/67/2/"{J"/2142-Q"/563}] initiating a split of this range at key /Table/67/2/"M"/3106 [r53] (manual)
I190205 02:44:45.360776 18303 storage/replica_command.go:244 [n1,s1,r222/1:/{Table/67/2/"…-Max}] initiating a split of this range at key /Table/67/3/466/"Y"/PrefixEnd [r223] (manual)
I190205 02:44:45.384558 18303 storage/replica_command.go:244 [n1,s1,r223/1:/{Table/67/3/4…-Max}] initiating a split of this range at key /Table/67/3/1133/"P"/PrefixEnd [r224] (manual)
I190205 02:44:45.401951 18303 storage/replica_command.go:244 [n1,s1,r224/1:/{Table/67/3/1…-Max}] initiating a split of this range at key /Table/67/3/1253/"F" [r225] (manual)
I190205 02:44:45.610614 18519 storage/replica_command.go:244 [n1,s1,r224/1:/Table/67/3/1{133/"P…-253/"F"}] initiating a split of this range at key /Table/67/3/1919/"V"/PrefixEnd [r226] (manual); delayed split for 0.2s to avoid Raft snapshot
I190205 02:44:45.663376 18462 storage/replica_command.go:244 [n1,s1,r225/1:/{Table/67/3/1…-Max}] initiating a split of this range at key /Table/67/3/1919/"V"/PrefixEnd [r227] (manual)
I190205 02:44:45.680611 18527 storage/replica_command.go:244 [n1,s1,r227/1:/{Table/67/3/1…-Max}] initiating a split of this range at key /Table/67/3/2586/"M"/PrefixEnd [r228] (manual)
I190205 02:44:45.709568 18480 storage/replica_command.go:244 [n1,s1,r228/1:/{Table/67/3/2…-Max}] initiating a split of this range at key /Table/67/3/3253/"D"/PrefixEnd [r229] (manual)
I190205 02:44:45.723287 18505 storage/replica_command.go:244 [n1,s1,r229/1:/{Table/67/3/3…-Max}] initiating a split of this range at key /Table/67/3/3920/"U"/PrefixEnd [r230] (manual)
I190205 02:44:45.760119 18601 storage/replica_command.go:244 [n1,s1,r230/1:/{Table/67/3/3…-Max}] initiating a split of this range at key /Table/67/3/4587/"L"/PrefixEnd [r251] (manual)
I190205 02:44:45.777528 18540 storage/replica_command.go:244 [n1,s1,r251/1:/{Table/67/3/4…-Max}] initiating a split of this range at key /Table/68 [r252] (manual)
```
Please assign, take a look and update the issue accordingly.
| non_priority | teamcity failed test testimportcsvstmt the following tests appear to have failed on master testrace testimportcsvstmt empty file testimportcsvstmt empty with files testimportcsvstmt testimportcsvstmt schema in file auto gzip testimportcsvstmt schema in file implicit gzip testimportcsvstmt schema in query transform only testimportcsvstmt schema in file auto decompress testimportcsvstmt schema in query opts testimportcsvstmt schema in file no decompress testimportcsvstmt schema in file explicit gzip testimportcsvstmt schema in file sstsize you may want to check testimportcsvstmt schema in query transform only ca command go initiating a split of this range at key table a manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table a manual storage replica command go initiating a split of this range at key table f manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table j manual storage replica command go initiating a split of this range at key table l manual server status runtime go runtime stats mib rss goroutines mib mib mib go alloc idle total mib mib cgo alloc total cgo sec u s time gc mib mib r w net server node go health alerts detected alerts xxx nounkeyedliteral xxx sizecache server status runtime go runtime stats mib rss goroutines mib mib mib go alloc idle total mib mib cgo alloc total cgo sec u s time gc mib mib r w net server status runtime go runtime stats mib rss goroutines b b b go alloc idle total mib mib cgo alloc total cgo sec u s time gc mib mib r w net storage replica command go initiating a split of this range at key table null manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table o prefixend manual storage replica command go initiating a split of this range at key table o prefixend manual storage replica command go initiating a split of this range at key table o prefixend manual storage replica command go initiating a split of this range at key table null manual storage replica command go initiating a split of this range at key table o prefixend manual storage replica command go initiating a split of this range at key table null manual testimportcsvstmt schema in file implicit gzip removing replica storage replica command go initiating a split of this range at key table manual storage store go removing replica storage store go removing replica storage replica command go initiating a merge of table k prefixend j prefixend into this range lhs rhs has size kib kib qps below threshold size kib qps storage store go removing replica storage store go removing replica storage store go removing replica storage store go removing replica storage replica command go initiating a merge of table j prefixend into this range lhs rhs has size kib kib qps below threshold size kib qps storage replica command go initiating a merge of table h r into this range lhs rhs has size kib kib qps below threshold size kib qps storage store go removing replica storage store go removing replica storage store go removing replica storage replica command go initiating a merge of table a r into this range lhs rhs has size kib kib qps below threshold size kib qps storage store go removing replica storage store go removing replica storage store go removing replica storage store go removing replica storage store go removing replica storage store go removing replica import stmt test go job did not match description import table public t a primary key b string index b index a b csv data nodelocal csv data gz nodelocal csv data gz nodelocal csv data gz nodelocal csv data gz nodelocal csv data gz import table public t a primary key b string index b index a b csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data with decompress auto testimportcsvstmt schema in file explicit gzip key table manual storage replica command go initiating a split of this range at key table e manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table h manual storage replica command go initiating a split of this range at key table l manual storage replica command go initiating a split of this range at key table o manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table s manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table v manual storage replica command go initiating a split of this range at key table z manual storage replica command go initiating a split of this range at key table b prefixend manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table s prefixend manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table a manual storage replica command go initiating a split of this range at key table a manual storage replica command go initiating a split of this range at key table e prefixend manual storage replica command go initiating a split of this range at key table v prefixend manual storage replica command go initiating a split of this range at key table m prefixend manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table j prefixend manual storage replica command go initiating a split of this range at key table o manual testimportcsvstmt schema in file sstsize storage replica command go initiating a split of this range at key table x manual storage replica command go initiating a split of this range at key table k manual storage replica command go initiating a split of this range at key table n manual storage replica command go initiating a split of this range at key table n manual storage replica command go initiating a split of this range at key table q manual storage replica command go initiating a split of this range at key table n manual storage replica command go initiating a split of this range at key table r manual storage replica command go initiating a split of this range at key table l manual storage replica command go initiating a split of this range at key table s manual storage replica command go initiating a split of this range at key table b manual storage replica command go initiating a split of this range at key table b manual storage replica command go initiating a split of this range at key table w manual storage replica command go initiating a split of this range at key table y prefixend manual storage replica command go initiating a split of this range at key table v manual storage replica command go initiating a split of this range at key table y prefixend manual storage replica command go initiating a split of this range at key table b manual storage replica command go initiating a split of this range at key table k manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table s manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table i manual storage replica command go initiating a split of this range at key table s manual testimportcsvstmt schema in file auto gzip paction for range table table that contains live data storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table p prefixend manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table k prefixend manual storage replica command go initiating a split of this range at key table r manual storage replica command go initiating a split of this range at key table u manual storage replica command go initiating a split of this range at key table b prefixend manual storage replica command go initiating a split of this range at key table u manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table u manual storage replica command go initiating a split of this range at key table s prefixend manual storage replica command go initiating a split of this range at key table j prefixend manual storage replica command go initiating a split of this range at key table y manual storage replica command go initiating a split of this range at key table a prefixend manual storage replica command go initiating a split of this range at key table s manual storage queue go unable to transfer lease to replica not lease holder current lease is repl seq start epo pro server status runtime go runtime stats gib rss goroutines mib mib mib go alloc idle total mib mib cgo alloc total cgo sec u s time gc mib mib r w net storage replica command go initiating a split of this range at key table manual delayed split for to avoid raft snapshot import stmt test go job did not match description import table public t a primary key b string index b index a b csv data nodelocal csv data gz nodelocal csv data gz nodelocal csv data gz nodelocal csv data gz nodelocal csv data gz with decompress auto import table public t a primary key b string index b index a b csv data nodelocal empty csv nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data testimportcsvstmt e replica command go change replicas add replica read existing descriptor table storage replica raft go proposing add replica updated next storage store snapshot go sending preemptive snapshot at applied index storage store snapshot go streamed snapshot to kv pairs log entries rate limit mib sec storage replica raftstorage go applying preemptive snapshot at index id encoded size rocksdb batches log entries storage replica raftstorage go applied preemptive snapshot in storage replica command go change replicas add replica read existing descriptor system nodeliveness max storage replica raft go proposing add replica updated next storage store snapshot go sending preemptive snapshot at applied index storage store snapshot go streamed snapshot to kv pairs log entries rate limit mib sec storage replica raftstorage go applying preemptive snapshot at index id encoded size rocksdb batches log entries storage replica raftstorage go applied preemptive snapshot in storage replica command go change replicas add replica read existing descriptor system nodelivenessmax tsd storage replica raft go proposing add replica updated next storage store snapshot go sending preemptive snapshot at applied index storage store snapshot go streamed snapshot to kv pairs log entries rate limit mib sec storage replica raftstorage go applying preemptive snapshot at index id encoded size rocksdb batches log entries storage replica raftstorage go applied preemptive snapshot in storage replica command go change replicas add replica read existing descriptor table storage replica raft go proposing add replica updated next sql event log go event set cluster setting target info settingname kv import batch size value user root testimportcsvstmt schema in file implicit gzip storage replica command go initiating a split of this range at key table u prefixend manual storage replica command go initiating a split of this range at key table b manual storage replica command go initiating a split of this range at key table l prefixend manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table c prefixend manual storage replica command go initiating a split of this range at key table e manual storage replica command go initiating a split of this range at key table h manual storage replica command go initiating a split of this range at key table l manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table o manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table s manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table v manual storage replica command go initiating a split of this range at key table t prefixend manual storage replica command go initiating a split of this range at key table i prefixend manual storage replica command go initiating a split of this range at key table k prefixend manual storage replica command go initiating a split of this range at key table z manual storage replica command go initiating a split of this range at key table z manual storage replica command go initiating a split of this range at key table i prefixend manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table s manual testimportcsvstmt schema in file auto decompress size kib qps storage replica command go initiating a merge of table w m prefixend into this range lhs rhs has size kib kib qps below threshold size kib qps storage store go removing replica storage store go removing replica storage store go removing replica storage replica command go initiating a split of this range at key table s manual delayed split for to avoid raft snapshot storage replica command go initiating a merge of table w m prefixend into this range lhs rhs has size kib kib qps below threshold size kib qps storage store go removing replica storage store go removing replica storage store go removing replica storage replica command go initiating a merge of table m prefixend into this range lhs rhs has size kib kib qps below threshold size kib qps storage store go removing replica storage store go removing replica storage store go removing replica storage replica command go initiating a split of this range at key table t manual delayed split for to avoid raft snapshot storage replica command go initiating a merge of table into this range lhs rhs has size kib kib qps below threshold size kib qps storage store go removing replica storage store go removing replica storage store go removing replica import stmt test go job did not match description import table public t a primary key b string index b index a b csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data with decompress auto import table t a primary key b string index b index a b csv data nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts with comment delimiter nullif skip transform nodelocal testimportcsvstmt storage replica command go change replicas add replica read existing descriptor table systemconfigspan start storage replica raft go proposing add replica updated next storage store snapshot go sending preemptive snapshot at applied index storage store snapshot go streamed snapshot to kv pairs log entries rate limit mib sec storage replica raftstorage go applying preemptive snapshot at index id encoded size rocksdb batches log entries storage replica raftstorage go applied preemptive snapshot in storage replica command go change replicas add replica read existing descriptor table max storage replica raft go proposing add replica updated next storage store snapshot go sending preemptive snapshot at applied index storage store snapshot go streamed snapshot to kv pairs log entries rate limit mib sec storage replica raftstorage go applying preemptive snapshot at index id encoded size rocksdb batches log entries storage replica raftstorage go applied preemptive snapshot in storage replica command go change replicas add replica read existing descriptor table storage replica raft go proposing add replica updated next storage store snapshot go sending preemptive snapshot at applied index storage store snapshot go streamed snapshot to kv pairs log entries rate limit mib sec storage replica raftstorage go applying preemptive snapshot at index id encoded size rocksdb batches log entries storage replica raftstorage go applied preemptive snapshot in storage replica command go change replicas add replica read existing descriptor system nodelivenessmax tsd storage replica raft go proposing add replica updated next sql event log go event set cluster setting target info settingname kv import batch size value user root testimportcsvstmt schema in file auto decompress manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table g manual storage replica command go initiating a split of this range at key table k manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table n manual storage replica command go initiating a split of this range at key table r manual storage replica command go initiating a split of this range at key table u manual storage replica command go initiating a split of this range at key table u prefixend manual storage replica command go initiating a split of this range at key table b manual storage replica command go initiating a split of this range at key table y manual storage replica command go initiating a split of this range at key table d manual storage replica command go initiating a split of this range at key table m prefixend manual storage replica command go initiating a split of this range at key table u prefixend manual storage replica command go initiating a split of this range at key table l prefixend manual storage replica command go initiating a split of this range at key table c prefixend manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table d prefixend manual storage replica command go initiating a split of this range at key table u prefixend manual storage replica command go initiating a split of this range at key table l prefixend manual storage replica command go initiating a split of this range at key table e manual testimportcsvstmt schema in query transform only storage store go removing replica storage store go removing replica storage store go removing replica storage store go removing replica storage replica command go initiating a merge of table s w into this range lhs rhs has size kib kib qps below threshold size kib qps storage replica command go initiating a merge of table i l into this range lhs rhs has size kib kib qps below threshold size kib qps storage store go removing replica storage store go removing replica storage store go removing replica storage store go removing replica storage store go removing replica storage store go removing replica storage replica command go initiating a merge of table into this range lhs rhs has size kib kib qps below threshold size kib qps storage replica command go initiating a merge of table l o into this range lhs rhs has size kib kib qps below threshold size kib qps storage store go removing replica storage store go removing replica storage store go removing replica storage store go removing replica storage store go removing replica storage store go removing replica import stmt test go job did not match description import table t a primary key b string index b index a b csv data nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts with comment delimiter nullif skip transform nodelocal import table public t a primary key b string index b index a b csv data nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts with comment delimiter nullif skip testimportcsvstmt empty file e compactor compactor go purging suggested compaction for range table p table q that contains live data storage compactor compactor go purging suggested compaction for range table q table s that contains live data storage compactor compactor go purging suggested compaction for range table s table w that contains live data storage compactor compactor go purging suggested compaction for range table x table e that contains live data storage compactor compactor go purging suggested compaction for range table d table n that contains live data storage compactor compactor go purging suggested compaction for range table n table d prefixend that contains live data storage compactor compactor go purging suggested compaction for range table a table q prefixend that contains live data storage compactor compactor go purging suggested compaction for range table q prefixend table d that contains live data storage compactor compactor go purging suggested compaction for range table j table u that contains live data storage compactor compactor go purging suggested compaction for range table r table u that contains live data storage store go removing replica storage store go removing replica storage store go removing replica storage replica command go initiating a merge of table into this range lhs rhs has size kib kib qps below threshold size kib qps import stmt test go job did not match description import table public t a primary key b string index b index a b csv data nodelocal empty csv import table public t a primary key b string index b index a b csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data with sstsize stdout sql event log go event create database target info databasename statement create database user root storage replica consistency go triggering stats recomputation to resolve delta of containsestimates true lastupdatenanos intentage gcbytesage livebytes livecount keybytes keycount valbytes valcount intentbytes intentcount sysbytes syscount xxx nounkeyedliteral xxx sizecache ccl importccl read import proc go could not fetch file size falling back to per file progress ccl importccl read import proc go could not fetch file size falling back to per file progress testimportcsvstmt schema in query opts gc mib mib r w net storage replica command go initiating a split of this range at key table d prefixend manual storage replica command go initiating a split of this range at key table f manual storage replica command go initiating a split of this range at key table j manual storage replica command go initiating a split of this range at key table d prefixend manual storage replica command go initiating a split of this range at key table f manual storage replica command go initiating a split of this range at key table f manual storage replica command go initiating a split of this range at key table d prefixend manual storage replica command go initiating a split of this range at key table o manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table d prefixend manual storage replica command go initiating a split of this range at key table t manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table d prefixend manual storage replica command go initiating a split of this range at key table r prefixend manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table x manual storage replica command go initiating a split of this range at key table r prefixend manual storage replica command go initiating a split of this range at key table y manual storage replica command go initiating a split of this range at key table null manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table null manual storage replica command go initiating a split of this range at key table manual import stmt test go job did not match description import table public t a primary key b string index b index a b csv data nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts nodelocal csv data opts with comment delimiter nullif skip create statistics auto from as of system time testimportcsvstmt empty with files storage compactor compactor go purging suggested compaction for range table e table d that contains live data storage compactor compactor go purging suggested compaction for range table d prefixend table d that contains live data storage compactor compactor go purging suggested compaction for range table d table that contains live data storage compactor compactor go purging suggested compaction for range table u table that contains live data storage compactor compactor go purging suggested compaction for range table max that contains live data storage replica command go initiating a split of this range at key table v manual server status runtime go runtime stats gib rss goroutines mib mib mib go alloc idle total mib mib cgo alloc total cgo sec u s time gc mib mib r w net gossip gossip go gossip status ok nodes gossip client cur max conns infos sent received bytes sent received gossip server cur max conns infos sent received bytes sent received storage replica command go initiating a merge of table into this range lhs rhs has size kib kib qps below threshold size kib qps server status runtime go runtime stats gib rss goroutines mib mib mib go alloc idle total mib mib cgo alloc total cgo sec u s time gc mib mib r w net storage replica command go initiating a split of this range at key table m prefixend manual storage replica command go initiating a split of this range at key table y manual storage store go removing replica storage store go removing replica storage store go removing replica storage queue go purgatory is now empty storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table c prefixend manual storage replica command go initiating a split of this range at key table w manual import stmt test go job did not match description import table public t a primary key b string index b index a b csv data nodelocal empty csv nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data create statistics auto from as of system time testimportcsvstmt empty file r go purging suggested compaction for range table b table e that contains live data storage compactor compactor go purging suggested compaction for range table e table e that contains live data storage compactor compactor go purging suggested compaction for range table e table g that contains live data storage compactor compactor go purging suggested compaction for range table i table l that contains live data storage compactor compactor go purging suggested compaction for range table l table o that contains live data storage compactor compactor go purging suggested compaction for range table p table q that contains live data storage compactor compactor go purging suggested compaction for range table q table s that contains live data storage compactor compactor go purging suggested compaction for range table s table w that contains live data storage compactor compactor go purging suggested compaction for range table x table e that contains live data storage compactor compactor go purging suggested compaction for range table d table n that contains live data storage compactor compactor go purging suggested compaction for range table n table d prefixend that contains live data storage compactor compactor go purging suggested compaction for range table a table q prefixend that contains live data storage compactor compactor go purging suggested compaction for range table q prefixend table d that contains live data storage compactor compactor go purging suggested compaction for range table j table u that contains live data storage compactor compactor go purging suggested compaction for range table r table u that contains live data storage store go removing replica storage store go removing replica storage store go removing replica storage replica command go initiating a merge of table into this range lhs rhs has size kib kib qps below threshold size kib qps import stmt test go job did not match description import table public t a primary key b string index b index a b csv data nodelocal empty csv import table public t a primary key b string index b index a b csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data with sstsize testimportcsvstmt schema in file no decompress storage replica command go initiating a split of this range at key table y prefixend manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table i manual server status runtime go runtime stats gib rss goroutines mib mib mib go alloc idle total mib mib cgo alloc total cgo sec u s time gc mib mib r w net storage replica command go initiating a split of this range at key table y prefixend manual server node go health alerts detected alerts xxx nounkeyedliteral xxx sizecache storage replica command go initiating a split of this range at key table y prefixend manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table u manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table l manual storage replica command go initiating a split of this range at key table o manual storage replica command go initiating a split of this range at key table s manual storage replica command go initiating a split of this range at key table v manual storage replica command go initiating a split of this range at key table k prefixend manual storage replica command go initiating a split of this range at key table b prefixend manual storage replica command go initiating a split of this range at key table s prefixend manual storage replica command go initiating a split of this range at key table manual import stmt test go job did not match description import table public t a primary key b string index b index a b csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data with decompress none create statistics auto from as of system time testimportcsvstmt schema in file auto gzip storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table b prefixend manual storage replica command go initiating a split of this range at key table s prefixend manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table t manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table l manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table o manual storage replica command go initiating a split of this range at key table a prefixend manual storage replica command go initiating a split of this range at key table s manual storage replica command go initiating a split of this range at key table v manual storage replica command go initiating a split of this range at key table z manual storage replica command go initiating a split of this range at key table a prefixend manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table k manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table a prefixend manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table r prefixend manual storage replica command go initiating a split of this range at key table i prefixend manual storage replica command go initiating a split of this range at key table u manual testimportcsvstmt schema in file explicit gzip storage store go removing replica storage store go removing replica storage store go removing replica storage replica command go initiating a merge of table l into this range lhs rhs has size kib kib qps below threshold size kib qps storage replica command go initiating a split of this range at key table o prefixend manual storage store go removing replica storage store go removing replica storage store go removing replica server status runtime go runtime stats gib rss goroutines mib mib mib go alloc idle total mib mib cgo alloc total cgo sec u s time gc mib mib r w net storage replica command go initiating a split of this range at key table f prefixend manual sql event log go event create statistics target info statisticname auto statement create statistics auto from as of system time storage replica command go initiating a split of this range at key table manual server status runtime go runtime stats gib rss goroutines mib mib mib go alloc idle total mib mib cgo alloc total cgo sec u s time gc mib mib r w net storage replica command go initiating a merge of table into this range lhs rhs has size kib kib qps below threshold size kib qps server node go health alerts detected alerts xxx nounkeyedliteral xxx sizecache storage store go removing replica storage replica command go initiating a merge of table into this range lhs rhs has size kib kib qps below threshold size kib qps storage store go removing replica storage store go removing replica import stmt test go job did not match description import table public t a primary key b string index b index a b csv data nodelocal csv data gz nodelocal csv data gz nodelocal csv data gz nodelocal csv data gz nodelocal csv data gz with decompress gzip import table public t a primary key b string index b index a b csv data nodelocal empty csv testimportcsvstmt schema in file sstsize compactor purging suggested compaction for range table d prefixend table null that contains live data storage compactor compactor go purging suggested compaction for range table null table r prefixend that contains live data storage compactor compactor go purging suggested compaction for range table table that contains live data storage compactor compactor go purging suggested compaction for range table r prefixend table that contains live data storage compactor compactor go purging suggested compaction for range table table e prefixend that contains live data storage compactor compactor go purging suggested compaction for range table e prefixend table that contains live data storage compactor compactor go purging suggested compaction for range table x table y that contains live data storage compactor compactor go purging suggested compaction for range table y table d prefixend that contains live data storage compactor compactor go purging suggested compaction for range table d prefixend table d prefixend that contains live data storage compactor compactor go purging suggested compaction for range table d prefixend table d prefixend that contains live data storage compactor compactor go purging suggested compaction for range table d prefixend table d prefixend that contains live data storage compactor compactor go purging suggested compaction for range table d prefixend table null that contains live data storage compactor compactor go purging suggested compaction for range table null table r prefixend that contains live data storage compactor compactor go purging suggested compaction for range table r prefixend table that contains live data storage compactor compactor go purging suggested compaction for range table table that contains live data storage compactor compactor go purging suggested compaction for range table table that contains live data storage store go removing replica storage store go removing replica storage store go removing replica storage replica command go initiating a split of this range at key table u manual import stmt test go job did not match description import table public t a primary key b string index b index a b csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data nodelocal csv data with sstsize create statistics auto from as of system time testimportcsvstmt schema in file no decompress storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table e manual storage replica command go initiating a split of this range at key table h manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table l manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table o manual storage replica command go initiating a split of this range at key table s manual storage replica command go initiating a split of this range at key table v manual storage replica command go initiating a split of this range at key table v manual storage replica command go initiating a split of this range at key table o prefixend manual storage replica command go initiating a split of this range at key table f prefixend manual storage replica command go initiating a split of this range at key table w prefixend manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table z manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table z manual storage replica command go initiating a split of this range at key table n prefixend manual storage replica command go initiating a split of this range at key table e prefixend manual storage replica command go initiating a split of this range at key table v prefixend manual storage replica command go initiating a split of this range at key table m prefixend manual storage replica command go initiating a split of this range at key table y manual testimportcsvstmt schema in query opts storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table null manual storage replica command go initiating a split of this range at key table b manual storage replica command go initiating a split of this range at key table a prefixend manual storage replica command go initiating a split of this range at key table d manual storage replica command go initiating a split of this range at key table a prefixend manual storage replica command go initiating a split of this range at key table null manual storage replica command go initiating a split of this range at key table a prefixend manual storage replica command go initiating a split of this range at key table null manual storage replica command go initiating a split of this range at key table a prefixend manual storage replica command go initiating a split of this range at key table null manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table h manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table h manual storage replica command go initiating a split of this range at key table m manual storage replica command go initiating a split of this range at key table r manual storage replica command go initiating a split of this range at key table v manual storage replica command go initiating a split of this range at key table a prefixend manual storage replica command go initiating a split of this range at key table b manual testimportcsvstmt empty with files storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table c manual storage replica command go initiating a split of this range at key table f manual storage replica command go initiating a split of this range at key table manual storage replica command go initiating a split of this range at key table j manual storage replica command go initiating a split of this range at key table y manual storage replica command go initiating a split of this range at key table y prefixend manual storage replica command go initiating a split of this range at key table m manual storage replica command go initiating a split of this range at key table y prefixend manual storage replica command go initiating a split of this range at key table p prefixend manual storage replica command go initiating a split of this range at key table f manual storage replica command go initiating a split of this range at key table v prefixend manual delayed split for to avoid raft snapshot storage replica command go initiating a split of this range at key table v prefixend manual storage replica command go initiating a split of this range at key table m prefixend manual storage replica command go initiating a split of this range at key table d prefixend manual storage replica command go initiating a split of this range at key table u prefixend manual storage replica command go initiating a split of this range at key table l prefixend manual storage replica command go initiating a split of this range at key table manual please assign take a look and update the issue accordingly | 0 |
110,196 | 9,438,917,751 | IssuesEvent | 2019-04-14 05:22:18 | alkemann/stone_soup_cosplay | https://api.github.com/repos/alkemann/stone_soup_cosplay | closed | Allow paragraphs in descriptions fields. | testing | Descriptions for some challenges etc. sometimes benefit from multiple paragraphs, or even bullets. See the first challenge in http://dcsscosplay.herokuapp.com/challenges/details?id=51 vs https://www.reddit.com/r/dcss/comments/alzylu/crawl_cosplay_challenge_set_3_week_3_grum_the/
For the Set 4 challenges I went and added some html to the descriptions where needed. However you said in another comment that "having html in the db can quickly become a risk" so I gather that's not the best solution. (I can undo that immediately if you think it is best to do so.)
Other basic formatting (bullets, italics, etc.) would be *nice* but are considerably less important than separate paragraphs. | 1.0 | Allow paragraphs in descriptions fields. - Descriptions for some challenges etc. sometimes benefit from multiple paragraphs, or even bullets. See the first challenge in http://dcsscosplay.herokuapp.com/challenges/details?id=51 vs https://www.reddit.com/r/dcss/comments/alzylu/crawl_cosplay_challenge_set_3_week_3_grum_the/
For the Set 4 challenges I went and added some html to the descriptions where needed. However you said in another comment that "having html in the db can quickly become a risk" so I gather that's not the best solution. (I can undo that immediately if you think it is best to do so.)
Other basic formatting (bullets, italics, etc.) would be *nice* but are considerably less important than separate paragraphs. | non_priority | allow paragraphs in descriptions fields descriptions for some challenges etc sometimes benefit from multiple paragraphs or even bullets see the first challenge in vs for the set challenges i went and added some html to the descriptions where needed however you said in another comment that having html in the db can quickly become a risk so i gather that s not the best solution i can undo that immediately if you think it is best to do so other basic formatting bullets italics etc would be nice but are considerably less important than separate paragraphs | 0 |
43,668 | 9,478,707,022 | IssuesEvent | 2019-04-20 00:44:13 | arades79/hyperdome | https://api.github.com/repos/arades79/hyperdome | opened | add TOR and counselor indicators | enhancement vestige code | onionshare used a status indicator for if the server was online. This should be repurposed to show that a user is properly connected to tor, and another similar indicator should show when a user is successfully connected to counsel. | 1.0 | add TOR and counselor indicators - onionshare used a status indicator for if the server was online. This should be repurposed to show that a user is properly connected to tor, and another similar indicator should show when a user is successfully connected to counsel. | non_priority | add tor and counselor indicators onionshare used a status indicator for if the server was online this should be repurposed to show that a user is properly connected to tor and another similar indicator should show when a user is successfully connected to counsel | 0 |
187,610 | 14,428,636,463 | IssuesEvent | 2020-12-06 10:45:40 | kalexmills/github-vet-tests-dec2020 | https://api.github.com/repos/kalexmills/github-vet-tests-dec2020 | closed | liambarkley/jenkins: jenkins_home/tools/org.jenkinsci.plugins.golang.GolangInstallation/go_1.8.3/src/sync/atomic/atomic_test.go; 21 LoC | fresh small test |
Found a possible issue in [liambarkley/jenkins](https://www.github.com/liambarkley/jenkins) at [jenkins_home/tools/org.jenkinsci.plugins.golang.GolangInstallation/go_1.8.3/src/sync/atomic/atomic_test.go](https://github.com/liambarkley/jenkins/blob/fe753e9b5089c2c8f088b5549490c1f4c5a8e9a6/jenkins_home/tools/org.jenkinsci.plugins.golang.GolangInstallation/go_1.8.3/src/sync/atomic/atomic_test.go#L886-L906)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable testf used in defer or goroutine at line 897
[Click here to see the code in its original context.](https://github.com/liambarkley/jenkins/blob/fe753e9b5089c2c8f088b5549490c1f4c5a8e9a6/jenkins_home/tools/org.jenkinsci.plugins.golang.GolangInstallation/go_1.8.3/src/sync/atomic/atomic_test.go#L886-L906)
<details>
<summary>Click here to show the 21 line(s) of Go which triggered the analyzer.</summary>
```go
for name, testf := range hammer32 {
c := make(chan int)
var val uint32
for i := 0; i < p; i++ {
go func() {
defer func() {
if err := recover(); err != nil {
t.Error(err.(string))
}
c <- 1
}()
testf(&val, n)
}()
}
for i := 0; i < p; i++ {
<-c
}
if !strings.HasPrefix(name, "Swap") && val != uint32(n)*p {
t.Fatalf("%s: val=%d want %d", name, val, n*p)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: fe753e9b5089c2c8f088b5549490c1f4c5a8e9a6
| 1.0 | liambarkley/jenkins: jenkins_home/tools/org.jenkinsci.plugins.golang.GolangInstallation/go_1.8.3/src/sync/atomic/atomic_test.go; 21 LoC -
Found a possible issue in [liambarkley/jenkins](https://www.github.com/liambarkley/jenkins) at [jenkins_home/tools/org.jenkinsci.plugins.golang.GolangInstallation/go_1.8.3/src/sync/atomic/atomic_test.go](https://github.com/liambarkley/jenkins/blob/fe753e9b5089c2c8f088b5549490c1f4c5a8e9a6/jenkins_home/tools/org.jenkinsci.plugins.golang.GolangInstallation/go_1.8.3/src/sync/atomic/atomic_test.go#L886-L906)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable testf used in defer or goroutine at line 897
[Click here to see the code in its original context.](https://github.com/liambarkley/jenkins/blob/fe753e9b5089c2c8f088b5549490c1f4c5a8e9a6/jenkins_home/tools/org.jenkinsci.plugins.golang.GolangInstallation/go_1.8.3/src/sync/atomic/atomic_test.go#L886-L906)
<details>
<summary>Click here to show the 21 line(s) of Go which triggered the analyzer.</summary>
```go
for name, testf := range hammer32 {
c := make(chan int)
var val uint32
for i := 0; i < p; i++ {
go func() {
defer func() {
if err := recover(); err != nil {
t.Error(err.(string))
}
c <- 1
}()
testf(&val, n)
}()
}
for i := 0; i < p; i++ {
<-c
}
if !strings.HasPrefix(name, "Swap") && val != uint32(n)*p {
t.Fatalf("%s: val=%d want %d", name, val, n*p)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: fe753e9b5089c2c8f088b5549490c1f4c5a8e9a6
| non_priority | liambarkley jenkins jenkins home tools org jenkinsci plugins golang golanginstallation go src sync atomic atomic test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message range loop variable testf used in defer or goroutine at line click here to show the line s of go which triggered the analyzer go for name testf range c make chan int var val for i i p i go func defer func if err recover err nil t error err string c testf val n for i i p i c if strings hasprefix name swap val n p t fatalf s val d want d name val n p leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id | 0 |
72,482 | 7,299,375,994 | IssuesEvent | 2018-02-26 19:55:13 | eclipse/jetty.project | https://api.github.com/repos/eclipse/jetty.project | closed | JDK9 Test failure: org.eclipse.jetty.server.NotAcceptingTest.testLocalConnector | Java 9 Test | `jetty-9.4.x` branch.
Linux build.
```
Error Details
/four
Stack Trace
java.lang.AssertionError: /four
at org.eclipse.jetty.server.NotAcceptingTest.testLocalConnector(NotAcceptingTest.java:186)
Standard Error
[AdvancedRunner] Running org.eclipse.jetty.server.NotAcceptingTest.testLocalConnector()
2018-02-22 21:50:52.277:INFO:oejs.Server:main: jetty-9.4.9-SNAPSHOT; built: 2018-02-22T21:43:49Z; git: c751a8b1bbfdc73549bc9136f091b67270d9859e; jvm 1.8.0_152-b16
2018-02-22 21:50:52.278:INFO:oejs.AbstractConnector:main: Started LocalConnector@109fff4a{HTTP/1.1,[http/1.1]}
2018-02-22 21:50:52.279:INFO:oejs.Server:main: Started @172210ms
2018-02-22 21:50:52.281:INFO:oejs.AbstractConnector:main: Stopped LocalConnector@109fff4a{HTTP/1.1,[http/1.1]}
``` | 1.0 | JDK9 Test failure: org.eclipse.jetty.server.NotAcceptingTest.testLocalConnector - `jetty-9.4.x` branch.
Linux build.
```
Error Details
/four
Stack Trace
java.lang.AssertionError: /four
at org.eclipse.jetty.server.NotAcceptingTest.testLocalConnector(NotAcceptingTest.java:186)
Standard Error
[AdvancedRunner] Running org.eclipse.jetty.server.NotAcceptingTest.testLocalConnector()
2018-02-22 21:50:52.277:INFO:oejs.Server:main: jetty-9.4.9-SNAPSHOT; built: 2018-02-22T21:43:49Z; git: c751a8b1bbfdc73549bc9136f091b67270d9859e; jvm 1.8.0_152-b16
2018-02-22 21:50:52.278:INFO:oejs.AbstractConnector:main: Started LocalConnector@109fff4a{HTTP/1.1,[http/1.1]}
2018-02-22 21:50:52.279:INFO:oejs.Server:main: Started @172210ms
2018-02-22 21:50:52.281:INFO:oejs.AbstractConnector:main: Stopped LocalConnector@109fff4a{HTTP/1.1,[http/1.1]}
``` | non_priority | test failure org eclipse jetty server notacceptingtest testlocalconnector jetty x branch linux build error details four stack trace java lang assertionerror four at org eclipse jetty server notacceptingtest testlocalconnector notacceptingtest java standard error running org eclipse jetty server notacceptingtest testlocalconnector info oejs server main jetty snapshot built git jvm info oejs abstractconnector main started localconnector http info oejs server main started info oejs abstractconnector main stopped localconnector http | 0 |
161,008 | 20,120,382,530 | IssuesEvent | 2022-02-08 01:13:33 | arohablue/BlockDockServer | https://api.github.com/repos/arohablue/BlockDockServer | closed | CVE-2018-8014 (High) detected in tomcat-embed-core-8.5.15.jar - autoclosed | security vulnerability | ## CVE-2018-8014 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tomcat-embed-core-8.5.15.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Library home page: <a href="http://tomcat.apache.org/">http://tomcat.apache.org/</a></p>
<p>Path to dependency file: /BlockDockServer/build.gradle</p>
<p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/org.apache.tomcat.embed/tomcat-embed-core/8.5.15/f197a93ae66212767b004fd93d7a1a8ea62bc3fa/tomcat-embed-core-8.5.15.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-tomcat-1.5.4.RELEASE.jar (Root Library)
- :x: **tomcat-embed-core-8.5.15.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The defaults settings for the CORS filter provided in Apache Tomcat 9.0.0.M1 to 9.0.8, 8.5.0 to 8.5.31, 8.0.0.RC1 to 8.0.52, 7.0.41 to 7.0.88 are insecure and enable 'supportsCredentials' for all origins. It is expected that users of the CORS filter will have configured it appropriately for their environment rather than using it in the default configuration. Therefore, it is expected that most users will not be impacted by this issue.
<p>Publish Date: 2018-05-16
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8014>CVE-2018-8014</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-8014">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-8014</a></p>
<p>Release Date: 2018-05-16</p>
<p>Fix Resolution: org.apache.tomcat.embed:tomcat-embed-core:9.0.10,8.5.32,8.0.53,7.0.90,org.apache.tomcat:tomcat-catalina:9.0.10,8.5.32,8.0.53,7.0.90</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2018-8014 (High) detected in tomcat-embed-core-8.5.15.jar - autoclosed - ## CVE-2018-8014 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tomcat-embed-core-8.5.15.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Library home page: <a href="http://tomcat.apache.org/">http://tomcat.apache.org/</a></p>
<p>Path to dependency file: /BlockDockServer/build.gradle</p>
<p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/org.apache.tomcat.embed/tomcat-embed-core/8.5.15/f197a93ae66212767b004fd93d7a1a8ea62bc3fa/tomcat-embed-core-8.5.15.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-tomcat-1.5.4.RELEASE.jar (Root Library)
- :x: **tomcat-embed-core-8.5.15.jar** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The defaults settings for the CORS filter provided in Apache Tomcat 9.0.0.M1 to 9.0.8, 8.5.0 to 8.5.31, 8.0.0.RC1 to 8.0.52, 7.0.41 to 7.0.88 are insecure and enable 'supportsCredentials' for all origins. It is expected that users of the CORS filter will have configured it appropriately for their environment rather than using it in the default configuration. Therefore, it is expected that most users will not be impacted by this issue.
<p>Publish Date: 2018-05-16
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-8014>CVE-2018-8014</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-8014">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-8014</a></p>
<p>Release Date: 2018-05-16</p>
<p>Fix Resolution: org.apache.tomcat.embed:tomcat-embed-core:9.0.10,8.5.32,8.0.53,7.0.90,org.apache.tomcat:tomcat-catalina:9.0.10,8.5.32,8.0.53,7.0.90</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in tomcat embed core jar autoclosed cve high severity vulnerability vulnerable library tomcat embed core jar core tomcat implementation library home page a href path to dependency file blockdockserver build gradle path to vulnerable library root gradle caches modules files org apache tomcat embed tomcat embed core tomcat embed core jar dependency hierarchy spring boot starter tomcat release jar root library x tomcat embed core jar vulnerable library vulnerability details the defaults settings for the cors filter provided in apache tomcat to to to to are insecure and enable supportscredentials for all origins it is expected that users of the cors filter will have configured it appropriately for their environment rather than using it in the default configuration therefore it is expected that most users will not be impacted by this issue publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache tomcat embed tomcat embed core org apache tomcat tomcat catalina step up your open source security game with whitesource | 0 |
60,891 | 14,595,994,034 | IssuesEvent | 2020-12-20 14:02:25 | dotnet/aspnetcore | https://api.github.com/repos/dotnet/aspnetcore | closed | Extend OpenIdConnect middleware with an option to skip TokenValidation, or alternative solutions. | area-security multi-tenancy | ### Is your feature request related to a problem? Please describe.
I had a specific scenario, in which I needed to modify on a per-connection basis the parameters being sent to the identity provider for every OIDC request. I tackled that problem by using one of the provided events - `OpenIdConnectOptions.Events.OnRedirectToIdentityProvider`.
Naturally, that required to also subscribe to another event, which was `OpenIdConnectOptions.Events.OnAuthorizationCodeReceived` as I was using the authorization code flow.
The issue came then, that the initially supplied configuration in `OpenIdConnectOptions`, which contained `TokenValidationParameters` was always being used for token validations, but in my case those parameters were invalid - as every request was being modified for possibly a different authority/client, the issuer saved initially in those `TokenValidationParameters` was invalid.
Hence, I always got an exception stating that validation failed.
I fixed the problem by extracting the logic found in lines [691](https://github.com/dotnet/aspnetcore/blob/8a81194f372fa6fe63ded2d932d379955854d080/src/Security/Authentication/OpenIdConnect/src/OpenIdConnectHandler.cs#L691) to [758](https://github.com/dotnet/aspnetcore/blob/8a81194f372fa6fe63ded2d932d379955854d080/src/Security/Authentication/OpenIdConnect/src/OpenIdConnectHandler.cs#L758) of `OpenIdConnectHandler` with slight alterations to some of the private methods used in that segment, and making all of that, part of the method I used for `OpenIdConnectOptions.Events.OnAuthorizationCodeReceived`, and then building the `AuthenticationTicket` via a call to `AuthorizationCodeReceivedContext.Success()`, which short-circuits all of the logic in the handler below that point, as seen [here](https://github.com/dotnet/aspnetcore/blob/8a81194f372fa6fe63ded2d932d379955854d080/src/Security/Authentication/OpenIdConnect/src/OpenIdConnectHandler.cs#L657).
### Describe the solution you'd like
The core issue here is that the call to [ValidateToken](https://github.com/dotnet/aspnetcore/blob/8a81194f372fa6fe63ded2d932d379955854d080/src/Security/Authentication/OpenIdConnect/src/OpenIdConnectHandler.cs#L691) in `OpenIdConnectHandler` is mandatory, which in my specific use-case results in an invalid state.
There are various solutions to this issue:
- We could make the method `ClaimsPrincipal ValidateToken(string idToken, AuthenticationProperties properties, TokenValidationParameters validationParameters, out JwtSecurityToken jwt)` in [OpenIdConnectHandler](https://github.com/dotnet/aspnetcore/blob/master/src/Security/Authentication/OpenIdConnect/src/OpenIdConnectHandler.cs) be `protected virtual`, which would allow us to modify the passed in `validationParameters` by deriving from that class.
- We could add a `bool` property in `OpenIdConnectOptions`, which can be used to skip validating the token, or instead create a new event - `OnTokenValidation`, which similarly to other events, accepts an object that has a `HandledTokenValidation` property that signals if validation was handled, and validate it for the user if it wasn't.
### Additional context
If it comes down to actually fixing this issue, I'd be happy to create a pull request, be it my first time doing so! | True | Extend OpenIdConnect middleware with an option to skip TokenValidation, or alternative solutions. - ### Is your feature request related to a problem? Please describe.
I had a specific scenario, in which I needed to modify on a per-connection basis the parameters being sent to the identity provider for every OIDC request. I tackled that problem by using one of the provided events - `OpenIdConnectOptions.Events.OnRedirectToIdentityProvider`.
Naturally, that required to also subscribe to another event, which was `OpenIdConnectOptions.Events.OnAuthorizationCodeReceived` as I was using the authorization code flow.
The issue came then, that the initially supplied configuration in `OpenIdConnectOptions`, which contained `TokenValidationParameters` was always being used for token validations, but in my case those parameters were invalid - as every request was being modified for possibly a different authority/client, the issuer saved initially in those `TokenValidationParameters` was invalid.
Hence, I always got an exception stating that validation failed.
I fixed the problem by extracting the logic found in lines [691](https://github.com/dotnet/aspnetcore/blob/8a81194f372fa6fe63ded2d932d379955854d080/src/Security/Authentication/OpenIdConnect/src/OpenIdConnectHandler.cs#L691) to [758](https://github.com/dotnet/aspnetcore/blob/8a81194f372fa6fe63ded2d932d379955854d080/src/Security/Authentication/OpenIdConnect/src/OpenIdConnectHandler.cs#L758) of `OpenIdConnectHandler` with slight alterations to some of the private methods used in that segment, and making all of that, part of the method I used for `OpenIdConnectOptions.Events.OnAuthorizationCodeReceived`, and then building the `AuthenticationTicket` via a call to `AuthorizationCodeReceivedContext.Success()`, which short-circuits all of the logic in the handler below that point, as seen [here](https://github.com/dotnet/aspnetcore/blob/8a81194f372fa6fe63ded2d932d379955854d080/src/Security/Authentication/OpenIdConnect/src/OpenIdConnectHandler.cs#L657).
### Describe the solution you'd like
The core issue here is that the call to [ValidateToken](https://github.com/dotnet/aspnetcore/blob/8a81194f372fa6fe63ded2d932d379955854d080/src/Security/Authentication/OpenIdConnect/src/OpenIdConnectHandler.cs#L691) in `OpenIdConnectHandler` is mandatory, which in my specific use-case results in an invalid state.
There are various solutions to this issue:
- We could make the method `ClaimsPrincipal ValidateToken(string idToken, AuthenticationProperties properties, TokenValidationParameters validationParameters, out JwtSecurityToken jwt)` in [OpenIdConnectHandler](https://github.com/dotnet/aspnetcore/blob/master/src/Security/Authentication/OpenIdConnect/src/OpenIdConnectHandler.cs) be `protected virtual`, which would allow us to modify the passed in `validationParameters` by deriving from that class.
- We could add a `bool` property in `OpenIdConnectOptions`, which can be used to skip validating the token, or instead create a new event - `OnTokenValidation`, which similarly to other events, accepts an object that has a `HandledTokenValidation` property that signals if validation was handled, and validate it for the user if it wasn't.
### Additional context
If it comes down to actually fixing this issue, I'd be happy to create a pull request, be it my first time doing so! | non_priority | extend openidconnect middleware with an option to skip tokenvalidation or alternative solutions is your feature request related to a problem please describe i had a specific scenario in which i needed to modify on a per connection basis the parameters being sent to the identity provider for every oidc request i tackled that problem by using one of the provided events openidconnectoptions events onredirecttoidentityprovider naturally that required to also subscribe to another event which was openidconnectoptions events onauthorizationcodereceived as i was using the authorization code flow the issue came then that the initially supplied configuration in openidconnectoptions which contained tokenvalidationparameters was always being used for token validations but in my case those parameters were invalid as every request was being modified for possibly a different authority client the issuer saved initially in those tokenvalidationparameters was invalid hence i always got an exception stating that validation failed i fixed the problem by extracting the logic found in lines to of openidconnecthandler with slight alterations to some of the private methods used in that segment and making all of that part of the method i used for openidconnectoptions events onauthorizationcodereceived and then building the authenticationticket via a call to authorizationcodereceivedcontext success which short circuits all of the logic in the handler below that point as seen describe the solution you d like the core issue here is that the call to in openidconnecthandler is mandatory which in my specific use case results in an invalid state there are various solutions to this issue we could make the method claimsprincipal validatetoken string idtoken authenticationproperties properties tokenvalidationparameters validationparameters out jwtsecuritytoken jwt in be protected virtual which would allow us to modify the passed in validationparameters by deriving from that class we could add a bool property in openidconnectoptions which can be used to skip validating the token or instead create a new event ontokenvalidation which similarly to other events accepts an object that has a handledtokenvalidation property that signals if validation was handled and validate it for the user if it wasn t additional context if it comes down to actually fixing this issue i d be happy to create a pull request be it my first time doing so | 0 |
62,651 | 15,306,632,696 | IssuesEvent | 2021-02-24 19:45:11 | tensorflow/tensorflow | https://api.github.com/repos/tensorflow/tensorflow | closed | TensorFlow 2.4 Contains References to nocopts, which is No Longer Compatible with Bazel | TF 2.4 comp:lite comp:micro stat:awaiting tensorflower type:build/install | In e5f8043742f927ed0e1711bb48f3e1a153b7a997 and ddde447e792231cdf83b435b0eeb59dd59bf4044, among other commits, support for `nocopts` was removed to enable the upgrade of Bazel to 1.0 and above.
However, there are still a few references to `nocopts` in the r2.4 branch:
```
$ git grep nocopts
tensorflow/lite/micro/testing/micro_test.bzl: nocopts = "",
tensorflow/lite/micro/testing/micro_test.bzl: nocopts: list of gcc compilation flags to remove for this rule
tensorflow/lite/micro/testing/micro_test.bzl: nocopts = nocopts,
tensorflow/tensorflow.bzl: # -fno-exceptions in nocopts breaks compilation if header modules are enabled.
tensorflow/tensorflow.bzl: # -fno-exceptions in nocopts breaks compilation if header modules are enabled.
```
These should probably be removed for consistency and to avoid confusion. | 1.0 | TensorFlow 2.4 Contains References to nocopts, which is No Longer Compatible with Bazel - In e5f8043742f927ed0e1711bb48f3e1a153b7a997 and ddde447e792231cdf83b435b0eeb59dd59bf4044, among other commits, support for `nocopts` was removed to enable the upgrade of Bazel to 1.0 and above.
However, there are still a few references to `nocopts` in the r2.4 branch:
```
$ git grep nocopts
tensorflow/lite/micro/testing/micro_test.bzl: nocopts = "",
tensorflow/lite/micro/testing/micro_test.bzl: nocopts: list of gcc compilation flags to remove for this rule
tensorflow/lite/micro/testing/micro_test.bzl: nocopts = nocopts,
tensorflow/tensorflow.bzl: # -fno-exceptions in nocopts breaks compilation if header modules are enabled.
tensorflow/tensorflow.bzl: # -fno-exceptions in nocopts breaks compilation if header modules are enabled.
```
These should probably be removed for consistency and to avoid confusion. | non_priority | tensorflow contains references to nocopts which is no longer compatible with bazel in and among other commits support for nocopts was removed to enable the upgrade of bazel to and above however there are still a few references to nocopts in the branch git grep nocopts tensorflow lite micro testing micro test bzl nocopts tensorflow lite micro testing micro test bzl nocopts list of gcc compilation flags to remove for this rule tensorflow lite micro testing micro test bzl nocopts nocopts tensorflow tensorflow bzl fno exceptions in nocopts breaks compilation if header modules are enabled tensorflow tensorflow bzl fno exceptions in nocopts breaks compilation if header modules are enabled these should probably be removed for consistency and to avoid confusion | 0 |
214,078 | 16,557,125,677 | IssuesEvent | 2021-05-28 15:06:39 | microsoft/vscode | https://api.github.com/repos/microsoft/vscode | opened | Test: file upload in web | testplan-item | Refs
- [ ] web
- [ ] web
Complexity: 2
---
A new "Upload..." command was added to the right-click menu of folders in the explorer.
**Test:**
* you can select 1-N files for upload
* the files appear after the upload has finished
* for large files you see progress in the status bar | 1.0 | Test: file upload in web - Refs
- [ ] web
- [ ] web
Complexity: 2
---
A new "Upload..." command was added to the right-click menu of folders in the explorer.
**Test:**
* you can select 1-N files for upload
* the files appear after the upload has finished
* for large files you see progress in the status bar | non_priority | test file upload in web refs web web complexity a new upload command was added to the right click menu of folders in the explorer test you can select n files for upload the files appear after the upload has finished for large files you see progress in the status bar | 0 |
130,552 | 12,439,603,811 | IssuesEvent | 2020-05-26 10:24:06 | luc-github/ESP3D | https://api.github.com/repos/luc-github/ESP3D | opened | [Question]Video capture/ editor for documentation / youtube | Documentation Feedback Welcome Help Welcome question | Hi,
With ESP3D 3.0 coming I am thinking to do some videos to present ESP3D 3.0 features, New webUI and how to configure them to post them on youtube.
I am not good at documentation, and even I do some, people do not read it - they prefere to watch videos, so I will give a try to videos support ...
I do not plan to do super fancy videos, create title for video, add text to give precisions, add background music (may be?) and do simple video editing, same for video capture, need to capture desktop area or some windows.
I do not plan to talk ( I have strong english accent ^_^) that is why I need to type text and show actions to be done, eventually split UI when action is done and show effect on printer/cnc and phone - like 2 videos side by side.
I am new to video world so I really need some suggestions for a free/handy solution to do video capture and video editor.
I have found VSCD Video editor (http://www.videosoftdev.com/) which seems pretty complete (desktop capture. webcam capture, video editor) but I do not manage it yet. It also has several tutorials to demonstrate how to create video, which for me is really necessary as neewbie
I am open to any suggestion from users who already do such things 😸
Thank you
| 1.0 | [Question]Video capture/ editor for documentation / youtube - Hi,
With ESP3D 3.0 coming I am thinking to do some videos to present ESP3D 3.0 features, New webUI and how to configure them to post them on youtube.
I am not good at documentation, and even I do some, people do not read it - they prefere to watch videos, so I will give a try to videos support ...
I do not plan to do super fancy videos, create title for video, add text to give precisions, add background music (may be?) and do simple video editing, same for video capture, need to capture desktop area or some windows.
I do not plan to talk ( I have strong english accent ^_^) that is why I need to type text and show actions to be done, eventually split UI when action is done and show effect on printer/cnc and phone - like 2 videos side by side.
I am new to video world so I really need some suggestions for a free/handy solution to do video capture and video editor.
I have found VSCD Video editor (http://www.videosoftdev.com/) which seems pretty complete (desktop capture. webcam capture, video editor) but I do not manage it yet. It also has several tutorials to demonstrate how to create video, which for me is really necessary as neewbie
I am open to any suggestion from users who already do such things 😸
Thank you
| non_priority | video capture editor for documentation youtube hi with coming i am thinking to do some videos to present features new webui and how to configure them to post them on youtube i am not good at documentation and even i do some people do not read it they prefere to watch videos so i will give a try to videos support i do not plan to do super fancy videos create title for video add text to give precisions add background music may be and do simple video editing same for video capture need to capture desktop area or some windows i do not plan to talk i have strong english accent that is why i need to type text and show actions to be done eventually split ui when action is done and show effect on printer cnc and phone like videos side by side i am new to video world so i really need some suggestions for a free handy solution to do video capture and video editor i have found vscd video editor which seems pretty complete desktop capture webcam capture video editor but i do not manage it yet it also has several tutorials to demonstrate how to create video which for me is really necessary as neewbie i am open to any suggestion from users who already do such things 😸 thank you | 0 |
312,709 | 23,440,023,719 | IssuesEvent | 2022-08-15 14:03:46 | nathan-rabet/linux-cryptFS-module | https://api.github.com/repos/nathan-rabet/linux-cryptFS-module | closed | Système de fichiers `cryptFS` + driver VFS | documentation critical | # Projet `cryptFS` (+ driver Linux)
Le but du projet est à présent de créer un **système de fichiers chiffré** qui sera déchiffré à l’exécution si l'utilisateur est autorisé (s'il fournit le bon mot de passe ou s'il possède _la bonne clé publique_) par un **module kernel**.
## Partie 1 : Système de fichiers chiffré `cryptFS`
L'idée est donc de réfléchir à un système de fichier (comme FAT32, NTFS ou AFS) permettant de stocker des informations de types "fichiers" et "dossiers", de la même façon que sur les systèmes de fichiers usuels, à la seule différence que ces fichiers seront **chiffrés à l'écriture** et **déchiffrés à la lecture** (par notre module kernel).
Il faut donc réfléchir à une **structure de données** permettant d'assurer cela.
### Idée 1 : système à clés symétriques
Une première idée serait de réfléchir à un système à clés symétriques, l'utilisateur pourrait donc rentrer un mot de passe via un programme tier, qui, si le mot de passe est correct, va monter notre système de fichier pour qu'il soit accessible à l'utilisateur.
### Idée 2 : système à clés asymétriques
Dans cette seconde idée, on pourrait imaginer un système _RSA-like_ où des clés publiques stockées dans le système de fichiers seraient liés à l'authentification d'un ou plusieurs utilisateurs, ce qui permettrait à un utilisateur de déchiffrer le système de fichier avec sa clé privée (le mot de passe de la session pourrait par exemple servir de _clé privée_).
### WARNING : REGARDER ce que propose Linux en terme de chiffrement
Cela parait évident... mais cela serait bête de penser à tout un système symétrique (ou asymétrique) si Linux n'offre que des fonctions de chiffrement asymétrique (ou symétrique)
### ANTI-WARNING : re-développer un système de chiffrement
C'est un peu _overkill_ mais j'en parle quand même, car si les fonctions de Linux ne nous permettent pas de faire ce que nous voulons, nous devrons de toute façon refaire des fonctions de chiffrement (ce que je ne souhaite pas).
## Partie 2 : Module kernel de liaison `cryptFS` <=> Linux Filesystem
Notre module lui devra être capable de gérer l'écriture (chiffrer) et la lecture (déchiffrer) quand notre système de fichiers sera monté.
## Partie 3 : Programme tier permettant de monter notre système de fichiers
Ce programme serait un équivalent du programme `mount` et devra demander à l'utilisateur les informations nécessaire pour monter le système de fichier.
> Ce programme pourrait ne pas exister si l'on décide d'utiliser un système de chiffrement asymétrique (avec les clés liées aux authentifications des utilisateurs). | 1.0 | Système de fichiers `cryptFS` + driver VFS - # Projet `cryptFS` (+ driver Linux)
Le but du projet est à présent de créer un **système de fichiers chiffré** qui sera déchiffré à l’exécution si l'utilisateur est autorisé (s'il fournit le bon mot de passe ou s'il possède _la bonne clé publique_) par un **module kernel**.
## Partie 1 : Système de fichiers chiffré `cryptFS`
L'idée est donc de réfléchir à un système de fichier (comme FAT32, NTFS ou AFS) permettant de stocker des informations de types "fichiers" et "dossiers", de la même façon que sur les systèmes de fichiers usuels, à la seule différence que ces fichiers seront **chiffrés à l'écriture** et **déchiffrés à la lecture** (par notre module kernel).
Il faut donc réfléchir à une **structure de données** permettant d'assurer cela.
### Idée 1 : système à clés symétriques
Une première idée serait de réfléchir à un système à clés symétriques, l'utilisateur pourrait donc rentrer un mot de passe via un programme tier, qui, si le mot de passe est correct, va monter notre système de fichier pour qu'il soit accessible à l'utilisateur.
### Idée 2 : système à clés asymétriques
Dans cette seconde idée, on pourrait imaginer un système _RSA-like_ où des clés publiques stockées dans le système de fichiers seraient liés à l'authentification d'un ou plusieurs utilisateurs, ce qui permettrait à un utilisateur de déchiffrer le système de fichier avec sa clé privée (le mot de passe de la session pourrait par exemple servir de _clé privée_).
### WARNING : REGARDER ce que propose Linux en terme de chiffrement
Cela parait évident... mais cela serait bête de penser à tout un système symétrique (ou asymétrique) si Linux n'offre que des fonctions de chiffrement asymétrique (ou symétrique)
### ANTI-WARNING : re-développer un système de chiffrement
C'est un peu _overkill_ mais j'en parle quand même, car si les fonctions de Linux ne nous permettent pas de faire ce que nous voulons, nous devrons de toute façon refaire des fonctions de chiffrement (ce que je ne souhaite pas).
## Partie 2 : Module kernel de liaison `cryptFS` <=> Linux Filesystem
Notre module lui devra être capable de gérer l'écriture (chiffrer) et la lecture (déchiffrer) quand notre système de fichiers sera monté.
## Partie 3 : Programme tier permettant de monter notre système de fichiers
Ce programme serait un équivalent du programme `mount` et devra demander à l'utilisateur les informations nécessaire pour monter le système de fichier.
> Ce programme pourrait ne pas exister si l'on décide d'utiliser un système de chiffrement asymétrique (avec les clés liées aux authentifications des utilisateurs). | non_priority | système de fichiers cryptfs driver vfs projet cryptfs driver linux le but du projet est à présent de créer un système de fichiers chiffré qui sera déchiffré à l’exécution si l utilisateur est autorisé s il fournit le bon mot de passe ou s il possède la bonne clé publique par un module kernel partie système de fichiers chiffré cryptfs l idée est donc de réfléchir à un système de fichier comme ntfs ou afs permettant de stocker des informations de types fichiers et dossiers de la même façon que sur les systèmes de fichiers usuels à la seule différence que ces fichiers seront chiffrés à l écriture et déchiffrés à la lecture par notre module kernel il faut donc réfléchir à une structure de données permettant d assurer cela idée système à clés symétriques une première idée serait de réfléchir à un système à clés symétriques l utilisateur pourrait donc rentrer un mot de passe via un programme tier qui si le mot de passe est correct va monter notre système de fichier pour qu il soit accessible à l utilisateur idée système à clés asymétriques dans cette seconde idée on pourrait imaginer un système rsa like où des clés publiques stockées dans le système de fichiers seraient liés à l authentification d un ou plusieurs utilisateurs ce qui permettrait à un utilisateur de déchiffrer le système de fichier avec sa clé privée le mot de passe de la session pourrait par exemple servir de clé privée warning regarder ce que propose linux en terme de chiffrement cela parait évident mais cela serait bête de penser à tout un système symétrique ou asymétrique si linux n offre que des fonctions de chiffrement asymétrique ou symétrique anti warning re développer un système de chiffrement c est un peu overkill mais j en parle quand même car si les fonctions de linux ne nous permettent pas de faire ce que nous voulons nous devrons de toute façon refaire des fonctions de chiffrement ce que je ne souhaite pas partie module kernel de liaison cryptfs linux filesystem notre module lui devra être capable de gérer l écriture chiffrer et la lecture déchiffrer quand notre système de fichiers sera monté partie programme tier permettant de monter notre système de fichiers ce programme serait un équivalent du programme mount et devra demander à l utilisateur les informations nécessaire pour monter le système de fichier ce programme pourrait ne pas exister si l on décide d utiliser un système de chiffrement asymétrique avec les clés liées aux authentifications des utilisateurs | 0 |
167,631 | 13,037,979,088 | IssuesEvent | 2020-07-28 14:34:42 | elastic/kibana | https://api.github.com/repos/elastic/kibana | closed | Failing test: UI Functional Tests.test/functional/apps/visualize/_tsvb_chart·js - visualize app visual builder Time Series should show the correct count in the legend | Team:KibanaApp failed-test | A test failed on a tracked branch
```
Error: expected '0' to equal '156'
at Assertion.assert (node_modules/expect.js/index.js:96:13)
at Assertion.be.Assertion.equal (node_modules/expect.js/index.js:216:10)
at Assertion.(anonymous function) [as be] (node_modules/expect.js/index.js:69:24)
at Context.<anonymous> (test/functional/apps/visualize/_tsvb_chart.js:38:32)
at process._tickCallback (internal/process/next_tick.js:68:7)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+6.7/JOB=kibana-ciGroup12,node=immutable/88/)
<!-- kibanaCiData = {"failed-test":{"test.class":"UI Functional Tests.test/functional/apps/visualize/_tsvb_chart·js","test.name":"visualize app visual builder Time Series should show the correct count in the legend","test.failCount":29}} --> | 1.0 | Failing test: UI Functional Tests.test/functional/apps/visualize/_tsvb_chart·js - visualize app visual builder Time Series should show the correct count in the legend - A test failed on a tracked branch
```
Error: expected '0' to equal '156'
at Assertion.assert (node_modules/expect.js/index.js:96:13)
at Assertion.be.Assertion.equal (node_modules/expect.js/index.js:216:10)
at Assertion.(anonymous function) [as be] (node_modules/expect.js/index.js:69:24)
at Context.<anonymous> (test/functional/apps/visualize/_tsvb_chart.js:38:32)
at process._tickCallback (internal/process/next_tick.js:68:7)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+6.7/JOB=kibana-ciGroup12,node=immutable/88/)
<!-- kibanaCiData = {"failed-test":{"test.class":"UI Functional Tests.test/functional/apps/visualize/_tsvb_chart·js","test.name":"visualize app visual builder Time Series should show the correct count in the legend","test.failCount":29}} --> | non_priority | failing test ui functional tests test functional apps visualize tsvb chart·js visualize app visual builder time series should show the correct count in the legend a test failed on a tracked branch error expected to equal at assertion assert node modules expect js index js at assertion be assertion equal node modules expect js index js at assertion anonymous function node modules expect js index js at context test functional apps visualize tsvb chart js at process tickcallback internal process next tick js first failure | 0 |
91,855 | 10,730,226,251 | IssuesEvent | 2019-10-28 17:00:37 | opentargets/platform | https://api.github.com/repos/opentargets/platform | closed | tractability buckets in the API response | Kind: Documentation Kind: Enhancement Topic: API Topic: JSON | We need to make the API response below more intelligible and change the bucket numbers (e.g. 4, 5, 7, 8) to something else or at least add an explanation following the numbers.
https://api.opentargets.io/v3/platform/private/target/ENSG00000142192
We could use the explanation that is available below (or a modified version of it):
https://docs.targetvalidation.org/getting-started/target-tractability
e.g.
Buckets | Small molecule | Monoclonal antibody
1 | Targets with drugs in phase IV | Targets with drugs in phase IV
etc...
<img width="553" alt="screen shot 2018-12-05 at 10 44 50" src="https://user-images.githubusercontent.com/6472381/49508614-0ac2bb00-f87b-11e8-80ea-8e60bffcd7a6.png">
| 1.0 | tractability buckets in the API response - We need to make the API response below more intelligible and change the bucket numbers (e.g. 4, 5, 7, 8) to something else or at least add an explanation following the numbers.
https://api.opentargets.io/v3/platform/private/target/ENSG00000142192
We could use the explanation that is available below (or a modified version of it):
https://docs.targetvalidation.org/getting-started/target-tractability
e.g.
Buckets | Small molecule | Monoclonal antibody
1 | Targets with drugs in phase IV | Targets with drugs in phase IV
etc...
<img width="553" alt="screen shot 2018-12-05 at 10 44 50" src="https://user-images.githubusercontent.com/6472381/49508614-0ac2bb00-f87b-11e8-80ea-8e60bffcd7a6.png">
| non_priority | tractability buckets in the api response we need to make the api response below more intelligible and change the bucket numbers e g to something else or at least add an explanation following the numbers we could use the explanation that is available below or a modified version of it e g buckets small molecule monoclonal antibody targets with drugs in phase iv targets with drugs in phase iv etc img width alt screen shot at src | 0 |
318,957 | 27,334,925,419 | IssuesEvent | 2023-02-26 04:07:28 | CLADevs/VanillaX | https://api.github.com/repos/CLADevs/VanillaX | reopened | Enchanting table & Anvil | bug not tested | Enchantement table ans anvil doesn’t work, when we try to use it, items will removed | 1.0 | Enchanting table & Anvil - Enchantement table ans anvil doesn’t work, when we try to use it, items will removed | non_priority | enchanting table anvil enchantement table ans anvil doesn’t work when we try to use it items will removed | 0 |
103,952 | 13,011,732,206 | IssuesEvent | 2020-07-25 00:58:05 | microsoft/vscode-remote-release | https://api.github.com/repos/microsoft/vscode-remote-release | closed | `build.args` not passed to `docker-compose` build | *as-designed containers | <!-- Please search existing issues to avoid creating duplicates. -->
<!-- Also please test using the latest insiders build to make sure your issue has not already been fixed: https://code.visualstudio.com/insiders/ -->
- VSCode Version: 1.47.2 17299e413d5590b14ab0340ea477cdd86ff13daf x64
- Local OS Version: macOS 10.15.5 (19F101)
- Remote OS Version: python:2.7.18-slim (Debian)
- Remote Extension/Connection Type: Docker
Steps to Reproduce:
1. Create `docker-compose.yml` file with a service that has a local build
2. Create `Dockerfile` with an `ARG`, for that build
3. Add a check for the `ARG` e.g. `RUN : "${FOO?Missing foo}"`
4. Add variable to global shell environment for user e.g. `~/.zshrc`
5. In `.devcontainer/devcontainer.json` add `build.args` - `"build":{"args":{"FOO":"${localEnv:FOO}"}},`
6. Open project folder in a container, specifying an existing `docker-compose.yml`
<!-- Check to see if the problem is general, with a specific extension, or only happens when remote -->
Does this issue occur when you try this locally?: Yes
Does this issue occur when you try this locally and all extensions are disabled?: Yes
The logs show that the `docker-compose` build is done incidentally via the `up` command - the logs show the command:
```
~/Library/Application Support/Code/logs/20200101T000000/exthost1/ms-vscode-remote.remote-containers/remoteContainers.log:[2020-01-01T00:00:00.000Z] [PID 1] [20 ms] Start: Run: docker-compose --project-name myproject -f ~/myproject/docker-compose.yml -f ~/myproject/.devcontainer/docker-compose.yml up -d --build
```
The `build.args` don't seem to be passed to `docker-compose` and this may be because the build phase is not separated - the `up` command doesn't support the `--build-arg` flag. | 1.0 | `build.args` not passed to `docker-compose` build - <!-- Please search existing issues to avoid creating duplicates. -->
<!-- Also please test using the latest insiders build to make sure your issue has not already been fixed: https://code.visualstudio.com/insiders/ -->
- VSCode Version: 1.47.2 17299e413d5590b14ab0340ea477cdd86ff13daf x64
- Local OS Version: macOS 10.15.5 (19F101)
- Remote OS Version: python:2.7.18-slim (Debian)
- Remote Extension/Connection Type: Docker
Steps to Reproduce:
1. Create `docker-compose.yml` file with a service that has a local build
2. Create `Dockerfile` with an `ARG`, for that build
3. Add a check for the `ARG` e.g. `RUN : "${FOO?Missing foo}"`
4. Add variable to global shell environment for user e.g. `~/.zshrc`
5. In `.devcontainer/devcontainer.json` add `build.args` - `"build":{"args":{"FOO":"${localEnv:FOO}"}},`
6. Open project folder in a container, specifying an existing `docker-compose.yml`
<!-- Check to see if the problem is general, with a specific extension, or only happens when remote -->
Does this issue occur when you try this locally?: Yes
Does this issue occur when you try this locally and all extensions are disabled?: Yes
The logs show that the `docker-compose` build is done incidentally via the `up` command - the logs show the command:
```
~/Library/Application Support/Code/logs/20200101T000000/exthost1/ms-vscode-remote.remote-containers/remoteContainers.log:[2020-01-01T00:00:00.000Z] [PID 1] [20 ms] Start: Run: docker-compose --project-name myproject -f ~/myproject/docker-compose.yml -f ~/myproject/.devcontainer/docker-compose.yml up -d --build
```
The `build.args` don't seem to be passed to `docker-compose` and this may be because the build phase is not separated - the `up` command doesn't support the `--build-arg` flag. | non_priority | build args not passed to docker compose build vscode version local os version macos remote os version python slim debian remote extension connection type docker steps to reproduce create docker compose yml file with a service that has a local build create dockerfile with an arg for that build add a check for the arg e g run foo missing foo add variable to global shell environment for user e g zshrc in devcontainer devcontainer json add build args build args foo localenv foo open project folder in a container specifying an existing docker compose yml does this issue occur when you try this locally yes does this issue occur when you try this locally and all extensions are disabled yes the logs show that the docker compose build is done incidentally via the up command the logs show the command library application support code logs ms vscode remote remote containers remotecontainers log start run docker compose project name myproject f myproject docker compose yml f myproject devcontainer docker compose yml up d build the build args don t seem to be passed to docker compose and this may be because the build phase is not separated the up command doesn t support the build arg flag | 0 |
84,680 | 10,416,170,576 | IssuesEvent | 2019-09-14 11:06:37 | perfect-matching/perfectmatching-backend | https://api.github.com/repos/perfect-matching/perfectmatching-backend | closed | Comment 도메인 post, put, delete 요청 로직 작성. | documentation enhancement | * [x] Comment 도메인 post 요청 처리 로직 작성.
* [x] Comment 도메인 put 요청 처리 로직 작성.
* [x] Comment 도메인 delete 요청 처리 로직 작성.
* [x] 추가로 작성된 로직에 대해서 api 명세 추가. | 1.0 | Comment 도메인 post, put, delete 요청 로직 작성. - * [x] Comment 도메인 post 요청 처리 로직 작성.
* [x] Comment 도메인 put 요청 처리 로직 작성.
* [x] Comment 도메인 delete 요청 처리 로직 작성.
* [x] 추가로 작성된 로직에 대해서 api 명세 추가. | non_priority | comment 도메인 post put delete 요청 로직 작성 comment 도메인 post 요청 처리 로직 작성 comment 도메인 put 요청 처리 로직 작성 comment 도메인 delete 요청 처리 로직 작성 추가로 작성된 로직에 대해서 api 명세 추가 | 0 |
205,394 | 23,337,688,057 | IssuesEvent | 2022-08-09 11:28:31 | elastic/integrations | https://api.github.com/repos/elastic/integrations | closed | Remove non-working EKS rules templates | 8.4-candidate Team:Cloud Security Posture | **Background**
We shouldn't display non-working EKS rules in our benchmarks.
**In-Depth**
In order to not shows non-working rules, we need to delete their rule template from our integration.
___
#### Related
- https://github.com/elastic/security-team/issues/4471
| True | Remove non-working EKS rules templates - **Background**
We shouldn't display non-working EKS rules in our benchmarks.
**In-Depth**
In order to not shows non-working rules, we need to delete their rule template from our integration.
___
#### Related
- https://github.com/elastic/security-team/issues/4471
| non_priority | remove non working eks rules templates background we shouldn t display non working eks rules in our benchmarks in depth in order to not shows non working rules we need to delete their rule template from our integration related | 0 |
226,731 | 24,996,540,733 | IssuesEvent | 2022-11-03 01:13:36 | hiagorios/charlib | https://api.github.com/repos/hiagorios/charlib | opened | CVE-2022-2421 (High) detected in socket.io-parser-4.0.4.tgz | security vulnerability | ## CVE-2022-2421 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>socket.io-parser-4.0.4.tgz</b></p></summary>
<p>socket.io protocol parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-4.0.4.tgz">https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-4.0.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/socket.io-parser/package.json</p>
<p>
Dependency Hierarchy:
- karma-6.0.3.tgz (Root Library)
- socket.io-3.1.0.tgz
- :x: **socket.io-parser-4.0.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/hiagorios/charlib/commit/8ee6e7aabcbf9c99ef1a7ed2ea015f0e838e7f1d">8ee6e7aabcbf9c99ef1a7ed2ea015f0e838e7f1d</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Due to improper type validation in attachment parsing the Socket.io js library, it is possible to overwrite the _placeholder object which allows an attacker to place references to functions at arbitrary places in the resulting query object.
<p>Publish Date: 2022-10-26
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-2421>CVE-2022-2421</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://csirt.divd.nl/cases/DIVD-2022-00045/">https://csirt.divd.nl/cases/DIVD-2022-00045/</a></p>
<p>Release Date: 2022-10-26</p>
<p>Fix Resolution: socket.io-parser - 4.0.5,4.2.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-2421 (High) detected in socket.io-parser-4.0.4.tgz - ## CVE-2022-2421 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>socket.io-parser-4.0.4.tgz</b></p></summary>
<p>socket.io protocol parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-4.0.4.tgz">https://registry.npmjs.org/socket.io-parser/-/socket.io-parser-4.0.4.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/socket.io-parser/package.json</p>
<p>
Dependency Hierarchy:
- karma-6.0.3.tgz (Root Library)
- socket.io-3.1.0.tgz
- :x: **socket.io-parser-4.0.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/hiagorios/charlib/commit/8ee6e7aabcbf9c99ef1a7ed2ea015f0e838e7f1d">8ee6e7aabcbf9c99ef1a7ed2ea015f0e838e7f1d</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Due to improper type validation in attachment parsing the Socket.io js library, it is possible to overwrite the _placeholder object which allows an attacker to place references to functions at arbitrary places in the resulting query object.
<p>Publish Date: 2022-10-26
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-2421>CVE-2022-2421</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://csirt.divd.nl/cases/DIVD-2022-00045/">https://csirt.divd.nl/cases/DIVD-2022-00045/</a></p>
<p>Release Date: 2022-10-26</p>
<p>Fix Resolution: socket.io-parser - 4.0.5,4.2.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in socket io parser tgz cve high severity vulnerability vulnerable library socket io parser tgz socket io protocol parser library home page a href path to dependency file package json path to vulnerable library node modules socket io parser package json dependency hierarchy karma tgz root library socket io tgz x socket io parser tgz vulnerable library found in head commit a href found in base branch main vulnerability details due to improper type validation in attachment parsing the socket io js library it is possible to overwrite the placeholder object which allows an attacker to place references to functions at arbitrary places in the resulting query object publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution socket io parser step up your open source security game with mend | 0 |
47,248 | 13,056,077,523 | IssuesEvent | 2020-07-30 03:35:19 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | closed | cmake incorrectly sets NCURSES_INCLUDE_DIR on suse (Trac #209) | Migrated from Trac combo core defect | dataio-shovel uses the curses header file form.h which is normally in /usr/include. But suse puts curses header files in /usr/include/ncurses and provides a soft link for curses.h and ncurses.h in /usr/include. See: http://rpm.pbone.net/index.php3/stat/4/idpl/14281986/dir/opensuse/com/ncurses-devel-5.7-3.39.i586.rpm.html
cmake finds /usr/include/ncurses.h and sets:
NCURSES_INCLUDE_DIR:STRING=/usr/include
when it should be set to:
NCURSES_INCLUDE_DIR:STRING=/usr/include/ncurses
Migrated from https://code.icecube.wisc.edu/ticket/209
```json
{
"status": "closed",
"changetime": "2010-09-03T01:48:38",
"description": "dataio-shovel uses the curses header file form.h which is normally in /usr/include. But suse puts curses header files in /usr/include/ncurses and provides a soft link for curses.h and ncurses.h in /usr/include. See: http://rpm.pbone.net/index.php3/stat/4/idpl/14281986/dir/opensuse/com/ncurses-devel-5.7-3.39.i586.rpm.html\n\ncmake finds /usr/include/ncurses.h and sets:\nNCURSES_INCLUDE_DIR:STRING=/usr/include\n\nwhen it should be set to:\nNCURSES_INCLUDE_DIR:STRING=/usr/include/ncurses\n\n\n",
"reporter": "kjmeagher",
"cc": "",
"resolution": "fixed",
"_ts": "1283478518000000",
"component": "combo core",
"summary": "cmake incorrectly sets NCURSES_INCLUDE_DIR on suse",
"priority": "normal",
"keywords": "",
"time": "2010-08-11T16:42:14",
"milestone": "",
"owner": "blaufuss",
"type": "defect"
}
```
| 1.0 | cmake incorrectly sets NCURSES_INCLUDE_DIR on suse (Trac #209) - dataio-shovel uses the curses header file form.h which is normally in /usr/include. But suse puts curses header files in /usr/include/ncurses and provides a soft link for curses.h and ncurses.h in /usr/include. See: http://rpm.pbone.net/index.php3/stat/4/idpl/14281986/dir/opensuse/com/ncurses-devel-5.7-3.39.i586.rpm.html
cmake finds /usr/include/ncurses.h and sets:
NCURSES_INCLUDE_DIR:STRING=/usr/include
when it should be set to:
NCURSES_INCLUDE_DIR:STRING=/usr/include/ncurses
Migrated from https://code.icecube.wisc.edu/ticket/209
```json
{
"status": "closed",
"changetime": "2010-09-03T01:48:38",
"description": "dataio-shovel uses the curses header file form.h which is normally in /usr/include. But suse puts curses header files in /usr/include/ncurses and provides a soft link for curses.h and ncurses.h in /usr/include. See: http://rpm.pbone.net/index.php3/stat/4/idpl/14281986/dir/opensuse/com/ncurses-devel-5.7-3.39.i586.rpm.html\n\ncmake finds /usr/include/ncurses.h and sets:\nNCURSES_INCLUDE_DIR:STRING=/usr/include\n\nwhen it should be set to:\nNCURSES_INCLUDE_DIR:STRING=/usr/include/ncurses\n\n\n",
"reporter": "kjmeagher",
"cc": "",
"resolution": "fixed",
"_ts": "1283478518000000",
"component": "combo core",
"summary": "cmake incorrectly sets NCURSES_INCLUDE_DIR on suse",
"priority": "normal",
"keywords": "",
"time": "2010-08-11T16:42:14",
"milestone": "",
"owner": "blaufuss",
"type": "defect"
}
```
| non_priority | cmake incorrectly sets ncurses include dir on suse trac dataio shovel uses the curses header file form h which is normally in usr include but suse puts curses header files in usr include ncurses and provides a soft link for curses h and ncurses h in usr include see cmake finds usr include ncurses h and sets ncurses include dir string usr include when it should be set to ncurses include dir string usr include ncurses migrated from json status closed changetime description dataio shovel uses the curses header file form h which is normally in usr include but suse puts curses header files in usr include ncurses and provides a soft link for curses h and ncurses h in usr include see finds usr include ncurses h and sets nncurses include dir string usr include n nwhen it should be set to nncurses include dir string usr include ncurses n n n reporter kjmeagher cc resolution fixed ts component combo core summary cmake incorrectly sets ncurses include dir on suse priority normal keywords time milestone owner blaufuss type defect | 0 |
103,589 | 8,922,766,035 | IssuesEvent | 2019-01-21 13:56:00 | khartec/waltz | https://api.github.com/repos/khartec/waltz | closed | Logical Data Elements: should be linked up to Data Types | DDL change fixed (test & close) noteworthy | Currently, NWM are loading the logical data elements as part of the Data Types hierarchy, which makes it look messy.
The LDEs should ideally link up to Data Types and be displayed on the parent data type's page, showing links to physical specs using these. | 1.0 | Logical Data Elements: should be linked up to Data Types - Currently, NWM are loading the logical data elements as part of the Data Types hierarchy, which makes it look messy.
The LDEs should ideally link up to Data Types and be displayed on the parent data type's page, showing links to physical specs using these. | non_priority | logical data elements should be linked up to data types currently nwm are loading the logical data elements as part of the data types hierarchy which makes it look messy the ldes should ideally link up to data types and be displayed on the parent data type s page showing links to physical specs using these | 0 |
75,441 | 14,448,417,675 | IssuesEvent | 2020-12-08 06:12:14 | SecretFoundation/SecretWebsite | https://api.github.com/repos/SecretFoundation/SecretWebsite | reopened | Improvements on Newsletter section. | bug dev / code | - [x] Adjust padding as shown in Figma.
- [x] Adjust title's line-height as shown in Figma.

| 1.0 | Improvements on Newsletter section. - - [x] Adjust padding as shown in Figma.
- [x] Adjust title's line-height as shown in Figma.

| non_priority | improvements on newsletter section adjust padding as shown in figma adjust title s line height as shown in figma | 0 |
173,870 | 27,535,424,942 | IssuesEvent | 2023-03-07 02:51:33 | GAME-2334/vrgame | https://api.github.com/repos/GAME-2334/vrgame | opened | Lab 1 Staging | designers | Please add props to this area of the station utilizing the props from the art department as well as from the asset packs in the store. This area of the space station will be in wrecked condition. This room should have plenty of debris, blood, flickering lights, and desks flipped over. Make it seem like this is the origin of the outbreak on the station if possible | 1.0 | Lab 1 Staging - Please add props to this area of the station utilizing the props from the art department as well as from the asset packs in the store. This area of the space station will be in wrecked condition. This room should have plenty of debris, blood, flickering lights, and desks flipped over. Make it seem like this is the origin of the outbreak on the station if possible | non_priority | lab staging please add props to this area of the station utilizing the props from the art department as well as from the asset packs in the store this area of the space station will be in wrecked condition this room should have plenty of debris blood flickering lights and desks flipped over make it seem like this is the origin of the outbreak on the station if possible | 0 |
40,340 | 9,962,408,529 | IssuesEvent | 2019-07-07 14:19:52 | primefaces/primefaces | https://api.github.com/repos/primefaces/primefaces | closed | MenuItem: NullPointerException | defect | All menus seems to be broken currently with an NPE.
See: https://www.primefaces.org/showcase/ui/menu/menu.xhtml and click on Save menu.
```
java.lang.NullPointerException
at org.primefaces.component.menu.BaseMenuRenderer.findMenuitem(BaseMenuRenderer.java:86)
at org.primefaces.component.menu.BaseMenuRenderer.decode(BaseMenuRenderer.java:67)
``` | 1.0 | MenuItem: NullPointerException - All menus seems to be broken currently with an NPE.
See: https://www.primefaces.org/showcase/ui/menu/menu.xhtml and click on Save menu.
```
java.lang.NullPointerException
at org.primefaces.component.menu.BaseMenuRenderer.findMenuitem(BaseMenuRenderer.java:86)
at org.primefaces.component.menu.BaseMenuRenderer.decode(BaseMenuRenderer.java:67)
``` | non_priority | menuitem nullpointerexception all menus seems to be broken currently with an npe see and click on save menu java lang nullpointerexception at org primefaces component menu basemenurenderer findmenuitem basemenurenderer java at org primefaces component menu basemenurenderer decode basemenurenderer java | 0 |
135,473 | 12,685,033,925 | IssuesEvent | 2020-06-20 01:38:43 | gregoranders/ts-react-playground | https://api.github.com/repos/gregoranders/ts-react-playground | closed | SSL support | documentation enhancement | **Is your feature request related to a problem? Please describe.**
SSL support for self signed certificates.
**Describe the solution you'd like**
Include a Root-CA and certificate for `localhost`. | 1.0 | SSL support - **Is your feature request related to a problem? Please describe.**
SSL support for self signed certificates.
**Describe the solution you'd like**
Include a Root-CA and certificate for `localhost`. | non_priority | ssl support is your feature request related to a problem please describe ssl support for self signed certificates describe the solution you d like include a root ca and certificate for localhost | 0 |
26,210 | 7,802,658,657 | IssuesEvent | 2018-06-10 15:02:41 | minetest/minetest | https://api.github.com/repos/minetest/minetest | closed | Android: ogg build error | @ Build Blocker Bug | ##### Issue type
<!-- Pick one below and delete others -->
- Build issue
##### Minetest version
<!--
Paste Minetest version between quotes below
If you are on a devel version, please add git commit hash
You can use `minetest --version` to find it.
-->
```
0.4.17
```
##### OS / Hardware
<!-- General information about your hardware and operating system -->
Operating system: Archlinux
CPU: AMD FX-8350
<!-- For graphical issues only -->
GPU model: Radeon Vega RX580
OpenGL version: 4.5
##### Summary
<!-- Describe your problem here -->
Android build doesn't work when trying to build libogg
```
[armeabi-v7a] SharedLibrary : libvorbis-jni.so
xxx/android-sdk/ndk-bundle/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.9.x/../../../../arm-linux-androideabi/bin/ld: warning: skipping incompatible /usr/lib/libatomic.so while searching for atomic
xxx/android-sdk/ndk-bundle/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.9.x/../../../../arm-linux-androideabi/bin/ld: error: treating warnings as errors
collect2: error: ld returned 1 exit status
```
This blocks the 0.4.17 release for the play store
##### Steps to reproduce
<!-- For bug reports or build issues, explain how the problem happened -->
| 1.0 | Android: ogg build error - ##### Issue type
<!-- Pick one below and delete others -->
- Build issue
##### Minetest version
<!--
Paste Minetest version between quotes below
If you are on a devel version, please add git commit hash
You can use `minetest --version` to find it.
-->
```
0.4.17
```
##### OS / Hardware
<!-- General information about your hardware and operating system -->
Operating system: Archlinux
CPU: AMD FX-8350
<!-- For graphical issues only -->
GPU model: Radeon Vega RX580
OpenGL version: 4.5
##### Summary
<!-- Describe your problem here -->
Android build doesn't work when trying to build libogg
```
[armeabi-v7a] SharedLibrary : libvorbis-jni.so
xxx/android-sdk/ndk-bundle/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.9.x/../../../../arm-linux-androideabi/bin/ld: warning: skipping incompatible /usr/lib/libatomic.so while searching for atomic
xxx/android-sdk/ndk-bundle/toolchains/arm-linux-androideabi-4.9/prebuilt/linux-x86_64/bin/../lib/gcc/arm-linux-androideabi/4.9.x/../../../../arm-linux-androideabi/bin/ld: error: treating warnings as errors
collect2: error: ld returned 1 exit status
```
This blocks the 0.4.17 release for the play store
##### Steps to reproduce
<!-- For bug reports or build issues, explain how the problem happened -->
| non_priority | android ogg build error issue type build issue minetest version paste minetest version between quotes below if you are on a devel version please add git commit hash you can use minetest version to find it os hardware operating system archlinux cpu amd fx gpu model radeon vega opengl version summary android build doesn t work when trying to build libogg sharedlibrary libvorbis jni so xxx android sdk ndk bundle toolchains arm linux androideabi prebuilt linux bin lib gcc arm linux androideabi x arm linux androideabi bin ld warning skipping incompatible usr lib libatomic so while searching for atomic xxx android sdk ndk bundle toolchains arm linux androideabi prebuilt linux bin lib gcc arm linux androideabi x arm linux androideabi bin ld error treating warnings as errors error ld returned exit status this blocks the release for the play store steps to reproduce | 0 |
195,981 | 22,386,362,613 | IssuesEvent | 2022-06-17 01:05:02 | turkdevops/electron-api-demos | https://api.github.com/repos/turkdevops/electron-api-demos | opened | CVE-2022-29247 (Low) detected in electron-13.6.6.tgz | security vulnerability | ## CVE-2022-29247 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>electron-13.6.6.tgz</b></p></summary>
<p>Build cross platform desktop apps with JavaScript, HTML, and CSS</p>
<p>Library home page: <a href="https://registry.npmjs.org/electron/-/electron-13.6.6.tgz">https://registry.npmjs.org/electron/-/electron-13.6.6.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/electron/package.json</p>
<p>
Dependency Hierarchy:
- :x: **electron-13.6.6.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Electron is a framework for writing cross-platform desktop applications using JavaScript (JS), HTML, and CSS. A vulnerability in versions prior to 18.0.0-beta.6, 17.2.0, 16.2.6, and 15.5.5 allows a renderer with JS execution to obtain access to a new renderer process with `nodeIntegrationInSubFrames` enabled which in turn allows effective access to `ipcRenderer`. The `nodeIntegrationInSubFrames` option does not implicitly grant Node.js access. Rather, it depends on the existing sandbox setting. If an application is sandboxed, then `nodeIntegrationInSubFrames` just gives access to the sandboxed renderer APIs, which include `ipcRenderer`. If the application then additionally exposes IPC messages without IPC `senderFrame` validation that perform privileged actions or return confidential data this access to `ipcRenderer` can in turn compromise your application / user even with the sandbox enabled. Electron versions 18.0.0-beta.6, 17.2.0, 16.2.6, and 15.5.5 contain a fix for this issue. As a workaround, ensure that all IPC message handlers appropriately validate `senderFrame`.
<p>Publish Date: 2022-06-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-29247>CVE-2022-29247</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>2.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-29247">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-29247</a></p>
<p>Release Date: 2022-06-13</p>
<p>Fix Resolution: 15.5.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-29247 (Low) detected in electron-13.6.6.tgz - ## CVE-2022-29247 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>electron-13.6.6.tgz</b></p></summary>
<p>Build cross platform desktop apps with JavaScript, HTML, and CSS</p>
<p>Library home page: <a href="https://registry.npmjs.org/electron/-/electron-13.6.6.tgz">https://registry.npmjs.org/electron/-/electron-13.6.6.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/electron/package.json</p>
<p>
Dependency Hierarchy:
- :x: **electron-13.6.6.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Electron is a framework for writing cross-platform desktop applications using JavaScript (JS), HTML, and CSS. A vulnerability in versions prior to 18.0.0-beta.6, 17.2.0, 16.2.6, and 15.5.5 allows a renderer with JS execution to obtain access to a new renderer process with `nodeIntegrationInSubFrames` enabled which in turn allows effective access to `ipcRenderer`. The `nodeIntegrationInSubFrames` option does not implicitly grant Node.js access. Rather, it depends on the existing sandbox setting. If an application is sandboxed, then `nodeIntegrationInSubFrames` just gives access to the sandboxed renderer APIs, which include `ipcRenderer`. If the application then additionally exposes IPC messages without IPC `senderFrame` validation that perform privileged actions or return confidential data this access to `ipcRenderer` can in turn compromise your application / user even with the sandbox enabled. Electron versions 18.0.0-beta.6, 17.2.0, 16.2.6, and 15.5.5 contain a fix for this issue. As a workaround, ensure that all IPC message handlers appropriately validate `senderFrame`.
<p>Publish Date: 2022-06-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-29247>CVE-2022-29247</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>2.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-29247">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-29247</a></p>
<p>Release Date: 2022-06-13</p>
<p>Fix Resolution: 15.5.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve low detected in electron tgz cve low severity vulnerability vulnerable library electron tgz build cross platform desktop apps with javascript html and css library home page a href path to dependency file package json path to vulnerable library node modules electron package json dependency hierarchy x electron tgz vulnerable library found in base branch master vulnerability details electron is a framework for writing cross platform desktop applications using javascript js html and css a vulnerability in versions prior to beta and allows a renderer with js execution to obtain access to a new renderer process with nodeintegrationinsubframes enabled which in turn allows effective access to ipcrenderer the nodeintegrationinsubframes option does not implicitly grant node js access rather it depends on the existing sandbox setting if an application is sandboxed then nodeintegrationinsubframes just gives access to the sandboxed renderer apis which include ipcrenderer if the application then additionally exposes ipc messages without ipc senderframe validation that perform privileged actions or return confidential data this access to ipcrenderer can in turn compromise your application user even with the sandbox enabled electron versions beta and contain a fix for this issue as a workaround ensure that all ipc message handlers appropriately validate senderframe publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
457 | 2,540,862,176 | IssuesEvent | 2015-01-28 01:11:38 | Cockatrice/Cockatrice | https://api.github.com/repos/Cockatrice/Cockatrice | closed | [RELEASE REGRESSIONS] - Library Access, Hand Reveal, Sideboard Access. | Cockatrice Defect OSX | Using the F3 function, or menu selection to view entire library causes client to crash.
Example using menu option and then shortcut.http://screencast.com/t/8pVfXDYXZO
Platform: OSX
Version: Jan,23rd, 2015.
Update: Regression appears to also effect accessing the sideboard. Behaviour is same when selecting view sideboard, or using keyboard shortcuts to access it.
Update: When revealing your entire hand it also causes a full client crash. http://screencast.com/t/ohelseUXX1a | 1.0 | [RELEASE REGRESSIONS] - Library Access, Hand Reveal, Sideboard Access. - Using the F3 function, or menu selection to view entire library causes client to crash.
Example using menu option and then shortcut.http://screencast.com/t/8pVfXDYXZO
Platform: OSX
Version: Jan,23rd, 2015.
Update: Regression appears to also effect accessing the sideboard. Behaviour is same when selecting view sideboard, or using keyboard shortcuts to access it.
Update: When revealing your entire hand it also causes a full client crash. http://screencast.com/t/ohelseUXX1a | non_priority | library access hand reveal sideboard access using the function or menu selection to view entire library causes client to crash example using menu option and then shortcut platform osx version jan update regression appears to also effect accessing the sideboard behaviour is same when selecting view sideboard or using keyboard shortcuts to access it update when revealing your entire hand it also causes a full client crash | 0 |
51,590 | 6,536,680,934 | IssuesEvent | 2017-08-31 19:06:51 | simonjaeger/azure-bot-pack | https://api.github.com/repos/simonjaeger/azure-bot-pack | opened | Move cursor mode and click handlers from CommandBar | designer | Should be placed in Designer component or separate components. | 1.0 | Move cursor mode and click handlers from CommandBar - Should be placed in Designer component or separate components. | non_priority | move cursor mode and click handlers from commandbar should be placed in designer component or separate components | 0 |
309,689 | 23,303,256,212 | IssuesEvent | 2022-08-07 16:40:54 | metonym/svelte-highlight | https://api.github.com/repos/metonym/svelte-highlight | closed | Add dynamic import example | documentation | This is less of an issue for SvelteKit which will already code-split your app.
However, for other SPA set-ups, it would be nice to have a quick primer on dynamically importing a language grammar, which can be quite heavy. | 1.0 | Add dynamic import example - This is less of an issue for SvelteKit which will already code-split your app.
However, for other SPA set-ups, it would be nice to have a quick primer on dynamically importing a language grammar, which can be quite heavy. | non_priority | add dynamic import example this is less of an issue for sveltekit which will already code split your app however for other spa set ups it would be nice to have a quick primer on dynamically importing a language grammar which can be quite heavy | 0 |
2,627 | 2,699,030,385 | IssuesEvent | 2015-04-03 13:48:01 | mkdocs/mkdocs | https://api.github.com/repos/mkdocs/mkdocs | closed | Document contributor guideline and a developement primer. | Documentation | I'm not familiar with Python at all so I'd like to see minimal guide how to build the project.
Currently I'm using mkdocs through pip and that doesn't allow me to fork into pull requests thus preventing my contribution. | 1.0 | Document contributor guideline and a developement primer. - I'm not familiar with Python at all so I'd like to see minimal guide how to build the project.
Currently I'm using mkdocs through pip and that doesn't allow me to fork into pull requests thus preventing my contribution. | non_priority | document contributor guideline and a developement primer i m not familiar with python at all so i d like to see minimal guide how to build the project currently i m using mkdocs through pip and that doesn t allow me to fork into pull requests thus preventing my contribution | 0 |
47,499 | 7,329,506,119 | IssuesEvent | 2018-03-05 05:31:36 | bignamehere/fry-calculator | https://api.github.com/repos/bignamehere/fry-calculator | closed | Develop Process Flow | Documentation P2 UX | Create versions of possible process flows that a User could take to perform the below tasks:
1. Discover tool
2. Select Insurance as partial payment
3. Select Other Payment criteria (?)
4. Set Down payment Amount
5. Select monthly payment (dynamically controlled)
6. Select number of months (dynamically controlled)
7. Select other options to reveal savings (retainers lost, etc)
? - Should the User be taken thru a simple Q&A of discovery to set the initial values of the Calculator?
? - | 1.0 | Develop Process Flow - Create versions of possible process flows that a User could take to perform the below tasks:
1. Discover tool
2. Select Insurance as partial payment
3. Select Other Payment criteria (?)
4. Set Down payment Amount
5. Select monthly payment (dynamically controlled)
6. Select number of months (dynamically controlled)
7. Select other options to reveal savings (retainers lost, etc)
? - Should the User be taken thru a simple Q&A of discovery to set the initial values of the Calculator?
? - | non_priority | develop process flow create versions of possible process flows that a user could take to perform the below tasks discover tool select insurance as partial payment select other payment criteria set down payment amount select monthly payment dynamically controlled select number of months dynamically controlled select other options to reveal savings retainers lost etc should the user be taken thru a simple q a of discovery to set the initial values of the calculator | 0 |
121,524 | 17,659,486,266 | IssuesEvent | 2021-08-21 07:30:29 | LaudateCorpus1/vscode-main | https://api.github.com/repos/LaudateCorpus1/vscode-main | opened | CVE-2021-21366 (Medium) detected in xmldom-0.1.31.tgz | security vulnerability | ## CVE-2021-21366 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xmldom-0.1.31.tgz</b></p></summary>
<p>A W3C Standard XML DOM(Level2 CORE) implementation and parser(DOMParser/XMLSerializer).</p>
<p>Library home page: <a href="https://registry.npmjs.org/xmldom/-/xmldom-0.1.31.tgz">https://registry.npmjs.org/xmldom/-/xmldom-0.1.31.tgz</a></p>
<p>Path to dependency file: vscode-main/vscode-main/build/package.json</p>
<p>Path to vulnerable library: vscode-main/vscode-main/build/node_modules/xmldom,vscode-main/vscode-main/node_modules/xmldom</p>
<p>
Dependency Hierarchy:
- plist-3.0.1.tgz (Root Library)
- :x: **xmldom-0.1.31.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/LaudateCorpus1/vscode-main/commit/1506b403cd49b2e313bb1af8edb964123510881e">1506b403cd49b2e313bb1af8edb964123510881e</a></p>
<p>Found in base branch: <b>dev1</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
xmldom is a pure JavaScript W3C standard-based (XML DOM Level 2 Core) DOMParser and XMLSerializer module. xmldom versions 0.4.0 and older do not correctly preserve system identifiers, FPIs or namespaces when repeatedly parsing and serializing maliciously crafted documents. This may lead to unexpected syntactic changes during XML processing in some downstream applications. This is fixed in version 0.5.0. As a workaround downstream applications can validate the input and reject the maliciously crafted documents.
<p>Publish Date: 2021-03-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21366>CVE-2021-21366</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/xmldom/xmldom/security/advisories/GHSA-h6q6-9hqw-rwfv">https://github.com/xmldom/xmldom/security/advisories/GHSA-h6q6-9hqw-rwfv</a></p>
<p>Release Date: 2021-03-12</p>
<p>Fix Resolution: 0.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-21366 (Medium) detected in xmldom-0.1.31.tgz - ## CVE-2021-21366 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xmldom-0.1.31.tgz</b></p></summary>
<p>A W3C Standard XML DOM(Level2 CORE) implementation and parser(DOMParser/XMLSerializer).</p>
<p>Library home page: <a href="https://registry.npmjs.org/xmldom/-/xmldom-0.1.31.tgz">https://registry.npmjs.org/xmldom/-/xmldom-0.1.31.tgz</a></p>
<p>Path to dependency file: vscode-main/vscode-main/build/package.json</p>
<p>Path to vulnerable library: vscode-main/vscode-main/build/node_modules/xmldom,vscode-main/vscode-main/node_modules/xmldom</p>
<p>
Dependency Hierarchy:
- plist-3.0.1.tgz (Root Library)
- :x: **xmldom-0.1.31.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/LaudateCorpus1/vscode-main/commit/1506b403cd49b2e313bb1af8edb964123510881e">1506b403cd49b2e313bb1af8edb964123510881e</a></p>
<p>Found in base branch: <b>dev1</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
xmldom is a pure JavaScript W3C standard-based (XML DOM Level 2 Core) DOMParser and XMLSerializer module. xmldom versions 0.4.0 and older do not correctly preserve system identifiers, FPIs or namespaces when repeatedly parsing and serializing maliciously crafted documents. This may lead to unexpected syntactic changes during XML processing in some downstream applications. This is fixed in version 0.5.0. As a workaround downstream applications can validate the input and reject the maliciously crafted documents.
<p>Publish Date: 2021-03-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21366>CVE-2021-21366</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/xmldom/xmldom/security/advisories/GHSA-h6q6-9hqw-rwfv">https://github.com/xmldom/xmldom/security/advisories/GHSA-h6q6-9hqw-rwfv</a></p>
<p>Release Date: 2021-03-12</p>
<p>Fix Resolution: 0.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in xmldom tgz cve medium severity vulnerability vulnerable library xmldom tgz a standard xml dom core implementation and parser domparser xmlserializer library home page a href path to dependency file vscode main vscode main build package json path to vulnerable library vscode main vscode main build node modules xmldom vscode main vscode main node modules xmldom dependency hierarchy plist tgz root library x xmldom tgz vulnerable library found in head commit a href found in base branch vulnerability details xmldom is a pure javascript standard based xml dom level core domparser and xmlserializer module xmldom versions and older do not correctly preserve system identifiers fpis or namespaces when repeatedly parsing and serializing maliciously crafted documents this may lead to unexpected syntactic changes during xml processing in some downstream applications this is fixed in version as a workaround downstream applications can validate the input and reject the maliciously crafted documents publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
307,086 | 26,518,379,493 | IssuesEvent | 2023-01-18 23:07:12 | pytorch/pytorch | https://api.github.com/repos/pytorch/pytorch | closed | DISABLED test_torch_cuda_is_available_dynamic_shapes (torch._dynamo.testing.make_test_cls_with_patches.<locals>.DummyTestClass) | module: flaky-tests skipped module: unknown | Platforms: linux
This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/failure/test_torch_cuda_is_available_dynamic_shapes) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/10713308472).
Over the past 72 hours, it has flakily failed in 2 workflow(s).
**Debugging instructions (after clicking on the recent samples link):**
To find relevant log snippets:
1. Click on the workflow logs linked above
2. Grep for `test_torch_cuda_is_available_dynamic_shapes`
Error retrieving /opt/conda/lib/python3.10/site-packages/torch/_dynamo/testing.py: Error: Statuscode 301 | 1.0 | DISABLED test_torch_cuda_is_available_dynamic_shapes (torch._dynamo.testing.make_test_cls_with_patches.<locals>.DummyTestClass) - Platforms: linux
This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/failure/test_torch_cuda_is_available_dynamic_shapes) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/10713308472).
Over the past 72 hours, it has flakily failed in 2 workflow(s).
**Debugging instructions (after clicking on the recent samples link):**
To find relevant log snippets:
1. Click on the workflow logs linked above
2. Grep for `test_torch_cuda_is_available_dynamic_shapes`
Error retrieving /opt/conda/lib/python3.10/site-packages/torch/_dynamo/testing.py: Error: Statuscode 301 | non_priority | disabled test torch cuda is available dynamic shapes torch dynamo testing make test cls with patches dummytestclass platforms linux this test was disabled because it is failing in ci see and the most recent trunk over the past hours it has flakily failed in workflow s debugging instructions after clicking on the recent samples link to find relevant log snippets click on the workflow logs linked above grep for test torch cuda is available dynamic shapes error retrieving opt conda lib site packages torch dynamo testing py error statuscode | 0 |
70,086 | 18,014,859,269 | IssuesEvent | 2021-09-16 12:55:10 | tensorflow/tensorflow | https://api.github.com/repos/tensorflow/tensorflow | closed | Issue in building Tensorflow library on Windows machine | stat:awaiting response type:build/install stalled subtype:windows |
**System information**
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Windows 10 (64-bit)
- Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: No
- TensorFlow installed from (source or binary): Source
- TensorFlow version:1.11
- Python version:3.6
- Installed using virtualenv? pip? conda?: Git
- Bazel version (if compiling from source):No
- GCC/Compiler version (if compiling from source): CMake : 3.10.1
- CUDA/cuDNN version: No
- GPU model and memory: No
**Describe the problem**
I am trying to build tensorflow in windows. After downloading 1.11 from the git branch, I tried to build it using one of the link avilable online. CMake step was executed successfully and visual studion solution file was also generated.
When i tried to build it in visual studion 2017 it started giving me error of file not found.
Link followed to build the library:
https://joe-antognini.github.io/machine-learning/build-windows-tf
**Provide the exact sequence of commands / steps that you executed before running into the problem**
1. Created all the required folder as described in the link.
2. Run the build command:
cmake .. -A x64 -DCMAKE_BUILD_TYPE=Release -DSWIG_EXECUTABLE=C:\swigwin-3.0.12\swig.exe -DPYTHON_EXECUTABLE=C:\Python365\python.exe -DPYTHON_LIBRARIES=C:\Python365\libs\python36.lib
3.Open the solution in Visualstudion 2017 and build in release mode.
**Any other info / logs**
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
********************************************
Visual Studio 2017 Logs:
********************************************
1>------ Build started: Project: zlib, Configuration: Release x64 ------
2>------ Build started: Project: farmhash, Configuration: Release x64 ------
3>------ Build started: Project: gif, Configuration: Release x64 ------
4>------ Build started: Project: sqlite, Configuration: Release x64 ------
5>------ Build started: Project: highwayhash, Configuration: Release x64 ------
6>------ Build started: Project: jpeg, Configuration: Release x64 ------
7>------ Build started: Project: lmdb, Configuration: Release x64 ------
8>------ Build started: Project: nsync, Configuration: Release x64 ------
1>Performing update step for 'zlib'
9>------ Build started: Project: farmhash_create_destination_dir, Configuration: Release x64 ------
10>------ Build started: Project: gif_create_destination_dir, Configuration: Release x64 ------
11>------ Build started: Project: sqlite_create_destination_dir, Configuration: Release x64 ------
5>Performing update step for 'highwayhash'
12>------ Build started: Project: png, Configuration: Release x64 ------
13>------ Build started: Project: protobuf, Configuration: Release x64 ------
14>------ Build started: Project: zlib_create_destination_dir, Configuration: Release x64 ------
15>------ Build started: Project: eigen, Configuration: Release x64 ------
16>------ Build started: Project: jpeg_create_destination_dir, Configuration: Release x64 ------
17>------ Build started: Project: highwayhash_create_destination_dir, Configuration: Release x64 ------
18>------ Build started: Project: png_create_destination_dir, Configuration: Release x64 ------
19>------ Build started: Project: lmdb_create_destination_dir, Configuration: Release x64 ------
13>Performing update step for 'protobuf'
20>------ Build started: Project: re2, Configuration: Release x64 ------
21>------ Build started: Project: double_conversion, Configuration: Release x64 ------
8>Performing update step for 'nsync'
22>------ Build started: Project: snappy, Configuration: Release x64 ------
23>------ Build started: Project: jpeg_copy_headers_to_destination, Configuration: Release x64 ------
24>------ Build started: Project: cub, Configuration: Release x64 ------
20>Performing update step for 're2'
21>Performing update step for 'double_conversion'
25>------ Build started: Project: nsync_create_destination_dir, Configuration: Release x64 ------
22>Performing update step for 'snappy'
26>------ Build started: Project: highwayhash_copy_headers_to_destination, Configuration: Release x64 ------
27>------ Build started: Project: grpc, Configuration: Release x64 ------
28>------ Build started: Project: jsoncpp, Configuration: Release x64 ------
29>------ Build started: Project: nsync_copy_headers_to_destination, Configuration: Release x64 ------
30>------ Build started: Project: lmdb_copy_headers_to_destination, Configuration: Release x64 ------
31>------ Build started: Project: png_copy_headers_to_destination, Configuration: Release x64 ------
32>------ Build started: Project: gif_copy_headers_to_destination, Configuration: Release x64 ------
27>Performing update step for 'grpc'
28>Performing update step for 'jsoncpp'
33>------ Build started: Project: sqlite_copy_headers_to_destination, Configuration: Release x64 ------
34>------ Build started: Project: gemmlowp, Configuration: Release x64 ------
35>------ Build started: Project: tf_protos_cc, Configuration: Release x64 ------
36>------ Build started: Project: fft2d, Configuration: Release x64 ------
37>------ Build started: Project: farmhash_copy_headers_to_destination, Configuration: Release x64 ------
38>------ Build started: Project: zlib_copy_headers_to_destination, Configuration: Release x64 ------
39>------ Build started: Project: create_cc_ops_header_dir, Configuration: Release x64 ------
40>------ Build started: Project: force_rebuild_target, Configuration: Release x64 ------
41>------ Build started: Project: tf_python_copy_scripts_to_destination, Configuration: Release x64 ------
40>Generating __force_rebuild
40>
40>Generating C:/Users/john/tensorflow/tensorflow/core/util/version_info.cc
35>debug_service.pb.cc
35>debugger_event_metadata.pb.cc
35>example.pb.cc
35>example_parser_configuration.pb.cc
35>feature.pb.cc
35>allocation_description.pb.cc
35>api_def.pb.cc
35>attr_value.pb.cc
40>fatal: Not a git repository: 'C:/Users/john/tensorflow/.git'
35>cost_graph.pb.cc
35>device_attributes.pb.cc
35>function.pb.cc
35>graph.pb.cc
35>graph_transfer_info.pb.cc
35>iterator.pb.cc
35>kernel_def.pb.cc
35>log_memory.pb.cc
35>node_def.pb.cc
35>op_def.pb.cc
35>reader_base.pb.cc
35>remote_fused_graph_execute_info.pb.cc
35>resource_handle.pb.cc
35>step_stats.pb.cc
35>summary.pb.cc
35>tensor.pb.cc
35>tensor_description.pb.cc
35>tensor_shape.pb.cc
35>tensor_slice.pb.cc
35>types.pb.cc
35>variable.pb.cc
35>versions.pb.cc
35>op_performance_data.pb.cc
35>boosted_trees.pb.cc
35>error_codes.pb.cc
35>profile.pb.cc
35>tfprof_log.pb.cc
35>tfprof_options.pb.cc
35>tfprof_output.pb.cc
35>checkpointable_object_graph.pb.cc
35>cluster.pb.cc
35>config.pb.cc
35>control_flow.pb.cc
35>critical_section.pb.cc
35>debug.pb.cc
35>device_properties.pb.cc
35>eager_service.pb.cc
35>master.pb.cc
35>master_service.pb.cc
35>meta_graph.pb.cc
35>named_tensor.pb.cc
35>queue_runner.pb.cc
35>rewriter_config.pb.cc
35>saved_model.pb.cc
35>saver.pb.cc
35>tensor_bundle.pb.cc
35>tensorflow_server.pb.cc
35>transport_options.pb.cc
35>worker.pb.cc
35>worker_service.pb.cc
35>event.pb.cc
35>example_proto_fast_parsing_test.pb.cc
35>memmapped_file_system.pb.cc
35>saved_tensor_slice.pb.cc
35>test_log.pb.cc
35>xla_service.pb.cc
35>backend_configs.pb.cc
35>hlo.pb.cc
35>hlo_profile_printer_data.pb.cc
35>xla.pb.cc
35>xla_data.pb.cc
35>learner.pb.cc
35>quantiles.pb.cc
35>split_info.pb.cc
35>tree_config.pb.cc
35>compilation_result.pb.cc
35>optimization_parameters.pb.cc
35>topology.pb.cc
35>tpu_embedding_config.pb.cc
35>tf_protos_cc.vcxproj -> C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\Release\tf_protos_cc.lib
42>------ Build started: Project: proto_text, Configuration: Release x64 ------
42>gen_proto_text_functions.cc
42>gen_proto_text_functions_lib.cc
42>path.obj : error LNK2019: unresolved external symbol "void __cdecl absl::base_internal::ThrowStdOutOfRange(char const *)" (?ThrowStdOutOfRange@base_internal@absl@@YAXPEBD@Z) referenced in function "class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > __cdecl tensorflow::io::internal::JoinPathImpl(class std::initializer_list<class absl::string_view>)" (?JoinPathImpl@internal@io@tensorflow@@YA?AV?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@V?$initializer_list@Vstring_view@absl@@@5@@Z)
42>path.obj : error LNK2019: unresolved external symbol "public: unsigned __int64 __cdecl absl::string_view::rfind(char,unsigned __int64)const " (?rfind@string_view@absl@@QEBA_KD_K@Z) referenced in function "class absl::string_view __cdecl tensorflow::io::Extension(class absl::string_view)" (?Extension@io@tensorflow@@YA?AVstring_view@absl@@V34@@Z)
42>collection_registry.obj : error LNK2019: unresolved external symbol "class std::basic_ostream<char,struct std::char_traits<char> > & __cdecl absl::operator<<(class std::basic_ostream<char,struct std::char_traits<char> > &,class absl::string_view)" (??6absl@@YAAEAV?$basic_ostream@DU?$char_traits@D@std@@@std@@AEAV12@Vstring_view@0@@Z) referenced in function "public: class std::unique_ptr<class tensorflow::monitoring::CollectionRegistry::RegistrationHandle,struct std::default_delete<class tensorflow::monitoring::CollectionRegistry::RegistrationHandle> > __cdecl tensorflow::monitoring::CollectionRegistry::Register(class tensorflow::monitoring::AbstractMetricDef const *,class std::function<void __cdecl(class tensorflow::monitoring::MetricCollectorGetter)> const &)" (?Register@CollectionRegistry@monitoring@tensorflow@@QEAA?AV?$unique_ptr@VRegistrationHandle@CollectionRegistry@monitoring@tensorflow@@U?$default_delete@VRegistrationHandle@CollectionRegistry@monitoring@tensorflow@@@std@@@std@@PEBVAbstractMetricDef@23@AEBV?$function@$$A6AXVMetricCollectorGetter@monitoring@tensorflow@@@Z@5@@Z)
42>str_util.obj : error LNK2019: unresolved external symbol "public: unsigned __int64 __cdecl absl::string_view::find(char,unsigned __int64)const " (?find@string_view@absl@@QEBA_KD_K@Z) referenced in function "class std::vector<class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> >,class std::allocator<class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > > > __cdecl tensorflow::str_util::Split<struct tensorflow::str_util::AllowEmpty>(class absl::string_view,class absl::string_view,struct tensorflow::str_util::AllowEmpty)" (??$Split@UAllowEmpty@str_util@tensorflow@@@str_util@tensorflow@@YA?AV?$vector@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@V?$allocator@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@@2@@std@@Vstring_view@absl@@0UAllowEmpty@01@@Z)
42>C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\Release\proto_text.exe : fatal error LNK1120: 4 unresolved externals
42>Done building project "proto_text.vcxproj" -- FAILED.
43>------ Build started: Project: tf_core_framework, Configuration: Release x64 ------
43>Generating __force_rebuild
43>
43>Running C++ protocol buffer text compiler (proto_text) on tensorflow/core/example/example.proto
43>'Release\proto_text.exe' is not recognized as an internal or external command,
43>operable program or batch file.
43>C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(171,5): error MSB6006: "cmd.exe" exited with code 9009.
43>Done building project "tf_core_framework.vcxproj" -- FAILED.
44>------ Build started: Project: tf_cc_op_gen_main, Configuration: Release x64 ------
45>------ Build started: Project: tf_core_cpu, Configuration: Release x64 ------
46>------ Build started: Project: tf_cc_framework, Configuration: Release x64 ------
47>------ Build started: Project: tf_python_op_gen_main, Configuration: Release x64 ------
44>cc_op_gen.cc
44>cc_op_gen_main.cc
46>ops.cc
46>scope.cc
47>python_op_gen.cc
47>python_op_gen_internal.cc
47>python_op_gen_main.cc
45>loader.cc
45>reader.cc
45>accumulate_n_optimizer.cc
45>allocator_retry.cc
45>base_collective_executor.cc
45>bfc_allocator.cc
45>buf_rendezvous.cc
45>build_graph_options.cc
47>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen_internal.cc(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
47>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen.cc(23): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
45>collective_executor_mgr.cc
44>c:\users\john\tensorflow\tensorflow\cc\framework\cc_op_gen.cc(28): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/types.pb_text.h': No such file or directory
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\base_collective_executor.cc)
45>collective_param_resolver_local.cc
44>Done building project "tf_cc_op_gen_main.vcxproj" -- FAILED.
48>------ Build started: Project: control_flow_ops_gen_cc, Configuration: Release x64 ------
49>------ Build started: Project: ctc_ops_gen_cc, Configuration: Release x64 ------
50>------ Build started: Project: cudnn_rnn_ops_gen_cc, Configuration: Release x64 ------
51>------ Build started: Project: data_flow_ops_gen_cc, Configuration: Release x64 ------
52>------ Build started: Project: image_ops_gen_cc, Configuration: Release x64 ------
45>collective_rma_local.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\accumulate_n_optimizer.cc)
45>collective_util.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\buf_rendezvous.cc)
45>copy_tensor.cc
46>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\cc\framework\scope.cc)
45>costmodel_manager.cc
47>Done building project "tf_python_op_gen_main.vcxproj" -- FAILED.
53>------ Build started: Project: random_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\collective_executor_mgr.cc)
45>debugger_state_interface.cc
45>device.cc
46>Done building project "tf_cc_framework.vcxproj" -- FAILED.
54>------ Build started: Project: io_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\collective_param_resolver_local.cc)
45>device_factory.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\collective_rma_local.cc)
45>device_mgr.cc
48>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
51>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
48>Done building project "control_flow_ops_gen_cc.vcxproj" -- FAILED.
49>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
55>------ Build started: Project: summary_ops_gen_cc, Configuration: Release x64 ------
52>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
51>Done building project "data_flow_ops_gen_cc.vcxproj" -- FAILED.
56>------ Build started: Project: sendrecv_ops_gen_cc, Configuration: Release x64 ------
52>Done building project "image_ops_gen_cc.vcxproj" -- FAILED.
49>Done building project "ctc_ops_gen_cc.vcxproj" -- FAILED.
57>------ Build started: Project: decode_proto_ops_gen_cc, Configuration: Release x64 ------
50>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
58>------ Build started: Project: remote_fused_graph_ops_gen_cc, Configuration: Release x64 ------
50>Done building project "cudnn_rnn_ops_gen_cc.vcxproj" -- FAILED.
59>------ Build started: Project: sdca_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\collective_util.cc)
45>device_resolver_local.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\copy_tensor.cc)
45>device_set.cc
53>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
58>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
57>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
53>Done building project "random_ops_gen_cc.vcxproj" -- FAILED.
60>------ Build started: Project: linalg_ops_gen_cc, Configuration: Release x64 ------
58>Done building project "remote_fused_graph_ops_gen_cc.vcxproj" -- FAILED.
61>------ Build started: Project: string_ops_gen_cc, Configuration: Release x64 ------
57>Done building project "decode_proto_ops_gen_cc.vcxproj" -- FAILED.
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\debugger_state_interface.cc)
62>------ Build started: Project: list_ops_gen_cc, Configuration: Release x64 ------
45>attr_builder.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\device.cc)
45>context.cc
45>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\attr_builder.cc)
59>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\attr_builder.cc)
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\device_mgr.cc)
45>eager_executor.cc
45>eager_operation.cc
59>Done building project "sdca_ops_gen_cc.vcxproj" -- FAILED.
63>------ Build started: Project: stateless_random_ops_gen_cc, Configuration: Release x64 ------
54>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
54>Done building project "io_ops_gen_cc.vcxproj" -- FAILED.
45>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\eager_operation.cc)
64>------ Build started: Project: resource_variable_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\eager_operation.cc)
55>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
62>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>execute.cc
60>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
55>Done building project "summary_ops_gen_cc.vcxproj" -- FAILED.
62>Done building project "list_ops_gen_cc.vcxproj" -- FAILED.
65>------ Build started: Project: encode_proto_ops_gen_cc, Configuration: Release x64 ------
66>------ Build started: Project: script_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\device_set.cc)
45>kernel_and_device.cc
60>Done building project "linalg_ops_gen_cc.vcxproj" -- FAILED.
67>------ Build started: Project: logging_ops_gen_cc, Configuration: Release x64 ------
61>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
61>Done building project "string_ops_gen_cc.vcxproj" -- FAILED.
68>------ Build started: Project: state_ops_gen_cc, Configuration: Release x64 ------
45>tensor_handle.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\device_resolver_local.cc)
45>eval_const_tensor.cc
66>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
56>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\attr_builder.cc)
66>Done building project "script_ops_gen_cc.vcxproj" -- FAILED.
69>------ Build started: Project: lookup_ops_gen_cc, Configuration: Release x64 ------
64>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>executor.cc
56>Done building project "sendrecv_ops_gen_cc.vcxproj" -- FAILED.
70>------ Build started: Project: array_ops_gen_cc, Configuration: Release x64 ------
64>Done building project "resource_variable_ops_gen_cc.vcxproj" -- FAILED.
71>------ Build started: Project: manip_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\context.cc)
45>executor_factory.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\eager_executor.cc)
45>function.cc
67>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
63>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
67>Done building project "logging_ops_gen_cc.vcxproj" -- FAILED.
72>------ Build started: Project: rpc_ops_gen_cc, Configuration: Release x64 ------
63>Done building project "stateless_random_ops_gen_cc.vcxproj" -- FAILED.
73>------ Build started: Project: math_ops_gen_cc, Configuration: Release x64 ------
70>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\eager_operation.cc)
70>Done building project "array_ops_gen_cc.vcxproj" -- FAILED.
74>------ Build started: Project: audio_ops_gen_cc, Configuration: Release x64 ------
45>gpu_id_manager.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\execute.cc)
45>graph_execution_state.cc
68>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\kernel_and_device.cc)
45>graph_optimizer.cc
68>Done building project "state_ops_gen_cc.vcxproj" -- FAILED.
75>------ Build started: Project: nn_ops_gen_cc, Configuration: Release x64 ------
65>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
71>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
65>Done building project "encode_proto_ops_gen_cc.vcxproj" -- FAILED.
76>------ Build started: Project: spectral_ops_gen_cc, Configuration: Release x64 ------
71>Done building project "manip_ops_gen_cc.vcxproj" -- FAILED.
77>------ Build started: Project: no_op_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\tensor_handle.cc)
45>graph_runner.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\executor.cc)
45>hierarchical_tree_broadcaster.cc
72>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
72>Done building project "rpc_ops_gen_cc.vcxproj" -- FAILED.
78>------ Build started: Project: batch_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\function.cc)
73>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>local_device.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eval_const_tensor.cc)
45>lower_if_op.cc
73>Done building project "math_ops_gen_cc.vcxproj" -- FAILED.
79>------ Build started: Project: user_ops_gen_cc, Configuration: Release x64 ------
45>lower_while_op.cc
74>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
74>Done building project "audio_ops_gen_cc.vcxproj" -- FAILED.
80>------ Build started: Project: bitwise_ops_gen_cc, Configuration: Release x64 ------
77>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
77>Done building project "no_op_gen_cc.vcxproj" -- FAILED.
81>------ Build started: Project: parsing_ops_gen_cc, Configuration: Release x64 ------
75>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
75>Done building project "nn_ops_gen_cc.vcxproj" -- FAILED.
82>------ Build started: Project: sparse_ops_gen_cc, Configuration: Release x64 ------
69>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
69>Done building project "lookup_ops_gen_cc.vcxproj" -- FAILED.
83>------ Build started: Project: functional_ops_gen_cc, Configuration: Release x64 ------
76>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
78>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\graph_execution_state.cc)
76>Done building project "spectral_ops_gen_cc.vcxproj" -- FAILED.
45>memory_types.cc
84>------ Build started: Project: dataset_ops_gen_cc, Configuration: Release x64 ------
78>Done building project "batch_ops_gen_cc.vcxproj" -- FAILED.
85>------ Build started: Project: training_ops_gen_cc, Configuration: Release x64 ------
45>mkl_cpu_allocator.cc
45>optimization_registry.cc
79>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
79>Done building project "user_ops_gen_cc.vcxproj" -- FAILED.
86>------ Build started: Project: candidate_sampling_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\graph_runner.cc)
45>parallel_concat_optimizer.cc
85>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
81>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
82>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
85>Done building project "training_ops_gen_cc.vcxproj" -- FAILED.
81>Done building project "parsing_ops_gen_cc.vcxproj" -- FAILED.
87>------ Build started: Project: checkpoint_ops_gen_cc, Configuration: Release x64 ------
88>------ Build started: Project: boosted_trees_ops_gen_cc, Configuration: Release x64 ------
82>Done building project "sparse_ops_gen_cc.vcxproj" -- FAILED.
80>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
89>------ Build started: Project: set_ops_gen_cc, Configuration: Release x64 ------
80>Done building project "bitwise_ops_gen_cc.vcxproj" -- FAILED.
90>------ Build started: Project: remote_fused_graph_ops_gen_python, Configuration: Release x64 ------
83>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\graph_optimizer.cc)
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\local_device.cc)
45>placer.cc
45>pool_allocator.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\hierarchical_tree_broadcaster.cc)
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\lower_if_op.cc)
83>Done building project "functional_ops_gen_cc.vcxproj" -- FAILED.
45>process_function_library_runtime.cc
45>process_state.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\lower_while_op.cc)
45>process_util.cc
91>------ Build started: Project: sdca_ops_gen_python, Configuration: Release x64 ------
84>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
84>Done building project "dataset_ops_gen_cc.vcxproj" -- FAILED.
92>------ Build started: Project: random_ops_gen_python, Configuration: Release x64 ------
87>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
88>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
86>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
87>Done building project "checkpoint_ops_gen_cc.vcxproj" -- FAILED.
93>------ Build started: Project: set_ops_gen_python, Configuration: Release x64 ------
88>Done building project "boosted_trees_ops_gen_cc.vcxproj" -- FAILED.
86>Done building project "candidate_sampling_ops_gen_cc.vcxproj" -- FAILED.
94>------ Build started: Project: parsing_ops_gen_python, Configuration: Release x64 ------
95>------ Build started: Project: sparse_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\optimization_registry.cc)
45>renamed_device.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\process_function_library_runtime.cc)
45>rendezvous_mgr.cc
90>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
92>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
90>Done building project "remote_fused_graph_ops_gen_python.vcxproj" -- FAILED.
95>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
91>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
96>------ Build started: Project: nn_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\placer.cc)
91>Done building project "sdca_ops_gen_python.vcxproj" -- FAILED.
97>------ Build started: Project: math_ops_gen_python, Configuration: Release x64 ------
45>rendezvous_util.cc
95>Done building project "sparse_ops_gen_python.vcxproj" -- FAILED.
92>Done building project "random_ops_gen_python.vcxproj" -- FAILED.
98>------ Build started: Project: manip_ops_gen_python, Configuration: Release x64 ------
99>------ Build started: Project: lookup_ops_gen_python, Configuration: Release x64 ------
94>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
94>Done building project "parsing_ops_gen_python.vcxproj" -- FAILED.
100>------ Build started: Project: logging_ops_gen_python, Configuration: Release x64 ------
89>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
89>Done building project "set_ops_gen_cc.vcxproj" -- FAILED.
101>------ Build started: Project: tf_cc_ops, Configuration: Release x64 ------
100>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
100>Done building project "logging_ops_gen_python.vcxproj" -- FAILED.
102>------ Build started: Project: state_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\parallel_concat_optimizer.cc)
45>ring_reducer.cc
45>scoped_allocator.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\renamed_device.cc)
45>scoped_allocator_mgr.cc
98>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
45>session_ref.cc
45>session_state.cc
102>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
98>Done building project "manip_ops_gen_python.vcxproj" -- FAILED.
103>------ Build started: Project: list_ops_gen_python, Configuration: Release x64 ------
102>Done building project "state_ops_gen_python.vcxproj" -- FAILED.
104>------ Build started: Project: stateless_random_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\rendezvous_mgr.cc)
45>shape_refiner.cc
93>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
93>Done building project "set_ops_gen_python.vcxproj" -- FAILED.
105>------ Build started: Project: linalg_ops_gen_python, Configuration: Release x64 ------
103>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
103>Done building project "list_ops_gen_python.vcxproj" -- FAILED.
106>------ Build started: Project: string_ops_gen_python, Configuration: Release x64 ------
99>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
97>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
99>Done building project "lookup_ops_gen_python.vcxproj" -- FAILED.
107>------ Build started: Project: io_ops_gen_python, Configuration: Release x64 ------
96>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
97>Done building project "math_ops_gen_python.vcxproj" -- FAILED.
104>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
108>------ Build started: Project: summary_ops_gen_python, Configuration: Release x64 ------
45>stats_publisher_interface.cc
96>Done building project "nn_ops_gen_python.vcxproj" -- FAILED.
109>------ Build started: Project: image_ops_gen_python, Configuration: Release x64 ------
104>Done building project "stateless_random_ops_gen_python.vcxproj" -- FAILED.
110>------ Build started: Project: functional_ops_gen_python, Configuration: Release x64 ------
105>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
105>Done building project "linalg_ops_gen_python.vcxproj" -- FAILED.
111>------ Build started: Project: encode_proto_ops_gen_python, Configuration: Release x64 ------
106>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
107>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
106>Done building project "string_ops_gen_python.vcxproj" -- FAILED.
112>------ Build started: Project: rpc_ops_gen_python, Configuration: Release x64 ------
107>Done building project "io_ops_gen_python.vcxproj" -- FAILED.
113>------ Build started: Project: debug_ops_gen_python, Configuration: Release x64 ------
109>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
109>Done building project "image_ops_gen_python.vcxproj" -- FAILED.
114>------ Build started: Project: resource_variable_ops_gen_python, Configuration: Release x64 ------
108>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
112>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
108>Done building project "summary_ops_gen_python.vcxproj" -- FAILED.
115>------ Build started: Project: dataset_ops_gen_python, Configuration: Release x64 ------
112>Done building project "rpc_ops_gen_python.vcxproj" -- FAILED.
116>------ Build started: Project: spectral_ops_gen_python, Configuration: Release x64 ------
114>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
113>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
114>Done building project "resource_variable_ops_gen_python.vcxproj" -- FAILED.
117>------ Build started: Project: data_flow_ops_gen_python, Configuration: Release x64 ------
45>step_stats_collector.cc
113>Done building project "debug_ops_gen_python.vcxproj" -- FAILED.
118>------ Build started: Project: cudnn_rnn_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\ring_reducer.cc)
45>sycl_allocator.cc
45>sycl_device.cc
45>sycl_device_context.cc
45>sycl_device_factory.cc
45>threadpool_device.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\shape_refiner.cc)
45>threadpool_device_factory.cc
45>debug.cc
115>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
115>Done building project "dataset_ops_gen_python.vcxproj" -- FAILED.
119>------ Build started: Project: script_ops_gen_python, Configuration: Release x64 ------
45>debug_callback_registry.cc
116>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
116>Done building project "spectral_ops_gen_python.vcxproj" -- FAILED.
111>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
120>------ Build started: Project: ctc_ops_gen_python, Configuration: Release x64 ------
110>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
111>Done building project "encode_proto_ops_gen_python.vcxproj" -- FAILED.
121>------ Build started: Project: control_flow_ops_gen_python, Configuration: Release x64 ------
110>Done building project "functional_ops_gen_python.vcxproj" -- FAILED.
122>------ Build started: Project: contrib_text_skip_gram_ops_gen_python, Configuration: Release x64 ------
117>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
119>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
117>Done building project "data_flow_ops_gen_python.vcxproj" -- FAILED.
119>Done building project "script_ops_gen_python.vcxproj" -- FAILED.
123>------ Build started: Project: contrib_tensor_forest_stats_ops_gen_python, Configuration: Release x64 ------
124>------ Build started: Project: contrib_tensor_forest_ops_gen_python, Configuration: Release x64 ------
45>debug_graph_utils.cc
125>------ Build started: Project: contrib_tensor_forest_model_ops_gen_python, Configuration: Release x64 ------
120>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
122>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
120>Done building project "ctc_ops_gen_python.vcxproj" -- FAILED.
126>------ Build started: Project: contrib_tensor_forest_hybrid_ops_gen_python, Configuration: Release x64 ------
122>Done building project "contrib_text_skip_gram_ops_gen_python.vcxproj" -- FAILED.
127>------ Build started: Project: contrib_seq2seq_beam_search_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\threadpool_device_factory.cc)
45>debug_io_utils.cc
123>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
45>debug_node_key.cc
45>debugger_state_impl.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\threadpool_device.cc)
45>server_lib.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\debug\debug.cc)
45>algorithm.cc
45>colors.cc
123>Done building project "contrib_tensor_forest_stats_ops_gen_python.vcxproj" -- FAILED.
128>------ Build started: Project: contrib_rnn_lstm_ops_gen_python, Configuration: Release x64 ------
121>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
126>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
127>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
121>Done building project "control_flow_ops_gen_python.vcxproj" -- FAILED.
129>------ Build started: Project: contrib_rnn_gru_ops_gen_python, Configuration: Release x64 ------
45>control_flow.cc
126>Done building project "contrib_tensor_forest_hybrid_ops_gen_python.vcxproj" -- FAILED.
124>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
130>------ Build started: Project: contrib_resampler_ops_gen_python, Configuration: Release x64 ------
127>Done building project "contrib_seq2seq_beam_search_ops_gen_python.vcxproj" -- FAILED.
131>------ Build started: Project: contrib_periodic_resample_ops_gen_python, Configuration: Release x64 ------
124>Done building project "contrib_tensor_forest_ops_gen_python.vcxproj" -- FAILED.
132>------ Build started: Project: contrib_nearest_neighbor_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\debug\debug_graph_utils.cc)
45>costmodel.cc
45>gradients.cc
125>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
125>Done building project "contrib_tensor_forest_model_ops_gen_python.vcxproj" -- FAILED.
133>------ Build started: Project: contrib_nccl_ops_gen_python, Configuration: Release x64 ------
131>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
131>Done building project "contrib_periodic_resample_ops_gen_python.vcxproj" -- FAILED.
134>------ Build started: Project: contrib_memory_stats_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\debug\debugger_state_impl.cc)
130>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
130>Done building project "contrib_resampler_ops_gen_python.vcxproj" -- FAILED.
45>graph_constructor.cc
135>------ Build started: Project: contrib_layers_sparse_feature_cross_ops_gen_python, Configuration: Release x64 ------
128>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
128>Done building project "contrib_rnn_lstm_ops_gen_python.vcxproj" -- FAILED.
136>------ Build started: Project: contrib_input_pipeline_ops_gen_python, Configuration: Release x64 ------
101>Generating tensorflow/cc/ops/audio_ops.h, tensorflow/cc/ops/audio_ops.cc, tensorflow/cc/ops/audio_ops_internal.h, tensorflow/cc/ops/audio_ops_internal.cc
45>graph_def_builder_util.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\graph\gradients.cc)
45>graph_partition.cc
45>mkl_layout_pass.cc
101>'Release\audio_ops_gen_cc.exe' is not recognized as an internal or external command,
45>mkl_tfconversion_pass.cc
101>operable program or batch file.
133>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
45>optimizer_cse.cc
133>Done building project "contrib_nccl_ops_gen_python.vcxproj" -- FAILED.
137>------ Build started: Project: contrib_image_sirds_ops_gen_python, Configuration: Release x64 ------
45>quantize_training.cc
135>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
101>C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(171,5): error MSB6006: "cmd.exe" exited with code 9009.
101>Done building project "tf_cc_ops.vcxproj" -- FAILED.
138>------ Build started: Project: tf_cc_while_loop, Configuration: Release x64 ------
135>Done building project "contrib_layers_sparse_feature_cross_ops_gen_python.vcxproj" -- FAILED.
139>------ Build started: Project: tf_cc, Configuration: Release x64 ------
132>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
136>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
132>Done building project "contrib_nearest_neighbor_ops_gen_python.vcxproj" -- FAILED.
140>------ Build started: Project: contrib_image_ops_gen_python, Configuration: Release x64 ------
136>Done building project "contrib_input_pipeline_ops_gen_python.vcxproj" -- FAILED.
134>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
141>------ Build started: Project: contrib_image_distort_image_ops_gen_python, Configuration: Release x64 ------
134>Done building project "contrib_memory_stats_ops_gen_python.vcxproj" -- FAILED.
142>------ Build started: Project: contrib_framework_variable_ops_gen_python, Configuration: Release x64 ------
129>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
129>Done building project "contrib_rnn_gru_ops_gen_python.vcxproj" -- FAILED.
143>------ Build started: Project: contrib_gcs_config_ops_gen_python, Configuration: Release x64 ------
137>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
137>Done building project "contrib_image_sirds_ops_gen_python.vcxproj" -- FAILED.
144>------ Build started: Project: contrib_factorization_factorization_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\graph\graph_constructor.cc)
45>validate.cc
45>cluster.cc
138>while_loop.cc
45>virtual_cluster.cc
45>analytical_cost_estimator.cc
139>client_session.cc
139>array_grad.cc
139>data_flow_grad.cc
139>c:\users\john\tensorflow\tensorflow\cc\gradients\data_flow_grad.cc(16): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/data_flow_ops.h': No such file or directory
139>image_grad.cc
139>math_grad.cc
139>nn_grad.cc
139>c:\users\john\tensorflow\tensorflow\cc\gradients\nn_grad.cc(16): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/nn_ops.h': No such file or directory
139>coordinator.cc
139>c:\users\john\tensorflow\tensorflow\cc\gradients\math_grad.cc(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/array_ops_internal.h': No such file or directory
139>queue_runner.cc
139>grad_op_registry.cc
139>gradient_checker.cc
139>gradients.cc
139>c:\users\john\tensorflow\tensorflow\cc\gradients\array_grad.cc(18): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/array_ops_internal.h': No such file or directory
139>while_gradients.cc
142>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\graph\quantize_training.cc)
45>graph_memory.cc
142>Done building project "contrib_framework_variable_ops_gen_python.vcxproj" -- FAILED.
145>------ Build started: Project: contrib_factorization_clustering_ops_gen_python, Configuration: Release x64 ------
45>graph_properties.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\clusters\cluster.cc)
45>measuring_cost_estimator.cc
141>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
144>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
141>Done building project "contrib_image_distort_image_ops_gen_python.vcxproj" -- FAILED.
144>Done building project "contrib_factorization_factorization_ops_gen_python.vcxproj" -- FAILED.
146>------ Build started: Project: contrib_data_dataset_ops_gen_python, Configuration: Release x64 ------
147>------ Build started: Project: contrib_coder_ops_gen_python, Configuration: Release x64 ------
140>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
143>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
138>c:\users\john\tensorflow\tensorflow\cc\ops\while_loop.cc(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/control_flow_ops_internal.h': No such file or directory
140>Done building project "contrib_image_ops_gen_python.vcxproj" -- FAILED.
143>Done building project "contrib_gcs_config_ops_gen_python.vcxproj" -- FAILED.
148>------ Build started: Project: contrib_boosted_trees_training_ops_gen_python, Configuration: Release x64 ------
149>------ Build started: Project: contrib_boosted_trees_stats_accumulator_ops_gen_python, Configuration: Release x64 ------
139>c:\users\john\tensorflow\tensorflow\cc\gradients\image_grad.cc(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/image_ops_internal.h': No such file or directory
139>c:\users\john\tensorflow\tensorflow\cc\ops\standard_ops.h(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/array_ops.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\cc\framework\gradients.cc)
139>c:\users\john\tensorflow\tensorflow\cc\framework\while_gradients.cc(20): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/control_flow_ops_internal.h': No such file or directory
139>c:\users\john\tensorflow\tensorflow\cc\ops\standard_ops.h(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/array_ops.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\cc\framework\gradient_checker.cc)
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\clusters\virtual_cluster.cc)
45>op_level_cost_estimator.cc
138>Done building project "tf_cc_while_loop.vcxproj" -- FAILED.
150>------ Build started: Project: tf_c, Configuration: Release x64 ------
45>robust_stats.cc
45>virtual_placer.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\analytical_cost_estimator.cc)
45>virtual_scheduler.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\graph_memory.cc)
45>devices.cc
45>gen_node.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\graph_properties.cc)
45>graph_analyzer.cc
45>c:\users\john\tensorflow\tensorflow\core\grappler\graph_analyzer\graph_analyzer.cc(20): fatal error C1083: Cannot open include file: 'absl/strings/str_format.h': No such file or directory
45>graph_analyzer_tool.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\measuring_cost_estimator.cc)
45>sig_node.cc
45>c:\users\john\tensorflow\tensorflow\core\grappler\graph_analyzer\gen_node.cc(18): fatal error C1083: Cannot open include file: 'absl/strings/str_format.h': No such file or directory
45>graph_view.cc
150>c_api.cc
147>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
150>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C
150>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\virtual_placer.cc)
147>Done building project "contrib_coder_ops_gen_python.vcxproj" -- FAILED.
45>grappler_item.cc
151>------ Build started: Project: contrib_boosted_trees_split_handler_ops_gen_python, Configuration: Release x64 ------
45>grappler_item_builder.cc
139>Done building project "tf_cc.vcxproj" -- FAILED.
152>------ Build started: Project: decode_proto_ops_gen_python, Configuration: Release x64 ------
151>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
151>Done building project "contrib_boosted_trees_split_handler_ops_gen_python.vcxproj" -- FAILED.
146>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
153>------ Build started: Project: contrib_boosted_trees_prediction_ops_gen_python, Configuration: Release x64 ------
146>Done building project "contrib_data_dataset_ops_gen_python.vcxproj" -- FAILED.
154>------ Build started: Project: contrib_boosted_trees_model_ops_gen_python, Configuration: Release x64 ------
149>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
149>Done building project "contrib_boosted_trees_stats_accumulator_ops_gen_python.vcxproj" -- FAILED.
155>------ Build started: Project: contrib_bigquery_reader_ops_gen_python, Configuration: Release x64 ------
148>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
152>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
148>Done building project "contrib_boosted_trees_training_ops_gen_python.vcxproj" -- FAILED.
156>------ Build started: Project: checkpoint_ops_gen_python, Configuration: Release x64 ------
152>Done building project "decode_proto_ops_gen_python.vcxproj" -- FAILED.
157>------ Build started: Project: array_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\virtual_scheduler.cc)
145>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
154>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
150>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory
145>Done building project "contrib_factorization_clustering_ops_gen_python.vcxproj" -- FAILED.
45>file_input_yielder.cc
45>c:\users\john\tensorflow\tensorflow\core\grappler\graph_analyzer\sig_node.cc(20): fatal error C1083: Cannot open include file: 'absl/strings/str_format.h': No such file or directory
45>mutable_graph_view.cc
45>op_types.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\grappler_item_builder.cc)
45>arithmetic_optimizer.cc
158>------ Build started: Project: candidate_sampling_ops_gen_python, Configuration: Release x64 ------
154>Done building project "contrib_boosted_trees_model_ops_gen_python.vcxproj" -- FAILED.
45>auto_parallel.cc
159>------ Build started: Project: boosted_trees_ops_gen_python, Configuration: Release x64 ------
150>Done building project "tf_c.vcxproj" -- FAILED.
153>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
160>------ Build started: Project: tf_c_python_api, Configuration: Release x64 ------
153>Done building project "contrib_boosted_trees_prediction_ops_gen_python.vcxproj" -- FAILED.
161>------ Build started: Project: bitwise_ops_gen_python, Configuration: Release x64 ------
45>custom_graph_optimizer_registry.cc
158>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
158>Done building project "candidate_sampling_ops_gen_python.vcxproj" -- FAILED.
162>------ Build started: Project: training_ops_gen_python, Configuration: Release x64 ------
155>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
159>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
155>Done building project "contrib_bigquery_reader_ops_gen_python.vcxproj" -- FAILED.
159>Done building project "boosted_trees_ops_gen_python.vcxproj" -- FAILED.
163>------ Build started: Project: user_ops_gen_python, Configuration: Release x64 ------
164>------ Build started: Project: batch_ops_gen_python, Configuration: Release x64 ------
156>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
156>Done building project "checkpoint_ops_gen_python.vcxproj" -- FAILED.
165>------ Build started: Project: audio_ops_gen_python, Configuration: Release x64 ------
157>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
161>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
157>Done building project "array_ops_gen_python.vcxproj" -- FAILED.
166>------ Build started: Project: contrib_boosted_trees_quantiles_ops_gen_python, Configuration: Release x64 ------
161>Done building project "bitwise_ops_gen_python.vcxproj" -- FAILED.
163>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
163>Done building project "user_ops_gen_python.vcxproj" -- FAILED.
45>filter_fusion.cc
45>fusion_utils.cc
45>graph_utils.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\arithmetic_optimizer.cc)
45>latency_all_edges.cc
162>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
45>map_and_batch_fusion.cc
162>Done building project "training_ops_gen_python.vcxproj" -- FAILED.
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\auto_parallel.cc)
45>map_and_filter_fusion.cc
45>map_fusion.cc
45>map_vectorization.cc
164>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
165>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
164>Done building project "batch_ops_gen_python.vcxproj" -- FAILED.
165>Done building project "audio_ops_gen_python.vcxproj" -- FAILED.
166>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
166>Done building project "contrib_boosted_trees_quantiles_ops_gen_python.vcxproj" -- FAILED.
167>------ Build started: Project: tf_python_ops, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\filter_fusion.cc)
45>noop_elimination.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\latency_all_edges.cc)
45>shuffle_and_repeat_fusion.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\map_and_batch_fusion.cc)
45>debug_stripper.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\map_fusion.cc)
45>dependency_optimizer.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\map_and_filter_fusion.cc)
45>evaluation_utils.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\map_vectorization.cc)
45>function_optimizer.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\noop_elimination.cc)
45>gpu_swapping_kernels.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\shuffle_and_repeat_fusion.cc)
45>gpu_swapping_ops.cc
45>graph_optimizer_stage.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\debug_stripper.cc)
45>graph_rewriter.cc
45>layout_optimizer.cc
167>Generating tf_python/tensorflow/python/ops/gen_audio_ops.py
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\dependency_optimizer.cc)
45>loop_optimizer.cc
167>'Release\audio_ops_gen_python.exe' is not recognized as an internal or external command,
167>operable program or batch file.
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\function_optimizer.cc)
45>memory_optimizer.cc
167>C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(171,5): error MSB6006: "cmd.exe" exited with code 9009.
167>Done building project "tf_python_ops.vcxproj" -- FAILED.
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\gpu_swapping_kernels.cc)
45>meta_optimizer.cc
45>model_pruner.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\graph_optimizer_stage.cc)
45>remapper.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\loop_optimizer.cc)
45>scoped_allocator_optimizer.cc
45>shape_optimizer.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\layout_optimizer.cc)
45>static_schedule.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\memory_optimizer.cc)
45>symbolic_shapes.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\meta_optimizer.cc)
45>colocation.cc
45>frame.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\shape_optimizer.cc)
45>functions.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\remapper.cc)
45>scc.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\scoped_allocator_optimizer.cc)
45>topological_sort.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\static_schedule.cc)
45>traversal.cc
45>Done building project "tf_core_cpu.vcxproj" -- FAILED.
168>------ Build started: Project: tf_core_kernels, Configuration: Release x64 ------
169>------ Build started: Project: tf_grappler, Configuration: Release x64 ------
170>------ Build started: Project: tf_core_distributed_runtime, Configuration: Release x64 ------
171>------ Build started: Project: tf_core_direct_session, Configuration: Release x64 ------
171>direct_session.cc
169>single_machine.cc
169>cost_analyzer.cc
169>model_analyzer.cc
170>base_rendezvous_mgr.cc
170>call_options.cc
170>cluster_function_library_runtime.cc
170>collective_param_resolver_distributed.cc
170>collective_rma_distributed.cc
170>device_resolver_distributed.cc
170>eager_service_impl.cc
170>graph_mgr.cc
170>local_master.cc
168>adjust_contrast_op.cc
168>adjust_hue_op.cc
168>adjust_saturation_op.cc
168>aggregate_ops.cc
168>argmax_op.cc
168>as_string_op.cc
168>attention_ops.cc
168>avgpooling_op.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\collective_rma_distributed.cc)
170>master.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\base_rendezvous_mgr.cc)
170>master_session.cc
169>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\grappler\cost_analyzer.cc)
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\eager\eager_service_impl.cc)
170>message_wrappers.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\cluster_function_library_runtime.cc)
170>partial_run_mgr.cc
169>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\clusters\single_machine.cc)
169>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\grappler\model_analyzer.cc)
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\device_resolver_distributed.cc)
170>recent_request_ids.cc
171>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\graph_mgr.cc)
170>remote_device.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\collective_param_resolver_distributed.cc)
170>request_id.cc
169>Done building project "tf_grappler.vcxproj" -- FAILED.
171>Done building project "tf_core_direct_session.vcxproj" -- FAILED.
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\local_master.cc)
170>grpc_eager_client.cc
170>grpc_eager_service.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\master_session.cc)
170>grpc_eager_service_impl.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\master.cc)
170>grpc_channel.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.cc)
170>grpc_master_service.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\remote_device.cc)
170>grpc_master_service_impl.cc
168>barrier_ops.cc
168>base64_ops.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\partial_run_mgr.cc)
170>grpc_remote_master.cc
168>batch_kernels.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\eager\grpc_eager_service_impl.cc)
170>grpc_remote_worker.cc
170>grpc_rpc_factory.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_master_service.cc)
170>grpc_rpc_factory_registration.cc
170>grpc_server_lib.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_remote_master.cc)
168>batch_matmul_op_complex.cc
170>grpc_session.cc
168>batch_matmul_op_real.cc
170>grpc_tensor_coding.cc
170>grpc_util.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_remote_worker.cc)
170>grpc_worker_cache.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_session.cc)
170>grpc_worker_service.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_server_lib.cc)
170>grpc_worker_service_impl.cc
168>batch_norm_op.cc
170>rpc_rendezvous_mgr.cc
170>rpc_collective_executor_mgr.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_worker_cache.cc)
170>scheduler.cc
170>session_mgr.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\rpc_rendezvous_mgr.cc)
170>tensor_coding.cc
170>worker.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_worker_service.cc)
170>worker_cache_logger.cc
170>worker_cache_partial.cc
168>fake_clock_env.cc
168>periodic_function.cc
168>batchtospace_op.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\session_mgr.cc)
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\scheduler.cc)
170>worker_session.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc_collective_executor_mgr.cc)
168>bcast_ops.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\tensor_coding.cc)
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\worker.cc)
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\worker_cache_partial.cc)
168>betainc_op.cc
168>bias_op.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\worker_session.cc)
168>c:\users\john\tensorflow\tensorflow\core\kernels\batchtospace_op.cc(198): warning C4002: too many actual parameters for macro 'TF_BATCHTOSPACE_BLOCK_DIMS_CASE'
170>Done building project "tf_core_distributed_runtime.vcxproj" -- FAILED.
168>bincount_op.cc
168>bitcast_op.cc
168>resource_ops.cc
168>resources.cc
168>stats_ops.cc
168>broadcast_to_op.cc
168>bucketize_op.cc
168>candidate_sampler_ops.cc
168>cast_op.cc
168>cast_op_impl_bfloat.cc
168>cast_op_impl_bool.cc
168>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\kernels\cast_op.cc)
168>cast_op_impl_complex128.cc
168>cast_op_impl_complex64.cc
168>cast_op_impl_double.cc
168>cast_op_impl_float.cc
168>cast_op_impl_half.cc
168>cast_op_impl_int16.cc
168>cast_op_impl_int32.cc
168>cast_op_impl_int64.cc
168>cast_op_impl_int8.cc
168>cast_op_impl_uint16.cc
168>cast_op_impl_uint32.cc
168>cast_op_impl_uint64.cc
168>cast_op_impl_uint8.cc
168>check_numerics_op.cc
168>cholesky_grad.cc
168>cholesky_op.cc
168>collective_ops.cc
168>colorspace_op.cc
168>compare_and_bitpack_op.cc
168>concat_lib_cpu.cc
168>concat_lib_gpu.cc
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(99): warning C4805: '|': unsafe mix of type 'int' and type 'bool' in operation
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(92): note: while compiling class template member function 'void tensorflow::functor::ComputeShard<T,void,void>::Compute(Eigen::TensorMap<Eigen::Tensor<const T,2,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::Tensor<unsigned char,2,1,IndexType>,16,Eigen::MakePointer>,const T &,tensorflow::int64,tensorflow::int64)'
168> with
168> [
168> T=tensorflow::bfloat16,
168> IndexType=Eigen::DenseIndex
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(149): note: see reference to function template instantiation 'void tensorflow::functor::ComputeShard<T,void,void>::Compute(Eigen::TensorMap<Eigen::Tensor<const T,2,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::Tensor<unsigned char,2,1,IndexType>,16,Eigen::MakePointer>,const T &,tensorflow::int64,tensorflow::int64)' being compiled
168> with
168> [
168> T=tensorflow::bfloat16,
168> IndexType=Eigen::DenseIndex
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(149): note: see reference to class template instantiation 'tensorflow::functor::ComputeShard<T,void,void>' being compiled
168> with
168> [
168> T=tensorflow::bfloat16
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(146): note: while compiling class template member function 'void tensorflow::functor::CompareAndBitpack<Device,T>::operator ()(tensorflow::OpKernelContext *,Eigen::TensorMap<Eigen::Tensor<const T,2,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::TensorFixedSize<const T,Eigen::Sizes<>,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::Tensor<unsigned char,2,1,IndexType>,16,Eigen::MakePointer>)'
168> with
168> [
168> Device=tensorflow::CPUDevice,
168> T=tensorflow::bfloat16,
168> IndexType=Eigen::DenseIndex
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(71): note: see reference to function template instantiation 'void tensorflow::functor::CompareAndBitpack<Device,T>::operator ()(tensorflow::OpKernelContext *,Eigen::TensorMap<Eigen::Tensor<const T,2,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::TensorFixedSize<const T,Eigen::Sizes<>,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::Tensor<unsigned char,2,1,IndexType>,16,Eigen::MakePointer>)' being compiled
168> with
168> [
168> Device=tensorflow::CPUDevice,
168> T=tensorflow::bfloat16,
168> IndexType=Eigen::DenseIndex
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(70): note: see reference to class template instantiation 'tensorflow::functor::CompareAndBitpack<Device,T>' being compiled
168> with
168> [
168> Device=tensorflow::CPUDevice,
168> T=tensorflow::bfloat16
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(42): note: while compiling class template member function 'void tensorflow::CompareAndBitpackOp<tensorflow::CPUDevice,tensorflow::bfloat16>::Compute(tensorflow::OpKernelContext *)'
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(80): note: see reference to class template instantiation 'tensorflow::CompareAndBitpackOp<tensorflow::CPUDevice,tensorflow::bfloat16>' being compiled
168>concat_op.cc
168>conditional_accumulator_base.cc
168>conditional_accumulator_base_op.cc
168>conditional_accumulator_op.cc
168>constant_op.cc
168>control_flow_ops.cc
168>conv_grad_filter_ops.cc
168>conv_grad_input_ops.cc
168>conv_grad_ops.cc
168>conv_grad_ops_3d.cc
168>conv_ops.cc
168>conv_ops_3d.cc
168>conv_ops_fused.cc
168>conv_ops_using_gemm.cc
168>count_up_to_op.cc
168>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\kernels\conv_ops_fused.cc)
168>crop_and_resize_op.cc
168>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\kernels\conv_ops_using_gemm.cc)
168>cross_op.cc
168>ctc_decoder_ops.cc
168>ctc_loss_op.cc
168>cuda_solvers.cc
168>cudnn_pooling_gpu.cc
168>cudnn_rnn_ops.cc
168>cwise_op_abs.cc
168>cwise_op_acos.cc
168>cwise_op_acosh.cc
168>cwise_op_add_1.cc
168>cwise_op_add_2.cc
168>cwise_op_arg.cc
168>cwise_op_asin.cc
168>cwise_op_asinh.cc
168>cwise_op_atan.cc
168>cwise_op_atan2.cc
168>cwise_op_atanh.cc
168>cwise_op_bessel.cc
168>cwise_op_bitwise_and.cc
168>cwise_op_bitwise_or.cc
168>cwise_op_bitwise_xor.cc
168>cwise_op_ceil.cc
168>cwise_op_clip.cc
168>cwise_op_complex.cc
168>cwise_op_conj.cc
168>cwise_op_cos.cc
168>cwise_op_cosh.cc
168>cwise_op_digamma.cc
168>cwise_op_div.cc
168>cwise_op_equal_to_1.cc
168>cwise_op_equal_to_2.cc
168>cwise_op_erf.cc
168>cwise_op_erfc.cc
168>cwise_op_exp.cc
168>cwise_op_expm1.cc
168>cwise_op_floor.cc
168>cwise_op_floor_div.cc
168>cwise_op_floor_mod.cc
168>cwise_op_greater.cc
168>cwise_op_greater_equal.cc
168>cwise_op_igammas.cc
168>cwise_op_imag.cc
168>cwise_op_invert.cc
168>cwise_op_isfinite.cc
168>cwise_op_isinf.cc
168>cwise_op_isnan.cc
168>cwise_op_left_shift.cc
168>c:\users\john\tensorflow\tensorflow\contrib\cmake\build\external\eigen_archive\eigen\src\core\products\generalblockpanelkernel.h(1902): fatal error C1002: compiler is out of heap space in pass 2
168>cwise_op_less.cc
168>cl : Command line error D8040: error creating or communicating with child process
168>Done building project "tf_core_kernels.vcxproj" -- FAILED.
172>------ Build started: Project: tf_tools_transform_graph_lib, Configuration: Release x64 ------
173>------ Build started: Project: tf_label_image_example, Configuration: Release x64 ------
174>------ Build started: Project: grpc_tensorflow_server, Configuration: Release x64 ------
175>------ Build started: Project: tf_tutorials_example_trainer, Configuration: Release x64 ------
176>------ Build started: Project: benchmark_model, Configuration: Release x64 ------
173>main.cc
174>grpc_tensorflow_server.cc
175>example_trainer.cc
176>benchmark_model.cc
172>add_default_attributes.cc
172>backports.cc
172>file_utils.cc
172>flatten_atrous.cc
172>fold_batch_norms.cc
176>benchmark_model_main.cc
172>fold_constants_lib.cc
172>fold_old_batch_norms.cc
172>freeze_requantization_ranges.cc
175>c:\users\john\tensorflow\tensorflow\cc\ops\standard_ops.h(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/array_ops.h': No such file or directory
175>Done building project "tf_tutorials_example_trainer.vcxproj" -- FAILED.
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\fold_constants_lib.cc)
172>fuse_convolutions.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\backports.cc)
172>insert_logging.cc
172>obfuscate_names.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\fold_old_batch_norms.cc)
172>quantize_nodes.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\fold_batch_norms.cc)
172>quantize_weights.cc
173>c:\users\john\tensorflow\tensorflow\examples\label_image\main.cc(42): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/image_ops.h': No such file or directory
173>Done building project "tf_label_image_example.vcxproj" -- FAILED.
172>remove_attribute.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\fuse_convolutions.cc)
172>remove_control_dependencies.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\insert_logging.cc)
172>remove_device.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\obfuscate_names.cc)
172>remove_nodes.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\quantize_weights.cc)
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\quantize_nodes.cc)
172>rename_attribute.cc
172>rename_op.cc
172>round_weights.cc
172>set_device.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\remove_attribute.cc)
172>sort_by_execution_order.cc
174>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_core_cpu.dir\Release\accumulate_n_optimizer.obj'
174>Done building project "grpc_tensorflow_server.vcxproj" -- FAILED.
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\rename_attribute.cc)
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\rename_op.cc)
172>sparsify_gather.cc
172>strip_unused_nodes.cc
172>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\sparsify_gather.cc)
172>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\sparsify_gather.cc)
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\remove_device.cc)
172>transform_graph.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\remove_nodes.cc)
172>transform_utils.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\round_weights.cc)
176>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_core_cpu.dir\Release\accumulate_n_optimizer.obj'
176>Done building project "benchmark_model.vcxproj" -- FAILED.
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\sort_by_execution_order.cc)
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\strip_unused_nodes.cc)
172>Done building project "tf_tools_transform_graph_lib.vcxproj" -- FAILED.
177>------ Build started: Project: pywrap_tensorflow_internal_static, Configuration: Release x64 ------
178>------ Build started: Project: summarize_graph, Configuration: Release x64 ------
179>------ Build started: Project: compare_graphs, Configuration: Release x64 ------
180>------ Build started: Project: transform_graph, Configuration: Release x64 ------
179>compare_graphs.cc
178>summarize_graph_main.cc
180>transform_graph_main.cc
177>Generating __force_rebuild
177>
177>Running SWIG to generate Python wrappers
177>print_model_analysis.cc
177>pywrap_tensor.cc
177>pywrap_tfe_src.cc
177>tf_session_helper.cc
177>cpp_shape_inference.cc
177>python_op_gen.cc
177>python_op_gen_internal.cc
177>bfloat16.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tensor.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tensor.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\framework\cpp_shape_inference.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\framework\cpp_shape_inference.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\bfloat16.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\bfloat16.cc)
177>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen.cc(23): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
177>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen_internal.cc(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
177>numpy.cc
177>ndarray_tensor.cc
177>ndarray_tensor_bridge.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor.cc)
177>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
177>py_func.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor_bridge.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor_bridge.cc)
177>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
177>py_exception_registry.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\core\profiler\internal\print_model_analysis.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\core\profiler\internal\print_model_analysis.cc)
177>py_seq_tensor.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_exception_registry.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_exception_registry.cc)
180>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_tools_transform_graph_lib.dir\Release\backports.obj'
179>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_tools_transform_graph_lib.dir\Release\backports.obj'
180>Done building project "transform_graph.vcxproj" -- FAILED.
177>py_util.cc
179>Done building project "compare_graphs.vcxproj" -- FAILED.
177>safe_ptr.cc
177>py_record_reader.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_reader.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_reader.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\safe_ptr.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\safe_ptr.cc)
177>py_record_writer.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_writer.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_writer.cc)
178>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_tools_transform_graph_lib.dir\Release\backports.obj'
178>Done building project "summarize_graph.vcxproj" -- FAILED.
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
177>kernel_registry.cc
177>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
177>util.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_seq_tensor.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_seq_tensor.cc)
177>ops.cc
177>scope.cc
177>pywrap_tensorflow_internal.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\util\util.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\util\util.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
177>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\cc\framework\scope.cc)
177>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
177>Done building project "pywrap_tensorflow_internal_static.vcxproj" -- FAILED.
181>------ Build started: Project: pywrap_tensorflow_internal, Configuration: Release x64 ------
181>Generating __force_rebuild
181>
181>Running SWIG to generate Python wrappers
181>print_model_analysis.cc
181>pywrap_tensor.cc
181>pywrap_tfe_src.cc
181>tf_session_helper.cc
181>cpp_shape_inference.cc
181>python_op_gen.cc
181>python_op_gen_internal.cc
181>bfloat16.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tensor.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tensor.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\framework\cpp_shape_inference.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\framework\cpp_shape_inference.cc)
181>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen.cc(23): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
181>numpy.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\bfloat16.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\bfloat16.cc)
181>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen_internal.cc(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
181>ndarray_tensor.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\core\profiler\internal\print_model_analysis.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\core\profiler\internal\print_model_analysis.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor.cc)
181>ndarray_tensor_bridge.cc
181>py_func.cc
181>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
181>py_exception_registry.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor_bridge.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor_bridge.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_exception_registry.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_exception_registry.cc)
181>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
181>py_seq_tensor.cc
181>py_util.cc
181>safe_ptr.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\safe_ptr.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\safe_ptr.cc)
181>py_record_reader.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_reader.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_reader.cc)
181>py_record_writer.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_writer.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_writer.cc)
181>kernel_registry.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
181>util.cc
181>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
181>ops.cc
181>scope.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_seq_tensor.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_seq_tensor.cc)
181>pywrap_tensorflow_internal.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\util\util.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\util\util.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
181>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\cc\framework\scope.cc)
181>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
181>Done building project "pywrap_tensorflow_internal.vcxproj" -- FAILED.
182>------ Build started: Project: _nearest_neighbor_ops, Configuration: Release x64 ------
183>------ Build started: Project: _gru_ops, Configuration: Release x64 ------
184>------ Build started: Project: _beam_search_ops, Configuration: Release x64 ------
185>------ Build started: Project: _lstm_ops, Configuration: Release x64 ------
186>------ Build started: Project: _periodic_resample_op, Configuration: Release x64 ------
182>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
184>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
183>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
183>blas_gemm.cc
185>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
186>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
184>beam_search_ops.cc
182>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
182>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
182>hyperplane_lsh_probes.cc
182>nearest_neighbor_ops.cc
186>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
186>periodic_resample_op.cc
186>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
185>blas_gemm.cc
186>array_ops.cc
183>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
183>gru_ops.cc
185>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
185>lstm_ops.cc
184>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
184>beam_search_ops.cc
186>LINK : fatal error LNK1181: cannot open input file 'Release\pywrap_tensorflow_internal.lib'
186>Done building project "_periodic_resample_op.vcxproj" -- FAILED.
182>LINK : fatal error LNK1181: cannot open input file 'Release\pywrap_tensorflow_internal.lib'
182>Done building project "_nearest_neighbor_ops.vcxproj" -- FAILED.
184>LINK : fatal error LNK1181: cannot open input file 'Release\pywrap_tensorflow_internal.lib'
184>Done building project "_beam_search_ops.vcxproj" -- FAILED.
183>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
183>gru_ops.cc
185>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
185>lstm_ops.cc
183>LINK : fatal error LNK1181: cannot open input file 'Release\pywrap_tensorflow_internal.lib'
183>Done building project "_gru_ops.vcxproj" -- FAILED.
185>LINK : fatal error LNK1181: cannot open input file 'Release\pywrap_tensorflow_internal.lib'
185>Done building project "_lstm_ops.vcxproj" -- FAILED.
187>------ Build started: Project: tf_python_api, Configuration: Release x64 ------
188>------ Skipped Build: Project: INSTALL, Configuration: Release x64 ------
188>Project not selected to build for this solution configuration
187>Generating __init__.py files for Python API.
187>The parameter is incorrect
187>C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(171,5): error MSB6006: "cmd.exe" exited with code 1.
187>Done building project "tf_python_api.vcxproj" -- FAILED.
189>------ Skipped Build: Project: tf_python_build_pip_package, Configuration: Release x64 ------
189>Project not selected to build for this solution configuration
========== Build: 41 succeeded, 146 failed, 84 up-to-date, 2 skipped ==========
Please Let me know of any other details required
| 1.0 | Issue in building Tensorflow library on Windows machine -
**System information**
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Windows 10 (64-bit)
- Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: No
- TensorFlow installed from (source or binary): Source
- TensorFlow version:1.11
- Python version:3.6
- Installed using virtualenv? pip? conda?: Git
- Bazel version (if compiling from source):No
- GCC/Compiler version (if compiling from source): CMake : 3.10.1
- CUDA/cuDNN version: No
- GPU model and memory: No
**Describe the problem**
I am trying to build tensorflow in windows. After downloading 1.11 from the git branch, I tried to build it using one of the link avilable online. CMake step was executed successfully and visual studion solution file was also generated.
When i tried to build it in visual studion 2017 it started giving me error of file not found.
Link followed to build the library:
https://joe-antognini.github.io/machine-learning/build-windows-tf
**Provide the exact sequence of commands / steps that you executed before running into the problem**
1. Created all the required folder as described in the link.
2. Run the build command:
cmake .. -A x64 -DCMAKE_BUILD_TYPE=Release -DSWIG_EXECUTABLE=C:\swigwin-3.0.12\swig.exe -DPYTHON_EXECUTABLE=C:\Python365\python.exe -DPYTHON_LIBRARIES=C:\Python365\libs\python36.lib
3.Open the solution in Visualstudion 2017 and build in release mode.
**Any other info / logs**
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
********************************************
Visual Studio 2017 Logs:
********************************************
1>------ Build started: Project: zlib, Configuration: Release x64 ------
2>------ Build started: Project: farmhash, Configuration: Release x64 ------
3>------ Build started: Project: gif, Configuration: Release x64 ------
4>------ Build started: Project: sqlite, Configuration: Release x64 ------
5>------ Build started: Project: highwayhash, Configuration: Release x64 ------
6>------ Build started: Project: jpeg, Configuration: Release x64 ------
7>------ Build started: Project: lmdb, Configuration: Release x64 ------
8>------ Build started: Project: nsync, Configuration: Release x64 ------
1>Performing update step for 'zlib'
9>------ Build started: Project: farmhash_create_destination_dir, Configuration: Release x64 ------
10>------ Build started: Project: gif_create_destination_dir, Configuration: Release x64 ------
11>------ Build started: Project: sqlite_create_destination_dir, Configuration: Release x64 ------
5>Performing update step for 'highwayhash'
12>------ Build started: Project: png, Configuration: Release x64 ------
13>------ Build started: Project: protobuf, Configuration: Release x64 ------
14>------ Build started: Project: zlib_create_destination_dir, Configuration: Release x64 ------
15>------ Build started: Project: eigen, Configuration: Release x64 ------
16>------ Build started: Project: jpeg_create_destination_dir, Configuration: Release x64 ------
17>------ Build started: Project: highwayhash_create_destination_dir, Configuration: Release x64 ------
18>------ Build started: Project: png_create_destination_dir, Configuration: Release x64 ------
19>------ Build started: Project: lmdb_create_destination_dir, Configuration: Release x64 ------
13>Performing update step for 'protobuf'
20>------ Build started: Project: re2, Configuration: Release x64 ------
21>------ Build started: Project: double_conversion, Configuration: Release x64 ------
8>Performing update step for 'nsync'
22>------ Build started: Project: snappy, Configuration: Release x64 ------
23>------ Build started: Project: jpeg_copy_headers_to_destination, Configuration: Release x64 ------
24>------ Build started: Project: cub, Configuration: Release x64 ------
20>Performing update step for 're2'
21>Performing update step for 'double_conversion'
25>------ Build started: Project: nsync_create_destination_dir, Configuration: Release x64 ------
22>Performing update step for 'snappy'
26>------ Build started: Project: highwayhash_copy_headers_to_destination, Configuration: Release x64 ------
27>------ Build started: Project: grpc, Configuration: Release x64 ------
28>------ Build started: Project: jsoncpp, Configuration: Release x64 ------
29>------ Build started: Project: nsync_copy_headers_to_destination, Configuration: Release x64 ------
30>------ Build started: Project: lmdb_copy_headers_to_destination, Configuration: Release x64 ------
31>------ Build started: Project: png_copy_headers_to_destination, Configuration: Release x64 ------
32>------ Build started: Project: gif_copy_headers_to_destination, Configuration: Release x64 ------
27>Performing update step for 'grpc'
28>Performing update step for 'jsoncpp'
33>------ Build started: Project: sqlite_copy_headers_to_destination, Configuration: Release x64 ------
34>------ Build started: Project: gemmlowp, Configuration: Release x64 ------
35>------ Build started: Project: tf_protos_cc, Configuration: Release x64 ------
36>------ Build started: Project: fft2d, Configuration: Release x64 ------
37>------ Build started: Project: farmhash_copy_headers_to_destination, Configuration: Release x64 ------
38>------ Build started: Project: zlib_copy_headers_to_destination, Configuration: Release x64 ------
39>------ Build started: Project: create_cc_ops_header_dir, Configuration: Release x64 ------
40>------ Build started: Project: force_rebuild_target, Configuration: Release x64 ------
41>------ Build started: Project: tf_python_copy_scripts_to_destination, Configuration: Release x64 ------
40>Generating __force_rebuild
40>
40>Generating C:/Users/john/tensorflow/tensorflow/core/util/version_info.cc
35>debug_service.pb.cc
35>debugger_event_metadata.pb.cc
35>example.pb.cc
35>example_parser_configuration.pb.cc
35>feature.pb.cc
35>allocation_description.pb.cc
35>api_def.pb.cc
35>attr_value.pb.cc
40>fatal: Not a git repository: 'C:/Users/john/tensorflow/.git'
35>cost_graph.pb.cc
35>device_attributes.pb.cc
35>function.pb.cc
35>graph.pb.cc
35>graph_transfer_info.pb.cc
35>iterator.pb.cc
35>kernel_def.pb.cc
35>log_memory.pb.cc
35>node_def.pb.cc
35>op_def.pb.cc
35>reader_base.pb.cc
35>remote_fused_graph_execute_info.pb.cc
35>resource_handle.pb.cc
35>step_stats.pb.cc
35>summary.pb.cc
35>tensor.pb.cc
35>tensor_description.pb.cc
35>tensor_shape.pb.cc
35>tensor_slice.pb.cc
35>types.pb.cc
35>variable.pb.cc
35>versions.pb.cc
35>op_performance_data.pb.cc
35>boosted_trees.pb.cc
35>error_codes.pb.cc
35>profile.pb.cc
35>tfprof_log.pb.cc
35>tfprof_options.pb.cc
35>tfprof_output.pb.cc
35>checkpointable_object_graph.pb.cc
35>cluster.pb.cc
35>config.pb.cc
35>control_flow.pb.cc
35>critical_section.pb.cc
35>debug.pb.cc
35>device_properties.pb.cc
35>eager_service.pb.cc
35>master.pb.cc
35>master_service.pb.cc
35>meta_graph.pb.cc
35>named_tensor.pb.cc
35>queue_runner.pb.cc
35>rewriter_config.pb.cc
35>saved_model.pb.cc
35>saver.pb.cc
35>tensor_bundle.pb.cc
35>tensorflow_server.pb.cc
35>transport_options.pb.cc
35>worker.pb.cc
35>worker_service.pb.cc
35>event.pb.cc
35>example_proto_fast_parsing_test.pb.cc
35>memmapped_file_system.pb.cc
35>saved_tensor_slice.pb.cc
35>test_log.pb.cc
35>xla_service.pb.cc
35>backend_configs.pb.cc
35>hlo.pb.cc
35>hlo_profile_printer_data.pb.cc
35>xla.pb.cc
35>xla_data.pb.cc
35>learner.pb.cc
35>quantiles.pb.cc
35>split_info.pb.cc
35>tree_config.pb.cc
35>compilation_result.pb.cc
35>optimization_parameters.pb.cc
35>topology.pb.cc
35>tpu_embedding_config.pb.cc
35>tf_protos_cc.vcxproj -> C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\Release\tf_protos_cc.lib
42>------ Build started: Project: proto_text, Configuration: Release x64 ------
42>gen_proto_text_functions.cc
42>gen_proto_text_functions_lib.cc
42>path.obj : error LNK2019: unresolved external symbol "void __cdecl absl::base_internal::ThrowStdOutOfRange(char const *)" (?ThrowStdOutOfRange@base_internal@absl@@YAXPEBD@Z) referenced in function "class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > __cdecl tensorflow::io::internal::JoinPathImpl(class std::initializer_list<class absl::string_view>)" (?JoinPathImpl@internal@io@tensorflow@@YA?AV?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@V?$initializer_list@Vstring_view@absl@@@5@@Z)
42>path.obj : error LNK2019: unresolved external symbol "public: unsigned __int64 __cdecl absl::string_view::rfind(char,unsigned __int64)const " (?rfind@string_view@absl@@QEBA_KD_K@Z) referenced in function "class absl::string_view __cdecl tensorflow::io::Extension(class absl::string_view)" (?Extension@io@tensorflow@@YA?AVstring_view@absl@@V34@@Z)
42>collection_registry.obj : error LNK2019: unresolved external symbol "class std::basic_ostream<char,struct std::char_traits<char> > & __cdecl absl::operator<<(class std::basic_ostream<char,struct std::char_traits<char> > &,class absl::string_view)" (??6absl@@YAAEAV?$basic_ostream@DU?$char_traits@D@std@@@std@@AEAV12@Vstring_view@0@@Z) referenced in function "public: class std::unique_ptr<class tensorflow::monitoring::CollectionRegistry::RegistrationHandle,struct std::default_delete<class tensorflow::monitoring::CollectionRegistry::RegistrationHandle> > __cdecl tensorflow::monitoring::CollectionRegistry::Register(class tensorflow::monitoring::AbstractMetricDef const *,class std::function<void __cdecl(class tensorflow::monitoring::MetricCollectorGetter)> const &)" (?Register@CollectionRegistry@monitoring@tensorflow@@QEAA?AV?$unique_ptr@VRegistrationHandle@CollectionRegistry@monitoring@tensorflow@@U?$default_delete@VRegistrationHandle@CollectionRegistry@monitoring@tensorflow@@@std@@@std@@PEBVAbstractMetricDef@23@AEBV?$function@$$A6AXVMetricCollectorGetter@monitoring@tensorflow@@@Z@5@@Z)
42>str_util.obj : error LNK2019: unresolved external symbol "public: unsigned __int64 __cdecl absl::string_view::find(char,unsigned __int64)const " (?find@string_view@absl@@QEBA_KD_K@Z) referenced in function "class std::vector<class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> >,class std::allocator<class std::basic_string<char,struct std::char_traits<char>,class std::allocator<char> > > > __cdecl tensorflow::str_util::Split<struct tensorflow::str_util::AllowEmpty>(class absl::string_view,class absl::string_view,struct tensorflow::str_util::AllowEmpty)" (??$Split@UAllowEmpty@str_util@tensorflow@@@str_util@tensorflow@@YA?AV?$vector@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@V?$allocator@V?$basic_string@DU?$char_traits@D@std@@V?$allocator@D@2@@std@@@2@@std@@Vstring_view@absl@@0UAllowEmpty@01@@Z)
42>C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\Release\proto_text.exe : fatal error LNK1120: 4 unresolved externals
42>Done building project "proto_text.vcxproj" -- FAILED.
43>------ Build started: Project: tf_core_framework, Configuration: Release x64 ------
43>Generating __force_rebuild
43>
43>Running C++ protocol buffer text compiler (proto_text) on tensorflow/core/example/example.proto
43>'Release\proto_text.exe' is not recognized as an internal or external command,
43>operable program or batch file.
43>C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(171,5): error MSB6006: "cmd.exe" exited with code 9009.
43>Done building project "tf_core_framework.vcxproj" -- FAILED.
44>------ Build started: Project: tf_cc_op_gen_main, Configuration: Release x64 ------
45>------ Build started: Project: tf_core_cpu, Configuration: Release x64 ------
46>------ Build started: Project: tf_cc_framework, Configuration: Release x64 ------
47>------ Build started: Project: tf_python_op_gen_main, Configuration: Release x64 ------
44>cc_op_gen.cc
44>cc_op_gen_main.cc
46>ops.cc
46>scope.cc
47>python_op_gen.cc
47>python_op_gen_internal.cc
47>python_op_gen_main.cc
45>loader.cc
45>reader.cc
45>accumulate_n_optimizer.cc
45>allocator_retry.cc
45>base_collective_executor.cc
45>bfc_allocator.cc
45>buf_rendezvous.cc
45>build_graph_options.cc
47>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen_internal.cc(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
47>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen.cc(23): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
45>collective_executor_mgr.cc
44>c:\users\john\tensorflow\tensorflow\cc\framework\cc_op_gen.cc(28): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/types.pb_text.h': No such file or directory
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\base_collective_executor.cc)
45>collective_param_resolver_local.cc
44>Done building project "tf_cc_op_gen_main.vcxproj" -- FAILED.
48>------ Build started: Project: control_flow_ops_gen_cc, Configuration: Release x64 ------
49>------ Build started: Project: ctc_ops_gen_cc, Configuration: Release x64 ------
50>------ Build started: Project: cudnn_rnn_ops_gen_cc, Configuration: Release x64 ------
51>------ Build started: Project: data_flow_ops_gen_cc, Configuration: Release x64 ------
52>------ Build started: Project: image_ops_gen_cc, Configuration: Release x64 ------
45>collective_rma_local.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\accumulate_n_optimizer.cc)
45>collective_util.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\buf_rendezvous.cc)
45>copy_tensor.cc
46>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\cc\framework\scope.cc)
45>costmodel_manager.cc
47>Done building project "tf_python_op_gen_main.vcxproj" -- FAILED.
53>------ Build started: Project: random_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\collective_executor_mgr.cc)
45>debugger_state_interface.cc
45>device.cc
46>Done building project "tf_cc_framework.vcxproj" -- FAILED.
54>------ Build started: Project: io_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\collective_param_resolver_local.cc)
45>device_factory.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\collective_rma_local.cc)
45>device_mgr.cc
48>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
51>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
48>Done building project "control_flow_ops_gen_cc.vcxproj" -- FAILED.
49>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
55>------ Build started: Project: summary_ops_gen_cc, Configuration: Release x64 ------
52>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
51>Done building project "data_flow_ops_gen_cc.vcxproj" -- FAILED.
56>------ Build started: Project: sendrecv_ops_gen_cc, Configuration: Release x64 ------
52>Done building project "image_ops_gen_cc.vcxproj" -- FAILED.
49>Done building project "ctc_ops_gen_cc.vcxproj" -- FAILED.
57>------ Build started: Project: decode_proto_ops_gen_cc, Configuration: Release x64 ------
50>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
58>------ Build started: Project: remote_fused_graph_ops_gen_cc, Configuration: Release x64 ------
50>Done building project "cudnn_rnn_ops_gen_cc.vcxproj" -- FAILED.
59>------ Build started: Project: sdca_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\collective_util.cc)
45>device_resolver_local.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\copy_tensor.cc)
45>device_set.cc
53>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
58>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
57>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
53>Done building project "random_ops_gen_cc.vcxproj" -- FAILED.
60>------ Build started: Project: linalg_ops_gen_cc, Configuration: Release x64 ------
58>Done building project "remote_fused_graph_ops_gen_cc.vcxproj" -- FAILED.
61>------ Build started: Project: string_ops_gen_cc, Configuration: Release x64 ------
57>Done building project "decode_proto_ops_gen_cc.vcxproj" -- FAILED.
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\debugger_state_interface.cc)
62>------ Build started: Project: list_ops_gen_cc, Configuration: Release x64 ------
45>attr_builder.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\device.cc)
45>context.cc
45>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\attr_builder.cc)
59>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\attr_builder.cc)
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\device_mgr.cc)
45>eager_executor.cc
45>eager_operation.cc
59>Done building project "sdca_ops_gen_cc.vcxproj" -- FAILED.
63>------ Build started: Project: stateless_random_ops_gen_cc, Configuration: Release x64 ------
54>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
54>Done building project "io_ops_gen_cc.vcxproj" -- FAILED.
45>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\eager_operation.cc)
64>------ Build started: Project: resource_variable_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\eager_operation.cc)
55>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
62>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>execute.cc
60>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
55>Done building project "summary_ops_gen_cc.vcxproj" -- FAILED.
62>Done building project "list_ops_gen_cc.vcxproj" -- FAILED.
65>------ Build started: Project: encode_proto_ops_gen_cc, Configuration: Release x64 ------
66>------ Build started: Project: script_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\device_set.cc)
45>kernel_and_device.cc
60>Done building project "linalg_ops_gen_cc.vcxproj" -- FAILED.
67>------ Build started: Project: logging_ops_gen_cc, Configuration: Release x64 ------
61>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
61>Done building project "string_ops_gen_cc.vcxproj" -- FAILED.
68>------ Build started: Project: state_ops_gen_cc, Configuration: Release x64 ------
45>tensor_handle.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\device_resolver_local.cc)
45>eval_const_tensor.cc
66>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
56>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\attr_builder.cc)
66>Done building project "script_ops_gen_cc.vcxproj" -- FAILED.
69>------ Build started: Project: lookup_ops_gen_cc, Configuration: Release x64 ------
64>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>executor.cc
56>Done building project "sendrecv_ops_gen_cc.vcxproj" -- FAILED.
70>------ Build started: Project: array_ops_gen_cc, Configuration: Release x64 ------
64>Done building project "resource_variable_ops_gen_cc.vcxproj" -- FAILED.
71>------ Build started: Project: manip_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\context.cc)
45>executor_factory.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\eager_executor.cc)
45>function.cc
67>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
63>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
67>Done building project "logging_ops_gen_cc.vcxproj" -- FAILED.
72>------ Build started: Project: rpc_ops_gen_cc, Configuration: Release x64 ------
63>Done building project "stateless_random_ops_gen_cc.vcxproj" -- FAILED.
73>------ Build started: Project: math_ops_gen_cc, Configuration: Release x64 ------
70>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\eager_operation.cc)
70>Done building project "array_ops_gen_cc.vcxproj" -- FAILED.
74>------ Build started: Project: audio_ops_gen_cc, Configuration: Release x64 ------
45>gpu_id_manager.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\execute.cc)
45>graph_execution_state.cc
68>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\kernel_and_device.cc)
45>graph_optimizer.cc
68>Done building project "state_ops_gen_cc.vcxproj" -- FAILED.
75>------ Build started: Project: nn_ops_gen_cc, Configuration: Release x64 ------
65>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
71>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
65>Done building project "encode_proto_ops_gen_cc.vcxproj" -- FAILED.
76>------ Build started: Project: spectral_ops_gen_cc, Configuration: Release x64 ------
71>Done building project "manip_ops_gen_cc.vcxproj" -- FAILED.
77>------ Build started: Project: no_op_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eager\tensor_handle.cc)
45>graph_runner.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\executor.cc)
45>hierarchical_tree_broadcaster.cc
72>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
72>Done building project "rpc_ops_gen_cc.vcxproj" -- FAILED.
78>------ Build started: Project: batch_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\function.cc)
73>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>local_device.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\eval_const_tensor.cc)
45>lower_if_op.cc
73>Done building project "math_ops_gen_cc.vcxproj" -- FAILED.
79>------ Build started: Project: user_ops_gen_cc, Configuration: Release x64 ------
45>lower_while_op.cc
74>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
74>Done building project "audio_ops_gen_cc.vcxproj" -- FAILED.
80>------ Build started: Project: bitwise_ops_gen_cc, Configuration: Release x64 ------
77>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
77>Done building project "no_op_gen_cc.vcxproj" -- FAILED.
81>------ Build started: Project: parsing_ops_gen_cc, Configuration: Release x64 ------
75>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
75>Done building project "nn_ops_gen_cc.vcxproj" -- FAILED.
82>------ Build started: Project: sparse_ops_gen_cc, Configuration: Release x64 ------
69>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
69>Done building project "lookup_ops_gen_cc.vcxproj" -- FAILED.
83>------ Build started: Project: functional_ops_gen_cc, Configuration: Release x64 ------
76>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
78>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\graph_execution_state.cc)
76>Done building project "spectral_ops_gen_cc.vcxproj" -- FAILED.
45>memory_types.cc
84>------ Build started: Project: dataset_ops_gen_cc, Configuration: Release x64 ------
78>Done building project "batch_ops_gen_cc.vcxproj" -- FAILED.
85>------ Build started: Project: training_ops_gen_cc, Configuration: Release x64 ------
45>mkl_cpu_allocator.cc
45>optimization_registry.cc
79>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
79>Done building project "user_ops_gen_cc.vcxproj" -- FAILED.
86>------ Build started: Project: candidate_sampling_ops_gen_cc, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\graph_runner.cc)
45>parallel_concat_optimizer.cc
85>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
81>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
82>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
85>Done building project "training_ops_gen_cc.vcxproj" -- FAILED.
81>Done building project "parsing_ops_gen_cc.vcxproj" -- FAILED.
87>------ Build started: Project: checkpoint_ops_gen_cc, Configuration: Release x64 ------
88>------ Build started: Project: boosted_trees_ops_gen_cc, Configuration: Release x64 ------
82>Done building project "sparse_ops_gen_cc.vcxproj" -- FAILED.
80>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
89>------ Build started: Project: set_ops_gen_cc, Configuration: Release x64 ------
80>Done building project "bitwise_ops_gen_cc.vcxproj" -- FAILED.
90>------ Build started: Project: remote_fused_graph_ops_gen_python, Configuration: Release x64 ------
83>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\graph_optimizer.cc)
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\local_device.cc)
45>placer.cc
45>pool_allocator.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\hierarchical_tree_broadcaster.cc)
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\lower_if_op.cc)
83>Done building project "functional_ops_gen_cc.vcxproj" -- FAILED.
45>process_function_library_runtime.cc
45>process_state.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\lower_while_op.cc)
45>process_util.cc
91>------ Build started: Project: sdca_ops_gen_python, Configuration: Release x64 ------
84>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
84>Done building project "dataset_ops_gen_cc.vcxproj" -- FAILED.
92>------ Build started: Project: random_ops_gen_python, Configuration: Release x64 ------
87>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
88>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
86>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
87>Done building project "checkpoint_ops_gen_cc.vcxproj" -- FAILED.
93>------ Build started: Project: set_ops_gen_python, Configuration: Release x64 ------
88>Done building project "boosted_trees_ops_gen_cc.vcxproj" -- FAILED.
86>Done building project "candidate_sampling_ops_gen_cc.vcxproj" -- FAILED.
94>------ Build started: Project: parsing_ops_gen_python, Configuration: Release x64 ------
95>------ Build started: Project: sparse_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\optimization_registry.cc)
45>renamed_device.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\process_function_library_runtime.cc)
45>rendezvous_mgr.cc
90>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
92>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
90>Done building project "remote_fused_graph_ops_gen_python.vcxproj" -- FAILED.
95>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
91>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
96>------ Build started: Project: nn_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\placer.cc)
91>Done building project "sdca_ops_gen_python.vcxproj" -- FAILED.
97>------ Build started: Project: math_ops_gen_python, Configuration: Release x64 ------
45>rendezvous_util.cc
95>Done building project "sparse_ops_gen_python.vcxproj" -- FAILED.
92>Done building project "random_ops_gen_python.vcxproj" -- FAILED.
98>------ Build started: Project: manip_ops_gen_python, Configuration: Release x64 ------
99>------ Build started: Project: lookup_ops_gen_python, Configuration: Release x64 ------
94>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
94>Done building project "parsing_ops_gen_python.vcxproj" -- FAILED.
100>------ Build started: Project: logging_ops_gen_python, Configuration: Release x64 ------
89>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_cc_op_gen_main.dir\Release\cc_op_gen.obj'
89>Done building project "set_ops_gen_cc.vcxproj" -- FAILED.
101>------ Build started: Project: tf_cc_ops, Configuration: Release x64 ------
100>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
100>Done building project "logging_ops_gen_python.vcxproj" -- FAILED.
102>------ Build started: Project: state_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\parallel_concat_optimizer.cc)
45>ring_reducer.cc
45>scoped_allocator.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\renamed_device.cc)
45>scoped_allocator_mgr.cc
98>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
45>session_ref.cc
45>session_state.cc
102>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
98>Done building project "manip_ops_gen_python.vcxproj" -- FAILED.
103>------ Build started: Project: list_ops_gen_python, Configuration: Release x64 ------
102>Done building project "state_ops_gen_python.vcxproj" -- FAILED.
104>------ Build started: Project: stateless_random_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\rendezvous_mgr.cc)
45>shape_refiner.cc
93>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
93>Done building project "set_ops_gen_python.vcxproj" -- FAILED.
105>------ Build started: Project: linalg_ops_gen_python, Configuration: Release x64 ------
103>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
103>Done building project "list_ops_gen_python.vcxproj" -- FAILED.
106>------ Build started: Project: string_ops_gen_python, Configuration: Release x64 ------
99>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
97>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
99>Done building project "lookup_ops_gen_python.vcxproj" -- FAILED.
107>------ Build started: Project: io_ops_gen_python, Configuration: Release x64 ------
96>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
97>Done building project "math_ops_gen_python.vcxproj" -- FAILED.
104>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
108>------ Build started: Project: summary_ops_gen_python, Configuration: Release x64 ------
45>stats_publisher_interface.cc
96>Done building project "nn_ops_gen_python.vcxproj" -- FAILED.
109>------ Build started: Project: image_ops_gen_python, Configuration: Release x64 ------
104>Done building project "stateless_random_ops_gen_python.vcxproj" -- FAILED.
110>------ Build started: Project: functional_ops_gen_python, Configuration: Release x64 ------
105>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
105>Done building project "linalg_ops_gen_python.vcxproj" -- FAILED.
111>------ Build started: Project: encode_proto_ops_gen_python, Configuration: Release x64 ------
106>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
107>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
106>Done building project "string_ops_gen_python.vcxproj" -- FAILED.
112>------ Build started: Project: rpc_ops_gen_python, Configuration: Release x64 ------
107>Done building project "io_ops_gen_python.vcxproj" -- FAILED.
113>------ Build started: Project: debug_ops_gen_python, Configuration: Release x64 ------
109>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
109>Done building project "image_ops_gen_python.vcxproj" -- FAILED.
114>------ Build started: Project: resource_variable_ops_gen_python, Configuration: Release x64 ------
108>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
112>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
108>Done building project "summary_ops_gen_python.vcxproj" -- FAILED.
115>------ Build started: Project: dataset_ops_gen_python, Configuration: Release x64 ------
112>Done building project "rpc_ops_gen_python.vcxproj" -- FAILED.
116>------ Build started: Project: spectral_ops_gen_python, Configuration: Release x64 ------
114>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
113>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
114>Done building project "resource_variable_ops_gen_python.vcxproj" -- FAILED.
117>------ Build started: Project: data_flow_ops_gen_python, Configuration: Release x64 ------
45>step_stats_collector.cc
113>Done building project "debug_ops_gen_python.vcxproj" -- FAILED.
118>------ Build started: Project: cudnn_rnn_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\ring_reducer.cc)
45>sycl_allocator.cc
45>sycl_device.cc
45>sycl_device_context.cc
45>sycl_device_factory.cc
45>threadpool_device.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\shape_refiner.cc)
45>threadpool_device_factory.cc
45>debug.cc
115>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
115>Done building project "dataset_ops_gen_python.vcxproj" -- FAILED.
119>------ Build started: Project: script_ops_gen_python, Configuration: Release x64 ------
45>debug_callback_registry.cc
116>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
116>Done building project "spectral_ops_gen_python.vcxproj" -- FAILED.
111>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
120>------ Build started: Project: ctc_ops_gen_python, Configuration: Release x64 ------
110>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
111>Done building project "encode_proto_ops_gen_python.vcxproj" -- FAILED.
121>------ Build started: Project: control_flow_ops_gen_python, Configuration: Release x64 ------
110>Done building project "functional_ops_gen_python.vcxproj" -- FAILED.
122>------ Build started: Project: contrib_text_skip_gram_ops_gen_python, Configuration: Release x64 ------
117>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
119>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
117>Done building project "data_flow_ops_gen_python.vcxproj" -- FAILED.
119>Done building project "script_ops_gen_python.vcxproj" -- FAILED.
123>------ Build started: Project: contrib_tensor_forest_stats_ops_gen_python, Configuration: Release x64 ------
124>------ Build started: Project: contrib_tensor_forest_ops_gen_python, Configuration: Release x64 ------
45>debug_graph_utils.cc
125>------ Build started: Project: contrib_tensor_forest_model_ops_gen_python, Configuration: Release x64 ------
120>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
122>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
120>Done building project "ctc_ops_gen_python.vcxproj" -- FAILED.
126>------ Build started: Project: contrib_tensor_forest_hybrid_ops_gen_python, Configuration: Release x64 ------
122>Done building project "contrib_text_skip_gram_ops_gen_python.vcxproj" -- FAILED.
127>------ Build started: Project: contrib_seq2seq_beam_search_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\threadpool_device_factory.cc)
45>debug_io_utils.cc
123>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
45>debug_node_key.cc
45>debugger_state_impl.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\common_runtime\threadpool_device.cc)
45>server_lib.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\debug\debug.cc)
45>algorithm.cc
45>colors.cc
123>Done building project "contrib_tensor_forest_stats_ops_gen_python.vcxproj" -- FAILED.
128>------ Build started: Project: contrib_rnn_lstm_ops_gen_python, Configuration: Release x64 ------
121>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
126>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
127>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
121>Done building project "control_flow_ops_gen_python.vcxproj" -- FAILED.
129>------ Build started: Project: contrib_rnn_gru_ops_gen_python, Configuration: Release x64 ------
45>control_flow.cc
126>Done building project "contrib_tensor_forest_hybrid_ops_gen_python.vcxproj" -- FAILED.
124>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
130>------ Build started: Project: contrib_resampler_ops_gen_python, Configuration: Release x64 ------
127>Done building project "contrib_seq2seq_beam_search_ops_gen_python.vcxproj" -- FAILED.
131>------ Build started: Project: contrib_periodic_resample_ops_gen_python, Configuration: Release x64 ------
124>Done building project "contrib_tensor_forest_ops_gen_python.vcxproj" -- FAILED.
132>------ Build started: Project: contrib_nearest_neighbor_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\debug\debug_graph_utils.cc)
45>costmodel.cc
45>gradients.cc
125>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
125>Done building project "contrib_tensor_forest_model_ops_gen_python.vcxproj" -- FAILED.
133>------ Build started: Project: contrib_nccl_ops_gen_python, Configuration: Release x64 ------
131>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
131>Done building project "contrib_periodic_resample_ops_gen_python.vcxproj" -- FAILED.
134>------ Build started: Project: contrib_memory_stats_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\debug\debugger_state_impl.cc)
130>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
130>Done building project "contrib_resampler_ops_gen_python.vcxproj" -- FAILED.
45>graph_constructor.cc
135>------ Build started: Project: contrib_layers_sparse_feature_cross_ops_gen_python, Configuration: Release x64 ------
128>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
128>Done building project "contrib_rnn_lstm_ops_gen_python.vcxproj" -- FAILED.
136>------ Build started: Project: contrib_input_pipeline_ops_gen_python, Configuration: Release x64 ------
101>Generating tensorflow/cc/ops/audio_ops.h, tensorflow/cc/ops/audio_ops.cc, tensorflow/cc/ops/audio_ops_internal.h, tensorflow/cc/ops/audio_ops_internal.cc
45>graph_def_builder_util.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\graph\gradients.cc)
45>graph_partition.cc
45>mkl_layout_pass.cc
101>'Release\audio_ops_gen_cc.exe' is not recognized as an internal or external command,
45>mkl_tfconversion_pass.cc
101>operable program or batch file.
133>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
45>optimizer_cse.cc
133>Done building project "contrib_nccl_ops_gen_python.vcxproj" -- FAILED.
137>------ Build started: Project: contrib_image_sirds_ops_gen_python, Configuration: Release x64 ------
45>quantize_training.cc
135>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
101>C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(171,5): error MSB6006: "cmd.exe" exited with code 9009.
101>Done building project "tf_cc_ops.vcxproj" -- FAILED.
138>------ Build started: Project: tf_cc_while_loop, Configuration: Release x64 ------
135>Done building project "contrib_layers_sparse_feature_cross_ops_gen_python.vcxproj" -- FAILED.
139>------ Build started: Project: tf_cc, Configuration: Release x64 ------
132>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
136>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
132>Done building project "contrib_nearest_neighbor_ops_gen_python.vcxproj" -- FAILED.
140>------ Build started: Project: contrib_image_ops_gen_python, Configuration: Release x64 ------
136>Done building project "contrib_input_pipeline_ops_gen_python.vcxproj" -- FAILED.
134>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
141>------ Build started: Project: contrib_image_distort_image_ops_gen_python, Configuration: Release x64 ------
134>Done building project "contrib_memory_stats_ops_gen_python.vcxproj" -- FAILED.
142>------ Build started: Project: contrib_framework_variable_ops_gen_python, Configuration: Release x64 ------
129>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
129>Done building project "contrib_rnn_gru_ops_gen_python.vcxproj" -- FAILED.
143>------ Build started: Project: contrib_gcs_config_ops_gen_python, Configuration: Release x64 ------
137>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
137>Done building project "contrib_image_sirds_ops_gen_python.vcxproj" -- FAILED.
144>------ Build started: Project: contrib_factorization_factorization_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\graph\graph_constructor.cc)
45>validate.cc
45>cluster.cc
138>while_loop.cc
45>virtual_cluster.cc
45>analytical_cost_estimator.cc
139>client_session.cc
139>array_grad.cc
139>data_flow_grad.cc
139>c:\users\john\tensorflow\tensorflow\cc\gradients\data_flow_grad.cc(16): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/data_flow_ops.h': No such file or directory
139>image_grad.cc
139>math_grad.cc
139>nn_grad.cc
139>c:\users\john\tensorflow\tensorflow\cc\gradients\nn_grad.cc(16): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/nn_ops.h': No such file or directory
139>coordinator.cc
139>c:\users\john\tensorflow\tensorflow\cc\gradients\math_grad.cc(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/array_ops_internal.h': No such file or directory
139>queue_runner.cc
139>grad_op_registry.cc
139>gradient_checker.cc
139>gradients.cc
139>c:\users\john\tensorflow\tensorflow\cc\gradients\array_grad.cc(18): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/array_ops_internal.h': No such file or directory
139>while_gradients.cc
142>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\graph\quantize_training.cc)
45>graph_memory.cc
142>Done building project "contrib_framework_variable_ops_gen_python.vcxproj" -- FAILED.
145>------ Build started: Project: contrib_factorization_clustering_ops_gen_python, Configuration: Release x64 ------
45>graph_properties.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\clusters\cluster.cc)
45>measuring_cost_estimator.cc
141>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
144>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
141>Done building project "contrib_image_distort_image_ops_gen_python.vcxproj" -- FAILED.
144>Done building project "contrib_factorization_factorization_ops_gen_python.vcxproj" -- FAILED.
146>------ Build started: Project: contrib_data_dataset_ops_gen_python, Configuration: Release x64 ------
147>------ Build started: Project: contrib_coder_ops_gen_python, Configuration: Release x64 ------
140>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
143>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
138>c:\users\john\tensorflow\tensorflow\cc\ops\while_loop.cc(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/control_flow_ops_internal.h': No such file or directory
140>Done building project "contrib_image_ops_gen_python.vcxproj" -- FAILED.
143>Done building project "contrib_gcs_config_ops_gen_python.vcxproj" -- FAILED.
148>------ Build started: Project: contrib_boosted_trees_training_ops_gen_python, Configuration: Release x64 ------
149>------ Build started: Project: contrib_boosted_trees_stats_accumulator_ops_gen_python, Configuration: Release x64 ------
139>c:\users\john\tensorflow\tensorflow\cc\gradients\image_grad.cc(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/image_ops_internal.h': No such file or directory
139>c:\users\john\tensorflow\tensorflow\cc\ops\standard_ops.h(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/array_ops.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\cc\framework\gradients.cc)
139>c:\users\john\tensorflow\tensorflow\cc\framework\while_gradients.cc(20): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/control_flow_ops_internal.h': No such file or directory
139>c:\users\john\tensorflow\tensorflow\cc\ops\standard_ops.h(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/array_ops.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\cc\framework\gradient_checker.cc)
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\clusters\virtual_cluster.cc)
45>op_level_cost_estimator.cc
138>Done building project "tf_cc_while_loop.vcxproj" -- FAILED.
150>------ Build started: Project: tf_c, Configuration: Release x64 ------
45>robust_stats.cc
45>virtual_placer.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\analytical_cost_estimator.cc)
45>virtual_scheduler.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\graph_memory.cc)
45>devices.cc
45>gen_node.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\graph_properties.cc)
45>graph_analyzer.cc
45>c:\users\john\tensorflow\tensorflow\core\grappler\graph_analyzer\graph_analyzer.cc(20): fatal error C1083: Cannot open include file: 'absl/strings/str_format.h': No such file or directory
45>graph_analyzer_tool.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\measuring_cost_estimator.cc)
45>sig_node.cc
45>c:\users\john\tensorflow\tensorflow\core\grappler\graph_analyzer\gen_node.cc(18): fatal error C1083: Cannot open include file: 'absl/strings/str_format.h': No such file or directory
45>graph_view.cc
150>c_api.cc
147>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
150>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C
150>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams'
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\virtual_placer.cc)
147>Done building project "contrib_coder_ops_gen_python.vcxproj" -- FAILED.
45>grappler_item.cc
151>------ Build started: Project: contrib_boosted_trees_split_handler_ops_gen_python, Configuration: Release x64 ------
45>grappler_item_builder.cc
139>Done building project "tf_cc.vcxproj" -- FAILED.
152>------ Build started: Project: decode_proto_ops_gen_python, Configuration: Release x64 ------
151>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
151>Done building project "contrib_boosted_trees_split_handler_ops_gen_python.vcxproj" -- FAILED.
146>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
153>------ Build started: Project: contrib_boosted_trees_prediction_ops_gen_python, Configuration: Release x64 ------
146>Done building project "contrib_data_dataset_ops_gen_python.vcxproj" -- FAILED.
154>------ Build started: Project: contrib_boosted_trees_model_ops_gen_python, Configuration: Release x64 ------
149>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
149>Done building project "contrib_boosted_trees_stats_accumulator_ops_gen_python.vcxproj" -- FAILED.
155>------ Build started: Project: contrib_bigquery_reader_ops_gen_python, Configuration: Release x64 ------
148>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
152>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
148>Done building project "contrib_boosted_trees_training_ops_gen_python.vcxproj" -- FAILED.
156>------ Build started: Project: checkpoint_ops_gen_python, Configuration: Release x64 ------
152>Done building project "decode_proto_ops_gen_python.vcxproj" -- FAILED.
157>------ Build started: Project: array_ops_gen_python, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\costs\virtual_scheduler.cc)
145>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
154>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
150>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory
145>Done building project "contrib_factorization_clustering_ops_gen_python.vcxproj" -- FAILED.
45>file_input_yielder.cc
45>c:\users\john\tensorflow\tensorflow\core\grappler\graph_analyzer\sig_node.cc(20): fatal error C1083: Cannot open include file: 'absl/strings/str_format.h': No such file or directory
45>mutable_graph_view.cc
45>op_types.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\grappler_item_builder.cc)
45>arithmetic_optimizer.cc
158>------ Build started: Project: candidate_sampling_ops_gen_python, Configuration: Release x64 ------
154>Done building project "contrib_boosted_trees_model_ops_gen_python.vcxproj" -- FAILED.
45>auto_parallel.cc
159>------ Build started: Project: boosted_trees_ops_gen_python, Configuration: Release x64 ------
150>Done building project "tf_c.vcxproj" -- FAILED.
153>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
160>------ Build started: Project: tf_c_python_api, Configuration: Release x64 ------
153>Done building project "contrib_boosted_trees_prediction_ops_gen_python.vcxproj" -- FAILED.
161>------ Build started: Project: bitwise_ops_gen_python, Configuration: Release x64 ------
45>custom_graph_optimizer_registry.cc
158>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
158>Done building project "candidate_sampling_ops_gen_python.vcxproj" -- FAILED.
162>------ Build started: Project: training_ops_gen_python, Configuration: Release x64 ------
155>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
159>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
155>Done building project "contrib_bigquery_reader_ops_gen_python.vcxproj" -- FAILED.
159>Done building project "boosted_trees_ops_gen_python.vcxproj" -- FAILED.
163>------ Build started: Project: user_ops_gen_python, Configuration: Release x64 ------
164>------ Build started: Project: batch_ops_gen_python, Configuration: Release x64 ------
156>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
156>Done building project "checkpoint_ops_gen_python.vcxproj" -- FAILED.
165>------ Build started: Project: audio_ops_gen_python, Configuration: Release x64 ------
157>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
161>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
157>Done building project "array_ops_gen_python.vcxproj" -- FAILED.
166>------ Build started: Project: contrib_boosted_trees_quantiles_ops_gen_python, Configuration: Release x64 ------
161>Done building project "bitwise_ops_gen_python.vcxproj" -- FAILED.
163>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
163>Done building project "user_ops_gen_python.vcxproj" -- FAILED.
45>filter_fusion.cc
45>fusion_utils.cc
45>graph_utils.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\arithmetic_optimizer.cc)
45>latency_all_edges.cc
162>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
45>map_and_batch_fusion.cc
162>Done building project "training_ops_gen_python.vcxproj" -- FAILED.
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\auto_parallel.cc)
45>map_and_filter_fusion.cc
45>map_fusion.cc
45>map_vectorization.cc
164>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
165>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
164>Done building project "batch_ops_gen_python.vcxproj" -- FAILED.
165>Done building project "audio_ops_gen_python.vcxproj" -- FAILED.
166>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_python_op_gen_main.dir\Release\python_op_gen.obj'
166>Done building project "contrib_boosted_trees_quantiles_ops_gen_python.vcxproj" -- FAILED.
167>------ Build started: Project: tf_python_ops, Configuration: Release x64 ------
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\filter_fusion.cc)
45>noop_elimination.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\latency_all_edges.cc)
45>shuffle_and_repeat_fusion.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\map_and_batch_fusion.cc)
45>debug_stripper.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\map_fusion.cc)
45>dependency_optimizer.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\map_and_filter_fusion.cc)
45>evaluation_utils.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\map_vectorization.cc)
45>function_optimizer.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\noop_elimination.cc)
45>gpu_swapping_kernels.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\data\shuffle_and_repeat_fusion.cc)
45>gpu_swapping_ops.cc
45>graph_optimizer_stage.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\debug_stripper.cc)
45>graph_rewriter.cc
45>layout_optimizer.cc
167>Generating tf_python/tensorflow/python/ops/gen_audio_ops.py
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\dependency_optimizer.cc)
45>loop_optimizer.cc
167>'Release\audio_ops_gen_python.exe' is not recognized as an internal or external command,
167>operable program or batch file.
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\function_optimizer.cc)
45>memory_optimizer.cc
167>C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(171,5): error MSB6006: "cmd.exe" exited with code 9009.
167>Done building project "tf_python_ops.vcxproj" -- FAILED.
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\gpu_swapping_kernels.cc)
45>meta_optimizer.cc
45>model_pruner.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\graph_optimizer_stage.cc)
45>remapper.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\loop_optimizer.cc)
45>scoped_allocator_optimizer.cc
45>shape_optimizer.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\layout_optimizer.cc)
45>static_schedule.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\memory_optimizer.cc)
45>symbolic_shapes.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\meta_optimizer.cc)
45>colocation.cc
45>frame.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\shape_optimizer.cc)
45>functions.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\remapper.cc)
45>scc.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\scoped_allocator_optimizer.cc)
45>topological_sort.cc
45>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\optimizers\static_schedule.cc)
45>traversal.cc
45>Done building project "tf_core_cpu.vcxproj" -- FAILED.
168>------ Build started: Project: tf_core_kernels, Configuration: Release x64 ------
169>------ Build started: Project: tf_grappler, Configuration: Release x64 ------
170>------ Build started: Project: tf_core_distributed_runtime, Configuration: Release x64 ------
171>------ Build started: Project: tf_core_direct_session, Configuration: Release x64 ------
171>direct_session.cc
169>single_machine.cc
169>cost_analyzer.cc
169>model_analyzer.cc
170>base_rendezvous_mgr.cc
170>call_options.cc
170>cluster_function_library_runtime.cc
170>collective_param_resolver_distributed.cc
170>collective_rma_distributed.cc
170>device_resolver_distributed.cc
170>eager_service_impl.cc
170>graph_mgr.cc
170>local_master.cc
168>adjust_contrast_op.cc
168>adjust_hue_op.cc
168>adjust_saturation_op.cc
168>aggregate_ops.cc
168>argmax_op.cc
168>as_string_op.cc
168>attention_ops.cc
168>avgpooling_op.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\collective_rma_distributed.cc)
170>master.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\base_rendezvous_mgr.cc)
170>master_session.cc
169>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\grappler\cost_analyzer.cc)
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\eager\eager_service_impl.cc)
170>message_wrappers.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\cluster_function_library_runtime.cc)
170>partial_run_mgr.cc
169>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\grappler\clusters\single_machine.cc)
169>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\grappler\model_analyzer.cc)
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\device_resolver_distributed.cc)
170>recent_request_ids.cc
171>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\graph_mgr.cc)
170>remote_device.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\collective_param_resolver_distributed.cc)
170>request_id.cc
169>Done building project "tf_grappler.vcxproj" -- FAILED.
171>Done building project "tf_core_direct_session.vcxproj" -- FAILED.
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\local_master.cc)
170>grpc_eager_client.cc
170>grpc_eager_service.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\master_session.cc)
170>grpc_eager_service_impl.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\master.cc)
170>grpc_channel.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.cc)
170>grpc_master_service.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\remote_device.cc)
170>grpc_master_service_impl.cc
168>barrier_ops.cc
168>base64_ops.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\partial_run_mgr.cc)
170>grpc_remote_master.cc
168>batch_kernels.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\eager\grpc_eager_service_impl.cc)
170>grpc_remote_worker.cc
170>grpc_rpc_factory.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_master_service.cc)
170>grpc_rpc_factory_registration.cc
170>grpc_server_lib.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_remote_master.cc)
168>batch_matmul_op_complex.cc
170>grpc_session.cc
168>batch_matmul_op_real.cc
170>grpc_tensor_coding.cc
170>grpc_util.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_remote_worker.cc)
170>grpc_worker_cache.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_session.cc)
170>grpc_worker_service.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_server_lib.cc)
170>grpc_worker_service_impl.cc
168>batch_norm_op.cc
170>rpc_rendezvous_mgr.cc
170>rpc_collective_executor_mgr.cc
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_worker_cache.cc)
170>scheduler.cc
170>session_mgr.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\rpc_rendezvous_mgr.cc)
170>tensor_coding.cc
170>worker.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc\grpc_worker_service.cc)
170>worker_cache_logger.cc
170>worker_cache_partial.cc
168>fake_clock_env.cc
168>periodic_function.cc
168>batchtospace_op.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\session_mgr.cc)
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\scheduler.cc)
170>worker_session.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\rpc_collective_executor_mgr.cc)
168>bcast_ops.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\tensor_coding.cc)
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\worker.cc)
170>c:\users\john\tensorflow\tensorflow\core\distributed_runtime\message_wrappers.h(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/tensor.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\worker_cache_partial.cc)
168>betainc_op.cc
168>bias_op.cc
170>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\distributed_runtime\worker_session.cc)
168>c:\users\john\tensorflow\tensorflow\core\kernels\batchtospace_op.cc(198): warning C4002: too many actual parameters for macro 'TF_BATCHTOSPACE_BLOCK_DIMS_CASE'
170>Done building project "tf_core_distributed_runtime.vcxproj" -- FAILED.
168>bincount_op.cc
168>bitcast_op.cc
168>resource_ops.cc
168>resources.cc
168>stats_ops.cc
168>broadcast_to_op.cc
168>bucketize_op.cc
168>candidate_sampler_ops.cc
168>cast_op.cc
168>cast_op_impl_bfloat.cc
168>cast_op_impl_bool.cc
168>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\kernels\cast_op.cc)
168>cast_op_impl_complex128.cc
168>cast_op_impl_complex64.cc
168>cast_op_impl_double.cc
168>cast_op_impl_float.cc
168>cast_op_impl_half.cc
168>cast_op_impl_int16.cc
168>cast_op_impl_int32.cc
168>cast_op_impl_int64.cc
168>cast_op_impl_int8.cc
168>cast_op_impl_uint16.cc
168>cast_op_impl_uint32.cc
168>cast_op_impl_uint64.cc
168>cast_op_impl_uint8.cc
168>check_numerics_op.cc
168>cholesky_grad.cc
168>cholesky_op.cc
168>collective_ops.cc
168>colorspace_op.cc
168>compare_and_bitpack_op.cc
168>concat_lib_cpu.cc
168>concat_lib_gpu.cc
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(99): warning C4805: '|': unsafe mix of type 'int' and type 'bool' in operation
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(92): note: while compiling class template member function 'void tensorflow::functor::ComputeShard<T,void,void>::Compute(Eigen::TensorMap<Eigen::Tensor<const T,2,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::Tensor<unsigned char,2,1,IndexType>,16,Eigen::MakePointer>,const T &,tensorflow::int64,tensorflow::int64)'
168> with
168> [
168> T=tensorflow::bfloat16,
168> IndexType=Eigen::DenseIndex
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(149): note: see reference to function template instantiation 'void tensorflow::functor::ComputeShard<T,void,void>::Compute(Eigen::TensorMap<Eigen::Tensor<const T,2,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::Tensor<unsigned char,2,1,IndexType>,16,Eigen::MakePointer>,const T &,tensorflow::int64,tensorflow::int64)' being compiled
168> with
168> [
168> T=tensorflow::bfloat16,
168> IndexType=Eigen::DenseIndex
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(149): note: see reference to class template instantiation 'tensorflow::functor::ComputeShard<T,void,void>' being compiled
168> with
168> [
168> T=tensorflow::bfloat16
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(146): note: while compiling class template member function 'void tensorflow::functor::CompareAndBitpack<Device,T>::operator ()(tensorflow::OpKernelContext *,Eigen::TensorMap<Eigen::Tensor<const T,2,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::TensorFixedSize<const T,Eigen::Sizes<>,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::Tensor<unsigned char,2,1,IndexType>,16,Eigen::MakePointer>)'
168> with
168> [
168> Device=tensorflow::CPUDevice,
168> T=tensorflow::bfloat16,
168> IndexType=Eigen::DenseIndex
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(71): note: see reference to function template instantiation 'void tensorflow::functor::CompareAndBitpack<Device,T>::operator ()(tensorflow::OpKernelContext *,Eigen::TensorMap<Eigen::Tensor<const T,2,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::TensorFixedSize<const T,Eigen::Sizes<>,1,IndexType>,16,Eigen::MakePointer>,Eigen::TensorMap<Eigen::Tensor<unsigned char,2,1,IndexType>,16,Eigen::MakePointer>)' being compiled
168> with
168> [
168> Device=tensorflow::CPUDevice,
168> T=tensorflow::bfloat16,
168> IndexType=Eigen::DenseIndex
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(70): note: see reference to class template instantiation 'tensorflow::functor::CompareAndBitpack<Device,T>' being compiled
168> with
168> [
168> Device=tensorflow::CPUDevice,
168> T=tensorflow::bfloat16
168> ]
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(42): note: while compiling class template member function 'void tensorflow::CompareAndBitpackOp<tensorflow::CPUDevice,tensorflow::bfloat16>::Compute(tensorflow::OpKernelContext *)'
168>c:\users\john\tensorflow\tensorflow\core\kernels\compare_and_bitpack_op.cc(80): note: see reference to class template instantiation 'tensorflow::CompareAndBitpackOp<tensorflow::CPUDevice,tensorflow::bfloat16>' being compiled
168>concat_op.cc
168>conditional_accumulator_base.cc
168>conditional_accumulator_base_op.cc
168>conditional_accumulator_op.cc
168>constant_op.cc
168>control_flow_ops.cc
168>conv_grad_filter_ops.cc
168>conv_grad_input_ops.cc
168>conv_grad_ops.cc
168>conv_grad_ops_3d.cc
168>conv_ops.cc
168>conv_ops_3d.cc
168>conv_ops_fused.cc
168>conv_ops_using_gemm.cc
168>count_up_to_op.cc
168>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\kernels\conv_ops_fused.cc)
168>crop_and_resize_op.cc
168>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\core\kernels\conv_ops_using_gemm.cc)
168>cross_op.cc
168>ctc_decoder_ops.cc
168>ctc_loss_op.cc
168>cuda_solvers.cc
168>cudnn_pooling_gpu.cc
168>cudnn_rnn_ops.cc
168>cwise_op_abs.cc
168>cwise_op_acos.cc
168>cwise_op_acosh.cc
168>cwise_op_add_1.cc
168>cwise_op_add_2.cc
168>cwise_op_arg.cc
168>cwise_op_asin.cc
168>cwise_op_asinh.cc
168>cwise_op_atan.cc
168>cwise_op_atan2.cc
168>cwise_op_atanh.cc
168>cwise_op_bessel.cc
168>cwise_op_bitwise_and.cc
168>cwise_op_bitwise_or.cc
168>cwise_op_bitwise_xor.cc
168>cwise_op_ceil.cc
168>cwise_op_clip.cc
168>cwise_op_complex.cc
168>cwise_op_conj.cc
168>cwise_op_cos.cc
168>cwise_op_cosh.cc
168>cwise_op_digamma.cc
168>cwise_op_div.cc
168>cwise_op_equal_to_1.cc
168>cwise_op_equal_to_2.cc
168>cwise_op_erf.cc
168>cwise_op_erfc.cc
168>cwise_op_exp.cc
168>cwise_op_expm1.cc
168>cwise_op_floor.cc
168>cwise_op_floor_div.cc
168>cwise_op_floor_mod.cc
168>cwise_op_greater.cc
168>cwise_op_greater_equal.cc
168>cwise_op_igammas.cc
168>cwise_op_imag.cc
168>cwise_op_invert.cc
168>cwise_op_isfinite.cc
168>cwise_op_isinf.cc
168>cwise_op_isnan.cc
168>cwise_op_left_shift.cc
168>c:\users\john\tensorflow\tensorflow\contrib\cmake\build\external\eigen_archive\eigen\src\core\products\generalblockpanelkernel.h(1902): fatal error C1002: compiler is out of heap space in pass 2
168>cwise_op_less.cc
168>cl : Command line error D8040: error creating or communicating with child process
168>Done building project "tf_core_kernels.vcxproj" -- FAILED.
172>------ Build started: Project: tf_tools_transform_graph_lib, Configuration: Release x64 ------
173>------ Build started: Project: tf_label_image_example, Configuration: Release x64 ------
174>------ Build started: Project: grpc_tensorflow_server, Configuration: Release x64 ------
175>------ Build started: Project: tf_tutorials_example_trainer, Configuration: Release x64 ------
176>------ Build started: Project: benchmark_model, Configuration: Release x64 ------
173>main.cc
174>grpc_tensorflow_server.cc
175>example_trainer.cc
176>benchmark_model.cc
172>add_default_attributes.cc
172>backports.cc
172>file_utils.cc
172>flatten_atrous.cc
172>fold_batch_norms.cc
176>benchmark_model_main.cc
172>fold_constants_lib.cc
172>fold_old_batch_norms.cc
172>freeze_requantization_ranges.cc
175>c:\users\john\tensorflow\tensorflow\cc\ops\standard_ops.h(19): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/array_ops.h': No such file or directory
175>Done building project "tf_tutorials_example_trainer.vcxproj" -- FAILED.
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\fold_constants_lib.cc)
172>fuse_convolutions.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\backports.cc)
172>insert_logging.cc
172>obfuscate_names.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\fold_old_batch_norms.cc)
172>quantize_nodes.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\fold_batch_norms.cc)
172>quantize_weights.cc
173>c:\users\john\tensorflow\tensorflow\examples\label_image\main.cc(42): fatal error C1083: Cannot open include file: 'tensorflow/cc/ops/image_ops.h': No such file or directory
173>Done building project "tf_label_image_example.vcxproj" -- FAILED.
172>remove_attribute.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\fuse_convolutions.cc)
172>remove_control_dependencies.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\insert_logging.cc)
172>remove_device.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\obfuscate_names.cc)
172>remove_nodes.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\quantize_weights.cc)
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\quantize_nodes.cc)
172>rename_attribute.cc
172>rename_op.cc
172>round_weights.cc
172>set_device.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\remove_attribute.cc)
172>sort_by_execution_order.cc
174>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_core_cpu.dir\Release\accumulate_n_optimizer.obj'
174>Done building project "grpc_tensorflow_server.vcxproj" -- FAILED.
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\rename_attribute.cc)
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\rename_op.cc)
172>sparsify_gather.cc
172>strip_unused_nodes.cc
172>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\sparsify_gather.cc)
172>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\sparsify_gather.cc)
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\remove_device.cc)
172>transform_graph.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\remove_nodes.cc)
172>transform_utils.cc
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\round_weights.cc)
176>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_core_cpu.dir\Release\accumulate_n_optimizer.obj'
176>Done building project "benchmark_model.vcxproj" -- FAILED.
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\sort_by_execution_order.cc)
172>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\tools\graph_transforms\strip_unused_nodes.cc)
172>Done building project "tf_tools_transform_graph_lib.vcxproj" -- FAILED.
177>------ Build started: Project: pywrap_tensorflow_internal_static, Configuration: Release x64 ------
178>------ Build started: Project: summarize_graph, Configuration: Release x64 ------
179>------ Build started: Project: compare_graphs, Configuration: Release x64 ------
180>------ Build started: Project: transform_graph, Configuration: Release x64 ------
179>compare_graphs.cc
178>summarize_graph_main.cc
180>transform_graph_main.cc
177>Generating __force_rebuild
177>
177>Running SWIG to generate Python wrappers
177>print_model_analysis.cc
177>pywrap_tensor.cc
177>pywrap_tfe_src.cc
177>tf_session_helper.cc
177>cpp_shape_inference.cc
177>python_op_gen.cc
177>python_op_gen_internal.cc
177>bfloat16.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tensor.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tensor.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\framework\cpp_shape_inference.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\framework\cpp_shape_inference.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\bfloat16.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\bfloat16.cc)
177>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen.cc(23): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
177>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen_internal.cc(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
177>numpy.cc
177>ndarray_tensor.cc
177>ndarray_tensor_bridge.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor.cc)
177>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
177>py_func.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor_bridge.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor_bridge.cc)
177>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
177>py_exception_registry.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\core\profiler\internal\print_model_analysis.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\core\profiler\internal\print_model_analysis.cc)
177>py_seq_tensor.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_exception_registry.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_exception_registry.cc)
180>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_tools_transform_graph_lib.dir\Release\backports.obj'
179>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_tools_transform_graph_lib.dir\Release\backports.obj'
180>Done building project "transform_graph.vcxproj" -- FAILED.
177>py_util.cc
179>Done building project "compare_graphs.vcxproj" -- FAILED.
177>safe_ptr.cc
177>py_record_reader.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_reader.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_reader.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\safe_ptr.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\safe_ptr.cc)
177>py_record_writer.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_writer.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_writer.cc)
178>LINK : fatal error LNK1181: cannot open input file 'C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\tf_tools_transform_graph_lib.dir\Release\backports.obj'
178>Done building project "summarize_graph.vcxproj" -- FAILED.
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
177>kernel_registry.cc
177>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
177>util.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_seq_tensor.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_seq_tensor.cc)
177>ops.cc
177>scope.cc
177>pywrap_tensorflow_internal.cc
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\util\util.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\util\util.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
177>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
177>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\cc\framework\scope.cc)
177>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
177>Done building project "pywrap_tensorflow_internal_static.vcxproj" -- FAILED.
181>------ Build started: Project: pywrap_tensorflow_internal, Configuration: Release x64 ------
181>Generating __force_rebuild
181>
181>Running SWIG to generate Python wrappers
181>print_model_analysis.cc
181>pywrap_tensor.cc
181>pywrap_tfe_src.cc
181>tf_session_helper.cc
181>cpp_shape_inference.cc
181>python_op_gen.cc
181>python_op_gen_internal.cc
181>bfloat16.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tensor.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tensor.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\framework\cpp_shape_inference.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\framework\cpp_shape_inference.cc)
181>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen.cc(23): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
181>numpy.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\bfloat16.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\bfloat16.cc)
181>c:\users\john\tensorflow\tensorflow\python\framework\python_op_gen_internal.cc(24): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/op_def.pb_text.h': No such file or directory
181>ndarray_tensor.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\core\profiler\internal\print_model_analysis.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\core\profiler\internal\print_model_analysis.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor.cc)
181>ndarray_tensor_bridge.cc
181>py_func.cc
181>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\client\tf_session_helper.cc)
181>py_exception_registry.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor_bridge.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\ndarray_tensor_bridge.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_exception_registry.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_exception_registry.cc)
181>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\eager\pywrap_tfe_src.cc)
181>py_seq_tensor.cc
181>py_util.cc
181>safe_ptr.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\safe_ptr.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\safe_ptr.cc)
181>py_record_reader.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_reader.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_reader.cc)
181>py_record_writer.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_writer.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\io\py_record_writer.cc)
181>kernel_registry.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
181>util.cc
181>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_func.cc)
181>ops.cc
181>scope.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_seq_tensor.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\lib\core\py_seq_tensor.cc)
181>pywrap_tensorflow_internal.cc
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\python\util\util.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\python\util\util.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1113): warning C4190: 'TF_NewWhile' has C-linkage specified, but returns UDT 'TF_WhileParams' which is incompatible with C (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
181>c:\users\john\tensorflow\tensorflow\c\c_api.h(1069): note: see declaration of 'TF_WhileParams' (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
181>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\cc\framework\scope.cc)
181>c:\users\john\tensorflow\tensorflow\core\common_runtime\device.h(37): fatal error C1083: Cannot open include file: 'tensorflow/core/framework/device_attributes.pb_text.h': No such file or directory (compiling source file C:\Users\john\tensorflow\tensorflow\contrib\cmake\build\pywrap_tensorflow_internal.cc)
181>Done building project "pywrap_tensorflow_internal.vcxproj" -- FAILED.
182>------ Build started: Project: _nearest_neighbor_ops, Configuration: Release x64 ------
183>------ Build started: Project: _gru_ops, Configuration: Release x64 ------
184>------ Build started: Project: _beam_search_ops, Configuration: Release x64 ------
185>------ Build started: Project: _lstm_ops, Configuration: Release x64 ------
186>------ Build started: Project: _periodic_resample_op, Configuration: Release x64 ------
182>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
184>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
183>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
183>blas_gemm.cc
185>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
186>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
184>beam_search_ops.cc
182>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
182>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
182>hyperplane_lsh_probes.cc
182>nearest_neighbor_ops.cc
186>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
186>periodic_resample_op.cc
186>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
185>blas_gemm.cc
186>array_ops.cc
183>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
183>gru_ops.cc
185>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
185>lstm_ops.cc
184>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
184>beam_search_ops.cc
186>LINK : fatal error LNK1181: cannot open input file 'Release\pywrap_tensorflow_internal.lib'
186>Done building project "_periodic_resample_op.vcxproj" -- FAILED.
182>LINK : fatal error LNK1181: cannot open input file 'Release\pywrap_tensorflow_internal.lib'
182>Done building project "_nearest_neighbor_ops.vcxproj" -- FAILED.
184>LINK : fatal error LNK1181: cannot open input file 'Release\pywrap_tensorflow_internal.lib'
184>Done building project "_beam_search_ops.vcxproj" -- FAILED.
183>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
183>gru_ops.cc
185>cl : Command line warning D9025: overriding '/DTF_COMPILE_LIBRARY' with '/UTF_COMPILE_LIBRARY'
185>lstm_ops.cc
183>LINK : fatal error LNK1181: cannot open input file 'Release\pywrap_tensorflow_internal.lib'
183>Done building project "_gru_ops.vcxproj" -- FAILED.
185>LINK : fatal error LNK1181: cannot open input file 'Release\pywrap_tensorflow_internal.lib'
185>Done building project "_lstm_ops.vcxproj" -- FAILED.
187>------ Build started: Project: tf_python_api, Configuration: Release x64 ------
188>------ Skipped Build: Project: INSTALL, Configuration: Release x64 ------
188>Project not selected to build for this solution configuration
187>Generating __init__.py files for Python API.
187>The parameter is incorrect
187>C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(171,5): error MSB6006: "cmd.exe" exited with code 1.
187>Done building project "tf_python_api.vcxproj" -- FAILED.
189>------ Skipped Build: Project: tf_python_build_pip_package, Configuration: Release x64 ------
189>Project not selected to build for this solution configuration
========== Build: 41 succeeded, 146 failed, 84 up-to-date, 2 skipped ==========
Please Let me know of any other details required
| non_priority | issue in building tensorflow library on windows machine system information os platform and distribution e g linux ubuntu windows bit mobile device e g iphone pixel samsung galaxy if the issue happens on mobile device no tensorflow installed from source or binary source tensorflow version python version installed using virtualenv pip conda git bazel version if compiling from source no gcc compiler version if compiling from source cmake cuda cudnn version no gpu model and memory no describe the problem i am trying to build tensorflow in windows after downloading from the git branch i tried to build it using one of the link avilable online cmake step was executed successfully and visual studion solution file was also generated when i tried to build it in visual studion it started giving me error of file not found link followed to build the library provide the exact sequence of commands steps that you executed before running into the problem created all the required folder as described in the link run the build command cmake a dcmake build type release dswig executable c swigwin swig exe dpython executable c python exe dpython libraries c libs lib open the solution in visualstudion and build in release mode any other info logs include any logs or source code that would be helpful to diagnose the problem if including tracebacks please include the full traceback large logs and files should be attached visual studio logs build started project zlib configuration release build started project farmhash configuration release build started project gif configuration release build started project sqlite configuration release build started project highwayhash configuration release build started project jpeg configuration release build started project lmdb configuration release build started project nsync configuration release performing update step for zlib build started project farmhash create destination dir configuration release build started project gif create destination dir configuration release build started project sqlite create destination dir configuration release performing update step for highwayhash build started project png configuration release build started project protobuf configuration release build started project zlib create destination dir configuration release build started project eigen configuration release build started project jpeg create destination dir configuration release build started project highwayhash create destination dir configuration release build started project png create destination dir configuration release build started project lmdb create destination dir configuration release performing update step for protobuf build started project configuration release build started project double conversion configuration release performing update step for nsync build started project snappy configuration release build started project jpeg copy headers to destination configuration release build started project cub configuration release performing update step for performing update step for double conversion build started project nsync create destination dir configuration release performing update step for snappy build started project highwayhash copy headers to destination configuration release build started project grpc configuration release build started project jsoncpp configuration release build started project nsync copy headers to destination configuration release build started project lmdb copy headers to destination configuration release build started project png copy headers to destination configuration release build started project gif copy headers to destination configuration release performing update step for grpc performing update step for jsoncpp build started project sqlite copy headers to destination configuration release build started project gemmlowp configuration release build started project tf protos cc configuration release build started project configuration release build started project farmhash copy headers to destination configuration release build started project zlib copy headers to destination configuration release build started project create cc ops header dir configuration release build started project force rebuild target configuration release build started project tf python copy scripts to destination configuration release generating force rebuild generating c users john tensorflow tensorflow core util version info cc debug service pb cc debugger event metadata pb cc example pb cc example parser configuration pb cc feature pb cc allocation description pb cc api def pb cc attr value pb cc fatal not a git repository c users john tensorflow git cost graph pb cc device attributes pb cc function pb cc graph pb cc graph transfer info pb cc iterator pb cc kernel def pb cc log memory pb cc node def pb cc op def pb cc reader base pb cc remote fused graph execute info pb cc resource handle pb cc step stats pb cc summary pb cc tensor pb cc tensor description pb cc tensor shape pb cc tensor slice pb cc types pb cc variable pb cc versions pb cc op performance data pb cc boosted trees pb cc error codes pb cc profile pb cc tfprof log pb cc tfprof options pb cc tfprof output pb cc checkpointable object graph pb cc cluster pb cc config pb cc control flow pb cc critical section pb cc debug pb cc device properties pb cc eager service pb cc master pb cc master service pb cc meta graph pb cc named tensor pb cc queue runner pb cc rewriter config pb cc saved model pb cc saver pb cc tensor bundle pb cc tensorflow server pb cc transport options pb cc worker pb cc worker service pb cc event pb cc example proto fast parsing test pb cc memmapped file system pb cc saved tensor slice pb cc test log pb cc xla service pb cc backend configs pb cc hlo pb cc hlo profile printer data pb cc xla pb cc xla data pb cc learner pb cc quantiles pb cc split info pb cc tree config pb cc compilation result pb cc optimization parameters pb cc topology pb cc tpu embedding config pb cc tf protos cc vcxproj c users john tensorflow tensorflow contrib cmake build release tf protos cc lib build started project proto text configuration release gen proto text functions cc gen proto text functions lib cc path obj error unresolved external symbol void cdecl absl base internal throwstdoutofrange char const throwstdoutofrange base internal absl yaxpebd z referenced in function class std basic string class std allocator cdecl tensorflow io internal joinpathimpl class std initializer list joinpathimpl internal io tensorflow ya av basic string du char traits d std v allocator d std v initializer list vstring view absl z path obj error unresolved external symbol public unsigned cdecl absl string view rfind char unsigned const rfind string view absl qeba kd k z referenced in function class absl string view cdecl tensorflow io extension class absl string view extension io tensorflow ya avstring view absl z collection registry obj error unresolved external symbol class std basic ostream cdecl absl operator class absl string view yaaeav basic ostream du char traits d std std vstring view z referenced in function public class std unique ptr cdecl tensorflow monitoring collectionregistry register class tensorflow monitoring abstractmetricdef const class std function const register collectionregistry monitoring tensorflow qeaa av unique ptr vregistrationhandle collectionregistry monitoring tensorflow u default delete vregistrationhandle collectionregistry monitoring tensorflow std std pebvabstractmetricdef aebv function monitoring tensorflow z z str util obj error unresolved external symbol public unsigned cdecl absl string view find char unsigned const find string view absl qeba kd k z referenced in function class std vector class std allocator class std allocator class std allocator cdecl tensorflow str util split class absl string view class absl string view struct tensorflow str util allowempty split uallowempty str util tensorflow str util tensorflow ya av vector v basic string du char traits d std v allocator d std v allocator v basic string du char traits d std v allocator d std std vstring view absl z c users john tensorflow tensorflow contrib cmake build release proto text exe fatal error unresolved externals done building project proto text vcxproj failed build started project tf core framework configuration release generating force rebuild running c protocol buffer text compiler proto text on tensorflow core example example proto release proto text exe is not recognized as an internal or external command operable program or batch file c program files microsoft visual studio enterprise ide vc vctargets microsoft cppcommon targets error cmd exe exited with code done building project tf core framework vcxproj failed build started project tf cc op gen main configuration release build started project tf core cpu configuration release build started project tf cc framework configuration release build started project tf python op gen main configuration release cc op gen cc cc op gen main cc ops cc scope cc python op gen cc python op gen internal cc python op gen main cc loader cc reader cc accumulate n optimizer cc allocator retry cc base collective executor cc bfc allocator cc buf rendezvous cc build graph options cc c users john tensorflow tensorflow python framework python op gen internal cc fatal error cannot open include file tensorflow core framework op def pb text h no such file or directory c users john tensorflow tensorflow python framework python op gen cc fatal error cannot open include file tensorflow core framework op def pb text h no such file or directory collective executor mgr cc c users john tensorflow tensorflow cc framework cc op gen cc fatal error cannot open include file tensorflow core framework types pb text h no such file or directory c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime base collective executor cc collective param resolver local cc done building project tf cc op gen main vcxproj failed build started project control flow ops gen cc configuration release build started project ctc ops gen cc configuration release build started project cudnn rnn ops gen cc configuration release build started project data flow ops gen cc configuration release build started project image ops gen cc configuration release collective rma local cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime accumulate n optimizer cc collective util cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime buf rendezvous cc copy tensor cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow cc framework scope cc costmodel manager cc done building project tf python op gen main vcxproj failed build started project random ops gen cc configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime collective executor mgr cc debugger state interface cc device cc done building project tf cc framework vcxproj failed build started project io ops gen cc configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime collective param resolver local cc device factory cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime collective rma local cc device mgr cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project control flow ops gen cc vcxproj failed link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj build started project summary ops gen cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project data flow ops gen cc vcxproj failed build started project sendrecv ops gen cc configuration release done building project image ops gen cc vcxproj failed done building project ctc ops gen cc vcxproj failed build started project decode proto ops gen cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj build started project remote fused graph ops gen cc configuration release done building project cudnn rnn ops gen cc vcxproj failed build started project sdca ops gen cc configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime collective util cc device resolver local cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime copy tensor cc device set cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project random ops gen cc vcxproj failed build started project linalg ops gen cc configuration release done building project remote fused graph ops gen cc vcxproj failed build started project string ops gen cc configuration release done building project decode proto ops gen cc vcxproj failed c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime debugger state interface cc build started project list ops gen cc configuration release attr builder cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime device cc context cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow core common runtime eager attr builder cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow core common runtime eager attr builder cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime device mgr cc eager executor cc eager operation cc done building project sdca ops gen cc vcxproj failed build started project stateless random ops gen cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project io ops gen cc vcxproj failed c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow core common runtime eager eager operation cc build started project resource variable ops gen cc configuration release c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow core common runtime eager eager operation cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj execute cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project summary ops gen cc vcxproj failed done building project list ops gen cc vcxproj failed build started project encode proto ops gen cc configuration release build started project script ops gen cc configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime device set cc kernel and device cc done building project linalg ops gen cc vcxproj failed build started project logging ops gen cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project string ops gen cc vcxproj failed build started project state ops gen cc configuration release tensor handle cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime device resolver local cc eval const tensor cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime eager attr builder cc done building project script ops gen cc vcxproj failed build started project lookup ops gen cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj executor cc done building project sendrecv ops gen cc vcxproj failed build started project array ops gen cc configuration release done building project resource variable ops gen cc vcxproj failed build started project manip ops gen cc configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime eager context cc executor factory cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime eager eager executor cc function cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project logging ops gen cc vcxproj failed build started project rpc ops gen cc configuration release done building project stateless random ops gen cc vcxproj failed build started project math ops gen cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime eager eager operation cc done building project array ops gen cc vcxproj failed build started project audio ops gen cc configuration release gpu id manager cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime eager execute cc graph execution state cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime eager kernel and device cc graph optimizer cc done building project state ops gen cc vcxproj failed build started project nn ops gen cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project encode proto ops gen cc vcxproj failed build started project spectral ops gen cc configuration release done building project manip ops gen cc vcxproj failed build started project no op gen cc configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime eager tensor handle cc graph runner cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime executor cc hierarchical tree broadcaster cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project rpc ops gen cc vcxproj failed build started project batch ops gen cc configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime function cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj local device cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime eval const tensor cc lower if op cc done building project math ops gen cc vcxproj failed build started project user ops gen cc configuration release lower while op cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project audio ops gen cc vcxproj failed build started project bitwise ops gen cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project no op gen cc vcxproj failed build started project parsing ops gen cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project nn ops gen cc vcxproj failed build started project sparse ops gen cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project lookup ops gen cc vcxproj failed build started project functional ops gen cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime graph execution state cc done building project spectral ops gen cc vcxproj failed memory types cc build started project dataset ops gen cc configuration release done building project batch ops gen cc vcxproj failed build started project training ops gen cc configuration release mkl cpu allocator cc optimization registry cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project user ops gen cc vcxproj failed build started project candidate sampling ops gen cc configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime graph runner cc parallel concat optimizer cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project training ops gen cc vcxproj failed done building project parsing ops gen cc vcxproj failed build started project checkpoint ops gen cc configuration release build started project boosted trees ops gen cc configuration release done building project sparse ops gen cc vcxproj failed link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj build started project set ops gen cc configuration release done building project bitwise ops gen cc vcxproj failed build started project remote fused graph ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime graph optimizer cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime local device cc placer cc pool allocator cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime hierarchical tree broadcaster cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime lower if op cc done building project functional ops gen cc vcxproj failed process function library runtime cc process state cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime lower while op cc process util cc build started project sdca ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project dataset ops gen cc vcxproj failed build started project random ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project checkpoint ops gen cc vcxproj failed build started project set ops gen python configuration release done building project boosted trees ops gen cc vcxproj failed done building project candidate sampling ops gen cc vcxproj failed build started project parsing ops gen python configuration release build started project sparse ops gen python configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime optimization registry cc renamed device cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime process function library runtime cc rendezvous mgr cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project remote fused graph ops gen python vcxproj failed link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj build started project nn ops gen python configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime placer cc done building project sdca ops gen python vcxproj failed build started project math ops gen python configuration release rendezvous util cc done building project sparse ops gen python vcxproj failed done building project random ops gen python vcxproj failed build started project manip ops gen python configuration release build started project lookup ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project parsing ops gen python vcxproj failed build started project logging ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf cc op gen main dir release cc op gen obj done building project set ops gen cc vcxproj failed build started project tf cc ops configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project logging ops gen python vcxproj failed build started project state ops gen python configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime parallel concat optimizer cc ring reducer cc scoped allocator cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime renamed device cc scoped allocator mgr cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj session ref cc session state cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project manip ops gen python vcxproj failed build started project list ops gen python configuration release done building project state ops gen python vcxproj failed build started project stateless random ops gen python configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime rendezvous mgr cc shape refiner cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project set ops gen python vcxproj failed build started project linalg ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project list ops gen python vcxproj failed build started project string ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project lookup ops gen python vcxproj failed build started project io ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project math ops gen python vcxproj failed link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj build started project summary ops gen python configuration release stats publisher interface cc done building project nn ops gen python vcxproj failed build started project image ops gen python configuration release done building project stateless random ops gen python vcxproj failed build started project functional ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project linalg ops gen python vcxproj failed build started project encode proto ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project string ops gen python vcxproj failed build started project rpc ops gen python configuration release done building project io ops gen python vcxproj failed build started project debug ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project image ops gen python vcxproj failed build started project resource variable ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project summary ops gen python vcxproj failed build started project dataset ops gen python configuration release done building project rpc ops gen python vcxproj failed build started project spectral ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project resource variable ops gen python vcxproj failed build started project data flow ops gen python configuration release step stats collector cc done building project debug ops gen python vcxproj failed build started project cudnn rnn ops gen python configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime ring reducer cc sycl allocator cc sycl device cc sycl device context cc sycl device factory cc threadpool device cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime shape refiner cc threadpool device factory cc debug cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project dataset ops gen python vcxproj failed build started project script ops gen python configuration release debug callback registry cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project spectral ops gen python vcxproj failed link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj build started project ctc ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project encode proto ops gen python vcxproj failed build started project control flow ops gen python configuration release done building project functional ops gen python vcxproj failed build started project contrib text skip gram ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project data flow ops gen python vcxproj failed done building project script ops gen python vcxproj failed build started project contrib tensor forest stats ops gen python configuration release build started project contrib tensor forest ops gen python configuration release debug graph utils cc build started project contrib tensor forest model ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project ctc ops gen python vcxproj failed build started project contrib tensor forest hybrid ops gen python configuration release done building project contrib text skip gram ops gen python vcxproj failed build started project contrib beam search ops gen python configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime threadpool device factory cc debug io utils cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj debug node key cc debugger state impl cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core common runtime threadpool device cc server lib cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core debug debug cc algorithm cc colors cc done building project contrib tensor forest stats ops gen python vcxproj failed build started project contrib rnn lstm ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project control flow ops gen python vcxproj failed build started project contrib rnn gru ops gen python configuration release control flow cc done building project contrib tensor forest hybrid ops gen python vcxproj failed link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj build started project contrib resampler ops gen python configuration release done building project contrib beam search ops gen python vcxproj failed build started project contrib periodic resample ops gen python configuration release done building project contrib tensor forest ops gen python vcxproj failed build started project contrib nearest neighbor ops gen python configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core debug debug graph utils cc costmodel cc gradients cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib tensor forest model ops gen python vcxproj failed build started project contrib nccl ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib periodic resample ops gen python vcxproj failed build started project contrib memory stats ops gen python configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core debug debugger state impl cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib resampler ops gen python vcxproj failed graph constructor cc build started project contrib layers sparse feature cross ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib rnn lstm ops gen python vcxproj failed build started project contrib input pipeline ops gen python configuration release generating tensorflow cc ops audio ops h tensorflow cc ops audio ops cc tensorflow cc ops audio ops internal h tensorflow cc ops audio ops internal cc graph def builder util cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core graph gradients cc graph partition cc mkl layout pass cc release audio ops gen cc exe is not recognized as an internal or external command mkl tfconversion pass cc operable program or batch file link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj optimizer cse cc done building project contrib nccl ops gen python vcxproj failed build started project contrib image sirds ops gen python configuration release quantize training cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj c program files microsoft visual studio enterprise ide vc vctargets microsoft cppcommon targets error cmd exe exited with code done building project tf cc ops vcxproj failed build started project tf cc while loop configuration release done building project contrib layers sparse feature cross ops gen python vcxproj failed build started project tf cc configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib nearest neighbor ops gen python vcxproj failed build started project contrib image ops gen python configuration release done building project contrib input pipeline ops gen python vcxproj failed link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj build started project contrib image distort image ops gen python configuration release done building project contrib memory stats ops gen python vcxproj failed build started project contrib framework variable ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib rnn gru ops gen python vcxproj failed build started project contrib gcs config ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib image sirds ops gen python vcxproj failed build started project contrib factorization factorization ops gen python configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core graph graph constructor cc validate cc cluster cc while loop cc virtual cluster cc analytical cost estimator cc client session cc array grad cc data flow grad cc c users john tensorflow tensorflow cc gradients data flow grad cc fatal error cannot open include file tensorflow cc ops data flow ops h no such file or directory image grad cc math grad cc nn grad cc c users john tensorflow tensorflow cc gradients nn grad cc fatal error cannot open include file tensorflow cc ops nn ops h no such file or directory coordinator cc c users john tensorflow tensorflow cc gradients math grad cc fatal error cannot open include file tensorflow cc ops array ops internal h no such file or directory queue runner cc grad op registry cc gradient checker cc gradients cc c users john tensorflow tensorflow cc gradients array grad cc fatal error cannot open include file tensorflow cc ops array ops internal h no such file or directory while gradients cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core graph quantize training cc graph memory cc done building project contrib framework variable ops gen python vcxproj failed build started project contrib factorization clustering ops gen python configuration release graph properties cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler clusters cluster cc measuring cost estimator cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib image distort image ops gen python vcxproj failed done building project contrib factorization factorization ops gen python vcxproj failed build started project contrib data dataset ops gen python configuration release build started project contrib coder ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj c users john tensorflow tensorflow cc ops while loop cc fatal error cannot open include file tensorflow cc ops control flow ops internal h no such file or directory done building project contrib image ops gen python vcxproj failed done building project contrib gcs config ops gen python vcxproj failed build started project contrib boosted trees training ops gen python configuration release build started project contrib boosted trees stats accumulator ops gen python configuration release c users john tensorflow tensorflow cc gradients image grad cc fatal error cannot open include file tensorflow cc ops image ops internal h no such file or directory c users john tensorflow tensorflow cc ops standard ops h fatal error cannot open include file tensorflow cc ops array ops h no such file or directory compiling source file c users john tensorflow tensorflow cc framework gradients cc c users john tensorflow tensorflow cc framework while gradients cc fatal error cannot open include file tensorflow cc ops control flow ops internal h no such file or directory c users john tensorflow tensorflow cc ops standard ops h fatal error cannot open include file tensorflow cc ops array ops h no such file or directory compiling source file c users john tensorflow tensorflow cc framework gradient checker cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler clusters virtual cluster cc op level cost estimator cc done building project tf cc while loop vcxproj failed build started project tf c configuration release robust stats cc virtual placer cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler costs analytical cost estimator cc virtual scheduler cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler costs graph memory cc devices cc gen node cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler costs graph properties cc graph analyzer cc c users john tensorflow tensorflow core grappler graph analyzer graph analyzer cc fatal error cannot open include file absl strings str format h no such file or directory graph analyzer tool cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler costs measuring cost estimator cc sig node cc c users john tensorflow tensorflow core grappler graph analyzer gen node cc fatal error cannot open include file absl strings str format h no such file or directory graph view cc c api cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c c users john tensorflow tensorflow c c api h note see declaration of tf whileparams c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler costs virtual placer cc done building project contrib coder ops gen python vcxproj failed grappler item cc build started project contrib boosted trees split handler ops gen python configuration release grappler item builder cc done building project tf cc vcxproj failed build started project decode proto ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib boosted trees split handler ops gen python vcxproj failed link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj build started project contrib boosted trees prediction ops gen python configuration release done building project contrib data dataset ops gen python vcxproj failed build started project contrib boosted trees model ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib boosted trees stats accumulator ops gen python vcxproj failed build started project contrib bigquery reader ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib boosted trees training ops gen python vcxproj failed build started project checkpoint ops gen python configuration release done building project decode proto ops gen python vcxproj failed build started project array ops gen python configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler costs virtual scheduler cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory done building project contrib factorization clustering ops gen python vcxproj failed file input yielder cc c users john tensorflow tensorflow core grappler graph analyzer sig node cc fatal error cannot open include file absl strings str format h no such file or directory mutable graph view cc op types cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler grappler item builder cc arithmetic optimizer cc build started project candidate sampling ops gen python configuration release done building project contrib boosted trees model ops gen python vcxproj failed auto parallel cc build started project boosted trees ops gen python configuration release done building project tf c vcxproj failed link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj build started project tf c python api configuration release done building project contrib boosted trees prediction ops gen python vcxproj failed build started project bitwise ops gen python configuration release custom graph optimizer registry cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project candidate sampling ops gen python vcxproj failed build started project training ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib bigquery reader ops gen python vcxproj failed done building project boosted trees ops gen python vcxproj failed build started project user ops gen python configuration release build started project batch ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project checkpoint ops gen python vcxproj failed build started project audio ops gen python configuration release link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project array ops gen python vcxproj failed build started project contrib boosted trees quantiles ops gen python configuration release done building project bitwise ops gen python vcxproj failed link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project user ops gen python vcxproj failed filter fusion cc fusion utils cc graph utils cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers arithmetic optimizer cc latency all edges cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj map and batch fusion cc done building project training ops gen python vcxproj failed c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers auto parallel cc map and filter fusion cc map fusion cc map vectorization cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project batch ops gen python vcxproj failed done building project audio ops gen python vcxproj failed link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf python op gen main dir release python op gen obj done building project contrib boosted trees quantiles ops gen python vcxproj failed build started project tf python ops configuration release c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers data filter fusion cc noop elimination cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers data latency all edges cc shuffle and repeat fusion cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers data map and batch fusion cc debug stripper cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers data map fusion cc dependency optimizer cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers data map and filter fusion cc evaluation utils cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers data map vectorization cc function optimizer cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers data noop elimination cc gpu swapping kernels cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers data shuffle and repeat fusion cc gpu swapping ops cc graph optimizer stage cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers debug stripper cc graph rewriter cc layout optimizer cc generating tf python tensorflow python ops gen audio ops py c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers dependency optimizer cc loop optimizer cc release audio ops gen python exe is not recognized as an internal or external command operable program or batch file c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers function optimizer cc memory optimizer cc c program files microsoft visual studio enterprise ide vc vctargets microsoft cppcommon targets error cmd exe exited with code done building project tf python ops vcxproj failed c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers gpu swapping kernels cc meta optimizer cc model pruner cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers graph optimizer stage cc remapper cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers loop optimizer cc scoped allocator optimizer cc shape optimizer cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers layout optimizer cc static schedule cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers memory optimizer cc symbolic shapes cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers meta optimizer cc colocation cc frame cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers shape optimizer cc functions cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers remapper cc scc cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers scoped allocator optimizer cc topological sort cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler optimizers static schedule cc traversal cc done building project tf core cpu vcxproj failed build started project tf core kernels configuration release build started project tf grappler configuration release build started project tf core distributed runtime configuration release build started project tf core direct session configuration release direct session cc single machine cc cost analyzer cc model analyzer cc base rendezvous mgr cc call options cc cluster function library runtime cc collective param resolver distributed cc collective rma distributed cc device resolver distributed cc eager service impl cc graph mgr cc local master cc adjust contrast op cc adjust hue op cc adjust saturation op cc aggregate ops cc argmax op cc as string op cc attention ops cc avgpooling op cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime collective rma distributed cc master cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime base rendezvous mgr cc master session cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow python grappler cost analyzer cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime eager eager service impl cc message wrappers cc c users john tensorflow tensorflow core distributed runtime message wrappers h fatal error cannot open include file tensorflow core framework tensor pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime cluster function library runtime cc partial run mgr cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core grappler clusters single machine cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow python grappler model analyzer cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime device resolver distributed cc recent request ids cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime graph mgr cc remote device cc c users john tensorflow tensorflow core distributed runtime message wrappers h fatal error cannot open include file tensorflow core framework tensor pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime collective param resolver distributed cc request id cc done building project tf grappler vcxproj failed done building project tf core direct session vcxproj failed c users john tensorflow tensorflow core distributed runtime message wrappers h fatal error cannot open include file tensorflow core framework tensor pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime local master cc grpc eager client cc grpc eager service cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime master session cc grpc eager service impl cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime master cc grpc channel cc c users john tensorflow tensorflow core distributed runtime message wrappers h fatal error cannot open include file tensorflow core framework tensor pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime message wrappers cc grpc master service cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime remote device cc grpc master service impl cc barrier ops cc ops cc c users john tensorflow tensorflow core distributed runtime message wrappers h fatal error cannot open include file tensorflow core framework tensor pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime partial run mgr cc grpc remote master cc batch kernels cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime rpc eager grpc eager service impl cc grpc remote worker cc grpc rpc factory cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime rpc grpc master service cc grpc rpc factory registration cc grpc server lib cc c users john tensorflow tensorflow core distributed runtime message wrappers h fatal error cannot open include file tensorflow core framework tensor pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime rpc grpc remote master cc batch matmul op complex cc grpc session cc batch matmul op real cc grpc tensor coding cc grpc util cc c users john tensorflow tensorflow core distributed runtime message wrappers h fatal error cannot open include file tensorflow core framework tensor pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime rpc grpc remote worker cc grpc worker cache cc c users john tensorflow tensorflow core distributed runtime message wrappers h fatal error cannot open include file tensorflow core framework tensor pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime rpc grpc session cc grpc worker service cc c users john tensorflow tensorflow core distributed runtime message wrappers h fatal error cannot open include file tensorflow core framework tensor pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime rpc grpc server lib cc grpc worker service impl cc batch norm op cc rpc rendezvous mgr cc rpc collective executor mgr cc c users john tensorflow tensorflow core distributed runtime message wrappers h fatal error cannot open include file tensorflow core framework tensor pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime rpc grpc worker cache cc scheduler cc session mgr cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime rpc rpc rendezvous mgr cc tensor coding cc worker cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime rpc grpc worker service cc worker cache logger cc worker cache partial cc fake clock env cc periodic function cc batchtospace op cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime session mgr cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime scheduler cc worker session cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime rpc collective executor mgr cc bcast ops cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime tensor coding cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime worker cc c users john tensorflow tensorflow core distributed runtime message wrappers h fatal error cannot open include file tensorflow core framework tensor pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime worker cache partial cc betainc op cc bias op cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core distributed runtime worker session cc c users john tensorflow tensorflow core kernels batchtospace op cc warning too many actual parameters for macro tf batchtospace block dims case done building project tf core distributed runtime vcxproj failed bincount op cc bitcast op cc resource ops cc resources cc stats ops cc broadcast to op cc bucketize op cc candidate sampler ops cc cast op cc cast op impl bfloat cc cast op impl bool cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core kernels cast op cc cast op impl cc cast op impl cc cast op impl double cc cast op impl float cc cast op impl half cc cast op impl cc cast op impl cc cast op impl cc cast op impl cc cast op impl cc cast op impl cc cast op impl cc cast op impl cc check numerics op cc cholesky grad cc cholesky op cc collective ops cc colorspace op cc compare and bitpack op cc concat lib cpu cc concat lib gpu cc c users john tensorflow tensorflow core kernels compare and bitpack op cc warning unsafe mix of type int and type bool in operation c users john tensorflow tensorflow core kernels compare and bitpack op cc note while compiling class template member function void tensorflow functor computeshard compute eigen tensormap eigen makepointer eigen tensormap eigen makepointer const t tensorflow tensorflow with t tensorflow indextype eigen denseindex c users john tensorflow tensorflow core kernels compare and bitpack op cc note see reference to function template instantiation void tensorflow functor computeshard compute eigen tensormap eigen makepointer eigen tensormap eigen makepointer const t tensorflow tensorflow being compiled with t tensorflow indextype eigen denseindex c users john tensorflow tensorflow core kernels compare and bitpack op cc note see reference to class template instantiation tensorflow functor computeshard being compiled with t tensorflow c users john tensorflow tensorflow core kernels compare and bitpack op cc note while compiling class template member function void tensorflow functor compareandbitpack operator tensorflow opkernelcontext eigen tensormap eigen makepointer eigen tensormap indextype eigen makepointer eigen tensormap eigen makepointer with device tensorflow cpudevice t tensorflow indextype eigen denseindex c users john tensorflow tensorflow core kernels compare and bitpack op cc note see reference to function template instantiation void tensorflow functor compareandbitpack operator tensorflow opkernelcontext eigen tensormap eigen makepointer eigen tensormap indextype eigen makepointer eigen tensormap eigen makepointer being compiled with device tensorflow cpudevice t tensorflow indextype eigen denseindex c users john tensorflow tensorflow core kernels compare and bitpack op cc note see reference to class template instantiation tensorflow functor compareandbitpack being compiled with device tensorflow cpudevice t tensorflow c users john tensorflow tensorflow core kernels compare and bitpack op cc note while compiling class template member function void tensorflow compareandbitpackop compute tensorflow opkernelcontext c users john tensorflow tensorflow core kernels compare and bitpack op cc note see reference to class template instantiation tensorflow compareandbitpackop being compiled concat op cc conditional accumulator base cc conditional accumulator base op cc conditional accumulator op cc constant op cc control flow ops cc conv grad filter ops cc conv grad input ops cc conv grad ops cc conv grad ops cc conv ops cc conv ops cc conv ops fused cc conv ops using gemm cc count up to op cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core kernels conv ops fused cc crop and resize op cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow core kernels conv ops using gemm cc cross op cc ctc decoder ops cc ctc loss op cc cuda solvers cc cudnn pooling gpu cc cudnn rnn ops cc cwise op abs cc cwise op acos cc cwise op acosh cc cwise op add cc cwise op add cc cwise op arg cc cwise op asin cc cwise op asinh cc cwise op atan cc cwise op cc cwise op atanh cc cwise op bessel cc cwise op bitwise and cc cwise op bitwise or cc cwise op bitwise xor cc cwise op ceil cc cwise op clip cc cwise op complex cc cwise op conj cc cwise op cos cc cwise op cosh cc cwise op digamma cc cwise op div cc cwise op equal to cc cwise op equal to cc cwise op erf cc cwise op erfc cc cwise op exp cc cwise op cc cwise op floor cc cwise op floor div cc cwise op floor mod cc cwise op greater cc cwise op greater equal cc cwise op igammas cc cwise op imag cc cwise op invert cc cwise op isfinite cc cwise op isinf cc cwise op isnan cc cwise op left shift cc c users john tensorflow tensorflow contrib cmake build external eigen archive eigen src core products generalblockpanelkernel h fatal error compiler is out of heap space in pass cwise op less cc cl command line error error creating or communicating with child process done building project tf core kernels vcxproj failed build started project tf tools transform graph lib configuration release build started project tf label image example configuration release build started project grpc tensorflow server configuration release build started project tf tutorials example trainer configuration release build started project benchmark model configuration release main cc grpc tensorflow server cc example trainer cc benchmark model cc add default attributes cc backports cc file utils cc flatten atrous cc fold batch norms cc benchmark model main cc fold constants lib cc fold old batch norms cc freeze requantization ranges cc c users john tensorflow tensorflow cc ops standard ops h fatal error cannot open include file tensorflow cc ops array ops h no such file or directory done building project tf tutorials example trainer vcxproj failed c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms fold constants lib cc fuse convolutions cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms backports cc insert logging cc obfuscate names cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms fold old batch norms cc quantize nodes cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms fold batch norms cc quantize weights cc c users john tensorflow tensorflow examples label image main cc fatal error cannot open include file tensorflow cc ops image ops h no such file or directory done building project tf label image example vcxproj failed remove attribute cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms fuse convolutions cc remove control dependencies cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms insert logging cc remove device cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms obfuscate names cc remove nodes cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms quantize weights cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms quantize nodes cc rename attribute cc rename op cc round weights cc set device cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms remove attribute cc sort by execution order cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf core cpu dir release accumulate n optimizer obj done building project grpc tensorflow server vcxproj failed c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms rename attribute cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms rename op cc sparsify gather cc strip unused nodes cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow tools graph transforms sparsify gather cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow tools graph transforms sparsify gather cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms remove device cc transform graph cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms remove nodes cc transform utils cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms round weights cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf core cpu dir release accumulate n optimizer obj done building project benchmark model vcxproj failed c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms sort by execution order cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow tools graph transforms strip unused nodes cc done building project tf tools transform graph lib vcxproj failed build started project pywrap tensorflow internal static configuration release build started project summarize graph configuration release build started project compare graphs configuration release build started project transform graph configuration release compare graphs cc summarize graph main cc transform graph main cc generating force rebuild running swig to generate python wrappers print model analysis cc pywrap tensor cc pywrap tfe src cc tf session helper cc cpp shape inference cc python op gen cc python op gen internal cc cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python eager pywrap tfe src cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python eager pywrap tfe src cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python client tf session helper cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python client tf session helper cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python eager pywrap tensor cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python eager pywrap tensor cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python framework cpp shape inference cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python framework cpp shape inference cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core cc c users john tensorflow tensorflow python framework python op gen cc fatal error cannot open include file tensorflow core framework op def pb text h no such file or directory c users john tensorflow tensorflow python framework python op gen internal cc fatal error cannot open include file tensorflow core framework op def pb text h no such file or directory numpy cc ndarray tensor cc ndarray tensor bridge cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core ndarray tensor cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core ndarray tensor cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow python eager pywrap tfe src cc py func cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core ndarray tensor bridge cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core ndarray tensor bridge cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow python client tf session helper cc py exception registry cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow core profiler internal print model analysis cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow core profiler internal print model analysis cc py seq tensor cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core py exception registry cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core py exception registry cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf tools transform graph lib dir release backports obj link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf tools transform graph lib dir release backports obj done building project transform graph vcxproj failed py util cc done building project compare graphs vcxproj failed safe ptr cc py record reader cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib io py record reader cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib io py record reader cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core safe ptr cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core safe ptr cc py record writer cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib io py record writer cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib io py record writer cc link fatal error cannot open input file c users john tensorflow tensorflow contrib cmake build tf tools transform graph lib dir release backports obj done building project summarize graph vcxproj failed c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core py func cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core py func cc kernel registry cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow python lib core py func cc util cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core py seq tensor cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core py seq tensor cc ops cc scope cc pywrap tensorflow internal cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python util util cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python util util cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow contrib cmake build pywrap tensorflow internal cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow contrib cmake build pywrap tensorflow internal cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow cc framework scope cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow contrib cmake build pywrap tensorflow internal cc done building project pywrap tensorflow internal static vcxproj failed build started project pywrap tensorflow internal configuration release generating force rebuild running swig to generate python wrappers print model analysis cc pywrap tensor cc pywrap tfe src cc tf session helper cc cpp shape inference cc python op gen cc python op gen internal cc cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python eager pywrap tfe src cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python eager pywrap tfe src cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python client tf session helper cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python client tf session helper cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python eager pywrap tensor cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python eager pywrap tensor cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python framework cpp shape inference cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python framework cpp shape inference cc c users john tensorflow tensorflow python framework python op gen cc fatal error cannot open include file tensorflow core framework op def pb text h no such file or directory numpy cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core cc c users john tensorflow tensorflow python framework python op gen internal cc fatal error cannot open include file tensorflow core framework op def pb text h no such file or directory ndarray tensor cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow core profiler internal print model analysis cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow core profiler internal print model analysis cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core ndarray tensor cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core ndarray tensor cc ndarray tensor bridge cc py func cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow python client tf session helper cc py exception registry cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core ndarray tensor bridge cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core ndarray tensor bridge cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core py exception registry cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core py exception registry cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow python eager pywrap tfe src cc py seq tensor cc py util cc safe ptr cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core safe ptr cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core safe ptr cc py record reader cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib io py record reader cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib io py record reader cc py record writer cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib io py record writer cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib io py record writer cc kernel registry cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core py func cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core py func cc util cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow python lib core py func cc ops cc scope cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python lib core py seq tensor cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python lib core py seq tensor cc pywrap tensorflow internal cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow python util util cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow python util util cc c users john tensorflow tensorflow c c api h warning tf newwhile has c linkage specified but returns udt tf whileparams which is incompatible with c compiling source file c users john tensorflow tensorflow contrib cmake build pywrap tensorflow internal cc c users john tensorflow tensorflow c c api h note see declaration of tf whileparams compiling source file c users john tensorflow tensorflow contrib cmake build pywrap tensorflow internal cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow cc framework scope cc c users john tensorflow tensorflow core common runtime device h fatal error cannot open include file tensorflow core framework device attributes pb text h no such file or directory compiling source file c users john tensorflow tensorflow contrib cmake build pywrap tensorflow internal cc done building project pywrap tensorflow internal vcxproj failed build started project nearest neighbor ops configuration release build started project gru ops configuration release build started project beam search ops configuration release build started project lstm ops configuration release build started project periodic resample op configuration release cl command line warning overriding dtf compile library with utf compile library cl command line warning overriding dtf compile library with utf compile library cl command line warning overriding dtf compile library with utf compile library blas gemm cc cl command line warning overriding dtf compile library with utf compile library cl command line warning overriding dtf compile library with utf compile library beam search ops cc cl command line warning overriding dtf compile library with utf compile library cl command line warning overriding dtf compile library with utf compile library hyperplane lsh probes cc nearest neighbor ops cc cl command line warning overriding dtf compile library with utf compile library periodic resample op cc cl command line warning overriding dtf compile library with utf compile library blas gemm cc array ops cc cl command line warning overriding dtf compile library with utf compile library gru ops cc cl command line warning overriding dtf compile library with utf compile library lstm ops cc cl command line warning overriding dtf compile library with utf compile library beam search ops cc link fatal error cannot open input file release pywrap tensorflow internal lib done building project periodic resample op vcxproj failed link fatal error cannot open input file release pywrap tensorflow internal lib done building project nearest neighbor ops vcxproj failed link fatal error cannot open input file release pywrap tensorflow internal lib done building project beam search ops vcxproj failed cl command line warning overriding dtf compile library with utf compile library gru ops cc cl command line warning overriding dtf compile library with utf compile library lstm ops cc link fatal error cannot open input file release pywrap tensorflow internal lib done building project gru ops vcxproj failed link fatal error cannot open input file release pywrap tensorflow internal lib done building project lstm ops vcxproj failed build started project tf python api configuration release skipped build project install configuration release project not selected to build for this solution configuration generating init py files for python api the parameter is incorrect c program files microsoft visual studio enterprise ide vc vctargets microsoft cppcommon targets error cmd exe exited with code done building project tf python api vcxproj failed skipped build project tf python build pip package configuration release project not selected to build for this solution configuration build succeeded failed up to date skipped please let me know of any other details required | 0 |
31,453 | 14,970,563,595 | IssuesEvent | 2021-01-27 19:48:18 | flutter/flutter | https://api.github.com/repos/flutter/flutter | opened | My app is slow or missing frames (metabug) | created via performance template macos-metal perf: speed severe: performance | This is a meta-issue to track reproducible reports of jank in Flutter apps.
If you are experiencing jank in your app:
1. Try to reproduce the problem in a test app. Either run `flutter create janktest` and recreate the situation you are experiencing in that app, or clone your app and delete code until you have the jank reproducing with a single .dart file.
2. [File a bug](https://github.com/flutter/flutter/issues/new?assignees=&labels=created+via+performance+template&template=5_performance_speed.md&title=) and include your .dart file demonstrating the problem. If you need more than just a .dart file (for example, assets are needed to reproduce the issue, or plugins/packages are needed to reproduce the issue) then create a GitHub repository and upload the app there.
Make sure to include the `flutter doctor -v` output and any logs from `flutter run` and `flutter analyze`.
3. Switch flutter to master channel and run this app on a physical device using profile mode with Skia tracing enabled, as follows:
`flutter channel master`
`flutter run --profile --trace-skia`
Then press ‘P’ to enable the performance overlay.
The bleeding edge master channel is encouraged here because Flutter is constantly fixing bugs and improving its performance. Your problem in an older Flutter version may have already been solved in the master channel.
4. Record a video of the performance issue using another phone so we can have an intuitive understanding of what happened. Don’t use "adb screenrecord", as that affects the performance of the profile run. Attach the video to your bug.
5. Open Observatory and save a timeline trace of the performance issue so we know which functions might be causing it. See "How to Collect and Read Timeline Traces" on this blog post:
https://medium.com/flutter/profiling-flutter-applications-using-the-timeline-a1a434964af3#a499
Attach the JSON file containing your trace to your bug. You may also wish to include a screenshot of the part of the trace showing the problem you are seeing, just so that people can see at a glance what kind of performance issue the bug is about.
6. Mention _this_ bug in your bug, so that GitHub includes a link to it here.
Please avoid commenting on this bug. Keep each issue separate so that we can examine each specific problem individually. Having one issue that contains comments about multiple problems make the issue intractable. | True | My app is slow or missing frames (metabug) - This is a meta-issue to track reproducible reports of jank in Flutter apps.
If you are experiencing jank in your app:
1. Try to reproduce the problem in a test app. Either run `flutter create janktest` and recreate the situation you are experiencing in that app, or clone your app and delete code until you have the jank reproducing with a single .dart file.
2. [File a bug](https://github.com/flutter/flutter/issues/new?assignees=&labels=created+via+performance+template&template=5_performance_speed.md&title=) and include your .dart file demonstrating the problem. If you need more than just a .dart file (for example, assets are needed to reproduce the issue, or plugins/packages are needed to reproduce the issue) then create a GitHub repository and upload the app there.
Make sure to include the `flutter doctor -v` output and any logs from `flutter run` and `flutter analyze`.
3. Switch flutter to master channel and run this app on a physical device using profile mode with Skia tracing enabled, as follows:
`flutter channel master`
`flutter run --profile --trace-skia`
Then press ‘P’ to enable the performance overlay.
The bleeding edge master channel is encouraged here because Flutter is constantly fixing bugs and improving its performance. Your problem in an older Flutter version may have already been solved in the master channel.
4. Record a video of the performance issue using another phone so we can have an intuitive understanding of what happened. Don’t use "adb screenrecord", as that affects the performance of the profile run. Attach the video to your bug.
5. Open Observatory and save a timeline trace of the performance issue so we know which functions might be causing it. See "How to Collect and Read Timeline Traces" on this blog post:
https://medium.com/flutter/profiling-flutter-applications-using-the-timeline-a1a434964af3#a499
Attach the JSON file containing your trace to your bug. You may also wish to include a screenshot of the part of the trace showing the problem you are seeing, just so that people can see at a glance what kind of performance issue the bug is about.
6. Mention _this_ bug in your bug, so that GitHub includes a link to it here.
Please avoid commenting on this bug. Keep each issue separate so that we can examine each specific problem individually. Having one issue that contains comments about multiple problems make the issue intractable. | non_priority | my app is slow or missing frames metabug this is a meta issue to track reproducible reports of jank in flutter apps if you are experiencing jank in your app try to reproduce the problem in a test app either run flutter create janktest and recreate the situation you are experiencing in that app or clone your app and delete code until you have the jank reproducing with a single dart file and include your dart file demonstrating the problem if you need more than just a dart file for example assets are needed to reproduce the issue or plugins packages are needed to reproduce the issue then create a github repository and upload the app there make sure to include the flutter doctor v output and any logs from flutter run and flutter analyze switch flutter to master channel and run this app on a physical device using profile mode with skia tracing enabled as follows flutter channel master flutter run profile trace skia then press ‘p’ to enable the performance overlay the bleeding edge master channel is encouraged here because flutter is constantly fixing bugs and improving its performance your problem in an older flutter version may have already been solved in the master channel record a video of the performance issue using another phone so we can have an intuitive understanding of what happened don’t use adb screenrecord as that affects the performance of the profile run attach the video to your bug open observatory and save a timeline trace of the performance issue so we know which functions might be causing it see how to collect and read timeline traces on this blog post attach the json file containing your trace to your bug you may also wish to include a screenshot of the part of the trace showing the problem you are seeing just so that people can see at a glance what kind of performance issue the bug is about mention this bug in your bug so that github includes a link to it here please avoid commenting on this bug keep each issue separate so that we can examine each specific problem individually having one issue that contains comments about multiple problems make the issue intractable | 0 |
14,331 | 3,256,854,116 | IssuesEvent | 2015-10-20 15:26:34 | healthsites/healthsites | https://api.github.com/repos/healthsites/healthsites | opened | establish country data section | country data homepage-design | https://redpen.io/ed70300c042811388e
Normal course:
The user will scroll down the page and find the map and the graphs showing how the data breaks down.
The map will show data that relates to the graphs.
Default view:
See all data for all countries.
Map will be doomed out.
When a user clicks on the map zoom in so that the locations can de-cluster
All a user to drag and move the map.
When the user clicks on an individual location open the map page and location sidebar for that location.
Country view
The user wants to look at data for a specific country.
Type country name..
When the user starts typing present the top 5 auto-completed options in a drop down so that they can select. See google.
Once selected open the map for selected country and display corresponding data in the graphs. | 1.0 | establish country data section - https://redpen.io/ed70300c042811388e
Normal course:
The user will scroll down the page and find the map and the graphs showing how the data breaks down.
The map will show data that relates to the graphs.
Default view:
See all data for all countries.
Map will be doomed out.
When a user clicks on the map zoom in so that the locations can de-cluster
All a user to drag and move the map.
When the user clicks on an individual location open the map page and location sidebar for that location.
Country view
The user wants to look at data for a specific country.
Type country name..
When the user starts typing present the top 5 auto-completed options in a drop down so that they can select. See google.
Once selected open the map for selected country and display corresponding data in the graphs. | non_priority | establish country data section normal course the user will scroll down the page and find the map and the graphs showing how the data breaks down the map will show data that relates to the graphs default view see all data for all countries map will be doomed out when a user clicks on the map zoom in so that the locations can de cluster all a user to drag and move the map when the user clicks on an individual location open the map page and location sidebar for that location country view the user wants to look at data for a specific country type country name when the user starts typing present the top auto completed options in a drop down so that they can select see google once selected open the map for selected country and display corresponding data in the graphs | 0 |
27,348 | 5,341,037,463 | IssuesEvent | 2017-02-17 01:08:47 | pvlib/pvlib-python | https://api.github.com/repos/pvlib/pvlib-python | closed | pvlib python communication forums | development_workflow documentation | What do people think about these new communication forums?
- pvlib tag on stackoverflow: http://stackoverflow.com/questions/tagged/pvlib
- pvlib python google group: https://groups.google.com/forum/#!forum/pvlib-python
- pvlib python slack organization: https://pvlib-python.slack.com/
Here are my thoughts:
First, all credit goes to @mikofski for pushing this forward and setting up some things for us to try! Thanks!
I am concerned that too many forums could be counter productive, but I think we're big enough now that we could benefit from more than GitHub issues. Watching the project on GitHub does allow people to stay up to date on every little thing in the pvlib python world, but that is way too much information for many people. Spreading the communication over multiple platforms lets people focus on what they really care about at the cost of making things more complicated and potentially diluting the viewership of any given item.
I think it would be nice if we would use GitHub for bug tracking and development discussions, people would ask usage questions on Stack Overflow, and we’d use a mailing list for new version announcements.
I know that a few people have tried to ask pvlib questions on stack overflow in the past with varying degrees of success. I am going to start watching Mark's new pvlib tag. I don't have much experience with google groups, but at this point I prefer stackoverflow for usage questions. In any case, while we're getting started with stack overflow and/or google groups, I think that people should feel free to open and then close a GitHub issue that links to the new post.
On a related note, [Sibbell](https://sibbell.com/about/) is a nice tool that will send you an email when there are new releases of any of your starred repositories.
I like Slack for quick, informal conversations in which I don’t have to stress about if I’m going to come across the wrong way in a public forum. I don't like the constant din of interruptions on Slack, though.
I also get requests for help via email from time to time. I don't like answering questions via email because only a handful of people have an opportunity to learn from the exchange.
I don’t know what the ideal approach is, but I think we should try something new and see if it catches on!
Finally, we should add some notes to the documentation once we come to a consensus. | 1.0 | pvlib python communication forums - What do people think about these new communication forums?
- pvlib tag on stackoverflow: http://stackoverflow.com/questions/tagged/pvlib
- pvlib python google group: https://groups.google.com/forum/#!forum/pvlib-python
- pvlib python slack organization: https://pvlib-python.slack.com/
Here are my thoughts:
First, all credit goes to @mikofski for pushing this forward and setting up some things for us to try! Thanks!
I am concerned that too many forums could be counter productive, but I think we're big enough now that we could benefit from more than GitHub issues. Watching the project on GitHub does allow people to stay up to date on every little thing in the pvlib python world, but that is way too much information for many people. Spreading the communication over multiple platforms lets people focus on what they really care about at the cost of making things more complicated and potentially diluting the viewership of any given item.
I think it would be nice if we would use GitHub for bug tracking and development discussions, people would ask usage questions on Stack Overflow, and we’d use a mailing list for new version announcements.
I know that a few people have tried to ask pvlib questions on stack overflow in the past with varying degrees of success. I am going to start watching Mark's new pvlib tag. I don't have much experience with google groups, but at this point I prefer stackoverflow for usage questions. In any case, while we're getting started with stack overflow and/or google groups, I think that people should feel free to open and then close a GitHub issue that links to the new post.
On a related note, [Sibbell](https://sibbell.com/about/) is a nice tool that will send you an email when there are new releases of any of your starred repositories.
I like Slack for quick, informal conversations in which I don’t have to stress about if I’m going to come across the wrong way in a public forum. I don't like the constant din of interruptions on Slack, though.
I also get requests for help via email from time to time. I don't like answering questions via email because only a handful of people have an opportunity to learn from the exchange.
I don’t know what the ideal approach is, but I think we should try something new and see if it catches on!
Finally, we should add some notes to the documentation once we come to a consensus. | non_priority | pvlib python communication forums what do people think about these new communication forums pvlib tag on stackoverflow pvlib python google group pvlib python slack organization here are my thoughts first all credit goes to mikofski for pushing this forward and setting up some things for us to try thanks i am concerned that too many forums could be counter productive but i think we re big enough now that we could benefit from more than github issues watching the project on github does allow people to stay up to date on every little thing in the pvlib python world but that is way too much information for many people spreading the communication over multiple platforms lets people focus on what they really care about at the cost of making things more complicated and potentially diluting the viewership of any given item i think it would be nice if we would use github for bug tracking and development discussions people would ask usage questions on stack overflow and we’d use a mailing list for new version announcements i know that a few people have tried to ask pvlib questions on stack overflow in the past with varying degrees of success i am going to start watching mark s new pvlib tag i don t have much experience with google groups but at this point i prefer stackoverflow for usage questions in any case while we re getting started with stack overflow and or google groups i think that people should feel free to open and then close a github issue that links to the new post on a related note is a nice tool that will send you an email when there are new releases of any of your starred repositories i like slack for quick informal conversations in which i don’t have to stress about if i’m going to come across the wrong way in a public forum i don t like the constant din of interruptions on slack though i also get requests for help via email from time to time i don t like answering questions via email because only a handful of people have an opportunity to learn from the exchange i don’t know what the ideal approach is but i think we should try something new and see if it catches on finally we should add some notes to the documentation once we come to a consensus | 0 |
202,561 | 15,286,998,538 | IssuesEvent | 2021-02-23 15:18:13 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | roachtest: kv95/enc=false/nodes=3/batch=16 failed | C-test-failure O-roachtest O-robot branch-release-20.2 release-blocker | [(roachtest).kv95/enc=false/nodes=3/batch=16 failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2657159&tab=buildLog) on [release-20.2@8c79e2bc4b35d36c8527f4c40c974f03d9034f46](https://github.com/cockroachdb/cockroach/commits/8c79e2bc4b35d36c8527f4c40c974f03d9034f46):
```
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1374
Wraps: (2) output in run_080937.638_n4_workload_run_kv
Wraps: (3) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-2657159-1612856700-19-n4cpu8:4 -- ./workload run kv --init --histograms=perf/stats.json --concurrency=192 --splits=1000 --duration=10m0s --read-percent=95 --batch=16 {pgurl:1-3} returned
| stderr:
| ./workload: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found (required by ./workload)
| Error: COMMAND_PROBLEM: exit status 1
| (1) COMMAND_PROBLEM
| Wraps: (2) Node 4. Command with error:
| | ```
| | ./workload run kv --init --histograms=perf/stats.json --concurrency=192 --splits=1000 --duration=10m0s --read-percent=95 --batch=16 {pgurl:1-3}
| | ```
| Wraps: (3) exit status 1
| Error types: (1) errors.Cmd (2) *hintdetail.withDetail (3) *exec.ExitError
|
| stdout:
Wraps: (4) exit status 20
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *main.withCommandDetails (4) *exec.ExitError
cluster.go:2654,kv.go:97,kv.go:184,test_runner.go:755: monitor failure: monitor task failed: t.Fatal() was called
(1) attached stack trace
-- stack trace:
| main.(*monitor).WaitE
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2642
| main.(*monitor).Wait
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2650
| main.registerKV.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/kv.go:97
| main.registerKV.func3
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/kv.go:184
| main.(*testRunner).runTest.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/test_runner.go:755
Wraps: (2) monitor failure
Wraps: (3) attached stack trace
-- stack trace:
| main.(*monitor).wait.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2698
Wraps: (4) monitor task failed
Wraps: (5) attached stack trace
-- stack trace:
| main.init
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2612
| runtime.doInit
| /usr/local/go/src/runtime/proc.go:5652
| runtime.main
| /usr/local/go/src/runtime/proc.go:191
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1374
Wraps: (6) t.Fatal() was called
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *withstack.withStack (4) *errutil.withPrefix (5) *withstack.withStack (6) *errutil.leafError
```
<details><summary>More</summary><p>
Artifacts: [/kv95/enc=false/nodes=3/batch=16](https://teamcity.cockroachdb.com/viewLog.html?buildId=2657159&tab=artifacts#/kv95/enc=false/nodes=3/batch=16)
Related:
- #59921 roachtest: kv95/enc=false/nodes=3/batch=16 failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-master](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-master) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Akv95%2Fenc%3Dfalse%2Fnodes%3D3%2Fbatch%3D16.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
| 2.0 | roachtest: kv95/enc=false/nodes=3/batch=16 failed - [(roachtest).kv95/enc=false/nodes=3/batch=16 failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2657159&tab=buildLog) on [release-20.2@8c79e2bc4b35d36c8527f4c40c974f03d9034f46](https://github.com/cockroachdb/cockroach/commits/8c79e2bc4b35d36c8527f4c40c974f03d9034f46):
```
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1374
Wraps: (2) output in run_080937.638_n4_workload_run_kv
Wraps: (3) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-2657159-1612856700-19-n4cpu8:4 -- ./workload run kv --init --histograms=perf/stats.json --concurrency=192 --splits=1000 --duration=10m0s --read-percent=95 --batch=16 {pgurl:1-3} returned
| stderr:
| ./workload: /lib/x86_64-linux-gnu/libm.so.6: version `GLIBC_2.29' not found (required by ./workload)
| Error: COMMAND_PROBLEM: exit status 1
| (1) COMMAND_PROBLEM
| Wraps: (2) Node 4. Command with error:
| | ```
| | ./workload run kv --init --histograms=perf/stats.json --concurrency=192 --splits=1000 --duration=10m0s --read-percent=95 --batch=16 {pgurl:1-3}
| | ```
| Wraps: (3) exit status 1
| Error types: (1) errors.Cmd (2) *hintdetail.withDetail (3) *exec.ExitError
|
| stdout:
Wraps: (4) exit status 20
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *main.withCommandDetails (4) *exec.ExitError
cluster.go:2654,kv.go:97,kv.go:184,test_runner.go:755: monitor failure: monitor task failed: t.Fatal() was called
(1) attached stack trace
-- stack trace:
| main.(*monitor).WaitE
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2642
| main.(*monitor).Wait
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2650
| main.registerKV.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/kv.go:97
| main.registerKV.func3
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/kv.go:184
| main.(*testRunner).runTest.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/test_runner.go:755
Wraps: (2) monitor failure
Wraps: (3) attached stack trace
-- stack trace:
| main.(*monitor).wait.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2698
Wraps: (4) monitor task failed
Wraps: (5) attached stack trace
-- stack trace:
| main.init
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2612
| runtime.doInit
| /usr/local/go/src/runtime/proc.go:5652
| runtime.main
| /usr/local/go/src/runtime/proc.go:191
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1374
Wraps: (6) t.Fatal() was called
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *withstack.withStack (4) *errutil.withPrefix (5) *withstack.withStack (6) *errutil.leafError
```
<details><summary>More</summary><p>
Artifacts: [/kv95/enc=false/nodes=3/batch=16](https://teamcity.cockroachdb.com/viewLog.html?buildId=2657159&tab=artifacts#/kv95/enc=false/nodes=3/batch=16)
Related:
- #59921 roachtest: kv95/enc=false/nodes=3/batch=16 failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-master](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-master) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Akv95%2Fenc%3Dfalse%2Fnodes%3D3%2Fbatch%3D16.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
| non_priority | roachtest enc false nodes batch failed on runtime goexit usr local go src runtime asm s wraps output in run workload run kv wraps home agent work go src github com cockroachdb cockroach bin roachprod run teamcity workload run kv init histograms perf stats json concurrency splits duration read percent batch pgurl returned stderr workload lib linux gnu libm so version glibc not found required by workload error command problem exit status command problem wraps node command with error workload run kv init histograms perf stats json concurrency splits duration read percent batch pgurl wraps exit status error types errors cmd hintdetail withdetail exec exiterror stdout wraps exit status error types withstack withstack errutil withprefix main withcommanddetails exec exiterror cluster go kv go kv go test runner go monitor failure monitor task failed t fatal was called attached stack trace stack trace main monitor waite home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go main monitor wait home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go main registerkv home agent work go src github com cockroachdb cockroach pkg cmd roachtest kv go main registerkv home agent work go src github com cockroachdb cockroach pkg cmd roachtest kv go main testrunner runtest home agent work go src github com cockroachdb cockroach pkg cmd roachtest test runner go wraps monitor failure wraps attached stack trace stack trace main monitor wait home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go wraps monitor task failed wraps attached stack trace stack trace main init home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go runtime doinit usr local go src runtime proc go runtime main usr local go src runtime proc go runtime goexit usr local go src runtime asm s wraps t fatal was called error types withstack withstack errutil withprefix withstack withstack errutil withprefix withstack withstack errutil leaferror more artifacts related roachtest enc false nodes batch failed powered by | 0 |
169,471 | 14,222,358,291 | IssuesEvent | 2020-11-17 16:47:25 | pulibrary/dspace-cli | https://api.github.com/repos/pulibrary/dspace-cli | closed | Define the best practices for deploying upgraded DSpace installations | documentation | This was created in response to a request to host a meeting addressing this from Atmire | 1.0 | Define the best practices for deploying upgraded DSpace installations - This was created in response to a request to host a meeting addressing this from Atmire | non_priority | define the best practices for deploying upgraded dspace installations this was created in response to a request to host a meeting addressing this from atmire | 0 |
98,671 | 16,387,800,066 | IssuesEvent | 2021-05-17 12:48:26 | fitzinbox/Exomiser | https://api.github.com/repos/fitzinbox/Exomiser | opened | CVE-2019-0221 (Medium) detected in tomcat-embed-core-9.0.16.jar | security vulnerability | ## CVE-2019-0221 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tomcat-embed-core-9.0.16.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Path to dependency file: Exomiser/exomiser-web/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/9.0.16/tomcat-embed-core-9.0.16.jar,/home/wss-scanner/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/9.0.16/tomcat-embed-core-9.0.16.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.1.3.RELEASE.jar (Root Library)
- spring-boot-starter-tomcat-2.1.3.RELEASE.jar
- :x: **tomcat-embed-core-9.0.16.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fitzinbox/Exomiser/commit/3a0ae5a0b72ae7a7e59a638af862c28aa80dcdf6">3a0ae5a0b72ae7a7e59a638af862c28aa80dcdf6</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The SSI printenv command in Apache Tomcat 9.0.0.M1 to 9.0.0.17, 8.5.0 to 8.5.39 and 7.0.0 to 7.0.93 echoes user provided data without escaping and is, therefore, vulnerable to XSS. SSI is disabled by default. The printenv command is intended for debugging and is unlikely to be present in a production website.
<p>Publish Date: 2019-05-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-0221>CVE-2019-0221</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0221">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0221</a></p>
<p>Release Date: 2019-05-28</p>
<p>Fix Resolution: 9.0.0.18,8.5.40,7.0.94</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-0221 (Medium) detected in tomcat-embed-core-9.0.16.jar - ## CVE-2019-0221 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tomcat-embed-core-9.0.16.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Path to dependency file: Exomiser/exomiser-web/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/9.0.16/tomcat-embed-core-9.0.16.jar,/home/wss-scanner/.m2/repository/org/apache/tomcat/embed/tomcat-embed-core/9.0.16/tomcat-embed-core-9.0.16.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.1.3.RELEASE.jar (Root Library)
- spring-boot-starter-tomcat-2.1.3.RELEASE.jar
- :x: **tomcat-embed-core-9.0.16.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fitzinbox/Exomiser/commit/3a0ae5a0b72ae7a7e59a638af862c28aa80dcdf6">3a0ae5a0b72ae7a7e59a638af862c28aa80dcdf6</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The SSI printenv command in Apache Tomcat 9.0.0.M1 to 9.0.0.17, 8.5.0 to 8.5.39 and 7.0.0 to 7.0.93 echoes user provided data without escaping and is, therefore, vulnerable to XSS. SSI is disabled by default. The printenv command is intended for debugging and is unlikely to be present in a production website.
<p>Publish Date: 2019-05-28
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-0221>CVE-2019-0221</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0221">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-0221</a></p>
<p>Release Date: 2019-05-28</p>
<p>Fix Resolution: 9.0.0.18,8.5.40,7.0.94</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in tomcat embed core jar cve medium severity vulnerability vulnerable library tomcat embed core jar core tomcat implementation path to dependency file exomiser exomiser web pom xml path to vulnerable library home wss scanner repository org apache tomcat embed tomcat embed core tomcat embed core jar home wss scanner repository org apache tomcat embed tomcat embed core tomcat embed core jar dependency hierarchy spring boot starter web release jar root library spring boot starter tomcat release jar x tomcat embed core jar vulnerable library found in head commit a href found in base branch master vulnerability details the ssi printenv command in apache tomcat to to and to echoes user provided data without escaping and is therefore vulnerable to xss ssi is disabled by default the printenv command is intended for debugging and is unlikely to be present in a production website publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
28,634 | 23,408,260,361 | IssuesEvent | 2022-08-12 14:49:31 | solidjs/solid-docs-next | https://api.github.com/repos/solidjs/solid-docs-next | closed | Improved Nav Bar | infrastructure | - Ability to nest "indefinitely"
- The root of a nest must be a page
- Divider lines with titles can be placed anywhere
- Mobile view that doesn't expand the entire TOC | 1.0 | Improved Nav Bar - - Ability to nest "indefinitely"
- The root of a nest must be a page
- Divider lines with titles can be placed anywhere
- Mobile view that doesn't expand the entire TOC | non_priority | improved nav bar ability to nest indefinitely the root of a nest must be a page divider lines with titles can be placed anywhere mobile view that doesn t expand the entire toc | 0 |
20,597 | 4,578,922,998 | IssuesEvent | 2016-09-18 00:28:31 | nozavroni/csvelte | https://api.github.com/repos/nozavroni/csvelte | opened | Implement a ``streamize()`` function | documentation feature testing | This simply pulls streamize() out of the Stream class and puts it in its own namespaced function so instead of this:
```php
$stream = Stream::streamize("all your base are belong to us");
```
you do this
```php
$stream = streamize("all your base are belong to us");
```
Also, add support for accepting anything that implements ``Iterator`` by instantiating an ``IO\IteratorStream`` object. | 1.0 | Implement a ``streamize()`` function - This simply pulls streamize() out of the Stream class and puts it in its own namespaced function so instead of this:
```php
$stream = Stream::streamize("all your base are belong to us");
```
you do this
```php
$stream = streamize("all your base are belong to us");
```
Also, add support for accepting anything that implements ``Iterator`` by instantiating an ``IO\IteratorStream`` object. | non_priority | implement a streamize function this simply pulls streamize out of the stream class and puts it in its own namespaced function so instead of this php stream stream streamize all your base are belong to us you do this php stream streamize all your base are belong to us also add support for accepting anything that implements iterator by instantiating an io iteratorstream object | 0 |
20,886 | 16,133,273,694 | IssuesEvent | 2021-04-29 08:32:12 | ClickHouse/ClickHouse | https://api.github.com/repos/ClickHouse/ClickHouse | closed | Usability of clickhouse-benchmark: allow to pass query with --query parameter instead of stdin. | easy task usability | `clickhouse-benchmark --query "..."`
To make it more similar to `clickhouse-client`. | True | Usability of clickhouse-benchmark: allow to pass query with --query parameter instead of stdin. - `clickhouse-benchmark --query "..."`
To make it more similar to `clickhouse-client`. | non_priority | usability of clickhouse benchmark allow to pass query with query parameter instead of stdin clickhouse benchmark query to make it more similar to clickhouse client | 0 |
229,904 | 18,452,222,863 | IssuesEvent | 2021-10-15 12:18:04 | WordPress/twentytwentytwo | https://api.github.com/repos/WordPress/twentytwentytwo | closed | Query pagination appears incorrectly positioned | [Type] Bug [Status] Needs Testing | The default query pagination appears to behave very oddly. Lots of misaligned items:
<img width="1286" alt="Screen Shot 2021-10-13 at 12 43 57 PM" src="https://user-images.githubusercontent.com/1202812/137177430-04b8d77b-35bf-4527-ac99-069389988801.png">
<img width="1178" alt="Screen Shot 2021-10-13 at 12 44 04 PM" src="https://user-images.githubusercontent.com/1202812/137177434-78930d59-cd70-4a05-a3bc-0f22b663b018.png">
<img width="1092" alt="Screen Shot 2021-10-13 at 12 44 12 PM" src="https://user-images.githubusercontent.com/1202812/137177436-6b6a424e-f46f-4fc5-9fee-635355695ab8.png">
Here's how that's supposed to look, according to the comps:
<img width="1074" alt="Screen Shot 2021-10-13 at 12 48 09 PM" src="https://user-images.githubusercontent.com/1202812/137177726-1d81629c-6bf2-4ac7-9d7c-c4f881f66238.png">
| 1.0 | Query pagination appears incorrectly positioned - The default query pagination appears to behave very oddly. Lots of misaligned items:
<img width="1286" alt="Screen Shot 2021-10-13 at 12 43 57 PM" src="https://user-images.githubusercontent.com/1202812/137177430-04b8d77b-35bf-4527-ac99-069389988801.png">
<img width="1178" alt="Screen Shot 2021-10-13 at 12 44 04 PM" src="https://user-images.githubusercontent.com/1202812/137177434-78930d59-cd70-4a05-a3bc-0f22b663b018.png">
<img width="1092" alt="Screen Shot 2021-10-13 at 12 44 12 PM" src="https://user-images.githubusercontent.com/1202812/137177436-6b6a424e-f46f-4fc5-9fee-635355695ab8.png">
Here's how that's supposed to look, according to the comps:
<img width="1074" alt="Screen Shot 2021-10-13 at 12 48 09 PM" src="https://user-images.githubusercontent.com/1202812/137177726-1d81629c-6bf2-4ac7-9d7c-c4f881f66238.png">
| non_priority | query pagination appears incorrectly positioned the default query pagination appears to behave very oddly lots of misaligned items img width alt screen shot at pm src img width alt screen shot at pm src img width alt screen shot at pm src here s how that s supposed to look according to the comps img width alt screen shot at pm src | 0 |
337,106 | 30,239,643,447 | IssuesEvent | 2023-07-06 12:43:56 | open-telemetry/opentelemetry-collector-contrib | https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib | closed | [receiver/riak] Flaky integration test: TestRiakIntegration | bug flaky test receiver/riak | ### Component(s)
receiver/riak
### What happened?
Flaky test: https://github.com/open-telemetry/opentelemetry-collector-contrib/actions/runs/3907699006/jobs/6677193729
```
=== RUN TestRiakIntegration
2023/01/13 02:15:26 github.com/testcontainers/testcontainers-go - Connected to docker:
Server Version: 20.10.22+azure-1
API Version: 1.41
Operating System: Ubuntu 22.04.1 LTS
Total Memory: 6943 MB
2023/01/13 02:15:26 Starting container id: 3ed94d340f42 image: docker.io/testcontainers/ryuk:0.3.4
2023/01/13 02:15:26 Waiting for container id 3ed94d340f42 image: docker.io/testcontainers/ryuk:0.3.4
2023/01/13 02:15:26 Container is ready id: 3ed94d340f42 image: docker.io/testcontainers/ryuk:0.3.4
integration_test.go:56:
Error Trace: /home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/riakreceiver/integration_test.go:56
/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/riakreceiver/integration_test.go:62
Error: Received unexpected error:
Error response from daemon: No such image: a21c129d-83b2-4b1d-8263-f3ba560e76f9:cc5a9dce-dfaa-418e-8320-45d6d6f277dd: failed to create container
Test: TestRiakIntegration
--- FAIL: TestRiakIntegration (51.11s)
```
### Collector version
0.69.0
### Environment information
## Environment
OS: (e.g., "Ubuntu 20.04")
Compiler(if manually compiled): (e.g., "go 14.2")
### OpenTelemetry Collector configuration
_No response_
### Log output
_No response_
### Additional context
_No response_ | 1.0 | [receiver/riak] Flaky integration test: TestRiakIntegration - ### Component(s)
receiver/riak
### What happened?
Flaky test: https://github.com/open-telemetry/opentelemetry-collector-contrib/actions/runs/3907699006/jobs/6677193729
```
=== RUN TestRiakIntegration
2023/01/13 02:15:26 github.com/testcontainers/testcontainers-go - Connected to docker:
Server Version: 20.10.22+azure-1
API Version: 1.41
Operating System: Ubuntu 22.04.1 LTS
Total Memory: 6943 MB
2023/01/13 02:15:26 Starting container id: 3ed94d340f42 image: docker.io/testcontainers/ryuk:0.3.4
2023/01/13 02:15:26 Waiting for container id 3ed94d340f42 image: docker.io/testcontainers/ryuk:0.3.4
2023/01/13 02:15:26 Container is ready id: 3ed94d340f42 image: docker.io/testcontainers/ryuk:0.3.4
integration_test.go:56:
Error Trace: /home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/riakreceiver/integration_test.go:56
/home/runner/work/opentelemetry-collector-contrib/opentelemetry-collector-contrib/receiver/riakreceiver/integration_test.go:62
Error: Received unexpected error:
Error response from daemon: No such image: a21c129d-83b2-4b1d-8263-f3ba560e76f9:cc5a9dce-dfaa-418e-8320-45d6d6f277dd: failed to create container
Test: TestRiakIntegration
--- FAIL: TestRiakIntegration (51.11s)
```
### Collector version
0.69.0
### Environment information
## Environment
OS: (e.g., "Ubuntu 20.04")
Compiler(if manually compiled): (e.g., "go 14.2")
### OpenTelemetry Collector configuration
_No response_
### Log output
_No response_
### Additional context
_No response_ | non_priority | flaky integration test testriakintegration component s receiver riak what happened flaky test run testriakintegration github com testcontainers testcontainers go connected to docker server version azure api version operating system ubuntu lts total memory mb starting container id image docker io testcontainers ryuk waiting for container id image docker io testcontainers ryuk container is ready id image docker io testcontainers ryuk integration test go error trace home runner work opentelemetry collector contrib opentelemetry collector contrib receiver riakreceiver integration test go home runner work opentelemetry collector contrib opentelemetry collector contrib receiver riakreceiver integration test go error received unexpected error error response from daemon no such image dfaa failed to create container test testriakintegration fail testriakintegration collector version environment information environment os e g ubuntu compiler if manually compiled e g go opentelemetry collector configuration no response log output no response additional context no response | 0 |
161,196 | 20,120,447,042 | IssuesEvent | 2022-02-08 01:19:59 | Killy85/game_ai_trainer | https://api.github.com/repos/Killy85/game_ai_trainer | opened | CVE-2022-23557 (Medium) detected in tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl | security vulnerability | ## CVE-2022-23557 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</a></p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Tensorflow is an Open Source Machine Learning Framework. An attacker can craft a TFLite model that would trigger a division by zero in `BiasAndClamp` implementation. There is no check that the `bias_size` is non zero. The fix will be included in TensorFlow 2.8.0. We will also cherrypick this commit on TensorFlow 2.7.1, TensorFlow 2.6.3, and TensorFlow 2.5.3, as these are also affected and still in supported range.
<p>Publish Date: 2022-02-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23557>CVE-2022-23557</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-gf2j-f278-xh4v">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-gf2j-f278-xh4v</a></p>
<p>Release Date: 2022-02-04</p>
<p>Fix Resolution: tensorflow - 2.5.3,2.6.3,2.7.1,2.8.0;tensorflow-cpu - 2.5.3,2.6.3,2.7.1,2.8.0;tensorflow-gpu - 2.5.3,2.6.3,2.7.1,2.8.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-23557 (Medium) detected in tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl - ## CVE-2022-23557 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</a></p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Tensorflow is an Open Source Machine Learning Framework. An attacker can craft a TFLite model that would trigger a division by zero in `BiasAndClamp` implementation. There is no check that the `bias_size` is non zero. The fix will be included in TensorFlow 2.8.0. We will also cherrypick this commit on TensorFlow 2.7.1, TensorFlow 2.6.3, and TensorFlow 2.5.3, as these are also affected and still in supported range.
<p>Publish Date: 2022-02-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23557>CVE-2022-23557</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-gf2j-f278-xh4v">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-gf2j-f278-xh4v</a></p>
<p>Release Date: 2022-02-04</p>
<p>Fix Resolution: tensorflow - 2.5.3,2.6.3,2.7.1,2.8.0;tensorflow-cpu - 2.5.3,2.6.3,2.7.1,2.8.0;tensorflow-gpu - 2.5.3,2.6.3,2.7.1,2.8.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in tensorflow whl cve medium severity vulnerability vulnerable library tensorflow whl tensorflow is an open source machine learning framework for everyone library home page a href dependency hierarchy x tensorflow whl vulnerable library vulnerability details tensorflow is an open source machine learning framework an attacker can craft a tflite model that would trigger a division by zero in biasandclamp implementation there is no check that the bias size is non zero the fix will be included in tensorflow we will also cherrypick this commit on tensorflow tensorflow and tensorflow as these are also affected and still in supported range publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tensorflow tensorflow cpu tensorflow gpu step up your open source security game with whitesource | 0 |
302,758 | 22,841,367,727 | IssuesEvent | 2022-07-12 22:23:05 | udistrital/financiera_documentacion | https://api.github.com/repos/udistrital/financiera_documentacion | opened | Pruebas procesos de plan de adquisiciones y necesidades | Documentation | Se realizan hoy 12/07/2022, pruebas a los siguientes subsistemas de kronos
**Presupuesto**
- Están creados los rubros presupuestales
- Están asignadas las apropiaciones de cada rubro
- Aparentemente la apropiación inicial no se encuentra balanceada de acuerdo a la alerta que muestra el sistema. Es preciso aclarar que en los dos primeros valores se refleja que están iguales, pero en la parte de abajo de acuerdo con la imagen presentan una diferencia. (esto ya se había observado en mesa de trabajo anterior)
- Se quizo revisar la apropiación para la vigencia 2022 pero nunca desplegó el listado

**Plan de adquisiciones**
- Se demora el cargue de los rubros para la selección correspondiente
- Se crean metas - Sin inconvenientes
- Se crean actividades - sin inconvenientes
- Se observa que los rubros presupuestales están creados
- Se observa que las fuentes de financiamiento están creadas
- Se observa que los productos (rubros del gasto) están creados
- Se crea un renglón con un rubro de funcionamiento
- Cuando se quiere seleccionar otro rubro, no se limpia el renglón sino trae el antes seleccionado
- Se crea un renglón con un rubro de inversión
- Se realiza la publicación sin inconvenientes
- Se genera el respectivo pdf - con inconsistencia (se compara el plan de adquisiciones publicado actualmente en la página web de la Universidad frente al pdf que se genera en Kronos, encontrando que hace falta la columna de "Fecha de estimación de presentación de las ofertas"
**Necesidades**
- Se genera una necesidad con un rubro de funcionamiento (con el renglón creado en el plan de adquisiciones) pero no fue posible aprobarla por que genera el siguiente error

- Se genera una necesidad con un rubro de inversión (con el renglón creado en el plan de adquisiciones) pero no fue posible aprobarla por que genera el siguiente error

A mi parecer hay un error en el tema de los saldos por que inicialmente me salió otra alerta en esta última aprobación que decía algo parecida a la imagen de funcionamiento, pero ya después salió que era error,
Por favor revisar las observaciones acá planteadas
@AlexFBP @DanKazuky @brayanpasa99
| 1.0 | Pruebas procesos de plan de adquisiciones y necesidades - Se realizan hoy 12/07/2022, pruebas a los siguientes subsistemas de kronos
**Presupuesto**
- Están creados los rubros presupuestales
- Están asignadas las apropiaciones de cada rubro
- Aparentemente la apropiación inicial no se encuentra balanceada de acuerdo a la alerta que muestra el sistema. Es preciso aclarar que en los dos primeros valores se refleja que están iguales, pero en la parte de abajo de acuerdo con la imagen presentan una diferencia. (esto ya se había observado en mesa de trabajo anterior)
- Se quizo revisar la apropiación para la vigencia 2022 pero nunca desplegó el listado

**Plan de adquisiciones**
- Se demora el cargue de los rubros para la selección correspondiente
- Se crean metas - Sin inconvenientes
- Se crean actividades - sin inconvenientes
- Se observa que los rubros presupuestales están creados
- Se observa que las fuentes de financiamiento están creadas
- Se observa que los productos (rubros del gasto) están creados
- Se crea un renglón con un rubro de funcionamiento
- Cuando se quiere seleccionar otro rubro, no se limpia el renglón sino trae el antes seleccionado
- Se crea un renglón con un rubro de inversión
- Se realiza la publicación sin inconvenientes
- Se genera el respectivo pdf - con inconsistencia (se compara el plan de adquisiciones publicado actualmente en la página web de la Universidad frente al pdf que se genera en Kronos, encontrando que hace falta la columna de "Fecha de estimación de presentación de las ofertas"
**Necesidades**
- Se genera una necesidad con un rubro de funcionamiento (con el renglón creado en el plan de adquisiciones) pero no fue posible aprobarla por que genera el siguiente error

- Se genera una necesidad con un rubro de inversión (con el renglón creado en el plan de adquisiciones) pero no fue posible aprobarla por que genera el siguiente error

A mi parecer hay un error en el tema de los saldos por que inicialmente me salió otra alerta en esta última aprobación que decía algo parecida a la imagen de funcionamiento, pero ya después salió que era error,
Por favor revisar las observaciones acá planteadas
@AlexFBP @DanKazuky @brayanpasa99
| non_priority | pruebas procesos de plan de adquisiciones y necesidades se realizan hoy pruebas a los siguientes subsistemas de kronos presupuesto están creados los rubros presupuestales están asignadas las apropiaciones de cada rubro aparentemente la apropiación inicial no se encuentra balanceada de acuerdo a la alerta que muestra el sistema es preciso aclarar que en los dos primeros valores se refleja que están iguales pero en la parte de abajo de acuerdo con la imagen presentan una diferencia esto ya se había observado en mesa de trabajo anterior se quizo revisar la apropiación para la vigencia pero nunca desplegó el listado plan de adquisiciones se demora el cargue de los rubros para la selección correspondiente se crean metas sin inconvenientes se crean actividades sin inconvenientes se observa que los rubros presupuestales están creados se observa que las fuentes de financiamiento están creadas se observa que los productos rubros del gasto están creados se crea un renglón con un rubro de funcionamiento cuando se quiere seleccionar otro rubro no se limpia el renglón sino trae el antes seleccionado se crea un renglón con un rubro de inversión se realiza la publicación sin inconvenientes se genera el respectivo pdf con inconsistencia se compara el plan de adquisiciones publicado actualmente en la página web de la universidad frente al pdf que se genera en kronos encontrando que hace falta la columna de fecha de estimación de presentación de las ofertas necesidades se genera una necesidad con un rubro de funcionamiento con el renglón creado en el plan de adquisiciones pero no fue posible aprobarla por que genera el siguiente error se genera una necesidad con un rubro de inversión con el renglón creado en el plan de adquisiciones pero no fue posible aprobarla por que genera el siguiente error a mi parecer hay un error en el tema de los saldos por que inicialmente me salió otra alerta en esta última aprobación que decía algo parecida a la imagen de funcionamiento pero ya después salió que era error por favor revisar las observaciones acá planteadas alexfbp dankazuky | 0 |
82,556 | 15,962,492,590 | IssuesEvent | 2021-04-16 01:29:05 | JuliaLang/julia | https://api.github.com/repos/JuliaLang/julia | closed | Feature request: isemoji function | unicode | Hi all
Has any thought been given to including an isemoji function, similar to isascii, which returns true if a given character is an emoji, or if a given string consists of only emojis or emojis joined by zero-width joiners?
I wrote one for my own use, which I've reproduced below. It runs fairly quickly without allocations, but I'm sure it could be improved.
```julia
const EMOJI_RANGES = [
0x1F600:0x1F64F, # Emoticons
0x1F300:0x1F5FF, # Misc Symbols and Pictographs
0x1F680:0x1F6FF, # Transport and Map
0x2600:0x26FF, # Misc symbols
0x2700:0x27BF, # Dingbats
0xFE00:0xFE0F, # Variation Selectors
];
const ZWJ = Char(0x200d) # Zero-width joiner
"""
isemoji(Union{AbstractChar, AbstractString}) -> Bool
Test whether a character is an emoji, or whether all elements in a given string are emoji. Includes identifying composite emoji.
"""
function isemoji(c::AbstractChar)
u = UInt32(c)
@inbounds for emojiset in EMOJI_RANGES
u in emojiset && return true
end
return false
end
function isemoji(s::AbstractString)
ZWJ_allowed = false
s[end] == ZWJ && return false
@inbounds for c in s
if c == ZWJ
!ZWJ_allowed && return false
ZWJ_allowed = false
else
!isemoji(c) && return false
ZWJ_allowed = true
end
end
return true
end
``` | 1.0 | Feature request: isemoji function - Hi all
Has any thought been given to including an isemoji function, similar to isascii, which returns true if a given character is an emoji, or if a given string consists of only emojis or emojis joined by zero-width joiners?
I wrote one for my own use, which I've reproduced below. It runs fairly quickly without allocations, but I'm sure it could be improved.
```julia
const EMOJI_RANGES = [
0x1F600:0x1F64F, # Emoticons
0x1F300:0x1F5FF, # Misc Symbols and Pictographs
0x1F680:0x1F6FF, # Transport and Map
0x2600:0x26FF, # Misc symbols
0x2700:0x27BF, # Dingbats
0xFE00:0xFE0F, # Variation Selectors
];
const ZWJ = Char(0x200d) # Zero-width joiner
"""
isemoji(Union{AbstractChar, AbstractString}) -> Bool
Test whether a character is an emoji, or whether all elements in a given string are emoji. Includes identifying composite emoji.
"""
function isemoji(c::AbstractChar)
u = UInt32(c)
@inbounds for emojiset in EMOJI_RANGES
u in emojiset && return true
end
return false
end
function isemoji(s::AbstractString)
ZWJ_allowed = false
s[end] == ZWJ && return false
@inbounds for c in s
if c == ZWJ
!ZWJ_allowed && return false
ZWJ_allowed = false
else
!isemoji(c) && return false
ZWJ_allowed = true
end
end
return true
end
``` | non_priority | feature request isemoji function hi all has any thought been given to including an isemoji function similar to isascii which returns true if a given character is an emoji or if a given string consists of only emojis or emojis joined by zero width joiners i wrote one for my own use which i ve reproduced below it runs fairly quickly without allocations but i m sure it could be improved julia const emoji ranges emoticons misc symbols and pictographs transport and map misc symbols dingbats variation selectors const zwj char zero width joiner isemoji union abstractchar abstractstring bool test whether a character is an emoji or whether all elements in a given string are emoji includes identifying composite emoji function isemoji c abstractchar u c inbounds for emojiset in emoji ranges u in emojiset return true end return false end function isemoji s abstractstring zwj allowed false s zwj return false inbounds for c in s if c zwj zwj allowed return false zwj allowed false else isemoji c return false zwj allowed true end end return true end | 0 |
44,017 | 13,047,865,329 | IssuesEvent | 2020-07-29 11:30:10 | rsoreq/zaproxy | https://api.github.com/repos/rsoreq/zaproxy | opened | CVE-2019-9518 (High) detected in http2-common-9.4.20.v20190813.jar | security vulnerability | ## CVE-2019-9518 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>http2-common-9.4.20.v20190813.jar</b></p></summary>
<p>The Eclipse Jetty Project</p>
<p>Library home page: <a href="https://eclipse.org/jetty/http2-parent/http2-common">https://eclipse.org/jetty/http2-parent/http2-common</a></p>
<p>Path to dependency file: /tmp/ws-scm/zaproxy</p>
<p>Path to vulnerable library: /tmp/ws-ua_20200729112444_WCAEYA/downloadResource_JMENZF/20200729112922/http2-common-9.4.20.v20190813.jar</p>
<p>
Dependency Hierarchy:
- wiremock-jre8-2.25.1.jar (Root Library)
- http2-server-9.4.20.v20190813.jar
- :x: **http2-common-9.4.20.v20190813.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/rsoreq/zaproxy/commit/faf0234fff2dbd2142cc463fc90d7e58bcf20cd0">faf0234fff2dbd2142cc463fc90d7e58bcf20cd0</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Some HTTP/2 implementations are vulnerable to a flood of empty frames, potentially leading to a denial of service. The attacker sends a stream of frames with an empty payload and without the end-of-stream flag. These frames can be DATA, HEADERS, CONTINUATION and/or PUSH_PROMISE. The peer spends time processing each frame disproportionate to attack bandwidth. This can consume excess CPU.
<p>Publish Date: 2019-08-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-9518>CVE-2019-9518</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://netty.io/news/2019/08/13/4-1-39-Final.html">https://netty.io/news/2019/08/13/4-1-39-Final.html</a></p>
<p>Release Date: 2019-08-13</p>
<p>Fix Resolution: io.netty:netty-codec-http2:4.1.39.Final,io.netty:netty-all:4.1.39.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.eclipse.jetty.http2","packageName":"http2-common","packageVersion":"9.4.20.v20190813","isTransitiveDependency":true,"dependencyTree":"com.github.tomakehurst:wiremock-jre8:2.25.1;org.eclipse.jetty.http2:http2-server:9.4.20.v20190813;org.eclipse.jetty.http2:http2-common:9.4.20.v20190813","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http2:4.1.39.Final,io.netty:netty-all:4.1.39.Final"}],"vulnerabilityIdentifier":"CVE-2019-9518","vulnerabilityDetails":"Some HTTP/2 implementations are vulnerable to a flood of empty frames, potentially leading to a denial of service. The attacker sends a stream of frames with an empty payload and without the end-of-stream flag. These frames can be DATA, HEADERS, CONTINUATION and/or PUSH_PROMISE. The peer spends time processing each frame disproportionate to attack bandwidth. This can consume excess CPU.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-9518","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2019-9518 (High) detected in http2-common-9.4.20.v20190813.jar - ## CVE-2019-9518 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>http2-common-9.4.20.v20190813.jar</b></p></summary>
<p>The Eclipse Jetty Project</p>
<p>Library home page: <a href="https://eclipse.org/jetty/http2-parent/http2-common">https://eclipse.org/jetty/http2-parent/http2-common</a></p>
<p>Path to dependency file: /tmp/ws-scm/zaproxy</p>
<p>Path to vulnerable library: /tmp/ws-ua_20200729112444_WCAEYA/downloadResource_JMENZF/20200729112922/http2-common-9.4.20.v20190813.jar</p>
<p>
Dependency Hierarchy:
- wiremock-jre8-2.25.1.jar (Root Library)
- http2-server-9.4.20.v20190813.jar
- :x: **http2-common-9.4.20.v20190813.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/rsoreq/zaproxy/commit/faf0234fff2dbd2142cc463fc90d7e58bcf20cd0">faf0234fff2dbd2142cc463fc90d7e58bcf20cd0</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Some HTTP/2 implementations are vulnerable to a flood of empty frames, potentially leading to a denial of service. The attacker sends a stream of frames with an empty payload and without the end-of-stream flag. These frames can be DATA, HEADERS, CONTINUATION and/or PUSH_PROMISE. The peer spends time processing each frame disproportionate to attack bandwidth. This can consume excess CPU.
<p>Publish Date: 2019-08-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-9518>CVE-2019-9518</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://netty.io/news/2019/08/13/4-1-39-Final.html">https://netty.io/news/2019/08/13/4-1-39-Final.html</a></p>
<p>Release Date: 2019-08-13</p>
<p>Fix Resolution: io.netty:netty-codec-http2:4.1.39.Final,io.netty:netty-all:4.1.39.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.eclipse.jetty.http2","packageName":"http2-common","packageVersion":"9.4.20.v20190813","isTransitiveDependency":true,"dependencyTree":"com.github.tomakehurst:wiremock-jre8:2.25.1;org.eclipse.jetty.http2:http2-server:9.4.20.v20190813;org.eclipse.jetty.http2:http2-common:9.4.20.v20190813","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http2:4.1.39.Final,io.netty:netty-all:4.1.39.Final"}],"vulnerabilityIdentifier":"CVE-2019-9518","vulnerabilityDetails":"Some HTTP/2 implementations are vulnerable to a flood of empty frames, potentially leading to a denial of service. The attacker sends a stream of frames with an empty payload and without the end-of-stream flag. These frames can be DATA, HEADERS, CONTINUATION and/or PUSH_PROMISE. The peer spends time processing each frame disproportionate to attack bandwidth. This can consume excess CPU.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-9518","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_priority | cve high detected in common jar cve high severity vulnerability vulnerable library common jar the eclipse jetty project library home page a href path to dependency file tmp ws scm zaproxy path to vulnerable library tmp ws ua wcaeya downloadresource jmenzf common jar dependency hierarchy wiremock jar root library server jar x common jar vulnerable library found in head commit a href vulnerability details some http implementations are vulnerable to a flood of empty frames potentially leading to a denial of service the attacker sends a stream of frames with an empty payload and without the end of stream flag these frames can be data headers continuation and or push promise the peer spends time processing each frame disproportionate to attack bandwidth this can consume excess cpu publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec final io netty netty all final isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails some http implementations are vulnerable to a flood of empty frames potentially leading to a denial of service the attacker sends a stream of frames with an empty payload and without the end of stream flag these frames can be data headers continuation and or push promise the peer spends time processing each frame disproportionate to attack bandwidth this can consume excess cpu vulnerabilityurl | 0 |
100,860 | 8,757,307,215 | IssuesEvent | 2018-12-14 20:46:28 | rust-lang-nursery/stdsimd | https://api.github.com/repos/rust-lang-nursery/stdsimd | closed | assert_instr does not work on some travis windows targets | A-testing | Panics with unimplemented - probably just need to either make them pick the right branch in stdsimd-test, or add support for whatever objdump they should use. | 1.0 | assert_instr does not work on some travis windows targets - Panics with unimplemented - probably just need to either make them pick the right branch in stdsimd-test, or add support for whatever objdump they should use. | non_priority | assert instr does not work on some travis windows targets panics with unimplemented probably just need to either make them pick the right branch in stdsimd test or add support for whatever objdump they should use | 0 |
84,576 | 16,518,482,904 | IssuesEvent | 2021-05-26 12:17:47 | ices-eg/DIG | https://api.github.com/repos/ices-eg/DIG | closed | Monitoring method | Approved-WithChanges Impact: low vocab: CodeValue | Monitoring method: need to add a “self-sampling/reporting” option to the vocabulary
Reason: last year due to the pandemic, many surveys were not able to be conducted with a observer on board, so it was requested to the fishermen to do the bycatch monitoring themselves.
| 1.0 | Monitoring method - Monitoring method: need to add a “self-sampling/reporting” option to the vocabulary
Reason: last year due to the pandemic, many surveys were not able to be conducted with a observer on board, so it was requested to the fishermen to do the bycatch monitoring themselves.
| non_priority | monitoring method monitoring method need to add a “self sampling reporting” option to the vocabulary reason last year due to the pandemic many surveys were not able to be conducted with a observer on board so it was requested to the fishermen to do the bycatch monitoring themselves | 0 |
91,280 | 8,302,317,257 | IssuesEvent | 2018-09-21 14:12:26 | DynamoRIO/dynamorio | https://api.github.com/repos/DynamoRIO/dynamorio | closed | Add full-state-checking detach tests | Component-Tests Type-Feature | In light of bugs like #3116, we should add detach tests that explicitly check that every piece of state is properly restored: every register, segment state, signal state. The current detach tests just ensure that execution continues without crashing, but we could mess up several registers and not crash these simple test apps.
For #3116 I am adding some signal state testing to existing detach tests. This issue covers writing new tests with threads that place specific values into each register and segment and check that they are restored. | 1.0 | Add full-state-checking detach tests - In light of bugs like #3116, we should add detach tests that explicitly check that every piece of state is properly restored: every register, segment state, signal state. The current detach tests just ensure that execution continues without crashing, but we could mess up several registers and not crash these simple test apps.
For #3116 I am adding some signal state testing to existing detach tests. This issue covers writing new tests with threads that place specific values into each register and segment and check that they are restored. | non_priority | add full state checking detach tests in light of bugs like we should add detach tests that explicitly check that every piece of state is properly restored every register segment state signal state the current detach tests just ensure that execution continues without crashing but we could mess up several registers and not crash these simple test apps for i am adding some signal state testing to existing detach tests this issue covers writing new tests with threads that place specific values into each register and segment and check that they are restored | 0 |
39,678 | 5,116,539,524 | IssuesEvent | 2017-01-07 04:50:15 | caseyg/knutepunkt2017 | https://api.github.com/repos/caseyg/knutepunkt2017 | opened | 4. Harviainen, A Tale of Knutepunkt Theorycrafting | design | Novel. Think about sidebars here, or maybe embrace the sparseness for this one? Could scale the novel ratio slightly bigger if margins feel too big. | 1.0 | 4. Harviainen, A Tale of Knutepunkt Theorycrafting - Novel. Think about sidebars here, or maybe embrace the sparseness for this one? Could scale the novel ratio slightly bigger if margins feel too big. | non_priority | harviainen a tale of knutepunkt theorycrafting novel think about sidebars here or maybe embrace the sparseness for this one could scale the novel ratio slightly bigger if margins feel too big | 0 |
439,973 | 30,724,397,729 | IssuesEvent | 2023-07-27 18:24:50 | kubeshop/tracetest | https://api.github.com/repos/kubeshop/tracetest | opened | document local environment variables injection in CLI | documentation enhancement | **Describe the enhancement you'd like to see**
Our CLI is able to replace values into the definition file when you run `tracetest run test`. For example:
```yaml
type: Test
spec:
name: POST import pokemon
description: Import a pokemon using its ID
trigger:
type: http
httpRequest:
url: http://pokemon-demo.tracetest.io/pokemon/import
method: POST
headers:
- key: Content-Type
value: application/json
authentication:
type: apiKey
apiKey:
key: X-Key
# This is a reference to a local environment variable
# This is different than when we reference environments
# (soon-to-be variablesets), which we prefix with `env:`
value: ${POKEMON_APP_API_KEY}
in: header
body: '{ "id": 52 }'
```
Our CLI will detect the placeholder `${VAR_NAME}` and replace the string `${VAR_NAME}` with the content
of the environment variable with the same name. | 1.0 | document local environment variables injection in CLI - **Describe the enhancement you'd like to see**
Our CLI is able to replace values into the definition file when you run `tracetest run test`. For example:
```yaml
type: Test
spec:
name: POST import pokemon
description: Import a pokemon using its ID
trigger:
type: http
httpRequest:
url: http://pokemon-demo.tracetest.io/pokemon/import
method: POST
headers:
- key: Content-Type
value: application/json
authentication:
type: apiKey
apiKey:
key: X-Key
# This is a reference to a local environment variable
# This is different than when we reference environments
# (soon-to-be variablesets), which we prefix with `env:`
value: ${POKEMON_APP_API_KEY}
in: header
body: '{ "id": 52 }'
```
Our CLI will detect the placeholder `${VAR_NAME}` and replace the string `${VAR_NAME}` with the content
of the environment variable with the same name. | non_priority | document local environment variables injection in cli describe the enhancement you d like to see our cli is able to replace values into the definition file when you run tracetest run test for example yaml type test spec name post import pokemon description import a pokemon using its id trigger type http httprequest url method post headers key content type value application json authentication type apikey apikey key x key this is a reference to a local environment variable this is different than when we reference environments soon to be variablesets which we prefix with env value pokemon app api key in header body id our cli will detect the placeholder var name and replace the string var name with the content of the environment variable with the same name | 0 |
82,911 | 15,680,647,994 | IssuesEvent | 2021-03-25 03:23:19 | berviantoleo/indonesia-os-list | https://api.github.com/repos/berviantoleo/indonesia-os-list | opened | CVE-2021-23362 (Medium) detected in hosted-git-info-2.8.8.tgz | security vulnerability | ## CVE-2021-23362 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hosted-git-info-2.8.8.tgz</b></p></summary>
<p>Provides metadata and conversions from repository urls for Github, Bitbucket and Gitlab</p>
<p>Library home page: <a href="https://registry.npmjs.org/hosted-git-info/-/hosted-git-info-2.8.8.tgz">https://registry.npmjs.org/hosted-git-info/-/hosted-git-info-2.8.8.tgz</a></p>
<p>Path to dependency file: indonesia-os-list/package.json</p>
<p>Path to vulnerable library: indonesia-os-list/node_modules/hosted-git-info/package.json</p>
<p>
Dependency Hierarchy:
- node-sass-4.14.1.tgz (Root Library)
- meow-3.7.0.tgz
- normalize-package-data-2.5.0.tgz
- :x: **hosted-git-info-2.8.8.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/berviantoleo/indonesia-os-list/commit/cf5452c8ad1574a90c507b6c753280672bd33af5">cf5452c8ad1574a90c507b6c753280672bd33af5</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package hosted-git-info before 3.0.8 are vulnerable to Regular Expression Denial of Service (ReDoS) via shortcutMatch in fromUrl().
<p>Publish Date: 2021-03-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23362>CVE-2021-23362</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/hosted-git-info/releases/tag/v3.0.8">https://github.com/npm/hosted-git-info/releases/tag/v3.0.8</a></p>
<p>Release Date: 2021-03-23</p>
<p>Fix Resolution: hosted-git-info - 3.0.8</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-23362 (Medium) detected in hosted-git-info-2.8.8.tgz - ## CVE-2021-23362 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hosted-git-info-2.8.8.tgz</b></p></summary>
<p>Provides metadata and conversions from repository urls for Github, Bitbucket and Gitlab</p>
<p>Library home page: <a href="https://registry.npmjs.org/hosted-git-info/-/hosted-git-info-2.8.8.tgz">https://registry.npmjs.org/hosted-git-info/-/hosted-git-info-2.8.8.tgz</a></p>
<p>Path to dependency file: indonesia-os-list/package.json</p>
<p>Path to vulnerable library: indonesia-os-list/node_modules/hosted-git-info/package.json</p>
<p>
Dependency Hierarchy:
- node-sass-4.14.1.tgz (Root Library)
- meow-3.7.0.tgz
- normalize-package-data-2.5.0.tgz
- :x: **hosted-git-info-2.8.8.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/berviantoleo/indonesia-os-list/commit/cf5452c8ad1574a90c507b6c753280672bd33af5">cf5452c8ad1574a90c507b6c753280672bd33af5</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package hosted-git-info before 3.0.8 are vulnerable to Regular Expression Denial of Service (ReDoS) via shortcutMatch in fromUrl().
<p>Publish Date: 2021-03-23
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23362>CVE-2021-23362</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/hosted-git-info/releases/tag/v3.0.8">https://github.com/npm/hosted-git-info/releases/tag/v3.0.8</a></p>
<p>Release Date: 2021-03-23</p>
<p>Fix Resolution: hosted-git-info - 3.0.8</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in hosted git info tgz cve medium severity vulnerability vulnerable library hosted git info tgz provides metadata and conversions from repository urls for github bitbucket and gitlab library home page a href path to dependency file indonesia os list package json path to vulnerable library indonesia os list node modules hosted git info package json dependency hierarchy node sass tgz root library meow tgz normalize package data tgz x hosted git info tgz vulnerable library found in head commit a href found in base branch master vulnerability details the package hosted git info before are vulnerable to regular expression denial of service redos via shortcutmatch in fromurl publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution hosted git info step up your open source security game with whitesource | 0 |
254,706 | 19,268,228,118 | IssuesEvent | 2021-12-10 00:23:22 | Azure-Samples/modern-data-warehouse-dataops | https://api.github.com/repos/Azure-Samples/modern-data-warehouse-dataops | closed | Parking Sensor Synapse: Instructions to wire up github integration with Synapse is outdated. | documentation | ## Description
## What
- Readme doc on how to wire up github integration between Synapse and Data Factory is outdated.
## Acceptance Criteria
- [ ] Readme document is updated to reflect changes in the Github integration between Synapse and Data Factory. | 1.0 | Parking Sensor Synapse: Instructions to wire up github integration with Synapse is outdated. - ## Description
## What
- Readme doc on how to wire up github integration between Synapse and Data Factory is outdated.
## Acceptance Criteria
- [ ] Readme document is updated to reflect changes in the Github integration between Synapse and Data Factory. | non_priority | parking sensor synapse instructions to wire up github integration with synapse is outdated description what readme doc on how to wire up github integration between synapse and data factory is outdated acceptance criteria readme document is updated to reflect changes in the github integration between synapse and data factory | 0 |
108,190 | 13,566,523,179 | IssuesEvent | 2020-09-18 13:24:45 | department-of-veterans-affairs/va.gov-cms | https://api.github.com/repos/department-of-veterans-affairs/va.gov-cms | closed | Build content listing pages in Drupal as "dashboards". | All products Content proofing Dashboards and UI Design Drupal engineering Epic Needs refining Research UX writing XL shirt ⭐️ CMS UX | ## User Story or Problem Statement
As a manager of a listing page (events, stories, news releases, health services, locations, leadership) i can see all my listed content *within the CMS*, published and unpublished, so that i can manage the content effectively.
### Hypothesis
We believe that custom task pages by content type and context, which loosely mimic the front end, will help content editors manage their content more effectively.
### Assumptions
That the information architecture used on the front end is also the the right IA for content managers, in most cases.
### Existing research
There may be some helpful data in the dashboard research performed in Summer 2019.
## Acceptance Criteria
- [x] On an event listing node, (eg /outreach-and-events/events, /pittsburgh-health-care/events),
- [x] i can see all events in my listing, including published/unpublished, past and future.
- [x] events are filtered to show only upcoming, by default, but i can filter to see all events
- [x] events are displayed in a table with these columns:
- [x] Rendered content teaser display mode, much like on the front end. (title, teaser, date)
- [x] Moderation state
- [x] Last updated
- [x] Owner
- [ ] On a stories listing node,
- [ ] i can see all stories.
- [ ] they are sorted by most recent, descending
- [ ] featured stories are pinned to the top
- [ ] stories are displayed in teaser display mode, which includes _TBD_fields_
- [ ] On a news releases listing node, i can see all news releases.
- [ ] they are sorted by most recent, descending
- [ ] featured news releases are pinned to the top
- [ ] news releases are displayed in teaser display mode, which includes _TBD_fields_
- [ ] On a locations listing node, i can see all facilities.
- [ ] they are sorted by "Main location" (boolean), descending, then by alphabetical, ASC.
- [ ] facilities are displayed in teaser display mode, which includes _TBD_fields_
- [ ] On a health services listing node, i can see all health services
- [ ] they are sorted and grouped by parent taxonomy term and alpha, [just like on the front end](https://www.va.gov/pittsburgh-health-care/health-services)
- [ ] Build on the work done in Proofing epic to display health service content
- [ ] health services are displayed in teaser display mode, which includes _TBD_fields_
- [ ] On a leadership listing page, i can see my leadership staff profiles
- [ ] they are sorted by the entity reference field for leadership.
- [ ] health services are displayed in teaser display mode, which includes _TBD_fields_
In each of these listings
- [ ] i can see some kind of "teaser" display mode (fields TBD), as well as some important metadata and actions, including
- [ ] moderation state
- [ ] "last edited by"?
- [ ] a subset of fields and actions seen on /admin/content, _TBD_fields_
- [ ] i can filter by
- [ ] moderation state
- [ ] _TBD_fields_
- [ ] I have a link to add new content of this type.
## Research/ Design
* Design review/crit events dashboard, then a small iteration
* Quick usability review of /pittsburgh-health-care/events with events managers in PGH
* Apply lessons to the other 5 types, which could lead to help text or other needs
## Implementation steps
* Figure out the page layout architecture. Layout builder? Panelizer? Panels/Page manager? etc.
| 1.0 | Build content listing pages in Drupal as "dashboards". - ## User Story or Problem Statement
As a manager of a listing page (events, stories, news releases, health services, locations, leadership) i can see all my listed content *within the CMS*, published and unpublished, so that i can manage the content effectively.
### Hypothesis
We believe that custom task pages by content type and context, which loosely mimic the front end, will help content editors manage their content more effectively.
### Assumptions
That the information architecture used on the front end is also the the right IA for content managers, in most cases.
### Existing research
There may be some helpful data in the dashboard research performed in Summer 2019.
## Acceptance Criteria
- [x] On an event listing node, (eg /outreach-and-events/events, /pittsburgh-health-care/events),
- [x] i can see all events in my listing, including published/unpublished, past and future.
- [x] events are filtered to show only upcoming, by default, but i can filter to see all events
- [x] events are displayed in a table with these columns:
- [x] Rendered content teaser display mode, much like on the front end. (title, teaser, date)
- [x] Moderation state
- [x] Last updated
- [x] Owner
- [ ] On a stories listing node,
- [ ] i can see all stories.
- [ ] they are sorted by most recent, descending
- [ ] featured stories are pinned to the top
- [ ] stories are displayed in teaser display mode, which includes _TBD_fields_
- [ ] On a news releases listing node, i can see all news releases.
- [ ] they are sorted by most recent, descending
- [ ] featured news releases are pinned to the top
- [ ] news releases are displayed in teaser display mode, which includes _TBD_fields_
- [ ] On a locations listing node, i can see all facilities.
- [ ] they are sorted by "Main location" (boolean), descending, then by alphabetical, ASC.
- [ ] facilities are displayed in teaser display mode, which includes _TBD_fields_
- [ ] On a health services listing node, i can see all health services
- [ ] they are sorted and grouped by parent taxonomy term and alpha, [just like on the front end](https://www.va.gov/pittsburgh-health-care/health-services)
- [ ] Build on the work done in Proofing epic to display health service content
- [ ] health services are displayed in teaser display mode, which includes _TBD_fields_
- [ ] On a leadership listing page, i can see my leadership staff profiles
- [ ] they are sorted by the entity reference field for leadership.
- [ ] health services are displayed in teaser display mode, which includes _TBD_fields_
In each of these listings
- [ ] i can see some kind of "teaser" display mode (fields TBD), as well as some important metadata and actions, including
- [ ] moderation state
- [ ] "last edited by"?
- [ ] a subset of fields and actions seen on /admin/content, _TBD_fields_
- [ ] i can filter by
- [ ] moderation state
- [ ] _TBD_fields_
- [ ] I have a link to add new content of this type.
## Research/ Design
* Design review/crit events dashboard, then a small iteration
* Quick usability review of /pittsburgh-health-care/events with events managers in PGH
* Apply lessons to the other 5 types, which could lead to help text or other needs
## Implementation steps
* Figure out the page layout architecture. Layout builder? Panelizer? Panels/Page manager? etc.
| non_priority | build content listing pages in drupal as dashboards user story or problem statement as a manager of a listing page events stories news releases health services locations leadership i can see all my listed content within the cms published and unpublished so that i can manage the content effectively hypothesis we believe that custom task pages by content type and context which loosely mimic the front end will help content editors manage their content more effectively assumptions that the information architecture used on the front end is also the the right ia for content managers in most cases existing research there may be some helpful data in the dashboard research performed in summer acceptance criteria on an event listing node eg outreach and events events pittsburgh health care events i can see all events in my listing including published unpublished past and future events are filtered to show only upcoming by default but i can filter to see all events events are displayed in a table with these columns rendered content teaser display mode much like on the front end title teaser date moderation state last updated owner on a stories listing node i can see all stories they are sorted by most recent descending featured stories are pinned to the top stories are displayed in teaser display mode which includes tbd fields on a news releases listing node i can see all news releases they are sorted by most recent descending featured news releases are pinned to the top news releases are displayed in teaser display mode which includes tbd fields on a locations listing node i can see all facilities they are sorted by main location boolean descending then by alphabetical asc facilities are displayed in teaser display mode which includes tbd fields on a health services listing node i can see all health services they are sorted and grouped by parent taxonomy term and alpha build on the work done in proofing epic to display health service content health services are displayed in teaser display mode which includes tbd fields on a leadership listing page i can see my leadership staff profiles they are sorted by the entity reference field for leadership health services are displayed in teaser display mode which includes tbd fields in each of these listings i can see some kind of teaser display mode fields tbd as well as some important metadata and actions including moderation state last edited by a subset of fields and actions seen on admin content tbd fields i can filter by moderation state tbd fields i have a link to add new content of this type research design design review crit events dashboard then a small iteration quick usability review of pittsburgh health care events with events managers in pgh apply lessons to the other types which could lead to help text or other needs implementation steps figure out the page layout architecture layout builder panelizer panels page manager etc | 0 |
199,131 | 15,025,064,963 | IssuesEvent | 2021-02-01 20:31:48 | DPigeon/Money-Tree | https://api.github.com/repos/DPigeon/Money-Tree | closed | Search for a stock by name or symbol | acceptance test | As a user I want to be able to search for a particular stock by entering the name or symbol of the stock
**Acceptance Criteria**
- Able to view a list of stocks that correspond with the entered query (either by name or symbol)
- Able to view the particular stock when searching for it by name or symbol
- Not result found if an invalid query is entered
<img width="402" alt="Screen Shot 2020-10-15 at 5 26 06 PM" src="https://user-images.githubusercontent.com/18631060/96187770-9263c100-0f0b-11eb-929c-6054a61547af.png">
<img width="576" alt="Screen Shot 2020-10-15 at 5 28 29 PM" src="https://user-images.githubusercontent.com/18631060/96188078-174eda80-0f0c-11eb-956d-052766e01351.png">
| 1.0 | Search for a stock by name or symbol - As a user I want to be able to search for a particular stock by entering the name or symbol of the stock
**Acceptance Criteria**
- Able to view a list of stocks that correspond with the entered query (either by name or symbol)
- Able to view the particular stock when searching for it by name or symbol
- Not result found if an invalid query is entered
<img width="402" alt="Screen Shot 2020-10-15 at 5 26 06 PM" src="https://user-images.githubusercontent.com/18631060/96187770-9263c100-0f0b-11eb-929c-6054a61547af.png">
<img width="576" alt="Screen Shot 2020-10-15 at 5 28 29 PM" src="https://user-images.githubusercontent.com/18631060/96188078-174eda80-0f0c-11eb-956d-052766e01351.png">
| non_priority | search for a stock by name or symbol as a user i want to be able to search for a particular stock by entering the name or symbol of the stock acceptance criteria able to view a list of stocks that correspond with the entered query either by name or symbol able to view the particular stock when searching for it by name or symbol not result found if an invalid query is entered img width alt screen shot at pm src img width alt screen shot at pm src | 0 |
236,640 | 26,033,716,150 | IssuesEvent | 2022-12-22 01:09:55 | pazhanivel07/frameworks_av-CVE-2020-0242_CVE-2020-0243 | https://api.github.com/repos/pazhanivel07/frameworks_av-CVE-2020-0242_CVE-2020-0243 | closed | CVE-2017-0731 (High) detected in avandroid-10.0.0_r37, avandroid-10.0.0_r37 - autoclosed | security vulnerability | ## CVE-2017-0731 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>avandroid-10.0.0_r37</b>, <b>avandroid-10.0.0_r37</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A elevation of privilege vulnerability in the Android media framework (mpeg4 encoder). Product: Android. Versions: 4.4.4, 5.0.2, 5.1.1, 6.0, 6.0.1, 7.0, 7.1.1, 7.1.2. Android ID: A-36075363.
<p>Publish Date: 2017-08-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0731>CVE-2017-0731</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://source.android.com/security/bulletin/2017-08-01">https://source.android.com/security/bulletin/2017-08-01</a></p>
<p>Release Date: 2017-08-09</p>
<p>Fix Resolution: android-7.1.2_r18</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2017-0731 (High) detected in avandroid-10.0.0_r37, avandroid-10.0.0_r37 - autoclosed - ## CVE-2017-0731 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>avandroid-10.0.0_r37</b>, <b>avandroid-10.0.0_r37</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A elevation of privilege vulnerability in the Android media framework (mpeg4 encoder). Product: Android. Versions: 4.4.4, 5.0.2, 5.1.1, 6.0, 6.0.1, 7.0, 7.1.1, 7.1.2. Android ID: A-36075363.
<p>Publish Date: 2017-08-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0731>CVE-2017-0731</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://source.android.com/security/bulletin/2017-08-01">https://source.android.com/security/bulletin/2017-08-01</a></p>
<p>Release Date: 2017-08-09</p>
<p>Fix Resolution: android-7.1.2_r18</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in avandroid avandroid autoclosed cve high severity vulnerability vulnerable libraries avandroid avandroid vulnerability details a elevation of privilege vulnerability in the android media framework encoder product android versions android id a publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution android step up your open source security game with mend | 0 |
16,416 | 22,155,496,030 | IssuesEvent | 2022-06-03 22:03:43 | OneSignal/react-native-onesignal | https://api.github.com/repos/OneSignal/react-native-onesignal | closed | Unable to build on real device | Compatibility Issue | <!--
1. IF YOU DON'T FILL OUT THE FOLLOWING INFORMATION WE MAY CLOSE YOUR ISSUE WITHOUT INVESTIGATION
2. SEARCH EXISTING ISSUES FOR AN ANSWER: https://goo.gl/pspQNY
3. See our Common Issues documentation: https://goo.gl/BDcfZZ
4. See our contributing guidelines: https://goo.gl/h19DnX
-->
**Description:**
<!-- (write below this line) -->
Unable to run my app on real device but run fine on simulator.
**Environment**
<!-- Example:
1. What version of the OneSignal React-Native SDK are you using?
2. How did you add the SDK to your project (eg. npm)
-->
"react-native": "0.63.4"
"react-native-onesignal": "^4.3.1",
**Steps to Reproduce Issue:**
<!--
Example:
1. Install the OneSignal SDK using npm into your project
2. Initialize the SDK in the iOS AppDelegate
3. Attempt to receive a push notification
(write below this line) -->
1. Connect real device
2. Run build
**Error Log:**
ld: building for iOS, but linking in dylib file (/Users/kheang/Library/Developer/Xcode/DerivedData/elevator-fqjpztlxeridfdgsnznclijzmwny/Build/Products/Release-iphoneos/OneSignal.framework/OneSignal) built for Mac Catalyst, file '/Users/kheang/Library/Developer/Xcode/DerivedData/elevator-fqjpztlxeridfdgsnznclijzmwny/Build/Products/Release-iphoneos/OneSignal.framework/OneSignal' for architecture arm64
<!--
SEARCH EXISTING ISSUES FOR AN ANSWER: https://goo.gl/pspQNY
-->
| True | Unable to build on real device - <!--
1. IF YOU DON'T FILL OUT THE FOLLOWING INFORMATION WE MAY CLOSE YOUR ISSUE WITHOUT INVESTIGATION
2. SEARCH EXISTING ISSUES FOR AN ANSWER: https://goo.gl/pspQNY
3. See our Common Issues documentation: https://goo.gl/BDcfZZ
4. See our contributing guidelines: https://goo.gl/h19DnX
-->
**Description:**
<!-- (write below this line) -->
Unable to run my app on real device but run fine on simulator.
**Environment**
<!-- Example:
1. What version of the OneSignal React-Native SDK are you using?
2. How did you add the SDK to your project (eg. npm)
-->
"react-native": "0.63.4"
"react-native-onesignal": "^4.3.1",
**Steps to Reproduce Issue:**
<!--
Example:
1. Install the OneSignal SDK using npm into your project
2. Initialize the SDK in the iOS AppDelegate
3. Attempt to receive a push notification
(write below this line) -->
1. Connect real device
2. Run build
**Error Log:**
ld: building for iOS, but linking in dylib file (/Users/kheang/Library/Developer/Xcode/DerivedData/elevator-fqjpztlxeridfdgsnznclijzmwny/Build/Products/Release-iphoneos/OneSignal.framework/OneSignal) built for Mac Catalyst, file '/Users/kheang/Library/Developer/Xcode/DerivedData/elevator-fqjpztlxeridfdgsnznclijzmwny/Build/Products/Release-iphoneos/OneSignal.framework/OneSignal' for architecture arm64
<!--
SEARCH EXISTING ISSUES FOR AN ANSWER: https://goo.gl/pspQNY
-->
| non_priority | unable to build on real device if you don t fill out the following information we may close your issue without investigation search existing issues for an answer see our common issues documentation see our contributing guidelines description unable to run my app on real device but run fine on simulator environment example what version of the onesignal react native sdk are you using how did you add the sdk to your project eg npm react native react native onesignal steps to reproduce issue example install the onesignal sdk using npm into your project initialize the sdk in the ios appdelegate attempt to receive a push notification write below this line connect real device run build error log ld building for ios but linking in dylib file users kheang library developer xcode deriveddata elevator fqjpztlxeridfdgsnznclijzmwny build products release iphoneos onesignal framework onesignal built for mac catalyst file users kheang library developer xcode deriveddata elevator fqjpztlxeridfdgsnznclijzmwny build products release iphoneos onesignal framework onesignal for architecture search existing issues for an answer | 0 |
10,693 | 7,278,807,384 | IssuesEvent | 2018-02-22 00:43:03 | archesproject/arches | https://api.github.com/repos/archesproject/arches | closed | Improve loading speed of Card Manager page | DISCO - Task 8 - Bugs and performance | It took 10 seconds to load this page on my machine (using Chrome):
http://localhost:8000/graph/3af584c3-fd4d-11e6-9e3e-026d961c88e6/card_manager
The download payload is 8.8 Mb. This seems excessive considering what is presented on the page.
As a point of reference, the menu page for this resource (http://localhost:8000/graph/3af584c3-fd4d-11e6-9e3e-026d961c88e6/form_manager) loads in 2.5 seconds and seems very similar in content payload to the card manager page. | True | Improve loading speed of Card Manager page - It took 10 seconds to load this page on my machine (using Chrome):
http://localhost:8000/graph/3af584c3-fd4d-11e6-9e3e-026d961c88e6/card_manager
The download payload is 8.8 Mb. This seems excessive considering what is presented on the page.
As a point of reference, the menu page for this resource (http://localhost:8000/graph/3af584c3-fd4d-11e6-9e3e-026d961c88e6/form_manager) loads in 2.5 seconds and seems very similar in content payload to the card manager page. | non_priority | improve loading speed of card manager page it took seconds to load this page on my machine using chrome the download payload is mb this seems excessive considering what is presented on the page as a point of reference the menu page for this resource loads in seconds and seems very similar in content payload to the card manager page | 0 |
167,941 | 26,569,636,416 | IssuesEvent | 2023-01-21 01:32:35 | BG-Tourism/frontend-nuxt | https://api.github.com/repos/BG-Tourism/frontend-nuxt | opened | [Design] Page: Place View - Reviews | area:design | ### Description
Should contain `title`, `description` elements. Also a `reviews list` containing a list of up to six `review card` elements. At the bottom - a `button` element **Tell your story** which opens a popup where the user will be able to leave their own review about the place.
A `review card` should contain:
- author photo
- author names
- rating points
- comment
- image _(optional)_
- date added
### Level of completion
- [ ] Wireframe
- [ ] Prototype
- [ ] Presentation
- [ ] Final (includes desktop and responsive versions)
### Related tasks
- #10
- | 1.0 | [Design] Page: Place View - Reviews - ### Description
Should contain `title`, `description` elements. Also a `reviews list` containing a list of up to six `review card` elements. At the bottom - a `button` element **Tell your story** which opens a popup where the user will be able to leave their own review about the place.
A `review card` should contain:
- author photo
- author names
- rating points
- comment
- image _(optional)_
- date added
### Level of completion
- [ ] Wireframe
- [ ] Prototype
- [ ] Presentation
- [ ] Final (includes desktop and responsive versions)
### Related tasks
- #10
- | non_priority | page place view reviews description should contain title description elements also a reviews list containing a list of up to six review card elements at the bottom a button element tell your story which opens a popup where the user will be able to leave their own review about the place a review card should contain author photo author names rating points comment image optional date added level of completion wireframe prototype presentation final includes desktop and responsive versions related tasks | 0 |
411,649 | 27,827,166,596 | IssuesEvent | 2023-03-19 22:09:00 | Dans-Plugins/dpc-website | https://api.github.com/repos/Dans-Plugins/dpc-website | closed | Write documentation on compiling and running the project. | documentation | Developers should be able to successfully compile & run
the project by following this documentation. | 1.0 | Write documentation on compiling and running the project. - Developers should be able to successfully compile & run
the project by following this documentation. | non_priority | write documentation on compiling and running the project developers should be able to successfully compile run the project by following this documentation | 0 |
290,476 | 21,881,466,011 | IssuesEvent | 2022-05-19 14:40:03 | Qiskit/qiskit-tutorials | https://api.github.com/repos/Qiskit/qiskit-tutorials | opened | Update documentation building section on README | enhancement documentation | Update documentation building section on README https://github.com/Qiskit/qiskit-tutorials#building-documentation with the newly added requirements-dev.txt https://github.com/Qiskit/qiskit-tutorials/pull/1323 and other non python dependencies (pandoc and graphviz) | 1.0 | Update documentation building section on README - Update documentation building section on README https://github.com/Qiskit/qiskit-tutorials#building-documentation with the newly added requirements-dev.txt https://github.com/Qiskit/qiskit-tutorials/pull/1323 and other non python dependencies (pandoc and graphviz) | non_priority | update documentation building section on readme update documentation building section on readme with the newly added requirements dev txt and other non python dependencies pandoc and graphviz | 0 |
157,591 | 13,697,323,985 | IssuesEvent | 2020-10-01 02:41:56 | BornRiot/Python.Udemy.Complete_Python_BootCamp | https://api.github.com/repos/BornRiot/Python.Udemy.Complete_Python_BootCamp | closed | Web Scraping - Exercise Overview | chore documentation enhancement good first issue help wanted | Complete lecture and code along exercise on topic: Web Scraping - Exercise Overview | 1.0 | Web Scraping - Exercise Overview - Complete lecture and code along exercise on topic: Web Scraping - Exercise Overview | non_priority | web scraping exercise overview complete lecture and code along exercise on topic web scraping exercise overview | 0 |
36,956 | 15,106,257,327 | IssuesEvent | 2021-02-08 14:05:33 | microsoft/botbuilder-python | https://api.github.com/repos/microsoft/botbuilder-python | closed | Python 81.skills-skilldialog throwing error: [on_turn_error] unhandled error: Cannot deserialize content-type: text/plain | Bot Services bug customer-replied-to customer-reported | ## Sample information
1. Sample type: \samples\
2. Sample language: python
3. Sample name: 81.skills-skilldialog
## Describe the bug
When you run the sample as per the instructions, the skill bot is throwing the following error:
======== Running on http://localhost:39783 ========
(Press CTRL+C to quit)
[on_turn_error] unhandled error: Cannot deserialize content-type: text/plain
Traceback (most recent call last):
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/bot_adapter.py", line 128, in run_pipeline
context, callback
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/middleware_set.py", line 69, in receive_activity_with_status
return await self.receive_activity_internal(context, callback)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/middleware_set.py", line 79, in receive_activity_internal
return await callback(context)
File "/Users/tim/Documents/Sourcetree/BotBuilderSamples/samples/python/81.skills-skilldialog/dialog-skill-bot/bots/skill_bot.py", line 21, in on_turn
self._conversation_state.create_property("DialogState"),
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/dialog_extensions.py", line 68, in run_dialog
result = await dialog_context.begin_dialog(dialog.id)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/dialog_context.py", line 91, in begin_dialog
return await dialog.begin_dialog(self, options)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/component_dialog.py", line 67, in begin_dialog
turn_result = await self.on_begin_dialog(inner_dc, options)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/component_dialog.py", line 221, in on_begin_dialog
return await inner_dc.begin_dialog(self.initial_dialog_id, options)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/dialog_context.py", line 91, in begin_dialog
return await dialog.begin_dialog(self, options)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/waterfall_dialog.py", line 65, in begin_dialog
return await self.run_step(dialog_context, 0, DialogReason.BeginCalled, None)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/waterfall_dialog.py", line 156, in run_step
return await self.on_step(step_context)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/waterfall_dialog.py", line 132, in on_step
return await self._steps[step_context.index](step_context)
File "/Users/tim/Documents/Sourcetree/BotBuilderSamples/samples/python/81.skills-skilldialog/dialog-skill-bot/dialogs/activity_router_dialog.py", line 50, in process_activity
return await self._on_event_activity(step_context)
File "/Users/tim/Documents/Sourcetree/BotBuilderSamples/samples/python/81.skills-skilldialog/dialog-skill-bot/dialogs/activity_router_dialog.py", line 77, in _on_event_activity
return await self._begin_get_weather(step_context)
File "/Users/tim/Documents/Sourcetree/BotBuilderSamples/samples/python/81.skills-skilldialog/dialog-skill-bot/dialogs/activity_router_dialog.py", line 156, in _begin_get_weather
get_weather_message, get_weather_message, InputHints.ignoring_input,
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/turn_context.py", line 174, in send_activity
result = await self.send_activities([activity_or_text])
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/turn_context.py", line 226, in send_activities
return await self._emit(self._on_send_activities, output, logic())
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/turn_context.py", line 304, in _emit
return await logic
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/turn_context.py", line 221, in logic
responses = await self.adapter.send_activities(self, output)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/bot_framework_adapter.py", line 729, in send_activities
raise error
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/bot_framework_adapter.py", line 715, in send_activities
activity.conversation.id, activity.reply_to_id, activity
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botframework/connector/aio/operations_async/_conversations_operations_async.py", line 529, in reply_to_activity
request, stream=False, **operation_config
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/async_client.py", line 115, in async_send
pipeline_response = await self.config.pipeline.run(request, **kwargs)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/async_abc.py", line 159, in run
return await first_node.send(pipeline_request, **kwargs) # type: ignore
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/async_abc.py", line 79, in send
response = await self.next.send(request, **kwargs) # type: ignore
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/async_requests.py", line 106, in send
return await self.next.send(request, **kwargs)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/async_abc.py", line 84, in send
self._policy.on_response(request, response, **kwargs)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/universal.py", line 252, in on_response
http_response.headers
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/universal.py", line 226, in deserialize_from_http_generics
return cls.deserialize_from_text(body_bytes, content_type)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/universal.py", line 203, in deserialize_from_text
raise DeserializationError("Cannot deserialize content-type: {}".format(content_type))
msrest.exceptions.DeserializationError: Cannot deserialize content-type: text/plain
## To Reproduce
Steps to reproduce the behavior:
1. Run the root & skill bots as per the instructions from the sample readme
2. Start the bot framework emulator & connect
3. Choose the DialogSkillBot
4. Enter activity 3
## Expected behavior
Error not returned
| 1.0 | Python 81.skills-skilldialog throwing error: [on_turn_error] unhandled error: Cannot deserialize content-type: text/plain - ## Sample information
1. Sample type: \samples\
2. Sample language: python
3. Sample name: 81.skills-skilldialog
## Describe the bug
When you run the sample as per the instructions, the skill bot is throwing the following error:
======== Running on http://localhost:39783 ========
(Press CTRL+C to quit)
[on_turn_error] unhandled error: Cannot deserialize content-type: text/plain
Traceback (most recent call last):
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/bot_adapter.py", line 128, in run_pipeline
context, callback
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/middleware_set.py", line 69, in receive_activity_with_status
return await self.receive_activity_internal(context, callback)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/middleware_set.py", line 79, in receive_activity_internal
return await callback(context)
File "/Users/tim/Documents/Sourcetree/BotBuilderSamples/samples/python/81.skills-skilldialog/dialog-skill-bot/bots/skill_bot.py", line 21, in on_turn
self._conversation_state.create_property("DialogState"),
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/dialog_extensions.py", line 68, in run_dialog
result = await dialog_context.begin_dialog(dialog.id)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/dialog_context.py", line 91, in begin_dialog
return await dialog.begin_dialog(self, options)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/component_dialog.py", line 67, in begin_dialog
turn_result = await self.on_begin_dialog(inner_dc, options)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/component_dialog.py", line 221, in on_begin_dialog
return await inner_dc.begin_dialog(self.initial_dialog_id, options)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/dialog_context.py", line 91, in begin_dialog
return await dialog.begin_dialog(self, options)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/waterfall_dialog.py", line 65, in begin_dialog
return await self.run_step(dialog_context, 0, DialogReason.BeginCalled, None)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/waterfall_dialog.py", line 156, in run_step
return await self.on_step(step_context)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/dialogs/waterfall_dialog.py", line 132, in on_step
return await self._steps[step_context.index](step_context)
File "/Users/tim/Documents/Sourcetree/BotBuilderSamples/samples/python/81.skills-skilldialog/dialog-skill-bot/dialogs/activity_router_dialog.py", line 50, in process_activity
return await self._on_event_activity(step_context)
File "/Users/tim/Documents/Sourcetree/BotBuilderSamples/samples/python/81.skills-skilldialog/dialog-skill-bot/dialogs/activity_router_dialog.py", line 77, in _on_event_activity
return await self._begin_get_weather(step_context)
File "/Users/tim/Documents/Sourcetree/BotBuilderSamples/samples/python/81.skills-skilldialog/dialog-skill-bot/dialogs/activity_router_dialog.py", line 156, in _begin_get_weather
get_weather_message, get_weather_message, InputHints.ignoring_input,
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/turn_context.py", line 174, in send_activity
result = await self.send_activities([activity_or_text])
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/turn_context.py", line 226, in send_activities
return await self._emit(self._on_send_activities, output, logic())
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/turn_context.py", line 304, in _emit
return await logic
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/turn_context.py", line 221, in logic
responses = await self.adapter.send_activities(self, output)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/bot_framework_adapter.py", line 729, in send_activities
raise error
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botbuilder/core/bot_framework_adapter.py", line 715, in send_activities
activity.conversation.id, activity.reply_to_id, activity
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/botframework/connector/aio/operations_async/_conversations_operations_async.py", line 529, in reply_to_activity
request, stream=False, **operation_config
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/async_client.py", line 115, in async_send
pipeline_response = await self.config.pipeline.run(request, **kwargs)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/async_abc.py", line 159, in run
return await first_node.send(pipeline_request, **kwargs) # type: ignore
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/async_abc.py", line 79, in send
response = await self.next.send(request, **kwargs) # type: ignore
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/async_requests.py", line 106, in send
return await self.next.send(request, **kwargs)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/async_abc.py", line 84, in send
self._policy.on_response(request, response, **kwargs)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/universal.py", line 252, in on_response
http_response.headers
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/universal.py", line 226, in deserialize_from_http_generics
return cls.deserialize_from_text(body_bytes, content_type)
File "/Users/tim/.pyenv/versions/bot379/lib/python3.7/site-packages/msrest/pipeline/universal.py", line 203, in deserialize_from_text
raise DeserializationError("Cannot deserialize content-type: {}".format(content_type))
msrest.exceptions.DeserializationError: Cannot deserialize content-type: text/plain
## To Reproduce
Steps to reproduce the behavior:
1. Run the root & skill bots as per the instructions from the sample readme
2. Start the bot framework emulator & connect
3. Choose the DialogSkillBot
4. Enter activity 3
## Expected behavior
Error not returned
| non_priority | python skills skilldialog throwing error unhandled error cannot deserialize content type text plain sample information sample type samples sample language python sample name skills skilldialog describe the bug when you run the sample as per the instructions the skill bot is throwing the following error running on press ctrl c to quit unhandled error cannot deserialize content type text plain traceback most recent call last file users tim pyenv versions lib site packages botbuilder core bot adapter py line in run pipeline context callback file users tim pyenv versions lib site packages botbuilder core middleware set py line in receive activity with status return await self receive activity internal context callback file users tim pyenv versions lib site packages botbuilder core middleware set py line in receive activity internal return await callback context file users tim documents sourcetree botbuildersamples samples python skills skilldialog dialog skill bot bots skill bot py line in on turn self conversation state create property dialogstate file users tim pyenv versions lib site packages botbuilder dialogs dialog extensions py line in run dialog result await dialog context begin dialog dialog id file users tim pyenv versions lib site packages botbuilder dialogs dialog context py line in begin dialog return await dialog begin dialog self options file users tim pyenv versions lib site packages botbuilder dialogs component dialog py line in begin dialog turn result await self on begin dialog inner dc options file users tim pyenv versions lib site packages botbuilder dialogs component dialog py line in on begin dialog return await inner dc begin dialog self initial dialog id options file users tim pyenv versions lib site packages botbuilder dialogs dialog context py line in begin dialog return await dialog begin dialog self options file users tim pyenv versions lib site packages botbuilder dialogs waterfall dialog py line in begin dialog return await self run step dialog context dialogreason begincalled none file users tim pyenv versions lib site packages botbuilder dialogs waterfall dialog py line in run step return await self on step step context file users tim pyenv versions lib site packages botbuilder dialogs waterfall dialog py line in on step return await self steps step context file users tim documents sourcetree botbuildersamples samples python skills skilldialog dialog skill bot dialogs activity router dialog py line in process activity return await self on event activity step context file users tim documents sourcetree botbuildersamples samples python skills skilldialog dialog skill bot dialogs activity router dialog py line in on event activity return await self begin get weather step context file users tim documents sourcetree botbuildersamples samples python skills skilldialog dialog skill bot dialogs activity router dialog py line in begin get weather get weather message get weather message inputhints ignoring input file users tim pyenv versions lib site packages botbuilder core turn context py line in send activity result await self send activities file users tim pyenv versions lib site packages botbuilder core turn context py line in send activities return await self emit self on send activities output logic file users tim pyenv versions lib site packages botbuilder core turn context py line in emit return await logic file users tim pyenv versions lib site packages botbuilder core turn context py line in logic responses await self adapter send activities self output file users tim pyenv versions lib site packages botbuilder core bot framework adapter py line in send activities raise error file users tim pyenv versions lib site packages botbuilder core bot framework adapter py line in send activities activity conversation id activity reply to id activity file users tim pyenv versions lib site packages botframework connector aio operations async conversations operations async py line in reply to activity request stream false operation config file users tim pyenv versions lib site packages msrest async client py line in async send pipeline response await self config pipeline run request kwargs file users tim pyenv versions lib site packages msrest pipeline async abc py line in run return await first node send pipeline request kwargs type ignore file users tim pyenv versions lib site packages msrest pipeline async abc py line in send response await self next send request kwargs type ignore file users tim pyenv versions lib site packages msrest pipeline async requests py line in send return await self next send request kwargs file users tim pyenv versions lib site packages msrest pipeline async abc py line in send self policy on response request response kwargs file users tim pyenv versions lib site packages msrest pipeline universal py line in on response http response headers file users tim pyenv versions lib site packages msrest pipeline universal py line in deserialize from http generics return cls deserialize from text body bytes content type file users tim pyenv versions lib site packages msrest pipeline universal py line in deserialize from text raise deserializationerror cannot deserialize content type format content type msrest exceptions deserializationerror cannot deserialize content type text plain to reproduce steps to reproduce the behavior run the root skill bots as per the instructions from the sample readme start the bot framework emulator connect choose the dialogskillbot enter activity expected behavior error not returned | 0 |
156,983 | 19,909,585,340 | IssuesEvent | 2022-01-25 15:55:59 | Recidiviz/covid19-dashboard | https://api.github.com/repos/Recidiviz/covid19-dashboard | closed | Security Alert - Package: trim; Severity: HIGH; Vuln ID: GHSA-w5p7-h5w8-2hfq | Subject: Security Subject: Vulnerability Severity: HIGH Due this month. |
---
due: 2022-02-24
---
A new vulnerability has been reported by Dependabot. The criticality of this vulnerability is HIGH.
HIGH vulnerabilities have an SLA of 30 days according to our policy.
Affected package: trim
Ecosystem: NPM
Affected version range: < 0.0.3
Summary: Regular Expression Denial of Service in trim
Description: All versions of package trim lower than 0.0.3 are vulnerable to Regular Expression Denial of Service (ReDoS) via trim().
identifiers: [{'type': 'GHSA', 'value': 'GHSA-w5p7-h5w8-2hfq'}, {'type': 'CVE', 'value': 'CVE-2020-7753'}]
Fixed Version: 0.0.3
Created Date = January 25, 2022
***Additional Context***
https://github.com/Recidiviz/covid19-dashboard/security/dependabot?q=is%3Aopen+sort%3Anewest
| True | Security Alert - Package: trim; Severity: HIGH; Vuln ID: GHSA-w5p7-h5w8-2hfq -
---
due: 2022-02-24
---
A new vulnerability has been reported by Dependabot. The criticality of this vulnerability is HIGH.
HIGH vulnerabilities have an SLA of 30 days according to our policy.
Affected package: trim
Ecosystem: NPM
Affected version range: < 0.0.3
Summary: Regular Expression Denial of Service in trim
Description: All versions of package trim lower than 0.0.3 are vulnerable to Regular Expression Denial of Service (ReDoS) via trim().
identifiers: [{'type': 'GHSA', 'value': 'GHSA-w5p7-h5w8-2hfq'}, {'type': 'CVE', 'value': 'CVE-2020-7753'}]
Fixed Version: 0.0.3
Created Date = January 25, 2022
***Additional Context***
https://github.com/Recidiviz/covid19-dashboard/security/dependabot?q=is%3Aopen+sort%3Anewest
| non_priority | security alert package trim severity high vuln id ghsa due a new vulnerability has been reported by dependabot the criticality of this vulnerability is high high vulnerabilities have an sla of days according to our policy affected package trim ecosystem npm affected version range summary regular expression denial of service in trim description all versions of package trim lower than are vulnerable to regular expression denial of service redos via trim identifiers fixed version created date january additional context | 0 |
128,219 | 10,520,190,106 | IssuesEvent | 2019-09-29 23:27:05 | SpongePowered/SpongeCommon | https://api.github.com/repos/SpongePowered/SpongeCommon | closed | Canceling ChangeBlockEvent.Break causes block to lose custom data | status: needs testing system: data system: event type: bug version: 1.12 | **I am currently running**
<!-- If you don't use the latest version, please tell us why. -->
- SpongeVanilla version: 1.12.2-7.1.7-RC272
- Java version: Java(TM) SE Runtime Environment (build 1.8.0_181-b13)
- Plugins/Mods: Just my plugin
**Issue Description**
I have added custom data to a mobspawner block and it gets persisted properly.
But, when I setup an event listener of `ChangeBlockEvent.Break` that cancels the break of the block and have it cancel the event and mark all transactions with invalid, it restores the block but there's no data attached anymore.
With the help of the debugger, I was able to see that the data was there when I first attempted to break the block.
After attempting to break it a second time, it was no longer there and was just lost.
| 1.0 | Canceling ChangeBlockEvent.Break causes block to lose custom data - **I am currently running**
<!-- If you don't use the latest version, please tell us why. -->
- SpongeVanilla version: 1.12.2-7.1.7-RC272
- Java version: Java(TM) SE Runtime Environment (build 1.8.0_181-b13)
- Plugins/Mods: Just my plugin
**Issue Description**
I have added custom data to a mobspawner block and it gets persisted properly.
But, when I setup an event listener of `ChangeBlockEvent.Break` that cancels the break of the block and have it cancel the event and mark all transactions with invalid, it restores the block but there's no data attached anymore.
With the help of the debugger, I was able to see that the data was there when I first attempted to break the block.
After attempting to break it a second time, it was no longer there and was just lost.
| non_priority | canceling changeblockevent break causes block to lose custom data i am currently running spongevanilla version java version java tm se runtime environment build plugins mods just my plugin issue description i have added custom data to a mobspawner block and it gets persisted properly but when i setup an event listener of changeblockevent break that cancels the break of the block and have it cancel the event and mark all transactions with invalid it restores the block but there s no data attached anymore with the help of the debugger i was able to see that the data was there when i first attempted to break the block after attempting to break it a second time it was no longer there and was just lost | 0 |
4,166 | 19,981,517,139 | IssuesEvent | 2022-01-30 00:43:43 | thumbor/thumbor-bootcamp | https://api.github.com/repos/thumbor/thumbor-bootcamp | opened | [Bootcamp Task] change thumbor main command to click | task L2 python maintainability | ## Areas of Expertise
Python
## Summary
Change the `thumbor` command to use [click](https://click.palletsprojects.com/en/8.0.x/).
## Involved Modules
* [thumbor](https://github.com/thumbor/thumbor/) - [server](https://github.com/thumbor/thumbor/blob/master/thumbor/server.py)
## Task Relevance
By upgrading the handling of the cli parts of thumbor to a more established library like click we ensure maintainability for the future.
## Further Details
Completing this task means the `thumbor` command in thumbor is handled by click and not by optparse (deprecated module in Python 3).
## How to complete this task?
To complete this task, follow this workflow:
1. [Fork the involved repositories](http://help.github.com/fork-a-repo/)
2. In each repository there's a documentation on how to install dependencies and initialize your environment
3. Hack, in no particular order:
- Write code & tests
- Write new tests
- Write docs
- Improve design
- Check that all tests pass
- Repeat until you're satisfied
4. [Submit a pull request](https://docs.github.com/en/github/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request). | True | [Bootcamp Task] change thumbor main command to click - ## Areas of Expertise
Python
## Summary
Change the `thumbor` command to use [click](https://click.palletsprojects.com/en/8.0.x/).
## Involved Modules
* [thumbor](https://github.com/thumbor/thumbor/) - [server](https://github.com/thumbor/thumbor/blob/master/thumbor/server.py)
## Task Relevance
By upgrading the handling of the cli parts of thumbor to a more established library like click we ensure maintainability for the future.
## Further Details
Completing this task means the `thumbor` command in thumbor is handled by click and not by optparse (deprecated module in Python 3).
## How to complete this task?
To complete this task, follow this workflow:
1. [Fork the involved repositories](http://help.github.com/fork-a-repo/)
2. In each repository there's a documentation on how to install dependencies and initialize your environment
3. Hack, in no particular order:
- Write code & tests
- Write new tests
- Write docs
- Improve design
- Check that all tests pass
- Repeat until you're satisfied
4. [Submit a pull request](https://docs.github.com/en/github/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request). | non_priority | change thumbor main command to click areas of expertise python summary change the thumbor command to use involved modules task relevance by upgrading the handling of the cli parts of thumbor to a more established library like click we ensure maintainability for the future further details completing this task means the thumbor command in thumbor is handled by click and not by optparse deprecated module in python how to complete this task to complete this task follow this workflow in each repository there s a documentation on how to install dependencies and initialize your environment hack in no particular order write code tests write new tests write docs improve design check that all tests pass repeat until you re satisfied | 0 |
279,387 | 24,221,266,565 | IssuesEvent | 2022-09-26 11:05:09 | wpfoodmanager/wp-food-manager | https://api.github.com/repos/wpfoodmanager/wp-food-manager | closed | Display invalid listing message on the page | In Testing | Food Dashboard > Edit any food
Without change .Click on the save changes button.
Here invalid listing message is display on the page.

| 1.0 | Display invalid listing message on the page - Food Dashboard > Edit any food
Without change .Click on the save changes button.
Here invalid listing message is display on the page.

| non_priority | display invalid listing message on the page food dashboard edit any food without change click on the save changes button here invalid listing message is display on the page | 0 |
152,720 | 13,465,453,335 | IssuesEvent | 2020-09-09 20:54:12 | golang/go | https://api.github.com/repos/golang/go | reopened | x/tools/gopls: autogenerate documentation for supported configurations | Documentation NeedsInvestigation Tools gopls help wanted | `gopls` has a number of configurations, some of which are meant for users to set manually. The rest are used primarily to "hide" experimental features and are only meant for use by `gopls` developers and testers.
We should formalize the set of `gopls` configurations that we want to make public for users, and we should consolidate these configurations with the settings of `gopls`'s clients, such as VSCode-Go, vim-go, govim, etc.
Current settings include:
```json5
"gopls": {
// user-facing
"env": {}, // the environment to use when loading packages
"buildFlags": [], // the build flags to use when loading packages
"usePlaceholders": true, // if the user wants placeholders when they complete an item
"hoverKind": "FullDocumentation", // the amount of documentation to show on hover, one of NoDocumentation, Synopsis, SingleLine, FullDocumentation
// experimental configurations
"wantCompletionDocumentation": true, // if the user wants to see documentation in completion items
"wantSuggestedFixes": true, // offer fixes to apply along with diagnostics
"experimentalDisabledAnalyses": [], // list of analyses to disable in diagnostics
"useDeepCompletions": true, // show completions for symbols in imported packages
}
```
We should add the settings used by clients to this issue to get a better understanding of how we can provide a more unified experience.
We should autogenerate this documentation from the comments in the internal/lsp/source/options.go. | 1.0 | x/tools/gopls: autogenerate documentation for supported configurations - `gopls` has a number of configurations, some of which are meant for users to set manually. The rest are used primarily to "hide" experimental features and are only meant for use by `gopls` developers and testers.
We should formalize the set of `gopls` configurations that we want to make public for users, and we should consolidate these configurations with the settings of `gopls`'s clients, such as VSCode-Go, vim-go, govim, etc.
Current settings include:
```json5
"gopls": {
// user-facing
"env": {}, // the environment to use when loading packages
"buildFlags": [], // the build flags to use when loading packages
"usePlaceholders": true, // if the user wants placeholders when they complete an item
"hoverKind": "FullDocumentation", // the amount of documentation to show on hover, one of NoDocumentation, Synopsis, SingleLine, FullDocumentation
// experimental configurations
"wantCompletionDocumentation": true, // if the user wants to see documentation in completion items
"wantSuggestedFixes": true, // offer fixes to apply along with diagnostics
"experimentalDisabledAnalyses": [], // list of analyses to disable in diagnostics
"useDeepCompletions": true, // show completions for symbols in imported packages
}
```
We should add the settings used by clients to this issue to get a better understanding of how we can provide a more unified experience.
We should autogenerate this documentation from the comments in the internal/lsp/source/options.go. | non_priority | x tools gopls autogenerate documentation for supported configurations gopls has a number of configurations some of which are meant for users to set manually the rest are used primarily to hide experimental features and are only meant for use by gopls developers and testers we should formalize the set of gopls configurations that we want to make public for users and we should consolidate these configurations with the settings of gopls s clients such as vscode go vim go govim etc current settings include gopls user facing env the environment to use when loading packages buildflags the build flags to use when loading packages useplaceholders true if the user wants placeholders when they complete an item hoverkind fulldocumentation the amount of documentation to show on hover one of nodocumentation synopsis singleline fulldocumentation experimental configurations wantcompletiondocumentation true if the user wants to see documentation in completion items wantsuggestedfixes true offer fixes to apply along with diagnostics experimentaldisabledanalyses list of analyses to disable in diagnostics usedeepcompletions true show completions for symbols in imported packages we should add the settings used by clients to this issue to get a better understanding of how we can provide a more unified experience we should autogenerate this documentation from the comments in the internal lsp source options go | 0 |
35,320 | 30,987,823,729 | IssuesEvent | 2023-08-09 00:14:31 | grafana/agent | https://api.github.com/repos/grafana/agent | closed | Update github.com/prometheus-community/postgres_exporter to v0.13.2 | outdated-dependency type/infrastructure | An update for `github.com/prometheus-community/postgres_exporter` (version `v0.13.2`) is now available. Version `v0.10.0` is currently in use. | 1.0 | Update github.com/prometheus-community/postgres_exporter to v0.13.2 - An update for `github.com/prometheus-community/postgres_exporter` (version `v0.13.2`) is now available. Version `v0.10.0` is currently in use. | non_priority | update github com prometheus community postgres exporter to an update for github com prometheus community postgres exporter version is now available version is currently in use | 0 |
319,000 | 23,750,289,857 | IssuesEvent | 2022-08-31 19:56:03 | spacepy/spacepy | https://api.github.com/repos/spacepy/spacepy | closed | 'nan' for T96 external field model | documentation | <!--
Thank you for contributing to the SpacePy community by
taking the time to report a SpacePy issue. Please
describe the issue in detail, and for bug reports
fill in the fields below.
You can delete the sections that don't apply to your
issue. For example, if a feature is inadequately
described, simply delete all sections below and
describe how the documentation is lacking. If you
think you've found a bug that proudces unwanted or
incorrect behavior then delete the "Error Message"
section and include a description of what the code
along with a description of what you think it should
do. For new feature requests, please describe the
feature and at least one possible use case (none of
the sections below would be required in that case).
You can view the final output by clicking the preview
button above.
-->
I am receiving an 'nan' for all values corresponding to the 'Blocal' and 'Bvec' keys using spacepy.irbempy.get_bfield() when I set the external magnetic field model as T96.
### My code:
<!--
If you place your code between the triple backticks below,
it will be marked as a code block automatically.
If possible, please provide a minimal example that succinctly
illustrate the issue.
-->
Here is my code:
```
import spacepy.time as spt
import spacepy.coordinates as spc
import spacepy.irbempy as ir
t = spt.Ticktock(['2003-10-29T07:00:00'],'ISO')
y = spc.Coords([2,0,0],'GEO','car')
print(ir.get_Bfield(t,y,extMag = 'T96'))
```
### Output:
<!-- If any, paste the *full* error message inside a code block
as above (starting from line Traceback)
-->
Here is the output:
```
{'Blocal': array([nan]), 'Bvec': array([[nan, nan, nan]])}
```
### OS, Python version, and dependency version information:
<!-- You can run the following and paste the result in a code
block.
```
import platform
import sys
import numpy
import scipy
import matplotlib
print(platform.platform())
print(sys.version_info)
print('numpy={0}'.format(numpy.__version__))
print('scipy={0}'.format(scipy.__version__))
print('matplotlib={0}'.format(matplotlib.__version__))
```
-->
Linux-3.10.0-1160.31.1.el7.x86_64-x86_64-with-glibc2.10
sys.version_info(major=3, minor=8, micro=5, releaselevel='final', serial=0)
numpy=1.19.2
scipy=1.5.2
matplotlib=3.3.2
### Version of SpacePy
<!-- What version of SpacePy are you using and where did you
download it from?
-->
My spacepy version is 0.2.2, and I cloned it from the git repository.
| 1.0 | 'nan' for T96 external field model - <!--
Thank you for contributing to the SpacePy community by
taking the time to report a SpacePy issue. Please
describe the issue in detail, and for bug reports
fill in the fields below.
You can delete the sections that don't apply to your
issue. For example, if a feature is inadequately
described, simply delete all sections below and
describe how the documentation is lacking. If you
think you've found a bug that proudces unwanted or
incorrect behavior then delete the "Error Message"
section and include a description of what the code
along with a description of what you think it should
do. For new feature requests, please describe the
feature and at least one possible use case (none of
the sections below would be required in that case).
You can view the final output by clicking the preview
button above.
-->
I am receiving an 'nan' for all values corresponding to the 'Blocal' and 'Bvec' keys using spacepy.irbempy.get_bfield() when I set the external magnetic field model as T96.
### My code:
<!--
If you place your code between the triple backticks below,
it will be marked as a code block automatically.
If possible, please provide a minimal example that succinctly
illustrate the issue.
-->
Here is my code:
```
import spacepy.time as spt
import spacepy.coordinates as spc
import spacepy.irbempy as ir
t = spt.Ticktock(['2003-10-29T07:00:00'],'ISO')
y = spc.Coords([2,0,0],'GEO','car')
print(ir.get_Bfield(t,y,extMag = 'T96'))
```
### Output:
<!-- If any, paste the *full* error message inside a code block
as above (starting from line Traceback)
-->
Here is the output:
```
{'Blocal': array([nan]), 'Bvec': array([[nan, nan, nan]])}
```
### OS, Python version, and dependency version information:
<!-- You can run the following and paste the result in a code
block.
```
import platform
import sys
import numpy
import scipy
import matplotlib
print(platform.platform())
print(sys.version_info)
print('numpy={0}'.format(numpy.__version__))
print('scipy={0}'.format(scipy.__version__))
print('matplotlib={0}'.format(matplotlib.__version__))
```
-->
Linux-3.10.0-1160.31.1.el7.x86_64-x86_64-with-glibc2.10
sys.version_info(major=3, minor=8, micro=5, releaselevel='final', serial=0)
numpy=1.19.2
scipy=1.5.2
matplotlib=3.3.2
### Version of SpacePy
<!-- What version of SpacePy are you using and where did you
download it from?
-->
My spacepy version is 0.2.2, and I cloned it from the git repository.
| non_priority | nan for external field model thank you for contributing to the spacepy community by taking the time to report a spacepy issue please describe the issue in detail and for bug reports fill in the fields below you can delete the sections that don t apply to your issue for example if a feature is inadequately described simply delete all sections below and describe how the documentation is lacking if you think you ve found a bug that proudces unwanted or incorrect behavior then delete the error message section and include a description of what the code along with a description of what you think it should do for new feature requests please describe the feature and at least one possible use case none of the sections below would be required in that case you can view the final output by clicking the preview button above i am receiving an nan for all values corresponding to the blocal and bvec keys using spacepy irbempy get bfield when i set the external magnetic field model as my code if you place your code between the triple backticks below it will be marked as a code block automatically if possible please provide a minimal example that succinctly illustrate the issue here is my code import spacepy time as spt import spacepy coordinates as spc import spacepy irbempy as ir t spt ticktock iso y spc coords geo car print ir get bfield t y extmag output if any paste the full error message inside a code block as above starting from line traceback here is the output blocal array bvec array os python version and dependency version information you can run the following and paste the result in a code block import platform import sys import numpy import scipy import matplotlib print platform platform print sys version info print numpy format numpy version print scipy format scipy version print matplotlib format matplotlib version linux with sys version info major minor micro releaselevel final serial numpy scipy matplotlib version of spacepy what version of spacepy are you using and where did you download it from my spacepy version is and i cloned it from the git repository | 0 |
184,389 | 21,784,896,152 | IssuesEvent | 2022-05-14 01:43:11 | yhuangsh/50pm | https://api.github.com/repos/yhuangsh/50pm | closed | WS-2019-0019 Medium Severity Vulnerability detected by WhiteSource - autoclosed | security vulnerability | ## WS-2019-0019 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>braces-1.8.5.tgz</b></p></summary>
<p>Fastest brace expansion for node.js, with the most complete support for the Bash 4.3 braces specification.</p>
<p>Library home page: <a href="https://registry.npmjs.org/braces/-/braces-1.8.5.tgz">https://registry.npmjs.org/braces/-/braces-1.8.5.tgz</a></p>
<p>Path to dependency file: /50pm/frontend/50pm/package.json</p>
<p>Path to vulnerable library: /tmp/git/50pm/frontend/50pm/node_modules/braces/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.8.tgz (Root Library)
- jest-23.6.0.tgz
- jest-cli-23.6.0.tgz
- micromatch-2.3.11.tgz
- :x: **braces-1.8.5.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Version of braces prior to 2.3.1 are vulnerable to Regular Expression Denial of Service (ReDoS). Untrusted input may cause catastrophic backtracking while matching regular expressions. This can cause the application to be unresponsive leading to Denial of Service.
<p>Publish Date: 2019-03-25
<p>URL: <a href=https://www.npmjs.com/advisories/786>WS-2019-0019</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/786">https://www.npmjs.com/advisories/786</a></p>
<p>Release Date: 2019-02-21</p>
<p>Fix Resolution: 2.3.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isOpenPROnNewVersion":false,"isPackageBased":true,"packages":[{"packageType":"javascript/Node.js","packageName":"braces","packageVersion":"1.8.5","isTransitiveDependency":true,"dependencyTree":"react-scripts:2.1.8;jest:23.6.0;jest-cli:23.6.0;micromatch:2.3.11;braces:1.8.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.3.1"}],"vulnerabilityIdentifier":"WS-2019-0019","vulnerabilityDetails":"Version of braces prior to 2.3.1 are vulnerable to Regular Expression Denial of Service (ReDoS). Untrusted input may cause catastrophic backtracking while matching regular expressions. This can cause the application to be unresponsive leading to Denial of Service.","cvss2Severity":"medium","cvss2Score":"5.0","extraData":{}}</REMEDIATE> --> | True | WS-2019-0019 Medium Severity Vulnerability detected by WhiteSource - autoclosed - ## WS-2019-0019 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>braces-1.8.5.tgz</b></p></summary>
<p>Fastest brace expansion for node.js, with the most complete support for the Bash 4.3 braces specification.</p>
<p>Library home page: <a href="https://registry.npmjs.org/braces/-/braces-1.8.5.tgz">https://registry.npmjs.org/braces/-/braces-1.8.5.tgz</a></p>
<p>Path to dependency file: /50pm/frontend/50pm/package.json</p>
<p>Path to vulnerable library: /tmp/git/50pm/frontend/50pm/node_modules/braces/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.8.tgz (Root Library)
- jest-23.6.0.tgz
- jest-cli-23.6.0.tgz
- micromatch-2.3.11.tgz
- :x: **braces-1.8.5.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Version of braces prior to 2.3.1 are vulnerable to Regular Expression Denial of Service (ReDoS). Untrusted input may cause catastrophic backtracking while matching regular expressions. This can cause the application to be unresponsive leading to Denial of Service.
<p>Publish Date: 2019-03-25
<p>URL: <a href=https://www.npmjs.com/advisories/786>WS-2019-0019</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/786">https://www.npmjs.com/advisories/786</a></p>
<p>Release Date: 2019-02-21</p>
<p>Fix Resolution: 2.3.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isOpenPROnNewVersion":false,"isPackageBased":true,"packages":[{"packageType":"javascript/Node.js","packageName":"braces","packageVersion":"1.8.5","isTransitiveDependency":true,"dependencyTree":"react-scripts:2.1.8;jest:23.6.0;jest-cli:23.6.0;micromatch:2.3.11;braces:1.8.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.3.1"}],"vulnerabilityIdentifier":"WS-2019-0019","vulnerabilityDetails":"Version of braces prior to 2.3.1 are vulnerable to Regular Expression Denial of Service (ReDoS). Untrusted input may cause catastrophic backtracking while matching regular expressions. This can cause the application to be unresponsive leading to Denial of Service.","cvss2Severity":"medium","cvss2Score":"5.0","extraData":{}}</REMEDIATE> --> | non_priority | ws medium severity vulnerability detected by whitesource autoclosed ws medium severity vulnerability vulnerable library braces tgz fastest brace expansion for node js with the most complete support for the bash braces specification library home page a href path to dependency file frontend package json path to vulnerable library tmp git frontend node modules braces package json dependency hierarchy react scripts tgz root library jest tgz jest cli tgz micromatch tgz x braces tgz vulnerable library vulnerability details version of braces prior to are vulnerable to regular expression denial of service redos untrusted input may cause catastrophic backtracking while matching regular expressions this can cause the application to be unresponsive leading to denial of service publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource isopenpronvulnerability true isopenpronnewversion false ispackagebased true packages vulnerabilityidentifier ws vulnerabilitydetails version of braces prior to are vulnerable to regular expression denial of service redos untrusted input may cause catastrophic backtracking while matching regular expressions this can cause the application to be unresponsive leading to denial of service medium extradata | 0 |
121,570 | 25,992,900,763 | IssuesEvent | 2022-12-20 09:12:37 | Regalis11/Barotrauma | https://api.github.com/repos/Regalis11/Barotrauma | closed | Grid Maintainer repair speed bonus doesn't work | Bug Code Design | ### Disclaimers
- [X] I have searched the issue tracker to check if the issue has already been reported.
- [ ] My issue happened while using mods.
### What happened?
Repairing electrical devices with the talent Grid Maintainer doesn't seem to be any quicker than the usual repair speed.
### Reproduction steps
_No response_
### Bug prevalence
Happens every time I play
### Version
0.20.15.0
### -
_No response_
### Which operating system did you encounter this bug on?
Windows
### Relevant error messages and crash reports
_No response_ | 1.0 | Grid Maintainer repair speed bonus doesn't work - ### Disclaimers
- [X] I have searched the issue tracker to check if the issue has already been reported.
- [ ] My issue happened while using mods.
### What happened?
Repairing electrical devices with the talent Grid Maintainer doesn't seem to be any quicker than the usual repair speed.
### Reproduction steps
_No response_
### Bug prevalence
Happens every time I play
### Version
0.20.15.0
### -
_No response_
### Which operating system did you encounter this bug on?
Windows
### Relevant error messages and crash reports
_No response_ | non_priority | grid maintainer repair speed bonus doesn t work disclaimers i have searched the issue tracker to check if the issue has already been reported my issue happened while using mods what happened repairing electrical devices with the talent grid maintainer doesn t seem to be any quicker than the usual repair speed reproduction steps no response bug prevalence happens every time i play version no response which operating system did you encounter this bug on windows relevant error messages and crash reports no response | 0 |
189,816 | 15,207,565,842 | IssuesEvent | 2021-02-17 00:29:41 | SACGF/variantgrid | https://api.github.com/repos/SACGF/variantgrid | opened | Tips on loading screens | documentation | While grids are loading etc, might as well display some useful tips, or new features to bring to users attention
Have a button show tips, so users can turn them off (can also change in user settings)
| 1.0 | Tips on loading screens - While grids are loading etc, might as well display some useful tips, or new features to bring to users attention
Have a button show tips, so users can turn them off (can also change in user settings)
| non_priority | tips on loading screens while grids are loading etc might as well display some useful tips or new features to bring to users attention have a button show tips so users can turn them off can also change in user settings | 0 |
75,578 | 20,883,197,630 | IssuesEvent | 2022-03-23 00:08:28 | spacetelescope/jwst | https://api.github.com/repos/spacetelescope/jwst | closed | 1.4.0 source_catalog | ins_build_tasks | _Issue [JP-2422](https://jira.stsci.edu/browse/JP-2422) was created on JIRA by [JWST PARSER](https://jira.stsci.edu/secure/ViewProfile.jspa?name=jpparser):_
## source\_catalog
* Fixed issue with non-finite positions for aperture photometry. [#⁠6206]
* Fixed the documentation for bkg\_boxsize to reflect that its data type should be integer. [#⁠6300]
* Renamed filter\_kernel to kernel in the call to detect\_sources to match the new name of the argument in photutils. [#⁠6527]
| 1.0 | 1.4.0 source_catalog - _Issue [JP-2422](https://jira.stsci.edu/browse/JP-2422) was created on JIRA by [JWST PARSER](https://jira.stsci.edu/secure/ViewProfile.jspa?name=jpparser):_
## source\_catalog
* Fixed issue with non-finite positions for aperture photometry. [#⁠6206]
* Fixed the documentation for bkg\_boxsize to reflect that its data type should be integer. [#⁠6300]
* Renamed filter\_kernel to kernel in the call to detect\_sources to match the new name of the argument in photutils. [#⁠6527]
| non_priority | source catalog issue was created on jira by source catalog fixed issue with non finite positions for aperture photometry fixed the documentation for bkg boxsize to reflect that its data type should be integer renamed filter kernel to kernel in the call to detect sources to match the new name of the argument in photutils | 0 |
283,442 | 24,547,922,664 | IssuesEvent | 2022-10-12 10:13:20 | Rdatatable/data.table | https://api.github.com/repos/Rdatatable/data.table | opened | encoding related test 2194.7 fails on a local machine | tests encoding | Looks like specific to my workstation, encoding related.
Following test, when run interactively, passes fine. But when running from `cc(T)` or `test.data.table()` or `R CMD check` it fails
```r
txt = readLines(testDir("issue_563_fread.txt"))
test(2194.7, endsWithAny(txt, 'B'), error="Internal error.*types or lengths incorrect") # txt is length 5
```
```
Running test id 2194.7 Test 1 produced 0 errors but expected 1
Expected: Internal error.*types or lengths incorrect
Observed:
```
Debugging leads me to the following
```r
txt = readLines(testDir("issue_563_fread.txt"))
print(txt)
print(endsWithAny(txt, 'B'))
test(1, endsWithAny(txt, 'B'), error="Internal error.*types or lengths incorrect") # txt is length 5
```
```
[1] "A,B"
[1] 1
Running test id 1 Test 1 produced 0 errors but expected 1
Expected: Internal error.*types or lengths incorrect
Observed:
...
In addition: Warning messages:
1: In readLines(testDir("issue_563_fread.txt")) :
invalid input found on input connection '~/git/data.table/inst/tests/issue_563_fread.txt'
```
locale and l10n are the same in both situations (failure and pass)
```
Wed Oct 12 11:06:49 2022 endian==little, sizeof(long double)==16, longdouble.digits==64, sizeof(pointer)==8, TZ==unset, Sys.timezone()=='Europe/Lisbon', Sys.getlocale()=='C', l10n_info()=='MBCS=FALSE; UTF-8=FALSE; Latin-1=FALSE; codeset=ANSI_X3.4-1968', getDTthreads()=='OpenMP version (_OPENMP)==201511; omp_get_num_procs()==8; R_DATATABLE_NUM_PROCS_PERCENT==unset (default 50); R_DATATABLE_NUM_THREADS==unset; R_DATATABLE_THROTTLE==unset (default 1024); omp_get_thread_limit()==2147483647; omp_get_max_threads()==8; OMP_THREAD_LIMIT==unset; OMP_NUM_THREADS==unset; RestoreAfterFork==true; data.table is using 4 threads with throttle==1024. See ?setDTthreads.', zlibVersion()==1.2.11 ZLIB_VERSION==1.2.11
``` | 1.0 | encoding related test 2194.7 fails on a local machine - Looks like specific to my workstation, encoding related.
Following test, when run interactively, passes fine. But when running from `cc(T)` or `test.data.table()` or `R CMD check` it fails
```r
txt = readLines(testDir("issue_563_fread.txt"))
test(2194.7, endsWithAny(txt, 'B'), error="Internal error.*types or lengths incorrect") # txt is length 5
```
```
Running test id 2194.7 Test 1 produced 0 errors but expected 1
Expected: Internal error.*types or lengths incorrect
Observed:
```
Debugging leads me to the following
```r
txt = readLines(testDir("issue_563_fread.txt"))
print(txt)
print(endsWithAny(txt, 'B'))
test(1, endsWithAny(txt, 'B'), error="Internal error.*types or lengths incorrect") # txt is length 5
```
```
[1] "A,B"
[1] 1
Running test id 1 Test 1 produced 0 errors but expected 1
Expected: Internal error.*types or lengths incorrect
Observed:
...
In addition: Warning messages:
1: In readLines(testDir("issue_563_fread.txt")) :
invalid input found on input connection '~/git/data.table/inst/tests/issue_563_fread.txt'
```
locale and l10n are the same in both situations (failure and pass)
```
Wed Oct 12 11:06:49 2022 endian==little, sizeof(long double)==16, longdouble.digits==64, sizeof(pointer)==8, TZ==unset, Sys.timezone()=='Europe/Lisbon', Sys.getlocale()=='C', l10n_info()=='MBCS=FALSE; UTF-8=FALSE; Latin-1=FALSE; codeset=ANSI_X3.4-1968', getDTthreads()=='OpenMP version (_OPENMP)==201511; omp_get_num_procs()==8; R_DATATABLE_NUM_PROCS_PERCENT==unset (default 50); R_DATATABLE_NUM_THREADS==unset; R_DATATABLE_THROTTLE==unset (default 1024); omp_get_thread_limit()==2147483647; omp_get_max_threads()==8; OMP_THREAD_LIMIT==unset; OMP_NUM_THREADS==unset; RestoreAfterFork==true; data.table is using 4 threads with throttle==1024. See ?setDTthreads.', zlibVersion()==1.2.11 ZLIB_VERSION==1.2.11
``` | non_priority | encoding related test fails on a local machine looks like specific to my workstation encoding related following test when run interactively passes fine but when running from cc t or test data table or r cmd check it fails r txt readlines testdir issue fread txt test endswithany txt b error internal error types or lengths incorrect txt is length running test id test produced errors but expected expected internal error types or lengths incorrect observed debugging leads me to the following r txt readlines testdir issue fread txt print txt print endswithany txt b test endswithany txt b error internal error types or lengths incorrect txt is length a b running test id test produced errors but expected expected internal error types or lengths incorrect observed in addition warning messages in readlines testdir issue fread txt invalid input found on input connection git data table inst tests issue fread txt locale and are the same in both situations failure and pass wed oct endian little sizeof long double longdouble digits sizeof pointer tz unset sys timezone europe lisbon sys getlocale c info mbcs false utf false latin false codeset ansi getdtthreads openmp version openmp omp get num procs r datatable num procs percent unset default r datatable num threads unset r datatable throttle unset default omp get thread limit omp get max threads omp thread limit unset omp num threads unset restoreafterfork true data table is using threads with throttle see setdtthreads zlibversion zlib version | 0 |
18,614 | 25,921,255,252 | IssuesEvent | 2022-12-15 22:12:53 | Astropilot/ValheimTooler | https://api.github.com/repos/Astropilot/ValheimTooler | closed | Mistlands Update.. | accepted game-compatibility | Hi,
I know this is too much to ask haha! But are you going to update this for the mistlands patch?
Thanks! | True | Mistlands Update.. - Hi,
I know this is too much to ask haha! But are you going to update this for the mistlands patch?
Thanks! | non_priority | mistlands update hi i know this is too much to ask haha but are you going to update this for the mistlands patch thanks | 0 |
77,863 | 14,924,923,229 | IssuesEvent | 2021-01-24 02:56:02 | pyqtgraph/pyqtgraph | https://api.github.com/repos/pyqtgraph/pyqtgraph | closed | [CI-fail] Segfault with pytest --cov | test-code | Moving some notes over from slack:
We're seeing a segmentation fault when running `pytest --cov pyqtgraph` both locally and in CI. It occurs right after the tests are completed (just before the coverage report is produced).
Environment:
- PySide2 5.12
- Python 3.7
- Either venv or conda
- pytest-cov 2.7
I noticed this occurs when using bash but *not* fish. @j9ac9k reported the segfault with zsh as well.
Running `coverage run -m pytest` does *not* result in a segfault.
`pytest-cov`'s README says this:
> Consistent pytest behavior. If you run `coverage run -m pytest` you will have slightly different `sys.path` (CWD will be in it, unlike when running `pytest`).
Edit: added pytest-cov version to list | 1.0 | [CI-fail] Segfault with pytest --cov - Moving some notes over from slack:
We're seeing a segmentation fault when running `pytest --cov pyqtgraph` both locally and in CI. It occurs right after the tests are completed (just before the coverage report is produced).
Environment:
- PySide2 5.12
- Python 3.7
- Either venv or conda
- pytest-cov 2.7
I noticed this occurs when using bash but *not* fish. @j9ac9k reported the segfault with zsh as well.
Running `coverage run -m pytest` does *not* result in a segfault.
`pytest-cov`'s README says this:
> Consistent pytest behavior. If you run `coverage run -m pytest` you will have slightly different `sys.path` (CWD will be in it, unlike when running `pytest`).
Edit: added pytest-cov version to list | non_priority | segfault with pytest cov moving some notes over from slack we re seeing a segmentation fault when running pytest cov pyqtgraph both locally and in ci it occurs right after the tests are completed just before the coverage report is produced environment python either venv or conda pytest cov i noticed this occurs when using bash but not fish reported the segfault with zsh as well running coverage run m pytest does not result in a segfault pytest cov s readme says this consistent pytest behavior if you run coverage run m pytest you will have slightly different sys path cwd will be in it unlike when running pytest edit added pytest cov version to list | 0 |
6,040 | 2,806,287,783 | IssuesEvent | 2015-05-15 00:45:11 | rancherio/rancher | https://api.github.com/repos/rancherio/rancher | closed | Default container - ubuntu:14.04.2 fails to start | area/container area/ui bug status/blocker status/to-test | Server version - V0.20.1
Steps to reproduce the problem:
Start container with defualt image - ubuntu:14.04.2 with out changing any other parameters.
Container starts and then stops.
This is because of the interactive & terminal options being defaulted to false. | 1.0 | Default container - ubuntu:14.04.2 fails to start - Server version - V0.20.1
Steps to reproduce the problem:
Start container with defualt image - ubuntu:14.04.2 with out changing any other parameters.
Container starts and then stops.
This is because of the interactive & terminal options being defaulted to false. | non_priority | default container ubuntu fails to start server version steps to reproduce the problem start container with defualt image ubuntu with out changing any other parameters container starts and then stops this is because of the interactive terminal options being defaulted to false | 0 |
183,091 | 14,928,458,737 | IssuesEvent | 2021-01-24 19:18:29 | danicianuro/IncidenciasCiberseguridad | https://api.github.com/repos/danicianuro/IncidenciasCiberseguridad | opened | Conexión a maquina Metaexploid 2 | documentation | Como tarea de equipo debéis entre todos conseguir tener acceso a la maquina virtual donde está instalada Metaexploid 2 desde la máquina Kali Linux. | 1.0 | Conexión a maquina Metaexploid 2 - Como tarea de equipo debéis entre todos conseguir tener acceso a la maquina virtual donde está instalada Metaexploid 2 desde la máquina Kali Linux. | non_priority | conexión a maquina metaexploid como tarea de equipo debéis entre todos conseguir tener acceso a la maquina virtual donde está instalada metaexploid desde la máquina kali linux | 0 |
101,818 | 8,799,423,440 | IssuesEvent | 2018-12-24 14:09:41 | status-im/status-react | https://api.github.com/repos/status-im/status-react | closed | Add sanity test suite for desktop app | desktop stale tests | [comment]: # (Please replace ... with your information. Remove < and >)
### Description
[comment]: # (Feature or Bug? i.e Type: Bug)
*Type*: Feature
[comment]: # (Describe the feature you would like, or briefly summarise the bug and what you did, what you expected to happen, and what actually happens. Sections below)
*Summary*: We need to define a sanity suite of tests to run against the desktop app
### Solution
[comment]: # (Please summarise the solution and provide a task list on what needs to be fixed.)
*Summary*:
- [x] Test suite is created in TestRail
- [ ] All tests from the suite are covered by e2e test
| 1.0 | Add sanity test suite for desktop app - [comment]: # (Please replace ... with your information. Remove < and >)
### Description
[comment]: # (Feature or Bug? i.e Type: Bug)
*Type*: Feature
[comment]: # (Describe the feature you would like, or briefly summarise the bug and what you did, what you expected to happen, and what actually happens. Sections below)
*Summary*: We need to define a sanity suite of tests to run against the desktop app
### Solution
[comment]: # (Please summarise the solution and provide a task list on what needs to be fixed.)
*Summary*:
- [x] Test suite is created in TestRail
- [ ] All tests from the suite are covered by e2e test
| non_priority | add sanity test suite for desktop app please replace with your information remove description feature or bug i e type bug type feature describe the feature you would like or briefly summarise the bug and what you did what you expected to happen and what actually happens sections below summary we need to define a sanity suite of tests to run against the desktop app solution please summarise the solution and provide a task list on what needs to be fixed summary test suite is created in testrail all tests from the suite are covered by test | 0 |
365,275 | 25,527,051,973 | IssuesEvent | 2022-11-29 03:54:17 | AndrewCerveny/flashCards | https://api.github.com/repos/AndrewCerveny/flashCards | closed | Turn | documentation enhancement | - [x] Turn should be instantiated with two arguments.
- [x] 1 usersGuess 2, a card object for the currentCard
methods.
returnGuess()
- [x] - return the user's guess
returnCard()
- [x] - returns the current Card
EvaluateGuess()
- [x] returns boolean (true || false) if the users guess matches the correctAnswer for the card
giveFeedBack()
- [x] methods that returns either incorrect or correct based on whether the guess is correct or not. | 1.0 | Turn - - [x] Turn should be instantiated with two arguments.
- [x] 1 usersGuess 2, a card object for the currentCard
methods.
returnGuess()
- [x] - return the user's guess
returnCard()
- [x] - returns the current Card
EvaluateGuess()
- [x] returns boolean (true || false) if the users guess matches the correctAnswer for the card
giveFeedBack()
- [x] methods that returns either incorrect or correct based on whether the guess is correct or not. | non_priority | turn turn should be instantiated with two arguments usersguess a card object for the currentcard methods returnguess return the user s guess returncard returns the current card evaluateguess returns boolean true false if the users guess matches the correctanswer for the card givefeedback methods that returns either incorrect or correct based on whether the guess is correct or not | 0 |
11,959 | 2,672,256,986 | IssuesEvent | 2015-03-24 13:12:22 | acardona/CATMAID | https://api.github.com/repos/acardona/CATMAID | closed | Circuit graph plot chooses wrong size | type: defect | The plots in the circuit graph widget resize to fill the widget size. However, they seem to choose a size slightly larger than will fit in the widget and thus will generate scroll bars. This is particularly a bother because the bottom scroll bar largely covers the actual numbers of the x-axis. | 1.0 | Circuit graph plot chooses wrong size - The plots in the circuit graph widget resize to fill the widget size. However, they seem to choose a size slightly larger than will fit in the widget and thus will generate scroll bars. This is particularly a bother because the bottom scroll bar largely covers the actual numbers of the x-axis. | non_priority | circuit graph plot chooses wrong size the plots in the circuit graph widget resize to fill the widget size however they seem to choose a size slightly larger than will fit in the widget and thus will generate scroll bars this is particularly a bother because the bottom scroll bar largely covers the actual numbers of the x axis | 0 |
256,262 | 22,042,409,346 | IssuesEvent | 2022-05-29 15:00:30 | SAA-SDT/eac-cpf-schema | https://api.github.com/repos/SAA-SDT/eac-cpf-schema | closed | <targetEntity> | Element Best Practice Guide Tested by Schema Team | ## Relation Target Entity
- add new mandatory, not repeatable element `<targetEntity>` as child element of `<relation>`
- add mandatory and repeatable child element `<part>` to add the name or term of the related entity
- add **mandatory attribute**
`@targetType` (limited values: corporateBody, person, family, resource, function)
- add **optional attributes**
`@audience`
`@conventationDeclarationReference`
`@id`
`@languageOfElement`
`@maintenanceEventReference`
`@scriptOfElement`
`@sourceReference`
`@valueURI`
`@vocabularySource`
`@vocabularySourceURI`
## Creator of issue
1. Silke Jagodzinski
2. TS-EAS: EAC-CPF subgroup
3. silkejagodzinski@gmail.com
## Related issues / documents
[Paper on Relation](https://github.com/SAA-SDT/TS-EAS-subteam-notes/tree/master/eaccpf-subteam/working-documents/topics/relations)
## EAD3 Reconciliation
EAC-CPF specific element
## Context
new EAC-CPF element
- was `<relationEntry>` before
## Solution documentation: agreed solution for TL and guidelines
_Summary_, _Description and Usage_ and _Attribute usage_ needed
**May contain**: `<part>` (1..n)
**May occur within**: `<relation>`
**Attributes**:
`@audience` - optional (values limited to: external, internal)
`@conventationDeclarationReference` - optional
`@id` - optional
`@languageOfElement` - optional
`@maintenanceEventReference` - optional
`@targettype` - mandatory (values limited to: corporateBody, person, family, resource, function)
`@scriptOfElement` - optional
`@sourceReference` - optional
`@valueURI` - optional
`@vocabularySource` - optional
`@vocabularySourceURI` - optional
**Availability**: mandatory, not repeatable
- New or other example needed
- Topic for Best Practise Guide
## Example encoding
```
<relation>
<targetEntity audience="external" conventionDeclarationReference="conventiondeclaration1" id="targetEntity1" languageOfElement="en" maintenanceEventReference="maintenancevent1" scriptOfElement="lat" sourceReference="source1" valueURI="http://entity.uri" vocabularySource="sourceofentityvoc" vocabularySourceURI="http://sourceofentityvoc.org">7
<part>name or part of the name or term of the related entity</part>
</targetEntity>
</relation>
``` | 1.0 | <targetEntity> - ## Relation Target Entity
- add new mandatory, not repeatable element `<targetEntity>` as child element of `<relation>`
- add mandatory and repeatable child element `<part>` to add the name or term of the related entity
- add **mandatory attribute**
`@targetType` (limited values: corporateBody, person, family, resource, function)
- add **optional attributes**
`@audience`
`@conventationDeclarationReference`
`@id`
`@languageOfElement`
`@maintenanceEventReference`
`@scriptOfElement`
`@sourceReference`
`@valueURI`
`@vocabularySource`
`@vocabularySourceURI`
## Creator of issue
1. Silke Jagodzinski
2. TS-EAS: EAC-CPF subgroup
3. silkejagodzinski@gmail.com
## Related issues / documents
[Paper on Relation](https://github.com/SAA-SDT/TS-EAS-subteam-notes/tree/master/eaccpf-subteam/working-documents/topics/relations)
## EAD3 Reconciliation
EAC-CPF specific element
## Context
new EAC-CPF element
- was `<relationEntry>` before
## Solution documentation: agreed solution for TL and guidelines
_Summary_, _Description and Usage_ and _Attribute usage_ needed
**May contain**: `<part>` (1..n)
**May occur within**: `<relation>`
**Attributes**:
`@audience` - optional (values limited to: external, internal)
`@conventationDeclarationReference` - optional
`@id` - optional
`@languageOfElement` - optional
`@maintenanceEventReference` - optional
`@targettype` - mandatory (values limited to: corporateBody, person, family, resource, function)
`@scriptOfElement` - optional
`@sourceReference` - optional
`@valueURI` - optional
`@vocabularySource` - optional
`@vocabularySourceURI` - optional
**Availability**: mandatory, not repeatable
- New or other example needed
- Topic for Best Practise Guide
## Example encoding
```
<relation>
<targetEntity audience="external" conventionDeclarationReference="conventiondeclaration1" id="targetEntity1" languageOfElement="en" maintenanceEventReference="maintenancevent1" scriptOfElement="lat" sourceReference="source1" valueURI="http://entity.uri" vocabularySource="sourceofentityvoc" vocabularySourceURI="http://sourceofentityvoc.org">7
<part>name or part of the name or term of the related entity</part>
</targetEntity>
</relation>
``` | non_priority | relation target entity add new mandatory not repeatable element as child element of add mandatory and repeatable child element to add the name or term of the related entity add mandatory attribute targettype limited values corporatebody person family resource function add optional attributes audience conventationdeclarationreference id languageofelement maintenanceeventreference scriptofelement sourcereference valueuri vocabularysource vocabularysourceuri creator of issue silke jagodzinski ts eas eac cpf subgroup silkejagodzinski gmail com related issues documents reconciliation eac cpf specific element context new eac cpf element was before solution documentation agreed solution for tl and guidelines summary description and usage and attribute usage needed may contain n may occur within attributes audience optional values limited to external internal conventationdeclarationreference optional id optional languageofelement optional maintenanceeventreference optional targettype mandatory values limited to corporatebody person family resource function scriptofelement optional sourcereference optional valueuri optional vocabularysource optional vocabularysourceuri optional availability mandatory not repeatable new or other example needed topic for best practise guide example encoding targetentity audience external conventiondeclarationreference id languageofelement en maintenanceeventreference scriptofelement lat sourcereference valueuri vocabularysource sourceofentityvoc vocabularysourceuri name or part of the name or term of the related entity | 0 |
204,068 | 23,203,424,736 | IssuesEvent | 2022-08-02 01:07:10 | jgeraigery/Singularity | https://api.github.com/repos/jgeraigery/Singularity | closed | CVE-2020-15168 (Medium) detected in node-fetch-1.7.3.tgz - autoclosed | security vulnerability | ## CVE-2020-15168 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-fetch-1.7.3.tgz</b></p></summary>
<p>A light-weight module that brings window.fetch to node.js and io.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-fetch/-/node-fetch-1.7.3.tgz">https://registry.npmjs.org/node-fetch/-/node-fetch-1.7.3.tgz</a></p>
<p>
Dependency Hierarchy:
- isomorphic-fetch-2.2.1.tgz (Root Library)
- :x: **node-fetch-1.7.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/Singularity/commit/385a9096f36f0aad5c901ba5dc79ed0487bbdafc">385a9096f36f0aad5c901ba5dc79ed0487bbdafc</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
node-fetch before versions 2.6.1 and 3.0.0-beta.9 did not honor the size option after following a redirect, which means that when a content size was over the limit, a FetchError would never get thrown and the process would end without failure. For most people, this fix will have a little or no impact. However, if you are relying on node-fetch to gate files above a size, the impact could be significant, for example: If you don't double-check the size of the data after fetch() has completed, your JS thread could get tied up doing work on a large file (DoS) and/or cost you money in computing.
<p>Publish Date: 2020-09-10
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15168>CVE-2020-15168</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/node-fetch/node-fetch/security/advisories/GHSA-w7rc-rwvf-8q5r">https://github.com/node-fetch/node-fetch/security/advisories/GHSA-w7rc-rwvf-8q5r</a></p>
<p>Release Date: 2020-09-17</p>
<p>Fix Resolution (node-fetch): 2.6.1</p>
<p>Direct dependency fix Resolution (isomorphic-fetch): 3.0.0</p>
</p>
</details>
<p></p>
| True | CVE-2020-15168 (Medium) detected in node-fetch-1.7.3.tgz - autoclosed - ## CVE-2020-15168 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-fetch-1.7.3.tgz</b></p></summary>
<p>A light-weight module that brings window.fetch to node.js and io.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-fetch/-/node-fetch-1.7.3.tgz">https://registry.npmjs.org/node-fetch/-/node-fetch-1.7.3.tgz</a></p>
<p>
Dependency Hierarchy:
- isomorphic-fetch-2.2.1.tgz (Root Library)
- :x: **node-fetch-1.7.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/jgeraigery/Singularity/commit/385a9096f36f0aad5c901ba5dc79ed0487bbdafc">385a9096f36f0aad5c901ba5dc79ed0487bbdafc</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
node-fetch before versions 2.6.1 and 3.0.0-beta.9 did not honor the size option after following a redirect, which means that when a content size was over the limit, a FetchError would never get thrown and the process would end without failure. For most people, this fix will have a little or no impact. However, if you are relying on node-fetch to gate files above a size, the impact could be significant, for example: If you don't double-check the size of the data after fetch() has completed, your JS thread could get tied up doing work on a large file (DoS) and/or cost you money in computing.
<p>Publish Date: 2020-09-10
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15168>CVE-2020-15168</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/node-fetch/node-fetch/security/advisories/GHSA-w7rc-rwvf-8q5r">https://github.com/node-fetch/node-fetch/security/advisories/GHSA-w7rc-rwvf-8q5r</a></p>
<p>Release Date: 2020-09-17</p>
<p>Fix Resolution (node-fetch): 2.6.1</p>
<p>Direct dependency fix Resolution (isomorphic-fetch): 3.0.0</p>
</p>
</details>
<p></p>
| non_priority | cve medium detected in node fetch tgz autoclosed cve medium severity vulnerability vulnerable library node fetch tgz a light weight module that brings window fetch to node js and io js library home page a href dependency hierarchy isomorphic fetch tgz root library x node fetch tgz vulnerable library found in head commit a href found in base branch master vulnerability details node fetch before versions and beta did not honor the size option after following a redirect which means that when a content size was over the limit a fetcherror would never get thrown and the process would end without failure for most people this fix will have a little or no impact however if you are relying on node fetch to gate files above a size the impact could be significant for example if you don t double check the size of the data after fetch has completed your js thread could get tied up doing work on a large file dos and or cost you money in computing publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution node fetch direct dependency fix resolution isomorphic fetch | 0 |
142,512 | 19,094,249,211 | IssuesEvent | 2021-11-29 15:10:37 | symfony/symfony | https://api.github.com/repos/symfony/symfony | closed | [Security] User is loaded on every request | Security Bug Status: Needs Review | **Symfony version(s) affected**: 5.3.9
**Description**
After upgrading my application to Symfony 5.3 I tried to enable the new security system. Everything seems to work well. However, I discovered that a fresh user object is loaded by the UserProvider on **every request** (although it is still beeing loaded from the session).
I'm using REMOTE_USER authentication and a custom User and UserProvider implementation.
**How to reproduce**
I created a small dummy project to reproduce the problem: https://github.com/stlrnz/test-new-symfony-security
As you can see, there is nothing special in my implementation/configuration:
```php
<?php
declare(strict_types=1);
namespace App\Security;
use Symfony\Component\Security\Core\User\UserInterface;
class User implements UserInterface
{
private $username;
public function __construct(string $username)
{
$this->username = $username;
}
public function getRoles()
{
return ['ROLE_USER'];
}
public function getPassword()
{
return null;
}
public function getSalt()
{
return null;
}
public function eraseCredentials()
{
}
public function getUsername()
{
return $this->username;
}
public function getUserIdentifier()
{
return $this->getUsername();
}
}
```
```php
<?php
declare(strict_types=1);
namespace App\Security;
use Psr\Log\LoggerInterface;
use Symfony\Component\Security\Core\User\UserInterface;
use Symfony\Component\Security\Core\User\UserProviderInterface;
final class UserProvider implements UserProviderInterface
{
private $logger;
public function __construct(LoggerInterface $logger)
{
$this->logger = $logger;
}
public function refreshUser(UserInterface $user)
{
$this->logger->debug('Refreshing user ' . $user->getUserIdentifier());
return $user;
}
public function supportsClass(string $class)
{
return $class === User::class;
}
public function loadUserByUsername(string $username)
{
$this->logger->debug('Loading user ' . $username);
return new User($username);
}
public function loadUserByIdentifier(string $username)
{
return $this->loadUserByUsername($username);
}
}
```
```yaml
security:
enable_authenticator_manager: true
# https://symfony.com/doc/current/security.html#registering-the-user-hashing-passwords
password_hashers:
Symfony\Component\Security\Core\User\PasswordAuthenticatedUserInterface: 'auto'
# https://symfony.com/doc/current/security.html#loading-the-user-the-user-provider
providers:
custom_remote_user_provider:
id: App\Security\UserProvider
firewalls:
dev:
pattern: ^/(_(profiler|wdt)|css|images|js)/
security: false
main:
pattern: ^/
remote_user:
provider: custom_remote_user_provider
# activate different ways to authenticate
# https://symfony.com/doc/current/security.html#the-firewall
# https://symfony.com/doc/current/security/impersonating_user.html
# switch_user: true
# Easy way to control access for large sections of your site
# Note: Only the *first* access control that matches will be used
access_control:
# - { path: ^/admin, roles: ROLE_ADMIN }
# - { path: ^/profile, roles: ROLE_USER }
```
**Additional context**
As you can see the UserProvider writes two log messages for debugging.
When using the new security system the user is loaded on the first request:
> Loading user stlrnz
and refreshed **and** loaded on the following:
> Refreshing user stlrnz
> Loading user stlrnz
When using the old system by configuring
```yaml
security:
enable_authenticator_manager: false
```
the user is still loaded on the first request
> Loading user stlrnz
and refreshed on the following (no loading as expected).
> Refreshing user stlrnz
Im my real application, loading a user is a very complex operation (requires some webservice calls etc.). And therfore it should not be done on every request. Is there a way to achive this?
I tried to understand why this happens in the new system. It seems that the Authenticator always triggers the load of the user through the passport to create an Authenticated Token.
https://github.com/symfony/symfony/blob/9f43121357bfdba3209de6bb974384ff36d8506f/src/Symfony/Component/Security/Http/Authenticator/AbstractPreAuthenticatedAuthenticator.php#L105 | True | [Security] User is loaded on every request - **Symfony version(s) affected**: 5.3.9
**Description**
After upgrading my application to Symfony 5.3 I tried to enable the new security system. Everything seems to work well. However, I discovered that a fresh user object is loaded by the UserProvider on **every request** (although it is still beeing loaded from the session).
I'm using REMOTE_USER authentication and a custom User and UserProvider implementation.
**How to reproduce**
I created a small dummy project to reproduce the problem: https://github.com/stlrnz/test-new-symfony-security
As you can see, there is nothing special in my implementation/configuration:
```php
<?php
declare(strict_types=1);
namespace App\Security;
use Symfony\Component\Security\Core\User\UserInterface;
class User implements UserInterface
{
private $username;
public function __construct(string $username)
{
$this->username = $username;
}
public function getRoles()
{
return ['ROLE_USER'];
}
public function getPassword()
{
return null;
}
public function getSalt()
{
return null;
}
public function eraseCredentials()
{
}
public function getUsername()
{
return $this->username;
}
public function getUserIdentifier()
{
return $this->getUsername();
}
}
```
```php
<?php
declare(strict_types=1);
namespace App\Security;
use Psr\Log\LoggerInterface;
use Symfony\Component\Security\Core\User\UserInterface;
use Symfony\Component\Security\Core\User\UserProviderInterface;
final class UserProvider implements UserProviderInterface
{
private $logger;
public function __construct(LoggerInterface $logger)
{
$this->logger = $logger;
}
public function refreshUser(UserInterface $user)
{
$this->logger->debug('Refreshing user ' . $user->getUserIdentifier());
return $user;
}
public function supportsClass(string $class)
{
return $class === User::class;
}
public function loadUserByUsername(string $username)
{
$this->logger->debug('Loading user ' . $username);
return new User($username);
}
public function loadUserByIdentifier(string $username)
{
return $this->loadUserByUsername($username);
}
}
```
```yaml
security:
enable_authenticator_manager: true
# https://symfony.com/doc/current/security.html#registering-the-user-hashing-passwords
password_hashers:
Symfony\Component\Security\Core\User\PasswordAuthenticatedUserInterface: 'auto'
# https://symfony.com/doc/current/security.html#loading-the-user-the-user-provider
providers:
custom_remote_user_provider:
id: App\Security\UserProvider
firewalls:
dev:
pattern: ^/(_(profiler|wdt)|css|images|js)/
security: false
main:
pattern: ^/
remote_user:
provider: custom_remote_user_provider
# activate different ways to authenticate
# https://symfony.com/doc/current/security.html#the-firewall
# https://symfony.com/doc/current/security/impersonating_user.html
# switch_user: true
# Easy way to control access for large sections of your site
# Note: Only the *first* access control that matches will be used
access_control:
# - { path: ^/admin, roles: ROLE_ADMIN }
# - { path: ^/profile, roles: ROLE_USER }
```
**Additional context**
As you can see the UserProvider writes two log messages for debugging.
When using the new security system the user is loaded on the first request:
> Loading user stlrnz
and refreshed **and** loaded on the following:
> Refreshing user stlrnz
> Loading user stlrnz
When using the old system by configuring
```yaml
security:
enable_authenticator_manager: false
```
the user is still loaded on the first request
> Loading user stlrnz
and refreshed on the following (no loading as expected).
> Refreshing user stlrnz
Im my real application, loading a user is a very complex operation (requires some webservice calls etc.). And therfore it should not be done on every request. Is there a way to achive this?
I tried to understand why this happens in the new system. It seems that the Authenticator always triggers the load of the user through the passport to create an Authenticated Token.
https://github.com/symfony/symfony/blob/9f43121357bfdba3209de6bb974384ff36d8506f/src/Symfony/Component/Security/Http/Authenticator/AbstractPreAuthenticatedAuthenticator.php#L105 | non_priority | user is loaded on every request symfony version s affected description after upgrading my application to symfony i tried to enable the new security system everything seems to work well however i discovered that a fresh user object is loaded by the userprovider on every request although it is still beeing loaded from the session i m using remote user authentication and a custom user and userprovider implementation how to reproduce i created a small dummy project to reproduce the problem as you can see there is nothing special in my implementation configuration php php declare strict types namespace app security use symfony component security core user userinterface class user implements userinterface private username public function construct string username this username username public function getroles return public function getpassword return null public function getsalt return null public function erasecredentials public function getusername return this username public function getuseridentifier return this getusername php php declare strict types namespace app security use psr log loggerinterface use symfony component security core user userinterface use symfony component security core user userproviderinterface final class userprovider implements userproviderinterface private logger public function construct loggerinterface logger this logger logger public function refreshuser userinterface user this logger debug refreshing user user getuseridentifier return user public function supportsclass string class return class user class public function loaduserbyusername string username this logger debug loading user username return new user username public function loaduserbyidentifier string username return this loaduserbyusername username yaml security enable authenticator manager true password hashers symfony component security core user passwordauthenticateduserinterface auto providers custom remote user provider id app security userprovider firewalls dev pattern profiler wdt css images js security false main pattern remote user provider custom remote user provider activate different ways to authenticate switch user true easy way to control access for large sections of your site note only the first access control that matches will be used access control path admin roles role admin path profile roles role user additional context as you can see the userprovider writes two log messages for debugging when using the new security system the user is loaded on the first request loading user stlrnz and refreshed and loaded on the following refreshing user stlrnz loading user stlrnz when using the old system by configuring yaml security enable authenticator manager false the user is still loaded on the first request loading user stlrnz and refreshed on the following no loading as expected refreshing user stlrnz im my real application loading a user is a very complex operation requires some webservice calls etc and therfore it should not be done on every request is there a way to achive this i tried to understand why this happens in the new system it seems that the authenticator always triggers the load of the user through the passport to create an authenticated token | 0 |
48,870 | 6,110,015,791 | IssuesEvent | 2017-06-21 14:15:39 | HellFirePvP/AstralSorcery | https://api.github.com/repos/HellFirePvP/AstralSorcery | closed | Potential Cross Mod Issue With Ender Lillys | design issue Fixed/Added next update mod interaction | This video shows the issue better, but basically Ender Lillys are meant to take a long time to grow, however I can grow them almost instantly with a ritual running the Aevitas constellation. I'm going to report this to Extra Utilities as well as I don't know which side this is an issue on, or if it's considered an issue at all.
https://www.twitch.tv/videos/153244516
https://github.com/rwtema/extrautilities/issues/1541
Minecraft 1.10.4
Astral Sorcery 1.4.1
Extra Utilities 2 1.4.1
Playing on my private server sole, using the All The Mods pack. | 1.0 | Potential Cross Mod Issue With Ender Lillys - This video shows the issue better, but basically Ender Lillys are meant to take a long time to grow, however I can grow them almost instantly with a ritual running the Aevitas constellation. I'm going to report this to Extra Utilities as well as I don't know which side this is an issue on, or if it's considered an issue at all.
https://www.twitch.tv/videos/153244516
https://github.com/rwtema/extrautilities/issues/1541
Minecraft 1.10.4
Astral Sorcery 1.4.1
Extra Utilities 2 1.4.1
Playing on my private server sole, using the All The Mods pack. | non_priority | potential cross mod issue with ender lillys this video shows the issue better but basically ender lillys are meant to take a long time to grow however i can grow them almost instantly with a ritual running the aevitas constellation i m going to report this to extra utilities as well as i don t know which side this is an issue on or if it s considered an issue at all minecraft astral sorcery extra utilities playing on my private server sole using the all the mods pack | 0 |
10,450 | 12,402,987,981 | IssuesEvent | 2020-05-21 13:07:15 | ProgVal/Limnoria | https://api.github.com/repos/ProgVal/Limnoria | closed | KeyError in urllib when using ~google translate with interpolation | Bug Python 2 compatibility | I'm running the latest git version, with `~x` aliased to `~google translate`, and using `{}` for interpolation.
This works:
```
<&The-Compiler> ~x en ja Hello
<foobar> The-Compiler: こんにちは
<&The-Compiler> ~x ja en こんにちは
<foobar> The-Compiler: good afternoon
```
but this doesn't:
```
<&The-Compiler> ~x ja en {x en ja Hello}
<foobar> The-Compiler: Error: KeyError: u'\u3053'
```
In the log:
```
ERROR 2016-01-05T17:21:55 Uncaught exception in ['google', 'translate'].
Traceback (most recent call last):
File ".../supybot/callbacks.py", line 1271, in _callCommand
self.callCommand(command, irc, msg, *args, **kwargs)
File ".../supybot/utils/python.py", line 90, in g
f(self, *args, **kwargs)
File ".../supybot/callbacks.py", line 1248, in callCommand
method(irc, msg, *args, **kwargs)
File ".../supybot/commands.py", line 1080, in newf
f(self, irc, msg, args, *state.args, **state.kwargs)
File ".../supybot/plugins/Google/plugin.py", line 283, in translate
(text, language) = self._translate(sourceLang, targetLang, text)
File ".../supybot/plugins/Google/plugin.py", line 252, in _translate
text = utils.web.urlquote(text)
File "/usr/lib/python2.7/urllib.py", line 1303, in quote
return ''.join(map(quoter, s))
KeyError: u'\u3053'
```
| True | KeyError in urllib when using ~google translate with interpolation - I'm running the latest git version, with `~x` aliased to `~google translate`, and using `{}` for interpolation.
This works:
```
<&The-Compiler> ~x en ja Hello
<foobar> The-Compiler: こんにちは
<&The-Compiler> ~x ja en こんにちは
<foobar> The-Compiler: good afternoon
```
but this doesn't:
```
<&The-Compiler> ~x ja en {x en ja Hello}
<foobar> The-Compiler: Error: KeyError: u'\u3053'
```
In the log:
```
ERROR 2016-01-05T17:21:55 Uncaught exception in ['google', 'translate'].
Traceback (most recent call last):
File ".../supybot/callbacks.py", line 1271, in _callCommand
self.callCommand(command, irc, msg, *args, **kwargs)
File ".../supybot/utils/python.py", line 90, in g
f(self, *args, **kwargs)
File ".../supybot/callbacks.py", line 1248, in callCommand
method(irc, msg, *args, **kwargs)
File ".../supybot/commands.py", line 1080, in newf
f(self, irc, msg, args, *state.args, **state.kwargs)
File ".../supybot/plugins/Google/plugin.py", line 283, in translate
(text, language) = self._translate(sourceLang, targetLang, text)
File ".../supybot/plugins/Google/plugin.py", line 252, in _translate
text = utils.web.urlquote(text)
File "/usr/lib/python2.7/urllib.py", line 1303, in quote
return ''.join(map(quoter, s))
KeyError: u'\u3053'
```
| non_priority | keyerror in urllib when using google translate with interpolation i m running the latest git version with x aliased to google translate and using for interpolation this works x en ja hello the compiler こんにちは x ja en こんにちは the compiler good afternoon but this doesn t x ja en x en ja hello the compiler error keyerror u in the log error uncaught exception in traceback most recent call last file supybot callbacks py line in callcommand self callcommand command irc msg args kwargs file supybot utils python py line in g f self args kwargs file supybot callbacks py line in callcommand method irc msg args kwargs file supybot commands py line in newf f self irc msg args state args state kwargs file supybot plugins google plugin py line in translate text language self translate sourcelang targetlang text file supybot plugins google plugin py line in translate text utils web urlquote text file usr lib urllib py line in quote return join map quoter s keyerror u | 0 |
43,902 | 11,879,566,167 | IssuesEvent | 2020-03-27 09:02:03 | contao/contao | https://api.github.com/repos/contao/contao | closed | Übersetzung tl_calendar.jumpTo.1, tl_calendar_events.jumpTo.1 und tl_calendar_events.articleId.1 | defect | Verbesserungsvorschläge für die Übersetzungen ``tl_calendar.jumpTo.1``, ``tl_calendar_events.jumpTo.1`` und ``tl_calendar_events.articleId.1``:
https://github.com/contao/contao/blob/bf87609767e5623d9da8b98920683bef2d651997/calendar-bundle/src/Resources/contao/languages/de/tl_calendar.xlf#L16-L19
``Bitte wählen Sie die Eventleser-Seite aus, zu der Besucher weitergeleitet werden, wenn sie ein Event anklicken.``
https://github.com/contao/contao/blob/bf87609767e5623d9da8b98920683bef2d651997/calendar-bundle/src/Resources/contao/languages/de/tl_calendar_events.xlf#L212-L215
``Bitte wählen Sie die Seite aus, zu der Besucher weitergeleitet werden, wenn sie das Event anklicken.``
https://github.com/contao/contao/blob/bf87609767e5623d9da8b98920683bef2d651997/calendar-bundle/src/Resources/contao/languages/de/tl_calendar_events.xlf#L220-L223
``Bitte wählen Sie den Artikel aus, zu dem Besucher weitergeleitet werden, wenn sie das Event anklicken.`` | 1.0 | Übersetzung tl_calendar.jumpTo.1, tl_calendar_events.jumpTo.1 und tl_calendar_events.articleId.1 - Verbesserungsvorschläge für die Übersetzungen ``tl_calendar.jumpTo.1``, ``tl_calendar_events.jumpTo.1`` und ``tl_calendar_events.articleId.1``:
https://github.com/contao/contao/blob/bf87609767e5623d9da8b98920683bef2d651997/calendar-bundle/src/Resources/contao/languages/de/tl_calendar.xlf#L16-L19
``Bitte wählen Sie die Eventleser-Seite aus, zu der Besucher weitergeleitet werden, wenn sie ein Event anklicken.``
https://github.com/contao/contao/blob/bf87609767e5623d9da8b98920683bef2d651997/calendar-bundle/src/Resources/contao/languages/de/tl_calendar_events.xlf#L212-L215
``Bitte wählen Sie die Seite aus, zu der Besucher weitergeleitet werden, wenn sie das Event anklicken.``
https://github.com/contao/contao/blob/bf87609767e5623d9da8b98920683bef2d651997/calendar-bundle/src/Resources/contao/languages/de/tl_calendar_events.xlf#L220-L223
``Bitte wählen Sie den Artikel aus, zu dem Besucher weitergeleitet werden, wenn sie das Event anklicken.`` | non_priority | übersetzung tl calendar jumpto tl calendar events jumpto und tl calendar events articleid verbesserungsvorschläge für die übersetzungen tl calendar jumpto tl calendar events jumpto und tl calendar events articleid bitte wählen sie die eventleser seite aus zu der besucher weitergeleitet werden wenn sie ein event anklicken bitte wählen sie die seite aus zu der besucher weitergeleitet werden wenn sie das event anklicken bitte wählen sie den artikel aus zu dem besucher weitergeleitet werden wenn sie das event anklicken | 0 |
127,843 | 18,024,374,162 | IssuesEvent | 2021-09-17 01:08:43 | kapseliboi/nehan | https://api.github.com/repos/kapseliboi/nehan | opened | CVE-2021-3777 (Medium) detected in tmpl-1.0.4.tgz | security vulnerability | ## CVE-2021-3777 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tmpl-1.0.4.tgz</b></p></summary>
<p>JavaScript micro templates.</p>
<p>Library home page: <a href="https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz">https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz</a></p>
<p>Path to dependency file: nehan/package.json</p>
<p>Path to vulnerable library: nehan/node_modules/tmpl/package.json</p>
<p>
Dependency Hierarchy:
- jest-27.0.6.tgz (Root Library)
- core-27.0.6.tgz
- jest-haste-map-27.0.6.tgz
- walker-1.0.7.tgz
- makeerror-1.0.11.tgz
- :x: **tmpl-1.0.4.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
nodejs-tmpl is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3777>CVE-2021-3777</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/daaku/nodejs-tmpl/releases/tag/v1.0.5">https://github.com/daaku/nodejs-tmpl/releases/tag/v1.0.5</a></p>
<p>Release Date: 2021-09-15</p>
<p>Fix Resolution: tmpl - 1.0.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-3777 (Medium) detected in tmpl-1.0.4.tgz - ## CVE-2021-3777 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tmpl-1.0.4.tgz</b></p></summary>
<p>JavaScript micro templates.</p>
<p>Library home page: <a href="https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz">https://registry.npmjs.org/tmpl/-/tmpl-1.0.4.tgz</a></p>
<p>Path to dependency file: nehan/package.json</p>
<p>Path to vulnerable library: nehan/node_modules/tmpl/package.json</p>
<p>
Dependency Hierarchy:
- jest-27.0.6.tgz (Root Library)
- core-27.0.6.tgz
- jest-haste-map-27.0.6.tgz
- walker-1.0.7.tgz
- makeerror-1.0.11.tgz
- :x: **tmpl-1.0.4.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
nodejs-tmpl is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3777>CVE-2021-3777</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/daaku/nodejs-tmpl/releases/tag/v1.0.5">https://github.com/daaku/nodejs-tmpl/releases/tag/v1.0.5</a></p>
<p>Release Date: 2021-09-15</p>
<p>Fix Resolution: tmpl - 1.0.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in tmpl tgz cve medium severity vulnerability vulnerable library tmpl tgz javascript micro templates library home page a href path to dependency file nehan package json path to vulnerable library nehan node modules tmpl package json dependency hierarchy jest tgz root library core tgz jest haste map tgz walker tgz makeerror tgz x tmpl tgz vulnerable library found in base branch master vulnerability details nodejs tmpl is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tmpl step up your open source security game with whitesource | 0 |
119,332 | 15,499,581,524 | IssuesEvent | 2021-03-11 08:09:46 | microsoft/vscode | https://api.github.com/repos/microsoft/vscode | closed | Fail to load SVG files on hover | *as-designed | Version: 1.55.0-insider
Commit: 5d80c30e5b6ce8b2f5336ed55ad043490b0b818f
Date: 2021-03-08T05:24:34.127Z
Electron: 11.3.0
Chrome: 87.0.4280.141
Node.js: 12.18.3
V8: 8.7.220.31-electron.0
OS: Darwin x64 18.7.0
Steps to Reproduce:
1. Execute the following extension sample.
2. Move a mouse cursor to any text.
3. Fail to load SVG files out of the extension path and on remote servers.
We want to display dynamically generated SVG files on hover. They are rendered math equations and thumbnails of PDF files. They are typically generated in a temporary directory out of the extension directory.
Console says:
```
tiger.svg:1 Failed to load resource: net::ERR_BLOCKED_BY_CLIENT
```
Does this issue occur when all extensions are disabled?: Yes
```ts
import * as vscode from 'vscode';
import * as path from 'path';
export function activate(context: vscode.ExtensionContext) {
console.log('Congratulations, your extension "helloworld-sample" is now active!');
context.subscriptions.push(
vscode.languages.registerHoverProvider({ scheme: '*' }, new HoverProvider(context.extensionPath))
)
const disposable = vscode.commands.registerCommand('extension.helloWorld', () => {
vscode.window.showInformationMessage('Hello World!');
});
}
class HoverProvider implements vscode.HoverProvider {
path: string
constructor(path: string) {
this.path = path
}
provideHover(): vscode.Hover {
const insideExtensionUri = vscode.Uri.file(path.join(this.path, 'tiger.svg'))
const fileUri = vscode.Uri.file('/Users/tamura/Downloads/tiger.svg')
const serverUri = vscode.Uri.parse('https://dev.w3.org/SVG/tools/svgweb/samples/svg-files/tiger.svg')
const md = `ext: }) local: }), server: })`
const mdString = new vscode.MarkdownString(md)
const hover = new vscode.Hover(mdString)
return hover
}
}
```

| 1.0 | Fail to load SVG files on hover - Version: 1.55.0-insider
Commit: 5d80c30e5b6ce8b2f5336ed55ad043490b0b818f
Date: 2021-03-08T05:24:34.127Z
Electron: 11.3.0
Chrome: 87.0.4280.141
Node.js: 12.18.3
V8: 8.7.220.31-electron.0
OS: Darwin x64 18.7.0
Steps to Reproduce:
1. Execute the following extension sample.
2. Move a mouse cursor to any text.
3. Fail to load SVG files out of the extension path and on remote servers.
We want to display dynamically generated SVG files on hover. They are rendered math equations and thumbnails of PDF files. They are typically generated in a temporary directory out of the extension directory.
Console says:
```
tiger.svg:1 Failed to load resource: net::ERR_BLOCKED_BY_CLIENT
```
Does this issue occur when all extensions are disabled?: Yes
```ts
import * as vscode from 'vscode';
import * as path from 'path';
export function activate(context: vscode.ExtensionContext) {
console.log('Congratulations, your extension "helloworld-sample" is now active!');
context.subscriptions.push(
vscode.languages.registerHoverProvider({ scheme: '*' }, new HoverProvider(context.extensionPath))
)
const disposable = vscode.commands.registerCommand('extension.helloWorld', () => {
vscode.window.showInformationMessage('Hello World!');
});
}
class HoverProvider implements vscode.HoverProvider {
path: string
constructor(path: string) {
this.path = path
}
provideHover(): vscode.Hover {
const insideExtensionUri = vscode.Uri.file(path.join(this.path, 'tiger.svg'))
const fileUri = vscode.Uri.file('/Users/tamura/Downloads/tiger.svg')
const serverUri = vscode.Uri.parse('https://dev.w3.org/SVG/tools/svgweb/samples/svg-files/tiger.svg')
const md = `ext: }) local: }), server: })`
const mdString = new vscode.MarkdownString(md)
const hover = new vscode.Hover(mdString)
return hover
}
}
```

| non_priority | fail to load svg files on hover version insider commit date electron chrome node js electron os darwin steps to reproduce execute the following extension sample move a mouse cursor to any text fail to load svg files out of the extension path and on remote servers we want to display dynamically generated svg files on hover they are rendered math equations and thumbnails of pdf files they are typically generated in a temporary directory out of the extension directory console says tiger svg failed to load resource net err blocked by client does this issue occur when all extensions are disabled yes ts import as vscode from vscode import as path from path export function activate context vscode extensioncontext console log congratulations your extension helloworld sample is now active context subscriptions push vscode languages registerhoverprovider scheme new hoverprovider context extensionpath const disposable vscode commands registercommand extension helloworld vscode window showinformationmessage hello world class hoverprovider implements vscode hoverprovider path string constructor path string this path path providehover vscode hover const insideextensionuri vscode uri file path join this path tiger svg const fileuri vscode uri file users tamura downloads tiger svg const serveruri vscode uri parse const md ext insideextensionuri tostring true local fileuri tostring true server serveruri tostring true const mdstring new vscode markdownstring md const hover new vscode hover mdstring return hover | 0 |
254,208 | 19,189,795,224 | IssuesEvent | 2021-12-05 20:13:14 | thedigitalmenagerie/mesi | https://api.github.com/repos/thedigitalmenagerie/mesi | closed | Create Issue Tickets | documentation | ## Feature Summary
As an architect, create the tickets for each milestone with as mush detail as possible.
## Acceptance Criteria
All features are covered.
## Technical Requirements
- [x] All tickets specific to Database creation with Sample Data
- [x] All tickets specific to back-end functionality
- [x] All tickets associated with front-end functionality.
| 1.0 | Create Issue Tickets - ## Feature Summary
As an architect, create the tickets for each milestone with as mush detail as possible.
## Acceptance Criteria
All features are covered.
## Technical Requirements
- [x] All tickets specific to Database creation with Sample Data
- [x] All tickets specific to back-end functionality
- [x] All tickets associated with front-end functionality.
| non_priority | create issue tickets feature summary as an architect create the tickets for each milestone with as mush detail as possible acceptance criteria all features are covered technical requirements all tickets specific to database creation with sample data all tickets specific to back end functionality all tickets associated with front end functionality | 0 |
246,199 | 26,600,438,660 | IssuesEvent | 2023-01-23 15:24:42 | lukebrogan-mend/WebGoat.NET | https://api.github.com/repos/lukebrogan-mend/WebGoat.NET | opened | jquery-1.6.2.min.js: 5 vulnerabilities (highest severity is: 6.1) | security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.6.2.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (jquery version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2020-11022](https://www.mend.io/vulnerability-database/CVE-2020-11022) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-1.6.2.min.js | Direct | jQuery - 3.5.0 | ❌ |
| [CVE-2015-9251](https://www.mend.io/vulnerability-database/CVE-2015-9251) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-1.6.2.min.js | Direct | jQuery - 3.0.0 | ❌ |
| [CVE-2019-11358](https://www.mend.io/vulnerability-database/CVE-2019-11358) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-1.6.2.min.js | Direct | jquery - 3.4.0 | ❌ |
| [CVE-2020-7656](https://www.mend.io/vulnerability-database/CVE-2020-7656) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-1.6.2.min.js | Direct | jquery - 1.9.0 | ❌ |
| [CVE-2012-6708](https://www.mend.io/vulnerability-database/CVE-2012-6708) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-1.6.2.min.js | Direct | jQuery - v1.9.0 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-11022</summary>
### Vulnerable Library - <b>jquery-1.6.2.min.js</b></p>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.6.2.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-11022>CVE-2020-11022</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11022">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11022</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jQuery - 3.5.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2015-9251</summary>
### Vulnerable Library - <b>jquery-1.6.2.min.js</b></p>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.6.2.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.
<p>Publish Date: 2018-01-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-9251>CVE-2015-9251</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p>
<p>Release Date: 2018-01-18</p>
<p>Fix Resolution: jQuery - 3.0.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2019-11358</summary>
### Vulnerable Library - <b>jquery-1.6.2.min.js</b></p>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.6.2.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype.
<p>Publish Date: 2019-04-20
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-11358>CVE-2019-11358</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358</a></p>
<p>Release Date: 2019-04-20</p>
<p>Fix Resolution: jquery - 3.4.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-7656</summary>
### Vulnerable Library - <b>jquery-1.6.2.min.js</b></p>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.6.2.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed.
<p>Publish Date: 2020-05-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-7656>CVE-2020-7656</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-q4m3-2j7h-f7xw">https://github.com/advisories/GHSA-q4m3-2j7h-f7xw</a></p>
<p>Release Date: 2020-05-19</p>
<p>Fix Resolution: jquery - 1.9.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2012-6708</summary>
### Vulnerable Library - <b>jquery-1.6.2.min.js</b></p>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.6.2.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
jQuery before 1.9.0 is vulnerable to Cross-site Scripting (XSS) attacks. The jQuery(strInput) function does not differentiate selectors from HTML in a reliable fashion. In vulnerable versions, jQuery determined whether the input was HTML by looking for the '<' character anywhere in the string, giving attackers more flexibility when attempting to construct a malicious payload. In fixed versions, jQuery only deems the input to be HTML if it explicitly starts with the '<' character, limiting exploitability only to attackers who can control the beginning of a string, which is far less common.
<p>Publish Date: 2018-01-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2012-6708>CVE-2012-6708</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2012-6708">https://nvd.nist.gov/vuln/detail/CVE-2012-6708</a></p>
<p>Release Date: 2018-01-18</p>
<p>Fix Resolution: jQuery - v1.9.0</p>
</p>
<p></p>
</details> | True | jquery-1.6.2.min.js: 5 vulnerabilities (highest severity is: 6.1) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.6.2.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (jquery version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2020-11022](https://www.mend.io/vulnerability-database/CVE-2020-11022) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-1.6.2.min.js | Direct | jQuery - 3.5.0 | ❌ |
| [CVE-2015-9251](https://www.mend.io/vulnerability-database/CVE-2015-9251) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-1.6.2.min.js | Direct | jQuery - 3.0.0 | ❌ |
| [CVE-2019-11358](https://www.mend.io/vulnerability-database/CVE-2019-11358) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-1.6.2.min.js | Direct | jquery - 3.4.0 | ❌ |
| [CVE-2020-7656](https://www.mend.io/vulnerability-database/CVE-2020-7656) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-1.6.2.min.js | Direct | jquery - 1.9.0 | ❌ |
| [CVE-2012-6708](https://www.mend.io/vulnerability-database/CVE-2012-6708) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | jquery-1.6.2.min.js | Direct | jQuery - v1.9.0 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-11022</summary>
### Vulnerable Library - <b>jquery-1.6.2.min.js</b></p>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.6.2.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-11022>CVE-2020-11022</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11022">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11022</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jQuery - 3.5.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2015-9251</summary>
### Vulnerable Library - <b>jquery-1.6.2.min.js</b></p>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.6.2.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.
<p>Publish Date: 2018-01-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-9251>CVE-2015-9251</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p>
<p>Release Date: 2018-01-18</p>
<p>Fix Resolution: jQuery - 3.0.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2019-11358</summary>
### Vulnerable Library - <b>jquery-1.6.2.min.js</b></p>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.6.2.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
jQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype.
<p>Publish Date: 2019-04-20
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-11358>CVE-2019-11358</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358</a></p>
<p>Release Date: 2019-04-20</p>
<p>Fix Resolution: jquery - 3.4.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-7656</summary>
### Vulnerable Library - <b>jquery-1.6.2.min.js</b></p>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.6.2.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed.
<p>Publish Date: 2020-05-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-7656>CVE-2020-7656</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-q4m3-2j7h-f7xw">https://github.com/advisories/GHSA-q4m3-2j7h-f7xw</a></p>
<p>Release Date: 2020-05-19</p>
<p>Fix Resolution: jquery - 1.9.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2012-6708</summary>
### Vulnerable Library - <b>jquery-1.6.2.min.js</b></p>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.6.2/jquery.min.js</a></p>
<p>Path to vulnerable library: /WebGoat/Resources/client-scripts/jquery-1.6.2.min.js</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.6.2.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/lukebrogan-mend/WebGoat.NET/commit/d7fa166c5e606ca761a83d8d10286b068972feb1">d7fa166c5e606ca761a83d8d10286b068972feb1</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
jQuery before 1.9.0 is vulnerable to Cross-site Scripting (XSS) attacks. The jQuery(strInput) function does not differentiate selectors from HTML in a reliable fashion. In vulnerable versions, jQuery determined whether the input was HTML by looking for the '<' character anywhere in the string, giving attackers more flexibility when attempting to construct a malicious payload. In fixed versions, jQuery only deems the input to be HTML if it explicitly starts with the '<' character, limiting exploitability only to attackers who can control the beginning of a string, which is far less common.
<p>Publish Date: 2018-01-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2012-6708>CVE-2012-6708</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2012-6708">https://nvd.nist.gov/vuln/detail/CVE-2012-6708</a></p>
<p>Release Date: 2018-01-18</p>
<p>Fix Resolution: jQuery - v1.9.0</p>
</p>
<p></p>
</details> | non_priority | jquery min js vulnerabilities highest severity is vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library webgoat resources client scripts jquery min js found in head commit a href vulnerabilities cve severity cvss dependency type fixed in jquery version remediation available medium jquery min js direct jquery medium jquery min js direct jquery medium jquery min js direct jquery medium jquery min js direct jquery medium jquery min js direct jquery details cve vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library webgoat resources client scripts jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details in jquery versions greater than or equal to and before passing html from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery cve vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library webgoat resources client scripts jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery cve vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library webgoat resources client scripts jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details jquery before as used in drupal backdrop cms and other products mishandles jquery extend true because of object prototype pollution if an unsanitized source object contained an enumerable proto property it could extend the native object prototype publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery cve vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library webgoat resources client scripts jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details jquery prior to allows cross site scripting attacks via the load method the load method fails to recognize and remove html tags that contain a whitespace character i e which results in the enclosed script logic to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery cve vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library webgoat resources client scripts jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details jquery before is vulnerable to cross site scripting xss attacks the jquery strinput function does not differentiate selectors from html in a reliable fashion in vulnerable versions jquery determined whether the input was html by looking for the character anywhere in the string giving attackers more flexibility when attempting to construct a malicious payload in fixed versions jquery only deems the input to be html if it explicitly starts with the character limiting exploitability only to attackers who can control the beginning of a string which is far less common publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery | 0 |
32,438 | 26,698,739,363 | IssuesEvent | 2023-01-27 12:41:25 | SonarSource/sonarlint-visualstudio | https://api.github.com/repos/SonarSource/sonarlint-visualstudio | opened | [Infra] Embedded resources are not included in Rules.csproj in clean command line builds | Area: VS2019 Infrastructure | ### Description
The `ProcessPluginJars` project extracts rule description files and adds them to a folder in `Rules.csproj`, which should then embed them as resource files.
This works when building inside VS. However, the embedded files are not included when building from the command line.
It is only a problem with clean builds i.e. when the `ProcessPluginJars` project is being built for the first time.
Unfortunately, this means it is a problem on the CI machine - the pipeline yaml has a hack work round the problem i.e. it explicitly builds the `ProcessPluginJars` and its dependency first. | 1.0 | [Infra] Embedded resources are not included in Rules.csproj in clean command line builds - ### Description
The `ProcessPluginJars` project extracts rule description files and adds them to a folder in `Rules.csproj`, which should then embed them as resource files.
This works when building inside VS. However, the embedded files are not included when building from the command line.
It is only a problem with clean builds i.e. when the `ProcessPluginJars` project is being built for the first time.
Unfortunately, this means it is a problem on the CI machine - the pipeline yaml has a hack work round the problem i.e. it explicitly builds the `ProcessPluginJars` and its dependency first. | non_priority | embedded resources are not included in rules csproj in clean command line builds description the processpluginjars project extracts rule description files and adds them to a folder in rules csproj which should then embed them as resource files this works when building inside vs however the embedded files are not included when building from the command line it is only a problem with clean builds i e when the processpluginjars project is being built for the first time unfortunately this means it is a problem on the ci machine the pipeline yaml has a hack work round the problem i e it explicitly builds the processpluginjars and its dependency first | 0 |
7,508 | 3,091,746,610 | IssuesEvent | 2015-08-26 14:39:18 | torchbox/wagtail | https://api.github.com/repos/torchbox/wagtail | closed | Document new approach to multi lingual sites | Documentation | Currently, the documentation contains a tutorial about how to create multi lingual sites by duplicating your site tree.
Wagtail 0.6+ lets you use a different approach using Djangos ``i18n_patterns`` and ``LocaleMiddleware`` which is better in some ways as you can duplicate the fields instead of duplicating all pages. We should document this approach as well. | 1.0 | Document new approach to multi lingual sites - Currently, the documentation contains a tutorial about how to create multi lingual sites by duplicating your site tree.
Wagtail 0.6+ lets you use a different approach using Djangos ``i18n_patterns`` and ``LocaleMiddleware`` which is better in some ways as you can duplicate the fields instead of duplicating all pages. We should document this approach as well. | non_priority | document new approach to multi lingual sites currently the documentation contains a tutorial about how to create multi lingual sites by duplicating your site tree wagtail lets you use a different approach using djangos patterns and localemiddleware which is better in some ways as you can duplicate the fields instead of duplicating all pages we should document this approach as well | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.