Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
417,909
12,189,832,614
IssuesEvent
2020-04-29 08:15:02
zoot-hq/zoot
https://api.github.com/repos/zoot-hq/zoot
closed
user-page
good first issue high priority
- accessed via navbar 2nd button icon - features username, email, and password (ONLY MOCK PASSWORD "****") and a way to update each item in the DB - contact button (opens email editor, subject line "User Contact" - logout button - delete button (purges user's UN, email, and password from DB) - delete button must feature a popup confirming user wants to delete their account please refer to wireframes for more details
1.0
user-page - - accessed via navbar 2nd button icon - features username, email, and password (ONLY MOCK PASSWORD "****") and a way to update each item in the DB - contact button (opens email editor, subject line "User Contact" - logout button - delete button (purges user's UN, email, and password from DB) - delete button must feature a popup confirming user wants to delete their account please refer to wireframes for more details
non_process
user page accessed via navbar button icon features username email and password only mock password and a way to update each item in the db contact button opens email editor subject line user contact logout button delete button purges user s un email and password from db delete button must feature a popup confirming user wants to delete their account please refer to wireframes for more details
0
103,941
16,613,270,015
IssuesEvent
2021-06-02 13:58:48
rammatzkvosky/789
https://api.github.com/repos/rammatzkvosky/789
opened
CVE-2018-20225 (High) detected in pip-19.1.1-py2.py3-none-any.whl, pip-19.3.1-py2.py3-none-any.whl
security vulnerability
## CVE-2018-20225 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>pip-19.1.1-py2.py3-none-any.whl</b>, <b>pip-19.3.1-py2.py3-none-any.whl</b></p></summary> <p> <details><summary><b>pip-19.1.1-py2.py3-none-any.whl</b></p></summary> <p>The PyPA recommended tool for installing Python packages.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/5c/e0/be401c003291b56efc55aeba6a80ab790d3d4cece2778288d65323009420/pip-19.1.1-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/5c/e0/be401c003291b56efc55aeba6a80ab790d3d4cece2778288d65323009420/pip-19.1.1-py2.py3-none-any.whl</a></p> <p>Path to vulnerable library: canner/.poetry/lib/poetry/_vendor/py2.7/virtualenv_support/pip-19.1.1-py2.py3-none-any.whl</p> <p> Dependency Hierarchy: - :x: **pip-19.1.1-py2.py3-none-any.whl** (Vulnerable Library) </details> <details><summary><b>pip-19.3.1-py2.py3-none-any.whl</b></p></summary> <p>The PyPA recommended tool for installing Python packages.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/00/b6/9cfa56b4081ad13874b0c6f96af8ce16cfbc1cb06bedf8e9164ce5551ec1/pip-19.3.1-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/00/b6/9cfa56b4081ad13874b0c6f96af8ce16cfbc1cb06bedf8e9164ce5551ec1/pip-19.3.1-py2.py3-none-any.whl</a></p> <p>Path to vulnerable library: canner/.poetry/lib/poetry/_vendor/py2.7/virtualenv_support/pip-19.3.1-py2.py3-none-any.whl</p> <p> Dependency Hierarchy: - :x: **pip-19.3.1-py2.py3-none-any.whl** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/rammatzkvosky/789/commit/a94ab1af7b954e06163acb325d4b035831f88835">a94ab1af7b954e06163acb325d4b035831f88835</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> ** DISPUTED ** An issue was discovered in pip (all versions) because it installs the version with the highest version number, even if the user had intended to obtain a private package from a private index. This only affects use of the --extra-index-url option, and exploitation requires that the package does not already exist in the public index (and thus the attacker can put the package there with an arbitrary version number). NOTE: it has been reported that this is intended functionality and the user is responsible for using --extra-index-url securely. <p>Publish Date: 2020-05-08 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20225>CVE-2018-20225</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"pip","packageVersion":"19.1.1","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"pip:19.1.1","isMinimumFixVersionAvailable":false},{"packageType":"Python","packageName":"pip","packageVersion":"19.3.1","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"pip:19.3.1","isMinimumFixVersionAvailable":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-20225","vulnerabilityDetails":"** DISPUTED ** An issue was discovered in pip (all versions) because it installs the version with the highest version number, even if the user had intended to obtain a private package from a private index. This only affects use of the --extra-index-url option, and exploitation requires that the package does not already exist in the public index (and thus the attacker can put the package there with an arbitrary version number). NOTE: it has been reported that this is intended functionality and the user is responsible for using --extra-index-url securely.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20225","cvss3Severity":"high","cvss3Score":"7.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Local","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2018-20225 (High) detected in pip-19.1.1-py2.py3-none-any.whl, pip-19.3.1-py2.py3-none-any.whl - ## CVE-2018-20225 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>pip-19.1.1-py2.py3-none-any.whl</b>, <b>pip-19.3.1-py2.py3-none-any.whl</b></p></summary> <p> <details><summary><b>pip-19.1.1-py2.py3-none-any.whl</b></p></summary> <p>The PyPA recommended tool for installing Python packages.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/5c/e0/be401c003291b56efc55aeba6a80ab790d3d4cece2778288d65323009420/pip-19.1.1-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/5c/e0/be401c003291b56efc55aeba6a80ab790d3d4cece2778288d65323009420/pip-19.1.1-py2.py3-none-any.whl</a></p> <p>Path to vulnerable library: canner/.poetry/lib/poetry/_vendor/py2.7/virtualenv_support/pip-19.1.1-py2.py3-none-any.whl</p> <p> Dependency Hierarchy: - :x: **pip-19.1.1-py2.py3-none-any.whl** (Vulnerable Library) </details> <details><summary><b>pip-19.3.1-py2.py3-none-any.whl</b></p></summary> <p>The PyPA recommended tool for installing Python packages.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/00/b6/9cfa56b4081ad13874b0c6f96af8ce16cfbc1cb06bedf8e9164ce5551ec1/pip-19.3.1-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/00/b6/9cfa56b4081ad13874b0c6f96af8ce16cfbc1cb06bedf8e9164ce5551ec1/pip-19.3.1-py2.py3-none-any.whl</a></p> <p>Path to vulnerable library: canner/.poetry/lib/poetry/_vendor/py2.7/virtualenv_support/pip-19.3.1-py2.py3-none-any.whl</p> <p> Dependency Hierarchy: - :x: **pip-19.3.1-py2.py3-none-any.whl** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/rammatzkvosky/789/commit/a94ab1af7b954e06163acb325d4b035831f88835">a94ab1af7b954e06163acb325d4b035831f88835</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> ** DISPUTED ** An issue was discovered in pip (all versions) because it installs the version with the highest version number, even if the user had intended to obtain a private package from a private index. This only affects use of the --extra-index-url option, and exploitation requires that the package does not already exist in the public index (and thus the attacker can put the package there with an arbitrary version number). NOTE: it has been reported that this is intended functionality and the user is responsible for using --extra-index-url securely. <p>Publish Date: 2020-05-08 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20225>CVE-2018-20225</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"pip","packageVersion":"19.1.1","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"pip:19.1.1","isMinimumFixVersionAvailable":false},{"packageType":"Python","packageName":"pip","packageVersion":"19.3.1","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"pip:19.3.1","isMinimumFixVersionAvailable":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-20225","vulnerabilityDetails":"** DISPUTED ** An issue was discovered in pip (all versions) because it installs the version with the highest version number, even if the user had intended to obtain a private package from a private index. This only affects use of the --extra-index-url option, and exploitation requires that the package does not already exist in the public index (and thus the attacker can put the package there with an arbitrary version number). NOTE: it has been reported that this is intended functionality and the user is responsible for using --extra-index-url securely.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20225","cvss3Severity":"high","cvss3Score":"7.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Local","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in pip none any whl pip none any whl cve high severity vulnerability vulnerable libraries pip none any whl pip none any whl pip none any whl the pypa recommended tool for installing python packages library home page a href path to vulnerable library canner poetry lib poetry vendor virtualenv support pip none any whl dependency hierarchy x pip none any whl vulnerable library pip none any whl the pypa recommended tool for installing python packages library home page a href path to vulnerable library canner poetry lib poetry vendor virtualenv support pip none any whl dependency hierarchy x pip none any whl vulnerable library found in head commit a href found in base branch master vulnerability details disputed an issue was discovered in pip all versions because it installs the version with the highest version number even if the user had intended to obtain a private package from a private index this only affects use of the extra index url option and exploitation requires that the package does not already exist in the public index and thus the attacker can put the package there with an arbitrary version number note it has been reported that this is intended functionality and the user is responsible for using extra index url securely publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree pip isminimumfixversionavailable false packagetype python packagename pip packageversion packagefilepaths istransitivedependency false dependencytree pip isminimumfixversionavailable false basebranches vulnerabilityidentifier cve vulnerabilitydetails disputed an issue was discovered in pip all versions because it installs the version with the highest version number even if the user had intended to obtain a private package from a private index this only affects use of the extra index url option and exploitation requires that the package does not already exist in the public index and thus the attacker can put the package there with an arbitrary version number note it has been reported that this is intended functionality and the user is responsible for using extra index url securely vulnerabilityurl
0
424,523
12,312,460,612
IssuesEvent
2020-05-12 13:58:04
DataDog/dd-trace-dotnet
https://api.github.com/repos/DataDog/dd-trace-dotnet
closed
Setting LoaderOptimization reg key prevents IIS from starting
area:integrations/aspnet priority:high type:bug
**Describe the bug** With APM >= 1.12 you can no longer set the `LoaderOptimization` registry key to work around #475/#592 because you get error ``` Failed to initialize the AppDomain:/LM/W3SVC/1/ROOT Exception: System.InvalidOperationException Message: The configuration system has already been initialized. StackTrace: at System.Configuration.ConfigurationManager.SetConfigurationSystem(IInternalConfigSystem configSystem, Boolean initComplete) at System.Web.Configuration.HttpConfigurationSystem.EnsureInit(IConfigMapPath configMapPath, Boolean listenToFileChanges, Boolean initComplete) at System.Web.Hosting.HostingEnvironment.Initialize(ApplicationManager appManager, IApplicationHost appHost, IConfigMapPathFactory configMapPathFactory, HostingEnvironmentParameters hostingParameters, PolicyLevel policyLevel, Exception appDomainCreationException) at System.Web.Hosting.HostingEnvironment.Initialize(ApplicationManager appManager, IApplicationHost appHost, IConfigMapPathFactory configMapPathFactory, HostingEnvironmentParameters hostingParameters, PolicyLevel policyLevel) at System.Web.Hosting.HostingEnvironment.Initialize(ApplicationManager appManager, IApplicationHost appHost, IConfigMapPathFactory configMapPathFactory, HostingEnvironmentParameters hostingParameters, PolicyLevel policyLevel) at System.Web.Hosting.ApplicationManager.CreateAppDomainWithHostingEnvironment(String appId, IApplicationHost appHost, HostingEnvironmentParameters hostingParameters) at System.Web.Hosting.ApplicationManager.CreateAppDomainWithHostingEnvironmentAndReportErrors(String appId, IApplicationHost appHost, HostingEnvironmentParameters hostingParameters) ``` **To Reproduce** 1. Install IIS with ASP.Net 1. Install APM/DD Agent 1. Set `HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\LoaderOptimization` to `1` 1. Access http://localhost You'll get a 500 server error. Check app event log for details **Expected behavior** The web app starts **Runtime environment (please complete the following information):** - Instrumentation mode: MSI - Tracer version: 1.13.2 - OS: Windows 2016 - CLR: Default (4.6.2) **Additional context** As an exercise I manually added `Datadog.Trace.AspNet` to the GAC and installed the module into IIS the normal way. Disabling the profiler allowed me to get ASP.Net traces. Attempts to have profiling enabled but the aspnet integration disabled (by adding `DD_DISABLED_INTEGRATIONS=AspNet` to the WAS environment reg key) failed.
1.0
Setting LoaderOptimization reg key prevents IIS from starting - **Describe the bug** With APM >= 1.12 you can no longer set the `LoaderOptimization` registry key to work around #475/#592 because you get error ``` Failed to initialize the AppDomain:/LM/W3SVC/1/ROOT Exception: System.InvalidOperationException Message: The configuration system has already been initialized. StackTrace: at System.Configuration.ConfigurationManager.SetConfigurationSystem(IInternalConfigSystem configSystem, Boolean initComplete) at System.Web.Configuration.HttpConfigurationSystem.EnsureInit(IConfigMapPath configMapPath, Boolean listenToFileChanges, Boolean initComplete) at System.Web.Hosting.HostingEnvironment.Initialize(ApplicationManager appManager, IApplicationHost appHost, IConfigMapPathFactory configMapPathFactory, HostingEnvironmentParameters hostingParameters, PolicyLevel policyLevel, Exception appDomainCreationException) at System.Web.Hosting.HostingEnvironment.Initialize(ApplicationManager appManager, IApplicationHost appHost, IConfigMapPathFactory configMapPathFactory, HostingEnvironmentParameters hostingParameters, PolicyLevel policyLevel) at System.Web.Hosting.HostingEnvironment.Initialize(ApplicationManager appManager, IApplicationHost appHost, IConfigMapPathFactory configMapPathFactory, HostingEnvironmentParameters hostingParameters, PolicyLevel policyLevel) at System.Web.Hosting.ApplicationManager.CreateAppDomainWithHostingEnvironment(String appId, IApplicationHost appHost, HostingEnvironmentParameters hostingParameters) at System.Web.Hosting.ApplicationManager.CreateAppDomainWithHostingEnvironmentAndReportErrors(String appId, IApplicationHost appHost, HostingEnvironmentParameters hostingParameters) ``` **To Reproduce** 1. Install IIS with ASP.Net 1. Install APM/DD Agent 1. Set `HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework\LoaderOptimization` to `1` 1. Access http://localhost You'll get a 500 server error. Check app event log for details **Expected behavior** The web app starts **Runtime environment (please complete the following information):** - Instrumentation mode: MSI - Tracer version: 1.13.2 - OS: Windows 2016 - CLR: Default (4.6.2) **Additional context** As an exercise I manually added `Datadog.Trace.AspNet` to the GAC and installed the module into IIS the normal way. Disabling the profiler allowed me to get ASP.Net traces. Attempts to have profiling enabled but the aspnet integration disabled (by adding `DD_DISABLED_INTEGRATIONS=AspNet` to the WAS environment reg key) failed.
non_process
setting loaderoptimization reg key prevents iis from starting describe the bug with apm you can no longer set the loaderoptimization registry key to work around because you get error failed to initialize the appdomain lm root exception system invalidoperationexception message the configuration system has already been initialized stacktrace at system configuration configurationmanager setconfigurationsystem iinternalconfigsystem configsystem boolean initcomplete at system web configuration httpconfigurationsystem ensureinit iconfigmappath configmappath boolean listentofilechanges boolean initcomplete at system web hosting hostingenvironment initialize applicationmanager appmanager iapplicationhost apphost iconfigmappathfactory configmappathfactory hostingenvironmentparameters hostingparameters policylevel policylevel exception appdomaincreationexception at system web hosting hostingenvironment initialize applicationmanager appmanager iapplicationhost apphost iconfigmappathfactory configmappathfactory hostingenvironmentparameters hostingparameters policylevel policylevel at system web hosting hostingenvironment initialize applicationmanager appmanager iapplicationhost apphost iconfigmappathfactory configmappathfactory hostingenvironmentparameters hostingparameters policylevel policylevel at system web hosting applicationmanager createappdomainwithhostingenvironment string appid iapplicationhost apphost hostingenvironmentparameters hostingparameters at system web hosting applicationmanager createappdomainwithhostingenvironmentandreporterrors string appid iapplicationhost apphost hostingenvironmentparameters hostingparameters to reproduce install iis with asp net install apm dd agent set hkey local machine software microsoft netframework loaderoptimization to access you ll get a server error check app event log for details expected behavior the web app starts runtime environment please complete the following information instrumentation mode msi tracer version os windows clr default additional context as an exercise i manually added datadog trace aspnet to the gac and installed the module into iis the normal way disabling the profiler allowed me to get asp net traces attempts to have profiling enabled but the aspnet integration disabled by adding dd disabled integrations aspnet to the was environment reg key failed
0
14,774
18,049,787,085
IssuesEvent
2021-09-19 14:44:54
shirou/gopsutil
https://api.github.com/repos/shirou/gopsutil
closed
Does not compile for Arm Windows due to go-ole
os:windows package:process package:cpu
**Describe the bug** Compiling for Windows Arm fails due to indirect dependency on go-ole, which appears to be win32-only. **To Reproduce** ```go package main import ( "fmt" "os" "github.com/shirou/gopsutil/process" ) func main() { currentPid := os.Getpid() myself, _ := process.NewProcess(int32(currentPid)) _, err := myself.CPUPercent() if err != nil { fmt.Printf("CPU Percent: %s\n", err) } } ``` ```bash $ GOOS=windows GOARCH=arm go build . go: downloading github.com/shirou/gopsutil v2.20.7+incompatible go: extracting github.com/shirou/gopsutil v2.20.7+incompatible go: downloading github.com/StackExchange/wmi v0.0.0-20190523213315-cbe66965904d go: downloading golang.org/x/sys v0.0.0-20200808120158-1030fc2bf1d9 go: extracting github.com/StackExchange/wmi v0.0.0-20190523213315-cbe66965904d go: downloading github.com/go-ole/go-ole v1.2.4 go: extracting github.com/go-ole/go-ole v1.2.4 go: extracting golang.org/x/sys v0.0.0-20200808120158-1030fc2bf1d9 go: finding github.com/shirou/gopsutil v2.20.7+incompatible go: finding golang.org/x/sys v0.0.0-20200808120158-1030fc2bf1d9 go: finding github.com/StackExchange/wmi v0.0.0-20190523213315-cbe66965904d go: finding github.com/go-ole/go-ole v1.2.4 # github.com/go-ole/go-ole ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/com.go:238:21: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/com.go:247:22: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:79:72: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:90:76: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:105:69: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:115:73: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:129:69: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:139:73: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:173:84: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/idispatch.go:26:90: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/idispatch.go:26:90: too many errors ``` **Expected behavior** A clean compilation **Environment (please complete the following information):** I'm cross-compiling, as I don't have a straightforward way to test the build on Arm directly, and our toolchain builds all cross-compile on Linux at the moment. - [ ] Windows: [paste the result of `ver`] - [x] Linux: ``` "Ubuntu 20.04.1 LTS" Linux myhostname 5.4.0-42-generic #46-Ubuntu SMP Fri Jul 10 00:24:02 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux ``` - [x] Mac OS: [paste the result of `sw_vers` and `uname -a` ``` ProductName: Mac OS X ProductVersion: 10.15.6 BuildVersion: 19G73 Darwin myhostname.local 19.6.0 Darwin Kernel Version 19.6.0: Sun Jul 5 00:43:10 PDT 2020; root:xnu-6153.141.1~9/RELEASE_X86_64 x86_64 ``` - [ ] FreeBSD: [paste the result of `freebsd-version -k -r -u` and `uname -a`] - [ ] OpenBSD: [paste the result of `uname -a`] **Additional context** [Cross-compiling? Paste the command you are using to cross-compile and the result of the corresponding `go env`] Linux `go env` ``` GO111MODULE="" GOARCH="amd64" GOBIN="" GOCACHE="/home/simon/.cache/go-build" GOENV="/home/simon/.config/go/env" GOEXE="" GOFLAGS="" GOHOSTARCH="amd64" GOHOSTOS="linux" GONOPROXY="" GONOSUMDB="" GOOS="linux" GOPATH="/home/simon/go" GOPRIVATE="" GOPROXY="https://proxy.golang.org,direct" GOROOT="/usr/lib/go-1.13" GOSUMDB="sum.golang.org" GOTMPDIR="" GOTOOLDIR="/usr/lib/go-1.13/pkg/tool/linux_amd64" GCCGO="gccgo" AR="ar" CC="gcc" CXX="g++" CGO_ENABLED="1" GOMOD="/home/simon/gopsutil-mvp/go.mod" CGO_CFLAGS="-g -O2" CGO_CPPFLAGS="" CGO_CXXFLAGS="-g -O2" CGO_FFLAGS="-g -O2" CGO_LDFLAGS="-g -O2" PKG_CONFIG="pkg-config" GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build453655880=/tmp/go-build -gno-record-gcc-switches" ``` Mac `go env`: ``` GO111MODULE="" GOARCH="amd64" GOBIN="" GOCACHE="/Users/sfraser/Library/Caches/go-build" GOENV="/Users/sfraser/Library/Application Support/go/env" GOEXE="" GOFLAGS="" GOHOSTARCH="amd64" GOHOSTOS="darwin" GOINSECURE="" GONOPROXY="" GONOSUMDB="" GOOS="darwin" GOPATH="/Users/sfraser/gosrc" GOPRIVATE="" GOPROXY="https://proxy.golang.org,direct" GOROOT="/usr/local/Cellar/go/1.14.6/libexec" GOSUMDB="sum.golang.org" GOTMPDIR="" GOTOOLDIR="/usr/local/Cellar/go/1.14.6/libexec/pkg/tool/darwin_amd64" GCCGO="gccgo" AR="ar" CC="clang" CXX="clang++" CGO_ENABLED="1" GOMOD="/Users/sfraser/github/srfraser/gopsutil-mvp/go.mod" CGO_CFLAGS="-g -O2" CGO_CPPFLAGS="" CGO_CXXFLAGS="-g -O2" CGO_FFLAGS="-g -O2" CGO_LDFLAGS="-g -O2" PKG_CONFIG="pkg-config" GOGCCFLAGS="-fPIC -m64 -pthread -fno-caret-diagnostics -Qunused-arguments -fmessage-length=0 -fdebug-prefix-map=/var/folders/g8/rv_wx46d5w10phdnqyw2fvt80000gn/T/go-build183372748=/tmp/go-build -gno-record-gcc-switches -fno-common" ```
1.0
Does not compile for Arm Windows due to go-ole - **Describe the bug** Compiling for Windows Arm fails due to indirect dependency on go-ole, which appears to be win32-only. **To Reproduce** ```go package main import ( "fmt" "os" "github.com/shirou/gopsutil/process" ) func main() { currentPid := os.Getpid() myself, _ := process.NewProcess(int32(currentPid)) _, err := myself.CPUPercent() if err != nil { fmt.Printf("CPU Percent: %s\n", err) } } ``` ```bash $ GOOS=windows GOARCH=arm go build . go: downloading github.com/shirou/gopsutil v2.20.7+incompatible go: extracting github.com/shirou/gopsutil v2.20.7+incompatible go: downloading github.com/StackExchange/wmi v0.0.0-20190523213315-cbe66965904d go: downloading golang.org/x/sys v0.0.0-20200808120158-1030fc2bf1d9 go: extracting github.com/StackExchange/wmi v0.0.0-20190523213315-cbe66965904d go: downloading github.com/go-ole/go-ole v1.2.4 go: extracting github.com/go-ole/go-ole v1.2.4 go: extracting golang.org/x/sys v0.0.0-20200808120158-1030fc2bf1d9 go: finding github.com/shirou/gopsutil v2.20.7+incompatible go: finding golang.org/x/sys v0.0.0-20200808120158-1030fc2bf1d9 go: finding github.com/StackExchange/wmi v0.0.0-20190523213315-cbe66965904d go: finding github.com/go-ole/go-ole v1.2.4 # github.com/go-ole/go-ole ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/com.go:238:21: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/com.go:247:22: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:79:72: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:90:76: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:105:69: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:115:73: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:129:69: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:139:73: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/connect.go:173:84: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/idispatch.go:26:90: undefined: VARIANT ../go/pkg/mod/github.com/go-ole/go-ole@v1.2.4/idispatch.go:26:90: too many errors ``` **Expected behavior** A clean compilation **Environment (please complete the following information):** I'm cross-compiling, as I don't have a straightforward way to test the build on Arm directly, and our toolchain builds all cross-compile on Linux at the moment. - [ ] Windows: [paste the result of `ver`] - [x] Linux: ``` "Ubuntu 20.04.1 LTS" Linux myhostname 5.4.0-42-generic #46-Ubuntu SMP Fri Jul 10 00:24:02 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux ``` - [x] Mac OS: [paste the result of `sw_vers` and `uname -a` ``` ProductName: Mac OS X ProductVersion: 10.15.6 BuildVersion: 19G73 Darwin myhostname.local 19.6.0 Darwin Kernel Version 19.6.0: Sun Jul 5 00:43:10 PDT 2020; root:xnu-6153.141.1~9/RELEASE_X86_64 x86_64 ``` - [ ] FreeBSD: [paste the result of `freebsd-version -k -r -u` and `uname -a`] - [ ] OpenBSD: [paste the result of `uname -a`] **Additional context** [Cross-compiling? Paste the command you are using to cross-compile and the result of the corresponding `go env`] Linux `go env` ``` GO111MODULE="" GOARCH="amd64" GOBIN="" GOCACHE="/home/simon/.cache/go-build" GOENV="/home/simon/.config/go/env" GOEXE="" GOFLAGS="" GOHOSTARCH="amd64" GOHOSTOS="linux" GONOPROXY="" GONOSUMDB="" GOOS="linux" GOPATH="/home/simon/go" GOPRIVATE="" GOPROXY="https://proxy.golang.org,direct" GOROOT="/usr/lib/go-1.13" GOSUMDB="sum.golang.org" GOTMPDIR="" GOTOOLDIR="/usr/lib/go-1.13/pkg/tool/linux_amd64" GCCGO="gccgo" AR="ar" CC="gcc" CXX="g++" CGO_ENABLED="1" GOMOD="/home/simon/gopsutil-mvp/go.mod" CGO_CFLAGS="-g -O2" CGO_CPPFLAGS="" CGO_CXXFLAGS="-g -O2" CGO_FFLAGS="-g -O2" CGO_LDFLAGS="-g -O2" PKG_CONFIG="pkg-config" GOGCCFLAGS="-fPIC -m64 -pthread -fmessage-length=0 -fdebug-prefix-map=/tmp/go-build453655880=/tmp/go-build -gno-record-gcc-switches" ``` Mac `go env`: ``` GO111MODULE="" GOARCH="amd64" GOBIN="" GOCACHE="/Users/sfraser/Library/Caches/go-build" GOENV="/Users/sfraser/Library/Application Support/go/env" GOEXE="" GOFLAGS="" GOHOSTARCH="amd64" GOHOSTOS="darwin" GOINSECURE="" GONOPROXY="" GONOSUMDB="" GOOS="darwin" GOPATH="/Users/sfraser/gosrc" GOPRIVATE="" GOPROXY="https://proxy.golang.org,direct" GOROOT="/usr/local/Cellar/go/1.14.6/libexec" GOSUMDB="sum.golang.org" GOTMPDIR="" GOTOOLDIR="/usr/local/Cellar/go/1.14.6/libexec/pkg/tool/darwin_amd64" GCCGO="gccgo" AR="ar" CC="clang" CXX="clang++" CGO_ENABLED="1" GOMOD="/Users/sfraser/github/srfraser/gopsutil-mvp/go.mod" CGO_CFLAGS="-g -O2" CGO_CPPFLAGS="" CGO_CXXFLAGS="-g -O2" CGO_FFLAGS="-g -O2" CGO_LDFLAGS="-g -O2" PKG_CONFIG="pkg-config" GOGCCFLAGS="-fPIC -m64 -pthread -fno-caret-diagnostics -Qunused-arguments -fmessage-length=0 -fdebug-prefix-map=/var/folders/g8/rv_wx46d5w10phdnqyw2fvt80000gn/T/go-build183372748=/tmp/go-build -gno-record-gcc-switches -fno-common" ```
process
does not compile for arm windows due to go ole describe the bug compiling for windows arm fails due to indirect dependency on go ole which appears to be only to reproduce go package main import fmt os github com shirou gopsutil process func main currentpid os getpid myself process newprocess currentpid err myself cpupercent if err nil fmt printf cpu percent s n err bash goos windows goarch arm go build go downloading github com shirou gopsutil incompatible go extracting github com shirou gopsutil incompatible go downloading github com stackexchange wmi go downloading golang org x sys go extracting github com stackexchange wmi go downloading github com go ole go ole go extracting github com go ole go ole go extracting golang org x sys go finding github com shirou gopsutil incompatible go finding golang org x sys go finding github com stackexchange wmi go finding github com go ole go ole github com go ole go ole go pkg mod github com go ole go ole com go undefined variant go pkg mod github com go ole go ole com go undefined variant go pkg mod github com go ole go ole connect go undefined variant go pkg mod github com go ole go ole connect go undefined variant go pkg mod github com go ole go ole connect go undefined variant go pkg mod github com go ole go ole connect go undefined variant go pkg mod github com go ole go ole connect go undefined variant go pkg mod github com go ole go ole connect go undefined variant go pkg mod github com go ole go ole connect go undefined variant go pkg mod github com go ole go ole idispatch go undefined variant go pkg mod github com go ole go ole idispatch go too many errors expected behavior a clean compilation environment please complete the following information i m cross compiling as i don t have a straightforward way to test the build on arm directly and our toolchain builds all cross compile on linux at the moment windows linux ubuntu lts linux myhostname generic ubuntu smp fri jul utc gnu linux mac os paste the result of sw vers and uname a productname mac os x productversion buildversion darwin myhostname local darwin kernel version sun jul pdt root xnu release freebsd openbsd additional context linux go env goarch gobin gocache home simon cache go build goenv home simon config go env goexe goflags gohostarch gohostos linux gonoproxy gonosumdb goos linux gopath home simon go goprivate goproxy goroot usr lib go gosumdb sum golang org gotmpdir gotooldir usr lib go pkg tool linux gccgo gccgo ar ar cc gcc cxx g cgo enabled gomod home simon gopsutil mvp go mod cgo cflags g cgo cppflags cgo cxxflags g cgo fflags g cgo ldflags g pkg config pkg config gogccflags fpic pthread fmessage length fdebug prefix map tmp go tmp go build gno record gcc switches mac go env goarch gobin gocache users sfraser library caches go build goenv users sfraser library application support go env goexe goflags gohostarch gohostos darwin goinsecure gonoproxy gonosumdb goos darwin gopath users sfraser gosrc goprivate goproxy goroot usr local cellar go libexec gosumdb sum golang org gotmpdir gotooldir usr local cellar go libexec pkg tool darwin gccgo gccgo ar ar cc clang cxx clang cgo enabled gomod users sfraser github srfraser gopsutil mvp go mod cgo cflags g cgo cppflags cgo cxxflags g cgo fflags g cgo ldflags g pkg config pkg config gogccflags fpic pthread fno caret diagnostics qunused arguments fmessage length fdebug prefix map var folders rv t go tmp go build gno record gcc switches fno common
1
636,564
20,603,098,564
IssuesEvent
2022-03-06 15:21:32
chrisreddington/CV
https://api.github.com/repos/chrisreddington/CV
closed
Refactor skills section
Type/Enhancement Community/HelpWanted Community/GoodFirstIssue Priority/Critical Size/Large
- [x] Create skills data section - [x] Separate skills into its own partial
1.0
Refactor skills section - - [x] Create skills data section - [x] Separate skills into its own partial
non_process
refactor skills section create skills data section separate skills into its own partial
0
9,400
12,399,017,255
IssuesEvent
2020-05-21 03:45:22
googleapis/nodejs-spanner
https://api.github.com/repos/googleapis/nodejs-spanner
opened
Run the system tests against the Cloud Spanner Emulator
api: spanner type: process
We want to run the system tests against the [Cloud Spanner Emulator](https://cloud.google.com/spanner/docs/emulator) on presubmits. Reasons for doing so are summarized nicely [here](https://github.com/googleapis/google-cloud-php/pull/2930#issuecomment-622212337).
1.0
Run the system tests against the Cloud Spanner Emulator - We want to run the system tests against the [Cloud Spanner Emulator](https://cloud.google.com/spanner/docs/emulator) on presubmits. Reasons for doing so are summarized nicely [here](https://github.com/googleapis/google-cloud-php/pull/2930#issuecomment-622212337).
process
run the system tests against the cloud spanner emulator we want to run the system tests against the on presubmits reasons for doing so are summarized nicely
1
248,941
26,867,478,162
IssuesEvent
2023-02-04 03:10:58
kxxt/kxxt-website
https://api.github.com/repos/kxxt/kxxt-website
closed
CVE-2022-25881 (Medium) detected in http-cache-semantics-4.1.0.tgz - autoclosed
security vulnerability
## CVE-2022-25881 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>http-cache-semantics-4.1.0.tgz</b></p></summary> <p>Parses Cache-Control and other headers. Helps building correct HTTP caches and proxies</p> <p>Library home page: <a href="https://registry.npmjs.org/http-cache-semantics/-/http-cache-semantics-4.1.0.tgz">https://registry.npmjs.org/http-cache-semantics/-/http-cache-semantics-4.1.0.tgz</a></p> <p> Dependency Hierarchy: - gatsby-5.3.3.tgz (Root Library) - got-11.8.6.tgz - cacheable-request-7.0.2.tgz - :x: **http-cache-semantics-4.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/kxxt/kxxt-website/commit/37f8543da5164a1a7ef318756aa0eac1c5e89a09">37f8543da5164a1a7ef318756aa0eac1c5e89a09</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects versions of the package http-cache-semantics before 4.1.1. The issue can be exploited via malicious request header values sent to a server, when that server reads the cache policy from the request using this library. <p>Publish Date: 2023-01-31 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25881>CVE-2022-25881</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2022-25881">https://www.cve.org/CVERecord?id=CVE-2022-25881</a></p> <p>Release Date: 2023-01-31</p> <p>Fix Resolution: http-cache-semantics - 4.1.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-25881 (Medium) detected in http-cache-semantics-4.1.0.tgz - autoclosed - ## CVE-2022-25881 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>http-cache-semantics-4.1.0.tgz</b></p></summary> <p>Parses Cache-Control and other headers. Helps building correct HTTP caches and proxies</p> <p>Library home page: <a href="https://registry.npmjs.org/http-cache-semantics/-/http-cache-semantics-4.1.0.tgz">https://registry.npmjs.org/http-cache-semantics/-/http-cache-semantics-4.1.0.tgz</a></p> <p> Dependency Hierarchy: - gatsby-5.3.3.tgz (Root Library) - got-11.8.6.tgz - cacheable-request-7.0.2.tgz - :x: **http-cache-semantics-4.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/kxxt/kxxt-website/commit/37f8543da5164a1a7ef318756aa0eac1c5e89a09">37f8543da5164a1a7ef318756aa0eac1c5e89a09</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects versions of the package http-cache-semantics before 4.1.1. The issue can be exploited via malicious request header values sent to a server, when that server reads the cache policy from the request using this library. <p>Publish Date: 2023-01-31 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25881>CVE-2022-25881</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2022-25881">https://www.cve.org/CVERecord?id=CVE-2022-25881</a></p> <p>Release Date: 2023-01-31</p> <p>Fix Resolution: http-cache-semantics - 4.1.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in http cache semantics tgz autoclosed cve medium severity vulnerability vulnerable library http cache semantics tgz parses cache control and other headers helps building correct http caches and proxies library home page a href dependency hierarchy gatsby tgz root library got tgz cacheable request tgz x http cache semantics tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects versions of the package http cache semantics before the issue can be exploited via malicious request header values sent to a server when that server reads the cache policy from the request using this library publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution http cache semantics step up your open source security game with mend
0
28,573
13,751,717,266
IssuesEvent
2020-10-06 13:41:39
cartographer-project/cartographer
https://api.github.com/repos/cartographer-project/cartographer
closed
Cached precomputation grids of map are problematic for RAM efficiency in pure localization mode
performance
_[This is a ticket to explain the motivation behind 2 upcoming PRs.]_ Construction of branch-and-bound submap scan matchers used for constraint search is dispatched dynamically, i.e. only when needed for individual submaps. They are kept afterwards. This is efficient because the precomputed grids at different resolutions only need to be computed once. However, in localization mode this means that we construct and collect more and more of those precomputed grids - proportional to the explored area and bounded by the max. number of submaps in the frozen map. (driving over the same submaps multiple times doesn’t add new ones). What I did to tackle this: * metric for counting these matchers #1738 * garbage collection of matchers that aren't needed anymore #1745 The garbage collector has been saving us a large chunk of memory during 24/7 operation in large maps since we deployed it at Magazino last year. Here's a comparison of two simulation runs we did, where the robot explores the whole area of a large map in 2D localization mode. ![image](https://user-images.githubusercontent.com/8985495/90880616-07ea5f80-e3a9-11ea-99b0-b9878238dbdd.png) (time axis don't match, but roughly corresponds to explored area in both cases)
True
Cached precomputation grids of map are problematic for RAM efficiency in pure localization mode - _[This is a ticket to explain the motivation behind 2 upcoming PRs.]_ Construction of branch-and-bound submap scan matchers used for constraint search is dispatched dynamically, i.e. only when needed for individual submaps. They are kept afterwards. This is efficient because the precomputed grids at different resolutions only need to be computed once. However, in localization mode this means that we construct and collect more and more of those precomputed grids - proportional to the explored area and bounded by the max. number of submaps in the frozen map. (driving over the same submaps multiple times doesn’t add new ones). What I did to tackle this: * metric for counting these matchers #1738 * garbage collection of matchers that aren't needed anymore #1745 The garbage collector has been saving us a large chunk of memory during 24/7 operation in large maps since we deployed it at Magazino last year. Here's a comparison of two simulation runs we did, where the robot explores the whole area of a large map in 2D localization mode. ![image](https://user-images.githubusercontent.com/8985495/90880616-07ea5f80-e3a9-11ea-99b0-b9878238dbdd.png) (time axis don't match, but roughly corresponds to explored area in both cases)
non_process
cached precomputation grids of map are problematic for ram efficiency in pure localization mode construction of branch and bound submap scan matchers used for constraint search is dispatched dynamically i e only when needed for individual submaps they are kept afterwards this is efficient because the precomputed grids at different resolutions only need to be computed once however in localization mode this means that we construct and collect more and more of those precomputed grids proportional to the explored area and bounded by the max number of submaps in the frozen map driving over the same submaps multiple times doesn’t add new ones what i did to tackle this metric for counting these matchers garbage collection of matchers that aren t needed anymore the garbage collector has been saving us a large chunk of memory during operation in large maps since we deployed it at magazino last year here s a comparison of two simulation runs we did where the robot explores the whole area of a large map in localization mode time axis don t match but roughly corresponds to explored area in both cases
0
14,800
18,090,403,465
IssuesEvent
2021-09-22 00:36:16
yandali-damian/LIM015-social-network
https://api.github.com/repos/yandali-damian/LIM015-social-network
closed
HU-001
Process
Como usuario nuevo tengo que crear una cuenta con correo electrónico y contraseña (válidos) para iniciar sesión. - Prototipado > - [x] Definición de colores > - [x] Selección de imágenes > - [x] Tipo de letra > - [x] Posicionamiento > - [x] Tamaño - Tarea > - [x] Maquetar la primera vista > - [x] Botones de ingresar y registrarse > - [x] Inputs para ingresar de correo y contraseña > - [x] Inputs para registrase nombre, correo, contraseña y confirma contraseña > - [x] Radio para sexo > - [ ] Función del login > - [ ] Interacción del botón ingresar con la vista de inicio > - [ ] Función del registro de datos > - [x] Interacción del botón registrarse con la vista de registro > - [ ] Agregar avatar
1.0
HU-001 - Como usuario nuevo tengo que crear una cuenta con correo electrónico y contraseña (válidos) para iniciar sesión. - Prototipado > - [x] Definición de colores > - [x] Selección de imágenes > - [x] Tipo de letra > - [x] Posicionamiento > - [x] Tamaño - Tarea > - [x] Maquetar la primera vista > - [x] Botones de ingresar y registrarse > - [x] Inputs para ingresar de correo y contraseña > - [x] Inputs para registrase nombre, correo, contraseña y confirma contraseña > - [x] Radio para sexo > - [ ] Función del login > - [ ] Interacción del botón ingresar con la vista de inicio > - [ ] Función del registro de datos > - [x] Interacción del botón registrarse con la vista de registro > - [ ] Agregar avatar
process
hu como usuario nuevo tengo que crear una cuenta con correo electrónico y contraseña válidos para iniciar sesión prototipado definición de colores selección de imágenes tipo de letra posicionamiento tamaño tarea maquetar la primera vista botones de ingresar y registrarse inputs para ingresar de correo y contraseña inputs para registrase nombre correo contraseña y confirma contraseña radio para sexo función del login interacción del botón ingresar con la vista de inicio función del registro de datos interacción del botón registrarse con la vista de registro agregar avatar
1
192,366
6,849,084,129
IssuesEvent
2017-11-13 20:48:21
minio/minio-dotnet
https://api.github.com/repos/minio/minio-dotnet
closed
Implement RemoveObjects API to remove objects in batches
priority: medium
Refer #467 of Minio-Java. Content-Md5 needs to be set for multi object deletes
1.0
Implement RemoveObjects API to remove objects in batches - Refer #467 of Minio-Java. Content-Md5 needs to be set for multi object deletes
non_process
implement removeobjects api to remove objects in batches refer of minio java content needs to be set for multi object deletes
0
17,730
23,637,557,968
IssuesEvent
2022-08-25 14:24:22
threefoldtech/tfgrid_dashboard
https://api.github.com/repos/threefoldtech/tfgrid_dashboard
closed
'More Details' of DAO proposal systematically gives spinning screens
process_wontfix type_bug
I click on 'More Details' in a DAO proposal, I get a screen with permanent spinning wheel (both qa and testnet, and for all proposals made). <img width="1512" alt="Screenshot 2022-07-28 at 16 29 32" src="https://user-images.githubusercontent.com/30384423/181556861-28b54a3d-501a-4464-9cdc-485e8fe56ca4.png">
1.0
'More Details' of DAO proposal systematically gives spinning screens - I click on 'More Details' in a DAO proposal, I get a screen with permanent spinning wheel (both qa and testnet, and for all proposals made). <img width="1512" alt="Screenshot 2022-07-28 at 16 29 32" src="https://user-images.githubusercontent.com/30384423/181556861-28b54a3d-501a-4464-9cdc-485e8fe56ca4.png">
process
more details of dao proposal systematically gives spinning screens i click on more details in a dao proposal i get a screen with permanent spinning wheel both qa and testnet and for all proposals made img width alt screenshot at src
1
454,963
13,109,790,942
IssuesEvent
2020-08-04 19:22:35
carbon-design-system/carbon-addons-iot-react
https://api.github.com/repos/carbon-design-system/carbon-addons-iot-react
opened
[Dashboard cards] Need "pop out" expand icon in card header
offering: health offering: predict status: needs priority :inbox_tray: status: needs triage :mag: type: enhancement :bulb:
<!-- Use this template if you want to request a new feature, or a change to an existing feature. If you'd like to request an entirely new component, please use the component request template instead. If you are reporting a bug or problem, please use the bug template instead. --> ### Summary Need "expand" option in dashboard card header. ![image](https://user-images.githubusercontent.com/51330315/89335161-a6ff1f80-d665-11ea-8b9a-932911685c53.png) **Additional context** Health & Predict are following the pattern that Monitor set for expanding cards to show a larger view of the data, but it looks like the expand button isn't available in the IoT cards. We need to expand a value card first, but also need this option on chart cards. Click on the expand icon in the upper right ![image](https://user-images.githubusercontent.com/51330315/89335270-cac26580-d665-11ea-8a5c-81962368a483.png) to open a dialog that has additional info ![image](https://user-images.githubusercontent.com/51330315/89335311-d7df5480-d665-11ea-8a87-aa91a52bbc42.png) ### Specific timeline issues / requests Do you want this work within a specific time period? Is it related to an upcoming release? Health needs this for our October 2020 release. Lack of this button blocks creation of the dialog that is invoked when the user clicks the button. ### Available extra resources What resources do you have to assist this effort? The Predict team hard-coded this into their chart cards, copying the Monitor team, and should hopefully be able to contribute it back.
1.0
[Dashboard cards] Need "pop out" expand icon in card header - <!-- Use this template if you want to request a new feature, or a change to an existing feature. If you'd like to request an entirely new component, please use the component request template instead. If you are reporting a bug or problem, please use the bug template instead. --> ### Summary Need "expand" option in dashboard card header. ![image](https://user-images.githubusercontent.com/51330315/89335161-a6ff1f80-d665-11ea-8b9a-932911685c53.png) **Additional context** Health & Predict are following the pattern that Monitor set for expanding cards to show a larger view of the data, but it looks like the expand button isn't available in the IoT cards. We need to expand a value card first, but also need this option on chart cards. Click on the expand icon in the upper right ![image](https://user-images.githubusercontent.com/51330315/89335270-cac26580-d665-11ea-8a5c-81962368a483.png) to open a dialog that has additional info ![image](https://user-images.githubusercontent.com/51330315/89335311-d7df5480-d665-11ea-8a87-aa91a52bbc42.png) ### Specific timeline issues / requests Do you want this work within a specific time period? Is it related to an upcoming release? Health needs this for our October 2020 release. Lack of this button blocks creation of the dialog that is invoked when the user clicks the button. ### Available extra resources What resources do you have to assist this effort? The Predict team hard-coded this into their chart cards, copying the Monitor team, and should hopefully be able to contribute it back.
non_process
need pop out expand icon in card header use this template if you want to request a new feature or a change to an existing feature if you d like to request an entirely new component please use the component request template instead if you are reporting a bug or problem please use the bug template instead summary need expand option in dashboard card header additional context health predict are following the pattern that monitor set for expanding cards to show a larger view of the data but it looks like the expand button isn t available in the iot cards we need to expand a value card first but also need this option on chart cards click on the expand icon in the upper right to open a dialog that has additional info specific timeline issues requests do you want this work within a specific time period is it related to an upcoming release health needs this for our october release lack of this button blocks creation of the dialog that is invoked when the user clicks the button available extra resources what resources do you have to assist this effort the predict team hard coded this into their chart cards copying the monitor team and should hopefully be able to contribute it back
0
348,429
10,442,325,301
IssuesEvent
2019-09-18 12:51:57
input-output-hk/jormungandr
https://api.github.com/repos/input-output-hk/jormungandr
closed
Chain head storage tag not kept up to date
Priority - High bug
Currently, there is a tag in storage for the main branch, but it is only ever set to the block0's hash. The tag: https://github.com/input-output-hk/jormungandr/blob/646b07116e2f40be66aee51df5e353c56b4fdd9e/jormungandr/src/blockchain/chain.rs#L106 Expected behavior: The tag to be kept up to date when the chain tip changes. comments from discussion with @NicolasDP about it: > Every time the tip is updated it should update the storage tag for this too. > It’s only done for the block0 hash.
1.0
Chain head storage tag not kept up to date - Currently, there is a tag in storage for the main branch, but it is only ever set to the block0's hash. The tag: https://github.com/input-output-hk/jormungandr/blob/646b07116e2f40be66aee51df5e353c56b4fdd9e/jormungandr/src/blockchain/chain.rs#L106 Expected behavior: The tag to be kept up to date when the chain tip changes. comments from discussion with @NicolasDP about it: > Every time the tip is updated it should update the storage tag for this too. > It’s only done for the block0 hash.
non_process
chain head storage tag not kept up to date currently there is a tag in storage for the main branch but it is only ever set to the s hash the tag expected behavior the tag to be kept up to date when the chain tip changes comments from discussion with nicolasdp about it every time the tip is updated it should update the storage tag for this too it’s only done for the hash
0
74,646
20,259,789,910
IssuesEvent
2022-02-15 05:42:08
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
opened
[SB] Eligibility section can not be marked as complete in the below scenario
Bug P1 Study builder
**Steps:** 1. Edit the study. 2. Naviagte to eligibility section. 3. In token validation > Remove the instruction text and then click on save button. 4. Now navigate to eligibility test > add eligibility test questions. 5. Click on mark as complete button and Observe. **AR:** Eligibility section can not be marked as complete in the above scenario. **ER:** Eligibility section should be marked as complete in the above scenario.
1.0
[SB] Eligibility section can not be marked as complete in the below scenario - **Steps:** 1. Edit the study. 2. Naviagte to eligibility section. 3. In token validation > Remove the instruction text and then click on save button. 4. Now navigate to eligibility test > add eligibility test questions. 5. Click on mark as complete button and Observe. **AR:** Eligibility section can not be marked as complete in the above scenario. **ER:** Eligibility section should be marked as complete in the above scenario.
non_process
eligibility section can not be marked as complete in the below scenario steps edit the study naviagte to eligibility section in token validation remove the instruction text and then click on save button now navigate to eligibility test add eligibility test questions click on mark as complete button and observe ar eligibility section can not be marked as complete in the above scenario er eligibility section should be marked as complete in the above scenario
0
2,464
5,242,913,515
IssuesEvent
2017-01-31 19:18:35
opentrials/opentrials
https://api.github.com/repos/opentrials/opentrials
closed
Introduce `database.identifiers` table
API Processors refactoring
# Overview For now we handle identifiers in `database.records.identifiers` jsonb dict and it's the most important part of our deduplication system. We have following problems with it: - can't have a few identifiers from one source (because it's dict) [#299] - should search using query like `identifiers @> '{"nct": "NCT124123423"}' (we need key+value) because of the way how Postgres GIN indexes works. - can't make it array because we need `source_id` keys Introducing normalized `database.identifiers` table will solve this problems and open some other possibilities to work with identifiers.
1.0
Introduce `database.identifiers` table - # Overview For now we handle identifiers in `database.records.identifiers` jsonb dict and it's the most important part of our deduplication system. We have following problems with it: - can't have a few identifiers from one source (because it's dict) [#299] - should search using query like `identifiers @> '{"nct": "NCT124123423"}' (we need key+value) because of the way how Postgres GIN indexes works. - can't make it array because we need `source_id` keys Introducing normalized `database.identifiers` table will solve this problems and open some other possibilities to work with identifiers.
process
introduce database identifiers table overview for now we handle identifiers in database records identifiers jsonb dict and it s the most important part of our deduplication system we have following problems with it can t have a few identifiers from one source because it s dict should search using query like identifiers nct we need key value because of the way how postgres gin indexes works can t make it array because we need source id keys introducing normalized database identifiers table will solve this problems and open some other possibilities to work with identifiers
1
20,068
26,557,884,178
IssuesEvent
2023-01-20 13:36:12
NixOS/nix
https://api.github.com/repos/NixOS/nix
closed
Pull request checklist
developer-experience process
**Is your feature request related to a problem? Please describe.** Checklists help with guiding a process to completion. They ensure quality and make their users confident, because they won't forget certain aspects of the process. **Describe the solution you'd like** Add a pull requests checklist to the pull request template and/or "maintainer handbook" (the manual or `maintainers/*.md`) <!-- Some items: - [ ] is the idea good? has it been discussed by the Nix team? - [ ] unit tests - [ ] functional tests (`tests/**.sh`) - [ ] documentation in the manual - [ ] documentation in the code (if necessary; ideally code is already clear) - [ ] documentation in the commit message (why was this change made? for future reference when maintaining the code) - [ ] documentation in the changelog (to announce features and fixes to existing users who might have to do something to finally solve their problem, and to summarize the development history) --> **Describe alternatives you've considered** - Keep forgetting to do certain things leading to - bloat - regressions - forgotten documentation - uncertainty as to why changes to the code were made - prs getting slowed by delays, reminders and more delays Have a github action that posts the checklist to each PR. This is a bit more robust as contributors may mess with the checklist. **Additional context** - I forgot to ask for documentation in #7314. **Priorities** Add :+1: to [issues you find important](https://github.com/NixOS/nix/issues?q=is%3Aissue+is%3Aopen+sort%3Areactions-%2B1-desc).
1.0
Pull request checklist - **Is your feature request related to a problem? Please describe.** Checklists help with guiding a process to completion. They ensure quality and make their users confident, because they won't forget certain aspects of the process. **Describe the solution you'd like** Add a pull requests checklist to the pull request template and/or "maintainer handbook" (the manual or `maintainers/*.md`) <!-- Some items: - [ ] is the idea good? has it been discussed by the Nix team? - [ ] unit tests - [ ] functional tests (`tests/**.sh`) - [ ] documentation in the manual - [ ] documentation in the code (if necessary; ideally code is already clear) - [ ] documentation in the commit message (why was this change made? for future reference when maintaining the code) - [ ] documentation in the changelog (to announce features and fixes to existing users who might have to do something to finally solve their problem, and to summarize the development history) --> **Describe alternatives you've considered** - Keep forgetting to do certain things leading to - bloat - regressions - forgotten documentation - uncertainty as to why changes to the code were made - prs getting slowed by delays, reminders and more delays Have a github action that posts the checklist to each PR. This is a bit more robust as contributors may mess with the checklist. **Additional context** - I forgot to ask for documentation in #7314. **Priorities** Add :+1: to [issues you find important](https://github.com/NixOS/nix/issues?q=is%3Aissue+is%3Aopen+sort%3Areactions-%2B1-desc).
process
pull request checklist is your feature request related to a problem please describe checklists help with guiding a process to completion they ensure quality and make their users confident because they won t forget certain aspects of the process describe the solution you d like add a pull requests checklist to the pull request template and or maintainer handbook the manual or maintainers md some items is the idea good has it been discussed by the nix team unit tests functional tests tests sh documentation in the manual documentation in the code if necessary ideally code is already clear documentation in the commit message why was this change made for future reference when maintaining the code documentation in the changelog to announce features and fixes to existing users who might have to do something to finally solve their problem and to summarize the development history describe alternatives you ve considered keep forgetting to do certain things leading to bloat regressions forgotten documentation uncertainty as to why changes to the code were made prs getting slowed by delays reminders and more delays have a github action that posts the checklist to each pr this is a bit more robust as contributors may mess with the checklist additional context i forgot to ask for documentation in priorities add to
1
3,439
6,537,316,309
IssuesEvent
2017-08-31 21:51:39
pburns96/Revature-VenderBender
https://api.github.com/repos/pburns96/Revature-VenderBender
closed
As a customer, I can view CDs
High Priority Work In Process
Requirements: I can sort CDs by: Year, Artist, Title, Price. I can navigate to the pages of results,
1.0
As a customer, I can view CDs - Requirements: I can sort CDs by: Year, Artist, Title, Price. I can navigate to the pages of results,
process
as a customer i can view cds requirements i can sort cds by year artist title price i can navigate to the pages of results
1
98,938
12,379,601,540
IssuesEvent
2020-05-19 12:46:19
spotify/backstage
https://api.github.com/repos/spotify/backstage
opened
Tabs (primary level navigation)
component design storybook
## 🗒 General Hi! Would love if someone could help us build our primary level navigation, which comes in the form of tabs! ✨Please add this to our [Storybook](http://storybook.backstage.io/) as well. ![tabs](https://user-images.githubusercontent.com/61153904/82327275-6f75c500-99de-11ea-99e0-330d66a83981.gif) <!--- Write a nice note to the community requesting the creation of a new component! --> <!--- Include an image of your component. Bonus points for a GIF! --> ## 💻 Usage ![Tabs](https://user-images.githubusercontent.com/61153904/82327390-9e8c3680-99de-11ea-8ed0-5c9115e8e7c3.png) Horizontal tabs are the first level of navigation within a plugin or entity page on Backstage. First level navigation should be broader, and we recommend keeping the number of first level tabs to less than 8. However, if there are more tabs, we have designed a way for users to view additional tabs by clicking on the caret icon at the very right of the tab area. We recommend that you follow the [usage](https://material.io/components/tabs#usage) and [interaction](https://material-ui.com/components/tabs/) guidelines for tabs that Material has listed. <!--- Tell us what the point of this component/pattern is! How does it help? How should it work? Any rules? --> ## 📐 Specs ![Tabs Specs](https://user-images.githubusercontent.com/61153904/82327412-a77d0800-99de-11ea-9e0b-d5cadcbfdc31.png) <!--- Include images that detail the redlines for your component.--> <!--- Once we get our Figma workspace set up, we'll be posting the Figma files rather than doing specs by hand.--> ## 🔮 Future We're currently brainstorming secondary and tertiary level navigation as well! We'll keep you all posted! 🎉 <!-- Any upcoming, exciting functionality for this component in the future? List that out here. -->
1.0
Tabs (primary level navigation) - ## 🗒 General Hi! Would love if someone could help us build our primary level navigation, which comes in the form of tabs! ✨Please add this to our [Storybook](http://storybook.backstage.io/) as well. ![tabs](https://user-images.githubusercontent.com/61153904/82327275-6f75c500-99de-11ea-99e0-330d66a83981.gif) <!--- Write a nice note to the community requesting the creation of a new component! --> <!--- Include an image of your component. Bonus points for a GIF! --> ## 💻 Usage ![Tabs](https://user-images.githubusercontent.com/61153904/82327390-9e8c3680-99de-11ea-8ed0-5c9115e8e7c3.png) Horizontal tabs are the first level of navigation within a plugin or entity page on Backstage. First level navigation should be broader, and we recommend keeping the number of first level tabs to less than 8. However, if there are more tabs, we have designed a way for users to view additional tabs by clicking on the caret icon at the very right of the tab area. We recommend that you follow the [usage](https://material.io/components/tabs#usage) and [interaction](https://material-ui.com/components/tabs/) guidelines for tabs that Material has listed. <!--- Tell us what the point of this component/pattern is! How does it help? How should it work? Any rules? --> ## 📐 Specs ![Tabs Specs](https://user-images.githubusercontent.com/61153904/82327412-a77d0800-99de-11ea-9e0b-d5cadcbfdc31.png) <!--- Include images that detail the redlines for your component.--> <!--- Once we get our Figma workspace set up, we'll be posting the Figma files rather than doing specs by hand.--> ## 🔮 Future We're currently brainstorming secondary and tertiary level navigation as well! We'll keep you all posted! 🎉 <!-- Any upcoming, exciting functionality for this component in the future? List that out here. -->
non_process
tabs primary level navigation 🗒 general hi would love if someone could help us build our primary level navigation which comes in the form of tabs ✨please add this to our as well 💻 usage horizontal tabs are the first level of navigation within a plugin or entity page on backstage first level navigation should be broader and we recommend keeping the number of first level tabs to less than however if there are more tabs we have designed a way for users to view additional tabs by clicking on the caret icon at the very right of the tab area we recommend that you follow the and guidelines for tabs that material has listed 📐 specs 🔮 future we re currently brainstorming secondary and tertiary level navigation as well we ll keep you all posted 🎉
0
37,316
15,240,861,768
IssuesEvent
2021-02-19 07:28:11
Azure/azure-cli
https://api.github.com/repos/Azure/azure-cli
reopened
Can subnet of kubernetes cluster be used for cosmos db subnet?
AKS Cosmos Service Attention
Can subnet of kubernetes cluster be used for cosmos db subnet? Or do I need to create a new second subnet? [Enter feedback here] --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: ceafa346-5ac6-9a5b-b36b-fb6a91449ae0 * Version Independent ID: 12dc1540-3a61-aece-0ef0-0d4c192d930e * Content: [az cosmosdb network-rule](https://docs.microsoft.com/en-us/cli/azure/cosmosdb/network-rule?view=azure-cli-latest) * Content Source: [latest/docs-ref-autogen/cosmosdb/network-rule.yml](https://github.com/MicrosoftDocs/azure-docs-cli/blob/master/latest/docs-ref-autogen/cosmosdb/network-rule.yml) * Service: **cosmos-db** * GitHub Login: @rloutlaw * Microsoft Alias: **routlaw**
1.0
Can subnet of kubernetes cluster be used for cosmos db subnet? - Can subnet of kubernetes cluster be used for cosmos db subnet? Or do I need to create a new second subnet? [Enter feedback here] --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: ceafa346-5ac6-9a5b-b36b-fb6a91449ae0 * Version Independent ID: 12dc1540-3a61-aece-0ef0-0d4c192d930e * Content: [az cosmosdb network-rule](https://docs.microsoft.com/en-us/cli/azure/cosmosdb/network-rule?view=azure-cli-latest) * Content Source: [latest/docs-ref-autogen/cosmosdb/network-rule.yml](https://github.com/MicrosoftDocs/azure-docs-cli/blob/master/latest/docs-ref-autogen/cosmosdb/network-rule.yml) * Service: **cosmos-db** * GitHub Login: @rloutlaw * Microsoft Alias: **routlaw**
non_process
can subnet of kubernetes cluster be used for cosmos db subnet can subnet of kubernetes cluster be used for cosmos db subnet or do i need to create a new second subnet document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id aece content content source service cosmos db github login rloutlaw microsoft alias routlaw
0
4,482
7,137,825,064
IssuesEvent
2018-01-23 12:27:57
AdguardTeam/AdguardForAndroid
https://api.github.com/repos/AdguardTeam/AdguardForAndroid
closed
com.kbstar.kbbank app - HTTPS filtering issue
SSL compatibility
https://play.google.com/store/apps/details?id=com.kbstar.kbbank > When I use 'kbstar bank' which is korean bank app, the app said 'the > network is not stable please try later' I use https filtering no root.
True
com.kbstar.kbbank app - HTTPS filtering issue - https://play.google.com/store/apps/details?id=com.kbstar.kbbank > When I use 'kbstar bank' which is korean bank app, the app said 'the > network is not stable please try later' I use https filtering no root.
non_process
com kbstar kbbank app https filtering issue when i use kbstar bank which is korean bank app the app said the network is not stable please try later i use https filtering no root
0
6,556
9,648,087,864
IssuesEvent
2019-05-17 15:23:51
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
Update wording
assigned-to-author automation/svc doc-bug process-automation/subsvc triaged
Hi. The "From your Automation account, select Workspace from the left page." entry under https://docs.microsoft.com/en-us/azure/automation/automation-solution-vm-management#remove-the-solution is now out-of-date. It should say something like: "In your Automation account, click on "Linked workspace under Related Resources", or something like that, as there isn't a Workspace button on the left page inside an automated account. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 225c9d05-83dd-b006-0025-3753f5ab25bf * Version Independent ID: 9eecef0c-b1cb-1136-faf7-542214492096 * Content: [Start/Stop VMs during off-hours solution](https://docs.microsoft.com/en-us/azure/automation/automation-solution-vm-management#remove-the-solution) * Content Source: [articles/automation/automation-solution-vm-management.md](https://github.com/Microsoft/azure-docs/blob/master/articles/automation/automation-solution-vm-management.md) * Service: **automation** * Sub-service: **process-automation** * GitHub Login: @georgewallace * Microsoft Alias: **gwallace**
1.0
Update wording - Hi. The "From your Automation account, select Workspace from the left page." entry under https://docs.microsoft.com/en-us/azure/automation/automation-solution-vm-management#remove-the-solution is now out-of-date. It should say something like: "In your Automation account, click on "Linked workspace under Related Resources", or something like that, as there isn't a Workspace button on the left page inside an automated account. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 225c9d05-83dd-b006-0025-3753f5ab25bf * Version Independent ID: 9eecef0c-b1cb-1136-faf7-542214492096 * Content: [Start/Stop VMs during off-hours solution](https://docs.microsoft.com/en-us/azure/automation/automation-solution-vm-management#remove-the-solution) * Content Source: [articles/automation/automation-solution-vm-management.md](https://github.com/Microsoft/azure-docs/blob/master/articles/automation/automation-solution-vm-management.md) * Service: **automation** * Sub-service: **process-automation** * GitHub Login: @georgewallace * Microsoft Alias: **gwallace**
process
update wording hi the from your automation account select workspace from the left page entry under is now out of date it should say something like in your automation account click on linked workspace under related resources or something like that as there isn t a workspace button on the left page inside an automated account document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service automation sub service process automation github login georgewallace microsoft alias gwallace
1
107,422
23,410,806,122
IssuesEvent
2022-08-12 17:17:43
objectos/objectos
https://api.github.com/repos/objectos/objectos
reopened
AsciiDoc: support section block
t:feature c:code a:objectos-asciidoc
Dependencies: - [x] #62 ## Test cases - [x] tc04: sections starts after a UL - [ ] tc05: level 'reduction'
1.0
AsciiDoc: support section block - Dependencies: - [x] #62 ## Test cases - [x] tc04: sections starts after a UL - [ ] tc05: level 'reduction'
non_process
asciidoc support section block dependencies test cases sections starts after a ul level reduction
0
4,108
7,056,824,064
IssuesEvent
2018-01-04 14:20:35
pwittchen/swipe
https://api.github.com/repos/pwittchen/swipe
opened
Release 0.2.0
release process
**Initial release notes**: - migrated library to RxJava2.x as a separate artifact on a separate Git branch - updated project dependencies - updated Gradle to 3.x - added Retrolambda to sample Java app **Things to do**: - [ ] `RxJava1.x` branch - [ ] bump library version to 0.2.0 - [ ] upload Archives to Maven Central Repository - [ ] close and release artifact on Nexus - [ ] update JavaDoc on gh-pages - [ ] update `CHANGELOG.md` after Maven Sync - [ ] update download section in `README.md` after Maven Sync - [ ] create new GitHub release - [ ] `RxJava2.x` branch - [ ] bump library version to 0.2.0 - [ ] upload Archives to Maven Central Repository - [ ] close and release artifact on Nexus - [ ] update JavaDoc on gh-pages - [ ] update `CHANGELOG.md` after Maven Sync - [ ] update download section in `README.md` after Maven Sync - [ ] create new GitHub release
1.0
Release 0.2.0 - **Initial release notes**: - migrated library to RxJava2.x as a separate artifact on a separate Git branch - updated project dependencies - updated Gradle to 3.x - added Retrolambda to sample Java app **Things to do**: - [ ] `RxJava1.x` branch - [ ] bump library version to 0.2.0 - [ ] upload Archives to Maven Central Repository - [ ] close and release artifact on Nexus - [ ] update JavaDoc on gh-pages - [ ] update `CHANGELOG.md` after Maven Sync - [ ] update download section in `README.md` after Maven Sync - [ ] create new GitHub release - [ ] `RxJava2.x` branch - [ ] bump library version to 0.2.0 - [ ] upload Archives to Maven Central Repository - [ ] close and release artifact on Nexus - [ ] update JavaDoc on gh-pages - [ ] update `CHANGELOG.md` after Maven Sync - [ ] update download section in `README.md` after Maven Sync - [ ] create new GitHub release
process
release initial release notes migrated library to x as a separate artifact on a separate git branch updated project dependencies updated gradle to x added retrolambda to sample java app things to do x branch bump library version to upload archives to maven central repository close and release artifact on nexus update javadoc on gh pages update changelog md after maven sync update download section in readme md after maven sync create new github release x branch bump library version to upload archives to maven central repository close and release artifact on nexus update javadoc on gh pages update changelog md after maven sync update download section in readme md after maven sync create new github release
1
751,883
26,262,881,334
IssuesEvent
2023-01-06 09:40:45
Public-Health-Scotland/source-linkage-files
https://api.github.com/repos/Public-Health-Scotland/source-linkage-files
closed
Convert (year specific) care homes into a function for processing
Priority: High New Function
The care home scripts found here should follow the format for other functions: Source-Linkage-Files/Production_scripts/Process_extracts - [x] extract_care_homes - [x] extract_care_homes_tests
1.0
Convert (year specific) care homes into a function for processing - The care home scripts found here should follow the format for other functions: Source-Linkage-Files/Production_scripts/Process_extracts - [x] extract_care_homes - [x] extract_care_homes_tests
non_process
convert year specific care homes into a function for processing the care home scripts found here should follow the format for other functions source linkage files production scripts process extracts extract care homes extract care homes tests
0
89,747
10,616,331,638
IssuesEvent
2019-10-12 10:52:50
AjayGoel1830/Election-Dapp-CFD-2019
https://api.github.com/repos/AjayGoel1830/Election-Dapp-CFD-2019
closed
Create a PR Template for Pull Requests
documentation enhancement good first issue hacktoberfest help wanted
Since Issue Template will be created in #5 , Create a nice PR template which may give us sufficient information for a patch.
1.0
Create a PR Template for Pull Requests - Since Issue Template will be created in #5 , Create a nice PR template which may give us sufficient information for a patch.
non_process
create a pr template for pull requests since issue template will be created in create a nice pr template which may give us sufficient information for a patch
0
202,522
15,833,658,791
IssuesEvent
2021-04-06 15:52:53
AY2021S2-CS2103T-T10-1/tp
https://api.github.com/repos/AY2021S2-CS2103T-T10-1/tp
closed
[PE-D] Session date and timing not displayed when a program is added.
documentation
Session date and timing not displayed when a program is added. I believe this is a functionality bug as the UI showed on the User Guide displays the session date and timing, ![image.png](https://raw.githubusercontent.com/github-amanda/ped/main/files/a0394022-6c69-4186-953f-d5fa359d8873.png) From User Guide: ![image.png](https://raw.githubusercontent.com/github-amanda/ped/main/files/cff47104-8835-4ce3-a374-f142aee8e351.png) <!--session: 1617430004446-7140fde4-6e04-4799-86b1-cdc7dd8de126--> ------------- Labels: `severity.High` `type.FunctionalityBug` original: github-amanda/ped#4
1.0
[PE-D] Session date and timing not displayed when a program is added. - Session date and timing not displayed when a program is added. I believe this is a functionality bug as the UI showed on the User Guide displays the session date and timing, ![image.png](https://raw.githubusercontent.com/github-amanda/ped/main/files/a0394022-6c69-4186-953f-d5fa359d8873.png) From User Guide: ![image.png](https://raw.githubusercontent.com/github-amanda/ped/main/files/cff47104-8835-4ce3-a374-f142aee8e351.png) <!--session: 1617430004446-7140fde4-6e04-4799-86b1-cdc7dd8de126--> ------------- Labels: `severity.High` `type.FunctionalityBug` original: github-amanda/ped#4
non_process
session date and timing not displayed when a program is added session date and timing not displayed when a program is added i believe this is a functionality bug as the ui showed on the user guide displays the session date and timing from user guide labels severity high type functionalitybug original github amanda ped
0
10,438
13,220,149,346
IssuesEvent
2020-08-17 11:52:38
bisq-network/proposals
https://api.github.com/repos/bisq-network/proposals
closed
Introduce recurring "Bisq-Calls"
a:proposal re:processes was:approved
> _This is a Bisq Network proposal. Please familiarize yourself with the [submission and review process](https://docs.bisq.network/proposals.html)._ <!-- Please do not remove the text above. --> With 2020 and its necessary changes, attempts have been made and structures have been installed to address issues around how to craft and maintain Bisqs roadmap. The budget has been introduced, the concept of projects has been introduced, and an entity, namely the project management committee, has been installed to triage projects and thus, shape the roadmap of Bisq. However, this committee did not gain members to actually perform as intended, reasons being, among others, the initial overhead of creating project proposals, big projects, a limited work force and, hence, slow progress. All of the above and the lack of communication started to paint a false picture of how things are handled and threatens to result in people loosing interest. By installing a new recurring Bisq call format, we hope to clear things up. # Proposal We propose to revive the ol' dev calls but with a slightly changed format. The new calls are to follow a tight agenda. The agenda is designed to keep it rather compact and focused: - **Review ongoing/completed projects** Get a feeling of what is going on in the project, make ourselves aware of what is being done currently and how things are moving along. Summarize every project by recapping the written project updates - **Adjust priorities of ongoing and new projects** Discuss the priorities of all projects (ongoing and new) and come to a rough consensus what is important and urgent to do next. - **Assign people** Ask for help on the projects to be taken over and driven forward. Find people that are willing to take over project leads as well as people which are willing to support bigger project by eg. help testing stuff if the time comes. Note that when a healthy number of high-priority tasks are taken care of already, we can fill our pipelines with less urgent but equally important stuff. In a way, we propose to let the the call participants be the formerly installed project management committee. Thus, we propose the call to follow these ancillary characteristics: - **Open to everyone** Literally everyone is welcome to join the calls. These calls should serve as a brewing ground of _THE_ Bisq team, everyone can just passively suck up the information that is provided, people can join the discussion and bring their arguments into the mix and last but not least, people can stand up and commit themselves to certain projects or tasks. - **Weekly** We propose to start with a weekly call and see how it goes. We can adjust the frequency as needed. - **1h** We aim for a call duration of 1h max. That is to not take up too much time for everybody but have some reserves when settling on priorities just takes a little longer. However, when nothing changed to the previous call (ie. no new projects, priorities stay the same, ...), such a call can be completed in 10min or less. Again, we always can adjust if it seems necessary. - **Meeting minutes** There will be _NO_ recording of the calls, however, there will be meeting minutes. - **Infrastructure** We are to try Jitsi meet as an infrastructure, so everyone with a browser can participate easily. The calls will be tracked in our [event list](https://github.com/bisq-network/events/issues), where you can also find the minutes of past calls and agendas of upcoming calls. There will also be a calendar entry in our [calendar](https://bisq.network/calendar) and participation links will be distributed via the event issue, the calendar and the #general keybase channel. # Open Questions - time/date. maybe thursdays, 13:00 UTC? - host? a team maybe, in case someone cannot participate. # Closing words These calls will be quite a challenge as we attempt to honor a lot of different opinions and reach a rough consensus over what is important and urgent for Bisq. We, however, are eager to try it and maybe, just maybe, it might just work out as expected. :rocket:
1.0
Introduce recurring "Bisq-Calls" - > _This is a Bisq Network proposal. Please familiarize yourself with the [submission and review process](https://docs.bisq.network/proposals.html)._ <!-- Please do not remove the text above. --> With 2020 and its necessary changes, attempts have been made and structures have been installed to address issues around how to craft and maintain Bisqs roadmap. The budget has been introduced, the concept of projects has been introduced, and an entity, namely the project management committee, has been installed to triage projects and thus, shape the roadmap of Bisq. However, this committee did not gain members to actually perform as intended, reasons being, among others, the initial overhead of creating project proposals, big projects, a limited work force and, hence, slow progress. All of the above and the lack of communication started to paint a false picture of how things are handled and threatens to result in people loosing interest. By installing a new recurring Bisq call format, we hope to clear things up. # Proposal We propose to revive the ol' dev calls but with a slightly changed format. The new calls are to follow a tight agenda. The agenda is designed to keep it rather compact and focused: - **Review ongoing/completed projects** Get a feeling of what is going on in the project, make ourselves aware of what is being done currently and how things are moving along. Summarize every project by recapping the written project updates - **Adjust priorities of ongoing and new projects** Discuss the priorities of all projects (ongoing and new) and come to a rough consensus what is important and urgent to do next. - **Assign people** Ask for help on the projects to be taken over and driven forward. Find people that are willing to take over project leads as well as people which are willing to support bigger project by eg. help testing stuff if the time comes. Note that when a healthy number of high-priority tasks are taken care of already, we can fill our pipelines with less urgent but equally important stuff. In a way, we propose to let the the call participants be the formerly installed project management committee. Thus, we propose the call to follow these ancillary characteristics: - **Open to everyone** Literally everyone is welcome to join the calls. These calls should serve as a brewing ground of _THE_ Bisq team, everyone can just passively suck up the information that is provided, people can join the discussion and bring their arguments into the mix and last but not least, people can stand up and commit themselves to certain projects or tasks. - **Weekly** We propose to start with a weekly call and see how it goes. We can adjust the frequency as needed. - **1h** We aim for a call duration of 1h max. That is to not take up too much time for everybody but have some reserves when settling on priorities just takes a little longer. However, when nothing changed to the previous call (ie. no new projects, priorities stay the same, ...), such a call can be completed in 10min or less. Again, we always can adjust if it seems necessary. - **Meeting minutes** There will be _NO_ recording of the calls, however, there will be meeting minutes. - **Infrastructure** We are to try Jitsi meet as an infrastructure, so everyone with a browser can participate easily. The calls will be tracked in our [event list](https://github.com/bisq-network/events/issues), where you can also find the minutes of past calls and agendas of upcoming calls. There will also be a calendar entry in our [calendar](https://bisq.network/calendar) and participation links will be distributed via the event issue, the calendar and the #general keybase channel. # Open Questions - time/date. maybe thursdays, 13:00 UTC? - host? a team maybe, in case someone cannot participate. # Closing words These calls will be quite a challenge as we attempt to honor a lot of different opinions and reach a rough consensus over what is important and urgent for Bisq. We, however, are eager to try it and maybe, just maybe, it might just work out as expected. :rocket:
process
introduce recurring bisq calls this is a bisq network proposal please familiarize yourself with the with and its necessary changes attempts have been made and structures have been installed to address issues around how to craft and maintain bisqs roadmap the budget has been introduced the concept of projects has been introduced and an entity namely the project management committee has been installed to triage projects and thus shape the roadmap of bisq however this committee did not gain members to actually perform as intended reasons being among others the initial overhead of creating project proposals big projects a limited work force and hence slow progress all of the above and the lack of communication started to paint a false picture of how things are handled and threatens to result in people loosing interest by installing a new recurring bisq call format we hope to clear things up proposal we propose to revive the ol dev calls but with a slightly changed format the new calls are to follow a tight agenda the agenda is designed to keep it rather compact and focused review ongoing completed projects get a feeling of what is going on in the project make ourselves aware of what is being done currently and how things are moving along summarize every project by recapping the written project updates adjust priorities of ongoing and new projects discuss the priorities of all projects ongoing and new and come to a rough consensus what is important and urgent to do next assign people ask for help on the projects to be taken over and driven forward find people that are willing to take over project leads as well as people which are willing to support bigger project by eg help testing stuff if the time comes note that when a healthy number of high priority tasks are taken care of already we can fill our pipelines with less urgent but equally important stuff in a way we propose to let the the call participants be the formerly installed project management committee thus we propose the call to follow these ancillary characteristics open to everyone literally everyone is welcome to join the calls these calls should serve as a brewing ground of the bisq team everyone can just passively suck up the information that is provided people can join the discussion and bring their arguments into the mix and last but not least people can stand up and commit themselves to certain projects or tasks weekly we propose to start with a weekly call and see how it goes we can adjust the frequency as needed we aim for a call duration of max that is to not take up too much time for everybody but have some reserves when settling on priorities just takes a little longer however when nothing changed to the previous call ie no new projects priorities stay the same such a call can be completed in or less again we always can adjust if it seems necessary meeting minutes there will be no recording of the calls however there will be meeting minutes infrastructure we are to try jitsi meet as an infrastructure so everyone with a browser can participate easily the calls will be tracked in our where you can also find the minutes of past calls and agendas of upcoming calls there will also be a calendar entry in our and participation links will be distributed via the event issue the calendar and the general keybase channel open questions time date maybe thursdays utc host a team maybe in case someone cannot participate closing words these calls will be quite a challenge as we attempt to honor a lot of different opinions and reach a rough consensus over what is important and urgent for bisq we however are eager to try it and maybe just maybe it might just work out as expected rocket
1
125,251
4,954,879,087
IssuesEvent
2016-12-01 18:51:45
nexusformat/definitions
https://api.github.com/repos/nexusformat/definitions
closed
NXlog has reST syntax warning
bug low priority
Sphinx reports this: .../build/manual/source/classes/base_classes/NXlog.rst:27: WARNING: Bullet list ends without a blank line; unexpected unindent.
1.0
NXlog has reST syntax warning - Sphinx reports this: .../build/manual/source/classes/base_classes/NXlog.rst:27: WARNING: Bullet list ends without a blank line; unexpected unindent.
non_process
nxlog has rest syntax warning sphinx reports this build manual source classes base classes nxlog rst warning bullet list ends without a blank line unexpected unindent
0
17,245
23,031,303,217
IssuesEvent
2022-07-22 14:10:05
scikit-learn/scikit-learn
https://api.github.com/repos/scikit-learn/scikit-learn
closed
Gaussian Process Regressor future forecast
Question module:gaussian_process Needs Decision - Close
I am trying to replicate this example [http://scikit-learn.org/stable/auto_examples/gaussian_process/plot_gpr_co2.html](url) to understand GPR for time series analysis. However, after training the data when i forecast for future, i do not get the result as shown in the example, my future forecast is straight line, with no seasonality what so ever. Please see below chart ![gpr forecast](https://cloud.githubusercontent.com/assets/5044483/24213209/aa745578-0f57-11e7-8164-ce95ad02c6d1.png) I am not sure, what I am doing wrong. The only difference between the example and my implementation- 1. As mldata.org is down i could not download the data, but got it from web which is for period of 1979 to 2008, unlike example data which is till 1997. 2. As, I am not sure what 'x' is in example, due to reason given in (1), I take linearly increasing time value as 'x'. My 'x' and 'y' are given in the snippet below x | y 2 | 315.70 3 | 317.45 4 | 317.50 5 | 317.26 6 | 315.86 Other than above two everything is as given in the example. I am even observing similar behaviour when trying this example [http://scikit-learn.org/stable/auto_examples/gaussian_process/plot_gpr_noisy_targets.html](url) Any pointer on what is going wrong would be really helpful, as I have been breaking my head on this from last two days. I searched stack over flow and googled a lot before posting the issue here. Details about my platform - Anaconda - python 2.7.12 - sklearn 0.18.1
1.0
Gaussian Process Regressor future forecast - I am trying to replicate this example [http://scikit-learn.org/stable/auto_examples/gaussian_process/plot_gpr_co2.html](url) to understand GPR for time series analysis. However, after training the data when i forecast for future, i do not get the result as shown in the example, my future forecast is straight line, with no seasonality what so ever. Please see below chart ![gpr forecast](https://cloud.githubusercontent.com/assets/5044483/24213209/aa745578-0f57-11e7-8164-ce95ad02c6d1.png) I am not sure, what I am doing wrong. The only difference between the example and my implementation- 1. As mldata.org is down i could not download the data, but got it from web which is for period of 1979 to 2008, unlike example data which is till 1997. 2. As, I am not sure what 'x' is in example, due to reason given in (1), I take linearly increasing time value as 'x'. My 'x' and 'y' are given in the snippet below x | y 2 | 315.70 3 | 317.45 4 | 317.50 5 | 317.26 6 | 315.86 Other than above two everything is as given in the example. I am even observing similar behaviour when trying this example [http://scikit-learn.org/stable/auto_examples/gaussian_process/plot_gpr_noisy_targets.html](url) Any pointer on what is going wrong would be really helpful, as I have been breaking my head on this from last two days. I searched stack over flow and googled a lot before posting the issue here. Details about my platform - Anaconda - python 2.7.12 - sklearn 0.18.1
process
gaussian process regressor future forecast i am trying to replicate this example url to understand gpr for time series analysis however after training the data when i forecast for future i do not get the result as shown in the example my future forecast is straight line with no seasonality what so ever please see below chart i am not sure what i am doing wrong the only difference between the example and my implementation as mldata org is down i could not download the data but got it from web which is for period of to unlike example data which is till as i am not sure what x is in example due to reason given in i take linearly increasing time value as x my x and y are given in the snippet below x y other than above two everything is as given in the example i am even observing similar behaviour when trying this example url any pointer on what is going wrong would be really helpful as i have been breaking my head on this from last two days i searched stack over flow and googled a lot before posting the issue here details about my platform anaconda python sklearn
1
20,217
13,763,343,216
IssuesEvent
2020-10-07 10:23:18
spring1944/spring1944
https://api.github.com/repos/spring1944/spring1944
closed
Move to CC-BY-SA
Status:new infrastructure
Everything except lua scripts are licensed as Creative Commons BY-NC 3.0, which is not considered a free cultural license. That has a number of drawbacks. The most important, we cannot announce the game neither in free-software forums nor in commercial platforms. Actually commercial platforms are not that clear, but it is blocking some plans to announce the game. @specing can give more details. I think moving to Creative Commons BY-SA 3.0 will solve many problems, and it is indeed a harmless change. I'm pretty sure @FLOZi and @yuritch are "assets" copyright owners. I think any of you are entitled to change the license.
1.0
Move to CC-BY-SA - Everything except lua scripts are licensed as Creative Commons BY-NC 3.0, which is not considered a free cultural license. That has a number of drawbacks. The most important, we cannot announce the game neither in free-software forums nor in commercial platforms. Actually commercial platforms are not that clear, but it is blocking some plans to announce the game. @specing can give more details. I think moving to Creative Commons BY-SA 3.0 will solve many problems, and it is indeed a harmless change. I'm pretty sure @FLOZi and @yuritch are "assets" copyright owners. I think any of you are entitled to change the license.
non_process
move to cc by sa everything except lua scripts are licensed as creative commons by nc which is not considered a free cultural license that has a number of drawbacks the most important we cannot announce the game neither in free software forums nor in commercial platforms actually commercial platforms are not that clear but it is blocking some plans to announce the game specing can give more details i think moving to creative commons by sa will solve many problems and it is indeed a harmless change i m pretty sure flozi and yuritch are assets copyright owners i think any of you are entitled to change the license
0
58,562
24,482,327,061
IssuesEvent
2022-10-09 01:34:28
anime-skip/player
https://api.github.com/repos/anime-skip/player
opened
Space bar scrolls window
service: vrv service: crunchyroll
In addition to playing/pausing, if you press the spacebar on a player like VRV in window mode (NOT fullscreen), it will scroll down. This happens both with VRV and Crunchyroll.
2.0
Space bar scrolls window - In addition to playing/pausing, if you press the spacebar on a player like VRV in window mode (NOT fullscreen), it will scroll down. This happens both with VRV and Crunchyroll.
non_process
space bar scrolls window in addition to playing pausing if you press the spacebar on a player like vrv in window mode not fullscreen it will scroll down this happens both with vrv and crunchyroll
0
2,530
5,289,877,581
IssuesEvent
2017-02-08 18:28:14
MikePopoloski/slang
https://api.github.com/repos/MikePopoloski/slang
closed
Handle remaining preprocessor directives
area-preprocessor medium
There are a few preprocessor directives that aren't yet handled. Implement them in Preprocessor::next().
1.0
Handle remaining preprocessor directives - There are a few preprocessor directives that aren't yet handled. Implement them in Preprocessor::next().
process
handle remaining preprocessor directives there are a few preprocessor directives that aren t yet handled implement them in preprocessor next
1
278,064
21,058,031,895
IssuesEvent
2022-04-01 06:38:51
medajet/ped
https://api.github.com/repos/medajet/ped
opened
"upt" command has incorrect format in UG
severity.Medium type.DocumentationBug
For the updt command, both fields are listed as optional even though at least 1 is mandatory. ![Picture14.png](https://raw.githubusercontent.com/medajet/ped/main/files/bce0f5c7-9c4c-499c-a1ff-c2d946008a39.png) <!--session: 1648792880809-a0d404c0-a5ce-4319-b3e0-6ef579fd4865--> <!--Version: Web v3.4.2-->
1.0
"upt" command has incorrect format in UG - For the updt command, both fields are listed as optional even though at least 1 is mandatory. ![Picture14.png](https://raw.githubusercontent.com/medajet/ped/main/files/bce0f5c7-9c4c-499c-a1ff-c2d946008a39.png) <!--session: 1648792880809-a0d404c0-a5ce-4319-b3e0-6ef579fd4865--> <!--Version: Web v3.4.2-->
non_process
upt command has incorrect format in ug for the updt command both fields are listed as optional even though at least is mandatory
0
18,239
24,308,470,929
IssuesEvent
2022-09-29 19:42:25
googleapis/google-cloud-go
https://api.github.com/repos/googleapis/google-cloud-go
closed
storage: fix use of Bucket_RetentionPolicy.RetentionPeriod
api: storage type: process
`Bucket_RetentionPolicy.RetentionPeriod` will get `proto3_optional` soon, changing it to a pointer. It is commented out right now, but should be fixed once regen lands the new change. Only places that assign a value to `Bucket_RetentionPolicy.RetentionPeriod` need to be fixed. Places that use the `GetRetentionPeriod` accessor are unaffected (which is part of why we use the accessors to read values 😉). As of right now, none of the integration tests enabled for gRPC use the `RetentionPeriod` field so we should see no disruptions there.
1.0
storage: fix use of Bucket_RetentionPolicy.RetentionPeriod - `Bucket_RetentionPolicy.RetentionPeriod` will get `proto3_optional` soon, changing it to a pointer. It is commented out right now, but should be fixed once regen lands the new change. Only places that assign a value to `Bucket_RetentionPolicy.RetentionPeriod` need to be fixed. Places that use the `GetRetentionPeriod` accessor are unaffected (which is part of why we use the accessors to read values 😉). As of right now, none of the integration tests enabled for gRPC use the `RetentionPeriod` field so we should see no disruptions there.
process
storage fix use of bucket retentionpolicy retentionperiod bucket retentionpolicy retentionperiod will get optional soon changing it to a pointer it is commented out right now but should be fixed once regen lands the new change only places that assign a value to bucket retentionpolicy retentionperiod need to be fixed places that use the getretentionperiod accessor are unaffected which is part of why we use the accessors to read values 😉 as of right now none of the integration tests enabled for grpc use the retentionperiod field so we should see no disruptions there
1
314,615
27,013,832,607
IssuesEvent
2023-02-10 17:29:34
elastic/kibana
https://api.github.com/repos/elastic/kibana
closed
Failing test: Jest Tests.x-pack/plugins/cases/public/components/all_cases - AllCasesListGeneric Actions Row actions should disable row actions when bulk selecting all cases
failed-test Team:ResponseOps Feature:Cases
A test failed on a tracked branch ``` TestingLibraryElementError: Unable to find an element by: [data-test-subj="checkboxSelectAll"] Ignored nodes: comments, script, style <body class="" > <div /> </body> at Object.getElementError (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/config.js:40:19) at /var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:90:38 at /var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:62:17 at /var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:111:19 at getByTestId (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/x-pack/plugins/cases/public/components/all_cases/all_cases_list.test.tsx:995:31) at batchedUpdates$1 (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/react-dom/cjs/react-dom.development.js:22380:12) at act (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/react-dom/cjs/react-dom-test-utils.development.js:1042:14) at Object.<anonymous> (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/x-pack/plugins/cases/public/components/all_cases/all_cases_list.test.tsx:994:12) at Promise.then.completed (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/utils.js:289:28) at new Promise (<anonymous>) at callAsyncCircusFn (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/utils.js:222:10) at _callCircusTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:248:40) at runMicrotasks (<anonymous>) at runNextTicks (node:internal/process/task_queues:61:5) at listOnTimeout (node:internal/timers:528:9) at processTimers (node:internal/timers:502:7) at _runTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:184:3) at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:86:9) at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9) at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9) at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9) at run (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:26:3) at runAndTransformResultsToJestFormat (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/legacy-code-todo-rewrite/jestAdapterInit.js:120:21) at jestAdapter (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/legacy-code-todo-rewrite/jestAdapter.js:79:19) at runTestInternal (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:367:16) at runTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:444:34) ``` First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/25182#01854e8e-ad88-4aeb-95ef-5e047b0791fa) <!-- kibanaCiData = {"failed-test":{"test.class":"Jest Tests.x-pack/plugins/cases/public/components/all_cases","test.name":"AllCasesListGeneric Actions Row actions should disable row actions when bulk selecting all cases","test.failCount":2}} -->
1.0
Failing test: Jest Tests.x-pack/plugins/cases/public/components/all_cases - AllCasesListGeneric Actions Row actions should disable row actions when bulk selecting all cases - A test failed on a tracked branch ``` TestingLibraryElementError: Unable to find an element by: [data-test-subj="checkboxSelectAll"] Ignored nodes: comments, script, style <body class="" > <div /> </body> at Object.getElementError (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/config.js:40:19) at /var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:90:38 at /var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:62:17 at /var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/@testing-library/dom/dist/query-helpers.js:111:19 at getByTestId (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/x-pack/plugins/cases/public/components/all_cases/all_cases_list.test.tsx:995:31) at batchedUpdates$1 (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/react-dom/cjs/react-dom.development.js:22380:12) at act (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/react-dom/cjs/react-dom-test-utils.development.js:1042:14) at Object.<anonymous> (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/x-pack/plugins/cases/public/components/all_cases/all_cases_list.test.tsx:994:12) at Promise.then.completed (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/utils.js:289:28) at new Promise (<anonymous>) at callAsyncCircusFn (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/utils.js:222:10) at _callCircusTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:248:40) at runMicrotasks (<anonymous>) at runNextTicks (node:internal/process/task_queues:61:5) at listOnTimeout (node:internal/timers:528:9) at processTimers (node:internal/timers:502:7) at _runTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:184:3) at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:86:9) at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9) at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9) at _runTestsForDescribeBlock (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:81:9) at run (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/run.js:26:3) at runAndTransformResultsToJestFormat (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/legacy-code-todo-rewrite/jestAdapterInit.js:120:21) at jestAdapter (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/legacy-code-todo-rewrite/jestAdapter.js:79:19) at runTestInternal (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:367:16) at runTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-1154c761d1f6f4cb/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:444:34) ``` First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/25182#01854e8e-ad88-4aeb-95ef-5e047b0791fa) <!-- kibanaCiData = {"failed-test":{"test.class":"Jest Tests.x-pack/plugins/cases/public/components/all_cases","test.name":"AllCasesListGeneric Actions Row actions should disable row actions when bulk selecting all cases","test.failCount":2}} -->
non_process
failing test jest tests x pack plugins cases public components all cases allcaseslistgeneric actions row actions should disable row actions when bulk selecting all cases a test failed on a tracked branch testinglibraryelementerror unable to find an element by ignored nodes comments script style body class at object getelementerror var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules testing library dom dist config js at var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules testing library dom dist query helpers js at var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules testing library dom dist query helpers js at var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules testing library dom dist query helpers js at getbytestid var lib buildkite agent builds kb spot elastic kibana on merge kibana x pack plugins cases public components all cases all cases list test tsx at batchedupdates var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules react dom cjs react dom development js at act var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules react dom cjs react dom test utils development js at object var lib buildkite agent builds kb spot elastic kibana on merge kibana x pack plugins cases public components all cases all cases list test tsx at promise then completed var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build utils js at new promise at callasynccircusfn var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build utils js at callcircustest var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runmicrotasks at runnextticks node internal process task queues at listontimeout node internal timers at processtimers node internal timers at runtest var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runtestsfordescribeblock var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runtestsfordescribeblock var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runtestsfordescribeblock var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runtestsfordescribeblock var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at run var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build run js at runandtransformresultstojestformat var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build legacy code todo rewrite jestadapterinit js at jestadapter var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build legacy code todo rewrite jestadapter js at runtestinternal var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest runner build runtest js at runtest var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest runner build runtest js first failure
0
216,043
24,206,831,689
IssuesEvent
2022-09-25 10:56:46
ghc-dev/3046595_1990
https://api.github.com/repos/ghc-dev/3046595_1990
opened
PyYAML-5.3.1.tar.gz: 1 vulnerabilities (highest severity is: 9.8)
security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>PyYAML-5.3.1.tar.gz</b></p></summary> <p>YAML parser and emitter for Python</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz">https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> <p>Found in HEAD commit: <a href="https://github.com/ghc-dev/3046595_1990/commit/1d1b99f0229024c5b4a722bed0c176d624e0824d">1d1b99f0229024c5b4a722bed0c176d624e0824d</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | --- | --- | | [CVE-2020-14343](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14343) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | PyYAML-5.3.1.tar.gz | Direct | 5.4 | &#9989; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-14343</summary> ### Vulnerable Library - <b>PyYAML-5.3.1.tar.gz</b></p> <p>YAML parser and emitter for Python</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz">https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> Dependency Hierarchy: - :x: **PyYAML-5.3.1.tar.gz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ghc-dev/3046595_1990/commit/1d1b99f0229024c5b4a722bed0c176d624e0824d">1d1b99f0229024c5b4a722bed0c176d624e0824d</a></p> <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> A vulnerability was discovered in the PyYAML library in versions before 5.4, where it is susceptible to arbitrary code execution when it processes untrusted YAML files through the full_load method or with the FullLoader loader. Applications that use the library to process untrusted input may be vulnerable to this flaw. This flaw allows an attacker to execute arbitrary code on the system by abusing the python/object/new constructor. This flaw is due to an incomplete fix for CVE-2020-1747. <p>Publish Date: 2021-02-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14343>CVE-2020-14343</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>9.8</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14343">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14343</a></p> <p>Release Date: 2021-02-09</p> <p>Fix Resolution: 5.4</p> </p> <p></p> :rescue_worker_helmet: Automatic Remediation is available for this issue </details> *** <p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
True
PyYAML-5.3.1.tar.gz: 1 vulnerabilities (highest severity is: 9.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>PyYAML-5.3.1.tar.gz</b></p></summary> <p>YAML parser and emitter for Python</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz">https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> <p>Found in HEAD commit: <a href="https://github.com/ghc-dev/3046595_1990/commit/1d1b99f0229024c5b4a722bed0c176d624e0824d">1d1b99f0229024c5b4a722bed0c176d624e0824d</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | --- | --- | | [CVE-2020-14343](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14343) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | PyYAML-5.3.1.tar.gz | Direct | 5.4 | &#9989; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-14343</summary> ### Vulnerable Library - <b>PyYAML-5.3.1.tar.gz</b></p> <p>YAML parser and emitter for Python</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz">https://files.pythonhosted.org/packages/64/c2/b80047c7ac2478f9501676c988a5411ed5572f35d1beff9cae07d321512c/PyYAML-5.3.1.tar.gz</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> Dependency Hierarchy: - :x: **PyYAML-5.3.1.tar.gz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ghc-dev/3046595_1990/commit/1d1b99f0229024c5b4a722bed0c176d624e0824d">1d1b99f0229024c5b4a722bed0c176d624e0824d</a></p> <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> A vulnerability was discovered in the PyYAML library in versions before 5.4, where it is susceptible to arbitrary code execution when it processes untrusted YAML files through the full_load method or with the FullLoader loader. Applications that use the library to process untrusted input may be vulnerable to this flaw. This flaw allows an attacker to execute arbitrary code on the system by abusing the python/object/new constructor. This flaw is due to an incomplete fix for CVE-2020-1747. <p>Publish Date: 2021-02-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14343>CVE-2020-14343</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>9.8</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14343">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14343</a></p> <p>Release Date: 2021-02-09</p> <p>Fix Resolution: 5.4</p> </p> <p></p> :rescue_worker_helmet: Automatic Remediation is available for this issue </details> *** <p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
non_process
pyyaml tar gz vulnerabilities highest severity is vulnerable library pyyaml tar gz yaml parser and emitter for python library home page a href path to dependency file requirements txt path to vulnerable library requirements txt found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high pyyaml tar gz direct details cve vulnerable library pyyaml tar gz yaml parser and emitter for python library home page a href path to dependency file requirements txt path to vulnerable library requirements txt dependency hierarchy x pyyaml tar gz vulnerable library found in head commit a href found in base branch main vulnerability details a vulnerability was discovered in the pyyaml library in versions before where it is susceptible to arbitrary code execution when it processes untrusted yaml files through the full load method or with the fullloader loader applications that use the library to process untrusted input may be vulnerable to this flaw this flaw allows an attacker to execute arbitrary code on the system by abusing the python object new constructor this flaw is due to an incomplete fix for cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
0
14,702
17,874,734,230
IssuesEvent
2021-09-07 00:30:22
lynnandtonic/nestflix.fun
https://api.github.com/repos/lynnandtonic/nestflix.fun
closed
Add Kill It Before It Dies
suggested title in process
Please add as much of the following info as you can: Title: Kill It Before It Dies Type (film/tv show): film Film or show in which it appears: Charmed Is the parent film/show streaming anywhere? https://www.amazon.com/gp/video/detail/B0093SJGSM About when in the parent film/show does it appear? Chick Flick (s2e18) Actual footage of the film/show can be seen (yes/no)? yes https://charmed.fandom.com/wiki/Kill_It_Before_It_Dies
1.0
Add Kill It Before It Dies - Please add as much of the following info as you can: Title: Kill It Before It Dies Type (film/tv show): film Film or show in which it appears: Charmed Is the parent film/show streaming anywhere? https://www.amazon.com/gp/video/detail/B0093SJGSM About when in the parent film/show does it appear? Chick Flick (s2e18) Actual footage of the film/show can be seen (yes/no)? yes https://charmed.fandom.com/wiki/Kill_It_Before_It_Dies
process
add kill it before it dies please add as much of the following info as you can title kill it before it dies type film tv show film film or show in which it appears charmed is the parent film show streaming anywhere about when in the parent film show does it appear chick flick actual footage of the film show can be seen yes no yes
1
32,116
6,033,461,218
IssuesEvent
2017-06-09 08:23:23
paritytech/parity
https://api.github.com/repos/paritytech/parity
closed
Apparent Babel show-stopper in Tutorial Part 1
F5-documentation M3-docs
Hi all I'm trying the tutorial here: https://github.com/paritytech/parity/wiki/Tutorial-Part-I All good - but when I get to: webpack It didn't work (because it isn't installed). When I try to install it it says: `npm WARN optional Skipping failed optional dependency /chokidar/fsevents: npm WARN notsup Not compatible with your operating system or architecture: fsevents@1.1.1` I don't know if that's important or not... but when I try to run webpack it says `ERROR in Entry module not found: Error: Can't resolve 'babel-loader' in '/home/nick/mydapp'` when I try to install babel-cli -g, I get the same warnings as above (I don't know if they're important or not), and still get `ERROR in Entry module not found: Error: Can't resolve 'babel-loader' in '/home/nick/mydapp'` when I try to run webpack Any ideas as to what I need to do next?
1.0
Apparent Babel show-stopper in Tutorial Part 1 - Hi all I'm trying the tutorial here: https://github.com/paritytech/parity/wiki/Tutorial-Part-I All good - but when I get to: webpack It didn't work (because it isn't installed). When I try to install it it says: `npm WARN optional Skipping failed optional dependency /chokidar/fsevents: npm WARN notsup Not compatible with your operating system or architecture: fsevents@1.1.1` I don't know if that's important or not... but when I try to run webpack it says `ERROR in Entry module not found: Error: Can't resolve 'babel-loader' in '/home/nick/mydapp'` when I try to install babel-cli -g, I get the same warnings as above (I don't know if they're important or not), and still get `ERROR in Entry module not found: Error: Can't resolve 'babel-loader' in '/home/nick/mydapp'` when I try to run webpack Any ideas as to what I need to do next?
non_process
apparent babel show stopper in tutorial part hi all i m trying the tutorial here all good but when i get to webpack it didn t work because it isn t installed when i try to install it it says npm warn optional skipping failed optional dependency chokidar fsevents npm warn notsup not compatible with your operating system or architecture fsevents i don t know if that s important or not but when i try to run webpack it says error in entry module not found error can t resolve babel loader in home nick mydapp when i try to install babel cli g i get the same warnings as above i don t know if they re important or not and still get error in entry module not found error can t resolve babel loader in home nick mydapp when i try to run webpack any ideas as to what i need to do next
0
2,859
5,824,284,394
IssuesEvent
2017-05-07 11:32:22
QCoDeS/Qcodes
https://api.github.com/repos/QCoDeS/Qcodes
closed
session crash when calling function on remote instrument
bug mulitprocessing p1
The following is a simple example ``` import qcodes import logging from qcodes.instrument.base import Instrument from qcodes.instrument.parameter import ManualParameter from qcodes.utils.validators import Numbers class DummyInstrument(Instrument): def __init__(self, name='dummy', gates=['dac1', 'dac2', 'dac3'], **kwargs): ''' Create a dummy instrument that can be used for testing Args: name (string): name for the instrument gates (list): list of names that is used to create parameters for the instrument ''' super().__init__(name, **kwargs) logging.info('create DummyInstrument') # make gates for i, g in enumerate(gates): self.add_parameter(g, parameter_class=ManualParameter, initial_value=0, label='Gate {} (arb. units)'.format(g), vals=Numbers(-800, 400)) self.value=0 def dummyget(self): return self.value def dummyset(self, val): self.value=val d =dict({'a': 1, 'b': 2}) x = DummyInstrument(name='test', server_name='test') x.dummyset( list(d.keys() ) ) # good x.dummyset( d.keys() )# fail ``` ### Expected behaviour When calling a remote function with arguments that cannot be pickled the system should give a warning or error. ### Actual behaviour The session crashes (ha ### System Ubuntu 16.04 Running with spyder
1.0
session crash when calling function on remote instrument - The following is a simple example ``` import qcodes import logging from qcodes.instrument.base import Instrument from qcodes.instrument.parameter import ManualParameter from qcodes.utils.validators import Numbers class DummyInstrument(Instrument): def __init__(self, name='dummy', gates=['dac1', 'dac2', 'dac3'], **kwargs): ''' Create a dummy instrument that can be used for testing Args: name (string): name for the instrument gates (list): list of names that is used to create parameters for the instrument ''' super().__init__(name, **kwargs) logging.info('create DummyInstrument') # make gates for i, g in enumerate(gates): self.add_parameter(g, parameter_class=ManualParameter, initial_value=0, label='Gate {} (arb. units)'.format(g), vals=Numbers(-800, 400)) self.value=0 def dummyget(self): return self.value def dummyset(self, val): self.value=val d =dict({'a': 1, 'b': 2}) x = DummyInstrument(name='test', server_name='test') x.dummyset( list(d.keys() ) ) # good x.dummyset( d.keys() )# fail ``` ### Expected behaviour When calling a remote function with arguments that cannot be pickled the system should give a warning or error. ### Actual behaviour The session crashes (ha ### System Ubuntu 16.04 Running with spyder
process
session crash when calling function on remote instrument the following is a simple example import qcodes import logging from qcodes instrument base import instrument from qcodes instrument parameter import manualparameter from qcodes utils validators import numbers class dummyinstrument instrument def init self name dummy gates kwargs create a dummy instrument that can be used for testing args name string name for the instrument gates list list of names that is used to create parameters for the instrument super init name kwargs logging info create dummyinstrument make gates for i g in enumerate gates self add parameter g parameter class manualparameter initial value label gate arb units format g vals numbers self value def dummyget self return self value def dummyset self val self value val d dict a b x dummyinstrument name test server name test x dummyset list d keys good x dummyset d keys fail expected behaviour when calling a remote function with arguments that cannot be pickled the system should give a warning or error actual behaviour the session crashes ha system ubuntu running with spyder
1
204,109
15,894,020,161
IssuesEvent
2021-04-11 08:38:02
brotkrueml/typo3-matomo-widgets
https://api.github.com/repos/brotkrueml/typo3-matomo-widgets
opened
Add widget for device plugins
documentation feature
- [ ] A widget for displaying the device plugins is available. - [ ] The widget can be enabled/disabled in the site configuration. - [ ] The result is displayed as a table in the widget with the columns: logo, label, nb_visits_percentage. The displayed columns can be overriden in the Services file. - [ ] The rows are sorted by nb_visits_percentage. The sorting can be overriden in the Services file. - [ ] A maximum of 30 rows is displayed. This value can be overridden in a Services file. - [ ] The documentation is adjusted. - [ ] A feature entry to the changelog is added. Example for an API call: https://demo.matomo.cloud/?module=API&method=DevicePlugins.getPlugin&idSite=1&period=day&date=yesterday&format=JSON&token_auth=anonymous
1.0
Add widget for device plugins - - [ ] A widget for displaying the device plugins is available. - [ ] The widget can be enabled/disabled in the site configuration. - [ ] The result is displayed as a table in the widget with the columns: logo, label, nb_visits_percentage. The displayed columns can be overriden in the Services file. - [ ] The rows are sorted by nb_visits_percentage. The sorting can be overriden in the Services file. - [ ] A maximum of 30 rows is displayed. This value can be overridden in a Services file. - [ ] The documentation is adjusted. - [ ] A feature entry to the changelog is added. Example for an API call: https://demo.matomo.cloud/?module=API&method=DevicePlugins.getPlugin&idSite=1&period=day&date=yesterday&format=JSON&token_auth=anonymous
non_process
add widget for device plugins a widget for displaying the device plugins is available the widget can be enabled disabled in the site configuration the result is displayed as a table in the widget with the columns logo label nb visits percentage the displayed columns can be overriden in the services file the rows are sorted by nb visits percentage the sorting can be overriden in the services file a maximum of rows is displayed this value can be overridden in a services file the documentation is adjusted a feature entry to the changelog is added example for an api call
0
179,077
30,113,592,727
IssuesEvent
2023-06-30 09:40:06
code4romania/asistent-medical-comunitar
https://api.github.com/repos/code4romania/asistent-medical-comunitar
opened
[Activități comunitare] replace 'Acțiuni rapide' button label with 'Adaugă activitate comunitară' label
design
**Describe the issue** As described in issues #84 #85 #86 the buttons for adding each individual community activity is displayed only in 'Acțiuni rapide' button. In the design, the button for adding an individual community activity is also displayed in the top right corner of the table. Because in the current version of the app, the button is hidden in 'Acțiuni rapide', the user may have a hard time finding it. **To Reproduce** Steps to reproduce the behavior: 1. Go to 'Activități comunitare' 2. Go to 'Activități administrative' tab 3. Click on 'Acțiuni rapide' button 4. Notice the 'Adaugă campanie de sănătate', 'Adaugă activitate de mediu' and 'Adaugă activitate administrative' buttons **Expected behavior** If there is no possibility of having the buttons both displayed in 'Acțiuni rapide' and individually on the top right corner of each table, according to the design, I propose to replace the 'Acțiuni rapide' button label with 'Adaugă activitate comunitară' label. In this way, the user will know that if they want to add a new community activity they have to click there, then choose which type of activity they want to add. Also, make button color #0E7490 (main color) to be the same color as all other 'Adaugă...'' buttons.
1.0
[Activități comunitare] replace 'Acțiuni rapide' button label with 'Adaugă activitate comunitară' label - **Describe the issue** As described in issues #84 #85 #86 the buttons for adding each individual community activity is displayed only in 'Acțiuni rapide' button. In the design, the button for adding an individual community activity is also displayed in the top right corner of the table. Because in the current version of the app, the button is hidden in 'Acțiuni rapide', the user may have a hard time finding it. **To Reproduce** Steps to reproduce the behavior: 1. Go to 'Activități comunitare' 2. Go to 'Activități administrative' tab 3. Click on 'Acțiuni rapide' button 4. Notice the 'Adaugă campanie de sănătate', 'Adaugă activitate de mediu' and 'Adaugă activitate administrative' buttons **Expected behavior** If there is no possibility of having the buttons both displayed in 'Acțiuni rapide' and individually on the top right corner of each table, according to the design, I propose to replace the 'Acțiuni rapide' button label with 'Adaugă activitate comunitară' label. In this way, the user will know that if they want to add a new community activity they have to click there, then choose which type of activity they want to add. Also, make button color #0E7490 (main color) to be the same color as all other 'Adaugă...'' buttons.
non_process
replace acțiuni rapide button label with adaugă activitate comunitară label describe the issue as described in issues the buttons for adding each individual community activity is displayed only in acțiuni rapide button in the design the button for adding an individual community activity is also displayed in the top right corner of the table because in the current version of the app the button is hidden in acțiuni rapide the user may have a hard time finding it to reproduce steps to reproduce the behavior go to activități comunitare go to activități administrative tab click on acțiuni rapide button notice the adaugă campanie de sănătate adaugă activitate de mediu and adaugă activitate administrative buttons expected behavior if there is no possibility of having the buttons both displayed in acțiuni rapide and individually on the top right corner of each table according to the design i propose to replace the acțiuni rapide button label with adaugă activitate comunitară label in this way the user will know that if they want to add a new community activity they have to click there then choose which type of activity they want to add also make button color main color to be the same color as all other adaugă buttons
0
14,103
16,991,808,861
IssuesEvent
2021-06-30 21:41:36
jonblatho/covid-19
https://api.github.com/repos/jonblatho/covid-19
closed
refactor calculation functions into separate script(s)
enhancement processing
The processing script should be split up and refactored such that generic calculation functions exist in a separate script or series of scripts, and other automation scripts should be updated accordingly. ### Progress #### Refactoring - [x] Refactor scripts #### Reimplementation - [x] Date-relative new cases data file (`data/relative.json`) - [x] Monthly new cases data file (`data/monthly.json`) - [x] Estimated active cases by town dats file (`data/active_town.json`) - [x] Dashboard data file (`data/summary.json`) - [x] Risk level - [x] 7-day new cases - [x] Active cases - [x] 14-day positivity rate - [x] Active hospitalizations - [x] Cumulative deaths - [x] 7-day new tests - [x] Table data file (`data/daily.json`) - [x] Chart data file (`assets/js/chart-data.json`)
1.0
refactor calculation functions into separate script(s) - The processing script should be split up and refactored such that generic calculation functions exist in a separate script or series of scripts, and other automation scripts should be updated accordingly. ### Progress #### Refactoring - [x] Refactor scripts #### Reimplementation - [x] Date-relative new cases data file (`data/relative.json`) - [x] Monthly new cases data file (`data/monthly.json`) - [x] Estimated active cases by town dats file (`data/active_town.json`) - [x] Dashboard data file (`data/summary.json`) - [x] Risk level - [x] 7-day new cases - [x] Active cases - [x] 14-day positivity rate - [x] Active hospitalizations - [x] Cumulative deaths - [x] 7-day new tests - [x] Table data file (`data/daily.json`) - [x] Chart data file (`assets/js/chart-data.json`)
process
refactor calculation functions into separate script s the processing script should be split up and refactored such that generic calculation functions exist in a separate script or series of scripts and other automation scripts should be updated accordingly progress refactoring refactor scripts reimplementation date relative new cases data file data relative json monthly new cases data file data monthly json estimated active cases by town dats file data active town json dashboard data file data summary json risk level day new cases active cases day positivity rate active hospitalizations cumulative deaths day new tests table data file data daily json chart data file assets js chart data json
1
2,082
4,912,237,914
IssuesEvent
2016-11-23 08:12:34
matz-e/lobster
https://api.github.com/repos/matz-e/lobster
closed
Parrot cache task specific?
bug processing
I see this on the worker currently: ```shell d12chas109: du -hs cache 39M cache d12chas109: du -hs t.839/cctools-temp-t.839.DPqpnx/* 1.9G t.839/cctools-temp-t.839.DPqpnx/parrot.187567 8.6M t.839/cctools-temp-t.839.DPqpnx/tmpSAK_CQ ``` which seems to indicate to me that the parrot cache is currently not shared. Is this desired? Did I miss some transition there?
1.0
Parrot cache task specific? - I see this on the worker currently: ```shell d12chas109: du -hs cache 39M cache d12chas109: du -hs t.839/cctools-temp-t.839.DPqpnx/* 1.9G t.839/cctools-temp-t.839.DPqpnx/parrot.187567 8.6M t.839/cctools-temp-t.839.DPqpnx/tmpSAK_CQ ``` which seems to indicate to me that the parrot cache is currently not shared. Is this desired? Did I miss some transition there?
process
parrot cache task specific i see this on the worker currently shell du hs cache cache du hs t cctools temp t dpqpnx t cctools temp t dpqpnx parrot t cctools temp t dpqpnx tmpsak cq which seems to indicate to me that the parrot cache is currently not shared is this desired did i miss some transition there
1
14,072
16,919,827,450
IssuesEvent
2021-06-25 02:43:18
ankidroid/Anki-Android
https://api.github.com/repos/ankidroid/Anki-Android
closed
Regression test for startup: new user granting permissions
Dev Stale Test process
It'd be great to get a regression test for this path in another PR _Originally posted by @david-allison-1 in https://github.com/ankidroid/Anki-Android/issues/8561#issuecomment-817213318_
1.0
Regression test for startup: new user granting permissions - It'd be great to get a regression test for this path in another PR _Originally posted by @david-allison-1 in https://github.com/ankidroid/Anki-Android/issues/8561#issuecomment-817213318_
process
regression test for startup new user granting permissions it d be great to get a regression test for this path in another pr originally posted by david allison in
1
491,670
14,168,928,971
IssuesEvent
2020-11-12 12:29:29
aau-giraf/api_client
https://api.github.com/repos/aau-giraf/api_client
closed
Fundamental bug in the offline repository
Offline mode group 9 point: 13 priority: highest type: bug
**Describe the bug** Models are stored in plain `.json` format in the offline repository which can produce unexpected behavior when reference models are updated or deleted. Say an `Activity` with `offline_id = 45` is saved in the offline repository with a pictogram with `offline_id = 34`: ``` { "id": "1", "order": "20", "pictogram": { "id": "2" } } ``` If then the pictogram were to be deleted with ``` PictogramModel.offline().delete(34) ``` Then calling `ActivityModel.offline().get(45)` will still return the activity with the deleted picogram. **To Reproduce** Steps to reproduce the behavior: 1. Insert an activity to the repository 2. Delete its pictogram 3. Get the activity again 4. Notice that the pictogram is still there **Expected behavior** Activity should not get the deleted pictogram with it. **Actual behavior** Activity gets the deleted pictogram with it.
1.0
Fundamental bug in the offline repository - **Describe the bug** Models are stored in plain `.json` format in the offline repository which can produce unexpected behavior when reference models are updated or deleted. Say an `Activity` with `offline_id = 45` is saved in the offline repository with a pictogram with `offline_id = 34`: ``` { "id": "1", "order": "20", "pictogram": { "id": "2" } } ``` If then the pictogram were to be deleted with ``` PictogramModel.offline().delete(34) ``` Then calling `ActivityModel.offline().get(45)` will still return the activity with the deleted picogram. **To Reproduce** Steps to reproduce the behavior: 1. Insert an activity to the repository 2. Delete its pictogram 3. Get the activity again 4. Notice that the pictogram is still there **Expected behavior** Activity should not get the deleted pictogram with it. **Actual behavior** Activity gets the deleted pictogram with it.
non_process
fundamental bug in the offline repository describe the bug models are stored in plain json format in the offline repository which can produce unexpected behavior when reference models are updated or deleted say an activity with offline id is saved in the offline repository with a pictogram with offline id id order pictogram id if then the pictogram were to be deleted with pictogrammodel offline delete then calling activitymodel offline get will still return the activity with the deleted picogram to reproduce steps to reproduce the behavior insert an activity to the repository delete its pictogram get the activity again notice that the pictogram is still there expected behavior activity should not get the deleted pictogram with it actual behavior activity gets the deleted pictogram with it
0
10,240
13,099,033,149
IssuesEvent
2020-08-03 20:43:24
googleapis/google-cloud-ruby
https://api.github.com/repos/googleapis/google-cloud-ruby
closed
Migrate google-cloud-bigtable to the microgenerator
api: bigtable type: process
Migrate google-cloud-bigtable to the microgenerator. This involves the following steps: * [x] Write synth file and generate `google-cloud-bigtable-v2` * [x] Write synth file and generate `google-cloud-bigtable-admin-v2` * [x] Make sure the new libraries are configured in kokoro * [x] Release `google-cloud-bigtable-v2` * [x] Release `google-cloud-bigtable-admin-v2` * [x] Switch `google-cloud-bigtable` backend to the versioned gems. That is: * Rip out synth and all the generated code * Add `google-cloud-bigtable-v2` and `google-cloud-bigtable-admin-v2` as dependencies * Update the veneer code to the microgenerator usage * [ ] Release `google-cloud-bigtable` update I do not believe samples need to be updated, unless they invoke the low-level interface directly.
1.0
Migrate google-cloud-bigtable to the microgenerator - Migrate google-cloud-bigtable to the microgenerator. This involves the following steps: * [x] Write synth file and generate `google-cloud-bigtable-v2` * [x] Write synth file and generate `google-cloud-bigtable-admin-v2` * [x] Make sure the new libraries are configured in kokoro * [x] Release `google-cloud-bigtable-v2` * [x] Release `google-cloud-bigtable-admin-v2` * [x] Switch `google-cloud-bigtable` backend to the versioned gems. That is: * Rip out synth and all the generated code * Add `google-cloud-bigtable-v2` and `google-cloud-bigtable-admin-v2` as dependencies * Update the veneer code to the microgenerator usage * [ ] Release `google-cloud-bigtable` update I do not believe samples need to be updated, unless they invoke the low-level interface directly.
process
migrate google cloud bigtable to the microgenerator migrate google cloud bigtable to the microgenerator this involves the following steps write synth file and generate google cloud bigtable write synth file and generate google cloud bigtable admin make sure the new libraries are configured in kokoro release google cloud bigtable release google cloud bigtable admin switch google cloud bigtable backend to the versioned gems that is rip out synth and all the generated code add google cloud bigtable and google cloud bigtable admin as dependencies update the veneer code to the microgenerator usage release google cloud bigtable update i do not believe samples need to be updated unless they invoke the low level interface directly
1
22,013
30,519,117,388
IssuesEvent
2023-07-19 06:43:46
fkdl0048/BookReview
https://api.github.com/repos/fkdl0048/BookReview
closed
12장 객체지향 설계의 SOLID 원칙
2023 The Object-Oriented Thought Process
## 12. 객체지향 설계의 SOLID 원칙 객체지향 프로그래밍과 관련해 많은 개발자가 가장 흔히 하는 말 중에 하나는 '실제 세계를 모델링할 수 있는 게 객체지향의 주요 장점이다.'라는 말이다. 고전적인 객체지향 개념에서는 많이 사용되는 말이지만, 로버트 마틴씨의 말에 따르면 객체지향이 우리가 생각하는 방식과 밀접하다고 말하는 것은 그저 마케팅에 불과하다고 한다. 그 대신에, 그는 객체지향이란 핵심 의존성들을 역전시킴으로써 경직된 코드나 취약한 코드 및 재사용이 불가능 코드가 되지 않게 하는 식으로 의존체들을 관리하는 일이라고 말한다. 고전적인 객체지향 프로그래밍 과정에서는 코드를 종종 실제 상황에 맞게 모델링하는 경우가 많다. 예를 들어, 개가 포유류의 일종이라면(is-a)이 관계를 상속으로 나타내는게 명백한 선택지이기는 하다. 엄격한 has-a 및 is-a 리트머스 테스트라는 게 수년간 객체지향적 마음가짐의 한 부분을 차지했다. 그러나 이 책 전체에서 보여주듯 상속 관계를 강요하려고 하면 설계 문제가 생길 수 있다. 짖지 않는 개와 짖는 개, 날지 못하는 새와 날 수 있는 새를 상속 설계를 잘 선택해서 하는 것만으로 서로 구분되게 할 수 있다고 생각하는가? has-a와 is-a를 엄밀하게 구분해 결정하는 데 집중하는 것만이 최선의 접근 방식은 아닐 것이다. 우리는 클래스를 분리하는 데 더 집중해야 한다. 밥 삼촌(로버트 마틴)은 재사용할 수 없는 코드를 설명하기 위해 다음과 같은 세 가지 용어를 정의한다. - 경직성(rigitity): 프로그램의 한 부분을 변경하면 다른 부분까지 변경해야 하는 경우 - 취약성(fragility): 관련이 없는 곳에서 오류가 발생하는 경우 - 부동성(immobility): 코드를 원래 맥락에서 벗어나 재사용할 수 없는 경우 이러한 문제들을 해결하고 목표를 달성하기 위해 SOLID가 도입되었다. 로버트 마틴이 '**소프트웨어 설계를 더 이해하기 쉽고 더 유연하며 유지보수 가능하게 만들기**'위해 도입한 다섯 가지 설계 원칙을 SOLID라고 한다. 밥 아저씨의 말에 따르면 모든 객체지향 설계에 적용되지만, SOLID원칙은 **애자일** 개발이나 적응형(adaptive) 소프트웨어 개발과 같은 방법론의 핵심 철학을 형성할 수 있다. - SRP: 단일 책임 원칙(Single Responsibility Principle) - OCP: 개방-폐쇄 원칙(Open-Closed Principle) - LSP: 리스코프 치환 원칙(Liskov Substitution Principle) - ISP: 인터페이스 분리 원칙(Interface Segregation Principle) - DIP: 의존성 역전 원칙(Dependency Inversion Principle) 이 원칙들을 다뤄보면서 수십년 동안 존재해 온 고전적인 객체지향 원칙과 관련지어 본다. ### 12.1 객체지향 설계의 SOLID 원칙 #### 12.1.1 단일 책임 원칙(SRP) 단일 책임 원칙(Single Responsibility Principle)에 따르면 클래스를 변경한 이유가 단일해야 한다. 프로그램의 각 클래스와 모듈은 단일 작업에 중점을 두어야 한다. 따라서 같은 클래스 안에 다른 이유 때문에 변경될 메서드를 넣지 않도록 한다. 클래스를 설명하는 글에 '그리고'라는 단어가 포함되면 SRP가 깨질 수 있다. 다시 말해서, 모든 모듈이나 클래스는 소프트웨어가 제공하는 기능의 단일 부분에 대해서만 책임을 져야 한다. 그 책임을 완전히 캡슐화해야 한다. 이 추상 클래스를 상속 받은 클래스는 반드시 calcArea() 메서드를 구현해야 한다. ```java abstract class Shape{ protected String name; protected double area; public abstract double calcArea(); } ``` 추상 클래스를 상속받은 Circle 클래스는 자체적으로 calcArea() 메서드를 구현한다. ```java class Circle extends Shape{ private double radius; public Circle(double radius){ this.radius = radius; } public double calcArea(){ area = Math.PI * radius * radius; return area; } } ``` > **주의** > 단일 책임 원칙에 초점을 맞출 뿐만 아니라 가능한 한 간단한 예제가 되게 하려고 Circle 클래스를 만들었다. CalculaterAreas라는 클래스는 Shape 배열에 포함된 다른 도형의 면적을 합산한다. ```java class CalculaterAreas{ Shape[] shapes; double sumTotal = 0; public CalculaterAreas(Shape[] shapes){ this.shapes = shapes; } public double sumAreas(){ sumTotal = 0; for (Shape shape : shapes){ sumTotal += shape.calcArea(); } return sumTotal; } public void output(){ System.out.println("Sum of the areas of provided shapes: " + sumTotal); } } ``` CalculaterAreas 클래스는 출력을 처리하므로 문제가 있다. 면적 계산 행위와 출력 행위가 같은 클래스에 포함되어 있다. ```java public class TestShape{ public static void main(String[] args){ Circle circle = new Circle(1); Shape[] shapeArray = new Shape[1]; shapeArray[0] = circle; CalculaterAreas calculator = new CalculaterAreas(shapeArray); calculator.sumAreas(); calculator.output(); } } ``` 이 처럼 테스트 애플리케이션이 준비되면 단일 책임 원칙 문제에 중점을 둘 수 있다. 다시 말하지만 이 클래스는 출력 행위와 합산 행위가 포함되어 있다. 여기서 근본적인 문제는 다음과 같다. 메서드의 기능을 변경하려면 면적을 합산하는 메서드를 변경할지 여부에 관계없이 CalculatorAreas 클래스를 변경해야 한다. **예를 들어, 어떤 시점에서 간단한 텍스트가 아닌 HTML로 출력하고 싶다면 어떻게 해야 할까?** 지금은 책임이 결합되어 있기 때문에 영역을 합한 코드를 다시 컴파일하고 재배치해야 한다. 단일 책임 원칙에 따르면, 한 메서드를 변경해도 다른 메서드에 영향을 미치지 않게 하여 불필요한 컴파일이 없도록 하는 것이 목표다. > '한 클래스에는 변화해야 할 이유가 한 가지, 아니 단 한 가지여야 한다. 즉, 변화해야 할 책임이 단일해야 한다.' 이를 해결하기 위해, 두 개의 메서드를 서로 분리된 클래스에 넣을 수 있는데, 그중에 하나는 원래 콘솔 출력용이고 다른 하나는 HTML 출력용이다. ```java class CalculaterAreas{ Shape[] shapes; double sumTotal = 0; public CalculaterAreas(Shape[] shapes){ this.shapes = shapes; } public double sumAreas(){ sumTotal = 0; for (Shape shape : shapes){ sumTotal += shape.calcArea(); } return sumTotal; } } class OutputAreas{ double areas = 0; public OutputAreas(double areas){ this.areas = areas; } public void console(){ System.out.println("Sum of the areas of provided shapes: " + areas); } public void html(){ System.out.println("<HTML>"); System.out.println("Sum of the areas of provided shapes: " + areas ); System.out.println("</HTML>"); } } ``` 이제 테스트 애플리케이션을 다시 작성해보자. ```java public class TestShape{ public static void main(String[] args){ Circle circle = new Circle(1); Shape[] shapeArray = new Shape[1]; shapeArray[0] = circle; CalculaterAreas calculator = new CalculaterAreas(shapeArray); OutputAreas output = new OutputAreas(calculator.sumAreas()); output.console(); output.html(); } } ``` 여기서 요점은 요구사항에 따라 다양한 대상으로 출력을 보낼 수 있다는 것이다. 그 밖에도 Json이 필요하다면 CalculatorAreas 클래스를 변경하지 않고 OutputAreas 클래스에 Json 출력 메서드를 추가할 수 있다. 결과적으로 CalculatorAreas클래스를 독립적으로 재배포하면서도 그 밖의 클래스에 전혀 영향을 미치지 않는다. ``` 생각 개인적인 생각으로 OutputAreas 클래스는 한 가지 행위만 하고 있긴 하지만 좀 더 나아가서 콘솔 출력과 HTML 출력을 각각 클래스로 분리하는 것이 더 좋을 것 같다. 인터페이스로 묶어서 해당 인터페이스를 상속받아 출력 기능을 각각 구현하는 것이 좀 더 바람직..? 클래스를 쪼갤 수 있으면 쪼개서 작게 만들어야 하기도 하고 DI를 사용해 사용자에게 new가 아닌 주입을 받아 사용하는 것이 좋다고 생각한다. ``` #### 12.1.2. 개방/폐쇄 원칙(OCP) 개방/폐쇄 원칙(Open/Close Principle)에 따르면 클래스를 수정하지 않고 클래스의 행위를 확장할 수 있어야 한다. ```java class Rectangle{ public double length; public double width; public Rectangle(double length, double width){ this.length = length; this.width = width; } } class CalculateAreas{ private double area; public double calculateRectangleArea(Rectangle rectangle){ area = rectangle.length * rectangle.width; return area; } } public class OpenColsed{ public static void main(String[] args){ Rectangle rectangle = new Rectangle(1, 2); CalculateAreas calculator = new CalculateAreas(); System.out.println("Area of rectangle: " + calculator.calculateRectangleArea(rectangle)); } } ``` 이 애플리케이션이 사각형에 한해서 작동한다는 사실은 개방/폐쇄 원칙을 설명하는 제약 조건을 제공한다. CalculateAreas 클래스에 Circle을 추가하려면 모듈 자체를 변경해야 한다. 분명히 이것은 개방/폐쇄 원칙과 상충되며, 모듈을 변경하기 위해 모듈을 변경할 필요가 없다는 것을 명시하고 있다. Shape라는 추상 클래스를 만들어 상속받게 하여 getArea()의 구현을 강제할 수 있는데, 이때 Shape클래스 자체를 변경할 필요 없이 원하는 만큼의 다양한 클래스를 추가할 수 있다. 이제 Shape클래스가 폐쇄되어 있다고 말할 수 있다. ```java abstract class Shape{ public abstract double getArea(); } class Rectangle extends Shape{ public double length; public double width; public Rectangle(double length, double width){ this.length = length; this.width = width; } @Override public double getArea(){ return length * width; } } class Circle extends Shape{ public double radius; public Circle(double radius){ this.radius = radius; } @Override public double getArea(){ return Math.PI * radius * radius; } } class CalculateAreas{ private double area; public double calculateArea(Shape shape){ area = shape.getArea(); return area; } } public class OpenColsed{ public static void main(String[] args){ Shape rectangle = new Rectangle(1, 2); Shape circle = new Circle(1); CalculateAreas calculator = new CalculateAreas(); System.out.println("Area of rectangle: " + calculator.calculateArea(rectangle)); System.out.println("Area of circle: " + calculator.calculateArea(circle)); } } ``` 이런 식으로 구현하면 새 Shape를 추가할 때 CalculateAreas 클래스를 변경할 필요가 없다. 이는 레거시 코드에 대한 걱정없이 코드를 확장할 수 있음을 의미한다. 핵심은 개방/폐쇄 원리는 자식 클래스를 통해 코드를 확장해야 하며, 원래 클래스는 변경할 필요가 없다는 것이다. 그러나 확장(Extension)이라는 단어는 SOLID와 관련된 여러 토론에서 문제가 된다. 우리가 상속보다 합성을 선호한다면 이것이 개방/폐쇄 원칙에 어떤 영향을 미칠까? SOLID 원칙 중 하나를 따르는 경우에, 코드가 또 다른 SOLID 원칙 중 하나를 따라야 할 수 있다. 예를 들어, 개방/폐쇄 원칙을 따르도록 코드를 설계했는데 이 코드가 단일 책임 원칙을 준수할 수도 있다는 말이다..! ``` 생각 단일 책임 원칙에서는 Shape를 가지고 다형성을 구현, 여기서도 Shape를 가지고 다형성을 구현했다. 다른 점은 CalculateAreas 클래스의 행위에 대한 인터페이스 설계인 것 같다. 해당 인터페이스를 열어 두어 다형성, 캡슐화를 위해 클래스를 분리하고 이를 보여주는 예제가 아마 나올 듯 하다. ``` #### 12.1.3. 리스코프 대체 원칙(LSP) 리스코프 대체(치환) 원칙(Liskov Substitution Principle)에 따르면 부모 클래스의 인스턴스를 해당 자식 클래스 중 하나의 인스턴스로 교체할 수 있게 설계해야 한다. 부모 클래스가 무언가를 할 수 있다면 자식 클래스도 그것을 할 수 있어야 한다. 합리적으로 보일 수 있지만.. 리스코프 대체 원칙을 위반하는 사례를 먼저 본다. ```java abstract class Shape{ protected double area; public abstract double getArea(); } class Rectangle extends Shape{ private double length; private double width; public Rectangle(double length, double width){ this.length = length; this.width = width; } @Override public double getArea(){ area = length * width; return area; } } class Square extends Rectangle{ public Square(double side){ super(side, side); } } public class LiskovSubstitution{ public static void main(String[] args){ Rectangle rectangle = new Rectangle(1, 2); System.out.println("Area of rectangle: " + rectangle.getArea()); Square square = new Square(2); System.out.println("Area of square: " + square.getArea()); } } ``` 문제없는 코드이다. 직사각형은 도형의 일종으므로(is-a) 모든 것이 좋아보인다. 정사각형또한 직사각형의 일종으로(is-a) 아직까지는 문제가 없는 것 같다. 만약 이런 관계가 아니라면 어떻게 해야 할까? 이제는 다소 철학적인 이야기를 해야한다. 정사각형이 실제로는 직사각형의 일종인가? 많은 사람들이 그렇게 말하지만 정사각형은 특수한 유형의 도형일 수 있지만 직사각형과는 속성이 다르다. 직사각형은 사격형이기도 하지만 평행사변형이기도 하다.(대각에 놓인 변끼리 일치하는 경우) 한편, 정사각형은 마름모이지만, 직사각형은 그렇지 않다. 따라서 직사각형과 정사각형 간에는 약간 다른 점이 있다. 실제로는 객체지향 설계 시에 기하학이 문제 되지는 않는다. 우리가 직사각형과 정사각형을 어떻게 만드느냐가 문제다. ```java public Rectangle(double l, double w){ length = l; width = w; } ``` 생성자에는 분명히 두 개의 매개변수가 필요하다. 그러나 부모 클래스인 Rectangle은 Square생성자가 두 개이기를 기대하지만, 한 개만 필요하다. 실제로 면적을 계산하는 기능은 두 클래스에서 미묘하게 다르다. 사실 Square는 동일한 매개변수를 두 번 전달하여 Rectangle을 속인다. 이것은 받아들일 수 있는 해결책처럼 보일지 모르지만,**실제로는 코드를 혼란스럽게 할 수 있고 의도하지 않은 유지보수 문제를 초래할 수 있다.** 이것은 일관성이 없는 설계 결정이며, 아마도 의심스러운 설계 결정일 것이다. 생성자가 다른 생성자를 호출하는 것을 보게 된다면 설계를 일시 중지하고 다시 생각해보는 게 좋다. *적절하지 않은 자식 클래스가 아니여서 그럴 수 있기 때문* 이런 딜레마를 해결하기 위해선 정사각형과 직사각형을 다음과 같이 설계해야 한다. ```java abstract class Shape{ protected double area; public abstract double getArea(); } class Rectangle extends Shape{ private double length; private double width; public Rectangle(double length, double width){ this.length = length; this.width = width; } @Override public double getArea(){ area = length * width; return area; } } class Square extends Shape{ private double side; public Square(double side){ this.side = side; } @Override public double getArea(){ area = side * side; return area; } } public class LiskovSubstitution{ public static void main(String[] args){ Rectangle rectangle = new Rectangle(1, 2); System.out.println("Area of rectangle: " + rectangle.getArea()); Square square = new Square(2); System.out.println("Area of square: " + square.getArea()); } } ``` #### 12.1.4. 인터페이스 분리 원칙(ISP) 인터페이스 분리 원칙(Interface Segregation Principle)에 따르면 몇 개의 큰 인터페이스가 있는 편보다는 작은 인터페이스가 많은 편이 바람직하다. *클래스도 마찬가지* 이번 예제는 Mammal, eat() 및 makeNoise()에 대한 여러 행위를 포함하는 단일 인터페이스를 작성한다. ```java interface Mammal{ public void eat(); public void makeNoise(); } class Dog implements Mammal{ @Override public void eat(){ System.out.println("Dog eats"); } @Override public void makeNoise(){ System.out.println("Dog barks"); } } public class myClass{ public static void main(String[] args){ Dog dog = new Dog(); dog.eat(); dog.makeNoise(); } } ``` 이것은 잘 작동하지만, Mammal 인터페이스는 Dog 클래스에 대해 너무 많은 것을 요구한다. Mammal에 대한 단일 인터페이스를 만드는 대신에 모든 행위에 대해 별도의 인터페이스를 만들 수 있다. ```java interface Eater{ public void eat(); } interface NoiseMaker{ public void makeNoise(); } class Dog implements Eater, NoiseMaker{ @Override public void eat(){ System.out.println("Dog eats"); } @Override public void makeNoise(){ System.out.println("Dog barks"); } } public class myClass{ public static void main(String[] args){ Dog dog = new Dog(); dog.eat(); dog.makeNoise(); } } ``` 실제로 우리는 Mammal 클래스에서 **행위를 분리**한다. 따라서 상속을 통해 단일 Mammal 엔터티를 만드는 대신에 이전 장에서 취한 전략과 비슷한 **합성 기반 설계**로 이동한다. *즉, 이 원칙을 사용하면 자연스럽게 합성 기반으로 설계가 이동된다.* 이 접근법을 사용하면 단일 Mammal 클래스에 포함된 행위를 강요하지 않고 합성으로 Mammal들을 만들 수 있다. 예를 들어 누군가가 먹지 않고 대신에 피부를 통해 영양분을 흡수하는 Mammal을 발견했다고 가정해보자. eat()이라는 행위가 포함된 단일 Mammal 클래스에서 상속받으면 새 포유류에는 이 행위가 필요하지 않다. 그러나 모든 행위를 별도의 단일 인터페이스로 분리하면 각 포유동물을 정확하게 제시하는 방식으로 구축할 수 있다. #### 12.1.5. 의존성 역전 원칙(DIP) 의존성 역전 원칙(Dependency Inversion Principle)은 코드가 추상화에 의존해야 한다고 명시하고 있다. 종종 의존성 역전과 의존성 주입이라는 용어가 서로 교환해서 쓸 수 있는 말처럼 들리겠지만, 이 원칙을 논의할 때 이해해야 할 몇 가지 핵심 용어는 다음과 같다. - **의존성 역전**: 의존체들을 역전시키는 **원칙** - **의존성 주입**: 의존체들을 역전시키는 **행위** - **생성자 주입**: 생성자를 통해서 의존성을 주입 - **파라미터 주입**: 메서드의 파라미터를 통해서 의존성을 주입 의존성 역전의 목표는 구상적인 것에 결합하기보다는 추상적인 것에 결합하는 것이다. **어떤 시점에서는 분명히 구상적인 것을 만들어야 하지만, 우리는 main() 메서드에서와 같이 가능한 한 사슬을 멀리 뻗어 구상 객체를 만들려고 노력한다.** ``` 생각 가능한 한 사슬을 멀리 뻗어 구상 객체를 만들려고 한다.. 우리는 최대한 추상적으로 다뤄야 코드의 유연성이 높아진다는 것을 알지만, 그 시점이 어디쯤인지 몰라서 망설이는 경우도 있을 것 이라고 생각한다. 멀리 뻗어 구상 객체를 만들려고 한다는 말 자체가 앞 장을 읽고 경험을 해보니 이해가 되는 말이라 되게 와닿는다. ``` 의존성 역전 원칙의 목표 중 하나는 컴파일타임이 아니라 **런타임에 객체를 선택**하는 것이다. 우리는 이전 클래스를 다시 컴파일하지 않고도 새 클래스를 작성할 수도 있다.(새 클래스를 작성해 주입) ##### 1단계: 초기 예제 Mammal클래스의 예제를 다시 살펴본다. ```java abstract class Mammal{ public abstract String makeNoise(); } ``` Cat과 같은 자식 클래스는 상속을 사용해 포유류의 행위인 makeNoise()를 활용한다. ```java class Cat extends Mammal{ @Override public String makeNoise(){ return "Meow"; } } class Dog extends Mammal{ @Override public String makeNoise(){ return "Bark"; } } ``` ```java public class TestMammal{ public static void main(String[] args){ Mammal cat = new Cat(); Mammal dog = new Dog(); System.out.println(cat.makeNoise()); System.out.println(dog.makeNoise()); } } ``` ##### 2단계: 행위를 분리해 내기 앞의 코드에는 잠재적으로 심각한 결함이 있다. 코드는 포유류와 행위를 연결한다. 포유류의 행위를 포유동물 자체로부터 분리하면 상당한 이점을 얻을 수 있다. 이렇게 하기 위해 우리는 포유류뿐만 아니라 포유류가 아닌 것들도 모두 사용할 수 있는 MakingNoise라는 클래스를 만든다. ```java abstract class MakingNoise{ public abstract String makeNoise(); } class CatNoise extends MakingNoise{ @Override public String makeNoise(){ return "Meow"; } } ``` Cat 클래스와 분리된 MakingNoise 행위를 사용하면 다음 코드 조각과 같이 Cat 클래스 자체의 하드코딩된 행위 대신에 CatNoise 클래스를 사용할 수 있다. ```java abstract class Mammal{ public abstract String makeNoise(); } class Cat extends Mammal{ private MakingNoise noise = new CatNoise(); @Override public String makeNoise(){ return noise.makeNoise(); } } ``` 두 번째 단계까지 적용된 전체 애플리케이션이다. ```java public class TestMammal{ public static void main(String[] args){ Mammal cat = new Cat(); Mammal dog = new Dog(); System.out.println(cat.makeNoise()); System.out.println(dog.makeNoise()); } } ``` 문제는 명확하다. 주요 부분은 분리했지만, Cat은 여전히 Cat울음 소리내기 행위를 인스턴스화 하기 때문에 우리는 의존성 역전이라는 목표에 도달하지 못했다는 것이다. `private CatNoise noise = new CatNoise();` Cat은 저수준 모듈인 CatNoise에 결합된다. 다시 말해서 Cat은 CatNoise와 연결되서는 안 되며, 울음 생성을 위한 추상화에 연결되어야 한다. 실제로 Cat 클래스는 울음 생성 동작을 인스턴스화하지 말고 대신 주입을 통해 받아야 한다. *이는 처음 설명한 구상적인 부분이 아닌 런타임에 추상적으로 처리 가능하기 때문에 Cat이 메인이 아니다.* ##### 3단계: 의존성 주입 이 마지막 단계에서 우리는 설계의 상속 측면을 완전히 버리고 합성을 통한 의존성 주입을 활용하는 방법을 조사한다. 상속보다는 합성이라는 개념이 탄력을 받는 주요 이유 중 하나는 **상속 위계구조가 필요하지 않다는 점**이다. 초기 구현엔 Cat과 Dog가 기본적으로 정확히 같은 코드를 포함하고 있다. 서로 다른 울음소리만 돌려줄 뿐이다. 결과적으로는 상당한 중복이 발생한다. 따라서 포유동물이 많으면 울음소리를 유발하는 코드가 많을 것이다. 아마도 더 나은 설계는 포유류가 울음소리를 내도록 코드를 취하는 것이다. 한 단계 나아가서 특정 포유류를 버리고 다음과 같이 간단히 Mammal클래스를 사용하는 것이다. ```java class Mammal{ MakingNoise noise; public Mammal(MakingNoise noise){ this.noise = noise; } public String makeNoise(){ return noise.makeNoise(); } } ``` 이제는 Cat 울음 소리 생성 행위를 인스턴스화하고 이를 Animal 클래스에 제공하여 Cat처럼 행위를 하는 포유류를 만들 수 있다. 실제로, 전통적인 클래스 구축 기술을 사용하는 대신에 행위를 주입하여 언제든 Cat을 조립할 수 있다. ```java Mammal cat = new Mammal(new CatNoise()); ``` 이제는 의존성 주입을 논의할 때 **객체를 실제로 인스턴스화하는 시점**이 중요한 고려 사항이다. 목표는 주입을 통해 객체를 작성하는 것이지만, 어느 시점에서는 객체를 인스턴스화해야 한다. 결과적으로 설계 결정은 이 인스턴스화를 수행할 시기를 중심으로 이루어진다. 의존성 역전의 목표는 특정 시점에서 구체적으로 무언가를 만들어야 하지만, 무언가 구상적인 게 아닌 추상적인 것에 결합하는 것이다. **따라서 간단한 목표 중 하나는(new 키워드를 사용함으로써) 메서드에서 그러는 것처럼 최대한 멀리까지 이어지게 구상 객체를 만드는 것이다.** new 키워드를 볼 때 마다 언제나 그 대상의 값을 평가하자. ### 12.2. 결론 SOLID원칙은 오는날 사용되는 가장 영향력 있는 객체지향 지침 중 하나이다. 필자는 SOLID의 가장 흥미로운 점을 상투적인 부분이 없다는 것을 강조하는데 각각의 특성이 캡슐화, 상속, 다형성 그리고 합성까지 연결되는 부분에 집중한다. 특히, 상속 대 합성.. 개인적인 생각 + 많은 말들이 그러하듯 정말 좋은 원칙이고 따라야 하는 것은 맞지만 신봉하지는 말자 주의이다. 팀 협업에 있어 다른 인원이 이해하기 쉽고 유지보수가 쉬운 코드를 작성하는 것이 더 중요하다고 생각된다. 물론 이런 원칙에 맞춰 짜여진 코드가 읽기 쉽겠지만 다른 사람의 코드가 그러지 못하다 해서 비난하는게 맞을까? 코드리뷰를 통해 교정은 해야하겠지만 모른다고 무시하는 것은 아닌 것 같다. 최근 친구와 대화에서 코틀린 관련 public에 대한 이야기를 나눴는데 친구는 객체지향 빠돌이에 대한 부정적인 인식이 강했다. 개인적으로 객체지향을 공부하는 입장에선 쉽게 이해가지 않았지만 친구의 자바 롬복 라이브러리, 코틀린의 성격에 대한 이야기를 들어보니 그럴 수 있겠다는 생각이 들었다.. 너무 개인적인 이야기까지 책 리뷰에 적은 것 같지만 이번 책은 여기서 마무리.
1.0
12장 객체지향 설계의 SOLID 원칙 - ## 12. 객체지향 설계의 SOLID 원칙 객체지향 프로그래밍과 관련해 많은 개발자가 가장 흔히 하는 말 중에 하나는 '실제 세계를 모델링할 수 있는 게 객체지향의 주요 장점이다.'라는 말이다. 고전적인 객체지향 개념에서는 많이 사용되는 말이지만, 로버트 마틴씨의 말에 따르면 객체지향이 우리가 생각하는 방식과 밀접하다고 말하는 것은 그저 마케팅에 불과하다고 한다. 그 대신에, 그는 객체지향이란 핵심 의존성들을 역전시킴으로써 경직된 코드나 취약한 코드 및 재사용이 불가능 코드가 되지 않게 하는 식으로 의존체들을 관리하는 일이라고 말한다. 고전적인 객체지향 프로그래밍 과정에서는 코드를 종종 실제 상황에 맞게 모델링하는 경우가 많다. 예를 들어, 개가 포유류의 일종이라면(is-a)이 관계를 상속으로 나타내는게 명백한 선택지이기는 하다. 엄격한 has-a 및 is-a 리트머스 테스트라는 게 수년간 객체지향적 마음가짐의 한 부분을 차지했다. 그러나 이 책 전체에서 보여주듯 상속 관계를 강요하려고 하면 설계 문제가 생길 수 있다. 짖지 않는 개와 짖는 개, 날지 못하는 새와 날 수 있는 새를 상속 설계를 잘 선택해서 하는 것만으로 서로 구분되게 할 수 있다고 생각하는가? has-a와 is-a를 엄밀하게 구분해 결정하는 데 집중하는 것만이 최선의 접근 방식은 아닐 것이다. 우리는 클래스를 분리하는 데 더 집중해야 한다. 밥 삼촌(로버트 마틴)은 재사용할 수 없는 코드를 설명하기 위해 다음과 같은 세 가지 용어를 정의한다. - 경직성(rigitity): 프로그램의 한 부분을 변경하면 다른 부분까지 변경해야 하는 경우 - 취약성(fragility): 관련이 없는 곳에서 오류가 발생하는 경우 - 부동성(immobility): 코드를 원래 맥락에서 벗어나 재사용할 수 없는 경우 이러한 문제들을 해결하고 목표를 달성하기 위해 SOLID가 도입되었다. 로버트 마틴이 '**소프트웨어 설계를 더 이해하기 쉽고 더 유연하며 유지보수 가능하게 만들기**'위해 도입한 다섯 가지 설계 원칙을 SOLID라고 한다. 밥 아저씨의 말에 따르면 모든 객체지향 설계에 적용되지만, SOLID원칙은 **애자일** 개발이나 적응형(adaptive) 소프트웨어 개발과 같은 방법론의 핵심 철학을 형성할 수 있다. - SRP: 단일 책임 원칙(Single Responsibility Principle) - OCP: 개방-폐쇄 원칙(Open-Closed Principle) - LSP: 리스코프 치환 원칙(Liskov Substitution Principle) - ISP: 인터페이스 분리 원칙(Interface Segregation Principle) - DIP: 의존성 역전 원칙(Dependency Inversion Principle) 이 원칙들을 다뤄보면서 수십년 동안 존재해 온 고전적인 객체지향 원칙과 관련지어 본다. ### 12.1 객체지향 설계의 SOLID 원칙 #### 12.1.1 단일 책임 원칙(SRP) 단일 책임 원칙(Single Responsibility Principle)에 따르면 클래스를 변경한 이유가 단일해야 한다. 프로그램의 각 클래스와 모듈은 단일 작업에 중점을 두어야 한다. 따라서 같은 클래스 안에 다른 이유 때문에 변경될 메서드를 넣지 않도록 한다. 클래스를 설명하는 글에 '그리고'라는 단어가 포함되면 SRP가 깨질 수 있다. 다시 말해서, 모든 모듈이나 클래스는 소프트웨어가 제공하는 기능의 단일 부분에 대해서만 책임을 져야 한다. 그 책임을 완전히 캡슐화해야 한다. 이 추상 클래스를 상속 받은 클래스는 반드시 calcArea() 메서드를 구현해야 한다. ```java abstract class Shape{ protected String name; protected double area; public abstract double calcArea(); } ``` 추상 클래스를 상속받은 Circle 클래스는 자체적으로 calcArea() 메서드를 구현한다. ```java class Circle extends Shape{ private double radius; public Circle(double radius){ this.radius = radius; } public double calcArea(){ area = Math.PI * radius * radius; return area; } } ``` > **주의** > 단일 책임 원칙에 초점을 맞출 뿐만 아니라 가능한 한 간단한 예제가 되게 하려고 Circle 클래스를 만들었다. CalculaterAreas라는 클래스는 Shape 배열에 포함된 다른 도형의 면적을 합산한다. ```java class CalculaterAreas{ Shape[] shapes; double sumTotal = 0; public CalculaterAreas(Shape[] shapes){ this.shapes = shapes; } public double sumAreas(){ sumTotal = 0; for (Shape shape : shapes){ sumTotal += shape.calcArea(); } return sumTotal; } public void output(){ System.out.println("Sum of the areas of provided shapes: " + sumTotal); } } ``` CalculaterAreas 클래스는 출력을 처리하므로 문제가 있다. 면적 계산 행위와 출력 행위가 같은 클래스에 포함되어 있다. ```java public class TestShape{ public static void main(String[] args){ Circle circle = new Circle(1); Shape[] shapeArray = new Shape[1]; shapeArray[0] = circle; CalculaterAreas calculator = new CalculaterAreas(shapeArray); calculator.sumAreas(); calculator.output(); } } ``` 이 처럼 테스트 애플리케이션이 준비되면 단일 책임 원칙 문제에 중점을 둘 수 있다. 다시 말하지만 이 클래스는 출력 행위와 합산 행위가 포함되어 있다. 여기서 근본적인 문제는 다음과 같다. 메서드의 기능을 변경하려면 면적을 합산하는 메서드를 변경할지 여부에 관계없이 CalculatorAreas 클래스를 변경해야 한다. **예를 들어, 어떤 시점에서 간단한 텍스트가 아닌 HTML로 출력하고 싶다면 어떻게 해야 할까?** 지금은 책임이 결합되어 있기 때문에 영역을 합한 코드를 다시 컴파일하고 재배치해야 한다. 단일 책임 원칙에 따르면, 한 메서드를 변경해도 다른 메서드에 영향을 미치지 않게 하여 불필요한 컴파일이 없도록 하는 것이 목표다. > '한 클래스에는 변화해야 할 이유가 한 가지, 아니 단 한 가지여야 한다. 즉, 변화해야 할 책임이 단일해야 한다.' 이를 해결하기 위해, 두 개의 메서드를 서로 분리된 클래스에 넣을 수 있는데, 그중에 하나는 원래 콘솔 출력용이고 다른 하나는 HTML 출력용이다. ```java class CalculaterAreas{ Shape[] shapes; double sumTotal = 0; public CalculaterAreas(Shape[] shapes){ this.shapes = shapes; } public double sumAreas(){ sumTotal = 0; for (Shape shape : shapes){ sumTotal += shape.calcArea(); } return sumTotal; } } class OutputAreas{ double areas = 0; public OutputAreas(double areas){ this.areas = areas; } public void console(){ System.out.println("Sum of the areas of provided shapes: " + areas); } public void html(){ System.out.println("<HTML>"); System.out.println("Sum of the areas of provided shapes: " + areas ); System.out.println("</HTML>"); } } ``` 이제 테스트 애플리케이션을 다시 작성해보자. ```java public class TestShape{ public static void main(String[] args){ Circle circle = new Circle(1); Shape[] shapeArray = new Shape[1]; shapeArray[0] = circle; CalculaterAreas calculator = new CalculaterAreas(shapeArray); OutputAreas output = new OutputAreas(calculator.sumAreas()); output.console(); output.html(); } } ``` 여기서 요점은 요구사항에 따라 다양한 대상으로 출력을 보낼 수 있다는 것이다. 그 밖에도 Json이 필요하다면 CalculatorAreas 클래스를 변경하지 않고 OutputAreas 클래스에 Json 출력 메서드를 추가할 수 있다. 결과적으로 CalculatorAreas클래스를 독립적으로 재배포하면서도 그 밖의 클래스에 전혀 영향을 미치지 않는다. ``` 생각 개인적인 생각으로 OutputAreas 클래스는 한 가지 행위만 하고 있긴 하지만 좀 더 나아가서 콘솔 출력과 HTML 출력을 각각 클래스로 분리하는 것이 더 좋을 것 같다. 인터페이스로 묶어서 해당 인터페이스를 상속받아 출력 기능을 각각 구현하는 것이 좀 더 바람직..? 클래스를 쪼갤 수 있으면 쪼개서 작게 만들어야 하기도 하고 DI를 사용해 사용자에게 new가 아닌 주입을 받아 사용하는 것이 좋다고 생각한다. ``` #### 12.1.2. 개방/폐쇄 원칙(OCP) 개방/폐쇄 원칙(Open/Close Principle)에 따르면 클래스를 수정하지 않고 클래스의 행위를 확장할 수 있어야 한다. ```java class Rectangle{ public double length; public double width; public Rectangle(double length, double width){ this.length = length; this.width = width; } } class CalculateAreas{ private double area; public double calculateRectangleArea(Rectangle rectangle){ area = rectangle.length * rectangle.width; return area; } } public class OpenColsed{ public static void main(String[] args){ Rectangle rectangle = new Rectangle(1, 2); CalculateAreas calculator = new CalculateAreas(); System.out.println("Area of rectangle: " + calculator.calculateRectangleArea(rectangle)); } } ``` 이 애플리케이션이 사각형에 한해서 작동한다는 사실은 개방/폐쇄 원칙을 설명하는 제약 조건을 제공한다. CalculateAreas 클래스에 Circle을 추가하려면 모듈 자체를 변경해야 한다. 분명히 이것은 개방/폐쇄 원칙과 상충되며, 모듈을 변경하기 위해 모듈을 변경할 필요가 없다는 것을 명시하고 있다. Shape라는 추상 클래스를 만들어 상속받게 하여 getArea()의 구현을 강제할 수 있는데, 이때 Shape클래스 자체를 변경할 필요 없이 원하는 만큼의 다양한 클래스를 추가할 수 있다. 이제 Shape클래스가 폐쇄되어 있다고 말할 수 있다. ```java abstract class Shape{ public abstract double getArea(); } class Rectangle extends Shape{ public double length; public double width; public Rectangle(double length, double width){ this.length = length; this.width = width; } @Override public double getArea(){ return length * width; } } class Circle extends Shape{ public double radius; public Circle(double radius){ this.radius = radius; } @Override public double getArea(){ return Math.PI * radius * radius; } } class CalculateAreas{ private double area; public double calculateArea(Shape shape){ area = shape.getArea(); return area; } } public class OpenColsed{ public static void main(String[] args){ Shape rectangle = new Rectangle(1, 2); Shape circle = new Circle(1); CalculateAreas calculator = new CalculateAreas(); System.out.println("Area of rectangle: " + calculator.calculateArea(rectangle)); System.out.println("Area of circle: " + calculator.calculateArea(circle)); } } ``` 이런 식으로 구현하면 새 Shape를 추가할 때 CalculateAreas 클래스를 변경할 필요가 없다. 이는 레거시 코드에 대한 걱정없이 코드를 확장할 수 있음을 의미한다. 핵심은 개방/폐쇄 원리는 자식 클래스를 통해 코드를 확장해야 하며, 원래 클래스는 변경할 필요가 없다는 것이다. 그러나 확장(Extension)이라는 단어는 SOLID와 관련된 여러 토론에서 문제가 된다. 우리가 상속보다 합성을 선호한다면 이것이 개방/폐쇄 원칙에 어떤 영향을 미칠까? SOLID 원칙 중 하나를 따르는 경우에, 코드가 또 다른 SOLID 원칙 중 하나를 따라야 할 수 있다. 예를 들어, 개방/폐쇄 원칙을 따르도록 코드를 설계했는데 이 코드가 단일 책임 원칙을 준수할 수도 있다는 말이다..! ``` 생각 단일 책임 원칙에서는 Shape를 가지고 다형성을 구현, 여기서도 Shape를 가지고 다형성을 구현했다. 다른 점은 CalculateAreas 클래스의 행위에 대한 인터페이스 설계인 것 같다. 해당 인터페이스를 열어 두어 다형성, 캡슐화를 위해 클래스를 분리하고 이를 보여주는 예제가 아마 나올 듯 하다. ``` #### 12.1.3. 리스코프 대체 원칙(LSP) 리스코프 대체(치환) 원칙(Liskov Substitution Principle)에 따르면 부모 클래스의 인스턴스를 해당 자식 클래스 중 하나의 인스턴스로 교체할 수 있게 설계해야 한다. 부모 클래스가 무언가를 할 수 있다면 자식 클래스도 그것을 할 수 있어야 한다. 합리적으로 보일 수 있지만.. 리스코프 대체 원칙을 위반하는 사례를 먼저 본다. ```java abstract class Shape{ protected double area; public abstract double getArea(); } class Rectangle extends Shape{ private double length; private double width; public Rectangle(double length, double width){ this.length = length; this.width = width; } @Override public double getArea(){ area = length * width; return area; } } class Square extends Rectangle{ public Square(double side){ super(side, side); } } public class LiskovSubstitution{ public static void main(String[] args){ Rectangle rectangle = new Rectangle(1, 2); System.out.println("Area of rectangle: " + rectangle.getArea()); Square square = new Square(2); System.out.println("Area of square: " + square.getArea()); } } ``` 문제없는 코드이다. 직사각형은 도형의 일종으므로(is-a) 모든 것이 좋아보인다. 정사각형또한 직사각형의 일종으로(is-a) 아직까지는 문제가 없는 것 같다. 만약 이런 관계가 아니라면 어떻게 해야 할까? 이제는 다소 철학적인 이야기를 해야한다. 정사각형이 실제로는 직사각형의 일종인가? 많은 사람들이 그렇게 말하지만 정사각형은 특수한 유형의 도형일 수 있지만 직사각형과는 속성이 다르다. 직사각형은 사격형이기도 하지만 평행사변형이기도 하다.(대각에 놓인 변끼리 일치하는 경우) 한편, 정사각형은 마름모이지만, 직사각형은 그렇지 않다. 따라서 직사각형과 정사각형 간에는 약간 다른 점이 있다. 실제로는 객체지향 설계 시에 기하학이 문제 되지는 않는다. 우리가 직사각형과 정사각형을 어떻게 만드느냐가 문제다. ```java public Rectangle(double l, double w){ length = l; width = w; } ``` 생성자에는 분명히 두 개의 매개변수가 필요하다. 그러나 부모 클래스인 Rectangle은 Square생성자가 두 개이기를 기대하지만, 한 개만 필요하다. 실제로 면적을 계산하는 기능은 두 클래스에서 미묘하게 다르다. 사실 Square는 동일한 매개변수를 두 번 전달하여 Rectangle을 속인다. 이것은 받아들일 수 있는 해결책처럼 보일지 모르지만,**실제로는 코드를 혼란스럽게 할 수 있고 의도하지 않은 유지보수 문제를 초래할 수 있다.** 이것은 일관성이 없는 설계 결정이며, 아마도 의심스러운 설계 결정일 것이다. 생성자가 다른 생성자를 호출하는 것을 보게 된다면 설계를 일시 중지하고 다시 생각해보는 게 좋다. *적절하지 않은 자식 클래스가 아니여서 그럴 수 있기 때문* 이런 딜레마를 해결하기 위해선 정사각형과 직사각형을 다음과 같이 설계해야 한다. ```java abstract class Shape{ protected double area; public abstract double getArea(); } class Rectangle extends Shape{ private double length; private double width; public Rectangle(double length, double width){ this.length = length; this.width = width; } @Override public double getArea(){ area = length * width; return area; } } class Square extends Shape{ private double side; public Square(double side){ this.side = side; } @Override public double getArea(){ area = side * side; return area; } } public class LiskovSubstitution{ public static void main(String[] args){ Rectangle rectangle = new Rectangle(1, 2); System.out.println("Area of rectangle: " + rectangle.getArea()); Square square = new Square(2); System.out.println("Area of square: " + square.getArea()); } } ``` #### 12.1.4. 인터페이스 분리 원칙(ISP) 인터페이스 분리 원칙(Interface Segregation Principle)에 따르면 몇 개의 큰 인터페이스가 있는 편보다는 작은 인터페이스가 많은 편이 바람직하다. *클래스도 마찬가지* 이번 예제는 Mammal, eat() 및 makeNoise()에 대한 여러 행위를 포함하는 단일 인터페이스를 작성한다. ```java interface Mammal{ public void eat(); public void makeNoise(); } class Dog implements Mammal{ @Override public void eat(){ System.out.println("Dog eats"); } @Override public void makeNoise(){ System.out.println("Dog barks"); } } public class myClass{ public static void main(String[] args){ Dog dog = new Dog(); dog.eat(); dog.makeNoise(); } } ``` 이것은 잘 작동하지만, Mammal 인터페이스는 Dog 클래스에 대해 너무 많은 것을 요구한다. Mammal에 대한 단일 인터페이스를 만드는 대신에 모든 행위에 대해 별도의 인터페이스를 만들 수 있다. ```java interface Eater{ public void eat(); } interface NoiseMaker{ public void makeNoise(); } class Dog implements Eater, NoiseMaker{ @Override public void eat(){ System.out.println("Dog eats"); } @Override public void makeNoise(){ System.out.println("Dog barks"); } } public class myClass{ public static void main(String[] args){ Dog dog = new Dog(); dog.eat(); dog.makeNoise(); } } ``` 실제로 우리는 Mammal 클래스에서 **행위를 분리**한다. 따라서 상속을 통해 단일 Mammal 엔터티를 만드는 대신에 이전 장에서 취한 전략과 비슷한 **합성 기반 설계**로 이동한다. *즉, 이 원칙을 사용하면 자연스럽게 합성 기반으로 설계가 이동된다.* 이 접근법을 사용하면 단일 Mammal 클래스에 포함된 행위를 강요하지 않고 합성으로 Mammal들을 만들 수 있다. 예를 들어 누군가가 먹지 않고 대신에 피부를 통해 영양분을 흡수하는 Mammal을 발견했다고 가정해보자. eat()이라는 행위가 포함된 단일 Mammal 클래스에서 상속받으면 새 포유류에는 이 행위가 필요하지 않다. 그러나 모든 행위를 별도의 단일 인터페이스로 분리하면 각 포유동물을 정확하게 제시하는 방식으로 구축할 수 있다. #### 12.1.5. 의존성 역전 원칙(DIP) 의존성 역전 원칙(Dependency Inversion Principle)은 코드가 추상화에 의존해야 한다고 명시하고 있다. 종종 의존성 역전과 의존성 주입이라는 용어가 서로 교환해서 쓸 수 있는 말처럼 들리겠지만, 이 원칙을 논의할 때 이해해야 할 몇 가지 핵심 용어는 다음과 같다. - **의존성 역전**: 의존체들을 역전시키는 **원칙** - **의존성 주입**: 의존체들을 역전시키는 **행위** - **생성자 주입**: 생성자를 통해서 의존성을 주입 - **파라미터 주입**: 메서드의 파라미터를 통해서 의존성을 주입 의존성 역전의 목표는 구상적인 것에 결합하기보다는 추상적인 것에 결합하는 것이다. **어떤 시점에서는 분명히 구상적인 것을 만들어야 하지만, 우리는 main() 메서드에서와 같이 가능한 한 사슬을 멀리 뻗어 구상 객체를 만들려고 노력한다.** ``` 생각 가능한 한 사슬을 멀리 뻗어 구상 객체를 만들려고 한다.. 우리는 최대한 추상적으로 다뤄야 코드의 유연성이 높아진다는 것을 알지만, 그 시점이 어디쯤인지 몰라서 망설이는 경우도 있을 것 이라고 생각한다. 멀리 뻗어 구상 객체를 만들려고 한다는 말 자체가 앞 장을 읽고 경험을 해보니 이해가 되는 말이라 되게 와닿는다. ``` 의존성 역전 원칙의 목표 중 하나는 컴파일타임이 아니라 **런타임에 객체를 선택**하는 것이다. 우리는 이전 클래스를 다시 컴파일하지 않고도 새 클래스를 작성할 수도 있다.(새 클래스를 작성해 주입) ##### 1단계: 초기 예제 Mammal클래스의 예제를 다시 살펴본다. ```java abstract class Mammal{ public abstract String makeNoise(); } ``` Cat과 같은 자식 클래스는 상속을 사용해 포유류의 행위인 makeNoise()를 활용한다. ```java class Cat extends Mammal{ @Override public String makeNoise(){ return "Meow"; } } class Dog extends Mammal{ @Override public String makeNoise(){ return "Bark"; } } ``` ```java public class TestMammal{ public static void main(String[] args){ Mammal cat = new Cat(); Mammal dog = new Dog(); System.out.println(cat.makeNoise()); System.out.println(dog.makeNoise()); } } ``` ##### 2단계: 행위를 분리해 내기 앞의 코드에는 잠재적으로 심각한 결함이 있다. 코드는 포유류와 행위를 연결한다. 포유류의 행위를 포유동물 자체로부터 분리하면 상당한 이점을 얻을 수 있다. 이렇게 하기 위해 우리는 포유류뿐만 아니라 포유류가 아닌 것들도 모두 사용할 수 있는 MakingNoise라는 클래스를 만든다. ```java abstract class MakingNoise{ public abstract String makeNoise(); } class CatNoise extends MakingNoise{ @Override public String makeNoise(){ return "Meow"; } } ``` Cat 클래스와 분리된 MakingNoise 행위를 사용하면 다음 코드 조각과 같이 Cat 클래스 자체의 하드코딩된 행위 대신에 CatNoise 클래스를 사용할 수 있다. ```java abstract class Mammal{ public abstract String makeNoise(); } class Cat extends Mammal{ private MakingNoise noise = new CatNoise(); @Override public String makeNoise(){ return noise.makeNoise(); } } ``` 두 번째 단계까지 적용된 전체 애플리케이션이다. ```java public class TestMammal{ public static void main(String[] args){ Mammal cat = new Cat(); Mammal dog = new Dog(); System.out.println(cat.makeNoise()); System.out.println(dog.makeNoise()); } } ``` 문제는 명확하다. 주요 부분은 분리했지만, Cat은 여전히 Cat울음 소리내기 행위를 인스턴스화 하기 때문에 우리는 의존성 역전이라는 목표에 도달하지 못했다는 것이다. `private CatNoise noise = new CatNoise();` Cat은 저수준 모듈인 CatNoise에 결합된다. 다시 말해서 Cat은 CatNoise와 연결되서는 안 되며, 울음 생성을 위한 추상화에 연결되어야 한다. 실제로 Cat 클래스는 울음 생성 동작을 인스턴스화하지 말고 대신 주입을 통해 받아야 한다. *이는 처음 설명한 구상적인 부분이 아닌 런타임에 추상적으로 처리 가능하기 때문에 Cat이 메인이 아니다.* ##### 3단계: 의존성 주입 이 마지막 단계에서 우리는 설계의 상속 측면을 완전히 버리고 합성을 통한 의존성 주입을 활용하는 방법을 조사한다. 상속보다는 합성이라는 개념이 탄력을 받는 주요 이유 중 하나는 **상속 위계구조가 필요하지 않다는 점**이다. 초기 구현엔 Cat과 Dog가 기본적으로 정확히 같은 코드를 포함하고 있다. 서로 다른 울음소리만 돌려줄 뿐이다. 결과적으로는 상당한 중복이 발생한다. 따라서 포유동물이 많으면 울음소리를 유발하는 코드가 많을 것이다. 아마도 더 나은 설계는 포유류가 울음소리를 내도록 코드를 취하는 것이다. 한 단계 나아가서 특정 포유류를 버리고 다음과 같이 간단히 Mammal클래스를 사용하는 것이다. ```java class Mammal{ MakingNoise noise; public Mammal(MakingNoise noise){ this.noise = noise; } public String makeNoise(){ return noise.makeNoise(); } } ``` 이제는 Cat 울음 소리 생성 행위를 인스턴스화하고 이를 Animal 클래스에 제공하여 Cat처럼 행위를 하는 포유류를 만들 수 있다. 실제로, 전통적인 클래스 구축 기술을 사용하는 대신에 행위를 주입하여 언제든 Cat을 조립할 수 있다. ```java Mammal cat = new Mammal(new CatNoise()); ``` 이제는 의존성 주입을 논의할 때 **객체를 실제로 인스턴스화하는 시점**이 중요한 고려 사항이다. 목표는 주입을 통해 객체를 작성하는 것이지만, 어느 시점에서는 객체를 인스턴스화해야 한다. 결과적으로 설계 결정은 이 인스턴스화를 수행할 시기를 중심으로 이루어진다. 의존성 역전의 목표는 특정 시점에서 구체적으로 무언가를 만들어야 하지만, 무언가 구상적인 게 아닌 추상적인 것에 결합하는 것이다. **따라서 간단한 목표 중 하나는(new 키워드를 사용함으로써) 메서드에서 그러는 것처럼 최대한 멀리까지 이어지게 구상 객체를 만드는 것이다.** new 키워드를 볼 때 마다 언제나 그 대상의 값을 평가하자. ### 12.2. 결론 SOLID원칙은 오는날 사용되는 가장 영향력 있는 객체지향 지침 중 하나이다. 필자는 SOLID의 가장 흥미로운 점을 상투적인 부분이 없다는 것을 강조하는데 각각의 특성이 캡슐화, 상속, 다형성 그리고 합성까지 연결되는 부분에 집중한다. 특히, 상속 대 합성.. 개인적인 생각 + 많은 말들이 그러하듯 정말 좋은 원칙이고 따라야 하는 것은 맞지만 신봉하지는 말자 주의이다. 팀 협업에 있어 다른 인원이 이해하기 쉽고 유지보수가 쉬운 코드를 작성하는 것이 더 중요하다고 생각된다. 물론 이런 원칙에 맞춰 짜여진 코드가 읽기 쉽겠지만 다른 사람의 코드가 그러지 못하다 해서 비난하는게 맞을까? 코드리뷰를 통해 교정은 해야하겠지만 모른다고 무시하는 것은 아닌 것 같다. 최근 친구와 대화에서 코틀린 관련 public에 대한 이야기를 나눴는데 친구는 객체지향 빠돌이에 대한 부정적인 인식이 강했다. 개인적으로 객체지향을 공부하는 입장에선 쉽게 이해가지 않았지만 친구의 자바 롬복 라이브러리, 코틀린의 성격에 대한 이야기를 들어보니 그럴 수 있겠다는 생각이 들었다.. 너무 개인적인 이야기까지 책 리뷰에 적은 것 같지만 이번 책은 여기서 마무리.
process
객체지향 설계의 solid 원칙 객체지향 설계의 solid 원칙 객체지향 프로그래밍과 관련해 많은 개발자가 가장 흔히 하는 말 중에 하나는 실제 세계를 모델링할 수 있는 게 객체지향의 주요 장점이다 라는 말이다 고전적인 객체지향 개념에서는 많이 사용되는 말이지만 로버트 마틴씨의 말에 따르면 객체지향이 우리가 생각하는 방식과 밀접하다고 말하는 것은 그저 마케팅에 불과하다고 한다 그 대신에 그는 객체지향이란 핵심 의존성들을 역전시킴으로써 경직된 코드나 취약한 코드 및 재사용이 불가능 코드가 되지 않게 하는 식으로 의존체들을 관리하는 일이라고 말한다 고전적인 객체지향 프로그래밍 과정에서는 코드를 종종 실제 상황에 맞게 모델링하는 경우가 많다 예를 들어 개가 포유류의 일종이라면 is a 이 관계를 상속으로 나타내는게 명백한 선택지이기는 하다 엄격한 has a 및 is a 리트머스 테스트라는 게 수년간 객체지향적 마음가짐의 한 부분을 차지했다 그러나 이 책 전체에서 보여주듯 상속 관계를 강요하려고 하면 설계 문제가 생길 수 있다 짖지 않는 개와 짖는 개 날지 못하는 새와 날 수 있는 새를 상속 설계를 잘 선택해서 하는 것만으로 서로 구분되게 할 수 있다고 생각하는가 has a와 is a를 엄밀하게 구분해 결정하는 데 집중하는 것만이 최선의 접근 방식은 아닐 것이다 우리는 클래스를 분리하는 데 더 집중해야 한다 밥 삼촌 로버트 마틴 은 재사용할 수 없는 코드를 설명하기 위해 다음과 같은 세 가지 용어를 정의한다 경직성 rigitity 프로그램의 한 부분을 변경하면 다른 부분까지 변경해야 하는 경우 취약성 fragility 관련이 없는 곳에서 오류가 발생하는 경우 부동성 immobility 코드를 원래 맥락에서 벗어나 재사용할 수 없는 경우 이러한 문제들을 해결하고 목표를 달성하기 위해 solid가 도입되었다 로버트 마틴이 소프트웨어 설계를 더 이해하기 쉽고 더 유연하며 유지보수 가능하게 만들기 위해 도입한 다섯 가지 설계 원칙을 solid라고 한다 밥 아저씨의 말에 따르면 모든 객체지향 설계에 적용되지만 solid원칙은 애자일 개발이나 적응형 adaptive 소프트웨어 개발과 같은 방법론의 핵심 철학을 형성할 수 있다 srp 단일 책임 원칙 single responsibility principle ocp 개방 폐쇄 원칙 open closed principle lsp 리스코프 치환 원칙 liskov substitution principle isp 인터페이스 분리 원칙 interface segregation principle dip 의존성 역전 원칙 dependency inversion principle 이 원칙들을 다뤄보면서 수십년 동안 존재해 온 고전적인 객체지향 원칙과 관련지어 본다 객체지향 설계의 solid 원칙 단일 책임 원칙 srp 단일 책임 원칙 single responsibility principle 에 따르면 클래스를 변경한 이유가 단일해야 한다 프로그램의 각 클래스와 모듈은 단일 작업에 중점을 두어야 한다 따라서 같은 클래스 안에 다른 이유 때문에 변경될 메서드를 넣지 않도록 한다 클래스를 설명하는 글에 그리고 라는 단어가 포함되면 srp가 깨질 수 있다 다시 말해서 모든 모듈이나 클래스는 소프트웨어가 제공하는 기능의 단일 부분에 대해서만 책임을 져야 한다 그 책임을 완전히 캡슐화해야 한다 이 추상 클래스를 상속 받은 클래스는 반드시 calcarea 메서드를 구현해야 한다 java abstract class shape protected string name protected double area public abstract double calcarea 추상 클래스를 상속받은 circle 클래스는 자체적으로 calcarea 메서드를 구현한다 java class circle extends shape private double radius public circle double radius this radius radius public double calcarea area math pi radius radius return area 주의 단일 책임 원칙에 초점을 맞출 뿐만 아니라 가능한 한 간단한 예제가 되게 하려고 circle 클래스를 만들었다 calculaterareas라는 클래스는 shape 배열에 포함된 다른 도형의 면적을 합산한다 java class calculaterareas shape shapes double sumtotal public calculaterareas shape shapes this shapes shapes public double sumareas sumtotal for shape shape shapes sumtotal shape calcarea return sumtotal public void output system out println sum of the areas of provided shapes sumtotal calculaterareas 클래스는 출력을 처리하므로 문제가 있다 면적 계산 행위와 출력 행위가 같은 클래스에 포함되어 있다 java public class testshape public static void main string args circle circle new circle shape shapearray new shape shapearray circle calculaterareas calculator new calculaterareas shapearray calculator sumareas calculator output 이 처럼 테스트 애플리케이션이 준비되면 단일 책임 원칙 문제에 중점을 둘 수 있다 다시 말하지만 이 클래스는 출력 행위와 합산 행위가 포함되어 있다 여기서 근본적인 문제는 다음과 같다 메서드의 기능을 변경하려면 면적을 합산하는 메서드를 변경할지 여부에 관계없이 calculatorareas 클래스를 변경해야 한다 예를 들어 어떤 시점에서 간단한 텍스트가 아닌 html로 출력하고 싶다면 어떻게 해야 할까 지금은 책임이 결합되어 있기 때문에 영역을 합한 코드를 다시 컴파일하고 재배치해야 한다 단일 책임 원칙에 따르면 한 메서드를 변경해도 다른 메서드에 영향을 미치지 않게 하여 불필요한 컴파일이 없도록 하는 것이 목표다 한 클래스에는 변화해야 할 이유가 한 가지 아니 단 한 가지여야 한다 즉 변화해야 할 책임이 단일해야 한다 이를 해결하기 위해 두 개의 메서드를 서로 분리된 클래스에 넣을 수 있는데 그중에 하나는 원래 콘솔 출력용이고 다른 하나는 html 출력용이다 java class calculaterareas shape shapes double sumtotal public calculaterareas shape shapes this shapes shapes public double sumareas sumtotal for shape shape shapes sumtotal shape calcarea return sumtotal class outputareas double areas public outputareas double areas this areas areas public void console system out println sum of the areas of provided shapes areas public void html system out println system out println sum of the areas of provided shapes areas system out println 이제 테스트 애플리케이션을 다시 작성해보자 java public class testshape public static void main string args circle circle new circle shape shapearray new shape shapearray circle calculaterareas calculator new calculaterareas shapearray outputareas output new outputareas calculator sumareas output console output html 여기서 요점은 요구사항에 따라 다양한 대상으로 출력을 보낼 수 있다는 것이다 그 밖에도 json이 필요하다면 calculatorareas 클래스를 변경하지 않고 outputareas 클래스에 json 출력 메서드를 추가할 수 있다 결과적으로 calculatorareas클래스를 독립적으로 재배포하면서도 그 밖의 클래스에 전혀 영향을 미치지 않는다 생각 개인적인 생각으로 outputareas 클래스는 한 가지 행위만 하고 있긴 하지만 좀 더 나아가서 콘솔 출력과 html 출력을 각각 클래스로 분리하는 것이 더 좋을 것 같다 인터페이스로 묶어서 해당 인터페이스를 상속받아 출력 기능을 각각 구현하는 것이 좀 더 바람직 클래스를 쪼갤 수 있으면 쪼개서 작게 만들어야 하기도 하고 di를 사용해 사용자에게 new가 아닌 주입을 받아 사용하는 것이 좋다고 생각한다 개방 폐쇄 원칙 ocp 개방 폐쇄 원칙 open close principle 에 따르면 클래스를 수정하지 않고 클래스의 행위를 확장할 수 있어야 한다 java class rectangle public double length public double width public rectangle double length double width this length length this width width class calculateareas private double area public double calculaterectanglearea rectangle rectangle area rectangle length rectangle width return area public class opencolsed public static void main string args rectangle rectangle new rectangle calculateareas calculator new calculateareas system out println area of rectangle calculator calculaterectanglearea rectangle 이 애플리케이션이 사각형에 한해서 작동한다는 사실은 개방 폐쇄 원칙을 설명하는 제약 조건을 제공한다 calculateareas 클래스에 circle을 추가하려면 모듈 자체를 변경해야 한다 분명히 이것은 개방 폐쇄 원칙과 상충되며 모듈을 변경하기 위해 모듈을 변경할 필요가 없다는 것을 명시하고 있다 shape라는 추상 클래스를 만들어 상속받게 하여 getarea 의 구현을 강제할 수 있는데 이때 shape클래스 자체를 변경할 필요 없이 원하는 만큼의 다양한 클래스를 추가할 수 있다 이제 shape클래스가 폐쇄되어 있다고 말할 수 있다 java abstract class shape public abstract double getarea class rectangle extends shape public double length public double width public rectangle double length double width this length length this width width override public double getarea return length width class circle extends shape public double radius public circle double radius this radius radius override public double getarea return math pi radius radius class calculateareas private double area public double calculatearea shape shape area shape getarea return area public class opencolsed public static void main string args shape rectangle new rectangle shape circle new circle calculateareas calculator new calculateareas system out println area of rectangle calculator calculatearea rectangle system out println area of circle calculator calculatearea circle 이런 식으로 구현하면 새 shape를 추가할 때 calculateareas 클래스를 변경할 필요가 없다 이는 레거시 코드에 대한 걱정없이 코드를 확장할 수 있음을 의미한다 핵심은 개방 폐쇄 원리는 자식 클래스를 통해 코드를 확장해야 하며 원래 클래스는 변경할 필요가 없다는 것이다 그러나 확장 extension 이라는 단어는 solid와 관련된 여러 토론에서 문제가 된다 우리가 상속보다 합성을 선호한다면 이것이 개방 폐쇄 원칙에 어떤 영향을 미칠까 solid 원칙 중 하나를 따르는 경우에 코드가 또 다른 solid 원칙 중 하나를 따라야 할 수 있다 예를 들어 개방 폐쇄 원칙을 따르도록 코드를 설계했는데 이 코드가 단일 책임 원칙을 준수할 수도 있다는 말이다 생각 단일 책임 원칙에서는 shape를 가지고 다형성을 구현 여기서도 shape를 가지고 다형성을 구현했다 다른 점은 calculateareas 클래스의 행위에 대한 인터페이스 설계인 것 같다 해당 인터페이스를 열어 두어 다형성 캡슐화를 위해 클래스를 분리하고 이를 보여주는 예제가 아마 나올 듯 하다 리스코프 대체 원칙 lsp 리스코프 대체 치환 원칙 liskov substitution principle 에 따르면 부모 클래스의 인스턴스를 해당 자식 클래스 중 하나의 인스턴스로 교체할 수 있게 설계해야 한다 부모 클래스가 무언가를 할 수 있다면 자식 클래스도 그것을 할 수 있어야 한다 합리적으로 보일 수 있지만 리스코프 대체 원칙을 위반하는 사례를 먼저 본다 java abstract class shape protected double area public abstract double getarea class rectangle extends shape private double length private double width public rectangle double length double width this length length this width width override public double getarea area length width return area class square extends rectangle public square double side super side side public class liskovsubstitution public static void main string args rectangle rectangle new rectangle system out println area of rectangle rectangle getarea square square new square system out println area of square square getarea 문제없는 코드이다 직사각형은 도형의 일종으므로 is a 모든 것이 좋아보인다 정사각형또한 직사각형의 일종으로 is a 아직까지는 문제가 없는 것 같다 만약 이런 관계가 아니라면 어떻게 해야 할까 이제는 다소 철학적인 이야기를 해야한다 정사각형이 실제로는 직사각형의 일종인가 많은 사람들이 그렇게 말하지만 정사각형은 특수한 유형의 도형일 수 있지만 직사각형과는 속성이 다르다 직사각형은 사격형이기도 하지만 평행사변형이기도 하다 대각에 놓인 변끼리 일치하는 경우 한편 정사각형은 마름모이지만 직사각형은 그렇지 않다 따라서 직사각형과 정사각형 간에는 약간 다른 점이 있다 실제로는 객체지향 설계 시에 기하학이 문제 되지는 않는다 우리가 직사각형과 정사각형을 어떻게 만드느냐가 문제다 java public rectangle double l double w length l width w 생성자에는 분명히 두 개의 매개변수가 필요하다 그러나 부모 클래스인 rectangle은 square생성자가 두 개이기를 기대하지만 한 개만 필요하다 실제로 면적을 계산하는 기능은 두 클래스에서 미묘하게 다르다 사실 square는 동일한 매개변수를 두 번 전달하여 rectangle을 속인다 이것은 받아들일 수 있는 해결책처럼 보일지 모르지만 실제로는 코드를 혼란스럽게 할 수 있고 의도하지 않은 유지보수 문제를 초래할 수 있다 이것은 일관성이 없는 설계 결정이며 아마도 의심스러운 설계 결정일 것이다 생성자가 다른 생성자를 호출하는 것을 보게 된다면 설계를 일시 중지하고 다시 생각해보는 게 좋다 적절하지 않은 자식 클래스가 아니여서 그럴 수 있기 때문 이런 딜레마를 해결하기 위해선 정사각형과 직사각형을 다음과 같이 설계해야 한다 java abstract class shape protected double area public abstract double getarea class rectangle extends shape private double length private double width public rectangle double length double width this length length this width width override public double getarea area length width return area class square extends shape private double side public square double side this side side override public double getarea area side side return area public class liskovsubstitution public static void main string args rectangle rectangle new rectangle system out println area of rectangle rectangle getarea square square new square system out println area of square square getarea 인터페이스 분리 원칙 isp 인터페이스 분리 원칙 interface segregation principle 에 따르면 몇 개의 큰 인터페이스가 있는 편보다는 작은 인터페이스가 많은 편이 바람직하다 클래스도 마찬가지 이번 예제는 mammal eat 및 makenoise 에 대한 여러 행위를 포함하는 단일 인터페이스를 작성한다 java interface mammal public void eat public void makenoise class dog implements mammal override public void eat system out println dog eats override public void makenoise system out println dog barks public class myclass public static void main string args dog dog new dog dog eat dog makenoise 이것은 잘 작동하지만 mammal 인터페이스는 dog 클래스에 대해 너무 많은 것을 요구한다 mammal에 대한 단일 인터페이스를 만드는 대신에 모든 행위에 대해 별도의 인터페이스를 만들 수 있다 java interface eater public void eat interface noisemaker public void makenoise class dog implements eater noisemaker override public void eat system out println dog eats override public void makenoise system out println dog barks public class myclass public static void main string args dog dog new dog dog eat dog makenoise 실제로 우리는 mammal 클래스에서 행위를 분리 한다 따라서 상속을 통해 단일 mammal 엔터티를 만드는 대신에 이전 장에서 취한 전략과 비슷한 합성 기반 설계 로 이동한다 즉 이 원칙을 사용하면 자연스럽게 합성 기반으로 설계가 이동된다 이 접근법을 사용하면 단일 mammal 클래스에 포함된 행위를 강요하지 않고 합성으로 mammal들을 만들 수 있다 예를 들어 누군가가 먹지 않고 대신에 피부를 통해 영양분을 흡수하는 mammal을 발견했다고 가정해보자 eat 이라는 행위가 포함된 단일 mammal 클래스에서 상속받으면 새 포유류에는 이 행위가 필요하지 않다 그러나 모든 행위를 별도의 단일 인터페이스로 분리하면 각 포유동물을 정확하게 제시하는 방식으로 구축할 수 있다 의존성 역전 원칙 dip 의존성 역전 원칙 dependency inversion principle 은 코드가 추상화에 의존해야 한다고 명시하고 있다 종종 의존성 역전과 의존성 주입이라는 용어가 서로 교환해서 쓸 수 있는 말처럼 들리겠지만 이 원칙을 논의할 때 이해해야 할 몇 가지 핵심 용어는 다음과 같다 의존성 역전 의존체들을 역전시키는 원칙 의존성 주입 의존체들을 역전시키는 행위 생성자 주입 생성자를 통해서 의존성을 주입 파라미터 주입 메서드의 파라미터를 통해서 의존성을 주입 의존성 역전의 목표는 구상적인 것에 결합하기보다는 추상적인 것에 결합하는 것이다 어떤 시점에서는 분명히 구상적인 것을 만들어야 하지만 우리는 main 메서드에서와 같이 가능한 한 사슬을 멀리 뻗어 구상 객체를 만들려고 노력한다 생각 가능한 한 사슬을 멀리 뻗어 구상 객체를 만들려고 한다 우리는 최대한 추상적으로 다뤄야 코드의 유연성이 높아진다는 것을 알지만 그 시점이 어디쯤인지 몰라서 망설이는 경우도 있을 것 이라고 생각한다 멀리 뻗어 구상 객체를 만들려고 한다는 말 자체가 앞 장을 읽고 경험을 해보니 이해가 되는 말이라 되게 와닿는다 의존성 역전 원칙의 목표 중 하나는 컴파일타임이 아니라 런타임에 객체를 선택 하는 것이다 우리는 이전 클래스를 다시 컴파일하지 않고도 새 클래스를 작성할 수도 있다 새 클래스를 작성해 주입 초기 예제 mammal클래스의 예제를 다시 살펴본다 java abstract class mammal public abstract string makenoise cat과 같은 자식 클래스는 상속을 사용해 포유류의 행위인 makenoise 를 활용한다 java class cat extends mammal override public string makenoise return meow class dog extends mammal override public string makenoise return bark java public class testmammal public static void main string args mammal cat new cat mammal dog new dog system out println cat makenoise system out println dog makenoise 행위를 분리해 내기 앞의 코드에는 잠재적으로 심각한 결함이 있다 코드는 포유류와 행위를 연결한다 포유류의 행위를 포유동물 자체로부터 분리하면 상당한 이점을 얻을 수 있다 이렇게 하기 위해 우리는 포유류뿐만 아니라 포유류가 아닌 것들도 모두 사용할 수 있는 makingnoise라는 클래스를 만든다 java abstract class makingnoise public abstract string makenoise class catnoise extends makingnoise override public string makenoise return meow cat 클래스와 분리된 makingnoise 행위를 사용하면 다음 코드 조각과 같이 cat 클래스 자체의 하드코딩된 행위 대신에 catnoise 클래스를 사용할 수 있다 java abstract class mammal public abstract string makenoise class cat extends mammal private makingnoise noise new catnoise override public string makenoise return noise makenoise 두 번째 단계까지 적용된 전체 애플리케이션이다 java public class testmammal public static void main string args mammal cat new cat mammal dog new dog system out println cat makenoise system out println dog makenoise 문제는 명확하다 주요 부분은 분리했지만 cat은 여전히 cat울음 소리내기 행위를 인스턴스화 하기 때문에 우리는 의존성 역전이라는 목표에 도달하지 못했다는 것이다 private catnoise noise new catnoise cat은 저수준 모듈인 catnoise에 결합된다 다시 말해서 cat은 catnoise와 연결되서는 안 되며 울음 생성을 위한 추상화에 연결되어야 한다 실제로 cat 클래스는 울음 생성 동작을 인스턴스화하지 말고 대신 주입을 통해 받아야 한다 이는 처음 설명한 구상적인 부분이 아닌 런타임에 추상적으로 처리 가능하기 때문에 cat이 메인이 아니다 의존성 주입 이 마지막 단계에서 우리는 설계의 상속 측면을 완전히 버리고 합성을 통한 의존성 주입을 활용하는 방법을 조사한다 상속보다는 합성이라는 개념이 탄력을 받는 주요 이유 중 하나는 상속 위계구조가 필요하지 않다는 점 이다 초기 구현엔 cat과 dog가 기본적으로 정확히 같은 코드를 포함하고 있다 서로 다른 울음소리만 돌려줄 뿐이다 결과적으로는 상당한 중복이 발생한다 따라서 포유동물이 많으면 울음소리를 유발하는 코드가 많을 것이다 아마도 더 나은 설계는 포유류가 울음소리를 내도록 코드를 취하는 것이다 한 단계 나아가서 특정 포유류를 버리고 다음과 같이 간단히 mammal클래스를 사용하는 것이다 java class mammal makingnoise noise public mammal makingnoise noise this noise noise public string makenoise return noise makenoise 이제는 cat 울음 소리 생성 행위를 인스턴스화하고 이를 animal 클래스에 제공하여 cat처럼 행위를 하는 포유류를 만들 수 있다 실제로 전통적인 클래스 구축 기술을 사용하는 대신에 행위를 주입하여 언제든 cat을 조립할 수 있다 java mammal cat new mammal new catnoise 이제는 의존성 주입을 논의할 때 객체를 실제로 인스턴스화하는 시점 이 중요한 고려 사항이다 목표는 주입을 통해 객체를 작성하는 것이지만 어느 시점에서는 객체를 인스턴스화해야 한다 결과적으로 설계 결정은 이 인스턴스화를 수행할 시기를 중심으로 이루어진다 의존성 역전의 목표는 특정 시점에서 구체적으로 무언가를 만들어야 하지만 무언가 구상적인 게 아닌 추상적인 것에 결합하는 것이다 따라서 간단한 목표 중 하나는 new 키워드를 사용함으로써 메서드에서 그러는 것처럼 최대한 멀리까지 이어지게 구상 객체를 만드는 것이다 new 키워드를 볼 때 마다 언제나 그 대상의 값을 평가하자 결론 solid원칙은 오는날 사용되는 가장 영향력 있는 객체지향 지침 중 하나이다 필자는 solid의 가장 흥미로운 점을 상투적인 부분이 없다는 것을 강조하는데 각각의 특성이 캡슐화 상속 다형성 그리고 합성까지 연결되는 부분에 집중한다 특히 상속 대 합성 개인적인 생각 많은 말들이 그러하듯 정말 좋은 원칙이고 따라야 하는 것은 맞지만 신봉하지는 말자 주의이다 팀 협업에 있어 다른 인원이 이해하기 쉽고 유지보수가 쉬운 코드를 작성하는 것이 더 중요하다고 생각된다 물론 이런 원칙에 맞춰 짜여진 코드가 읽기 쉽겠지만 다른 사람의 코드가 그러지 못하다 해서 비난하는게 맞을까 코드리뷰를 통해 교정은 해야하겠지만 모른다고 무시하는 것은 아닌 것 같다 최근 친구와 대화에서 코틀린 관련 public에 대한 이야기를 나눴는데 친구는 객체지향 빠돌이에 대한 부정적인 인식이 강했다 개인적으로 객체지향을 공부하는 입장에선 쉽게 이해가지 않았지만 친구의 자바 롬복 라이브러리 코틀린의 성격에 대한 이야기를 들어보니 그럴 수 있겠다는 생각이 들었다 너무 개인적인 이야기까지 책 리뷰에 적은 것 같지만 이번 책은 여기서 마무리
1
355,487
25,175,949,508
IssuesEvent
2022-11-11 09:16:37
o-ohst/pe
https://api.github.com/repos/o-ohst/pe
opened
Confusing description for command format in UG
severity.Low type.DocumentationBug
![Screenshot 2022-11-11 at 5.15.32 PM.png](https://raw.githubusercontent.com/o-ohst/pe/main/files/283cd0f4-b0b8-421d-94a0-80a708d8f8eb.png) Here, ` ` is given as an example for tag, which can be confusing. <!--session: 1668154015875-0a6c164f-cb9f-4230-ab6a-3be8dd789e10--> <!--Version: Web v3.4.4-->
1.0
Confusing description for command format in UG - ![Screenshot 2022-11-11 at 5.15.32 PM.png](https://raw.githubusercontent.com/o-ohst/pe/main/files/283cd0f4-b0b8-421d-94a0-80a708d8f8eb.png) Here, ` ` is given as an example for tag, which can be confusing. <!--session: 1668154015875-0a6c164f-cb9f-4230-ab6a-3be8dd789e10--> <!--Version: Web v3.4.4-->
non_process
confusing description for command format in ug here is given as an example for tag which can be confusing
0
21,177
28,145,880,146
IssuesEvent
2023-04-02 13:29:13
firebase/firebase-cpp-sdk
https://api.github.com/repos/firebase/firebase-cpp-sdk
closed
[C++] Nightly Integration Testing Report
type: process nightly-testing
Note: This report excludes firestore. Please also check **[the report for firestore](https://github.com/firebase/firebase-cpp-sdk/issues/1178)** *** <hidden value="integration-test-status-comment"></hidden> ### ✅&nbsp; [build against repo] Integration test succeeded! Requested by @DellaBitta on commit 73ce6feb70d3e830676aafa1d0ded64a57f07fb8 Last updated: Sun Apr 2 04:24 PDT 2023 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4587871270)** <hidden value="integration-test-status-comment"></hidden> *** ### ❌&nbsp; [build against SDK] Integration test FAILED Requested by @firebase-workflow-trigger[bot] on commit 73ce6feb70d3e830676aafa1d0ded64a57f07fb8 Last updated: Sat Apr 1 07:03 PDT 2023 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4583102181)** | Failures | Configs | |----------|---------| | missing_log | [BUILD] [ERROR] [MacOS] [1/2 ssl_lib: arm64] [boringssl]<br/>[TEST] [ERROR] [MacOS] [1/2 ssl_lib: arm64] [boringssl]<br/> | Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)** <hidden value="integration-test-status-comment"></hidden>
1.0
[C++] Nightly Integration Testing Report - Note: This report excludes firestore. Please also check **[the report for firestore](https://github.com/firebase/firebase-cpp-sdk/issues/1178)** *** <hidden value="integration-test-status-comment"></hidden> ### ✅&nbsp; [build against repo] Integration test succeeded! Requested by @DellaBitta on commit 73ce6feb70d3e830676aafa1d0ded64a57f07fb8 Last updated: Sun Apr 2 04:24 PDT 2023 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4587871270)** <hidden value="integration-test-status-comment"></hidden> *** ### ❌&nbsp; [build against SDK] Integration test FAILED Requested by @firebase-workflow-trigger[bot] on commit 73ce6feb70d3e830676aafa1d0ded64a57f07fb8 Last updated: Sat Apr 1 07:03 PDT 2023 **[View integration test log & download artifacts](https://github.com/firebase/firebase-cpp-sdk/actions/runs/4583102181)** | Failures | Configs | |----------|---------| | missing_log | [BUILD] [ERROR] [MacOS] [1/2 ssl_lib: arm64] [boringssl]<br/>[TEST] [ERROR] [MacOS] [1/2 ssl_lib: arm64] [boringssl]<br/> | Add flaky tests to **[go/fpl-cpp-flake-tracker](http://go/fpl-cpp-flake-tracker)** <hidden value="integration-test-status-comment"></hidden>
process
nightly integration testing report note this report excludes firestore please also check ✅ nbsp integration test succeeded requested by dellabitta on commit last updated sun apr pdt ❌ nbsp integration test failed requested by firebase workflow trigger on commit last updated sat apr pdt failures configs missing log add flaky tests to
1
61,490
17,023,706,440
IssuesEvent
2021-07-03 03:24:32
tomhughes/trac-tickets
https://api.github.com/repos/tomhughes/trac-tickets
closed
source=OS_OpenData_* tags should not contain spaces when using "Assign Source" shortcut
Component: potlatch2 Priority: minor Resolution: fixed Type: defect
**[Submitted to the original trac issue database at 1.15pm, Wednesday, 27th April 2011]** The B shortcut sets the source or source:name tag based on the current background layer. In Potlatch 2.0-14-gc925264 this maps as: "OS OpenData StreetView" background --> source=OS OpenData StreetView "OS OpenData Locator" background --> source:name=OS OpenData Locator However the wiki at http://wiki.openstreetmap.org/wiki/Ordnance_Survey_Opendata#Attributing_OS suggests these values should contain underscores instead of spaces. Taginfo shows the community is largely following the wiki: 166,945 use source=OS_OpenData_StreetView versus 79,636 with OS OpenData StreetView 21,934 use source:name=OS_OpenData_Locator versus 3,327 with OS OpenData Locator
1.0
source=OS_OpenData_* tags should not contain spaces when using "Assign Source" shortcut - **[Submitted to the original trac issue database at 1.15pm, Wednesday, 27th April 2011]** The B shortcut sets the source or source:name tag based on the current background layer. In Potlatch 2.0-14-gc925264 this maps as: "OS OpenData StreetView" background --> source=OS OpenData StreetView "OS OpenData Locator" background --> source:name=OS OpenData Locator However the wiki at http://wiki.openstreetmap.org/wiki/Ordnance_Survey_Opendata#Attributing_OS suggests these values should contain underscores instead of spaces. Taginfo shows the community is largely following the wiki: 166,945 use source=OS_OpenData_StreetView versus 79,636 with OS OpenData StreetView 21,934 use source:name=OS_OpenData_Locator versus 3,327 with OS OpenData Locator
non_process
source os opendata tags should not contain spaces when using assign source shortcut the b shortcut sets the source or source name tag based on the current background layer in potlatch this maps as os opendata streetview background source os opendata streetview os opendata locator background source name os opendata locator however the wiki at suggests these values should contain underscores instead of spaces taginfo shows the community is largely following the wiki use source os opendata streetview versus with os opendata streetview use source name os opendata locator versus with os opendata locator
0
11,819
14,633,535,799
IssuesEvent
2020-12-24 02:15:43
TheUltimateC0der/Listrr
https://api.github.com/repos/TheUltimateC0der/Listrr
closed
Edit Name Created Lists
feature-request processing:api-side
I would like it if the name-based generated lists could include the ability to edit. My use case is generating lists of all my media and would like to be able to grow the list as my media does without recreating all the items in the list each time.
1.0
Edit Name Created Lists - I would like it if the name-based generated lists could include the ability to edit. My use case is generating lists of all my media and would like to be able to grow the list as my media does without recreating all the items in the list each time.
process
edit name created lists i would like it if the name based generated lists could include the ability to edit my use case is generating lists of all my media and would like to be able to grow the list as my media does without recreating all the items in the list each time
1
18,405
24,543,585,255
IssuesEvent
2022-10-12 06:57:46
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Algorithm can't be executed if `.` in filename
Processing Bug
### What is the bug or the crash? ![image](https://user-images.githubusercontent.com/39594821/191698921-3fed35fa-40ee-40bb-8c26-e0ce48d70f88.png) It fails to determine the extension ### Steps to reproduce the issue Try to output to a file with a `.` in filename ### Versions 3.22.11 ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [X] I tried with a new QGIS profile ### Additional context _No response_
1.0
Algorithm can't be executed if `.` in filename - ### What is the bug or the crash? ![image](https://user-images.githubusercontent.com/39594821/191698921-3fed35fa-40ee-40bb-8c26-e0ce48d70f88.png) It fails to determine the extension ### Steps to reproduce the issue Try to output to a file with a `.` in filename ### Versions 3.22.11 ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [X] I tried with a new QGIS profile ### Additional context _No response_
process
algorithm can t be executed if in filename what is the bug or the crash it fails to determine the extension steps to reproduce the issue try to output to a file with a in filename versions supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context no response
1
11,007
13,794,633,651
IssuesEvent
2020-10-09 16:38:35
googleapis/releasetool
https://api.github.com/repos/googleapis/releasetool
closed
Automatic publish is broken
type: process
Autorelease currently chooses the wrong path for the release job. ``` Release is at https://github.com/googleapis/docuploader/releases/tag/v0.2.0 Kokoro build URL: https://fusion.corp.google.com/projectanalysis/current/KOKORO/prod%3Acloud-devrel%2Fclient-libraries%2FNone%2Frelease ```
1.0
Automatic publish is broken - Autorelease currently chooses the wrong path for the release job. ``` Release is at https://github.com/googleapis/docuploader/releases/tag/v0.2.0 Kokoro build URL: https://fusion.corp.google.com/projectanalysis/current/KOKORO/prod%3Acloud-devrel%2Fclient-libraries%2FNone%2Frelease ```
process
automatic publish is broken autorelease currently chooses the wrong path for the release job release is at kokoro build url
1
53,814
13,883,692,593
IssuesEvent
2020-10-18 13:08:39
ioana-nicolae/first
https://api.github.com/repos/ioana-nicolae/first
opened
CVE-2020-15250 (Medium) detected in junit-4.11.jar
security vulnerability
## CVE-2020-15250 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>junit-4.11.jar</b></p></summary> <p>JUnit is a regression testing framework written by Erich Gamma and Kent Beck. It is used by the developer who implements unit tests in Java.</p> <p>Library home page: <a href="http://junit.org">http://junit.org</a></p> <p>Path to vulnerable library: first/junit-4.11.jar</p> <p> Dependency Hierarchy: - :x: **junit-4.11.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ioana-nicolae/first/commit/1c9d7be4fe6f4e84a87886fbd1a88c91a41efd0e">1c9d7be4fe6f4e84a87886fbd1a88c91a41efd0e</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In JUnit4 before version 4.13.1, the test rule TemporaryFolder contains a local information disclosure vulnerability. On Unix like systems, the system's temporary directory is shared between all users on that system. Because of this, when files and directories are written into this directory they are, by default, readable by other users on that same system. This vulnerability does not allow other users to overwrite the contents of these directories or files. This is purely an information disclosure vulnerability. This vulnerability impacts you if the JUnit tests write sensitive information, like API keys or passwords, into the temporary folder, and the JUnit tests execute in an environment where the OS has other untrusted users. Because certain JDK file system APIs were only added in JDK 1.7, this this fix is dependent upon the version of the JDK you are using. For Java 1.7 and higher users: this vulnerability is fixed in 4.13.1. For Java 1.6 and lower users: no patch is available, you must use the workaround below. If you are unable to patch, or are stuck running on Java 1.6, specifying the `java.io.tmpdir` system environment variable to a directory that is exclusively owned by the executing user will fix this vulnerability. For more information, including an example of vulnerable code, see the referenced GitHub Security Advisory. <p>Publish Date: 2020-07-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15250>CVE-2020-15250</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/junit-team/junit4/security/advisories/GHSA-269g-pwp5-87pp">https://github.com/junit-team/junit4/security/advisories/GHSA-269g-pwp5-87pp</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: junit:junit:r4.13.1</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END --> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"junit","packageName":"junit","packageVersion":"4.11","isTransitiveDependency":false,"dependencyTree":"junit:junit:4.11","isMinimumFixVersionAvailable":true,"minimumFixVersion":"junit:junit:r4.13.1"}],"vulnerabilityIdentifier":"CVE-2020-15250","vulnerabilityDetails":"In JUnit4 before version 4.13.1, the test rule TemporaryFolder contains a local information disclosure vulnerability. On Unix like systems, the system\u0027s temporary directory is shared between all users on that system. Because of this, when files and directories are written into this directory they are, by default, readable by other users on that same system. This vulnerability does not allow other users to overwrite the contents of these directories or files. This is purely an information disclosure vulnerability. This vulnerability impacts you if the JUnit tests write sensitive information, like API keys or passwords, into the temporary folder, and the JUnit tests execute in an environment where the OS has other untrusted users. Because certain JDK file system APIs were only added in JDK 1.7, this this fix is dependent upon the version of the JDK you are using. For Java 1.7 and higher users: this vulnerability is fixed in 4.13.1. For Java 1.6 and lower users: no patch is available, you must use the workaround below. If you are unable to patch, or are stuck running on Java 1.6, specifying the `java.io.tmpdir` system environment variable to a directory that is exclusively owned by the executing user will fix this vulnerability. For more information, including an example of vulnerable code, see the referenced GitHub Security Advisory.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15250","cvss3Severity":"medium","cvss3Score":"4.4","cvss3Metrics":{"A":"None","AC":"High","PR":"Low","S":"Unchanged","C":"High","UI":"Required","AV":"Local","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2020-15250 (Medium) detected in junit-4.11.jar - ## CVE-2020-15250 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>junit-4.11.jar</b></p></summary> <p>JUnit is a regression testing framework written by Erich Gamma and Kent Beck. It is used by the developer who implements unit tests in Java.</p> <p>Library home page: <a href="http://junit.org">http://junit.org</a></p> <p>Path to vulnerable library: first/junit-4.11.jar</p> <p> Dependency Hierarchy: - :x: **junit-4.11.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ioana-nicolae/first/commit/1c9d7be4fe6f4e84a87886fbd1a88c91a41efd0e">1c9d7be4fe6f4e84a87886fbd1a88c91a41efd0e</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In JUnit4 before version 4.13.1, the test rule TemporaryFolder contains a local information disclosure vulnerability. On Unix like systems, the system's temporary directory is shared between all users on that system. Because of this, when files and directories are written into this directory they are, by default, readable by other users on that same system. This vulnerability does not allow other users to overwrite the contents of these directories or files. This is purely an information disclosure vulnerability. This vulnerability impacts you if the JUnit tests write sensitive information, like API keys or passwords, into the temporary folder, and the JUnit tests execute in an environment where the OS has other untrusted users. Because certain JDK file system APIs were only added in JDK 1.7, this this fix is dependent upon the version of the JDK you are using. For Java 1.7 and higher users: this vulnerability is fixed in 4.13.1. For Java 1.6 and lower users: no patch is available, you must use the workaround below. If you are unable to patch, or are stuck running on Java 1.6, specifying the `java.io.tmpdir` system environment variable to a directory that is exclusively owned by the executing user will fix this vulnerability. For more information, including an example of vulnerable code, see the referenced GitHub Security Advisory. <p>Publish Date: 2020-07-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15250>CVE-2020-15250</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/junit-team/junit4/security/advisories/GHSA-269g-pwp5-87pp">https://github.com/junit-team/junit4/security/advisories/GHSA-269g-pwp5-87pp</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: junit:junit:r4.13.1</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END --> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"junit","packageName":"junit","packageVersion":"4.11","isTransitiveDependency":false,"dependencyTree":"junit:junit:4.11","isMinimumFixVersionAvailable":true,"minimumFixVersion":"junit:junit:r4.13.1"}],"vulnerabilityIdentifier":"CVE-2020-15250","vulnerabilityDetails":"In JUnit4 before version 4.13.1, the test rule TemporaryFolder contains a local information disclosure vulnerability. On Unix like systems, the system\u0027s temporary directory is shared between all users on that system. Because of this, when files and directories are written into this directory they are, by default, readable by other users on that same system. This vulnerability does not allow other users to overwrite the contents of these directories or files. This is purely an information disclosure vulnerability. This vulnerability impacts you if the JUnit tests write sensitive information, like API keys or passwords, into the temporary folder, and the JUnit tests execute in an environment where the OS has other untrusted users. Because certain JDK file system APIs were only added in JDK 1.7, this this fix is dependent upon the version of the JDK you are using. For Java 1.7 and higher users: this vulnerability is fixed in 4.13.1. For Java 1.6 and lower users: no patch is available, you must use the workaround below. If you are unable to patch, or are stuck running on Java 1.6, specifying the `java.io.tmpdir` system environment variable to a directory that is exclusively owned by the executing user will fix this vulnerability. For more information, including an example of vulnerable code, see the referenced GitHub Security Advisory.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15250","cvss3Severity":"medium","cvss3Score":"4.4","cvss3Metrics":{"A":"None","AC":"High","PR":"Low","S":"Unchanged","C":"High","UI":"Required","AV":"Local","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
cve medium detected in junit jar cve medium severity vulnerability vulnerable library junit jar junit is a regression testing framework written by erich gamma and kent beck it is used by the developer who implements unit tests in java library home page a href path to vulnerable library first junit jar dependency hierarchy x junit jar vulnerable library found in head commit a href found in base branch master vulnerability details in before version the test rule temporaryfolder contains a local information disclosure vulnerability on unix like systems the system s temporary directory is shared between all users on that system because of this when files and directories are written into this directory they are by default readable by other users on that same system this vulnerability does not allow other users to overwrite the contents of these directories or files this is purely an information disclosure vulnerability this vulnerability impacts you if the junit tests write sensitive information like api keys or passwords into the temporary folder and the junit tests execute in an environment where the os has other untrusted users because certain jdk file system apis were only added in jdk this this fix is dependent upon the version of the jdk you are using for java and higher users this vulnerability is fixed in for java and lower users no patch is available you must use the workaround below if you are unable to patch or are stuck running on java specifying the java io tmpdir system environment variable to a directory that is exclusively owned by the executing user will fix this vulnerability for more information including an example of vulnerable code see the referenced github security advisory publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction required scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution junit junit check this box to open an automated fix pr isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails in before version the test rule temporaryfolder contains a local information disclosure vulnerability on unix like systems the system temporary directory is shared between all users on that system because of this when files and directories are written into this directory they are by default readable by other users on that same system this vulnerability does not allow other users to overwrite the contents of these directories or files this is purely an information disclosure vulnerability this vulnerability impacts you if the junit tests write sensitive information like api keys or passwords into the temporary folder and the junit tests execute in an environment where the os has other untrusted users because certain jdk file system apis were only added in jdk this this fix is dependent upon the version of the jdk you are using for java and higher users this vulnerability is fixed in for java and lower users no patch is available you must use the workaround below if you are unable to patch or are stuck running on java specifying the java io tmpdir system environment variable to a directory that is exclusively owned by the executing user will fix this vulnerability for more information including an example of vulnerable code see the referenced github security advisory vulnerabilityurl
0
2,186
5,036,904,484
IssuesEvent
2016-12-17 10:27:14
paulkornikov/Pragonas
https://api.github.com/repos/paulkornikov/Pragonas
closed
Améliorer trace processus
a-enhancement processus workload III
Mettre le détail soit directement dans le lien soit en base de la liste. Voir les perfs.
1.0
Améliorer trace processus - Mettre le détail soit directement dans le lien soit en base de la liste. Voir les perfs.
process
améliorer trace processus mettre le détail soit directement dans le lien soit en base de la liste voir les perfs
1
14,295
17,269,844,654
IssuesEvent
2021-07-22 18:14:17
googleapis/python-bigtable
https://api.github.com/repos/googleapis/python-bigtable
opened
metricscaler sample: 'test_scale_bigtable' fails in setup with NotFound
type: process
From this [Kokoro build](https://source.cloud.google.com/results/invocations/89e0bb49-0df6-47d9-a9b4-bfd8790f6e8a/targets/github%2Fpython-bigtable%2Fsamples%2Fmetricscaler/tests): ``` def test_scale_bigtable(instance): bigtable_client = bigtable.Client(admin=True) instance = bigtable_client.instance(BIGTABLE_INSTANCE) > instance.reload() metricscaler_test.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../google/cloud/bigtable/instance.py:364: in reload instance_pb = self._client.instance_admin_client.get_instance( ../../google/cloud/bigtable_admin_v2/services/bigtable_instance_admin/client.py:605: in get_instance response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) .nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in __call__ return wrapped_func(*args, **kwargs) .nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:285: in retry_wrapped_func return retry_target( .nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:188: in retry_target return target() .nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ value = None from_value = <_InactiveRpcError of RPC that terminated with: status = StatusCode.NOT_FOUND details = "Instance projects/python-do...ge":"Instance projects/python-docs-samples-tests/instances/metric-scale-test-a60739a9-3 not found.","grpc_status":5}" > > ??? E google.api_core.exceptions.NotFound: 404 Instance projects/python-docs-samples-tests/instances/metric-scale-test-a60739a9-3 not found. <string>:3: NotFound ``` Likely just needs an EC check.
1.0
metricscaler sample: 'test_scale_bigtable' fails in setup with NotFound - From this [Kokoro build](https://source.cloud.google.com/results/invocations/89e0bb49-0df6-47d9-a9b4-bfd8790f6e8a/targets/github%2Fpython-bigtable%2Fsamples%2Fmetricscaler/tests): ``` def test_scale_bigtable(instance): bigtable_client = bigtable.Client(admin=True) instance = bigtable_client.instance(BIGTABLE_INSTANCE) > instance.reload() metricscaler_test.py:109: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../../google/cloud/bigtable/instance.py:364: in reload instance_pb = self._client.instance_admin_client.get_instance( ../../google/cloud/bigtable_admin_v2/services/bigtable_instance_admin/client.py:605: in get_instance response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,) .nox/py-3-8/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py:145: in __call__ return wrapped_func(*args, **kwargs) .nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:285: in retry_wrapped_func return retry_target( .nox/py-3-8/lib/python3.8/site-packages/google/api_core/retry.py:188: in retry_target return target() .nox/py-3-8/lib/python3.8/site-packages/google/api_core/grpc_helpers.py:69: in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ value = None from_value = <_InactiveRpcError of RPC that terminated with: status = StatusCode.NOT_FOUND details = "Instance projects/python-do...ge":"Instance projects/python-docs-samples-tests/instances/metric-scale-test-a60739a9-3 not found.","grpc_status":5}" > > ??? E google.api_core.exceptions.NotFound: 404 Instance projects/python-docs-samples-tests/instances/metric-scale-test-a60739a9-3 not found. <string>:3: NotFound ``` Likely just needs an EC check.
process
metricscaler sample test scale bigtable fails in setup with notfound from this def test scale bigtable instance bigtable client bigtable client admin true instance bigtable client instance bigtable instance instance reload metricscaler test py google cloud bigtable instance py in reload instance pb self client instance admin client get instance google cloud bigtable admin services bigtable instance admin client py in get instance response rpc request retry retry timeout timeout metadata metadata nox py lib site packages google api core gapic method py in call return wrapped func args kwargs nox py lib site packages google api core retry py in retry wrapped func return retry target nox py lib site packages google api core retry py in retry target return target nox py lib site packages google api core grpc helpers py in error remapped callable six raise from exceptions from grpc error exc exc value none from value inactiverpcerror of rpc that terminated with status statuscode not found details instance projects python do ge instance projects python docs samples tests instances metric scale test not found grpc status e google api core exceptions notfound instance projects python docs samples tests instances metric scale test not found notfound likely just needs an ec check
1
15,932
20,158,524,886
IssuesEvent
2022-02-09 18:51:15
pytorch/pytorch
https://api.github.com/repos/pytorch/pytorch
closed
In windows, DataLoader with num_workers > 0 is extremely slow (50 times slower)
module: windows module: multiprocessing module: dataloader triaged has workaround
## 🐛 Bug In windows, DataLoader with num_workers > 0 is extremely slow (pytorch=0.41) ## To Reproduce Step 1: create two loader, one with num_workers and one without. import torch.utils.data as Data train_loader = Data.DataLoader(dataset=train_dataset, batch_size=batch_size, shuffle=True) train_loader2 = Data.DataLoader(dataset=train_dataset, batch_size=batch_size, shuffle=True, num_workers=1) Step 2: time it %%time for _ in range(200): for x in train_loader: pass %%time for _ in range(200): for x in train_loader2: pass The first one took only 2.5 s The second one took 1min 47s --------------------------------------------------------------------------- Collecting environment information... PyTorch version: 0.4.1 Is debug build: No CUDA used to build PyTorch: None OS: Microsoft Windows 10 Enterprise GCC version: Could not collect CMake version: version 3.11.0-rc3 Python version: 3.6 Is CUDA available: No CUDA runtime version: No CUDA GPU models and configuration: No CUDA Nvidia driver version: No CUDA cuDNN version: No CUDA cc @peterjc123 @mszhanyi @skyline75489 @nbcsm @VitalyFedyunin @SsnL @ejguan @NivekT
1.0
In windows, DataLoader with num_workers > 0 is extremely slow (50 times slower) - ## 🐛 Bug In windows, DataLoader with num_workers > 0 is extremely slow (pytorch=0.41) ## To Reproduce Step 1: create two loader, one with num_workers and one without. import torch.utils.data as Data train_loader = Data.DataLoader(dataset=train_dataset, batch_size=batch_size, shuffle=True) train_loader2 = Data.DataLoader(dataset=train_dataset, batch_size=batch_size, shuffle=True, num_workers=1) Step 2: time it %%time for _ in range(200): for x in train_loader: pass %%time for _ in range(200): for x in train_loader2: pass The first one took only 2.5 s The second one took 1min 47s --------------------------------------------------------------------------- Collecting environment information... PyTorch version: 0.4.1 Is debug build: No CUDA used to build PyTorch: None OS: Microsoft Windows 10 Enterprise GCC version: Could not collect CMake version: version 3.11.0-rc3 Python version: 3.6 Is CUDA available: No CUDA runtime version: No CUDA GPU models and configuration: No CUDA Nvidia driver version: No CUDA cuDNN version: No CUDA cc @peterjc123 @mszhanyi @skyline75489 @nbcsm @VitalyFedyunin @SsnL @ejguan @NivekT
process
in windows dataloader with num workers is extremely slow times slower 🐛 bug in windows dataloader with num workers is extremely slow pytorch to reproduce step create two loader one with num workers and one without import torch utils data as data train loader data dataloader dataset train dataset batch size batch size shuffle true train data dataloader dataset train dataset batch size batch size shuffle true num workers step time it time for in range for x in train loader pass time for in range for x in train pass the first one took only s the second one took collecting environment information pytorch version is debug build no cuda used to build pytorch none os microsoft windows enterprise gcc version could not collect cmake version version python version is cuda available no cuda runtime version no cuda gpu models and configuration no cuda nvidia driver version no cuda cudnn version no cuda cc mszhanyi nbcsm vitalyfedyunin ssnl ejguan nivekt
1
5,593
8,445,189,747
IssuesEvent
2018-10-18 20:43:09
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
opened
Deprecate extra values for `srcs_version`.
P2 team-Rules-Python type: process
Current allowed `srcs_version` values are: - `PY2ONLY`: errors out if built in PY3 mode - `PY2`: originally intended to mean "do 2to3 conversion", but it errors out (badly) if built in PY3 mode - `PY2AND3`: works for both PY2 and PY3 (no 2to3 conversion) - `PY3ONLY`: errors out if built in PY2 mode - `PY3`: same as `PY3ONLY` We should reduce this to something like just: - `PY2` (behaving like `PY2ONLY` does today) - `PY2AND3` - `PY3`
1.0
Deprecate extra values for `srcs_version`. - Current allowed `srcs_version` values are: - `PY2ONLY`: errors out if built in PY3 mode - `PY2`: originally intended to mean "do 2to3 conversion", but it errors out (badly) if built in PY3 mode - `PY2AND3`: works for both PY2 and PY3 (no 2to3 conversion) - `PY3ONLY`: errors out if built in PY2 mode - `PY3`: same as `PY3ONLY` We should reduce this to something like just: - `PY2` (behaving like `PY2ONLY` does today) - `PY2AND3` - `PY3`
process
deprecate extra values for srcs version current allowed srcs version values are errors out if built in mode originally intended to mean do conversion but it errors out badly if built in mode works for both and no conversion errors out if built in mode same as we should reduce this to something like just behaving like does today
1
4,607
7,452,876,862
IssuesEvent
2018-03-29 09:52:11
nodejs/node
https://api.github.com/repos/nodejs/node
closed
child_process: confusing error message from functions with empty string argument
child_process errors
* **Version**: master (10.0.0) * **Platform**: Windows 7 x64 * **Subsystem**: child_process All these function calls produce a bit confusing error message: ```js child_process.execFile(''); child_process.exec(''); child_process.spawn(''); child_process.execFileSync(''); child_process.execSync(''); child_process.spawnSync(''); ``` ```console TypeError [ERR_INVALID_ARG_TYPE]: The "file" argument must be of type string. Received type string at normalizeSpawnArguments (child_process.js:389:11) ... ``` Compare with previous versions: v4–v6: `TypeError: Bad argument` v8–v9: `TypeError: "file" argument must be a non-empty string` See https://github.com/nodejs/node/blob/74ff743289dc750a9171e5e6a6aaaef73cb3b912/lib/child_process.js#L388-L389
1.0
child_process: confusing error message from functions with empty string argument - * **Version**: master (10.0.0) * **Platform**: Windows 7 x64 * **Subsystem**: child_process All these function calls produce a bit confusing error message: ```js child_process.execFile(''); child_process.exec(''); child_process.spawn(''); child_process.execFileSync(''); child_process.execSync(''); child_process.spawnSync(''); ``` ```console TypeError [ERR_INVALID_ARG_TYPE]: The "file" argument must be of type string. Received type string at normalizeSpawnArguments (child_process.js:389:11) ... ``` Compare with previous versions: v4–v6: `TypeError: Bad argument` v8–v9: `TypeError: "file" argument must be a non-empty string` See https://github.com/nodejs/node/blob/74ff743289dc750a9171e5e6a6aaaef73cb3b912/lib/child_process.js#L388-L389
process
child process confusing error message from functions with empty string argument version master platform windows subsystem child process all these function calls produce a bit confusing error message js child process execfile child process exec child process spawn child process execfilesync child process execsync child process spawnsync console typeerror the file argument must be of type string received type string at normalizespawnarguments child process js compare with previous versions – typeerror bad argument – typeerror file argument must be a non empty string see
1
18,087
24,110,840,424
IssuesEvent
2022-09-20 11:12:41
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
.NET 6 reading standard output blocks on Linux
area-System.Diagnostics.Process
### Description I have a .NET 6 process **prog1** which starts a .NET 6 process **prog2**, redirects its standard output and waits for all output. **prog2** starts a .NET 6 process **prog3** without redirecting its output and shuts down. **prog3** runs forever. That works just fine on Windows. First **prog2** stops, then **prog1** stops, while **prog3** keeps running. On Linux however, **prog1** hangs forever while waiting for all output. It only stops when **prog3** stops. ### Reproduction Steps ```csharp // prog1: using System.Diagnostics; var process = new Process { StartInfo = { FileName = "prog2", // Or the full path, depending if prog2 is in the PATH or not RedirectStandardOutput = true, UseShellExecute = false } }; process.Start(); process.WaitForExit(); Console.WriteLine(process.StandardOutput.ReadToEnd()); // prog2: using System.Diagnostics; using var process = new Process { StartInfo = { FileName = "prog3", // Or the full path, depending if prog3 is in the PATH or not UseShellExecute = true } }; process.Start(); Console.WriteLine("prog3 started successfully."); // prog3: Thread.Sleep(Timeout.Infinite); ``` ### Expected behavior Since **prog2** does not redirect the output of **prog3**, **prog1** is supposed to stop after **prog2** stops. ### Actual behavior **prog1** stops on Windows but not on Linux. ### Regression? I observed the same issue with .net core 3.1. ### Known Workarounds _No response_ ### Configuration _No response_ ### Other information _No response_
1.0
.NET 6 reading standard output blocks on Linux - ### Description I have a .NET 6 process **prog1** which starts a .NET 6 process **prog2**, redirects its standard output and waits for all output. **prog2** starts a .NET 6 process **prog3** without redirecting its output and shuts down. **prog3** runs forever. That works just fine on Windows. First **prog2** stops, then **prog1** stops, while **prog3** keeps running. On Linux however, **prog1** hangs forever while waiting for all output. It only stops when **prog3** stops. ### Reproduction Steps ```csharp // prog1: using System.Diagnostics; var process = new Process { StartInfo = { FileName = "prog2", // Or the full path, depending if prog2 is in the PATH or not RedirectStandardOutput = true, UseShellExecute = false } }; process.Start(); process.WaitForExit(); Console.WriteLine(process.StandardOutput.ReadToEnd()); // prog2: using System.Diagnostics; using var process = new Process { StartInfo = { FileName = "prog3", // Or the full path, depending if prog3 is in the PATH or not UseShellExecute = true } }; process.Start(); Console.WriteLine("prog3 started successfully."); // prog3: Thread.Sleep(Timeout.Infinite); ``` ### Expected behavior Since **prog2** does not redirect the output of **prog3**, **prog1** is supposed to stop after **prog2** stops. ### Actual behavior **prog1** stops on Windows but not on Linux. ### Regression? I observed the same issue with .net core 3.1. ### Known Workarounds _No response_ ### Configuration _No response_ ### Other information _No response_
process
net reading standard output blocks on linux description i have a net process which starts a net process redirects its standard output and waits for all output starts a net process without redirecting its output and shuts down runs forever that works just fine on windows first stops then stops while keeps running on linux however hangs forever while waiting for all output it only stops when stops reproduction steps csharp using system diagnostics var process new process startinfo filename or the full path depending if is in the path or not redirectstandardoutput true useshellexecute false process start process waitforexit console writeline process standardoutput readtoend using system diagnostics using var process new process startinfo filename or the full path depending if is in the path or not useshellexecute true process start console writeline started successfully thread sleep timeout infinite expected behavior since does not redirect the output of is supposed to stop after stops actual behavior stops on windows but not on linux regression i observed the same issue with net core known workarounds no response configuration no response other information no response
1
453,287
13,067,591,371
IssuesEvent
2020-07-31 00:57:18
shatadru/simpletools
https://api.github.com/repos/shatadru/simpletools
opened
[otpgen.sh] RHEL7.x does not support -pbkdf2 or -iter in openssl
Priority: High bug
- Required for older openssl version - To fix check if these options are supported in installed openssl version else do not use them
1.0
[otpgen.sh] RHEL7.x does not support -pbkdf2 or -iter in openssl - - Required for older openssl version - To fix check if these options are supported in installed openssl version else do not use them
non_process
x does not support or iter in openssl required for older openssl version to fix check if these options are supported in installed openssl version else do not use them
0
299,964
9,206,015,527
IssuesEvent
2019-03-08 12:26:14
qissue-bot/QGIS
https://api.github.com/repos/qissue-bot/QGIS
closed
"Actions" in qgis do not work on MS windows
Category: Symbology Component: Affected QGIS version Component: Crashes QGIS or corrupts data Component: Easy fix? Component: Operating System Component: Pull Request or Patch supplied Component: Regression? Component: Resolution Priority: Low Project: QGIS Application Status: Closed Tracker: Bug report
--- Author Name: **rap -** (rap -) Original Redmine Issue: 1315, https://issues.qgis.org/issues/1315 Original Assignee: Gavin Macaulay - --- *qGIS (0.11/windows) crashes if you want to use actions (e. g. to link images or websites)* This problem only occure on the windows version, on ubuntu actions seems to work properly. Original Task: link photos to point shape files using "actions". An action is defined: calling "firefox" as action The Identify Feature is used to open the identify window A click on the action is used to define the action... result: qgis crashes Regards, Gerhard --- - [qgiscrash_actions.pdf](https://issues.qgis.org/attachments/download/2148/qgiscrash_actions.pdf) (rap -)
1.0
"Actions" in qgis do not work on MS windows - --- Author Name: **rap -** (rap -) Original Redmine Issue: 1315, https://issues.qgis.org/issues/1315 Original Assignee: Gavin Macaulay - --- *qGIS (0.11/windows) crashes if you want to use actions (e. g. to link images or websites)* This problem only occure on the windows version, on ubuntu actions seems to work properly. Original Task: link photos to point shape files using "actions". An action is defined: calling "firefox" as action The Identify Feature is used to open the identify window A click on the action is used to define the action... result: qgis crashes Regards, Gerhard --- - [qgiscrash_actions.pdf](https://issues.qgis.org/attachments/download/2148/qgiscrash_actions.pdf) (rap -)
non_process
actions in qgis do not work on ms windows author name rap rap original redmine issue original assignee gavin macaulay qgis windows crashes if you want to use actions e g to link images or websites this problem only occure on the windows version on ubuntu actions seems to work properly original task link photos to point shape files using actions an action is defined calling firefox as action the identify feature is used to open the identify window a click on the action is used to define the action result qgis crashes regards gerhard rap
0
1,676
4,313,014,276
IssuesEvent
2016-07-22 08:45:12
CGAL/cgal
https://api.github.com/repos/CGAL/cgal
closed
VCM Features Estimation
CGAL 3D demo enhancement Pkg::Point_set_processing
It is applicable to point sets, but it is not clear what it does. The popup widget mentions "Normals" what it does not do at all. The parameters should be explained. Do the parameters have reasonable initial values. @sgiraudot Please fix that before Siggraph.
1.0
VCM Features Estimation - It is applicable to point sets, but it is not clear what it does. The popup widget mentions "Normals" what it does not do at all. The parameters should be explained. Do the parameters have reasonable initial values. @sgiraudot Please fix that before Siggraph.
process
vcm features estimation it is applicable to point sets but it is not clear what it does the popup widget mentions normals what it does not do at all the parameters should be explained do the parameters have reasonable initial values sgiraudot please fix that before siggraph
1
50,344
26,592,460,025
IssuesEvent
2023-01-23 09:50:31
getsentry/sentry-cocoa
https://api.github.com/repos/getsentry/sentry-cocoa
closed
Rename SentryTracedView transactionName to view name
Component: Performance Platform: Cocoa Type: Refactoring
### Description The parameter `transactionName` is only the transaction name if the view is the outermost view. If it's a nested view, the `transactionName` is a span description. Therefore the parameter doesn't correctly reflect what it does. Let's rename it to `viewName` or something similar. We should also update the code docs of `SentryTracedView` accordingly. https://github.com/getsentry/sentry-cocoa/blob/8.0.0/Sources/SentrySwiftUI/SentryTracedView.swift#L42 https://github.com/getsentry/sentry-cocoa/blob/8.0.0/Sources/SentrySwiftUI/SentryTracedView.swift#L74
True
Rename SentryTracedView transactionName to view name - ### Description The parameter `transactionName` is only the transaction name if the view is the outermost view. If it's a nested view, the `transactionName` is a span description. Therefore the parameter doesn't correctly reflect what it does. Let's rename it to `viewName` or something similar. We should also update the code docs of `SentryTracedView` accordingly. https://github.com/getsentry/sentry-cocoa/blob/8.0.0/Sources/SentrySwiftUI/SentryTracedView.swift#L42 https://github.com/getsentry/sentry-cocoa/blob/8.0.0/Sources/SentrySwiftUI/SentryTracedView.swift#L74
non_process
rename sentrytracedview transactionname to view name description the parameter transactionname is only the transaction name if the view is the outermost view if it s a nested view the transactionname is a span description therefore the parameter doesn t correctly reflect what it does let s rename it to viewname or something similar we should also update the code docs of sentrytracedview accordingly
0
186,434
14,394,694,465
IssuesEvent
2020-12-03 01:54:28
github-vet/rangeclosure-findings
https://api.github.com/repos/github-vet/rangeclosure-findings
closed
evergreen-ci/evergreen: vendor/github.com/mongodb/amboy/vendor/github.com/evergreen-ci/poplar/vendor/github.com/mongodb/mongo-go-driver/mongo/read_write_concern_spec_test.go; 3 LoC
fresh test tiny vendored
Found a possible issue in [evergreen-ci/evergreen](https://www.github.com/evergreen-ci/evergreen) at [vendor/github.com/mongodb/amboy/vendor/github.com/evergreen-ci/poplar/vendor/github.com/mongodb/mongo-go-driver/mongo/read_write_concern_spec_test.go](https://github.com/evergreen-ci/evergreen/blob/335f9ee3f78248268784741d34bd447b17227d8d/vendor/github.com/mongodb/amboy/vendor/github.com/evergreen-ci/poplar/vendor/github.com/mongodb/mongo-go-driver/mongo/read_write_concern_spec_test.go#L90-L92) The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements which capture loop variables. [Click here to see the code in its original context.](https://github.com/evergreen-ci/evergreen/blob/335f9ee3f78248268784741d34bd447b17227d8d/vendor/github.com/mongodb/amboy/vendor/github.com/evergreen-ci/poplar/vendor/github.com/mongodb/mongo-go-driver/mongo/read_write_concern_spec_test.go#L90-L92) <details> <summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary> ```go for _, testCase := range container.Tests { runConnectionStringTest(t, fmt.Sprintf("%s/%s/%s", connStringTestsDir, filename, testCase.Description), &testCase) } ``` Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to testCase at line 91 may start a goroutine </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 335f9ee3f78248268784741d34bd447b17227d8d
1.0
evergreen-ci/evergreen: vendor/github.com/mongodb/amboy/vendor/github.com/evergreen-ci/poplar/vendor/github.com/mongodb/mongo-go-driver/mongo/read_write_concern_spec_test.go; 3 LoC - Found a possible issue in [evergreen-ci/evergreen](https://www.github.com/evergreen-ci/evergreen) at [vendor/github.com/mongodb/amboy/vendor/github.com/evergreen-ci/poplar/vendor/github.com/mongodb/mongo-go-driver/mongo/read_write_concern_spec_test.go](https://github.com/evergreen-ci/evergreen/blob/335f9ee3f78248268784741d34bd447b17227d8d/vendor/github.com/mongodb/amboy/vendor/github.com/evergreen-ci/poplar/vendor/github.com/mongodb/mongo-go-driver/mongo/read_write_concern_spec_test.go#L90-L92) The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements which capture loop variables. [Click here to see the code in its original context.](https://github.com/evergreen-ci/evergreen/blob/335f9ee3f78248268784741d34bd447b17227d8d/vendor/github.com/mongodb/amboy/vendor/github.com/evergreen-ci/poplar/vendor/github.com/mongodb/mongo-go-driver/mongo/read_write_concern_spec_test.go#L90-L92) <details> <summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary> ```go for _, testCase := range container.Tests { runConnectionStringTest(t, fmt.Sprintf("%s/%s/%s", connStringTestsDir, filename, testCase.Description), &testCase) } ``` Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to testCase at line 91 may start a goroutine </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 335f9ee3f78248268784741d34bd447b17227d8d
non_process
evergreen ci evergreen vendor github com mongodb amboy vendor github com evergreen ci poplar vendor github com mongodb mongo go driver mongo read write concern spec test go loc found a possible issue in at the below snippet of go code triggered static analysis which searches for goroutines and or defer statements which capture loop variables click here to show the line s of go which triggered the analyzer go for testcase range container tests runconnectionstringtest t fmt sprintf s s s connstringtestsdir filename testcase description testcase below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to testcase at line may start a goroutine leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
0
16,767
21,943,370,786
IssuesEvent
2022-05-23 20:41:07
OpenDataScotland/the_od_bods
https://api.github.com/repos/OpenDataScotland/the_od_bods
closed
Add in dataset owner aliasing
bug data processing back end
When scraping some orgs at the moment, the org name for the dataset owner does not match up with the org we have in JKAN e.g. the DCAT publisher field for Highland Council is "Highland Council GIS organisation". We should have an alias lookup table for these so we can convert where required during the scraping process.
1.0
Add in dataset owner aliasing - When scraping some orgs at the moment, the org name for the dataset owner does not match up with the org we have in JKAN e.g. the DCAT publisher field for Highland Council is "Highland Council GIS organisation". We should have an alias lookup table for these so we can convert where required during the scraping process.
process
add in dataset owner aliasing when scraping some orgs at the moment the org name for the dataset owner does not match up with the org we have in jkan e g the dcat publisher field for highland council is highland council gis organisation we should have an alias lookup table for these so we can convert where required during the scraping process
1
13,218
15,687,976,804
IssuesEvent
2021-03-25 14:13:51
ORNL-AMO/AMO-Tools-Desktop
https://api.github.com/repos/ORNL-AMO/AMO-Tools-Desktop
closed
New PH calcs small tweaks
Calculator Process Heating Quick Fix
Air heating using flue Exhaust gas temp leaving HX result has a $ in it Heat Cascading Heat transfered - recovered ... result has a $ in it Fuel Heating Value should have a default based on the fuel type. It is available from the fuel composition (drop down or "add new fuel" ![image.png](https://images.zenhubusercontent.com/5cd48a2af8cffa5a19122d27/acdc3abb-4ba6-438e-8ba1-b8ca5b388311)
1.0
New PH calcs small tweaks - Air heating using flue Exhaust gas temp leaving HX result has a $ in it Heat Cascading Heat transfered - recovered ... result has a $ in it Fuel Heating Value should have a default based on the fuel type. It is available from the fuel composition (drop down or "add new fuel" ![image.png](https://images.zenhubusercontent.com/5cd48a2af8cffa5a19122d27/acdc3abb-4ba6-438e-8ba1-b8ca5b388311)
process
new ph calcs small tweaks air heating using flue exhaust gas temp leaving hx result has a in it heat cascading heat transfered recovered result has a in it fuel heating value should have a default based on the fuel type it is available from the fuel composition drop down or add new fuel
1
11,983
14,737,114,609
IssuesEvent
2021-01-07 00:54:46
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Getting an error when trying to get into an account
anc-ops anc-process anp-0.5 ant-bug ant-support
In GitLab by @kdjstudios on Apr 17, 2018, 13:37 **Submitted by:** "Amecia Snelling" <amecia.snelling@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-04-17-12951/conversation **Server:** Internal **Client/Site:** Chattanooga **Account:** ALL **Issue:** When trying to get into an account we are getting the following error. We’re sorry but something went wrong. This is happening on all accounts that we try to get into. Please let me know if you have any questions.
1.0
Getting an error when trying to get into an account - In GitLab by @kdjstudios on Apr 17, 2018, 13:37 **Submitted by:** "Amecia Snelling" <amecia.snelling@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-04-17-12951/conversation **Server:** Internal **Client/Site:** Chattanooga **Account:** ALL **Issue:** When trying to get into an account we are getting the following error. We’re sorry but something went wrong. This is happening on all accounts that we try to get into. Please let me know if you have any questions.
process
getting an error when trying to get into an account in gitlab by kdjstudios on apr submitted by amecia snelling helpdesk server internal client site chattanooga account all issue when trying to get into an account we are getting the following error we’re sorry but something went wrong this is happening on all accounts that we try to get into please let me know if you have any questions
1
97,990
8,673,896,931
IssuesEvent
2018-11-30 04:53:58
humera987/FXLabs-Test-Automation
https://api.github.com/repos/humera987/FXLabs-Test-Automation
closed
FXLabs Testing : ApiV1IssueTrackersSkillTypeTypeGetQueryParamPagesizeNegativeNumber
FXLabs Testing
Project : FXLabs Testing Job : UAT Env : UAT Region : US_WEST Result : fail Status Code : 404 Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=Njg3ZmU0YTYtMGQyNy00NzhjLWFlYzctNTk1MTU3YWRlOTNk; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Fri, 30 Nov 2018 04:29:52 GMT]} Endpoint : http://13.56.210.25/api/v1/api/v1/issue-trackers/skill-type/NLWViVux?pageSize=-1 Request : Response : { "timestamp" : "2018-11-30T04:29:53.420+0000", "status" : 404, "error" : "Not Found", "message" : "No message available", "path" : "/api/v1/api/v1/issue-trackers/skill-type/NLWViVux" } Logs : Assertion [@StatusCode != 401] resolved-to [404 != 401] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed] --- FX Bot ---
1.0
FXLabs Testing : ApiV1IssueTrackersSkillTypeTypeGetQueryParamPagesizeNegativeNumber - Project : FXLabs Testing Job : UAT Env : UAT Region : US_WEST Result : fail Status Code : 404 Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=Njg3ZmU0YTYtMGQyNy00NzhjLWFlYzctNTk1MTU3YWRlOTNk; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Fri, 30 Nov 2018 04:29:52 GMT]} Endpoint : http://13.56.210.25/api/v1/api/v1/issue-trackers/skill-type/NLWViVux?pageSize=-1 Request : Response : { "timestamp" : "2018-11-30T04:29:53.420+0000", "status" : 404, "error" : "Not Found", "message" : "No message available", "path" : "/api/v1/api/v1/issue-trackers/skill-type/NLWViVux" } Logs : Assertion [@StatusCode != 401] resolved-to [404 != 401] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed] --- FX Bot ---
non_process
fxlabs testing project fxlabs testing job uat env uat region us west result fail status code headers x content type options x xss protection cache control pragma expires x frame options set cookie content type transfer encoding date endpoint request response timestamp status error not found message no message available path api api issue trackers skill type nlwvivux logs assertion resolved to result assertion resolved to result fx bot
0
119,085
12,014,169,302
IssuesEvent
2020-04-10 10:41:29
saunders94/JarChess
https://api.github.com/repos/saunders94/JarChess
closed
Update Use Case Survey
documentation
Once changes to the Diagram are approved, the use case survey needs to be updated to be consistent with the diagram. - [ ] Remove the friend secondary actor summary. - [ ] Remove and update the use cases listed in the use case summary. - [ ] Replace the diagram with the new one
1.0
Update Use Case Survey - Once changes to the Diagram are approved, the use case survey needs to be updated to be consistent with the diagram. - [ ] Remove the friend secondary actor summary. - [ ] Remove and update the use cases listed in the use case summary. - [ ] Replace the diagram with the new one
non_process
update use case survey once changes to the diagram are approved the use case survey needs to be updated to be consistent with the diagram remove the friend secondary actor summary remove and update the use cases listed in the use case summary replace the diagram with the new one
0
41,963
5,412,055,345
IssuesEvent
2017-03-01 13:35:01
allure-framework/allure1
https://api.github.com/repos/allure-framework/allure1
closed
The problem is TestNG and finding steps error installed using soft asserts
TestNG
Hi. Got a question. Would it be possible to implement the support for TestNG into Allure reports in order to display the bugs fixed with the help of Soft-Asserts correctly? In my opinion right now it looks incorrectly because the errors are not binding to the steps in which they have occurred. ![2015-10-30 16-01-10](https://cloud.githubusercontent.com/assets/10957765/10845289/71808a68-7f28-11e5-97bd-0df2151e2f40.png)
1.0
The problem is TestNG and finding steps error installed using soft asserts - Hi. Got a question. Would it be possible to implement the support for TestNG into Allure reports in order to display the bugs fixed with the help of Soft-Asserts correctly? In my opinion right now it looks incorrectly because the errors are not binding to the steps in which they have occurred. ![2015-10-30 16-01-10](https://cloud.githubusercontent.com/assets/10957765/10845289/71808a68-7f28-11e5-97bd-0df2151e2f40.png)
non_process
the problem is testng and finding steps error installed using soft asserts hi got a question would it be possible to implement the support for testng into allure reports in order to display the bugs fixed with the help of soft asserts correctly in my opinion right now it looks incorrectly because the errors are not binding to the steps in which they have occurred
0
13,228
15,701,017,294
IssuesEvent
2021-03-26 10:36:57
e4exp/paper_manager_abstract
https://api.github.com/repos/e4exp/paper_manager_abstract
opened
BERT: A Review of Applications in Natural Language Processing and Understanding
2020 BERT Natural Language Processing Review _read_later
- https://arxiv.org/abs/2103.11943 - 2021 このレビューでは、最も人気のある深層学習ベースの言語モデルの1つであるBERTの適用について説明します。 このモデルの動作メカニズム、テキスト分析のタスクへの主な適用分野、各タスクにおける類似モデルとの比較、さらにいくつかの独自モデルについても説明しています。 このレビューを作成するにあたり、科学界で最も注目を集めた過去数年間に発表された数十本のオリジナル科学論文のデータを体系化しました。 この調査は、自然言語テキスト分析の分野における最新の進歩を知りたいと思っているすべての学生や研究者にとって有益なものとなるでしょう。
1.0
BERT: A Review of Applications in Natural Language Processing and Understanding - - https://arxiv.org/abs/2103.11943 - 2021 このレビューでは、最も人気のある深層学習ベースの言語モデルの1つであるBERTの適用について説明します。 このモデルの動作メカニズム、テキスト分析のタスクへの主な適用分野、各タスクにおける類似モデルとの比較、さらにいくつかの独自モデルについても説明しています。 このレビューを作成するにあたり、科学界で最も注目を集めた過去数年間に発表された数十本のオリジナル科学論文のデータを体系化しました。 この調査は、自然言語テキスト分析の分野における最新の進歩を知りたいと思っているすべての学生や研究者にとって有益なものとなるでしょう。
process
bert a review of applications in natural language processing and understanding このレビューでは、 。 このモデルの動作メカニズム、テキスト分析のタスクへの主な適用分野、各タスクにおける類似モデルとの比較、さらにいくつかの独自モデルについても説明しています。 このレビューを作成するにあたり、科学界で最も注目を集めた過去数年間に発表された数十本のオリジナル科学論文のデータを体系化しました。 この調査は、自然言語テキスト分析の分野における最新の進歩を知りたいと思っているすべての学生や研究者にとって有益なものとなるでしょう。
1
64,354
3,210,850,485
IssuesEvent
2015-10-06 07:15:58
fallenswordhelper/fallenswordhelper
https://api.github.com/repos/fallenswordhelper/fallenswordhelper
closed
Hide Buff Selected doesn't do anything
bug minor priority:2
Looks like the preference was written in but the functionality never made it. It's fairly simple to do.
1.0
Hide Buff Selected doesn't do anything - Looks like the preference was written in but the functionality never made it. It's fairly simple to do.
non_process
hide buff selected doesn t do anything looks like the preference was written in but the functionality never made it it s fairly simple to do
0
9,990
13,038,538,536
IssuesEvent
2020-07-28 15:21:08
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Refactor Fields tool ignoring user selection for field types on macOS
Bug Feedback MacOS Processing
**Describe the bug** <!-- A clear and concise description of what the bug is. --> The Refactor Fields processing tool ignore’s the user’s selection of a field’s type, instead using the existing type for fields loaded from an existing layer, or always using String for new fields. **How to Reproduce** <!-- Steps, sample datasets and qgis project file to reproduce the behavior. Screencasts or screenshots welcome--> Sample geojson: ```geojson {"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"LineString","coordinates":[[102,0],[103,1],[104,0],[105,1]]},"properties":{"prop0":"value0","prop1":0}}]} ``` Can’t change the type of an existing field: 1. Load a vector layer, such as the one above. If prompted to reset the field mapping, accept 2. In the Processing toolbox, open the **Refactor Fields** tool 3. Select the vector layer as the input 4. Change the type of one of the fields 5. Click Run 6. Open the Properties of the new layer 7. Select the **Fields** tab 8. Note that the field type has not changed The following can be found in the output log. On line 3, the type should be ’10’, as string was selected, but instead the type is ‘6’, showing that the user’s selection is not being used ``` { 'FIELDS_MAPPING' : [ {'expression': '"prop0"', 'length': 0, 'name': 'prop0', 'precision': 0, 'type': 10}, {'expression': '"prop1"', 'length': 0, 'name': 'prop1', 'precision': 0, 'type': 6} ], 'INPUT' : 'sample.geojson', 'OUTPUT' : 'TEMPORARY_OUTPUT' } ``` Type of new field always defaults to String: 1. Load a vector layer, such as the one above. If prompted to reset the field mapping, accept 2. In the Processing toolbox, open the **Refactor Fields** tool 3. Select the vector layer as the input 4. Click the add field button. 5. Add a source expression such as `prop1 + 4` , and a field name such as `prop2` 6. Select a type other than String, such as double, and a length and precision, such as 4 and 1 7. Click Run 8. Open the Properties of the new layer 9. Select the **Fields** tab 10. Note that the new field's type is String The output log shows that the user’s selection for the new field’s type is not being used. Line 4 should have a type of 6, but instead has a type of 0: ``` { 'FIELDS_MAPPING' : [ {'expression': '"prop0"', 'length': 0, 'name': 'prop0', 'precision': 0, 'type': 10}, {'expression': '"prop1"', 'length': 0, 'name': 'prop1', 'precision': 0, 'type': 6}, {'expression': 'prop1 + 4', 'length': 10, 'name': 'prop2', 'precision': 1, 'type': 0} ], 'INPUT' : '/Users/primaryuser/Downloads/sample.geojson', 'OUTPUT' : 'TEMPORARY_OUTPUT' } ``` These problems persist even when casting the source to the desired type within the source expression, with a function such as `to_real()`, and when saving the output to file formats such as geojson or geopackage. **QGIS and OS versions** <!-- In the QGIS menu help/about, click in the dialog, Ctrl+A and then Ctrl+C. Finally paste here --> QGIS Version 3.10 (also present in 3.8 and 3.4) Only the default .dmg installers from qgis.org have been tested. (Have not tested the KyngChaos or osgeo4mac distributions) macOS version 10.14 Mojave
1.0
Refactor Fields tool ignoring user selection for field types on macOS - **Describe the bug** <!-- A clear and concise description of what the bug is. --> The Refactor Fields processing tool ignore’s the user’s selection of a field’s type, instead using the existing type for fields loaded from an existing layer, or always using String for new fields. **How to Reproduce** <!-- Steps, sample datasets and qgis project file to reproduce the behavior. Screencasts or screenshots welcome--> Sample geojson: ```geojson {"type":"FeatureCollection","features":[{"type":"Feature","geometry":{"type":"LineString","coordinates":[[102,0],[103,1],[104,0],[105,1]]},"properties":{"prop0":"value0","prop1":0}}]} ``` Can’t change the type of an existing field: 1. Load a vector layer, such as the one above. If prompted to reset the field mapping, accept 2. In the Processing toolbox, open the **Refactor Fields** tool 3. Select the vector layer as the input 4. Change the type of one of the fields 5. Click Run 6. Open the Properties of the new layer 7. Select the **Fields** tab 8. Note that the field type has not changed The following can be found in the output log. On line 3, the type should be ’10’, as string was selected, but instead the type is ‘6’, showing that the user’s selection is not being used ``` { 'FIELDS_MAPPING' : [ {'expression': '"prop0"', 'length': 0, 'name': 'prop0', 'precision': 0, 'type': 10}, {'expression': '"prop1"', 'length': 0, 'name': 'prop1', 'precision': 0, 'type': 6} ], 'INPUT' : 'sample.geojson', 'OUTPUT' : 'TEMPORARY_OUTPUT' } ``` Type of new field always defaults to String: 1. Load a vector layer, such as the one above. If prompted to reset the field mapping, accept 2. In the Processing toolbox, open the **Refactor Fields** tool 3. Select the vector layer as the input 4. Click the add field button. 5. Add a source expression such as `prop1 + 4` , and a field name such as `prop2` 6. Select a type other than String, such as double, and a length and precision, such as 4 and 1 7. Click Run 8. Open the Properties of the new layer 9. Select the **Fields** tab 10. Note that the new field's type is String The output log shows that the user’s selection for the new field’s type is not being used. Line 4 should have a type of 6, but instead has a type of 0: ``` { 'FIELDS_MAPPING' : [ {'expression': '"prop0"', 'length': 0, 'name': 'prop0', 'precision': 0, 'type': 10}, {'expression': '"prop1"', 'length': 0, 'name': 'prop1', 'precision': 0, 'type': 6}, {'expression': 'prop1 + 4', 'length': 10, 'name': 'prop2', 'precision': 1, 'type': 0} ], 'INPUT' : '/Users/primaryuser/Downloads/sample.geojson', 'OUTPUT' : 'TEMPORARY_OUTPUT' } ``` These problems persist even when casting the source to the desired type within the source expression, with a function such as `to_real()`, and when saving the output to file formats such as geojson or geopackage. **QGIS and OS versions** <!-- In the QGIS menu help/about, click in the dialog, Ctrl+A and then Ctrl+C. Finally paste here --> QGIS Version 3.10 (also present in 3.8 and 3.4) Only the default .dmg installers from qgis.org have been tested. (Have not tested the KyngChaos or osgeo4mac distributions) macOS version 10.14 Mojave
process
refactor fields tool ignoring user selection for field types on macos describe the bug the refactor fields processing tool ignore’s the user’s selection of a field’s type instead using the existing type for fields loaded from an existing layer or always using string for new fields how to reproduce sample geojson geojson type featurecollection features properties can’t change the type of an existing field load a vector layer such as the one above if prompted to reset the field mapping accept in the processing toolbox open the refactor fields tool select the vector layer as the input change the type of one of the fields click run open the properties of the new layer select the fields tab note that the field type has not changed the following can be found in the output log on line the type should be ’ ’ as string was selected but instead the type is ‘ ’ showing that the user’s selection is not being used fields mapping expression length name precision type expression length name precision type input sample geojson output temporary output type of new field always defaults to string load a vector layer such as the one above if prompted to reset the field mapping accept in the processing toolbox open the refactor fields tool select the vector layer as the input click the add field button add a source expression such as and a field name such as select a type other than string such as double and a length and precision such as and click run open the properties of the new layer select the fields tab note that the new field s type is string the output log shows that the user’s selection for the new field’s type is not being used line should have a type of but instead has a type of fields mapping expression length name precision type expression length name precision type expression length name precision type input users primaryuser downloads sample geojson output temporary output these problems persist even when casting the source to the desired type within the source expression with a function such as to real and when saving the output to file formats such as geojson or geopackage qgis and os versions qgis version also present in and only the default dmg installers from qgis org have been tested have not tested the kyngchaos or distributions macos version mojave
1
135,978
30,455,182,518
IssuesEvent
2023-07-16 19:59:04
languagetool-org/languagetool
https://api.github.com/repos/languagetool-org/languagetool
closed
[en] Case conversion does not give the correct result for UPPER CASE text
bug English code/java
"Note: LT automatically adjusts the case of suggestions if they are added as plain text." (https://dev.languagetool.org/tips-and-tricks#changing-the-case-of-matched-word) ``` <rule id="CASE_CONVERSION_TEST" name="Case conversion test"> <pattern> <token>cat</token> </pattern> <message>Use '<suggestion>dog</suggestion>'.</message> <example correction="dog">My <marker>cat</marker> is happy.</example> <example correction="Dog">My <marker>Cat</marker> is fat.</example> <example correction="DOG">MY <marker>CAT</marker> SAT ON THE MAT.</example> </rule> ``` Testrules gives this message (LanguageTool-20220919-snapshot.zip): `Exception in thread "main" org.languagetool.rules.patterns.PatternRuleTest$PatternRuleTestFailure: Test failure for rule CASE_CONVERSION_TEST[1] in file /org/languagetool/rules/en/grammar.xml: Incorrect suggestions: Expected 'DOG', got: 'Dog' on input: 'MY CAT SAT ON THE MAT.'`
1.0
[en] Case conversion does not give the correct result for UPPER CASE text - "Note: LT automatically adjusts the case of suggestions if they are added as plain text." (https://dev.languagetool.org/tips-and-tricks#changing-the-case-of-matched-word) ``` <rule id="CASE_CONVERSION_TEST" name="Case conversion test"> <pattern> <token>cat</token> </pattern> <message>Use '<suggestion>dog</suggestion>'.</message> <example correction="dog">My <marker>cat</marker> is happy.</example> <example correction="Dog">My <marker>Cat</marker> is fat.</example> <example correction="DOG">MY <marker>CAT</marker> SAT ON THE MAT.</example> </rule> ``` Testrules gives this message (LanguageTool-20220919-snapshot.zip): `Exception in thread "main" org.languagetool.rules.patterns.PatternRuleTest$PatternRuleTestFailure: Test failure for rule CASE_CONVERSION_TEST[1] in file /org/languagetool/rules/en/grammar.xml: Incorrect suggestions: Expected 'DOG', got: 'Dog' on input: 'MY CAT SAT ON THE MAT.'`
non_process
case conversion does not give the correct result for upper case text note lt automatically adjusts the case of suggestions if they are added as plain text cat use dog my cat is happy my cat is fat my cat sat on the mat testrules gives this message languagetool snapshot zip exception in thread main org languagetool rules patterns patternruletest patternruletestfailure test failure for rule case conversion test in file org languagetool rules en grammar xml incorrect suggestions expected dog got dog on input my cat sat on the mat
0
2,250
5,088,650,072
IssuesEvent
2017-01-01 00:01:31
sw4j-org/tool-jpa-processor
https://api.github.com/repos/sw4j-org/tool-jpa-processor
opened
Handle @OneToOne Annotation
annotation processor task
Handle the `@OneToOne` annotation for a property or field. See [JSR 338: Java Persistence API, Version 2.1](http://download.oracle.com/otn-pub/jcp/persistence-2_1-fr-eval-spec/JavaPersistence.pdf) - 11.1.41 OneToOne Annotation
1.0
Handle @OneToOne Annotation - Handle the `@OneToOne` annotation for a property or field. See [JSR 338: Java Persistence API, Version 2.1](http://download.oracle.com/otn-pub/jcp/persistence-2_1-fr-eval-spec/JavaPersistence.pdf) - 11.1.41 OneToOne Annotation
process
handle onetoone annotation handle the onetoone annotation for a property or field see onetoone annotation
1
19,302
25,466,614,928
IssuesEvent
2022-11-25 05:29:40
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[IDP] [PM] Getting 'An internal error has occurred' message in participant manager when tried to sign in
Bug Blocker P0 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
**Pre-condition:** idp and mfa should be enabled in the PM **Steps:** 1. Add organizational user in the participant manager 2. Complete set up your account process 3. Sign in with registered credentials and Verify **AR:** Getting an internal error has occurred message when tried to sign in **ER:** Admin's should be able to sign without any error's ![Untitled](https://user-images.githubusercontent.com/86007179/178441102-f1d47479-b1ad-4661-b31b-5abde48b9b46.png)
3.0
[IDP] [PM] Getting 'An internal error has occurred' message in participant manager when tried to sign in - **Pre-condition:** idp and mfa should be enabled in the PM **Steps:** 1. Add organizational user in the participant manager 2. Complete set up your account process 3. Sign in with registered credentials and Verify **AR:** Getting an internal error has occurred message when tried to sign in **ER:** Admin's should be able to sign without any error's ![Untitled](https://user-images.githubusercontent.com/86007179/178441102-f1d47479-b1ad-4661-b31b-5abde48b9b46.png)
process
getting an internal error has occurred message in participant manager when tried to sign in pre condition idp and mfa should be enabled in the pm steps add organizational user in the participant manager complete set up your account process sign in with registered credentials and verify ar getting an internal error has occurred message when tried to sign in er admin s should be able to sign without any error s
1
15,111
18,847,210,369
IssuesEvent
2021-11-11 16:11:59
prisma/prisma
https://api.github.com/repos/prisma/prisma
opened
Decide on the default referential types in a single place
process/candidate engines/data model parser team/migrations topic: referential actions
Right now we define referential types two times if they're not explicitly defined in the data model, first here: https://github.com/prisma/prisma-engines/blob/master/libs/datamodel/core/src/transform/ast_to_dml/db/walkers/relation.rs#L360-L395 Then we do not use these functions in the validations: [thttps://github.com/prisma/prisma-engine](https://github.com/prisma/prisma-engines/blob/master/libs/datamodel/core/src/transform/ast_to_dml/lift.rs#L70-L71) And, again, in the dml part we have another set of defaults: thttps://github.com/prisma/prisma-engines/blob/master/libs/datamodel/core/src/transform/ast_to_dml/lift.rs#L70-L71 Now if any of these change without the other, there might be quite interesting bugs happening in the future. The correct thing would be to just not use `Option<ReferentialAction>` in the dml. All the engines need some action defined, so it doesn't really need to be optional. There are some caveats for this. First is how we use dml to render the datamodel string back to the file. We should somehow mark in the data structures so we know to not render a value. This decision needs a lot of data, so at least we could write the default values to the dml and just not render them. The second one is how dml structs hold a `RelationInfo` on both sides of the relation, and it is not the same structure. Should we have the actions in the back relation side too and if so should they be the same in there too? In the case of many to many relations, we don't really define any referential actions, should we just have `NoAction` in these?
1.0
Decide on the default referential types in a single place - Right now we define referential types two times if they're not explicitly defined in the data model, first here: https://github.com/prisma/prisma-engines/blob/master/libs/datamodel/core/src/transform/ast_to_dml/db/walkers/relation.rs#L360-L395 Then we do not use these functions in the validations: [thttps://github.com/prisma/prisma-engine](https://github.com/prisma/prisma-engines/blob/master/libs/datamodel/core/src/transform/ast_to_dml/lift.rs#L70-L71) And, again, in the dml part we have another set of defaults: thttps://github.com/prisma/prisma-engines/blob/master/libs/datamodel/core/src/transform/ast_to_dml/lift.rs#L70-L71 Now if any of these change without the other, there might be quite interesting bugs happening in the future. The correct thing would be to just not use `Option<ReferentialAction>` in the dml. All the engines need some action defined, so it doesn't really need to be optional. There are some caveats for this. First is how we use dml to render the datamodel string back to the file. We should somehow mark in the data structures so we know to not render a value. This decision needs a lot of data, so at least we could write the default values to the dml and just not render them. The second one is how dml structs hold a `RelationInfo` on both sides of the relation, and it is not the same structure. Should we have the actions in the back relation side too and if so should they be the same in there too? In the case of many to many relations, we don't really define any referential actions, should we just have `NoAction` in these?
process
decide on the default referential types in a single place right now we define referential types two times if they re not explicitly defined in the data model first here then we do not use these functions in the validations and again in the dml part we have another set of defaults t now if any of these change without the other there might be quite interesting bugs happening in the future the correct thing would be to just not use option in the dml all the engines need some action defined so it doesn t really need to be optional there are some caveats for this first is how we use dml to render the datamodel string back to the file we should somehow mark in the data structures so we know to not render a value this decision needs a lot of data so at least we could write the default values to the dml and just not render them the second one is how dml structs hold a relationinfo on both sides of the relation and it is not the same structure should we have the actions in the back relation side too and if so should they be the same in there too in the case of many to many relations we don t really define any referential actions should we just have noaction in these
1
58,583
11,890,765,569
IssuesEvent
2020-03-28 19:53:05
herricane/herricane.github.io
https://api.github.com/repos/herricane/herricane.github.io
opened
LeetCode 练习(1) | 阿辉's Blog
/post/leetcode-lian-xi-1/ Gitalk
https://herricane.github.io/post/leetcode-lian-xi-1/ 为了找工作,LeetCode 刷起来。记录一下有意思的题目和相关知识。(要恰饭的嘛) 20. Valid Parentheses Given a string containing just the characters '(', ')...
1.0
LeetCode 练习(1) | 阿辉's Blog - https://herricane.github.io/post/leetcode-lian-xi-1/ 为了找工作,LeetCode 刷起来。记录一下有意思的题目和相关知识。(要恰饭的嘛) 20. Valid Parentheses Given a string containing just the characters '(', ')...
non_process
leetcode 练习( ) 阿辉 s blog 为了找工作,leetcode 刷起来。记录一下有意思的题目和相关知识。(要恰饭的嘛) valid parentheses given a string containing just the characters
0
21,815
30,316,614,110
IssuesEvent
2023-07-10 15:59:28
tdwg/dwc
https://api.github.com/repos/tdwg/dwc
closed
Change term - institutionID and collectionID
Term - change Class - Record-level non-normative Process - complete
## Term change Note: Similar changes for two terms are proposed here. The justifications below apply to both terms. * Submitter: Steven Ginzbarg * Efficacy Justification (why is this change necessary?): The service recommended in the comments has been replaced. * Demand Justification (if the change is semantic in nature, name at least two organizations that independently need this term): No semantic change. * Stability Justification (what concerns are there that this might affect existing implementations?): None * Implications for dwciri: namespace (does this change affect a dwciri term version)?: None Current Term definition: https://dwc.tdwg.org/list/#dwc_institutionID Current Term definition: https://dwc.tdwg.org/list/#dwc_collectionID Proposed attributes of the new term version (Please put actual changes to be implemented in **bold** and ~strikethrough~): * Term name (in lowerCamelCase for properties, UpperCamelCase for classes): institutionID * Organized in Class (e.g., Occurrence, Event, Location, Taxon): Record-level * Definition of the term (normative): An identifier for the institution having custody of the object(s) or information referred to in the record. * Usage comments (recommendations regarding content, etc., not normative): For physical specimens, the recommended best practice is to use an identifier from a collections registry such as the ~Global Registry of Biodiversity Repositories (http://grbio.org/)~ **Research Organization Registry (ROR) or the GBIF Registry of Scientific Collections (https://www.gbif.org/grscicoll)**. * Examples (not normative): ~`http://biocol.org/urn:lsid:biocol.org:col:34777`, http://grbio.org/cool/km06-gtbn~ **`https://ror.org/015hz7p22`, `https://www.gbif.org/grscicoll/institution/e3d4dcc4-81e2-444c-8a5c-41d1044b5381`** * Refines (identifier of the broader term this term refines; normative): None * Replaces (identifier of the existing term that would be deprecated and replaced by this term; normative): http://rs.tdwg.org/dwc/terms/version/institutionID-2017-10-06 * ABCD 2.06 (XPATH of the equivalent term in ABCD or EFG; not normative): DataSets/DataSet/Units/Unit/SourceInstitutionID <hr> * Term name (in lowerCamelCase for properties, UpperCamelCase for classes): collectionID * Organized in Class (e.g., Occurrence, Event, Location, Taxon): Record-level * Definition of the term (normative): An identifier for the collection or dataset from which the record was derived. * Usage comments (recommendations regarding content, etc., not normative): For physical specimens, the recommended best practice is to use an identifier from a collections registry such as the ~Global Registry of Biodiversity Repositories (http://grbio.org/)~ **GBIF Registry of Scientific Collections (https://www.gbif.org/grscicoll)**. * Examples (not normative): ~`http://biocol.org/urn:lsid:biocol.org:col:1001`, `http://grbio.org/cool/p5fp-c036`~ **https://www.gbif.org/grscicoll/collection/fbd3ed74-5a21-4e01-b86a-33d36f032d9c** * Refines (identifier of the broader term this term refines; normative): None * Replaces (identifier of the existing term that would be deprecated and replaced by this term; normative): http://rs.tdwg.org/dwc/terms/version/collectionID-2017-10-06 * ABCD 2.06 (XPATH of the equivalent term in ABCD or EFG; not normative): DataSets/DataSet/Units/Unit/SourceID
1.0
Change term - institutionID and collectionID - ## Term change Note: Similar changes for two terms are proposed here. The justifications below apply to both terms. * Submitter: Steven Ginzbarg * Efficacy Justification (why is this change necessary?): The service recommended in the comments has been replaced. * Demand Justification (if the change is semantic in nature, name at least two organizations that independently need this term): No semantic change. * Stability Justification (what concerns are there that this might affect existing implementations?): None * Implications for dwciri: namespace (does this change affect a dwciri term version)?: None Current Term definition: https://dwc.tdwg.org/list/#dwc_institutionID Current Term definition: https://dwc.tdwg.org/list/#dwc_collectionID Proposed attributes of the new term version (Please put actual changes to be implemented in **bold** and ~strikethrough~): * Term name (in lowerCamelCase for properties, UpperCamelCase for classes): institutionID * Organized in Class (e.g., Occurrence, Event, Location, Taxon): Record-level * Definition of the term (normative): An identifier for the institution having custody of the object(s) or information referred to in the record. * Usage comments (recommendations regarding content, etc., not normative): For physical specimens, the recommended best practice is to use an identifier from a collections registry such as the ~Global Registry of Biodiversity Repositories (http://grbio.org/)~ **Research Organization Registry (ROR) or the GBIF Registry of Scientific Collections (https://www.gbif.org/grscicoll)**. * Examples (not normative): ~`http://biocol.org/urn:lsid:biocol.org:col:34777`, http://grbio.org/cool/km06-gtbn~ **`https://ror.org/015hz7p22`, `https://www.gbif.org/grscicoll/institution/e3d4dcc4-81e2-444c-8a5c-41d1044b5381`** * Refines (identifier of the broader term this term refines; normative): None * Replaces (identifier of the existing term that would be deprecated and replaced by this term; normative): http://rs.tdwg.org/dwc/terms/version/institutionID-2017-10-06 * ABCD 2.06 (XPATH of the equivalent term in ABCD or EFG; not normative): DataSets/DataSet/Units/Unit/SourceInstitutionID <hr> * Term name (in lowerCamelCase for properties, UpperCamelCase for classes): collectionID * Organized in Class (e.g., Occurrence, Event, Location, Taxon): Record-level * Definition of the term (normative): An identifier for the collection or dataset from which the record was derived. * Usage comments (recommendations regarding content, etc., not normative): For physical specimens, the recommended best practice is to use an identifier from a collections registry such as the ~Global Registry of Biodiversity Repositories (http://grbio.org/)~ **GBIF Registry of Scientific Collections (https://www.gbif.org/grscicoll)**. * Examples (not normative): ~`http://biocol.org/urn:lsid:biocol.org:col:1001`, `http://grbio.org/cool/p5fp-c036`~ **https://www.gbif.org/grscicoll/collection/fbd3ed74-5a21-4e01-b86a-33d36f032d9c** * Refines (identifier of the broader term this term refines; normative): None * Replaces (identifier of the existing term that would be deprecated and replaced by this term; normative): http://rs.tdwg.org/dwc/terms/version/collectionID-2017-10-06 * ABCD 2.06 (XPATH of the equivalent term in ABCD or EFG; not normative): DataSets/DataSet/Units/Unit/SourceID
process
change term institutionid and collectionid term change note similar changes for two terms are proposed here the justifications below apply to both terms submitter steven ginzbarg efficacy justification why is this change necessary the service recommended in the comments has been replaced demand justification if the change is semantic in nature name at least two organizations that independently need this term no semantic change stability justification what concerns are there that this might affect existing implementations none implications for dwciri namespace does this change affect a dwciri term version none current term definition current term definition proposed attributes of the new term version please put actual changes to be implemented in bold and strikethrough term name in lowercamelcase for properties uppercamelcase for classes institutionid organized in class e g occurrence event location taxon record level definition of the term normative an identifier for the institution having custody of the object s or information referred to in the record usage comments recommendations regarding content etc not normative for physical specimens the recommended best practice is to use an identifier from a collections registry such as the global registry of biodiversity repositories research organization registry ror or the gbif registry of scientific collections examples not normative refines identifier of the broader term this term refines normative none replaces identifier of the existing term that would be deprecated and replaced by this term normative abcd xpath of the equivalent term in abcd or efg not normative datasets dataset units unit sourceinstitutionid term name in lowercamelcase for properties uppercamelcase for classes collectionid organized in class e g occurrence event location taxon record level definition of the term normative an identifier for the collection or dataset from which the record was derived usage comments recommendations regarding content etc not normative for physical specimens the recommended best practice is to use an identifier from a collections registry such as the global registry of biodiversity repositories gbif registry of scientific collections examples not normative refines identifier of the broader term this term refines normative none replaces identifier of the existing term that would be deprecated and replaced by this term normative abcd xpath of the equivalent term in abcd or efg not normative datasets dataset units unit sourceid
1
61,897
3,158,430,286
IssuesEvent
2015-09-18 00:48:46
leo-project/leofs
https://api.github.com/repos/leo-project/leofs
closed
[NFS] du command is broken when the path of avs file is specified as a relative path
Bug Priority-LOW _nfs
Currently woks only specified as a absolute path.
1.0
[NFS] du command is broken when the path of avs file is specified as a relative path - Currently woks only specified as a absolute path.
non_process
du command is broken when the path of avs file is specified as a relative path currently woks only specified as a absolute path
0
8,858
11,956,473,385
IssuesEvent
2020-04-04 10:36:17
shogun-toolbox/shogun
https://api.github.com/repos/shogun-toolbox/shogun
opened
port GP code to be compatible with new API
ML: Gaussian Process Tag: Cleanup Tag: Meta Examples
The goal of this issue is to be able to use the Gaussian Process framework of Shogun from the new API (factories, base classes). I.e. the issue is a sub-issue of #4463 In a nutshell, we want to replace all explicit constructor calls around GP objects `GaussianProcessRegression, ProbitLikelihood, ZeroMeanFunction, ExactInferenceMethod`, etc, with factory calls. And then fix everything downstream from there. A good starting point is to try to port the current meta examples for GPs to the new API. This will bring up a number of issues * missing base classes for GPs -> add them * some specialized methods called in GPs -> refactor so they can be exposed via `Machine`, or potentially add a GP baseclass that has the method * there will be cases where it is not 100% clear what the best solution -> try to come up with a draft and send a PR, ask in irc/mailing-list. Once the meta examples are ported, continue with the GP notebook
1.0
port GP code to be compatible with new API - The goal of this issue is to be able to use the Gaussian Process framework of Shogun from the new API (factories, base classes). I.e. the issue is a sub-issue of #4463 In a nutshell, we want to replace all explicit constructor calls around GP objects `GaussianProcessRegression, ProbitLikelihood, ZeroMeanFunction, ExactInferenceMethod`, etc, with factory calls. And then fix everything downstream from there. A good starting point is to try to port the current meta examples for GPs to the new API. This will bring up a number of issues * missing base classes for GPs -> add them * some specialized methods called in GPs -> refactor so they can be exposed via `Machine`, or potentially add a GP baseclass that has the method * there will be cases where it is not 100% clear what the best solution -> try to come up with a draft and send a PR, ask in irc/mailing-list. Once the meta examples are ported, continue with the GP notebook
process
port gp code to be compatible with new api the goal of this issue is to be able to use the gaussian process framework of shogun from the new api factories base classes i e the issue is a sub issue of in a nutshell we want to replace all explicit constructor calls around gp objects gaussianprocessregression probitlikelihood zeromeanfunction exactinferencemethod etc with factory calls and then fix everything downstream from there a good starting point is to try to port the current meta examples for gps to the new api this will bring up a number of issues missing base classes for gps add them some specialized methods called in gps refactor so they can be exposed via machine or potentially add a gp baseclass that has the method there will be cases where it is not clear what the best solution try to come up with a draft and send a pr ask in irc mailing list once the meta examples are ported continue with the gp notebook
1
12,137
14,741,023,366
IssuesEvent
2021-01-07 09:59:05
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Missing Accounts Exceptions - NOT WORKING
anc-process anp-0.5 ant-bug has attachment
In GitLab by @kdjstudios on Dec 20, 2018, 16:56 **Submitted by:** Kyle **Helpdesk:** NA **Server:** Dev **Client/Site:** Demo **Account:** NA **Issue:** While testing the ability to turn on and off the usage line item invoicing, I believe I found that the missing accounts and exceptions is not working correctly. As I was uploading a file ([Demo_Site_VCC_Usage_Internal.csv](/uploads/e8e3c07cdcd48139ab03b39099e103fb/Demo_Site_VCC_Usage_Internal.csv)) that had resource IDs that did not match any account but the last two billing cycles it only showed the first account as being missing. This was done on the Dev server and as you can see in the last exception[exceptions_09_01_2022_Master.pdf](/uploads/8cafe7c52f635d796b11c939c301114e/exceptions_09_01_2022_Master.pdf) report only one account shows. Please look into this and provide feedback and an estimate.
1.0
Missing Accounts Exceptions - NOT WORKING - In GitLab by @kdjstudios on Dec 20, 2018, 16:56 **Submitted by:** Kyle **Helpdesk:** NA **Server:** Dev **Client/Site:** Demo **Account:** NA **Issue:** While testing the ability to turn on and off the usage line item invoicing, I believe I found that the missing accounts and exceptions is not working correctly. As I was uploading a file ([Demo_Site_VCC_Usage_Internal.csv](/uploads/e8e3c07cdcd48139ab03b39099e103fb/Demo_Site_VCC_Usage_Internal.csv)) that had resource IDs that did not match any account but the last two billing cycles it only showed the first account as being missing. This was done on the Dev server and as you can see in the last exception[exceptions_09_01_2022_Master.pdf](/uploads/8cafe7c52f635d796b11c939c301114e/exceptions_09_01_2022_Master.pdf) report only one account shows. Please look into this and provide feedback and an estimate.
process
missing accounts exceptions not working in gitlab by kdjstudios on dec submitted by kyle helpdesk na server dev client site demo account na issue while testing the ability to turn on and off the usage line item invoicing i believe i found that the missing accounts and exceptions is not working correctly as i was uploading a file uploads demo site vcc usage internal csv that had resource ids that did not match any account but the last two billing cycles it only showed the first account as being missing this was done on the dev server and as you can see in the last exception uploads exceptions master pdf report only one account shows please look into this and provide feedback and an estimate
1
237,094
7,755,767,278
IssuesEvent
2018-05-31 11:21:41
MichaelHillcox/Echelon
https://api.github.com/repos/MichaelHillcox/Echelon
opened
Implement a complete CSRF token system with verification
enhancement feature optimisation priority high security
- [ ] Remove old system - [ ] Implement system based on standard prevention https://www.owasp.org/index.php/Cross-Site_Request_Forgery_(CSRF) Related to #43
1.0
Implement a complete CSRF token system with verification - - [ ] Remove old system - [ ] Implement system based on standard prevention https://www.owasp.org/index.php/Cross-Site_Request_Forgery_(CSRF) Related to #43
non_process
implement a complete csrf token system with verification remove old system implement system based on standard prevention related to
0
881
3,343,584,091
IssuesEvent
2015-11-15 16:36:42
pwittchen/kirai
https://api.github.com/repos/pwittchen/kirai
closed
Release 1.2.0
release process
**Initial release notes**: - changed library type from `aar` (Android packaging) to `jar` (pure Java packaging) **Things to do**: - [x] bump library version - [x] upload archives to Maven Central - [x] close and release artifact on Maven Central - [x] update `CHANGELOG.md` - [x] bump library version in `README.md` - [x] create new GitHub release
1.0
Release 1.2.0 - **Initial release notes**: - changed library type from `aar` (Android packaging) to `jar` (pure Java packaging) **Things to do**: - [x] bump library version - [x] upload archives to Maven Central - [x] close and release artifact on Maven Central - [x] update `CHANGELOG.md` - [x] bump library version in `README.md` - [x] create new GitHub release
process
release initial release notes changed library type from aar android packaging to jar pure java packaging things to do bump library version upload archives to maven central close and release artifact on maven central update changelog md bump library version in readme md create new github release
1
283,489
8,719,724,242
IssuesEvent
2018-12-08 03:39:57
aowen87/BAR
https://api.github.com/repos/aowen87/BAR
closed
Add a config site file for surface.
enhancement priority reviewed
Add a config site file for surface. -----------------------REDMINE MIGRATION----------------------- This ticket was migrated from Redmine. As such, not all information was able to be captured in the transition. Below is a complete record of the original redmine ticket. Ticket number: 2040 Status: Resolved Project: VisIt Tracker: Feature Priority: High Subject: Add a config site file for surface. Assigned to: Eric Brugger Category: Target version: 2.8.2 Author: Eric Brugger Start: 10/27/2014 Due date: % Done: 100 Estimated time: 1.0 Created: 10/27/2014 07:09 pm Updated: 10/29/2014 12:04 pm Likelihood: Severity: Found in version: Impact: 3 - Medium Expected Use: 3 - Occasional OS: All Support Group: Any Description: Add a config site file for surface. Comments: I committed revisions 24917 and 24919 to the 2.8 RC and trunk with thefollowing change:1) I added a config site file for surface and deleted the one for edge. This resolves #2040.A config-site/surface86.cmakeD config-site/edge83.cmake
1.0
Add a config site file for surface. - Add a config site file for surface. -----------------------REDMINE MIGRATION----------------------- This ticket was migrated from Redmine. As such, not all information was able to be captured in the transition. Below is a complete record of the original redmine ticket. Ticket number: 2040 Status: Resolved Project: VisIt Tracker: Feature Priority: High Subject: Add a config site file for surface. Assigned to: Eric Brugger Category: Target version: 2.8.2 Author: Eric Brugger Start: 10/27/2014 Due date: % Done: 100 Estimated time: 1.0 Created: 10/27/2014 07:09 pm Updated: 10/29/2014 12:04 pm Likelihood: Severity: Found in version: Impact: 3 - Medium Expected Use: 3 - Occasional OS: All Support Group: Any Description: Add a config site file for surface. Comments: I committed revisions 24917 and 24919 to the 2.8 RC and trunk with thefollowing change:1) I added a config site file for surface and deleted the one for edge. This resolves #2040.A config-site/surface86.cmakeD config-site/edge83.cmake
non_process
add a config site file for surface add a config site file for surface redmine migration this ticket was migrated from redmine as such not all information was able to be captured in the transition below is a complete record of the original redmine ticket ticket number status resolved project visit tracker feature priority high subject add a config site file for surface assigned to eric brugger category target version author eric brugger start due date done estimated time created pm updated pm likelihood severity found in version impact medium expected use occasional os all support group any description add a config site file for surface comments i committed revisions and to the rc and trunk with thefollowing change i added a config site file for surface and deleted the one for edge this resolves a config site cmaked config site cmake
0
15,203
5,906,932,863
IssuesEvent
2017-05-19 16:19:10
maxild/Prelude
https://api.github.com/repos/maxild/Prelude
closed
Investigate git submodules
build
Is it possible to use git submodules when doing p2p development. And maybe later subsitute the vendor modules with nupkgs.
1.0
Investigate git submodules - Is it possible to use git submodules when doing p2p development. And maybe later subsitute the vendor modules with nupkgs.
non_process
investigate git submodules is it possible to use git submodules when doing development and maybe later subsitute the vendor modules with nupkgs
0
21,333
29,041,235,684
IssuesEvent
2023-05-13 01:33:11
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
How to make bazel generate temporary static libs when bazel build
P4 type: support / not a bug (process) team-Rules-CPP stale
I'm using bazel to build tensorflow and tensorflow/serving, usually when I built tensorflow/serving with bazel, bazel generated some static library in ```bazel-bin/tensorflow_serving/core/``` for me. for example, after I successfully built tensorflow/serving, I got ```bazel-bin/tensorflow_serving/core/libaspired_version_policy.a``` and so on. But things changed when building tensorflow/serving on another docker instance, under ```bazel-bin/tensorflow_serving/core/``` I only got ```_objs```, no static libraries anymore, how come? What should I do to get these static libraries? Here is my Dockerfile: FROM tensorflow/tensorflow:1.15.5-py3 LABEL version="1.0.0" WORKDIR /root/ ENV TMP=/tmp RUN ln -s /usr/local/bin/gfortran /usr/bin/gfortran RUN apt-get update -yq ; exit 0 RUN apt-get install -yq software-properties-common apt-utils && apt-get update -yq; exit 0 RUN apt-get install -yq vim tree clang gdb make git RUN apt-get install -yq automake bison flex libboost-all-dev libevent-dev RUN apt-get install -yq libssl-dev ssh libtool pkg-config RUN apt-get install -yq default-jdk default-jre libunwind8-dev libc-ares-dev RUN apt-get install -yq python-numpy python-future libleveldb-dev libsnappy-dev libgoogle-perftools-dev RUN apt-get install -yq librdkafka-dev libapr1-dev libaprutil1-dev texinfo unzip zip RUN unset TF_NEED_CUDA ## To install go, uncomment below #RUN add-apt-repository ppa:longsleep/golang-backports -y #RUN apt-get install -yq golang-go RUN add-apt-repository ppa:ubuntu-toolchain-r/test -y RUN apt-get install -yq gcc-8 g++-8 RUN update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-8 80 \ --slave /usr/bin/g++ g++ /usr/bin/g++-8 \ --slave /usr/bin/gcc-ar gcc-ar /usr/bin/gcc-ar-8 \ --slave /usr/bin/gcc-nm gcc-nm /usr/bin/gcc-nm-8 \ --slave /usr/bin/gcc-ranlib gcc-ranlib /usr/bin/gcc-ranlib-8 RUN export https_proxy=http://220.181.102.178:8118 && wget https://github.com/Kitware/CMake/releases/download/v3.18.5/cmake-3.18.5-Linux-x86_64.sh && \ bash cmake-3.18.5-Linux-x86_64.sh --prefix=/usr/ --skip-license && \ rm -rf cmake-3.18.5-Linux-x86_64.sh RUN export https_proxy=http://220.181.102.178:8118 && wget https://github.com/bazelbuild/bazel/releases/download/0.24.1/bazel-0.24.1-installer-linux-x86_64.sh && \ bash bazel-0.24.1-installer-linux-x86_64.sh && rm -rf bazel-0.24.1-installer-linux-x86_64.sh RUN mkdir .ssh/ ADD .ssh/ .ssh/ ADD .vimrc . ADD .bashrc . RUN export PATH=/usr/local/cuda/bin:$PATH && export LD_LIBRARY_PATH=/usr/local/cuda/lib64:/lib:/lib64:/usr/lib:/usr/lib64:/usr/local/lib:/usr/local/lib64:$LD_LIBRARY_PATH RUN git config --global credential.helper store bazel version : 0.24.1 tensorflow/serving version : 1.15.0 downloaded from github release page. build command : export https_proxy=http://10.130.48.179:3128 unset TF_NEED_CUDA # Build tensorflow_model_server bazel build -c opt --copt=-msse4.1 --copt=-msse4.2 --copt=-mavx --copt=-mavx2 --copt=-mfma --copt=-O3 --copt=-march=native --cxxopt="-fexceptions" --verbose_failures //tensorflow_serving/model_servers:tensorflow_model_server Please help. thanks.
1.0
How to make bazel generate temporary static libs when bazel build - I'm using bazel to build tensorflow and tensorflow/serving, usually when I built tensorflow/serving with bazel, bazel generated some static library in ```bazel-bin/tensorflow_serving/core/``` for me. for example, after I successfully built tensorflow/serving, I got ```bazel-bin/tensorflow_serving/core/libaspired_version_policy.a``` and so on. But things changed when building tensorflow/serving on another docker instance, under ```bazel-bin/tensorflow_serving/core/``` I only got ```_objs```, no static libraries anymore, how come? What should I do to get these static libraries? Here is my Dockerfile: FROM tensorflow/tensorflow:1.15.5-py3 LABEL version="1.0.0" WORKDIR /root/ ENV TMP=/tmp RUN ln -s /usr/local/bin/gfortran /usr/bin/gfortran RUN apt-get update -yq ; exit 0 RUN apt-get install -yq software-properties-common apt-utils && apt-get update -yq; exit 0 RUN apt-get install -yq vim tree clang gdb make git RUN apt-get install -yq automake bison flex libboost-all-dev libevent-dev RUN apt-get install -yq libssl-dev ssh libtool pkg-config RUN apt-get install -yq default-jdk default-jre libunwind8-dev libc-ares-dev RUN apt-get install -yq python-numpy python-future libleveldb-dev libsnappy-dev libgoogle-perftools-dev RUN apt-get install -yq librdkafka-dev libapr1-dev libaprutil1-dev texinfo unzip zip RUN unset TF_NEED_CUDA ## To install go, uncomment below #RUN add-apt-repository ppa:longsleep/golang-backports -y #RUN apt-get install -yq golang-go RUN add-apt-repository ppa:ubuntu-toolchain-r/test -y RUN apt-get install -yq gcc-8 g++-8 RUN update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-8 80 \ --slave /usr/bin/g++ g++ /usr/bin/g++-8 \ --slave /usr/bin/gcc-ar gcc-ar /usr/bin/gcc-ar-8 \ --slave /usr/bin/gcc-nm gcc-nm /usr/bin/gcc-nm-8 \ --slave /usr/bin/gcc-ranlib gcc-ranlib /usr/bin/gcc-ranlib-8 RUN export https_proxy=http://220.181.102.178:8118 && wget https://github.com/Kitware/CMake/releases/download/v3.18.5/cmake-3.18.5-Linux-x86_64.sh && \ bash cmake-3.18.5-Linux-x86_64.sh --prefix=/usr/ --skip-license && \ rm -rf cmake-3.18.5-Linux-x86_64.sh RUN export https_proxy=http://220.181.102.178:8118 && wget https://github.com/bazelbuild/bazel/releases/download/0.24.1/bazel-0.24.1-installer-linux-x86_64.sh && \ bash bazel-0.24.1-installer-linux-x86_64.sh && rm -rf bazel-0.24.1-installer-linux-x86_64.sh RUN mkdir .ssh/ ADD .ssh/ .ssh/ ADD .vimrc . ADD .bashrc . RUN export PATH=/usr/local/cuda/bin:$PATH && export LD_LIBRARY_PATH=/usr/local/cuda/lib64:/lib:/lib64:/usr/lib:/usr/lib64:/usr/local/lib:/usr/local/lib64:$LD_LIBRARY_PATH RUN git config --global credential.helper store bazel version : 0.24.1 tensorflow/serving version : 1.15.0 downloaded from github release page. build command : export https_proxy=http://10.130.48.179:3128 unset TF_NEED_CUDA # Build tensorflow_model_server bazel build -c opt --copt=-msse4.1 --copt=-msse4.2 --copt=-mavx --copt=-mavx2 --copt=-mfma --copt=-O3 --copt=-march=native --cxxopt="-fexceptions" --verbose_failures //tensorflow_serving/model_servers:tensorflow_model_server Please help. thanks.
process
how to make bazel generate temporary static libs when bazel build i m using bazel to build tensorflow and tensorflow serving usually when i built tensorflow serving with bazel bazel generated some static library in bazel bin tensorflow serving core for me for example after i successfully built tensorflow serving i got bazel bin tensorflow serving core libaspired version policy a and so on but things changed when building tensorflow serving on another docker instance under bazel bin tensorflow serving core i only got objs no static libraries anymore how come what should i do to get these static libraries here is my dockerfile from tensorflow tensorflow label version workdir root env tmp tmp run ln s usr local bin gfortran usr bin gfortran run apt get update yq exit run apt get install yq software properties common apt utils apt get update yq exit run apt get install yq vim tree clang gdb make git run apt get install yq automake bison flex libboost all dev libevent dev run apt get install yq libssl dev ssh libtool pkg config run apt get install yq default jdk default jre dev libc ares dev run apt get install yq python numpy python future libleveldb dev libsnappy dev libgoogle perftools dev run apt get install yq librdkafka dev dev dev texinfo unzip zip run unset tf need cuda to install go uncomment below run add apt repository ppa longsleep golang backports y run apt get install yq golang go run add apt repository ppa ubuntu toolchain r test y run apt get install yq gcc g run update alternatives install usr bin gcc gcc usr bin gcc slave usr bin g g usr bin g slave usr bin gcc ar gcc ar usr bin gcc ar slave usr bin gcc nm gcc nm usr bin gcc nm slave usr bin gcc ranlib gcc ranlib usr bin gcc ranlib run export https proxy wget bash cmake linux sh prefix usr skip license rm rf cmake linux sh run export https proxy wget bash bazel installer linux sh rm rf bazel installer linux sh run mkdir ssh add ssh ssh add vimrc add bashrc run export path usr local cuda bin path export ld library path usr local cuda lib usr lib usr usr local lib usr local ld library path run git config global credential helper store bazel version tensorflow serving version downloaded from github release page build command export https proxy unset tf need cuda build tensorflow model server bazel build c opt copt copt copt mavx copt copt mfma copt copt march native cxxopt fexceptions verbose failures tensorflow serving model servers tensorflow model server please help thanks
1
20,863
27,645,580,293
IssuesEvent
2023-03-10 22:34:35
cse442-at-ub/project_s23-iweatherify
https://api.github.com/repos/cse442-at-ub/project_s23-iweatherify
closed
Create demo Vue application that connects to database and allows registering and login
Processing Task Sprint 2
*Task Tests* *Test 1* 1) Go to the repo and clone the app if not already done so: https://github.com/cse442-at-ub/project_s23-iweatherify 2) Open up the terminal and cd to the root of the cloned project 3) Run `npm i npm@6.14.6` 4) Run `npm install node@12.22.12` 5) Run `npm install` to install the node modules 6) Run `npm start` 7) Open up your browser and type in the given address of the locally running app 8) Type "/register" at the end of the URL and verify that the page redirects to another page with a registration form 9) In another tab, go to https://www-student.cse.buffalo.edu/tools/db/phpmyadmin/ 10) Type in your UBIT for the username 11) Type in your UB Person number for the password 12) Ensure that the server choice is Oceanus 13) Click "Go" 14) On the left-hand panel, click on "cse442_2023_spring_team_a_db" 15) Go back to the webpage of the locally running application 16) For the email type in: test999@gmail.com 17) For the username type in: test999 18) For the password type in test999 19) Click on register 20) Navigate to the tab with the myPHPAdmin and verify that a row entry matches the newly registered account information that you inputted from step 16 *Test 2* 1) Go to the repo and clone the app if not already done so: https://github.com/cse442-at-ub/project_s23-iweatherify 2) Open up the terminal and cd to the root of the cloned project 3) Run `npm i npm@6.14.6` 4) Run `npm install node@12.22.12` 5) Run `npm install` to install the node modules 6) Run `npm start` 7) Open up your browser and type in the given address of the locally running app 8) Type "/login" at the end of the URL and verify that the page redirects to another page with a login form 9) For the email type in: test@gmail.com 10) For the username type in: test 11) For the password type in: brokenpassword 12) Click on login 13) Verify that you are still on the login page *Test 3* 1) Go to the repo and clone the app if not already done so: https://github.com/cse442-at-ub/project_s23-iweatherify 2) Open up the terminal and cd to the root of the cloned project 3) Run `npm i npm@6.14.6` 4) Run `npm install node@12.22.12` 5) Run `npm install` to install the node modules 6) Run `npm start` 7) Open up your browser and type in the given address of the locally running app 8) Type "/login" at the end of the URL and verify that the page redirects to another page with a login form 9) For the email type in: test@gmail.com 10) For the username type in: test 11) For the password type in: test 12) Click on login 13) Verify that you are no longer on the login page and that you are on the feed page
1.0
Create demo Vue application that connects to database and allows registering and login - *Task Tests* *Test 1* 1) Go to the repo and clone the app if not already done so: https://github.com/cse442-at-ub/project_s23-iweatherify 2) Open up the terminal and cd to the root of the cloned project 3) Run `npm i npm@6.14.6` 4) Run `npm install node@12.22.12` 5) Run `npm install` to install the node modules 6) Run `npm start` 7) Open up your browser and type in the given address of the locally running app 8) Type "/register" at the end of the URL and verify that the page redirects to another page with a registration form 9) In another tab, go to https://www-student.cse.buffalo.edu/tools/db/phpmyadmin/ 10) Type in your UBIT for the username 11) Type in your UB Person number for the password 12) Ensure that the server choice is Oceanus 13) Click "Go" 14) On the left-hand panel, click on "cse442_2023_spring_team_a_db" 15) Go back to the webpage of the locally running application 16) For the email type in: test999@gmail.com 17) For the username type in: test999 18) For the password type in test999 19) Click on register 20) Navigate to the tab with the myPHPAdmin and verify that a row entry matches the newly registered account information that you inputted from step 16 *Test 2* 1) Go to the repo and clone the app if not already done so: https://github.com/cse442-at-ub/project_s23-iweatherify 2) Open up the terminal and cd to the root of the cloned project 3) Run `npm i npm@6.14.6` 4) Run `npm install node@12.22.12` 5) Run `npm install` to install the node modules 6) Run `npm start` 7) Open up your browser and type in the given address of the locally running app 8) Type "/login" at the end of the URL and verify that the page redirects to another page with a login form 9) For the email type in: test@gmail.com 10) For the username type in: test 11) For the password type in: brokenpassword 12) Click on login 13) Verify that you are still on the login page *Test 3* 1) Go to the repo and clone the app if not already done so: https://github.com/cse442-at-ub/project_s23-iweatherify 2) Open up the terminal and cd to the root of the cloned project 3) Run `npm i npm@6.14.6` 4) Run `npm install node@12.22.12` 5) Run `npm install` to install the node modules 6) Run `npm start` 7) Open up your browser and type in the given address of the locally running app 8) Type "/login" at the end of the URL and verify that the page redirects to another page with a login form 9) For the email type in: test@gmail.com 10) For the username type in: test 11) For the password type in: test 12) Click on login 13) Verify that you are no longer on the login page and that you are on the feed page
process
create demo vue application that connects to database and allows registering and login task tests test go to the repo and clone the app if not already done so open up the terminal and cd to the root of the cloned project run npm i npm run npm install node run npm install to install the node modules run npm start open up your browser and type in the given address of the locally running app type register at the end of the url and verify that the page redirects to another page with a registration form in another tab go to type in your ubit for the username type in your ub person number for the password ensure that the server choice is oceanus click go on the left hand panel click on spring team a db go back to the webpage of the locally running application for the email type in gmail com for the username type in for the password type in click on register navigate to the tab with the myphpadmin and verify that a row entry matches the newly registered account information that you inputted from step test go to the repo and clone the app if not already done so open up the terminal and cd to the root of the cloned project run npm i npm run npm install node run npm install to install the node modules run npm start open up your browser and type in the given address of the locally running app type login at the end of the url and verify that the page redirects to another page with a login form for the email type in test gmail com for the username type in test for the password type in brokenpassword click on login verify that you are still on the login page test go to the repo and clone the app if not already done so open up the terminal and cd to the root of the cloned project run npm i npm run npm install node run npm install to install the node modules run npm start open up your browser and type in the given address of the locally running app type login at the end of the url and verify that the page redirects to another page with a login form for the email type in test gmail com for the username type in test for the password type in test click on login verify that you are no longer on the login page and that you are on the feed page
1
71,322
8,644,464,348
IssuesEvent
2018-11-26 02:59:29
ValerioLyndon/MAL-Public-List-Designs
https://api.github.com/repos/ValerioLyndon/MAL-Public-List-Designs
opened
C| Change method used for displaying header background.
bug problem with design
As it is, it can be laggy in Chrome and is very laggy in Firefox, where the background moves slower than the actual content. Shouldn't be too difficult to change how this work, although it may mean using a different manner of header shadow.
1.0
C| Change method used for displaying header background. - As it is, it can be laggy in Chrome and is very laggy in Firefox, where the background moves slower than the actual content. Shouldn't be too difficult to change how this work, although it may mean using a different manner of header shadow.
non_process
c change method used for displaying header background as it is it can be laggy in chrome and is very laggy in firefox where the background moves slower than the actual content shouldn t be too difficult to change how this work although it may mean using a different manner of header shadow
0
20,090
26,605,410,272
IssuesEvent
2023-01-23 18:55:20
altillimity/SatDump
https://api.github.com/repos/altillimity/SatDump
closed
Decode rest of NOAA TIP Contents
enhancement Processing
AFAIK there's more in the DSB signal than just HIRS data... (see @nebarnix/project-desert-tortoise) would be cool to have Satdump unpack this data as well!
1.0
Decode rest of NOAA TIP Contents - AFAIK there's more in the DSB signal than just HIRS data... (see @nebarnix/project-desert-tortoise) would be cool to have Satdump unpack this data as well!
process
decode rest of noaa tip contents afaik there s more in the dsb signal than just hirs data see nebarnix project desert tortoise would be cool to have satdump unpack this data as well
1
1,653
4,282,582,902
IssuesEvent
2016-07-15 09:44:40
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
opened
NTR: maintenance of differentiated cell state
BHF-UCL miRNA New term request RNA processes
Hello, I recently annotated paper PMID:24804980, in which Figure 5 showed that the silencing of the BDNF gene by miR-1 and miR-206 was required for maintenance of the differentiated state of myotubes (however, this gene silencing was not required for induction of the differentiation). There is a myotube differentiation term (GO:0014902) in gene ontology, however, the definition does not really fit what the paper/Figure 5 describes. And there seems to be no GO terms in the ontology for cell type differentiation maintenance at all. After I shared this enquiry among the UCL curators, @rebeccafoulger has suggested referring to this review http://www.ncbi.nlm.nih.gov/pubmed/22596319 for other examples of similar processes. The new term could be: ‘maintenance of differentiated cell state' and the definition would need to specify that this maintenance of differentiation/cell identity occurs once the cell has reached 'maturity'. @ukemi would you mind looking at this issue? Rebecca mentioned that you were originally involved in work on the differentiation/development hierarchy. Thanks, Barbara DbxREFs: GOC:BHF, GOC:BHF_miRNA, GOC:bc cc: @RLovering cc: @rachhuntley
1.0
NTR: maintenance of differentiated cell state - Hello, I recently annotated paper PMID:24804980, in which Figure 5 showed that the silencing of the BDNF gene by miR-1 and miR-206 was required for maintenance of the differentiated state of myotubes (however, this gene silencing was not required for induction of the differentiation). There is a myotube differentiation term (GO:0014902) in gene ontology, however, the definition does not really fit what the paper/Figure 5 describes. And there seems to be no GO terms in the ontology for cell type differentiation maintenance at all. After I shared this enquiry among the UCL curators, @rebeccafoulger has suggested referring to this review http://www.ncbi.nlm.nih.gov/pubmed/22596319 for other examples of similar processes. The new term could be: ‘maintenance of differentiated cell state' and the definition would need to specify that this maintenance of differentiation/cell identity occurs once the cell has reached 'maturity'. @ukemi would you mind looking at this issue? Rebecca mentioned that you were originally involved in work on the differentiation/development hierarchy. Thanks, Barbara DbxREFs: GOC:BHF, GOC:BHF_miRNA, GOC:bc cc: @RLovering cc: @rachhuntley
process
ntr maintenance of differentiated cell state hello i recently annotated paper pmid in which figure showed that the silencing of the bdnf gene by mir and mir was required for maintenance of the differentiated state of myotubes however this gene silencing was not required for induction of the differentiation there is a myotube differentiation term go in gene ontology however the definition does not really fit what the paper figure describes and there seems to be no go terms in the ontology for cell type differentiation maintenance at all after i shared this enquiry among the ucl curators rebeccafoulger has suggested referring to this review for other examples of similar processes the new term could be ‘maintenance of differentiated cell state and the definition would need to specify that this maintenance of differentiation cell identity occurs once the cell has reached maturity ukemi would you mind looking at this issue rebecca mentioned that you were originally involved in work on the differentiation development hierarchy thanks barbara dbxrefs goc bhf goc bhf mirna goc bc cc rlovering cc rachhuntley
1
65,306
6,955,168,713
IssuesEvent
2017-12-07 06:09:50
bskinn/sphobjinv
https://api.github.com/repos/bskinn/sphobjinv
opened
Refactor test/sphobjinv_base.py into api and cli submodules
needs tests refactor
At 1000 lines, it's probably time. Should probably keep `/sphobjinv_base.py` (possibly renamed) for defining common functions and the `SuperSphobjinv` class.
1.0
Refactor test/sphobjinv_base.py into api and cli submodules - At 1000 lines, it's probably time. Should probably keep `/sphobjinv_base.py` (possibly renamed) for defining common functions and the `SuperSphobjinv` class.
non_process
refactor test sphobjinv base py into api and cli submodules at lines it s probably time should probably keep sphobjinv base py possibly renamed for defining common functions and the supersphobjinv class
0
15,931
20,151,896,278
IssuesEvent
2022-02-09 13:13:18
syncfusion/ej2-angular-ui-components
https://api.github.com/repos/syncfusion/ej2-angular-ui-components
closed
Auto capital After full stop in Rich text and Word Editor
enhancement rich-text-editor word-processor
Dear Team, Kindly advise how to enable Auto capitalization after full stop in a sentence for Rich text editor and Word editor
1.0
Auto capital After full stop in Rich text and Word Editor - Dear Team, Kindly advise how to enable Auto capitalization after full stop in a sentence for Rich text editor and Word editor
process
auto capital after full stop in rich text and word editor dear team kindly advise how to enable auto capitalization after full stop in a sentence for rich text editor and word editor
1
175,664
13,578,472,448
IssuesEvent
2020-09-20 07:56:51
TEAMMATES/teammates
https://api.github.com/repos/TEAMMATES/teammates
closed
GUI tests: Check for js/css errrors
a-Testing p.Low
It is nice if our UI tests can check for js/css errors automatically. i.e. We can manually check for css/js errors using firefox/chrome developer console. Can we do that using selenium?
1.0
GUI tests: Check for js/css errrors - It is nice if our UI tests can check for js/css errors automatically. i.e. We can manually check for css/js errors using firefox/chrome developer console. Can we do that using selenium?
non_process
gui tests check for js css errrors it is nice if our ui tests can check for js css errors automatically i e we can manually check for css js errors using firefox chrome developer console can we do that using selenium
0
66,005
6,986,199,654
IssuesEvent
2017-12-14 01:56:30
AffiliateWP/AffiliateWP
https://api.github.com/repos/AffiliateWP/AffiliateWP
closed
WooCommerce: when a referral gets updated, no note is generated
bug Has PR needs testing
When a referral gets updated, no note gets generated, showing apparently wrong information in the order.
1.0
WooCommerce: when a referral gets updated, no note is generated - When a referral gets updated, no note gets generated, showing apparently wrong information in the order.
non_process
woocommerce when a referral gets updated no note is generated when a referral gets updated no note gets generated showing apparently wrong information in the order
0
21,132
28,103,598,635
IssuesEvent
2023-03-30 21:41:04
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
[Mirror] bazelbuild/bazel-gazelle/releases/tag/v0.30.0
P2 type: process team-OSS mirror request
### Please list the URLs of the archives you'd like to mirror: https://github.com/bazelbuild/bazel-gazelle/releases/tag/v0.30.0
1.0
[Mirror] bazelbuild/bazel-gazelle/releases/tag/v0.30.0 - ### Please list the URLs of the archives you'd like to mirror: https://github.com/bazelbuild/bazel-gazelle/releases/tag/v0.30.0
process
bazelbuild bazel gazelle releases tag please list the urls of the archives you d like to mirror
1
6,141
9,012,896,476
IssuesEvent
2019-02-05 18:02:23
olange/code-retreat
https://api.github.com/repos/olange/code-retreat
opened
Contamines-Montjoie 02–03.2019 · Organization
process
### Actions - [x] Reserve the chalet for weeks 1, 2, 4 and 5 `DONE` 03.02 OL - [ ] Determine where to go on week 3 `TODO`
1.0
Contamines-Montjoie 02–03.2019 · Organization - ### Actions - [x] Reserve the chalet for weeks 1, 2, 4 and 5 `DONE` 03.02 OL - [ ] Determine where to go on week 3 `TODO`
process
contamines montjoie – · organization actions reserve the chalet for weeks and done ol determine where to go on week todo
1
2,308
5,120,468,011
IssuesEvent
2017-01-09 03:41:46
Jarvvski/CavTools
https://api.github.com/repos/Jarvvski/CavTools
closed
S2 Quarterly Checks Automation
Process Flow Request
<h2>Problem</h2><br />Investigators wasting time on GUID checks Automate the system using code from RRD Enlistment tracker Boom; 50 hours + annually saved<br /><hr><h2>Reason</h2><br />Automation Duh<br><br>-Second Lieutenant Collins.G
1.0
S2 Quarterly Checks Automation - <h2>Problem</h2><br />Investigators wasting time on GUID checks Automate the system using code from RRD Enlistment tracker Boom; 50 hours + annually saved<br /><hr><h2>Reason</h2><br />Automation Duh<br><br>-Second Lieutenant Collins.G
process
quarterly checks automation problem investigators wasting time on guid checks automate the system using code from rrd enlistment tracker boom hours annually saved reason automation duh second lieutenant collins g
1
14,952
18,434,438,524
IssuesEvent
2021-10-14 11:26:21
opensafely-core/job-server
https://api.github.com/repos/opensafely-core/job-server
opened
Import approved project applications
application-process
As an admin user, I want to import already approved (email) applications, so that I they can be linked to the relevant project. - [Approved Projects page](https://www.opensafely.org/approved-projects/)
1.0
Import approved project applications - As an admin user, I want to import already approved (email) applications, so that I they can be linked to the relevant project. - [Approved Projects page](https://www.opensafely.org/approved-projects/)
process
import approved project applications as an admin user i want to import already approved email applications so that i they can be linked to the relevant project
1