Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
21,564
29,922,573,987
IssuesEvent
2023-06-22 00:38:10
devssa/onde-codar-em-salvador
https://api.github.com/repos/devssa/onde-codar-em-salvador
closed
[Remoto] Cloud Infrastructure Specialist na Coodesh
SALVADOR UX PJ INFRAESTRUTURA AGILE MOBILE REQUISITOS REMOTO PROCESSOS GITHUB CI SEGURANÇA UMA ESPECIALISTA ENGENHARIA DE SOFTWARE UI Stale
## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/jobs/especialista-de-infraestrutura-cloud-143023743?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋 <p>A Loginfo está em busca de Cloud Infrastructure Specialist para compor seu time!</p> <p>Como especialistas, escolhemos entregar ao mercado uma solução totalmente mobile, capaz de cobrir de ponta a ponta a intralogística e o dia a dia de nossos clientes. Somos movidos a tecnologia e temos fome de resultados. Nosso principal objetivo é transformar o mercado entregando logística aprimorada, apoiando a logística e o comércio exterior com a otimização de processos, redução de custos e ganho de produtividade no recebimento, armazenagem e expedição.</p> <p>Responsabilidades:</p> <ul> <li>Projetar, implementar e gerenciar soluções de infraestrutura em nuvem para garantir a disponibilidade, desempenho e segurança de sistemas e aplicativos baseados em cloud.</li> </ul> ## Loginfo Tecnologia da Informação LTDA: <p>Desde 2014, ano de nossa fundação, escolhemos conectar processos operacionais e comunicação. Decidimos tornar cada vez mais digital, ágil e intuitivo o mercado dos setores logísticos, portuários e de armazéns gerais. Somos inovadores em tudo que nos propomos a fazer. Como especialistas, escolhemos entregar ao mercado uma solução totalmente mobile, capaz de cobrir de ponta a ponta a intralogística e o dia a dia de nossos clientes. Somos movidos a tecnologia e temos fome de resultados. Nosso principal objetivo é transformar o mercado entregando logística aprimorada, apoiando a logística e o comércio exterior com a otimização de processos, redução de custos e ganho de produtividade no recebimento, armazenagem e expedição.</p></p> ## Habilidades: - Agile - .NET Framework - UX/UI ## Local: 100% Remoto ## Requisitos: - Experiência como Cloud Infrastructure Specialist; - Graduação completa em Ciências da Computação, Engenharia de Software, Sistemas da Informação e/ou afins. ## Benefícios: - Gympass; - Alura; - Seguro de vida; - Horários flexíveis; ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Cloud Infrastructure Specialist na Loginfo Tecnologia da Informação LTDA](https://coodesh.com/jobs/especialista-de-infraestrutura-cloud-143023743?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Remoto #### Regime PJ #### Categoria Gestão em TI
1.0
[Remoto] Cloud Infrastructure Specialist na Coodesh - ## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/jobs/especialista-de-infraestrutura-cloud-143023743?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋 <p>A Loginfo está em busca de Cloud Infrastructure Specialist para compor seu time!</p> <p>Como especialistas, escolhemos entregar ao mercado uma solução totalmente mobile, capaz de cobrir de ponta a ponta a intralogística e o dia a dia de nossos clientes. Somos movidos a tecnologia e temos fome de resultados. Nosso principal objetivo é transformar o mercado entregando logística aprimorada, apoiando a logística e o comércio exterior com a otimização de processos, redução de custos e ganho de produtividade no recebimento, armazenagem e expedição.</p> <p>Responsabilidades:</p> <ul> <li>Projetar, implementar e gerenciar soluções de infraestrutura em nuvem para garantir a disponibilidade, desempenho e segurança de sistemas e aplicativos baseados em cloud.</li> </ul> ## Loginfo Tecnologia da Informação LTDA: <p>Desde 2014, ano de nossa fundação, escolhemos conectar processos operacionais e comunicação. Decidimos tornar cada vez mais digital, ágil e intuitivo o mercado dos setores logísticos, portuários e de armazéns gerais. Somos inovadores em tudo que nos propomos a fazer. Como especialistas, escolhemos entregar ao mercado uma solução totalmente mobile, capaz de cobrir de ponta a ponta a intralogística e o dia a dia de nossos clientes. Somos movidos a tecnologia e temos fome de resultados. Nosso principal objetivo é transformar o mercado entregando logística aprimorada, apoiando a logística e o comércio exterior com a otimização de processos, redução de custos e ganho de produtividade no recebimento, armazenagem e expedição.</p></p> ## Habilidades: - Agile - .NET Framework - UX/UI ## Local: 100% Remoto ## Requisitos: - Experiência como Cloud Infrastructure Specialist; - Graduação completa em Ciências da Computação, Engenharia de Software, Sistemas da Informação e/ou afins. ## Benefícios: - Gympass; - Alura; - Seguro de vida; - Horários flexíveis; ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Cloud Infrastructure Specialist na Loginfo Tecnologia da Informação LTDA](https://coodesh.com/jobs/especialista-de-infraestrutura-cloud-143023743?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Remoto #### Regime PJ #### Categoria Gestão em TI
process
cloud infrastructure specialist na coodesh descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 a loginfo está em busca de cloud infrastructure specialist para compor seu time como especialistas escolhemos entregar ao mercado uma solução totalmente mobile capaz de cobrir de ponta a ponta a intralogística e o dia a dia de nossos clientes somos movidos a tecnologia e temos fome de resultados nosso principal objetivo é transformar o mercado entregando logística aprimorada apoiando a logística e o comércio exterior com a otimização de processos redução de custos e ganho de produtividade no recebimento armazenagem e expedição responsabilidades projetar implementar e gerenciar soluções de infraestrutura em nuvem para garantir a disponibilidade desempenho e segurança de sistemas e aplicativos baseados em cloud loginfo tecnologia da informação ltda desde ano de nossa fundação escolhemos conectar processos operacionais e comunicação decidimos tornar cada vez mais digital ágil e intuitivo o mercado dos setores logísticos portuários e de armazéns gerais somos inovadores em tudo que nos propomos a fazer como especialistas escolhemos entregar ao mercado uma solução totalmente mobile capaz de cobrir de ponta a ponta a intralogística e o dia a dia de nossos clientes somos movidos a tecnologia e temos fome de resultados nosso principal objetivo é transformar o mercado entregando logística aprimorada apoiando a logística e o comércio exterior com a otimização de processos redução de custos e ganho de produtividade no recebimento armazenagem e expedição habilidades agile net framework ux ui local remoto requisitos experiência como cloud infrastructure specialist graduação completa em ciências da computação engenharia de software sistemas da informação e ou afins benefícios gympass alura seguro de vida horários flexíveis como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação remoto regime pj categoria gestão em ti
1
264,334
23,112,208,487
IssuesEvent
2022-07-27 13:54:14
amlight/kytos-end-to-end-tests
https://api.github.com/repos/amlight/kytos-end-to-end-tests
closed
Add an e2e test for enabling and disabling LLDP liveness detection
good first issue epic_e2e_tests_coverage 2022.2
This is for adding an e2e test for enabling and disabling LLDP liveness detection, it depends on this PR landing: https://github.com/kytos-ng/of_lldp/issues/42
1.0
Add an e2e test for enabling and disabling LLDP liveness detection - This is for adding an e2e test for enabling and disabling LLDP liveness detection, it depends on this PR landing: https://github.com/kytos-ng/of_lldp/issues/42
non_process
add an test for enabling and disabling lldp liveness detection this is for adding an test for enabling and disabling lldp liveness detection it depends on this pr landing
0
16,134
20,385,178,358
IssuesEvent
2022-02-22 05:42:25
linuxdeepin/developer-center
https://api.github.com/repos/linuxdeepin/developer-center
closed
[Feature request] Change transparency of every Deepin applications
suggest | functional behavior suggest | interface style other | delay processing
We can now change the transparency of Deepin-dock but not the transparency of other Deepin programs. It would be really great if we could.
1.0
[Feature request] Change transparency of every Deepin applications - We can now change the transparency of Deepin-dock but not the transparency of other Deepin programs. It would be really great if we could.
process
change transparency of every deepin applications we can now change the transparency of deepin dock but not the transparency of other deepin programs it would be really great if we could
1
11,921
14,703,064,185
IssuesEvent
2021-01-04 14:31:58
prisma/prisma
https://api.github.com/repos/prisma/prisma
closed
Issue with migrate on SQL Server
bug/2-confirmed kind/bug process/candidate team/migrations topic: migrate topic: sql server
## Bug description Running migrate after making changes to my schema triggered a weird issue which seems related to drift detection/shadow database. ## How to reproduce Steps to reproduce the behavior: 1. Introspected my database which you could create as follows: ```sql -- master.dbo.[User] definition -- Drop table -- DROP TABLE master.dbo.[User] GO CREATE TABLE [User] ( email varchar(100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, name varchar(100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, id int IDENTITY(0,1) NOT NULL, CONSTRAINT User_PK2 PRIMARY KEY (id) ) GO; -- master.dbo.Post definition -- Drop table -- DROP TABLE master.dbo.Post GO CREATE TABLE Post ( authorId int NULL, content varchar(100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, published int NULL, title varchar(100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, id int IDENTITY(0,1) NOT NULL, CONSTRAINT Post_pk PRIMARY KEY (id), CONSTRAINT Post_FK_2 FOREIGN KEY (authorId) REFERENCES [User](id) ) GO CREATE UNIQUE NONCLUSTERED INDEX Post_id_uindex ON dbo.Post ( id ASC ) WITH ( PAD_INDEX = OFF ,FILLFACTOR = 100 ,SORT_IN_TEMPDB = OFF , IGNORE_DUP_KEY = OFF , STATISTICS_NORECOMPUTE = OFF , ONLINE = OFF , ALLOW_ROW_LOCKS = ON , ALLOW_PAGE_LOCKS = ON ) ON [PRIMARY ] GO; ``` The schema I got from that after introspecting: ```prisma generator client { provider = "prisma-client-js" previewFeatures = ["microsoftSqlServer"] } datasource db { provider = "sqlserver" url = env("DATABASE_URL") } model User { email String? name String? id Int @id @default(autoincrement()) posts Post[] } model Post { authorId Int? content String? published Int? title String? id Int @id @default(autoincrement()) author User? @relation(fields: [authorId], references: [id]) } ``` 2. Make changes to the schema as follows: ```prisma generator client { provider = "prisma-client-js" previewFeatures = ["microsoftSqlServer"] } datasource db { provider = "sqlserver" url = env("DATABASE_URL") } model User { email String? @unique name String? id Int @id @default(autoincrement()) posts Post[] comments Comment[] } model Post { authorId Int? content String? published Boolean title String? id Int @id @default(autoincrement()) author User? @relation(fields: [authorId], references: [id]) comments Comment[] } model Comment { id Int @id @default(autoincrement()) title String body String post Post @relation(fields: [postId], references: [id]) author User? @relation(fields: [authorId], references: [id]) authorId Int? postId Int } ``` 3. Run `yarn prisma migrate dev --preview-feature` 4. See error ``` MigrateEngine:rpc starting migration engine with binary: /Users/hervelabas/Dev/studio-qa/prisma-mssql-openssl/migration-engine +0ms MigrateEngine:rpc SENDING RPC CALL {"id":1,"jsonrpc":"2.0","method":"diagnoseMigrationHistory","params":{"migrationsDirectoryPath":"/path/to/my/project/prisma/migrations","optInToShadowDatabase":true}} +3ms MigrateEngine:stderr Dec 10 18:50:02.790 INFO migration_engine: Starting migration engine RPC server git_hash="626fcb87e24db886bb885c976d49debe7af90acc" +0ms MigrateEngine:stderr Dec 10 18:50:02.795 INFO tiberius::client::connection: Performing a TLS handshake +4ms MigrateEngine:stderr Dec 10 18:50:02.795 WARN tiberius::client::connection: Trusting the server certificate without validation. +1ms MigrateEngine:stderr Dec 10 18:50:02.808 INFO tiberius::client::connection: TLS handshake successful +12ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO tiberius::tds::stream::token: Database change from 'master' to 'master' +5ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO tiberius::tds::stream::token: Changed database context to 'master'. +0ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO tiberius::tds::stream::token: SQL collation change from None to windows-1252/windows-1252 +0ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO tiberius::tds::stream::token: Microsoft SQL Server version 3742302223 +0ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO tiberius::tds::stream::token: Packet size change from '4096' to '4096' +0ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO quaint::single: Starting a mssql connection. +0ms MigrateEngine:stderr Dec 10 18:50:02.820 ERROR DiagnoseMigrationHistory:list_migrations: tiberius::tds::stream::token: Invalid object name 'dbo._prisma_migrations'. code=208 +7ms MigrateEngine:stderr Dec 10 18:50:03.289 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::client::connection: Performing a TLS handshake +469ms MigrateEngine:stderr Dec 10 18:50:03.289 WARN DiagnoseMigrationHistory:calculate_drift: tiberius::client::connection: Trusting the server certificate without validation. +1ms MigrateEngine:stderr Dec 10 18:50:03.301 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::client::connection: TLS handshake successful +11ms MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::tds::stream::token: Database change from 'prisma_shadow_dbb5b76c9a-7091-4eb5-8422-9975ac30b62b' to 'master' +5ms MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::tds::stream::token: Changed database context to 'prisma_shadow_dbb5b76c9a-7091-4eb5-8422-9975ac30b62b'. +0ms MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::tds::stream::token: SQL collation change from None to windows-1252/windows-1252 +0ms MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::tds::stream::token: Microsoft SQL Server version 3742302223 +0ms MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::tds::stream::token: Packet size change from '4096' to '4096' +1s MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: quaint::single: Starting a mssql connection. +1ms MigrateEngine:stderr Dec 10 18:50:04.101 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::client::connection: Performing a TLS handshake +0ms MigrateEngine:stderr Dec 10 18:50:04.101 WARN DiagnoseMigrationHistory:validate_migrations: tiberius::client::connection: Trusting the server certificate without validation. +0ms MigrateEngine:stderr Dec 10 18:50:04.112 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::client::connection: TLS handshake successful +0ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::tds::stream::token: Database change from 'prisma_shadow_db5090846c-bb0d-40a0-80bc-560435248178' to 'master' +3ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::tds::stream::token: Changed database context to 'prisma_shadow_db5090846c-bb0d-40a0-80bc-560435248178'. +0ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::tds::stream::token: SQL collation change from None to windows-1252/windows-1252 +0ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::tds::stream::token: Microsoft SQL Server version 3742302223 +0ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::tds::stream::token: Packet size change from '4096' to '4096' +0ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: quaint::single: Starting a mssql connection. +4ms migrate:dev { migrate:dev diagnoseResult: { migrate:dev drift: { migrate:dev diagnostic: 'driftDetected', migrate:dev rollback: '/*\n' + migrate:dev ' Warnings:\n' + migrate:dev '\n' + migrate:dev ' - You are about to drop the column `name` on the `User` table. All the data in the column will be lost.\n' + migrate:dev ' - You are about to drop the `Post` table. If the table is not empty, all the data it contains will be lost.\n' + migrate:dev ' - The migration will add a unique constraint covering the columns `[email]` on the table `User`. If there are existing duplicate values, the migration will fail.\n' + migrate:dev ' - Made the column `email` on table `User` required. The migration will fail if there are existing NULL values in that column.\n' + migrate:dev '\n' + migrate:dev '*/\n' + migrate:dev '-- DropForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[Post] DROP CONSTRAINT [Post_FK_2];\n' + migrate:dev '\n' + migrate:dev '-- AlterTable\n' + migrate:dev 'ALTER TABLE [dbo].[User] DROP CONSTRAINT [UQ__email];\n' + migrate:dev 'ALTER TABLE [dbo].[User] ALTER COLUMN [email] varchar(150) NOT NULL;\n' + migrate:dev 'ALTER TABLE [dbo].[User] DROP COLUMN [name];\n' + migrate:dev 'ALTER TABLE [dbo].[User] ADD CONSTRAINT PK__User__email UNIQUE ([email]);\n' + migrate:dev 'ALTER TABLE [dbo].[User] ADD [firstName] varchar(100),\n' + migrate:dev '[lastName] varchar(150),\n' + migrate:dev '[social] nvarchar(max),\n' + migrate:dev '[isAdmin] bit NOT NULL CONSTRAINT [DF__User__isAdmin] DEFAULT 0;\n' + migrate:dev '\n' + migrate:dev '-- CreateTable\n' + migrate:dev 'CREATE TABLE [dbo].[Course] (\n' + migrate:dev ' [id] INT IDENTITY(1,1),\n' + migrate:dev ' [name] varchar(255) NOT NULL,\n' + migrate:dev ' [courseDetails] nvarchar(max),\n' + migrate:dev ' CONSTRAINT [PK__Course__id] PRIMARY KEY ([id])\n' + migrate:dev ');\n' + migrate:dev '\n' + migrate:dev '-- CreateTable\n' + migrate:dev 'CREATE TABLE [dbo].[CourseEnrollment] (\n' + migrate:dev ' [userId] int NOT NULL,\n' + migrate:dev ' [courseId] int NOT NULL,\n' + migrate:dev ' [role] varchar(20) NOT NULL,\n' + migrate:dev ' [createdAt] datetime NOT NULL CONSTRAINT [DF__CourseEnrollment__createdAt] DEFAULT CURRENT_TIMESTAMP,\n' + migrate:dev ' CONSTRAINT [PK__CourseEnrollment__userId_courseId] PRIMARY KEY ([userId],[courseId])\n' + migrate:dev ');\n' + migrate:dev '\n' + migrate:dev '-- CreateTable\n' + migrate:dev 'CREATE TABLE [dbo].[Test] (\n' + migrate:dev ' [id] INT IDENTITY(1,1),\n' + migrate:dev ' [updatedAt] datetime NOT NULL,\n' + migrate:dev ' [courseId] int NOT NULL,\n' + migrate:dev ' [name] varchar(255) NOT NULL,\n' + migrate:dev ' [date] datetime NOT NULL,\n' + migrate:dev ' CONSTRAINT [PK__Test__id] PRIMARY KEY ([id])\n' + migrate:dev ');\n' + migrate:dev '\n' + migrate:dev '-- CreateTable\n' + migrate:dev 'CREATE TABLE [dbo].[TestResult] (\n' + migrate:dev ' [id] INT IDENTITY(1,1),\n' + migrate:dev ' [createdAt] datetime NOT NULL CONSTRAINT [DF__TestResult__createdAt] DEFAULT CURRENT_TIMESTAMP,\n' + migrate:dev ' [result] int NOT NULL,\n' + migrate:dev ' [studentId] int NOT NULL,\n' + migrate:dev ' [graderId] int NOT NULL,\n' + migrate:dev ' [testId] int NOT NULL,\n' + migrate:dev ' CONSTRAINT [PK__TestResult__id] PRIMARY KEY ([id])\n' + migrate:dev ');\n' + migrate:dev '\n' + migrate:dev '-- CreateTable\n' + migrate:dev 'CREATE TABLE [dbo].[Token] (\n' + migrate:dev ' [id] INT IDENTITY(1,1),\n' + migrate:dev ' [createdAt] datetime NOT NULL CONSTRAINT [DF__Token__createdAt] DEFAULT CURRENT_TIMESTAMP,\n' + migrate:dev ' [updatedAt] datetime NOT NULL,\n' + migrate:dev ' [emailToken] varchar(255),\n' + migrate:dev ' [valid] bit NOT NULL CONSTRAINT [DF__Token__valid] DEFAULT 1,\n' + migrate:dev ' [expiration] datetime NOT NULL,\n' + migrate:dev ' [userId] int NOT NULL,\n' + migrate:dev ' [type] varchar(10) NOT NULL,\n' + migrate:dev ' CONSTRAINT [PK__Token__id] PRIMARY KEY ([id]),\n' + migrate:dev ' CONSTRAINT [UQ__emailToken] UNIQUE ([emailToken])\n' + migrate:dev ');\n' + migrate:dev '\n' + migrate:dev '-- DropTable\n' + migrate:dev 'DROP TABLE [dbo].[Post];\n' + migrate:dev '\n' + migrate:dev '-- CreateIndex\n' + migrate:dev 'CREATE UNIQUE INDEX [UQ__email] ON [dbo].[User]([email]);\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[CourseEnrollment] ADD CONSTRAINT [FK__CourseEnr__courseId] FOREIGN KEY ([courseId]) REFERENCES [dbo].[Course]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[CourseEnrollment] ADD CONSTRAINT [FK__CourseEnr__userId] FOREIGN KEY ([userId]) REFERENCES [dbo].[User]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[Test] ADD CONSTRAINT [FK__Test__courseId] FOREIGN KEY ([courseId]) REFERENCES [dbo].[Course]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[TestResult] ADD CONSTRAINT [FK__TestResult_grader] FOREIGN KEY ([graderId]) REFERENCES [dbo].[User]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[TestResult] ADD CONSTRAINT [FK__TestResult_student] FOREIGN KEY ([studentId]) REFERENCES [dbo].[User]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[TestResult] ADD CONSTRAINT [FK__TestResult_test] FOREIGN KEY ([testId]) REFERENCES [dbo].[Test]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[Token] ADD CONSTRAINT [FK__Token__userId] FOREIGN KEY ([userId]) REFERENCES [dbo].[User]([id]) ON UPDATE CASCADE;\n' migrate:dev }, migrate:dev history: null, migrate:dev failedMigrationNames: [], migrate:dev editedMigrationNames: [], migrate:dev errorInUnappliedMigration: null, migrate:dev hasMigrationsTable: false migrate:dev } migrate:dev } +0ms MigrateEngine:rpc SENDING RPC CALL {"id":2,"jsonrpc":"2.0","method":"evaluateDataLoss","params":{"migrationsDirectoryPath":"/path/to/my/project/prisma/migrations","prismaSchema":"generator client {\n provider = \"prisma-client-js\"\n previewFeatures = [\"microsoftSqlServer\"]\n}\n\ndatasource db {\n provider = \"sqlserver\"\n url = env(\"DATABASE_URL\")\n}\n\nmodel User {\n email String? @unique\n name String?\n id Int @id @default(autoincrement())\n posts Post[]\n Comment Comment[]\n}\n\nmodel Post {\n authorId Int?\n content String?\n published Boolean\n title String?\n id Int @id @default(autoincrement())\n author User? @relation(fields: [authorId], references: [id])\n Comment Comment[]\n}\n\nmodel Comment {\n id Int @id @default(autoincrement())\n title String\n body String\n post Post @relation(fields: [postId], references: [id])\n author User? @relation(fields: [authorId], references: [id])\n authorId Int?\n postId Int\n}\n"}} +2s MigrateEngine:stderr Dec 10 18:50:05.113 INFO EvaluateDataLoss:infer_next_migration: tiberius::client::connection: Performing a TLS handshake +502ms MigrateEngine:stderr Dec 10 18:50:05.113 WARN EvaluateDataLoss:infer_next_migration: tiberius::client::connection: Trusting the server certificate without validation. +0ms MigrateEngine:stderr Dec 10 18:50:05.124 INFO EvaluateDataLoss:infer_next_migration: tiberius::client::connection: TLS handshake successful +10ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: tiberius::tds::stream::token: Database change from 'prisma_shadow_db9b24c130-0ce1-43c4-9093-02b4c75c8978' to 'master' +4ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: tiberius::tds::stream::token: Changed database context to 'prisma_shadow_db9b24c130-0ce1-43c4-9093-02b4c75c8978'. +0ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: tiberius::tds::stream::token: SQL collation change from None to windows-1252/windows-1252 +0ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: tiberius::tds::stream::token: Microsoft SQL Server version 3742302223 +0ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: tiberius::tds::stream::token: Packet size change from '4096' to '4096' +5ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: quaint::single: Starting a mssql connection. +0ms MigrateEngine:stderr Dec 10 18:50:05.374 ERROR EvaluateDataLoss: tiberius::tds::stream::token: Invalid column name 'firstName'. code=207 +241ms MigrateEngine:rpc { MigrateEngine:rpc jsonrpc: '2.0', MigrateEngine:rpc error: { MigrateEngine:rpc code: 4466, MigrateEngine:rpc message: 'An error happened. Check the data field for details.', MigrateEngine:rpc data: { MigrateEngine:rpc is_panic: false, MigrateEngine:rpc message: 'Database error: Error accessing result set, column not found: firstName\n' + MigrateEngine:rpc ' 0: migration_core::api::EvaluateDataLoss\n' + MigrateEngine:rpc ' at migration-engine/core/src/api.rs:154', MigrateEngine:rpc backtrace: null MigrateEngine:rpc } MigrateEngine:rpc }, MigrateEngine:rpc id: 2 MigrateEngine:rpc } +758ms Error: Error: Database error: Error accessing result set, column not found: firstName 0: migration_core::api::EvaluateDataLoss at migration-engine/core/src/api.rs:154 at Object.<anonymous> (/path/to/my/project/node_modules/@prisma/cli/build/index.js:53983:26) at MigrateEngine.handleResponse (/path/to/my/project/node_modules/@prisma/cli/build/index.js:53856:38) at LineStream.<anonymous> (/path/to/my/project/node_modules/@prisma/cli/build/index.js:53941:18) at LineStream.emit (events.js:315:20) at LineStream.EventEmitter.emit (domain.js:486:12) at addChunk (_stream_readable.js:309:12) at readableAddChunk (_stream_readable.js:284:9) at LineStream.Readable.push (_stream_readable.js:223:10) at LineStream.Transform.push (_stream_transform.js:166:32) at LineStream._pushBuffer (/path/to/my/project/node_modules/@prisma/cli/build/index.js:53674:19) error Command failed with exit code 1. ``` ## Expected behavior I would have expected migrate to detect a drift and offer to reset the dev database or to successfully migrate. ## Prisma information ```json { "devDependencies": { "@prisma/cli": "^2.14.0-dev.13" }, "dependencies": { "@prisma/client": "^2.14.0-dev.13" } } ``` Using custom binaries built to enable TLS to work on macOS. ## Environment & setup <!-- In which environment does the problem occur --> - OS: macOS Big Sur (11.0.1) - Database: Docker SQL Server 2019 - Node.js version: v14.15.1 - Prisma version: ``` Environment variables loaded from prisma/.env @prisma/cli : 2.14.0-dev.13 @prisma/client : 2.14.0-dev.13 Current platform : darwin Query Engine : query-engine 626fcb87e24db886bb885c976d49debe7af90acc (at ../prisma-mssql-openssl/query-engine, resolved by PRISMA_QUERY_ENGINE_BINARY) Migration Engine : migration-engine-cli 626fcb87e24db886bb885c976d49debe7af90acc (at ../prisma-mssql-openssl/migration-engine, resolved by PRISMA_MIGRATION_ENGINE_BINARY) Introspection Engine : introspection-core 626fcb87e24db886bb885c976d49debe7af90acc (at ../prisma-mssql-openssl/introspection-engine, resolved by PRISMA_INTROSPECTION_ENGINE_BINARY) Format Binary : prisma-fmt 626fcb87e24db886bb885c976d49debe7af90acc (at ../prisma-mssql-openssl/prisma-fmt, resolved by PRISMA_FMT_BINARY) Studio : 0.331.0 Preview Features : microsoftSqlServer ```
1.0
Issue with migrate on SQL Server - ## Bug description Running migrate after making changes to my schema triggered a weird issue which seems related to drift detection/shadow database. ## How to reproduce Steps to reproduce the behavior: 1. Introspected my database which you could create as follows: ```sql -- master.dbo.[User] definition -- Drop table -- DROP TABLE master.dbo.[User] GO CREATE TABLE [User] ( email varchar(100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, name varchar(100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, id int IDENTITY(0,1) NOT NULL, CONSTRAINT User_PK2 PRIMARY KEY (id) ) GO; -- master.dbo.Post definition -- Drop table -- DROP TABLE master.dbo.Post GO CREATE TABLE Post ( authorId int NULL, content varchar(100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, published int NULL, title varchar(100) COLLATE SQL_Latin1_General_CP1_CI_AS NULL, id int IDENTITY(0,1) NOT NULL, CONSTRAINT Post_pk PRIMARY KEY (id), CONSTRAINT Post_FK_2 FOREIGN KEY (authorId) REFERENCES [User](id) ) GO CREATE UNIQUE NONCLUSTERED INDEX Post_id_uindex ON dbo.Post ( id ASC ) WITH ( PAD_INDEX = OFF ,FILLFACTOR = 100 ,SORT_IN_TEMPDB = OFF , IGNORE_DUP_KEY = OFF , STATISTICS_NORECOMPUTE = OFF , ONLINE = OFF , ALLOW_ROW_LOCKS = ON , ALLOW_PAGE_LOCKS = ON ) ON [PRIMARY ] GO; ``` The schema I got from that after introspecting: ```prisma generator client { provider = "prisma-client-js" previewFeatures = ["microsoftSqlServer"] } datasource db { provider = "sqlserver" url = env("DATABASE_URL") } model User { email String? name String? id Int @id @default(autoincrement()) posts Post[] } model Post { authorId Int? content String? published Int? title String? id Int @id @default(autoincrement()) author User? @relation(fields: [authorId], references: [id]) } ``` 2. Make changes to the schema as follows: ```prisma generator client { provider = "prisma-client-js" previewFeatures = ["microsoftSqlServer"] } datasource db { provider = "sqlserver" url = env("DATABASE_URL") } model User { email String? @unique name String? id Int @id @default(autoincrement()) posts Post[] comments Comment[] } model Post { authorId Int? content String? published Boolean title String? id Int @id @default(autoincrement()) author User? @relation(fields: [authorId], references: [id]) comments Comment[] } model Comment { id Int @id @default(autoincrement()) title String body String post Post @relation(fields: [postId], references: [id]) author User? @relation(fields: [authorId], references: [id]) authorId Int? postId Int } ``` 3. Run `yarn prisma migrate dev --preview-feature` 4. See error ``` MigrateEngine:rpc starting migration engine with binary: /Users/hervelabas/Dev/studio-qa/prisma-mssql-openssl/migration-engine +0ms MigrateEngine:rpc SENDING RPC CALL {"id":1,"jsonrpc":"2.0","method":"diagnoseMigrationHistory","params":{"migrationsDirectoryPath":"/path/to/my/project/prisma/migrations","optInToShadowDatabase":true}} +3ms MigrateEngine:stderr Dec 10 18:50:02.790 INFO migration_engine: Starting migration engine RPC server git_hash="626fcb87e24db886bb885c976d49debe7af90acc" +0ms MigrateEngine:stderr Dec 10 18:50:02.795 INFO tiberius::client::connection: Performing a TLS handshake +4ms MigrateEngine:stderr Dec 10 18:50:02.795 WARN tiberius::client::connection: Trusting the server certificate without validation. +1ms MigrateEngine:stderr Dec 10 18:50:02.808 INFO tiberius::client::connection: TLS handshake successful +12ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO tiberius::tds::stream::token: Database change from 'master' to 'master' +5ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO tiberius::tds::stream::token: Changed database context to 'master'. +0ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO tiberius::tds::stream::token: SQL collation change from None to windows-1252/windows-1252 +0ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO tiberius::tds::stream::token: Microsoft SQL Server version 3742302223 +0ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO tiberius::tds::stream::token: Packet size change from '4096' to '4096' +0ms MigrateEngine:stderr Dec 10 18:50:02.813 INFO quaint::single: Starting a mssql connection. +0ms MigrateEngine:stderr Dec 10 18:50:02.820 ERROR DiagnoseMigrationHistory:list_migrations: tiberius::tds::stream::token: Invalid object name 'dbo._prisma_migrations'. code=208 +7ms MigrateEngine:stderr Dec 10 18:50:03.289 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::client::connection: Performing a TLS handshake +469ms MigrateEngine:stderr Dec 10 18:50:03.289 WARN DiagnoseMigrationHistory:calculate_drift: tiberius::client::connection: Trusting the server certificate without validation. +1ms MigrateEngine:stderr Dec 10 18:50:03.301 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::client::connection: TLS handshake successful +11ms MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::tds::stream::token: Database change from 'prisma_shadow_dbb5b76c9a-7091-4eb5-8422-9975ac30b62b' to 'master' +5ms MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::tds::stream::token: Changed database context to 'prisma_shadow_dbb5b76c9a-7091-4eb5-8422-9975ac30b62b'. +0ms MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::tds::stream::token: SQL collation change from None to windows-1252/windows-1252 +0ms MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::tds::stream::token: Microsoft SQL Server version 3742302223 +0ms MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: tiberius::tds::stream::token: Packet size change from '4096' to '4096' +1s MigrateEngine:stderr Dec 10 18:50:03.306 INFO DiagnoseMigrationHistory:calculate_drift: quaint::single: Starting a mssql connection. +1ms MigrateEngine:stderr Dec 10 18:50:04.101 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::client::connection: Performing a TLS handshake +0ms MigrateEngine:stderr Dec 10 18:50:04.101 WARN DiagnoseMigrationHistory:validate_migrations: tiberius::client::connection: Trusting the server certificate without validation. +0ms MigrateEngine:stderr Dec 10 18:50:04.112 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::client::connection: TLS handshake successful +0ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::tds::stream::token: Database change from 'prisma_shadow_db5090846c-bb0d-40a0-80bc-560435248178' to 'master' +3ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::tds::stream::token: Changed database context to 'prisma_shadow_db5090846c-bb0d-40a0-80bc-560435248178'. +0ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::tds::stream::token: SQL collation change from None to windows-1252/windows-1252 +0ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::tds::stream::token: Microsoft SQL Server version 3742302223 +0ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: tiberius::tds::stream::token: Packet size change from '4096' to '4096' +0ms MigrateEngine:stderr Dec 10 18:50:04.116 INFO DiagnoseMigrationHistory:validate_migrations: quaint::single: Starting a mssql connection. +4ms migrate:dev { migrate:dev diagnoseResult: { migrate:dev drift: { migrate:dev diagnostic: 'driftDetected', migrate:dev rollback: '/*\n' + migrate:dev ' Warnings:\n' + migrate:dev '\n' + migrate:dev ' - You are about to drop the column `name` on the `User` table. All the data in the column will be lost.\n' + migrate:dev ' - You are about to drop the `Post` table. If the table is not empty, all the data it contains will be lost.\n' + migrate:dev ' - The migration will add a unique constraint covering the columns `[email]` on the table `User`. If there are existing duplicate values, the migration will fail.\n' + migrate:dev ' - Made the column `email` on table `User` required. The migration will fail if there are existing NULL values in that column.\n' + migrate:dev '\n' + migrate:dev '*/\n' + migrate:dev '-- DropForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[Post] DROP CONSTRAINT [Post_FK_2];\n' + migrate:dev '\n' + migrate:dev '-- AlterTable\n' + migrate:dev 'ALTER TABLE [dbo].[User] DROP CONSTRAINT [UQ__email];\n' + migrate:dev 'ALTER TABLE [dbo].[User] ALTER COLUMN [email] varchar(150) NOT NULL;\n' + migrate:dev 'ALTER TABLE [dbo].[User] DROP COLUMN [name];\n' + migrate:dev 'ALTER TABLE [dbo].[User] ADD CONSTRAINT PK__User__email UNIQUE ([email]);\n' + migrate:dev 'ALTER TABLE [dbo].[User] ADD [firstName] varchar(100),\n' + migrate:dev '[lastName] varchar(150),\n' + migrate:dev '[social] nvarchar(max),\n' + migrate:dev '[isAdmin] bit NOT NULL CONSTRAINT [DF__User__isAdmin] DEFAULT 0;\n' + migrate:dev '\n' + migrate:dev '-- CreateTable\n' + migrate:dev 'CREATE TABLE [dbo].[Course] (\n' + migrate:dev ' [id] INT IDENTITY(1,1),\n' + migrate:dev ' [name] varchar(255) NOT NULL,\n' + migrate:dev ' [courseDetails] nvarchar(max),\n' + migrate:dev ' CONSTRAINT [PK__Course__id] PRIMARY KEY ([id])\n' + migrate:dev ');\n' + migrate:dev '\n' + migrate:dev '-- CreateTable\n' + migrate:dev 'CREATE TABLE [dbo].[CourseEnrollment] (\n' + migrate:dev ' [userId] int NOT NULL,\n' + migrate:dev ' [courseId] int NOT NULL,\n' + migrate:dev ' [role] varchar(20) NOT NULL,\n' + migrate:dev ' [createdAt] datetime NOT NULL CONSTRAINT [DF__CourseEnrollment__createdAt] DEFAULT CURRENT_TIMESTAMP,\n' + migrate:dev ' CONSTRAINT [PK__CourseEnrollment__userId_courseId] PRIMARY KEY ([userId],[courseId])\n' + migrate:dev ');\n' + migrate:dev '\n' + migrate:dev '-- CreateTable\n' + migrate:dev 'CREATE TABLE [dbo].[Test] (\n' + migrate:dev ' [id] INT IDENTITY(1,1),\n' + migrate:dev ' [updatedAt] datetime NOT NULL,\n' + migrate:dev ' [courseId] int NOT NULL,\n' + migrate:dev ' [name] varchar(255) NOT NULL,\n' + migrate:dev ' [date] datetime NOT NULL,\n' + migrate:dev ' CONSTRAINT [PK__Test__id] PRIMARY KEY ([id])\n' + migrate:dev ');\n' + migrate:dev '\n' + migrate:dev '-- CreateTable\n' + migrate:dev 'CREATE TABLE [dbo].[TestResult] (\n' + migrate:dev ' [id] INT IDENTITY(1,1),\n' + migrate:dev ' [createdAt] datetime NOT NULL CONSTRAINT [DF__TestResult__createdAt] DEFAULT CURRENT_TIMESTAMP,\n' + migrate:dev ' [result] int NOT NULL,\n' + migrate:dev ' [studentId] int NOT NULL,\n' + migrate:dev ' [graderId] int NOT NULL,\n' + migrate:dev ' [testId] int NOT NULL,\n' + migrate:dev ' CONSTRAINT [PK__TestResult__id] PRIMARY KEY ([id])\n' + migrate:dev ');\n' + migrate:dev '\n' + migrate:dev '-- CreateTable\n' + migrate:dev 'CREATE TABLE [dbo].[Token] (\n' + migrate:dev ' [id] INT IDENTITY(1,1),\n' + migrate:dev ' [createdAt] datetime NOT NULL CONSTRAINT [DF__Token__createdAt] DEFAULT CURRENT_TIMESTAMP,\n' + migrate:dev ' [updatedAt] datetime NOT NULL,\n' + migrate:dev ' [emailToken] varchar(255),\n' + migrate:dev ' [valid] bit NOT NULL CONSTRAINT [DF__Token__valid] DEFAULT 1,\n' + migrate:dev ' [expiration] datetime NOT NULL,\n' + migrate:dev ' [userId] int NOT NULL,\n' + migrate:dev ' [type] varchar(10) NOT NULL,\n' + migrate:dev ' CONSTRAINT [PK__Token__id] PRIMARY KEY ([id]),\n' + migrate:dev ' CONSTRAINT [UQ__emailToken] UNIQUE ([emailToken])\n' + migrate:dev ');\n' + migrate:dev '\n' + migrate:dev '-- DropTable\n' + migrate:dev 'DROP TABLE [dbo].[Post];\n' + migrate:dev '\n' + migrate:dev '-- CreateIndex\n' + migrate:dev 'CREATE UNIQUE INDEX [UQ__email] ON [dbo].[User]([email]);\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[CourseEnrollment] ADD CONSTRAINT [FK__CourseEnr__courseId] FOREIGN KEY ([courseId]) REFERENCES [dbo].[Course]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[CourseEnrollment] ADD CONSTRAINT [FK__CourseEnr__userId] FOREIGN KEY ([userId]) REFERENCES [dbo].[User]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[Test] ADD CONSTRAINT [FK__Test__courseId] FOREIGN KEY ([courseId]) REFERENCES [dbo].[Course]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[TestResult] ADD CONSTRAINT [FK__TestResult_grader] FOREIGN KEY ([graderId]) REFERENCES [dbo].[User]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[TestResult] ADD CONSTRAINT [FK__TestResult_student] FOREIGN KEY ([studentId]) REFERENCES [dbo].[User]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[TestResult] ADD CONSTRAINT [FK__TestResult_test] FOREIGN KEY ([testId]) REFERENCES [dbo].[Test]([id]) ON UPDATE CASCADE;\n' + migrate:dev '\n' + migrate:dev '-- AddForeignKey\n' + migrate:dev 'ALTER TABLE [dbo].[Token] ADD CONSTRAINT [FK__Token__userId] FOREIGN KEY ([userId]) REFERENCES [dbo].[User]([id]) ON UPDATE CASCADE;\n' migrate:dev }, migrate:dev history: null, migrate:dev failedMigrationNames: [], migrate:dev editedMigrationNames: [], migrate:dev errorInUnappliedMigration: null, migrate:dev hasMigrationsTable: false migrate:dev } migrate:dev } +0ms MigrateEngine:rpc SENDING RPC CALL {"id":2,"jsonrpc":"2.0","method":"evaluateDataLoss","params":{"migrationsDirectoryPath":"/path/to/my/project/prisma/migrations","prismaSchema":"generator client {\n provider = \"prisma-client-js\"\n previewFeatures = [\"microsoftSqlServer\"]\n}\n\ndatasource db {\n provider = \"sqlserver\"\n url = env(\"DATABASE_URL\")\n}\n\nmodel User {\n email String? @unique\n name String?\n id Int @id @default(autoincrement())\n posts Post[]\n Comment Comment[]\n}\n\nmodel Post {\n authorId Int?\n content String?\n published Boolean\n title String?\n id Int @id @default(autoincrement())\n author User? @relation(fields: [authorId], references: [id])\n Comment Comment[]\n}\n\nmodel Comment {\n id Int @id @default(autoincrement())\n title String\n body String\n post Post @relation(fields: [postId], references: [id])\n author User? @relation(fields: [authorId], references: [id])\n authorId Int?\n postId Int\n}\n"}} +2s MigrateEngine:stderr Dec 10 18:50:05.113 INFO EvaluateDataLoss:infer_next_migration: tiberius::client::connection: Performing a TLS handshake +502ms MigrateEngine:stderr Dec 10 18:50:05.113 WARN EvaluateDataLoss:infer_next_migration: tiberius::client::connection: Trusting the server certificate without validation. +0ms MigrateEngine:stderr Dec 10 18:50:05.124 INFO EvaluateDataLoss:infer_next_migration: tiberius::client::connection: TLS handshake successful +10ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: tiberius::tds::stream::token: Database change from 'prisma_shadow_db9b24c130-0ce1-43c4-9093-02b4c75c8978' to 'master' +4ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: tiberius::tds::stream::token: Changed database context to 'prisma_shadow_db9b24c130-0ce1-43c4-9093-02b4c75c8978'. +0ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: tiberius::tds::stream::token: SQL collation change from None to windows-1252/windows-1252 +0ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: tiberius::tds::stream::token: Microsoft SQL Server version 3742302223 +0ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: tiberius::tds::stream::token: Packet size change from '4096' to '4096' +5ms MigrateEngine:stderr Dec 10 18:50:05.128 INFO EvaluateDataLoss:infer_next_migration: quaint::single: Starting a mssql connection. +0ms MigrateEngine:stderr Dec 10 18:50:05.374 ERROR EvaluateDataLoss: tiberius::tds::stream::token: Invalid column name 'firstName'. code=207 +241ms MigrateEngine:rpc { MigrateEngine:rpc jsonrpc: '2.0', MigrateEngine:rpc error: { MigrateEngine:rpc code: 4466, MigrateEngine:rpc message: 'An error happened. Check the data field for details.', MigrateEngine:rpc data: { MigrateEngine:rpc is_panic: false, MigrateEngine:rpc message: 'Database error: Error accessing result set, column not found: firstName\n' + MigrateEngine:rpc ' 0: migration_core::api::EvaluateDataLoss\n' + MigrateEngine:rpc ' at migration-engine/core/src/api.rs:154', MigrateEngine:rpc backtrace: null MigrateEngine:rpc } MigrateEngine:rpc }, MigrateEngine:rpc id: 2 MigrateEngine:rpc } +758ms Error: Error: Database error: Error accessing result set, column not found: firstName 0: migration_core::api::EvaluateDataLoss at migration-engine/core/src/api.rs:154 at Object.<anonymous> (/path/to/my/project/node_modules/@prisma/cli/build/index.js:53983:26) at MigrateEngine.handleResponse (/path/to/my/project/node_modules/@prisma/cli/build/index.js:53856:38) at LineStream.<anonymous> (/path/to/my/project/node_modules/@prisma/cli/build/index.js:53941:18) at LineStream.emit (events.js:315:20) at LineStream.EventEmitter.emit (domain.js:486:12) at addChunk (_stream_readable.js:309:12) at readableAddChunk (_stream_readable.js:284:9) at LineStream.Readable.push (_stream_readable.js:223:10) at LineStream.Transform.push (_stream_transform.js:166:32) at LineStream._pushBuffer (/path/to/my/project/node_modules/@prisma/cli/build/index.js:53674:19) error Command failed with exit code 1. ``` ## Expected behavior I would have expected migrate to detect a drift and offer to reset the dev database or to successfully migrate. ## Prisma information ```json { "devDependencies": { "@prisma/cli": "^2.14.0-dev.13" }, "dependencies": { "@prisma/client": "^2.14.0-dev.13" } } ``` Using custom binaries built to enable TLS to work on macOS. ## Environment & setup <!-- In which environment does the problem occur --> - OS: macOS Big Sur (11.0.1) - Database: Docker SQL Server 2019 - Node.js version: v14.15.1 - Prisma version: ``` Environment variables loaded from prisma/.env @prisma/cli : 2.14.0-dev.13 @prisma/client : 2.14.0-dev.13 Current platform : darwin Query Engine : query-engine 626fcb87e24db886bb885c976d49debe7af90acc (at ../prisma-mssql-openssl/query-engine, resolved by PRISMA_QUERY_ENGINE_BINARY) Migration Engine : migration-engine-cli 626fcb87e24db886bb885c976d49debe7af90acc (at ../prisma-mssql-openssl/migration-engine, resolved by PRISMA_MIGRATION_ENGINE_BINARY) Introspection Engine : introspection-core 626fcb87e24db886bb885c976d49debe7af90acc (at ../prisma-mssql-openssl/introspection-engine, resolved by PRISMA_INTROSPECTION_ENGINE_BINARY) Format Binary : prisma-fmt 626fcb87e24db886bb885c976d49debe7af90acc (at ../prisma-mssql-openssl/prisma-fmt, resolved by PRISMA_FMT_BINARY) Studio : 0.331.0 Preview Features : microsoftSqlServer ```
process
issue with migrate on sql server bug description running migrate after making changes to my schema triggered a weird issue which seems related to drift detection shadow database how to reproduce steps to reproduce the behavior introspected my database which you could create as follows sql master dbo definition drop table drop table master dbo go create table email varchar collate sql general ci as null name varchar collate sql general ci as null id int identity not null constraint user primary key id go master dbo post definition drop table drop table master dbo post go create table post authorid int null content varchar collate sql general ci as null published int null title varchar collate sql general ci as null id int identity not null constraint post pk primary key id constraint post fk foreign key authorid references id go create unique nonclustered index post id uindex on dbo post id asc with pad index off fillfactor sort in tempdb off ignore dup key off statistics norecompute off online off allow row locks on allow page locks on on go the schema i got from that after introspecting prisma generator client provider prisma client js previewfeatures datasource db provider sqlserver url env database url model user email string name string id int id default autoincrement posts post model post authorid int content string published int title string id int id default autoincrement author user relation fields references make changes to the schema as follows prisma generator client provider prisma client js previewfeatures datasource db provider sqlserver url env database url model user email string unique name string id int id default autoincrement posts post comments comment model post authorid int content string published boolean title string id int id default autoincrement author user relation fields references comments comment model comment id int id default autoincrement title string body string post post relation fields references author user relation fields references authorid int postid int run yarn prisma migrate dev preview feature see error migrateengine rpc starting migration engine with binary users hervelabas dev studio qa prisma mssql openssl migration engine migrateengine rpc sending rpc call id jsonrpc method diagnosemigrationhistory params migrationsdirectorypath path to my project prisma migrations optintoshadowdatabase true migrateengine stderr dec info migration engine starting migration engine rpc server git hash migrateengine stderr dec info tiberius client connection performing a tls handshake migrateengine stderr dec warn tiberius client connection trusting the server certificate without validation migrateengine stderr dec info tiberius client connection tls handshake successful migrateengine stderr dec info tiberius tds stream token database change from master to master migrateengine stderr dec info tiberius tds stream token changed database context to master migrateengine stderr dec info tiberius tds stream token sql collation change from none to windows windows migrateengine stderr dec info tiberius tds stream token microsoft sql server version migrateengine stderr dec info tiberius tds stream token packet size change from to migrateengine stderr dec info quaint single starting a mssql connection migrateengine stderr dec error diagnosemigrationhistory list migrations tiberius tds stream token invalid object name dbo prisma migrations code migrateengine stderr dec info diagnosemigrationhistory calculate drift tiberius client connection performing a tls handshake migrateengine stderr dec warn diagnosemigrationhistory calculate drift tiberius client connection trusting the server certificate without validation migrateengine stderr dec info diagnosemigrationhistory calculate drift tiberius client connection tls handshake successful migrateengine stderr dec info diagnosemigrationhistory calculate drift tiberius tds stream token database change from prisma shadow to master migrateengine stderr dec info diagnosemigrationhistory calculate drift tiberius tds stream token changed database context to prisma shadow migrateengine stderr dec info diagnosemigrationhistory calculate drift tiberius tds stream token sql collation change from none to windows windows migrateengine stderr dec info diagnosemigrationhistory calculate drift tiberius tds stream token microsoft sql server version migrateengine stderr dec info diagnosemigrationhistory calculate drift tiberius tds stream token packet size change from to migrateengine stderr dec info diagnosemigrationhistory calculate drift quaint single starting a mssql connection migrateengine stderr dec info diagnosemigrationhistory validate migrations tiberius client connection performing a tls handshake migrateengine stderr dec warn diagnosemigrationhistory validate migrations tiberius client connection trusting the server certificate without validation migrateengine stderr dec info diagnosemigrationhistory validate migrations tiberius client connection tls handshake successful migrateengine stderr dec info diagnosemigrationhistory validate migrations tiberius tds stream token database change from prisma shadow to master migrateengine stderr dec info diagnosemigrationhistory validate migrations tiberius tds stream token changed database context to prisma shadow migrateengine stderr dec info diagnosemigrationhistory validate migrations tiberius tds stream token sql collation change from none to windows windows migrateengine stderr dec info diagnosemigrationhistory validate migrations tiberius tds stream token microsoft sql server version migrateengine stderr dec info diagnosemigrationhistory validate migrations tiberius tds stream token packet size change from to migrateengine stderr dec info diagnosemigrationhistory validate migrations quaint single starting a mssql connection migrate dev migrate dev diagnoseresult migrate dev drift migrate dev diagnostic driftdetected migrate dev rollback n migrate dev warnings n migrate dev n migrate dev you are about to drop the column name on the user table all the data in the column will be lost n migrate dev you are about to drop the post table if the table is not empty all the data it contains will be lost n migrate dev the migration will add a unique constraint covering the columns on the table user if there are existing duplicate values the migration will fail n migrate dev made the column email on table user required the migration will fail if there are existing null values in that column n migrate dev n migrate dev n migrate dev dropforeignkey n migrate dev alter table drop constraint n migrate dev n migrate dev altertable n migrate dev alter table drop constraint n migrate dev alter table alter column varchar not null n migrate dev alter table drop column n migrate dev alter table add constraint pk user email unique n migrate dev alter table add varchar n migrate dev varchar n migrate dev nvarchar max n migrate dev bit not null constraint default n migrate dev n migrate dev createtable n migrate dev create table n migrate dev int identity n migrate dev varchar not null n migrate dev nvarchar max n migrate dev constraint primary key n migrate dev n migrate dev n migrate dev createtable n migrate dev create table n migrate dev int not null n migrate dev int not null n migrate dev varchar not null n migrate dev datetime not null constraint default current timestamp n migrate dev constraint primary key n migrate dev n migrate dev n migrate dev createtable n migrate dev create table n migrate dev int identity n migrate dev datetime not null n migrate dev int not null n migrate dev varchar not null n migrate dev datetime not null n migrate dev constraint primary key n migrate dev n migrate dev n migrate dev createtable n migrate dev create table n migrate dev int identity n migrate dev datetime not null constraint default current timestamp n migrate dev int not null n migrate dev int not null n migrate dev int not null n migrate dev int not null n migrate dev constraint primary key n migrate dev n migrate dev n migrate dev createtable n migrate dev create table n migrate dev int identity n migrate dev datetime not null constraint default current timestamp n migrate dev datetime not null n migrate dev varchar n migrate dev bit not null constraint default n migrate dev datetime not null n migrate dev int not null n migrate dev varchar not null n migrate dev constraint primary key n migrate dev constraint unique n migrate dev n migrate dev n migrate dev droptable n migrate dev drop table n migrate dev n migrate dev createindex n migrate dev create unique index on n migrate dev n migrate dev addforeignkey n migrate dev alter table add constraint foreign key references on update cascade n migrate dev n migrate dev addforeignkey n migrate dev alter table add constraint foreign key references on update cascade n migrate dev n migrate dev addforeignkey n migrate dev alter table add constraint foreign key references on update cascade n migrate dev n migrate dev addforeignkey n migrate dev alter table add constraint foreign key references on update cascade n migrate dev n migrate dev addforeignkey n migrate dev alter table add constraint foreign key references on update cascade n migrate dev n migrate dev addforeignkey n migrate dev alter table add constraint foreign key references on update cascade n migrate dev n migrate dev addforeignkey n migrate dev alter table add constraint foreign key references on update cascade n migrate dev migrate dev history null migrate dev failedmigrationnames migrate dev editedmigrationnames migrate dev errorinunappliedmigration null migrate dev hasmigrationstable false migrate dev migrate dev migrateengine rpc sending rpc call id jsonrpc method evaluatedataloss params migrationsdirectorypath path to my project prisma migrations prismaschema generator client n provider prisma client js n previewfeatures n n ndatasource db n provider sqlserver n url env database url n n nmodel user n email string unique n name string n id int id default autoincrement n posts post n comment comment n n nmodel post n authorid int n content string n published boolean n title string n id int id default autoincrement n author user relation fields references n comment comment n n nmodel comment n id int id default autoincrement n title string n body string n post post relation fields references n author user relation fields references n authorid int n postid int n n migrateengine stderr dec info evaluatedataloss infer next migration tiberius client connection performing a tls handshake migrateengine stderr dec warn evaluatedataloss infer next migration tiberius client connection trusting the server certificate without validation migrateengine stderr dec info evaluatedataloss infer next migration tiberius client connection tls handshake successful migrateengine stderr dec info evaluatedataloss infer next migration tiberius tds stream token database change from prisma shadow to master migrateengine stderr dec info evaluatedataloss infer next migration tiberius tds stream token changed database context to prisma shadow migrateengine stderr dec info evaluatedataloss infer next migration tiberius tds stream token sql collation change from none to windows windows migrateengine stderr dec info evaluatedataloss infer next migration tiberius tds stream token microsoft sql server version migrateengine stderr dec info evaluatedataloss infer next migration tiberius tds stream token packet size change from to migrateengine stderr dec info evaluatedataloss infer next migration quaint single starting a mssql connection migrateengine stderr dec error evaluatedataloss tiberius tds stream token invalid column name firstname code migrateengine rpc migrateengine rpc jsonrpc migrateengine rpc error migrateengine rpc code migrateengine rpc message an error happened check the data field for details migrateengine rpc data migrateengine rpc is panic false migrateengine rpc message database error error accessing result set column not found firstname n migrateengine rpc migration core api evaluatedataloss n migrateengine rpc at migration engine core src api rs migrateengine rpc backtrace null migrateengine rpc migrateengine rpc migrateengine rpc id migrateengine rpc error error database error error accessing result set column not found firstname migration core api evaluatedataloss at migration engine core src api rs at object path to my project node modules prisma cli build index js at migrateengine handleresponse path to my project node modules prisma cli build index js at linestream path to my project node modules prisma cli build index js at linestream emit events js at linestream eventemitter emit domain js at addchunk stream readable js at readableaddchunk stream readable js at linestream readable push stream readable js at linestream transform push stream transform js at linestream pushbuffer path to my project node modules prisma cli build index js error command failed with exit code expected behavior i would have expected migrate to detect a drift and offer to reset the dev database or to successfully migrate prisma information json devdependencies prisma cli dev dependencies prisma client dev using custom binaries built to enable tls to work on macos environment setup os macos big sur database docker sql server node js version prisma version environment variables loaded from prisma env prisma cli dev prisma client dev current platform darwin query engine query engine at prisma mssql openssl query engine resolved by prisma query engine binary migration engine migration engine cli at prisma mssql openssl migration engine resolved by prisma migration engine binary introspection engine introspection core at prisma mssql openssl introspection engine resolved by prisma introspection engine binary format binary prisma fmt at prisma mssql openssl prisma fmt resolved by prisma fmt binary studio preview features microsoftsqlserver
1
21,262
28,437,732,588
IssuesEvent
2023-04-15 14:18:29
pycaret/pycaret
https://api.github.com/repos/pycaret/pycaret
closed
[BUG]: time series test transformation not working as expected
question time_series preprocessing
### pycaret version checks - [X] I have checked that this issue has not already been reported [here](https://github.com/pycaret/pycaret/issues). - [X] I have confirmed this bug exists on the [latest version](https://github.com/pycaret/pycaret/releases) of pycaret. - [X] I have confirmed this bug exists on the master branch of pycaret (pip install -U git+https://github.com/pycaret/pycaret.git@master). ### Issue Description I am using transform_target='box-cox' in the setup. transformation applied to y_train seems fine, but the transformation applied to the test does not make sense. to verify, I took the y_train_transformed and y_test_transformed from pycaret. Then outside pycaret, I applied sktime boxcox transformation. transformation on the y_train are exactly the same, however, test transformations are different. Here is the link to the notebook https://colab.research.google.com/drive/1tGy-K0RnqNmFH-vkjBA3xRrgcLxoAPRH?usp=sharing ### Reproducible Example ```python from pycaret.time_series import TSForecastingExperiment exp = TSForecastingExperiment() exp.setup(data=df, index='Date', fh=[1,2], session_id=42,transform_target='box-cox') # box cox with sktime from sktime.transformations.series.boxcox import BoxCoxTransformer bc = BoxCoxTransformer() bc.fit_transform(exp.y_train.to_numpy().reshape(-1, 1)) # compare transformation import pandas as pd df = pd.DataFrame(bc.transform(exp.y_test.to_numpy().reshape(-1, 1)),exp.y_test_transformed).reset_index() df.columns = ['sktime_test_transformed','pycaret_test_transformed'] df['y_test'] = exp.y_test.values df ``` ### Expected Behavior transformation on test data produced by pycaret and sktime should be the same ### Actual Results ```python-traceback sktime_test_transformed pycaret_test_transformed y_test 0 42.674049 53.379534 403 1 37.583487 46.517722 320 ``` ### Installed Versions <details> System: python: 3.8.10 (default, Nov 14 2022, 12:59:47) [GCC 9.4.0] executable: /usr/bin/python3 machine: Linux-5.10.147+-x86_64-with-glibc2.29 PyCaret required dependencies: pip: 22.0.4 setuptools: 57.4.0 pycaret: 3.0.0rc8 IPython: 7.9.0 ipywidgets: 7.7.1 tqdm: 4.64.1 numpy: 1.21.6 pandas: 1.3.5 jinja2: 2.11.3 scipy: 1.7.3 joblib: 1.2.0 sklearn: 1.0.2 pyod: 1.0.7 imblearn: 0.8.1 category_encoders: 2.6.0 lightgbm: 3.3.5 numba: 0.56.4 requests: 2.28.2 matplotlib: 3.6.3 scikitplot: 0.3.7 yellowbrick: 1.5 plotly: 5.5.0 kaleido: 0.2.1 statsmodels: 0.13.5 sktime: 0.16.0 tbats: 1.1.2 pmdarima: 2.0.2 psutil: 5.9.4 PyCaret optional dependencies: shap: Not installed interpret: Not installed umap: Not installed pandas_profiling: 1.4.1 explainerdashboard: Not installed autoviz: Not installed fairlearn: Not installed xgboost: 0.90 catboost: Not installed kmodes: Not installed mlxtend: 0.14.0 statsforecast: Not installed tune_sklearn: Not installed ray: Not installed hyperopt: 0.1.2 optuna: Not installed skopt: Not installed mlflow: Not installed gradio: Not installed fastapi: Not installed uvicorn: Not installed m2cgen: Not installed evidently: Not installed nltk: 3.7 pyLDAvis: Not installed gensim: 3.6.0 spacy: 3.4.4 wordcloud: 1.8.2.2 textblob: 0.15.3 fugue: Not installed streamlit: Not installed prophet: 1.1.2 </details>
1.0
[BUG]: time series test transformation not working as expected - ### pycaret version checks - [X] I have checked that this issue has not already been reported [here](https://github.com/pycaret/pycaret/issues). - [X] I have confirmed this bug exists on the [latest version](https://github.com/pycaret/pycaret/releases) of pycaret. - [X] I have confirmed this bug exists on the master branch of pycaret (pip install -U git+https://github.com/pycaret/pycaret.git@master). ### Issue Description I am using transform_target='box-cox' in the setup. transformation applied to y_train seems fine, but the transformation applied to the test does not make sense. to verify, I took the y_train_transformed and y_test_transformed from pycaret. Then outside pycaret, I applied sktime boxcox transformation. transformation on the y_train are exactly the same, however, test transformations are different. Here is the link to the notebook https://colab.research.google.com/drive/1tGy-K0RnqNmFH-vkjBA3xRrgcLxoAPRH?usp=sharing ### Reproducible Example ```python from pycaret.time_series import TSForecastingExperiment exp = TSForecastingExperiment() exp.setup(data=df, index='Date', fh=[1,2], session_id=42,transform_target='box-cox') # box cox with sktime from sktime.transformations.series.boxcox import BoxCoxTransformer bc = BoxCoxTransformer() bc.fit_transform(exp.y_train.to_numpy().reshape(-1, 1)) # compare transformation import pandas as pd df = pd.DataFrame(bc.transform(exp.y_test.to_numpy().reshape(-1, 1)),exp.y_test_transformed).reset_index() df.columns = ['sktime_test_transformed','pycaret_test_transformed'] df['y_test'] = exp.y_test.values df ``` ### Expected Behavior transformation on test data produced by pycaret and sktime should be the same ### Actual Results ```python-traceback sktime_test_transformed pycaret_test_transformed y_test 0 42.674049 53.379534 403 1 37.583487 46.517722 320 ``` ### Installed Versions <details> System: python: 3.8.10 (default, Nov 14 2022, 12:59:47) [GCC 9.4.0] executable: /usr/bin/python3 machine: Linux-5.10.147+-x86_64-with-glibc2.29 PyCaret required dependencies: pip: 22.0.4 setuptools: 57.4.0 pycaret: 3.0.0rc8 IPython: 7.9.0 ipywidgets: 7.7.1 tqdm: 4.64.1 numpy: 1.21.6 pandas: 1.3.5 jinja2: 2.11.3 scipy: 1.7.3 joblib: 1.2.0 sklearn: 1.0.2 pyod: 1.0.7 imblearn: 0.8.1 category_encoders: 2.6.0 lightgbm: 3.3.5 numba: 0.56.4 requests: 2.28.2 matplotlib: 3.6.3 scikitplot: 0.3.7 yellowbrick: 1.5 plotly: 5.5.0 kaleido: 0.2.1 statsmodels: 0.13.5 sktime: 0.16.0 tbats: 1.1.2 pmdarima: 2.0.2 psutil: 5.9.4 PyCaret optional dependencies: shap: Not installed interpret: Not installed umap: Not installed pandas_profiling: 1.4.1 explainerdashboard: Not installed autoviz: Not installed fairlearn: Not installed xgboost: 0.90 catboost: Not installed kmodes: Not installed mlxtend: 0.14.0 statsforecast: Not installed tune_sklearn: Not installed ray: Not installed hyperopt: 0.1.2 optuna: Not installed skopt: Not installed mlflow: Not installed gradio: Not installed fastapi: Not installed uvicorn: Not installed m2cgen: Not installed evidently: Not installed nltk: 3.7 pyLDAvis: Not installed gensim: 3.6.0 spacy: 3.4.4 wordcloud: 1.8.2.2 textblob: 0.15.3 fugue: Not installed streamlit: Not installed prophet: 1.1.2 </details>
process
time series test transformation not working as expected pycaret version checks i have checked that this issue has not already been reported i have confirmed this bug exists on the of pycaret i have confirmed this bug exists on the master branch of pycaret pip install u git issue description i am using transform target box cox in the setup transformation applied to y train seems fine but the transformation applied to the test does not make sense to verify i took the y train transformed and y test transformed from pycaret then outside pycaret i applied sktime boxcox transformation transformation on the y train are exactly the same however test transformations are different here is the link to the notebook reproducible example python from pycaret time series import tsforecastingexperiment exp tsforecastingexperiment exp setup data df index date fh session id transform target box cox box cox with sktime from sktime transformations series boxcox import boxcoxtransformer bc boxcoxtransformer bc fit transform exp y train to numpy reshape compare transformation import pandas as pd df pd dataframe bc transform exp y test to numpy reshape exp y test transformed reset index df columns df exp y test values df expected behavior transformation on test data produced by pycaret and sktime should be the same actual results python traceback sktime test transformed pycaret test transformed y test installed versions system python default nov executable usr bin machine linux with pycaret required dependencies pip setuptools pycaret ipython ipywidgets tqdm numpy pandas scipy joblib sklearn pyod imblearn category encoders lightgbm numba requests matplotlib scikitplot yellowbrick plotly kaleido statsmodels sktime tbats pmdarima psutil pycaret optional dependencies shap not installed interpret not installed umap not installed pandas profiling explainerdashboard not installed autoviz not installed fairlearn not installed xgboost catboost not installed kmodes not installed mlxtend statsforecast not installed tune sklearn not installed ray not installed hyperopt optuna not installed skopt not installed mlflow not installed gradio not installed fastapi not installed uvicorn not installed not installed evidently not installed nltk pyldavis not installed gensim spacy wordcloud textblob fugue not installed streamlit not installed prophet
1
727,576
25,040,378,473
IssuesEvent
2022-11-04 20:03:41
gsanders5/Group_5
https://api.github.com/repos/gsanders5/Group_5
closed
User Story: Friend Profile Interaction Story
enhancement backend frontend database medium-priority user-story
**Story:** I want to be able to access friend's profiles and view their public information and posts. See the [user stories wiki page](https://github.com/gsanders5/Group_5/wiki/User-Stories).
1.0
User Story: Friend Profile Interaction Story - **Story:** I want to be able to access friend's profiles and view their public information and posts. See the [user stories wiki page](https://github.com/gsanders5/Group_5/wiki/User-Stories).
non_process
user story friend profile interaction story story i want to be able to access friend s profiles and view their public information and posts see the
0
249,842
18,858,242,579
IssuesEvent
2021-11-12 09:32:40
zognin/pe
https://api.github.com/repos/zognin/pe
opened
[DG] Documentation guide still uses AB-3
severity.VeryLow type.DocumentationBug
It still references AB-3 (note the last line in the screenshot), even though the repo files have been updated to the team's product name. ![Screenshot 2021-11-12 at 5.31.28 PM.png](https://raw.githubusercontent.com/zognin/pe/main/files/4c3d9a9b-26ae-48b8-9ba2-f6afc91541d9.png) ![Screenshot 2021-11-12 at 5.32.29 PM.png](https://raw.githubusercontent.com/zognin/pe/main/files/a87eff4b-2263-4a4b-a7b3-acd118aaa212.png) <!--session: 1636702513413-0a17c380-ab5a-44ea-9f84-58df1e0fefb4--> <!--Version: Web v3.4.1-->
1.0
[DG] Documentation guide still uses AB-3 - It still references AB-3 (note the last line in the screenshot), even though the repo files have been updated to the team's product name. ![Screenshot 2021-11-12 at 5.31.28 PM.png](https://raw.githubusercontent.com/zognin/pe/main/files/4c3d9a9b-26ae-48b8-9ba2-f6afc91541d9.png) ![Screenshot 2021-11-12 at 5.32.29 PM.png](https://raw.githubusercontent.com/zognin/pe/main/files/a87eff4b-2263-4a4b-a7b3-acd118aaa212.png) <!--session: 1636702513413-0a17c380-ab5a-44ea-9f84-58df1e0fefb4--> <!--Version: Web v3.4.1-->
non_process
documentation guide still uses ab it still references ab note the last line in the screenshot even though the repo files have been updated to the team s product name
0
20,923
27,767,206,259
IssuesEvent
2023-03-16 12:16:19
camunda/issues
https://api.github.com/repos/camunda/issues
opened
Delete selected version of a process definition
component:connectors component:operate component:zeebe component:zeebe-process-automation public kind:epic feature-parity potential:8.3 riskAssessment:completed
### Value Proposition Statement Delete Process Definition to free up storage, declutter user interface and prevent errors. ### User Problem - I have multiple versions of Process Definitions that are not being used anymore. I cannot delete them what can cause accidentally start process in the older version. - A bloated database of process data, takes too much space - I cannot stop creation of the new process instances, when there is a timer start event (cycle) - Currently, there is no way to delete process definition form Zeebe cluster - During development, I have multiple versions with bugs as I was testing the process ### User Stories - As a Developer, I can delete a selected version of deployed process definition and all process instances of this version of process definition (1 operation) via Operate UI. - As a Developer, while using this feature, I can read basic information, what will be deleted - As a Developer, I can see the progress of deletion in Operations Panel - As a Developer, I can read the documentation, explaining what is going to happen when I delete a version of process definition ### Implementation Notes - Together with Process Definition deletion, all dependent data of this process definition version like tasks and decision instances will be deleted - only instances, not definitions. It will affect only child processes, call activity and business rule tasks - nothing in parent instances. - With this iteration, we'll deliver simplified frontend - the enhancements will be covered in the 4th iteration. **Why this scope:** - Delete Process Definition has dependencies with Decisions and Tasks objects so it's more complex to build that - Frontend part is split into 2 iterations, so we're able to ship it with the simplified frontend and then enhance it to final version This is the second, out of 4 iterations, to implement the whole feature of **Delete Process and Decision Definition**. More details in [Miro](https://miro.com/app/board/uXjVPNlXkFg=/) **Iterations:** 1. https://github.com/camunda/product-hub/issues/94 2. https://github.com/camunda/product-hub/issues/615 3. https://github.com/camunda/product-hub/issues/619 4. https://github.com/camunda/product-hub/issues/620 ### Breakdown > This section links to various sub-issues / -tasks contributing to respective epic phase or phase results where appropriate. #### Discovery phase ## <!-- Example: link to "Conduct customer interview with xyz" --> #### Define phase ## #### Additional security testing * Test for security login and monitoring failure * Ensure current permission management remains in effect * <!-- Consider: UI, UX, technical design, documentation design --> <!-- Example: link to "Define User-Journey Flow" or "Define target architecture" --> Design Planning * Designer assigned: yes * Assignee: @gastonpillet01 * Design Brief - https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit#heading=h.c4qtk4282c6g * Research Brief - https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit Design Deliverables * [Design handover](https://www.figma.com/file/S0VWipy5r3KlZ0H1zR1mkb/Delete-Definition?node-id=0%3A1) * Figma File for HFW - https://www.figma.com/file/S0VWipy5r3KlZ0H1zR1mkb/Delete-Definition?node-id=402%3A2459 * Still to add iterative breakdown Documentation Planning <!-- Complex changes must be reviewed during the Define phase by the DRI of Documentation or technical writer. --> <!-- Briefly describe the anticipated impact to documentation. --> <!-- Example: "Creates structural changes in docs as UX is reworked." _Add docs reviewer to Epic for feedback._ --> Risk Management <!-- add link to risk management issue --> * Risk Class: <!-- e.g. very low | low | medium | high | very high --> * Risk Treatment: <!-- e.g. avoid | mitigate | transfer | accept --> #### Implement phase ## <!-- Example: link to "Implement User Story xyz". Should not only include core implementation, but also documentation. --> Zeebe Epic * DRI: @remcowesterhoud * https://github.com/camunda/zeebe/issues/9576 Operate Epic * DRI: @pedesen * https://github.com/camunda/operate/issues/3971
1.0
Delete selected version of a process definition - ### Value Proposition Statement Delete Process Definition to free up storage, declutter user interface and prevent errors. ### User Problem - I have multiple versions of Process Definitions that are not being used anymore. I cannot delete them what can cause accidentally start process in the older version. - A bloated database of process data, takes too much space - I cannot stop creation of the new process instances, when there is a timer start event (cycle) - Currently, there is no way to delete process definition form Zeebe cluster - During development, I have multiple versions with bugs as I was testing the process ### User Stories - As a Developer, I can delete a selected version of deployed process definition and all process instances of this version of process definition (1 operation) via Operate UI. - As a Developer, while using this feature, I can read basic information, what will be deleted - As a Developer, I can see the progress of deletion in Operations Panel - As a Developer, I can read the documentation, explaining what is going to happen when I delete a version of process definition ### Implementation Notes - Together with Process Definition deletion, all dependent data of this process definition version like tasks and decision instances will be deleted - only instances, not definitions. It will affect only child processes, call activity and business rule tasks - nothing in parent instances. - With this iteration, we'll deliver simplified frontend - the enhancements will be covered in the 4th iteration. **Why this scope:** - Delete Process Definition has dependencies with Decisions and Tasks objects so it's more complex to build that - Frontend part is split into 2 iterations, so we're able to ship it with the simplified frontend and then enhance it to final version This is the second, out of 4 iterations, to implement the whole feature of **Delete Process and Decision Definition**. More details in [Miro](https://miro.com/app/board/uXjVPNlXkFg=/) **Iterations:** 1. https://github.com/camunda/product-hub/issues/94 2. https://github.com/camunda/product-hub/issues/615 3. https://github.com/camunda/product-hub/issues/619 4. https://github.com/camunda/product-hub/issues/620 ### Breakdown > This section links to various sub-issues / -tasks contributing to respective epic phase or phase results where appropriate. #### Discovery phase ## <!-- Example: link to "Conduct customer interview with xyz" --> #### Define phase ## #### Additional security testing * Test for security login and monitoring failure * Ensure current permission management remains in effect * <!-- Consider: UI, UX, technical design, documentation design --> <!-- Example: link to "Define User-Journey Flow" or "Define target architecture" --> Design Planning * Designer assigned: yes * Assignee: @gastonpillet01 * Design Brief - https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit#heading=h.c4qtk4282c6g * Research Brief - https://docs.google.com/document/d/1yFi75aIIUAw6aMcGNGsxGGgkCRD2Sq4IvZxbJOLCXmI/edit Design Deliverables * [Design handover](https://www.figma.com/file/S0VWipy5r3KlZ0H1zR1mkb/Delete-Definition?node-id=0%3A1) * Figma File for HFW - https://www.figma.com/file/S0VWipy5r3KlZ0H1zR1mkb/Delete-Definition?node-id=402%3A2459 * Still to add iterative breakdown Documentation Planning <!-- Complex changes must be reviewed during the Define phase by the DRI of Documentation or technical writer. --> <!-- Briefly describe the anticipated impact to documentation. --> <!-- Example: "Creates structural changes in docs as UX is reworked." _Add docs reviewer to Epic for feedback._ --> Risk Management <!-- add link to risk management issue --> * Risk Class: <!-- e.g. very low | low | medium | high | very high --> * Risk Treatment: <!-- e.g. avoid | mitigate | transfer | accept --> #### Implement phase ## <!-- Example: link to "Implement User Story xyz". Should not only include core implementation, but also documentation. --> Zeebe Epic * DRI: @remcowesterhoud * https://github.com/camunda/zeebe/issues/9576 Operate Epic * DRI: @pedesen * https://github.com/camunda/operate/issues/3971
process
delete selected version of a process definition value proposition statement delete process definition to free up storage declutter user interface and prevent errors user problem i have multiple versions of process definitions that are not being used anymore i cannot delete them what can cause accidentally start process in the older version a bloated database of process data takes too much space i cannot stop creation of the new process instances when there is a timer start event cycle currently there is no way to delete process definition form zeebe cluster during development i have multiple versions with bugs as i was testing the process user stories as a developer i can delete a selected version of deployed process definition and all process instances of this version of process definition operation via operate ui as a developer while using this feature i can read basic information what will be deleted as a developer i can see the progress of deletion in operations panel as a developer i can read the documentation explaining what is going to happen when i delete a version of process definition implementation notes together with process definition deletion all dependent data of this process definition version like tasks and decision instances will be deleted only instances not definitions it will affect only child processes call activity and business rule tasks nothing in parent instances with this iteration we ll deliver simplified frontend the enhancements will be covered in the iteration why this scope delete process definition has dependencies with decisions and tasks objects so it s more complex to build that frontend part is split into iterations so we re able to ship it with the simplified frontend and then enhance it to final version this is the second out of iterations to implement the whole feature of delete process and decision definition more details in iterations breakdown this section links to various sub issues tasks contributing to respective epic phase or phase results where appropriate discovery phase define phase additional security testing test for security login and monitoring failure ensure current permission management remains in effect design planning designer assigned yes assignee design brief research brief design deliverables figma file for hfw still to add iterative breakdown documentation planning risk management risk class risk treatment implement phase zeebe epic dri remcowesterhoud operate epic dri pedesen
1
797,410
28,145,937,800
IssuesEvent
2023-04-02 13:37:00
berkeli/My-Coursework-Planner
https://api.github.com/repos/berkeli/My-Coursework-Planner
opened
[TECH ED] title goes here
🐂 Size Medium 🏕 Priority Mandatory
### Link to the coursework https://github.com/CodeYourFuture/HTML-CSS-Coursework-Week1 ### Why are we doing this? mandatory ### Maximum time in hours (Tech has max 16 per week total) 1 ### How to get help _No response_ ### How to submit Fork and clone then PR ### How to review _No response_ ### Anything else? _No response_
1.0
[TECH ED] title goes here - ### Link to the coursework https://github.com/CodeYourFuture/HTML-CSS-Coursework-Week1 ### Why are we doing this? mandatory ### Maximum time in hours (Tech has max 16 per week total) 1 ### How to get help _No response_ ### How to submit Fork and clone then PR ### How to review _No response_ ### Anything else? _No response_
non_process
title goes here link to the coursework why are we doing this mandatory maximum time in hours tech has max per week total how to get help no response how to submit fork and clone then pr how to review no response anything else no response
0
13,801
16,554,195,000
IssuesEvent
2021-05-28 12:09:06
cmux/koot
https://api.github.com/repos/cmux/koot
closed
更新 Webpack 5
bundling process overhaul
- [x] 利用第一方缓存机制 (https://github.com/webpack/webpack/issues/6527) - [x] _Client_ - [x] 新配置项: `webpackAssetModule` - `builtIn` (新默认值) - `false` (旧项目默认值) - `['png', 'jpg', 'gif', 'svg']` - 等 Asset Modules 支持不输出文件的选项 https://github.com/webpack/webpack/issues/12474#issuecomment-765258062 - [x] [Asset Module](https://webpack.js.org/guides/asset-modules/) - 根据配置项更改行为
1.0
更新 Webpack 5 - - [x] 利用第一方缓存机制 (https://github.com/webpack/webpack/issues/6527) - [x] _Client_ - [x] 新配置项: `webpackAssetModule` - `builtIn` (新默认值) - `false` (旧项目默认值) - `['png', 'jpg', 'gif', 'svg']` - 等 Asset Modules 支持不输出文件的选项 https://github.com/webpack/webpack/issues/12474#issuecomment-765258062 - [x] [Asset Module](https://webpack.js.org/guides/asset-modules/) - 根据配置项更改行为
process
更新 webpack 利用第一方缓存机制 client 新配置项 webpackassetmodule builtin 新默认值 false 旧项目默认值 等 asset modules 支持不输出文件的选项 根据配置项更改行为
1
60,859
17,023,541,525
IssuesEvent
2021-07-03 02:33:19
tomhughes/trac-tickets
https://api.github.com/repos/tomhughes/trac-tickets
closed
City bounding boxes incorrect
Component: nominatim Priority: major Resolution: invalid Type: defect
**[Submitted to the original trac issue database at 10.02pm, Thursday, 21st January 2010]** The boundaries for many cities appear to be wrong, e.g. Greater London, UK extends too far east, and Paris extends too far west. Looking at details reveals the boundaries are incorrect.
1.0
City bounding boxes incorrect - **[Submitted to the original trac issue database at 10.02pm, Thursday, 21st January 2010]** The boundaries for many cities appear to be wrong, e.g. Greater London, UK extends too far east, and Paris extends too far west. Looking at details reveals the boundaries are incorrect.
non_process
city bounding boxes incorrect the boundaries for many cities appear to be wrong e g greater london uk extends too far east and paris extends too far west looking at details reveals the boundaries are incorrect
0
104,963
4,227,043,317
IssuesEvent
2016-07-02 22:14:43
Jaeger2305/werewolves-site
https://api.github.com/repos/Jaeger2305/werewolves-site
opened
Matchmaking should be done through the game manager instead of through the player
cleanup enhancement Medium priority
currently too much reliance on find_game and leave_game Player() methods
1.0
Matchmaking should be done through the game manager instead of through the player - currently too much reliance on find_game and leave_game Player() methods
non_process
matchmaking should be done through the game manager instead of through the player currently too much reliance on find game and leave game player methods
0
10,584
13,393,582,033
IssuesEvent
2020-09-03 04:39:35
threefoldtech/js-sdk
https://api.github.com/repos/threefoldtech/js-sdk
closed
3BOT ADMIN: POOL - screen #4 should be removed. why should users choose which wallet they wanna use?
process_wontfix
<img width="848" alt="Screenshot 2020-09-02 at 16 30 33" src="https://user-images.githubusercontent.com/43240801/91998886-14bd6a80-ed3c-11ea-9f72-f8414134b58f.png"> not everyone has a testnet / staging 3bot wallet app. we dont even have a manual on howe to get a staging 3bot wallet. at the moment people pay with solar wallet or interstellar wallet testnet... can we delete this page altogether? its a repetition for the following page. #906
1.0
3BOT ADMIN: POOL - screen #4 should be removed. why should users choose which wallet they wanna use? - <img width="848" alt="Screenshot 2020-09-02 at 16 30 33" src="https://user-images.githubusercontent.com/43240801/91998886-14bd6a80-ed3c-11ea-9f72-f8414134b58f.png"> not everyone has a testnet / staging 3bot wallet app. we dont even have a manual on howe to get a staging 3bot wallet. at the moment people pay with solar wallet or interstellar wallet testnet... can we delete this page altogether? its a repetition for the following page. #906
process
admin pool screen should be removed why should users choose which wallet they wanna use img width alt screenshot at src not everyone has a testnet staging wallet app we dont even have a manual on howe to get a staging wallet at the moment people pay with solar wallet or interstellar wallet testnet can we delete this page altogether its a repetition for the following page
1
52,571
27,628,995,766
IssuesEvent
2023-03-10 09:24:18
stoneatom/stonedb
https://api.github.com/repos/stoneatom/stonedb
closed
feature: support semi-join in stonedb, include EXISTS and IN subqueries
A-feature B-performance prio: high B-SQL
**Is your feature request related to a problem? Please describe.** <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] --> The reason why our tpch Q4 query is slow is because the execution strategy of subquery is only one kind of nested loop, here we want to do an optimization is to convert the subquery expansion to semi-join, can use hash join, so that we can use hash-join to speed up. **Describe the solution you'd like** <!-- A clear and concise description of what you want to happen. --> **Describe alternatives you've considered** <!-- A clear and concise description of any alternative solutions or features you've considered. --> **Additional context** <!-- Add any other context or screenshots about the feature request here. -->
True
feature: support semi-join in stonedb, include EXISTS and IN subqueries - **Is your feature request related to a problem? Please describe.** <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] --> The reason why our tpch Q4 query is slow is because the execution strategy of subquery is only one kind of nested loop, here we want to do an optimization is to convert the subquery expansion to semi-join, can use hash join, so that we can use hash-join to speed up. **Describe the solution you'd like** <!-- A clear and concise description of what you want to happen. --> **Describe alternatives you've considered** <!-- A clear and concise description of any alternative solutions or features you've considered. --> **Additional context** <!-- Add any other context or screenshots about the feature request here. -->
non_process
feature support semi join in stonedb include exists and in subqueries is your feature request related to a problem please describe the reason why our tpch query is slow is because the execution strategy of subquery is only one kind of nested loop here we want to do an optimization is to convert the subquery expansion to semi join can use hash join so that we can use hash join to speed up describe the solution you d like describe alternatives you ve considered additional context
0
202,251
15,278,176,642
IssuesEvent
2021-02-23 00:52:24
microsoft/vscode
https://api.github.com/repos/microsoft/vscode
opened
Test: test explorer UI
testplan-item
Refs: https://github.com/microsoft/vscode/issues/107467 - [ ] anyOS - [ ] anyOS Complexity: 3 Authors: @connor4312 --- You have likely seen or heard of the test explorer built as part of the testing work. Try it out! 1. Install the [selfhost test provider](https://github.com/microsoft/vscode-selfhost-test-provider) 2. Open the VS Code repo 3. Play around in the test explorer view indicated by the beaker icon. Try out functionality and let me know if you run into problems or have questions 🙂
1.0
Test: test explorer UI - Refs: https://github.com/microsoft/vscode/issues/107467 - [ ] anyOS - [ ] anyOS Complexity: 3 Authors: @connor4312 --- You have likely seen or heard of the test explorer built as part of the testing work. Try it out! 1. Install the [selfhost test provider](https://github.com/microsoft/vscode-selfhost-test-provider) 2. Open the VS Code repo 3. Play around in the test explorer view indicated by the beaker icon. Try out functionality and let me know if you run into problems or have questions 🙂
non_process
test test explorer ui refs anyos anyos complexity authors you have likely seen or heard of the test explorer built as part of the testing work try it out install the open the vs code repo play around in the test explorer view indicated by the beaker icon try out functionality and let me know if you run into problems or have questions 🙂
0
20,540
27,192,398,919
IssuesEvent
2023-02-19 23:24:22
pentium3/sys_reading
https://api.github.com/repos/pentium3/sys_reading
opened
Fries: Fast and Consistent Runtime Reconfiguration in Dataflow Systems with Transactional Guarantees
stream processing
https://www.vldb.org/pvldb/vol16/p256-wang.pdf
1.0
Fries: Fast and Consistent Runtime Reconfiguration in Dataflow Systems with Transactional Guarantees - https://www.vldb.org/pvldb/vol16/p256-wang.pdf
process
fries fast and consistent runtime reconfiguration in dataflow systems with transactional guarantees
1
390,645
11,551,225,299
IssuesEvent
2020-02-19 00:46:33
GoogleContainerTools/skaffold
https://api.github.com/repos/GoogleContainerTools/skaffold
closed
`skaffold dev` breaks on kustomize new `patches:` field format
area/dev area/watch deploy/kustomize good first issue help wanted kind/bug priority/p2
### Actual behavior Running `skaffold dev` with a kustomization.yaml file containing the `patches:` field to target multiple objects (https://github.com/kubernetes-sigs/kustomize/blob/master/docs/plugins/builtins.md#field-name-patches) fails with: ``` FATA[0000] watching files for deployer: listing files: listing files: yaml: unmarshal errors: line xx: cannot unmarshal !!map into string ``` In the skaffold kustomize example (https://github.com/GoogleContainerTools/skaffold/blob/master/examples/kustomize/kustomization.yaml) it is clear that skaffold expects the "old" format of defining patches just by filename: ``` yaml patches: - file.yml - another-file.yml ``` The new format is more elaborate defining targets and allows for inline patches and can look like this: ```yaml # from https://github.com/kubernetes-sigs/kustomize/blob/master/docs/plugins/builtins.md#field-name-patches patches: - path: patch.yaml target: group: apps version: v1 kind: Deployment name: deploy.* labelSelector: "env=dev" annotationSelector: "zone=west" - patch: |- - op: replace path: /some/existing/path value: new value target: kind: MyKind labelSelector: "env=dev" ``` Running `skaffold run` or `skaffold deploy` strangely **works** and produces valid yaml. Running `kustomize` directly works and produces valid yaml as well. ### Expected behavior `skaffold dev` with a kustomization target is expected not to break on usage of the new patches field format. ### Information - Skaffold version: v0.38.0 - Kustomize version: v3.2.0 - Operating system: macOS
1.0
`skaffold dev` breaks on kustomize new `patches:` field format - ### Actual behavior Running `skaffold dev` with a kustomization.yaml file containing the `patches:` field to target multiple objects (https://github.com/kubernetes-sigs/kustomize/blob/master/docs/plugins/builtins.md#field-name-patches) fails with: ``` FATA[0000] watching files for deployer: listing files: listing files: yaml: unmarshal errors: line xx: cannot unmarshal !!map into string ``` In the skaffold kustomize example (https://github.com/GoogleContainerTools/skaffold/blob/master/examples/kustomize/kustomization.yaml) it is clear that skaffold expects the "old" format of defining patches just by filename: ``` yaml patches: - file.yml - another-file.yml ``` The new format is more elaborate defining targets and allows for inline patches and can look like this: ```yaml # from https://github.com/kubernetes-sigs/kustomize/blob/master/docs/plugins/builtins.md#field-name-patches patches: - path: patch.yaml target: group: apps version: v1 kind: Deployment name: deploy.* labelSelector: "env=dev" annotationSelector: "zone=west" - patch: |- - op: replace path: /some/existing/path value: new value target: kind: MyKind labelSelector: "env=dev" ``` Running `skaffold run` or `skaffold deploy` strangely **works** and produces valid yaml. Running `kustomize` directly works and produces valid yaml as well. ### Expected behavior `skaffold dev` with a kustomization target is expected not to break on usage of the new patches field format. ### Information - Skaffold version: v0.38.0 - Kustomize version: v3.2.0 - Operating system: macOS
non_process
skaffold dev breaks on kustomize new patches field format actual behavior running skaffold dev with a kustomization yaml file containing the patches field to target multiple objects fails with fata watching files for deployer listing files listing files yaml unmarshal errors line xx cannot unmarshal map into string in the skaffold kustomize example it is clear that skaffold expects the old format of defining patches just by filename yaml patches file yml another file yml the new format is more elaborate defining targets and allows for inline patches and can look like this yaml from patches path patch yaml target group apps version kind deployment name deploy labelselector env dev annotationselector zone west patch op replace path some existing path value new value target kind mykind labelselector env dev running skaffold run or skaffold deploy strangely works and produces valid yaml running kustomize directly works and produces valid yaml as well expected behavior skaffold dev with a kustomization target is expected not to break on usage of the new patches field format information skaffold version kustomize version operating system macos
0
9,388
12,392,969,375
IssuesEvent
2020-05-20 14:45:32
prisma/specs
https://api.github.com/repos/prisma/specs
closed
Spec spec
area/process kind/spec spec/change
We need to define what a spec is, what goals it has, who its intended consumers are, what level of depth we expect, how it should be structured etc.
1.0
Spec spec - We need to define what a spec is, what goals it has, who its intended consumers are, what level of depth we expect, how it should be structured etc.
process
spec spec we need to define what a spec is what goals it has who its intended consumers are what level of depth we expect how it should be structured etc
1
4,357
7,260,477,749
IssuesEvent
2018-02-18 10:23:54
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
reopened
mitosis and meiosis I & II/ nuclear division of cell cycle
auto-migrated cell cycle and DNA processes
mitosis and meiosis I & II outstanding question: mitosis has a nuclear division parent, meiosis meiosis I and meiosis II don't Either meiosis terms are a little broader and encompass MORE than nuclear division ? OR meiosis I and meiosis II should both have a nuclear division parent Q If meiosis is "a specialised type of nuclear division", when people refer to "meiosis" in the broader sense, are they really talking about the meiotic cell cycle, or meiotic development (i.e which would include sporulation in fission yeast). Q Should we define meiosis and mitosis as a type of nuclear division (qualified by the cell cycle in which it takes place (meiotic or mitotic). each should have 2 parents the "x cell cycle" & "nuclear division". Note: If meiosis I and meiosis II are the specialised nuclear divisions there are lots of child processes which are not part of nuclear division (premeiotic DNA replication processes for example). We need a term which represents any of the cell-cycle events which occur during (meiosis I + interphase) "meiosis I cell cycle process" and "meiosis II cell cycle process" (would that work ?) to capture all the events which happen during interphase AND meiosis of those specific divisions and terms to represent the nulcear division part of meiosis I and meiosis II David to consult meiosis experts Reported by: ValWood Original Ticket: [geneontology/ontology-requests/10110](https://sourceforge.net/p/geneontology/ontology-requests/10110)
1.0
mitosis and meiosis I & II/ nuclear division of cell cycle - mitosis and meiosis I & II outstanding question: mitosis has a nuclear division parent, meiosis meiosis I and meiosis II don't Either meiosis terms are a little broader and encompass MORE than nuclear division ? OR meiosis I and meiosis II should both have a nuclear division parent Q If meiosis is "a specialised type of nuclear division", when people refer to "meiosis" in the broader sense, are they really talking about the meiotic cell cycle, or meiotic development (i.e which would include sporulation in fission yeast). Q Should we define meiosis and mitosis as a type of nuclear division (qualified by the cell cycle in which it takes place (meiotic or mitotic). each should have 2 parents the "x cell cycle" & "nuclear division". Note: If meiosis I and meiosis II are the specialised nuclear divisions there are lots of child processes which are not part of nuclear division (premeiotic DNA replication processes for example). We need a term which represents any of the cell-cycle events which occur during (meiosis I + interphase) "meiosis I cell cycle process" and "meiosis II cell cycle process" (would that work ?) to capture all the events which happen during interphase AND meiosis of those specific divisions and terms to represent the nulcear division part of meiosis I and meiosis II David to consult meiosis experts Reported by: ValWood Original Ticket: [geneontology/ontology-requests/10110](https://sourceforge.net/p/geneontology/ontology-requests/10110)
process
mitosis and meiosis i ii nuclear division of cell cycle mitosis and meiosis i ii outstanding question mitosis has a nuclear division parent meiosis meiosis i and meiosis ii don t either meiosis terms are a little broader and encompass more than nuclear division or meiosis i and meiosis ii should both have a nuclear division parent q if meiosis is a specialised type of nuclear division when people refer to meiosis in the broader sense are they really talking about the meiotic cell cycle or meiotic development i e which would include sporulation in fission yeast q should we define meiosis and mitosis as a type of nuclear division qualified by the cell cycle in which it takes place meiotic or mitotic each should have parents the x cell cycle nuclear division note if meiosis i and meiosis ii are the specialised nuclear divisions there are lots of child processes which are not part of nuclear division premeiotic dna replication processes for example we need a term which represents any of the cell cycle events which occur during meiosis i interphase meiosis i cell cycle process and meiosis ii cell cycle process would that work to capture all the events which happen during interphase and meiosis of those specific divisions and terms to represent the nulcear division part of meiosis i and meiosis ii david to consult meiosis experts reported by valwood original ticket
1
6,169
9,082,070,296
IssuesEvent
2019-02-17 08:46:08
linnovate/root
https://api.github.com/repos/linnovate/root
opened
in search, in the task template, the related project and discussion arent shown, and no sub tasks are shown
Process bug Search Visual bug
create a new task and name it assign a project and discussion to it search for it click on it in order to enter the task's template no sub tasks are shown(like in #1484 ) and the related project and discussion appear as empty circles: ![image](https://user-images.githubusercontent.com/38312178/52910495-f3ec0c00-32a0-11e9-8780-09700ac040ed.png) if you edit the task, the project and discussion return
1.0
in search, in the task template, the related project and discussion arent shown, and no sub tasks are shown - create a new task and name it assign a project and discussion to it search for it click on it in order to enter the task's template no sub tasks are shown(like in #1484 ) and the related project and discussion appear as empty circles: ![image](https://user-images.githubusercontent.com/38312178/52910495-f3ec0c00-32a0-11e9-8780-09700ac040ed.png) if you edit the task, the project and discussion return
process
in search in the task template the related project and discussion arent shown and no sub tasks are shown create a new task and name it assign a project and discussion to it search for it click on it in order to enter the task s template no sub tasks are shown like in and the related project and discussion appear as empty circles if you edit the task the project and discussion return
1
34,768
4,953,966,810
IssuesEvent
2016-12-01 16:22:07
mozilla-sensorweb/sensorthings
https://api.github.com/repos/mozilla-sensorweb/sensorthings
closed
Observation does not have mandatory relation: FeatureOfInterest
bug OGC compliance tests
OGC Tests are failing with this error: _Test method readEntitiesAndCheckResponse: Entity type "OBSERVATION" does not have mandatory relation: "FeatureOfInterest". Test method readEntityAndCheckResponse: Entity type "OBSERVATION" does not have mandatory relation: "FeatureOfInterest"._
1.0
Observation does not have mandatory relation: FeatureOfInterest - OGC Tests are failing with this error: _Test method readEntitiesAndCheckResponse: Entity type "OBSERVATION" does not have mandatory relation: "FeatureOfInterest". Test method readEntityAndCheckResponse: Entity type "OBSERVATION" does not have mandatory relation: "FeatureOfInterest"._
non_process
observation does not have mandatory relation featureofinterest ogc tests are failing with this error test method readentitiesandcheckresponse entity type observation does not have mandatory relation featureofinterest test method readentityandcheckresponse entity type observation does not have mandatory relation featureofinterest
0
14,956
18,436,314,587
IssuesEvent
2021-10-14 13:24:35
googleapis/python-vision
https://api.github.com/repos/googleapis/python-vision
closed
chore: snippet-bot full scan
api: vision type: process
<!-- probot comment [11299897]--> ## snippet-bot scan result Life is too short to manually check unmatched region tags. Here is the result: Great job! No unmatching region tags found! --- Report generated by [snippet-bot](https://github.com/apps/snippet-bot). If you find problems with this result, please file an issue at: https://github.com/googleapis/repo-automation-bots/issues.
1.0
chore: snippet-bot full scan - <!-- probot comment [11299897]--> ## snippet-bot scan result Life is too short to manually check unmatched region tags. Here is the result: Great job! No unmatching region tags found! --- Report generated by [snippet-bot](https://github.com/apps/snippet-bot). If you find problems with this result, please file an issue at: https://github.com/googleapis/repo-automation-bots/issues.
process
chore snippet bot full scan snippet bot scan result life is too short to manually check unmatched region tags here is the result great job no unmatching region tags found report generated by if you find problems with this result please file an issue at
1
223,986
17,650,722,993
IssuesEvent
2021-08-20 12:52:22
ColoredCow/portal
https://api.github.com/repos/ColoredCow/portal
closed
Unit test cases entities in HR module
module : hr Testing
- [x] Job - [x] Round - [x] Applicant - [ ] Application - [ ] Application round
1.0
Unit test cases entities in HR module - - [x] Job - [x] Round - [x] Applicant - [ ] Application - [ ] Application round
non_process
unit test cases entities in hr module job round applicant application application round
0
15,347
19,511,818,631
IssuesEvent
2021-12-29 00:22:40
jonblatho/covid-19
https://api.github.com/repos/jonblatho/covid-19
closed
update to use Python 3.10
enhancement data processing
Python 3.10 has been released, and processing code should be updated to use it as soon as feasible. Numerous packages are not yet compatible with Python 3.10, though.
1.0
update to use Python 3.10 - Python 3.10 has been released, and processing code should be updated to use it as soon as feasible. Numerous packages are not yet compatible with Python 3.10, though.
process
update to use python python has been released and processing code should be updated to use it as soon as feasible numerous packages are not yet compatible with python though
1
418,662
28,122,997,206
IssuesEvent
2023-03-31 15:25:30
nramapurath/ped
https://api.github.com/repos/nramapurath/ped
opened
Inconsistent command keyword in UG
severity.Low type.DocumentationBug
In the command summary, both `kill` and `delete` have been listed as the keyword to trigger a deletion. However, I understand that `delete` is the only accepted keyword for this command. ![image.png](https://raw.githubusercontent.com/nramapurath/ped/main/files/1f9ef87c-64d6-48cc-a3e8-0401bb246465.png) <!--session: 1680276105898-1d31ae29-36a8-4c79-a45b-a2c90abf8ee2--> <!--Version: Web v3.4.7-->
1.0
Inconsistent command keyword in UG - In the command summary, both `kill` and `delete` have been listed as the keyword to trigger a deletion. However, I understand that `delete` is the only accepted keyword for this command. ![image.png](https://raw.githubusercontent.com/nramapurath/ped/main/files/1f9ef87c-64d6-48cc-a3e8-0401bb246465.png) <!--session: 1680276105898-1d31ae29-36a8-4c79-a45b-a2c90abf8ee2--> <!--Version: Web v3.4.7-->
non_process
inconsistent command keyword in ug in the command summary both kill and delete have been listed as the keyword to trigger a deletion however i understand that delete is the only accepted keyword for this command
0
9,735
12,731,649,732
IssuesEvent
2020-06-25 09:10:18
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
Minor copy editing feedback
Pri2 cxp doc-bug machine-learning/svc team-data-science-process/subsvc triaged
"Outlines" in "Next Steps" should be lowercase. "comprises of" should just be "comprises": https://www.merriam-webster.com/words-at-play/can-you-use-comprised-of-grammar --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 094e56c7-d8dc-1e80-1154-3a369d1a1c82 * Version Independent ID: 768d235a-9571-329b-1195-62a02ff1ed87 * Content: [What is the Team Data Science Process?](https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/overview) * Content Source: [articles/machine-learning/team-data-science-process/overview.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/machine-learning/team-data-science-process/overview.md) * Service: **machine-learning** * Sub-service: **team-data-science-process** * GitHub Login: @marktab * Microsoft Alias: **tdsp**
1.0
Minor copy editing feedback - "Outlines" in "Next Steps" should be lowercase. "comprises of" should just be "comprises": https://www.merriam-webster.com/words-at-play/can-you-use-comprised-of-grammar --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 094e56c7-d8dc-1e80-1154-3a369d1a1c82 * Version Independent ID: 768d235a-9571-329b-1195-62a02ff1ed87 * Content: [What is the Team Data Science Process?](https://docs.microsoft.com/en-us/azure/machine-learning/team-data-science-process/overview) * Content Source: [articles/machine-learning/team-data-science-process/overview.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/machine-learning/team-data-science-process/overview.md) * Service: **machine-learning** * Sub-service: **team-data-science-process** * GitHub Login: @marktab * Microsoft Alias: **tdsp**
process
minor copy editing feedback outlines in next steps should be lowercase comprises of should just be comprises document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service machine learning sub service team data science process github login marktab microsoft alias tdsp
1
5,930
8,753,125,022
IssuesEvent
2018-12-14 07:04:53
nodejs/node
https://api.github.com/repos/nodejs/node
opened
Investigate flaky test-child-process-exit-code on ubuntu1604_sharedlibs_debug_x64
CI / flaky test child_process test
https://ci.nodejs.org/job/node-test-commit-linux-containered/9393/nodes=ubuntu1604_sharedlibs_debug_x64/consoleText test-digitalocean-ubuntu1604_sharedlibs_container-x64-7 ```console not ok 267 parallel/test-child-process-exit-code --- duration_ms: 1.127 severity: fail exitcode: 1 stack: |- assert.js:86 throw new AssertionError(obj); ^ AssertionError [ERR_ASSERTION]: Expected values to be strictly equal: null !== 23 at ChildProcess.<anonymous> (/home/iojs/build/workspace/node-test-commit-linux-containered/test/parallel/test-child-process-exit-code.js:31:10) at ChildProcess.<anonymous> (/home/iojs/build/workspace/node-test-commit-linux-containered/test/common/index.js:335:15) at ChildProcess.emit (events.js:189:13) at Process.ChildProcess._handle.onexit (internal/child_process.js:254:12) ... ```
1.0
Investigate flaky test-child-process-exit-code on ubuntu1604_sharedlibs_debug_x64 - https://ci.nodejs.org/job/node-test-commit-linux-containered/9393/nodes=ubuntu1604_sharedlibs_debug_x64/consoleText test-digitalocean-ubuntu1604_sharedlibs_container-x64-7 ```console not ok 267 parallel/test-child-process-exit-code --- duration_ms: 1.127 severity: fail exitcode: 1 stack: |- assert.js:86 throw new AssertionError(obj); ^ AssertionError [ERR_ASSERTION]: Expected values to be strictly equal: null !== 23 at ChildProcess.<anonymous> (/home/iojs/build/workspace/node-test-commit-linux-containered/test/parallel/test-child-process-exit-code.js:31:10) at ChildProcess.<anonymous> (/home/iojs/build/workspace/node-test-commit-linux-containered/test/common/index.js:335:15) at ChildProcess.emit (events.js:189:13) at Process.ChildProcess._handle.onexit (internal/child_process.js:254:12) ... ```
process
investigate flaky test child process exit code on sharedlibs debug test digitalocean sharedlibs container console not ok parallel test child process exit code duration ms severity fail exitcode stack assert js throw new assertionerror obj assertionerror expected values to be strictly equal null at childprocess home iojs build workspace node test commit linux containered test parallel test child process exit code js at childprocess home iojs build workspace node test commit linux containered test common index js at childprocess emit events js at process childprocess handle onexit internal child process js
1
34,562
7,844,000,100
IssuesEvent
2018-06-19 08:19:15
An-Sar/PrimalCore
https://api.github.com/repos/An-Sar/PrimalCore
closed
Major lag on every new world
Code Review World Gen duplicate
Everytime I create a new world, the world generates normally, but after a few minutes it starts to lag a lot, eventually leading to Minecraft stopping responding. https://paste.ee/p/OPsLu
1.0
Major lag on every new world - Everytime I create a new world, the world generates normally, but after a few minutes it starts to lag a lot, eventually leading to Minecraft stopping responding. https://paste.ee/p/OPsLu
non_process
major lag on every new world everytime i create a new world the world generates normally but after a few minutes it starts to lag a lot eventually leading to minecraft stopping responding
0
597,566
18,166,668,009
IssuesEvent
2021-09-27 15:14:50
nunit/nunit-v2-framework-driver
https://api.github.com/repos/nunit/nunit-v2-framework-driver
closed
Exploring tests with filter returns all tests
Bug High Priority
`NUnit.Engine.ITestRunner.Explore(...)` does not obey the filter that is passed as argument to it and returns a list of all the tests within the assembly. This behaviour is inconsistent with calling `NUnit.Engine.ITestRunner.Run(...)`, which obeys the filter as expected and only executes the tests which pass the filter.
1.0
Exploring tests with filter returns all tests - `NUnit.Engine.ITestRunner.Explore(...)` does not obey the filter that is passed as argument to it and returns a list of all the tests within the assembly. This behaviour is inconsistent with calling `NUnit.Engine.ITestRunner.Run(...)`, which obeys the filter as expected and only executes the tests which pass the filter.
non_process
exploring tests with filter returns all tests nunit engine itestrunner explore does not obey the filter that is passed as argument to it and returns a list of all the tests within the assembly this behaviour is inconsistent with calling nunit engine itestrunner run which obeys the filter as expected and only executes the tests which pass the filter
0
306
2,741,467,472
IssuesEvent
2015-04-21 11:18:04
cliffparnitzky/TriathlonResultsManager
https://api.github.com/repos/cliffparnitzky/TriathlonResultsManager
closed
Replace singleStarter-freetext with input unit field type
Improvement ⚙ - Processed
Use an input unit field with options (female, male) to ensure a gender
1.0
Replace singleStarter-freetext with input unit field type - Use an input unit field with options (female, male) to ensure a gender
process
replace singlestarter freetext with input unit field type use an input unit field with options female male to ensure a gender
1
243,003
18,675,550,513
IssuesEvent
2021-10-31 13:54:32
synyx/buchungsstreber
https://api.github.com/repos/synyx/buchungsstreber
closed
Documentation: Time Granularity
documentation good first issue
The documentation for YAML/Buch Format still specifies, that the time granularity is fixed to a quarter hour. This hasn't been true for quite some time, being configurable via config file. This paragraph should be revised/rewritten.
1.0
Documentation: Time Granularity - The documentation for YAML/Buch Format still specifies, that the time granularity is fixed to a quarter hour. This hasn't been true for quite some time, being configurable via config file. This paragraph should be revised/rewritten.
non_process
documentation time granularity the documentation for yaml buch format still specifies that the time granularity is fixed to a quarter hour this hasn t been true for quite some time being configurable via config file this paragraph should be revised rewritten
0
9,505
12,493,928,151
IssuesEvent
2020-06-01 10:11:07
shresthaprince/InfoSystems
https://api.github.com/repos/shresthaprince/InfoSystems
closed
Outbound Calls
process
Using user stories for above process, make use cases, activity diagrams, class diagrams and collaboration diagrams to illustrate it.
1.0
Outbound Calls - Using user stories for above process, make use cases, activity diagrams, class diagrams and collaboration diagrams to illustrate it.
process
outbound calls using user stories for above process make use cases activity diagrams class diagrams and collaboration diagrams to illustrate it
1
215,260
24,156,874,587
IssuesEvent
2022-09-22 08:26:53
elastic/kibana
https://api.github.com/repos/elastic/kibana
opened
[Security Solution] Not able to create a custom query rule after editing one of the creation steps
bug triage_needed Team: SecuritySolution
**Describe the bug:** - The user is not able to create a custom query rule if during the creation process, once it is placed in the `Actions` section, modifies one of the previous steps. **Kibana/Elasticsearch Stack version:** **Steps to reproduce:** 1. Navigate to the rule creation flow `app/security/rules/create` 2. Enter a valid custom query, i.e: `*:*` 3. Click on `Continue` 4. Fill Name and description fields 5. Click on `Continue` 6. Click on `Continue` for the Schedule rule section 7. Click on `Edit` option of any of the previous steps, i.e. about rule 8. Click again on the `Continue` buttons until you arrive again to the `Rule actions section` 9. Click on `Create & Enable rule` **Current behavior:** <img width="1257" alt="Screenshot 2022-09-22 at 10 22 41" src="https://user-images.githubusercontent.com/17427073/191696391-cc09df18-794c-469f-b8a1-b6f7fc57b4de.png"> **Expected behavior:** - The rule should be properly created - No actions should be displayed on the actions section **Additional information:** <details> <summary>Error message</summary> ``` Failed to add Rule [request body]: Invalid value "" supplied to "response_actions,action_type_id",Invalid value "undefined" supplied to "response_actions,params" (400) ``` </details> <details> <summary>Full eror message</summary> ``` Failed to add Rule [request body]: Invalid value "" supplied to "response_actions,action_type_id",Invalid value "undefined" supplied to "response_actions,params" (400) { "name": "Error", "body": { "statusCode": 400, "error": "Bad Request", "message": "[request body]: Invalid value \"\" supplied to \"response_actions,action_type_id\",Invalid value \"undefined\" supplied to \"response_actions,params\"" }, "message": "Bad Request", "stack": "Error: Bad Request\n at Fetch._callee3$ (http://localhost:5601/9007199254740991/bundles/core/core.entry.js:11729:23)\n at tryCatch (http://localhost:5601/9007199254740991/bundles/kbn-ui-shared-deps-npm/kbn-ui-shared-deps-npm.dll.js:216709:17)\n at Generator._invoke (http://localhost:5601/9007199254740991/bundles/kbn-ui-shared-deps-npm/kbn-ui-shared-deps-npm.dll.js:216689:24)\n at Generator.next (http://localhost:5601/9007199254740991/bundles/kbn-ui-shared-deps-npm/kbn-ui-shared-deps-npm.dll.js:216740:21)\n at asyncGeneratorStep (http://localhost:5601/9007199254740991/bundles/kbn-ui-shared-deps-npm/kbn-ui-shared-deps-npm.dll.js:77323:24)\n at _next (http://localhost:5601/9007199254740991/bundles/kbn-ui-shared-deps-npm/kbn-ui-shared-deps-npm.dll.js:77345:9)" } ``` </details>
True
[Security Solution] Not able to create a custom query rule after editing one of the creation steps - **Describe the bug:** - The user is not able to create a custom query rule if during the creation process, once it is placed in the `Actions` section, modifies one of the previous steps. **Kibana/Elasticsearch Stack version:** **Steps to reproduce:** 1. Navigate to the rule creation flow `app/security/rules/create` 2. Enter a valid custom query, i.e: `*:*` 3. Click on `Continue` 4. Fill Name and description fields 5. Click on `Continue` 6. Click on `Continue` for the Schedule rule section 7. Click on `Edit` option of any of the previous steps, i.e. about rule 8. Click again on the `Continue` buttons until you arrive again to the `Rule actions section` 9. Click on `Create & Enable rule` **Current behavior:** <img width="1257" alt="Screenshot 2022-09-22 at 10 22 41" src="https://user-images.githubusercontent.com/17427073/191696391-cc09df18-794c-469f-b8a1-b6f7fc57b4de.png"> **Expected behavior:** - The rule should be properly created - No actions should be displayed on the actions section **Additional information:** <details> <summary>Error message</summary> ``` Failed to add Rule [request body]: Invalid value "" supplied to "response_actions,action_type_id",Invalid value "undefined" supplied to "response_actions,params" (400) ``` </details> <details> <summary>Full eror message</summary> ``` Failed to add Rule [request body]: Invalid value "" supplied to "response_actions,action_type_id",Invalid value "undefined" supplied to "response_actions,params" (400) { "name": "Error", "body": { "statusCode": 400, "error": "Bad Request", "message": "[request body]: Invalid value \"\" supplied to \"response_actions,action_type_id\",Invalid value \"undefined\" supplied to \"response_actions,params\"" }, "message": "Bad Request", "stack": "Error: Bad Request\n at Fetch._callee3$ (http://localhost:5601/9007199254740991/bundles/core/core.entry.js:11729:23)\n at tryCatch (http://localhost:5601/9007199254740991/bundles/kbn-ui-shared-deps-npm/kbn-ui-shared-deps-npm.dll.js:216709:17)\n at Generator._invoke (http://localhost:5601/9007199254740991/bundles/kbn-ui-shared-deps-npm/kbn-ui-shared-deps-npm.dll.js:216689:24)\n at Generator.next (http://localhost:5601/9007199254740991/bundles/kbn-ui-shared-deps-npm/kbn-ui-shared-deps-npm.dll.js:216740:21)\n at asyncGeneratorStep (http://localhost:5601/9007199254740991/bundles/kbn-ui-shared-deps-npm/kbn-ui-shared-deps-npm.dll.js:77323:24)\n at _next (http://localhost:5601/9007199254740991/bundles/kbn-ui-shared-deps-npm/kbn-ui-shared-deps-npm.dll.js:77345:9)" } ``` </details>
non_process
not able to create a custom query rule after editing one of the creation steps describe the bug the user is not able to create a custom query rule if during the creation process once it is placed in the actions section modifies one of the previous steps kibana elasticsearch stack version steps to reproduce navigate to the rule creation flow app security rules create enter a valid custom query i e click on continue fill name and description fields click on continue click on continue for the schedule rule section click on edit option of any of the previous steps i e about rule click again on the continue buttons until you arrive again to the rule actions section click on create enable rule current behavior img width alt screenshot at src expected behavior the rule should be properly created no actions should be displayed on the actions section additional information error message failed to add rule invalid value supplied to response actions action type id invalid value undefined supplied to response actions params full eror message failed to add rule invalid value supplied to response actions action type id invalid value undefined supplied to response actions params name error body statuscode error bad request message invalid value supplied to response actions action type id invalid value undefined supplied to response actions params message bad request stack error bad request n at fetch at trycatch at generator invoke at generator next at asyncgeneratorstep at next
0
95,451
27,511,586,016
IssuesEvent
2023-03-06 09:12:16
expo/eas-cli
https://api.github.com/repos/expo/eas-cli
closed
BUG! exception in phase 'semantic analysis' in source unit '_BuildScript_' Unsupported class file major version 63
incomplete issue: missing or invalid repro eas build invalid issue: project specific issue
### Build/Submit details page URL _No response_ ### Summary When i try to build with eas locally i have the error written below. I've install the latest version of Gradle (8.0.1) and Java (19.0.2) but the error persist MacOS with M2 ### Managed or bare? Managed ### Environment WARNING: The legacy expo-cli does not support Node +17. Migrate to the versioned Expo CLI (npx expo). 🎉 Didn't find any issues with the project! ### Error output `[PREPARE_CREDENTIALS] Writing secrets to the project's directory [PREPARE_CREDENTIALS] Injecting signing config into build.gradle [RUN_GRADLEW] Running 'gradlew :app:assembleRelease' in /var/folders/mj/1cc59x_d49sgz9h8j9w50xnm0000gn/T/eas-build-local-nodejs/6c09b46b-4fd5-4068-86f2-9382c13e0ac0/build/android [RUN_GRADLEW] FAILURE: Build failed with an exception. [RUN_GRADLEW] * What went wrong: [RUN_GRADLEW] Could not open settings generic class cache for settings file '/private/var/folders/mj/1cc59x_d49sgz9h8j9w50xnm0000gn/T/eas-build-local-nodejs/6c09b46b-4fd5-4068-86f2-9382c13e0ac0/build/android/settings.gradle' (/Users/mattiarainieri/.gradle/caches/7.5.1/scripts/4m73o5hy0wcdtw9kuua74cbm8). [RUN_GRADLEW] > [RUN_GRADLEW] BUG! exception in phase 'semantic analysis' in source unit '_BuildScript_' Unsupported class file major version 63 [RUN_GRADLEW] * Try: [RUN_GRADLEW] > Run with [RUN_GRADLEW] --stacktrace option to get the stack trace. [RUN_GRADLEW] > Run with --info or --debug option to get more log output. [RUN_GRADLEW] > [RUN_GRADLEW] Run with --scan to get full insights. [RUN_GRADLEW] * Get more help at https://help.gradle.org [RUN_GRADLEW] BUILD FAILED in 351ms [RUN_GRADLEW] Error: Gradle build failed with unknown error. See logs for the "Run gradlew" phase for more information. Build failed Gradle build failed with unknown error. See logs for the "Run gradlew" phase for more information.` ### Reproducible demo or steps to reproduce from a blank project **eas.json file** `{ "cli": { "version": ">= 3.6.2" }, "build": { "development": { "developmentClient": true, "distribution": "internal", "ios": { "resourceClass": "m1-medium" } }, "preview": { "android": { "buildType": "apk" }, "distribution": "internal", "ios": { "resourceClass": "m1-medium" } }, "production": { "ios": { "resourceClass": "m1-medium" } } }, "submit": { "production": {} } } `
1.0
BUG! exception in phase 'semantic analysis' in source unit '_BuildScript_' Unsupported class file major version 63 - ### Build/Submit details page URL _No response_ ### Summary When i try to build with eas locally i have the error written below. I've install the latest version of Gradle (8.0.1) and Java (19.0.2) but the error persist MacOS with M2 ### Managed or bare? Managed ### Environment WARNING: The legacy expo-cli does not support Node +17. Migrate to the versioned Expo CLI (npx expo). 🎉 Didn't find any issues with the project! ### Error output `[PREPARE_CREDENTIALS] Writing secrets to the project's directory [PREPARE_CREDENTIALS] Injecting signing config into build.gradle [RUN_GRADLEW] Running 'gradlew :app:assembleRelease' in /var/folders/mj/1cc59x_d49sgz9h8j9w50xnm0000gn/T/eas-build-local-nodejs/6c09b46b-4fd5-4068-86f2-9382c13e0ac0/build/android [RUN_GRADLEW] FAILURE: Build failed with an exception. [RUN_GRADLEW] * What went wrong: [RUN_GRADLEW] Could not open settings generic class cache for settings file '/private/var/folders/mj/1cc59x_d49sgz9h8j9w50xnm0000gn/T/eas-build-local-nodejs/6c09b46b-4fd5-4068-86f2-9382c13e0ac0/build/android/settings.gradle' (/Users/mattiarainieri/.gradle/caches/7.5.1/scripts/4m73o5hy0wcdtw9kuua74cbm8). [RUN_GRADLEW] > [RUN_GRADLEW] BUG! exception in phase 'semantic analysis' in source unit '_BuildScript_' Unsupported class file major version 63 [RUN_GRADLEW] * Try: [RUN_GRADLEW] > Run with [RUN_GRADLEW] --stacktrace option to get the stack trace. [RUN_GRADLEW] > Run with --info or --debug option to get more log output. [RUN_GRADLEW] > [RUN_GRADLEW] Run with --scan to get full insights. [RUN_GRADLEW] * Get more help at https://help.gradle.org [RUN_GRADLEW] BUILD FAILED in 351ms [RUN_GRADLEW] Error: Gradle build failed with unknown error. See logs for the "Run gradlew" phase for more information. Build failed Gradle build failed with unknown error. See logs for the "Run gradlew" phase for more information.` ### Reproducible demo or steps to reproduce from a blank project **eas.json file** `{ "cli": { "version": ">= 3.6.2" }, "build": { "development": { "developmentClient": true, "distribution": "internal", "ios": { "resourceClass": "m1-medium" } }, "preview": { "android": { "buildType": "apk" }, "distribution": "internal", "ios": { "resourceClass": "m1-medium" } }, "production": { "ios": { "resourceClass": "m1-medium" } } }, "submit": { "production": {} } } `
non_process
bug exception in phase semantic analysis in source unit buildscript unsupported class file major version build submit details page url no response summary when i try to build with eas locally i have the error written below i ve install the latest version of gradle and java but the error persist macos with managed or bare managed environment warning the legacy expo cli does not support node migrate to the versioned expo cli npx expo 🎉 didn t find any issues with the project error output writing secrets to the project s directory injecting signing config into build gradle running gradlew app assemblerelease in var folders mj t eas build local nodejs build android failure build failed with an exception what went wrong could not open settings generic class cache for settings file private var folders mj t eas build local nodejs build android settings gradle users mattiarainieri gradle caches scripts bug exception in phase semantic analysis in source unit buildscript unsupported class file major version try run with stacktrace option to get the stack trace run with info or debug option to get more log output run with scan to get full insights get more help at build failed in error gradle build failed with unknown error see logs for the run gradlew phase for more information build failed gradle build failed with unknown error see logs for the run gradlew phase for more information reproducible demo or steps to reproduce from a blank project eas json file cli version build development developmentclient true distribution internal ios resourceclass medium preview android buildtype apk distribution internal ios resourceclass medium production ios resourceclass medium submit production
0
129,037
5,085,120,951
IssuesEvent
2016-12-30 09:54:56
lyuich/brain
https://api.github.com/repos/lyuich/brain
opened
API GatewayでサーバレスなAPIのバックエンドを作る
aws category: engineer priority: normal status: new type: feature
### Write a summary of this issue - マネージドサービスだけでスケーラブルなアプリケーションバックエンドを作る ### List the details of what to do as a list - [ ] Amazon API Gatewayとは - [ ] デモシステムの構築 - [ ] API Gatewayの高度な機能 - [ ] 有用なツール ### Any references (Describe URL of documents or webpages) - なし
1.0
API GatewayでサーバレスなAPIのバックエンドを作る - ### Write a summary of this issue - マネージドサービスだけでスケーラブルなアプリケーションバックエンドを作る ### List the details of what to do as a list - [ ] Amazon API Gatewayとは - [ ] デモシステムの構築 - [ ] API Gatewayの高度な機能 - [ ] 有用なツール ### Any references (Describe URL of documents or webpages) - なし
non_process
api gatewayでサーバレスなapiのバックエンドを作る write a summary of this issue マネージドサービスだけでスケーラブルなアプリケーションバックエンドを作る list the details of what to do as a list amazon api gatewayとは デモシステムの構築 api gatewayの高度な機能 有用なツール any references describe url of documents or webpages なし
0
1,347
3,907,936,514
IssuesEvent
2016-04-19 14:27:13
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
Toolkit Log Should List Referenced But Undefined Keys
enhancement P2 preprocess/keyref
Although there might be situations where it is expected that some keyrefs point to keys that are not defined, it is probably far more frequent that content creators need to know about undefined key references. They can't rely on authoring tools to provide this information because the authoring scope could easily differ from that of publication. Some closed issues suggest that in the past, the toolkit logged such instances, but recent versions (1.8, 2.0) do not. It might be desirable to have the option to suppress such logging by a build parameter or to log these as information-only rather than warnings.
1.0
Toolkit Log Should List Referenced But Undefined Keys - Although there might be situations where it is expected that some keyrefs point to keys that are not defined, it is probably far more frequent that content creators need to know about undefined key references. They can't rely on authoring tools to provide this information because the authoring scope could easily differ from that of publication. Some closed issues suggest that in the past, the toolkit logged such instances, but recent versions (1.8, 2.0) do not. It might be desirable to have the option to suppress such logging by a build parameter or to log these as information-only rather than warnings.
process
toolkit log should list referenced but undefined keys although there might be situations where it is expected that some keyrefs point to keys that are not defined it is probably far more frequent that content creators need to know about undefined key references they can t rely on authoring tools to provide this information because the authoring scope could easily differ from that of publication some closed issues suggest that in the past the toolkit logged such instances but recent versions do not it might be desirable to have the option to suppress such logging by a build parameter or to log these as information only rather than warnings
1
6,717
9,769,361,388
IssuesEvent
2019-06-06 08:22:38
RRZE-Webteam/rrze-xliff
https://api.github.com/repos/RRZE-Webteam/rrze-xliff
closed
Export/Import - Berücksichtigung von Medien
Requirement
Bei einigen Webseiten werden Galerien oder Bilder im Content, aber auch als Artikelbild eingebunden. Diese Bilder verfügen ebenfalls über Title und Captiontexte. Es ist ein Verfahren zu ermöglichen, daß diese Bilder ebenfalls bei der Übersetzung (XLIFF-Export und Import) mit berücksichtigt werden.
1.0
Export/Import - Berücksichtigung von Medien - Bei einigen Webseiten werden Galerien oder Bilder im Content, aber auch als Artikelbild eingebunden. Diese Bilder verfügen ebenfalls über Title und Captiontexte. Es ist ein Verfahren zu ermöglichen, daß diese Bilder ebenfalls bei der Übersetzung (XLIFF-Export und Import) mit berücksichtigt werden.
non_process
export import berücksichtigung von medien bei einigen webseiten werden galerien oder bilder im content aber auch als artikelbild eingebunden diese bilder verfügen ebenfalls über title und captiontexte es ist ein verfahren zu ermöglichen daß diese bilder ebenfalls bei der übersetzung xliff export und import mit berücksichtigt werden
0
18,783
13,102,957,718
IssuesEvent
2020-08-04 07:43:16
SonarSource/sonarlint-visualstudio
https://api.github.com/repos/SonarSource/sonarlint-visualstudio
opened
Automate post-release pipeline
Infrastructure Type: Improvement Type: Task
### Description Automate some manual steps that we do after each release: [ ] Bump version script [ ] Asm ref update [ ] PR for the changes
1.0
Automate post-release pipeline - ### Description Automate some manual steps that we do after each release: [ ] Bump version script [ ] Asm ref update [ ] PR for the changes
non_process
automate post release pipeline description automate some manual steps that we do after each release bump version script asm ref update pr for the changes
0
41,931
5,409,929,330
IssuesEvent
2017-03-01 06:40:54
Promact/trappist
https://api.github.com/repos/Promact/trappist
closed
Test Duplicate
Test Creation and Management
while duplicating the test, the name of the test would be different or same..? as mentioned in SRS that Test names should be Unique...but duplicating means the copy of particular test.
1.0
Test Duplicate - while duplicating the test, the name of the test would be different or same..? as mentioned in SRS that Test names should be Unique...but duplicating means the copy of particular test.
non_process
test duplicate while duplicating the test the name of the test would be different or same as mentioned in srs that test names should be unique but duplicating means the copy of particular test
0
18,703
24,598,377,017
IssuesEvent
2022-10-14 10:15:06
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[Consent API] [Mobile apps] [Standalone app] Consent API Enabled > Getting 'No data available' in the data sharing image screen in the following scenario
Bug P0 iOS Android Process: Fixed Process: Tested dev
**Steps:** 1. Sign in / Sign up 2. Enroll to the study 3. Leave the study / Delete the app account 4. Again , create account with same email 5. Enroll to the study 6. Navigate to 'Resources' screen 7. Click on 'Data sharing image' and Verify **AR:** Getting 'No data available' in the data sharing image screen in the following scenario **ER:** 'Data sharing image' should get displayed **Note:** Issue is observed in the consent pdf also ![image](https://user-images.githubusercontent.com/86007179/180159092-9f092e73-168c-4e48-a0d6-9b12015ab345.png)
2.0
[Consent API] [Mobile apps] [Standalone app] Consent API Enabled > Getting 'No data available' in the data sharing image screen in the following scenario - **Steps:** 1. Sign in / Sign up 2. Enroll to the study 3. Leave the study / Delete the app account 4. Again , create account with same email 5. Enroll to the study 6. Navigate to 'Resources' screen 7. Click on 'Data sharing image' and Verify **AR:** Getting 'No data available' in the data sharing image screen in the following scenario **ER:** 'Data sharing image' should get displayed **Note:** Issue is observed in the consent pdf also ![image](https://user-images.githubusercontent.com/86007179/180159092-9f092e73-168c-4e48-a0d6-9b12015ab345.png)
process
consent api enabled getting no data available in the data sharing image screen in the following scenario steps sign in sign up enroll to the study leave the study delete the app account again create account with same email enroll to the study navigate to resources screen click on data sharing image and verify ar getting no data available in the data sharing image screen in the following scenario er data sharing image should get displayed note issue is observed in the consent pdf also
1
7,178
10,319,455,470
IssuesEvent
2019-08-30 17:34:02
googlemaps/google-maps-services-python
https://api.github.com/repos/googlemaps/google-maps-services-python
closed
automate test of distribution
priority: p2 type: process
A simple test of the distribution will help prevent issues such as #307.
1.0
automate test of distribution - A simple test of the distribution will help prevent issues such as #307.
process
automate test of distribution a simple test of the distribution will help prevent issues such as
1
379,765
26,385,777,410
IssuesEvent
2023-01-12 12:10:12
PBWDeformations/PBWDeformations.jl
https://api.github.com/repos/PBWDeformations/PBWDeformations.jl
closed
add docs entry for functions used in tutorial notebook
documentation
In GitLab by @johannesflake on Jan 13, 2022, 10:12
1.0
add docs entry for functions used in tutorial notebook - In GitLab by @johannesflake on Jan 13, 2022, 10:12
non_process
add docs entry for functions used in tutorial notebook in gitlab by johannesflake on jan
0
2,010
4,834,086,741
IssuesEvent
2016-11-08 13:18:07
AllenFang/react-bootstrap-table
https://api.github.com/repos/AllenFang/react-bootstrap-table
opened
Issue with giving the paginationShowsTotal as function when table is empty
bug inprocess
https://github.com/AllenFang/react-bootstrap-table/issues/719#issuecomment-258975952 self-explanatory
1.0
Issue with giving the paginationShowsTotal as function when table is empty - https://github.com/AllenFang/react-bootstrap-table/issues/719#issuecomment-258975952 self-explanatory
process
issue with giving the paginationshowstotal as function when table is empty self explanatory
1
12,005
14,738,162,537
IssuesEvent
2021-01-07 03:56:43
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Billing Cycles Process - Improvments and Fixes
anc-process anp-not prioritized ant-enhancement
In GitLab by @kdjstudios on May 11, 2018, 12:40 Hello Team, This is going to be a list of all the concerns and improvements to insure an easy to use and functional Billing cycle Process - Add staged fees to accounts that will get applied to the same billing cycle. (This is due to ops now having to create the next billing cycle after finalizing in order to create manual invoices) #890 - Add the ability to process a billing cycle with no usage file. This is due to the increased amount of billing cycles that do not need a usage file. #893 - Add the step between create next cycle and upload to allow for manual invoice creation without confusion. - Reverting/Deleting a cycle concerns - When performing a revert, Will this revert manually created invoices or just one created by the upload? - Is the delete functionality shown to what permission level? - Does this delete manually created invoices or just ones created by the upload?
1.0
Billing Cycles Process - Improvments and Fixes - In GitLab by @kdjstudios on May 11, 2018, 12:40 Hello Team, This is going to be a list of all the concerns and improvements to insure an easy to use and functional Billing cycle Process - Add staged fees to accounts that will get applied to the same billing cycle. (This is due to ops now having to create the next billing cycle after finalizing in order to create manual invoices) #890 - Add the ability to process a billing cycle with no usage file. This is due to the increased amount of billing cycles that do not need a usage file. #893 - Add the step between create next cycle and upload to allow for manual invoice creation without confusion. - Reverting/Deleting a cycle concerns - When performing a revert, Will this revert manually created invoices or just one created by the upload? - Is the delete functionality shown to what permission level? - Does this delete manually created invoices or just ones created by the upload?
process
billing cycles process improvments and fixes in gitlab by kdjstudios on may hello team this is going to be a list of all the concerns and improvements to insure an easy to use and functional billing cycle process add staged fees to accounts that will get applied to the same billing cycle this is due to ops now having to create the next billing cycle after finalizing in order to create manual invoices add the ability to process a billing cycle with no usage file this is due to the increased amount of billing cycles that do not need a usage file add the step between create next cycle and upload to allow for manual invoice creation without confusion reverting deleting a cycle concerns when performing a revert will this revert manually created invoices or just one created by the upload is the delete functionality shown to what permission level does this delete manually created invoices or just ones created by the upload
1
12,921
15,294,981,648
IssuesEvent
2021-02-24 03:43:45
topcoder-platform/community-app
https://api.github.com/repos/topcoder-platform/community-app
closed
Highlighting Matched Skills: Clarification
ShapeupProcess challenge- recommender-tool question
@Oanh-and-only-Oanh , The Open for registration list shows the copilot entered tags below the challenge. When the recommended challenges toggle is on, the matched skills are displayed. these are extracted skills from the spec, that also match the user's skills. Is this behaviour fine, where the user sees additional skills to the challenge once recommended toggle is on? <img width="983" alt="Screenshot 2021-02-19 at 6 04 51 PM" src="https://user-images.githubusercontent.com/58783823/108505443-6b643400-72dd-11eb-8b45-7319a630de91.png"> <img width="978" alt="Screenshot 2021-02-19 at 6 05 46 PM" src="https://user-images.githubusercontent.com/58783823/108505445-6c956100-72dd-11eb-8b79-30d4c9d8dda9.png">
1.0
Highlighting Matched Skills: Clarification - @Oanh-and-only-Oanh , The Open for registration list shows the copilot entered tags below the challenge. When the recommended challenges toggle is on, the matched skills are displayed. these are extracted skills from the spec, that also match the user's skills. Is this behaviour fine, where the user sees additional skills to the challenge once recommended toggle is on? <img width="983" alt="Screenshot 2021-02-19 at 6 04 51 PM" src="https://user-images.githubusercontent.com/58783823/108505443-6b643400-72dd-11eb-8b45-7319a630de91.png"> <img width="978" alt="Screenshot 2021-02-19 at 6 05 46 PM" src="https://user-images.githubusercontent.com/58783823/108505445-6c956100-72dd-11eb-8b79-30d4c9d8dda9.png">
process
highlighting matched skills clarification oanh and only oanh the open for registration list shows the copilot entered tags below the challenge when the recommended challenges toggle is on the matched skills are displayed these are extracted skills from the spec that also match the user s skills is this behaviour fine where the user sees additional skills to the challenge once recommended toggle is on img width alt screenshot at pm src img width alt screenshot at pm src
1
60,074
14,518,871,282
IssuesEvent
2020-12-14 01:11:04
jgeraigery/datasite
https://api.github.com/repos/jgeraigery/datasite
opened
CVE-2020-7788 (High) detected in ini-1.3.4.tgz
security vulnerability
## CVE-2020-7788 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ini-1.3.4.tgz</b></p></summary> <p>An ini encoder/decoder for node</p> <p>Library home page: <a href="https://registry.npmjs.org/ini/-/ini-1.3.4.tgz">https://registry.npmjs.org/ini/-/ini-1.3.4.tgz</a></p> <p>Path to dependency file: datasite/package.json</p> <p>Path to vulnerable library: datasite/node_modules/bower/lib/node_modules/ini/package.json</p> <p> Dependency Hierarchy: - protractor-5.4.1.tgz (Root Library) - webdriver-manager-12.1.0.tgz - :x: **ini-1.3.4.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package ini before 1.3.6. If an attacker submits a malicious INI file to an application that parses it with ini.parse, they will pollute the prototype on the application. This can be exploited further depending on the context. <p>Publish Date: 2020-12-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788>CVE-2020-7788</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788</a></p> <p>Release Date: 2020-12-11</p> <p>Fix Resolution: v1.3.6</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"ini","packageVersion":"1.3.4","isTransitiveDependency":true,"dependencyTree":"protractor:5.4.1;webdriver-manager:12.1.0;ini:1.3.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v1.3.6"}],"vulnerabilityIdentifier":"CVE-2020-7788","vulnerabilityDetails":"This affects the package ini before 1.3.6. If an attacker submits a malicious INI file to an application that parses it with ini.parse, they will pollute the prototype on the application. This can be exploited further depending on the context.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788","cvss3Severity":"high","cvss3Score":"7.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2020-7788 (High) detected in ini-1.3.4.tgz - ## CVE-2020-7788 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ini-1.3.4.tgz</b></p></summary> <p>An ini encoder/decoder for node</p> <p>Library home page: <a href="https://registry.npmjs.org/ini/-/ini-1.3.4.tgz">https://registry.npmjs.org/ini/-/ini-1.3.4.tgz</a></p> <p>Path to dependency file: datasite/package.json</p> <p>Path to vulnerable library: datasite/node_modules/bower/lib/node_modules/ini/package.json</p> <p> Dependency Hierarchy: - protractor-5.4.1.tgz (Root Library) - webdriver-manager-12.1.0.tgz - :x: **ini-1.3.4.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> This affects the package ini before 1.3.6. If an attacker submits a malicious INI file to an application that parses it with ini.parse, they will pollute the prototype on the application. This can be exploited further depending on the context. <p>Publish Date: 2020-12-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788>CVE-2020-7788</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7788</a></p> <p>Release Date: 2020-12-11</p> <p>Fix Resolution: v1.3.6</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"ini","packageVersion":"1.3.4","isTransitiveDependency":true,"dependencyTree":"protractor:5.4.1;webdriver-manager:12.1.0;ini:1.3.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v1.3.6"}],"vulnerabilityIdentifier":"CVE-2020-7788","vulnerabilityDetails":"This affects the package ini before 1.3.6. If an attacker submits a malicious INI file to an application that parses it with ini.parse, they will pollute the prototype on the application. This can be exploited further depending on the context.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7788","cvss3Severity":"high","cvss3Score":"7.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in ini tgz cve high severity vulnerability vulnerable library ini tgz an ini encoder decoder for node library home page a href path to dependency file datasite package json path to vulnerable library datasite node modules bower lib node modules ini package json dependency hierarchy protractor tgz root library webdriver manager tgz x ini tgz vulnerable library found in base branch master vulnerability details this affects the package ini before if an attacker submits a malicious ini file to an application that parses it with ini parse they will pollute the prototype on the application this can be exploited further depending on the context publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails this affects the package ini before if an attacker submits a malicious ini file to an application that parses it with ini parse they will pollute the prototype on the application this can be exploited further depending on the context vulnerabilityurl
0
14,139
17,031,314,773
IssuesEvent
2021-07-04 16:14:17
ankidroid/Anki-Android
https://api.github.com/repos/ankidroid/Anki-Android
closed
Regression tests: 2.15.3 crash on startup
Help Wanted Priority-High Test process
2.15.4 is rolling out now and is already available here @burbilog please confirm it works for you? It should - if not there is something else going on: https://github.com/ankidroid/Anki-Android/releases/tag/v2.15.4 Action plan for 2.15.5 on this one: - There must be a test that fails on 2.15.3 (and it should fail on master as well?) via starting the study options fragment from deck picker via the review counts (to prove this can't happen again - this has been problematic in the past) - then fix the navigation drawer preference PR on master and enable the test - merge that fix PR - cherry-pick the navigation drawer gesture preference back to release-2.15 and the fix PR for it as well Sound about right @david-allison-1 ? _Originally posted by @mikehardy in https://github.com/ankidroid/Anki-Android/issues/9052#issuecomment-853378139_
1.0
Regression tests: 2.15.3 crash on startup - 2.15.4 is rolling out now and is already available here @burbilog please confirm it works for you? It should - if not there is something else going on: https://github.com/ankidroid/Anki-Android/releases/tag/v2.15.4 Action plan for 2.15.5 on this one: - There must be a test that fails on 2.15.3 (and it should fail on master as well?) via starting the study options fragment from deck picker via the review counts (to prove this can't happen again - this has been problematic in the past) - then fix the navigation drawer preference PR on master and enable the test - merge that fix PR - cherry-pick the navigation drawer gesture preference back to release-2.15 and the fix PR for it as well Sound about right @david-allison-1 ? _Originally posted by @mikehardy in https://github.com/ankidroid/Anki-Android/issues/9052#issuecomment-853378139_
process
regression tests crash on startup is rolling out now and is already available here burbilog please confirm it works for you it should if not there is something else going on action plan for on this one there must be a test that fails on and it should fail on master as well via starting the study options fragment from deck picker via the review counts to prove this can t happen again this has been problematic in the past then fix the navigation drawer preference pr on master and enable the test merge that fix pr cherry pick the navigation drawer gesture preference back to release and the fix pr for it as well sound about right david allison originally posted by mikehardy in
1
84,350
10,519,505,728
IssuesEvent
2019-09-29 18:27:38
pulumi/pulumi
https://api.github.com/repos/pulumi/pulumi
closed
Add a `transform` escape hatch callback to Component options
kind/design
The Kubernetes provider exposes an optional `transform` callback on some components, which allows the set of resources managed by the component to be transformed by the caller before being created. This is a generally useful capability, to allow an escape hatch for cases where a component author did not expose a desired capability on some underlying resources it creates, but the caller wants to break the black box abstraction of the component and work around this at least until the component author exposes first class support. This could be offered as a general feature for all Pulumi components by adding an `transform` callback on ComponentResourceOptions that gets a chance to modify the resource inputs of any resource prior to them being sent to the Pulumi engine. As we do elsewhere, all parents up the parent chain would be consulted, potentially doing transformations at each layer up the hierarchy. The caller would need to switch on `type` and potentially `name` to try to find the child resource they care about - which would be brittle when there are changes in the component, but not too much so given this would be intentionally breaking the black box abstraction. This may require resolving the inputs to ground values (not Outputs) prior to handing them to the callback, so the callback would receive `Unwrap<Args>` instead of `Args` as a parameter (though it could not be strongly typed in general since the callback would need to handle all child resources).
1.0
Add a `transform` escape hatch callback to Component options - The Kubernetes provider exposes an optional `transform` callback on some components, which allows the set of resources managed by the component to be transformed by the caller before being created. This is a generally useful capability, to allow an escape hatch for cases where a component author did not expose a desired capability on some underlying resources it creates, but the caller wants to break the black box abstraction of the component and work around this at least until the component author exposes first class support. This could be offered as a general feature for all Pulumi components by adding an `transform` callback on ComponentResourceOptions that gets a chance to modify the resource inputs of any resource prior to them being sent to the Pulumi engine. As we do elsewhere, all parents up the parent chain would be consulted, potentially doing transformations at each layer up the hierarchy. The caller would need to switch on `type` and potentially `name` to try to find the child resource they care about - which would be brittle when there are changes in the component, but not too much so given this would be intentionally breaking the black box abstraction. This may require resolving the inputs to ground values (not Outputs) prior to handing them to the callback, so the callback would receive `Unwrap<Args>` instead of `Args` as a parameter (though it could not be strongly typed in general since the callback would need to handle all child resources).
non_process
add a transform escape hatch callback to component options the kubernetes provider exposes an optional transform callback on some components which allows the set of resources managed by the component to be transformed by the caller before being created this is a generally useful capability to allow an escape hatch for cases where a component author did not expose a desired capability on some underlying resources it creates but the caller wants to break the black box abstraction of the component and work around this at least until the component author exposes first class support this could be offered as a general feature for all pulumi components by adding an transform callback on componentresourceoptions that gets a chance to modify the resource inputs of any resource prior to them being sent to the pulumi engine as we do elsewhere all parents up the parent chain would be consulted potentially doing transformations at each layer up the hierarchy the caller would need to switch on type and potentially name to try to find the child resource they care about which would be brittle when there are changes in the component but not too much so given this would be intentionally breaking the black box abstraction this may require resolving the inputs to ground values not outputs prior to handing them to the callback so the callback would receive unwrap instead of args as a parameter though it could not be strongly typed in general since the callback would need to handle all child resources
0
16,298
20,947,742,330
IssuesEvent
2022-03-26 05:28:58
lynnandtonic/nestflix.fun
https://api.github.com/repos/lynnandtonic/nestflix.fun
closed
Add Big Shot
suggested title in process
Please add as much of the following info as you can: Title: Big Shot Type (film/tv show): Both Film or show in which it appears: Cowboy Bebop Is the parent film/show streaming anywhere? About when in the parent film/show does it appear? Multiple times Actual footage of the film/show can be seen (yes/no)? Yes https://cowboybebop.fandom.com/wiki/Big_Shot#articleComments
1.0
Add Big Shot - Please add as much of the following info as you can: Title: Big Shot Type (film/tv show): Both Film or show in which it appears: Cowboy Bebop Is the parent film/show streaming anywhere? About when in the parent film/show does it appear? Multiple times Actual footage of the film/show can be seen (yes/no)? Yes https://cowboybebop.fandom.com/wiki/Big_Shot#articleComments
process
add big shot please add as much of the following info as you can title big shot type film tv show both film or show in which it appears cowboy bebop is the parent film show streaming anywhere about when in the parent film show does it appear multiple times actual footage of the film show can be seen yes no yes
1
7,092
10,239,466,172
IssuesEvent
2019-08-19 18:19:11
RIOT-OS/RIOT
https://api.github.com/repos/RIOT-OS/RIOT
closed
core: API: RTC interface should not use struct tm
Discussion: RFC Process: API change State: stale
While collecting implementation ideas for a new timer subsystem, I stumbled about the fact that our real time clock interface can only be used using `struct tm` time representation. While that might sound natural for a RTC, it seems inefficient: - `struct tm` is defined using at least 9 integers in newlib (-> 36bytes), where 8 would be enough for any conceivable use case if RTCs would be used not in calendar mode, but in counting mode (just counting seconds). Also, while inefficient, it is very easy to convert an epoch (or something similar) to `struct tm`, should it be needed for presenting a date to the user. I propose changing the low level interface to work with a single integer representing (epoch) seconds, maybe keeping the `struct tm` functions for convenience and backwards compatibility. This would essentially reduce the RTCs to timers counting seconds, but we'd be able to use their capability of waking up MCUs from deep power down modes. A quick survey of our rtc implementations and the capabilities of RTCs : cc430: uses calendar mode. chip offers counting mode, but without alarm. can probably be easily worked around lpc2387: only calendar mode capable native: uses posix system calls, so it's already epoch based sam3x8e: using calendar mode, but MCU offers a real time timer (RTT) matching counter mode with alarm. samd21: using calendar mode, but RTC offers counter mode with alarm kinetis: already uses counting mode I say, let's keep the notion of days, years, (leap years) in software. What do you think?
1.0
core: API: RTC interface should not use struct tm - While collecting implementation ideas for a new timer subsystem, I stumbled about the fact that our real time clock interface can only be used using `struct tm` time representation. While that might sound natural for a RTC, it seems inefficient: - `struct tm` is defined using at least 9 integers in newlib (-> 36bytes), where 8 would be enough for any conceivable use case if RTCs would be used not in calendar mode, but in counting mode (just counting seconds). Also, while inefficient, it is very easy to convert an epoch (or something similar) to `struct tm`, should it be needed for presenting a date to the user. I propose changing the low level interface to work with a single integer representing (epoch) seconds, maybe keeping the `struct tm` functions for convenience and backwards compatibility. This would essentially reduce the RTCs to timers counting seconds, but we'd be able to use their capability of waking up MCUs from deep power down modes. A quick survey of our rtc implementations and the capabilities of RTCs : cc430: uses calendar mode. chip offers counting mode, but without alarm. can probably be easily worked around lpc2387: only calendar mode capable native: uses posix system calls, so it's already epoch based sam3x8e: using calendar mode, but MCU offers a real time timer (RTT) matching counter mode with alarm. samd21: using calendar mode, but RTC offers counter mode with alarm kinetis: already uses counting mode I say, let's keep the notion of days, years, (leap years) in software. What do you think?
process
core api rtc interface should not use struct tm while collecting implementation ideas for a new timer subsystem i stumbled about the fact that our real time clock interface can only be used using struct tm time representation while that might sound natural for a rtc it seems inefficient struct tm is defined using at least integers in newlib where would be enough for any conceivable use case if rtcs would be used not in calendar mode but in counting mode just counting seconds also while inefficient it is very easy to convert an epoch or something similar to struct tm should it be needed for presenting a date to the user i propose changing the low level interface to work with a single integer representing epoch seconds maybe keeping the struct tm functions for convenience and backwards compatibility this would essentially reduce the rtcs to timers counting seconds but we d be able to use their capability of waking up mcus from deep power down modes a quick survey of our rtc implementations and the capabilities of rtcs uses calendar mode chip offers counting mode but without alarm can probably be easily worked around only calendar mode capable native uses posix system calls so it s already epoch based using calendar mode but mcu offers a real time timer rtt matching counter mode with alarm using calendar mode but rtc offers counter mode with alarm kinetis already uses counting mode i say let s keep the notion of days years leap years in software what do you think
1
38,177
8,685,274,951
IssuesEvent
2018-12-03 07:03:11
ShaikASK/Testing
https://api.github.com/repos/ShaikASK/Testing
opened
Activities : Rehire : New Hire text is being displayed instead Rehire text
Defect HR Admin Module HR User Module P3 RE-Hire
Issue 01 : Activities : Rehire : New Hire text is being displayed instead Rehire text Steps To Replicate : 1.Launch the URL 2.Sign in as HR admin user 3.Create Rehire 4.Click on Save 5.Initiate the above created Rehire 6.Sign in as candidate for above created rehire 7.Click on Get Started 8.Reject the Offer Letter Experienced Behavior : Observed that New Hire text is being displayed instead Rehire text in the activities list Expected Behavior : Ensure that it should should display Rehire text in the activities tab when rehire is created Issue 02 : Activities : candidate Reject the offer text is being displayed instead of candidate Rejected the Offer Letter Steps To Replicate : 1.Launch the URL 2.sign in as Candidate 3.click on Get Started 4.Reject the Offer Letter 5.Sign in as HR admin user 6.Check the activities tab Experienced Behavior : Observed that candidate Reject the offer text is being displayed instead of candidate Rejected the Offer Letter in the activities list (Refer Screen Shot) Expected Behavior : Ensure that application should displayed as candidate rejected the offer letter
1.0
Activities : Rehire : New Hire text is being displayed instead Rehire text - Issue 01 : Activities : Rehire : New Hire text is being displayed instead Rehire text Steps To Replicate : 1.Launch the URL 2.Sign in as HR admin user 3.Create Rehire 4.Click on Save 5.Initiate the above created Rehire 6.Sign in as candidate for above created rehire 7.Click on Get Started 8.Reject the Offer Letter Experienced Behavior : Observed that New Hire text is being displayed instead Rehire text in the activities list Expected Behavior : Ensure that it should should display Rehire text in the activities tab when rehire is created Issue 02 : Activities : candidate Reject the offer text is being displayed instead of candidate Rejected the Offer Letter Steps To Replicate : 1.Launch the URL 2.sign in as Candidate 3.click on Get Started 4.Reject the Offer Letter 5.Sign in as HR admin user 6.Check the activities tab Experienced Behavior : Observed that candidate Reject the offer text is being displayed instead of candidate Rejected the Offer Letter in the activities list (Refer Screen Shot) Expected Behavior : Ensure that application should displayed as candidate rejected the offer letter
non_process
activities rehire new hire text is being displayed instead rehire text issue activities rehire new hire text is being displayed instead rehire text steps to replicate launch the url sign in as hr admin user create rehire click on save initiate the above created rehire sign in as candidate for above created rehire click on get started reject the offer letter experienced behavior observed that new hire text is being displayed instead rehire text in the activities list expected behavior ensure that it should should display rehire text in the activities tab when rehire is created issue activities candidate reject the offer text is being displayed instead of candidate rejected the offer letter steps to replicate launch the url sign in as candidate click on get started reject the offer letter sign in as hr admin user check the activities tab experienced behavior observed that candidate reject the offer text is being displayed instead of candidate rejected the offer letter in the activities list refer screen shot expected behavior ensure that application should displayed as candidate rejected the offer letter
0
8,522
11,701,218,606
IssuesEvent
2020-03-06 19:12:01
tueit/it_management
https://api.github.com/repos/tueit/it_management
closed
IT Checklist
Please check & close enhancement process usability
The possibilty to create a new DocType IT Checklist from a IT Management Table. Case: We create a checklist in Issue which shows a customer onboarding. Once this is done and approved it would be nice if I were able to merge the created IT Management Table which could then be a submittable DocType which I can use via "get from" for future reference. We would use this for customer onboard, install configurations etc.
1.0
IT Checklist - The possibilty to create a new DocType IT Checklist from a IT Management Table. Case: We create a checklist in Issue which shows a customer onboarding. Once this is done and approved it would be nice if I were able to merge the created IT Management Table which could then be a submittable DocType which I can use via "get from" for future reference. We would use this for customer onboard, install configurations etc.
process
it checklist the possibilty to create a new doctype it checklist from a it management table case we create a checklist in issue which shows a customer onboarding once this is done and approved it would be nice if i were able to merge the created it management table which could then be a submittable doctype which i can use via get from for future reference we would use this for customer onboard install configurations etc
1
3,361
6,490,664,080
IssuesEvent
2017-08-21 07:59:28
openvstorage/framework
https://api.github.com/repos/openvstorage/framework
closed
Unable to change the storagerouter when creating a vpool
process_duplicate
I'm trying to create a new vpool with another storagerouter but i'm not able to click on 3 other storagerouters. The error we found: ``` gather_fragment_cache.js?version=2.9.5:135 Uncaught TypeError: Cannot read property 'features' of undefined at gather_fragment_cache.js?version=2.9.5:135 at l (knockout-3.3.0.js?version=2.9.5:44) at a.Ub.h [as La] (knockout-3.3.0.js?version=2.9.5:43) at Function.notifySubscribers (knockout-3.3.0.js?version=2.9.5:33) at Function.d.W (knockout-3.3.0.js?version=2.9.5:36) at Object.d [as target] (knockout-3.3.0.js?version=2.9.5:36) at Object.self.set (viewmodel.js?version=2.9.5:72) at Object.eval (eval at parseBindingsString (knockout-3.3.0.js?version=2.9.5:61), <anonymous>:6:565) at HTMLLIElement.<anonymous> (knockout-3.3.0.js?version=2.9.5:83) at HTMLLIElement.dispatch (jquery-1.9.1.js?version=2.9.5:3074) ```
1.0
Unable to change the storagerouter when creating a vpool - I'm trying to create a new vpool with another storagerouter but i'm not able to click on 3 other storagerouters. The error we found: ``` gather_fragment_cache.js?version=2.9.5:135 Uncaught TypeError: Cannot read property 'features' of undefined at gather_fragment_cache.js?version=2.9.5:135 at l (knockout-3.3.0.js?version=2.9.5:44) at a.Ub.h [as La] (knockout-3.3.0.js?version=2.9.5:43) at Function.notifySubscribers (knockout-3.3.0.js?version=2.9.5:33) at Function.d.W (knockout-3.3.0.js?version=2.9.5:36) at Object.d [as target] (knockout-3.3.0.js?version=2.9.5:36) at Object.self.set (viewmodel.js?version=2.9.5:72) at Object.eval (eval at parseBindingsString (knockout-3.3.0.js?version=2.9.5:61), <anonymous>:6:565) at HTMLLIElement.<anonymous> (knockout-3.3.0.js?version=2.9.5:83) at HTMLLIElement.dispatch (jquery-1.9.1.js?version=2.9.5:3074) ```
process
unable to change the storagerouter when creating a vpool i m trying to create a new vpool with another storagerouter but i m not able to click on other storagerouters the error we found gather fragment cache js version uncaught typeerror cannot read property features of undefined at gather fragment cache js version at l knockout js version at a ub h knockout js version at function notifysubscribers knockout js version at function d w knockout js version at object d knockout js version at object self set viewmodel js version at object eval eval at parsebindingsstring knockout js version at htmllielement knockout js version at htmllielement dispatch jquery js version
1
22,719
3,689,855,096
IssuesEvent
2016-02-25 17:51:34
zerogods/phpwebsocket
https://api.github.com/repos/zerogods/phpwebsocket
closed
Use on own Webspace on Strato
auto-migrated Priority-Medium Type-Defect
``` What steps will reproduce the problem? 1. Works Fine on localhost by using Xampp 2. on my strato Webspace i am Not able to connect my Client to the Server 3. i also Followed the Instructions mentioned in One of the previous issues (how to bind to external Adress) But it did Not help What is the expected output? What do you see instead? Websocket - Status 0 Disconnected - Status 3 What version of the product are you using? On what operating system? I use the newest Version My System is Mountain Lion The phpwebsocket is on my strato webspace Please provide any additional information below. Is There are Problem by using it in a Shared Hosting Solution ? ``` Original issue reported on code.google.com by `Wesel...@gmail.com` on 8 Dec 2012 at 3:36
1.0
Use on own Webspace on Strato - ``` What steps will reproduce the problem? 1. Works Fine on localhost by using Xampp 2. on my strato Webspace i am Not able to connect my Client to the Server 3. i also Followed the Instructions mentioned in One of the previous issues (how to bind to external Adress) But it did Not help What is the expected output? What do you see instead? Websocket - Status 0 Disconnected - Status 3 What version of the product are you using? On what operating system? I use the newest Version My System is Mountain Lion The phpwebsocket is on my strato webspace Please provide any additional information below. Is There are Problem by using it in a Shared Hosting Solution ? ``` Original issue reported on code.google.com by `Wesel...@gmail.com` on 8 Dec 2012 at 3:36
non_process
use on own webspace on strato what steps will reproduce the problem works fine on localhost by using xampp on my strato webspace i am not able to connect my client to the server i also followed the instructions mentioned in one of the previous issues how to bind to external adress but it did not help what is the expected output what do you see instead websocket status disconnected status what version of the product are you using on what operating system i use the newest version my system is mountain lion the phpwebsocket is on my strato webspace please provide any additional information below is there are problem by using it in a shared hosting solution original issue reported on code google com by wesel gmail com on dec at
0
9,750
12,736,172,429
IssuesEvent
2020-06-25 16:27:20
unicode-org/icu4x
https://api.github.com/repos/unicode-org/icu4x
opened
Integrate README.md and CONTRIBUTING.md
C-process T-docs
After #147, instructions for contributing will be in two places. We should keep the executive summary in README.md and link to CONTRIBUTING.md for more details. @zbraniecki
1.0
Integrate README.md and CONTRIBUTING.md - After #147, instructions for contributing will be in two places. We should keep the executive summary in README.md and link to CONTRIBUTING.md for more details. @zbraniecki
process
integrate readme md and contributing md after instructions for contributing will be in two places we should keep the executive summary in readme md and link to contributing md for more details zbraniecki
1
20,437
27,099,331,561
IssuesEvent
2023-02-15 07:12:13
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
Use rules_jvm_external to manage Bazel's third_party Java dependencies
P3 type: process team-ExternalDeps stale
Once https://github.com/bazelbuild/rules_jvm_external/pull/169 is merged, rules_jvm_external will be able to resolve Maven dependencies transitively using `maven_install.json`, a pinned list of transitive artifacts. This file is generated with an initial call to Coursier via rules_jvm_external. Subsequent calls will use `http_file` with sha256 checksums to verify integrity and take advantage of Bazel's repository cache. Benefit 1: **No need to vendor jars in third_party**. Instead, one can just update the `maven_install` declaration in the WORKSPACE file with new artifacts / versions, and check in the updated `maven_install.json` file. Contributions touching `third_party` no longer need to be split up into two changes (third_party and non-third_party). Benefit 2: **No need for Bazel maintainers to manually wire up transitive dependencies in third_party.** `rules_jvm_external` will resolve and fetch artifacts from Maven into `$output_base/external/@maven` with a fully wired up BUILD file. The missing piece is a post-build step to vendor that directory back into Bazel's source tree. Benefit 3: **Maintaining capability for fully offline builds**, since artifact downloads will be managed by Bazel and can be integrated with the distdir in a straightforward manner.
1.0
Use rules_jvm_external to manage Bazel's third_party Java dependencies - Once https://github.com/bazelbuild/rules_jvm_external/pull/169 is merged, rules_jvm_external will be able to resolve Maven dependencies transitively using `maven_install.json`, a pinned list of transitive artifacts. This file is generated with an initial call to Coursier via rules_jvm_external. Subsequent calls will use `http_file` with sha256 checksums to verify integrity and take advantage of Bazel's repository cache. Benefit 1: **No need to vendor jars in third_party**. Instead, one can just update the `maven_install` declaration in the WORKSPACE file with new artifacts / versions, and check in the updated `maven_install.json` file. Contributions touching `third_party` no longer need to be split up into two changes (third_party and non-third_party). Benefit 2: **No need for Bazel maintainers to manually wire up transitive dependencies in third_party.** `rules_jvm_external` will resolve and fetch artifacts from Maven into `$output_base/external/@maven` with a fully wired up BUILD file. The missing piece is a post-build step to vendor that directory back into Bazel's source tree. Benefit 3: **Maintaining capability for fully offline builds**, since artifact downloads will be managed by Bazel and can be integrated with the distdir in a straightforward manner.
process
use rules jvm external to manage bazel s third party java dependencies once is merged rules jvm external will be able to resolve maven dependencies transitively using maven install json a pinned list of transitive artifacts this file is generated with an initial call to coursier via rules jvm external subsequent calls will use http file with checksums to verify integrity and take advantage of bazel s repository cache benefit no need to vendor jars in third party instead one can just update the maven install declaration in the workspace file with new artifacts versions and check in the updated maven install json file contributions touching third party no longer need to be split up into two changes third party and non third party benefit no need for bazel maintainers to manually wire up transitive dependencies in third party rules jvm external will resolve and fetch artifacts from maven into output base external maven with a fully wired up build file the missing piece is a post build step to vendor that directory back into bazel s source tree benefit maintaining capability for fully offline builds since artifact downloads will be managed by bazel and can be integrated with the distdir in a straightforward manner
1
14,575
17,702,941,955
IssuesEvent
2021-08-25 01:56:17
tdwg/dwc
https://api.github.com/repos/tdwg/dwc
closed
Change term - dcterms:bibliographicCitation
Term - change Class - Record-level non-normative Process - complete
## Change term * Submitter: John Wieczorek @tucotuco * Justification (why is this change necessary?): Clarity * Proponents (who needs this change): Everyone Current Term definition: https://dwc.tdwg.org/terms/#dcterms:bibliographicCitation, https://dublincore.org/specifications/dublin-core/dcmi-terms/#bibliographicCitation Proposed new attributes of the term: * Term name (in lowerCamelCase): dcterms:bibliographicCitation * Organized in Class (e.g. Location, Taxon): Record-level * Definition of the term: (unchanged): **A bibliographic reference for the resource.** * Usage comments (recommendations regarding content, etc.): **From Dublin Core, "Recommended practice is to include sufficient bibliographic detail to identify the resource as unambiguously as possible." The intended usage of this term in Darwin Core is to provide the preferred way to cite the resource itself - "how to cite this record". Note that the intended usage of dcterms:references in Darwin Core, by contrast, is to point to the definitive source representation of the resource - "where to find the as-close-to-original reference", if one is available.** * Examples: Occurrence example: `Museum of Vertebrate Zoology, UC Berkeley. MVZ Mammal Collection (Arctos). Record ID: http://arctos.database.museum/guid/MVZ:Mamm:165861?seid=101356. Source: http://ipt.vertnet.org:8080/ipt/resource.do?r=mvz_mammal.` **Taxon example: `https://www.gbif.org/species/2439608 Source: GBIF Taxonomic Backbone`, Event example: `Rand, K.M., Logerwell, E.A. The first demersal trawl survey of benthic fish and invertebrates in the Beaufort Sea since the late 1970s. Polar Biol 34, 475–488 (2011). https://doi.org/10.1007/s00300-010-0900-2`** * Refines (identifier of the broader term this term refines, if applicable): None * Replaces (identifier of the existing term that would be deprecated and replaced by this term, if applicable): None * ABCD 2.06 (XPATH of the equivalent term in ABCD or EFG, if applicable): Not in ABCD Note that the definition of this term is governed by the Dublin Core Metadata Initiative.
1.0
Change term - dcterms:bibliographicCitation - ## Change term * Submitter: John Wieczorek @tucotuco * Justification (why is this change necessary?): Clarity * Proponents (who needs this change): Everyone Current Term definition: https://dwc.tdwg.org/terms/#dcterms:bibliographicCitation, https://dublincore.org/specifications/dublin-core/dcmi-terms/#bibliographicCitation Proposed new attributes of the term: * Term name (in lowerCamelCase): dcterms:bibliographicCitation * Organized in Class (e.g. Location, Taxon): Record-level * Definition of the term: (unchanged): **A bibliographic reference for the resource.** * Usage comments (recommendations regarding content, etc.): **From Dublin Core, "Recommended practice is to include sufficient bibliographic detail to identify the resource as unambiguously as possible." The intended usage of this term in Darwin Core is to provide the preferred way to cite the resource itself - "how to cite this record". Note that the intended usage of dcterms:references in Darwin Core, by contrast, is to point to the definitive source representation of the resource - "where to find the as-close-to-original reference", if one is available.** * Examples: Occurrence example: `Museum of Vertebrate Zoology, UC Berkeley. MVZ Mammal Collection (Arctos). Record ID: http://arctos.database.museum/guid/MVZ:Mamm:165861?seid=101356. Source: http://ipt.vertnet.org:8080/ipt/resource.do?r=mvz_mammal.` **Taxon example: `https://www.gbif.org/species/2439608 Source: GBIF Taxonomic Backbone`, Event example: `Rand, K.M., Logerwell, E.A. The first demersal trawl survey of benthic fish and invertebrates in the Beaufort Sea since the late 1970s. Polar Biol 34, 475–488 (2011). https://doi.org/10.1007/s00300-010-0900-2`** * Refines (identifier of the broader term this term refines, if applicable): None * Replaces (identifier of the existing term that would be deprecated and replaced by this term, if applicable): None * ABCD 2.06 (XPATH of the equivalent term in ABCD or EFG, if applicable): Not in ABCD Note that the definition of this term is governed by the Dublin Core Metadata Initiative.
process
change term dcterms bibliographiccitation change term submitter john wieczorek tucotuco justification why is this change necessary clarity proponents who needs this change everyone current term definition proposed new attributes of the term term name in lowercamelcase dcterms bibliographiccitation organized in class e g location taxon record level definition of the term unchanged a bibliographic reference for the resource usage comments recommendations regarding content etc from dublin core recommended practice is to include sufficient bibliographic detail to identify the resource as unambiguously as possible the intended usage of this term in darwin core is to provide the preferred way to cite the resource itself how to cite this record note that the intended usage of dcterms references in darwin core by contrast is to point to the definitive source representation of the resource where to find the as close to original reference if one is available examples occurrence example museum of vertebrate zoology uc berkeley mvz mammal collection arctos record id source taxon example source gbif taxonomic backbone event example rand k m logerwell e a the first demersal trawl survey of benthic fish and invertebrates in the beaufort sea since the late polar biol – refines identifier of the broader term this term refines if applicable none replaces identifier of the existing term that would be deprecated and replaced by this term if applicable none abcd xpath of the equivalent term in abcd or efg if applicable not in abcd note that the definition of this term is governed by the dublin core metadata initiative
1
15,398
19,589,438,977
IssuesEvent
2022-01-05 11:09:06
pystatgen/sgkit
https://api.github.com/repos/pystatgen/sgkit
closed
Fix doc build
process + tools upstream
The doc build is failing with the new 1.13.0 release of sphinx-autodoc-typehints.
1.0
Fix doc build - The doc build is failing with the new 1.13.0 release of sphinx-autodoc-typehints.
process
fix doc build the doc build is failing with the new release of sphinx autodoc typehints
1
443,348
30,886,090,335
IssuesEvent
2023-08-03 22:00:36
ubc-library-rc/dataverse_utils
https://api.github.com/repos/ubc-library-rc/dataverse_utils
closed
dv_record_copy has confusing help
documentation
`dv_record_copy` shows LDC catalogue numbers instead of PIDs in the help function.
1.0
dv_record_copy has confusing help - `dv_record_copy` shows LDC catalogue numbers instead of PIDs in the help function.
non_process
dv record copy has confusing help dv record copy shows ldc catalogue numbers instead of pids in the help function
0
423,948
28,966,627,997
IssuesEvent
2023-05-10 08:22:09
AttractorSchool/ESDP-AP-10-1
https://api.github.com/repos/AttractorSchool/ESDP-AP-10-1
closed
Добавление в wiki проекта - Правила работы с проектом
documentation
## Добавление в wiki проекта - Правила работы с проектом (разработать правила) ### Конечный результат: Список правил работы с проектом ### План решения: Обозначить список тем и описать в виде инструкции каждое правило работы с проектом. ### Мотивация: Информация необходима для успешного вовлечения каждого члена команды в проект (для работы по единым правилам и методам). ### Критерии приёмки: Раздел размещен и наполнен в wiki проекта. ### Планируемое время работы: 4 часа ### Фактически затраченное время:
1.0
Добавление в wiki проекта - Правила работы с проектом - ## Добавление в wiki проекта - Правила работы с проектом (разработать правила) ### Конечный результат: Список правил работы с проектом ### План решения: Обозначить список тем и описать в виде инструкции каждое правило работы с проектом. ### Мотивация: Информация необходима для успешного вовлечения каждого члена команды в проект (для работы по единым правилам и методам). ### Критерии приёмки: Раздел размещен и наполнен в wiki проекта. ### Планируемое время работы: 4 часа ### Фактически затраченное время:
non_process
добавление в wiki проекта правила работы с проектом добавление в wiki проекта правила работы с проектом разработать правила конечный результат список правил работы с проектом план решения обозначить список тем и описать в виде инструкции каждое правило работы с проектом мотивация информация необходима для успешного вовлечения каждого члена команды в проект для работы по единым правилам и методам критерии приёмки раздел размещен и наполнен в wiki проекта планируемое время работы часа фактически затраченное время
0
130,646
18,167,011,095
IssuesEvent
2021-09-27 15:33:47
mTvare6/hello-world.rs
https://api.github.com/repos/mTvare6/hello-world.rs
opened
CVE-2020-36205 (Medium) detected in xcb-0.8.2.crate
security vulnerability
## CVE-2020-36205 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xcb-0.8.2.crate</b></p></summary> <p>Rust bindings and wrappers for XCB</p> <p>Library home page: <a href="https://crates.io/api/v1/crates/xcb/0.8.2/download">https://crates.io/api/v1/crates/xcb/0.8.2/download</a></p> <p> Dependency Hierarchy: - amethyst-0.15.3.crate (Root Library) - amethyst_ui-0.15.3.crate - clipboard-0.5.0.crate - x11-clipboard-0.3.3.crate - :x: **xcb-0.8.2.crate** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/mTvare6/hello-world.rs/commit/a5a175063bd51fcbbce0eaba88d1b9b6ad315911">a5a175063bd51fcbbce0eaba88d1b9b6ad315911</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in the xcb crate through 2020-12-10 for Rust. base::Error does not have soundness. Because of the public ptr field, a use-after-free or double-free can occur. <p>Publish Date: 2021-01-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36205>CVE-2020-36205</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-36205 (Medium) detected in xcb-0.8.2.crate - ## CVE-2020-36205 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xcb-0.8.2.crate</b></p></summary> <p>Rust bindings and wrappers for XCB</p> <p>Library home page: <a href="https://crates.io/api/v1/crates/xcb/0.8.2/download">https://crates.io/api/v1/crates/xcb/0.8.2/download</a></p> <p> Dependency Hierarchy: - amethyst-0.15.3.crate (Root Library) - amethyst_ui-0.15.3.crate - clipboard-0.5.0.crate - x11-clipboard-0.3.3.crate - :x: **xcb-0.8.2.crate** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/mTvare6/hello-world.rs/commit/a5a175063bd51fcbbce0eaba88d1b9b6ad315911">a5a175063bd51fcbbce0eaba88d1b9b6ad315911</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in the xcb crate through 2020-12-10 for Rust. base::Error does not have soundness. Because of the public ptr field, a use-after-free or double-free can occur. <p>Publish Date: 2021-01-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36205>CVE-2020-36205</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in xcb crate cve medium severity vulnerability vulnerable library xcb crate rust bindings and wrappers for xcb library home page a href dependency hierarchy amethyst crate root library amethyst ui crate clipboard crate clipboard crate x xcb crate vulnerable library found in head commit a href found in base branch master vulnerability details an issue was discovered in the xcb crate through for rust base error does not have soundness because of the public ptr field a use after free or double free can occur publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href step up your open source security game with whitesource
0
129,473
18,102,526,635
IssuesEvent
2021-09-22 15:32:49
gms-ws-demo/JS-Demo-Sep2021
https://api.github.com/repos/gms-ws-demo/JS-Demo-Sep2021
opened
CVE-2021-32803 (High) detected in tar-4.4.8.tgz
security vulnerability
## CVE-2021-32803 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-4.4.8.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.8.tgz">https://registry.npmjs.org/tar/-/tar-4.4.8.tgz</a></p> <p> Dependency Hierarchy: - nodemon-1.19.1.tgz (Root Library) - chokidar-2.1.6.tgz - fsevents-1.2.9.tgz - node-pre-gyp-0.12.0.tgz - :x: **tar-4.4.8.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/gms-ws-demo/JS-Demo-Sep2021/commit/e8cd219daa23fb09c60a7e7095b13c9e8372f529">e8cd219daa23fb09c60a7e7095b13c9e8372f529</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2. <p>Publish Date: 2021-08-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803>CVE-2021-32803</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw">https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw</a></p> <p>Release Date: 2021-08-03</p> <p>Fix Resolution: tar - 3.2.3, 4.4.15, 5.0.7, 6.1.2</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"tar","packageVersion":"4.4.8","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"nodemon:1.19.1;chokidar:2.1.6;fsevents:1.2.9;node-pre-gyp:0.12.0;tar:4.4.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"tar - 3.2.3, 4.4.15, 5.0.7, 6.1.2"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-32803","vulnerabilityDetails":"The npm package \"tar\" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"Required","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2021-32803 (High) detected in tar-4.4.8.tgz - ## CVE-2021-32803 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-4.4.8.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.8.tgz">https://registry.npmjs.org/tar/-/tar-4.4.8.tgz</a></p> <p> Dependency Hierarchy: - nodemon-1.19.1.tgz (Root Library) - chokidar-2.1.6.tgz - fsevents-1.2.9.tgz - node-pre-gyp-0.12.0.tgz - :x: **tar-4.4.8.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/gms-ws-demo/JS-Demo-Sep2021/commit/e8cd219daa23fb09c60a7e7095b13c9e8372f529">e8cd219daa23fb09c60a7e7095b13c9e8372f529</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2. <p>Publish Date: 2021-08-03 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803>CVE-2021-32803</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw">https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw</a></p> <p>Release Date: 2021-08-03</p> <p>Fix Resolution: tar - 3.2.3, 4.4.15, 5.0.7, 6.1.2</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"tar","packageVersion":"4.4.8","packageFilePaths":[],"isTransitiveDependency":true,"dependencyTree":"nodemon:1.19.1;chokidar:2.1.6;fsevents:1.2.9;node-pre-gyp:0.12.0;tar:4.4.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"tar - 3.2.3, 4.4.15, 5.0.7, 6.1.2"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-32803","vulnerabilityDetails":"The npm package \"tar\" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"Required","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in tar tgz cve high severity vulnerability vulnerable library tar tgz tar for node library home page a href dependency hierarchy nodemon tgz root library chokidar tgz fsevents tgz node pre gyp tgz x tar tgz vulnerable library found in head commit a href found in base branch master vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite vulnerability via insufficient symlink protection node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory this order of operations resulted in the directory being created and added to the node tar directory cache when a directory is present in the directory cache subsequent calls to mkdir for that directory are skipped however this is also where node tar checks for symlinks occur by first creating a directory and then replacing that directory with a symlink it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite this issue was addressed in releases and publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree nodemon chokidar fsevents node pre gyp tar isminimumfixversionavailable true minimumfixversion tar basebranches vulnerabilityidentifier cve vulnerabilitydetails the npm package tar aka node tar before versions and has an arbitrary file creation overwrite vulnerability via insufficient symlink protection node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory this order of operations resulted in the directory being created and added to the node tar directory cache when a directory is present in the directory cache subsequent calls to mkdir for that directory are skipped however this is also where node tar checks for symlinks occur by first creating a directory and then replacing that directory with a symlink it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite this issue was addressed in releases and vulnerabilityurl
0
15,603
19,726,738,520
IssuesEvent
2022-01-13 20:46:34
googleapis/google-cloud-cpp
https://api.github.com/repos/googleapis/google-cloud-cpp
closed
Use -Wthread-safety and thread safety annotations where available.
static-analysis type: process
If the compiler supports them (like Clang does), we should use the thread safe analysis annotations: https://clang.llvm.org/docs/ThreadSafetyAnalysis.html
1.0
Use -Wthread-safety and thread safety annotations where available. - If the compiler supports them (like Clang does), we should use the thread safe analysis annotations: https://clang.llvm.org/docs/ThreadSafetyAnalysis.html
process
use wthread safety and thread safety annotations where available if the compiler supports them like clang does we should use the thread safe analysis annotations
1
13,400
15,874,107,537
IssuesEvent
2021-04-09 04:10:08
e4exp/paper_manager_abstract
https://api.github.com/repos/e4exp/paper_manager_abstract
opened
Revisiting Simple Neural Probabilistic Language Models
2021 Language Model Natural Language Processing Transformer
- https://arxiv.org/abs/2104.03474 - 2021 近年の言語モデリングの進歩は,ニューラルアーキテクチャの進歩だけでなく,ハードウェアや最適化の改善によってもたらされています。 本論文では、~\\{Bengio2003ANP}のニューラル確率的言語モデル(NPLM)を再検討しました。 このモデルは、固定されたウィンドウ内で単語の埋め込みを単純に連結し、その結果をフィードフォワードネットワークに通して次の単語を予測します。 このモデルは、最新のハードウェアにスケールアップすると、多くの制限があるにもかかわらず、単語レベルの言語モデルベンチマークで予想をはるかに上回る性能を発揮します。 分析の結果、NPLMは短い入力文脈ではベースラインのTransformerよりも低いperplexityを達成しますが、長期的な依存関係の処理には苦労します。 この結果にヒントを得て、我々はTransformerの最初の自己注意層をNPLMの局所連結層に置き換えることでTransformerを修正し、3つの単語レベルの言語モデルデータセットにおいて、小さいながらも一貫してパープレキシティを減少させることができた。
1.0
Revisiting Simple Neural Probabilistic Language Models - - https://arxiv.org/abs/2104.03474 - 2021 近年の言語モデリングの進歩は,ニューラルアーキテクチャの進歩だけでなく,ハードウェアや最適化の改善によってもたらされています。 本論文では、~\\{Bengio2003ANP}のニューラル確率的言語モデル(NPLM)を再検討しました。 このモデルは、固定されたウィンドウ内で単語の埋め込みを単純に連結し、その結果をフィードフォワードネットワークに通して次の単語を予測します。 このモデルは、最新のハードウェアにスケールアップすると、多くの制限があるにもかかわらず、単語レベルの言語モデルベンチマークで予想をはるかに上回る性能を発揮します。 分析の結果、NPLMは短い入力文脈ではベースラインのTransformerよりも低いperplexityを達成しますが、長期的な依存関係の処理には苦労します。 この結果にヒントを得て、我々はTransformerの最初の自己注意層をNPLMの局所連結層に置き換えることでTransformerを修正し、3つの単語レベルの言語モデルデータセットにおいて、小さいながらも一貫してパープレキシティを減少させることができた。
process
revisiting simple neural probabilistic language models 近年の言語モデリングの進歩は,ニューラルアーキテクチャの進歩だけでなく,ハードウェアや最適化の改善によってもたらされています。 本論文では、 のニューラル確率的言語モデル nplm を再検討しました。 このモデルは、固定されたウィンドウ内で単語の埋め込みを単純に連結し、その結果をフィードフォワードネットワークに通して次の単語を予測します。 このモデルは、最新のハードウェアにスケールアップすると、多くの制限があるにもかかわらず、単語レベルの言語モデルベンチマークで予想をはるかに上回る性能を発揮します。 分析の結果、nplmは短い入力文脈ではベースラインのtransformerよりも低いperplexityを達成しますが、長期的な依存関係の処理には苦労します。 この結果にヒントを得て、我々はtransformerの最初の自己注意層をnplmの局所連結層に置き換えることでtransformerを修正し、 、小さいながらも一貫してパープレキシティを減少させることができた。
1
9,178
12,227,034,803
IssuesEvent
2020-05-03 13:38:53
jyn514/rcc
https://api.github.com/repos/jyn514/rcc
opened
[ICE] preprocessor needs a recursion guard
ICE fuzz preprocessor
### Code <!-- The code that caused the panic goes here. This should also include the error message you got. --> Modified from the gcc torture suite. ```c /* Add a few "extern int Xxxxxx ();" declarations. */ #define DEF(x) extern int x; #define LIM1(x) DEF(x##0); DEF(x##1); DEF(x##2); DEF(x##3); DEF(x##4); \ DEF(x##5); DEF(x##6); DEF(x##7); DEF(x##8); DEF(x##9); #define LIM2(x) LIM1(x##0) LIM1(x##1) LIM1(x##2) LIM1(x##3) LIM1(x##4) \ LIM1(x##5) LIM1(x##6) LIM1(x##7) LIM1(x##8) LIM1(x##9) #define LIM3(x) LIM2(x##0) LIM2(x##1) LIM2(x##2) LIM2(x##3) LIM2(x##4) \ LIM2(x##5) LIM2(x##6) LIM2(x##7) LIM2(x##8) LIM2(x##9) #define LIM4(x) LIM3(x##0) LIM3(x##1) LIM3(x##2) LIM3(x##3) LIM3(x##4) \ LIM3(x##5) LIM3(x##6) LIM3(x##7) LIM3(x##8) LIM3(x##9) LIM4 (X); thread 'main' has overflowed its stack fatal runtime error: stack overflow Aborted (core dumped) ``` ### Expected behavior <!-- A description of what you expected to happen. If you're not sure (e.g. this is invalid code), paste the output of another compiler (I like `clang -x c - -Wall -Wextra -pedantic`) --> It should give a fatal error and exit normally, like for nested expressions. <details><summary>Backtrace</summary> <!-- The output of `RUST_BACKTRACE=1 cargo run` goes here. --> ``` Program received signal SIGSEGV, Segmentation fault. 0x0000555555a16c41 in rcc::lex::cpp::PreProcessor::next_replacement_token ( self=0x7fffffff9760) at src/lex/cpp.rs:281 281 if let Some(replacement) = self.pending.pop_front() { (gdb) where #0 0x0000555555a16c41 in rcc::lex::cpp::PreProcessor::next_replacement_token ( self=0x7fffffff9760) at src/lex/cpp.rs:281 #1 0x0000555555a1707b in rcc::lex::cpp::PreProcessor::match_next ( self=0x7fffffff9760, token=...) at src/lex/cpp.rs:292 #2 0x0000555555a1aece in rcc::lex::cpp::PreProcessor::replace_function ( self=0x7fffffff9760, name=..., start=613) at src/lex/cpp.rs:708 #3 0x0000555555a1aba2 in rcc::lex::cpp::PreProcessor::replace_id ( self=0x7fffffff9760, name=..., location=...) at src/lex/cpp.rs:696 #4 0x0000555555a17a1c in rcc::lex::cpp::PreProcessor::handle_token ( self=0x7fffffff9760, token=..., location=...) at src/lex/cpp.rs:342 #5 0x0000555555a16533 in <rcc::lex::cpp::PreProcessor as core::iter::traits::iterator::Iterator>::next (self=0x7fffffff9760) at src/lex/cpp.rs:240 #6 0x0000555555a1b85d in rcc::lex::cpp::PreProcessor::replace_function ( self=0x7fffffff9760, name=..., start=613) at src/lex/cpp.rs:775 #7 0x0000555555a1aba2 in rcc::lex::cpp::PreProcessor::replace_id ( self=0x7fffffff9760, name=..., location=...) at src/lex/cpp.rs:696 #8 0x0000555555a17a1c in rcc::lex::cpp::PreProcessor::handle_token ( self=0x7fffffff9760, token=..., location=...) at src/lex/cpp.rs:342 ... several thousand more lines ... ``` </details>
1.0
[ICE] preprocessor needs a recursion guard - ### Code <!-- The code that caused the panic goes here. This should also include the error message you got. --> Modified from the gcc torture suite. ```c /* Add a few "extern int Xxxxxx ();" declarations. */ #define DEF(x) extern int x; #define LIM1(x) DEF(x##0); DEF(x##1); DEF(x##2); DEF(x##3); DEF(x##4); \ DEF(x##5); DEF(x##6); DEF(x##7); DEF(x##8); DEF(x##9); #define LIM2(x) LIM1(x##0) LIM1(x##1) LIM1(x##2) LIM1(x##3) LIM1(x##4) \ LIM1(x##5) LIM1(x##6) LIM1(x##7) LIM1(x##8) LIM1(x##9) #define LIM3(x) LIM2(x##0) LIM2(x##1) LIM2(x##2) LIM2(x##3) LIM2(x##4) \ LIM2(x##5) LIM2(x##6) LIM2(x##7) LIM2(x##8) LIM2(x##9) #define LIM4(x) LIM3(x##0) LIM3(x##1) LIM3(x##2) LIM3(x##3) LIM3(x##4) \ LIM3(x##5) LIM3(x##6) LIM3(x##7) LIM3(x##8) LIM3(x##9) LIM4 (X); thread 'main' has overflowed its stack fatal runtime error: stack overflow Aborted (core dumped) ``` ### Expected behavior <!-- A description of what you expected to happen. If you're not sure (e.g. this is invalid code), paste the output of another compiler (I like `clang -x c - -Wall -Wextra -pedantic`) --> It should give a fatal error and exit normally, like for nested expressions. <details><summary>Backtrace</summary> <!-- The output of `RUST_BACKTRACE=1 cargo run` goes here. --> ``` Program received signal SIGSEGV, Segmentation fault. 0x0000555555a16c41 in rcc::lex::cpp::PreProcessor::next_replacement_token ( self=0x7fffffff9760) at src/lex/cpp.rs:281 281 if let Some(replacement) = self.pending.pop_front() { (gdb) where #0 0x0000555555a16c41 in rcc::lex::cpp::PreProcessor::next_replacement_token ( self=0x7fffffff9760) at src/lex/cpp.rs:281 #1 0x0000555555a1707b in rcc::lex::cpp::PreProcessor::match_next ( self=0x7fffffff9760, token=...) at src/lex/cpp.rs:292 #2 0x0000555555a1aece in rcc::lex::cpp::PreProcessor::replace_function ( self=0x7fffffff9760, name=..., start=613) at src/lex/cpp.rs:708 #3 0x0000555555a1aba2 in rcc::lex::cpp::PreProcessor::replace_id ( self=0x7fffffff9760, name=..., location=...) at src/lex/cpp.rs:696 #4 0x0000555555a17a1c in rcc::lex::cpp::PreProcessor::handle_token ( self=0x7fffffff9760, token=..., location=...) at src/lex/cpp.rs:342 #5 0x0000555555a16533 in <rcc::lex::cpp::PreProcessor as core::iter::traits::iterator::Iterator>::next (self=0x7fffffff9760) at src/lex/cpp.rs:240 #6 0x0000555555a1b85d in rcc::lex::cpp::PreProcessor::replace_function ( self=0x7fffffff9760, name=..., start=613) at src/lex/cpp.rs:775 #7 0x0000555555a1aba2 in rcc::lex::cpp::PreProcessor::replace_id ( self=0x7fffffff9760, name=..., location=...) at src/lex/cpp.rs:696 #8 0x0000555555a17a1c in rcc::lex::cpp::PreProcessor::handle_token ( self=0x7fffffff9760, token=..., location=...) at src/lex/cpp.rs:342 ... several thousand more lines ... ``` </details>
process
preprocessor needs a recursion guard code the code that caused the panic goes here this should also include the error message you got modified from the gcc torture suite c add a few extern int xxxxxx declarations define def x extern int x define x def x def x def x def x def x def x def x def x def x def x define x x x x x x x x x x x define x x x x x x x x x x x define x x x x x x x x x x x x thread main has overflowed its stack fatal runtime error stack overflow aborted core dumped expected behavior a description of what you expected to happen if you re not sure e g this is invalid code paste the output of another compiler i like clang x c wall wextra pedantic it should give a fatal error and exit normally like for nested expressions backtrace program received signal sigsegv segmentation fault in rcc lex cpp preprocessor next replacement token self at src lex cpp rs if let some replacement self pending pop front gdb where in rcc lex cpp preprocessor next replacement token self at src lex cpp rs in rcc lex cpp preprocessor match next self token at src lex cpp rs in rcc lex cpp preprocessor replace function self name start at src lex cpp rs in rcc lex cpp preprocessor replace id self name location at src lex cpp rs in rcc lex cpp preprocessor handle token self token location at src lex cpp rs in next self at src lex cpp rs in rcc lex cpp preprocessor replace function self name start at src lex cpp rs in rcc lex cpp preprocessor replace id self name location at src lex cpp rs in rcc lex cpp preprocessor handle token self token location at src lex cpp rs several thousand more lines
1
17,930
23,928,562,385
IssuesEvent
2022-09-10 07:00:43
GregTechCEu/gt-ideas
https://api.github.com/repos/GregTechCEu/gt-ideas
opened
Purification of silicon for electronics
processing chain
## Details Electronic components that use silicon (in real life) need silicon with a purity above 99.99999% (i forgot the amount of 9's there were but it is basically a lot) and all raw silicon that has come fresh out of ore processing has too many impurities to be used in electronics. It is therefore unrealistic to simply make silicon boules and other electronic related things with normal silicon that you just pulled out of an electrolyzer/ore washer/whatever you used. It is time to make silicon in gregtech more painful. ## Products Main Product: Purified Silicon Dust Side Product(s): N/A ## Steps Silicon Dust -> Pulverizer -> Fine Silicon Dust Fine Silicon Dust + 0.1 b Ethanol + 0.5 b Hydrofluoric Acid + 5 b Nitration Solution -> Chemical Reactor -> 1 b Silicon Solution 1 b Silicon Solution -> Sonicator -> 1 Purified Silicon Dust ## Yield 1 Purified Silicon Dust will be made per 1 Normal Silicon Dust ## Sources https://www.pnas.org/doi/10.1073/pnas.1513012112
1.0
Purification of silicon for electronics - ## Details Electronic components that use silicon (in real life) need silicon with a purity above 99.99999% (i forgot the amount of 9's there were but it is basically a lot) and all raw silicon that has come fresh out of ore processing has too many impurities to be used in electronics. It is therefore unrealistic to simply make silicon boules and other electronic related things with normal silicon that you just pulled out of an electrolyzer/ore washer/whatever you used. It is time to make silicon in gregtech more painful. ## Products Main Product: Purified Silicon Dust Side Product(s): N/A ## Steps Silicon Dust -> Pulverizer -> Fine Silicon Dust Fine Silicon Dust + 0.1 b Ethanol + 0.5 b Hydrofluoric Acid + 5 b Nitration Solution -> Chemical Reactor -> 1 b Silicon Solution 1 b Silicon Solution -> Sonicator -> 1 Purified Silicon Dust ## Yield 1 Purified Silicon Dust will be made per 1 Normal Silicon Dust ## Sources https://www.pnas.org/doi/10.1073/pnas.1513012112
process
purification of silicon for electronics details electronic components that use silicon in real life need silicon with a purity above i forgot the amount of s there were but it is basically a lot and all raw silicon that has come fresh out of ore processing has too many impurities to be used in electronics it is therefore unrealistic to simply make silicon boules and other electronic related things with normal silicon that you just pulled out of an electrolyzer ore washer whatever you used it is time to make silicon in gregtech more painful products main product purified silicon dust side product s n a steps silicon dust pulverizer fine silicon dust fine silicon dust b ethanol b hydrofluoric acid b nitration solution chemical reactor b silicon solution b silicon solution sonicator purified silicon dust yield purified silicon dust will be made per normal silicon dust sources
1
15,004
18,719,710,541
IssuesEvent
2021-11-03 10:21:28
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
opened
Fix 'envenomation' terms
multi-species process
Hello, After much discussion with the multi-org group @geneontology/multiorganism-working-group @Antonialock and @FJungo We propose to change all the terms of pattern 'envenomation resulting in x' to 'toxin-mediated as shown below: Children of envenomation | New term label -- | -- envenomation resulting in modulation of vasoactive intestinal polypeptide receptor activity in other organism | toxin-mediated modulation of vasoactive intestinal polypeptide receptor activity in another organism envenomation resulting in modulation of cell migration in other organism | toxin-mediated modulation of cell migration in another organism envenomation resulting in modulation of blood pressure in other organism | toxin-mediated modulation of blood pressure in another organism envenomation resulting in hemolysis in other organism | toxin-mediated hemolysis in another organism envenomation resulting in modulation of blood coagulation in other organism | toxin-mediated modulation of blood coagulation in another organism envenomation resulting in modulation of complement activation, alternative pathway in other organism | toxin-mediated modulation of complement activation, alternative pathway in another organism envenomation resulting in hemorrhagic damage to other organism | toxin-mediated hemorrhagic damage to other organism envenomation resulting in fibrinolysis in other organism | toxin-mediated fibrinolysis in another organism envenomation resulting in modulation of acid-sensing ion channel activity in other organism | toxin-mediated modulation of acid-sensing ion channel activity in another organism envenomation resulting in modulation of calcium channel activity in other organism | toxin-mediated modulation of calcium channel activity in another organism envenomation resulting in modulation of ion channel activity in other organism | toxin-mediated modulation of ion channel activity in another organism envenomation resulting in modulation of transmission of nerve impulse in other organism | toxin-mediated modulation of transmission of nerve impulse in another organism envenomation resulting in modulation of complement activation in other organism | toxin-mediated modulation of complement activation in another organism envenomation resulting in modulation of signal transduction in other organism | toxin-mediated modulation of signal transduction in another organism envenomation resulting in modulation of glucagon-like peptide receptor 1 activity in other organism | toxin-mediated modulation of glucagon-like peptide receptor 1 activity in another organism envenomation resulting in modulation of G protein-coupled receptor activity in other organism | toxin-mediated modulation of G protein-coupled receptor activity in another organism envenomation resulting in modulation of receptor activity in other organism | toxin-mediated modulation of receptor activity in another organism envenomation resulting in modulation of apoptotic process in other organism | toxin-mediated modulation of apoptotic process in another organism envenomation resulting in modulation of complement activation, classical pathway in other organism | toxin-mediated modulation of complement activation, classical pathway in another organism envenomation resulting in modulation of complement activation, lectin pathway in other organism | toxin-mediated modulation of complement activation, lectin pathway in another organism envenomation resulting in fibrinogenolysis in other organism | toxin-mediated fibrinogenolysis in another organism envenomation resulting in blood agglutination in other organism | toxin-mediated blood agglutination in another organism envenomation resulting in depletion of circulating fibrinogen in other organism | toxin-mediated depletion of circulating fibrinogen in another organism envenomation resulting in cytolysis in other organism | toxin-mediated cytolysis in another organism envenomation resulting in blood vessel extracellular matrix damage, causing hemorrhagic damage in other organism | toxin-mediated blood vessel extracellular matrix damage, causing hemorrhagic damage in another organism envenomation resulting in modulation of mast cell degranulation in other organism | toxin-mediated modulation of mast cell degranulation in another organism envenomation resulting in modulation of platelet aggregation in other organism | toxin-mediated modulation of platelet aggregation in another organism envenomation resulting in induction of edema in other organism | toxin-mediated induction of edema in another organism envenomation resulting in impairment of hemostasis in other organism | toxin-mediated impairment of hemostasis in another organism envenomation resulting in modulation of sensory perception of pain in other organism | toxin-mediated modulation of sensory perception of pain in another organism envenomation resulting in damage of muscle extracellular matrix in other organism | toxin-mediated damage of muscle extracellular matrix in another organism envenomation resulting in modulation of voltage-gated potassium channel activity in other organism | toxin-mediated modulation of voltage-gated potassium channel activity in another organism envenomation resulting in modulation of voltage-gated sodium channel activity in other organism | toxin-mediated modulation of voltage-gated sodium channel activity in another organism envenomation resulting in muscle damage in other organism | toxin-mediated muscle damage in another organism envenomation resulting in myocyte killing in other organism | toxin-mediated myocyte killing in another organism envenomation resulting in negative regulation of acid-sensing ion channel activity in other organism | toxin-mediated negative regulation of acid-sensing ion channel activity in another organism envenomation resulting in negative regulation of blood coagulation in other organism | toxin-mediated negative regulation of blood coagulation in another organism envenomation resulting in negative regulation of calcium channel activity in other organism | toxin-mediated negative regulation of calcium channel activity in another organism envenomation resulting in negative regulation of cell migration in other organism | toxin-mediated negative regulation of cell migration in another organism envenomation resulting in negative regulation of complement activation, alternative pathway in other organism | toxin-mediated negative regulation of complement activation, alternative pathway in another organism envenomation resulting in negative regulation of complement activation, classical pathway in other organism | toxin-mediated negative regulation of complement activation, classical pathway in another organism envenomation resulting in negative regulation of complement activation, lectin pathway in other organism | toxin-mediated negative regulation of complement activation, lectin pathway in another organism envenomation resulting in negative regulation of heart rate of other organism | toxin-mediated negative regulation of heart rate of other organism envenomation resulting in negative regulation of high voltage-gated calcium channel activity in other organism | toxin-mediated negative regulation of high voltage-gated calcium channel activity in another organism envenomation resulting in negative regulation of low voltage-gated calcium channel activity in other organism | toxin-mediated negative regulation of low voltage-gated calcium channel activity in another organism envenomation resulting in negative regulation of platelet aggregation in other organism | toxin-mediated negative regulation of platelet aggregation in another organism envenomation resulting in negative regulation of sensory perception of pain in other organism | toxin-mediated negative regulation of sensory perception of pain in another organism envenomation resulting in negative regulation of voltage-gated calcium channel activity in other organism | toxin-mediated negative regulation of voltage-gated calcium channel activity in another organism envenomation resulting in negative regulation of voltage-gated potassium channel activity in other organism | toxin-mediated negative regulation of voltage-gated potassium channel activity in another organism envenomation resulting in negative regulation of voltage-gated sodium channel activity in other organism | toxin-mediated negative regulation of voltage-gated sodium channel activity in another organism envenomation resulting in occlusion of the pore of voltage-gated potassium channel in other organism | toxin-mediated occlusion of the pore of voltage-gated potassium channel in another organism envenomation resulting in plasminogen activation in other organism | toxin-mediated plasminogen activation in another organism envenomation resulting in pore formation in membrane of other organism | toxin-mediated pore formation in membrane of other organism envenomation resulting in positive regulation of G protein-coupled receptor activity in other organism | toxin-mediated positive regulation of G protein-coupled receptor activity in another organism envenomation resulting in positive regulation of acid-sensing ion channel activity in other organism | toxin-mediated positive regulation of acid-sensing ion channel activity in another organism envenomation resulting in positive regulation of argininosuccinate synthase activity in other organism | toxin-mediated positive regulation of argininosuccinate synthase activity in another organism envenomation resulting in positive regulation of blood coagulation in other organism | toxin-mediated positive regulation of blood coagulation in another organism envenomation resulting in positive regulation of blood pressure in other organism | toxin-mediated positive regulation of blood pressure in another organism envenomation resulting in positive regulation of cell migration in other organism | toxin-mediated positive regulation of cell migration in another organism envenomation resulting in positive regulation of complement activation, alternative pathway in other organism | toxin-mediated positive regulation of complement activation, alternative pathway in another organism envenomation resulting in positive regulation of complement activation, classical pathway in other organism | toxin-mediated positive regulation of complement activation, classical pathway in another organism envenomation resulting in positive regulation of complement activation, lectin pathway in other organism | toxin-mediated positive regulation of complement activation, lectin pathway in another organism envenomation resulting in positive regulation of glucagon-like peptide receptor 1 activity in other organism | toxin-mediated positive regulation of glucagon-like peptide receptor 1 activity in another organism envenomation resulting in positive regulation of mast cell degranulation in other organism | toxin-mediated positive regulation of mast cell degranulation in another organism envenomation resulting in positive regulation of platelet aggregation in other organism | toxin-mediated positive regulation of platelet aggregation in another organism envenomation resulting in positive regulation of signal transduction in other organism | toxin-mediated positive regulation of signal transduction in another organism envenomation resulting in positive regulation of vasoactive intestinal polypeptide receptor activity in other organism | toxin-mediated positive regulation of vasoactive intestinal polypeptide receptor activity in another organism envenomation resulting in positive regulation of voltage-gated sodium channel activity in other organism | toxin-mediated positive regulation of voltage-gated sodium channel activity in another organism envenomation resulting in proteolysis in other organism | toxin-mediated proteolysis in another organism envenomation resulting in slowing of activation kinetics of voltage-gated potassium channel in other organism | toxin-mediated slowing of activation kinetics of voltage-gated potassium channel in another organism envenomation resulting in vasodilation in other organism | toxin-mediated vasodilation in another organism envenomation resulting in zymogen activation in other organism | toxin-mediated zymogen activation in another organism
1.0
Fix 'envenomation' terms - Hello, After much discussion with the multi-org group @geneontology/multiorganism-working-group @Antonialock and @FJungo We propose to change all the terms of pattern 'envenomation resulting in x' to 'toxin-mediated as shown below: Children of envenomation | New term label -- | -- envenomation resulting in modulation of vasoactive intestinal polypeptide receptor activity in other organism | toxin-mediated modulation of vasoactive intestinal polypeptide receptor activity in another organism envenomation resulting in modulation of cell migration in other organism | toxin-mediated modulation of cell migration in another organism envenomation resulting in modulation of blood pressure in other organism | toxin-mediated modulation of blood pressure in another organism envenomation resulting in hemolysis in other organism | toxin-mediated hemolysis in another organism envenomation resulting in modulation of blood coagulation in other organism | toxin-mediated modulation of blood coagulation in another organism envenomation resulting in modulation of complement activation, alternative pathway in other organism | toxin-mediated modulation of complement activation, alternative pathway in another organism envenomation resulting in hemorrhagic damage to other organism | toxin-mediated hemorrhagic damage to other organism envenomation resulting in fibrinolysis in other organism | toxin-mediated fibrinolysis in another organism envenomation resulting in modulation of acid-sensing ion channel activity in other organism | toxin-mediated modulation of acid-sensing ion channel activity in another organism envenomation resulting in modulation of calcium channel activity in other organism | toxin-mediated modulation of calcium channel activity in another organism envenomation resulting in modulation of ion channel activity in other organism | toxin-mediated modulation of ion channel activity in another organism envenomation resulting in modulation of transmission of nerve impulse in other organism | toxin-mediated modulation of transmission of nerve impulse in another organism envenomation resulting in modulation of complement activation in other organism | toxin-mediated modulation of complement activation in another organism envenomation resulting in modulation of signal transduction in other organism | toxin-mediated modulation of signal transduction in another organism envenomation resulting in modulation of glucagon-like peptide receptor 1 activity in other organism | toxin-mediated modulation of glucagon-like peptide receptor 1 activity in another organism envenomation resulting in modulation of G protein-coupled receptor activity in other organism | toxin-mediated modulation of G protein-coupled receptor activity in another organism envenomation resulting in modulation of receptor activity in other organism | toxin-mediated modulation of receptor activity in another organism envenomation resulting in modulation of apoptotic process in other organism | toxin-mediated modulation of apoptotic process in another organism envenomation resulting in modulation of complement activation, classical pathway in other organism | toxin-mediated modulation of complement activation, classical pathway in another organism envenomation resulting in modulation of complement activation, lectin pathway in other organism | toxin-mediated modulation of complement activation, lectin pathway in another organism envenomation resulting in fibrinogenolysis in other organism | toxin-mediated fibrinogenolysis in another organism envenomation resulting in blood agglutination in other organism | toxin-mediated blood agglutination in another organism envenomation resulting in depletion of circulating fibrinogen in other organism | toxin-mediated depletion of circulating fibrinogen in another organism envenomation resulting in cytolysis in other organism | toxin-mediated cytolysis in another organism envenomation resulting in blood vessel extracellular matrix damage, causing hemorrhagic damage in other organism | toxin-mediated blood vessel extracellular matrix damage, causing hemorrhagic damage in another organism envenomation resulting in modulation of mast cell degranulation in other organism | toxin-mediated modulation of mast cell degranulation in another organism envenomation resulting in modulation of platelet aggregation in other organism | toxin-mediated modulation of platelet aggregation in another organism envenomation resulting in induction of edema in other organism | toxin-mediated induction of edema in another organism envenomation resulting in impairment of hemostasis in other organism | toxin-mediated impairment of hemostasis in another organism envenomation resulting in modulation of sensory perception of pain in other organism | toxin-mediated modulation of sensory perception of pain in another organism envenomation resulting in damage of muscle extracellular matrix in other organism | toxin-mediated damage of muscle extracellular matrix in another organism envenomation resulting in modulation of voltage-gated potassium channel activity in other organism | toxin-mediated modulation of voltage-gated potassium channel activity in another organism envenomation resulting in modulation of voltage-gated sodium channel activity in other organism | toxin-mediated modulation of voltage-gated sodium channel activity in another organism envenomation resulting in muscle damage in other organism | toxin-mediated muscle damage in another organism envenomation resulting in myocyte killing in other organism | toxin-mediated myocyte killing in another organism envenomation resulting in negative regulation of acid-sensing ion channel activity in other organism | toxin-mediated negative regulation of acid-sensing ion channel activity in another organism envenomation resulting in negative regulation of blood coagulation in other organism | toxin-mediated negative regulation of blood coagulation in another organism envenomation resulting in negative regulation of calcium channel activity in other organism | toxin-mediated negative regulation of calcium channel activity in another organism envenomation resulting in negative regulation of cell migration in other organism | toxin-mediated negative regulation of cell migration in another organism envenomation resulting in negative regulation of complement activation, alternative pathway in other organism | toxin-mediated negative regulation of complement activation, alternative pathway in another organism envenomation resulting in negative regulation of complement activation, classical pathway in other organism | toxin-mediated negative regulation of complement activation, classical pathway in another organism envenomation resulting in negative regulation of complement activation, lectin pathway in other organism | toxin-mediated negative regulation of complement activation, lectin pathway in another organism envenomation resulting in negative regulation of heart rate of other organism | toxin-mediated negative regulation of heart rate of other organism envenomation resulting in negative regulation of high voltage-gated calcium channel activity in other organism | toxin-mediated negative regulation of high voltage-gated calcium channel activity in another organism envenomation resulting in negative regulation of low voltage-gated calcium channel activity in other organism | toxin-mediated negative regulation of low voltage-gated calcium channel activity in another organism envenomation resulting in negative regulation of platelet aggregation in other organism | toxin-mediated negative regulation of platelet aggregation in another organism envenomation resulting in negative regulation of sensory perception of pain in other organism | toxin-mediated negative regulation of sensory perception of pain in another organism envenomation resulting in negative regulation of voltage-gated calcium channel activity in other organism | toxin-mediated negative regulation of voltage-gated calcium channel activity in another organism envenomation resulting in negative regulation of voltage-gated potassium channel activity in other organism | toxin-mediated negative regulation of voltage-gated potassium channel activity in another organism envenomation resulting in negative regulation of voltage-gated sodium channel activity in other organism | toxin-mediated negative regulation of voltage-gated sodium channel activity in another organism envenomation resulting in occlusion of the pore of voltage-gated potassium channel in other organism | toxin-mediated occlusion of the pore of voltage-gated potassium channel in another organism envenomation resulting in plasminogen activation in other organism | toxin-mediated plasminogen activation in another organism envenomation resulting in pore formation in membrane of other organism | toxin-mediated pore formation in membrane of other organism envenomation resulting in positive regulation of G protein-coupled receptor activity in other organism | toxin-mediated positive regulation of G protein-coupled receptor activity in another organism envenomation resulting in positive regulation of acid-sensing ion channel activity in other organism | toxin-mediated positive regulation of acid-sensing ion channel activity in another organism envenomation resulting in positive regulation of argininosuccinate synthase activity in other organism | toxin-mediated positive regulation of argininosuccinate synthase activity in another organism envenomation resulting in positive regulation of blood coagulation in other organism | toxin-mediated positive regulation of blood coagulation in another organism envenomation resulting in positive regulation of blood pressure in other organism | toxin-mediated positive regulation of blood pressure in another organism envenomation resulting in positive regulation of cell migration in other organism | toxin-mediated positive regulation of cell migration in another organism envenomation resulting in positive regulation of complement activation, alternative pathway in other organism | toxin-mediated positive regulation of complement activation, alternative pathway in another organism envenomation resulting in positive regulation of complement activation, classical pathway in other organism | toxin-mediated positive regulation of complement activation, classical pathway in another organism envenomation resulting in positive regulation of complement activation, lectin pathway in other organism | toxin-mediated positive regulation of complement activation, lectin pathway in another organism envenomation resulting in positive regulation of glucagon-like peptide receptor 1 activity in other organism | toxin-mediated positive regulation of glucagon-like peptide receptor 1 activity in another organism envenomation resulting in positive regulation of mast cell degranulation in other organism | toxin-mediated positive regulation of mast cell degranulation in another organism envenomation resulting in positive regulation of platelet aggregation in other organism | toxin-mediated positive regulation of platelet aggregation in another organism envenomation resulting in positive regulation of signal transduction in other organism | toxin-mediated positive regulation of signal transduction in another organism envenomation resulting in positive regulation of vasoactive intestinal polypeptide receptor activity in other organism | toxin-mediated positive regulation of vasoactive intestinal polypeptide receptor activity in another organism envenomation resulting in positive regulation of voltage-gated sodium channel activity in other organism | toxin-mediated positive regulation of voltage-gated sodium channel activity in another organism envenomation resulting in proteolysis in other organism | toxin-mediated proteolysis in another organism envenomation resulting in slowing of activation kinetics of voltage-gated potassium channel in other organism | toxin-mediated slowing of activation kinetics of voltage-gated potassium channel in another organism envenomation resulting in vasodilation in other organism | toxin-mediated vasodilation in another organism envenomation resulting in zymogen activation in other organism | toxin-mediated zymogen activation in another organism
process
fix envenomation terms hello after much discussion with the multi org group geneontology multiorganism working group antonialock and fjungo we propose to change all the terms of pattern envenomation resulting in x to toxin mediated as shown below children of envenomation new term label envenomation resulting in modulation of vasoactive intestinal polypeptide receptor activity in other organism toxin mediated modulation of vasoactive intestinal polypeptide receptor activity in another organism envenomation resulting in modulation of cell migration in other organism toxin mediated modulation of cell migration in another organism envenomation resulting in modulation of blood pressure in other organism toxin mediated modulation of blood pressure in another organism envenomation resulting in hemolysis in other organism toxin mediated hemolysis in another organism envenomation resulting in modulation of blood coagulation in other organism toxin mediated modulation of blood coagulation in another organism envenomation resulting in modulation of complement activation alternative pathway in other organism toxin mediated modulation of complement activation alternative pathway in another organism envenomation resulting in hemorrhagic damage to other organism toxin mediated hemorrhagic damage to other organism envenomation resulting in fibrinolysis in other organism toxin mediated fibrinolysis in another organism envenomation resulting in modulation of acid sensing ion channel activity in other organism toxin mediated modulation of acid sensing ion channel activity in another organism envenomation resulting in modulation of calcium channel activity in other organism toxin mediated modulation of calcium channel activity in another organism envenomation resulting in modulation of ion channel activity in other organism toxin mediated modulation of ion channel activity in another organism envenomation resulting in modulation of transmission of nerve impulse in other organism toxin mediated modulation of transmission of nerve impulse in another organism envenomation resulting in modulation of complement activation in other organism toxin mediated modulation of complement activation in another organism envenomation resulting in modulation of signal transduction in other organism toxin mediated modulation of signal transduction in another organism envenomation resulting in modulation of glucagon like peptide receptor activity in other organism toxin mediated modulation of glucagon like peptide receptor activity in another organism envenomation resulting in modulation of g protein coupled receptor activity in other organism toxin mediated modulation of g protein coupled receptor activity in another organism envenomation resulting in modulation of receptor activity in other organism toxin mediated modulation of receptor activity in another organism envenomation resulting in modulation of apoptotic process in other organism toxin mediated modulation of apoptotic process in another organism envenomation resulting in modulation of complement activation classical pathway in other organism toxin mediated modulation of complement activation classical pathway in another organism envenomation resulting in modulation of complement activation lectin pathway in other organism toxin mediated modulation of complement activation lectin pathway in another organism envenomation resulting in fibrinogenolysis in other organism toxin mediated fibrinogenolysis in another organism envenomation resulting in blood agglutination in other organism toxin mediated blood agglutination in another organism envenomation resulting in depletion of circulating fibrinogen in other organism toxin mediated depletion of circulating fibrinogen in another organism envenomation resulting in cytolysis in other organism toxin mediated cytolysis in another organism envenomation resulting in blood vessel extracellular matrix damage causing hemorrhagic damage in other organism toxin mediated blood vessel extracellular matrix damage causing hemorrhagic damage in another organism envenomation resulting in modulation of mast cell degranulation in other organism toxin mediated modulation of mast cell degranulation in another organism envenomation resulting in modulation of platelet aggregation in other organism toxin mediated modulation of platelet aggregation in another organism envenomation resulting in induction of edema in other organism toxin mediated induction of edema in another organism envenomation resulting in impairment of hemostasis in other organism toxin mediated impairment of hemostasis in another organism envenomation resulting in modulation of sensory perception of pain in other organism toxin mediated modulation of sensory perception of pain in another organism envenomation resulting in damage of muscle extracellular matrix in other organism toxin mediated damage of muscle extracellular matrix in another organism envenomation resulting in modulation of voltage gated potassium channel activity in other organism toxin mediated modulation of voltage gated potassium channel activity in another organism envenomation resulting in modulation of voltage gated sodium channel activity in other organism toxin mediated modulation of voltage gated sodium channel activity in another organism envenomation resulting in muscle damage in other organism toxin mediated muscle damage in another organism envenomation resulting in myocyte killing in other organism toxin mediated myocyte killing in another organism envenomation resulting in negative regulation of acid sensing ion channel activity in other organism toxin mediated negative regulation of acid sensing ion channel activity in another organism envenomation resulting in negative regulation of blood coagulation in other organism toxin mediated negative regulation of blood coagulation in another organism envenomation resulting in negative regulation of calcium channel activity in other organism toxin mediated negative regulation of calcium channel activity in another organism envenomation resulting in negative regulation of cell migration in other organism toxin mediated negative regulation of cell migration in another organism envenomation resulting in negative regulation of complement activation alternative pathway in other organism toxin mediated negative regulation of complement activation alternative pathway in another organism envenomation resulting in negative regulation of complement activation classical pathway in other organism toxin mediated negative regulation of complement activation classical pathway in another organism envenomation resulting in negative regulation of complement activation lectin pathway in other organism toxin mediated negative regulation of complement activation lectin pathway in another organism envenomation resulting in negative regulation of heart rate of other organism toxin mediated negative regulation of heart rate of other organism envenomation resulting in negative regulation of high voltage gated calcium channel activity in other organism toxin mediated negative regulation of high voltage gated calcium channel activity in another organism envenomation resulting in negative regulation of low voltage gated calcium channel activity in other organism toxin mediated negative regulation of low voltage gated calcium channel activity in another organism envenomation resulting in negative regulation of platelet aggregation in other organism toxin mediated negative regulation of platelet aggregation in another organism envenomation resulting in negative regulation of sensory perception of pain in other organism toxin mediated negative regulation of sensory perception of pain in another organism envenomation resulting in negative regulation of voltage gated calcium channel activity in other organism toxin mediated negative regulation of voltage gated calcium channel activity in another organism envenomation resulting in negative regulation of voltage gated potassium channel activity in other organism toxin mediated negative regulation of voltage gated potassium channel activity in another organism envenomation resulting in negative regulation of voltage gated sodium channel activity in other organism toxin mediated negative regulation of voltage gated sodium channel activity in another organism envenomation resulting in occlusion of the pore of voltage gated potassium channel in other organism toxin mediated occlusion of the pore of voltage gated potassium channel in another organism envenomation resulting in plasminogen activation in other organism toxin mediated plasminogen activation in another organism envenomation resulting in pore formation in membrane of other organism toxin mediated pore formation in membrane of other organism envenomation resulting in positive regulation of g protein coupled receptor activity in other organism toxin mediated positive regulation of g protein coupled receptor activity in another organism envenomation resulting in positive regulation of acid sensing ion channel activity in other organism toxin mediated positive regulation of acid sensing ion channel activity in another organism envenomation resulting in positive regulation of argininosuccinate synthase activity in other organism toxin mediated positive regulation of argininosuccinate synthase activity in another organism envenomation resulting in positive regulation of blood coagulation in other organism toxin mediated positive regulation of blood coagulation in another organism envenomation resulting in positive regulation of blood pressure in other organism toxin mediated positive regulation of blood pressure in another organism envenomation resulting in positive regulation of cell migration in other organism toxin mediated positive regulation of cell migration in another organism envenomation resulting in positive regulation of complement activation alternative pathway in other organism toxin mediated positive regulation of complement activation alternative pathway in another organism envenomation resulting in positive regulation of complement activation classical pathway in other organism toxin mediated positive regulation of complement activation classical pathway in another organism envenomation resulting in positive regulation of complement activation lectin pathway in other organism toxin mediated positive regulation of complement activation lectin pathway in another organism envenomation resulting in positive regulation of glucagon like peptide receptor activity in other organism toxin mediated positive regulation of glucagon like peptide receptor activity in another organism envenomation resulting in positive regulation of mast cell degranulation in other organism toxin mediated positive regulation of mast cell degranulation in another organism envenomation resulting in positive regulation of platelet aggregation in other organism toxin mediated positive regulation of platelet aggregation in another organism envenomation resulting in positive regulation of signal transduction in other organism toxin mediated positive regulation of signal transduction in another organism envenomation resulting in positive regulation of vasoactive intestinal polypeptide receptor activity in other organism toxin mediated positive regulation of vasoactive intestinal polypeptide receptor activity in another organism envenomation resulting in positive regulation of voltage gated sodium channel activity in other organism toxin mediated positive regulation of voltage gated sodium channel activity in another organism envenomation resulting in proteolysis in other organism toxin mediated proteolysis in another organism envenomation resulting in slowing of activation kinetics of voltage gated potassium channel in other organism toxin mediated slowing of activation kinetics of voltage gated potassium channel in another organism envenomation resulting in vasodilation in other organism toxin mediated vasodilation in another organism envenomation resulting in zymogen activation in other organism toxin mediated zymogen activation in another organism
1
674,228
23,043,589,524
IssuesEvent
2022-07-23 14:40:44
deckhouse/deckhouse
https://api.github.com/repos/deckhouse/deckhouse
closed
Simplify bash scripts for node bootstrap.
status/rotten priority/backlog
### Preflight Checklist - [X] I agree to follow the [Code of Conduct](https://github.com/deckhouse/deckhouse/blob/main/CODE_OF_CONDUCT.md) that this project adheres to. - [X] I have searched the [issue tracker](https://github.com/deckhouse/deckhouse/issues) for an issue that matches the one I want to file, without success. ### Use case. Why is this important? Now bootstrap scripts contains many unnecessary software install like install bashible-completion-extras, various kernel packages, etc. This makes the scripts more complicated than necessary. ### Proposed Solution Use NodeGroupConfiguration custom scripts to install additional software. ### Additional Information _No response_
1.0
Simplify bash scripts for node bootstrap. - ### Preflight Checklist - [X] I agree to follow the [Code of Conduct](https://github.com/deckhouse/deckhouse/blob/main/CODE_OF_CONDUCT.md) that this project adheres to. - [X] I have searched the [issue tracker](https://github.com/deckhouse/deckhouse/issues) for an issue that matches the one I want to file, without success. ### Use case. Why is this important? Now bootstrap scripts contains many unnecessary software install like install bashible-completion-extras, various kernel packages, etc. This makes the scripts more complicated than necessary. ### Proposed Solution Use NodeGroupConfiguration custom scripts to install additional software. ### Additional Information _No response_
non_process
simplify bash scripts for node bootstrap preflight checklist i agree to follow the that this project adheres to i have searched the for an issue that matches the one i want to file without success use case why is this important now bootstrap scripts contains many unnecessary software install like install bashible completion extras various kernel packages etc this makes the scripts more complicated than necessary proposed solution use nodegroupconfiguration custom scripts to install additional software additional information no response
0
256,790
22,099,983,028
IssuesEvent
2022-06-01 13:06:17
vaadin/testbench
https://api.github.com/repos/vaadin/testbench
closed
GridWrap get column order by caption
UITest
GridWrap should have a `int getColumn(String)` This will help getting the column for a Bean Grid where the actual column number for a field is anybodys guess if not set using the `setColumns()`
1.0
GridWrap get column order by caption - GridWrap should have a `int getColumn(String)` This will help getting the column for a Bean Grid where the actual column number for a field is anybodys guess if not set using the `setColumns()`
non_process
gridwrap get column order by caption gridwrap should have a int getcolumn string this will help getting the column for a bean grid where the actual column number for a field is anybodys guess if not set using the setcolumns
0
50,295
10,475,618,223
IssuesEvent
2019-09-23 16:43:38
scottbass47/gsts
https://api.github.com/repos/scottbass47/gsts
opened
Shield drone hit animation
Code Enhancement Meeting Topic
Do we want a shader effect for when the shield drone gets hit by a bullet with its shield up?
1.0
Shield drone hit animation - Do we want a shader effect for when the shield drone gets hit by a bullet with its shield up?
non_process
shield drone hit animation do we want a shader effect for when the shield drone gets hit by a bullet with its shield up
0
7,909
11,090,082,548
IssuesEvent
2019-12-14 23:34:31
qgis/QGIS-Documentation
https://api.github.com/repos/qgis/QGIS-Documentation
closed
[FEATURE] explode hstore algorithm (#8212)
3.6 Automatic new feature Processing Alg
Original commit: https://github.com/qgis/QGIS/commit/6e16651d96f56c829a0a667d6b14969f39dace86 by nirvn Unfortunately this naughty coder did not write a description... :-(
1.0
[FEATURE] explode hstore algorithm (#8212) - Original commit: https://github.com/qgis/QGIS/commit/6e16651d96f56c829a0a667d6b14969f39dace86 by nirvn Unfortunately this naughty coder did not write a description... :-(
process
explode hstore algorithm original commit by nirvn unfortunately this naughty coder did not write a description
1
8,145
11,353,792,994
IssuesEvent
2020-01-24 16:14:26
pytorch/pytorch
https://api.github.com/repos/pytorch/pytorch
closed
CUDA MPS system
module: cuda module: multiprocessing triaged
## 🚀 Feature <!-- A clear and concise description of the feature proposal --> Enable PyTorch to work with MPS in multiple processes. Currently program just crashes if you start a second one. ## Motivation Certain shared clusters have CUDA exclusive mode turned on and must use MPS for full system utilization. If there is an easy way to make PyTorch work with MPS, would be great. <!-- Please outline the motivation for the proposal. Is your feature request related to a problem? e.g., I'm always frustrated when [...]. If this is related to another GitHub issue, please link here too --> ## Pitch https://docs.nvidia.com/deploy/pdf/CUDA_Multi_Process_Service_Overview.pdf cc @ngimel
1.0
CUDA MPS system - ## 🚀 Feature <!-- A clear and concise description of the feature proposal --> Enable PyTorch to work with MPS in multiple processes. Currently program just crashes if you start a second one. ## Motivation Certain shared clusters have CUDA exclusive mode turned on and must use MPS for full system utilization. If there is an easy way to make PyTorch work with MPS, would be great. <!-- Please outline the motivation for the proposal. Is your feature request related to a problem? e.g., I'm always frustrated when [...]. If this is related to another GitHub issue, please link here too --> ## Pitch https://docs.nvidia.com/deploy/pdf/CUDA_Multi_Process_Service_Overview.pdf cc @ngimel
process
cuda mps system 🚀 feature enable pytorch to work with mps in multiple processes currently program just crashes if you start a second one motivation certain shared clusters have cuda exclusive mode turned on and must use mps for full system utilization if there is an easy way to make pytorch work with mps would be great pitch cc ngimel
1
14,097
3,377,485,042
IssuesEvent
2015-11-25 03:51:19
nodejs/node
https://api.github.com/repos/nodejs/node
closed
test: parallel/test-net-socket-local-address frequently fails on freebsd
freebsd test
It seems to time out, e.g. https://jenkins-iojs.nodesource.com/job/node-test-commit-other/198/nodes=freebsd101-64/console - grep for 'not ok'. ``` not ok 522 - test-net-socket-local-address.js --- duration_ms: 60.74 ... ``` Refs #2095, /cc @rmg
1.0
test: parallel/test-net-socket-local-address frequently fails on freebsd - It seems to time out, e.g. https://jenkins-iojs.nodesource.com/job/node-test-commit-other/198/nodes=freebsd101-64/console - grep for 'not ok'. ``` not ok 522 - test-net-socket-local-address.js --- duration_ms: 60.74 ... ``` Refs #2095, /cc @rmg
non_process
test parallel test net socket local address frequently fails on freebsd it seems to time out e g grep for not ok not ok test net socket local address js duration ms refs cc rmg
0
43,282
17,500,541,193
IssuesEvent
2021-08-10 08:54:22
hashicorp/terraform-provider-azurerm
https://api.github.com/repos/hashicorp/terraform-provider-azurerm
closed
Provider crash on frontdoor destruction
bug crash upstream-microsoft service/frontdoor
<!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform (and AzureRM Provider) Version ``` terraform -v Terraform v1.0.4 on darwin_amd64 + provider registry.terraform.io/hashicorp/azurerm v2.70.0 ``` ### Affected Resource(s) * `azurerm_frontdoor` ### Terraform Configuration Files <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl # Copy-paste your Terraform configurations here - for large Terraform configs, # please use a service like Dropbox and share a link to the ZIP file. For # security, you can also encrypt the files using our GPG public key: https://keybase.io/hashicorp ``` ### Debug Output ### Panic Output ``` azurerm_frontdoor.this: Destroying... [id=/subscriptions/4e094144-afbc-420a-8d74-a4a0da618a3b/resourceGroups/fxs-ops-frontdoor-poc/providers/Microsoft.Network/frontDoors/fxs-ops-poc] ╷ │ Error: Plugin did not respond │ │ The plugin encountered an error, and failed to respond to the plugin.(*GRPCProvider).ApplyResourceChange call. The plugin logs may contain more details. ╵ Stack trace from the terraform-provider-azurerm_v2.70.0_x5 plugin: panic: runtime error: invalid memory address or nil pointer dereference [signal SIGSEGV: segmentation violation code=0x1 addr=0x48 pc=0x58ba3c7] goroutine 31 [running]: github.com/terraform-providers/terraform-provider-azurerm/azurerm/internal/services/frontdoor.resourceFrontDoorDelete(0xc001820a80, 0x6376e00, 0xc0002ea700, 0x0, 0x0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/azurerm/internal/services/frontdoor/frontdoor_resource.go:810 +0x247 github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).delete(0xc000c02d20, 0x73098c8, 0xc0012d6240, 0xc001820a80, 0x6376e00, 0xc0002ea700, 0x0, 0x0, 0x0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema/resource.go:369 +0x1ee github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).Apply(0xc000c02d20, 0x73098c8, 0xc0012d6240, 0xc0018302a0, 0xc0011ddf80, 0x6376e00, 0xc0002ea700, 0x0, 0x0, 0x0, ...) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema/resource.go:428 +0x8f7 github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*GRPCProviderServer).ApplyResourceChange(0xc0002855d8, 0x73098c8, 0xc0012d6240, 0xc00138e3c0, 0xc0012d6240, 0x6857f00, 0xc001b44400) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema/grpc_provider.go:955 +0x8ef github.com/hashicorp/terraform-plugin-go/tfprotov5/server.(*server).ApplyResourceChange(0xc000743c20, 0x7309970, 0xc0012d6240, 0xc001830000, 0xc000743c20, 0xc001b444e0, 0xc0002c4ba0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/github.com/hashicorp/terraform-plugin-go/tfprotov5/server/server.go:332 +0xb5 github.com/hashicorp/terraform-plugin-go/tfprotov5/internal/tfplugin5._Provider_ApplyResourceChange_Handler(0x6857f00, 0xc000743c20, 0x7309970, 0xc001b444e0, 0xc0010841e0, 0x0, 0x7309970, 0xc001b444e0, 0xc000847800, 0x176b) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/github.com/hashicorp/terraform-plugin-go/tfprotov5/internal/tfplugin5/tfplugin5_grpc.pb.go:380 +0x214 google.golang.org/grpc.(*Server).processUnaryRPC(0xc00049c700, 0x7355a18, 0xc000299c80, 0xc0017ac5a0, 0xc0010664e0, 0xa8d0140, 0x0, 0x0, 0x0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/google.golang.org/grpc/server.go:1292 +0x52b google.golang.org/grpc.(*Server).handleStream(0xc00049c700, 0x7355a18, 0xc000299c80, 0xc0017ac5a0, 0x0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/google.golang.org/grpc/server.go:1617 +0xd0c google.golang.org/grpc.(*Server).serveStreams.func1.2(0xc000ac2e30, 0xc00049c700, 0x7355a18, 0xc000299c80, 0xc0017ac5a0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/google.golang.org/grpc/server.go:940 +0xab created by google.golang.org/grpc.(*Server).serveStreams.func1 /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/google.golang.org/grpc/server.go:938 +0x1fd Error: The terraform-provider-azurerm_v2.70.0_x5 plugin crashed! This is always indicative of a bug within the plugin. It would be immensely helpful if you could report the crash with the plugin's maintainers so that it can be fixed. The output above should help diagnose the issue. ```
1.0
Provider crash on frontdoor destruction - <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform (and AzureRM Provider) Version ``` terraform -v Terraform v1.0.4 on darwin_amd64 + provider registry.terraform.io/hashicorp/azurerm v2.70.0 ``` ### Affected Resource(s) * `azurerm_frontdoor` ### Terraform Configuration Files <!--- Information about code formatting: https://help.github.com/articles/basic-writing-and-formatting-syntax/#quoting-code ---> ```hcl # Copy-paste your Terraform configurations here - for large Terraform configs, # please use a service like Dropbox and share a link to the ZIP file. For # security, you can also encrypt the files using our GPG public key: https://keybase.io/hashicorp ``` ### Debug Output ### Panic Output ``` azurerm_frontdoor.this: Destroying... [id=/subscriptions/4e094144-afbc-420a-8d74-a4a0da618a3b/resourceGroups/fxs-ops-frontdoor-poc/providers/Microsoft.Network/frontDoors/fxs-ops-poc] ╷ │ Error: Plugin did not respond │ │ The plugin encountered an error, and failed to respond to the plugin.(*GRPCProvider).ApplyResourceChange call. The plugin logs may contain more details. ╵ Stack trace from the terraform-provider-azurerm_v2.70.0_x5 plugin: panic: runtime error: invalid memory address or nil pointer dereference [signal SIGSEGV: segmentation violation code=0x1 addr=0x48 pc=0x58ba3c7] goroutine 31 [running]: github.com/terraform-providers/terraform-provider-azurerm/azurerm/internal/services/frontdoor.resourceFrontDoorDelete(0xc001820a80, 0x6376e00, 0xc0002ea700, 0x0, 0x0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/azurerm/internal/services/frontdoor/frontdoor_resource.go:810 +0x247 github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).delete(0xc000c02d20, 0x73098c8, 0xc0012d6240, 0xc001820a80, 0x6376e00, 0xc0002ea700, 0x0, 0x0, 0x0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema/resource.go:369 +0x1ee github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*Resource).Apply(0xc000c02d20, 0x73098c8, 0xc0012d6240, 0xc0018302a0, 0xc0011ddf80, 0x6376e00, 0xc0002ea700, 0x0, 0x0, 0x0, ...) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema/resource.go:428 +0x8f7 github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema.(*GRPCProviderServer).ApplyResourceChange(0xc0002855d8, 0x73098c8, 0xc0012d6240, 0xc00138e3c0, 0xc0012d6240, 0x6857f00, 0xc001b44400) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema/grpc_provider.go:955 +0x8ef github.com/hashicorp/terraform-plugin-go/tfprotov5/server.(*server).ApplyResourceChange(0xc000743c20, 0x7309970, 0xc0012d6240, 0xc001830000, 0xc000743c20, 0xc001b444e0, 0xc0002c4ba0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/github.com/hashicorp/terraform-plugin-go/tfprotov5/server/server.go:332 +0xb5 github.com/hashicorp/terraform-plugin-go/tfprotov5/internal/tfplugin5._Provider_ApplyResourceChange_Handler(0x6857f00, 0xc000743c20, 0x7309970, 0xc001b444e0, 0xc0010841e0, 0x0, 0x7309970, 0xc001b444e0, 0xc000847800, 0x176b) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/github.com/hashicorp/terraform-plugin-go/tfprotov5/internal/tfplugin5/tfplugin5_grpc.pb.go:380 +0x214 google.golang.org/grpc.(*Server).processUnaryRPC(0xc00049c700, 0x7355a18, 0xc000299c80, 0xc0017ac5a0, 0xc0010664e0, 0xa8d0140, 0x0, 0x0, 0x0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/google.golang.org/grpc/server.go:1292 +0x52b google.golang.org/grpc.(*Server).handleStream(0xc00049c700, 0x7355a18, 0xc000299c80, 0xc0017ac5a0, 0x0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/google.golang.org/grpc/server.go:1617 +0xd0c google.golang.org/grpc.(*Server).serveStreams.func1.2(0xc000ac2e30, 0xc00049c700, 0x7355a18, 0xc000299c80, 0xc0017ac5a0) /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/google.golang.org/grpc/server.go:940 +0xab created by google.golang.org/grpc.(*Server).serveStreams.func1 /opt/teamcity-agent/work/5d79fe75d4460a2f/src/github.com/terraform-providers/terraform-provider-azurerm/vendor/google.golang.org/grpc/server.go:938 +0x1fd Error: The terraform-provider-azurerm_v2.70.0_x5 plugin crashed! This is always indicative of a bug within the plugin. It would be immensely helpful if you could report the crash with the plugin's maintainers so that it can be fixed. The output above should help diagnose the issue. ```
non_process
provider crash on frontdoor destruction community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform and azurerm provider version terraform v terraform on darwin provider registry terraform io hashicorp azurerm affected resource s azurerm frontdoor terraform configuration files hcl copy paste your terraform configurations here for large terraform configs please use a service like dropbox and share a link to the zip file for security you can also encrypt the files using our gpg public key debug output panic output azurerm frontdoor this destroying ╷ │ error plugin did not respond │ │ the plugin encountered an error and failed to respond to the plugin grpcprovider applyresourcechange call the plugin logs may contain more details ╵ stack trace from the terraform provider azurerm plugin panic runtime error invalid memory address or nil pointer dereference goroutine github com terraform providers terraform provider azurerm azurerm internal services frontdoor resourcefrontdoordelete opt teamcity agent work src github com terraform providers terraform provider azurerm azurerm internal services frontdoor frontdoor resource go github com hashicorp terraform plugin sdk helper schema resource delete opt teamcity agent work src github com terraform providers terraform provider azurerm vendor github com hashicorp terraform plugin sdk helper schema resource go github com hashicorp terraform plugin sdk helper schema resource apply opt teamcity agent work src github com terraform providers terraform provider azurerm vendor github com hashicorp terraform plugin sdk helper schema resource go github com hashicorp terraform plugin sdk helper schema grpcproviderserver applyresourcechange opt teamcity agent work src github com terraform providers terraform provider azurerm vendor github com hashicorp terraform plugin sdk helper schema grpc provider go github com hashicorp terraform plugin go server server applyresourcechange opt teamcity agent work src github com terraform providers terraform provider azurerm vendor github com hashicorp terraform plugin go server server go github com hashicorp terraform plugin go internal provider applyresourcechange handler opt teamcity agent work src github com terraform providers terraform provider azurerm vendor github com hashicorp terraform plugin go internal grpc pb go google golang org grpc server processunaryrpc opt teamcity agent work src github com terraform providers terraform provider azurerm vendor google golang org grpc server go google golang org grpc server handlestream opt teamcity agent work src github com terraform providers terraform provider azurerm vendor google golang org grpc server go google golang org grpc server servestreams opt teamcity agent work src github com terraform providers terraform provider azurerm vendor google golang org grpc server go created by google golang org grpc server servestreams opt teamcity agent work src github com terraform providers terraform provider azurerm vendor google golang org grpc server go error the terraform provider azurerm plugin crashed this is always indicative of a bug within the plugin it would be immensely helpful if you could report the crash with the plugin s maintainers so that it can be fixed the output above should help diagnose the issue
0
17,023
22,392,194,678
IssuesEvent
2022-06-17 08:49:23
qgis/QGIS-Documentation
https://api.github.com/repos/qgis/QGIS-Documentation
closed
[processing] overlay tools with multiple overlay layers support (Request in QGIS)
Processing Alg Vector 3.26
### Request for documentation From pull request QGIS/qgis#46877 Author: @alexbruy QGIS version: 3.26 (Feature) **[processing] overlay tools with multiple overlay layers support** ### PR Description: ## Description Add new Intersect, Union and Difference tools which support multiple "overlay" inputs instead of single overlay layer. Tools are working in the following way: pick input and first overlay layer, produce result and use that result as an input at the next step together with second overlay layer and so on. For example, given this set of overlapping polygons ![image](https://user-images.githubusercontent.com/776954/154613862-a7058810-3852-4628-9122-d0f379364d7e.png) Difference with multiple overlays will produce this output ![image](https://user-images.githubusercontent.com/776954/154613930-9d1bdd93-0819-402a-b3a8-5198d7bd8cd4.png) which is a result of consecutive difference operations between overlays and result of the previous difference Similarly union will produce single layer ![image](https://user-images.githubusercontent.com/776954/154614064-103b4e6a-1234-49a9-825a-bc2ba0390d27.png) This is quite useful in models where we can not use loops. ### Commits tagged with [need-docs] or [FEATURE]
1.0
[processing] overlay tools with multiple overlay layers support (Request in QGIS) - ### Request for documentation From pull request QGIS/qgis#46877 Author: @alexbruy QGIS version: 3.26 (Feature) **[processing] overlay tools with multiple overlay layers support** ### PR Description: ## Description Add new Intersect, Union and Difference tools which support multiple "overlay" inputs instead of single overlay layer. Tools are working in the following way: pick input and first overlay layer, produce result and use that result as an input at the next step together with second overlay layer and so on. For example, given this set of overlapping polygons ![image](https://user-images.githubusercontent.com/776954/154613862-a7058810-3852-4628-9122-d0f379364d7e.png) Difference with multiple overlays will produce this output ![image](https://user-images.githubusercontent.com/776954/154613930-9d1bdd93-0819-402a-b3a8-5198d7bd8cd4.png) which is a result of consecutive difference operations between overlays and result of the previous difference Similarly union will produce single layer ![image](https://user-images.githubusercontent.com/776954/154614064-103b4e6a-1234-49a9-825a-bc2ba0390d27.png) This is quite useful in models where we can not use loops. ### Commits tagged with [need-docs] or [FEATURE]
process
overlay tools with multiple overlay layers support request in qgis request for documentation from pull request qgis qgis author alexbruy qgis version feature overlay tools with multiple overlay layers support pr description description add new intersect union and difference tools which support multiple overlay inputs instead of single overlay layer tools are working in the following way pick input and first overlay layer produce result and use that result as an input at the next step together with second overlay layer and so on for example given this set of overlapping polygons difference with multiple overlays will produce this output which is a result of consecutive difference operations between overlays and result of the previous difference similarly union will produce single layer this is quite useful in models where we can not use loops commits tagged with or
1
8,788
11,908,147,726
IssuesEvent
2020-03-31 00:07:07
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Processing : Dynamic dropdown list depending on selected layer
Feature Request Processing
Hi, i'm not sure if it's a new feature to add or if it already exists and i didn't managed to find out. **Feature description.** Creating a processing algorithm, i would like to have a field that can feel depending to the layer selected. Like when using a ParameterField, but with custom data inside that would be different depending on the layer (Not like a ParameterEnum that has always the same list). **Additional context** In my case for example, i would like to have the list of the schema contained on the database where the layer come from. I have a layer: **Layer1**, from a database : **Database1** and a layer : **LayerA** from a database : **DatabaseA**. My **Database1** got the schemas **[schema1, schema2, schema3]** and my **DatabaseA** got the schemas **[schemaA, schemaB, schemaC].** In my processing interface, i want to have a field that display the list of schema when i select a layer. If **Layer1** is selected, then my field displays the values **[schema1, schema2, schema3]** If i switch to **LayerA**, then it displays **[schemaA, schemaB, schemaC]** Thanks
1.0
Processing : Dynamic dropdown list depending on selected layer - Hi, i'm not sure if it's a new feature to add or if it already exists and i didn't managed to find out. **Feature description.** Creating a processing algorithm, i would like to have a field that can feel depending to the layer selected. Like when using a ParameterField, but with custom data inside that would be different depending on the layer (Not like a ParameterEnum that has always the same list). **Additional context** In my case for example, i would like to have the list of the schema contained on the database where the layer come from. I have a layer: **Layer1**, from a database : **Database1** and a layer : **LayerA** from a database : **DatabaseA**. My **Database1** got the schemas **[schema1, schema2, schema3]** and my **DatabaseA** got the schemas **[schemaA, schemaB, schemaC].** In my processing interface, i want to have a field that display the list of schema when i select a layer. If **Layer1** is selected, then my field displays the values **[schema1, schema2, schema3]** If i switch to **LayerA**, then it displays **[schemaA, schemaB, schemaC]** Thanks
process
processing dynamic dropdown list depending on selected layer hi i m not sure if it s a new feature to add or if it already exists and i didn t managed to find out feature description creating a processing algorithm i would like to have a field that can feel depending to the layer selected like when using a parameterfield but with custom data inside that would be different depending on the layer not like a parameterenum that has always the same list additional context in my case for example i would like to have the list of the schema contained on the database where the layer come from i have a layer from a database and a layer layera from a database databasea my got the schemas and my databasea got the schemas in my processing interface i want to have a field that display the list of schema when i select a layer if is selected then my field displays the values if i switch to layera then it displays thanks
1
57,927
14,244,901,888
IssuesEvent
2020-11-19 07:47:29
gitpod-io/gitpod
https://api.github.com/repos/gitpod-io/gitpod
closed
Prebuild logs are hanging
bug prebuilds staging
When running a prebuild (e.g. http://gitpod-staging.com/#prebuild/https://github.com/gitpod-io/spring-petclinic) the logs are pushed to the frontend but eventually there are no more updates. Reloading the page enabled this again for a bit before it again stops to update.
1.0
Prebuild logs are hanging - When running a prebuild (e.g. http://gitpod-staging.com/#prebuild/https://github.com/gitpod-io/spring-petclinic) the logs are pushed to the frontend but eventually there are no more updates. Reloading the page enabled this again for a bit before it again stops to update.
non_process
prebuild logs are hanging when running a prebuild e g the logs are pushed to the frontend but eventually there are no more updates reloading the page enabled this again for a bit before it again stops to update
0
11,007
4,864,864,485
IssuesEvent
2016-11-14 19:12:42
mapbox/mapbox-gl-native
https://api.github.com/repos/mapbox/mapbox-gl-native
closed
ios-tests fails to build — No such file or directory
build iOS tests
The ios-tests project that contains KIF tests fails to build due to file references that went stale following #6150: ``` No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/JSON/OHHTTPStubsResponse+JSON.m' No input files No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/OHHTTPStubs.m' No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/HTTPMessage/OHHTTPStubsResponse+HTTPMessage.m' No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/NSURLSession/OHHTTPStubs+NSURLSessionConfiguration.m' No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/OHPathHelpers/OHPathHelpers.m' No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/OHHTTPStubsResponse.m' clang: error: linker command failed with exit code 1 (use -v to see invocation) (null): error: cannot parse the debug map for "/path/to/mapbox-gl-native/build/ios/Debug-iphonesimulator/Mapbox GL Tests.app/PlugIns/Test Bundle.xctest/Test Bundle": No such file or directory ``` /cc @incanus @friedbunny
1.0
ios-tests fails to build — No such file or directory - The ios-tests project that contains KIF tests fails to build due to file references that went stale following #6150: ``` No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/JSON/OHHTTPStubsResponse+JSON.m' No input files No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/OHHTTPStubs.m' No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/HTTPMessage/OHHTTPStubsResponse+HTTPMessage.m' No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/NSURLSession/OHHTTPStubs+NSURLSessionConfiguration.m' No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/OHPathHelpers/OHPathHelpers.m' No such file or directory: '/path/to/mapbox-gl-native/platform/ios/uitest/Mapbox GL Tests/OHHTTPStubs/OHHTTPStubs/Sources/OHHTTPStubsResponse.m' clang: error: linker command failed with exit code 1 (use -v to see invocation) (null): error: cannot parse the debug map for "/path/to/mapbox-gl-native/build/ios/Debug-iphonesimulator/Mapbox GL Tests.app/PlugIns/Test Bundle.xctest/Test Bundle": No such file or directory ``` /cc @incanus @friedbunny
non_process
ios tests fails to build — no such file or directory the ios tests project that contains kif tests fails to build due to file references that went stale following no such file or directory path to mapbox gl native platform ios uitest mapbox gl tests ohhttpstubs ohhttpstubs sources json ohhttpstubsresponse json m no input files no such file or directory path to mapbox gl native platform ios uitest mapbox gl tests ohhttpstubs ohhttpstubs sources ohhttpstubs m no such file or directory path to mapbox gl native platform ios uitest mapbox gl tests ohhttpstubs ohhttpstubs sources httpmessage ohhttpstubsresponse httpmessage m no such file or directory path to mapbox gl native platform ios uitest mapbox gl tests ohhttpstubs ohhttpstubs sources nsurlsession ohhttpstubs nsurlsessionconfiguration m no such file or directory path to mapbox gl native platform ios uitest mapbox gl tests ohhttpstubs ohhttpstubs sources ohpathhelpers ohpathhelpers m no such file or directory path to mapbox gl native platform ios uitest mapbox gl tests ohhttpstubs ohhttpstubs sources ohhttpstubsresponse m clang error linker command failed with exit code use v to see invocation null error cannot parse the debug map for path to mapbox gl native build ios debug iphonesimulator mapbox gl tests app plugins test bundle xctest test bundle no such file or directory cc incanus friedbunny
0
16,424
21,262,260,531
IssuesEvent
2022-04-13 06:18:32
camunda/zeebe
https://api.github.com/repos/camunda/zeebe
opened
Automate manual QA instructions for OAuth integration tests
kind/toil team/process-automation area/maintainability
**Description** When we initially implemented the OAuth credentials provider (both for Go and Java), me and Miguel added some helpers to run integration tests against real OAuth2 providers under `clients/oauth2`, with some instructions on how to use them. These would setup what we expected would be similar to a production set up - a reverse proxy in front of Zeebe performing the authentication, and an OAuth2 provider. This helped us catch two bugs after the initial release, so it was useful, but it hasn't really be used ever since and most likely is outdated at this point. It also doesn't help us prevent further bugs since we don't run this as part of our normal QA. I would propose replacing the tests with just a simple integration test for each client that tests a plain `OAuthCredentialsProvider` (in Java and in Go) against a real provider, likely hydra as it's the lighter one (Keycloak takes a good 30s to just start up versus hydra which takes a couple).
1.0
Automate manual QA instructions for OAuth integration tests - **Description** When we initially implemented the OAuth credentials provider (both for Go and Java), me and Miguel added some helpers to run integration tests against real OAuth2 providers under `clients/oauth2`, with some instructions on how to use them. These would setup what we expected would be similar to a production set up - a reverse proxy in front of Zeebe performing the authentication, and an OAuth2 provider. This helped us catch two bugs after the initial release, so it was useful, but it hasn't really be used ever since and most likely is outdated at this point. It also doesn't help us prevent further bugs since we don't run this as part of our normal QA. I would propose replacing the tests with just a simple integration test for each client that tests a plain `OAuthCredentialsProvider` (in Java and in Go) against a real provider, likely hydra as it's the lighter one (Keycloak takes a good 30s to just start up versus hydra which takes a couple).
process
automate manual qa instructions for oauth integration tests description when we initially implemented the oauth credentials provider both for go and java me and miguel added some helpers to run integration tests against real providers under clients with some instructions on how to use them these would setup what we expected would be similar to a production set up a reverse proxy in front of zeebe performing the authentication and an provider this helped us catch two bugs after the initial release so it was useful but it hasn t really be used ever since and most likely is outdated at this point it also doesn t help us prevent further bugs since we don t run this as part of our normal qa i would propose replacing the tests with just a simple integration test for each client that tests a plain oauthcredentialsprovider in java and in go against a real provider likely hydra as it s the lighter one keycloak takes a good to just start up versus hydra which takes a couple
1
21,115
14,365,851,189
IssuesEvent
2020-12-01 02:50:56
Opentrons/opentrons
https://api.github.com/repos/Opentrons/opentrons
closed
Chore: remove e2e tests from travis
:spider: SPDDRS chore infrastructure
# Overview After moving cypress e2e tests over to github actions (#6642 and #6978) we don't need e2e tests to run in Travis anymore, so lets remove them.
1.0
Chore: remove e2e tests from travis - # Overview After moving cypress e2e tests over to github actions (#6642 and #6978) we don't need e2e tests to run in Travis anymore, so lets remove them.
non_process
chore remove tests from travis overview after moving cypress tests over to github actions and we don t need tests to run in travis anymore so lets remove them
0
20,028
10,580,265,644
IssuesEvent
2019-10-08 06:10:06
golang/go
https://api.github.com/repos/golang/go
closed
cmd/compile: for small objects, use constants directly instead of copying from statictmp
Performance
``` func f() []int { return []int{7} } ``` generates: ``` 0x001d 00029 (tmp1.go:3) LEAQ type.[1]int(SB), AX 0x0024 00036 (tmp1.go:4) MOVQ AX, (SP) 0x0028 00040 (tmp1.go:4) CALL runtime.newobject(SB) 0x002d 00045 (tmp1.go:4) MOVQ 8(SP), AX 0x0032 00050 (tmp1.go:4) MOVQ "".statictmp_0(SB), CX 0x0039 00057 (tmp1.go:4) MOVQ CX, (AX) ``` Note that the `7` is written by loading from a readonly global and writing to the slice. We should just use a ``` MOVQ $7, (AX) ``` for the last 2 lines. Up to some small constant in size, emitting the constants explicitly instead of copying them from a statictmp is better.
True
cmd/compile: for small objects, use constants directly instead of copying from statictmp - ``` func f() []int { return []int{7} } ``` generates: ``` 0x001d 00029 (tmp1.go:3) LEAQ type.[1]int(SB), AX 0x0024 00036 (tmp1.go:4) MOVQ AX, (SP) 0x0028 00040 (tmp1.go:4) CALL runtime.newobject(SB) 0x002d 00045 (tmp1.go:4) MOVQ 8(SP), AX 0x0032 00050 (tmp1.go:4) MOVQ "".statictmp_0(SB), CX 0x0039 00057 (tmp1.go:4) MOVQ CX, (AX) ``` Note that the `7` is written by loading from a readonly global and writing to the slice. We should just use a ``` MOVQ $7, (AX) ``` for the last 2 lines. Up to some small constant in size, emitting the constants explicitly instead of copying them from a statictmp is better.
non_process
cmd compile for small objects use constants directly instead of copying from statictmp func f int return int generates go leaq type int sb ax go movq ax sp go call runtime newobject sb go movq sp ax go movq statictmp sb cx go movq cx ax note that the is written by loading from a readonly global and writing to the slice we should just use a movq ax for the last lines up to some small constant in size emitting the constants explicitly instead of copying them from a statictmp is better
0
7,630
10,730,387,978
IssuesEvent
2019-10-28 17:19:24
prisma/photonjs
https://api.github.com/repos/prisma/photonjs
closed
Need to call `photon.disconnect()` to make script terminate
kind/discussion process/candidate
I have the following simple TypeScript script based on Photon.js: ``` import { Photon } from '@generated/photon' const photon = new Photon() async function main() { const people = await photon.people.findMany({ first: 20 }) console.log(people) // await photon.disconnect() } main() ``` When not including the `photon.disconnect()` call, my script won't terminate. Is that the expected behavior or should Photon.js autoclose the DB connection?
1.0
Need to call `photon.disconnect()` to make script terminate - I have the following simple TypeScript script based on Photon.js: ``` import { Photon } from '@generated/photon' const photon = new Photon() async function main() { const people = await photon.people.findMany({ first: 20 }) console.log(people) // await photon.disconnect() } main() ``` When not including the `photon.disconnect()` call, my script won't terminate. Is that the expected behavior or should Photon.js autoclose the DB connection?
process
need to call photon disconnect to make script terminate i have the following simple typescript script based on photon js import photon from generated photon const photon new photon async function main const people await photon people findmany first console log people await photon disconnect main when not including the photon disconnect call my script won t terminate is that the expected behavior or should photon js autoclose the db connection
1
17,856
23,802,761,254
IssuesEvent
2022-09-03 15:03:48
gobuffalo/github_flavored_markdown
https://api.github.com/repos/gobuffalo/github_flavored_markdown
closed
NOTE: the reason of forking this repository
process
This repository was forked from https://github.com/shurcooL/github_flavored_markdown in late 2018 to fix the dependency issue. Currently, there is no significant functional fragmentation so far and it could be fine to keep it as is. The related PRs are: * https://github.com/gobuffalo/buffalo/pull/1216 * https://github.com/gobuffalo/buffalo/pull/1216/files#diff-ac069b752eca30067e98020f2707cd0f30063fe516596330ce41a384ad62f340 * https://github.com/gobuffalo/plush/pull/68
1.0
NOTE: the reason of forking this repository - This repository was forked from https://github.com/shurcooL/github_flavored_markdown in late 2018 to fix the dependency issue. Currently, there is no significant functional fragmentation so far and it could be fine to keep it as is. The related PRs are: * https://github.com/gobuffalo/buffalo/pull/1216 * https://github.com/gobuffalo/buffalo/pull/1216/files#diff-ac069b752eca30067e98020f2707cd0f30063fe516596330ce41a384ad62f340 * https://github.com/gobuffalo/plush/pull/68
process
note the reason of forking this repository this repository was forked from in late to fix the dependency issue currently there is no significant functional fragmentation so far and it could be fine to keep it as is the related prs are
1
96,465
20,021,939,946
IssuesEvent
2022-02-01 17:09:59
danieljharvey/mimsa
https://api.github.com/repos/danieljharvey/mimsa
opened
TS output error with infix
bug codegen
This code: ```haskell let apply a f = f a; infix |> = apply; \a -> a |> (and False) |> not ``` Outputs this: ```typescript const apply = <A, C>(a: A) => (f: (arg: A) => C) => f(a); export const main = (a: boolean) => apply(apply(a)(and(false)))(not); ``` This breaks because the `C` generic is introduced for `a` rather than for `f`. Fixing it to this fixes typechecking: ```typescript const apply = <A>(a: A) => <C>(f: (arg: A) => C) => f(a); export const main = (a: boolean) => apply(apply(a)(and(false)))(not); ```
1.0
TS output error with infix - This code: ```haskell let apply a f = f a; infix |> = apply; \a -> a |> (and False) |> not ``` Outputs this: ```typescript const apply = <A, C>(a: A) => (f: (arg: A) => C) => f(a); export const main = (a: boolean) => apply(apply(a)(and(false)))(not); ``` This breaks because the `C` generic is introduced for `a` rather than for `f`. Fixing it to this fixes typechecking: ```typescript const apply = <A>(a: A) => <C>(f: (arg: A) => C) => f(a); export const main = (a: boolean) => apply(apply(a)(and(false)))(not); ```
non_process
ts output error with infix this code haskell let apply a f f a infix apply a a and false not outputs this typescript const apply a a f arg a c f a export const main a boolean apply apply a and false not this breaks because the c generic is introduced for a rather than for f fixing it to this fixes typechecking typescript const apply a a f arg a c f a export const main a boolean apply apply a and false not
0
51,711
13,642,667,326
IssuesEvent
2020-09-25 15:52:07
panther-labs/panther
https://api.github.com/repos/panther-labs/panther
closed
API for Most Active Rules
p1 team:security engineering team:web
The log analysis overview page needs to show a chart with the top rules ranked by the number of alerts they sent in a certain time frame https://app.abstract.com/projects/558a8de6-7134-4c8b-91c5-62074bb1279b/branches/master/commits/d0570e02d9fb58bb4e79266e5e7ff6e3978973a8/files/c2bbc1a8-4778-4cd6-93a5-4630d8868201/layers/49B988DE-947E-4204-AA7B-5B48AE7E50E4 ### Acceptance Criteria - The API accepts a date range as input - The API returns a list of rule information (as designated by the needs of the chart above) along with the number of alerts sent - If it's helpful, the API can optionally return only a fixed number of results (10 or 20) since the chart will only showcase 5 or 10
True
API for Most Active Rules - The log analysis overview page needs to show a chart with the top rules ranked by the number of alerts they sent in a certain time frame https://app.abstract.com/projects/558a8de6-7134-4c8b-91c5-62074bb1279b/branches/master/commits/d0570e02d9fb58bb4e79266e5e7ff6e3978973a8/files/c2bbc1a8-4778-4cd6-93a5-4630d8868201/layers/49B988DE-947E-4204-AA7B-5B48AE7E50E4 ### Acceptance Criteria - The API accepts a date range as input - The API returns a list of rule information (as designated by the needs of the chart above) along with the number of alerts sent - If it's helpful, the API can optionally return only a fixed number of results (10 or 20) since the chart will only showcase 5 or 10
non_process
api for most active rules the log analysis overview page needs to show a chart with the top rules ranked by the number of alerts they sent in a certain time frame acceptance criteria the api accepts a date range as input the api returns a list of rule information as designated by the needs of the chart above along with the number of alerts sent if it s helpful the api can optionally return only a fixed number of results or since the chart will only showcase or
0
19,248
25,414,720,066
IssuesEvent
2022-11-22 22:30:14
ORNL-AMO/AMO-Tools-Desktop
https://api.github.com/repos/ORNL-AMO/AMO-Tools-Desktop
closed
Badge for Energy Input in EAF
bug Process Heating
... doesn't revert to green/blue after fixing error regarding electricity input (importing old assessments).[https://app.zenhub.com/files/80439269/b7cb001b-575a-4ee4-938d-f8a2d384d200/download](https://app.zenhub.com/files/80439269/b7cb001b-575a-4ee4-938d-f8a2d384d200/download) will also need to fix several costs. Carbon = 0.06 electrode = 3 other fuel = 4 ![image.png](https://images.zenhubusercontent.com/5cd48a2af8cffa5a19122d27/b86d80d6-5ff8-4e7a-a445-004a0b1c72e5)
1.0
Badge for Energy Input in EAF - ... doesn't revert to green/blue after fixing error regarding electricity input (importing old assessments).[https://app.zenhub.com/files/80439269/b7cb001b-575a-4ee4-938d-f8a2d384d200/download](https://app.zenhub.com/files/80439269/b7cb001b-575a-4ee4-938d-f8a2d384d200/download) will also need to fix several costs. Carbon = 0.06 electrode = 3 other fuel = 4 ![image.png](https://images.zenhubusercontent.com/5cd48a2af8cffa5a19122d27/b86d80d6-5ff8-4e7a-a445-004a0b1c72e5)
process
badge for energy input in eaf doesn t revert to green blue after fixing error regarding electricity input importing old assessments will also need to fix several costs carbon electrode other fuel
1
176,760
6,564,785,736
IssuesEvent
2017-09-08 04:14:26
zero-os/0-Disk
https://api.github.com/repos/zero-os/0-Disk
opened
Work out how offline (vdisk data) shard repairing could be done for
priority_critical
Currently we only plan for restoring of primary data (as in copying it over from the slave cluster), for running vdisks which are handled on the fly by the nbdserver when needed. However the shards that went down probably also contain data from vdisks that weren't mounted at the time. It would be great if somehow we could restore the data (if available) for those vdisks as well, from an upper layer. This would probably be done by calling some new zeroctl command, but the details for this have to be worked out. That is what this issue is about.
1.0
Work out how offline (vdisk data) shard repairing could be done for - Currently we only plan for restoring of primary data (as in copying it over from the slave cluster), for running vdisks which are handled on the fly by the nbdserver when needed. However the shards that went down probably also contain data from vdisks that weren't mounted at the time. It would be great if somehow we could restore the data (if available) for those vdisks as well, from an upper layer. This would probably be done by calling some new zeroctl command, but the details for this have to be worked out. That is what this issue is about.
non_process
work out how offline vdisk data shard repairing could be done for currently we only plan for restoring of primary data as in copying it over from the slave cluster for running vdisks which are handled on the fly by the nbdserver when needed however the shards that went down probably also contain data from vdisks that weren t mounted at the time it would be great if somehow we could restore the data if available for those vdisks as well from an upper layer this would probably be done by calling some new zeroctl command but the details for this have to be worked out that is what this issue is about
0
506,921
14,675,971,134
IssuesEvent
2020-12-30 18:57:28
prometheus/prometheus
https://api.github.com/repos/prometheus/prometheus
closed
enable TLS (HTTPS) on prometheus
component/api component/ui kind/feature priority/P3
I'm proposing to add TLS support in Prometheus to provide secure data transfer.
1.0
enable TLS (HTTPS) on prometheus - I'm proposing to add TLS support in Prometheus to provide secure data transfer.
non_process
enable tls https on prometheus i m proposing to add tls support in prometheus to provide secure data transfer
0
4,432
7,308,485,855
IssuesEvent
2018-02-28 08:32:42
UKHomeOffice/dq-aws-transition
https://api.github.com/repos/UKHomeOffice/dq-aws-transition
opened
Configure Data Transfer route OAG to S3 Archive
DQ Data Ingest Production SSM processing
Pre requisites: - [ ] `sftp_oag_client_maytech.py` downloads files successfully. ## Acceptance Criteria - [ ] PM2 logs show no errors for OAG to S3 Archive configuration - [ ] OAG data is moved to S3 Archive bucket
1.0
Configure Data Transfer route OAG to S3 Archive - Pre requisites: - [ ] `sftp_oag_client_maytech.py` downloads files successfully. ## Acceptance Criteria - [ ] PM2 logs show no errors for OAG to S3 Archive configuration - [ ] OAG data is moved to S3 Archive bucket
process
configure data transfer route oag to archive pre requisites sftp oag client maytech py downloads files successfully acceptance criteria logs show no errors for oag to archive configuration oag data is moved to archive bucket
1
25,190
7,647,322,101
IssuesEvent
2018-05-09 03:13:50
eclipse/openj9
https://api.github.com/repos/eclipse/openj9
closed
Create a dockerfile for OpenJDK 10 builds and update v8 & v9 dockerfiles
comp:build enhancement jdk10
/buildenv/docker/* dockerfiles for Java 8 & 9 pull OpenJDK builds from Ubuntu as the bootjdk images. These should be replaced with Adopt builds. We also need a dockerfile for Java 10 which we can link to from the OpenJ9 website build page: https://www.eclipse.org/openj9/oj9_build.html
1.0
Create a dockerfile for OpenJDK 10 builds and update v8 & v9 dockerfiles - /buildenv/docker/* dockerfiles for Java 8 & 9 pull OpenJDK builds from Ubuntu as the bootjdk images. These should be replaced with Adopt builds. We also need a dockerfile for Java 10 which we can link to from the OpenJ9 website build page: https://www.eclipse.org/openj9/oj9_build.html
non_process
create a dockerfile for openjdk builds and update dockerfiles buildenv docker dockerfiles for java pull openjdk builds from ubuntu as the bootjdk images these should be replaced with adopt builds we also need a dockerfile for java which we can link to from the website build page
0
262,214
22,824,086,534
IssuesEvent
2022-07-12 06:56:26
mozilla-mobile/fenix
https://api.github.com/repos/mozilla-mobile/fenix
closed
Intermittent UI test failure - < SettingsAboutTest.verifyAboutFirefoxPreview>
eng:intermittent-test eng:disabled-test eng:ui-test
### Firebase Test Run: [Firebase link](https://console.firebase.google.com/u/0/project/moz-fenix/testlab/histories/bh.66b7091e15d53d45/matrices/6881175624710471990/executions/bs.e52f9c7a5db3bc59/test-cases) ### Stacktrace: 05-23 18:07:30.764: I/WindowManager(477): Input event dispatching timed out sending to org.mozilla.fenix.debug/org.mozilla.fenix.HomeActivity. Reason: b9a2d0b org.mozilla.fenix.debug/org.mozilla.fenix.HomeActivity (server) is not responding. Waited 5002ms for FocusEvent(hasFocus=true) 05-23 18:07:30.932: I/ActivityManager(477): Force stopping org.mozilla.fenix.debug appid=10152 user=0: finished inst 05-23 18:07:30.932: I/ActivityManager(477): Killing 12969:org.mozilla.fenix.debug/u0a152 (adj 0): stop org.mozilla.fenix.debug due to finished inst 05-23 18:07:30.937: W/ActivityTaskManager(477): Force removing ActivityRecord{27abe61 u0 org.mozilla.fenix.debug/org.mozilla.fenix.HomeActivity t9 f}}: app died, no saved state 05-23 18:07:30.940: D/AndroidRuntime(12952): Shutting down VM 05-23 18:07:30.940: I/ServiceChildProcess(13786): Destroying GeckoServiceChildProcess 05-23 18:07:30.941: I/ServiceChildProcess(13621): Destroying GeckoServiceChildProcess 05-23 18:07:30.976: I/ActivityManager(477): Killing 13621:org.mozilla.fenix.debug:tab33/u0a152 (adj 0): stop org.mozilla.fenix.debug due to finished inst 05-23 18:07:30.978: I/ActivityManager(477): Killing 13786:org.mozilla.fenix.debug:gpu/u0a152 (adj 0): stop org.mozilla.fenix.debug due to finished inst 05-23 18:07:30.983: D/WindowManager(477): notifyANR took 231ms ### Build: 5/24 ### Notes: Similar with #24047 ┆Issue is synchronized with this [Jira Task](https://mozilla-hub.atlassian.net/browse/FNXV2-20541)
3.0
Intermittent UI test failure - < SettingsAboutTest.verifyAboutFirefoxPreview> - ### Firebase Test Run: [Firebase link](https://console.firebase.google.com/u/0/project/moz-fenix/testlab/histories/bh.66b7091e15d53d45/matrices/6881175624710471990/executions/bs.e52f9c7a5db3bc59/test-cases) ### Stacktrace: 05-23 18:07:30.764: I/WindowManager(477): Input event dispatching timed out sending to org.mozilla.fenix.debug/org.mozilla.fenix.HomeActivity. Reason: b9a2d0b org.mozilla.fenix.debug/org.mozilla.fenix.HomeActivity (server) is not responding. Waited 5002ms for FocusEvent(hasFocus=true) 05-23 18:07:30.932: I/ActivityManager(477): Force stopping org.mozilla.fenix.debug appid=10152 user=0: finished inst 05-23 18:07:30.932: I/ActivityManager(477): Killing 12969:org.mozilla.fenix.debug/u0a152 (adj 0): stop org.mozilla.fenix.debug due to finished inst 05-23 18:07:30.937: W/ActivityTaskManager(477): Force removing ActivityRecord{27abe61 u0 org.mozilla.fenix.debug/org.mozilla.fenix.HomeActivity t9 f}}: app died, no saved state 05-23 18:07:30.940: D/AndroidRuntime(12952): Shutting down VM 05-23 18:07:30.940: I/ServiceChildProcess(13786): Destroying GeckoServiceChildProcess 05-23 18:07:30.941: I/ServiceChildProcess(13621): Destroying GeckoServiceChildProcess 05-23 18:07:30.976: I/ActivityManager(477): Killing 13621:org.mozilla.fenix.debug:tab33/u0a152 (adj 0): stop org.mozilla.fenix.debug due to finished inst 05-23 18:07:30.978: I/ActivityManager(477): Killing 13786:org.mozilla.fenix.debug:gpu/u0a152 (adj 0): stop org.mozilla.fenix.debug due to finished inst 05-23 18:07:30.983: D/WindowManager(477): notifyANR took 231ms ### Build: 5/24 ### Notes: Similar with #24047 ┆Issue is synchronized with this [Jira Task](https://mozilla-hub.atlassian.net/browse/FNXV2-20541)
non_process
intermittent ui test failure firebase test run stacktrace i windowmanager input event dispatching timed out sending to org mozilla fenix debug org mozilla fenix homeactivity reason org mozilla fenix debug org mozilla fenix homeactivity server is not responding waited for focusevent hasfocus true i activitymanager force stopping org mozilla fenix debug appid user finished inst i activitymanager killing org mozilla fenix debug adj stop org mozilla fenix debug due to finished inst w activitytaskmanager force removing activityrecord org mozilla fenix debug org mozilla fenix homeactivity f app died no saved state d androidruntime shutting down vm i servicechildprocess destroying geckoservicechildprocess i servicechildprocess destroying geckoservicechildprocess i activitymanager killing org mozilla fenix debug adj stop org mozilla fenix debug due to finished inst i activitymanager killing org mozilla fenix debug gpu adj stop org mozilla fenix debug due to finished inst d windowmanager notifyanr took build notes similar with ┆issue is synchronized with this
0
9,084
12,152,655,251
IssuesEvent
2020-04-24 22:56:36
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
Process Start fails with URL on .NET Core 3
area-System.Diagnostics.Process untriaged
The following code works on .NET Framework but throws on .NET Core 3: `Process.Start("https://github.com");` ``` System.ComponentModel.Win32Exception HResult=0x80004005 Message=The system cannot find the file specified. Source=System.Diagnostics.Process StackTrace: at System.Diagnostics.Process.StartWithCreateProcess(ProcessStartInfo startInfo) at System.Diagnostics.Process.Start() at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) at System.Diagnostics.Process.Start(String fileName) at NetCoreProcessStartBug.MainWindow.ButtonBase_OnClick(Object sender, RoutedEventArgs e) in c:\dev\NetCoreProcessStartBug\NetCoreProcessStartBug\MainWindow.xaml.cs:line 31 at System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs) at System.Windows.RouteItem.InvokeHandler(RoutedEventArgs routedEventArgs) at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised) at System.Windows.EventRoute.InvokeHandlers(Object source, RoutedEventArgs args) at System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args) at System.Windows.UIElement.RaiseEvent(RoutedEventArgs e) at System.Windows.Controls.Primitives.ButtonBase.OnClick() at System.Windows.Controls.Button.OnClick() at System.Windows.Controls.Primitives.ButtonBase.OnMouseLeftButtonUp(MouseButtonEventArgs e) at System.Windows.UIElement.OnMouseLeftButtonUpThunk(Object sender, MouseButtonEventArgs e) at System.Windows.Input.MouseButtonEventArgs.InvokeEventHandler(Delegate genericHandler, Object genericTarget) at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target) at System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs) at System.Windows.RouteItem.InvokeHandler(RoutedEventArgs routedEventArgs) at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised) at System.Windows.EventRoute.ReInvokeHandlers(Object source, RoutedEventArgs args) at System.Windows.UIElement.ReRaiseEventAs(DependencyObject sender, RoutedEventArgs args, RoutedEvent newEvent) at System.Windows.UIElement.CrackMouseButtonEventAndReRaiseEvent(DependencyObject sender, MouseButtonEventArgs e) at System.Windows.UIElement.OnMouseUpThunk(Object sender, MouseButtonEventArgs e) at System.Windows.Input.MouseButtonEventArgs.InvokeEventHandler(Delegate genericHandler, Object genericTarget) at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target) at System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs) at System.Windows.RouteItem.InvokeHandler(RoutedEventArgs routedEventArgs) at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised) at System.Windows.EventRoute.InvokeHandlers(Object source, RoutedEventArgs args) at System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args) at System.Windows.UIElement.RaiseTrustedEvent(RoutedEventArgs args) at System.Windows.UIElement.RaiseEvent(RoutedEventArgs args, Boolean trusted) at System.Windows.Input.InputManager.ProcessStagingArea() at System.Windows.Input.InputManager.ProcessInput(InputEventArgs input) at System.Windows.Input.InputProviderSite.ReportInput(InputReport inputReport) at System.Windows.Interop.HwndMouseInputProvider.ReportInput(IntPtr hwnd, InputMode mode, Int32 timestamp, RawMouseActions actions, Int32 x, Int32 y, Int32 wheel) at System.Windows.Interop.HwndMouseInputProvider.FilterMessage(IntPtr hwnd, WindowMessage msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at System.Windows.Interop.HwndSource.InputFilterMessage(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o) at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs) at System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Int32 numArgs, Delegate catchHandler) at System.Windows.Threading.Dispatcher.WrappedInvoke(Delegate callback, Object args, Int32 numArgs, Delegate catchHandler) at System.Windows.Threading.Dispatcher.LegacyInvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Int32 numArgs) at System.Windows.Threading.Dispatcher.Invoke(DispatcherPriority priority, Delegate method, Object arg) at MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam) at MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg) at System.Windows.Threading.Dispatcher.TranslateAndDispatchMessage(MSG& msg) at System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame) at System.Windows.Threading.Dispatcher.PushFrame(DispatcherFrame frame) at System.Windows.Threading.Dispatcher.Run() at System.Windows.Application.RunDispatcher(Object ignore) at System.Windows.Application.RunInternal(Window window) at System.Windows.Application.Run(Window window) at System.Windows.Application.Run() at NetCoreProcessStartBug.App.Main() ``` GitHub repro: https://github.com/onovotny/NetCoreProcessStartBug
1.0
Process Start fails with URL on .NET Core 3 - The following code works on .NET Framework but throws on .NET Core 3: `Process.Start("https://github.com");` ``` System.ComponentModel.Win32Exception HResult=0x80004005 Message=The system cannot find the file specified. Source=System.Diagnostics.Process StackTrace: at System.Diagnostics.Process.StartWithCreateProcess(ProcessStartInfo startInfo) at System.Diagnostics.Process.Start() at System.Diagnostics.Process.Start(ProcessStartInfo startInfo) at System.Diagnostics.Process.Start(String fileName) at NetCoreProcessStartBug.MainWindow.ButtonBase_OnClick(Object sender, RoutedEventArgs e) in c:\dev\NetCoreProcessStartBug\NetCoreProcessStartBug\MainWindow.xaml.cs:line 31 at System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs) at System.Windows.RouteItem.InvokeHandler(RoutedEventArgs routedEventArgs) at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised) at System.Windows.EventRoute.InvokeHandlers(Object source, RoutedEventArgs args) at System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args) at System.Windows.UIElement.RaiseEvent(RoutedEventArgs e) at System.Windows.Controls.Primitives.ButtonBase.OnClick() at System.Windows.Controls.Button.OnClick() at System.Windows.Controls.Primitives.ButtonBase.OnMouseLeftButtonUp(MouseButtonEventArgs e) at System.Windows.UIElement.OnMouseLeftButtonUpThunk(Object sender, MouseButtonEventArgs e) at System.Windows.Input.MouseButtonEventArgs.InvokeEventHandler(Delegate genericHandler, Object genericTarget) at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target) at System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs) at System.Windows.RouteItem.InvokeHandler(RoutedEventArgs routedEventArgs) at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised) at System.Windows.EventRoute.ReInvokeHandlers(Object source, RoutedEventArgs args) at System.Windows.UIElement.ReRaiseEventAs(DependencyObject sender, RoutedEventArgs args, RoutedEvent newEvent) at System.Windows.UIElement.CrackMouseButtonEventAndReRaiseEvent(DependencyObject sender, MouseButtonEventArgs e) at System.Windows.UIElement.OnMouseUpThunk(Object sender, MouseButtonEventArgs e) at System.Windows.Input.MouseButtonEventArgs.InvokeEventHandler(Delegate genericHandler, Object genericTarget) at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target) at System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs) at System.Windows.RouteItem.InvokeHandler(RoutedEventArgs routedEventArgs) at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised) at System.Windows.EventRoute.InvokeHandlers(Object source, RoutedEventArgs args) at System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args) at System.Windows.UIElement.RaiseTrustedEvent(RoutedEventArgs args) at System.Windows.UIElement.RaiseEvent(RoutedEventArgs args, Boolean trusted) at System.Windows.Input.InputManager.ProcessStagingArea() at System.Windows.Input.InputManager.ProcessInput(InputEventArgs input) at System.Windows.Input.InputProviderSite.ReportInput(InputReport inputReport) at System.Windows.Interop.HwndMouseInputProvider.ReportInput(IntPtr hwnd, InputMode mode, Int32 timestamp, RawMouseActions actions, Int32 x, Int32 y, Int32 wheel) at System.Windows.Interop.HwndMouseInputProvider.FilterMessage(IntPtr hwnd, WindowMessage msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at System.Windows.Interop.HwndSource.InputFilterMessage(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) at MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o) at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs) at System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Int32 numArgs, Delegate catchHandler) at System.Windows.Threading.Dispatcher.WrappedInvoke(Delegate callback, Object args, Int32 numArgs, Delegate catchHandler) at System.Windows.Threading.Dispatcher.LegacyInvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Int32 numArgs) at System.Windows.Threading.Dispatcher.Invoke(DispatcherPriority priority, Delegate method, Object arg) at MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam) at MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg) at System.Windows.Threading.Dispatcher.TranslateAndDispatchMessage(MSG& msg) at System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame) at System.Windows.Threading.Dispatcher.PushFrame(DispatcherFrame frame) at System.Windows.Threading.Dispatcher.Run() at System.Windows.Application.RunDispatcher(Object ignore) at System.Windows.Application.RunInternal(Window window) at System.Windows.Application.Run(Window window) at System.Windows.Application.Run() at NetCoreProcessStartBug.App.Main() ``` GitHub repro: https://github.com/onovotny/NetCoreProcessStartBug
process
process start fails with url on net core the following code works on net framework but throws on net core process start system componentmodel hresult message the system cannot find the file specified source system diagnostics process stacktrace at system diagnostics process startwithcreateprocess processstartinfo startinfo at system diagnostics process start at system diagnostics process start processstartinfo startinfo at system diagnostics process start string filename at netcoreprocessstartbug mainwindow buttonbase onclick object sender routedeventargs e in c dev netcoreprocessstartbug netcoreprocessstartbug mainwindow xaml cs line at system windows routedeventhandlerinfo invokehandler object target routedeventargs routedeventargs at system windows routeitem invokehandler routedeventargs routedeventargs at system windows eventroute invokehandlersimpl object source routedeventargs args boolean reraised at system windows eventroute invokehandlers object source routedeventargs args at system windows uielement raiseeventimpl dependencyobject sender routedeventargs args at system windows uielement raiseevent routedeventargs e at system windows controls primitives buttonbase onclick at system windows controls button onclick at system windows controls primitives buttonbase onmouseleftbuttonup mousebuttoneventargs e at system windows uielement onmouseleftbuttonupthunk object sender mousebuttoneventargs e at system windows input mousebuttoneventargs invokeeventhandler delegate generichandler object generictarget at system windows routedeventargs invokehandler delegate handler object target at system windows routedeventhandlerinfo invokehandler object target routedeventargs routedeventargs at system windows routeitem invokehandler routedeventargs routedeventargs at system windows eventroute invokehandlersimpl object source routedeventargs args boolean reraised at system windows eventroute reinvokehandlers object source routedeventargs args at system windows uielement reraiseeventas dependencyobject sender routedeventargs args routedevent newevent at system windows uielement crackmousebuttoneventandreraiseevent dependencyobject sender mousebuttoneventargs e at system windows uielement onmouseupthunk object sender mousebuttoneventargs e at system windows input mousebuttoneventargs invokeeventhandler delegate generichandler object generictarget at system windows routedeventargs invokehandler delegate handler object target at system windows routedeventhandlerinfo invokehandler object target routedeventargs routedeventargs at system windows routeitem invokehandler routedeventargs routedeventargs at system windows eventroute invokehandlersimpl object source routedeventargs args boolean reraised at system windows eventroute invokehandlers object source routedeventargs args at system windows uielement raiseeventimpl dependencyobject sender routedeventargs args at system windows uielement raisetrustedevent routedeventargs args at system windows uielement raiseevent routedeventargs args boolean trusted at system windows input inputmanager processstagingarea at system windows input inputmanager processinput inputeventargs input at system windows input inputprovidersite reportinput inputreport inputreport at system windows interop hwndmouseinputprovider reportinput intptr hwnd inputmode mode timestamp rawmouseactions actions x y wheel at system windows interop hwndmouseinputprovider filtermessage intptr hwnd windowmessage msg intptr wparam intptr lparam boolean handled at system windows interop hwndsource inputfiltermessage intptr hwnd msg intptr wparam intptr lparam boolean handled at ms hwndwrapper wndproc intptr hwnd msg intptr wparam intptr lparam boolean handled at ms hwndsubclass dispatchercallbackoperation object o at system windows threading exceptionwrapper internalrealcall delegate callback object args numargs at system windows threading exceptionwrapper trycatchwhen object source delegate callback object args numargs delegate catchhandler at system windows threading dispatcher wrappedinvoke delegate callback object args numargs delegate catchhandler at system windows threading dispatcher legacyinvokeimpl dispatcherpriority priority timespan timeout delegate method object args numargs at system windows threading dispatcher invoke dispatcherpriority priority delegate method object arg at ms hwndsubclass subclasswndproc intptr hwnd msg intptr wparam intptr lparam at ms unsafenativemethods dispatchmessage msg msg at system windows threading dispatcher translateanddispatchmessage msg msg at system windows threading dispatcher pushframeimpl dispatcherframe frame at system windows threading dispatcher pushframe dispatcherframe frame at system windows threading dispatcher run at system windows application rundispatcher object ignore at system windows application runinternal window window at system windows application run window window at system windows application run at netcoreprocessstartbug app main github repro
1
176,390
14,580,556,814
IssuesEvent
2020-12-18 09:21:19
CompositionalIT/farmer
https://api.github.com/repos/CompositionalIT/farmer
closed
Document Secure parameters
documentation
Document how to manage secure parameters from an author point of view.
1.0
Document Secure parameters - Document how to manage secure parameters from an author point of view.
non_process
document secure parameters document how to manage secure parameters from an author point of view
0
585,843
17,536,418,200
IssuesEvent
2021-08-12 07:02:01
hackforla/design-systems
https://api.github.com/repos/hackforla/design-systems
opened
Documentation Tools Initial Research - Fractal
research Development priority: medium size: small
### Overview Explore Fractal, figure out how easy it is to understand the documentation tool's doc. Figure out how easy is configuration for a React App. ### Action Items - [ ] How long did exploring take, from research to base template? - [ ] Log time - [ ] Is the tool’s documentation easy to understand? - [ ] Did you have to use other dev’s GitHub repo to understand the docs - [ ] Figure out settings: - [ ] What is the out of the box setting? - [ ] What addons are available? - [ ] What are the pros? - [ ] What do you like about this tool? - [ ] What are the cons? - [ ] What are your struggles with this tool? ### Resources/Instructions * [Fractal Docs](https://fractal.build/)
1.0
Documentation Tools Initial Research - Fractal - ### Overview Explore Fractal, figure out how easy it is to understand the documentation tool's doc. Figure out how easy is configuration for a React App. ### Action Items - [ ] How long did exploring take, from research to base template? - [ ] Log time - [ ] Is the tool’s documentation easy to understand? - [ ] Did you have to use other dev’s GitHub repo to understand the docs - [ ] Figure out settings: - [ ] What is the out of the box setting? - [ ] What addons are available? - [ ] What are the pros? - [ ] What do you like about this tool? - [ ] What are the cons? - [ ] What are your struggles with this tool? ### Resources/Instructions * [Fractal Docs](https://fractal.build/)
non_process
documentation tools initial research fractal overview explore fractal figure out how easy it is to understand the documentation tool s doc figure out how easy is configuration for a react app action items how long did exploring take from research to base template log time is the tool’s documentation easy to understand did you have to use other dev’s github repo to understand the docs figure out settings what is the out of the box setting what addons are available what are the pros what do you like about this tool what are the cons what are your struggles with this tool resources instructions
0
19,385
25,520,933,687
IssuesEvent
2022-11-28 20:24:33
biapy/biapy-bashlings
https://api.github.com/repos/biapy/biapy-bashlings
closed
Unbound variable error in process-options.
bug unit-test macos process-options
### Description Unit tests fails for process-options on Mac OS with unbound variable error probably due to empty array. ```bash not ok 128 Allow for no arguments beside allowed option list. # (from function `assert_success' in file test/test_helper/bats-assert/src/assert_success.bash, line 42, # in test file test/process-options.bats, line 127) # `assert_success' failed # # -- command failed -- # status : 1 # output : /Users/runner/work/biapy-bashlings/biapy-bashlings/src/process-options.bash: line 96: allowed_options[@]: unbound variable # -- # ```
1.0
Unbound variable error in process-options. - ### Description Unit tests fails for process-options on Mac OS with unbound variable error probably due to empty array. ```bash not ok 128 Allow for no arguments beside allowed option list. # (from function `assert_success' in file test/test_helper/bats-assert/src/assert_success.bash, line 42, # in test file test/process-options.bats, line 127) # `assert_success' failed # # -- command failed -- # status : 1 # output : /Users/runner/work/biapy-bashlings/biapy-bashlings/src/process-options.bash: line 96: allowed_options[@]: unbound variable # -- # ```
process
unbound variable error in process options description unit tests fails for process options on mac os with unbound variable error probably due to empty array bash not ok allow for no arguments beside allowed option list from function assert success in file test test helper bats assert src assert success bash line in test file test process options bats line assert success failed command failed status output users runner work biapy bashlings biapy bashlings src process options bash line allowed options unbound variable
1
10,207
13,067,025,679
IssuesEvent
2020-07-30 23:11:17
nion-software/nionswift
https://api.github.com/repos/nion-software/nionswift
opened
Assigning variables and switching projects can result in invalid mapped items
f - organization f - processing priority - critical type - bug
This results in not being able to open the Console window due to an exception. Not sure if this is the exact circumstance under which this occurs. Need to try to reproduce the issue.
1.0
Assigning variables and switching projects can result in invalid mapped items - This results in not being able to open the Console window due to an exception. Not sure if this is the exact circumstance under which this occurs. Need to try to reproduce the issue.
process
assigning variables and switching projects can result in invalid mapped items this results in not being able to open the console window due to an exception not sure if this is the exact circumstance under which this occurs need to try to reproduce the issue
1
772
3,255,773,983
IssuesEvent
2015-10-20 10:21:56
luc-github/Repetier-Firmware-0.92
https://api.github.com/repos/luc-github/Repetier-Firmware-0.92
closed
Add Support for ESP8266 wifi module
Feature Request help welcome Processing
as this device is very cheap ~5$ and only need 2 pins and +/- hardware changes are not hard to implement, thank g mcclean for pointing this out. connection can be done using UART0 non populated connector on main board or any other available pins using softserial Request is already done in repetier tracker but no response so far. the goal is: 1 - to see if can allow TCP/IP connection with repetier host 2 - to get a replica of LCD panel on a web page
1.0
Add Support for ESP8266 wifi module - as this device is very cheap ~5$ and only need 2 pins and +/- hardware changes are not hard to implement, thank g mcclean for pointing this out. connection can be done using UART0 non populated connector on main board or any other available pins using softserial Request is already done in repetier tracker but no response so far. the goal is: 1 - to see if can allow TCP/IP connection with repetier host 2 - to get a replica of LCD panel on a web page
process
add support for wifi module as this device is very cheap and only need pins and hardware changes are not hard to implement thank g mcclean for pointing this out connection can be done using non populated connector on main board or any other available pins using softserial request is already done in repetier tracker but no response so far the goal is to see if can allow tcp ip connection with repetier host to get a replica of lcd panel on a web page
1
15,923
20,142,123,606
IssuesEvent
2022-02-09 01:03:33
brucemiller/LaTeXML
https://api.github.com/repos/brucemiller/LaTeXML
closed
HTML subfigures via flexbox rather than tables?
enhancement postprocessing
I was inspecting an arXiv document (1802.06832) which has 4 subfigures a)-d) in its Results section and noticed they don't reflow in the HTML. They are currently marked as 4 `td` table cells with class `ltx_subfigure` in the same `tr` row. The fixed single-row display looks quite bad even on large displays, as it is significantly wider than the main article width. And on small displays you end up with a huge horizontal scroll, which is just as bad. As I recently did some flexbox work with the moderncv binding, I'm wondering if @brucemiller would see that as a better mechanism for presenting the subfigures. That would give us decent reflow, and we can contain the max-width to a reasonable responsive width.
1.0
HTML subfigures via flexbox rather than tables? - I was inspecting an arXiv document (1802.06832) which has 4 subfigures a)-d) in its Results section and noticed they don't reflow in the HTML. They are currently marked as 4 `td` table cells with class `ltx_subfigure` in the same `tr` row. The fixed single-row display looks quite bad even on large displays, as it is significantly wider than the main article width. And on small displays you end up with a huge horizontal scroll, which is just as bad. As I recently did some flexbox work with the moderncv binding, I'm wondering if @brucemiller would see that as a better mechanism for presenting the subfigures. That would give us decent reflow, and we can contain the max-width to a reasonable responsive width.
process
html subfigures via flexbox rather than tables i was inspecting an arxiv document which has subfigures a d in its results section and noticed they don t reflow in the html they are currently marked as td table cells with class ltx subfigure in the same tr row the fixed single row display looks quite bad even on large displays as it is significantly wider than the main article width and on small displays you end up with a huge horizontal scroll which is just as bad as i recently did some flexbox work with the moderncv binding i m wondering if brucemiller would see that as a better mechanism for presenting the subfigures that would give us decent reflow and we can contain the max width to a reasonable responsive width
1
9,636
12,600,603,878
IssuesEvent
2020-06-11 08:26:22
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
QGIS native algorithms hanging connections on inputs after ran
Bug Feedback Processing
<!-- Bug fixing and feature development is a community responsibility, and not the responsibility of the QGIS project alone. If this bug report or feature request is high-priority for you, we suggest engaging a QGIS developer or support organisation and financially sponsoring a fix Checklist before submitting - [ ] Search through existing issue reports and gis.stackexchange.com to check whether the issue already exists - [ ] Test with a [clean new user profile](https://docs.qgis.org/testing/en/docs/user_manual/introduction/qgis_configuration.html?highlight=profile#working-with-user-profiles). - [ ] Create a light and self-contained sample dataset and project file which demonstrates the issue --> **Describe the bug** <!-- A clear and concise description of what the bug is. --> After i ran the "native:extractbyattribute" or "native:pointsalonglines" using python processing.run interface, the inputs of the processing becomes "locked" as if the source is still being used. I'm using geopackage and this causes 2 problems: 1: I can't delete temporary inputs until the process finishes 2: The maximun connections on the underlying SQLite execedes 64 connections over time, becouse my processes iterates over and over using these functions **How to Reproduce** <!-- Steps, sample datasets and qgis project file to reproduce the behavior. Screencasts or screenshots welcome 1. Go to '...' 2. Click on '....' 3. Scroll down to '....' 4. See error --> Just run one of those algorithms like "native:extractbyattribute" or "native:pointsalonglines" and try to delete the file via system while python has not finished to run. processing.run("native:extractbyattribute", {'INPUT': "my.gpkg", 'FIELD': "id", 'OPERATOR': 0, 'VALUE': 1, 'OUTPUT': "my2.gpkg"}) os. remove("my.gpkg") **QGIS and OS versions** <!-- In the QGIS Help menu -> About, click in the table, Ctrl+A and then Ctrl+C. Finally paste here --> Using windows 10 build 18362 QGS 3.12.3 (i tried to copy the about content, but the about window just showed blank. **Additional context** By taking a deep dive into QGS code, i think this may be related to the function: QgsProcessingFeatureSource *QgsProcessingUtils::variantToSource defined in src/core/processing/qgsprocessingutils.cpp:347 in line 373 e 389 when it creates this temporary source just for use into this processing it passes false to the QgsProcessingFeatureSource constructor in the ownsOriginalSource parameter. This way the distructor from the QgsProcessingFeatureSource will not kill the source when the QgsProcessingFeatureSource gets distructed. I haven't tested this possible solution as i don't have a dev env for qgis here, and don't know if there is any other problem this path can lead. <!-- Add any other context about the problem here. -->
1.0
QGIS native algorithms hanging connections on inputs after ran - <!-- Bug fixing and feature development is a community responsibility, and not the responsibility of the QGIS project alone. If this bug report or feature request is high-priority for you, we suggest engaging a QGIS developer or support organisation and financially sponsoring a fix Checklist before submitting - [ ] Search through existing issue reports and gis.stackexchange.com to check whether the issue already exists - [ ] Test with a [clean new user profile](https://docs.qgis.org/testing/en/docs/user_manual/introduction/qgis_configuration.html?highlight=profile#working-with-user-profiles). - [ ] Create a light and self-contained sample dataset and project file which demonstrates the issue --> **Describe the bug** <!-- A clear and concise description of what the bug is. --> After i ran the "native:extractbyattribute" or "native:pointsalonglines" using python processing.run interface, the inputs of the processing becomes "locked" as if the source is still being used. I'm using geopackage and this causes 2 problems: 1: I can't delete temporary inputs until the process finishes 2: The maximun connections on the underlying SQLite execedes 64 connections over time, becouse my processes iterates over and over using these functions **How to Reproduce** <!-- Steps, sample datasets and qgis project file to reproduce the behavior. Screencasts or screenshots welcome 1. Go to '...' 2. Click on '....' 3. Scroll down to '....' 4. See error --> Just run one of those algorithms like "native:extractbyattribute" or "native:pointsalonglines" and try to delete the file via system while python has not finished to run. processing.run("native:extractbyattribute", {'INPUT': "my.gpkg", 'FIELD': "id", 'OPERATOR': 0, 'VALUE': 1, 'OUTPUT': "my2.gpkg"}) os. remove("my.gpkg") **QGIS and OS versions** <!-- In the QGIS Help menu -> About, click in the table, Ctrl+A and then Ctrl+C. Finally paste here --> Using windows 10 build 18362 QGS 3.12.3 (i tried to copy the about content, but the about window just showed blank. **Additional context** By taking a deep dive into QGS code, i think this may be related to the function: QgsProcessingFeatureSource *QgsProcessingUtils::variantToSource defined in src/core/processing/qgsprocessingutils.cpp:347 in line 373 e 389 when it creates this temporary source just for use into this processing it passes false to the QgsProcessingFeatureSource constructor in the ownsOriginalSource parameter. This way the distructor from the QgsProcessingFeatureSource will not kill the source when the QgsProcessingFeatureSource gets distructed. I haven't tested this possible solution as i don't have a dev env for qgis here, and don't know if there is any other problem this path can lead. <!-- Add any other context about the problem here. -->
process
qgis native algorithms hanging connections on inputs after ran bug fixing and feature development is a community responsibility and not the responsibility of the qgis project alone if this bug report or feature request is high priority for you we suggest engaging a qgis developer or support organisation and financially sponsoring a fix checklist before submitting search through existing issue reports and gis stackexchange com to check whether the issue already exists test with a create a light and self contained sample dataset and project file which demonstrates the issue describe the bug after i ran the native extractbyattribute or native pointsalonglines using python processing run interface the inputs of the processing becomes locked as if the source is still being used i m using geopackage and this causes problems i can t delete temporary inputs until the process finishes the maximun connections on the underlying sqlite execedes connections over time becouse my processes iterates over and over using these functions how to reproduce steps sample datasets and qgis project file to reproduce the behavior screencasts or screenshots welcome go to click on scroll down to see error just run one of those algorithms like native extractbyattribute or native pointsalonglines and try to delete the file via system while python has not finished to run processing run native extractbyattribute input my gpkg field id operator value output gpkg os remove my gpkg qgis and os versions about click in the table ctrl a and then ctrl c finally paste here using windows build qgs i tried to copy the about content but the about window just showed blank additional context by taking a deep dive into qgs code i think this may be related to the function qgsprocessingfeaturesource qgsprocessingutils varianttosource defined in src core processing qgsprocessingutils cpp in line e when it creates this temporary source just for use into this processing it passes false to the qgsprocessingfeaturesource constructor in the ownsoriginalsource parameter this way the distructor from the qgsprocessingfeaturesource will not kill the source when the qgsprocessingfeaturesource gets distructed i haven t tested this possible solution as i don t have a dev env for qgis here and don t know if there is any other problem this path can lead
1
3,247
6,313,729,881
IssuesEvent
2017-07-24 08:58:30
itsyouonline/identityserver
https://api.github.com/repos/itsyouonline/identityserver
closed
Implement contracts
process_duplicate type_feature
- Separate page on the website (possibly needing some ui design work) - Need to decide: - what are the required fields? - what are optional fields? - only allow validated info to be used or do we allow non validated info as well in some cases - how to represent an organization in a contract - do we allow to simply upload a photo/pdf of an (existing) paper contract and signing that??
1.0
Implement contracts - - Separate page on the website (possibly needing some ui design work) - Need to decide: - what are the required fields? - what are optional fields? - only allow validated info to be used or do we allow non validated info as well in some cases - how to represent an organization in a contract - do we allow to simply upload a photo/pdf of an (existing) paper contract and signing that??
process
implement contracts separate page on the website possibly needing some ui design work need to decide what are the required fields what are optional fields only allow validated info to be used or do we allow non validated info as well in some cases how to represent an organization in a contract do we allow to simply upload a photo pdf of an existing paper contract and signing that
1