Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
196,137
| 14,814,368,337
|
IssuesEvent
|
2021-01-14 04:39:07
|
Thy-Vipe/BeastsOfBermuda-issues
|
https://api.github.com/repos/Thy-Vipe/BeastsOfBermuda-issues
|
opened
|
[Bug] cave jam location
|
Map bug tester-team
|
_Originally written by **Cleafspear | 76561198077984700**_
Game Version: 1.1.1085
*===== System Specs =====
CPU Brand: Intel(R) Core(TM) i7-8700K CPU @ 3.70GHz
Vendor: GenuineIntel
GPU Brand: NVIDIA GeForce GTX 1080
GPU Driver Info: Unknown
Num CPU Cores: 6
===================*
Context: **velo 1.2**
Map: Rival_Shores
the rock on the wall can jam a player when walked into, effects small creatures.
Location: X=-190842.766 Y=16915.342 Z=25901.346
|
1.0
|
[Bug] cave jam location - _Originally written by **Cleafspear | 76561198077984700**_
Game Version: 1.1.1085
*===== System Specs =====
CPU Brand: Intel(R) Core(TM) i7-8700K CPU @ 3.70GHz
Vendor: GenuineIntel
GPU Brand: NVIDIA GeForce GTX 1080
GPU Driver Info: Unknown
Num CPU Cores: 6
===================*
Context: **velo 1.2**
Map: Rival_Shores
the rock on the wall can jam a player when walked into, effects small creatures.
Location: X=-190842.766 Y=16915.342 Z=25901.346
|
non_process
|
cave jam location originally written by cleafspear game version system specs cpu brand intel r core tm cpu vendor genuineintel gpu brand nvidia geforce gtx gpu driver info unknown num cpu cores context velo map rival shores the rock on the wall can jam a player when walked into effects small creatures location x y z
| 0
|
86,263
| 10,479,262,708
|
IssuesEvent
|
2019-09-24 03:27:28
|
Naoghuman/naoghuman.github.io
|
https://api.github.com/repos/Naoghuman/naoghuman.github.io
|
opened
|
[web] Add content to the 'DSGVO' site.
|
documentation
|
[web] Add content to the 'DSGVO' site.
* Generate the stuff with recht24.de.
|
1.0
|
[web] Add content to the 'DSGVO' site. - [web] Add content to the 'DSGVO' site.
* Generate the stuff with recht24.de.
|
non_process
|
add content to the dsgvo site add content to the dsgvo site generate the stuff with de
| 0
|
271,416
| 23,603,339,487
|
IssuesEvent
|
2022-08-24 05:41:46
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
ccl/changefeedccl: TestChangefeedInitialScan failed
|
C-test-failure O-robot branch-master
|
ccl/changefeedccl.TestChangefeedInitialScan [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/6213858?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/6213858?buildTab=artifacts#/) on master @ [003c0360de8b64319b5f0f127b99be91dbdca8a3](https://github.com/cockroachdb/cockroach/commits/003c0360de8b64319b5f0f127b99be91dbdca8a3):
```
=== RUN TestChangefeedInitialScan
test_log_scope.go:162: test logs captured to: /artifacts/tmp/_tmp/a77002d7c9453d7cd2d382f907780e13/logTestChangefeedInitialScan4243794510
test_log_scope.go:80: use -show-logs to present logs inline
=== CONT TestChangefeedInitialScan
changefeed_test.go:869: -- test log scope end --
--- FAIL: TestChangefeedInitialScan (7.01s)
=== RUN TestChangefeedInitialScan/sinkless/no_cursor_-_no_initial_scan
changefeed_test.go:834: ERROR: context canceled (SQLSTATE XXUUU)
--- FAIL: TestChangefeedInitialScan/sinkless/no_cursor_-_no_initial_scan (1.40s)
=== RUN TestChangefeedInitialScan/sinkless
helpers_test.go:716: making server as system tenant
helpers_test.go:803: making sinkless feed factory
--- FAIL: TestChangefeedInitialScan/sinkless (6.81s)
```
<p>Parameters: <code>TAGS=bazel,gss,deadlock</code>
</p>
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
/cc @cockroachdb/cdc
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestChangefeedInitialScan.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
1.0
|
ccl/changefeedccl: TestChangefeedInitialScan failed - ccl/changefeedccl.TestChangefeedInitialScan [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/6213858?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/6213858?buildTab=artifacts#/) on master @ [003c0360de8b64319b5f0f127b99be91dbdca8a3](https://github.com/cockroachdb/cockroach/commits/003c0360de8b64319b5f0f127b99be91dbdca8a3):
```
=== RUN TestChangefeedInitialScan
test_log_scope.go:162: test logs captured to: /artifacts/tmp/_tmp/a77002d7c9453d7cd2d382f907780e13/logTestChangefeedInitialScan4243794510
test_log_scope.go:80: use -show-logs to present logs inline
=== CONT TestChangefeedInitialScan
changefeed_test.go:869: -- test log scope end --
--- FAIL: TestChangefeedInitialScan (7.01s)
=== RUN TestChangefeedInitialScan/sinkless/no_cursor_-_no_initial_scan
changefeed_test.go:834: ERROR: context canceled (SQLSTATE XXUUU)
--- FAIL: TestChangefeedInitialScan/sinkless/no_cursor_-_no_initial_scan (1.40s)
=== RUN TestChangefeedInitialScan/sinkless
helpers_test.go:716: making server as system tenant
helpers_test.go:803: making sinkless feed factory
--- FAIL: TestChangefeedInitialScan/sinkless (6.81s)
```
<p>Parameters: <code>TAGS=bazel,gss,deadlock</code>
</p>
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
/cc @cockroachdb/cdc
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestChangefeedInitialScan.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
non_process
|
ccl changefeedccl testchangefeedinitialscan failed ccl changefeedccl testchangefeedinitialscan with on master run testchangefeedinitialscan test log scope go test logs captured to artifacts tmp tmp test log scope go use show logs to present logs inline cont testchangefeedinitialscan changefeed test go test log scope end fail testchangefeedinitialscan run testchangefeedinitialscan sinkless no cursor no initial scan changefeed test go error context canceled sqlstate xxuuu fail testchangefeedinitialscan sinkless no cursor no initial scan run testchangefeedinitialscan sinkless helpers test go making server as system tenant helpers test go making sinkless feed factory fail testchangefeedinitialscan sinkless parameters tags bazel gss deadlock help see also cc cockroachdb cdc
| 0
|
409,207
| 27,726,758,199
|
IssuesEvent
|
2023-03-15 03:16:43
|
omarperez77/IS2022
|
https://api.github.com/repos/omarperez77/IS2022
|
closed
|
Subir al repositorio el documento de especificación de requisitos
|
documentation
|
- [x] Subir el documento SRS.
- [x] Añadir los últimos cambios.
- [x] Dar formato acorde a las reglas APA
|
1.0
|
Subir al repositorio el documento de especificación de requisitos - - [x] Subir el documento SRS.
- [x] Añadir los últimos cambios.
- [x] Dar formato acorde a las reglas APA
|
non_process
|
subir al repositorio el documento de especificación de requisitos subir el documento srs añadir los últimos cambios dar formato acorde a las reglas apa
| 0
|
5,201
| 7,976,109,886
|
IssuesEvent
|
2018-07-17 11:35:26
|
pingcap/tidb
|
https://api.github.com/repos/pingcap/tidb
|
opened
|
read the encoded Chunk returned by TiKV
|
component/coprocessor status/WIP type/enhancement
|
If the `EncodeType` is 'Arrow' in the KVRequest, TiKV will return the encoded `Chunk`, we should handle it.
|
1.0
|
read the encoded Chunk returned by TiKV - If the `EncodeType` is 'Arrow' in the KVRequest, TiKV will return the encoded `Chunk`, we should handle it.
|
process
|
read the encoded chunk returned by tikv if the encodetype is arrow in the kvrequest tikv will return the encoded chunk we should handle it
| 1
|
9,471
| 12,466,684,851
|
IssuesEvent
|
2020-05-28 15:50:57
|
prisma/prisma
|
https://api.github.com/repos/prisma/prisma
|
opened
|
Disable update check
|
process/candidate
|

It should be possible to disable this update check, preferably using an environment variable.
|
1.0
|
Disable update check - 
It should be possible to disable this update check, preferably using an environment variable.
|
process
|
disable update check it should be possible to disable this update check preferably using an environment variable
| 1
|
4,512
| 7,358,862,667
|
IssuesEvent
|
2018-03-10 00:08:55
|
MetaMask/metamask-extension
|
https://api.github.com/repos/MetaMask/metamask-extension
|
closed
|
Migrate Circle CI Config Files To 2.0
|
L6-tests L9-process P2-sooner T2-refactor
|
"We’re Sunsetting CircleCI 1.0: August 31, 2018 is the Final Day for 1.0 Builds"
https://circleci.com/blog/sunsetting-1-0/
Requires us to upgrade the config files to CircleCI 2.0 syntax.
|
1.0
|
Migrate Circle CI Config Files To 2.0 - "We’re Sunsetting CircleCI 1.0: August 31, 2018 is the Final Day for 1.0 Builds"
https://circleci.com/blog/sunsetting-1-0/
Requires us to upgrade the config files to CircleCI 2.0 syntax.
|
process
|
migrate circle ci config files to we’re sunsetting circleci august is the final day for builds requires us to upgrade the config files to circleci syntax
| 1
|
18,993
| 13,536,007,514
|
IssuesEvent
|
2020-09-16 08:25:57
|
topcoder-platform/qa-fun
|
https://api.github.com/repos/topcoder-platform/qa-fun
|
closed
|
[TCO20-Southern Asia][Mobile-Android]No Checkout & add Coupon option
|
UX/Usability
|
Steps:-
1. Login to app in android
2 .Add a item to cart
3. Verify to add Checkout & Coupon option
Issue : No Checkout & Coupon option
Expected Result : Should have an option to Checkout & add Coupon
Device : Samsing A20

|
True
|
[TCO20-Southern Asia][Mobile-Android]No Checkout & add Coupon option - Steps:-
1. Login to app in android
2 .Add a item to cart
3. Verify to add Checkout & Coupon option
Issue : No Checkout & Coupon option
Expected Result : Should have an option to Checkout & add Coupon
Device : Samsing A20

|
non_process
|
no checkout add coupon option steps login to app in android add a item to cart verify to add checkout coupon option issue no checkout coupon option expected result should have an option to checkout add coupon device samsing
| 0
|
65,983
| 6,980,926,836
|
IssuesEvent
|
2017-12-13 04:57:00
|
LSSTDESC/descqa
|
https://api.github.com/repos/LSSTDESC/descqa
|
reopened
|
N(z) test
|
validation test
|
I believe @evevkovacs is already working on this. This issue is to track progress.
- [x] code to reduce mock data
- [x] code that works within DESCQA framework
- [x] validation data
- [ ] validation criteria
|
1.0
|
N(z) test - I believe @evevkovacs is already working on this. This issue is to track progress.
- [x] code to reduce mock data
- [x] code that works within DESCQA framework
- [x] validation data
- [ ] validation criteria
|
non_process
|
n z test i believe evevkovacs is already working on this this issue is to track progress code to reduce mock data code that works within descqa framework validation data validation criteria
| 0
|
6,108
| 8,966,980,815
|
IssuesEvent
|
2019-01-29 01:14:40
|
pawn-lang/compiler
|
https://api.github.com/repos/pawn-lang/compiler
|
closed
|
Hangs when a string define including an extended ascii table character (Windows-1252) gets processed inside an include
|
area: pre-processor state: cannot reproduce type: bug
|
**Is this a BUG REPORT, FEATURE REQUEST or QUESTION?**:
* [x] Bug Report
* [ ] Feature Request
* [ ] Question
**What happened**:
Compiler hangs when trying to compile this:
test.pwn:
```pawn
#define crash "ñ"
#include "crash.pwn"
```
crash.pwn:
```pawn
crash
```
(yeah, only one line including "crash")
**What you expected to happen**:
Not hang, give an relevant error
**How to reproduce it (as minimally and precisely as possible)**:
Try to compile the files above encoded as Windows-1252
**Anything else we need to know?**:
The hanging happens whenever the compiler find a match for the `#define`, and starts eating up a good chunk of CPU (up to 28% on my PC).
`ñ` is a character outside of the normal ASCII table, just an example. It also crashes with `á`, `ö` and probably all other characters outside of the normal ASCII spec.
**Environment**:
* Operating System
Windows 10 Pro Build 17682
* Compiler version
Original SA-MP Compiler (2007), 3.10.8
* How are you invoking the compiler? Pawno, Sublime, vscode, sampctl or command-line?
`pawncc test.pwn`
|
1.0
|
Hangs when a string define including an extended ascii table character (Windows-1252) gets processed inside an include - **Is this a BUG REPORT, FEATURE REQUEST or QUESTION?**:
* [x] Bug Report
* [ ] Feature Request
* [ ] Question
**What happened**:
Compiler hangs when trying to compile this:
test.pwn:
```pawn
#define crash "ñ"
#include "crash.pwn"
```
crash.pwn:
```pawn
crash
```
(yeah, only one line including "crash")
**What you expected to happen**:
Not hang, give an relevant error
**How to reproduce it (as minimally and precisely as possible)**:
Try to compile the files above encoded as Windows-1252
**Anything else we need to know?**:
The hanging happens whenever the compiler find a match for the `#define`, and starts eating up a good chunk of CPU (up to 28% on my PC).
`ñ` is a character outside of the normal ASCII table, just an example. It also crashes with `á`, `ö` and probably all other characters outside of the normal ASCII spec.
**Environment**:
* Operating System
Windows 10 Pro Build 17682
* Compiler version
Original SA-MP Compiler (2007), 3.10.8
* How are you invoking the compiler? Pawno, Sublime, vscode, sampctl or command-line?
`pawncc test.pwn`
|
process
|
hangs when a string define including an extended ascii table character windows gets processed inside an include is this a bug report feature request or question bug report feature request question what happened compiler hangs when trying to compile this test pwn pawn define crash ñ include crash pwn crash pwn pawn crash yeah only one line including crash what you expected to happen not hang give an relevant error how to reproduce it as minimally and precisely as possible try to compile the files above encoded as windows anything else we need to know the hanging happens whenever the compiler find a match for the define and starts eating up a good chunk of cpu up to on my pc ñ is a character outside of the normal ascii table just an example it also crashes with á ö and probably all other characters outside of the normal ascii spec environment operating system windows pro build compiler version original sa mp compiler how are you invoking the compiler pawno sublime vscode sampctl or command line pawncc test pwn
| 1
|
156,691
| 24,625,095,093
|
IssuesEvent
|
2022-10-16 12:17:21
|
dotnet/efcore
|
https://api.github.com/repos/dotnet/efcore
|
closed
|
OwnedType with same property name as primary key causes error on migration
|
closed-by-design
|
I cannot use owned types that have an Id property. It will trigger an error on migrations. This is using dotnet core 2.1 preview1. I'm not sure if it works on previous versions.
```
Exception message: The keys {'Id'} on 'OwnedType' and {'Id'} on 'Entity' are both mapped to 'Entity.PK_Entity' but with different columns ({'OwnedTypeId'} and {'Id'}).
Stack trace:System.InvalidOperationException: The keys {'Id'} on 'OwnedType' and {'Id'} on 'Entity' are both mapped to 'Entity.PK_Entity' but with different columns ({'OwnedTypeId'} and {'Id'}).
at Microsoft.EntityFrameworkCore.Infrastructure.RelationalModelValidator.ValidateSharedKeysCompatibility(IReadOnlyList`1 mappedTypes, String tableName)
at Microsoft.EntityFrameworkCore.Infrastructure.RelationalModelValidator.ValidateSharedTableCompatibility(IModel model)
at Microsoft.EntityFrameworkCore.Infrastructure.RelationalModelValidator.Validate(IModel model)
at Microsoft.EntityFrameworkCore.Internal.SqlServerModelValidator.Validate(IModel model)
at Microsoft.EntityFrameworkCore.Infrastructure.ModelSource.CreateModel(DbContext context, IConventionSetBuilder conventionSetBuilder, IModelValidator validator)
at System.Collections.Concurrent.ConcurrentDictionary`2.GetOrAdd(TKey key, Func`2 valueFactory)
at Microsoft.EntityFrameworkCore.Internal.DbContextServices.CreateModel()
at Microsoft.EntityFrameworkCore.Internal.DbContextServices.get_Model()
at Microsoft.Extensions.DependencyInjection.ServiceLookup.CallSiteRuntimeResolver.VisitScoped(ScopedCallSite scopedCallSite, ServiceProviderEngineScope scope)
at Microsoft.Extensions.DependencyInjection.ServiceLookup.CallSiteRuntimeResolver.VisitConstructor(ConstructorCallSite constructorCallSite, ServiceProviderEngineScope scope)
at Microsoft.Extensions.DependencyInjection.ServiceLookup.CallSiteRuntimeResolver.VisitScoped(ScopedCallSite scopedCallSite, ServiceProviderEngineScope scope)
at Microsoft.Extensions.DependencyInjection.ServiceProviderServiceExtensions.GetRequiredService(IServiceProvider provider, Type serviceType)
at Microsoft.Extensions.DependencyInjection.ServiceProviderServiceExtensions.GetRequiredService[T](IServiceProvider provider)
at Microsoft.EntityFrameworkCore.DbContext.get_DbContextDependencies()
at Microsoft.EntityFrameworkCore.DbContext.get_InternalServiceProvider()
at Microsoft.EntityFrameworkCore.Internal.InternalAccessorExtensions.GetService[TService](IInfrastructure`1 accessor)
at Microsoft.EntityFrameworkCore.Design.Internal.DbContextOperations.CreateContext(Func`1 factory)
at Microsoft.EntityFrameworkCore.Design.Internal.DbContextOperations.CreateContext(String contextType)
at Microsoft.EntityFrameworkCore.Design.Internal.MigrationsOperations.AddMigration(String name, String outputDir, String contextType)
at Microsoft.EntityFrameworkCore.Design.OperationExecutor.AddMigrationImpl(String name, String outputDir, String contextType)
at Microsoft.EntityFrameworkCore.Design.OperationExecutor.OperationBase.<>c__DisplayClass3_0`1.<Execute>b__0()
at Microsoft.EntityFrameworkCore.Design.OperationExecutor.OperationBase.Execute(Action action)
The keys {'Id'} on 'OwnedType' and {'Id'} on 'Entity' are both mapped to 'Entity.PK_Entity' but with different columns ({'OwnedTypeId'} and {'Id'}).
```
### Steps to reproduce
Create new console app with the following code.
Run command `dotnet ef migrations add InitialCreate` from commandline
```c#
using Microsoft.EntityFrameworkCore;
using System;
namespace EfCore2_1MigrationError
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
}
}
public class ApplicationDbContext : DbContext
{
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UseSqlServer("Server=(local);Database=EfCore2_1Error;Trusted_Connection=True;");
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity<Entity>(builder =>
{
builder.HasKey(e => e.Id);
builder.OwnsOne(e => e.OwnedType, o =>
{
o.Property(e => e.Id).HasColumnName("OwnedTypeId");
});
});
}
}
public class Entity
{
public Guid Id { get; set; }
public OwnedType OwnedType { get; set; }
}
public class OwnedType
{
public Guid Id { get; set; }
}
}
```
csproj
```
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>netcoreapp2.1</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="2.1.0-preview1-final" />
<PackageReference Include="Microsoft.EntityFrameworkCore.SqlServer" Version="2.1.0-preview1-final" />
</ItemGroup>
<ItemGroup>
<DotNetCliToolReference Include="Microsoft.EntityFrameworkCore.Tools.DotNet" Version="2.1.0-preview1-final" />
</ItemGroup>
</Project>
```
### Further technical details
EF Core version: 2.1.0-preview1-final
Database Provider: Microsoft.EntityFrameworkCore.SqlServer
Operating system: Windows 10 Pro
IDE: Visual Studio 2017 15.7.0 Preview 2
|
1.0
|
OwnedType with same property name as primary key causes error on migration - I cannot use owned types that have an Id property. It will trigger an error on migrations. This is using dotnet core 2.1 preview1. I'm not sure if it works on previous versions.
```
Exception message: The keys {'Id'} on 'OwnedType' and {'Id'} on 'Entity' are both mapped to 'Entity.PK_Entity' but with different columns ({'OwnedTypeId'} and {'Id'}).
Stack trace:System.InvalidOperationException: The keys {'Id'} on 'OwnedType' and {'Id'} on 'Entity' are both mapped to 'Entity.PK_Entity' but with different columns ({'OwnedTypeId'} and {'Id'}).
at Microsoft.EntityFrameworkCore.Infrastructure.RelationalModelValidator.ValidateSharedKeysCompatibility(IReadOnlyList`1 mappedTypes, String tableName)
at Microsoft.EntityFrameworkCore.Infrastructure.RelationalModelValidator.ValidateSharedTableCompatibility(IModel model)
at Microsoft.EntityFrameworkCore.Infrastructure.RelationalModelValidator.Validate(IModel model)
at Microsoft.EntityFrameworkCore.Internal.SqlServerModelValidator.Validate(IModel model)
at Microsoft.EntityFrameworkCore.Infrastructure.ModelSource.CreateModel(DbContext context, IConventionSetBuilder conventionSetBuilder, IModelValidator validator)
at System.Collections.Concurrent.ConcurrentDictionary`2.GetOrAdd(TKey key, Func`2 valueFactory)
at Microsoft.EntityFrameworkCore.Internal.DbContextServices.CreateModel()
at Microsoft.EntityFrameworkCore.Internal.DbContextServices.get_Model()
at Microsoft.Extensions.DependencyInjection.ServiceLookup.CallSiteRuntimeResolver.VisitScoped(ScopedCallSite scopedCallSite, ServiceProviderEngineScope scope)
at Microsoft.Extensions.DependencyInjection.ServiceLookup.CallSiteRuntimeResolver.VisitConstructor(ConstructorCallSite constructorCallSite, ServiceProviderEngineScope scope)
at Microsoft.Extensions.DependencyInjection.ServiceLookup.CallSiteRuntimeResolver.VisitScoped(ScopedCallSite scopedCallSite, ServiceProviderEngineScope scope)
at Microsoft.Extensions.DependencyInjection.ServiceProviderServiceExtensions.GetRequiredService(IServiceProvider provider, Type serviceType)
at Microsoft.Extensions.DependencyInjection.ServiceProviderServiceExtensions.GetRequiredService[T](IServiceProvider provider)
at Microsoft.EntityFrameworkCore.DbContext.get_DbContextDependencies()
at Microsoft.EntityFrameworkCore.DbContext.get_InternalServiceProvider()
at Microsoft.EntityFrameworkCore.Internal.InternalAccessorExtensions.GetService[TService](IInfrastructure`1 accessor)
at Microsoft.EntityFrameworkCore.Design.Internal.DbContextOperations.CreateContext(Func`1 factory)
at Microsoft.EntityFrameworkCore.Design.Internal.DbContextOperations.CreateContext(String contextType)
at Microsoft.EntityFrameworkCore.Design.Internal.MigrationsOperations.AddMigration(String name, String outputDir, String contextType)
at Microsoft.EntityFrameworkCore.Design.OperationExecutor.AddMigrationImpl(String name, String outputDir, String contextType)
at Microsoft.EntityFrameworkCore.Design.OperationExecutor.OperationBase.<>c__DisplayClass3_0`1.<Execute>b__0()
at Microsoft.EntityFrameworkCore.Design.OperationExecutor.OperationBase.Execute(Action action)
The keys {'Id'} on 'OwnedType' and {'Id'} on 'Entity' are both mapped to 'Entity.PK_Entity' but with different columns ({'OwnedTypeId'} and {'Id'}).
```
### Steps to reproduce
Create new console app with the following code.
Run command `dotnet ef migrations add InitialCreate` from commandline
```c#
using Microsoft.EntityFrameworkCore;
using System;
namespace EfCore2_1MigrationError
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello World!");
}
}
public class ApplicationDbContext : DbContext
{
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UseSqlServer("Server=(local);Database=EfCore2_1Error;Trusted_Connection=True;");
}
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity<Entity>(builder =>
{
builder.HasKey(e => e.Id);
builder.OwnsOne(e => e.OwnedType, o =>
{
o.Property(e => e.Id).HasColumnName("OwnedTypeId");
});
});
}
}
public class Entity
{
public Guid Id { get; set; }
public OwnedType OwnedType { get; set; }
}
public class OwnedType
{
public Guid Id { get; set; }
}
}
```
csproj
```
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>netcoreapp2.1</TargetFramework>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="Microsoft.EntityFrameworkCore.Design" Version="2.1.0-preview1-final" />
<PackageReference Include="Microsoft.EntityFrameworkCore.SqlServer" Version="2.1.0-preview1-final" />
</ItemGroup>
<ItemGroup>
<DotNetCliToolReference Include="Microsoft.EntityFrameworkCore.Tools.DotNet" Version="2.1.0-preview1-final" />
</ItemGroup>
</Project>
```
### Further technical details
EF Core version: 2.1.0-preview1-final
Database Provider: Microsoft.EntityFrameworkCore.SqlServer
Operating system: Windows 10 Pro
IDE: Visual Studio 2017 15.7.0 Preview 2
|
non_process
|
ownedtype with same property name as primary key causes error on migration i cannot use owned types that have an id property it will trigger an error on migrations this is using dotnet core i m not sure if it works on previous versions exception message the keys id on ownedtype and id on entity are both mapped to entity pk entity but with different columns ownedtypeid and id stack trace system invalidoperationexception the keys id on ownedtype and id on entity are both mapped to entity pk entity but with different columns ownedtypeid and id at microsoft entityframeworkcore infrastructure relationalmodelvalidator validatesharedkeyscompatibility ireadonlylist mappedtypes string tablename at microsoft entityframeworkcore infrastructure relationalmodelvalidator validatesharedtablecompatibility imodel model at microsoft entityframeworkcore infrastructure relationalmodelvalidator validate imodel model at microsoft entityframeworkcore internal sqlservermodelvalidator validate imodel model at microsoft entityframeworkcore infrastructure modelsource createmodel dbcontext context iconventionsetbuilder conventionsetbuilder imodelvalidator validator at system collections concurrent concurrentdictionary getoradd tkey key func valuefactory at microsoft entityframeworkcore internal dbcontextservices createmodel at microsoft entityframeworkcore internal dbcontextservices get model at microsoft extensions dependencyinjection servicelookup callsiteruntimeresolver visitscoped scopedcallsite scopedcallsite serviceproviderenginescope scope at microsoft extensions dependencyinjection servicelookup callsiteruntimeresolver visitconstructor constructorcallsite constructorcallsite serviceproviderenginescope scope at microsoft extensions dependencyinjection servicelookup callsiteruntimeresolver visitscoped scopedcallsite scopedcallsite serviceproviderenginescope scope at microsoft extensions dependencyinjection serviceproviderserviceextensions getrequiredservice iserviceprovider provider type servicetype at microsoft extensions dependencyinjection serviceproviderserviceextensions getrequiredservice iserviceprovider provider at microsoft entityframeworkcore dbcontext get dbcontextdependencies at microsoft entityframeworkcore dbcontext get internalserviceprovider at microsoft entityframeworkcore internal internalaccessorextensions getservice iinfrastructure accessor at microsoft entityframeworkcore design internal dbcontextoperations createcontext func factory at microsoft entityframeworkcore design internal dbcontextoperations createcontext string contexttype at microsoft entityframeworkcore design internal migrationsoperations addmigration string name string outputdir string contexttype at microsoft entityframeworkcore design operationexecutor addmigrationimpl string name string outputdir string contexttype at microsoft entityframeworkcore design operationexecutor operationbase c b at microsoft entityframeworkcore design operationexecutor operationbase execute action action the keys id on ownedtype and id on entity are both mapped to entity pk entity but with different columns ownedtypeid and id steps to reproduce create new console app with the following code run command dotnet ef migrations add initialcreate from commandline c using microsoft entityframeworkcore using system namespace class program static void main string args console writeline hello world public class applicationdbcontext dbcontext protected override void onconfiguring dbcontextoptionsbuilder optionsbuilder optionsbuilder usesqlserver server local database trusted connection true protected override void onmodelcreating modelbuilder modelbuilder modelbuilder entity builder builder haskey e e id builder ownsone e e ownedtype o o property e e id hascolumnname ownedtypeid public class entity public guid id get set public ownedtype ownedtype get set public class ownedtype public guid id get set csproj exe further technical details ef core version final database provider microsoft entityframeworkcore sqlserver operating system windows pro ide visual studio preview
| 0
|
10,504
| 13,262,368,700
|
IssuesEvent
|
2020-08-20 21:41:15
|
googleapis/python-firestore
|
https://api.github.com/repos/googleapis/python-firestore
|
closed
|
Systests are being skipped on Kokoro
|
api: firestore type: process
|
The systests expect to have `FIRESTORE_APPLICATION_CREDENTIALS` set, but the templated `.kokoro/build.sh` is setting only the more normal `GOOGLE_APPLICATION_CREDENTIALS`. The [monorepo `.kokoro/build.sh` set both](https://github.com/googleapis/google-cloud-python/blob/0d8b70a2bd6a279fa5e172b182006c7038491521/.kokoro/build.sh#L27-L31).
|
1.0
|
Systests are being skipped on Kokoro - The systests expect to have `FIRESTORE_APPLICATION_CREDENTIALS` set, but the templated `.kokoro/build.sh` is setting only the more normal `GOOGLE_APPLICATION_CREDENTIALS`. The [monorepo `.kokoro/build.sh` set both](https://github.com/googleapis/google-cloud-python/blob/0d8b70a2bd6a279fa5e172b182006c7038491521/.kokoro/build.sh#L27-L31).
|
process
|
systests are being skipped on kokoro the systests expect to have firestore application credentials set but the templated kokoro build sh is setting only the more normal google application credentials the
| 1
|
17,508
| 23,319,041,110
|
IssuesEvent
|
2022-08-08 14:49:46
|
GoogleCloudPlatform/spring-cloud-gcp
|
https://api.github.com/repos/GoogleCloudPlatform/spring-cloud-gcp
|
closed
|
Please document release schedule
|
p2 process
|
**Is your feature request related to a problem? Please describe.**
The release schedule isn't documented anywhere.
I wonder when Spring Cloud GCP becomes Spring Boot 2.7 compatible.
**Describe the solution you'd like**
Something like https://calendar.spring.io/
**Describe alternatives you've considered**
Googling, but I found nothing.
|
1.0
|
Please document release schedule - **Is your feature request related to a problem? Please describe.**
The release schedule isn't documented anywhere.
I wonder when Spring Cloud GCP becomes Spring Boot 2.7 compatible.
**Describe the solution you'd like**
Something like https://calendar.spring.io/
**Describe alternatives you've considered**
Googling, but I found nothing.
|
process
|
please document release schedule is your feature request related to a problem please describe the release schedule isn t documented anywhere i wonder when spring cloud gcp becomes spring boot compatible describe the solution you d like something like describe alternatives you ve considered googling but i found nothing
| 1
|
6,190
| 9,104,165,534
|
IssuesEvent
|
2019-02-20 17:28:52
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
xref via keyref to external HTML page doesn't work
|
bug preprocess preprocess/keyref priority/medium
|
Hi,
There was a similar issue reported in the past, but the case is a bit different https://github.com/dita-ot/dita-ot/issues/1174.
DITA-OT version tested (direct from download): 1.6.3, 1.7.1 both have the issue.
In my understanding, either navtitle element or linktext are not honored in the map resolution (Not sure I use the right vocabulary - sorry - I mean when the map is built in the preprocess stage), when a keyref reference an external html page.
The link text is honored in a context of a topic resolution.
## Code
Here is a map using linktext and navtitle
``` xml
<map>
<title>Keydef on exernal content</title>
<keydef keys="ncbi"
href="http://www.ncbi.nlm.nih.gov/"
scope="external"
format="html">
<topicmeta>
<navtitle>NCBI web site</navtitle>
<linktext>NCBI web site</linktext>
</topicmeta>
</keydef>
<topicref href="concept.dita" type="concept"/>
<topicref keyref="ncbi"/>
</map>
```
After preprocess, the @href value seems to replace the element navtitle and link text
``` xml
<!-- (...) -->
<topicref class="- map/topicref "
format="html"
href="http://www.ncbi.nlm.nih.gov/"
keyref="ncbi"
scope="external"
xtrc="topicref:2;12:30"
xtrf="/Users/blefort/DITA-OT1.7.1/bugs/DITA-OT/keyref-html/main.ditamap">
<topicmeta class="- map/topicmeta ">
<navtitle class="- topic/navtitle ">http://www.ncbi.nlm.nih.gov/</navtitle>
<?ditaot gentext?>
<linktext class="- map/linktext ">http://www.ncbi.nlm.nih.gov/</linktext>
</topicmeta>
</topicref>
```
## Result
In the navigation, we will see http://www.ncbi.nlm.nih.gov/
In the concept: NCBI web site
## Samples
https://dl.dropbox.com/u/14980268/issues/keyref-html.zip
```
```
|
2.0
|
xref via keyref to external HTML page doesn't work - Hi,
There was a similar issue reported in the past, but the case is a bit different https://github.com/dita-ot/dita-ot/issues/1174.
DITA-OT version tested (direct from download): 1.6.3, 1.7.1 both have the issue.
In my understanding, either navtitle element or linktext are not honored in the map resolution (Not sure I use the right vocabulary - sorry - I mean when the map is built in the preprocess stage), when a keyref reference an external html page.
The link text is honored in a context of a topic resolution.
## Code
Here is a map using linktext and navtitle
``` xml
<map>
<title>Keydef on exernal content</title>
<keydef keys="ncbi"
href="http://www.ncbi.nlm.nih.gov/"
scope="external"
format="html">
<topicmeta>
<navtitle>NCBI web site</navtitle>
<linktext>NCBI web site</linktext>
</topicmeta>
</keydef>
<topicref href="concept.dita" type="concept"/>
<topicref keyref="ncbi"/>
</map>
```
After preprocess, the @href value seems to replace the element navtitle and link text
``` xml
<!-- (...) -->
<topicref class="- map/topicref "
format="html"
href="http://www.ncbi.nlm.nih.gov/"
keyref="ncbi"
scope="external"
xtrc="topicref:2;12:30"
xtrf="/Users/blefort/DITA-OT1.7.1/bugs/DITA-OT/keyref-html/main.ditamap">
<topicmeta class="- map/topicmeta ">
<navtitle class="- topic/navtitle ">http://www.ncbi.nlm.nih.gov/</navtitle>
<?ditaot gentext?>
<linktext class="- map/linktext ">http://www.ncbi.nlm.nih.gov/</linktext>
</topicmeta>
</topicref>
```
## Result
In the navigation, we will see http://www.ncbi.nlm.nih.gov/
In the concept: NCBI web site
## Samples
https://dl.dropbox.com/u/14980268/issues/keyref-html.zip
```
```
|
process
|
xref via keyref to external html page doesn t work hi there was a similar issue reported in the past but the case is a bit different dita ot version tested direct from download both have the issue in my understanding either navtitle element or linktext are not honored in the map resolution not sure i use the right vocabulary sorry i mean when the map is built in the preprocess stage when a keyref reference an external html page the link text is honored in a context of a topic resolution code here is a map using linktext and navtitle xml keydef on exernal content keydef keys ncbi href scope external format html ncbi web site ncbi web site after preprocess the href value seems to replace the element navtitle and link text xml topicref class map topicref format html href keyref ncbi scope external xtrc topicref xtrf users blefort dita bugs dita ot keyref html main ditamap result in the navigation we will see in the concept ncbi web site samples
| 1
|
19,521
| 25,832,291,397
|
IssuesEvent
|
2022-12-12 16:55:12
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Cannot filter in Mongo on a nested key named just `_` (underscore)
|
Type:Bug Priority:P3 Database/Mongo Querying/Processor Querying/Parameters & Variables .Backend
|
Hi, I'm trying to filter a JSON/MongoDB. It works with some fields, but, for some arrays, I receive this error:
```
There was a problem with your question
Most of the time this is caused by an invalid selection or bad input value. Double check your inputs and retry your query.
`Show error details
Here's the full error message
(not ("Non-blank string" ""))
```
And in the debug:
```
03-30 15:07:16 WARN metabase.query-processor :: {:status :failed,
:class clojure.lang.ExceptionInfo,
:error (not ("Non-blank string" "")),
:stacktrace
["query_processor.resolve$fn__21338.invokeStatic(resolve.clj:136)"
"query_processor.resolve$fn__21338.invoke(resolve.clj:133)"
"query_processor.resolve$fn__21325$G__21320__21332.invoke(resolve.clj:129)"
"query_processor.resolve$value_ph_resolve_field.invokeStatic(resolve.clj:162)"
"query_processor.resolve$value_ph_resolve_field.invoke(resolve.clj:158)"
"query_processor.resolve$fn__21244$G__21226__21251.invoke(resolve.clj:40)"
"util$rpartial$fn__7447.doInvoke(util.clj:449)"
"query_processor.resolve$resolve_fields.invokeStatic(resolve.clj:211)"
"query_processor.resolve$resolve_fields.invoke(resolve.clj:188)"
"query_processor.resolve$resolve.invokeStatic(resolve.clj:269)"
"query_processor.resolve$resolve.invoke(resolve.clj:266)"
"query_processor.middleware.expand_resolve$expand_resolve_STAR_.invokeStatic(expand_resolve.clj:21)"
"query_processor.middleware.expand_resolve$expand_resolve_STAR_.invoke(expand_resolve.clj:16)"
"query_processor.middleware.add_row_count_and_status$add_row_count_and_status$fn__21473.invoke(add_row_count_and_status.clj:14)"
"driver.mongo$process_query_in_context$fn__39419$f__38894__auto____39421.invoke(mongo.clj:48)"
"driver.mongo.util$_with_mongo_connection.invokeStatic(util.clj:70)"
"driver.mongo.util$_with_mongo_connection.invoke(util.clj:42)"
"driver.mongo$process_query_in_context$fn__39419.invoke(mongo.clj:47)"
"query_processor.middleware.driver_specific$process_query_in_context$fn__22978.invoke(driver_specific.clj:12)"
"query_processor.middleware.resolve_driver$resolve_driver$fn__24064.invoke(resolve_driver.clj:14)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__22900.invoke(catch_exceptions.clj:51)"
"query_processor$process_query.invokeStatic(query_processor.clj:64)"
"query_processor$process_query.invoke(query_processor.clj:59)"
"query_processor$run_and_save_query_BANG_.invokeStatic(query_processor.clj:180)"
"query_processor$run_and_save_query_BANG_.invoke(query_processor.clj:175)"
"query_processor$fn__24095$dataset_query__24100$fn__24101.invoke(query_processor.clj:212)"
"query_processor$fn__24095$dataset_query__24100.invoke(query_processor.clj:199)"
"api.dataset$fn__24866$fn__24869.invoke(dataset.clj:36)"
"api.common.internal$do_with_caught_api_exceptions.invokeStatic(internal.clj:229)"
"api.common.internal$do_with_caught_api_exceptions.invoke(internal.clj:224)"
"api.dataset$fn__24866.invokeStatic(dataset.clj:30)"
"api.dataset$fn__24866.invoke(dataset.clj:30)"
"middleware$enforce_authentication$fn__34556.invoke(middleware.clj:119)"
"api.routes$fn__34682.invokeStatic(routes.clj:57)"
"api.routes$fn__34682.invoke(routes.clj:57)"
"routes$fn__35755.invokeStatic(routes.clj:44)"
"routes$fn__35755.invoke(routes.clj:44)"
"middleware$log_api_call$fn__34655$fn__34657.invoke(middleware.clj:331)"
"middleware$log_api_call$fn__34655.invoke(middleware.clj:330)"
"middleware$add_security_headers$fn__34605.invoke(middleware.clj:246)"
"middleware$bind_current_user$fn__34560.invoke(middleware.clj:139)"
"middleware$maybe_set_site_url$fn__34609.invoke(middleware.clj:268)"],
:query
{:type "query",
:query {:source_table 47, :filter ["AND" ["CONTAINS" ["field-id" 1099] "casa"]]},
:parameters [],
:constraints {:max-results 10000, :max-results-bare-rows 2000},
:info
{:executed-by 1,
:context :ad-hoc,
:query-hash [37, 97, -56, -4, 102, 110, -18, 103, 82, -92, 97, 49, -79, 118, 71, -36, -85, 93, 75, 35, 106, 56, 96, 115, -40, 20, -24, -102, -44, -19, 91, 10],
:query-type "MBQL"}},
:expanded-query nil,
:ex-data
{:type :schema.core/error,
:schema metabase.query_processor.interface.Value,
:value
{:value "casa",
:field
{:field-id 1099,
:field-name "_",
:field-display-name "",
:base-type :type/Text,
:special-type nil,
:visibility-type :normal,
:table-id 47,
:schema-name nil,
:table-name nil,
:position nil,
:fk-field-id nil,
:description nil,
:parent-id 1098,
:parent {:field-id 1098, :fk-field-id nil, :datetime-unit nil}}},
:error {:field (named {:field-display-name (not ("Non-blank string" ""))} "field or expression reference")}}}
```
```
03-30 15:07:16 WARN metabase.query-processor :: Query failure: (not ("Non-blank string" ""))
["query_processor$assert_query_status_successful.invokeStatic(query_processor.clj:149)"
"query_processor$assert_query_status_successful.invoke(query_processor.clj:142)"
"query_processor$run_and_save_query_BANG_.invokeStatic(query_processor.clj:181)"
"query_processor$run_and_save_query_BANG_.invoke(query_processor.clj:175)"
"query_processor$fn__24095$dataset_query__24100$fn__24101.invoke(query_processor.clj:212)"
"query_processor$fn__24095$dataset_query__24100.invoke(query_processor.clj:199)"
"api.dataset$fn__24866$fn__24869.invoke(dataset.clj:36)"
"api.common.internal$do_with_caught_api_exceptions.invokeStatic(internal.clj:229)"
"api.common.internal$do_with_caught_api_exceptions.invoke(internal.clj:224)"
"api.dataset$fn__24866.invokeStatic(dataset.clj:30)"
"api.dataset$fn__24866.invoke(dataset.clj:30)"
"middleware$enforce_authentication$fn__34556.invoke(middleware.clj:119)"
"api.routes$fn__34682.invokeStatic(routes.clj:57)"
"api.routes$fn__34682.invoke(routes.clj:57)"
"routes$fn__35755.invokeStatic(routes.clj:44)"
"routes$fn__35755.invoke(routes.clj:44)"
"middleware$log_api_call$fn__34655$fn__34657.invoke(middleware.clj:331)"
"middleware$log_api_call$fn__34655.invoke(middleware.clj:330)"
"middleware$add_security_headers$fn__34605.invoke(middleware.clj:246)"
"middleware$bind_current_user$fn__34560.invoke(middleware.clj:139)"
"middleware$maybe_set_site_url$fn__34609.invoke(middleware.clj:268)"]
```
And here are the problematic fields
```
"v99": {
"_": "Rolos 1 a 4 com pista dupla. Rolo 5 com pista simples"
}
"v2": {
"_": "B2-035-F-III"
},
```
-------
Metabase version: 0.23
Database: MongoDB - 3.4
|
1.0
|
Cannot filter in Mongo on a nested key named just `_` (underscore) - Hi, I'm trying to filter a JSON/MongoDB. It works with some fields, but, for some arrays, I receive this error:
```
There was a problem with your question
Most of the time this is caused by an invalid selection or bad input value. Double check your inputs and retry your query.
`Show error details
Here's the full error message
(not ("Non-blank string" ""))
```
And in the debug:
```
03-30 15:07:16 WARN metabase.query-processor :: {:status :failed,
:class clojure.lang.ExceptionInfo,
:error (not ("Non-blank string" "")),
:stacktrace
["query_processor.resolve$fn__21338.invokeStatic(resolve.clj:136)"
"query_processor.resolve$fn__21338.invoke(resolve.clj:133)"
"query_processor.resolve$fn__21325$G__21320__21332.invoke(resolve.clj:129)"
"query_processor.resolve$value_ph_resolve_field.invokeStatic(resolve.clj:162)"
"query_processor.resolve$value_ph_resolve_field.invoke(resolve.clj:158)"
"query_processor.resolve$fn__21244$G__21226__21251.invoke(resolve.clj:40)"
"util$rpartial$fn__7447.doInvoke(util.clj:449)"
"query_processor.resolve$resolve_fields.invokeStatic(resolve.clj:211)"
"query_processor.resolve$resolve_fields.invoke(resolve.clj:188)"
"query_processor.resolve$resolve.invokeStatic(resolve.clj:269)"
"query_processor.resolve$resolve.invoke(resolve.clj:266)"
"query_processor.middleware.expand_resolve$expand_resolve_STAR_.invokeStatic(expand_resolve.clj:21)"
"query_processor.middleware.expand_resolve$expand_resolve_STAR_.invoke(expand_resolve.clj:16)"
"query_processor.middleware.add_row_count_and_status$add_row_count_and_status$fn__21473.invoke(add_row_count_and_status.clj:14)"
"driver.mongo$process_query_in_context$fn__39419$f__38894__auto____39421.invoke(mongo.clj:48)"
"driver.mongo.util$_with_mongo_connection.invokeStatic(util.clj:70)"
"driver.mongo.util$_with_mongo_connection.invoke(util.clj:42)"
"driver.mongo$process_query_in_context$fn__39419.invoke(mongo.clj:47)"
"query_processor.middleware.driver_specific$process_query_in_context$fn__22978.invoke(driver_specific.clj:12)"
"query_processor.middleware.resolve_driver$resolve_driver$fn__24064.invoke(resolve_driver.clj:14)"
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__22900.invoke(catch_exceptions.clj:51)"
"query_processor$process_query.invokeStatic(query_processor.clj:64)"
"query_processor$process_query.invoke(query_processor.clj:59)"
"query_processor$run_and_save_query_BANG_.invokeStatic(query_processor.clj:180)"
"query_processor$run_and_save_query_BANG_.invoke(query_processor.clj:175)"
"query_processor$fn__24095$dataset_query__24100$fn__24101.invoke(query_processor.clj:212)"
"query_processor$fn__24095$dataset_query__24100.invoke(query_processor.clj:199)"
"api.dataset$fn__24866$fn__24869.invoke(dataset.clj:36)"
"api.common.internal$do_with_caught_api_exceptions.invokeStatic(internal.clj:229)"
"api.common.internal$do_with_caught_api_exceptions.invoke(internal.clj:224)"
"api.dataset$fn__24866.invokeStatic(dataset.clj:30)"
"api.dataset$fn__24866.invoke(dataset.clj:30)"
"middleware$enforce_authentication$fn__34556.invoke(middleware.clj:119)"
"api.routes$fn__34682.invokeStatic(routes.clj:57)"
"api.routes$fn__34682.invoke(routes.clj:57)"
"routes$fn__35755.invokeStatic(routes.clj:44)"
"routes$fn__35755.invoke(routes.clj:44)"
"middleware$log_api_call$fn__34655$fn__34657.invoke(middleware.clj:331)"
"middleware$log_api_call$fn__34655.invoke(middleware.clj:330)"
"middleware$add_security_headers$fn__34605.invoke(middleware.clj:246)"
"middleware$bind_current_user$fn__34560.invoke(middleware.clj:139)"
"middleware$maybe_set_site_url$fn__34609.invoke(middleware.clj:268)"],
:query
{:type "query",
:query {:source_table 47, :filter ["AND" ["CONTAINS" ["field-id" 1099] "casa"]]},
:parameters [],
:constraints {:max-results 10000, :max-results-bare-rows 2000},
:info
{:executed-by 1,
:context :ad-hoc,
:query-hash [37, 97, -56, -4, 102, 110, -18, 103, 82, -92, 97, 49, -79, 118, 71, -36, -85, 93, 75, 35, 106, 56, 96, 115, -40, 20, -24, -102, -44, -19, 91, 10],
:query-type "MBQL"}},
:expanded-query nil,
:ex-data
{:type :schema.core/error,
:schema metabase.query_processor.interface.Value,
:value
{:value "casa",
:field
{:field-id 1099,
:field-name "_",
:field-display-name "",
:base-type :type/Text,
:special-type nil,
:visibility-type :normal,
:table-id 47,
:schema-name nil,
:table-name nil,
:position nil,
:fk-field-id nil,
:description nil,
:parent-id 1098,
:parent {:field-id 1098, :fk-field-id nil, :datetime-unit nil}}},
:error {:field (named {:field-display-name (not ("Non-blank string" ""))} "field or expression reference")}}}
```
```
03-30 15:07:16 WARN metabase.query-processor :: Query failure: (not ("Non-blank string" ""))
["query_processor$assert_query_status_successful.invokeStatic(query_processor.clj:149)"
"query_processor$assert_query_status_successful.invoke(query_processor.clj:142)"
"query_processor$run_and_save_query_BANG_.invokeStatic(query_processor.clj:181)"
"query_processor$run_and_save_query_BANG_.invoke(query_processor.clj:175)"
"query_processor$fn__24095$dataset_query__24100$fn__24101.invoke(query_processor.clj:212)"
"query_processor$fn__24095$dataset_query__24100.invoke(query_processor.clj:199)"
"api.dataset$fn__24866$fn__24869.invoke(dataset.clj:36)"
"api.common.internal$do_with_caught_api_exceptions.invokeStatic(internal.clj:229)"
"api.common.internal$do_with_caught_api_exceptions.invoke(internal.clj:224)"
"api.dataset$fn__24866.invokeStatic(dataset.clj:30)"
"api.dataset$fn__24866.invoke(dataset.clj:30)"
"middleware$enforce_authentication$fn__34556.invoke(middleware.clj:119)"
"api.routes$fn__34682.invokeStatic(routes.clj:57)"
"api.routes$fn__34682.invoke(routes.clj:57)"
"routes$fn__35755.invokeStatic(routes.clj:44)"
"routes$fn__35755.invoke(routes.clj:44)"
"middleware$log_api_call$fn__34655$fn__34657.invoke(middleware.clj:331)"
"middleware$log_api_call$fn__34655.invoke(middleware.clj:330)"
"middleware$add_security_headers$fn__34605.invoke(middleware.clj:246)"
"middleware$bind_current_user$fn__34560.invoke(middleware.clj:139)"
"middleware$maybe_set_site_url$fn__34609.invoke(middleware.clj:268)"]
```
And here are the problematic fields
```
"v99": {
"_": "Rolos 1 a 4 com pista dupla. Rolo 5 com pista simples"
}
"v2": {
"_": "B2-035-F-III"
},
```
-------
Metabase version: 0.23
Database: MongoDB - 3.4
|
process
|
cannot filter in mongo on a nested key named just underscore hi i m trying to filter a json mongodb it works with some fields but for some arrays i receive this error there was a problem with your question most of the time this is caused by an invalid selection or bad input value double check your inputs and retry your query show error details here s the full error message not non blank string and in the debug warn metabase query processor status failed class clojure lang exceptioninfo error not non blank string stacktrace query processor resolve fn invokestatic resolve clj query processor resolve fn invoke resolve clj query processor resolve fn g invoke resolve clj query processor resolve value ph resolve field invokestatic resolve clj query processor resolve value ph resolve field invoke resolve clj query processor resolve fn g invoke resolve clj util rpartial fn doinvoke util clj query processor resolve resolve fields invokestatic resolve clj query processor resolve resolve fields invoke resolve clj query processor resolve resolve invokestatic resolve clj query processor resolve resolve invoke resolve clj query processor middleware expand resolve expand resolve star invokestatic expand resolve clj query processor middleware expand resolve expand resolve star invoke expand resolve clj query processor middleware add row count and status add row count and status fn invoke add row count and status clj driver mongo process query in context fn f auto invoke mongo clj driver mongo util with mongo connection invokestatic util clj driver mongo util with mongo connection invoke util clj driver mongo process query in context fn invoke mongo clj query processor middleware driver specific process query in context fn invoke driver specific clj query processor middleware resolve driver resolve driver fn invoke resolve driver clj query processor middleware catch exceptions catch exceptions fn invoke catch exceptions clj query processor process query invokestatic query processor clj query processor process query invoke query processor clj query processor run and save query bang invokestatic query processor clj query processor run and save query bang invoke query processor clj query processor fn dataset query fn invoke query processor clj query processor fn dataset query invoke query processor clj api dataset fn fn invoke dataset clj api common internal do with caught api exceptions invokestatic internal clj api common internal do with caught api exceptions invoke internal clj api dataset fn invokestatic dataset clj api dataset fn invoke dataset clj middleware enforce authentication fn invoke middleware clj api routes fn invokestatic routes clj api routes fn invoke routes clj routes fn invokestatic routes clj routes fn invoke routes clj middleware log api call fn fn invoke middleware clj middleware log api call fn invoke middleware clj middleware add security headers fn invoke middleware clj middleware bind current user fn invoke middleware clj middleware maybe set site url fn invoke middleware clj query type query query source table filter casa parameters constraints max results max results bare rows info executed by context ad hoc query hash query type mbql expanded query nil ex data type schema core error schema metabase query processor interface value value value casa field field id field name field display name base type type text special type nil visibility type normal table id schema name nil table name nil position nil fk field id nil description nil parent id parent field id fk field id nil datetime unit nil error field named field display name not non blank string field or expression reference warn metabase query processor query failure not non blank string query processor assert query status successful invokestatic query processor clj query processor assert query status successful invoke query processor clj query processor run and save query bang invokestatic query processor clj query processor run and save query bang invoke query processor clj query processor fn dataset query fn invoke query processor clj query processor fn dataset query invoke query processor clj api dataset fn fn invoke dataset clj api common internal do with caught api exceptions invokestatic internal clj api common internal do with caught api exceptions invoke internal clj api dataset fn invokestatic dataset clj api dataset fn invoke dataset clj middleware enforce authentication fn invoke middleware clj api routes fn invokestatic routes clj api routes fn invoke routes clj routes fn invokestatic routes clj routes fn invoke routes clj middleware log api call fn fn invoke middleware clj middleware log api call fn invoke middleware clj middleware add security headers fn invoke middleware clj middleware bind current user fn invoke middleware clj middleware maybe set site url fn invoke middleware clj and here are the problematic fields rolos a com pista dupla rolo com pista simples f iii metabase version database mongodb
| 1
|
8,068
| 11,251,340,538
|
IssuesEvent
|
2020-01-11 00:00:28
|
googleapis/java-irm
|
https://api.github.com/repos/googleapis/java-irm
|
opened
|
Promote to Beta
|
type: process
|
Package name: **google-cloud-irm**
Current release: **alpha**
Proposed release: **beta**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [ ] Server API is beta or GA
- [ ] Service API is public
- [ ] Client surface is mostly stable (no known issues that could significantly change the surface)
- [ ] All manual types and methods have comment documentation
- [ ] Package name is idiomatic for the platform
- [ ] At least one integration/smoke test is defined and passing
- [ ] Central GitHub README lists and points to the per-API README
- [ ] Per-API README links to product page on cloud.google.com
- [ ] Manual code has been reviewed for API stability by repo owner
## Optional
- [ ] Most common / important scenarios have descriptive samples
- [ ] Public manual methods have at least one usage sample each (excluding overloads)
- [ ] Per-API README includes a full description of the API
- [ ] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client LIbraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
1.0
|
Promote to Beta - Package name: **google-cloud-irm**
Current release: **alpha**
Proposed release: **beta**
## Instructions
Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue.
## Required
- [ ] Server API is beta or GA
- [ ] Service API is public
- [ ] Client surface is mostly stable (no known issues that could significantly change the surface)
- [ ] All manual types and methods have comment documentation
- [ ] Package name is idiomatic for the platform
- [ ] At least one integration/smoke test is defined and passing
- [ ] Central GitHub README lists and points to the per-API README
- [ ] Per-API README links to product page on cloud.google.com
- [ ] Manual code has been reviewed for API stability by repo owner
## Optional
- [ ] Most common / important scenarios have descriptive samples
- [ ] Public manual methods have at least one usage sample each (excluding overloads)
- [ ] Per-API README includes a full description of the API
- [ ] Per-API README contains at least one “getting started” sample using the most common API scenario
- [ ] Manual code has been reviewed by API producer
- [ ] Manual code has been reviewed by a DPE responsible for samples
- [ ] 'Client LIbraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
|
process
|
promote to beta package name google cloud irm current release alpha proposed release beta instructions check the lists below adding tests documentation as required once all the required boxes are ticked please create a release and close this issue required server api is beta or ga service api is public client surface is mostly stable no known issues that could significantly change the surface all manual types and methods have comment documentation package name is idiomatic for the platform at least one integration smoke test is defined and passing central github readme lists and points to the per api readme per api readme links to product page on cloud google com manual code has been reviewed for api stability by repo owner optional most common important scenarios have descriptive samples public manual methods have at least one usage sample each excluding overloads per api readme includes a full description of the api per api readme contains at least one “getting started” sample using the most common api scenario manual code has been reviewed by api producer manual code has been reviewed by a dpe responsible for samples client libraries page is added to the product documentation in apis reference section of the product s documentation on cloud site
| 1
|
130,252
| 27,636,284,307
|
IssuesEvent
|
2023-03-10 14:41:02
|
alvgonfri/dp2-acme-l3
|
https://api.github.com/repos/alvgonfri/dp2-acme-l3
|
opened
|
R24: Authenticated principals operations on offers
|
code optional
|
Operations by authenticated principals on offers:
- List the offers in the system and show their details.
|
1.0
|
R24: Authenticated principals operations on offers - Operations by authenticated principals on offers:
- List the offers in the system and show their details.
|
non_process
|
authenticated principals operations on offers operations by authenticated principals on offers list the offers in the system and show their details
| 0
|
7,131
| 10,278,005,442
|
IssuesEvent
|
2019-08-25 10:37:14
|
pwittchen/ReactiveNetwork
|
https://api.github.com/repos/pwittchen/ReactiveNetwork
|
closed
|
Release 3.0.6
|
release process
|
**Release notes**:
- added new method for creating `HttpsUrlConnection` (`HttpsURLConnection createHttpsUrlConnection(final String host, final int port,
final int timeoutInMs)`) in `WalledGardenInternetObservingStrategy`, appropriate method is chosen automatically basing on the protocol (`http` or `https`) - solves #323
- note: version 3.0.5 was skipped due to sonatype issues
**Things to do**:
- [x] update javadocs
- [x] bump version
- [x] release library
- [x] update changelog
- [x] create github release
|
1.0
|
Release 3.0.6 - **Release notes**:
- added new method for creating `HttpsUrlConnection` (`HttpsURLConnection createHttpsUrlConnection(final String host, final int port,
final int timeoutInMs)`) in `WalledGardenInternetObservingStrategy`, appropriate method is chosen automatically basing on the protocol (`http` or `https`) - solves #323
- note: version 3.0.5 was skipped due to sonatype issues
**Things to do**:
- [x] update javadocs
- [x] bump version
- [x] release library
- [x] update changelog
- [x] create github release
|
process
|
release release notes added new method for creating httpsurlconnection httpsurlconnection createhttpsurlconnection final string host final int port final int timeoutinms in walledgardeninternetobservingstrategy appropriate method is chosen automatically basing on the protocol http or https solves note version was skipped due to sonatype issues things to do update javadocs bump version release library update changelog create github release
| 1
|
21,325
| 28,961,593,112
|
IssuesEvent
|
2023-05-10 03:17:57
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Ability to control the order of processing model output layers
|
Processing Feature Request Modeller
|
**Feature description.**
There are a lot of use cases where a specific layer order for processing model output layers is wanted, e.g. so that a layer with point features is not hidden by a layer with polygon features.
Unless I'm missing something, there's currently no way to control the order of processing model output layers, so it could be that you have to rearrange the layers manually after the model is finished.
Here's a [test_model_output.zip](https://github.com/qgis/QGIS/files/6824112/test_model_output.zip) where it looks like the output order is totally random, despite the algorithm `2` is always executed after `1` by setting a dependency.


Maybe one option to control the layer order would be to make the output order dependent on the order in which the model algorithms are executed. This way it would be possible to control the layer order by setting algorithm dependencies, e.g. for the sample model above layer `2` would be always positioned on top of layer `1`.
**Additional context**
This is a recurring question on SE:
- https://gis.stackexchange.com/questions/157702/processing-modeler-output-order
- https://gis.stackexchange.com/questions/278256/control-rendering-layer-order-of-outputs-from-qgis-3-processing-modeler
- https://gis.stackexchange.com/questions/387284/setting-layer-order-for-output-in-qgis-graphical-modeler
- https://gis.stackexchange.com/questions/377868/output-layers-order-in-qgis-graphic-modeller-algorithm-execution-order-depend
|
1.0
|
Ability to control the order of processing model output layers - **Feature description.**
There are a lot of use cases where a specific layer order for processing model output layers is wanted, e.g. so that a layer with point features is not hidden by a layer with polygon features.
Unless I'm missing something, there's currently no way to control the order of processing model output layers, so it could be that you have to rearrange the layers manually after the model is finished.
Here's a [test_model_output.zip](https://github.com/qgis/QGIS/files/6824112/test_model_output.zip) where it looks like the output order is totally random, despite the algorithm `2` is always executed after `1` by setting a dependency.


Maybe one option to control the layer order would be to make the output order dependent on the order in which the model algorithms are executed. This way it would be possible to control the layer order by setting algorithm dependencies, e.g. for the sample model above layer `2` would be always positioned on top of layer `1`.
**Additional context**
This is a recurring question on SE:
- https://gis.stackexchange.com/questions/157702/processing-modeler-output-order
- https://gis.stackexchange.com/questions/278256/control-rendering-layer-order-of-outputs-from-qgis-3-processing-modeler
- https://gis.stackexchange.com/questions/387284/setting-layer-order-for-output-in-qgis-graphical-modeler
- https://gis.stackexchange.com/questions/377868/output-layers-order-in-qgis-graphic-modeller-algorithm-execution-order-depend
|
process
|
ability to control the order of processing model output layers feature description there are a lot of use cases where a specific layer order for processing model output layers is wanted e g so that a layer with point features is not hidden by a layer with polygon features unless i m missing something there s currently no way to control the order of processing model output layers so it could be that you have to rearrange the layers manually after the model is finished here s a where it looks like the output order is totally random despite the algorithm is always executed after by setting a dependency maybe one option to control the layer order would be to make the output order dependent on the order in which the model algorithms are executed this way it would be possible to control the layer order by setting algorithm dependencies e g for the sample model above layer would be always positioned on top of layer additional context this is a recurring question on se
| 1
|
15,808
| 20,009,182,869
|
IssuesEvent
|
2022-02-01 02:45:41
|
elastic/beats
|
https://api.github.com/repos/elastic/beats
|
closed
|
[libbeat] add_network_direction creates field name that contains a dot
|
bug libbeat :Processors good first issue Team:Security-External Integrations
|
The `add_network_direction` processor creates field names that contain dots. By default it will create a document that contains
What is does:
```json
{
"network.direction": "outbound"
}
```
What is SHOULD do:
```json
{
"network": {
"direction": "outbound"
}
}
```
The problem is this line that directly uses the target key without considering that it could contain dots.
https://github.com/elastic/beats/blob/ce73772c4ef4be4577d0bb268524f0d96d31cc33/libbeat/processors/actions/add_network_direction.go#L103-L105
It looks to me like it should use:
` event.PutValue(m.Target, networkDirection(internalSource, internalDestination))`
|
1.0
|
[libbeat] add_network_direction creates field name that contains a dot - The `add_network_direction` processor creates field names that contain dots. By default it will create a document that contains
What is does:
```json
{
"network.direction": "outbound"
}
```
What is SHOULD do:
```json
{
"network": {
"direction": "outbound"
}
}
```
The problem is this line that directly uses the target key without considering that it could contain dots.
https://github.com/elastic/beats/blob/ce73772c4ef4be4577d0bb268524f0d96d31cc33/libbeat/processors/actions/add_network_direction.go#L103-L105
It looks to me like it should use:
` event.PutValue(m.Target, networkDirection(internalSource, internalDestination))`
|
process
|
add network direction creates field name that contains a dot the add network direction processor creates field names that contain dots by default it will create a document that contains what is does json network direction outbound what is should do json network direction outbound the problem is this line that directly uses the target key without considering that it could contain dots it looks to me like it should use event putvalue m target networkdirection internalsource internaldestination
| 1
|
19,937
| 26,405,513,481
|
IssuesEvent
|
2023-01-13 07:33:44
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
reopened
|
Change label GO:0052151 positive regulation by symbiont of host apoptotic process
|
multi-species process term name
|
GO:0052151 positive regulation by symbiont of host apoptotic process
should be 'induction'
+ have taxon constraints to prevent host proteins being annotated.
@genegodbold
|
1.0
|
Change label GO:0052151 positive regulation by symbiont of host apoptotic process - GO:0052151 positive regulation by symbiont of host apoptotic process
should be 'induction'
+ have taxon constraints to prevent host proteins being annotated.
@genegodbold
|
process
|
change label go positive regulation by symbiont of host apoptotic process go positive regulation by symbiont of host apoptotic process should be induction have taxon constraints to prevent host proteins being annotated genegodbold
| 1
|
13,898
| 16,657,576,408
|
IssuesEvent
|
2021-06-05 20:14:08
|
laugharn/link
|
https://api.github.com/repos/laugharn/link
|
closed
|
Persist Front-End User Data
|
kind/improvement process/selected size/sm team/front
|
Waiting for the user response from a cold lambda is visually annoying. We should persist that data with some kind of local storage that is more expansive than a cookie but more secure than normal local storage.
- [x] Use localforage to persist user data
- [x] Clear the persisted data when a user is not authenticated or logs out.
|
1.0
|
Persist Front-End User Data - Waiting for the user response from a cold lambda is visually annoying. We should persist that data with some kind of local storage that is more expansive than a cookie but more secure than normal local storage.
- [x] Use localforage to persist user data
- [x] Clear the persisted data when a user is not authenticated or logs out.
|
process
|
persist front end user data waiting for the user response from a cold lambda is visually annoying we should persist that data with some kind of local storage that is more expansive than a cookie but more secure than normal local storage use localforage to persist user data clear the persisted data when a user is not authenticated or logs out
| 1
|
19,011
| 13,536,101,930
|
IssuesEvent
|
2020-09-16 08:33:58
|
topcoder-platform/qa-fun
|
https://api.github.com/repos/topcoder-platform/qa-fun
|
closed
|
"Back to topcoder.com" button opens a new tab instead of redirecting the user within the current tab
|
UX/Usability
|
Step to reproduce:
1. Go to https://www.topcoder.com/openassemblyreport
2. Click the gray button on top right corner
3. Observe how it redirects user to topcoder.com
Expected Result: User should be redirected to topcoder.com within the current tab
Actual Result: A new tab opened and redirected user to topcoder.com
Device/OS/Browser Information: Asus K401U/Windows 10 Home/Google Chrome Version 81.0.4044.138 (Official Build) (64-bit)
Screenshot:


|
True
|
"Back to topcoder.com" button opens a new tab instead of redirecting the user within the current tab - Step to reproduce:
1. Go to https://www.topcoder.com/openassemblyreport
2. Click the gray button on top right corner
3. Observe how it redirects user to topcoder.com
Expected Result: User should be redirected to topcoder.com within the current tab
Actual Result: A new tab opened and redirected user to topcoder.com
Device/OS/Browser Information: Asus K401U/Windows 10 Home/Google Chrome Version 81.0.4044.138 (Official Build) (64-bit)
Screenshot:


|
non_process
|
back to topcoder com button opens a new tab instead of redirecting the user within the current tab step to reproduce go to click the gray button on top right corner observe how it redirects user to topcoder com expected result user should be redirected to topcoder com within the current tab actual result a new tab opened and redirected user to topcoder com device os browser information asus windows home google chrome version official build bit screenshot
| 0
|
12,348
| 9,742,110,926
|
IssuesEvent
|
2019-06-02 14:34:46
|
teambit/bit
|
https://api.github.com/repos/teambit/bit
|
opened
|
[Feature] Autocomplete in Bit
|
area/infrastructure type/feature
|
Terminal commands must be exact and correct. This means a typo breaks an entire command. Commands in Bit tend to be rather long due to that fact that one has to type the full ID of a component. Additionally if one has to know the exact ID of the component they need, it splits the discovery of components from the import process. An autocomplete mechanism can merge both flows into one.
## Suggesting remotes
A Bit workspace has a local content store of remotes and the components it uses and tracks. When a developer starts typing the name of a remote, Bit should start suggesting completions to the remotes that are already "known" to the workspace (ie - the workspace's local store has components that are linked to these remotes).
For example:
```sh
$ bit import itaym
itaymendel.framer-demo
itaymendel.bit-docs
itaymendel.public-tests
itaymendel.test-glitch
```
### Handling "new" remotes
It's possible that there are other remotes that are not used by a workspace at the moment but are still available to use.
#### 'bit remote'
One can link remotes to their server using the `bit remote` command. Bit's autocomplete should use all the globally and locally defined remotes as possible candidates for completions.
#### Default resolver
Bit can connect with a remote service provider to act as a default resolver for remotes. We should extend that API to allow for suggestions for remotes.
## Suggesting components
After the developer is done typing the name of a remote, Bit's autocomplete mechanism should assist in finding the correct component. Here Bit should flatten the entire components tree in a remote collection as well as suggest completions to the different namespaces.
### Local components
The autocomplete should give priority to components and namespaces that are already a part of a workspace.
```js
$ bit import bit.movie-app/co
components [namespace]
components/button [imported]
componetns/item
```
### Remote components
The autocomplete should be able to fetch the list of components and namespaces from a remote and suggest them as well. There should be a distinction between components that are already a part of the workspace.
## Open questions
1. Handling autocomplete for components without remotes (components that are not exported yet).
1. UX for "fetching remote content" (when there are no local completions)
1. UX for autocomplete suggestions that are a dependency of a local component (not sure if we should even care about it)
1. Handling content in `package.json` and local `node_modules` folder (not only in the workspace store).
|
1.0
|
[Feature] Autocomplete in Bit - Terminal commands must be exact and correct. This means a typo breaks an entire command. Commands in Bit tend to be rather long due to that fact that one has to type the full ID of a component. Additionally if one has to know the exact ID of the component they need, it splits the discovery of components from the import process. An autocomplete mechanism can merge both flows into one.
## Suggesting remotes
A Bit workspace has a local content store of remotes and the components it uses and tracks. When a developer starts typing the name of a remote, Bit should start suggesting completions to the remotes that are already "known" to the workspace (ie - the workspace's local store has components that are linked to these remotes).
For example:
```sh
$ bit import itaym
itaymendel.framer-demo
itaymendel.bit-docs
itaymendel.public-tests
itaymendel.test-glitch
```
### Handling "new" remotes
It's possible that there are other remotes that are not used by a workspace at the moment but are still available to use.
#### 'bit remote'
One can link remotes to their server using the `bit remote` command. Bit's autocomplete should use all the globally and locally defined remotes as possible candidates for completions.
#### Default resolver
Bit can connect with a remote service provider to act as a default resolver for remotes. We should extend that API to allow for suggestions for remotes.
## Suggesting components
After the developer is done typing the name of a remote, Bit's autocomplete mechanism should assist in finding the correct component. Here Bit should flatten the entire components tree in a remote collection as well as suggest completions to the different namespaces.
### Local components
The autocomplete should give priority to components and namespaces that are already a part of a workspace.
```js
$ bit import bit.movie-app/co
components [namespace]
components/button [imported]
componetns/item
```
### Remote components
The autocomplete should be able to fetch the list of components and namespaces from a remote and suggest them as well. There should be a distinction between components that are already a part of the workspace.
## Open questions
1. Handling autocomplete for components without remotes (components that are not exported yet).
1. UX for "fetching remote content" (when there are no local completions)
1. UX for autocomplete suggestions that are a dependency of a local component (not sure if we should even care about it)
1. Handling content in `package.json` and local `node_modules` folder (not only in the workspace store).
|
non_process
|
autocomplete in bit terminal commands must be exact and correct this means a typo breaks an entire command commands in bit tend to be rather long due to that fact that one has to type the full id of a component additionally if one has to know the exact id of the component they need it splits the discovery of components from the import process an autocomplete mechanism can merge both flows into one suggesting remotes a bit workspace has a local content store of remotes and the components it uses and tracks when a developer starts typing the name of a remote bit should start suggesting completions to the remotes that are already known to the workspace ie the workspace s local store has components that are linked to these remotes for example sh bit import itaym itaymendel framer demo itaymendel bit docs itaymendel public tests itaymendel test glitch handling new remotes it s possible that there are other remotes that are not used by a workspace at the moment but are still available to use bit remote one can link remotes to their server using the bit remote command bit s autocomplete should use all the globally and locally defined remotes as possible candidates for completions default resolver bit can connect with a remote service provider to act as a default resolver for remotes we should extend that api to allow for suggestions for remotes suggesting components after the developer is done typing the name of a remote bit s autocomplete mechanism should assist in finding the correct component here bit should flatten the entire components tree in a remote collection as well as suggest completions to the different namespaces local components the autocomplete should give priority to components and namespaces that are already a part of a workspace js bit import bit movie app co components components button componetns item remote components the autocomplete should be able to fetch the list of components and namespaces from a remote and suggest them as well there should be a distinction between components that are already a part of the workspace open questions handling autocomplete for components without remotes components that are not exported yet ux for fetching remote content when there are no local completions ux for autocomplete suggestions that are a dependency of a local component not sure if we should even care about it handling content in package json and local node modules folder not only in the workspace store
| 0
|
22,093
| 30,613,671,962
|
IssuesEvent
|
2023-07-23 23:01:46
|
AvaloniaUI/Avalonia
|
https://api.github.com/repos/AvaloniaUI/Avalonia
|
closed
|
[11.0.0] TextBox input broken on Android since RC2
|
bug os-android area-textprocessing
|
**Describe the bug**
Input using device keyboard on a few tested devices (Samsung S21 android 13, Zebra TC21 android 11, Datalogic Skorpio x5 android 10, all using their default keyboards) is broken with digit input (maybe other inputs as well).
For example on the Zebra TC21 this is the result, when typing "1 2 3 4 5":
In the default TextBox character '5' is doubled.
In the password TextBox '3', '4' and '5' are doubled.
Results differ per type of device (the Samsung S21 in password textbox shows "234" as result).

(label shows content of password textbox)
**To Reproduce**
```xaml
<StackPanel Margin="20" Spacing="20">
<TextBox
x:Name="text"
HorizontalAlignment="Stretch"
VerticalAlignment="Center" />
<TextBox
x:Name="pass"
HorizontalAlignment="Stretch"
VerticalAlignment="Center"
PasswordChar="*" />
<TextBlock HorizontalAlignment="Stretch" Text="{Binding #pass.Text}" />
</StackPanel>
```
- OS: Android
- Version 10, 11 and 13 tested
- - Avalonia 11.0.0 RC2.2 and 11.0.0 -- problem does not exist on RC1 and before (it was a problem back in a preview in november last year)
|
1.0
|
[11.0.0] TextBox input broken on Android since RC2 - **Describe the bug**
Input using device keyboard on a few tested devices (Samsung S21 android 13, Zebra TC21 android 11, Datalogic Skorpio x5 android 10, all using their default keyboards) is broken with digit input (maybe other inputs as well).
For example on the Zebra TC21 this is the result, when typing "1 2 3 4 5":
In the default TextBox character '5' is doubled.
In the password TextBox '3', '4' and '5' are doubled.
Results differ per type of device (the Samsung S21 in password textbox shows "234" as result).

(label shows content of password textbox)
**To Reproduce**
```xaml
<StackPanel Margin="20" Spacing="20">
<TextBox
x:Name="text"
HorizontalAlignment="Stretch"
VerticalAlignment="Center" />
<TextBox
x:Name="pass"
HorizontalAlignment="Stretch"
VerticalAlignment="Center"
PasswordChar="*" />
<TextBlock HorizontalAlignment="Stretch" Text="{Binding #pass.Text}" />
</StackPanel>
```
- OS: Android
- Version 10, 11 and 13 tested
- - Avalonia 11.0.0 RC2.2 and 11.0.0 -- problem does not exist on RC1 and before (it was a problem back in a preview in november last year)
|
process
|
textbox input broken on android since describe the bug input using device keyboard on a few tested devices samsung android zebra android datalogic skorpio android all using their default keyboards is broken with digit input maybe other inputs as well for example on the zebra this is the result when typing in the default textbox character is doubled in the password textbox and are doubled results differ per type of device the samsung in password textbox shows as result label shows content of password textbox to reproduce xaml textbox x name text horizontalalignment stretch verticalalignment center textbox x name pass horizontalalignment stretch verticalalignment center passwordchar os android version and tested avalonia and problem does not exist on and before it was a problem back in a preview in november last year
| 1
|
52,316
| 12,949,234,993
|
IssuesEvent
|
2020-07-19 08:15:39
|
BEEmod/BEE2-items
|
https://api.github.com/repos/BEEmod/BEE2-items
|
closed
|
Placement Helper model overridden by older versions
|
Bug Development Build Fixed
|
Editor model is misplaced and is also missing the arrow entirely. The in-game results are bugged too.

In-game the portal is supposed to be placed on the side but gets placed like this instead:

|
1.0
|
Placement Helper model overridden by older versions - Editor model is misplaced and is also missing the arrow entirely. The in-game results are bugged too.

In-game the portal is supposed to be placed on the side but gets placed like this instead:

|
non_process
|
placement helper model overridden by older versions editor model is misplaced and is also missing the arrow entirely the in game results are bugged too in game the portal is supposed to be placed on the side but gets placed like this instead
| 0
|
114,225
| 24,568,837,507
|
IssuesEvent
|
2022-10-13 06:55:21
|
ballerina-platform/ballerina-lang
|
https://api.github.com/repos/ballerina-platform/ballerina-lang
|
closed
|
[Bug]: ClassCastException in ExtractToFunctionCodeAction
|
Type/Bug Team/LanguageServer Points/1 Area/CodeAction Reason/EngineeringMistake userCategory/Editor
|
### Description
Placing the cursor at the following cursor position gives a ClassCastException,
<img width="1198" alt="Screenshot 2022-10-03 at 13 02 16" src="https://user-images.githubusercontent.com/61020198/193523423-b495998e-5ad4-4a13-9a77-40ecdfa8e044.png">
```
[Error - 1:00:51 PM] CodeAction 'ExtractToFunctionCodeAction' failed! {, error: 'class io.ballerina.compiler.syntax.tree.BasicLiteralNode cannot be cast to class io.ballerina.compiler.syntax.tree.FieldAccessExpressionNode (io.ballerina.compiler.syntax.tree.BasicLiteralNode and io.ballerina.compiler.syntax.tree.FieldAccessExpressionNode are in unnamed module of loader 'app')'}
java.lang.ClassCastException: class io.ballerina.compiler.syntax.tree.BasicLiteralNode cannot be cast to class io.ballerina.compiler.syntax.tree.FieldAccessExpressionNode (io.ballerina.compiler.syntax.tree.BasicLiteralNode and io.ballerina.compiler.syntax.tree.FieldAccessExpressionNode are in unnamed module of loader 'app')
at org.ballerinalang.langserver.codeaction.providers.ExtractToFunctionCodeAction.isExpressionExtractable(ExtractToFunctionCodeAction.java:549)
at org.ballerinalang.langserver.codeaction.providers.ExtractToFunctionCodeAction.validate(ExtractToFunctionCodeAction.java:86)
at org.ballerinalang.langserver.codeaction.CodeActionRouter.lambda$getAvailableCodeActions$1(CodeActionRouter.java:94)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1540)
at org.ballerinalang.langserver.codeaction.CodeActionRouter.getAvailableCodeActions(CodeActionRouter.java:88)
at org.ballerinalang.langserver.codeaction.BallerinaCodeActionExtension.execute(BallerinaCodeActionExtension.java:49)
at org.ballerinalang.langserver.codeaction.BallerinaCodeActionExtension.execute(BallerinaCodeActionExtension.java:37)
at org.ballerinalang.langserver.LangExtensionDelegator.codeActions(LangExtensionDelegator.java:169)
at org.ballerinalang.langserver.BallerinaTextDocumentService.lambda$codeAction$8(BallerinaTextDocumentService.java:334)
at java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642)
at java.base/java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:479)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:177)
```
### Steps to Reproduce
```
type Student record {
string fname;
string lname;
int age;
};
public function main() {
Student st = {fname: "<cursor>",};
}
```
### Affected Version(s)
2201.2.x
### Related area
-> Editor
|
1.0
|
[Bug]: ClassCastException in ExtractToFunctionCodeAction - ### Description
Placing the cursor at the following cursor position gives a ClassCastException,
<img width="1198" alt="Screenshot 2022-10-03 at 13 02 16" src="https://user-images.githubusercontent.com/61020198/193523423-b495998e-5ad4-4a13-9a77-40ecdfa8e044.png">
```
[Error - 1:00:51 PM] CodeAction 'ExtractToFunctionCodeAction' failed! {, error: 'class io.ballerina.compiler.syntax.tree.BasicLiteralNode cannot be cast to class io.ballerina.compiler.syntax.tree.FieldAccessExpressionNode (io.ballerina.compiler.syntax.tree.BasicLiteralNode and io.ballerina.compiler.syntax.tree.FieldAccessExpressionNode are in unnamed module of loader 'app')'}
java.lang.ClassCastException: class io.ballerina.compiler.syntax.tree.BasicLiteralNode cannot be cast to class io.ballerina.compiler.syntax.tree.FieldAccessExpressionNode (io.ballerina.compiler.syntax.tree.BasicLiteralNode and io.ballerina.compiler.syntax.tree.FieldAccessExpressionNode are in unnamed module of loader 'app')
at org.ballerinalang.langserver.codeaction.providers.ExtractToFunctionCodeAction.isExpressionExtractable(ExtractToFunctionCodeAction.java:549)
at org.ballerinalang.langserver.codeaction.providers.ExtractToFunctionCodeAction.validate(ExtractToFunctionCodeAction.java:86)
at org.ballerinalang.langserver.codeaction.CodeActionRouter.lambda$getAvailableCodeActions$1(CodeActionRouter.java:94)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1540)
at org.ballerinalang.langserver.codeaction.CodeActionRouter.getAvailableCodeActions(CodeActionRouter.java:88)
at org.ballerinalang.langserver.codeaction.BallerinaCodeActionExtension.execute(BallerinaCodeActionExtension.java:49)
at org.ballerinalang.langserver.codeaction.BallerinaCodeActionExtension.execute(BallerinaCodeActionExtension.java:37)
at org.ballerinalang.langserver.LangExtensionDelegator.codeActions(LangExtensionDelegator.java:169)
at org.ballerinalang.langserver.BallerinaTextDocumentService.lambda$codeAction$8(BallerinaTextDocumentService.java:334)
at java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642)
at java.base/java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:479)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:177)
```
### Steps to Reproduce
```
type Student record {
string fname;
string lname;
int age;
};
public function main() {
Student st = {fname: "<cursor>",};
}
```
### Affected Version(s)
2201.2.x
### Related area
-> Editor
|
non_process
|
classcastexception in extracttofunctioncodeaction description placing the cursor at the following cursor position gives a classcastexception img width alt screenshot at src codeaction extracttofunctioncodeaction failed error class io ballerina compiler syntax tree basicliteralnode cannot be cast to class io ballerina compiler syntax tree fieldaccessexpressionnode io ballerina compiler syntax tree basicliteralnode and io ballerina compiler syntax tree fieldaccessexpressionnode are in unnamed module of loader app java lang classcastexception class io ballerina compiler syntax tree basicliteralnode cannot be cast to class io ballerina compiler syntax tree fieldaccessexpressionnode io ballerina compiler syntax tree basicliteralnode and io ballerina compiler syntax tree fieldaccessexpressionnode are in unnamed module of loader app at org ballerinalang langserver codeaction providers extracttofunctioncodeaction isexpressionextractable extracttofunctioncodeaction java at org ballerinalang langserver codeaction providers extracttofunctioncodeaction validate extracttofunctioncodeaction java at org ballerinalang langserver codeaction codeactionrouter lambda getavailablecodeactions codeactionrouter java at java base java util arraylist foreach arraylist java at org ballerinalang langserver codeaction codeactionrouter getavailablecodeactions codeactionrouter java at org ballerinalang langserver codeaction ballerinacodeactionextension execute ballerinacodeactionextension java at org ballerinalang langserver codeaction ballerinacodeactionextension execute ballerinacodeactionextension java at org ballerinalang langserver langextensiondelegator codeactions langextensiondelegator java at org ballerinalang langserver ballerinatextdocumentservice lambda codeaction ballerinatextdocumentservice java at java base java util concurrent completablefuture uniapply tryfire completablefuture java at java base java util concurrent completablefuture completion exec completablefuture java at java base java util concurrent forkjointask doexec forkjointask java at java base java util concurrent forkjoinpool workqueue toplevelexec forkjoinpool java at java base java util concurrent forkjoinpool scan forkjoinpool java at java base java util concurrent forkjoinpool runworker forkjoinpool java at java base java util concurrent forkjoinworkerthread run forkjoinworkerthread java steps to reproduce type student record string fname string lname int age public function main student st fname affected version s x related area editor
| 0
|
197,774
| 6,963,710,461
|
IssuesEvent
|
2017-12-08 18:31:37
|
AmpersandTarski/Ampersand
|
https://api.github.com/repos/AmpersandTarski/Ampersand
|
closed
|
Fatal: after checking everything an unhandled singleton value found!
|
bug priority:high xlsx populations
|
# Symptom
@ellekebaecke showed me a bug, which I can reproduce. This bug is urgent, because this bug is in a distributed version of the compiler. So it also exposes a flaw in our automated testing procedure.
I compiled `D:\git\ampersand-models\MirrorMe\MiniMirrorMe.adl` using Ampersand-v3.9.0 [master:fb042c8], build time: 28-Nov-17 09:05:15
I expected the compiler to work as intended. However, I got a compiler error:

|
1.0
|
Fatal: after checking everything an unhandled singleton value found! - # Symptom
@ellekebaecke showed me a bug, which I can reproduce. This bug is urgent, because this bug is in a distributed version of the compiler. So it also exposes a flaw in our automated testing procedure.
I compiled `D:\git\ampersand-models\MirrorMe\MiniMirrorMe.adl` using Ampersand-v3.9.0 [master:fb042c8], build time: 28-Nov-17 09:05:15
I expected the compiler to work as intended. However, I got a compiler error:

|
non_process
|
fatal after checking everything an unhandled singleton value found symptom ellekebaecke showed me a bug which i can reproduce this bug is urgent because this bug is in a distributed version of the compiler so it also exposes a flaw in our automated testing procedure i compiled d git ampersand models mirrorme minimirrorme adl using ampersand build time nov i expected the compiler to work as intended however i got a compiler error
| 0
|
12,530
| 14,972,326,713
|
IssuesEvent
|
2021-01-27 22:39:01
|
BootBlock/FileSieve
|
https://api.github.com/repos/BootBlock/FileSieve
|
opened
|
Add YieldNoPreScan to Get Files Mode in the Source Item Editor
|
backend-core processing ui
|
Or add a checkbox to the Source Item advanced settings that say “Do not perform a pre-scan” as it’s not actually mutually-exclusive with the current **Get Files Mode**.
This needs to be thought out better from a code point-of-view.
Leaving for now.
|
1.0
|
Add YieldNoPreScan to Get Files Mode in the Source Item Editor - Or add a checkbox to the Source Item advanced settings that say “Do not perform a pre-scan” as it’s not actually mutually-exclusive with the current **Get Files Mode**.
This needs to be thought out better from a code point-of-view.
Leaving for now.
|
process
|
add yieldnoprescan to get files mode in the source item editor or add a checkbox to the source item advanced settings that say “do not perform a pre scan” as it’s not actually mutually exclusive with the current get files mode this needs to be thought out better from a code point of view leaving for now
| 1
|
263,142
| 28,021,942,798
|
IssuesEvent
|
2023-03-28 06:22:04
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
[Cloud Security][Findings] Runtime field error
|
bug Team:Cloud Security 8.7 candidate
|
The findings page fails after upgrade with pre-existing KSPM data, the error happens because some documents don't have the `cluster_id` field

|
True
|
[Cloud Security][Findings] Runtime field error - The findings page fails after upgrade with pre-existing KSPM data, the error happens because some documents don't have the `cluster_id` field

|
non_process
|
runtime field error the findings page fails after upgrade with pre existing kspm data the error happens because some documents don t have the cluster id field
| 0
|
11,932
| 9,533,134,722
|
IssuesEvent
|
2019-04-29 20:27:50
|
MicrosoftDocs/azure-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-docs
|
closed
|
Volume [resourceId('Microsoft.ServiceFabricMesh/volumes','testVolume')] does not exist
|
assigned-to-author product-question service-fabric-mesh/svc triaged
|
Following the code above exactly, putting in my storage account info and I get the error above when trying to run the code.
Why would it think my volume does not exist?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: bb141910-2325-da04-e131-e58487fb6a43
* Version Independent ID: 8e4e14c0-b1ff-eb7f-0241-f781060862d0
* Content: [Use an Azure Files based volume in a Service Fabric Mesh application](https://docs.microsoft.com/en-us/azure/service-fabric-mesh/service-fabric-mesh-howto-deploy-app-azurefiles-volume#feedback)
* Content Source: [articles/service-fabric-mesh/service-fabric-mesh-howto-deploy-app-azurefiles-volume.md](https://github.com/Microsoft/azure-docs/blob/master/articles/service-fabric-mesh/service-fabric-mesh-howto-deploy-app-azurefiles-volume.md)
* Service: **service-fabric-mesh**
* GitHub Login: @dkkapur
* Microsoft Alias: **dekapur**
|
1.0
|
Volume [resourceId('Microsoft.ServiceFabricMesh/volumes','testVolume')] does not exist - Following the code above exactly, putting in my storage account info and I get the error above when trying to run the code.
Why would it think my volume does not exist?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: bb141910-2325-da04-e131-e58487fb6a43
* Version Independent ID: 8e4e14c0-b1ff-eb7f-0241-f781060862d0
* Content: [Use an Azure Files based volume in a Service Fabric Mesh application](https://docs.microsoft.com/en-us/azure/service-fabric-mesh/service-fabric-mesh-howto-deploy-app-azurefiles-volume#feedback)
* Content Source: [articles/service-fabric-mesh/service-fabric-mesh-howto-deploy-app-azurefiles-volume.md](https://github.com/Microsoft/azure-docs/blob/master/articles/service-fabric-mesh/service-fabric-mesh-howto-deploy-app-azurefiles-volume.md)
* Service: **service-fabric-mesh**
* GitHub Login: @dkkapur
* Microsoft Alias: **dekapur**
|
non_process
|
volume does not exist following the code above exactly putting in my storage account info and i get the error above when trying to run the code why would it think my volume does not exist document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service service fabric mesh github login dkkapur microsoft alias dekapur
| 0
|
777,571
| 27,286,156,800
|
IssuesEvent
|
2023-02-23 13:38:56
|
UCL/TDMS
|
https://api.github.com/repos/UCL/TDMS
|
closed
|
Update to `iteratefdtd_matrix`: Main Loop Alterations
|
enhancement priority:1 subtask
|
Subtask of #208 |
### Main Loop Alterations
These changes are rather self contained, and do not require any major reworking of the current test framework or how `tdms` behaves on the command line. _Where_ in the codebase these changes occur will be in either `execute_simulation` or `simulation.execute`, depending on post or pre #215.
- [ ] Whenever `{I,J,K}_source` is non-empty, the corresponding field update lines in the main loop need to be executed. These are a subsection of lines [2121 through 2231](https://github.com/UCL/TDMS/blob/384a91ab8d0b24bc2d48233d710d2f4e4cf14f64/tdms/src/simulation_manager/execute_simulation.cpp#L2121-L2231).
- [ ] If all `{I,J,K}_source` are non-empty, then outputs should not be normalised. The functions listed in the issue have been superseded by class methods, with the lines not to be executed being [here](https://github.com/UCL/TDMS/blob/1924201bdad21465f6aa05677b4294eed7993832/tdms/src/iterator.cpp#L4222-L4242) (or in the `post_loop_processing()` method if [PR215](https://github.com/UCL/TDMS/pull/215) is merged).
#### To clarify:
- There is already conditional logic surrounding each part of the codebase to be edited - is the new condition of `{I,J,K}_source` all being empty a replacement, an additional constraint, or an alternative?
|
1.0
|
Update to `iteratefdtd_matrix`: Main Loop Alterations - Subtask of #208 |
### Main Loop Alterations
These changes are rather self contained, and do not require any major reworking of the current test framework or how `tdms` behaves on the command line. _Where_ in the codebase these changes occur will be in either `execute_simulation` or `simulation.execute`, depending on post or pre #215.
- [ ] Whenever `{I,J,K}_source` is non-empty, the corresponding field update lines in the main loop need to be executed. These are a subsection of lines [2121 through 2231](https://github.com/UCL/TDMS/blob/384a91ab8d0b24bc2d48233d710d2f4e4cf14f64/tdms/src/simulation_manager/execute_simulation.cpp#L2121-L2231).
- [ ] If all `{I,J,K}_source` are non-empty, then outputs should not be normalised. The functions listed in the issue have been superseded by class methods, with the lines not to be executed being [here](https://github.com/UCL/TDMS/blob/1924201bdad21465f6aa05677b4294eed7993832/tdms/src/iterator.cpp#L4222-L4242) (or in the `post_loop_processing()` method if [PR215](https://github.com/UCL/TDMS/pull/215) is merged).
#### To clarify:
- There is already conditional logic surrounding each part of the codebase to be edited - is the new condition of `{I,J,K}_source` all being empty a replacement, an additional constraint, or an alternative?
|
non_process
|
update to iteratefdtd matrix main loop alterations subtask of main loop alterations these changes are rather self contained and do not require any major reworking of the current test framework or how tdms behaves on the command line where in the codebase these changes occur will be in either execute simulation or simulation execute depending on post or pre whenever i j k source is non empty the corresponding field update lines in the main loop need to be executed these are a subsection of lines if all i j k source are non empty then outputs should not be normalised the functions listed in the issue have been superseded by class methods with the lines not to be executed being or in the post loop processing method if is merged to clarify there is already conditional logic surrounding each part of the codebase to be edited is the new condition of i j k source all being empty a replacement an additional constraint or an alternative
| 0
|
790,500
| 27,827,776,125
|
IssuesEvent
|
2023-03-19 23:28:39
|
ChainSafe/Multix
|
https://api.github.com/repos/ChainSafe/Multix
|
closed
|
Make the subscription to new calls more selective
|
Priority: 🟠 P1 Type: 🫶 Enhancement
|
We should subscribe to the current multisigs/proxy rather than all.
we should unsub and subscribe again when switching.
|
1.0
|
Make the subscription to new calls more selective - We should subscribe to the current multisigs/proxy rather than all.
we should unsub and subscribe again when switching.
|
non_process
|
make the subscription to new calls more selective we should subscribe to the current multisigs proxy rather than all we should unsub and subscribe again when switching
| 0
|
240,079
| 7,800,379,508
|
IssuesEvent
|
2018-06-09 08:42:07
|
tine20/Tine-2.0-Open-Source-Groupware-and-CRM
|
https://api.github.com/repos/tine20/Tine-2.0-Open-Source-Groupware-and-CRM
|
closed
|
0008990:
update RELEASENOTES
|
Mantis Other high priority
|
**Reported by pschuele on 2 Oct 2013 09:30**
update RELEASENOTES
- php module intl is now required
- php module hash is now required
- ...
|
1.0
|
0008990:
update RELEASENOTES - **Reported by pschuele on 2 Oct 2013 09:30**
update RELEASENOTES
- php module intl is now required
- php module hash is now required
- ...
|
non_process
|
update releasenotes reported by pschuele on oct update releasenotes php module intl is now required php module hash is now required
| 0
|
78,523
| 3,510,777,981
|
IssuesEvent
|
2016-01-09 19:14:54
|
jo-source/jo-client-platform
|
https://api.github.com/repos/jo-source/jo-client-platform
|
closed
|
Allow to add custom buttons additional to custom actions on bean forms
|
auto-migrated Priority-Medium Type-Enhancement
|
```
Allow to add custom buttons additional to custom actions on bean forms
```
Original issue reported on code.google.com by `herr.gro...@gmx.de` on 28 Mar 2013 at 4:03
|
1.0
|
Allow to add custom buttons additional to custom actions on bean forms - ```
Allow to add custom buttons additional to custom actions on bean forms
```
Original issue reported on code.google.com by `herr.gro...@gmx.de` on 28 Mar 2013 at 4:03
|
non_process
|
allow to add custom buttons additional to custom actions on bean forms allow to add custom buttons additional to custom actions on bean forms original issue reported on code google com by herr gro gmx de on mar at
| 0
|
10,434
| 8,949,502,957
|
IssuesEvent
|
2019-01-25 07:52:04
|
Azure/azure-service-bus-node
|
https://api.github.com/repos/Azure/azure-service-bus-node
|
closed
|
API Pruning for Preview 1
|
Client Service Bus
|
Reviewing our current set of APIs for Preview 1, below are some changes that I am proposing we look at:
- **`maxConcurrentCalls` in the streaming receiver & `maxConcurrentCallsPerSession` in the session streaming receiver are options exposed to the user, but dont work as documented**
- Since we are not sure how this is supposed to work, there are no tests to cover this feature.
- We can keep the default values for now and not expose this option to the user.
- Once we understand the feature and there is user ask for the same, we can expose this to the users.
- **Promise returned by Send and SendBatch shouldn't resolve to any object**
- These promises currently resolve to a `Delivery` object which is an internal implementation of the object that was sent. The user has no use for it and so should not be returned.
- **listMessageSessions has skip as the first parameter making it mandatory**
- Remove the skip parameter. It can be added as an optional parameter later if there is user feedback.
- The only reason this exists is because the underlying api allows you to skip. .Net SDK hasnt implemented this feature, so its not like there has been existing user feedback to have this feature.
- **SessionManager**
- We don't have tests for the various options the session manager takes as input.
- Python has a simple sample for this and doesnt have this as a built-in feature.
- I would suggest to pull back on this and not ship this in Preview 1. If we can simplify this and share it as a sample instead of a feature, then that would be 1 less complicated feature to support.
- Else, understand all its options, have test cases and then ship.
- **acceptSession()**
- The name acceptSession() is not intuitive to receive an object that is then used to receive messages from sessions. Suggested name: `getSessionClient`
- **Input options to acceptSession(), session streaming receiver and session manager**
- Clean up needed. There is all kinds of overlap resulting in user getting exposed to options that they dont need
- Streaming Receiver takes in parameter `maxConcurrentSessions` & `maxMessageWaitTimeoutInSeconds` which only makes sense when it is called from Session Manager and not when called by user
- Streaming Receiver takes in `receiveMode` which is redundant as the same is passed to `acceptSesion()`
- `acceptSession()` takes in `autoComplete` and `maxAutoRenewDurationInSeconds` which it has no need for. Both are needed for streaming receiver and Session Manager.
- **Options that have the words "inSeconds"**
- **Sugggestion**: Strip the word "inSeconds", ensure documentation captures the fact that it is in seconds. Ensure we deal with seconds in all user facing APIs
- The name of these options are not self-explainatory. Removing "inSeconds" gives us some extra letters to have more meaningful names.
- **ReceiveBatch: Consider removing the third parameter**
- This third parameter is the time for which we would wait for a new message after receiving the current message
- The second parameter is for total time to wait to get the n messages asked for.
- The third parameter is complicating what would be a simple api.
- .Net SDK doesnt hav this feature, meaning there was no strong user ask for the same.
- If we do get user feedback about this, then we can always add it back
This one we might want to take up for Preview 2
- **The handler returned by `receive()` function whose sole purpose is to stop receiving messages**
- **Sugggestion**: Dont return any handler.
- None of the other SDKs have this model of using a handler to stop receiving messages. Instead, the
receiver client is expected to be closed by the user when they want to stop receiving messages
- Our streaming receiver for sessions doesnt have return such a handler either. The messageSession is expected to be closed by the user when they want to stop receiving messages
- Having a timeout instead of the "stop" behavior would be much simpler
|
1.0
|
API Pruning for Preview 1 - Reviewing our current set of APIs for Preview 1, below are some changes that I am proposing we look at:
- **`maxConcurrentCalls` in the streaming receiver & `maxConcurrentCallsPerSession` in the session streaming receiver are options exposed to the user, but dont work as documented**
- Since we are not sure how this is supposed to work, there are no tests to cover this feature.
- We can keep the default values for now and not expose this option to the user.
- Once we understand the feature and there is user ask for the same, we can expose this to the users.
- **Promise returned by Send and SendBatch shouldn't resolve to any object**
- These promises currently resolve to a `Delivery` object which is an internal implementation of the object that was sent. The user has no use for it and so should not be returned.
- **listMessageSessions has skip as the first parameter making it mandatory**
- Remove the skip parameter. It can be added as an optional parameter later if there is user feedback.
- The only reason this exists is because the underlying api allows you to skip. .Net SDK hasnt implemented this feature, so its not like there has been existing user feedback to have this feature.
- **SessionManager**
- We don't have tests for the various options the session manager takes as input.
- Python has a simple sample for this and doesnt have this as a built-in feature.
- I would suggest to pull back on this and not ship this in Preview 1. If we can simplify this and share it as a sample instead of a feature, then that would be 1 less complicated feature to support.
- Else, understand all its options, have test cases and then ship.
- **acceptSession()**
- The name acceptSession() is not intuitive to receive an object that is then used to receive messages from sessions. Suggested name: `getSessionClient`
- **Input options to acceptSession(), session streaming receiver and session manager**
- Clean up needed. There is all kinds of overlap resulting in user getting exposed to options that they dont need
- Streaming Receiver takes in parameter `maxConcurrentSessions` & `maxMessageWaitTimeoutInSeconds` which only makes sense when it is called from Session Manager and not when called by user
- Streaming Receiver takes in `receiveMode` which is redundant as the same is passed to `acceptSesion()`
- `acceptSession()` takes in `autoComplete` and `maxAutoRenewDurationInSeconds` which it has no need for. Both are needed for streaming receiver and Session Manager.
- **Options that have the words "inSeconds"**
- **Sugggestion**: Strip the word "inSeconds", ensure documentation captures the fact that it is in seconds. Ensure we deal with seconds in all user facing APIs
- The name of these options are not self-explainatory. Removing "inSeconds" gives us some extra letters to have more meaningful names.
- **ReceiveBatch: Consider removing the third parameter**
- This third parameter is the time for which we would wait for a new message after receiving the current message
- The second parameter is for total time to wait to get the n messages asked for.
- The third parameter is complicating what would be a simple api.
- .Net SDK doesnt hav this feature, meaning there was no strong user ask for the same.
- If we do get user feedback about this, then we can always add it back
This one we might want to take up for Preview 2
- **The handler returned by `receive()` function whose sole purpose is to stop receiving messages**
- **Sugggestion**: Dont return any handler.
- None of the other SDKs have this model of using a handler to stop receiving messages. Instead, the
receiver client is expected to be closed by the user when they want to stop receiving messages
- Our streaming receiver for sessions doesnt have return such a handler either. The messageSession is expected to be closed by the user when they want to stop receiving messages
- Having a timeout instead of the "stop" behavior would be much simpler
|
non_process
|
api pruning for preview reviewing our current set of apis for preview below are some changes that i am proposing we look at maxconcurrentcalls in the streaming receiver maxconcurrentcallspersession in the session streaming receiver are options exposed to the user but dont work as documented since we are not sure how this is supposed to work there are no tests to cover this feature we can keep the default values for now and not expose this option to the user once we understand the feature and there is user ask for the same we can expose this to the users promise returned by send and sendbatch shouldn t resolve to any object these promises currently resolve to a delivery object which is an internal implementation of the object that was sent the user has no use for it and so should not be returned listmessagesessions has skip as the first parameter making it mandatory remove the skip parameter it can be added as an optional parameter later if there is user feedback the only reason this exists is because the underlying api allows you to skip net sdk hasnt implemented this feature so its not like there has been existing user feedback to have this feature sessionmanager we don t have tests for the various options the session manager takes as input python has a simple sample for this and doesnt have this as a built in feature i would suggest to pull back on this and not ship this in preview if we can simplify this and share it as a sample instead of a feature then that would be less complicated feature to support else understand all its options have test cases and then ship acceptsession the name acceptsession is not intuitive to receive an object that is then used to receive messages from sessions suggested name getsessionclient input options to acceptsession session streaming receiver and session manager clean up needed there is all kinds of overlap resulting in user getting exposed to options that they dont need streaming receiver takes in parameter maxconcurrentsessions maxmessagewaittimeoutinseconds which only makes sense when it is called from session manager and not when called by user streaming receiver takes in receivemode which is redundant as the same is passed to acceptsesion acceptsession takes in autocomplete and maxautorenewdurationinseconds which it has no need for both are needed for streaming receiver and session manager options that have the words inseconds sugggestion strip the word inseconds ensure documentation captures the fact that it is in seconds ensure we deal with seconds in all user facing apis the name of these options are not self explainatory removing inseconds gives us some extra letters to have more meaningful names receivebatch consider removing the third parameter this third parameter is the time for which we would wait for a new message after receiving the current message the second parameter is for total time to wait to get the n messages asked for the third parameter is complicating what would be a simple api net sdk doesnt hav this feature meaning there was no strong user ask for the same if we do get user feedback about this then we can always add it back this one we might want to take up for preview the handler returned by receive function whose sole purpose is to stop receiving messages sugggestion dont return any handler none of the other sdks have this model of using a handler to stop receiving messages instead the receiver client is expected to be closed by the user when they want to stop receiving messages our streaming receiver for sessions doesnt have return such a handler either the messagesession is expected to be closed by the user when they want to stop receiving messages having a timeout instead of the stop behavior would be much simpler
| 0
|
47,086
| 6,041,911,890
|
IssuesEvent
|
2017-06-11 07:18:36
|
nextcloud/server
|
https://api.github.com/repos/nextcloud/server
|
opened
|
Show 3 latest contacts/circles or groups
|
design enhancement feature: sharing
|
The sharing with user dropdown should be expanded by default and show the latest and/or often used contacts, circles and groups.
@schiessle @jancborchardt
|
1.0
|
Show 3 latest contacts/circles or groups - The sharing with user dropdown should be expanded by default and show the latest and/or often used contacts, circles and groups.
@schiessle @jancborchardt
|
non_process
|
show latest contacts circles or groups the sharing with user dropdown should be expanded by default and show the latest and or often used contacts circles and groups schiessle jancborchardt
| 0
|
7,522
| 10,597,728,000
|
IssuesEvent
|
2019-10-10 01:52:53
|
kubeflow/manifests
|
https://api.github.com/repos/kubeflow/manifests
|
closed
|
Update centraldashboard image in master
|
area/front-end kind/process priority/p0
|
The central dashboard image on kubeflow/manifests master is outdated. Its still using a v0.5.0 tag which looks like its older then the image on v0.6-branch.
|
1.0
|
Update centraldashboard image in master - The central dashboard image on kubeflow/manifests master is outdated. Its still using a v0.5.0 tag which looks like its older then the image on v0.6-branch.
|
process
|
update centraldashboard image in master the central dashboard image on kubeflow manifests master is outdated its still using a tag which looks like its older then the image on branch
| 1
|
13,187
| 15,613,086,788
|
IssuesEvent
|
2021-03-19 16:02:16
|
bridgetownrb/bridgetown
|
https://api.github.com/repos/bridgetownrb/bridgetown
|
closed
|
The Great Content Re-alignment
|
enhancement high priority process question
|
**January 2021 Update:** work has begun on this in earnest. Using the term "resource" and not "content". (Thanks @andrewmcodes!) Development of `Bridgetown::Resource::Base` and supporting classes is now underway! [Check out the "diary"…](https://github.com/bridgetownrb/bridgetown/issues/187#issuecomment-770068092)
----
Looking ahead to what I hope to accomplish for an official Bridgetown 1.0 release and beyond, I think we need to take a hard and painful look at the distinctions between `Page` & `Document` and also between the `posts` collection and other collections.
➡️ Bridgetown's heritage comes from Jekyll and Jekyll comes from the idea that you have blog posts and you have standalone pages (home page, about page, etc.), and anything else is a "static file". Then the concept of collections emerged, with posts being a kind of builtin collection, but posts behave differently in some respects compared to other collections and are read off the filesystem completely differently.
➡️ There's also confusion around how permalinks work and how to configure them, because the top-level `permalink` config value affects both pages and blog pots, but there's also a `permalink` config possible at the collection level, plus you can add `permalink` configs via front-matter defaults which could affect anything potentially, so it's clear as mud.
➡️ At the code level, I've done what I can using mixins/concerns to get the `Page` and `Document` classes to act more alike and work similarly in various respects using duck typing, but it can still be frustrating. `Layout` is yet another similar-but-not-really sort of enigma.
➡️ There's also the question of when to use data files and when to use collections, and in fact you can add YAML files in a collection folder and they are treated as documents with front matter and a blank content field! 🤪 Also wacky, until a recent bug fix, static files saved within collection folders were processed as "collection documents" even though they were the `StaticFile` class and were missing from the site's overall static files array.
➡️ Another problem is currently categories and tags are post-specific. If you add categories and/or tags to other collections, or pages for that matter, they're invisible from any typical searching/filtering of categories/tags.
➡️ Yet another obscure problem is you currently don't have any control over the order in which content is processed on a file-by-file basis, so you can occasionally run into issues where File A is trying to display content from File B, C, D, etc. but the content for those files haven't actually been processed, so File A shows the _raw markup/template_ string instead of the processed content. Oops! It's a non-trivial problem, because you could potentially run into circular dependencies. File A displays content from File B, but File B wants to display content from File A. Yikes. That happens virtually never in a typical site design, but you never know.
➡️ But wait, there's more! Right now there's no concrete way to determine the "source" of a particular file/piece of content if it came from an API/headless CMS—you only know if it came from an actual file on the filesystem, otherwise it's just "virtual". In addition, after it gets rendered at a particular URL, you can't backtrack—in other words, you can't determine that /a/b/c corresponds to this one object and, say, re-render that particular object.
❓ (There's also the outstanding question of how all this relates to `ActiveModel` objects that can be used to load/validate/save content in a Rails CMS-context—a project I have underway—but I think I'll save that for a future issue.)
❤️ All that to say…I'll always love Jekyll to pieces, but its content modeling situation is kind of screwball and it's time for us to fix this in Bridgetown once and for all so we have a sane platform to build on for the next ten years.
----
So, how do we fix this? 😂
I propose creating a new namespace under `Bridgetown` called `Bridgetown::Content`. Inside we'd define several classes:
* `Bridgetown::Content::Base` — this represents a single piece of content. This is any kind of content that isn't simply a "static file" like an image or PDF. So that means page, blog post, collection document, YAML/JSON/CSV/etc. data file, whatever.
* `Bridgetown::Content::Source` — this is attached to the content object and represents where the content came from…filesystem, third-party API, generator, etc.
* `Bridgetown::Content::Destination` — this is attached to the content object and represents the URL/filepath where the content will be generated.
* `Bridgetown::Content::Transformer` — this is an auxiliary object that is responsible for transforming the object data from raw input to final converted output
* `Bridgetown::Content::Dependencies` — this would determine the dependencies required for each piece of content and use that to facilitate both the correct order of processing and also to cache in the future so a piece of content could be quickly rerendered along with just its dependencies. There'd be some default heuristics along these lines but you could manually specify dependencies on a per-object basis. (Like a product template could specifically require "products" to be a dependency and maybe just the products in its own category.)
* `Bridgetown::Content::Taxonomy` — this would represent a particular way to classify a content item. A category would be a Taxonomy of type "category", a tag would be a Taxonomy of type "tag", etc. Site owners could easily configure any sort of Taxonomy. Looking at Hugo for example, it comes out of the box configured like so:
```yaml
taxonomies:
category: categories
tag: tags
```
but you could adjust that however you like.
* `Bridgetown::Content::Relations` — this is how a content object could be thought of as "related" to another type of object…parent-child relationships, belongs-to/has-many, etc. So you could have `author: janedoe` in a post's frontmatter and then maybe `post.relations.author` would automatically resolve to the content object for `janedoe`. The relations themselves would probably be defined in the yml where collections are currently configured.
After doing all this, we'd refactor `Bridgetown::Page` and `Bridgetown::Document` so they're just child subclasses of `Bridgetown::Content::Base`, and we'd probably add `Bridgetown::StructuredData` as well to represent a YAML/JSON/etc. data structure. In addition, we get rid of separate file readers for pages, collections, and posts, and unify everything into a single file reader. I also like the idea of letting front matter itself override directory locations, so you could potentially have everything all in a top-level folder and just add `collection: posts`, `collection: recipes`, etc. That would be dumb, but it would also be immensely flexible and eliminate any hard requirements for folders like `_posts`, `_recipes`, etc.
The special behavior of posts would be basic configuration options of a collection, so any collection could potentially behave in that manner if configured. Pages would just be collection-less documents, essentially—or alternatively, create a `pages` or `default` or `unfiled` collection and use that.
I'd also like to make sure we get good-quality content graphs out of all this so menus, breadcrumbs, etc. would be a piece of cake once the collections/taxonomies/relations are properly configured. (Again, Hugo leads the way on this stuff!)
In terms of ecosystem impact, my hope is that after doing all this, most existing sites would work "as is" from the user's perspective, and any external Bridgetown plugins would only need slight tweaks to work with the new Page/Document classes…not entirely backwards-compatible unfortunately, but since we're still pre-1.0, the time for breaking changes is really now if ever. Once we do this, we break free from Jekyll's gravitational pull and get to define the future of Bridgetown on our terms. Very exciting!
Please note all the above class names are purely theoretical at this point and subject to deliberation and further brainstorming, so please let me know what you think and if I'm missing any important aspects of quality content modeling. We shouldn't shy away from looking at how other CMSes and site generators do this stuff and aim for providing as much power and flexibility as we can right out-of-the-box.
|
1.0
|
The Great Content Re-alignment - **January 2021 Update:** work has begun on this in earnest. Using the term "resource" and not "content". (Thanks @andrewmcodes!) Development of `Bridgetown::Resource::Base` and supporting classes is now underway! [Check out the "diary"…](https://github.com/bridgetownrb/bridgetown/issues/187#issuecomment-770068092)
----
Looking ahead to what I hope to accomplish for an official Bridgetown 1.0 release and beyond, I think we need to take a hard and painful look at the distinctions between `Page` & `Document` and also between the `posts` collection and other collections.
➡️ Bridgetown's heritage comes from Jekyll and Jekyll comes from the idea that you have blog posts and you have standalone pages (home page, about page, etc.), and anything else is a "static file". Then the concept of collections emerged, with posts being a kind of builtin collection, but posts behave differently in some respects compared to other collections and are read off the filesystem completely differently.
➡️ There's also confusion around how permalinks work and how to configure them, because the top-level `permalink` config value affects both pages and blog pots, but there's also a `permalink` config possible at the collection level, plus you can add `permalink` configs via front-matter defaults which could affect anything potentially, so it's clear as mud.
➡️ At the code level, I've done what I can using mixins/concerns to get the `Page` and `Document` classes to act more alike and work similarly in various respects using duck typing, but it can still be frustrating. `Layout` is yet another similar-but-not-really sort of enigma.
➡️ There's also the question of when to use data files and when to use collections, and in fact you can add YAML files in a collection folder and they are treated as documents with front matter and a blank content field! 🤪 Also wacky, until a recent bug fix, static files saved within collection folders were processed as "collection documents" even though they were the `StaticFile` class and were missing from the site's overall static files array.
➡️ Another problem is currently categories and tags are post-specific. If you add categories and/or tags to other collections, or pages for that matter, they're invisible from any typical searching/filtering of categories/tags.
➡️ Yet another obscure problem is you currently don't have any control over the order in which content is processed on a file-by-file basis, so you can occasionally run into issues where File A is trying to display content from File B, C, D, etc. but the content for those files haven't actually been processed, so File A shows the _raw markup/template_ string instead of the processed content. Oops! It's a non-trivial problem, because you could potentially run into circular dependencies. File A displays content from File B, but File B wants to display content from File A. Yikes. That happens virtually never in a typical site design, but you never know.
➡️ But wait, there's more! Right now there's no concrete way to determine the "source" of a particular file/piece of content if it came from an API/headless CMS—you only know if it came from an actual file on the filesystem, otherwise it's just "virtual". In addition, after it gets rendered at a particular URL, you can't backtrack—in other words, you can't determine that /a/b/c corresponds to this one object and, say, re-render that particular object.
❓ (There's also the outstanding question of how all this relates to `ActiveModel` objects that can be used to load/validate/save content in a Rails CMS-context—a project I have underway—but I think I'll save that for a future issue.)
❤️ All that to say…I'll always love Jekyll to pieces, but its content modeling situation is kind of screwball and it's time for us to fix this in Bridgetown once and for all so we have a sane platform to build on for the next ten years.
----
So, how do we fix this? 😂
I propose creating a new namespace under `Bridgetown` called `Bridgetown::Content`. Inside we'd define several classes:
* `Bridgetown::Content::Base` — this represents a single piece of content. This is any kind of content that isn't simply a "static file" like an image or PDF. So that means page, blog post, collection document, YAML/JSON/CSV/etc. data file, whatever.
* `Bridgetown::Content::Source` — this is attached to the content object and represents where the content came from…filesystem, third-party API, generator, etc.
* `Bridgetown::Content::Destination` — this is attached to the content object and represents the URL/filepath where the content will be generated.
* `Bridgetown::Content::Transformer` — this is an auxiliary object that is responsible for transforming the object data from raw input to final converted output
* `Bridgetown::Content::Dependencies` — this would determine the dependencies required for each piece of content and use that to facilitate both the correct order of processing and also to cache in the future so a piece of content could be quickly rerendered along with just its dependencies. There'd be some default heuristics along these lines but you could manually specify dependencies on a per-object basis. (Like a product template could specifically require "products" to be a dependency and maybe just the products in its own category.)
* `Bridgetown::Content::Taxonomy` — this would represent a particular way to classify a content item. A category would be a Taxonomy of type "category", a tag would be a Taxonomy of type "tag", etc. Site owners could easily configure any sort of Taxonomy. Looking at Hugo for example, it comes out of the box configured like so:
```yaml
taxonomies:
category: categories
tag: tags
```
but you could adjust that however you like.
* `Bridgetown::Content::Relations` — this is how a content object could be thought of as "related" to another type of object…parent-child relationships, belongs-to/has-many, etc. So you could have `author: janedoe` in a post's frontmatter and then maybe `post.relations.author` would automatically resolve to the content object for `janedoe`. The relations themselves would probably be defined in the yml where collections are currently configured.
After doing all this, we'd refactor `Bridgetown::Page` and `Bridgetown::Document` so they're just child subclasses of `Bridgetown::Content::Base`, and we'd probably add `Bridgetown::StructuredData` as well to represent a YAML/JSON/etc. data structure. In addition, we get rid of separate file readers for pages, collections, and posts, and unify everything into a single file reader. I also like the idea of letting front matter itself override directory locations, so you could potentially have everything all in a top-level folder and just add `collection: posts`, `collection: recipes`, etc. That would be dumb, but it would also be immensely flexible and eliminate any hard requirements for folders like `_posts`, `_recipes`, etc.
The special behavior of posts would be basic configuration options of a collection, so any collection could potentially behave in that manner if configured. Pages would just be collection-less documents, essentially—or alternatively, create a `pages` or `default` or `unfiled` collection and use that.
I'd also like to make sure we get good-quality content graphs out of all this so menus, breadcrumbs, etc. would be a piece of cake once the collections/taxonomies/relations are properly configured. (Again, Hugo leads the way on this stuff!)
In terms of ecosystem impact, my hope is that after doing all this, most existing sites would work "as is" from the user's perspective, and any external Bridgetown plugins would only need slight tweaks to work with the new Page/Document classes…not entirely backwards-compatible unfortunately, but since we're still pre-1.0, the time for breaking changes is really now if ever. Once we do this, we break free from Jekyll's gravitational pull and get to define the future of Bridgetown on our terms. Very exciting!
Please note all the above class names are purely theoretical at this point and subject to deliberation and further brainstorming, so please let me know what you think and if I'm missing any important aspects of quality content modeling. We shouldn't shy away from looking at how other CMSes and site generators do this stuff and aim for providing as much power and flexibility as we can right out-of-the-box.
|
process
|
the great content re alignment january update work has begun on this in earnest using the term resource and not content thanks andrewmcodes development of bridgetown resource base and supporting classes is now underway looking ahead to what i hope to accomplish for an official bridgetown release and beyond i think we need to take a hard and painful look at the distinctions between page document and also between the posts collection and other collections ➡️ bridgetown s heritage comes from jekyll and jekyll comes from the idea that you have blog posts and you have standalone pages home page about page etc and anything else is a static file then the concept of collections emerged with posts being a kind of builtin collection but posts behave differently in some respects compared to other collections and are read off the filesystem completely differently ➡️ there s also confusion around how permalinks work and how to configure them because the top level permalink config value affects both pages and blog pots but there s also a permalink config possible at the collection level plus you can add permalink configs via front matter defaults which could affect anything potentially so it s clear as mud ➡️ at the code level i ve done what i can using mixins concerns to get the page and document classes to act more alike and work similarly in various respects using duck typing but it can still be frustrating layout is yet another similar but not really sort of enigma ➡️ there s also the question of when to use data files and when to use collections and in fact you can add yaml files in a collection folder and they are treated as documents with front matter and a blank content field 🤪 also wacky until a recent bug fix static files saved within collection folders were processed as collection documents even though they were the staticfile class and were missing from the site s overall static files array ➡️ another problem is currently categories and tags are post specific if you add categories and or tags to other collections or pages for that matter they re invisible from any typical searching filtering of categories tags ➡️ yet another obscure problem is you currently don t have any control over the order in which content is processed on a file by file basis so you can occasionally run into issues where file a is trying to display content from file b c d etc but the content for those files haven t actually been processed so file a shows the raw markup template string instead of the processed content oops it s a non trivial problem because you could potentially run into circular dependencies file a displays content from file b but file b wants to display content from file a yikes that happens virtually never in a typical site design but you never know ➡️ but wait there s more right now there s no concrete way to determine the source of a particular file piece of content if it came from an api headless cms—you only know if it came from an actual file on the filesystem otherwise it s just virtual in addition after it gets rendered at a particular url you can t backtrack—in other words you can t determine that a b c corresponds to this one object and say re render that particular object ❓ there s also the outstanding question of how all this relates to activemodel objects that can be used to load validate save content in a rails cms context—a project i have underway—but i think i ll save that for a future issue ❤️ all that to say…i ll always love jekyll to pieces but its content modeling situation is kind of screwball and it s time for us to fix this in bridgetown once and for all so we have a sane platform to build on for the next ten years so how do we fix this 😂 i propose creating a new namespace under bridgetown called bridgetown content inside we d define several classes bridgetown content base — this represents a single piece of content this is any kind of content that isn t simply a static file like an image or pdf so that means page blog post collection document yaml json csv etc data file whatever bridgetown content source — this is attached to the content object and represents where the content came from…filesystem third party api generator etc bridgetown content destination — this is attached to the content object and represents the url filepath where the content will be generated bridgetown content transformer — this is an auxiliary object that is responsible for transforming the object data from raw input to final converted output bridgetown content dependencies — this would determine the dependencies required for each piece of content and use that to facilitate both the correct order of processing and also to cache in the future so a piece of content could be quickly rerendered along with just its dependencies there d be some default heuristics along these lines but you could manually specify dependencies on a per object basis like a product template could specifically require products to be a dependency and maybe just the products in its own category bridgetown content taxonomy — this would represent a particular way to classify a content item a category would be a taxonomy of type category a tag would be a taxonomy of type tag etc site owners could easily configure any sort of taxonomy looking at hugo for example it comes out of the box configured like so yaml taxonomies category categories tag tags but you could adjust that however you like bridgetown content relations — this is how a content object could be thought of as related to another type of object…parent child relationships belongs to has many etc so you could have author janedoe in a post s frontmatter and then maybe post relations author would automatically resolve to the content object for janedoe the relations themselves would probably be defined in the yml where collections are currently configured after doing all this we d refactor bridgetown page and bridgetown document so they re just child subclasses of bridgetown content base and we d probably add bridgetown structureddata as well to represent a yaml json etc data structure in addition we get rid of separate file readers for pages collections and posts and unify everything into a single file reader i also like the idea of letting front matter itself override directory locations so you could potentially have everything all in a top level folder and just add collection posts collection recipes etc that would be dumb but it would also be immensely flexible and eliminate any hard requirements for folders like posts recipes etc the special behavior of posts would be basic configuration options of a collection so any collection could potentially behave in that manner if configured pages would just be collection less documents essentially—or alternatively create a pages or default or unfiled collection and use that i d also like to make sure we get good quality content graphs out of all this so menus breadcrumbs etc would be a piece of cake once the collections taxonomies relations are properly configured again hugo leads the way on this stuff in terms of ecosystem impact my hope is that after doing all this most existing sites would work as is from the user s perspective and any external bridgetown plugins would only need slight tweaks to work with the new page document classes…not entirely backwards compatible unfortunately but since we re still pre the time for breaking changes is really now if ever once we do this we break free from jekyll s gravitational pull and get to define the future of bridgetown on our terms very exciting please note all the above class names are purely theoretical at this point and subject to deliberation and further brainstorming so please let me know what you think and if i m missing any important aspects of quality content modeling we shouldn t shy away from looking at how other cmses and site generators do this stuff and aim for providing as much power and flexibility as we can right out of the box
| 1
|
26,869
| 2,685,706,678
|
IssuesEvent
|
2015-03-30 05:04:14
|
cs2103jan2015-t09-4j/main
|
https://api.github.com/repos/cs2103jan2015-t09-4j/main
|
closed
|
undo/redo: user can undo and redo commands a couple of times.
|
Hard priority.high
|
so that I would not need to suffer the consequence of mistyping
|
1.0
|
undo/redo: user can undo and redo commands a couple of times. - so that I would not need to suffer the consequence of mistyping
|
non_process
|
undo redo user can undo and redo commands a couple of times so that i would not need to suffer the consequence of mistyping
| 0
|
244,936
| 18,769,563,986
|
IssuesEvent
|
2021-11-06 15:32:22
|
dkagramanyan/WC-Co_computer_vision
|
https://api.github.com/repos/dkagramanyan/WC-Co_computer_vision
|
closed
|
Отчетность 8.11-12.11
|
documentation
|
https://miem.hse.ru/project_office/announcements/519270360.html
Дедлайн загрузки файлов - 4 ноября включительно. После дедлайна - всем боньк по голове. Наработки прошлого года находятся на гугл диске.
Требуется:
- [x] написать сценарий ролика
- [x] написать сценарий презентации
- [x] записать ролик
- [x] сделать презентацию
|
1.0
|
Отчетность 8.11-12.11 - https://miem.hse.ru/project_office/announcements/519270360.html
Дедлайн загрузки файлов - 4 ноября включительно. После дедлайна - всем боньк по голове. Наработки прошлого года находятся на гугл диске.
Требуется:
- [x] написать сценарий ролика
- [x] написать сценарий презентации
- [x] записать ролик
- [x] сделать презентацию
|
non_process
|
отчетность дедлайн загрузки файлов ноября включительно после дедлайна всем боньк по голове наработки прошлого года находятся на гугл диске требуется написать сценарий ролика написать сценарий презентации записать ролик сделать презентацию
| 0
|
20,574
| 6,900,272,276
|
IssuesEvent
|
2017-11-24 17:32:34
|
craigbarnes/dte
|
https://api.github.com/repos/craigbarnes/dte
|
closed
|
Generate PDF manual as a single file
|
build-system documentation
|
The current [PDF manual](https://github.com/craigbarnes/dte#documentation) is split into 3 separate files in the same way as the man pages, but by convention, PDF manuals are almost always distributed as large, single documents containing everything.
The `dte` PDF manual should follow this convention and should be a concatenation of all 3 man pages. This might require some extra post-processing and/or adjustments to [`docs/ttman.c`](https://github.com/craigbarnes/dte/blob/master/docs/ttman.c).
|
1.0
|
Generate PDF manual as a single file - The current [PDF manual](https://github.com/craigbarnes/dte#documentation) is split into 3 separate files in the same way as the man pages, but by convention, PDF manuals are almost always distributed as large, single documents containing everything.
The `dte` PDF manual should follow this convention and should be a concatenation of all 3 man pages. This might require some extra post-processing and/or adjustments to [`docs/ttman.c`](https://github.com/craigbarnes/dte/blob/master/docs/ttman.c).
|
non_process
|
generate pdf manual as a single file the current is split into separate files in the same way as the man pages but by convention pdf manuals are almost always distributed as large single documents containing everything the dte pdf manual should follow this convention and should be a concatenation of all man pages this might require some extra post processing and or adjustments to
| 0
|
9,315
| 12,335,406,072
|
IssuesEvent
|
2020-05-14 11:55:46
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
closed
|
UCP: Migrate scalar function `Password` from TiDB
|
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
|
## Description
Port the scalar function `Password` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @andylokandy
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
2.0
|
UCP: Migrate scalar function `Password` from TiDB -
## Description
Port the scalar function `Password` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @andylokandy
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
process
|
ucp migrate scalar function password from tidb description port the scalar function password from tidb to coprocessor score mentor s andylokandy recommended skills rust programming learning materials already implemented expressions ported from tidb
| 1
|
91,378
| 8,303,877,541
|
IssuesEvent
|
2018-09-21 19:05:46
|
tschottdorf/cockroach
|
https://api.github.com/repos/tschottdorf/cockroach
|
closed
|
teamcity: failed test: TestImportPgDump
|
C-test-failure O-robot
|
The following tests appear to have failed on release-banana.
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestImportPgDump).
[#864629](https://teamcity.cockroachdb.com/viewLog.html?buildId=864629):
```
```
Please assign, take a look and update the issue accordingly.
|
1.0
|
teamcity: failed test: TestImportPgDump - The following tests appear to have failed on release-banana.
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestImportPgDump).
[#864629](https://teamcity.cockroachdb.com/viewLog.html?buildId=864629):
```
```
Please assign, take a look and update the issue accordingly.
|
non_process
|
teamcity failed test testimportpgdump the following tests appear to have failed on release banana you may want to check please assign take a look and update the issue accordingly
| 0
|
121,838
| 17,662,819,763
|
IssuesEvent
|
2021-08-21 21:37:54
|
ghc-dev/David-Jones
|
https://api.github.com/repos/ghc-dev/David-Jones
|
opened
|
CVE-2020-10744 (Medium) detected in ansible-2.9.9.tar.gz
|
security vulnerability
|
## CVE-2020-10744 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ansible-2.9.9.tar.gz</b></p></summary>
<p>Radically simple IT automation</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz">https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz</a></p>
<p>Path to dependency file: David-Jones/requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **ansible-2.9.9.tar.gz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/David-Jones/commit/b878066a2e3b45b24b6dd8269681b51c8ba2112b">b878066a2e3b45b24b6dd8269681b51c8ba2112b</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An incomplete fix was found for the fix of the flaw CVE-2020-1733 ansible: insecure temporary directory when running become_user from become directive. The provided fix is insufficient to prevent the race condition on systems using ACLs and FUSE filesystems. Ansible Engine 2.7.18, 2.8.12, and 2.9.9 as well as previous versions are affected and Ansible Tower 3.4.5, 3.5.6 and 3.6.4 as well as previous versions are affected.
<p>Publish Date: 2020-05-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10744>CVE-2020-10744</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"ansible","packageVersion":"2.9.9","packageFilePaths":["/requirements.txt"],"isTransitiveDependency":false,"dependencyTree":"ansible:2.9.9","isMinimumFixVersionAvailable":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-10744","vulnerabilityDetails":"An incomplete fix was found for the fix of the flaw CVE-2020-1733 ansible: insecure temporary directory when running become_user from become directive. The provided fix is insufficient to prevent the race condition on systems using ACLs and FUSE filesystems. Ansible Engine 2.7.18, 2.8.12, and 2.9.9 as well as previous versions are affected and Ansible Tower 3.4.5, 3.5.6 and 3.6.4 as well as previous versions are affected.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10744","cvss3Severity":"medium","cvss3Score":"5.0","cvss3Metrics":{"A":"Low","AC":"High","PR":"Low","S":"Changed","C":"Low","UI":"Required","AV":"Local","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-10744 (Medium) detected in ansible-2.9.9.tar.gz - ## CVE-2020-10744 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ansible-2.9.9.tar.gz</b></p></summary>
<p>Radically simple IT automation</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz">https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz</a></p>
<p>Path to dependency file: David-Jones/requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **ansible-2.9.9.tar.gz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/David-Jones/commit/b878066a2e3b45b24b6dd8269681b51c8ba2112b">b878066a2e3b45b24b6dd8269681b51c8ba2112b</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An incomplete fix was found for the fix of the flaw CVE-2020-1733 ansible: insecure temporary directory when running become_user from become directive. The provided fix is insufficient to prevent the race condition on systems using ACLs and FUSE filesystems. Ansible Engine 2.7.18, 2.8.12, and 2.9.9 as well as previous versions are affected and Ansible Tower 3.4.5, 3.5.6 and 3.6.4 as well as previous versions are affected.
<p>Publish Date: 2020-05-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10744>CVE-2020-10744</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"ansible","packageVersion":"2.9.9","packageFilePaths":["/requirements.txt"],"isTransitiveDependency":false,"dependencyTree":"ansible:2.9.9","isMinimumFixVersionAvailable":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-10744","vulnerabilityDetails":"An incomplete fix was found for the fix of the flaw CVE-2020-1733 ansible: insecure temporary directory when running become_user from become directive. The provided fix is insufficient to prevent the race condition on systems using ACLs and FUSE filesystems. Ansible Engine 2.7.18, 2.8.12, and 2.9.9 as well as previous versions are affected and Ansible Tower 3.4.5, 3.5.6 and 3.6.4 as well as previous versions are affected.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10744","cvss3Severity":"medium","cvss3Score":"5.0","cvss3Metrics":{"A":"Low","AC":"High","PR":"Low","S":"Changed","C":"Low","UI":"Required","AV":"Local","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in ansible tar gz cve medium severity vulnerability vulnerable library ansible tar gz radically simple it automation library home page a href path to dependency file david jones requirements txt path to vulnerable library requirements txt dependency hierarchy x ansible tar gz vulnerable library found in head commit a href found in base branch master vulnerability details an incomplete fix was found for the fix of the flaw cve ansible insecure temporary directory when running become user from become directive the provided fix is insufficient to prevent the race condition on systems using acls and fuse filesystems ansible engine and as well as previous versions are affected and ansible tower and as well as previous versions are affected publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree ansible isminimumfixversionavailable false basebranches vulnerabilityidentifier cve vulnerabilitydetails an incomplete fix was found for the fix of the flaw cve ansible insecure temporary directory when running become user from become directive the provided fix is insufficient to prevent the race condition on systems using acls and fuse filesystems ansible engine and as well as previous versions are affected and ansible tower and as well as previous versions are affected vulnerabilityurl
| 0
|
21,432
| 29,368,589,369
|
IssuesEvent
|
2023-05-29 00:34:31
|
devssa/onde-codar-em-salvador
|
https://api.github.com/repos/devssa/onde-codar-em-salvador
|
closed
|
[Hibrido / São Paulo, São Paulo, Brazil] SAP ABAP (Híbrido São Paulo/SP) na Coodesh
|
SALVADOR SENIOR REQUISITOS SAP PROCESSOS GITHUB INGLÊS UMA QUALIDADE MODELAGEM DE DADOS QA HIBRIDO ALOCADO Stale
|
## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/jobs/sap-abap-hibrido-sao-paulosp-130415961?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋
<p>A KLB Group está em busca de Senior SAP ABAP para compor seu time!</p>
<p>A KLB Group é especializada na implementação de projetos em empresas públicas e privadas. Seja em projetos de desenvolvimento, produção ou transformação, a KLB Group assegura a implementação eficaz, mobilizando rapidamente uma equipe de especialistas de várias funções (compras, cadeia de suprimentos, qualidade, engenharia, TI, finanças, etc.), com uma combinação única de experiência em design, implementação e operação. A KLB Group tem mais de 500 funcionários na Europa, América e Ásia.</p>
<p>Atividades e Atribuições:</p>
<ul>
<li>Analisar e levantar requisitos, mapear processos e realizar modelagem de dados, com objetivo de estudar e implementar sistemas de acordo com as regras de negócio;</li>
<li>Analisar o desempenho de sistemas implantados, soluciona problemas técnicos e elabora manuais;</li>
<li>Análise e Desenvolvimento de programas;</li>
<li>Desenhar e implementar melhorias de processo relacionado ao projeto;</li>
<li>Atuar no framework de aplicação específica do negócio dentro do SAP suportando melhorias;</li>
<li>Garantir a qualidade de entrega das demandas e efetuar processos de QA nos ambientes de Testes e Produção.</li>
</ul>
<p>Cliente final: Samsung.</p>
<p>Atuação híbrida 03 vezes por semana no bairro do Morumbi, zona sul de São Paulo/SP. Horário de Trabalho: das 09hs às 18hs.</p>
<p>Tempo de Projeto: inicialmente 06 meses.</p>
## KLB GROUP BRASIL :
<p>A KLB Group é especializada na implementação de projetos em empresas públicas e privadas. Seja em projetos de desenvolvimento, produção ou transformação, a KLB Group assegura a implementação eficaz, mobilizando rapidamente uma equipe de especialistas de várias funções (compras, cadeia de suprimentos, qualidade, engenharia, TI, finanças, etc.), com uma combinação única de experiência em design, implementação e operação. A KLB Group tem mais de 500 funcionários na Europa, América e Ásia.</p>
</p>
## Habilidades:
- SAP
- RPA
- Inglês
## Local:
São Paulo, São Paulo, Brazil
## Requisitos:
- Experiência de sólida na área;
- Superior completo na área de Tecnologia;
- Experiência com ABAP, SAP Scripts, Smartforms, Idocs, ABAP Objects, Dialog programming, User-exits, Smart Forms, PDF Forms, ALV, RFCs and other SAP development tools;
- Inglês Avançado;
- Conhecimentos com processos Funcionais SAP MM (Inbound Process) e GRC (PI/PO);
- Sólida experiência em projetos de Inbound;
- Habilidades de comunicação, interação com os usuários;
- Conhecimento do conceito de RPA;
- Responsabilidade avançada, envolvendo tarefas que exigem um grau maior de conhecimento e experiência;
- Raciocínio analítico e afinidade com processos de SAP MM.
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [SAP ABAP (Híbrido São Paulo/SP) na KLB GROUP BRASIL ](https://coodesh.com/jobs/sap-abap-hibrido-sao-paulosp-130415961?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Alocado
#### Regime
CLT
#### Categoria
Gestão em TI
|
1.0
|
[Hibrido / São Paulo, São Paulo, Brazil] SAP ABAP (Híbrido São Paulo/SP) na Coodesh - ## Descrição da vaga:
Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios.
Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/jobs/sap-abap-hibrido-sao-paulosp-130415961?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋
<p>A KLB Group está em busca de Senior SAP ABAP para compor seu time!</p>
<p>A KLB Group é especializada na implementação de projetos em empresas públicas e privadas. Seja em projetos de desenvolvimento, produção ou transformação, a KLB Group assegura a implementação eficaz, mobilizando rapidamente uma equipe de especialistas de várias funções (compras, cadeia de suprimentos, qualidade, engenharia, TI, finanças, etc.), com uma combinação única de experiência em design, implementação e operação. A KLB Group tem mais de 500 funcionários na Europa, América e Ásia.</p>
<p>Atividades e Atribuições:</p>
<ul>
<li>Analisar e levantar requisitos, mapear processos e realizar modelagem de dados, com objetivo de estudar e implementar sistemas de acordo com as regras de negócio;</li>
<li>Analisar o desempenho de sistemas implantados, soluciona problemas técnicos e elabora manuais;</li>
<li>Análise e Desenvolvimento de programas;</li>
<li>Desenhar e implementar melhorias de processo relacionado ao projeto;</li>
<li>Atuar no framework de aplicação específica do negócio dentro do SAP suportando melhorias;</li>
<li>Garantir a qualidade de entrega das demandas e efetuar processos de QA nos ambientes de Testes e Produção.</li>
</ul>
<p>Cliente final: Samsung.</p>
<p>Atuação híbrida 03 vezes por semana no bairro do Morumbi, zona sul de São Paulo/SP. Horário de Trabalho: das 09hs às 18hs.</p>
<p>Tempo de Projeto: inicialmente 06 meses.</p>
## KLB GROUP BRASIL :
<p>A KLB Group é especializada na implementação de projetos em empresas públicas e privadas. Seja em projetos de desenvolvimento, produção ou transformação, a KLB Group assegura a implementação eficaz, mobilizando rapidamente uma equipe de especialistas de várias funções (compras, cadeia de suprimentos, qualidade, engenharia, TI, finanças, etc.), com uma combinação única de experiência em design, implementação e operação. A KLB Group tem mais de 500 funcionários na Europa, América e Ásia.</p>
</p>
## Habilidades:
- SAP
- RPA
- Inglês
## Local:
São Paulo, São Paulo, Brazil
## Requisitos:
- Experiência de sólida na área;
- Superior completo na área de Tecnologia;
- Experiência com ABAP, SAP Scripts, Smartforms, Idocs, ABAP Objects, Dialog programming, User-exits, Smart Forms, PDF Forms, ALV, RFCs and other SAP development tools;
- Inglês Avançado;
- Conhecimentos com processos Funcionais SAP MM (Inbound Process) e GRC (PI/PO);
- Sólida experiência em projetos de Inbound;
- Habilidades de comunicação, interação com os usuários;
- Conhecimento do conceito de RPA;
- Responsabilidade avançada, envolvendo tarefas que exigem um grau maior de conhecimento e experiência;
- Raciocínio analítico e afinidade com processos de SAP MM.
## Como se candidatar:
Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [SAP ABAP (Híbrido São Paulo/SP) na KLB GROUP BRASIL ](https://coodesh.com/jobs/sap-abap-hibrido-sao-paulosp-130415961?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open)
Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação.
## Labels
#### Alocação
Alocado
#### Regime
CLT
#### Categoria
Gestão em TI
|
process
|
sap abap híbrido são paulo sp na coodesh descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 a klb group está em busca de senior sap abap para compor seu time a klb group é especializada na implementação de projetos em empresas públicas e privadas seja em projetos de desenvolvimento produção ou transformação a klb group assegura a implementação eficaz mobilizando rapidamente uma equipe de especialistas de várias funções compras cadeia de suprimentos qualidade engenharia ti finanças etc com uma combinação única de experiência em design implementação e operação a klb group tem mais de funcionários na europa américa e ásia atividades e atribuições analisar e levantar requisitos mapear processos e realizar modelagem de dados com objetivo de estudar e implementar sistemas de acordo com as regras de negócio analisar o desempenho de sistemas implantados soluciona problemas técnicos e elabora manuais análise e desenvolvimento de programas desenhar e implementar melhorias de processo relacionado ao projeto atuar no framework de aplicação específica do negócio dentro do sap suportando melhorias garantir a qualidade de entrega das demandas e efetuar processos de qa nos ambientes de testes e produção cliente final samsung atuação híbrida vezes por semana no bairro do morumbi zona sul de são paulo sp horário de trabalho das às tempo de projeto inicialmente meses klb group brasil a klb group é especializada na implementação de projetos em empresas públicas e privadas seja em projetos de desenvolvimento produção ou transformação a klb group assegura a implementação eficaz mobilizando rapidamente uma equipe de especialistas de várias funções compras cadeia de suprimentos qualidade engenharia ti finanças etc com uma combinação única de experiência em design implementação e operação a klb group tem mais de funcionários na europa américa e ásia habilidades sap rpa inglês local são paulo são paulo brazil requisitos experiência de sólida na área superior completo na área de tecnologia experiência com abap sap scripts smartforms idocs abap objects dialog programming user exits smart forms pdf forms alv rfcs and other sap development tools inglês avançado conhecimentos com processos funcionais sap mm inbound process e grc pi po sólida experiência em projetos de inbound habilidades de comunicação interação com os usuários conhecimento do conceito de rpa responsabilidade avançada envolvendo tarefas que exigem um grau maior de conhecimento e experiência raciocínio analítico e afinidade com processos de sap mm como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação alocado regime clt categoria gestão em ti
| 1
|
8,833
| 11,944,917,686
|
IssuesEvent
|
2020-04-03 04:03:50
|
googleapis/nodejs-bigtable
|
https://api.github.com/repos/googleapis/nodejs-bigtable
|
closed
|
enable tslint.json rules
|
api: bigtable type: process
|
Clean up synth.py.
reference: ##631
Ideally we should only have below as `tslint.json` as generated from Microgenerator.
```
{
"extends": "gts/tslint.json",
}
```
|
1.0
|
enable tslint.json rules - Clean up synth.py.
reference: ##631
Ideally we should only have below as `tslint.json` as generated from Microgenerator.
```
{
"extends": "gts/tslint.json",
}
```
|
process
|
enable tslint json rules clean up synth py reference ideally we should only have below as tslint json as generated from microgenerator extends gts tslint json
| 1
|
411,832
| 27,833,210,320
|
IssuesEvent
|
2023-03-20 07:24:52
|
royshil/obs-backgroundremoval
|
https://api.github.com/repos/royshil/obs-backgroundremoval
|
closed
|
Update documentation to reflect that the filter is now an effect filter and not an audio/video filter
|
documentation
|
EDIT: Thanks @umireon . The filter was under the "Effect filters" an not under "Audio/Video" Filters as the old screenshots may suggest. I think it could be cool to update the documentation to avoid people thinking it's not working like me.
Hi. I'm using a fresh install of ubuntu 22. I did everything I could. I tried the .deb, I tried to build it myself, I tried with obs from ubuntu, obs from ppa, obs from flathub, 25, 27, 29. It never showed up. The best I could get was:
```
warning: Failed to load 'en-US' text for module: 'obs-backgroundremoval.so'
info: [obs-backgroundremoval] plugin loaded successfully (version 0.5.13)
```
But even when I got this it did not show up in obs in the list of filters in Audio/Video filters.
I had no success before on Ubuntu 20, I was never able to make it work yet.
|
1.0
|
Update documentation to reflect that the filter is now an effect filter and not an audio/video filter -
EDIT: Thanks @umireon . The filter was under the "Effect filters" an not under "Audio/Video" Filters as the old screenshots may suggest. I think it could be cool to update the documentation to avoid people thinking it's not working like me.
Hi. I'm using a fresh install of ubuntu 22. I did everything I could. I tried the .deb, I tried to build it myself, I tried with obs from ubuntu, obs from ppa, obs from flathub, 25, 27, 29. It never showed up. The best I could get was:
```
warning: Failed to load 'en-US' text for module: 'obs-backgroundremoval.so'
info: [obs-backgroundremoval] plugin loaded successfully (version 0.5.13)
```
But even when I got this it did not show up in obs in the list of filters in Audio/Video filters.
I had no success before on Ubuntu 20, I was never able to make it work yet.
|
non_process
|
update documentation to reflect that the filter is now an effect filter and not an audio video filter edit thanks umireon the filter was under the effect filters an not under audio video filters as the old screenshots may suggest i think it could be cool to update the documentation to avoid people thinking it s not working like me hi i m using a fresh install of ubuntu i did everything i could i tried the deb i tried to build it myself i tried with obs from ubuntu obs from ppa obs from flathub it never showed up the best i could get was warning failed to load en us text for module obs backgroundremoval so info plugin loaded successfully version but even when i got this it did not show up in obs in the list of filters in audio video filters i had no success before on ubuntu i was never able to make it work yet
| 0
|
66,065
| 16,532,334,189
|
IssuesEvent
|
2021-05-27 07:45:49
|
rticommunity/rticonnextdds-examples
|
https://api.github.com/repos/rticommunity/rticonnextdds-examples
|
closed
|
analyze-build warnings
|
build ci
|
<!-- :warning: Please, try to follow the template -->
### Information
- **RTI Product**: Connext DDS
- **Version**: 6.1.0
- **Operating system**: Linux
- **Compiler**: GCC
- **Compiler version**: 7.5.0
- **Additional information**:
### What is the current behavior?
There are a few warnings during static analysis:
```plaintext
analyze-build: INFO: /home/mnunez/rticonnextdds-examples/examples/connext_dds/custom_transport/c/FileTransport.c:1504:5: warning: Dereference of null pointer (loaded from variable 'dest_address_in')
analyze-build: INFO: NDDS_Transport_Address_copy(&sendResourceStruct->_address, dest_address_in);
analyze-build: INFO: ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
analyze-build: INFO: /home/mnunez/rti_connext_dds-6.1.0/include/ndds/transport/transport_common_impl.h:68:15: note: expanded from macro 'NDDS_Transport_Address_copy'
analyze-build: INFO: ( *(dst) = *(src) )
analyze-build: INFO: ^~~~~~
analyze-build: INFO: /home/mnunez/rticonnextdds-examples/examples/connext_dds/custom_transport/c/FileTransport.c:1553:14: warning: Null pointer passed as an argument to a 'nonnull' parameter
analyze-build: INFO: || (!NDDS_Transport_Address_is_equal(
analyze-build: INFO: ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
analyze-build: INFO: /home/mnunez/rti_connext_dds-6.1.0/include/ndds/transport/transport_common_impl.h:63:7: note: expanded from macro 'NDDS_Transport_Address_is_equal'
analyze-build: INFO: ( !RTIOsapiMemory_compare(l, r, sizeof(NDDS_Transport_Address_t)) )
analyze-build: INFO: ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
analyze-build: INFO: /home/mnunez/rti_connext_dds-6.1.0/include/ndds/osapi/osapi_bufferUtils_impl.h:44:30: note: expanded from macro 'RTIOsapiMemory_compare'
analyze-build: INFO: (((size) == 0) ? 0 : memcmp(l, r, (size_t) (size)))
analyze-build: INFO: ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~
analyze-build: INFO: 2 warnings generated.
analyze-build: INFO: /home/mnunez/rticonnextdds-examples/examples/connext_dds/flat_data_api/c++98/CameraImage_subscriber.cxx:162:45: warning: Division by zero
analyze-build: INFO: std::cout << "Avg. pixel: (" << red_sum / pixel_count << ", "
analyze-build: INFO: ~~~~~~~~^~~~~~~~~~~~~
analyze-build: INFO: 1 warning generated.
analyze-build: INFO: /home/mnunez/rticonnextdds-examples/examples/connext_dds/logging_config/c/logging_publisher.c:94:27: warning: Value stored to 'send_period' during its initialization is never read
analyze-build: INFO: struct DDS_Duration_t send_period = {4,0};
analyze-build: INFO: ^~~~~~~~~~~ ~~~~~
analyze-build: INFO: 1 warning generated.
```
### Steps to reproduce the issue
Run `resources/ci_cd/linux_static_analysis.py`.
|
1.0
|
analyze-build warnings - <!-- :warning: Please, try to follow the template -->
### Information
- **RTI Product**: Connext DDS
- **Version**: 6.1.0
- **Operating system**: Linux
- **Compiler**: GCC
- **Compiler version**: 7.5.0
- **Additional information**:
### What is the current behavior?
There are a few warnings during static analysis:
```plaintext
analyze-build: INFO: /home/mnunez/rticonnextdds-examples/examples/connext_dds/custom_transport/c/FileTransport.c:1504:5: warning: Dereference of null pointer (loaded from variable 'dest_address_in')
analyze-build: INFO: NDDS_Transport_Address_copy(&sendResourceStruct->_address, dest_address_in);
analyze-build: INFO: ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
analyze-build: INFO: /home/mnunez/rti_connext_dds-6.1.0/include/ndds/transport/transport_common_impl.h:68:15: note: expanded from macro 'NDDS_Transport_Address_copy'
analyze-build: INFO: ( *(dst) = *(src) )
analyze-build: INFO: ^~~~~~
analyze-build: INFO: /home/mnunez/rticonnextdds-examples/examples/connext_dds/custom_transport/c/FileTransport.c:1553:14: warning: Null pointer passed as an argument to a 'nonnull' parameter
analyze-build: INFO: || (!NDDS_Transport_Address_is_equal(
analyze-build: INFO: ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
analyze-build: INFO: /home/mnunez/rti_connext_dds-6.1.0/include/ndds/transport/transport_common_impl.h:63:7: note: expanded from macro 'NDDS_Transport_Address_is_equal'
analyze-build: INFO: ( !RTIOsapiMemory_compare(l, r, sizeof(NDDS_Transport_Address_t)) )
analyze-build: INFO: ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
analyze-build: INFO: /home/mnunez/rti_connext_dds-6.1.0/include/ndds/osapi/osapi_bufferUtils_impl.h:44:30: note: expanded from macro 'RTIOsapiMemory_compare'
analyze-build: INFO: (((size) == 0) ? 0 : memcmp(l, r, (size_t) (size)))
analyze-build: INFO: ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~
analyze-build: INFO: 2 warnings generated.
analyze-build: INFO: /home/mnunez/rticonnextdds-examples/examples/connext_dds/flat_data_api/c++98/CameraImage_subscriber.cxx:162:45: warning: Division by zero
analyze-build: INFO: std::cout << "Avg. pixel: (" << red_sum / pixel_count << ", "
analyze-build: INFO: ~~~~~~~~^~~~~~~~~~~~~
analyze-build: INFO: 1 warning generated.
analyze-build: INFO: /home/mnunez/rticonnextdds-examples/examples/connext_dds/logging_config/c/logging_publisher.c:94:27: warning: Value stored to 'send_period' during its initialization is never read
analyze-build: INFO: struct DDS_Duration_t send_period = {4,0};
analyze-build: INFO: ^~~~~~~~~~~ ~~~~~
analyze-build: INFO: 1 warning generated.
```
### Steps to reproduce the issue
Run `resources/ci_cd/linux_static_analysis.py`.
|
non_process
|
analyze build warnings information rti product connext dds version operating system linux compiler gcc compiler version additional information what is the current behavior there are a few warnings during static analysis plaintext analyze build info home mnunez rticonnextdds examples examples connext dds custom transport c filetransport c warning dereference of null pointer loaded from variable dest address in analyze build info ndds transport address copy sendresourcestruct address dest address in analyze build info analyze build info home mnunez rti connext dds include ndds transport transport common impl h note expanded from macro ndds transport address copy analyze build info dst src analyze build info analyze build info home mnunez rticonnextdds examples examples connext dds custom transport c filetransport c warning null pointer passed as an argument to a nonnull parameter analyze build info ndds transport address is equal analyze build info analyze build info home mnunez rti connext dds include ndds transport transport common impl h note expanded from macro ndds transport address is equal analyze build info rtiosapimemory compare l r sizeof ndds transport address t analyze build info analyze build info home mnunez rti connext dds include ndds osapi osapi bufferutils impl h note expanded from macro rtiosapimemory compare analyze build info size memcmp l r size t size analyze build info analyze build info warnings generated analyze build info home mnunez rticonnextdds examples examples connext dds flat data api c cameraimage subscriber cxx warning division by zero analyze build info std cout avg pixel red sum pixel count analyze build info analyze build info warning generated analyze build info home mnunez rticonnextdds examples examples connext dds logging config c logging publisher c warning value stored to send period during its initialization is never read analyze build info struct dds duration t send period analyze build info analyze build info warning generated steps to reproduce the issue run resources ci cd linux static analysis py
| 0
|
26,035
| 4,552,894,468
|
IssuesEvent
|
2016-09-13 01:21:27
|
MDAnalysis/mdanalysis
|
https://api.github.com/repos/MDAnalysis/mdanalysis
|
opened
|
duplicate resids hide residues in atom.residues
|
Component-Core defect
|
I used the PDB from https://github.com/MDAnalysis/mdanalysis/issues/975#issuecomment-246074022 `asdf.pdb` which has no chain ID but duplicated resids.
### Expected behaviour
All residues should be separately resolved in `atoms.residues`.
### Actual behaviour
Instead, only one residue for each resid is visible.
All atoms are present.
### Code to reproduce the behaviour
Download [asdf.py](https://dl.dropboxusercontent.com/u/49250897/asdf.pdb) from dropbox.
``` python
import MDAnalysis as mda
u = mda.Universe("asdf.pdb")
print(len(u.atoms))
# 29879
print(len(u.residues))
431
# the correct number of residues:
print(u.atoms.CA.n_atoms)
# 1849
```
### Currently version of MDAnalysis:
(run `python -c "import MDAnalysis as mda; print(mda.__version__)"`)
0.15.0, 0.15.1-dev
|
1.0
|
duplicate resids hide residues in atom.residues - I used the PDB from https://github.com/MDAnalysis/mdanalysis/issues/975#issuecomment-246074022 `asdf.pdb` which has no chain ID but duplicated resids.
### Expected behaviour
All residues should be separately resolved in `atoms.residues`.
### Actual behaviour
Instead, only one residue for each resid is visible.
All atoms are present.
### Code to reproduce the behaviour
Download [asdf.py](https://dl.dropboxusercontent.com/u/49250897/asdf.pdb) from dropbox.
``` python
import MDAnalysis as mda
u = mda.Universe("asdf.pdb")
print(len(u.atoms))
# 29879
print(len(u.residues))
431
# the correct number of residues:
print(u.atoms.CA.n_atoms)
# 1849
```
### Currently version of MDAnalysis:
(run `python -c "import MDAnalysis as mda; print(mda.__version__)"`)
0.15.0, 0.15.1-dev
|
non_process
|
duplicate resids hide residues in atom residues i used the pdb from asdf pdb which has no chain id but duplicated resids expected behaviour all residues should be separately resolved in atoms residues actual behaviour instead only one residue for each resid is visible all atoms are present code to reproduce the behaviour download from dropbox python import mdanalysis as mda u mda universe asdf pdb print len u atoms print len u residues the correct number of residues print u atoms ca n atoms currently version of mdanalysis run python c import mdanalysis as mda print mda version dev
| 0
|
9,185
| 12,228,615,044
|
IssuesEvent
|
2020-05-03 20:11:35
|
chfor183/data_science_articles
|
https://api.github.com/repos/chfor183/data_science_articles
|
opened
|
Feature Scaling
|
Data Preprocessing Machine Learning
|
## TL;DR
Yes !
## Key Takeaways
- 1
- 2
## Useful Code Snippets
```
function test() {
console.log("notice the blank line before this function?");
}
```
## Articles/Ressources
|
1.0
|
Feature Scaling - ## TL;DR
Yes !
## Key Takeaways
- 1
- 2
## Useful Code Snippets
```
function test() {
console.log("notice the blank line before this function?");
}
```
## Articles/Ressources
|
process
|
feature scaling tl dr yes key takeaways useful code snippets function test console log notice the blank line before this function articles ressources
| 1
|
125,395
| 12,259,625,316
|
IssuesEvent
|
2020-05-06 16:54:39
|
dotnet/runtime
|
https://api.github.com/repos/dotnet/runtime
|
closed
|
Documentation needed on System.Collections.Immutable (what exist is seriously lacking)
|
area-System.Collections documentation
|
The documentation for how to use System.Collections.Immutable is the worst I have seen of any general microsoft libraries. No general introduction and very few examples. + Most stuff that google can find on the net is outdated and may no longer work. I need quite some detective work just to find out how to instantiate an instance of an immutable list with specific values, how to convert a IEnumerable etc.
|
1.0
|
Documentation needed on System.Collections.Immutable (what exist is seriously lacking) - The documentation for how to use System.Collections.Immutable is the worst I have seen of any general microsoft libraries. No general introduction and very few examples. + Most stuff that google can find on the net is outdated and may no longer work. I need quite some detective work just to find out how to instantiate an instance of an immutable list with specific values, how to convert a IEnumerable etc.
|
non_process
|
documentation needed on system collections immutable what exist is seriously lacking the documentation for how to use system collections immutable is the worst i have seen of any general microsoft libraries no general introduction and very few examples most stuff that google can find on the net is outdated and may no longer work i need quite some detective work just to find out how to instantiate an instance of an immutable list with specific values how to convert a ienumerable etc
| 0
|
9,162
| 8,552,531,356
|
IssuesEvent
|
2018-11-07 21:22:07
|
edgexfoundry/edgex-go
|
https://api.github.com/repos/edgexfoundry/edgex-go
|
closed
|
Scheduler/Metadata Interaction Error During "Make Run"
|
support-services
|
When I execute "make run" from the command line, I usually see errors indicating that the scheduler is trying to call metadata for default schedule information, but metadata isn't ready to serve requests yet. Example:
```
ERROR: 2018/10/29 16:41:14 error connecting to metadata and retrieving schedules Get http://localhost:48081/api/v1/schedule: dial tcp [::1]:48081: connect: connection refused
ERROR: 2018/10/29 16:41:14 error connecting to metadata and retrieving schedule events: Get http://localhost:48081/api/v1/scheduleevent: dial tcp [::1]:48081: connect: connection refused
INFO: 2018/10/29 16:41:14 scheduler could not find schedule id with schedule with name : midnight
ERROR: 2018/10/29 16:41:14 error trying to add schedule to core-metadata service: Post http://localhost:48081/api/v1/schedule: dial tcp [::1]:48081: connect: connection refused
ERROR: 2018/10/29 16:41:14 scheduler could not find schedule id with schedule event name : scrub-pushed-events
```
Do we need retries here?
|
1.0
|
Scheduler/Metadata Interaction Error During "Make Run" - When I execute "make run" from the command line, I usually see errors indicating that the scheduler is trying to call metadata for default schedule information, but metadata isn't ready to serve requests yet. Example:
```
ERROR: 2018/10/29 16:41:14 error connecting to metadata and retrieving schedules Get http://localhost:48081/api/v1/schedule: dial tcp [::1]:48081: connect: connection refused
ERROR: 2018/10/29 16:41:14 error connecting to metadata and retrieving schedule events: Get http://localhost:48081/api/v1/scheduleevent: dial tcp [::1]:48081: connect: connection refused
INFO: 2018/10/29 16:41:14 scheduler could not find schedule id with schedule with name : midnight
ERROR: 2018/10/29 16:41:14 error trying to add schedule to core-metadata service: Post http://localhost:48081/api/v1/schedule: dial tcp [::1]:48081: connect: connection refused
ERROR: 2018/10/29 16:41:14 scheduler could not find schedule id with schedule event name : scrub-pushed-events
```
Do we need retries here?
|
non_process
|
scheduler metadata interaction error during make run when i execute make run from the command line i usually see errors indicating that the scheduler is trying to call metadata for default schedule information but metadata isn t ready to serve requests yet example error error connecting to metadata and retrieving schedules get dial tcp connect connection refused error error connecting to metadata and retrieving schedule events get dial tcp connect connection refused info scheduler could not find schedule id with schedule with name midnight error error trying to add schedule to core metadata service post dial tcp connect connection refused error scheduler could not find schedule id with schedule event name scrub pushed events do we need retries here
| 0
|
11,180
| 13,957,695,399
|
IssuesEvent
|
2020-10-24 08:11:32
|
alexanderkotsev/geoportal
|
https://api.github.com/repos/alexanderkotsev/geoportal
|
opened
|
SE: Questions on Harvesting frequency and the new Validator
|
Geoportal Harvesting process SE - Sweden
|
From: Persäter Fredrik
Sent: 17 December 2018 14:56
To: QUAGLIA Angelo (JRC-ISPRA-EXT)
Subject: Questions on Harvesting frequency and the new Validator
Dear Angelo,
We are currently doing a targeted action in Sweden trying to get a better result for our priority datasets in the European Geoportal. To make the feedback on result in the EU-Geoportal for the Swedish data providers quicker, I have increased the harvesting frequency to be daily. That was done Friday, December 14. When I look for the harvesting status at the Geoportal the frequency is set to daily, but the date for the latest harvesting is Thursday, Dec 6 – why?
Another question we have discussed but not got any clear answer on is the possibility in the new Validator to validate links according to TG. This was an existing functionality in the old validator (INSPIRE Validator 2) which is very important when we now trying to solve issues with download services. Should we use the old or is it available even in the new validator?
Best Regards
______________________
Fredrik Persäter
Swedish Mapping, Cadastre and Land Registration authority
|
1.0
|
SE: Questions on Harvesting frequency and the new Validator - From: Persäter Fredrik
Sent: 17 December 2018 14:56
To: QUAGLIA Angelo (JRC-ISPRA-EXT)
Subject: Questions on Harvesting frequency and the new Validator
Dear Angelo,
We are currently doing a targeted action in Sweden trying to get a better result for our priority datasets in the European Geoportal. To make the feedback on result in the EU-Geoportal for the Swedish data providers quicker, I have increased the harvesting frequency to be daily. That was done Friday, December 14. When I look for the harvesting status at the Geoportal the frequency is set to daily, but the date for the latest harvesting is Thursday, Dec 6 – why?
Another question we have discussed but not got any clear answer on is the possibility in the new Validator to validate links according to TG. This was an existing functionality in the old validator (INSPIRE Validator 2) which is very important when we now trying to solve issues with download services. Should we use the old or is it available even in the new validator?
Best Regards
______________________
Fredrik Persäter
Swedish Mapping, Cadastre and Land Registration authority
|
process
|
se questions on harvesting frequency and the new validator from pers auml ter fredrik sent december to quaglia angelo jrc ispra ext subject questions on harvesting frequency and the new validator dear angelo we are currently doing a targeted action in sweden trying to get a better result for our priority datasets in the european geoportal to make the feedback on result in the eu geoportal for the swedish data providers quicker i have increased the harvesting frequency to be daily that was done friday december when i look for the harvesting status at the geoportal the frequency is set to daily but the date for the latest harvesting is thursday dec ndash why another question we have discussed but not got any clear answer on is the possibility in the new validator to validate links according to tg this was an existing functionality in the old validator inspire validator which is very important when we now trying to solve issues with download services should we use the old or is it available even in the new validator best regards fredrik pers auml ter swedish mapping cadastre and land registration authority
| 1
|
336,225
| 10,173,530,543
|
IssuesEvent
|
2019-08-08 13:19:13
|
AxonIQ/reference-guide
|
https://api.github.com/repos/AxonIQ/reference-guide
|
closed
|
Broken Links in the Reference Guide
|
Priority 2: Should Status: Resolved Type: Bug
|
e.g. in here https://docs.axoniq.io/reference-guide/implementing-domain-logic/command-handling.
Maybe a job in CI that seeks all links and pings them for 200 HTTP status would fix this issue.
|
1.0
|
Broken Links in the Reference Guide - e.g. in here https://docs.axoniq.io/reference-guide/implementing-domain-logic/command-handling.
Maybe a job in CI that seeks all links and pings them for 200 HTTP status would fix this issue.
|
non_process
|
broken links in the reference guide e g in here maybe a job in ci that seeks all links and pings them for http status would fix this issue
| 0
|
6,338
| 9,379,364,343
|
IssuesEvent
|
2019-04-04 14:47:30
|
pwittchen/ReactiveNetwork
|
https://api.github.com/repos/pwittchen/ReactiveNetwork
|
opened
|
Improve release process
|
release process
|
Add the following improvements to the release process:
- optionally, remove steps related to `RxJava1.x` from the `RELEASING.md` file (they're no longer needed)
- create `release.sh` script, which will do the following things:
- update docs (see `update_docs.sh` script and https://stackoverflow.com/a/12363366/1150795)
- update javadocs (see `update_javadocs.sh`)
- upload artifact to the maven central repository (see `uploadArchives` gradle task)
- close and release the artifact (see `closeAndReleaseRepository` gradle task)
- print message about the success in the end
**Note**: basically everything is ready to automate right now and I just need to glue it toghether
As a final effect I'd like to have a single script, which will do all the work mentioned above automatically within a single call like:
```
./release.sh
```
|
1.0
|
Improve release process - Add the following improvements to the release process:
- optionally, remove steps related to `RxJava1.x` from the `RELEASING.md` file (they're no longer needed)
- create `release.sh` script, which will do the following things:
- update docs (see `update_docs.sh` script and https://stackoverflow.com/a/12363366/1150795)
- update javadocs (see `update_javadocs.sh`)
- upload artifact to the maven central repository (see `uploadArchives` gradle task)
- close and release the artifact (see `closeAndReleaseRepository` gradle task)
- print message about the success in the end
**Note**: basically everything is ready to automate right now and I just need to glue it toghether
As a final effect I'd like to have a single script, which will do all the work mentioned above automatically within a single call like:
```
./release.sh
```
|
process
|
improve release process add the following improvements to the release process optionally remove steps related to x from the releasing md file they re no longer needed create release sh script which will do the following things update docs see update docs sh script and update javadocs see update javadocs sh upload artifact to the maven central repository see uploadarchives gradle task close and release the artifact see closeandreleaserepository gradle task print message about the success in the end note basically everything is ready to automate right now and i just need to glue it toghether as a final effect i d like to have a single script which will do all the work mentioned above automatically within a single call like release sh
| 1
|
22,674
| 31,898,628,613
|
IssuesEvent
|
2023-09-18 05:43:42
|
pydata/pydata-sphinx-theme
|
https://api.github.com/repos/pydata/pydata-sphinx-theme
|
opened
|
Can I be granted admin right on the repository ?
|
tag: team process
|
We have a long lasting error on Codecov and from time to time small issues with Nodes that can only be fixed by admin user.
I assume @choldgraf you were more or less in charge until now but you are less available and we don't want to bother you all the time.
For the time being I have time and I'm willing to take care of this issues when it's needed.
|
1.0
|
Can I be granted admin right on the repository ? - We have a long lasting error on Codecov and from time to time small issues with Nodes that can only be fixed by admin user.
I assume @choldgraf you were more or less in charge until now but you are less available and we don't want to bother you all the time.
For the time being I have time and I'm willing to take care of this issues when it's needed.
|
process
|
can i be granted admin right on the repository we have a long lasting error on codecov and from time to time small issues with nodes that can only be fixed by admin user i assume choldgraf you were more or less in charge until now but you are less available and we don t want to bother you all the time for the time being i have time and i m willing to take care of this issues when it s needed
| 1
|
13,451
| 15,896,632,317
|
IssuesEvent
|
2021-04-11 18:06:57
|
hasura/ask-me-anything
|
https://api.github.com/repos/hasura/ask-me-anything
|
opened
|
What are the supported character data types in Hasura?
|
data type question series next-up-for-ama processing-for-shortvid question
|
## The following is the chart of Hasura supported Postgres data types.
<table border="1" class="colwidths-given docutils">
<colgroup>
<col width="24%">
<col width="20%">
<col width="45%">
<col width="11%">
</colgroup>
<thead valign="bottom">
<tr class="row-odd"><th class="head">Name</th>
<th class="head">Aliases</th>
<th class="head">Description</th>
<th class="head">Hasura Type</th>
</tr>
</thead>
<tbody valign="top">
<tr class="row-even"><td>bigint</td>
<td>int8</td>
<td>signed eight-byte integer</td>
<td><a class="reference internal" href="#string">String</a></td>
</tr>
<tr class="row-odd"><td>bigserial</td>
<td>serial8</td>
<td>autoincrementing eight-byte integer</td>
<td><a class="reference internal" href="#string">String</a></td>
</tr>
<tr class="row-even"><td>bit [ (n) ]</td>
<td> </td>
<td>fixed-length bit string</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>bit varying [ (n) ]</td>
<td>varbit [ (n) ]</td>
<td>variable-length bit string</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>boolean</td>
<td>bool</td>
<td>logical Boolean (true/false)</td>
<td><a class="reference internal" href="#bool">Bool</a></td>
</tr>
<tr class="row-odd"><td>box</td>
<td> </td>
<td>rectangular box on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>bytea</td>
<td> </td>
<td>binary data (“byte array”)</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>character [ (n) ]</td>
<td>char [ (n) ]</td>
<td>fixed-length character string</td>
<td><a class="reference internal" href="#char">Char</a></td>
</tr>
<tr class="row-even"><td>character varying [ (n) ]</td>
<td>varchar [ (n) ]</td>
<td>variable-length character string</td>
<td><a class="reference internal" href="#string">String</a></td>
</tr>
<tr class="row-odd"><td>cidr</td>
<td> </td>
<td>IPv4 or IPv6 network address</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>circle</td>
<td> </td>
<td>circle on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>date</td>
<td> </td>
<td>calendar date (year, month, day)</td>
<td><a class="reference internal" href="#date">Date</a></td>
</tr>
<tr class="row-even"><td>double precision</td>
<td>float8</td>
<td>double precision floating-point number (8 bytes)</td>
<td><a class="reference internal" href="#float">Float</a></td>
</tr>
<tr class="row-odd"><td>inet</td>
<td> </td>
<td>IPv4 or IPv6 host address</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>integer</td>
<td>int, int4</td>
<td>signed four-byte integer</td>
<td><a class="reference internal" href="#int">Int</a></td>
</tr>
<tr class="row-odd"><td>interval [ fields ] [ (p) ]</td>
<td> </td>
<td>time span</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>json</td>
<td> </td>
<td>textual JSON data</td>
<td><a class="reference internal" href="#json">JSON</a></td>
</tr>
<tr class="row-odd"><td>jsonb</td>
<td> </td>
<td>binary JSON data, decomposed</td>
<td><a class="reference internal" href="#jsonb">JSONB</a></td>
</tr>
<tr class="row-even"><td>line</td>
<td> </td>
<td>infinite line on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>lseg</td>
<td> </td>
<td>line segment on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>ltree</td>
<td> </td>
<td>labels of data stored in a hierarchical tree-like structure</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>geometry</td>
<td> </td>
<td>PostGIS Geometry type</td>
<td><a class="reference internal" href="#geometry">Geometry</a></td>
</tr>
<tr class="row-even"><td>geography</td>
<td> </td>
<td>PostGIS Geography type</td>
<td><a class="reference internal" href="#geography">Geography</a></td>
</tr>
<tr class="row-odd"><td>macaddr</td>
<td> </td>
<td>MAC (Media Access Control) address</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>macaddr8</td>
<td> </td>
<td>MAC (Media Access Control) address (EUI-64 format)</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>money</td>
<td> </td>
<td>currency amount</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>numeric [ (p, s) ]</td>
<td>decimal [ (p, s) ]</td>
<td>exact numeric of selectable precision</td>
<td><a class="reference internal" href="#numeric">Numeric</a></td>
</tr>
<tr class="row-odd"><td>path</td>
<td> </td>
<td>geometric path on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>pg_lsn</td>
<td> </td>
<td>PostgreSQL Log Sequence Number</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>point</td>
<td> </td>
<td>geometric point on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>polygon</td>
<td> </td>
<td>closed geometric path on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>real</td>
<td>float4</td>
<td>single precision floating-point number (4 bytes)</td>
<td><a class="reference internal" href="#float">Float</a></td>
</tr>
<tr class="row-even"><td>smallint</td>
<td>int2</td>
<td>signed two-byte integer</td>
<td><a class="reference internal" href="#int">Int</a></td>
</tr>
<tr class="row-odd"><td>smallserial</td>
<td>serial2</td>
<td>autoincrementing two-byte integer</td>
<td><a class="reference internal" href="#int">Int</a></td>
</tr>
<tr class="row-even"><td>serial</td>
<td>serial4</td>
<td>autoincrementing four-byte integer</td>
<td><a class="reference internal" href="#int">Int</a></td>
</tr>
<tr class="row-odd"><td>text</td>
<td> </td>
<td>variable-length character string</td>
<td><a class="reference internal" href="#string">String</a></td>
</tr>
<tr class="row-even"><td>time [ (p) ] [ without time zone ]</td>
<td> </td>
<td>time of day (no time zone)</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>time [ (p) ] with time zone</td>
<td>timetz</td>
<td>time of day, including time zone</td>
<td><a class="reference internal" href="#timetz">Timetz</a></td>
</tr>
<tr class="row-even"><td>timestamp [ (p) ] [ without time zone ]</td>
<td> </td>
<td>date and time (no time zone)</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>timestamp [ (p) ] with time zone</td>
<td>timestamptz</td>
<td>date and time, including time zone</td>
<td><a class="reference internal" href="#timestamptz">Timestamptz</a></td>
</tr>
<tr class="row-even"><td>tsquery</td>
<td> </td>
<td>text search query</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>tsvector</td>
<td> </td>
<td>text search document</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>txid_snapshot</td>
<td> </td>
<td>user-level transaction ID snapshot</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>uuid</td>
<td> </td>
<td>universally unique identifier</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>xml</td>
<td> </td>
<td>XML data</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
</tbody>
</table>
|
1.0
|
What are the supported character data types in Hasura? - ## The following is the chart of Hasura supported Postgres data types.
<table border="1" class="colwidths-given docutils">
<colgroup>
<col width="24%">
<col width="20%">
<col width="45%">
<col width="11%">
</colgroup>
<thead valign="bottom">
<tr class="row-odd"><th class="head">Name</th>
<th class="head">Aliases</th>
<th class="head">Description</th>
<th class="head">Hasura Type</th>
</tr>
</thead>
<tbody valign="top">
<tr class="row-even"><td>bigint</td>
<td>int8</td>
<td>signed eight-byte integer</td>
<td><a class="reference internal" href="#string">String</a></td>
</tr>
<tr class="row-odd"><td>bigserial</td>
<td>serial8</td>
<td>autoincrementing eight-byte integer</td>
<td><a class="reference internal" href="#string">String</a></td>
</tr>
<tr class="row-even"><td>bit [ (n) ]</td>
<td> </td>
<td>fixed-length bit string</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>bit varying [ (n) ]</td>
<td>varbit [ (n) ]</td>
<td>variable-length bit string</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>boolean</td>
<td>bool</td>
<td>logical Boolean (true/false)</td>
<td><a class="reference internal" href="#bool">Bool</a></td>
</tr>
<tr class="row-odd"><td>box</td>
<td> </td>
<td>rectangular box on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>bytea</td>
<td> </td>
<td>binary data (“byte array”)</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>character [ (n) ]</td>
<td>char [ (n) ]</td>
<td>fixed-length character string</td>
<td><a class="reference internal" href="#char">Char</a></td>
</tr>
<tr class="row-even"><td>character varying [ (n) ]</td>
<td>varchar [ (n) ]</td>
<td>variable-length character string</td>
<td><a class="reference internal" href="#string">String</a></td>
</tr>
<tr class="row-odd"><td>cidr</td>
<td> </td>
<td>IPv4 or IPv6 network address</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>circle</td>
<td> </td>
<td>circle on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>date</td>
<td> </td>
<td>calendar date (year, month, day)</td>
<td><a class="reference internal" href="#date">Date</a></td>
</tr>
<tr class="row-even"><td>double precision</td>
<td>float8</td>
<td>double precision floating-point number (8 bytes)</td>
<td><a class="reference internal" href="#float">Float</a></td>
</tr>
<tr class="row-odd"><td>inet</td>
<td> </td>
<td>IPv4 or IPv6 host address</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>integer</td>
<td>int, int4</td>
<td>signed four-byte integer</td>
<td><a class="reference internal" href="#int">Int</a></td>
</tr>
<tr class="row-odd"><td>interval [ fields ] [ (p) ]</td>
<td> </td>
<td>time span</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>json</td>
<td> </td>
<td>textual JSON data</td>
<td><a class="reference internal" href="#json">JSON</a></td>
</tr>
<tr class="row-odd"><td>jsonb</td>
<td> </td>
<td>binary JSON data, decomposed</td>
<td><a class="reference internal" href="#jsonb">JSONB</a></td>
</tr>
<tr class="row-even"><td>line</td>
<td> </td>
<td>infinite line on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>lseg</td>
<td> </td>
<td>line segment on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>ltree</td>
<td> </td>
<td>labels of data stored in a hierarchical tree-like structure</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>geometry</td>
<td> </td>
<td>PostGIS Geometry type</td>
<td><a class="reference internal" href="#geometry">Geometry</a></td>
</tr>
<tr class="row-even"><td>geography</td>
<td> </td>
<td>PostGIS Geography type</td>
<td><a class="reference internal" href="#geography">Geography</a></td>
</tr>
<tr class="row-odd"><td>macaddr</td>
<td> </td>
<td>MAC (Media Access Control) address</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>macaddr8</td>
<td> </td>
<td>MAC (Media Access Control) address (EUI-64 format)</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>money</td>
<td> </td>
<td>currency amount</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>numeric [ (p, s) ]</td>
<td>decimal [ (p, s) ]</td>
<td>exact numeric of selectable precision</td>
<td><a class="reference internal" href="#numeric">Numeric</a></td>
</tr>
<tr class="row-odd"><td>path</td>
<td> </td>
<td>geometric path on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>pg_lsn</td>
<td> </td>
<td>PostgreSQL Log Sequence Number</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>point</td>
<td> </td>
<td>geometric point on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>polygon</td>
<td> </td>
<td>closed geometric path on a plane</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>real</td>
<td>float4</td>
<td>single precision floating-point number (4 bytes)</td>
<td><a class="reference internal" href="#float">Float</a></td>
</tr>
<tr class="row-even"><td>smallint</td>
<td>int2</td>
<td>signed two-byte integer</td>
<td><a class="reference internal" href="#int">Int</a></td>
</tr>
<tr class="row-odd"><td>smallserial</td>
<td>serial2</td>
<td>autoincrementing two-byte integer</td>
<td><a class="reference internal" href="#int">Int</a></td>
</tr>
<tr class="row-even"><td>serial</td>
<td>serial4</td>
<td>autoincrementing four-byte integer</td>
<td><a class="reference internal" href="#int">Int</a></td>
</tr>
<tr class="row-odd"><td>text</td>
<td> </td>
<td>variable-length character string</td>
<td><a class="reference internal" href="#string">String</a></td>
</tr>
<tr class="row-even"><td>time [ (p) ] [ without time zone ]</td>
<td> </td>
<td>time of day (no time zone)</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>time [ (p) ] with time zone</td>
<td>timetz</td>
<td>time of day, including time zone</td>
<td><a class="reference internal" href="#timetz">Timetz</a></td>
</tr>
<tr class="row-even"><td>timestamp [ (p) ] [ without time zone ]</td>
<td> </td>
<td>date and time (no time zone)</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>timestamp [ (p) ] with time zone</td>
<td>timestamptz</td>
<td>date and time, including time zone</td>
<td><a class="reference internal" href="#timestamptz">Timestamptz</a></td>
</tr>
<tr class="row-even"><td>tsquery</td>
<td> </td>
<td>text search query</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>tsvector</td>
<td> </td>
<td>text search document</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>txid_snapshot</td>
<td> </td>
<td>user-level transaction ID snapshot</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-odd"><td>uuid</td>
<td> </td>
<td>universally unique identifier</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
<tr class="row-even"><td>xml</td>
<td> </td>
<td>XML data</td>
<td><a class="reference internal" href="#implicit">Implicit</a></td>
</tr>
</tbody>
</table>
|
process
|
what are the supported character data types in hasura the following is the chart of hasura supported postgres data types name aliases description hasura type bigint signed eight byte integer string bigserial autoincrementing eight byte integer string bit nbsp fixed length bit string implicit bit varying varbit variable length bit string implicit boolean bool logical boolean true false bool box nbsp rectangular box on a plane implicit bytea nbsp binary data “byte array” implicit character char fixed length character string char character varying varchar variable length character string string cidr nbsp or network address implicit circle nbsp circle on a plane implicit date nbsp calendar date year month day date double precision double precision floating point number bytes float inet nbsp or host address implicit integer int signed four byte integer int interval nbsp time span implicit json nbsp textual json data json jsonb nbsp binary json data decomposed jsonb line nbsp infinite line on a plane implicit lseg nbsp line segment on a plane implicit ltree nbsp labels of data stored in a hierarchical tree like structure implicit geometry nbsp postgis geometry type geometry geography nbsp postgis geography type geography macaddr nbsp mac media access control address implicit nbsp mac media access control address eui format implicit money nbsp currency amount implicit numeric decimal exact numeric of selectable precision numeric path nbsp geometric path on a plane implicit pg lsn nbsp postgresql log sequence number implicit point nbsp geometric point on a plane implicit polygon nbsp closed geometric path on a plane implicit real single precision floating point number bytes float smallint signed two byte integer int smallserial autoincrementing two byte integer int serial autoincrementing four byte integer int text nbsp variable length character string string time nbsp time of day no time zone implicit time with time zone timetz time of day including time zone timetz timestamp nbsp date and time no time zone implicit timestamp with time zone timestamptz date and time including time zone timestamptz tsquery nbsp text search query implicit tsvector nbsp text search document implicit txid snapshot nbsp user level transaction id snapshot implicit uuid nbsp universally unique identifier implicit xml nbsp xml data implicit
| 1
|
8,809
| 11,908,300,077
|
IssuesEvent
|
2020-03-31 00:33:13
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Add support for undo/redo actions, such as deleting an object in modeler
|
Feature Request Processing
|
Author Name: **Magnus Nilsson** (Magnus Nilsson)
Original Redmine Issue: [5471](https://issues.qgis.org/issues/5471)
Redmine category:processing/modeller
Assignee: Victor Olaya
---
Add support for undo/redo actions, such as deleting an object in modeler
---
Related issue(s): #24172 (duplicates)
Redmine related issue(s): [16262](https://issues.qgis.org/issues/16262)
---
|
1.0
|
Add support for undo/redo actions, such as deleting an object in modeler - Author Name: **Magnus Nilsson** (Magnus Nilsson)
Original Redmine Issue: [5471](https://issues.qgis.org/issues/5471)
Redmine category:processing/modeller
Assignee: Victor Olaya
---
Add support for undo/redo actions, such as deleting an object in modeler
---
Related issue(s): #24172 (duplicates)
Redmine related issue(s): [16262](https://issues.qgis.org/issues/16262)
---
|
process
|
add support for undo redo actions such as deleting an object in modeler author name magnus nilsson magnus nilsson original redmine issue redmine category processing modeller assignee victor olaya add support for undo redo actions such as deleting an object in modeler related issue s duplicates redmine related issue s
| 1
|
20,180
| 26,738,133,185
|
IssuesEvent
|
2023-01-30 10:57:39
|
GoogleCloudPlatform/dotnet-docs-samples
|
https://api.github.com/repos/GoogleCloudPlatform/dotnet-docs-samples
|
opened
|
[Storage]: Compose object test is flaky
|
type: process priority: p2 api: storage
|
Log [here](https://source.cloud.google.com/results/invocations/778557d3-7adf-4b9c-8bff-fd3227a719c5/log)
```
Failed ComposeObjectTest.ComposeObject [611 ms]
Error Message:
Google.GoogleApiException : The service storage has thrown an exception. HttpStatusCode is BadRequest. Cannot compose object with destination storage class COLDLINE from source components that have different storage classes (STANDARD)
Stack Trace:
at Google.Apis.Requests.ClientServiceRequest`1.ParseResponse(HttpResponseMessage response)
at Google.Apis.Requests.ClientServiceRequest`1.Execute()
at ComposeObjectSample.ComposeObject(String bucketName, String firstObjectName, String secondObjectName, String targetObjectName) in /tmpfs/src/github/dotnet-docs-samples/storage/api/Storage.Samples/ComposeObject.cs:line 39
```
|
1.0
|
[Storage]: Compose object test is flaky - Log [here](https://source.cloud.google.com/results/invocations/778557d3-7adf-4b9c-8bff-fd3227a719c5/log)
```
Failed ComposeObjectTest.ComposeObject [611 ms]
Error Message:
Google.GoogleApiException : The service storage has thrown an exception. HttpStatusCode is BadRequest. Cannot compose object with destination storage class COLDLINE from source components that have different storage classes (STANDARD)
Stack Trace:
at Google.Apis.Requests.ClientServiceRequest`1.ParseResponse(HttpResponseMessage response)
at Google.Apis.Requests.ClientServiceRequest`1.Execute()
at ComposeObjectSample.ComposeObject(String bucketName, String firstObjectName, String secondObjectName, String targetObjectName) in /tmpfs/src/github/dotnet-docs-samples/storage/api/Storage.Samples/ComposeObject.cs:line 39
```
|
process
|
compose object test is flaky log failed composeobjecttest composeobject error message google googleapiexception the service storage has thrown an exception httpstatuscode is badrequest cannot compose object with destination storage class coldline from source components that have different storage classes standard stack trace at google apis requests clientservicerequest parseresponse httpresponsemessage response at google apis requests clientservicerequest execute at composeobjectsample composeobject string bucketname string firstobjectname string secondobjectname string targetobjectname in tmpfs src github dotnet docs samples storage api storage samples composeobject cs line
| 1
|
15,887
| 20,075,034,240
|
IssuesEvent
|
2022-02-04 11:43:24
|
climatepolicyradar/navigator
|
https://api.github.com/repos/climatepolicyradar/navigator
|
opened
|
Navigator should store submitted PDF documents in repository
|
Document processing
|
Documents submitted by a third party system, during initial loading from CCLW, or manually by an internal/external user should be stored in a document repository. This is to allow a corpus of documents to be created, and ensure that they are maintained even if the source of that document is removed.
The URL to the document in the Navigator repository should be stored against the document in the database.
|
1.0
|
Navigator should store submitted PDF documents in repository - Documents submitted by a third party system, during initial loading from CCLW, or manually by an internal/external user should be stored in a document repository. This is to allow a corpus of documents to be created, and ensure that they are maintained even if the source of that document is removed.
The URL to the document in the Navigator repository should be stored against the document in the database.
|
process
|
navigator should store submitted pdf documents in repository documents submitted by a third party system during initial loading from cclw or manually by an internal external user should be stored in a document repository this is to allow a corpus of documents to be created and ensure that they are maintained even if the source of that document is removed the url to the document in the navigator repository should be stored against the document in the database
| 1
|
15,409
| 19,598,429,494
|
IssuesEvent
|
2022-01-05 21:01:03
|
goodboy/tractor
|
https://api.github.com/repos/goodboy/tractor
|
opened
|
Deprecate `ActorNursery.run_in_actor()` and offer as part of a *wrapper* cluster API
|
help wanted testing api supervision cancellation process_spawning
|
I've rambled on wayyy too long in #287 but thanks to attempts to resolve that issue *I think* this is the way I'd like to go on a so called *one shot task per actor*-style "worker pool" API.
#### TDLR premises
- `Portal.result()` + `ActorNursery.run_in_actor()` couples a *future*-like API into our nursery's actor spawning interface at the *wrong level of abstraction*
- `.run_in_actor()` can be implemented as a syntactic sugar on top of `ActorNursery.start_actor()` + `Portal.run()` using `trio` tasks in a similar way to our [`concurrent.futures` worker pool example](https://github.com/goodboy/tractor/blob/master/examples/parallelism/concurrent_actors_primes.py#L108)
- conducting [error collection and propagation in our `ActorNursery`'s teardown machinery](https://github.com/goodboy/tractor/blob/master/tractor/_supervise.py#L431) is entirely superfluous and a *path dependent* legacy design which has a few pretty awful side effects:
- it makes the nursery need [to be aware of `.run_in_actor()` portals](https://github.com/goodboy/tractor/blob/master/tractor/_supervise.py#L217)
- it enforces duplication between `.run_in_actor()` and `Portal.run()`
- it complicates [spawn machinery with special cases](https://github.com/goodboy/tractor/blob/master/tractor/_spawn.py#L348)
- it results in [hard to maintain tests]() due to indeterminacy in cancelled vs. errored child results (which btw won't go away if we change the API layering but at least we can leverage std `trio` nursery machinery instead of rolling our own 😂)
#### ToDO
- [ ] write up a version of `.run_in_actor()` style on top of the rest of the nursery API
- [ ] convert the `nested_multierrors()` test to use the `.open_context()` / `.run()` style and ensure we can still get as reliable of performance
- [ ] figure out how to offer the higher level / new `.run_in_actor()` API as a cluster helper
|
1.0
|
Deprecate `ActorNursery.run_in_actor()` and offer as part of a *wrapper* cluster API - I've rambled on wayyy too long in #287 but thanks to attempts to resolve that issue *I think* this is the way I'd like to go on a so called *one shot task per actor*-style "worker pool" API.
#### TDLR premises
- `Portal.result()` + `ActorNursery.run_in_actor()` couples a *future*-like API into our nursery's actor spawning interface at the *wrong level of abstraction*
- `.run_in_actor()` can be implemented as a syntactic sugar on top of `ActorNursery.start_actor()` + `Portal.run()` using `trio` tasks in a similar way to our [`concurrent.futures` worker pool example](https://github.com/goodboy/tractor/blob/master/examples/parallelism/concurrent_actors_primes.py#L108)
- conducting [error collection and propagation in our `ActorNursery`'s teardown machinery](https://github.com/goodboy/tractor/blob/master/tractor/_supervise.py#L431) is entirely superfluous and a *path dependent* legacy design which has a few pretty awful side effects:
- it makes the nursery need [to be aware of `.run_in_actor()` portals](https://github.com/goodboy/tractor/blob/master/tractor/_supervise.py#L217)
- it enforces duplication between `.run_in_actor()` and `Portal.run()`
- it complicates [spawn machinery with special cases](https://github.com/goodboy/tractor/blob/master/tractor/_spawn.py#L348)
- it results in [hard to maintain tests]() due to indeterminacy in cancelled vs. errored child results (which btw won't go away if we change the API layering but at least we can leverage std `trio` nursery machinery instead of rolling our own 😂)
#### ToDO
- [ ] write up a version of `.run_in_actor()` style on top of the rest of the nursery API
- [ ] convert the `nested_multierrors()` test to use the `.open_context()` / `.run()` style and ensure we can still get as reliable of performance
- [ ] figure out how to offer the higher level / new `.run_in_actor()` API as a cluster helper
|
process
|
deprecate actornursery run in actor and offer as part of a wrapper cluster api i ve rambled on wayyy too long in but thanks to attempts to resolve that issue i think this is the way i d like to go on a so called one shot task per actor style worker pool api tdlr premises portal result actornursery run in actor couples a future like api into our nursery s actor spawning interface at the wrong level of abstraction run in actor can be implemented as a syntactic sugar on top of actornursery start actor portal run using trio tasks in a similar way to our conducting is entirely superfluous and a path dependent legacy design which has a few pretty awful side effects it makes the nursery need it enforces duplication between run in actor and portal run it complicates it results in due to indeterminacy in cancelled vs errored child results which btw won t go away if we change the api layering but at least we can leverage std trio nursery machinery instead of rolling our own 😂 todo write up a version of run in actor style on top of the rest of the nursery api convert the nested multierrors test to use the open context run style and ensure we can still get as reliable of performance figure out how to offer the higher level new run in actor api as a cluster helper
| 1
|
349,099
| 24,933,750,988
|
IssuesEvent
|
2022-10-31 13:42:41
|
equinor/ert
|
https://api.github.com/repos/equinor/ert
|
closed
|
Improved documentation of parallel runs
|
user request documentation
|
The documentation does not seem to have a very good description of how parallel runs are configured in ERT. The NUM_CPU kw is briefly explained, but that is only a part of the story. In reality ERT, when the DATA_FILE kw is set, parses the DATA file for kws PARALLEL for MPI and SLAVE for RC runs to determine how many CPUs are needed. The NUM_CPU kw is hence optional. I would suggest some prosaic description of how it works and what the options are. I know that it might change with ert3, but think it is worth the effort :)
|
1.0
|
Improved documentation of parallel runs - The documentation does not seem to have a very good description of how parallel runs are configured in ERT. The NUM_CPU kw is briefly explained, but that is only a part of the story. In reality ERT, when the DATA_FILE kw is set, parses the DATA file for kws PARALLEL for MPI and SLAVE for RC runs to determine how many CPUs are needed. The NUM_CPU kw is hence optional. I would suggest some prosaic description of how it works and what the options are. I know that it might change with ert3, but think it is worth the effort :)
|
non_process
|
improved documentation of parallel runs the documentation does not seem to have a very good description of how parallel runs are configured in ert the num cpu kw is briefly explained but that is only a part of the story in reality ert when the data file kw is set parses the data file for kws parallel for mpi and slave for rc runs to determine how many cpus are needed the num cpu kw is hence optional i would suggest some prosaic description of how it works and what the options are i know that it might change with but think it is worth the effort
| 0
|
29,337
| 5,654,066,161
|
IssuesEvent
|
2017-04-09 04:41:03
|
Quantum64/ArcadeAPI
|
https://api.github.com/repos/Quantum64/ArcadeAPI
|
opened
|
Fix market on most servers
|
core defect
|
This issue is caused by a shaded library conflict involving Apache DBUtils, it can probably be fixed with Shade Relocations and a new external version.
https://maven.apache.org/plugins/maven-shade-plugin/examples/class-relocation.html
|
1.0
|
Fix market on most servers - This issue is caused by a shaded library conflict involving Apache DBUtils, it can probably be fixed with Shade Relocations and a new external version.
https://maven.apache.org/plugins/maven-shade-plugin/examples/class-relocation.html
|
non_process
|
fix market on most servers this issue is caused by a shaded library conflict involving apache dbutils it can probably be fixed with shade relocations and a new external version
| 0
|
16,652
| 21,716,780,961
|
IssuesEvent
|
2022-05-10 18:45:19
|
googleapis/java-storage
|
https://api.github.com/repos/googleapis/java-storage
|
opened
|
test: add `storage.notification.*` conformance tests
|
type: process api: storage
|
Now that we have notifications integrated into the api, we should ensure our retries are following the prescribed best practice.
* [ ] Add setup steps to RpcMethodMapping
* [ ] Add mappings into RpcMethodMappings
* [ ] Update StorageRetryStrategy and associated classes to account for notification operations
|
1.0
|
test: add `storage.notification.*` conformance tests - Now that we have notifications integrated into the api, we should ensure our retries are following the prescribed best practice.
* [ ] Add setup steps to RpcMethodMapping
* [ ] Add mappings into RpcMethodMappings
* [ ] Update StorageRetryStrategy and associated classes to account for notification operations
|
process
|
test add storage notification conformance tests now that we have notifications integrated into the api we should ensure our retries are following the prescribed best practice add setup steps to rpcmethodmapping add mappings into rpcmethodmappings update storageretrystrategy and associated classes to account for notification operations
| 1
|
10,718
| 13,521,408,050
|
IssuesEvent
|
2020-09-15 06:57:58
|
panther-labs/panther
|
https://api.github.com/repos/panther-labs/panther
|
closed
|
Backend should store errors that occurred while running rules over incoming events
|
p1 story team:data processing
|
### Description
Backend should store errors that occurred while running rules over incoming events
### Acceptance Criteria
- Backend should store errors that occurred while running rules over incoming events
|
1.0
|
Backend should store errors that occurred while running rules over incoming events - ### Description
Backend should store errors that occurred while running rules over incoming events
### Acceptance Criteria
- Backend should store errors that occurred while running rules over incoming events
|
process
|
backend should store errors that occurred while running rules over incoming events description backend should store errors that occurred while running rules over incoming events acceptance criteria backend should store errors that occurred while running rules over incoming events
| 1
|
6,334
| 9,377,816,130
|
IssuesEvent
|
2019-04-04 11:18:46
|
Maximus5/ConEmu
|
https://api.github.com/repos/Maximus5/ConEmu
|
closed
|
Question: The system cannot find the path specified
|
processes
|
First off, everything is running great with this application. **Very nice work here!** If you had a forum I would post this there instead. This is more of a question on how to track something down.
(_Apologies if there is a better place to post this -- I did not see anything in the contributions template that offered a preferred location for questions._)
On Windows 10 boot, I am getting windows that popup in ConEmu on load that say "The system cannot find the path specified."
Example:

I am wondering if there is a way to see which files it is trying to load and which processes are executing this command? I tried poking around in Task Manager but did not see anything obvious.
Thank you in advance for any assistance!
(Please feel free to close this issue since it is a question and not a bug.)
|
1.0
|
Question: The system cannot find the path specified - First off, everything is running great with this application. **Very nice work here!** If you had a forum I would post this there instead. This is more of a question on how to track something down.
(_Apologies if there is a better place to post this -- I did not see anything in the contributions template that offered a preferred location for questions._)
On Windows 10 boot, I am getting windows that popup in ConEmu on load that say "The system cannot find the path specified."
Example:

I am wondering if there is a way to see which files it is trying to load and which processes are executing this command? I tried poking around in Task Manager but did not see anything obvious.
Thank you in advance for any assistance!
(Please feel free to close this issue since it is a question and not a bug.)
|
process
|
question the system cannot find the path specified first off everything is running great with this application very nice work here if you had a forum i would post this there instead this is more of a question on how to track something down apologies if there is a better place to post this i did not see anything in the contributions template that offered a preferred location for questions on windows boot i am getting windows that popup in conemu on load that say the system cannot find the path specified example i am wondering if there is a way to see which files it is trying to load and which processes are executing this command i tried poking around in task manager but did not see anything obvious thank you in advance for any assistance please feel free to close this issue since it is a question and not a bug
| 1
|
6,982
| 10,131,466,533
|
IssuesEvent
|
2019-08-01 19:37:07
|
toggl/mobileapp
|
https://api.github.com/repos/toggl/mobileapp
|
closed
|
Create issue template to request translations fixes/improvements
|
process
|
Following more or the same format as whatever is defined by #4835, but having in mind that the issue will be used for users and ourselves to submit requests that will fix bad/wrong translations.
|
1.0
|
Create issue template to request translations fixes/improvements - Following more or the same format as whatever is defined by #4835, but having in mind that the issue will be used for users and ourselves to submit requests that will fix bad/wrong translations.
|
process
|
create issue template to request translations fixes improvements following more or the same format as whatever is defined by but having in mind that the issue will be used for users and ourselves to submit requests that will fix bad wrong translations
| 1
|
10,328
| 13,162,279,796
|
IssuesEvent
|
2020-08-10 21:15:29
|
elastic/beats
|
https://api.github.com/repos/elastic/beats
|
closed
|
Add a minimum TTL to auditbeat reverse DNS resolution
|
:Processors Team:SIEM enhancement good first issue libbeat
|
**Describe the enhancement:**
I am having an issue with auditbeat performing reverse DNS resolution too often. However, it doesn't look like a bug, as the responses seem to have `TTL: 0`, and based on the documentation it appears as though that response value is honoured and the responses are never cached by design:
https://www.elastic.co/guide/en/beats/auditbeat/master/processor-dns.html
**Describe a specific use case for the enhancement or feature:**
It would be very useful for me to be able to configure a minimum TTL for caching successful auditbeat reverse DNS resolutions to allow `TTL: 0` responses to be cached for the minimum configured time rather than never cached.
Reverse DNS requests are generally out-of-system requests, so there is also the potential for DoS if malicious users are aware of the use of auditbeat reverse DNS resolution and this issue.
|
1.0
|
Add a minimum TTL to auditbeat reverse DNS resolution - **Describe the enhancement:**
I am having an issue with auditbeat performing reverse DNS resolution too often. However, it doesn't look like a bug, as the responses seem to have `TTL: 0`, and based on the documentation it appears as though that response value is honoured and the responses are never cached by design:
https://www.elastic.co/guide/en/beats/auditbeat/master/processor-dns.html
**Describe a specific use case for the enhancement or feature:**
It would be very useful for me to be able to configure a minimum TTL for caching successful auditbeat reverse DNS resolutions to allow `TTL: 0` responses to be cached for the minimum configured time rather than never cached.
Reverse DNS requests are generally out-of-system requests, so there is also the potential for DoS if malicious users are aware of the use of auditbeat reverse DNS resolution and this issue.
|
process
|
add a minimum ttl to auditbeat reverse dns resolution describe the enhancement i am having an issue with auditbeat performing reverse dns resolution too often however it doesn t look like a bug as the responses seem to have ttl and based on the documentation it appears as though that response value is honoured and the responses are never cached by design describe a specific use case for the enhancement or feature it would be very useful for me to be able to configure a minimum ttl for caching successful auditbeat reverse dns resolutions to allow ttl responses to be cached for the minimum configured time rather than never cached reverse dns requests are generally out of system requests so there is also the potential for dos if malicious users are aware of the use of auditbeat reverse dns resolution and this issue
| 1
|
1,654
| 4,283,836,447
|
IssuesEvent
|
2016-07-15 14:46:31
|
openvstorage/framework
|
https://api.github.com/repos/openvstorage/framework
|
closed
|
Framework should cleanup MDS slaves of removed volumes
|
priority_urgent process_wontfix type_bug
|
Framework should cleanup MDS slaves of removed (non-existing) volumes.
|
1.0
|
Framework should cleanup MDS slaves of removed volumes - Framework should cleanup MDS slaves of removed (non-existing) volumes.
|
process
|
framework should cleanup mds slaves of removed volumes framework should cleanup mds slaves of removed non existing volumes
| 1
|
8,537
| 11,713,497,131
|
IssuesEvent
|
2020-03-09 10:26:13
|
python-trio/trio
|
https://api.github.com/repos/python-trio/trio
|
closed
|
Intermittent test failure in test_subprocess.py:test_signals: SIGINT causes startup failure in child Python and unexpected return value
|
subprocesses
|
Observed here: https://dev.azure.com/python-trio/trio/_build/results?buildId=827&view=logs&jobId=300ae31b-e5d9-5bcb-b319-aee10d9d83f3
```
=================================== FAILURES ===================================
_________________________________ test_signals _________________________________
async def test_signals():
async def test_one_signal(send_it, signum):
with move_on_after(1.0) as scope:
async with await open_process(SLEEP(3600)) as proc:
send_it(proc)
assert not scope.cancelled_caught
if posix:
assert proc.returncode == -signum
else:
assert proc.returncode != 0
await test_one_signal(Process.kill, SIGKILL)
await test_one_signal(Process.terminate, SIGTERM)
if posix:
> await test_one_signal(lambda proc: proc.send_signal(SIGINT), SIGINT)
/t/venv/lib/python3.7/site-packages/trio/tests/test_subprocess.py:366:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
send_it = <function test_signals.<locals>.<lambda> at 0x7f7e2d578840>
signum = <Signals.SIGINT: 2>
async def test_one_signal(send_it, signum):
with move_on_after(1.0) as scope:
async with await open_process(SLEEP(3600)) as proc:
send_it(proc)
assert not scope.cancelled_caught
if posix:
> assert proc.returncode == -signum
E assert -6 == -2
E --6
E +-2
/t/venv/lib/python3.7/site-packages/trio/tests/test_subprocess.py:359: AssertionError
----------------------------- Captured stderr call -----------------------------
Fatal Python error: init_sys_streams: can't initialize sys standard streams
Traceback (most recent call last):
File "/t/venv/lib/python3.7/encodings/latin_1.py", line 13, in <module>
class Codec(codecs.Codec):
KeyboardInterrupt
```
In this test we're sending SIGINT to a child process, and the child process happens to be Python, and apparently Python's startup logic is flaky if a SIGINT arrives at the wrong time.
I think the simplest fix would be to make the test use a different signal, one that Python doesn't catch? Maybe SIGKILL + SIGTERM, if we're worried about making sure that `send_signal` isn't accidentally doing the same thing as `kill` or `terminate`?
|
1.0
|
Intermittent test failure in test_subprocess.py:test_signals: SIGINT causes startup failure in child Python and unexpected return value - Observed here: https://dev.azure.com/python-trio/trio/_build/results?buildId=827&view=logs&jobId=300ae31b-e5d9-5bcb-b319-aee10d9d83f3
```
=================================== FAILURES ===================================
_________________________________ test_signals _________________________________
async def test_signals():
async def test_one_signal(send_it, signum):
with move_on_after(1.0) as scope:
async with await open_process(SLEEP(3600)) as proc:
send_it(proc)
assert not scope.cancelled_caught
if posix:
assert proc.returncode == -signum
else:
assert proc.returncode != 0
await test_one_signal(Process.kill, SIGKILL)
await test_one_signal(Process.terminate, SIGTERM)
if posix:
> await test_one_signal(lambda proc: proc.send_signal(SIGINT), SIGINT)
/t/venv/lib/python3.7/site-packages/trio/tests/test_subprocess.py:366:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
send_it = <function test_signals.<locals>.<lambda> at 0x7f7e2d578840>
signum = <Signals.SIGINT: 2>
async def test_one_signal(send_it, signum):
with move_on_after(1.0) as scope:
async with await open_process(SLEEP(3600)) as proc:
send_it(proc)
assert not scope.cancelled_caught
if posix:
> assert proc.returncode == -signum
E assert -6 == -2
E --6
E +-2
/t/venv/lib/python3.7/site-packages/trio/tests/test_subprocess.py:359: AssertionError
----------------------------- Captured stderr call -----------------------------
Fatal Python error: init_sys_streams: can't initialize sys standard streams
Traceback (most recent call last):
File "/t/venv/lib/python3.7/encodings/latin_1.py", line 13, in <module>
class Codec(codecs.Codec):
KeyboardInterrupt
```
In this test we're sending SIGINT to a child process, and the child process happens to be Python, and apparently Python's startup logic is flaky if a SIGINT arrives at the wrong time.
I think the simplest fix would be to make the test use a different signal, one that Python doesn't catch? Maybe SIGKILL + SIGTERM, if we're worried about making sure that `send_signal` isn't accidentally doing the same thing as `kill` or `terminate`?
|
process
|
intermittent test failure in test subprocess py test signals sigint causes startup failure in child python and unexpected return value observed here failures test signals async def test signals async def test one signal send it signum with move on after as scope async with await open process sleep as proc send it proc assert not scope cancelled caught if posix assert proc returncode signum else assert proc returncode await test one signal process kill sigkill await test one signal process terminate sigterm if posix await test one signal lambda proc proc send signal sigint sigint t venv lib site packages trio tests test subprocess py send it at signum async def test one signal send it signum with move on after as scope async with await open process sleep as proc send it proc assert not scope cancelled caught if posix assert proc returncode signum e assert e e t venv lib site packages trio tests test subprocess py assertionerror captured stderr call fatal python error init sys streams can t initialize sys standard streams traceback most recent call last file t venv lib encodings latin py line in class codec codecs codec keyboardinterrupt in this test we re sending sigint to a child process and the child process happens to be python and apparently python s startup logic is flaky if a sigint arrives at the wrong time i think the simplest fix would be to make the test use a different signal one that python doesn t catch maybe sigkill sigterm if we re worried about making sure that send signal isn t accidentally doing the same thing as kill or terminate
| 1
|
11,119
| 13,957,684,816
|
IssuesEvent
|
2020-10-24 08:08:25
|
alexanderkotsev/geoportal
|
https://api.github.com/repos/alexanderkotsev/geoportal
|
opened
|
AT: Missing Download link during Harvest - Linkage Checker say all is good
|
AT - Austria Geoportal Harvesting process
|
Dear Geoportal Team,
we did a Harvesting on Friday and there were several missing Download Links within 1 specific download service (WFS)
Download Service metadata: https://geometadaten.lfrz.at/at.lfrz.discoveryservices/srv/ger/csw?service=CSW&version=2.0.2&request=GetRecordById&outputschema=http://www.isotc211.org/2005/gmd&elementSetName=full&id=9c00f140-a51a-11e3-be40-425861b86ab6
example of Dataset metadata: https://geometadaten.lfrz.at/at.lfrz.discoveryservices/srv/ger/csw?service=CSW&version=2.0.2&request=GetRecordById&outputschema=http://www.isotc211.org/2005/gmd&elementSetName=full&id=ecc6f3f6-d4ba-4d30-8efb-ed907963e0f8
All Datasets of the sevrice mentioned above are not downloadable regarding to harvesting / geoportal.
We tried the linkage checker for a specific dataset of this service and the result is, that all linkage is correct (see attachment respectively https://inspire-geoportal.ec.europa.eu/resources/sandbox/INSPIRE-ae21ae4d-09fd-11ea-8bf3-0050563f01ec_20191118-131934/)
Can you tell us what is the problem during harvest process?
Thanks, Manuel
|
1.0
|
AT: Missing Download link during Harvest - Linkage Checker say all is good - Dear Geoportal Team,
we did a Harvesting on Friday and there were several missing Download Links within 1 specific download service (WFS)
Download Service metadata: https://geometadaten.lfrz.at/at.lfrz.discoveryservices/srv/ger/csw?service=CSW&version=2.0.2&request=GetRecordById&outputschema=http://www.isotc211.org/2005/gmd&elementSetName=full&id=9c00f140-a51a-11e3-be40-425861b86ab6
example of Dataset metadata: https://geometadaten.lfrz.at/at.lfrz.discoveryservices/srv/ger/csw?service=CSW&version=2.0.2&request=GetRecordById&outputschema=http://www.isotc211.org/2005/gmd&elementSetName=full&id=ecc6f3f6-d4ba-4d30-8efb-ed907963e0f8
All Datasets of the sevrice mentioned above are not downloadable regarding to harvesting / geoportal.
We tried the linkage checker for a specific dataset of this service and the result is, that all linkage is correct (see attachment respectively https://inspire-geoportal.ec.europa.eu/resources/sandbox/INSPIRE-ae21ae4d-09fd-11ea-8bf3-0050563f01ec_20191118-131934/)
Can you tell us what is the problem during harvest process?
Thanks, Manuel
|
process
|
at missing download link during harvest linkage checker say all is good dear geoportal team we did a harvesting on friday and there were several missing download links within specific download service wfs download service metadata example of dataset metadata all datasets of the sevrice mentioned above are not downloadable regarding to harvesting geoportal we tried the linkage checker for a specific dataset of this service and the result is that all linkage is correct see attachment respectively can you tell us what is the problem during harvest process thanks manuel
| 1
|
207,614
| 23,465,380,659
|
IssuesEvent
|
2022-08-16 16:16:37
|
temporalio/graphql
|
https://api.github.com/repos/temporalio/graphql
|
opened
|
cli-2.11.1.tgz: 4 vulnerabilities (highest severity is: 7.1)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cli-2.11.1.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/undici/package.json</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2022-35948](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-35948) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.1 | undici-5.5.1.tgz | Transitive | N/A | ❌ |
| [CVE-2022-31151](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31151) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | undici-5.5.1.tgz | Transitive | 2.11.2 | ✅ |
| [CVE-2022-31150](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31150) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | undici-5.5.1.tgz | Transitive | 2.11.2 | ✅ |
| [CVE-2022-35949](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-35949) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.4 | undici-5.5.1.tgz | Transitive | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-35948</summary>
### Vulnerable Library - <b>undici-5.5.1.tgz</b></p>
<p>An HTTP/1.1 client, written from scratch for Node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/undici/-/undici-5.5.1.tgz">https://registry.npmjs.org/undici/-/undici-5.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/undici/package.json</p>
<p>
Dependency Hierarchy:
- cli-2.11.1.tgz (Root Library)
- fetch-0.0.2.tgz
- :x: **undici-5.5.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
undici is an HTTP/1.1 client, written from scratch for Node.js.`=< undici@5.8.0` users are vulnerable to _CRLF Injection_ on headers when using unsanitized input as request headers, more specifically, inside the `content-type` header. Example: ``` import { request } from 'undici' const unsanitizedContentTypeInput = 'application/json\r\n\r\nGET /foo2 HTTP/1.1' await request('http://localhost:3000, { method: 'GET', headers: { 'content-type': unsanitizedContentTypeInput }, }) ``` The above snippet will perform two requests in a single `request` API call: 1) `http://localhost:3000/` 2) `http://localhost:3000/foo2` This issue was patched in Undici v5.8.1. Sanitize input when sending content-type headers using user input as a workaround.
<p>Publish Date: 2022-08-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-35948>CVE-2022-35948</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-35948">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-35948</a></p>
<p>Release Date: 2022-08-13</p>
<p>Fix Resolution: undici - 5.8.2</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-31151</summary>
### Vulnerable Library - <b>undici-5.5.1.tgz</b></p>
<p>An HTTP/1.1 client, written from scratch for Node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/undici/-/undici-5.5.1.tgz">https://registry.npmjs.org/undici/-/undici-5.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/undici/package.json</p>
<p>
Dependency Hierarchy:
- cli-2.11.1.tgz (Root Library)
- fetch-0.0.2.tgz
- :x: **undici-5.5.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Authorization headers are cleared on cross-origin redirect. However, cookie headers which are sensitive headers and are official headers found in the spec, remain uncleared. There are active users using cookie headers in undici. This may lead to accidental leakage of cookie to a 3rd-party site or a malicious attacker who can control the redirection target (ie. an open redirector) to leak the cookie to the 3rd party site. This was patched in v5.7.1. By default, this vulnerability is not exploitable. Do not enable redirections, i.e. `maxRedirections: 0` (the default).
<p>Publish Date: 2022-07-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31151>CVE-2022-31151</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/nodejs/undici/security/advisories/GHSA-q768-x9m6-m9qp">https://github.com/nodejs/undici/security/advisories/GHSA-q768-x9m6-m9qp</a></p>
<p>Release Date: 2022-07-21</p>
<p>Fix Resolution (undici): 5.8.0</p>
<p>Direct dependency fix Resolution (@graphql-codegen/cli): 2.11.2</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-31150</summary>
### Vulnerable Library - <b>undici-5.5.1.tgz</b></p>
<p>An HTTP/1.1 client, written from scratch for Node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/undici/-/undici-5.5.1.tgz">https://registry.npmjs.org/undici/-/undici-5.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/undici/package.json</p>
<p>
Dependency Hierarchy:
- cli-2.11.1.tgz (Root Library)
- fetch-0.0.2.tgz
- :x: **undici-5.5.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
undici is an HTTP/1.1 client, written from scratch for Node.js. It is possible to inject CRLF sequences into request headers in undici in versions less than 5.7.1. A fix was released in version 5.8.0. Sanitizing all HTTP headers from untrusted sources to eliminate `\r\n` is a workaround for this issue.
<p>Publish Date: 2022-07-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31150>CVE-2022-31150</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31150">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31150</a></p>
<p>Release Date: 2022-07-19</p>
<p>Fix Resolution (undici): 5.8.0</p>
<p>Direct dependency fix Resolution (@graphql-codegen/cli): 2.11.2</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-35949</summary>
### Vulnerable Library - <b>undici-5.5.1.tgz</b></p>
<p>An HTTP/1.1 client, written from scratch for Node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/undici/-/undici-5.5.1.tgz">https://registry.npmjs.org/undici/-/undici-5.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/undici/package.json</p>
<p>
Dependency Hierarchy:
- cli-2.11.1.tgz (Root Library)
- fetch-0.0.2.tgz
- :x: **undici-5.5.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
undici is an HTTP/1.1 client, written from scratch for Node.js.`undici` is vulnerable to SSRF (Server-side Request Forgery) when an application takes in **user input** into the `path/pathname` option of `undici.request`. If a user specifies a URL such as `http://127.0.0.1` or `//127.0.0.1` ```js const undici = require("undici") undici.request({origin: "http://example.com", pathname: "//127.0.0.1"}) ``` Instead of processing the request as `http://example.org//127.0.0.1` (or `http://example.org/http://127.0.0.1` when `http://127.0.0.1 is used`), it actually processes the request as `http://127.0.0.1/` and sends it to `http://127.0.0.1`. If a developer passes in user input into `path` parameter of `undici.request`, it can result in an _SSRF_ as they will assume that the hostname cannot change, when in actual fact it can change because the specified path parameter is combined with the base URL. This issue was fixed in `undici@5.8.1`. The best workaround is to validate user input before passing it to the `undici.request` call.
<p>Publish Date: 2022-08-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-35949>CVE-2022-35949</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-35949">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-35949</a></p>
<p>Release Date: 2022-08-12</p>
<p>Fix Resolution: undici - 5.8.2</p>
</p>
<p></p>
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
True
|
cli-2.11.1.tgz: 4 vulnerabilities (highest severity is: 7.1) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cli-2.11.1.tgz</b></p></summary>
<p></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/undici/package.json</p>
<p>
</details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2022-35948](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-35948) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.1 | undici-5.5.1.tgz | Transitive | N/A | ❌ |
| [CVE-2022-31151](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31151) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | undici-5.5.1.tgz | Transitive | 2.11.2 | ✅ |
| [CVE-2022-31150](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31150) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | undici-5.5.1.tgz | Transitive | 2.11.2 | ✅ |
| [CVE-2022-35949](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-35949) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.4 | undici-5.5.1.tgz | Transitive | N/A | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-35948</summary>
### Vulnerable Library - <b>undici-5.5.1.tgz</b></p>
<p>An HTTP/1.1 client, written from scratch for Node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/undici/-/undici-5.5.1.tgz">https://registry.npmjs.org/undici/-/undici-5.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/undici/package.json</p>
<p>
Dependency Hierarchy:
- cli-2.11.1.tgz (Root Library)
- fetch-0.0.2.tgz
- :x: **undici-5.5.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
undici is an HTTP/1.1 client, written from scratch for Node.js.`=< undici@5.8.0` users are vulnerable to _CRLF Injection_ on headers when using unsanitized input as request headers, more specifically, inside the `content-type` header. Example: ``` import { request } from 'undici' const unsanitizedContentTypeInput = 'application/json\r\n\r\nGET /foo2 HTTP/1.1' await request('http://localhost:3000, { method: 'GET', headers: { 'content-type': unsanitizedContentTypeInput }, }) ``` The above snippet will perform two requests in a single `request` API call: 1) `http://localhost:3000/` 2) `http://localhost:3000/foo2` This issue was patched in Undici v5.8.1. Sanitize input when sending content-type headers using user input as a workaround.
<p>Publish Date: 2022-08-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-35948>CVE-2022-35948</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-35948">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-35948</a></p>
<p>Release Date: 2022-08-13</p>
<p>Fix Resolution: undici - 5.8.2</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-31151</summary>
### Vulnerable Library - <b>undici-5.5.1.tgz</b></p>
<p>An HTTP/1.1 client, written from scratch for Node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/undici/-/undici-5.5.1.tgz">https://registry.npmjs.org/undici/-/undici-5.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/undici/package.json</p>
<p>
Dependency Hierarchy:
- cli-2.11.1.tgz (Root Library)
- fetch-0.0.2.tgz
- :x: **undici-5.5.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Authorization headers are cleared on cross-origin redirect. However, cookie headers which are sensitive headers and are official headers found in the spec, remain uncleared. There are active users using cookie headers in undici. This may lead to accidental leakage of cookie to a 3rd-party site or a malicious attacker who can control the redirection target (ie. an open redirector) to leak the cookie to the 3rd party site. This was patched in v5.7.1. By default, this vulnerability is not exploitable. Do not enable redirections, i.e. `maxRedirections: 0` (the default).
<p>Publish Date: 2022-07-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31151>CVE-2022-31151</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/nodejs/undici/security/advisories/GHSA-q768-x9m6-m9qp">https://github.com/nodejs/undici/security/advisories/GHSA-q768-x9m6-m9qp</a></p>
<p>Release Date: 2022-07-21</p>
<p>Fix Resolution (undici): 5.8.0</p>
<p>Direct dependency fix Resolution (@graphql-codegen/cli): 2.11.2</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-31150</summary>
### Vulnerable Library - <b>undici-5.5.1.tgz</b></p>
<p>An HTTP/1.1 client, written from scratch for Node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/undici/-/undici-5.5.1.tgz">https://registry.npmjs.org/undici/-/undici-5.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/undici/package.json</p>
<p>
Dependency Hierarchy:
- cli-2.11.1.tgz (Root Library)
- fetch-0.0.2.tgz
- :x: **undici-5.5.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
undici is an HTTP/1.1 client, written from scratch for Node.js. It is possible to inject CRLF sequences into request headers in undici in versions less than 5.7.1. A fix was released in version 5.8.0. Sanitizing all HTTP headers from untrusted sources to eliminate `\r\n` is a workaround for this issue.
<p>Publish Date: 2022-07-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31150>CVE-2022-31150</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31150">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31150</a></p>
<p>Release Date: 2022-07-19</p>
<p>Fix Resolution (undici): 5.8.0</p>
<p>Direct dependency fix Resolution (@graphql-codegen/cli): 2.11.2</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2022-35949</summary>
### Vulnerable Library - <b>undici-5.5.1.tgz</b></p>
<p>An HTTP/1.1 client, written from scratch for Node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/undici/-/undici-5.5.1.tgz">https://registry.npmjs.org/undici/-/undici-5.5.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/undici/package.json</p>
<p>
Dependency Hierarchy:
- cli-2.11.1.tgz (Root Library)
- fetch-0.0.2.tgz
- :x: **undici-5.5.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
undici is an HTTP/1.1 client, written from scratch for Node.js.`undici` is vulnerable to SSRF (Server-side Request Forgery) when an application takes in **user input** into the `path/pathname` option of `undici.request`. If a user specifies a URL such as `http://127.0.0.1` or `//127.0.0.1` ```js const undici = require("undici") undici.request({origin: "http://example.com", pathname: "//127.0.0.1"}) ``` Instead of processing the request as `http://example.org//127.0.0.1` (or `http://example.org/http://127.0.0.1` when `http://127.0.0.1 is used`), it actually processes the request as `http://127.0.0.1/` and sends it to `http://127.0.0.1`. If a developer passes in user input into `path` parameter of `undici.request`, it can result in an _SSRF_ as they will assume that the hostname cannot change, when in actual fact it can change because the specified path parameter is combined with the base URL. This issue was fixed in `undici@5.8.1`. The best workaround is to validate user input before passing it to the `undici.request` call.
<p>Publish Date: 2022-08-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-35949>CVE-2022-35949</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-35949">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-35949</a></p>
<p>Release Date: 2022-08-12</p>
<p>Fix Resolution: undici - 5.8.2</p>
</p>
<p></p>
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
non_process
|
cli tgz vulnerabilities highest severity is vulnerable library cli tgz path to dependency file package json path to vulnerable library node modules undici package json vulnerabilities cve severity cvss dependency type fixed in remediation available high undici tgz transitive n a medium undici tgz transitive medium undici tgz transitive medium undici tgz transitive n a details cve vulnerable library undici tgz an http client written from scratch for node js library home page a href path to dependency file package json path to vulnerable library node modules undici package json dependency hierarchy cli tgz root library fetch tgz x undici tgz vulnerable library found in base branch main vulnerability details undici is an http client written from scratch for node js undici users are vulnerable to crlf injection on headers when using unsanitized input as request headers more specifically inside the content type header example import request from undici const unsanitizedcontenttypeinput application json r n r nget http await request method get headers content type unsanitizedcontenttypeinput the above snippet will perform two requests in a single request api call this issue was patched in undici sanitize input when sending content type headers using user input as a workaround publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact low availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution undici cve vulnerable library undici tgz an http client written from scratch for node js library home page a href path to dependency file package json path to vulnerable library node modules undici package json dependency hierarchy cli tgz root library fetch tgz x undici tgz vulnerable library found in base branch main vulnerability details authorization headers are cleared on cross origin redirect however cookie headers which are sensitive headers and are official headers found in the spec remain uncleared there are active users using cookie headers in undici this may lead to accidental leakage of cookie to a party site or a malicious attacker who can control the redirection target ie an open redirector to leak the cookie to the party site this was patched in by default this vulnerability is not exploitable do not enable redirections i e maxredirections the default publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution undici direct dependency fix resolution graphql codegen cli rescue worker helmet automatic remediation is available for this issue cve vulnerable library undici tgz an http client written from scratch for node js library home page a href path to dependency file package json path to vulnerable library node modules undici package json dependency hierarchy cli tgz root library fetch tgz x undici tgz vulnerable library found in base branch main vulnerability details undici is an http client written from scratch for node js it is possible to inject crlf sequences into request headers in undici in versions less than a fix was released in version sanitizing all http headers from untrusted sources to eliminate r n is a workaround for this issue publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution undici direct dependency fix resolution graphql codegen cli rescue worker helmet automatic remediation is available for this issue cve vulnerable library undici tgz an http client written from scratch for node js library home page a href path to dependency file package json path to vulnerable library node modules undici package json dependency hierarchy cli tgz root library fetch tgz x undici tgz vulnerable library found in base branch main vulnerability details undici is an http client written from scratch for node js undici is vulnerable to ssrf server side request forgery when an application takes in user input into the path pathname option of undici request if a user specifies a url such as or js const undici require undici undici request origin pathname instead of processing the request as or when is used it actually processes the request as and sends it to if a developer passes in user input into path parameter of undici request it can result in an ssrf as they will assume that the hostname cannot change when in actual fact it can change because the specified path parameter is combined with the base url this issue was fixed in undici the best workaround is to validate user input before passing it to the undici request call publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution undici rescue worker helmet automatic remediation is available for this issue
| 0
|
19,968
| 26,448,066,856
|
IssuesEvent
|
2023-01-16 09:07:51
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Processing dialogs don't export data-defined overrides in qgis_process and JSON syntax
|
Processing Bug
|
### What is the bug or the crash?
Several processing algorithms provide a [data-defined override](https://docs.qgis.org/testing/en/docs/user_manual/introduction/general_tools.html#the-data-defined-override-widget) button next to a parameter, so that an attribute field or an expression can be inserted instead of a fixed value. Examples are the numerical `DISTANCE` parameter of `native:buffer`, and the boolean `ALL_PARTS` parameter of `native:centroids`.
The processing dialogs (tool dialog and history dialog) provide a means to export the chosen parameters as `qgis_process` command syntax or as a JSON string ('Advanced' button, or right clicking a history item). Both options are used in geospatial scripting outside of QGIS and Python, e.g. in bash or R.
However, the value of a data-overridden parameter is currently not captured in the exported string; instead it is empty (`null` in the case of JSON). E.g. we get below result – look at the **`DISTANCE`** value:
`qgis_process run native:buffer --ellipsoid=EPSG:7030 --INPUT='/usr/share/qgis/resources/data/world_map.gpkg|layername=countries' --DISTANCE= --SEGMENTS=5 --END_CAP_STYLE=0 --JOIN_STYLE=0 --MITER_LIMIT=2 --DISSOLVE=false --OUTPUT=TEMPORARY_OUTPUT
`
and
```
{
"ellipsoid": "EPSG:7030",
"inputs": {
"DISSOLVE": false,
"DISTANCE": null,
"END_CAP_STYLE": 0,
"INPUT": "/usr/share/qgis/resources/data/world_map.gpkg|layername=countries",
"JOIN_STYLE": 0,
"MITER_LIMIT": 2.0,
"OUTPUT": "TEMPORARY_OUTPUT",
"SEGMENTS": 5
}
}
```
### Steps to reproduce the issue
1. Make a QGIS project with at least 1 layer. E.g. a polygon layer.
2. From the toolbox, choose an algorithm that provides a parameter with the data-defined override widget, e.g. `native:buffer` (to control a numerical value) or `native:centroids` (to control a boolean value).
3. Apply a data-defined override by clicking the corresponding button next to the parameter and select an appropriate attribute field or provide an appropriate expression – that expression may even just be a fixed value.
4. In the processing dialog, click 'Advanced' and choose either 'Copy as qgis_process command' or 'Copy as JSON'.
5. Inspect the content of your clipboard: the value of the parameter is empty (`null` in the case of JSON).
### Versions
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /></head><body>
QGIS version | 3.26.3-Buenos Aires | QGIS code revision | 65e4edfdad
-- | -- | -- | --
Qt version | 5.12.8
Python version | 3.8.10
GDAL/OGR version | 3.4.3
PROJ version | 8.2.0
EPSG Registry database version | v10.038 (2021-10-21)
GEOS version | 3.10.2-CAPI-1.16.0
SQLite version | 3.31.1
PDAL version | 2.2.0
PostgreSQL client version | 12.12 (Ubuntu 12.12-0ubuntu0.20.04.1)
SpatiaLite version | 5.0.1
QWT version | 6.1.4
QScintilla2 version | 2.11.2
OS version | Linux Mint 20
| | |
Active Python plugins
geopunt4Qgis | 2.2.4
ViewshedAnalysis | 1.7
cartography_tools | 1.2.1
quick_map_services | 0.19.27
grassprovider | 2.12.99
processing | 2.12.99
sagaprovider | 2.12.99
db_manager | 0.1.20
MetaSearch | 0.3.6
otbprovider | 2.12.99
</body></html>
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [ ] I tried with a new QGIS profile
### Additional context
_No response_
|
1.0
|
Processing dialogs don't export data-defined overrides in qgis_process and JSON syntax - ### What is the bug or the crash?
Several processing algorithms provide a [data-defined override](https://docs.qgis.org/testing/en/docs/user_manual/introduction/general_tools.html#the-data-defined-override-widget) button next to a parameter, so that an attribute field or an expression can be inserted instead of a fixed value. Examples are the numerical `DISTANCE` parameter of `native:buffer`, and the boolean `ALL_PARTS` parameter of `native:centroids`.
The processing dialogs (tool dialog and history dialog) provide a means to export the chosen parameters as `qgis_process` command syntax or as a JSON string ('Advanced' button, or right clicking a history item). Both options are used in geospatial scripting outside of QGIS and Python, e.g. in bash or R.
However, the value of a data-overridden parameter is currently not captured in the exported string; instead it is empty (`null` in the case of JSON). E.g. we get below result – look at the **`DISTANCE`** value:
`qgis_process run native:buffer --ellipsoid=EPSG:7030 --INPUT='/usr/share/qgis/resources/data/world_map.gpkg|layername=countries' --DISTANCE= --SEGMENTS=5 --END_CAP_STYLE=0 --JOIN_STYLE=0 --MITER_LIMIT=2 --DISSOLVE=false --OUTPUT=TEMPORARY_OUTPUT
`
and
```
{
"ellipsoid": "EPSG:7030",
"inputs": {
"DISSOLVE": false,
"DISTANCE": null,
"END_CAP_STYLE": 0,
"INPUT": "/usr/share/qgis/resources/data/world_map.gpkg|layername=countries",
"JOIN_STYLE": 0,
"MITER_LIMIT": 2.0,
"OUTPUT": "TEMPORARY_OUTPUT",
"SEGMENTS": 5
}
}
```
### Steps to reproduce the issue
1. Make a QGIS project with at least 1 layer. E.g. a polygon layer.
2. From the toolbox, choose an algorithm that provides a parameter with the data-defined override widget, e.g. `native:buffer` (to control a numerical value) or `native:centroids` (to control a boolean value).
3. Apply a data-defined override by clicking the corresponding button next to the parameter and select an appropriate attribute field or provide an appropriate expression – that expression may even just be a fixed value.
4. In the processing dialog, click 'Advanced' and choose either 'Copy as qgis_process command' or 'Copy as JSON'.
5. Inspect the content of your clipboard: the value of the parameter is empty (`null` in the case of JSON).
### Versions
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /></head><body>
QGIS version | 3.26.3-Buenos Aires | QGIS code revision | 65e4edfdad
-- | -- | -- | --
Qt version | 5.12.8
Python version | 3.8.10
GDAL/OGR version | 3.4.3
PROJ version | 8.2.0
EPSG Registry database version | v10.038 (2021-10-21)
GEOS version | 3.10.2-CAPI-1.16.0
SQLite version | 3.31.1
PDAL version | 2.2.0
PostgreSQL client version | 12.12 (Ubuntu 12.12-0ubuntu0.20.04.1)
SpatiaLite version | 5.0.1
QWT version | 6.1.4
QScintilla2 version | 2.11.2
OS version | Linux Mint 20
| | |
Active Python plugins
geopunt4Qgis | 2.2.4
ViewshedAnalysis | 1.7
cartography_tools | 1.2.1
quick_map_services | 0.19.27
grassprovider | 2.12.99
processing | 2.12.99
sagaprovider | 2.12.99
db_manager | 0.1.20
MetaSearch | 0.3.6
otbprovider | 2.12.99
</body></html>
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [ ] I tried with a new QGIS profile
### Additional context
_No response_
|
process
|
processing dialogs don t export data defined overrides in qgis process and json syntax what is the bug or the crash several processing algorithms provide a button next to a parameter so that an attribute field or an expression can be inserted instead of a fixed value examples are the numerical distance parameter of native buffer and the boolean all parts parameter of native centroids the processing dialogs tool dialog and history dialog provide a means to export the chosen parameters as qgis process command syntax or as a json string advanced button or right clicking a history item both options are used in geospatial scripting outside of qgis and python e g in bash or r however the value of a data overridden parameter is currently not captured in the exported string instead it is empty null in the case of json e g we get below result – look at the distance value qgis process run native buffer ellipsoid epsg input usr share qgis resources data world map gpkg layername countries distance segments end cap style join style miter limit dissolve false output temporary output and ellipsoid epsg inputs dissolve false distance null end cap style input usr share qgis resources data world map gpkg layername countries join style miter limit output temporary output segments steps to reproduce the issue make a qgis project with at least layer e g a polygon layer from the toolbox choose an algorithm that provides a parameter with the data defined override widget e g native buffer to control a numerical value or native centroids to control a boolean value apply a data defined override by clicking the corresponding button next to the parameter and select an appropriate attribute field or provide an appropriate expression – that expression may even just be a fixed value in the processing dialog click advanced and choose either copy as qgis process command or copy as json inspect the content of your clipboard the value of the parameter is empty null in the case of json versions doctype html public dtd html en qgis version buenos aires qgis code revision qt version python version gdal ogr version proj version epsg registry database version geos version capi sqlite version pdal version postgresql client version ubuntu spatialite version qwt version version os version linux mint active python plugins viewshedanalysis cartography tools quick map services grassprovider processing sagaprovider db manager metasearch otbprovider supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context no response
| 1
|
342
| 2,793,253,578
|
IssuesEvent
|
2015-05-11 09:41:44
|
ecodistrict/IDSSDashboard
|
https://api.github.com/repos/ecodistrict/IDSSDashboard
|
reopened
|
Case example / tutorial
|
enhancement form feedback 09102014 process step: assess alternatives
|
It would be good if we could export the buildup of a case study which gives an overview of all the settings, data sets used, calculations applied, etc, etc, not sure if this is already the case.
|
1.0
|
Case example / tutorial - It would be good if we could export the buildup of a case study which gives an overview of all the settings, data sets used, calculations applied, etc, etc, not sure if this is already the case.
|
process
|
case example tutorial it would be good if we could export the buildup of a case study which gives an overview of all the settings data sets used calculations applied etc etc not sure if this is already the case
| 1
|
180,202
| 30,465,490,251
|
IssuesEvent
|
2023-07-17 10:03:36
|
nacht-falter/aikido-course-website-django
|
https://api.github.com/repos/nacht-falter/aikido-course-website-django
|
opened
|
USER STORY: Messages
|
THEME: Website UX PRIORITY: Should-Have SIZE: S EPIC: UI Design
|
As a **visitor or logged-in user** I can **see messages with different colors** so that **I can get feedback for my actions.**
### Acceptance Criteria
- [ ] There are messages displayed, when the user submits forms.
- [ ] The messages have different colors depending on their type.
### Tasks
- [ ] Add bootstrap alert class to messages
- [ ] Set up messages typse in settings.py
- [ ] Add styling to alerts types
|
1.0
|
USER STORY: Messages - As a **visitor or logged-in user** I can **see messages with different colors** so that **I can get feedback for my actions.**
### Acceptance Criteria
- [ ] There are messages displayed, when the user submits forms.
- [ ] The messages have different colors depending on their type.
### Tasks
- [ ] Add bootstrap alert class to messages
- [ ] Set up messages typse in settings.py
- [ ] Add styling to alerts types
|
non_process
|
user story messages as a visitor or logged in user i can see messages with different colors so that i can get feedback for my actions acceptance criteria there are messages displayed when the user submits forms the messages have different colors depending on their type tasks add bootstrap alert class to messages set up messages typse in settings py add styling to alerts types
| 0
|
3,717
| 6,732,869,371
|
IssuesEvent
|
2017-10-18 13:08:04
|
lockedata/rcms
|
https://api.github.com/repos/lockedata/rcms
|
opened
|
Submit talks
|
odoo processes speaker
|
## Detailed task
- Submit a session
- Edit sessions
- See status of each talk
## Assessing the task
Try to perform the task. Use google and the system documentation to help - part of what we're trying to assess how easy it is for people to work out how to do tasks.
Use a 👍 (`:+1:`) reaction to this task if you were able to perform the task. Use a 👎 (`:-1:`) reaction to the task if you could not complete it. Add a reply with any comments or feedback.
## Extra Info
- Site: [odoo](//http://188.166.159.192:8069)
- System documentation: [odoo docs](https://www.odoo.com/page/docs)
- Role: Speaker
- Area: Processes
|
1.0
|
Submit talks - ## Detailed task
- Submit a session
- Edit sessions
- See status of each talk
## Assessing the task
Try to perform the task. Use google and the system documentation to help - part of what we're trying to assess how easy it is for people to work out how to do tasks.
Use a 👍 (`:+1:`) reaction to this task if you were able to perform the task. Use a 👎 (`:-1:`) reaction to the task if you could not complete it. Add a reply with any comments or feedback.
## Extra Info
- Site: [odoo](//http://188.166.159.192:8069)
- System documentation: [odoo docs](https://www.odoo.com/page/docs)
- Role: Speaker
- Area: Processes
|
process
|
submit talks detailed task submit a session edit sessions see status of each talk assessing the task try to perform the task use google and the system documentation to help part of what we re trying to assess how easy it is for people to work out how to do tasks use a 👍 reaction to this task if you were able to perform the task use a 👎 reaction to the task if you could not complete it add a reply with any comments or feedback extra info site system documentation role speaker area processes
| 1
|
271,751
| 29,659,375,974
|
IssuesEvent
|
2023-06-10 01:23:31
|
Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2021-33034
|
https://api.github.com/repos/Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2021-33034
|
closed
|
WS-2021-0596 (High) detected in linuxlinux-4.19.239 - autoclosed
|
Mend: dependency security vulnerability
|
## WS-2021-0596 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.239</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2021-33034/commit/19525e8c58fe9ba0d7cb0f7a1a87d31d30380de6">19525e8c58fe9ba0d7cb0f7a1a87d31d30380de6</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Linux Kernel before 5.15.12 avoid double free in tun_free_netdev
<p>Publish Date: 2021-12-30
<p>URL: <a href=https://github.com/gregkh/linux/commit/158b515f703e75e7d68289bf4d98c664e1d632df>WS-2021-0596</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/GSD-2021-1002847">https://osv.dev/vulnerability/GSD-2021-1002847</a></p>
<p>Release Date: 2021-12-30</p>
<p>Fix Resolution: v5.15.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2021-0596 (High) detected in linuxlinux-4.19.239 - autoclosed - ## WS-2021-0596 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.239</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Trinadh465/device_renesas_kernel_AOSP10_r33_CVE-2021-33034/commit/19525e8c58fe9ba0d7cb0f7a1a87d31d30380de6">19525e8c58fe9ba0d7cb0f7a1a87d31d30380de6</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Linux Kernel before 5.15.12 avoid double free in tun_free_netdev
<p>Publish Date: 2021-12-30
<p>URL: <a href=https://github.com/gregkh/linux/commit/158b515f703e75e7d68289bf4d98c664e1d632df>WS-2021-0596</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://osv.dev/vulnerability/GSD-2021-1002847">https://osv.dev/vulnerability/GSD-2021-1002847</a></p>
<p>Release Date: 2021-12-30</p>
<p>Fix Resolution: v5.15.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws high detected in linuxlinux autoclosed ws high severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch master vulnerable source files vulnerability details linux kernel before avoid double free in tun free netdev publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
2,565
| 5,316,101,615
|
IssuesEvent
|
2017-02-13 19:01:25
|
jlm2017/jlm-video-subtitles
|
https://api.github.com/repos/jlm2017/jlm-video-subtitles
|
closed
|
[Subtitles] [EN] "COMMENT FINANCER LE REMBOURSEMENT À 100% DES SOINS DE SANTÉ PRESCRITS ?"
|
Language: English Process: [6] Approved
|
# Video title
"COMMENT FINANCER LE REMBOURSEMENT À 100% DES SOINS DE SANTÉ PRESCRITS ?"
# URL
https://www.youtube.com/watch?v=p4WrCjgCbaA&t=149s
# Youtube subtitles language
Anglais
# Duration
4:36
# Subtitles URL
https://www.youtube.com/timedtext_editor?lang=en&v=p4WrCjgCbaA&ui=hd&ref=player&action_mde_edit_form=1&tab=captions&bl=vmp
|
1.0
|
[Subtitles] [EN] "COMMENT FINANCER LE REMBOURSEMENT À 100% DES SOINS DE SANTÉ PRESCRITS ?" - # Video title
"COMMENT FINANCER LE REMBOURSEMENT À 100% DES SOINS DE SANTÉ PRESCRITS ?"
# URL
https://www.youtube.com/watch?v=p4WrCjgCbaA&t=149s
# Youtube subtitles language
Anglais
# Duration
4:36
# Subtitles URL
https://www.youtube.com/timedtext_editor?lang=en&v=p4WrCjgCbaA&ui=hd&ref=player&action_mde_edit_form=1&tab=captions&bl=vmp
|
process
|
comment financer le remboursement à des soins de santé prescrits video title comment financer le remboursement à des soins de santé prescrits url youtube subtitles language anglais duration subtitles url
| 1
|
272,747
| 20,755,140,081
|
IssuesEvent
|
2022-03-15 11:26:55
|
TsubakiBotPad/pad-cogs
|
https://api.github.com/repos/TsubakiBotPad/pad-cogs
|
closed
|
figure out some way to upload awakening images so we can include them in docs
|
documentation id3 discussion
|
figure out some way to upload awakening images so we can include them in docs
I'm thinking an assets folder in dadguide cog similar to the one in padbuildimg cog?
|
1.0
|
figure out some way to upload awakening images so we can include them in docs - figure out some way to upload awakening images so we can include them in docs
I'm thinking an assets folder in dadguide cog similar to the one in padbuildimg cog?
|
non_process
|
figure out some way to upload awakening images so we can include them in docs figure out some way to upload awakening images so we can include them in docs i m thinking an assets folder in dadguide cog similar to the one in padbuildimg cog
| 0
|
16,804
| 22,047,994,391
|
IssuesEvent
|
2022-05-30 05:26:48
|
camunda/zeebe
|
https://api.github.com/repos/camunda/zeebe
|
closed
|
User can't deploy two different DMN models in one deployment
|
kind/bug severity/mid area/ux team/process-automation
|
**Describe the bug**
> Hi,
>
> I’m having an issue trying to write a process test using Zeebe-process-test-extension in Camunda Cloud 8. (v.8.0.2)
> The following code runs fine if it’s just 1 bpmn and 1 dmn. But fails when i try adding second dmn:
> return client.newDeployResourceCommand()
> .addResourceFromClasspath(“test.bpmn”)
> .addResourceFile("~/dmn/test1.dmn")
> .addResourceFile("~/dmn/test2.dmn")
> .send()
> .join();
>
> Here is the error: io.camunda.zeebe.db.ZeebeDbInconsistentException: Key DbLong{2251799813685250} in ColumnFamily DMN_DECISION_REQUIREMENTS already exists
>
> I tried using addResourceFromClasspath and AddResourceFile - both methods result in the same error.
https://forum.camunda.io/t/fail-to-load-multiple-dmns-in-zeebe-process-test-extension-test-in-camunda-8/37483/4?u=zelldon
<!-- A clear and concise description of what the bug is. -->
**To Reproduce**
```
return client.newDeployResourceCommand()
.addResourceFromClasspath(“dummy_bpmn.bpmn”)
.addResourceFromClasspath(“dummy1.dmn”)
.addResourceFromClasspath(“dummy2.dmn”)
.send()
.join();
```
[dummy1.dmn.txt](https://github.com/camunda/zeebe/files/8650472/dummy1.dmn.txt)
[dummy2.dmn.txt](https://github.com/camunda/zeebe/files/8650473/dummy2.dmn.txt)
[dummy-bpmn.bpmn.txt](https://github.com/camunda/zeebe/files/8650474/dummy-bpmn.bpmn.txt)
<!--
Steps to reproduce the behavior
If possible add a minimal reproducer code sample
- when using the Java client: https://github.com/zeebe-io/zeebe-test-template-java
-->
**Expected behavior**
No error on deploying two different dmn models.
<!-- A clear and concise description of what you expected to happen. -->
**Log/Stacktrace**
<!-- If possible add the full stacktrace or Zeebe log which contains the issue. -->
<details><summary>Full Stacktrace</summary>
<p>
```
Error:
io.camunda.zeebe.client.api.command.ClientStatusException: Command 'CREATE' rejected with code 'INVALID_ARGUMENT': Expected to deploy new resources, but encountered the following errors:
'dummy2.dmn': Key DbLong{2251799813685250} in ColumnFamily DMN_DECISION_REQUIREMENTS already exists
at io.camunda.zeebe.client.impl.ZeebeClientFutureImpl.transformExecutionException(ZeebeClientFutureImpl.java:93)
at io.camunda.zeebe.client.impl.ZeebeClientFutureImpl.join(ZeebeClientFutureImpl.java:50)
at io.camunda.c8.test.DeterminationProcessTest.initDeployment(DeterminationProcessTest.java:48)
at io.camunda.c8.test.DeterminationProcessTest.testDeployment(DeterminationProcessTest.java:53)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:725)
at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60)
at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131)
at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149)
at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:140)
at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:84)
at org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115)
at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105)
at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64)
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45)
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37)
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104)
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:214)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:210)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:135)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:66)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:107)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:88)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:54)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:67)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:52)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:114)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:86)
at org.junit.platform.launcher.core.DefaultLauncherSession$DelegatingLauncher.execute(DefaultLauncherSession.java:86)
at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:53)
at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:71)
at com.intellij.rt.junit.IdeaTestRunner$Repeater$1.execute(IdeaTestRunner.java:38)
at com.intellij.rt.execution.junit.TestsRepeater.repeat(TestsRepeater.java:11)
at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:35)
at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:235)
at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:54)
Caused by: java.util.concurrent.ExecutionException: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: Command 'CREATE' rejected with code 'INVALID_ARGUMENT': Expected to deploy new resources, but encountered the following errors:
'dummy2.dmn': Key DbLong{2251799813685250} in ColumnFamily DMN_DECISION_REQUIREMENTS already exists
at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396)
at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073)
at io.camunda.zeebe.client.impl.ZeebeClientFutureImpl.join(ZeebeClientFutureImpl.java:48)
... 71 more
```
</p>
</details>
**Environment:**
- zeebe-process-test:8.0.1
|
1.0
|
User can't deploy two different DMN models in one deployment - **Describe the bug**
> Hi,
>
> I’m having an issue trying to write a process test using Zeebe-process-test-extension in Camunda Cloud 8. (v.8.0.2)
> The following code runs fine if it’s just 1 bpmn and 1 dmn. But fails when i try adding second dmn:
> return client.newDeployResourceCommand()
> .addResourceFromClasspath(“test.bpmn”)
> .addResourceFile("~/dmn/test1.dmn")
> .addResourceFile("~/dmn/test2.dmn")
> .send()
> .join();
>
> Here is the error: io.camunda.zeebe.db.ZeebeDbInconsistentException: Key DbLong{2251799813685250} in ColumnFamily DMN_DECISION_REQUIREMENTS already exists
>
> I tried using addResourceFromClasspath and AddResourceFile - both methods result in the same error.
https://forum.camunda.io/t/fail-to-load-multiple-dmns-in-zeebe-process-test-extension-test-in-camunda-8/37483/4?u=zelldon
<!-- A clear and concise description of what the bug is. -->
**To Reproduce**
```
return client.newDeployResourceCommand()
.addResourceFromClasspath(“dummy_bpmn.bpmn”)
.addResourceFromClasspath(“dummy1.dmn”)
.addResourceFromClasspath(“dummy2.dmn”)
.send()
.join();
```
[dummy1.dmn.txt](https://github.com/camunda/zeebe/files/8650472/dummy1.dmn.txt)
[dummy2.dmn.txt](https://github.com/camunda/zeebe/files/8650473/dummy2.dmn.txt)
[dummy-bpmn.bpmn.txt](https://github.com/camunda/zeebe/files/8650474/dummy-bpmn.bpmn.txt)
<!--
Steps to reproduce the behavior
If possible add a minimal reproducer code sample
- when using the Java client: https://github.com/zeebe-io/zeebe-test-template-java
-->
**Expected behavior**
No error on deploying two different dmn models.
<!-- A clear and concise description of what you expected to happen. -->
**Log/Stacktrace**
<!-- If possible add the full stacktrace or Zeebe log which contains the issue. -->
<details><summary>Full Stacktrace</summary>
<p>
```
Error:
io.camunda.zeebe.client.api.command.ClientStatusException: Command 'CREATE' rejected with code 'INVALID_ARGUMENT': Expected to deploy new resources, but encountered the following errors:
'dummy2.dmn': Key DbLong{2251799813685250} in ColumnFamily DMN_DECISION_REQUIREMENTS already exists
at io.camunda.zeebe.client.impl.ZeebeClientFutureImpl.transformExecutionException(ZeebeClientFutureImpl.java:93)
at io.camunda.zeebe.client.impl.ZeebeClientFutureImpl.join(ZeebeClientFutureImpl.java:50)
at io.camunda.c8.test.DeterminationProcessTest.initDeployment(DeterminationProcessTest.java:48)
at io.camunda.c8.test.DeterminationProcessTest.testDeployment(DeterminationProcessTest.java:53)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:725)
at org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60)
at org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131)
at org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149)
at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:140)
at org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:84)
at org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115)
at org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105)
at org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64)
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45)
at org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37)
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104)
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:214)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:210)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:135)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:66)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1511)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:107)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:88)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:54)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:67)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:52)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:114)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:86)
at org.junit.platform.launcher.core.DefaultLauncherSession$DelegatingLauncher.execute(DefaultLauncherSession.java:86)
at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:53)
at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:71)
at com.intellij.rt.junit.IdeaTestRunner$Repeater$1.execute(IdeaTestRunner.java:38)
at com.intellij.rt.execution.junit.TestsRepeater.repeat(TestsRepeater.java:11)
at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:35)
at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:235)
at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:54)
Caused by: java.util.concurrent.ExecutionException: io.grpc.StatusRuntimeException: INVALID_ARGUMENT: Command 'CREATE' rejected with code 'INVALID_ARGUMENT': Expected to deploy new resources, but encountered the following errors:
'dummy2.dmn': Key DbLong{2251799813685250} in ColumnFamily DMN_DECISION_REQUIREMENTS already exists
at java.base/java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:396)
at java.base/java.util.concurrent.CompletableFuture.get(CompletableFuture.java:2073)
at io.camunda.zeebe.client.impl.ZeebeClientFutureImpl.join(ZeebeClientFutureImpl.java:48)
... 71 more
```
</p>
</details>
**Environment:**
- zeebe-process-test:8.0.1
|
process
|
user can t deploy two different dmn models in one deployment describe the bug hi i’m having an issue trying to write a process test using zeebe process test extension in camunda cloud v the following code runs fine if it’s just bpmn and dmn but fails when i try adding second dmn return client newdeployresourcecommand addresourcefromclasspath “test bpmn” addresourcefile dmn dmn addresourcefile dmn dmn send join here is the error io camunda zeebe db zeebedbinconsistentexception key dblong in columnfamily dmn decision requirements already exists i tried using addresourcefromclasspath and addresourcefile both methods result in the same error to reproduce return client newdeployresourcecommand addresourcefromclasspath “dummy bpmn bpmn” addresourcefromclasspath “ dmn” addresourcefromclasspath “ dmn” send join steps to reproduce the behavior if possible add a minimal reproducer code sample when using the java client expected behavior no error on deploying two different dmn models log stacktrace full stacktrace error io camunda zeebe client api command clientstatusexception command create rejected with code invalid argument expected to deploy new resources but encountered the following errors dmn key dblong in columnfamily dmn decision requirements already exists at io camunda zeebe client impl zeebeclientfutureimpl transformexecutionexception zeebeclientfutureimpl java at io camunda zeebe client impl zeebeclientfutureimpl join zeebeclientfutureimpl java at io camunda test determinationprocesstest initdeployment determinationprocesstest java at io camunda test determinationprocesstest testdeployment determinationprocesstest java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at org junit platform commons util reflectionutils invokemethod reflectionutils java at org junit jupiter engine execution methodinvocation proceed methodinvocation java at org junit jupiter engine execution invocationinterceptorchain validatinginvocation proceed invocationinterceptorchain java at org junit jupiter engine extension timeoutextension intercept timeoutextension java at org junit jupiter engine extension timeoutextension intercepttestablemethod timeoutextension java at org junit jupiter engine extension timeoutextension intercepttestmethod timeoutextension java at org junit jupiter engine execution executableinvoker reflectiveinterceptorcall lambda ofvoidmethod executableinvoker java at org junit jupiter engine execution executableinvoker lambda invoke executableinvoker java at org junit jupiter engine execution invocationinterceptorchain interceptedinvocation proceed invocationinterceptorchain java at org junit jupiter engine execution invocationinterceptorchain proceed invocationinterceptorchain java at org junit jupiter engine execution invocationinterceptorchain chainandinvoke invocationinterceptorchain java at org junit jupiter engine execution invocationinterceptorchain invoke invocationinterceptorchain java at org junit jupiter engine execution executableinvoker invoke executableinvoker java at org junit jupiter engine execution executableinvoker invoke executableinvoker java at org junit jupiter engine descriptor testmethodtestdescriptor lambda invoketestmethod testmethodtestdescriptor java at org junit platform engine support hierarchical throwablecollector execute throwablecollector java at org junit jupiter engine descriptor testmethodtestdescriptor invoketestmethod testmethodtestdescriptor java at org junit jupiter engine descriptor testmethodtestdescriptor execute testmethodtestdescriptor java at org junit jupiter engine descriptor testmethodtestdescriptor execute testmethodtestdescriptor java at org junit platform engine support hierarchical nodetesttask lambda executerecursively nodetesttask java at org junit platform engine support hierarchical throwablecollector execute throwablecollector java at org junit platform engine support hierarchical nodetesttask lambda executerecursively nodetesttask java at org junit platform engine support hierarchical node around node java at org junit platform engine support hierarchical nodetesttask lambda executerecursively nodetesttask java at org junit platform engine support hierarchical throwablecollector execute throwablecollector java at org junit platform engine support hierarchical nodetesttask executerecursively nodetesttask java at org junit platform engine support hierarchical nodetesttask execute nodetesttask java at java base java util arraylist foreach arraylist java at org junit platform engine support hierarchical samethreadhierarchicaltestexecutorservice invokeall samethreadhierarchicaltestexecutorservice java at org junit platform engine support hierarchical nodetesttask lambda executerecursively nodetesttask java at org junit platform engine support hierarchical throwablecollector execute throwablecollector java at org junit platform engine support hierarchical nodetesttask lambda executerecursively nodetesttask java at org junit platform engine support hierarchical node around node java at org junit platform engine support hierarchical nodetesttask lambda executerecursively nodetesttask java at org junit platform engine support hierarchical throwablecollector execute throwablecollector java at org junit platform engine support hierarchical nodetesttask executerecursively nodetesttask java at org junit platform engine support hierarchical nodetesttask execute nodetesttask java at java base java util arraylist foreach arraylist java at org junit platform engine support hierarchical samethreadhierarchicaltestexecutorservice invokeall samethreadhierarchicaltestexecutorservice java at org junit platform engine support hierarchical nodetesttask lambda executerecursively nodetesttask java at org junit platform engine support hierarchical throwablecollector execute throwablecollector java at org junit platform engine support hierarchical nodetesttask lambda executerecursively nodetesttask java at org junit platform engine support hierarchical node around node java at org junit platform engine support hierarchical nodetesttask lambda executerecursively nodetesttask java at org junit platform engine support hierarchical throwablecollector execute throwablecollector java at org junit platform engine support hierarchical nodetesttask executerecursively nodetesttask java at org junit platform engine support hierarchical nodetesttask execute nodetesttask java at org junit platform engine support hierarchical samethreadhierarchicaltestexecutorservice submit samethreadhierarchicaltestexecutorservice java at org junit platform engine support hierarchical hierarchicaltestexecutor execute hierarchicaltestexecutor java at org junit platform engine support hierarchical hierarchicaltestengine execute hierarchicaltestengine java at org junit platform launcher core engineexecutionorchestrator execute engineexecutionorchestrator java at org junit platform launcher core engineexecutionorchestrator execute engineexecutionorchestrator java at org junit platform launcher core engineexecutionorchestrator lambda execute engineexecutionorchestrator java at org junit platform launcher core engineexecutionorchestrator withinterceptedstreams engineexecutionorchestrator java at org junit platform launcher core engineexecutionorchestrator execute engineexecutionorchestrator java at org junit platform launcher core defaultlauncher execute defaultlauncher java at org junit platform launcher core defaultlauncher execute defaultlauncher java at org junit platform launcher core defaultlaunchersession delegatinglauncher execute defaultlaunchersession java at org junit platform launcher core sessionperrequestlauncher execute sessionperrequestlauncher java at com intellij startrunnerwithargs java at com intellij rt junit ideatestrunner repeater execute ideatestrunner java at com intellij rt execution junit testsrepeater repeat testsrepeater java at com intellij rt junit ideatestrunner repeater startrunnerwithargs ideatestrunner java at com intellij rt junit junitstarter preparestreamsandstart junitstarter java at com intellij rt junit junitstarter main junitstarter java caused by java util concurrent executionexception io grpc statusruntimeexception invalid argument command create rejected with code invalid argument expected to deploy new resources but encountered the following errors dmn key dblong in columnfamily dmn decision requirements already exists at java base java util concurrent completablefuture reportget completablefuture java at java base java util concurrent completablefuture get completablefuture java at io camunda zeebe client impl zeebeclientfutureimpl join zeebeclientfutureimpl java more environment zeebe process test
| 1
|
17,955
| 23,959,966,322
|
IssuesEvent
|
2022-09-12 18:09:44
|
influxdata/telegraf
|
https://api.github.com/repos/influxdata/telegraf
|
closed
|
Enum Processor Plugin not working after upgrade from 1.14.4 to 1.18.3
|
bug plugin/processor waiting for response
|
We are testing a Telegraph update from 1.14.4 to 1.18.3 and we have some issues with the Enum Processor Plugin.
We have the following config:
```
name_prefix = "snmp_defensepro"
[[inputs.snmp.table]]
name = "_fans"
oid = "RADWARE-MIB::rsSystemFansTable"
inherit_tags = [ "sysname" ]
#Processors
[[processors.enum]]
namepass = ["snmp_defensepro_fans"]
order = 1
[[processors.enum.mapping]]
## Name of the field to map
field = "rsSystemFansStatus"
default = 0
## Table of mappings
[processors.enum.mapping.value_mappings]
OK = 1
```
After the upgrade to v1.18.3 the fan metrics are always '0' set by the default rule. While the result of the snmpwalk gives:
RADWARE-MIB::rsSystemFanIndex.1 = INTEGER: 1
RADWARE-MIB::rsSystemFanIndex.2 = INTEGER: 2
RADWARE-MIB::rsSystemFanIndex.3 = INTEGER: 3
RADWARE-MIB::rsSystemFansStatus.1 = STRING: "OK"
RADWARE-MIB::rsSystemFansStatus.2 = STRING: "OK"
RADWARE-MIB::rsSystemFansStatus.3 = STRING: "OK"
So after the upgrade the "OK" is no longer mapped to '1'.
Is this a know issue and is this fixed in later versions?
Has the working of the plugin changed?
|
1.0
|
Enum Processor Plugin not working after upgrade from 1.14.4 to 1.18.3 - We are testing a Telegraph update from 1.14.4 to 1.18.3 and we have some issues with the Enum Processor Plugin.
We have the following config:
```
name_prefix = "snmp_defensepro"
[[inputs.snmp.table]]
name = "_fans"
oid = "RADWARE-MIB::rsSystemFansTable"
inherit_tags = [ "sysname" ]
#Processors
[[processors.enum]]
namepass = ["snmp_defensepro_fans"]
order = 1
[[processors.enum.mapping]]
## Name of the field to map
field = "rsSystemFansStatus"
default = 0
## Table of mappings
[processors.enum.mapping.value_mappings]
OK = 1
```
After the upgrade to v1.18.3 the fan metrics are always '0' set by the default rule. While the result of the snmpwalk gives:
RADWARE-MIB::rsSystemFanIndex.1 = INTEGER: 1
RADWARE-MIB::rsSystemFanIndex.2 = INTEGER: 2
RADWARE-MIB::rsSystemFanIndex.3 = INTEGER: 3
RADWARE-MIB::rsSystemFansStatus.1 = STRING: "OK"
RADWARE-MIB::rsSystemFansStatus.2 = STRING: "OK"
RADWARE-MIB::rsSystemFansStatus.3 = STRING: "OK"
So after the upgrade the "OK" is no longer mapped to '1'.
Is this a know issue and is this fixed in later versions?
Has the working of the plugin changed?
|
process
|
enum processor plugin not working after upgrade from to we are testing a telegraph update from to and we have some issues with the enum processor plugin we have the following config name prefix snmp defensepro name fans oid radware mib rssystemfanstable inherit tags processors namepass order name of the field to map field rssystemfansstatus default table of mappings ok after the upgrade to the fan metrics are always set by the default rule while the result of the snmpwalk gives radware mib rssystemfanindex integer radware mib rssystemfanindex integer radware mib rssystemfanindex integer radware mib rssystemfansstatus string ok radware mib rssystemfansstatus string ok radware mib rssystemfansstatus string ok so after the upgrade the ok is no longer mapped to is this a know issue and is this fixed in later versions has the working of the plugin changed
| 1
|
19,913
| 26,375,222,580
|
IssuesEvent
|
2023-01-12 01:28:33
|
quark-engine/quark-engine
|
https://api.github.com/repos/quark-engine/quark-engine
|
closed
|
Add quark script case for CWE-295
|
issue-processing-state-03
|
# Detect CWE-295 in Android Application (InsecureShop.apk)
This scenario seeks to find **Improper Certificate Validation**. See [CWE-295](https://cwe.mitre.org/data/definitions/295.html) for more details.
Let’s use this [APK](https://github.com/hax0rgb/InsecureShop) and the above APIs to show how the Quark script finds this vulnerability.
We use the API `findMethodInAPK` to locate all `SslErrorHandler.proceed` methods. Then we need to identify whether the method `WebViewClient.onReceivedSslError` is overridden by its subclass.
First, we check and make sure that the `MethodInstance.name` is `onReceivedSslError`, and the `MethodInstance.descriptor` is `(Landroid/webkit/WebView; Landroid/webkit/SslErrorHandler; Landroid/net/http/SslError;)V`.
Then we use the method API `MethodInstance.findSuperclassHierarchy`to get the supclass list of the method's caller class.
Finally, we check the `Landroid/webkit/WebViewClient;` is on the supclass list. If **YES** , that may cause CWE-295 vulnerability.
## API Spec
**MethodInstance.findSuperclassHierarchy()**
* **Description:** Find all superclass hierarchy of this method object.
* **params:** None
* **Return:** Python list contains all superclas's name of the this method.
## Quark Script CWE-295.py
```python
from quark.script import findMethodInAPK
SAMPLE_PATH = "insecureShop.apk"
TARGET_METHOD = [
"Landroid/webkit/SslErrorHandler;", # class name
"proceed", # method name
"()V" # descriptor
]
OVERRIDE_METHOD = [
"Landroid/webkit/WebViewClient;", # class name
"onReceivedSslError", # method name
# descriptor
"(Landroid/webkit/WebView; Landroid/webkit/SslErrorHandler; Landroid/net/http/SslError;)V"
]
for sslProceedCaller in findMethodInAPK(SAMPLE_PATH, TARGET_METHOD):
if (sslProceedCaller.name == OVERRIDE_METHOD[1] and
sslProceedCaller.descriptor == OVERRIDE_METHOD[2] and
OVERRIDE_METHOD[0] in sslProceedCaller.findSuperclassHierarchy()):
print(f"CWE-295 is detected in method, {sslProceedCaller.fullName}")
```
## Quark Script Result
```
$python3 CWE-295.py
Requested API level 29 is larger than maximum we have, returning API level 28 instead.
CWE-295 is detected in method, Lcom/insecureshop/util/CustomWebViewClient; onReceivedSslError (Landroid/webkit/WebView; Landroid/webkit/SslErrorHandler; Landroid/net/http/SslError;)V
```
|
1.0
|
Add quark script case for CWE-295 - # Detect CWE-295 in Android Application (InsecureShop.apk)
This scenario seeks to find **Improper Certificate Validation**. See [CWE-295](https://cwe.mitre.org/data/definitions/295.html) for more details.
Let’s use this [APK](https://github.com/hax0rgb/InsecureShop) and the above APIs to show how the Quark script finds this vulnerability.
We use the API `findMethodInAPK` to locate all `SslErrorHandler.proceed` methods. Then we need to identify whether the method `WebViewClient.onReceivedSslError` is overridden by its subclass.
First, we check and make sure that the `MethodInstance.name` is `onReceivedSslError`, and the `MethodInstance.descriptor` is `(Landroid/webkit/WebView; Landroid/webkit/SslErrorHandler; Landroid/net/http/SslError;)V`.
Then we use the method API `MethodInstance.findSuperclassHierarchy`to get the supclass list of the method's caller class.
Finally, we check the `Landroid/webkit/WebViewClient;` is on the supclass list. If **YES** , that may cause CWE-295 vulnerability.
## API Spec
**MethodInstance.findSuperclassHierarchy()**
* **Description:** Find all superclass hierarchy of this method object.
* **params:** None
* **Return:** Python list contains all superclas's name of the this method.
## Quark Script CWE-295.py
```python
from quark.script import findMethodInAPK
SAMPLE_PATH = "insecureShop.apk"
TARGET_METHOD = [
"Landroid/webkit/SslErrorHandler;", # class name
"proceed", # method name
"()V" # descriptor
]
OVERRIDE_METHOD = [
"Landroid/webkit/WebViewClient;", # class name
"onReceivedSslError", # method name
# descriptor
"(Landroid/webkit/WebView; Landroid/webkit/SslErrorHandler; Landroid/net/http/SslError;)V"
]
for sslProceedCaller in findMethodInAPK(SAMPLE_PATH, TARGET_METHOD):
if (sslProceedCaller.name == OVERRIDE_METHOD[1] and
sslProceedCaller.descriptor == OVERRIDE_METHOD[2] and
OVERRIDE_METHOD[0] in sslProceedCaller.findSuperclassHierarchy()):
print(f"CWE-295 is detected in method, {sslProceedCaller.fullName}")
```
## Quark Script Result
```
$python3 CWE-295.py
Requested API level 29 is larger than maximum we have, returning API level 28 instead.
CWE-295 is detected in method, Lcom/insecureshop/util/CustomWebViewClient; onReceivedSslError (Landroid/webkit/WebView; Landroid/webkit/SslErrorHandler; Landroid/net/http/SslError;)V
```
|
process
|
add quark script case for cwe detect cwe in android application insecureshop apk this scenario seeks to find improper certificate validation see for more details let’s use this and the above apis to show how the quark script finds this vulnerability we use the api findmethodinapk to locate all sslerrorhandler proceed methods then we need to identify whether the method webviewclient onreceivedsslerror is overridden by its subclass first we check and make sure that the methodinstance name is onreceivedsslerror and the methodinstance descriptor is landroid webkit webview landroid webkit sslerrorhandler landroid net http sslerror v then we use the method api methodinstance findsuperclasshierarchy to get the supclass list of the method s caller class finally we check the landroid webkit webviewclient is on the supclass list if yes that may cause cwe vulnerability api spec methodinstance findsuperclasshierarchy description find all superclass hierarchy of this method object params none return python list contains all superclas s name of the this method quark script cwe py python from quark script import findmethodinapk sample path insecureshop apk target method landroid webkit sslerrorhandler class name proceed method name v descriptor override method landroid webkit webviewclient class name onreceivedsslerror method name descriptor landroid webkit webview landroid webkit sslerrorhandler landroid net http sslerror v for sslproceedcaller in findmethodinapk sample path target method if sslproceedcaller name override method and sslproceedcaller descriptor override method and override method in sslproceedcaller findsuperclasshierarchy print f cwe is detected in method sslproceedcaller fullname quark script result cwe py requested api level is larger than maximum we have returning api level instead cwe is detected in method lcom insecureshop util customwebviewclient onreceivedsslerror landroid webkit webview landroid webkit sslerrorhandler landroid net http sslerror v
| 1
|
20,349
| 27,009,792,185
|
IssuesEvent
|
2023-02-10 14:34:06
|
sysflow-telemetry/sysflow
|
https://api.github.com/repos/sysflow-telemetry/sysflow
|
opened
|
Add Sigma rules support
|
enhancement sf-processor
|
**Indicate project**
Processor
**Overview**
We want to enable Sigma rules evaluation in the SysFlow Processor, using our policy engine architecture as the base framework.
**Tasks**
- [x] Refactoring to enable multi-language support
- [x] Upgrade to Go 1.19
- [x] Separation between policy front and backend compilation
- [x] Generics (parametric record, to enable different backends for different input sources)
- [ ] Sigma frontend
- [x] Conditions
- [x] Search evaluation (event matching)
- [x] Field modifiers
- [ ] Field transformers
- [ ] Keywords
- [x] Field mapping support
- [x] Sigma backend support
- [ ] Field mapping for SysFlow
- [ ] Unit tests
- [ ] Sigma rule curation
- [ ] Performance tests
- [ ] Documentation
**Additional context**
Working branch: https://github.com/sysflow-telemetry/sf-processor/tree/go1.19-sigma
|
1.0
|
Add Sigma rules support - **Indicate project**
Processor
**Overview**
We want to enable Sigma rules evaluation in the SysFlow Processor, using our policy engine architecture as the base framework.
**Tasks**
- [x] Refactoring to enable multi-language support
- [x] Upgrade to Go 1.19
- [x] Separation between policy front and backend compilation
- [x] Generics (parametric record, to enable different backends for different input sources)
- [ ] Sigma frontend
- [x] Conditions
- [x] Search evaluation (event matching)
- [x] Field modifiers
- [ ] Field transformers
- [ ] Keywords
- [x] Field mapping support
- [x] Sigma backend support
- [ ] Field mapping for SysFlow
- [ ] Unit tests
- [ ] Sigma rule curation
- [ ] Performance tests
- [ ] Documentation
**Additional context**
Working branch: https://github.com/sysflow-telemetry/sf-processor/tree/go1.19-sigma
|
process
|
add sigma rules support indicate project processor overview we want to enable sigma rules evaluation in the sysflow processor using our policy engine architecture as the base framework tasks refactoring to enable multi language support upgrade to go separation between policy front and backend compilation generics parametric record to enable different backends for different input sources sigma frontend conditions search evaluation event matching field modifiers field transformers keywords field mapping support sigma backend support field mapping for sysflow unit tests sigma rule curation performance tests documentation additional context working branch
| 1
|
419,682
| 28,150,481,307
|
IssuesEvent
|
2023-04-03 00:11:09
|
Dorps/aiboardgame
|
https://api.github.com/repos/Dorps/aiboardgame
|
closed
|
Team[33]-Review: Ambiguous Phase-in Plan
|
documentation Review-SRS-Team33
|
Phase-In Plan section is ambiguous and merely based on deliverable dates as well as priority. It would be more effective to highlight the subsections within the priorities in which the order of these requirements will be completed (i.e. although a plethora of requirements labeled as high priority must be completed before your listed date, a lot of those requirements must be completed in some orderly fashion). It would be best practice to state the order of which those requirements would be completed in, in order to narrow down your tasks list.
Note: Please create a label for Team[33]-Review
|
1.0
|
Team[33]-Review: Ambiguous Phase-in Plan - Phase-In Plan section is ambiguous and merely based on deliverable dates as well as priority. It would be more effective to highlight the subsections within the priorities in which the order of these requirements will be completed (i.e. although a plethora of requirements labeled as high priority must be completed before your listed date, a lot of those requirements must be completed in some orderly fashion). It would be best practice to state the order of which those requirements would be completed in, in order to narrow down your tasks list.
Note: Please create a label for Team[33]-Review
|
non_process
|
team review ambiguous phase in plan phase in plan section is ambiguous and merely based on deliverable dates as well as priority it would be more effective to highlight the subsections within the priorities in which the order of these requirements will be completed i e although a plethora of requirements labeled as high priority must be completed before your listed date a lot of those requirements must be completed in some orderly fashion it would be best practice to state the order of which those requirements would be completed in in order to narrow down your tasks list note please create a label for team review
| 0
|
69,887
| 8,469,579,440
|
IssuesEvent
|
2018-10-23 23:39:59
|
guylepage3/lepage.cc
|
https://api.github.com/repos/guylepage3/lepage.cc
|
closed
|
About page design and content development v2.2.0
|
copywriting design
|
**Description**
About page design and content development v2.2.0.
**Features**
- [x] Intro https://github.com/guylepage3/lepage.cc/issues/1#issuecomment-432450364
- [x] Resume
- [x] Bio https://github.com/guylepage3/lepage.cc/issues/1#issuecomment-432422664
- [x] Highlights https://github.com/guylepage3/lepage.cc/issues/1#issuecomment-432423170
- [x] Work experience
- Link to LinkedIn.
**Reference**
- About Preethi Kasireddy – https://www.hashhack.it/about
- [How to Create Magnetic Content](https://www.webdesignerdepot.com/2018/10/how-to-create-magnetic-content/)
**Pushing to v3** #51
- Resume
- ~~Strengths~~ (pushing to v3)
- ~~Skills~~ (pushing to v3)
- ~~Education~~ (pushing to v3)
- Awards
- ~~Lotus award logo~~ (pushing to v3)
- ~~New York Festivals award logo~~ (pushing to v3)
- ~~2017 INDEX award runner up~~ (pushing to v3)
- ~~BC Home and garden show environmental design rookie award winner~~ (pushing to v3)
- ~~Organizations~~ (pushing to v3)
|
1.0
|
About page design and content development v2.2.0 - **Description**
About page design and content development v2.2.0.
**Features**
- [x] Intro https://github.com/guylepage3/lepage.cc/issues/1#issuecomment-432450364
- [x] Resume
- [x] Bio https://github.com/guylepage3/lepage.cc/issues/1#issuecomment-432422664
- [x] Highlights https://github.com/guylepage3/lepage.cc/issues/1#issuecomment-432423170
- [x] Work experience
- Link to LinkedIn.
**Reference**
- About Preethi Kasireddy – https://www.hashhack.it/about
- [How to Create Magnetic Content](https://www.webdesignerdepot.com/2018/10/how-to-create-magnetic-content/)
**Pushing to v3** #51
- Resume
- ~~Strengths~~ (pushing to v3)
- ~~Skills~~ (pushing to v3)
- ~~Education~~ (pushing to v3)
- Awards
- ~~Lotus award logo~~ (pushing to v3)
- ~~New York Festivals award logo~~ (pushing to v3)
- ~~2017 INDEX award runner up~~ (pushing to v3)
- ~~BC Home and garden show environmental design rookie award winner~~ (pushing to v3)
- ~~Organizations~~ (pushing to v3)
|
non_process
|
about page design and content development description about page design and content development features intro resume bio highlights work experience link to linkedin reference about preethi kasireddy – pushing to resume strengths pushing to skills pushing to education pushing to awards lotus award logo pushing to new york festivals award logo pushing to index award runner up pushing to bc home and garden show environmental design rookie award winner pushing to organizations pushing to
| 0
|
3,650
| 6,688,805,696
|
IssuesEvent
|
2017-10-08 18:49:14
|
econtoolkit/continuous_time_methods
|
https://api.github.com/repos/econtoolkit/continuous_time_methods
|
closed
|
Update description in discretization to point out system is overdetermined but not full rank
|
Stochastic Processes theory
|
In particular, the X matrices defined in the document are of size $(I+1)\times I$ and $(2 I + 1)\times 2I$, but the rank of them is $I$ and $2I$. This means that the overdetermined system can be solved as a linear system. The rank can be checked with an SVD or the `rank` function of matlab. The fact that this is really a system should be pointed out in the document prior to describing it as a LLS problem.
On the other hand, this doesn't necessarily change the solution technique for the system. For dense setups, the LLS approach can be compared to doing a pseudo-inverse or something similar in speed - but I suspect that LLS is close enough.
For sparse systems, we could investigate whether there are any sparse direct solvers that can solve rectangular systems. Iterative methods wouldn't work, but gaussian elimination should. Alternatively it is possible that sparse LLS is the best approach even if the rank is known.
|
1.0
|
Update description in discretization to point out system is overdetermined but not full rank - In particular, the X matrices defined in the document are of size $(I+1)\times I$ and $(2 I + 1)\times 2I$, but the rank of them is $I$ and $2I$. This means that the overdetermined system can be solved as a linear system. The rank can be checked with an SVD or the `rank` function of matlab. The fact that this is really a system should be pointed out in the document prior to describing it as a LLS problem.
On the other hand, this doesn't necessarily change the solution technique for the system. For dense setups, the LLS approach can be compared to doing a pseudo-inverse or something similar in speed - but I suspect that LLS is close enough.
For sparse systems, we could investigate whether there are any sparse direct solvers that can solve rectangular systems. Iterative methods wouldn't work, but gaussian elimination should. Alternatively it is possible that sparse LLS is the best approach even if the rank is known.
|
process
|
update description in discretization to point out system is overdetermined but not full rank in particular the x matrices defined in the document are of size i times i and i times but the rank of them is i and this means that the overdetermined system can be solved as a linear system the rank can be checked with an svd or the rank function of matlab the fact that this is really a system should be pointed out in the document prior to describing it as a lls problem on the other hand this doesn t necessarily change the solution technique for the system for dense setups the lls approach can be compared to doing a pseudo inverse or something similar in speed but i suspect that lls is close enough for sparse systems we could investigate whether there are any sparse direct solvers that can solve rectangular systems iterative methods wouldn t work but gaussian elimination should alternatively it is possible that sparse lls is the best approach even if the rank is known
| 1
|
50,735
| 13,187,702,415
|
IssuesEvent
|
2020-08-13 04:17:25
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
closed
|
[millipede] low test coverage in some parts (Trac #1250)
|
Migrated from Trac combo reconstruction defect
|
The following files/directories need more test coverage:
millipede/private/millipede/MillipedeFisherMatrixCalculator.cxx
millipede/private/millipede/MuMillipede.cxx
millipede/private/millipede/TauMillipede.cxx
millipede/private/millipede/converter
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1250">https://code.icecube.wisc.edu/ticket/1250</a>, reported by hdembinski and owned by jbraun</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2016-02-25T20:36:43",
"description": "The following files/directories need more test coverage:\n\nmillipede/private/millipede/MillipedeFisherMatrixCalculator.cxx\nmillipede/private/millipede/MuMillipede.cxx\nmillipede/private/millipede/TauMillipede.cxx\n\nmillipede/private/millipede/converter",
"reporter": "hdembinski",
"cc": "",
"resolution": "fixed",
"_ts": "1456432603769641",
"component": "combo reconstruction",
"summary": "[millipede] low test coverage in some parts",
"priority": "normal",
"keywords": "",
"time": "2015-08-20T17:12:24",
"milestone": "",
"owner": "jbraun",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
[millipede] low test coverage in some parts (Trac #1250) - The following files/directories need more test coverage:
millipede/private/millipede/MillipedeFisherMatrixCalculator.cxx
millipede/private/millipede/MuMillipede.cxx
millipede/private/millipede/TauMillipede.cxx
millipede/private/millipede/converter
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1250">https://code.icecube.wisc.edu/ticket/1250</a>, reported by hdembinski and owned by jbraun</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2016-02-25T20:36:43",
"description": "The following files/directories need more test coverage:\n\nmillipede/private/millipede/MillipedeFisherMatrixCalculator.cxx\nmillipede/private/millipede/MuMillipede.cxx\nmillipede/private/millipede/TauMillipede.cxx\n\nmillipede/private/millipede/converter",
"reporter": "hdembinski",
"cc": "",
"resolution": "fixed",
"_ts": "1456432603769641",
"component": "combo reconstruction",
"summary": "[millipede] low test coverage in some parts",
"priority": "normal",
"keywords": "",
"time": "2015-08-20T17:12:24",
"milestone": "",
"owner": "jbraun",
"type": "defect"
}
```
</p>
</details>
|
non_process
|
low test coverage in some parts trac the following files directories need more test coverage millipede private millipede millipedefishermatrixcalculator cxx millipede private millipede mumillipede cxx millipede private millipede taumillipede cxx millipede private millipede converter migrated from json status closed changetime description the following files directories need more test coverage n nmillipede private millipede millipedefishermatrixcalculator cxx nmillipede private millipede mumillipede cxx nmillipede private millipede taumillipede cxx n nmillipede private millipede converter reporter hdembinski cc resolution fixed ts component combo reconstruction summary low test coverage in some parts priority normal keywords time milestone owner jbraun type defect
| 0
|
3,013
| 6,020,529,239
|
IssuesEvent
|
2017-06-07 16:38:37
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
System.IO.FileLoadException - Exception from HRESULT: 0x80131040
|
area-System.Diagnostics.Process question
|
Folks,
I added System.Diagnostics.Process in a netstandard 1.6 library and I get this exception:
'System.IO.FileLoadException: 'Could not load file or assembly 'System.Diagnostics.Process, Version=4.1.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)'
Before add, lib worked fine. I have added this in App.config (I'm into a WPF 4.7 app) and I get same exception:
<dependentAssembly >
<assemblyIdentity name="System.Diagnostics.Process" publicKeyToken="b03f5f7f11d50a3a" culture="neutral" />
<bindingRedirect oldVersion="0.0.0.0-4.1.0.0" newVersion="4.1.0.0" />
</dependentAssembly>
Other test I did:
1. Upgrade System.Diagnostics.Process to 4.3.0 and same app.config
2. Downgrade to netstandard 1.5 both v4.3.0 and 4.1.0
Every step I run nuget locals all -clear and deleted bin and obj folders (all in solution). Desperated I removed System.Diagnostics.Process and tested with System.Console and same results with same tests...
WPF has netstandard 1.6 lib. Tested too netstandard 1.6.1 and same results with same tests ...
Finally I have tested this class library with a new console app (FW 4.7) and same result...
I forgot to tell you, class library and WPF app and console test app are compiled as x64.
[https://drive.google.com/open?id=0B3M-4-kZkv4FSktIbUluVGV1RGs](source with libs)
Please can you help me?
|
1.0
|
System.IO.FileLoadException - Exception from HRESULT: 0x80131040 - Folks,
I added System.Diagnostics.Process in a netstandard 1.6 library and I get this exception:
'System.IO.FileLoadException: 'Could not load file or assembly 'System.Diagnostics.Process, Version=4.1.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)'
Before add, lib worked fine. I have added this in App.config (I'm into a WPF 4.7 app) and I get same exception:
<dependentAssembly >
<assemblyIdentity name="System.Diagnostics.Process" publicKeyToken="b03f5f7f11d50a3a" culture="neutral" />
<bindingRedirect oldVersion="0.0.0.0-4.1.0.0" newVersion="4.1.0.0" />
</dependentAssembly>
Other test I did:
1. Upgrade System.Diagnostics.Process to 4.3.0 and same app.config
2. Downgrade to netstandard 1.5 both v4.3.0 and 4.1.0
Every step I run nuget locals all -clear and deleted bin and obj folders (all in solution). Desperated I removed System.Diagnostics.Process and tested with System.Console and same results with same tests...
WPF has netstandard 1.6 lib. Tested too netstandard 1.6.1 and same results with same tests ...
Finally I have tested this class library with a new console app (FW 4.7) and same result...
I forgot to tell you, class library and WPF app and console test app are compiled as x64.
[https://drive.google.com/open?id=0B3M-4-kZkv4FSktIbUluVGV1RGs](source with libs)
Please can you help me?
|
process
|
system io fileloadexception exception from hresult folks i added system diagnostics process in a netstandard library and i get this exception system io fileloadexception could not load file or assembly system diagnostics process version culture neutral publickeytoken or one of its dependencies the located assembly s manifest definition does not match the assembly reference exception from hresult before add lib worked fine i have added this in app config i m into a wpf app and i get same exception other test i did upgrade system diagnostics process to and same app config downgrade to netstandard both and every step i run nuget locals all clear and deleted bin and obj folders all in solution desperated i removed system diagnostics process and tested with system console and same results with same tests wpf has netstandard lib tested too netstandard and same results with same tests finally i have tested this class library with a new console app fw and same result i forgot to tell you class library and wpf app and console test app are compiled as source with libs please can you help me
| 1
|
273,567
| 8,536,684,369
|
IssuesEvent
|
2018-11-05 16:00:28
|
rucio/rucio
|
https://api.github.com/repos/rucio/rucio
|
opened
|
Primary Key for replicas table is in wrong order in models.py
|
Core & Internals Priority: High bug
|
Motivation
----------
https://github.com/rucio/rucio/blob/a7b6fef66687f4a673b705eb738dd4d88113edac/lib/rucio/db/sqla/models.py#L736
The PK for the replicas table is in `order rse_id, scope, name` while it actually should be in order `scope, name, rse_id`
This needs to be changed and we need a proper upgrade script so this can be adapted at all installations.
|
1.0
|
Primary Key for replicas table is in wrong order in models.py - Motivation
----------
https://github.com/rucio/rucio/blob/a7b6fef66687f4a673b705eb738dd4d88113edac/lib/rucio/db/sqla/models.py#L736
The PK for the replicas table is in `order rse_id, scope, name` while it actually should be in order `scope, name, rse_id`
This needs to be changed and we need a proper upgrade script so this can be adapted at all installations.
|
non_process
|
primary key for replicas table is in wrong order in models py motivation the pk for the replicas table is in order rse id scope name while it actually should be in order scope name rse id this needs to be changed and we need a proper upgrade script so this can be adapted at all installations
| 0
|
7,645
| 10,738,045,658
|
IssuesEvent
|
2019-10-29 14:10:24
|
allinurl/goaccess
|
https://api.github.com/repos/allinurl/goaccess
|
closed
|
Podcast-related browser list
|
add log-processing
|
I run a Wordpress blog where I also host a podcast. The statistics shows more the 50% of unknown browsers. So I collected a browser list which contains most user agents which are podcast related, i.e. podcast apps, media players, and podcast directories/crawlers (Itunes, Stitcher,...). This highly improved to result.
Maybe you want to integrate this into your repository, list is attached.
Best regards,
Bernhard
[podcast.list.txt](https://github.com/allinurl/goaccess/files/3667422/podcast.list.txt)
|
1.0
|
Podcast-related browser list - I run a Wordpress blog where I also host a podcast. The statistics shows more the 50% of unknown browsers. So I collected a browser list which contains most user agents which are podcast related, i.e. podcast apps, media players, and podcast directories/crawlers (Itunes, Stitcher,...). This highly improved to result.
Maybe you want to integrate this into your repository, list is attached.
Best regards,
Bernhard
[podcast.list.txt](https://github.com/allinurl/goaccess/files/3667422/podcast.list.txt)
|
process
|
podcast related browser list i run a wordpress blog where i also host a podcast the statistics shows more the of unknown browsers so i collected a browser list which contains most user agents which are podcast related i e podcast apps media players and podcast directories crawlers itunes stitcher this highly improved to result maybe you want to integrate this into your repository list is attached best regards bernhard
| 1
|
14,913
| 2,831,390,311
|
IssuesEvent
|
2015-05-24 15:55:10
|
nobodyguy/dslrdashboard
|
https://api.github.com/repos/nobodyguy/dslrdashboard
|
closed
|
HTC DNA Jellybean support with Nikon D7100
|
auto-migrated Priority-Medium Type-Defect
|
```
Not detecting camera with version .18
What steps will reproduce the problem?
1.
2.
3.
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Please provide any additional information below.
```
Original issue reported on code.google.com by `darinsug...@gmail.com` on 23 Mar 2013 at 12:52
|
1.0
|
HTC DNA Jellybean support with Nikon D7100 - ```
Not detecting camera with version .18
What steps will reproduce the problem?
1.
2.
3.
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Please provide any additional information below.
```
Original issue reported on code.google.com by `darinsug...@gmail.com` on 23 Mar 2013 at 12:52
|
non_process
|
htc dna jellybean support with nikon not detecting camera with version what steps will reproduce the problem what is the expected output what do you see instead what version of the product are you using on what operating system please provide any additional information below original issue reported on code google com by darinsug gmail com on mar at
| 0
|
11,735
| 14,576,762,901
|
IssuesEvent
|
2020-12-18 00:16:16
|
pacificclimate/quail
|
https://api.github.com/repos/pacificclimate/quail
|
closed
|
Monthly Maximum 5-day Consecutive Precipitation
|
process
|
## Description
This function takes a climdexInput object as input and computes the climdex index Rx5day: monthly or annual maximum 5-day consecutive precipitation.
## Function to wrap
[`climdex.rx5day`](https://github.com/pacificclimate/climdex.pcic/blob/master/R/climdex.r#L1112)
|
1.0
|
Monthly Maximum 5-day Consecutive Precipitation - ## Description
This function takes a climdexInput object as input and computes the climdex index Rx5day: monthly or annual maximum 5-day consecutive precipitation.
## Function to wrap
[`climdex.rx5day`](https://github.com/pacificclimate/climdex.pcic/blob/master/R/climdex.r#L1112)
|
process
|
monthly maximum day consecutive precipitation description this function takes a climdexinput object as input and computes the climdex index monthly or annual maximum day consecutive precipitation function to wrap
| 1
|
18,930
| 24,884,672,189
|
IssuesEvent
|
2022-10-28 06:32:11
|
scikit-learn/scikit-learn
|
https://api.github.com/repos/scikit-learn/scikit-learn
|
closed
|
LabelEncoder not transforming nans as expected.
|
Bug module:preprocessing
|
### Describe the bug
When fitting a LabelEncoder with a pandas Series that contains a nan, transforming an array containing nans fails, even though nan is one of the LabelEncoder classes.
### Steps/Code to Reproduce
```zsh
>>> from sklearn import preprocessing
>>> import pandas as pd
>>> import numpy as np
>>> le = preprocessing.LabelEncoder()
>>> le.fit(pd.Series(["a", "a", "b", np.nan]))
LabelEncoder()
>>> le.classes_
array(['a', 'b', nan], dtype=object)
>>> le.transform(["a"])
array([0])
>>> le.transform([np.nan])
```
### Expected Results
Instead of the error, I'd expect the output to be `array([2])`
### Actual Results
```zsh
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/vladimirvargas/.local/share/virtualenvs/real-estate-fGt6EZlF/lib/python3.7/site-packages/sklearn/preprocessing/_label.py", line 138, in transform
return _encode(y, uniques=self.classes_)
File "/Users/vladimirvargas/.local/share/virtualenvs/real-estate-fGt6EZlF/lib/python3.7/site-packages/sklearn/utils/_encode.py", line 187, in _encode
diff = _check_unknown(values, uniques)
File "/Users/vladimirvargas/.local/share/virtualenvs/real-estate-fGt6EZlF/lib/python3.7/site-packages/sklearn/utils/_encode.py", line 261, in _check_unknown
if np.isnan(known_values).any():
TypeError: ufunc 'isnan' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''
```
### Versions
```shell
System:
python: 3.7.12 (default, Oct 13 2021, 06:53:03) [Clang 13.0.0 (clang-1300.0.29.3)]
executable: /Users/vladimirvargas/.local/share/virtualenvs/real-estate-fGt6EZlF/bin/python
machine: Darwin-20.6.0-x86_64-i386-64bit
Python dependencies:
pip: 21.2.4
setuptools: 57.4.0
sklearn: 1.0.2
numpy: 1.21.5
scipy: 1.7.3
Cython: None
pandas: 1.3.5
matplotlib: None
joblib: 1.1.0
threadpoolctl: 3.1.0
Built with OpenMP: True
```
|
1.0
|
LabelEncoder not transforming nans as expected. - ### Describe the bug
When fitting a LabelEncoder with a pandas Series that contains a nan, transforming an array containing nans fails, even though nan is one of the LabelEncoder classes.
### Steps/Code to Reproduce
```zsh
>>> from sklearn import preprocessing
>>> import pandas as pd
>>> import numpy as np
>>> le = preprocessing.LabelEncoder()
>>> le.fit(pd.Series(["a", "a", "b", np.nan]))
LabelEncoder()
>>> le.classes_
array(['a', 'b', nan], dtype=object)
>>> le.transform(["a"])
array([0])
>>> le.transform([np.nan])
```
### Expected Results
Instead of the error, I'd expect the output to be `array([2])`
### Actual Results
```zsh
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/vladimirvargas/.local/share/virtualenvs/real-estate-fGt6EZlF/lib/python3.7/site-packages/sklearn/preprocessing/_label.py", line 138, in transform
return _encode(y, uniques=self.classes_)
File "/Users/vladimirvargas/.local/share/virtualenvs/real-estate-fGt6EZlF/lib/python3.7/site-packages/sklearn/utils/_encode.py", line 187, in _encode
diff = _check_unknown(values, uniques)
File "/Users/vladimirvargas/.local/share/virtualenvs/real-estate-fGt6EZlF/lib/python3.7/site-packages/sklearn/utils/_encode.py", line 261, in _check_unknown
if np.isnan(known_values).any():
TypeError: ufunc 'isnan' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''
```
### Versions
```shell
System:
python: 3.7.12 (default, Oct 13 2021, 06:53:03) [Clang 13.0.0 (clang-1300.0.29.3)]
executable: /Users/vladimirvargas/.local/share/virtualenvs/real-estate-fGt6EZlF/bin/python
machine: Darwin-20.6.0-x86_64-i386-64bit
Python dependencies:
pip: 21.2.4
setuptools: 57.4.0
sklearn: 1.0.2
numpy: 1.21.5
scipy: 1.7.3
Cython: None
pandas: 1.3.5
matplotlib: None
joblib: 1.1.0
threadpoolctl: 3.1.0
Built with OpenMP: True
```
|
process
|
labelencoder not transforming nans as expected describe the bug when fitting a labelencoder with a pandas series that contains a nan transforming an array containing nans fails even though nan is one of the labelencoder classes steps code to reproduce zsh from sklearn import preprocessing import pandas as pd import numpy as np le preprocessing labelencoder le fit pd series labelencoder le classes array dtype object le transform array le transform expected results instead of the error i d expect the output to be array actual results zsh traceback most recent call last file line in file users vladimirvargas local share virtualenvs real estate lib site packages sklearn preprocessing label py line in transform return encode y uniques self classes file users vladimirvargas local share virtualenvs real estate lib site packages sklearn utils encode py line in encode diff check unknown values uniques file users vladimirvargas local share virtualenvs real estate lib site packages sklearn utils encode py line in check unknown if np isnan known values any typeerror ufunc isnan not supported for the input types and the inputs could not be safely coerced to any supported types according to the casting rule safe versions shell system python default oct executable users vladimirvargas local share virtualenvs real estate bin python machine darwin python dependencies pip setuptools sklearn numpy scipy cython none pandas matplotlib none joblib threadpoolctl built with openmp true
| 1
|
10,049
| 13,044,161,654
|
IssuesEvent
|
2020-07-29 03:47:25
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
closed
|
UCP: Migrate scalar function `SubDateStringInt` from TiDB
|
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
|
## Description
Port the scalar function `SubDateStringInt` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @sticnarf
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
2.0
|
UCP: Migrate scalar function `SubDateStringInt` from TiDB -
## Description
Port the scalar function `SubDateStringInt` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @sticnarf
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
process
|
ucp migrate scalar function subdatestringint from tidb description port the scalar function subdatestringint from tidb to coprocessor score mentor s sticnarf recommended skills rust programming learning materials already implemented expressions ported from tidb
| 1
|
2,468
| 5,243,527,228
|
IssuesEvent
|
2017-01-31 20:57:36
|
jlm2017/jlm-video-subtitles
|
https://api.github.com/repos/jlm2017/jlm-video-subtitles
|
closed
|
[Subtitles] [FR] "COMMENT FINANCER LE REMBOURSEMENT À 100% DES SOINS DE SANTÉ PRESCRITS ?"
|
Language: French Process: [6] Approved
|
# Video title
COMMENT FINANCER LE REMBOURSEMENT À 100% DES SOINS DE SANTÉ PRESCRITS ?
# URL
https://youtu.be/p4WrCjgCbaA
# Youtube subtitles language
FRANÇAIS
# Duration
4'36
# Subtitles URL
https://www.youtube.com/timedtext_editor?bl=watch&action_mde_edit_form=1&ui=hd&v=p4WrCjgCbaA&ref=wt&lang=fr&tab=captions
|
1.0
|
[Subtitles] [FR] "COMMENT FINANCER LE REMBOURSEMENT À 100% DES SOINS DE SANTÉ PRESCRITS ?" - # Video title
COMMENT FINANCER LE REMBOURSEMENT À 100% DES SOINS DE SANTÉ PRESCRITS ?
# URL
https://youtu.be/p4WrCjgCbaA
# Youtube subtitles language
FRANÇAIS
# Duration
4'36
# Subtitles URL
https://www.youtube.com/timedtext_editor?bl=watch&action_mde_edit_form=1&ui=hd&v=p4WrCjgCbaA&ref=wt&lang=fr&tab=captions
|
process
|
comment financer le remboursement à des soins de santé prescrits video title comment financer le remboursement à des soins de santé prescrits url youtube subtitles language français duration subtitles url
| 1
|
166,182
| 6,292,464,980
|
IssuesEvent
|
2017-07-20 05:46:30
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
closed
|
.multinomial parameter names differ between torch and torch.autograd.Variable
|
bug medium priority
|
Hello,
the parameter name for with replacement differs between `torch.multinomial` and `torch.autograd.Variable.multinomial` (the former has `replacement`, the latter `with_replacement`).
This could be harmonized.
Internally some bits of the python and C code also seem to use with_replacement, but I don't know whether that is important.
Thank you!
Best regards
Thomas
|
1.0
|
.multinomial parameter names differ between torch and torch.autograd.Variable - Hello,
the parameter name for with replacement differs between `torch.multinomial` and `torch.autograd.Variable.multinomial` (the former has `replacement`, the latter `with_replacement`).
This could be harmonized.
Internally some bits of the python and C code also seem to use with_replacement, but I don't know whether that is important.
Thank you!
Best regards
Thomas
|
non_process
|
multinomial parameter names differ between torch and torch autograd variable hello the parameter name for with replacement differs between torch multinomial and torch autograd variable multinomial the former has replacement the latter with replacement this could be harmonized internally some bits of the python and c code also seem to use with replacement but i don t know whether that is important thank you best regards thomas
| 0
|
20,402
| 27,061,900,459
|
IssuesEvent
|
2023-02-13 20:28:44
|
cse442-at-ub/project_s23-one-belt-one-road
|
https://api.github.com/repos/cse442-at-ub/project_s23-one-belt-one-road
|
opened
|
Sample: Create a randomized URL generator
|
Processing Task
|
**Task Tests**
*Test1*
1) Call function in 'utils.php' named 'getURL' with arguments ....
|
1.0
|
Sample: Create a randomized URL generator - **Task Tests**
*Test1*
1) Call function in 'utils.php' named 'getURL' with arguments ....
|
process
|
sample create a randomized url generator task tests call function in utils php named geturl with arguments
| 1
|
21,753
| 30,271,382,455
|
IssuesEvent
|
2023-07-07 15:36:53
|
USGS-WiM/StreamStats
|
https://api.github.com/repos/USGS-WiM/StreamStats
|
opened
|
BP: add stream grids for download
|
Batch Processor
|
The current Batch Processor has a list of downloadable stream grids: https://streamstatsags.cr.usgs.gov/StreamGrids/directoryBrowsing.asp
We need to provide these stream grids in the new BP.
|
1.0
|
BP: add stream grids for download - The current Batch Processor has a list of downloadable stream grids: https://streamstatsags.cr.usgs.gov/StreamGrids/directoryBrowsing.asp
We need to provide these stream grids in the new BP.
|
process
|
bp add stream grids for download the current batch processor has a list of downloadable stream grids we need to provide these stream grids in the new bp
| 1
|
21,683
| 30,174,321,190
|
IssuesEvent
|
2023-07-04 02:10:47
|
hsmusic/hsmusic-wiki
|
https://api.github.com/repos/hsmusic/hsmusic-wiki
|
opened
|
Certain listings don't perform an initial non-arbitrary list
|
type: bug (user-facing) scope: data processing thing: listings
|
I need to go over the current listing-spec and identify which, but in general, if a listing sorts by some arbitrary value wherein two data objects may be associated with the same value, then their relative positioning is going to be retained from whatever came before. If we don't perform an initial, *non-arbitrary* sort (such as `sortAlphabetically`), then this sorting is arbitrary and exposes the internal sorting order, which is a no-go.
|
1.0
|
Certain listings don't perform an initial non-arbitrary list - I need to go over the current listing-spec and identify which, but in general, if a listing sorts by some arbitrary value wherein two data objects may be associated with the same value, then their relative positioning is going to be retained from whatever came before. If we don't perform an initial, *non-arbitrary* sort (such as `sortAlphabetically`), then this sorting is arbitrary and exposes the internal sorting order, which is a no-go.
|
process
|
certain listings don t perform an initial non arbitrary list i need to go over the current listing spec and identify which but in general if a listing sorts by some arbitrary value wherein two data objects may be associated with the same value then their relative positioning is going to be retained from whatever came before if we don t perform an initial non arbitrary sort such as sortalphabetically then this sorting is arbitrary and exposes the internal sorting order which is a no go
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.