id
stringlengths
4
10
text
stringlengths
4
2.14M
source
stringclasses
2 values
created
timestamp[s]date
2001-05-16 21:05:09
2025-01-01 03:38:30
added
stringdate
2025-04-01 04:05:38
2025-04-01 07:14:06
metadata
dict
631897515
Chapter 05 (dev): MEI examples to be rendered in Verovio are not shown For some reason, in the development version of the guidelines, the MEI examples that I included in the "mei/dev/" folder are not showing. Just the first example is shown, the "notes_rests.mei", but all following are not displayed (neither the MEI encoding nor the Verovio rendition of it). In my local machine, all examples do render (see link). I don't know why this is happening. I have cloned this exact same repo and while the examples are displayed on my local machine they are not displayed on the website. Also, the font for the examples is different, the one shown on my local machine makes the MEI encoding easier to read (see the text in bold and the different colors). Without havind had a closer look at the problem itself: The CSS for the Guidelines is provided in the https://github.com/music-encoding/music-encoding.github.io repo, which admittedly is a nightmare to debug and should be changed asap. This probably doesn't explain why examples don't show up, but it could explain why things look different, assuming that you have a local copy of that other repo running locally… (@kepper I feel every day worse and worse about the guidelines setup which was my proposal. Please get rid of it as soon as you can.) I'll have a look at the Verovio stuff. We can go back to static images if this is preferred. @kepper, that could be the case for the fonts since I do have the website's GitHub also cloned (well, a fork of it, but same thing). @lpugin, I love the Verovio stuff! I certainly do prefer this over static images because it is good to see that the encodings can actually be rendered somewhere (here, Verovio). It would also be good to have the same thing as the tutorials (or in the Verovio website) where you can click on "show full encoding" as well. @lpugin, no worries. You had good arguments, which apparently convinced us all at some point ;-) But after a couple of years of operating this setup, we have a better picture now, and see that the benefits come at a cost we didn't foresee back then. We'll change that situation now, but it's already clear that every setup will have it's own issues… It should be fixed. The problem came from the capital letters in the file names. GitHub servers make the distinction be upper and lower case. Another reason to move away from the current setup ;-) Until then we need to use always lower case filenames. Thank you, Laurent! It should be fixed. The problem came from the capital letters in the file names. GitHub servers make the distinction be upper and lower case. Another reason to move away from the current setup ;-) Until then we need to use always lower case filenames. could you please documet somewhere which files have to stick to lowercase naming scheme?
gharchive/issue
2020-06-05T19:49:00
2025-04-01T06:45:05.367830
{ "authors": [ "bwbohl", "kepper", "lpugin", "martha-thomae" ], "repo": "music-encoding/guidelines", "url": "https://github.com/music-encoding/guidelines/issues/175", "license": "ECL-2.0", "license_type": "permissive", "license_source": "github-api" }
82061716
staffDef allowed as child of measure: From zolae...@gmail.com on July 04, 2014 09:04:26 staffDef is actually allowed as child of measure: \<measure> \<staffDef label="B" n="1"/> \<staff n="1"/> \</measure> Is this intentional? Guidelines (" is allowed only within <staffGrp>, <staff> and <layer>. ") and also Issue 86 suggests it's not. Original issue: http://code.google.com/p/music-encoding/issues/detail?id=195 From pd...@virginia.edu on July 09, 2014 14:08:01 It is intentional. <staffDef> is also allowed inside <staff>. It occurs in so many places so that MEI can offer multiple encoding possibilities (and therefore accommodate multiple software designs). If this is a feature that isn't needed in a particular notational repertoire, say CMN, then the CMN module should be re-designed to enforce this restriction. The "generic" MEI model shouldn't be changed until it can be determined that a particular feature is never needed. Along with modifying the CMN (or mensural or neumes) module, we should also mount a campaign to encourage broader use of the RNG schema associated with the repertoire and discourage the use of mei-all. There are probably several other instances in which this general approach could/should be employed. From andrew.hankinson on September 12, 2014 03:38:20 Status: WontFix From zolae...@gmail.com on September 12, 2014 03:47:10 My question targets the discrepancy between the documentation and the actual behaviour, (I didn't try to suggest that it shouldn't be allowed somewhere). In this case I think the relevant passage of the documentation should be updated to reflect that staffDef isn't only allowed within "within <staffGrp>, <staff> and <layer>" but also in other places.
gharchive/issue
2015-05-28T18:12:47
2025-04-01T06:45:05.374290
{ "authors": [ "ahankinson" ], "repo": "music-encoding/music-encoding", "url": "https://github.com/music-encoding/music-encoding/issues/195", "license": "ECL-2.0", "license_type": "permissive", "license_source": "github-api" }
1794592319
无法上传路径中带有emoji的文件 如题 [23-07-08 09:16:18] ERROR:main.py:<module>:155 | [358] 上传失败 '/pic1/Pictures/1/Happy weekend _ Ahri🦊_89946644.jpg' Traceback (most recent call last): File "/app/main.py", line 146, in <module> fs_id, md5, server_filename, category, rpath, isdir = bdy.finall_upload_file(ept_file_path,path_conf['remote']) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/utils/bdyUpd.py", line 290, in finall_upload_file return self.create(remote_path, size, block_list, uploadid) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/utils/bdyUpd.py", line 226, in create raise Exception(f"err! {res_data}") Exception: err! {'errno': -7, 'path': '', 'request_id': 为了避免隐私问题删除了这个id } 根据百度云盘的错误码,-7是文件目录名错误,那就是emoji的锅了 https://github.com/musnows/encrypt2bdy/commit/7b541ed08c4ccd047ca4c1a42073f98dcd9826c4
gharchive/issue
2023-07-08T01:25:53
2025-04-01T06:45:05.377515
{ "authors": [ "musnows" ], "repo": "musnows/encrypt2bdy", "url": "https://github.com/musnows/encrypt2bdy/issues/1", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2153103102
how to set vault location? how to set " extension options"? Is the "vault name" path? I test settings, it couldn't work. for example, my vault path is "D:\work\ObsidianLib\tech", and I want to clip articles to dir "09-inbox" under vault path, how to set "extension options"? can it be used on android phone? The name of your obsidian vault should be tech while in the directory input you should put 09-inbox/{title}. It can't be used on an android phone. The extension doesn't save the page's content to your vault. If you wish to add content to the note you should do it manually. That's its intended usage. If you wish to save full page of content by clicking somewhere, you should probably try some other extension.
gharchive/issue
2024-02-26T02:04:41
2025-04-01T06:45:05.416199
{ "authors": [ "murphychengdu", "mvavassori" ], "repo": "mvavassori/obsidian-web-clipper", "url": "https://github.com/mvavassori/obsidian-web-clipper/issues/10", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1287369843
--rm added twice When I run command: x11docker -it --xc=no --debug alpine:latest Part of the output is: DEBUGNOTE[13:32:14,687]: podman command (rootless yes): /usr/bin/podman run \ --rm \ --pull never \ --rm \ --interactive \ --tty \ --name x11docker_X142_alpine-latest_23131369578 \ --user 1000:1000 \ --userns=keep-id \ --group-add 1000 \ --runtime='crun' \ --network none \ --cap-drop ALL \ --cap-add CHOWN \ --security-opt no-new-privileges \ --security-opt label=type:container_runtime_t \ --mount type=bind,source='/usr/bin/catatonit',target='/usr/local/bin/init',readonly \ --tmpfs /run:exec \ --tmpfs /run/lock \ --tmpfs /tmp \ --mount type=bind,source='/home/user/.cache/x11docker/23131369578-alpine-latest/share',target='/x11docker' \ --workdir '/tmp' \ --entrypoint env \ --env 'container=podman' \ --env 'WAYLAND_DISPLAY=' \ --env 'USER=user' \ -- alpine:latest /usr/local/bin/init -- /bin/sh - /x11docker/containerrc Notice that --rm is added twice to the generated podman run command. It doesn't break anything right now, it's a purely cosmetic issue, however still odd that it happens. Indeed, thank your for pointing me on this! I've removed one --rm in the latest commit.
gharchive/issue
2022-06-28T13:40:14
2025-04-01T06:45:05.418376
{ "authors": [ "a1346054", "mviereck" ], "repo": "mviereck/x11docker", "url": "https://github.com/mviereck/x11docker/issues/451", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
2224299774
Python API/CLI does not include precmz or SMILES/compound name in results_df Hey, I was wondering why the results_df from the Python API does not include precursor masses or SMILES like in the sandbox, even though they are defined in the .mgf file. I tested it with a GNPS library dataset as well as with an in-house dataset and followed the documentation for running a query. The query works fine. the sandbox does extra enrichment that the basic querying infrastructure does not.
gharchive/issue
2024-04-04T02:16:26
2025-04-01T06:45:05.419766
{ "authors": [ "j-a-dietrich", "mwang87" ], "repo": "mwang87/MassQueryLanguage", "url": "https://github.com/mwang87/MassQueryLanguage/issues/244", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
243132728
Problem with JS not working on Heroku When including gem 'gentelella-rails'. JS works fine on local but stops working on Heroku. i followed all the instructions. Using rails 5.1.2. Works fine if i remove gentelella-custom.js file. I have the same problem, i opened an issue because on Heroku i cannot explore the files. Your rails version? I got it working fine on rails 5.0.1 but crashes on 5.1.2 Hmmm...this may be a 5.1 issue. It may be that some of the gems utilized are not 5.1 ready or it may be that this gem's not 5.1 ready. I'll try to find some time to dig in and see. In any event, I need to update the README to clarify the role of gentelella-custom.js -- this really is just where you'd load anything else you need for your specific project, but it ships (perhaps erroneously) with an example of what gentelella-custom can be...which is all the JS specific to making your own project work. In this case, it was the custom JS supplied by the Gentelella theme itself. @Talha5 5.1! I'll keep in touch to this issue to get any news about it, if you can fix that it will be awesome. Thanks. @Miccighel and @Talha5 I set up a demo project that loads Rails 5.0.1 and Rails 5.1.2 and both work locally. Can either of you try deploying the demo to Heroku after confirming it works locally for you? The demo with 5.1 and 5.2 working examples: https://github.com/mwlang/gentelella-rails-demo I´m using Rails 5.1 and Ruby 2.3.3. I had to use the following setting for the precompile to work: Rails.application.config.assets.precompile += %w( *.js *.css ) But now, I'm having another kind of error: I, [2017-07-21T03:12:29.041916 #6274] INFO -- : Writing /Users/bruno/Projects/indeva/public/assets/date/da-DK-6c551b4dc9297b663c23dc2337fed6edbd7caea67b3869843e4fefa8dd93aeb3.js.gz rake aborted! Encoding::InvalidByteSequenceError: "\xE7" followed by "a" on UTF-8 Thanks! @mwlang I was also successfully running it over localhost by adding Rails.application.config.assets.paths << Rails.root.join('node_modules') but it js wasnt working when deployed to heroku. @b-mandelbrot: That data/da-DK.js failing on invalid byte sequence seems to be a clue that there's a problem encoding and compressing the JS with. However, I'm able to follow your steps and precompile everything on the demo rails project. Can you try to reproduce this with the demo? @Talha5 node_modules could contain anything. Are you geting the same error writing da-DK-[fingerprint].js.gz? That is, the Encoding::InvalidByteSequenceError that @b-mandelbrot reports? @mwlang I've tested with rails 5.1.x demo application, but it also doesn't work in production (local/heroku) and when adding the line (config.assets.precompile) I have the same encoding error. I think there is a problem with datejs-rails see: (https://github.com/datejs-utf8/datejs-rails). @b-mandelbrot I'm able to reproduce now locally. It's the DateJS library that's included via the datejs-rails gem. Trying to see if I can resolve the issue. turned out there were a total of three separate issues to resolve along the way. 1st: moment.js is required for fullcalendar. This has been added to the gem's asset pipeline. Why it didn't cause issues in development mode is a mystery! 2nd: It wasn't enough to add just "gentelella.js" to the gem's precompile list to fully build out the asset pipeline. This was prompting everyone to add *.js and *.css to their project's precompile list. 3rd: datejs-rails is dormant, and has encoding issues that are unresolved since 2014, so I switched to bower to get latest sources. To fix your project: bundle update gentelella-rails To get to the latest release of the gem. 0.1.8 at the time of this note. REMOVE Rails.application.config.assets.precompile += %w( *.js *.css ) from your Rails project. If this line is left in, you'll start see a whole new set of issues when it gets to bootstrap-sass gem. The long/short of it: adding that line to the gem pulled in exactly the right *.js and *.css that the gem depends on to work. Adding it to the Rails project pulls in every *.js and *.css of every gem in the Rails app and this gets messy quick! Here's a working Heroku deploy: https://stormy-shore-66376.herokuapp.com/index.html This is just the (Rails 5.1.x example](https://github.com/mwlang/gentelella-rails-demo/tree/master/examples/rails.5.1.x) tweaked to use 'pg' gem. I will likely delete the Heroku dyno in a couple weeks, so check it out while you can. @mwlang Perfect! I tested on my project and everything is working fine, including Heroku. Thank you very much!
gharchive/issue
2017-07-14T23:07:44
2025-04-01T06:45:05.443595
{ "authors": [ "Miccighel", "Talha5", "b-mandelbrot", "mwlang" ], "repo": "mwlang/gentelella-rails", "url": "https://github.com/mwlang/gentelella-rails/issues/9", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
121983848
Failed to install the boxstarter the error msg I get is : Boxstarter not installed. An error occurred during installation: The remote name could not be resolved: 'packages.chocolatey.org' The install of Boxstarter was NOT successful. Boxstarter not installed. An error occurred during installation: The remote name could not be resolved: 'packages.chocolatey.org' Interesting. This is the endpoint that hosts chocolatey packages. Seems like your system may be having DNS resolution issues. You might try it again later. Are you able to ping the name and resolve to an IP?
gharchive/issue
2015-12-14T06:39:01
2025-04-01T06:45:05.446391
{ "authors": [ "mwrock", "woshichuanqilz" ], "repo": "mwrock/boxstarter", "url": "https://github.com/mwrock/boxstarter/issues/129", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
891539293
What are we sending in response to !tech command? Should we just be sending the title? Is there something more we can add to make things more interesting? Example title: Aircraft Engine Icing Event Avoidance and Mitigation Unfortunately Twitch does not seem to allow sending images (even though they are really cool). I might send the link to the image. The only problem with that is the length. Maybe I could make it tiny. Example with title and image tiny URL: Aircraft Engine Icing Event Avoidance and Mitigation https://tinyurl.com/yf4oa2ab Are there any hard requirements about which details I return? Closing this, because it is not consequential enough to warrant a question.
gharchive/issue
2021-05-14T02:24:39
2025-04-01T06:45:05.459561
{ "authors": [ "mxkay" ], "repo": "mxkay/techtrans-bot", "url": "https://github.com/mxkay/techtrans-bot/issues/16", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1041944813
chore: add verbose mode to downloader Fixes https://github.com/mxschmitt/playwright-go/issues/144 Not sure if we should add verbose: bool vs. Logger: someLoggerInterface Not sure if we should add verbose: bool vs. Logger: someLoggerInterface Verbose is enough for now, logging can be added later on. @mxschmitt merge Unrelated to this PR, the following test is flaky: === RUN TestShouldReportResponseHeadersArray network_test.go:161: Error Trace: network_test.go:161 Error: Not equal: expected: "a=b\nc=d" actual : "c=d\na=b" Diff: --- Expected +++ Actual @@ -1,2 +1,2 @@ +c=d a=b -c=d Test: TestShouldReportResponseHeadersArray --- FAIL: TestShouldReportResponseHeadersArray (0.24s)
gharchive/pull-request
2021-11-02T07:00:29
2025-04-01T06:45:05.462054
{ "authors": [ "kumaraditya303", "mxschmitt" ], "repo": "mxschmitt/playwright-go", "url": "https://github.com/mxschmitt/playwright-go/pull/231", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
133260651
Tagline The tagline of this boilerplate up to v2 was "Quick setup for performance orientated, offline-first React.js applications featuring Redux, hot-reloading, PostCSS, react-router, ServiceWorker, AppCache, FontFaceObserver and Mocha" That's a nice, short description, which is reinforced in the README and other documentation. With v3 though, we've made some big changes and the above tagline doesn't really describe the focus of this boilerplate anymore. We need a new one. The main ideological change is that the boilerplate now focusses on being the best foundation to start your big applications with. We've adopted a new structure and made a few other improvements that help massive applications manage their scale: We've added tools such as reselect, ImmutableJS and react-pure-render to help manage frontend performance for large-scale applications. We've added improvements to the usability of the boilerplate itself, which includes remote testing with npm run serve, generating containers, components, routes etc. with npm run generate, etc. We made the boilerplate bulletproof, fixing dozens of issues that can only be found with real world usage. We've made the webpack configuration even better with chunking (code splitting) and more fixes. (see #123 for the full list of changes) I'm really not sure what the focus of our "branding" of v3 should be. What do you think? We should probably also highlight our focus on giving the users the best development experience possible with hot-reloading, generators etc. This boilerplate makes it easy and fast to build great applications, maybe that should be the main focus of the tagline? First possible idea: Kickstart your new project with the best development experience, a highly scalable and solid foundation and a focus on performance and offline access. That's not very catchy though... I love where this is going! Your bold description really caught my attention: ...the best foundation to start your big applications with... I personally prefer something more succinct, the tagline doesn't need to over-sell. Maybe something like: A foundation for big applications A foundation for developers writing big applications A performance-oriented, developer-friendly foundation for serious apps Next-level developer experience for great applications :smile: We need to show people through the tagline why they should choose react-boilerplate over any of the other starter projects. It's the first (and maybe only) thing they see of it, before they see anything else. Saying "A foundation for big applications" is fair and probably the most honest tagline one could have, but does it really intrigue you enough to click on the link to it? I don't think so... What about this: Start your next react project in seconds A scalable foundation with the best DX and a focus on performance and best practices Adding offline-first back in, slight wording change... Start your next react project in seconds A highly scalable, offline-first foundation with the best DX and a focus on performance and best practices Liking it: punchy, succinct, appealing. I can't think of a way to make this better, if nobody has any suggestions I'll close this issue and add the tagline everywhere tomorrow! My only reservation is around the "scalable" statement: what makes it so? How is it highly so? Can the value be more explicitly/clearly expressed? Hmm, good point. It is highly scalable because it should be able to scale to a massively sized frontend. (i.e. 1000's of components and pages) The scalability mostly refers to the architecture we chose, so maybe adding that in makes it better? It seems very long now though: An offline-first foundation with a highly scalable architecture, the best DX and a focus on performance and best practices I guess my issue with "scalable" is that it's a term more often used in the context of infrastructure than architecture... What about "An offline-first foundation for apps of any size, with a focus on DX, performance and best practice" I kind of like the "A highly scalable, offline first foundation..." though, it has a really nice ring and sounds very appealing – though it might just be my non-native ears. Added to the README!
gharchive/issue
2016-02-12T14:54:53
2025-04-01T06:45:05.473027
{ "authors": [ "mxstbr", "oliverturner", "willklein" ], "repo": "mxstbr/react-boilerplate", "url": "https://github.com/mxstbr/react-boilerplate/issues/161", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
174644334
Incorrect debug line in stacktrace when running tests Issue Type [x] Bug (https://github.com/mxstbr/react-boilerplate/blob/master/.github/CONTRIBUTING.md#bug-reports) [ ] Feature (https://github.com/mxstbr/react-boilerplate/blob/master/.github/CONTRIBUTING.md#feature-requests) Description I set up a really simple component with a test to demonstrate this issue. I've also seen this issue when running tests on other components and the common theme is that it always seems to look at line 9: http://imgur.com/a/PCpZq I've done some googling and thought the issue might be related to using istanbul code coverage with karma but turning off code coverage doesn't seem to affect this at all. https://github.com/karma-runner/karma/issues/1141 Steps to reproduce git clone --depth=1 https://github.com/mxstbr/react-boilerplate.git npm run setup npm run clean Create new component via npm run generate (code below) removed linting step in pretest in package.json for convenience run test via npm run test:watch -- --grep "TestComponent" TestComponent: /** * * TestComponent * */ import React from 'react'; function TestComponent() { garbageToTriggerAnError return ( <h1>hi</h1> ); } export default TestComponent; TestComponent test: import TestComponent from '../index'; import expect from 'expect'; import { shallow } from 'enzyme'; import React from 'react'; describe('<TestComponent />', () => { it('renders a <TestComponent>', () => { console.log('here'); const renderedComponent = shallow( <TestComponent /> ); }); }); Versions Node/NPM: 6.5 / 3.10.3 Browser: Chrome 51.0.2704 Kubuntu 16.04 We switched to a new code coverage method in the dev branch, please try doing it there, see if the issue still exists and comment again here. Thanks! Yep this appears to be fixed on the latest dev branch (e83b1b29a39f5746a038bd6f8cd47f795cab8420)! Many thanks for the quick reply and for making a great boilerplate! On a slightly unrelated note, if the build is passing for the dev branch, can we reasonably assume it's stable enough for use? @jwyuen That is my assumption. But then I'm prepared to do a lot of testing and deal with issues by rolling back commits, etc. So far, I haven't encountered too many issues. There have been some, unfortunately I have not had the time to distill them down an issue I felt good about posting. So sure, use the dev branch, but understand that, if you do, you are one of the beta testers :wink:
gharchive/issue
2016-09-01T22:53:03
2025-04-01T06:45:05.481057
{ "authors": [ "gihrig", "jwyuen", "mxstbr" ], "repo": "mxstbr/react-boilerplate", "url": "https://github.com/mxstbr/react-boilerplate/issues/945", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
1837817546
Do not change dir - 221 [Auto generated] Number: [#221] github-webhook-server[PR 221]: Closing issue for PR: Do not change dir. PR was closed.
gharchive/issue
2023-08-05T15:48:26
2025-04-01T06:45:05.496885
{ "authors": [ "myakove" ], "repo": "myakove/github-webhook-server", "url": "https://github.com/myakove/github-webhook-server/issues/222", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
2739928485
Potential Express Injection Vulnerability Problem Statement In the MyBatis framework, the ExpressionEvaluator module uses OGNL to evaluate the values of expressions and handle the returned results. However, this functionality may be exploited by attackers to inject carefully crafted malicious expressions, enabling attacks such as remote code execution. A vulnerable code example. package org.apache.ibatis.scripting.xmltags; public class ExpressionEvaluator { public boolean evaluateBoolean(String expression, Object parameterObject) { Object value = OgnlCache.getValue(expression, parameterObject); if (value instanceof Boolean) { return (Boolean) value; } if (value instanceof Number) { return new BigDecimal(String.valueOf(value)).compareTo(BigDecimal.ZERO) != 0; } return value != null; } } MyBatis version <= 3.5.17 Steps to reproduce Considering the security implications, I just provide the following test case as an example to reproduce the attack. class ExpressionEvaluatorTest { private final ExpressionEvaluator evaluator = new ExpressionEvaluator(); @Test void shouldReturnFalseIfZero() { String query = "@javax.naming.InitialContext@doLookup('ldap://127.0.0.1:8087/Evil')"; boolean value = evaluator.evaluateBoolean(query, new Author(0, "cbegin", null, "cbegin@apache.org", "N/A", Section.NEWS)); assertFalse(value); } } Vulnerability Impact Remote Command Execution (RCE), such as the invocation of the calculator application. Hello @qxyuan853 , So, this also assumes that mybatis is used in an application that has a vulnerability allowing an attacker to execute arbitrary code using any class in the app's dependencies, am I right? I honestly am not sure if it's possible to protect such applications comprehensively, but what do you propose? FYI, OGNL provides a custom security manager that prevents some risky calls. For JNDI lookup particularly, JDK provides NamingManager to prevent such attack, it seems (I've never tried it, so please correct me if this statement is inaccurate). We also encourage developers to setup JEP-290 serialization filter. #2079 Hi harawata, I fully agree that achieving complete protection can be quite challenging. That said, would it be possible to provide an interface/optional configuration that simplifies security customization for users? For example, allowing them to define custom blacklists or whitelists to restrict the methods that can be invoked during expression parsing. In addition, from what we’ve observed, most users tend to stick with the default configuration. Therefore, adding a default blacklist and whitelist restrictions might be a helpful enhancement. Otherwise, the OGNL parsing functionality in MyBatis could be more easily exploited by attackers to carry out malicious actions. Thank you for your continued attention to this matter. Hi harawata, I fully agree that achieving complete protection can be quite challenging. That said, would it be possible to provide an interface/optional configuration that simplifies security customization for users? For example, allowing them to define custom blacklists or whitelists to restrict the methods that can be invoked during expression parsing. In addition, from what we’ve observed, most users tend to stick with the default configuration. Therefore, adding a default blacklist and whitelist restrictions might be a helpful enhancement. Otherwise, the OGNL parsing functionality in MyBatis could be more easily exploited by attackers to carry out malicious actions. Thank you for your continued attention to this matter. Hi harawata, I noticed that a similar vulnerability also previously existed in Thymeleaf (https://github.com/thymeleaf/thymeleaf), where they addressed it by using a combination of blacklists and whitelists to restrict methods that can be called on Class objects from expressions. Perhaps you could refer to their defense strategy, e.g., https://github.com/thymeleaf/thymeleaf/commit/2f1298367fd73caa0f5121803d682bfc3a77c3ce The commit adds restriction to Thymeleaf's internal expression language. If you want OGNL to provide similar restriction, you need to propose it to OGNL, not to MyBatis. If the attacker can call OgnlCache#getValue(), they should be able to call Ognl#getValue(), so I don't see the point in protecting OgnlCache, to be honest. And, please educate the developers you know about these security measures like JEP-290 if they haven't used them. If developers use user input directly, for example, there is not much we (=framework/library developers) can do to protect their app. Hi harawata, it seems like there might be some misunderstanding. Thymeleaf also uses Ognl#getValue to evaluate expressions, and the issue lies in restricting the methods that can be invoked during the evaluation process. Furthermore, in practice, the security measures of JEP-290 can also be bypassed in various ways. In MyBatis, risky string is not evaluated by OGNL unless 1) the app developer passes risky string directly to OGNL expression in their app or 2) there is some library that has vulnerability allowing attacker to execute arbitrary code. Are you suggesting that we should take care of 1) ? However, it seems that MyBatis-3 includes scenarios where OGNL may evaluate expressions that involve user input. The dynamic SQL features in MyBatis-3 (such as , , , and ) rely on OGNL to evaluate conditional expressions. External user input, passed through the parameterObject, can be included in these expressions and dynamically evaluated by OGNL. If my understanding is incorrect, I would greatly appreciate your clarification. It might be. Let us discuss using a concrete example. Here is the POJO. public class User { private Integer id; private String name; // getters, setters } The mapper and the method to search users by the properties of the User. public interface UserMapper { List<User> search(User criteria); } The mapper statement may look something like this. <select id="search" resultType="pkg.User"> select id, name from user <where> <if test="name != null"> and name = #{name} </if> </where> </select> Now, if the developer sets an external user's input (e.g. a request parameter) to User.name, is it safe? The answer is "yes, it is safe". Even if the value of name is "@javax.naming.InitialContext@doLookup('ldap://127.0.0.1:8087/Evil')", OGNL does not call the method. The expression OGNL evaluates, in this case, is always name != null and external users cannot change that. Hope this clarifies your concern. If I am misunderstanding what you mean, please show us an example like the above.
gharchive/issue
2024-12-14T14:53:43
2025-04-01T06:45:05.516967
{ "authors": [ "BFionaccept", "harawata", "qxyuan853" ], "repo": "mybatis/mybatis-3", "url": "https://github.com/mybatis/mybatis-3/issues/3316", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
179961526
why the DefaultSqlSession's method wrapCollection has map.put("collection",object) and map.put("list",object),this map may put two times MyBatis version 3.4.2 if (object instanceof Collection) { StrictMap map = new StrictMap(); map.put("collection", object); if (object instanceof List) { map.put("list", object); } return map; } It is intentional, not a bug. If it causes a problem, please post a test case or an example project to reproduce the problem. As stated in the issue template, we use GitHub Issues only to track bugs and feature requests. Please post your question to the mailing list. Thank you!
gharchive/issue
2016-09-29T06:15:42
2025-04-01T06:45:05.520922
{ "authors": [ "IAMTJW", "harawata" ], "repo": "mybatis/mybatis-3", "url": "https://github.com/mybatis/mybatis-3/issues/794", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
141844702
Rename ObjectFactory -> ObjectManager Old name hints to a factory pattern which it isn't @pgh70 Thank you for your PR. can you change ObjectFactory -> McObjectFactory? please pull recent changes from development branch and do your change Changing ObjectFactory to McObjectFactory still hints to a factory-pattern (http://www.tutorialspoint.com/design_pattern/factory_pattern.htm) which it isn't.. It is simply a class which holds references to other objects.. So a Manager is a better name for it... @pgh70 Yes, McObjectManager could be a better name. Is there a reason for using this class for holding the references? For instance RawMessageQueue could be a singleton class, then there is no need to let ObjectFactory (of McObjectManager) keep track of a reference to the class... I can create a PR to demonstrate this.. @pgh70 i do not have any specific reason to hold references. I will check your changes in 3 days. right now I'm in vacation. sorry for the delay. thank you. @pgh70 can you satisfy checkstyle? [INFO] --- maven-checkstyle-plugin:2.17:checkstyle (check-style) @ mycontroller-core --- [INFO] Starting audit... /home/travis/build/mycontroller-org/mycontroller/modules/core/src/main/java/org/mycontroller/standalone/db/DataBaseUtils.java:30: Wrong order for 'org.mycontroller.standalone.McObjectManager' import. /home/travis/build/mycontroller-org/mycontroller/modules/core/src/main/java/org/mycontroller/standalone/db/SensorUtils.java:25: Wrong order for 'org.mycontroller.standalone.McObjectManager' import. /home/travis/build/mycontroller-org/mycontroller/modules/core/src/main/java/org/mycontroller/standalone/gateway/serialport/SerialDataListenerjSerialComm.java:21: Wrong order for 'org.mycontroller.standalone.McObjectManager' import. /home/travis/build/mycontroller-org/mycontroller/modules/core/src/main/java/org/mycontroller/standalone/metrics/MetricsAggregationBase.java:360: Line is longer than 120 characters (found 121). /home/travis/build/mycontroller-org/mycontroller/modules/core/src/main/java/org/mycontroller/standalone/metrics/MetricsAggregationBase.java:369: Line is longer than 120 characters (found 121). /home/travis/build/mycontroller-org/mycontroller/modules/core/src/main/java/org/mycontroller/standalone/mqttbroker/BrokerConfiguration.java:53: Line is longer than 120 characters (found 121). /home/travis/build/mycontroller-org/mycontroller/modules/core/src/main/java/org/mycontroller/standalone/rule/McConditionScript.java:24: Wrong order for 'org.mycontroller.standalone.McObjectManager' import. Audit done. Satisfy checkstyle done (after some effort in IntelliJ) :) @pgh70 merged your changes thank you! @jkandasa Did you also look at the conversion to singleton as shown in: https://github.com/pgh70/mycontroller/tree/rmq-singleton (commit: a4d2c6e516ddb8795a3f405140dc9ace267f075) This demonstrates the conversion of RawMessageQueue to a singleton... If you like this approach, i can make a PR with RawMessageQueue and the other classes where McObjectManager keeps a reference to, changed to singleton class which renders McObjectManager obsolete.. @pgh70 Thank you! Let me have a look and update you soon. @pgh70 your single tone approach looks good. we can merge it. Kindly create a PR. Thank you very much for your contribution.
gharchive/pull-request
2016-03-18T11:45:17
2025-04-01T06:45:05.531950
{ "authors": [ "jkandasa", "pgh70" ], "repo": "mycontroller-org/mycontroller", "url": "https://github.com/mycontroller-org/mycontroller/pull/167", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
1899188611
Submission to Julia General Registry @JuliaRegistrator register() Registration pull request created: JuliaRegistries/General/91515 After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version. This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via: git tag -a v1.0.1 -m "<description of version>" a62bda6dbf6c8921b155575499ed11419160f542 git push origin v1.0.1 Also, note the warning: This looks like a new registration that registers version 1.0.1. Ideally, you should register an initial release with 0.0.1, 0.1.0 or 1.0.0 version numbers This can be safely ignored. However, if you want to fix this you can do so. Call register() again after making the fix. This will update the Pull request.
gharchive/issue
2023-09-15T23:18:00
2025-04-01T06:45:05.535225
{ "authors": [ "JuliaRegistrator", "myersm0" ], "repo": "myersm0/Cifti.jl", "url": "https://github.com/myersm0/Cifti.jl/issues/12", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
350431327
Pool connection pool Codecov Report Merging #13 into master will decrease coverage by 13.2%. The diff coverage is 61.87%. @@ Coverage Diff @@ ## master #13 +/- ## =========================================== - Coverage 100% 86.79% -13.21% =========================================== Files 10 13 +3 Lines 375 462 +87 Branches 36 50 +14 =========================================== + Hits 375 401 +26 - Misses 0 59 +59 - Partials 0 2 +2 Impacted Files Coverage Δ src/core/protocol.ts 100% <ø> (ø) src/api/zset.ts 100% <100%> (ø) src/core/tedis.ts 100% <100%> (ø) src/api/string.ts 100% <100%> (ø) src/api/hash.ts 100% <100%> (ø) src/api/key.ts 100% <100%> (ø) src/main.ts 100% <100%> (ø) src/api/list.ts 100% <100%> (ø) tools/index.ts 100% <100%> (ø) :arrow_up: src/api/set.ts 100% <100%> (ø) ... and 9 more Continue to review full report at Codecov. Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update aefb8af...377d80e. Read the comment docs.
gharchive/pull-request
2018-08-14T13:36:15
2025-04-01T06:45:05.562202
{ "authors": [ "codecov-io", "dasoncheng" ], "repo": "myour-cc/tedis", "url": "https://github.com/myour-cc/tedis/pull/13", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2185776
com.mysema.maven:maven-apt-plugin does not support AJDT/aspectj style domain classes I was trying to use the code generator to generate query support classes for my domain objects. My domain objects uses roo-style separation to add the entity attribute using aspectj @MyDomainEntityMarker public class MyDomainClass { } then I have aspectj slice in the @Entity annotation and a few others related to my domain class. So you have a custom annotation @MyDomainEntityMarker and you want support for that? I am not sure if I understand correctly. I am not familiar with Roo. I have that custom annotation. But I use aspectj to slice in the @Entity as part of the aspectj weaving. Essentially, think of the annotation as being expanded into a set of annotations. Hence, the annotations are only present on the compiled class that has been woven by aspectj. Perhaps the real issue is an enhancement request to work with aspectj or that the apt plugin think about generation from compiled classes instead of direct source. Your comment about supporting a custom annotation would mostly solve this problem without having to fully integrate into aspectj, that is, make the @Entity annotation a parameter with the default being @Entity of course. Maybe the following class might work for you : http://www.querydsl.com/static/querydsl/2.2.3/apidocs/com/mysema/query/codegen/GenericExporter.html The APT processor works on sources, so aspectj woven classes are probably not handled well with this. As a test, I put the @Entity back into the domain class directly but still had aspectj augmenting my class with some other content. It still looks like the apt querydsl plugin is failing to run correctly. I'll try to track down if there is something more complicated with aspects in general going on re: standard apt processing. I was looking at GenericExporter, I starting looking into the code. Is there a set of classes that are used to create an in-memory model of a class that is then used to generate the querydsl code? I was wondering if its possible to populate that class from binary class files (that includes the JPA annotations) and generate the querydsl generated code from binary files instead of just source. It kind-of looks that way with the EntityType class. As near as I can tell, since my use of aspectj slices in not only the @Entity annotation but also a few standard entity properties (such as the id and version field) that the only way I can continue to use this domain class pattern with querydsl is to code generate from the .class files and not source. That works. Many thanks. I think it would be nice to be able to set the export package name explicitly though. The target folder gets me to the top of my source tree, but then the original domain library's package name is appended. I would need to change that. I'll open another request on that issue. Strangely, the exporter picks up two fields for those fields that have both a setter/getter and the field definition in an aspect. For example, since I slice in both the id and version field, I actually have two fields in the query class. One for the id using its short name, "id", and one for the really long aspectj mangled field name. I'll just ignore the aspectj field for now. I will see what I can do about the field issue You can now skip field handling via exporter.setHandleFields(false) Released in 2.2.5
gharchive/issue
2011-11-09T12:48:00
2025-04-01T06:45:05.583596
{ "authors": [ "aappddeevv", "timowest" ], "repo": "mysema/querydsl", "url": "https://github.com/mysema/querydsl/issues/43", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
2074611081
How do i change the language ? (Multilingual options) i want to know how to change the language. lets say the text is in english but I want the output Audio in some other language ? lets say i want to give the text input in another language.. How do i go about these cases ? (preferably without using OpenAi APIs) https://github.com/myshell-ai/OpenVoice/blob/main/QA.md#issues-with-languages https://github.com/myshell-ai/OpenVoice/blob/main/docs/QA.md#issues-with-languages here is a working link
gharchive/issue
2024-01-10T15:25:11
2025-04-01T06:45:05.592930
{ "authors": [ "Dev-startupagile", "Zengyi-Qin", "hrishi-04" ], "repo": "myshell-ai/OpenVoice", "url": "https://github.com/myshell-ai/OpenVoice/issues/96", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
273630448
Negation and not Handles negation of both Integer and Float and not for Boolean Looks great! Thank you for the contribution :) oops, meant to mark this as closes #35. Hopefully GitHub picks that up automatically? If not, I'll go do it manually. Awesome, thanks
gharchive/pull-request
2017-11-14T00:50:27
2025-04-01T06:45:05.621511
{ "authors": [ "faultyserver", "rainbru" ], "repo": "myst-lang/myst", "url": "https://github.com/myst-lang/myst/pull/46", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
166928308
Question: Parallel with serial Is it possible to compile serial and parallel tasks? Ex: run-s lint run-p build In this example, I'm assuming that build will run after lint completes Thank you for this issue. Though I might not understand this issue, I think there are some solutions: npm-run-all -s lint -p build:* npm run lint && run-p build:* "prebuild": "npm run lint", "build": "run-p build:*" @mysticatea i think @amilajack meant nested task, example: /- favicons -> imagemin (serial) /-- webfont (serial) parallels -- copy (serial) -another-task -> another-task-too -> another-task-too-too (serial) Yeah.. nested tasks within a parallel block would be useful. i guess it's possible if you added another script for the nested block but that doesn't scale well
gharchive/issue
2016-07-21T22:02:31
2025-04-01T06:45:05.642829
{ "authors": [ "amilajack", "evilebottnawi", "kentor", "mysticatea" ], "repo": "mysticatea/npm-run-all", "url": "https://github.com/mysticatea/npm-run-all/issues/58", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1197122694
Cannot load module Basic information My operating system: OS Name: Microsoft Windows 11 Education Insider Preview OS Version: 10.0.22483 N/A Build 22483 My MZmine version: MZmine 3.0.21-beta What happened Trying raw-data-export function but it fails with "Cannot load module" error 2022-04-08 10:21:02 FINE io.github.mzmine.main.MZmineCore init Loading core classes.. 2022-04-08 10:21:03 FINE io.github.mzmine.main.MZmineCore init Initializing core classes.. 2022-04-08 10:21:03 FINEST io.github.mzmine.taskcontrol.impl.TaskControllerImpl initModule Starting task controller thread 2022-04-08 10:21:03 INFO io.github.mzmine.main.MZmineCore main Starting MZmine 3.0.21-beta 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore main MZmine arguments: 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore main Java VM arguments: -Djpackage.app-version=3.0.21 -XX:MaxHeapFreeRatio=100 -XX:InitialRAMPercentage=30 -XX:MinRAMPercentage=80 -XX:MaxRAMPercentage=80 -enableassertions -Djava.util.logging.config.class=io.github.mzmine.main.MZmineLoggingConfiguration -Djpackage.app-path=F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\MZmine.exe 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore main Java class path: F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\mzmine3-3.0.21.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\activation-1.1.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\adap-4.1.10.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\aird-sdk-1.1.6.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\annotations-2.0.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\annotations-22.0.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\arpack_combined_all-0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-all-1.13.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-anim-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-awt-util-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-bridge-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-codec-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-constants-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-css-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-dom-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-ext-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-extension-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-gui-util-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-gvt-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-i18n-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-parser-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-rasterizer-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-rasterizer-ext-1.13.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-script-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-shared-resources-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-slideshow-1.13.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-squiggle-1.13.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-squiggle-ext-1.13.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-svg-dom-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-svgbrowser-1.13.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-svggen-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-svgpp-1.13.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-svgrasterizer-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-swing-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-transcoder-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-ttf2svg-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-util-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\batik-xml-1.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\bcpkix-jdk15on-1.68.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\bcprov-jdk15on-1.68.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\beam-core-1.3.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\beam-func-1.3.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\bounce-0.18.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\brotli4j-1.6.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\bzip2-0.9.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\c3p0-0.9.1.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\canopus_predict_oss-1.1.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-atomtype-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-charges-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-core-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-ctab-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-data-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-dict-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-fingerprint-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-forcefield-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-formula-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-fragment-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-hash-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-inchi-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-interfaces-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-io-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-ioformats-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-isomorphism-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-qsar-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-qsaratomic-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-qsarmolecular-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-reaction-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-render-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-renderawt-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-renderbasic-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-sdg-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-silent-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-smarts-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-smiles-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-standard-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdk-valencycheck-2.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cdm-4.5.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\chardet-1.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\chemdb_file_oss-1.1.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\chemdb_rest_oss-1.1.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\chemdb_sql_oss-1.1.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\chemical_db_oss-1.1.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\chemistry_base-4.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\chemspider-api-1.0.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\clustering-20210422.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\colt-1.2.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-beanutils-1.9.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-cli-1.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-codec-1.15.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-collections-3.2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-collections4-4.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-compress-1.20.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-csv-1.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-email-1.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-graph-0.8.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-io-2.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-lang3-3.12.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-logging-1.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-math-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-math3-3.6.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-pool2-2.4.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\commons-text-1.8.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\concurrent-1.3.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\controlsfx-11.1.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\core-1.1.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\cpdetector-1.0.7.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\csparsej-1.1.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\curvesapi-1.06.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\decomp_cli-4.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\ehcache-core-2.6.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\ejml-all-0.38.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\ejml-cdense-0.38.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\ejml-core-0.38.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\ejml-ddense-0.38.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\ejml-dsparse-0.38.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\ejml-fdense-0.38.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\ejml-simple-0.38.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\ejml-zdense-0.38.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\epsgraphics-1.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\error-reporter-0.9.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\error_prone_annotations-2.3.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\failureaccess-1.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fastjson-1.2.79.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fastutil-8.5.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fingerblast_oss-1.1.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fingerid_utils_oss-1.1.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fingerprinter_oss-1.1.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fingerprint_pvalues_oss-1.1.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fontbox-2.0.22.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fontchooser-2.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fop-2.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fop-core-2.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fop-events-2.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fop-util-2.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\fragmentation_tree_construction-4.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\freehep-graphics2d-2.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\freehep-graphicsbase-2.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\freehep-graphicsio-2.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\freehep-graphicsio-emf-2.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\freehep-graphicsio-tests-2.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\freehep-io-2.2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\gibbs_sampling-4.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\glpk-java-1.7.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\graphics2d-0.30.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\GraphUtils-1.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\gs-algo-2.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\gs-core-45504f632f.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\gs-ui-javafx-2.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\gslibml-0.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\gson-2.8.9.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\gson-fire-1.8.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\guava-30.1-jre.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\hamcrest-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\hamcrest-core-1.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\httpclient-4.5.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\httpcore-4.4.10.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\httpmime-4.2.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\httpservices-4.5.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\io-4.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\isogen_cli-4.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\isotope_pattern_analysis-4.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\istack-commons-runtime-3.0.11.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\itextpdf-5.5.13.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\j2objc-annotations-1.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jackson-annotations-2.10.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jackson-core-2.10.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jackson-databind-2.10.5.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jackson-dataformat-csv-2.10.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jackson-dataformat-xml-2.10.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jackson-datatype-joda-2.10.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jackson-module-jaxb-annotations-2.10.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jakarta.activation-1.2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jakarta.activation-api-1.2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jakarta.xml.bind-api-2.3.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jama-1.0.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\java-cup-11b-20160615.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\java-cup-runtime-11b-20160615.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\java-utilities-1.0-SNAPSHOT.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\JavaFastPFOR-0.1.12.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-base-16-win.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-base-16.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-controls-16-win.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-controls-16.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-fxml-16-win.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-fxml-16.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-graphics-16-win.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-graphics-16.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-media-16-win.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-media-16.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-swing-16-win.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-swing-16.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javafx-web-16-win.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javanmf-0.2.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\JavaPlot-1.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javatuples-1.3-SNAPSHOT.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javax.activation-api-1.2.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javax.json-1.1.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javax.json-api-1.1.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javax.mail-1.5.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javax.mail-api-1.6.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\javolution-core-java-msftbx-6.11.8.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jaxb-api-2.3.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jaxb-core-2.3.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jaxb-impl-2.3.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jaxb-runtime-2.3.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jaxb-xjc-2.2.7.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jblas-1.2.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jcip-annotations-1.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jcl-over-slf4j-1.7.30.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jclipboardhelper-0.1.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jcommander-1.35.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jdom2-2.0.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jewelcli-0.8.9.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jfilechooser-bookmarks-0.1.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jfreechart-1.5.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jgrapht-core-0.9.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jimzmlparser-1.0.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jjobs-core-0.9.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jmol-14.31.10.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jmprojection-1.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jmzml-1.7.11.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jmztab-modular-model-3.0.9.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jmztab-modular-util-3.0.9.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jmztabm-api-1.0.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jmztabm-io-1.0.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jna-5.5.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jna-platform-5.5.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jnati-core-0.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jnati-deploy-0.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jni-inchi-0.8.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jniloader-1.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jocl-2.0.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\joda-time-2.9.9.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\joptimizer-3.5.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\json-20211205.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jsparsehc-0.3.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\jsr305-3.0.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\junit-4.10.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\log4j-1.2.14.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\logging-interceptor-2.7.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\lz4-java-1.8.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\lzo-commons-1.0.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\lzo-core-1.0.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\mass_decomposer-4.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\mbox2-1.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\migbase64-2.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\miglayout-3.7.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\msdk-datamodel-0.0.27.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\msdk-featuredetection-adap3d-0.0.27.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\msdk-id-sirius-0.0.27.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\msdk-io-msp-0.0.27.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\msdk-io-mzml-0.0.27.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\msdk-io-netcdf-0.0.27.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\msdk-spectra-centroidprofiledetection-0.0.27.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\MsNumpress-0.1.18.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\mtj-1.0.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\mysql-connector-java-5.1.34.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\native_ref-java-1.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\native_system-java-1.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netcdf4-4.5.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-java-1.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_ref-linux-armhf-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_ref-linux-i686-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_ref-linux-x86_64-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_ref-osx-x86_64-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_ref-win-i686-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_ref-win-x86_64-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_system-linux-armhf-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_system-linux-i686-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_system-linux-x86_64-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_system-osx-x86_64-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_system-win-i686-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\netlib-native_system-win-x86_64-1.1-natives.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\okhttp-2.7.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\okio-1.6.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\opencsv-5.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\org.jfree.chart.fx-2.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\org.jfree.fxgraphics2d-2.1.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\org.jfree.pdf-2.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\org.jfree.svg-4.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\oshi-core-4.5.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\pdfbox-2.0.22.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\pherd-1.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\poi-5.0.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\poi-ooxml-5.0.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\poi-ooxml-lite-5.0.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\postgresql-42.2.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\protobuf-java-2.5.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\py4j-0.8.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\qdox-1.12.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\quartz-2.2.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\RCaller-4.0.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\Recalibration-1.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\REngine-1.8-5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\Rserve-1.8-5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\semver4j-3.1.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\serializer-2.7.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\sirius_api-4.0.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\slf4j-api-1.7.32.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\slf4j-jdk14-1.7.32.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\smile-core-1.3.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\smile-data-1.3.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\smile-graph-1.3.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\smile-math-1.3.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\snappy-java-1.1.8.4.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\SparseBitSet-1.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\sqlite-jdbc-3.8.11.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\stax2-api-4.2.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\swagger-annotations-1.5.24.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\threetenbp-1.3.8.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\trove4j-3.0.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\txw2-2.3.3.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\udunits-4.5.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\utils-1.07.00.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\validation-api-2.0.0.Final.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\vecmath-1.5.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\weka-stable-3.8.5.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\woodstox-core-6.2.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\xalan-2.7.2.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\xmlbeans-4.0.0.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\xmlgraphics-commons-2.6.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\xmlsec-2.2.1.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\xxindex-0.23.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\xz-1.9.jar;F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta\app\zstd-jni-1.3.4-10.jar 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore main Working directory is F:\OneDrive - NTNU\Downloads\MZmine_Windows_portable_3.0.21-beta 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore main Default temporary directory is C:\Users\animeshs\tmp\ 2022-04-08 10:21:03 FINE io.github.mzmine.main.TmpFileCleanup run Checking for old temporary files... 2022-04-08 10:21:03 FINEST io.github.mzmine.main.impl.MZmineConfigurationImpl loadConfiguration Loading desktop configuration 2022-04-08 10:21:03 FINEST io.github.mzmine.main.impl.MZmineConfigurationImpl loadConfiguration Loading last projects 2022-04-08 10:21:03 FINEST io.github.mzmine.main.impl.MZmineConfigurationImpl loadConfiguration Loading modules configuration 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore getModuleInstance Creating an instance of the module io.github.mzmine.modules.dataprocessing.featdet_massdetection.auto.AutoMassDetector 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore getModuleInstance Creating an instance of the module io.github.mzmine.modules.dataprocessing.featdet_massdetection.wavelet.WaveletMassDetector 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore getModuleInstance Creating an instance of the module io.github.mzmine.modules.tools.batchwizard.BatchWizardModule 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore getModuleInstance Creating an instance of the module io.github.mzmine.modules.dataprocessing.featdet_massdetection.localmaxima.LocalMaxMassDetector 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore getModuleInstance Creating an instance of the module io.github.mzmine.modules.dataprocessing.featdet_massdetection.centroid.CentroidMassDetector 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore getModuleInstance Creating an instance of the module io.github.mzmine.modules.dataprocessing.featdet_massdetection.exactmass.ExactMassDetector 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore getModuleInstance Creating an instance of the module io.github.mzmine.modules.dataprocessing.featdet_massdetection.recursive.RecursiveMassDetector 2022-04-08 10:21:03 INFO io.github.mzmine.main.impl.MZmineConfigurationImpl loadConfiguration Loaded configuration from file C:\Users\animeshs\.mzmine3.conf 2022-04-08 10:21:03 FINEST io.github.mzmine.main.MZmineCore setTempDirToPreference Working temporary directory is C:\Users\animeshs\tmp 2022-04-08 10:21:03 FINE io.github.mzmine.main.TmpFileCleanup run Checking for old temporary files... 2022-04-08 10:21:03 INFO io.github.mzmine.main.MZmineCore main Starting MZmine GUI 2022-04-08 10:21:03 WARNING com.sun.javafx.application.PlatformImpl startup Unsupported JavaFX configuration: classes were loaded from 'unnamed module @720f09d3' 2022-04-08 10:21:04 FINEST io.github.mzmine.gui.MZmineGUI start Initializing MZmine main window 2022-04-08 10:21:06 FINEST io.github.mzmine.gui.DesktopSetup run Configuring desktop settings 2022-04-08 10:21:07 FINEST io.github.mzmine.util.InetUtils retrieveData Retrieving data from URL http://mzmine.github.io/version.txt 2022-04-08 10:21:07 FINEST io.github.mzmine.util.InetUtils retrieveData Retrieved 12 characters from http://mzmine.github.io/version.txt 2022-04-08 10:58:49 INFO io.github.mzmine.gui.mainwindow.MainMenuController runModule Menu item activated for module io.github.mzmine.modules.io.import_rawdata_thermo_raw.ThermoRawImportModule 2022-04-08 10:58:49 FINEST io.github.mzmine.main.MZmineCore getModuleInstance Creating an instance of the module io.github.mzmine.modules.io.import_rawdata_thermo_raw.ThermoRawImportModule 2022-04-08 10:58:49 INFO io.github.mzmine.gui.mainwindow.MainMenuController runModule Setting parameters for module Thermo RAW file import 2022-04-08 10:59:09 FINEST io.github.mzmine.gui.mainwindow.MainMenuController runModule Starting module Thermo RAW file import with parameters File names: [Z:\220402_beads_test3\plasma_samples\220402_Alessandro_test3_Plasma24h-2.raw] 2022-04-08 10:59:09 FINEST io.github.mzmine.main.MZmineCore runMZmineModule Module Thermo RAW file import called at 2022-04-08T08:59:09.259621100Z 2022-04-08 10:59:09 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Opening file Z:\220402_beads_test3\plasma_samples\220402_Alessandro_test3_Plasma24h-2.raw" to the task controller queue 2022-04-08 10:59:09 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Opening file Z:\220402_beads_test3\plasma_samples\220402_Alessandro_test3_Plasma24h-2.raw 2022-04-08 10:59:09 INFO io.github.mzmine.modules.io.import_rawdata_thermo_raw.ThermoRawImportTask run Opening file Z:\220402_beads_test3\plasma_samples\220402_Alessandro_test3_Plasma24h-2.raw 2022-04-08 10:59:09 FINEST io.github.mzmine.modules.io.import_rawdata_thermo_raw.ThermoRawImportTask unzipThermoRawFileParser Unpacking ThermoRawFileParser to folder C:\Users\animeshs\tmp\mzmine_thermo_raw_parser 2022-04-08 10:59:10 FINEST io.github.mzmine.modules.io.import_rawdata_mzml.msdk.MzMLFileImportMethod execute Began parsing file from stream 2022-04-08 11:00:24 INFO io.github.mzmine.gui.mainwindow.MainMenuController runModule Menu item activated for module io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportModule 2022-04-08 11:00:24 FINEST io.github.mzmine.main.MZmineCore getModuleInstance Creating an instance of the module io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportModule 2022-04-08 11:00:24 INFO io.github.mzmine.gui.mainwindow.MainMenuController runModule Setting parameters for module Bruker TDF file import 2022-04-08 11:00:56 FINEST io.github.mzmine.gui.mainwindow.MainMenuController runModule Starting module Bruker TDF file import with parameters File names: [Z:\220402_beads_test3\timsTOF\220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d] 2022-04-08 11:00:56 FINEST io.github.mzmine.main.MZmineCore runMZmineModule Module Bruker TDF file import called at 2022-04-08T09:00:56.715160500Z 2022-04-08 11:00:56 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "null" to the task controller queue 2022-04-08 11:00:56 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task null 2022-04-08 11:00:56 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask readMetadata Initialising SQL... 2022-04-08 11:00:56 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask readMetadata SQl initialised. 2022-04-08 11:00:56 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask readMetadata Establishing SQL connection to analysis.tdf 2022-04-08 11:00:56 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask readMetadata Connection established. org.sqlite.SQLiteConnection@33df7c70 2022-04-08 11:00:57 INFO io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask readMetadata Metadata read successfully for null 2022-04-08 11:00:57 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask run Opening tdf file Z:\220402_beads_test3\timsTOF\220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d\analysis.tdf_bin 2022-04-08 11:00:57 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFUtils loadLibrary Initialising tdf library. 2022-04-08 11:00:58 INFO io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFUtils loadLibrary Native TDF library initialised Proxy interface to Native Library <C:\Users\animeshs\tmp\jna--795084854\jna2865131544765511507.dll@140714733142016> 2022-04-08 11:00:58 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFUtils setNumThreads Setting number of threads per file to 24 2022-04-08 11:00:58 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFUtils openFile Opening tdf file Z:\220402_beads_test3\timsTOF\220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d\analysis.tdf_bin 2022-04-08 11:00:58 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFUtils openFile File analysis.tdf_bin hasReacalibratedState = 0 2022-04-08 11:00:58 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask run Starting frame import. 2022-04-08 11:00:58 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine6495102123580427346.tmp 2022-04-08 11:01:46 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine14927564702407177126.tmp 2022-04-08 11:02:57 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine9276996119853586438.tmp 2022-04-08 11:05:40 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine12296041720103823897.tmp 2022-04-08 11:05:49 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask setFinishedPercentage Loading mobility scans of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d: Frame 2/32042 - 10% 2022-04-08 11:06:13 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine3567316044266902886.tmp 2022-04-08 11:06:39 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine15434875942009947729.tmp 2022-04-08 11:07:05 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine14261944412051621411.tmp 2022-04-08 11:07:30 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine2658883360009417032.tmp 2022-04-08 11:07:54 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine12320986480936919871.tmp 2022-04-08 11:07:58 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask setFinishedPercentage Loading mobility scans of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d: Frame 3563/32042 - 20% 2022-04-08 11:08:18 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine18322156714395607572.tmp 2022-04-08 11:08:41 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine15146253727559368832.tmp 2022-04-08 11:09:05 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine18238357387711134951.tmp 2022-04-08 11:09:31 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine9056744129023248056.tmp 2022-04-08 11:09:52 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask setFinishedPercentage Loading mobility scans of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d: Frame 7124/32042 - 30% 2022-04-08 11:10:03 FINE io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFUtils printLastError Last TDF import error: <no error> length: 11. Required buffer size: 303424 actual size: 300000 2022-04-08 11:10:03 FINE io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFUtils loadDataPointsForFrame Could not read scans 600-650 for frame 7883. Increasing buffer size to 400000 and reloading. 2022-04-08 11:10:17 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine18275821231200674906.tmp 2022-04-08 11:11:03 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask setFinishedPercentage Loading mobility scans of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d: Frame 10685/32042 - 40% 2022-04-08 11:11:35 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine15489476581432322703.tmp 2022-04-08 11:12:13 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask setFinishedPercentage Loading mobility scans of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d: Frame 14246/32042 - 50% 2022-04-08 11:12:33 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine948246095733687252.tmp 2022-04-08 11:13:18 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask setFinishedPercentage Loading mobility scans of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d: Frame 17807/32042 - 60% 2022-04-08 11:13:32 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine11570905847283319696.tmp 2022-04-08 11:14:24 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask setFinishedPercentage Loading mobility scans of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d: Frame 21368/32042 - 70% 2022-04-08 11:14:31 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine14051204624066137058.tmp 2022-04-08 11:15:22 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine4088378038076152010.tmp 2022-04-08 11:15:35 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask setFinishedPercentage Loading mobility scans of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d: Frame 24929/32042 - 80% 2022-04-08 11:16:01 FINE io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFUtils printLastError Last TDF import error: <no error> length: 11. Required buffer size: 410312 actual size: 400000 2022-04-08 11:16:01 FINE io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFUtils loadDataPointsForFrame Could not read scans 550-600 for frame 26231. Increasing buffer size to 500000 and reloading. 2022-04-08 11:16:14 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine377747158027701214.tmp 2022-04-08 11:16:50 FINEST io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask setFinishedPercentage Loading mobility scans of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d: Frame 28490/32042 - 90% 2022-04-08 11:17:05 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine13661924559005267153.tmp 2022-04-08 11:17:31 FINEST io.github.mzmine.modules.io.import_rawdata_mzml.msdk.MzMLFileImportMethod execute Parsing Complete 2022-04-08 11:17:31 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine16449784675199716358.tmp 2022-04-08 11:17:35 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine6233957650623788853.tmp 2022-04-08 11:17:39 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine15284232830419620800.tmp 2022-04-08 11:17:43 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine517144791032713350.tmp 2022-04-08 11:17:47 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine16252811333156657725.tmp 2022-04-08 11:17:50 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine6080521014099319874.tmp 2022-04-08 11:17:54 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine16946976893789780675.tmp 2022-04-08 11:17:55 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine2158702779178666462.tmp 2022-04-08 11:17:58 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine12944601320804647051.tmp 2022-04-08 11:18:02 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine2950072157245239988.tmp 2022-04-08 11:18:06 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine17393006667927393324.tmp 2022-04-08 11:18:09 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine13522175424589333339.tmp 2022-04-08 11:18:11 FINEST io.github.mzmine.util.MemoryMapStorage createNewMappedFile Created a temporary file C:\Users\animeshs\tmp\mzmine8502898768522720037.tmp 2022-04-08 11:18:13 FINEST io.github.mzmine.project.impl.MZmineProjectImpl addFile Adding a new file to the project: 220402_Alessandro_test3_Plasma24h-2.raw 2022-04-08 11:18:13 INFO io.github.mzmine.modules.io.import_rawdata_thermo_raw.ThermoRawImportTask run Finished parsing Z:\220402_beads_test3\plasma_samples\220402_Alessandro_test3_Plasma24h-2.raw, parsed 98525 scans 2022-04-08 11:18:13 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Importing 220402_Alessandro_test3_Plasma24h-2.raw, parsed 98525/98525 scans done, status FINISHED 2022-04-08 11:18:48 INFO io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask constructMsMsInfo Construced 173053 ImsMsMsInfos for 32042 in 38825 ms 2022-04-08 11:18:48 INFO io.github.mzmine.modules.io.import_rawdata_bruker_tdf.TDFImportTask run Imported 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d. Loaded 32042 scans and 32042 frames. 2022-04-08 11:18:48 FINEST io.github.mzmine.project.impl.MZmineProjectImpl addFile Adding a new file to the project: 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d 2022-04-08 11:18:48 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Importing 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d: Writing raw data file... done, status FINISHED 2022-04-08 11:26:30 FINEST io.github.mzmine.gui.mainwindow.MainWindowController handleShowRawDataOverview Activated Show raw data overview menu item 2022-04-08 11:26:30 FINEST io.github.mzmine.main.MZmineCore getModuleInstance Creating an instance of the module io.github.mzmine.modules.visualization.rawdataoverview.RawDataOverviewModule 2022-04-08 11:26:30 FINEST io.github.mzmine.main.MZmineCore runMZmineModule Module Raw data overview called at 2022-04-08T09:26:30.965581100Z 2022-04-08 11:26:30 FINEST io.github.mzmine.parameters.parametertypes.selectors.RawDataFilesSelection getMatchingRawDataFiles Setting file selection. Evaluated files: [220402_Alessandro_test3_Plasma24h-2.raw] 2022-04-08 11:26:30 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Create raw data overview for 1 raw data files" to the task controller queue 2022-04-08 11:26:30 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Create raw data overview for 1 raw data files 2022-04-08 11:26:31 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Create raw data overview for 1 raw data files done, status FINISHED 2022-04-08 11:26:31 FINE io.github.mzmine.modules.visualization.rawdataoverview.RawDataFileInfoPaneController populate Populating table for raw data file 220402_Alessandro_test3_Plasma24h-2.raw 2022-04-08 11:26:31 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Loading scan info of 220402_Alessandro_test3_Plasma24h-2.raw" to the task controller queue 2022-04-08 11:26:31 FINE io.github.mzmine.modules.visualization.rawdataoverview.RawDataOverviewWindowController addRawDataFileTab Added raw data file tab for 220402_Alessandro_test3_Plasma24h-2.raw 2022-04-08 11:26:31 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Loading scan info of 220402_Alessandro_test3_Plasma24h-2.raw 2022-04-08 11:26:31 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Updating TIC visualizer of 220402_Alessandro_test3_Plasma24h-2.raw" to the task controller queue 2022-04-08 11:26:31 FINEST io.github.mzmine.modules.visualization.chromatogramandspectra.ChromatogramAndSpectraVisualizer addRawDataFile Added raw data file 220402_Alessandro_test3_Plasma24h-2.raw 2022-04-08 11:26:31 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Updating TIC visualizer of 220402_Alessandro_test3_Plasma24h-2.raw 2022-04-08 11:26:31 INFO io.github.mzmine.modules.visualization.chromatogram.TICDataSet run TIC data calculated for 220402_Alessandro_test3_Plasma24h-2.raw 2022-04-08 11:26:31 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Updating TIC visualizer of 220402_Alessandro_test3_Plasma24h-2.raw done, status FINISHED 2022-04-08 11:26:32 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Loading scan info of 220402_Alessandro_test3_Plasma24h-2.raw done, status FINISHED 2022-04-08 11:26:48 FINEST io.github.mzmine.gui.mainwindow.MainWindowController handleShowIMSDataOverview Activated Show ion mobility raw data overview 2022-04-08 11:26:48 FINEST io.github.mzmine.main.MZmineCore getModuleInstance Creating an instance of the module io.github.mzmine.modules.visualization.rawdataoverviewims.IMSRawDataOverviewModule 2022-04-08 11:26:48 FINEST io.github.mzmine.main.MZmineCore runMZmineModule Module Ion mobility raw data overview called at 2022-04-08T09:26:48.755324600Z 2022-04-08 11:26:48 FINEST io.github.mzmine.parameters.parametertypes.selectors.RawDataFilesSelection getMatchingRawDataFiles Setting file selection. Evaluated files: [220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d] 2022-04-08 11:26:48 FINEST io.github.mzmine.datamodel.data_access.BinningMobilogramDataAccess <init> Bin width set to 1 scans. (approximately 0.0011185838621645546 Vs/cm^2) 2022-04-08 11:26:48 FINEST io.github.mzmine.datamodel.data_access.BinningMobilogramDataAccess <init> Bin width set to 1 scans. (approximately 0.0011185838621645546 Vs/cm^2) 2022-04-08 11:26:48 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Updating TIC visualizer of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d" to the task controller queue 2022-04-08 11:26:48 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Updating TIC visualizer of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d 2022-04-08 11:26:48 INFO io.github.mzmine.modules.visualization.chromatogram.TICDataSet run TIC data calculated for 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d 2022-04-08 11:26:48 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Updating TIC visualizer of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d done, status FINISHED 2022-04-08 11:26:48 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Computing values for dataset 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min" to the task controller queue 2022-04-08 11:26:48 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Computing values for dataset Total ion mobilogram for 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min" to the task controller queue 2022-04-08 11:26:48 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Computing values for dataset 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min" to the task controller queue 2022-04-08 11:26:48 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Building mobilograms and tic data sets for Ims raw data overview." to the task controller queue 2022-04-08 11:27:04 FINEST io.github.mzmine.gui.mainwindow.MainWindowController handleShowIMSDataOverview Activated Show ion mobility raw data overview 2022-04-08 11:27:04 FINEST io.github.mzmine.parameters.parametertypes.selectors.RawDataFilesSelection resetSelection Resetting file selection. Previously evaluated files: [220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d] 2022-04-08 11:27:04 FINEST io.github.mzmine.main.MZmineCore runMZmineModule Module Ion mobility raw data overview called at 2022-04-08T09:27:04.081990200Z 2022-04-08 11:27:04 FINEST io.github.mzmine.parameters.parametertypes.selectors.RawDataFilesSelection getMatchingRawDataFiles Setting file selection. Evaluated files: [220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d] 2022-04-08 11:27:04 FINEST io.github.mzmine.datamodel.data_access.BinningMobilogramDataAccess <init> Bin width set to 1 scans. (approximately 0.0011185838621645546 Vs/cm^2) 2022-04-08 11:27:04 FINEST io.github.mzmine.datamodel.data_access.BinningMobilogramDataAccess <init> Bin width set to 1 scans. (approximately 0.0011185838621645546 Vs/cm^2) 2022-04-08 11:27:04 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Updating TIC visualizer of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d" to the task controller queue 2022-04-08 11:27:04 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Updating TIC visualizer of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d 2022-04-08 11:27:04 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Computing values for dataset 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min" to the task controller queue 2022-04-08 11:27:04 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Computing values for dataset Total ion mobilogram for 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min" to the task controller queue 2022-04-08 11:27:04 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Computing values for dataset 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min" to the task controller queue 2022-04-08 11:27:04 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Building mobilograms and tic data sets for Ims raw data overview." to the task controller queue 2022-04-08 11:27:04 INFO io.github.mzmine.modules.visualization.chromatogram.TICDataSet run TIC data calculated for 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d 2022-04-08 11:27:04 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Updating TIC visualizer of 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d done, status FINISHED 2022-04-08 11:27:04 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Computing values for dataset 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min 2022-04-08 11:27:04 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Computing values for dataset Total ion mobilogram for 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min 2022-04-08 11:27:04 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Building mobilograms and tic data sets for Ims raw data overview. 2022-04-08 11:27:04 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Computing values for dataset 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min 2022-04-08 11:27:04 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Computing values for dataset 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min done, status FINISHED 2022-04-08 11:27:04 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Computing values for dataset Total ion mobilogram for 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min done, status FINISHED 2022-04-08 11:27:04 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Building mobilograms and tic data sets for Ims raw data overview. done, status FINISHED 2022-04-08 11:27:04 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Computing values for dataset 220402_Alessandro_test3_24h_2_Slot2-54_1_1512.d - Frame 1 0.01 min done, status FINISHED 2022-04-08 12:02:57 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Calculating base peak chromatogram(s) of m/z 274.2745 in 1 file(s)." to the task controller queue 2022-04-08 12:02:57 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Calculating base peak chromatogram(s) of m/z 274.2745 in 1 file(s). 2022-04-08 12:02:57 FINEST io.github.mzmine.modules.visualization.chromatogramandspectra.ChromatogramAndSpectraVisualizer lambda$delayedFeatureDataUpdate$18 FeatureUpdate status changed from WAITING to PROCESSING 2022-04-08 12:02:57 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Calculating scan data sets for 1 raw data file(s)." to the task controller queue 2022-04-08 12:02:58 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Calculating scan data sets for 1 raw data file(s). 2022-04-08 12:02:58 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Calculating scan data sets for 1 raw data file(s). done, status FINISHED 2022-04-08 12:03:00 FINEST io.github.mzmine.modules.visualization.chromatogramandspectra.ChromatogramAndSpectraVisualizer lambda$delayedFeatureDataUpdate$18 FeatureUpdate status changed from PROCESSING to FINISHED 2022-04-08 12:03:00 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Calculating base peak chromatogram(s) of m/z 274.2745 in 1 file(s). done, status FINISHED 2022-04-08 12:03:20 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Calculating base peak chromatogram(s) of m/z 286.1762 in 1 file(s)." to the task controller queue 2022-04-08 12:03:20 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Calculating base peak chromatogram(s) of m/z 286.1762 in 1 file(s). 2022-04-08 12:03:20 FINEST io.github.mzmine.modules.visualization.chromatogramandspectra.ChromatogramAndSpectraVisualizer lambda$delayedFeatureDataUpdate$18 FeatureUpdate status changed from WAITING to PROCESSING 2022-04-08 12:03:20 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Calculating scan data sets for 1 raw data file(s)." to the task controller queue 2022-04-08 12:03:21 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Calculating scan data sets for 1 raw data file(s). 2022-04-08 12:03:21 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Calculating scan data sets for 1 raw data file(s). done, status FINISHED 2022-04-08 12:03:24 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Calculating base peak chromatogram(s) of m/z 136.0759 in 1 file(s)." to the task controller queue 2022-04-08 12:03:24 FINEST io.github.mzmine.taskcontrol.impl.TaskQueue addWrappedTask Adding task "Calculating scan data sets for 1 raw data file(s)." to the task controller queue 2022-04-08 12:03:24 FINEST io.github.mzmine.modules.visualization.chromatogramandspectra.ChromatogramAndSpectraVisualizer lambda$delayedFeatureDataUpdate$18 FeatureUpdate status changed from PROCESSING to FINISHED 2022-04-08 12:03:24 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Calculating base peak chromatogram(s) of m/z 286.1762 in 1 file(s). done, status FINISHED 2022-04-08 12:03:24 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Calculating scan data sets for 1 raw data file(s). 2022-04-08 12:03:24 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Starting processing of task Calculating base peak chromatogram(s) of m/z 136.0759 in 1 file(s). 2022-04-08 12:03:24 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Calculating scan data sets for 1 raw data file(s). done, status FINISHED 2022-04-08 12:03:24 FINEST io.github.mzmine.modules.visualization.chromatogramandspectra.ChromatogramAndSpectraVisualizer lambda$delayedFeatureDataUpdate$18 FeatureUpdate status changed from WAITING to PROCESSING 2022-04-08 12:03:24 FINEST io.github.mzmine.modules.visualization.chromatogramandspectra.FeatureDataSetCalc run No scans found for 220402_Alessandro_test3_Plasma24h-2.raw 2022-04-08 12:03:24 FINEST io.github.mzmine.modules.visualization.chromatogramandspectra.ChromatogramAndSpectraVisualizer lambda$delayedFeatureDataUpdate$18 FeatureUpdate status changed from PROCESSING to FINISHED 2022-04-08 12:03:24 INFO io.github.mzmine.taskcontrol.impl.WorkerThread run Processing of task Calculating base peak chromatogram(s) of m/z 136.0759 in 1 file(s). done, status FINISHED 2022-04-08 12:04:26 INFO io.github.mzmine.gui.mainwindow.MainWindowController runModule Menu item activated for module io.github.mzmine.modules.io.exportscans.ExportScansFromRawFilesModule 2022-04-08 12:07:23 FINEST io.github.mzmine.gui.helpwindow.HelpWindow <init> Loading help file jar:file:/F:/OneDrive%20-%20NTNU/Downloads/MZmine_Windows_portable_3.0.21-beta/app/mzmine3-3.0.21.jar!/aboutpage/AboutMZmine.html 2022-04-08 12:07:29 FINEST io.github.mzmine.gui.helpwindow.HelpWindowController handleClose Closing help window Thanks for the report! Will be fixed in the next release. For now, you should be able to achieve the same thing via "Raw data methods -> Raw data export -> mzML" Thanks @SteffenHeu , just to update, i tried the mzML export and see this warning I guess it is safe to ignore? Hi animesh, this basically means, that the module was not designed with IMS data in mind. It will only export the merged frame spectra and not ims resolved scans. OK, so it just concatenates the ims dimension? I am assuming it will the same when MGF module gets functional? Yes. If you want acces to individual scans, we recommend MSConvert.
gharchive/issue
2022-04-08T10:10:17
2025-04-01T06:45:05.695178
{ "authors": [ "SteffenHeu", "animesh" ], "repo": "mzmine/mzmine3", "url": "https://github.com/mzmine/mzmine3/issues/658", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
95664112
i can't load home page linkedin ? Hey everyone; I want to load the png of home page linkedin but i have empty file who i can do this ? var casper = require("casper").create(); casper.start('https://www.linkedin.com/nhome/'); casper.then(function() { this.echo('Title : ' + this.getTitle()); this.capture('image.png'); }); casper.run(); thx for your help :) I don't know if that could be the issue, but LinkedIn is blocking some servers from accessing their content, for example the Heroku ones : see here or here I don't think, because when i use URL without httpS it's not working but when is http it's work fine. Oh, so maybe you could try playing with the --ssl-protocol and --ignore-ssl-errors CL options : http://phantomjs.org/api/command-line.html I try [sslv3|sslv2|tlsv1|any'] but it's the same, i think it's can work with -ssl-certificates-path= but i don't have any idea how use it :/ up As part of a cleanup effort: Looks to be a stale help request. Assuming you've moved on to better and cooler things. Please feel free to re-open if this is still an active issue for you and we'll try our best to help out.
gharchive/issue
2015-07-17T14:12:21
2025-04-01T06:45:05.788355
{ "authors": [ "BIGjuevos", "idir-ait", "ngotchac" ], "repo": "n1k0/casperjs", "url": "https://github.com/n1k0/casperjs/issues/1272", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
2206285301
🛑 m.n1l.dev is down In 52a0754, m.n1l.dev (https://m.n1l.dev) was down: HTTP code: 502 Response time: 611 ms Resolved: m.n1l.dev is back up in 69408ef after 16 minutes.
gharchive/issue
2024-03-25T17:18:45
2025-04-01T06:45:05.791152
{ "authors": [ "n1lsqn" ], "repo": "n1lsqn/status", "url": "https://github.com/n1lsqn/status/issues/125", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2324641239
🛑 rasp.n1l.dev is down In 5a25b50, rasp.n1l.dev (https://rasp.n1l.dev) was down: HTTP code: 0 Response time: 0 ms Resolved: rasp.n1l.dev is back up in 499caf1 after 45 minutes.
gharchive/issue
2024-05-30T03:56:03
2025-04-01T06:45:05.793596
{ "authors": [ "n1lsqn" ], "repo": "n1lsqn/status", "url": "https://github.com/n1lsqn/status/issues/306", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2331917877
🛑 rasp.n1l.dev is down In 601f8bb, rasp.n1l.dev (https://rasp.n1l.dev) was down: HTTP code: 0 Response time: 0 ms Resolved: rasp.n1l.dev is back up in f4b476c after 1 hour, 46 minutes.
gharchive/issue
2024-06-03T19:50:33
2025-04-01T06:45:05.796018
{ "authors": [ "n1lsqn" ], "repo": "n1lsqn/status", "url": "https://github.com/n1lsqn/status/issues/351", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2183483646
🛑 rasp.n1l.dev is down In 32a5871, rasp.n1l.dev (https://rasp.n1l.dev) was down: HTTP code: 0 Response time: 0 ms Resolved: rasp.n1l.dev is back up in 0894361 after 7 minutes.
gharchive/issue
2024-03-13T09:16:42
2025-04-01T06:45:05.798379
{ "authors": [ "n1lsqn" ], "repo": "n1lsqn/status", "url": "https://github.com/n1lsqn/status/issues/94", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1629697764
Added Popup window In this milestone the following functionalities were added : When the user clicks (or taps) the button to check project details, the popup with details about the project appears. When the user clicks (or taps) the close (X) button, the popup disappears. The popup window was implemented. Popups for both mobile and desktop screens were implemented As per the suggestions of @AlphaNtihinduka I have fixed the overflow using overflow : scroll;
gharchive/pull-request
2023-03-17T17:41:50
2025-04-01T06:45:05.849668
{ "authors": [ "naanmohammed" ], "repo": "naanmohammed/my-portfolio", "url": "https://github.com/naanmohammed/my-portfolio/pull/25", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
434330083
Covariance matrix of the bi-variate Gaussians It seems that the covariance matrix of the bi-variate Gaussian is assumed to be diagonal, isn't it? isn't it a strong assumption? It isn't, the model outputs the correlation coefficient in addition to the means and standard deviations Ok, you're right, I mixed Up with your other publicación "Multi-Modal Trajectory Prediction of Surrounding Vehicles with Maneuver based LSTMs"
gharchive/issue
2019-04-17T14:56:47
2025-04-01T06:45:05.856413
{ "authors": [ "nachiket92", "stratomaster31" ], "repo": "nachiket92/conv-social-pooling", "url": "https://github.com/nachiket92/conv-social-pooling/issues/10", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1517164714
Array of strings syntax sugar Maybe [ "hello", "world", "this", "is", "an", "array", "of", "strings" ] "\,hello\,world\,this\,is\,an\,array\,of\,strings" could be equivalent. and a easy way to concatenate string arrays or append/prepend individual items Add a str PathJoin(str a, str b) function (checks b is relative and remove optional trailing "/" from a). Add syntax sugar using divide operator on strings: that is, a/b and PathJoin(a,b) would be equivalent.
gharchive/issue
2023-01-03T09:59:57
2025-04-01T06:45:05.891330
{ "authors": [ "nakst" ], "repo": "nakst/teak", "url": "https://github.com/nakst/teak/issues/1", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1651055390
Problem install extensions into SQLite Browser Version: (MacM1, osx 13.2) DB Browser for SQLite Version 3.12.2 Built for arm64-little_endian-lp64, running on arm64 Qt Version 5.15.6 SQLCipher Version 4.5.1 community (based on SQLite) select load_extension('/usr/local/lib/sqlean/stats'); select median(value) from generate_series(1, 99); Results in: Execution finished with errors. Result: dlopen(/usr/local/lib/sqlean/stats.dylib, 0x000A): tried: '/usr/local/lib/sqlean/stats.dylib' (code signature in <9D31DCA1-36E5-34B2-9FF0-BE51FE7B72E7> '/usr/local/lib/sqlean/stats.dylib' not valid for use in process: mapped file has no Team ID and is not a platform binary (signed with custom identity or adhoc?)), '/System/ At line 1: select load_extension('/usr/local/lib/sqlean/stats'); Works from Command line: sqlite> .load /usr/local/lib/sqlean/stats.dylib sqlite> select median(value) from generate_series(1, 99); 50.0 sqlite> Even adding it the OSX Application folder produced the same results: Execution finished with errors. Result: dlopen(/Applications/DB Browser for SQLite.app/Contents/PlugIns/sqlean/stats.dylib, 0x000A): tried: '/Applications/DB Browser for SQLite.app/Contents/PlugIns/sqlean/stats.dylib' (code signature in <9D31DCA1-36E5-34B2-9FF0-BE51FE7B72E7> '/Applications/DB Browser for SQLite.app/Contents/PlugIns/sqlean/stats.dylib' not valid for use in process: mapped file has no Team At line 1: select load_extension('/Applications/DB Browser for SQLite.app/Contents/PlugIns/sqlean/stats'); PS: I am sure the issue is related to M1 Codesigning codesign -dv -r- stats.dylib Executable=/Applications/DB Browser for SQLite.app/Contents/PlugIns/sqlean/stats.dylib Identifier=stats.dylib Format=Mach-O thin (arm64) CodeDirectory v=20400 size=516 flags=0x20002(adhoc,linker-signed) hashes=13+0 location=embedded Signature=adhoc Info.plist=not bound TeamIdentifier=not set Sealed Resources=none # designated => cdhash H"0fd9e8853f07bf426f3388a50c006ec45e588684" The TeamIdentifier seems to be the issue https://wiki.lazarus.freepascal.org/Code_Signing_for_macOS Have you tried removing the extension binary from the quarantine? xattr -d com.apple.quarantine "/Applications/DB Browser for SQLite.app/Contents/PlugIns/sqlean/stats.dylib" I saw that and removed that file and replaced with a new one. ——————————————————— Colin A. Bitterfield Mailto: @.*** Mobile: (571) 533-4700 ——————————————————— On Apr 2, 2023 at 1:31 PM -0700, Anton Zhiyanov @.***>, wrote: Have you tried removing the extension binary from the quarantine? xattr -d com.apple.quarantine "/Applications/DB Browser for SQLite.app/Contents/PlugIns/sqlean/stats.dylib" — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***> Sorry, I don't see how that answers my question 🤔 I'm going to close the issue as I haven't received an answer. Let me know when you are ready to resume the conversation. I removed the quarantine and tried to call apple The issue seems to be related to being opened from within an application I tried moving the db browser app to user/applications no change Using spxtl —access it shows the library as rejected ——————————————————— Colin A. Bitterfield Mailto: @.*** Mobile: (571) 533-4700 ——————————————————— On Apr 5, 2023 at 1:11 AM -0400, Anton Zhiyanov @.***>, wrote: I'm going to close the issue as I haven't received an answer. Let me know when you are ready to resume the conversation. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***> Did you run the exact command I wrote earlier? What was the output? Oh, I see. Thanks for the detailed explanation! I'll investigate further. I requested the SQLiteBrowser people add your libraries in their build. They thought it was a good idea. ——————————————————— Colin A. Bitterfield Mailto: @.*** Mobile: (571) 533-4700 ——————————————————— On Apr 5, 2023 at 11:31 AM -0400, Anton Zhiyanov @.***>, wrote: Oh, I see. Thanks for the detailed explanation! I'll investigate further. — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***> Thank you very much! I haven't looked into the issue yet, but I'll come back when I have new information.
gharchive/issue
2023-04-02T18:54:17
2025-04-01T06:45:05.913965
{ "authors": [ "cbitterfield", "nalgeon" ], "repo": "nalgeon/sqlean", "url": "https://github.com/nalgeon/sqlean/issues/76", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1063615496
I think we should add these 2 hook usePrevious and useComparingProp The usePrevious would looks like function usePrevious(value: any) { const ref = useRef(); useEffect(() => { ref.current = value; }, [value]); return ref.current; } The useComparingProp would looks like type FnOnChange<T> = (o: { currentValue: T, prevValue: T }) => void; type FnCompare<T> = (prev: T, next: T) => boolean; function useComparingProp<T>(currentValue: T, compare: FnCompare<T>, onChange: FnOnChange<T>, deps: any[] = []) { const prevValue = usePrevious(currentValue); useEffect(() => { if (compare(prevValue, currentValue)) { onChange({ currentValue, prevValue }); } }, [currentValue, prevValue, onChange].concat(deps)); } Hi @finnng, thanks for your idea. These two hooks look great to me. In my opinion, in the usingComparingProp, we should cache the onChange handler to make sure that we don't have any unnecessary rerender.
gharchive/issue
2021-11-25T13:51:27
2025-04-01T06:45:05.916148
{ "authors": [ "finnng", "namKolo" ], "repo": "namKolo/hookdafish", "url": "https://github.com/namKolo/hookdafish/issues/3", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
664345719
Update for Twig 2 This PR is based on PR https://github.com/namics/twig-nitro-library/pull/8 Using "null" for the value of node "data" of "Namics\Terrific\Twig\Node\ComponentNode" is deprecated since version 1.25 and will be removed in 2.0. Closing in favor of https://github.com/namics/twig-nitro-library/pull/9 Created new branch to trigger Packagist build
gharchive/pull-request
2020-07-23T09:59:54
2025-04-01T06:45:05.945091
{ "authors": [ "marcosimbuerger", "orlandothoeny" ], "repo": "namics/twig-nitro-library", "url": "https://github.com/namics/twig-nitro-library/pull/9", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1881729051
🛑 aaaaa.ooo is down In f957b69, aaaaa.ooo (https://aaaaa.ooo) was down: HTTP code: 502 Response time: 4003 ms Resolved: aaaaa.ooo is back up in d74d12a after 27 days, 11 hours, 48 minutes.
gharchive/issue
2023-09-05T10:58:14
2025-04-01T06:45:05.965319
{ "authors": [ "Ishydo" ], "repo": "nanhosting/monitoring", "url": "https://github.com/nanhosting/monitoring/issues/921", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2256482585
🛑 Personal Website is down In 55a9f34, Personal Website (https://nanthakumaran.com) was down: HTTP code: 0 Response time: 0 ms Resolved: Personal Website is back up in e97d3ab after 21 minutes.
gharchive/issue
2024-04-22T13:00:36
2025-04-01T06:45:06.079037
{ "authors": [ "nanthakumaran-s" ], "repo": "nanthakumaran-s/upptime", "url": "https://github.com/nanthakumaran-s/upptime/issues/14", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1221795696
How to distinguish between 2 layers selected, and 1 layer selection changing? 🐛 Bug I am writing a plugin that needs to do something when two layers are selected. To do this I'm connecting to the viewer.layers.selection.events.changed event. When two layers are selected by the user this works fine. However, when the user has one layer selected, and switches to another single layer, the changed event is triggered twice. The first time is when the new layer is added to the selection, and the second time when the old layer is removed. This is annoying, because I have no way to distinguish between: The first event when changing between single layers, where a new layer is present alongside an old layer. When two layers are "permanently" selected I need a way to distinguish between the two cases above. Perhaps I've missed another way of doing this? Or perhaps this is impossible to solve due to the way SelectableEventedList is implemented? To Reproduce Steps to reproduce the behavior: Run the code below Select individual layers. Note how the callback triggers with 2 layers selected, then 1 selected. Select two layers (using ctrl to select the second). Note how the callback triggers with 2 layers selected. import napari viewer = napari.Viewer() viewer.open_sample("napari", "kidney") def active_changed(): print(f"{len(viewer.layers.selection)} layers selected") viewer.layers.selection.events.changed.connect(active_changed) if __name__ == "__main__": napari.run() Environment Current main branch of napari. Interestingly, this is just with the GUI. programmatically the opposite occurs, e.g. using: viewer.layers.selection = {viewer.layers[1]} followed by: viewer.layers.selection = {viewer.layers[0]} I get: viewer.layers.selection = {viewer.layers[0]} 0 layers selected 1 layers selected I think that way is better? first deselect then select? But yea, the deselect in this case should probably be swallowed? This is related to this question I had: https://napari.zulipchat.com/#narrow/channel/212875-general/topic/layer.20selection.20events
gharchive/issue
2022-04-30T10:16:46
2025-04-01T06:45:06.103670
{ "authors": [ "dstansby", "psobolewskiPhD" ], "repo": "napari/napari", "url": "https://github.com/napari/napari/issues/4455", "license": "BSD-3-Clause", "license_type": "permissive", "license_source": "github-api" }
479316129
Hosted docs 📚 Documentation I think now would be a good time for us to start hosting our API docs on a public site. I view this as complimentary to, but connected to our current tutorials page Using readthedocs is fine with me, I don't know if CZI will pay to remove adds, but that would be reasonable. Alternatively there is https://github.com/freeman-lab/minidocs made by our very own @freeman-lab or direct integration with our tutorials page. Thoughts @kne42 @AhmetCanSolak @jni? There was a recent discussion around hosted docs on twitter that might be of interest too. I use doctr for the skan docs and really like it: https://jni.github.io/skan/ That would be my recommendation. The only thing I haven't set up yet is multiple hosted versions, but I think that's not hard to do. doctr looks super awesome but unfortunately we do not use travis, although i suppose we could set it up for this purpose. it's also not terribly difficult to replicate this functionality ourselves. i am definitely in favor of self-hosting our docs if possible :) since there are a number of issues with our docs' formatting and content, i would advise focusing on that prior to hosting them sidenote: @jni are you the one who owns napari.org? we'll need your help to properly set up the GH pages on that domain :) @kne42 Is Doctr really travis only? Could we not adapt it to other CIs? And anyway, as you mention, we can have Travis just for docs, I don't think that's a big deal. Yes I own napari.org, happy to change that to our gh-pages site once it's up and running! @jni Yes, I inspected the code and it's only compatible with travis due to its reliance on its environment variables. We may define these in our config as a hack but that's not guaranteed to work.
gharchive/issue
2019-08-10T23:42:36
2025-04-01T06:45:06.109649
{ "authors": [ "jni", "kne42", "sofroniewn" ], "repo": "napari/napari", "url": "https://github.com/napari/napari/issues/473", "license": "BSD-3-Clause", "license_type": "permissive", "license_source": "github-api" }
933457771
addressing case wher argument to get_default_shape_type is empty list, addresses issue #2960 Description Type of change bugfix, see #2960 checks for empty list. I also changed the docstring to be more accurate and changed the default type that is returned to the type "polygon" that was mentioned in the docstring. How has this been tested? manually checked whether #2960 has been fixed. Final checklist: [x] My PR is the minimum possible work for the desired functionality [x] I have commented my code, particularly in hard-to-understand areas [ ] I have made corresponding changes to the documentation [x] I have added tests that prove my fix is effective or that my feature works [ ] If I included new strings, I have used trans. to make them localizable. For more information see our translations guide. Looking at the tests I could see that the tests were expecting "rectangle" as shape type when the shape type list contained a mix of shape types. However, as every rectangle is a polygon but not every polygon is a rectangle, I thought it would be more appropriate here to return "polygon" as the default. This was also the behaviour described in the original docstring (but not the implemented behaviour). However, I can only guess what the original intention was, so I am not sure this change is desired. This change also consitutes a bit of scope creep for this PR beyond fixing the empty list problem #2960. Thanks for catching this one @VolkerH! However, as every rectangle is a polygon but not every polygon is a rectangle, I thought it would be more appropriate here to return "polygon" as the default. This was also the behaviour described in the original docstring (but not the implemented behaviour). However, I can only guess what the original intention was, so I am not sure this change is desired. I agree re. the polygon being a superset of the rectangle and therefore likely a better choice. The original plan was indeed for polygon to become the default (hence the docstring). @sofroniewn thought it would be best to wait for 0.4.9 to make this change. I don't see an issue with changing it now while we're here, but of course will defer to @sofroniewn and @jni and what they think is best.
gharchive/pull-request
2021-06-30T08:50:57
2025-04-01T06:45:06.116218
{ "authors": [ "DragaDoncila", "VolkerH" ], "repo": "napari/napari", "url": "https://github.com/napari/napari/pull/2961", "license": "BSD-3-Clause", "license_type": "permissive", "license_source": "github-api" }
2761970998
feat: expressify lower_bound and upper_bound in is_between What type of PR is this? (check all applicable closes #1659 [ ] 💾 Refactor [ ] ✨ Feature [ ] 🐛 Bug Fix [ ] 🔧 Optimization [ ] 📝 Documentation [ ] ✅ Test [ ] 🐳 Other Related issues Related issue #<issue number> Closes #<issue number> Checklist [ ] Code follows style guide (ruff) [ ] Tests added [ ] Documented the changes If you have comments or can explain your changes, please do so below On a second thought, we are creating some asymmetry between what we can pass to Series.is_between vs Expr.is_between, aren't we? I would expect to be able to pass series' to Series.is_between, e.g. some_series.is_between(lower_bound_series, upper_bound_series) Apologies if this is already possible, if so we may want to add a test case for it thanks for your review! yup, this is what we do in __eq__ to accept either scalars or expressions / series, we can almost certainly do it in more places (and probably reuse more code)
gharchive/pull-request
2024-12-28T21:31:30
2025-04-01T06:45:06.167676
{ "authors": [ "FBruzzesi", "MarcoGorelli" ], "repo": "narwhals-dev/narwhals", "url": "https://github.com/narwhals-dev/narwhals/pull/1672", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
355755190
4-pin headers 0.1 not 0.05" The specs call for common 0.1" headers on the PCBs. The master parts list, however, has a Digi-Key link to 0.05" headers. (as well as 40 pin 0.1" headers which I see no reference to x- perhaps to break into 4 pin units?) I searched the docs and found no reference to 0.05" anything. Error? If so, will need to swap out with Digi-Key. There should only be 0.1 inch standard pin headers and connectors. I'll check through for the digikey parts and fix error if necessary
gharchive/issue
2018-08-30T21:09:17
2025-04-01T06:45:06.173326
{ "authors": [ "JHPHELAN", "ericjunkins" ], "repo": "nasa-jpl/open-source-rover", "url": "https://github.com/nasa-jpl/open-source-rover/issues/34", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
2236215579
Feature/issue 147 empty detection GitHub Issue: #147 Description This change modifies the logic of the check for empty files, to consider empty a netCDF with only singleton null-values arrays. Local test steps Added three unit tests: Two check datasets that should be considered empty, one checks a dataset that should be considered not empty. Tests passed locally. Overview of integration done Explain how this change was integration tested. Provide screenshots or logs if appropriate. An example of this would be a local Harmony deployment. PR Acceptance Checklist [x] Unit tests added/updated and passing. [ ] Integration testing [x] CHANGELOG.md updated [ ] Documentation updated (if needed). I think this is ready for you to test @ank1m! Has passed my several local tests. Thanks @ank1m!  And it's great to see the visual of the results 😄
gharchive/pull-request
2024-04-10T18:51:01
2025-04-01T06:45:06.223076
{ "authors": [ "danielfromearth" ], "repo": "nasa/stitchee", "url": "https://github.com/nasa/stitchee/pull/152", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1867218453
Odd repetitive behaviour in interactive "nat" Hey there Tim, Every now and then I test natalie to see how usable it has become. :) In the interactive nat, aka bin/natalie, we can type in ruby code and it is evaluated. The evaluation is a bit slow, so if you may have time in the future, perhaps the parser/lexer or whatever else is evaluating the code, could become faster. But this is an aside; the main gist here is the odd repetitive behaviour I see. First, here is the image, a partial screenshot of two separate natalie sessions ("irb" sessions): https://i.imgur.com/66IIVDB.png In particular in the middle of the image, I was typing "def foobar", and oddly enough, tons of extra "def foobar" lines appeared, but I only typed it once. I believe something in the interactive natalie repeats this, probably to help with spacing and indent (or this is what I think may be going on). The awkward thing is that sometimes it does NOT repeat it, and sometimes it repeats it like those 8 times. I am not sure how to easily reproduce the bug, but I believe that in whatever routine is used to determine repetition, something may be wrong since it seems to repeat it when it really should not (or, at the least, not repeat it more than once, not sure why it is ever repeated more than once ... I could perhaps wrap my head around repeating it once, if you want to show things with some indent, but it is like repeated 5 times sometimes, and I don't understand why or the rationale behind this, so I believe this is most likely a small bug). Anyway, just reporting this in the event it comes to your attention in the future if you re-visit the irb-like part of natalie again. Onwards to 100% compatbiility! \o/ Oh yes, what may not be instantly obvious - at first the "foobar" is white, which I think comes from when I type it, but the subsequent repetitions are yellow in colour, so probably some re-parsing and redrawing or something occurred, because the colour changed. And also perhaps a way can be added to disable this, in the event it causes oddities. The evaluation is a bit slow, so if you may have time in the future, perhaps the parser/lexer or whatever else is evaluating the code, could become faster. Yes, this is expected. Natalie will not (in the near future at least) be very fast at evaluating a single line of code in the interactive REPL, because Natalie is an AOT compiler. Natalie must compile each line to machine code in order to execute it. So that probably won't get a lot better until we implement the rest of our VM, which is not the top of my priorities right now. In particular in the middle of the image, I was typing "def foobar", and oddly enough, tons of extra "def foobar" lines appeared, but I only typed it once. Thank you for the bug report. Our REPL was rewritten to support autocomplete and some other interactive features, but I have noticed there are some bugs with the repetition, as you have mentioned. We can leave this issue open as a tracker for the buggy REPL. Thanks.
gharchive/issue
2023-08-25T15:04:23
2025-04-01T06:45:06.247625
{ "authors": [ "rubyFeedback", "seven1m" ], "repo": "natalie-lang/natalie", "url": "https://github.com/natalie-lang/natalie/issues/1160", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
370070096
config.toml to show list post and show icon ? help My config.toml baseurl = "https://example.com" languageCode = "en-us" title = "Desawarna" theme = "charaka-hugo-theme" copyright = "&copy; <a href=\"https://github.com/desawarna\">Desawarna</a> 2018" disqusShortname = "" googleAnalytics = "" [params] [params.highlight] style = "zenburn" [[params.social]] url = "about.html" fa _icon = "fas fa-info" [[params.social]] url = "https://github.com/desawarna" fa_icon = "fab fa-github" [[params.social]] url = "https://www.linkedin.com/in/desawarna/" fa_icon = "fab fa-linkedin-in" [[params.social]] url = "https://twitter.com/desawarna" fa_icon = "fab fa-twitter" And eror in terminal :smile: ERROR 2018/10/15 16:09:33 Failed to reload config: While parsing config: (16, 6): was expecting token =, but got "_icon" instead ERROR 2018/10/15 16:13:06 Failed to reload config: While parsing config: (16, 6): was expecting token =, but got "_icon" instead ERROR 2018/10/15 16:13:30 Failed to reload config: While parsing config: (16, 6): was expecting token =, but got "_icon" instead ERROR 2018/10/15 16:13:38 Failed to reload config: While parsing config: (16, 6): was expecting token =, but got "_icon" instead ERROR 2018/10/15 16:14:03 Failed to reload config: While parsing config: (16, 6): was expecting token =, but got "_icon" instead You have space between "fa _icon" under the above.html Thanks Mr @natarajmb 😁
gharchive/issue
2018-10-15T09:16:41
2025-04-01T06:45:06.250251
{ "authors": [ "desawarna", "natarajmb" ], "repo": "natarajmb/charaka-hugo-theme", "url": "https://github.com/natarajmb/charaka-hugo-theme/issues/2", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
339243409
new logo/icon Hi, I am a graphic designer, I want to help others in graphic design. After I reviewed your project, you have no logo on this project. Therefore I want to contribute to this project by creating a new logo / icon. what do you think? Feel free to do it! We could use it as the sample app icon I have design logo for this project. what do you think ? do you l like it ? half circle design that shaped the letter C that I took it from the word Complete which initials C and A in the letter C that I took it from the Auto word initials A. and icon green colored icon describes the complete. how about the logo. I really hope for a positive response from you.
gharchive/issue
2018-07-08T17:39:32
2025-04-01T06:45:06.254436
{ "authors": [ "mansya", "natario1" ], "repo": "natario1/Autocomplete", "url": "https://github.com/natario1/Autocomplete/issues/18", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
244215737
Differences between scriptTemplate.sh and simpleScriptTemplate.sh ? Hello there. Thanks for those templates ! I'm wandering what are the differences between scriptTemplate.sh and simpleScriptTemplate.sh ? I dont see anything in the readme about simpleScriptTemplate.sh. Thanks ! Same question ! no longer relevant
gharchive/issue
2017-07-20T00:54:33
2025-04-01T06:45:06.266437
{ "authors": [ "mtx-z", "natelandau", "pascalandy" ], "repo": "natelandau/shell-scripting-templates", "url": "https://github.com/natelandau/shell-scripting-templates/issues/2", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
795054263
Feat/ep 3709 create custom hooks location All boxes should be checked before the PR can be accepted. Exceptions can be made, but they need to be argumented and discussed with the reviewer STORY Required [ ] Added QA classes on main containers and all actionable items like Buttons, Inputs, etc [ ] Translated all the user facing text (static & dynamic) [ ] Ran Unit Tests and they pass successfuly [ ] Added unit tests for the code in earth-shared [ ] Ran the linter with yarn lint [ ] No TypeScript warnings are introduced [ ] I have made corresponding changes to the documentation Not Needed [ ] Drag here BUGFIX Required [ ] Translated all the user facing text (static & dynamic) [ ] Ran Unit Tests and they pass successfuly [ ] Ran the linter with yarn lint [ ] No TypeScript warnings are introduced [ ] I have made corresponding changes to the documentation Not Needed [ ] Drag here References Include any links supporting this change such as a: GitHub Issue/PR number addressed or fixed StackOverflow post Support forum thread Related pull requests/issues from other repos If there are no references, simply delete this section. By submitting a PR to this repository, you agree to the terms within the Code of Conduct. Please see the contributing guidelines for how to create and submit a high-quality PR for this repo. :tada: This PR is included in version 1.17.0 :tada: The release is available on GitHub release Your semantic-release bot :package::rocket: :tada: This PR is included in version 1.17.0 :tada: The release is available on GitHub release Your semantic-release bot :package::rocket:
gharchive/pull-request
2021-01-27T12:21:39
2025-04-01T06:45:06.319667
{ "authors": [ "dan-qc" ], "repo": "natgeosociety/marapp-frontend", "url": "https://github.com/natgeosociety/marapp-frontend/pull/456", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
1669372330
Fix issues around modbus TCP connections/reconnections Disable Nagle's algorithm after a reconnection. The socket is recreated on reconnection, and we weren't re-disabling Nagle's. Remove ModbusClient.connect. It turns out that pymodbus automatically connects the first time that you try and read/write something anyway. I think that the initial first auto_connect connection was messing up my W610 as well, as I often got a failed read on the first read attempt, and would have to wait for the automatic reconnection to happen. Removing the auto_connect seems to fix this. Have you tested reconnections - i.e. disconnecting the adapter/485 for ~5 mins then reconnecting? The native HA modbus integration also maintains the connection in a background thread -> https://github.com/home-assistant/core/blob/dev/homeassistant/components/modbus/modbus.py#L339 I tested disconnecting for long enough that connection was dropped, then watched it reconnect. I'll take another look at the modbus intigration again tomorrow, try and work out why it's doing what it's doing. It might be related to the need to add a delay after a connection before using it? I don't immediately see anything in there to handle a reconnection as well, although I am on mobile. I wanted to spend some time looking into what actually happens if you tell the modbus intigration to poll a sensor with the W610, which we don't think can handle that - whether it does something different which makes it all work, or whether it just silently falls over. I'll debug into (re-) connections as part of that. I do suspect that we are telling pymodbus to connect on one thread, then doing a read which is triggering a connection on another thread, and the two are interfering somehow. I very often get an error saying the read failed with 0 bytes read some minescule amount of time into the transaction when HA first starts, and this PR appears to resolve it. I did some archaeology... That connect() call was first introduced here, as part of the move to async: https://github.com/home-assistant/core/pull/34043 . It just slipped in - noone commented on it, and it wasn't justified. After that it went through various changes as other things were refactored and improved, but it seems that the assumption was that it was always needed (and maybe it is for the async client, I'm not sure). This is the reconnect call, FWIW: https://github.com/pymodbus-dev/pymodbus/blob/dev/pymodbus/client/base.py#L180 Good digging! I'll take a deeper look and take it for a test drive when I'm back on the laptop but seems to make sense. I think the connect can also be done as part of the class initialisation (sync/async) -> https://github.com/pymodbus-dev/pymodbus/blob/dev/pymodbus/client/base.py#L346 The docs for pymodbus say that it'll automatically reconnect after a successful initial connection. Well 1) that doesn't appear to be accurate - you don't actually need the initial connection, and 2) neither us nor the HA integration make sure that the initial connection is successful - we just give it a punt, and if it faisl fall back to the built in reconnection logic. Also, the HA integration calls connect with the lock held, and we don't. I suspect that's important and avoids the issue I've been seeing, but I'll need to do some more tests with some extra logging to be sure. Hmm OK I added some logging to connect, and it doesn't look like the two calls (one during setup, one when we read the first register) overlap in practice (that's just because there's a 10s gap between the initial connection and the first poll). But I still get: 2023-04-16 10:27:55.314 DEBUG (MainThread) [custom_components.foxess_modbus.modbus_controller] Modbus exception when polling - Modbus Error: [Connection] ModbusTcpClient(192.168.10.8:502): Connection unexpectedly closed 0.00450444221496582 seconds into read of 8 bytes without response from unit before it closed connection Also, the HA integration calls connect with the lock held, and we don't. I suspect that's important and avoids the issue I've been seeing, but I'll need to do some more tests with some extra logging to be sure. I'm wrong about that. We call _async_pymodbus_call, which does take the lock. However, the error above does seem to stop appearing when I remove the auto_connect logic -- I get the error about 1 time in 2 with the auto_connect present, and I've yet to see it with it absent. So, I'm more confused this morning than I was last night -- I'm not sure what's going on. Thinking today, I wonder whether the W610 simply didn't like the large gap between the first connection and the first request. Now that we've reduced that time, however, we might be affecting direct LAN connections. There's talk that people using HA-foxess-modbus needed a delay: 1 when using a direct LAN connection, and we now don't have any delay here.
gharchive/pull-request
2023-04-15T12:53:43
2025-04-01T06:45:06.336185
{ "authors": [ "canton7", "nathanmarlor" ], "repo": "nathanmarlor/foxess_modbus", "url": "https://github.com/nathanmarlor/foxess_modbus/pull/153", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1817142158
HeroMove Transition for larger views Hi @nathantannar4! Hope you're doing well, I just stumbled across this repo and it's exactly what I've needed for a long time, thanks a lot for doing this! Just one thing I noticed, for the hero move transition, it only really seems to work well if the source view is fairly small (like the blue square example or the pictures in the grid in your example app). I have some cards in my app that I'd like to animate an expanding transition for (a bit like the cards in the App Store), but it doesn't seem to expand as smoothly (e.g. the height doesn't really animate), and sort of fades instead, like this: For reference, https://github.com/sebjvidal/UIViewControllerAnimatedTransitioning-Demo has some cards that are almost full-width that seem to expand quite nicely, or the below example: If I am doing something wrong, or if there's any way to achieve something like this please let me know! Thanks a lot for your help :) The hero move transition is not feature complete. It's just there as a demonstration of what is possible and so one could adapt it to how they would like it to behave. That's why it's not in the library itself, just the example project. You'll need to change the transition code. Currently when it the interaction ends, it fades between the two views. Ahh I see, thanks for clarifying. I don't actually use UIKit myself, would you be able to point me to whereabouts it says the views should fade? And I can try and figure something out :) You'll need to learn about UIViewControllerInteractiveTransitioning and UIViewControllerAnimatedTransitioning. Look at the source code for the heroMove transition in the example and you'll see it. Perfect, could you point me to whereabouts in the source code the fade happens? To be honest it's already very close to what I would like to happen anyway, just as I said it needs to expand and shrink properly even with the source view being the size it is (as it does when the source view is smaller, as in the GIF above), as I am not too familiar with UIKit myself would you be able to highlight how I could do this? I was thinking maybe there's some logic in your code that checks for the sizing and behaves accordingly? If not if you could point me in the right direction I would really appreciate it, as long as it doesn't take too much of your time!
gharchive/issue
2023-07-23T14:06:29
2025-04-01T06:45:06.348541
{ "authors": [ "Saim-Khan1", "nathantannar4" ], "repo": "nathantannar4/Transmission", "url": "https://github.com/nathantannar4/Transmission/issues/5", "license": "BSD-2-Clause", "license_type": "permissive", "license_source": "github-api" }
1834274785
error when building fresh project I get the following error TypeError: The 'compilation' argument must be an instance of Compilation when running ns run android after a fresh git clone and pnpm install: full log: TypeError: The 'compilation' argument must be an instance of Compilation at NormalModule.getCompilationHooks (/Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/webpack/lib/NormalModule.js:227:10) at /Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/webpack/lib/HotModuleReplacementPlugin.js:767:18 at Hook.eval [as call] (eval at create (/Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/node_modules/tapable/lib/HookCodeFactory.js:19:10), :102:1) at Hook.CALL_DELEGATE [as _call] (/Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/node_modules/tapable/lib/Hook.js:14:14) at Compiler.newCompilation (/Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/@nativescript+webpack@5.0.16_typescript@5.1.6/node_modules/webpack/lib/Compiler.js:1126:26) at /Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/@nativescript+webpack@5.0.16_typescript@5.1.6/node_modules/webpack/lib/Compiler.js:1170:29 at Hook.eval [as callAsync] (eval at create (/Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/node_modules/tapable/lib/HookCodeFactory.js:33:10), :6:1) at Hook.CALL_ASYNC_DELEGATE [as _callAsync] (/Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/node_modules/tapable/lib/Hook.js:18:14) at Compiler.compile (/Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/@nativescript+webpack@5.0.16_typescript@5.1.6/node_modules/webpack/lib/Compiler.js:1165:28) at /Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/@nativescript+webpack@5.0.16_typescript@5.1.6/node_modules/webpack/lib/Watching.js:218:19 at eval (eval at create (/Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/node_modules/tapable/lib/HookCodeFactory.js:33:10), :11:1) at /Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/@nativescript/webpack/dist/plugins/WatchStatePlugin.js:22:13 at Hook.eval [as callAsync] (eval at create (/Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/node_modules/tapable/lib/HookCodeFactory.js:33:10), :7:1) at Hook.CALL_ASYNC_DELEGATE [as _callAsync] (/Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/node_modules/tapable/lib/Hook.js:18:14) at run (/Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/@nativescript+webpack@5.0.16_typescript@5.1.6/node_modules/webpack/lib/Watching.js:172:33) at /Users/bigmistqke/Documents/GitHub/solid-native2/node_modules/.pnpm/@nativescript+webpack@5.0.16_typescript@5.1.6/node_modules/webpack/lib/Watching.js:167:6 Hey, which example app are you trying to run? Here's a stackblitz with everything latest that works for me: https://stackblitz.com/edit/nativescript-solid-lqmrxv?file=webpack.config.js,package.json,app%2Fcomponent.jsx,app%2Fapp.jsx You can also run this locally by just downloading it. for documentation purposes: the issue was using pnpm, with npm install --legacy-peer-deps it works.
gharchive/issue
2023-08-03T05:00:27
2025-04-01T06:45:06.364494
{ "authors": [ "ammarahm-ed", "bigmistqke" ], "repo": "nativescript-community/solid-js", "url": "https://github.com/nativescript-community/solid-js/issues/6", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1965366471
Issue in JPEG image sequences I am seeing some issues while taking JPEG image sequences using the VideoKitRecorder component. The JPEG images have some patches at the bottom of the images (Sample shared below). The mp4 video is coming fine, only the JPEG sequences bear the issue. When tried in a separate UI, the image sequences are not all saved in the target folder when taken using the VideoKitRecordButton prefab. Unity version: 2023.1.15f1 Build platform: Android Color space: Linear I use almost standard settings here except for custom resolution of 1080 x 1920. This has been fixed in the upcoming VideoKit 0.0.18 update. Feel free to test the current VideoKit alpha.
gharchive/issue
2023-10-27T11:55:26
2025-04-01T06:45:06.367094
{ "authors": [ "olokobayusuf", "thesanketkale" ], "repo": "natmlx/videokit", "url": "https://github.com/natmlx/videokit/issues/66", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
223524431
[ADDED] Handler invoked when a new server joins the NATS Cluster If DiscoveredServersCB is set, the client library will invoke this callback when it is notified by the server it connects to that a new server has joined the cluster. This callback is not invoked in the initial Connect(). A ConnHandler is used. The user can invoke nc.DiscoveredServers() to get the list of discovered servers. But in some cases, it is just enough to know that a new server has joined. A use case is in NATS Streaming server with the channels partitioning/sharding feature where a server needs to know if the cluster has changed in which case it should resend its list of channels. Coverage decreased (-0.05%) to 95.078% when pulling 5d5b5ba6a06723e17310f17ffea3155958e97e9c on add_discovered_servers_cb into 9c2ce0b1f855da8981394f9fb2ccca2bbf55e798 on master.
gharchive/pull-request
2017-04-21T23:55:06
2025-04-01T06:45:06.394494
{ "authors": [ "coveralls", "kozlovic" ], "repo": "nats-io/go-nats", "url": "https://github.com/nats-io/go-nats/pull/282", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
485355027
Code review Code review [ ] Probleem: hard coderen van paden op veel locaties Op heel veel plekken los van elkaar worden dezelfde paden hard gecodeerd in de code. (Voorbeeld: /data/GIS/layers/... komt heel vaak terug in de code). Dat is een probleem omdat we hierdoor vanaf nu getrouwd zijn met bepaalde locaties die vrijwel niet meer te wijzigen zijn. Bovendien is de code niet consequent: de ene keer is het gis, de andere keer GIS. Dat werkt op OSX en Windows maar niet op Linux (want dat is case sensitive). Niet alleen is dat een probleem voor onszelf, maar nog veel erger voor eventuele gebruikers (=Wouter), die hun TIF layers op exact de juiste (niet-gedocumenteerde) plek, met de juiste naam, in een specifieke nesting moeten zetten. Dat gaat ze nooit lukken. Hard gecodeerde paden en duplicatie zijn een symptoom van een fragiel, moeilijk te onderhouden ontwerp: http://wiki.c2.com/?OnceAndOnlyOnce Een iets betere optie zou zijn dat zo'n pad maximaal op een plek staat, namelijk binnen de code van een configuratie-object (of andere abstractie) die paden beheert. De kennis over de paden is dan geconcentreerd op een plek en hoeft dus alleen daar aangepast te worden. Nog beter is als die kennis op nul plekken in de code staat, en volledig afhankelijk is van een configuratiebestand. Hier zijn voorbeelden van het probleem: https://github.com/naturalis/sdmdl/search?l=Python&q='%2Fdata%2FGIS%2F' https://github.com/naturalis/sdmdl/blob/master/script/python/train.py#L3 Gerelateerd hier aan is ook het probleem dat hard gecodeerd getest wordt op .tif, terwijl de bestands-extensie in werkelijkheid niet case sensitive hoeft te zijn (dus .TIF is ook een valide naam), en zelfs ook iets anders zou kunnen zijn (nl. .tiff of .TIFF). Oplossing: paden laten beheren via een configuratie abstractie [ ] Probleem: te lange methods Elke method (of functie) moet ongeveer een scherm vol zijn: ±40 regels. Als dat onmogelijk lijkt dan probeert de method dus te veel te doen en moet opgesplitst worden: http://wiki.c2.com/?LongMethodSmell Oplossing: methods opsplitsen [ ] Probleem: hard gecodeerde dictionary keys Op een aantal plekken in de code zijn kolomnamen van de DarwinCore archives hard gecodeerd, en laten dan een mix zien van CamelCase en underscore_case. Dit is een probleem omdat we hierdoor getrouwd zijn met hele specifieke (en custom) structuren voor onze data. Als Wouter met de data van observation.org aan de slag wil dan heeft hij vast andere namen voor de kolommen. Wat er moet gebeuren is dat er een Occurrence class is (of iets dergelijks) die dit intern beheert zodat de rest van de code de specifieke details van de databestanden niet hoeft te weten. Als dan blijkt dat het in observation.org decimal_latitude in plaats van decimalLatitude us dan hoeft er maar op een plek iets aangepast te worden. Bijkomend voordeel is dat je zo ook geen typefouten in de code kan krijgen (bijvoorbeeld, per ongeluk data["decima1Latitude"]). Oplossing: een aparte class voor DarwinCore records, idealiter als subclass van een al bestande DarwinCore library [ ] Probleem: niet (goed) testbare code Een verzameling losse scripts is niet hetzelfde als een hebruikbare, testbare API. Het moet mogelijk zijn om elk stukje van de code via een unit test te verifieren. Dat kan eigenlijk alleen maar (of althans, op een veel meer voor de hand liggende manier) als de code bestaat uit classes die door de unit test suite geimporteerd kunnen worden om zo de functionaliteit van elke method te testen. Het doel is dat de coverage om en nabij de 100% wordt. Oplossing: toch echt wel een daadwerkelijke OO API. Het doel is om dus van de huidige 0% coverage (https://coveralls.io/github/naturalis/trait-geo-diverse-angiosperms?branch=master) tot een ruim voldoende/goed te komen [ ] Probleem: geen goede documentatie van de methods Het moet zo zijn dat we uiteindelijk automatisch goede documentatie kunnen genereren (bijvoorbeeld op readthedocs). Dat gaan we doen door een combinatie van goede docstrings in de code en goede [.rst](https://en.wikipedia.org/wiki/ReStructuredText) documentatie, te beginnen bij /docs/index.rst Voorbeeld van goed gedocumenteerde code: https://github.com/BelgianBiodiversityPlatform/python-dwca-reader/blob/master/dwca/read.py Automatisch gegenereerde documentatie: https://python-dwca-reader.readthedocs.io/en/latest/api.html Oplossing: code zodanig documenteren dat het readthedocs knopje op groen komt te staan: https://readthedocs.org/projects/sdmdl/ [ ] Probleem: hard gecodeerde 'magic numbers' Voorbeelden: random number seed ("42") aantal pseudo absences ("2000") parameters voor plots (palet e.d.) Oplossing: moeten allemaal via de configuratie beheerd worden (met defaults) [ ] Probleem: print() statements voor debugging Voorbeelden: https://github.com/naturalis/sdmdl/search?q=print&unscoped_q=print Oplossing: vervangen door logging Suggesties voor de API Aan de collecties van scripts kan je in grote lijnen zien hoe de API er uit zou moeten zien. De namen die ik hier hanteer dienen even als voorbeeldjes: [ ] config - een class die de configuratiebestanden (e.g. YAML) inleest en de waardes via methods beschikbaar maakt aan de rest. Is verantwoordelijk voor het aanleveren van defaults (dus niet via magic numbers elders in de code), het construeren van locaties van invoer- en uitvoerbestanden. [ ] model_trainer - een class die de training van het model aanstuurt, d.w.z. de input parameters vanuit de configuratie ophaalt en overdraagt aan keras, vervolgens de voortgang / het succes van de training evalueert, en het model kan opslaan [ ] model_predictor - gegeven een getraind model (eventueel in te lezen van een bestand) zorgt deze class dat er voorspellingen gedaan worden [ ] gis_handler - weet alles van het inlezen, croppen, rescalen, plotten, wegschrijven van GIS data [ ] occurrence_handler - weet alles van het inlezen, filteren, converteren, wegschrijven van DarwinCore (-achtige) data Verdere code review punten: [ ] maak alle class namen consequent CamelCase (want: PEP8) [ ] verwijder Helper uit alle namen en verwijder alle werkwoorden uit class namen, zoals Create. Kortom: niet CreatePresenceMapHelper maar PresenceMap. Objecten zijn zelfstandig naamwoorden, niet werkwoorden - en '...helper' is volkomen vaag. In feite is '...handler' dat trouwens ook. Hier wat tips [ ] ik geloof nooit dat elk object een aparte instance naar oh, gh, en ch in zich mee moet dragen. Voor mijn gevoel zien die er uit als singleton objecten die die classes dus net zo goed intern kunnen aanmaken. Heeft elk object zijn eigen configuratie die potentieel verschilt van alle andere objecten? Nee, want ze werken samen. Dus hebben ze geen eigen instance nodig. Dus moeten alle constructors er niet zo uit zien als ze er nu uitzien. [ ] als meerdere objecten toch wel dezelfde constructor hebben, met dezelfde argumenten, dan moet je dus inheritance gaan toepassen
gharchive/issue
2019-08-26T17:41:43
2025-04-01T06:45:06.443079
{ "authors": [ "rvosa" ], "repo": "naturalis/sdmdl", "url": "https://github.com/naturalis/sdmdl/issues/11", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2616529239
🛑 BarefootEdu is down In 23d2be1, BarefootEdu (https://barefootedu.com) was down: HTTP code: 0 Response time: 0 ms Resolved: BarefootEdu is back up in 3da419b after 36 minutes.
gharchive/issue
2024-10-27T11:31:17
2025-04-01T06:45:06.460476
{ "authors": [ "arunjose1995" ], "repo": "navadhiti/upptime", "url": "https://github.com/navadhiti/upptime/issues/2077", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1270367892
🛑 Harmony Bot Website is down In 1985c6a, Harmony Bot Website ($HARMONY_WEB) was down: HTTP code: 0 Response time: 0 ms Resolved: Harmony Bot Website is back up in 878a415.
gharchive/issue
2022-06-14T07:20:16
2025-04-01T06:45:06.462836
{ "authors": [ "navaneethkm004" ], "repo": "navaneethkm004/uptime", "url": "https://github.com/navaneethkm004/uptime/issues/10778", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1435377249
🛑 Harmony Bot Website is down In 48ff61d, Harmony Bot Website ($HARMONY_WEB) was down: HTTP code: 0 Response time: 0 ms Resolved: Harmony Bot Website is back up in d21ebc1.
gharchive/issue
2022-11-03T23:43:59
2025-04-01T06:45:06.464971
{ "authors": [ "navaneethkm004" ], "repo": "navaneethkm004/uptime", "url": "https://github.com/navaneethkm004/uptime/issues/13026", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1455993487
🛑 Harmony Bot Website is down In ebd478b, Harmony Bot Website ($HARMONY_WEB) was down: HTTP code: 0 Response time: 0 ms Resolved: Harmony Bot Website is back up in 30c3d5d.
gharchive/issue
2022-11-18T23:46:11
2025-04-01T06:45:06.467133
{ "authors": [ "navaneethkm004" ], "repo": "navaneethkm004/uptime", "url": "https://github.com/navaneethkm004/uptime/issues/13294", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1499146786
🛑 Harmony Bot Website is down In 1101cb6, Harmony Bot Website ($HARMONY_WEB) was down: HTTP code: 0 Response time: 0 ms Resolved: Harmony Bot Website is back up in 84608ab.
gharchive/issue
2022-12-15T21:43:53
2025-04-01T06:45:06.469272
{ "authors": [ "navaneethkm004" ], "repo": "navaneethkm004/uptime", "url": "https://github.com/navaneethkm004/uptime/issues/13895", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
866783933
🛑 Harmony Bot Website is down In cade032, Harmony Bot Website ($HARMONY_WEB) was down: HTTP code: 0 Response time: 0 ms Resolved: Harmony Bot Website is back up in a47e168.
gharchive/issue
2021-04-24T16:25:32
2025-04-01T06:45:06.471641
{ "authors": [ "navaneethkm004" ], "repo": "navaneethkm004/uptime", "url": "https://github.com/navaneethkm004/uptime/issues/1711", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
930967991
🛑 Harmony Bot Website is down In 7d090d2, Harmony Bot Website ($HARMONY_WEB) was down: HTTP code: 0 Response time: 0 ms Resolved: Harmony Bot Website is back up in 42e6275.
gharchive/issue
2021-06-27T16:18:45
2025-04-01T06:45:06.473781
{ "authors": [ "navaneethkm004" ], "repo": "navaneethkm004/uptime", "url": "https://github.com/navaneethkm004/uptime/issues/3565", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
936235243
🛑 Harmony Bot Website is down In 5fa25c4, Harmony Bot Website ($HARMONY_WEB) was down: HTTP code: 0 Response time: 0 ms Resolved: Harmony Bot Website is back up in 9be1717.
gharchive/issue
2021-07-03T11:26:35
2025-04-01T06:45:06.475961
{ "authors": [ "navaneethkm004" ], "repo": "navaneethkm004/uptime", "url": "https://github.com/navaneethkm004/uptime/issues/3674", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1020343906
🛑 Harmony Bot Website is down In b5b4d47, Harmony Bot Website ($HARMONY_WEB) was down: HTTP code: 0 Response time: 0 ms Resolved: Harmony Bot Website is back up in 1c646bf.
gharchive/issue
2021-10-07T18:37:13
2025-04-01T06:45:06.478113
{ "authors": [ "navaneethkm004" ], "repo": "navaneethkm004/uptime", "url": "https://github.com/navaneethkm004/uptime/issues/5703", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1097283280
🛑 Harmony Bot Website is down In db5d160, Harmony Bot Website ($HARMONY_WEB) was down: HTTP code: 0 Response time: 0 ms Resolved: Harmony Bot Website is back up in 7c8a983.
gharchive/issue
2022-01-09T19:39:50
2025-04-01T06:45:06.480257
{ "authors": [ "navaneethkm004" ], "repo": "navaneethkm004/uptime", "url": "https://github.com/navaneethkm004/uptime/issues/7631", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1111872322
🛑 Harmony Bot Website is down In 8b2a044, Harmony Bot Website ($HARMONY_WEB) was down: HTTP code: 0 Response time: 0 ms Resolved: Harmony Bot Website is back up in b88b53f.
gharchive/issue
2022-01-23T13:32:31
2025-04-01T06:45:06.482570
{ "authors": [ "navaneethkm004" ], "repo": "navaneethkm004/uptime", "url": "https://github.com/navaneethkm004/uptime/issues/7897", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
606079721
FEATURE: apply warning violation severity in checkstyle mvn install시 checkstyle이 실행이 안되는 문제가 있었습니다. 현재 violation level을 warning으로 설정하였는데, 실제로 검사할 때는 error level 부터 검사를 하여서, violation warning이 발생하여도 mvn install은 성공하게됩니다. 따라서 현재 상태에서 CI와 연동되지 않는 문제가 있습니다. 이전에 rebase 하면서 해당 설정을 누락한 듯 싶습니다. 이에 대해 warning level 까지 검사를 할 수 있도록 수정하였습니다. @MinWooJin travis ci에서 checkstyle 실패 발생하네요. 확인해주시면 수정하여 commit 하겠습니다. @hjyun328 실패 확인 했습니다. 로컬에서는 violation error가 발생 하지 않는데, travis에서만 violation error 발생하는 case가 있네요. 무엇이 문제인지 확인해보겠습니다. @jhpark816 현재 발생되고 있는 violation 제거를 위해 https://github.com/naver/arcus-java-client/pull/240 PR 먼저 반영 부탁드리겠습니다. [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/bulkoperation/BulkSetVariousTypeTest.java:41:5: Definition of 'equals()' without corresponding definition of 'hashCode()'. [EqualsHashCode] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/bulkoperation/BulkDeleteTest.java:81:67: '.' should be on a new line. [SeparatorWrapDot] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/bulkoperation/BulkDeleteTest.java:132:67: '.' should be on a new line. [SeparatorWrapDot] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/bulkoperation/BopPipeUpdateTest.java:1: File does not end with a newline. [NewlineAtEndOfFile] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/LongKeyTest/BaseLongKeyTest.java:1:9: Package name 'net.spy.memcached.LongKeyTest' must match pattern '^[a-z]+(\.[a-z][a-z0-9]*)*$'. [PackageName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/collection/map/MopGetTest.java:1: File does not end with a newline. [NewlineAtEndOfFile] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/collection/map/MopInsertWhenKeyNotExist.java:1: File does not end with a newline. [NewlineAtEndOfFile] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/collection/map/MopInsertWhenKeyNotExist.java:29:59: ',' is not followed by whitespace. [WhitespaceAfter] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/collection/set/SopPipedExistTest.java:52:3: ';' is preceded with whitespace. [NoWhitespaceBefore] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/collection/CollectionMaxElementSize.java:1: File does not end with a newline. [NewlineAtEndOfFile] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/collection/btree/longbkey/BopUpsertTest.java:53:3: ';' is preceded with whitespace. [NoWhitespaceBefore] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/collection/btree/longbkey/BopInsertAndGetWithElementFlagTest.java:55:3: ';' is preceded with whitespace. [NoWhitespaceBefore] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/collection/btree/BopInsertAndGetWithElementFlagTest.java:50:3: ';' is preceded with whitespace. [NoWhitespaceBefore] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/collection/btree/BopMutateTest.java:116:36: Literal Strings should be compared using equals(), not '=='. [StringLiteralEquality] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/collection/btree/BopMutateTest.java:123:37: Literal Strings should be compared using equals(), not '=='. [StringLiteralEquality] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/collection/btree/BopInsertWhenKeyNotExist.java:29:59: ',' is not followed by whitespace. [WhitespaceAfter] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/emptycollection/BTreeDeleteWithFilterTest.java:45:3: ';' is preceded with whitespace. [NoWhitespaceBefore] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/emptycollection/VariousTypeTest.java:338:5: Definition of 'equals()' without corresponding definition of 'hashCode()'. [EqualsHashCode] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/emptycollection/BTreeGetWithFilterTest.java:51:3: ';' is preceded with whitespace. [NoWhitespaceBefore] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/emptycollection/ProtocolMapDeleteTest.java:1: File does not end with a newline. [NewlineAtEndOfFile] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:133:15: Method name 'CASOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:142:15: Method name 'ConcatenationOperationTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:152:15: Method name 'BTreeFindPositionWithGetOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:163:15: Method name 'SetExistOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:179:15: Method name 'CollectionMutateOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:220:15: Method name 'MutatorOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:229:15: Method name 'BTreeSortMergeGetOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:259:15: Method name 'BTreeSortMergeGetOperationOldImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:285:15: Method name 'CollectionPipedInsertOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:353:15: Method name 'DeleteOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:362:15: Method name 'ExtendedBTreeGetOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:387:15: Method name 'CollectionDeleteOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:406:15: Method name 'CollectionBulkInsertOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:449:15: Method name 'BTreeGetBulkOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:475:15: Method name 'BTreeGetByPositionOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:498:15: Method name 'CollectionUpdateOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:509:15: Method name 'CollectionPipedUpdateOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:536:15: Method name 'CollectionCreateOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:547:15: Method name 'CollectionCountOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:558:15: Method name 'CollectionPipedExistOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:585:15: Method name 'BTreeInsertAndGetOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:609:15: Method name 'SetAttrOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] [WARN] /home/travis/build/naver/arcus-java-client/src/test/manual/net/spy/memcached/MultibyteKeyTest.java:618:15: Method name 'BTreeFindPositionOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [MethodName] Audit done. [WARNING] src/test/manual/net/spy/memcached/bulkoperation/BulkSetVariousTypeTest.java:[41,5] (coding) EqualsHashCode: Definition of 'equals()' without corresponding definition of 'hashCode()'. [WARNING] src/test/manual/net/spy/memcached/bulkoperation/BulkDeleteTest.java:[81,67] (extension) SeparatorWrapDot: '.' should be on a new line. [WARNING] src/test/manual/net/spy/memcached/bulkoperation/BulkDeleteTest.java:[132,67] (extension) SeparatorWrapDot: '.' should be on a new line. [WARNING] src/test/manual/net/spy/memcached/bulkoperation/BopPipeUpdateTest.java:[1] (misc) NewlineAtEndOfFile: File does not end with a newline. [WARNING] src/test/manual/net/spy/memcached/LongKeyTest/BaseLongKeyTest.java:[1,9] (naming) PackageName: Package name 'net.spy.memcached.LongKeyTest' must match pattern '^[a-z]+(\.[a-z][a-z0-9]*)*$'. [WARNING] src/test/manual/net/spy/memcached/collection/map/MopGetTest.java:[1] (misc) NewlineAtEndOfFile: File does not end with a newline. [WARNING] src/test/manual/net/spy/memcached/collection/map/MopInsertWhenKeyNotExist.java:[1] (misc) NewlineAtEndOfFile: File does not end with a newline. [WARNING] src/test/manual/net/spy/memcached/collection/map/MopInsertWhenKeyNotExist.java:[29,59] (whitespace) WhitespaceAfter: ',' is not followed by whitespace. [WARNING] src/test/manual/net/spy/memcached/collection/set/SopPipedExistTest.java:[52,3] (whitespace) NoWhitespaceBefore: ';' is preceded with whitespace. [WARNING] src/test/manual/net/spy/memcached/collection/CollectionMaxElementSize.java:[1] (misc) NewlineAtEndOfFile: File does not end with a newline. [WARNING] src/test/manual/net/spy/memcached/collection/btree/longbkey/BopUpsertTest.java:[53,3] (whitespace) NoWhitespaceBefore: ';' is preceded with whitespace. [WARNING] src/test/manual/net/spy/memcached/collection/btree/longbkey/BopInsertAndGetWithElementFlagTest.java:[55,3] (whitespace) NoWhitespaceBefore: ';' is preceded with whitespace. [WARNING] src/test/manual/net/spy/memcached/collection/btree/BopInsertAndGetWithElementFlagTest.java:[50,3] (whitespace) NoWhitespaceBefore: ';' is preceded with whitespace. [WARNING] src/test/manual/net/spy/memcached/collection/btree/BopMutateTest.java:[116,36] (coding) StringLiteralEquality: Literal Strings should be compared using equals(), not '=='. [WARNING] src/test/manual/net/spy/memcached/collection/btree/BopMutateTest.java:[123,37] (coding) StringLiteralEquality: Literal Strings should be compared using equals(), not '=='. [WARNING] src/test/manual/net/spy/memcached/collection/btree/BopInsertWhenKeyNotExist.java:[29,59] (whitespace) WhitespaceAfter: ',' is not followed by whitespace. [WARNING] src/test/manual/net/spy/memcached/emptycollection/BTreeDeleteWithFilterTest.java:[45,3] (whitespace) NoWhitespaceBefore: ';' is preceded with whitespace. [WARNING] src/test/manual/net/spy/memcached/emptycollection/VariousTypeTest.java:[338,5] (coding) EqualsHashCode: Definition of 'equals()' without corresponding definition of 'hashCode()'. [WARNING] src/test/manual/net/spy/memcached/emptycollection/BTreeGetWithFilterTest.java:[51,3] (whitespace) NoWhitespaceBefore: ';' is preceded with whitespace. [WARNING] src/test/manual/net/spy/memcached/emptycollection/ProtocolMapDeleteTest.java:[1] (misc) NewlineAtEndOfFile: File does not end with a newline. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[133,15] (naming) MethodName: Method name 'CASOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[142,15] (naming) MethodName: Method name 'ConcatenationOperationTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[152,15] (naming) MethodName: Method name 'BTreeFindPositionWithGetOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[163,15] (naming) MethodName: Method name 'SetExistOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[179,15] (naming) MethodName: Method name 'CollectionMutateOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[220,15] (naming) MethodName: Method name 'MutatorOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[229,15] (naming) MethodName: Method name 'BTreeSortMergeGetOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[259,15] (naming) MethodName: Method name 'BTreeSortMergeGetOperationOldImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[285,15] (naming) MethodName: Method name 'CollectionPipedInsertOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[353,15] (naming) MethodName: Method name 'DeleteOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[362,15] (naming) MethodName: Method name 'ExtendedBTreeGetOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[387,15] (naming) MethodName: Method name 'CollectionDeleteOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[406,15] (naming) MethodName: Method name 'CollectionBulkInsertOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[449,15] (naming) MethodName: Method name 'BTreeGetBulkOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[475,15] (naming) MethodName: Method name 'BTreeGetByPositionOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[498,15] (naming) MethodName: Method name 'CollectionUpdateOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[509,15] (naming) MethodName: Method name 'CollectionPipedUpdateOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[536,15] (naming) MethodName: Method name 'CollectionCreateOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[547,15] (naming) MethodName: Method name 'CollectionCountOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[558,15] (naming) MethodName: Method name 'CollectionPipedExistOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[585,15] (naming) MethodName: Method name 'BTreeInsertAndGetOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[609,15] (naming) MethodName: Method name 'SetAttrOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. [WARNING] src/test/manual/net/spy/memcached/MultibyteKeyTest.java:[618,15] (naming) MethodName: Method name 'BTreeFindPositionOperationImplTest' must match pattern '^[a-z][a-z0-9][a-zA-Z0-9_]*$'. @jhpark816 rebase 완료하였습니다. checkstyle 테스트 제대로 통과됩니다.
gharchive/pull-request
2020-04-24T06:40:58
2025-04-01T06:45:06.490355
{ "authors": [ "MinWooJin", "hjyun328" ], "repo": "naver/arcus-java-client", "url": "https://github.com/naver/arcus-java-client/pull/239", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
544930403
PHP Dev App With Pinpoint APM We have prepared the entire code and installged PHP agent outside the docker, also configured the collector config file. with collector IP. and port for the application as per requirement, bust still just app shows not any kind of php app tracking. How it could be track php app ? Hello, @anilvaja I think this issue can be posted https://github.com/naver/pinpoint-c-agent which is the PHP-agent repository
gharchive/issue
2020-01-03T10:13:03
2025-04-01T06:45:06.507890
{ "authors": [ "RoySRose", "anilvaja" ], "repo": "naver/pinpoint", "url": "https://github.com/naver/pinpoint/issues/6384", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
682315654
pinpoint 2.0.3 : The sampling rate was found to be incorrect after my tests. is it my configuration error? can anyone tells me why? thanks!!! when i set profiler.sampling.enable=true and profiler.sampling.rate=1 or 5 or 10 or 20. Ten transactions were sent, and only one transaction be sampled. profiler.sampling.rate=: 1=100% 2 = 50% 4 = 25% 10 =10% 20 = 5% Change the "sampling.rate" in the two configuration files respectively and recomplile. The two configuration files are as follows: “pinpoint-env.config” in “profiles\release” and “rofiles\local”. https://github.com/naver/pinpoint/blob/master/agent/src/main/resources/profiles/local/pinpoint.config#L60 https://github.com/naver/pinpoint/blob/master/agent/src/main/resources/profiles/release/pinpoint.config#L60 profiler.sampling.rate=: 1=100% 2 = 50% 4 = 25% 10 =10% 20 = 5% Change the "sampling.rate" in the two configuration files respectively and recomplile. The two configuration files are as follows: “pinpoint-env.config” in “profiles\release” and “rofiles\local”. https://github.com/naver/pinpoint/blob/master/agent/src/main/resources/profiles/local/pinpoint.config#L60 https://github.com/naver/pinpoint/blob/master/agent/src/main/resources/profiles/release/pinpoint.config#L60 Now the configuration works. And I hava anther question: the sever map is:App1->App2->App3. If i Set App1's profiler.sampling.rate=1. Set the App2's profiler.sampling.rate=1 or 20, and my test result is that App2‘s sampling rate is the same。 hi @caicha According to the transaction tracking policy, if there is a trace that already traced, the request would be traced. For above reason, if front application sampling policy is 100%, then all applications that receive requests from it would all track down request. thanks :) hi @caicha According to the transaction tracking policy, if there is a trace that already traced, the request would be traced. For above reason, if front application sampling policy is 100%, then all applications that receive requests from it would all track down request. thanks :) Thank you very much!
gharchive/issue
2020-08-20T02:28:30
2025-04-01T06:45:06.514660
{ "authors": [ "Zoey-dot", "caicha", "koo-taejin" ], "repo": "naver/pinpoint", "url": "https://github.com/naver/pinpoint/issues/7141", "license": "apache-2.0", "license_type": "permissive", "license_source": "bigquery" }
1600940830
Tilbakemelding I saksbehandlingsmiljøer med tight UI er HelpText veldig utbredt så jeg tenkte jeg skulle være litt føre var her, på Aksel står det "HelpText er en komponent som skal fases ut på sikt. Vi anbefaler heller å bruke ReadMore, eller lenke til en annen side." Ser ikke helt hvorfor det er ønske om å fase ut et komponent som brukes såpass mye uten at det finnes noe oppgradert versjon eller høvelig alternativ. Det å linke til annen side er nesten alltid et hardt brudd med saksflyten. ReadMore erstatter ikke HelptText på noe måte og innfrir helt andre brukskontekster. Det er en av komponentene vi ser at det er størst sjanse for feil bruk. Det er grunnen til at vi er såpass direkte med å anbefale en annen komponent. Vi kan endre teksten så den spesifiserer at den vurderes i det minste å flyttes til interne flater? @sjur-gr Det er en av komponentene vi ser at det er størst sjanse for feil bruk. Det er grunnen til at vi er såpass direkte med å anbefale en annen komponent. Vi kan endre teksten så den spesifiserer at den vurderes i det minste å flyttes til interne flater? @sjur-gr Vet ikke helt hva du mener med feil bruk, men husker fra min tid i DS at vi ønsket å formidle at HelpText ikke skulle bli et "sekkekomponent" for all mulig informasjon. Men her tror jeg DS bare burde skrive en bruksanmodning, på samme måte som med andre komponenter, men det vil være de ulike teamenes behov for å bruke HelpText som må trumfe. Når jeg observerer den i kontekst av Pesys så er den brukt riktig og løsningsorientert og det er min oppfatning av den når jeg ser den også i andre saksbehandlingssystemer. Det finnes ingen reell erstatning, jeg tror den fyller en funksjon og reduserer faren for at folk lager seg andre mindre heldige improviserte løsninger. Vi kan endre teksten her ja. Tror ikke vi får jobbet med en justering eller alternativ med det første. Det finnes sikkert gode argumenter og situasjoner hvor det er helt ok å bruke denne på åpne/innlogga flater også. Dok’en er ganske tydelig på mengde innhold. Vi kan trekke fram flere ting som bør være med i vurderingen om den skal brukes. Eks. lite kontroll på plassering og lukking, utfordringer med scrolling på mobil om popover blir stor. Om jeg skal tenke høyt kan modal eller dynamisk sidepanel være alternativer som gir bruker og oss mer kontroll. Dokumentasjonen er endret. Nå beskrives komponenten som den er, og vi utdyper hva man må være oppmerksom på når man bruker komponenten.
gharchive/issue
2023-02-27T10:56:17
2025-04-01T06:45:06.519465
{ "authors": [ "Trent-NAV", "olejorgenbakken", "sjur-gr" ], "repo": "navikt/aksel", "url": "https://github.com/navikt/aksel/issues/1824", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1970700444
TFP-5463 første utgave av infotrygd-søk Lenken blir 'infotrygd-søk'. Tar en input på 11 eller 13 tegn og returnerer liste av grunnlag. Skal få PL til å teste ut med swagger før vi lander Dtos. Grunnlag-klassen ligger i fp-felles. @mrsladek hjelper du med å få opp fp-infotrygd-foreldrepenger og fp-infotrygd-svangerskapspenger i prod og dev ? Gjør AzureCC og lener meg på ABAC på søke-endepunktet Må nok utvide fp-infotrygd med AzureCC støtte ellers vil det ikke virke. Må nok utvide fp-infotrygd med AzureCC støtte ellers vil det ikke virke. Vi bruker azureCC mot infotrygd-sykepenger-fp.infotrygd og k9-infotrygd-grunnlag-paaroerende-sykdom.k9saksbehandling - kan vi låne noe oppsett fra dem? Har ikke så lyst til å gjeninnføre STS :-) Ja, husker at jeg har jobbet noe med innføring av azure der også. Men har lyst til å gjøre det litt mer likt alle våre andre tjenester. Nå er det bare litt cowboy style der. Og hvordan de tjenester fremstår tillater de ingen kommunikasjon hverken med azure eller sts siden ingenting er whitelistet. @mrsladek skal vi prøve få opp tjenestene med innkommende AzureCC først? Da kan vi jobbe i parallell med a) Studere data for å lage Dtos, b) jobbe med GUI, c) forbedre infotrygd-apps.
gharchive/pull-request
2023-10-31T15:15:19
2025-04-01T06:45:06.523245
{ "authors": [ "jolarsen", "mrsladek" ], "repo": "navikt/fp-sak", "url": "https://github.com/navikt/fp-sak/pull/6057", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1872934824
🛑 ISTAT is down In ce49cd7, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ISTAT is back up in 3c8c13e after 4 minutes.
gharchive/issue
2023-08-30T05:32:45
2025-04-01T06:45:06.541327
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/14442", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1904401291
🛑 ESTAT is down In c2d8d78, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ESTAT is back up in 73e329e after 30 minutes.
gharchive/issue
2023-09-20T07:57:05
2025-04-01T06:45:06.543955
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/15853", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1921103172
🛑 ESTAT is down In 2a52b07, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ESTAT is back up in 8cfaf99 after 29 minutes.
gharchive/issue
2023-10-02T01:37:42
2025-04-01T06:45:06.546396
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/16630", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1951139450
🛑 ESTAT is down In d15a6bc, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ESTAT is back up in 05a650f after 42 minutes.
gharchive/issue
2023-10-19T03:44:18
2025-04-01T06:45:06.548871
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/17609", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1962674394
🛑 ISTAT is down In b1ff7c9, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ISTAT is back up in f22cd3a after 9 minutes.
gharchive/issue
2023-10-26T04:19:59
2025-04-01T06:45:06.551232
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/18047", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1288390445
🛑 ESCAP is down In 2b7c274, ESCAP (https://api-dataexplorer.unescap.org/rest/dataflow/all/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ESCAP is back up in cd5fec1.
gharchive/issue
2022-06-29T08:57:53
2025-04-01T06:45:06.553671
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/2147", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1301330466
🛑 OECD is down In 9f5f4a4, OECD (https://stats.oecd.org/restsdmx/sdmx.ashx/GetDataStructure/ALL) was down: HTTP code: 0 Response time: 0 ms Resolved: OECD is back up in 570a1cf.
gharchive/issue
2022-07-11T23:00:34
2025-04-01T06:45:06.556304
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/2306", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2123581091
🛑 NB is down In 7565df4, NB (https://data.norges-bank.no/api/dataflow/all/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: NB is back up in 06483ad after 14 minutes.
gharchive/issue
2024-02-07T17:58:40
2025-04-01T06:45:06.558698
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/24559", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2129408726
🛑 ISTAT is down In e36d570, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ISTAT is back up in 8618159 after 11 minutes.
gharchive/issue
2024-02-12T04:39:24
2025-04-01T06:45:06.561096
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/24834", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2165622044
🛑 ISTAT is down In 9f4b661, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ISTAT is back up in 1dd1d92 after 5 minutes.
gharchive/issue
2024-03-03T23:31:41
2025-04-01T06:45:06.563483
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/26235", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1336785898
🛑 ESTAT is down In 1bbc583, ESTAT (http://ec.europa.eu/eurostat/SDMX/diss-web/rest/dataflow/ESTAT/all/latest/) was down: HTTP code: 0 Response time: 0 ms Resolved: ESTAT is back up in ea2b781.
gharchive/issue
2022-08-12T05:59:08
2025-04-01T06:45:06.565904
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/2735", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2202283634
🛑 ISTAT is down In ce778f7, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ISTAT is back up in 4dfca7f after 6 minutes.
gharchive/issue
2024-03-22T11:27:59
2025-04-01T06:45:06.568565
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/27459", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2219767194
🛑 IMF is down In db9554b, IMF (http://dataservices.imf.org/REST/SDMX_XML.svc/Dataflow) was down: HTTP code: 0 Response time: 0 ms Resolved: IMF is back up in c02621e after 8 minutes.
gharchive/issue
2024-04-02T07:29:28
2025-04-01T06:45:06.570937
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/28106", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2329148280
🛑 ISTAT is down In 8ae8c33, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ISTAT is back up in f429168 after 4 minutes.
gharchive/issue
2024-06-01T13:32:01
2025-04-01T06:45:06.573406
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/31975", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2338012358
🛑 ESTAT is down In 431b7f7, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ESTAT is back up in 00415b0 after 5 minutes.
gharchive/issue
2024-06-06T11:23:10
2025-04-01T06:45:06.575832
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/32289", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2554701951
🛑 ISTAT is down In 03de98a, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ISTAT is back up in 0e80458 after 6 minutes.
gharchive/issue
2024-09-29T04:39:09
2025-04-01T06:45:06.578264
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/39032", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
2645391177
🛑 ISTAT is down In bfdb099, ISTAT (https://esploradati.istat.it/SDMXWS/rest/dataflow/all/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ISTAT is back up in 70f14b4 after 6 minutes.
gharchive/issue
2024-11-08T23:47:24
2025-04-01T06:45:06.580822
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/41490", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1631081107
🛑 ESTAT is down In dd7d7c4, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ESTAT is back up in 88a2e34.
gharchive/issue
2023-03-19T19:13:47
2025-04-01T06:45:06.583218
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/6948", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1690960721
🛑 ESTAT is down In b3498ab, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ESTAT is back up in 4dd097d.
gharchive/issue
2023-05-01T15:28:55
2025-04-01T06:45:06.585666
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/8656", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1691215756
🛑 NBB is down In 12762f7, NBB (https://stat.nbb.be/restsdmx/sdmx.ashx/GetDataStructure/ALL) was down: HTTP code: 429 Response time: 641 ms Resolved: NBB is back up in b489b0b.
gharchive/issue
2023-05-01T19:07:55
2025-04-01T06:45:06.587965
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/8661", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1704894586
🛑 ESTAT is down In 5a7aa27, ESTAT (https://ec.europa.eu/eurostat/api/dissemination/sdmx/2.1/dataflow/ESTAT/all/latest) was down: HTTP code: 0 Response time: 0 ms Resolved: ESTAT is back up in c90234a.
gharchive/issue
2023-05-11T02:02:42
2025-04-01T06:45:06.590422
{ "authors": [ "charphi" ], "repo": "nbbrd/sdmx-upptime", "url": "https://github.com/nbbrd/sdmx-upptime/issues/9022", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1379277429
wsld does not return an error if X11 forward fails $ wsld && echo OK Failed to listen: X0 is current in use OK I have this problem when I restart WSL, and its cumbersome to have to check the output for an error. Fixed
gharchive/issue
2022-09-20T11:41:45
2025-04-01T06:45:06.592608
{ "authors": [ "lmartelli", "nbdd0121" ], "repo": "nbdd0121/wsld", "url": "https://github.com/nbdd0121/wsld/issues/24", "license": "Apache-2.0", "license_type": "permissive", "license_source": "github-api" }
529230411
how to work with fixed region inpainting hello, how to work with fixed region inpainting, such as the logo with the fixed position on the video. I follow your instruction: CUDA_VISIBLE_DEVICES=0 python tools/video_inpaint.py --frame_dir ./demo/lady-running/frames \ --MASK_ROOT ./demo/lady-running/mask_bbox.png \ --img_size 448 896 --DFC --FlowNet2 --Propagation \ --PRETRAINED_MODEL_1 ./pretrained_models/resnet50_stage1.pth \ --PRETRAINED_MODEL_2 ./pretrained_models/DAVIS_model/davis_stage2.pth \ --PRETRAINED_MODEL_3 ./pretrained_models/DAVIS_model/davis_stage3.pth \ --MS --th_warp 3 --FIX_MAS But it still does Iter for many minutes. Why it so slow... I provide several ways for reducing running time: using single stage model get th_warp much higher like 10 decrease the resolution of input However, these tricks will harm the final performance. The running speed depends on many things, like CPUs, memory bus. I'm still working on the more efficient versioin and welcome to any kind of suggestions.
gharchive/issue
2019-11-27T09:44:20
2025-04-01T06:45:06.594957
{ "authors": [ "IvyGongoogle", "nbei" ], "repo": "nbei/Deep-Flow-Guided-Video-Inpainting", "url": "https://github.com/nbei/Deep-Flow-Guided-Video-Inpainting/issues/54", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
1339949578
Unable to install nodejs specific version Hello, Thank you for this awesome project ! I use my custom Dockerfile and I need a specific version of nodejs to run the nodejs scripts I tried these methods in the Dockerfile without success: Method 1 refuses to install nodejs (nodejs-current-17.9.0-r0: breaks: world[nodejs=14.19.0]) FROM ncarlier/webhookd:1.15.0 RUN apk add --update nodejs=14.19.0 Method 2 installs nvm but node is not available to run the scripts (env: can't execute 'node': No such file or directory) FROM ncarlier/webhookd:1.15.0 RUN curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh | bash RUN source $HOME/.nvm/nvm.sh && nvm install 14 I would love to use webhookd in silex v3 :100: Have a nice day Hello, thank you for your interest. Using NVM with Alpine is a bit complicated. Here is a working example: FROM ncarlier/webhookd:1.15.0 SHELL ["/bin/bash", "-c"] RUN apk add --update alpine-sdk gcompat coreutils RUN curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.1/install.sh | bash RUN cd $HOME && source $HOME/.nvm/nvm.sh && nvm install 14 RUN cd $HOME && source $HOME/.nvm/nvm.sh && nvm use 14 && node --version You may have to call nvm in your script to setup the PATH variable. Have a nice day too Hello again Thanks so much for the quick answer It is indeed what I did more or less, and as you said, I have to "You may have to call nvm in your script to setup the PATH variable." But then I can not call nodejs scripts directly right? You're saying i need to call a shell script which calls my nodejs script right? You can call your nodejs script directly as long as the script is executable (chmod +x) and the file header is #!/usr/bin/env node But in this case, node must be in the PATH. Using nvm, you may have to set the path manually (with the node version). Otherwise you create a script that sources nvm and uses the correct version.
gharchive/issue
2022-08-16T07:46:25
2025-04-01T06:45:06.618217
{ "authors": [ "lexoyo", "ncarlier" ], "repo": "ncarlier/webhookd", "url": "https://github.com/ncarlier/webhookd/issues/62", "license": "mit", "license_type": "permissive", "license_source": "bigquery" }
57503314
Anticheat for second door doesn't seem to work correctly I can still go back the way I came with the second door, forming only half a heart. Huh, didn't even think people'd be looking at this repo. Thanks, it's been fixed! The full minigame is out now - http://ncase.me/door/ - thanks again!
gharchive/issue
2015-02-12T18:57:40
2025-04-01T06:45:06.620486
{ "authors": [ "ncase", "rubenwardy" ], "repo": "ncase/door", "url": "https://github.com/ncase/door/issues/1", "license": "CC0-1.0", "license_type": "permissive", "license_source": "github-api" }
1176835143
fix apply encoder method Earlier in _apply_encoder method new labels were recursively added to the encoder.classes_ if they had not been encountered before. Now at the very beginning we go through the current column and add all the labels that have not been seen before А корректность поведения можно каким-то тестом проверить? this А корректность поведения можно каким-то тестом проверить? this Но этот тест же и без фикса не падает.
gharchive/pull-request
2022-03-22T14:02:44
2025-04-01T06:45:06.625771
{ "authors": [ "maypink", "nicl-nno" ], "repo": "nccr-itmo/FEDOT", "url": "https://github.com/nccr-itmo/FEDOT/pull/609", "license": "BSD-3-Clause", "license_type": "permissive", "license_source": "github-api" }
1277067866
Extract fit initiall assumption#717 This PR is devoted to https://github.com/nccr-itmo/FEDOT/issues/717 issue and separate logic for initial assumption from api composer Hello @valer1435! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found: In the file fedot/api/api_utils/predefined_model.py: Line 18:1: W293 blank line contains whitespace Line 43:1: W391 blank line at end of file
gharchive/pull-request
2022-06-20T15:07:24
2025-04-01T06:45:06.629173
{ "authors": [ "pep8speaks", "valer1435" ], "repo": "nccr-itmo/FEDOT", "url": "https://github.com/nccr-itmo/FEDOT/pull/736", "license": "BSD-3-Clause", "license_type": "permissive", "license_source": "github-api" }
1065355534
Make mcmc.PMMH work with any Feynman-Kac class By default, PMMH runs the bootstrap filter associated to the considered state-space model. It is possible to use a different algorithm, by setting argument fk_cls to another Feynman-Kac class; however, this works only for FK classes that have the same structure as state_space_models.Bootstrap or state_space_models.GuidedPF (taking as arguments data and ssm, for the state-space model). What if the user wants to specify a FK class with a different structure? Hi, I have a question regarding the PMMH algorithm. I'm mostly interested in filtering/smoothing, so I'm not up to date on every theoretical part of the parameter estimation. I only wanted to try it out to see if I get any useful results from the get-go. My model is a bit more complex but to boil down my problem: When picking the new 'guess' for theta in the step method of the GenericRWHM class (starting in line 231 in mcmc) any parameter is changed by a value determined via a random Gaussian sample. Here it gets problematic in my case: How does this not 'overshoot' in many small scenarios, say your theta is just one parameter and your prior distribution is a uniform distribution on [0.4, 0.45]. In almost every step this value is missed and the step gets a weight of -inf, making it obsolete. I changed the new choice to a random sample from the prior distribution in my case (as a quick fix), but I don't know if that is the right (theoretical) way to choose the next guess for theta. When trying out the example of the documentation (Bayesian Inference, PMMH) this also happens with the parameter 'rho' as it is uniform distributed in [-1,1] (I simply printed out the value for 'rho' and the value self.prop.lpost[0] to see if it is outside / -inf). Regards, David Hi, to fix your problem (proposed values often fall outside the support of the prior distribution), you may : change the value of parameter `rw_cov' (default is Identity), which gives the covariance matrix of the random walk proposal. You could take something like tau * Identity, with tau=10^{-1} or 10^{-2}. This will make the steps of the random walk much smaller, so it's more likely you stay in the support of the prior. consider using an unconstrained prior, say a Gaussian with a small variance. Regarding 1, I don't know whether you use the adaptive version (adaptive=True by default) or not. In the former case, the value of rw_cov is used only at the beginning, since the adaptive version is learning this covariance matrix sequentially from the chain. Hope this helps. I guess your question shows the documentation is not so clear about these points, I will try to improve it. Closing this, as the initial issue has been addressed (experimental branch, module mcmc now better explains what to do).
gharchive/issue
2021-11-28T15:05:05
2025-04-01T06:45:06.634563
{ "authors": [ "P3ngwings", "nchopin" ], "repo": "nchopin/particles", "url": "https://github.com/nchopin/particles/issues/44", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
361376637
[feature request] Elm support I'd love to try my hand at porting ncm-elm-oracle to ncm2, failing that, do any of you have suggested configs for using elm-vim + elm-oracle to get completion to ncm2? I tested https://github.com/megalithic/ncm2-elm, but I'm not familiar with elm and I cannot get elm-oracle to work. FYI, Here's some modifications I've mode. You could enable logging by export NVIM_PYTHON_LOG_FILE="/tmp/nvim_log" and export NVIM_PYTHON_LOG_LEVEL="DEBUG" before start neovim. I also noticed that elm-oracle doesn't seem to work with elm 0.19.0. diff --git a/autoload/ncm2_elm.vim b/autoload/ncm2_elm.vim index 925b1c7..23feb43 100644 --- a/autoload/ncm2_elm.vim +++ b/autoload/ncm2_elm.vim @@ -19,7 +19,7 @@ let g:ncm2_elm#source = extend( \ 'scope': ['elm'], \ 'word_pattern': ['[\w/]+', '[\w\-]+'], \ 'complete_pattern': ['\.', '::', ':\s*'], - \ 'on_complete': ['ncm2_elm#on_complete', 'ncm2#on_complete#omni', 'elm#Complete'], + \ 'on_complete': 'ncm2_elm#on_complete', \ 'on_warmup': 'ncm2_elm#on_warmup', \ }, 'keep') diff --git a/pythonx/ncm2_elm.py b/pythonx/ncm2_elm.py index fb0c401..a7fc989 100644 --- a/pythonx/ncm2_elm.py +++ b/pythonx/ncm2_elm.py @@ -5,6 +5,8 @@ from ncm2 import Ncm2Source, getLogger, Popen import os import glob import subprocess +from os import path +import json logger = getLogger(__name__) @@ -44,10 +46,7 @@ class Source(Ncm2Source): args = [ 'elm-oracle', - str(lnum), - str(bcol - 1), filepath, - '-', query ] @@ -56,14 +55,12 @@ class Source(Ncm2Source): args=args, stdin=subprocess.PIPE, stdout=subprocess.PIPE, - stderr=subprocess.DEVNULL cwd=proj_dir) result, errs = proc.communicate(src, timeout=30) - result = json.loads(result.decode('utf-8')) - - logger.debug("args: %s, result: [%s]", args, result.decode()) + logger.debug("args: %s, result: [%s]", args, result) + result = json.loads(result.decode('utf-8')) if not result: return
gharchive/issue
2018-09-18T16:03:14
2025-04-01T06:45:06.638008
{ "authors": [ "megalithic", "roxma" ], "repo": "ncm2/ncm2", "url": "https://github.com/ncm2/ncm2/issues/84", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }
866967819
Release on F-Droid It would be really nice if you could release the app on F-Droid, where most of the open source android apps are. It could get community more involved as your app would be findable there. Thank you. I am planning to release Cuppa via F-Droid starting with the next version! Submitted: https://gitlab.com/fdroid/rfp/-/issues/1725 Cuppa is available on F-Droid now! https://f-droid.org/en/packages/com.nathanatos.Cuppa/ 😃 Awesome! i wish you many downloads! :D
gharchive/issue
2021-04-25T09:55:49
2025-04-01T06:45:06.676115
{ "authors": [ "aha999", "ncosgray" ], "repo": "ncosgray/cuppa_mobile", "url": "https://github.com/ncosgray/cuppa_mobile/issues/3", "license": "BSD-3-Clause", "license_type": "permissive", "license_source": "github-api" }
213525455
embed rclone ? Hi, I d like to know how you consider a third party should proceed to embed rclone within another GO project. Should i use the internal API, or am i better suited to parse the cli output ? Currently i m working on a small gui, using rclone under the hood, its written in go for the server part. I diged the source a bit, but could not find a quick and easy way to re use rclone as an API. On the other hand i found i could simply parse the output (with -vv) to get a whole detailed set of information about the progress. Thus i made a cli output parser to transform the raw text into JSON objects, and yes that works, but it will give some little trouble that i d like to get ride, i will have to pack rclone exe file into my app (working, just little weird imho) i will 1+[all transfers] running process on the target system (some waste in here) So yeah, just like to know if you any remarks or recommendations. Thanks! I haven't designed the rclone API to be re-used, except from within rclone. It is possible though - I often write little scripts embedding rclone. What I'd like to do is fix #633 - you can see some code there. That would enable you to call rclone from within your binary. With a bit of thought, it would be possible to make an API so you could do what you want... Thanks! I ll give it a check.
gharchive/issue
2017-03-11T12:13:59
2025-04-01T06:45:06.686068
{ "authors": [ "mh-cbon", "ncw" ], "repo": "ncw/rclone", "url": "https://github.com/ncw/rclone/issues/1229", "license": "MIT", "license_type": "permissive", "license_source": "github-api" }