Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
17,684
9,858,878,150
IssuesEvent
2019-06-20 08:16:14
galaxyproject/galaxy
https://api.github.com/repos/galaxyproject/galaxy
opened
Sharing history is slow with a lot of items
area/performance help wanted kind/enhancement
Maybe we could optimise the query. For a history with 70k items, attempting to share (and make all objects accessible) took ~200 seconds. My user gave up initially and I ended up doing it in the database: ``` delete from dataset_permissions where dataset_id in (select dataset_id from history_dataset_association where history_id = 129258) and action = 'access'; ``` which was quite fast. I'm guessing it iterates over all items? Perhaps we could do something more efficient here?
True
Sharing history is slow with a lot of items - Maybe we could optimise the query. For a history with 70k items, attempting to share (and make all objects accessible) took ~200 seconds. My user gave up initially and I ended up doing it in the database: ``` delete from dataset_permissions where dataset_id in (select dataset_id from history_dataset_association where history_id = 129258) and action = 'access'; ``` which was quite fast. I'm guessing it iterates over all items? Perhaps we could do something more efficient here?
non_process
sharing history is slow with a lot of items maybe we could optimise the query for a history with items attempting to share and make all objects accessible took seconds my user gave up initially and i ended up doing it in the database delete from dataset permissions where dataset id in select dataset id from history dataset association where history id and action access which was quite fast i m guessing it iterates over all items perhaps we could do something more efficient here
0
19,205
25,338,864,056
IssuesEvent
2022-11-18 19:26:10
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
[bazel.build] Problem with /install/ubuntu
type: support / not a bug (process) untriaged team-OSS
Trying to install a fresh install on both Debian and Ubuntu I'm unable to install Bazel onto the stock image of a cloud VM from Google. When I run `sudo apt update && sudo apt install bazel` from the instructions I get: Hit:1 http://us-west1.gce.archive.ubuntu.com/ubuntu bionic InRelease Get:2 http://us-west1.gce.archive.ubuntu.com/ubuntu bionic-updates InRelease [88.7 kB] Get:3 http://us-west1.gce.archive.ubuntu.com/ubuntu bionic-backports InRelease [83.3 kB] Hit:4 http://security.ubuntu.com/ubuntu bionic-security InRelease Fetched 172 kB in 0s (499 kB/s) Reading package lists... Done Building dependency tree Reading state information... Done 19 packages can be upgraded. Run 'apt list --upgradable' to see them. Reading package lists... Done Building dependency tree Reading state information... Done E: Unable to locate package bazel
1.0
[bazel.build] Problem with /install/ubuntu - Trying to install a fresh install on both Debian and Ubuntu I'm unable to install Bazel onto the stock image of a cloud VM from Google. When I run `sudo apt update && sudo apt install bazel` from the instructions I get: Hit:1 http://us-west1.gce.archive.ubuntu.com/ubuntu bionic InRelease Get:2 http://us-west1.gce.archive.ubuntu.com/ubuntu bionic-updates InRelease [88.7 kB] Get:3 http://us-west1.gce.archive.ubuntu.com/ubuntu bionic-backports InRelease [83.3 kB] Hit:4 http://security.ubuntu.com/ubuntu bionic-security InRelease Fetched 172 kB in 0s (499 kB/s) Reading package lists... Done Building dependency tree Reading state information... Done 19 packages can be upgraded. Run 'apt list --upgradable' to see them. Reading package lists... Done Building dependency tree Reading state information... Done E: Unable to locate package bazel
process
problem with install ubuntu trying to install a fresh install on both debian and ubuntu i m unable to install bazel onto the stock image of a cloud vm from google when i run sudo apt update sudo apt install bazel from the instructions i get hit bionic inrelease get bionic updates inrelease get bionic backports inrelease hit bionic security inrelease fetched kb in kb s reading package lists done building dependency tree reading state information done packages can be upgraded run apt list upgradable to see them reading package lists done building dependency tree reading state information done e unable to locate package bazel
1
20,828
27,585,433,402
IssuesEvent
2023-03-08 19:21:51
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
opened
Release 6.2.0 - June 2022
P1 type: process release team-OSS
# Status of Bazel 6.2.0 - Expected release date: 2020-06-01 - [List of release blockers](https://github.com/bazelbuild/bazel/milestone/48) To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone. To cherry-pick a mainline commit into 6.2.0, simply send a PR against the `release-6.2.0` branch. Task list: - [x] [Create draft release announcement](https://docs.google.com/document/d/1pu2ARPweOCTxPsRR8snoDtkC9R51XWRyBXeiC6Ql5so/edit?usp=sharing) - [x] Send for review the release announcement PR - [x] Push the release, notify package maintainers - [x] Update the documentation - [x] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
1.0
Release 6.2.0 - June 2022 - # Status of Bazel 6.2.0 - Expected release date: 2020-06-01 - [List of release blockers](https://github.com/bazelbuild/bazel/milestone/48) To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone. To cherry-pick a mainline commit into 6.2.0, simply send a PR against the `release-6.2.0` branch. Task list: - [x] [Create draft release announcement](https://docs.google.com/document/d/1pu2ARPweOCTxPsRR8snoDtkC9R51XWRyBXeiC6Ql5so/edit?usp=sharing) - [x] Send for review the release announcement PR - [x] Push the release, notify package maintainers - [x] Update the documentation - [x] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
process
release june status of bazel expected release date to report a release blocking bug please add a comment with the text bazel io flag to the issue a release manager will triage it and add it to the milestone to cherry pick a mainline commit into simply send a pr against the release branch task list send for review the release announcement pr push the release notify package maintainers update the documentation update the
1
601,228
18,391,868,148
IssuesEvent
2021-10-12 06:56:01
magento/magento2
https://api.github.com/repos/magento/magento2
opened
[Issue] Replace repetitive actions with Action Groups in StorefrontProductNameWithDoubleQuoteTest
Priority: P2
This issue is automatically created based on existing pull request: magento/magento2#34256: Replace repetitive actions with Action Groups in StorefrontProductNameWithDoubleQuoteTest --------- <!--- Thank you for contributing to Magento. To help us process this pull request we recommend that you add the following information: - Summary of the pull request, - Issue(s) related to the changes made, - Manual testing scenarios Fields marked with (*) are required. Please don't remove the template. --> <!--- Please provide a general summary of the Pull Request in the Title above --> ### Description (*) <!--- Please provide a description of the changes proposed in the pull request. Letting us know what has changed and why it needed changing will help us validate this pull request. --> ### Related Pull Requests <!-- related pull request placeholder --> ### Fixed Issues (if relevant) Test is refactored according to the best practices followed by MFTF. ### Manual testing scenarios (*) ### Questions or comments <!--- If relevant, here you can ask questions or provide comments on your pull request for the reviewer For example if you need assistance with writing tests or would like some feedback on one of your development ideas --> ### Contribution checklist (*) - [ ] Pull request has a meaningful description of its purpose - [ ] All commits are accompanied by meaningful commit messages - [ ] All new or changed code is covered with unit/integration tests (if applicable) - [ ] README.md files for modified modules are updated and included in the pull request if any [README.md predefined sections](https://github.com/magento/devdocs/wiki/Magento-module-README.md) require an update - [ ] All automated tests passed successfully (all builds are green)
1.0
[Issue] Replace repetitive actions with Action Groups in StorefrontProductNameWithDoubleQuoteTest - This issue is automatically created based on existing pull request: magento/magento2#34256: Replace repetitive actions with Action Groups in StorefrontProductNameWithDoubleQuoteTest --------- <!--- Thank you for contributing to Magento. To help us process this pull request we recommend that you add the following information: - Summary of the pull request, - Issue(s) related to the changes made, - Manual testing scenarios Fields marked with (*) are required. Please don't remove the template. --> <!--- Please provide a general summary of the Pull Request in the Title above --> ### Description (*) <!--- Please provide a description of the changes proposed in the pull request. Letting us know what has changed and why it needed changing will help us validate this pull request. --> ### Related Pull Requests <!-- related pull request placeholder --> ### Fixed Issues (if relevant) Test is refactored according to the best practices followed by MFTF. ### Manual testing scenarios (*) ### Questions or comments <!--- If relevant, here you can ask questions or provide comments on your pull request for the reviewer For example if you need assistance with writing tests or would like some feedback on one of your development ideas --> ### Contribution checklist (*) - [ ] Pull request has a meaningful description of its purpose - [ ] All commits are accompanied by meaningful commit messages - [ ] All new or changed code is covered with unit/integration tests (if applicable) - [ ] README.md files for modified modules are updated and included in the pull request if any [README.md predefined sections](https://github.com/magento/devdocs/wiki/Magento-module-README.md) require an update - [ ] All automated tests passed successfully (all builds are green)
non_process
replace repetitive actions with action groups in storefrontproductnamewithdoublequotetest this issue is automatically created based on existing pull request magento replace repetitive actions with action groups in storefrontproductnamewithdoublequotetest thank you for contributing to magento to help us process this pull request we recommend that you add the following information summary of the pull request issue s related to the changes made manual testing scenarios fields marked with are required please don t remove the template description please provide a description of the changes proposed in the pull request letting us know what has changed and why it needed changing will help us validate this pull request related pull requests fixed issues if relevant test is refactored according to the best practices followed by mftf manual testing scenarios questions or comments if relevant here you can ask questions or provide comments on your pull request for the reviewer for example if you need assistance with writing tests or would like some feedback on one of your development ideas contribution checklist pull request has a meaningful description of its purpose all commits are accompanied by meaningful commit messages all new or changed code is covered with unit integration tests if applicable readme md files for modified modules are updated and included in the pull request if any require an update all automated tests passed successfully all builds are green
0
209,376
16,018,449,748
IssuesEvent
2021-04-20 19:08:25
totvs/tds-vscode
https://api.github.com/repos/totvs/tds-vscode
closed
Apply Patch ocasiona erro DBGCpyFile error
awaiting user test
**Describe the bug** Ao tentar aplicar Patch por via da extensão no VS Code, o erro DBGCpyFile error aparece. Foi testado aplicando o patch 10827288_dlogwmsmsp-11876_12.1.27_tttp120_lg e com o patch P12_SetupRobo_por_lobo_guara **To Reproduce** Steps to reproduce the behavior: 1. Vá na extensão aplicar patch selecionar o patch desejado. 2. Clique em Aply **Expected behavior** A aplicação do patch ser executada e o êxito no terminal do app server **Screenshots** App Server ![image](https://user-images.githubusercontent.com/81650410/113049217-c0675580-9179-11eb-8cbc-294ef5a1c0c7.png) Ao selecionar Patch Info ![image](https://user-images.githubusercontent.com/81650410/113049639-41265180-917a-11eb-90cc-92ac5d533f66.png) **Desktop (please complete the following information):** - OS/Architecture: Windows 10 64 bit **Appserver (please complete the following information):** - Build (with date): Build 7.00.191205P - Feb 20 2020 - 17:32:02 - OS/Architecture: Windows 10 64 bit - Build Version 19.3.0.2
1.0
Apply Patch ocasiona erro DBGCpyFile error - **Describe the bug** Ao tentar aplicar Patch por via da extensão no VS Code, o erro DBGCpyFile error aparece. Foi testado aplicando o patch 10827288_dlogwmsmsp-11876_12.1.27_tttp120_lg e com o patch P12_SetupRobo_por_lobo_guara **To Reproduce** Steps to reproduce the behavior: 1. Vá na extensão aplicar patch selecionar o patch desejado. 2. Clique em Aply **Expected behavior** A aplicação do patch ser executada e o êxito no terminal do app server **Screenshots** App Server ![image](https://user-images.githubusercontent.com/81650410/113049217-c0675580-9179-11eb-8cbc-294ef5a1c0c7.png) Ao selecionar Patch Info ![image](https://user-images.githubusercontent.com/81650410/113049639-41265180-917a-11eb-90cc-92ac5d533f66.png) **Desktop (please complete the following information):** - OS/Architecture: Windows 10 64 bit **Appserver (please complete the following information):** - Build (with date): Build 7.00.191205P - Feb 20 2020 - 17:32:02 - OS/Architecture: Windows 10 64 bit - Build Version 19.3.0.2
non_process
apply patch ocasiona erro dbgcpyfile error describe the bug ao tentar aplicar patch por via da extensão no vs code o erro dbgcpyfile error aparece foi testado aplicando o patch dlogwmsmsp lg e com o patch setuprobo por lobo guara to reproduce steps to reproduce the behavior vá na extensão aplicar patch selecionar o patch desejado clique em aply expected behavior a aplicação do patch ser executada e o êxito no terminal do app server screenshots app server ao selecionar patch info desktop please complete the following information os architecture windows bit appserver please complete the following information build with date build feb os architecture windows bit build version
0
218,626
7,331,906,956
IssuesEvent
2018-03-05 14:55:37
enviroCar/enviroCar-app
https://api.github.com/repos/enviroCar/enviroCar-app
closed
User Story: app produces correct measurements
0 - Backlog Priority - 1 - High User Story
We need to make sure that the measurements produced by the app and uploaded to the server are correct. There seem to be wrong fuel consumption / CO2 measurements; see: https://github.com/enviroCar/enviroCar-www/issues/41 The process of creating the measurements needs to be understood and well documented (also on website). <!--- @huboard:{"order":1.94580078125} -->
1.0
User Story: app produces correct measurements - We need to make sure that the measurements produced by the app and uploaded to the server are correct. There seem to be wrong fuel consumption / CO2 measurements; see: https://github.com/enviroCar/enviroCar-www/issues/41 The process of creating the measurements needs to be understood and well documented (also on website). <!--- @huboard:{"order":1.94580078125} -->
non_process
user story app produces correct measurements we need to make sure that the measurements produced by the app and uploaded to the server are correct there seem to be wrong fuel consumption measurements see the process of creating the measurements needs to be understood and well documented also on website huboard order
0
3,953
6,892,281,480
IssuesEvent
2017-11-22 20:20:54
PWRFLcreative/Lightwork-Mapper
https://api.github.com/repos/PWRFLcreative/Lightwork-Mapper
opened
Network Connect button stops working on failed connection
Processing
Reviewed code, looks like it disposes of the network object correctly. Need to investigate the case that's breaking it further.
1.0
Network Connect button stops working on failed connection - Reviewed code, looks like it disposes of the network object correctly. Need to investigate the case that's breaking it further.
process
network connect button stops working on failed connection reviewed code looks like it disposes of the network object correctly need to investigate the case that s breaking it further
1
22,094
30,614,491,553
IssuesEvent
2023-07-24 00:58:01
pytorch/pytorch
https://api.github.com/repos/pytorch/pytorch
closed
DISABLED test_fd_pool (__main__.TestMultiprocessing)
high priority triage review module: multiprocessing module: flaky-tests skipped
This test has been determined flaky through reruns in CI and its instances are reported in our flaky_tests table here https://metrics.pytorch.org/d/L0r6ErGnk/github-status?orgId=1&from=1636426818307&to=1639018818307&viewPanel=57. It is our second flakiest test in the last 15 days with 28 failed instances. ``` ====================================================================== FAIL [4.720s]: test_fd_pool (__main__.TestMultiprocessing) ---------------------------------------------------------------------- Traceback (most recent call last): File "test_multiprocessing.py", line 341, in test_fd_pool self._test_pool(repeat=TEST_REPEATS) File "test_multiprocessing.py", line 327, in _test_pool do_test() File "test_multiprocessing.py", line 206, in __exit__ self.test_case.assertFalse(self.has_shm_files()) AssertionError: True is not false ``` Please look at the table for details from the past 30 days such as * number of failed instances * an example url * which platforms it failed on * the number of times it failed on trunk vs on PRs. cc @ezyang @gchanan @zou3519 @bdhirsh @jbschlosser @VitalyFedyunin
1.0
DISABLED test_fd_pool (__main__.TestMultiprocessing) - This test has been determined flaky through reruns in CI and its instances are reported in our flaky_tests table here https://metrics.pytorch.org/d/L0r6ErGnk/github-status?orgId=1&from=1636426818307&to=1639018818307&viewPanel=57. It is our second flakiest test in the last 15 days with 28 failed instances. ``` ====================================================================== FAIL [4.720s]: test_fd_pool (__main__.TestMultiprocessing) ---------------------------------------------------------------------- Traceback (most recent call last): File "test_multiprocessing.py", line 341, in test_fd_pool self._test_pool(repeat=TEST_REPEATS) File "test_multiprocessing.py", line 327, in _test_pool do_test() File "test_multiprocessing.py", line 206, in __exit__ self.test_case.assertFalse(self.has_shm_files()) AssertionError: True is not false ``` Please look at the table for details from the past 30 days such as * number of failed instances * an example url * which platforms it failed on * the number of times it failed on trunk vs on PRs. cc @ezyang @gchanan @zou3519 @bdhirsh @jbschlosser @VitalyFedyunin
process
disabled test fd pool main testmultiprocessing this test has been determined flaky through reruns in ci and its instances are reported in our flaky tests table here it is our second flakiest test in the last days with failed instances fail test fd pool main testmultiprocessing traceback most recent call last file test multiprocessing py line in test fd pool self test pool repeat test repeats file test multiprocessing py line in test pool do test file test multiprocessing py line in exit self test case assertfalse self has shm files assertionerror true is not false please look at the table for details from the past days such as number of failed instances an example url which platforms it failed on the number of times it failed on trunk vs on prs cc ezyang gchanan bdhirsh jbschlosser vitalyfedyunin
1
21,529
29,810,891,808
IssuesEvent
2023-06-16 14:58:55
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
Release X.Y.Z - $MONTH $YEAR
P1 type: process release team-OSS
# Status of Bazel X.Y.Z - Expected first release candidate date: [date] - Expected release date: [date] - [List of release blockers](link-to-milestone) To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone. To cherry-pick a mainline commit into X.Y.Z, simply send a PR against the `release-X.Y.Z` branch. **Task list:** <!-- The first item is only needed for major releases (X.0.0) --> - [ ] Pick release baseline: [link to base commit] - [ ] Create release candidate: X.Y.Zrc1 - [ ] Check downstream projects - [ ] Create [draft release announcement](https://docs.google.com/document/d/1pu2ARPweOCTxPsRR8snoDtkC9R51XWRyBXeiC6Ql5so/edit) <!-- Note that there should be a new Bazel Release Announcement document for every major release. For minor and patch releases, use the latest open doc. --> - [ ] Send the release announcement PR for review: [link to bazel-blog PR] <!-- Only for major releases. --> - [ ] Push the release and notify package maintainers: [link to comment notifying package maintainers] - [ ] Update the documentation - [ ] Push the blog post: [link to blog post] <!-- Only for major releases. --> - [ ] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
1.0
Release X.Y.Z - $MONTH $YEAR - # Status of Bazel X.Y.Z - Expected first release candidate date: [date] - Expected release date: [date] - [List of release blockers](link-to-milestone) To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone. To cherry-pick a mainline commit into X.Y.Z, simply send a PR against the `release-X.Y.Z` branch. **Task list:** <!-- The first item is only needed for major releases (X.0.0) --> - [ ] Pick release baseline: [link to base commit] - [ ] Create release candidate: X.Y.Zrc1 - [ ] Check downstream projects - [ ] Create [draft release announcement](https://docs.google.com/document/d/1pu2ARPweOCTxPsRR8snoDtkC9R51XWRyBXeiC6Ql5so/edit) <!-- Note that there should be a new Bazel Release Announcement document for every major release. For minor and patch releases, use the latest open doc. --> - [ ] Send the release announcement PR for review: [link to bazel-blog PR] <!-- Only for major releases. --> - [ ] Push the release and notify package maintainers: [link to comment notifying package maintainers] - [ ] Update the documentation - [ ] Push the blog post: [link to blog post] <!-- Only for major releases. --> - [ ] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
process
release x y z month year status of bazel x y z expected first release candidate date expected release date link to milestone to report a release blocking bug please add a comment with the text bazel io flag to the issue a release manager will triage it and add it to the milestone to cherry pick a mainline commit into x y z simply send a pr against the release x y z branch task list pick release baseline create release candidate x y check downstream projects create send the release announcement pr for review push the release and notify package maintainers update the documentation push the blog post update the
1
3,506
6,559,856,291
IssuesEvent
2017-09-07 06:52:07
inasafe/inasafe-realtime
https://api.github.com/repos/inasafe/inasafe-realtime
closed
Realtime earthquake translation in Bahasa Indonesia
earthquake feature request in progress realtime processor web page
Problem Similar issue with InaSAFE Realtime flood translation (#3291), currently InaSAFE Realtime Earthquake in Bahasa Indonesia is not 100% translated yet. See original ticket at https://github.com/inasafe/inasafe/issues/3714 for further discussion.
1.0
Realtime earthquake translation in Bahasa Indonesia - Problem Similar issue with InaSAFE Realtime flood translation (#3291), currently InaSAFE Realtime Earthquake in Bahasa Indonesia is not 100% translated yet. See original ticket at https://github.com/inasafe/inasafe/issues/3714 for further discussion.
process
realtime earthquake translation in bahasa indonesia problem similar issue with inasafe realtime flood translation currently inasafe realtime earthquake in bahasa indonesia is not translated yet see original ticket at for further discussion
1
519,822
15,057,636,451
IssuesEvent
2021-02-03 22:00:36
zephyrproject-rtos/zephyr
https://api.github.com/repos/zephyrproject-rtos/zephyr
closed
intel_adsp_cavs15:running tests/kernel/sched/schedule_api failed
bug priority: medium
Describe the bug running tests/kernel/sched/schedule_api error, it showed assertion failed. To Reproduce Steps to reproduce the behavior: sanitycheck -W -p intel_adsp_cavs15 --device-testing -T tests/kernel/sched/ --west-flash="/home/ztest/work/zephyrproject/zephyr/boards/xtensa/up_squared_adsp/tools/up_squared_adsp_flash.sh, /home/ztest/work/sof/rimage/keys/otc_private_key.pem" --device-serial-pty="/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/dump_trace.py" -x=CONFIG_IPM=y -x=CONFIG_CONSOLE=y -x=CONFIG_LOG_PRINTK=n see error: START - test_slice_reset Assertion failed at WEST_TOPDIR/zephyr/tests/kernel/sched/schedule_api/src/test_sched_timeslice_reset.c:91: thread_time_slice: (t <= expected_slice_max is false) timeslice too big, expected 3840384 got 3840934 Assertion failed at WEST_TOPDIR/zephyr/tests/kernel/sched/schedule_api/src/test_sched_timeslice_reset.c:88: thread_time_slice: (t >= expected_slice_min is false) timeslice too small, expected 3839616 got 15295 FAIL - test_slice_reset Environment (please complete the following information): OS: Fedora28 Toolchain: Zephyr-sdk-0.11.4 Commit ID: 90d06cff
1.0
intel_adsp_cavs15:running tests/kernel/sched/schedule_api failed - Describe the bug running tests/kernel/sched/schedule_api error, it showed assertion failed. To Reproduce Steps to reproduce the behavior: sanitycheck -W -p intel_adsp_cavs15 --device-testing -T tests/kernel/sched/ --west-flash="/home/ztest/work/zephyrproject/zephyr/boards/xtensa/up_squared_adsp/tools/up_squared_adsp_flash.sh, /home/ztest/work/sof/rimage/keys/otc_private_key.pem" --device-serial-pty="/home/ztest/work/zephyrproject/zephyr/boards/xtensa/intel_adsp_cavs15/tools/dump_trace.py" -x=CONFIG_IPM=y -x=CONFIG_CONSOLE=y -x=CONFIG_LOG_PRINTK=n see error: START - test_slice_reset Assertion failed at WEST_TOPDIR/zephyr/tests/kernel/sched/schedule_api/src/test_sched_timeslice_reset.c:91: thread_time_slice: (t <= expected_slice_max is false) timeslice too big, expected 3840384 got 3840934 Assertion failed at WEST_TOPDIR/zephyr/tests/kernel/sched/schedule_api/src/test_sched_timeslice_reset.c:88: thread_time_slice: (t >= expected_slice_min is false) timeslice too small, expected 3839616 got 15295 FAIL - test_slice_reset Environment (please complete the following information): OS: Fedora28 Toolchain: Zephyr-sdk-0.11.4 Commit ID: 90d06cff
non_process
intel adsp running tests kernel sched schedule api failed describe the bug running tests kernel sched schedule api error it showed assertion failed to reproduce steps to reproduce the behavior sanitycheck w p intel adsp device testing t tests kernel sched west flash home ztest work zephyrproject zephyr boards xtensa up squared adsp tools up squared adsp flash sh home ztest work sof rimage keys otc private key pem device serial pty home ztest work zephyrproject zephyr boards xtensa intel adsp tools dump trace py x config ipm y x config console y x config log printk n see error start test slice reset assertion failed at west topdir zephyr tests kernel sched schedule api src test sched timeslice reset c thread time slice t expected slice max is false timeslice too big expected got assertion failed at west topdir zephyr tests kernel sched schedule api src test sched timeslice reset c thread time slice t expected slice min is false timeslice too small expected got fail test slice reset environment please complete the following information os toolchain zephyr sdk commit id
0
405,999
11,885,873,617
IssuesEvent
2020-03-27 20:34:11
microsoftgraph/microsoft-graph-toolkit
https://api.github.com/repos/microsoftgraph/microsoft-graph-toolkit
closed
Update all components to leverage new features from mgt-flyout
Priority: 0 State: In Review feature-request
This issue tracks the progress of updating components to leverage the new updates to mgt-flyout in #351 and #344. The following components need to be updated: - [x] person - [x] login - [x] people picker - [x] tasks - [x] person card Updates include (might not apply to all component): * moving the anchor element inside the flyout so the flyout can properly render from bottom * using light-dismiss logic from the flyout and remove logic from each component * leverage the new `--mgt-flyout-set-width` property to resize the flyout to fir the screen when window is too narrow
1.0
Update all components to leverage new features from mgt-flyout - This issue tracks the progress of updating components to leverage the new updates to mgt-flyout in #351 and #344. The following components need to be updated: - [x] person - [x] login - [x] people picker - [x] tasks - [x] person card Updates include (might not apply to all component): * moving the anchor element inside the flyout so the flyout can properly render from bottom * using light-dismiss logic from the flyout and remove logic from each component * leverage the new `--mgt-flyout-set-width` property to resize the flyout to fir the screen when window is too narrow
non_process
update all components to leverage new features from mgt flyout this issue tracks the progress of updating components to leverage the new updates to mgt flyout in and the following components need to be updated person login people picker tasks person card updates include might not apply to all component moving the anchor element inside the flyout so the flyout can properly render from bottom using light dismiss logic from the flyout and remove logic from each component leverage the new mgt flyout set width property to resize the flyout to fir the screen when window is too narrow
0
8,222
11,410,592,725
IssuesEvent
2020-02-01 00:03:15
parcel-bundler/parcel
https://api.github.com/repos/parcel-bundler/parcel
closed
Postcss-modules config broken with plugins array syntax
:bug: Bug CSS Preprocessing Stale
# 🐛 bug report Postcss config support plugins set as an array (`{plugins: []}`), but doing so prevents being able to set custom `postcss-modules` config. The trouble here is that I use a shared postcss-config file, but even if I didn't, the plugins array syntax is more explicit, not depending on js's unsupported object key order. ## 🤔 Expected Behavior `postcss-modules` config should be detected. ## 😯 Current Behavior `postcss-modules` config is not detected. ## 💁 Possible Solution Either parcel should search the array for module names matching `postcss-modules`, or you should be able to use the top-level modules key to a config object (`{modules: {}, plugins: []}`).
1.0
Postcss-modules config broken with plugins array syntax - # 🐛 bug report Postcss config support plugins set as an array (`{plugins: []}`), but doing so prevents being able to set custom `postcss-modules` config. The trouble here is that I use a shared postcss-config file, but even if I didn't, the plugins array syntax is more explicit, not depending on js's unsupported object key order. ## 🤔 Expected Behavior `postcss-modules` config should be detected. ## 😯 Current Behavior `postcss-modules` config is not detected. ## 💁 Possible Solution Either parcel should search the array for module names matching `postcss-modules`, or you should be able to use the top-level modules key to a config object (`{modules: {}, plugins: []}`).
process
postcss modules config broken with plugins array syntax 🐛 bug report postcss config support plugins set as an array plugins but doing so prevents being able to set custom postcss modules config the trouble here is that i use a shared postcss config file but even if i didn t the plugins array syntax is more explicit not depending on js s unsupported object key order 🤔 expected behavior postcss modules config should be detected 😯 current behavior postcss modules config is not detected 💁 possible solution either parcel should search the array for module names matching postcss modules or you should be able to use the top level modules key to a config object modules plugins
1
424,582
12,313,293,988
IssuesEvent
2020-05-12 15:06:47
mozilla/addons-server
https://api.github.com/repos/mozilla/addons-server
closed
Handle moved/deleted files better during git extraction
component: git priority: p3
We should do something like we did for code-search.
1.0
Handle moved/deleted files better during git extraction - We should do something like we did for code-search.
non_process
handle moved deleted files better during git extraction we should do something like we did for code search
0
3,016
6,023,158,656
IssuesEvent
2017-06-07 23:04:38
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
closed
Avoid HWND functions in UWP Process class
area-System.Diagnostics.Process
HWND related functions are not meaningful in an app. When bringing back API for the Process class, anything that relies on these should throw PNSE with a nice message. ``` EnumWindows GetWindow GetWindowLong GetWindowText GetWindowTextLength GetWindowThreadProcessId IsWindowVisible PostMessage SendMessageTimeout WaitForInputIdle GetKeyState ```
1.0
Avoid HWND functions in UWP Process class - HWND related functions are not meaningful in an app. When bringing back API for the Process class, anything that relies on these should throw PNSE with a nice message. ``` EnumWindows GetWindow GetWindowLong GetWindowText GetWindowTextLength GetWindowThreadProcessId IsWindowVisible PostMessage SendMessageTimeout WaitForInputIdle GetKeyState ```
process
avoid hwnd functions in uwp process class hwnd related functions are not meaningful in an app when bringing back api for the process class anything that relies on these should throw pnse with a nice message enumwindows getwindow getwindowlong getwindowtext getwindowtextlength getwindowthreadprocessid iswindowvisible postmessage sendmessagetimeout waitforinputidle getkeystate
1
334,932
10,147,379,119
IssuesEvent
2019-08-05 10:24:58
ahmedkaludi/accelerated-mobile-pages
https://api.github.com/repos/ahmedkaludi/accelerated-mobile-pages
closed
If title is loading then only its markup should load otherwise not
NEED FAST REVIEW [Priority: HIGH] bug
Help Scout Link: https://secure.helpscout.net/conversation/908191818/74764?folderId=2770545 If the title is loading then its markup(h1> tag) should load else not. User is waiting for this so need to push it soon.
1.0
If title is loading then only its markup should load otherwise not - Help Scout Link: https://secure.helpscout.net/conversation/908191818/74764?folderId=2770545 If the title is loading then its markup(h1> tag) should load else not. User is waiting for this so need to push it soon.
non_process
if title is loading then only its markup should load otherwise not help scout link if the title is loading then its markup tag should load else not user is waiting for this so need to push it soon
0
12,744
15,106,934,298
IssuesEvent
2021-02-08 14:52:21
apache/incubator-flagon-useralejs
https://api.github.com/repos/apache/incubator-flagon-useralejs
opened
Update Readme to be consistent with Incubator NPM Release Policies
Process
https://incubator.apache.org/guides/distribution.html#npm our release process is compliant, save the "full disclaimer" at the bottom of ReadME
1.0
Update Readme to be consistent with Incubator NPM Release Policies - https://incubator.apache.org/guides/distribution.html#npm our release process is compliant, save the "full disclaimer" at the bottom of ReadME
process
update readme to be consistent with incubator npm release policies our release process is compliant save the full disclaimer at the bottom of readme
1
4,595
7,433,361,689
IssuesEvent
2018-03-26 07:13:17
pwittchen/ReactiveBus
https://api.github.com/repos/pwittchen/ReactiveBus
closed
release 0.0.5
release process
**Initial release notes**: - Improved builder pattern in the `Event` class - PR #10 **things to do**: - [x] bump library version - [x] publish artifact to sonatype - [x] close and release artifact on sonatype - [x] update changelog after maven sync - [x] update download section after maven sync - [x] publish new release on GitHub
1.0
release 0.0.5 - **Initial release notes**: - Improved builder pattern in the `Event` class - PR #10 **things to do**: - [x] bump library version - [x] publish artifact to sonatype - [x] close and release artifact on sonatype - [x] update changelog after maven sync - [x] update download section after maven sync - [x] publish new release on GitHub
process
release initial release notes improved builder pattern in the event class pr things to do bump library version publish artifact to sonatype close and release artifact on sonatype update changelog after maven sync update download section after maven sync publish new release on github
1
22,246
30,801,443,677
IssuesEvent
2023-08-01 02:00:08
lizhihao6/get-daily-arxiv-noti
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
opened
New submissions for Tue, 1 Aug 23
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
## Keyword: events ### Seeing Behind Dynamic Occlusions with Event Cameras - **Authors:** Rong Zou, Manasi Muglikar, Niko Messikommer, Davide Scaramuzza - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.15829 - **Pdf link:** https://arxiv.org/pdf/2307.15829 - **Abstract** Unwanted camera occlusions, such as debris, dust, rain-drops, and snow, can severely degrade the performance of computer-vision systems. Dynamic occlusions are particularly challenging because of the continuously changing pattern. Existing occlusion-removal methods currently use synthetic aperture imaging or image inpainting. However, they face issues with dynamic occlusions as these require multiple viewpoints or user-generated masks to hallucinate the background intensity. We propose a novel approach to reconstruct the background from a single viewpoint in the presence of dynamic occlusions. Our solution relies for the first time on the combination of a traditional camera with an event camera. When an occlusion moves across a background image, it causes intensity changes that trigger events. These events provide additional information on the relative intensity changes between foreground and background at a high temporal resolution, enabling a truer reconstruction of the background content. We present the first large-scale dataset consisting of synchronized images and event sequences to evaluate our approach. We show that our method outperforms image inpainting methods by 3dB in terms of PSNR on our dataset. ### CMDA: Cross-Modality Domain Adaptation for Nighttime Semantic Segmentation - **Authors:** Ruihao Xia, Chaoqiang Zhao, Meng Zheng, Ziyan Wu, Qiyu Sun, Yang Tang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.15942 - **Pdf link:** https://arxiv.org/pdf/2307.15942 - **Abstract** Most nighttime semantic segmentation studies are based on domain adaptation approaches and image input. However, limited by the low dynamic range of conventional cameras, images fail to capture structural details and boundary information in low-light conditions. Event cameras, as a new form of vision sensors, are complementary to conventional cameras with their high dynamic range. To this end, we propose a novel unsupervised Cross-Modality Domain Adaptation (CMDA) framework to leverage multi-modality (Images and Events) information for nighttime semantic segmentation, with only labels on daytime images. In CMDA, we design the Image Motion-Extractor to extract motion information and the Image Content-Extractor to extract content information from images, in order to bridge the gap between different modalities (Images to Events) and domains (Day to Night). Besides, we introduce the first image-event nighttime semantic segmentation dataset. Extensive experiments on both the public image dataset and the proposed image-event dataset demonstrate the effectiveness of our proposed approach. We open-source our code, models, and dataset at https://github.com/XiaRho/CMDA. ### Fully $1\times1$ Convolutional Network for Lightweight Image Super-Resolution - **Authors:** Gang Wu, Junjun Jiang, Kui Jiang, Xianming Liu - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI) - **Arxiv link:** https://arxiv.org/abs/2307.16140 - **Pdf link:** https://arxiv.org/pdf/2307.16140 - **Abstract** Deep models have achieved significant process on single image super-resolution (SISR) tasks, in particular large models with large kernel ($3\times3$ or more). However, the heavy computational footprint of such models prevents their deployment in real-time, resource-constrained environments. Conversely, $1\times1$ convolutions bring substantial computational efficiency, but struggle with aggregating local spatial representations, an essential capability to SISR models. In response to this dichotomy, we propose to harmonize the merits of both $3\times3$ and $1\times1$ kernels, and exploit a great potential for lightweight SISR tasks. Specifically, we propose a simple yet effective fully $1\times1$ convolutional network, named Shift-Conv-based Network (SCNet). By incorporating a parameter-free spatial-shift operation, it equips the fully $1\times1$ convolutional network with powerful representation capability while impressive computational efficiency. Extensive experiments demonstrate that SCNets, despite its fully $1\times1$ convolutional structure, consistently matches or even surpasses the performance of existing lightweight SR models that employ regular convolutions. ## Keyword: event camera ### Seeing Behind Dynamic Occlusions with Event Cameras - **Authors:** Rong Zou, Manasi Muglikar, Niko Messikommer, Davide Scaramuzza - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.15829 - **Pdf link:** https://arxiv.org/pdf/2307.15829 - **Abstract** Unwanted camera occlusions, such as debris, dust, rain-drops, and snow, can severely degrade the performance of computer-vision systems. Dynamic occlusions are particularly challenging because of the continuously changing pattern. Existing occlusion-removal methods currently use synthetic aperture imaging or image inpainting. However, they face issues with dynamic occlusions as these require multiple viewpoints or user-generated masks to hallucinate the background intensity. We propose a novel approach to reconstruct the background from a single viewpoint in the presence of dynamic occlusions. Our solution relies for the first time on the combination of a traditional camera with an event camera. When an occlusion moves across a background image, it causes intensity changes that trigger events. These events provide additional information on the relative intensity changes between foreground and background at a high temporal resolution, enabling a truer reconstruction of the background content. We present the first large-scale dataset consisting of synchronized images and event sequences to evaluate our approach. We show that our method outperforms image inpainting methods by 3dB in terms of PSNR on our dataset. ### CMDA: Cross-Modality Domain Adaptation for Nighttime Semantic Segmentation - **Authors:** Ruihao Xia, Chaoqiang Zhao, Meng Zheng, Ziyan Wu, Qiyu Sun, Yang Tang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.15942 - **Pdf link:** https://arxiv.org/pdf/2307.15942 - **Abstract** Most nighttime semantic segmentation studies are based on domain adaptation approaches and image input. However, limited by the low dynamic range of conventional cameras, images fail to capture structural details and boundary information in low-light conditions. Event cameras, as a new form of vision sensors, are complementary to conventional cameras with their high dynamic range. To this end, we propose a novel unsupervised Cross-Modality Domain Adaptation (CMDA) framework to leverage multi-modality (Images and Events) information for nighttime semantic segmentation, with only labels on daytime images. In CMDA, we design the Image Motion-Extractor to extract motion information and the Image Content-Extractor to extract content information from images, in order to bridge the gap between different modalities (Images to Events) and domains (Day to Night). Besides, we introduce the first image-event nighttime semantic segmentation dataset. Extensive experiments on both the public image dataset and the proposed image-event dataset demonstrate the effectiveness of our proposed approach. We open-source our code, models, and dataset at https://github.com/XiaRho/CMDA. ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB There is no result ## Keyword: ISP ### Implementing Edge Based Object Detection For Microplastic Debris - **Authors:** Amardeep Singh, Prof. Charles Jia, Prof. Donald Kirk - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI) - **Arxiv link:** https://arxiv.org/abs/2307.16289 - **Pdf link:** https://arxiv.org/pdf/2307.16289 - **Abstract** Plastic has imbibed itself as an indispensable part of our day to day activities, becoming a source of problems due to its non-biodegradable nature and cheaper production prices. With these problems, comes the challenge of mitigating and responding to the aftereffects of disposal or the lack of proper disposal which leads to waste concentrating in locations and disturbing ecosystems for both plants and animals. As plastic debris levels continue to rise with the accumulation of waste in garbage patches in landfills and more hazardously in natural water bodies, swift action is necessary to plug or cease this flow. While manual sorting operations and detection can offer a solution, they can be augmented using highly advanced computer imagery linked with robotic appendages for removing wastes. The primary application of focus in this report are the much-discussed Computer Vision and Open Vision which have gained novelty for their light dependence on internet and ability to relay information in remote areas. These applications can be applied to the creation of edge-based mobility devices that can as a counter to the growing problem of plastic debris in oceans and rivers, demanding little connectivity and still offering the same results with reasonably timed maintenance. The principal findings of this project cover the various methods that were tested and deployed to detect waste in images, as well as comparing them against different waste types. The project has been able to produce workable models that can perform on time detection of sampled images using an augmented CNN approach. Latter portions of the project have also achieved a better interpretation of the necessary preprocessing steps required to arrive at the best accuracies, including the best hardware for expanding waste detection studies to larger environments. ### DRAW: Defending Camera-shooted RAW against Image Manipulation - **Authors:** Xiaoxiao Hu, Qichao Ying, Zhenxing Qian, Sheng Li, Xinpeng Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16418 - **Pdf link:** https://arxiv.org/pdf/2307.16418 - **Abstract** RAW files are the initial measurement of scene radiance widely used in most cameras, and the ubiquitously-used RGB images are converted from RAW data through Image Signal Processing (ISP) pipelines. Nowadays, digital images are risky of being nefariously manipulated. Inspired by the fact that innate immunity is the first line of body defense, we propose DRAW, a novel scheme of defending images against manipulation by protecting their sources, i.e., camera-shooted RAWs. Specifically, we design a lightweight Multi-frequency Partial Fusion Network (MPF-Net) friendly to devices with limited computing resources by frequency learning and partial feature fusion. It introduces invisible watermarks as protective signal into the RAW data. The protection capability can not only be transferred into the rendered RGB images regardless of the applied ISP pipeline, but also is resilient to post-processing operations such as blurring or compression. Once the image is manipulated, we can accurately identify the forged areas with a localization network. Extensive experiments on several famous RAW datasets, e.g., RAISE, FiveK and SIDD, indicate the effectiveness of our method. We hope that this technique can be used in future cameras as an option for image protection, which could effectively restrict image manipulation at the source. ### Digging Into Uncertainty-based Pseudo-label for Robust Stereo Matching - **Authors:** Zhelun Shen, Xibin Song, Yuchao Dai, Dingfu Zhou, Zhibo Rao, Liangjun Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.16509 - **Pdf link:** https://arxiv.org/pdf/2307.16509 - **Abstract** Due to the domain differences and unbalanced disparity distribution across multiple datasets, current stereo matching approaches are commonly limited to a specific dataset and generalize poorly to others. Such domain shift issue is usually addressed by substantial adaptation on costly target-domain ground-truth data, which cannot be easily obtained in practical settings. In this paper, we propose to dig into uncertainty estimation for robust stereo matching. Specifically, to balance the disparity distribution, we employ a pixel-level uncertainty estimation to adaptively adjust the next stage disparity searching space, in this way driving the network progressively prune out the space of unlikely correspondences. Then, to solve the limited ground truth data, an uncertainty-based pseudo-label is proposed to adapt the pre-trained model to the new domain, where pixel-level and area-level uncertainty estimation are proposed to filter out the high-uncertainty pixels of predicted disparity maps and generate sparse while reliable pseudo-labels to align the domain gap. Experimentally, our method shows strong cross-domain, adapt, and joint generalization and obtains \textbf{1st} place on the stereo task of Robust Vision Challenge 2020. Additionally, our uncertainty-based pseudo-labels can be extended to train monocular depth estimation networks in an unsupervised way and even achieves comparable performance with the supervised methods. The code will be available at https://github.com/gallenszl/UCFNet. ### Multi-Spectral Image Stitching via Spatial Graph Reasoning - **Authors:** Zhiying Jiang, Zengxi Zhang, Jinyuan Liu, Xin Fan, Risheng Liu - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.16741 - **Pdf link:** https://arxiv.org/pdf/2307.16741 - **Abstract** Multi-spectral image stitching leverages the complementarity between infrared and visible images to generate a robust and reliable wide field-of-view (FOV) scene. The primary challenge of this task is to explore the relations between multi-spectral images for aligning and integrating multi-view scenes. Capitalizing on the strengths of Graph Convolutional Networks (GCNs) in modeling feature relationships, we propose a spatial graph reasoning based multi-spectral image stitching method that effectively distills the deformation and integration of multi-spectral images across different viewpoints. To accomplish this, we embed multi-scale complementary features from the same view position into a set of nodes. The correspondence across different views is learned through powerful dense feature embeddings, where both inter- and intra-correlations are developed to exploit cross-view matching and enhance inner feature disparity. By introducing long-range coherence along spatial and channel dimensions, the complementarity of pixel relations and channel interdependencies aids in the reconstruction of aligned multi-view features, generating informative and reliable wide FOV scenes. Moreover, we release a challenging dataset named ChaMS, comprising both real-world and synthetic sets with significant parallax, providing a new option for comprehensive evaluation. Extensive experiments demonstrate that our method surpasses the state-of-the-arts. ## Keyword: image signal processing ### DRAW: Defending Camera-shooted RAW against Image Manipulation - **Authors:** Xiaoxiao Hu, Qichao Ying, Zhenxing Qian, Sheng Li, Xinpeng Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16418 - **Pdf link:** https://arxiv.org/pdf/2307.16418 - **Abstract** RAW files are the initial measurement of scene radiance widely used in most cameras, and the ubiquitously-used RGB images are converted from RAW data through Image Signal Processing (ISP) pipelines. Nowadays, digital images are risky of being nefariously manipulated. Inspired by the fact that innate immunity is the first line of body defense, we propose DRAW, a novel scheme of defending images against manipulation by protecting their sources, i.e., camera-shooted RAWs. Specifically, we design a lightweight Multi-frequency Partial Fusion Network (MPF-Net) friendly to devices with limited computing resources by frequency learning and partial feature fusion. It introduces invisible watermarks as protective signal into the RAW data. The protection capability can not only be transferred into the rendered RGB images regardless of the applied ISP pipeline, but also is resilient to post-processing operations such as blurring or compression. Once the image is manipulated, we can accurately identify the forged areas with a localization network. Extensive experiments on several famous RAW datasets, e.g., RAISE, FiveK and SIDD, indicate the effectiveness of our method. We hope that this technique can be used in future cameras as an option for image protection, which could effectively restrict image manipulation at the source. ## Keyword: image signal process ### DRAW: Defending Camera-shooted RAW against Image Manipulation - **Authors:** Xiaoxiao Hu, Qichao Ying, Zhenxing Qian, Sheng Li, Xinpeng Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16418 - **Pdf link:** https://arxiv.org/pdf/2307.16418 - **Abstract** RAW files are the initial measurement of scene radiance widely used in most cameras, and the ubiquitously-used RGB images are converted from RAW data through Image Signal Processing (ISP) pipelines. Nowadays, digital images are risky of being nefariously manipulated. Inspired by the fact that innate immunity is the first line of body defense, we propose DRAW, a novel scheme of defending images against manipulation by protecting their sources, i.e., camera-shooted RAWs. Specifically, we design a lightweight Multi-frequency Partial Fusion Network (MPF-Net) friendly to devices with limited computing resources by frequency learning and partial feature fusion. It introduces invisible watermarks as protective signal into the RAW data. The protection capability can not only be transferred into the rendered RGB images regardless of the applied ISP pipeline, but also is resilient to post-processing operations such as blurring or compression. Once the image is manipulated, we can accurately identify the forged areas with a localization network. Extensive experiments on several famous RAW datasets, e.g., RAISE, FiveK and SIDD, indicate the effectiveness of our method. We hope that this technique can be used in future cameras as an option for image protection, which could effectively restrict image manipulation at the source. ## Keyword: compression ### InfoStyler: Disentanglement Information Bottleneck for Artistic Style Transfer - **Authors:** Yueming Lyu, Yue Jiang, Bo Peng, Jing Dong - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.16227 - **Pdf link:** https://arxiv.org/pdf/2307.16227 - **Abstract** Artistic style transfer aims to transfer the style of an artwork to a photograph while maintaining its original overall content. Many prior works focus on designing various transfer modules to transfer the style statistics to the content image. Although effective, ignoring the clear disentanglement of the content features and the style features from the first beginning, they have difficulty in balancing between content preservation and style transferring. To tackle this problem, we propose a novel information disentanglement method, named InfoStyler, to capture the minimal sufficient information for both content and style representations from the pre-trained encoding network. InfoStyler formulates the disentanglement representation learning as an information compression problem by eliminating style statistics from the content image and removing the content structure from the style image. Besides, to further facilitate disentanglement learning, a cross-domain Information Bottleneck (IB) learning strategy is proposed by reconstructing the content and style domains. Extensive experiments demonstrate that our InfoStyler can synthesize high-quality stylized images while balancing content structure preservation and style pattern richness. ### DRAW: Defending Camera-shooted RAW against Image Manipulation - **Authors:** Xiaoxiao Hu, Qichao Ying, Zhenxing Qian, Sheng Li, Xinpeng Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16418 - **Pdf link:** https://arxiv.org/pdf/2307.16418 - **Abstract** RAW files are the initial measurement of scene radiance widely used in most cameras, and the ubiquitously-used RGB images are converted from RAW data through Image Signal Processing (ISP) pipelines. Nowadays, digital images are risky of being nefariously manipulated. Inspired by the fact that innate immunity is the first line of body defense, we propose DRAW, a novel scheme of defending images against manipulation by protecting their sources, i.e., camera-shooted RAWs. Specifically, we design a lightweight Multi-frequency Partial Fusion Network (MPF-Net) friendly to devices with limited computing resources by frequency learning and partial feature fusion. It introduces invisible watermarks as protective signal into the RAW data. The protection capability can not only be transferred into the rendered RGB images regardless of the applied ISP pipeline, but also is resilient to post-processing operations such as blurring or compression. Once the image is manipulated, we can accurately identify the forged areas with a localization network. Extensive experiments on several famous RAW datasets, e.g., RAISE, FiveK and SIDD, indicate the effectiveness of our method. We hope that this technique can be used in future cameras as an option for image protection, which could effectively restrict image manipulation at the source. ## Keyword: RAW ### DRAW: Defending Camera-shooted RAW against Image Manipulation - **Authors:** Xiaoxiao Hu, Qichao Ying, Zhenxing Qian, Sheng Li, Xinpeng Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16418 - **Pdf link:** https://arxiv.org/pdf/2307.16418 - **Abstract** RAW files are the initial measurement of scene radiance widely used in most cameras, and the ubiquitously-used RGB images are converted from RAW data through Image Signal Processing (ISP) pipelines. Nowadays, digital images are risky of being nefariously manipulated. Inspired by the fact that innate immunity is the first line of body defense, we propose DRAW, a novel scheme of defending images against manipulation by protecting their sources, i.e., camera-shooted RAWs. Specifically, we design a lightweight Multi-frequency Partial Fusion Network (MPF-Net) friendly to devices with limited computing resources by frequency learning and partial feature fusion. It introduces invisible watermarks as protective signal into the RAW data. The protection capability can not only be transferred into the rendered RGB images regardless of the applied ISP pipeline, but also is resilient to post-processing operations such as blurring or compression. Once the image is manipulated, we can accurately identify the forged areas with a localization network. Extensive experiments on several famous RAW datasets, e.g., RAISE, FiveK and SIDD, indicate the effectiveness of our method. We hope that this technique can be used in future cameras as an option for image protection, which could effectively restrict image manipulation at the source. ### Towards General Low-Light Raw Noise Synthesis and Modeling - **Authors:** Feng Zhang, Bin Xu, Zhiqiang Li, Xinran Liu, Qingbo Lu, Changxin Gao, Nong Sang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16508 - **Pdf link:** https://arxiv.org/pdf/2307.16508 - **Abstract** Modeling and synthesizing low-light raw noise is a fundamental problem for computational photography and image processing applications. Although most recent works have adopted physics-based models to synthesize noise, the signal-independent noise in low-light conditions is far more complicated and varies dramatically across camera sensors, which is beyond the description of these models. To address this issue, we introduce a new perspective to synthesize the signal-independent noise by a generative model. Specifically, we synthesize the signal-dependent and signal-independent noise in a physics- and learning-based manner, respectively. In this way, our method can be considered as a general model, that is, it can simultaneously learn different noise characteristics for different ISO levels and generalize to various sensors. Subsequently, we present an effective multi-scale discriminator termed Fourier transformer discriminator (FTD) to distinguish the noise distribution accurately. Additionally, we collect a new low-light raw denoising (LRD) dataset for training and benchmarking. Qualitative validation shows that the noise generated by our proposed noise model can be highly similar to the real noise in terms of distribution. Furthermore, extensive denoising experiments demonstrate that our method performs favorably against state-of-the-art methods on different sensors. The source code and dataset can be found at ~\url{https://github.com/fengzhang427/LRD}. ### Echoes Beyond Points: Unleashing the Power of Raw Radar Data in Multi-modality Fusion - **Authors:** Yang Liu, Feng Wang, Naiyan Wang, Zhaoxiang Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.16532 - **Pdf link:** https://arxiv.org/pdf/2307.16532 - **Abstract** Radar is ubiquitous in autonomous driving systems due to its low cost and good adaptability to bad weather. Nevertheless, the radar detection performance is usually inferior because its point cloud is sparse and not accurate due to the poor azimuth and elevation resolution. Moreover, point cloud generation algorithms already drop weak signals to reduce the false targets which may be suboptimal for the use of deep fusion. In this paper, we propose a novel method named EchoFusion to skip the existing radar signal processing pipeline and then incorporate the radar raw data with other sensors. Specifically, we first generate the Bird's Eye View (BEV) queries and then take corresponding spectrum features from radar to fuse with other sensors. By this approach, our method could utilize both rich and lossless distance and speed clues from radar echoes and rich semantic clues from images, making our method surpass all existing methods on the RADIal dataset, and approach the performance of LiDAR. Codes will be available upon acceptance. ## Keyword: raw image There is no result
2.0
New submissions for Tue, 1 Aug 23 - ## Keyword: events ### Seeing Behind Dynamic Occlusions with Event Cameras - **Authors:** Rong Zou, Manasi Muglikar, Niko Messikommer, Davide Scaramuzza - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.15829 - **Pdf link:** https://arxiv.org/pdf/2307.15829 - **Abstract** Unwanted camera occlusions, such as debris, dust, rain-drops, and snow, can severely degrade the performance of computer-vision systems. Dynamic occlusions are particularly challenging because of the continuously changing pattern. Existing occlusion-removal methods currently use synthetic aperture imaging or image inpainting. However, they face issues with dynamic occlusions as these require multiple viewpoints or user-generated masks to hallucinate the background intensity. We propose a novel approach to reconstruct the background from a single viewpoint in the presence of dynamic occlusions. Our solution relies for the first time on the combination of a traditional camera with an event camera. When an occlusion moves across a background image, it causes intensity changes that trigger events. These events provide additional information on the relative intensity changes between foreground and background at a high temporal resolution, enabling a truer reconstruction of the background content. We present the first large-scale dataset consisting of synchronized images and event sequences to evaluate our approach. We show that our method outperforms image inpainting methods by 3dB in terms of PSNR on our dataset. ### CMDA: Cross-Modality Domain Adaptation for Nighttime Semantic Segmentation - **Authors:** Ruihao Xia, Chaoqiang Zhao, Meng Zheng, Ziyan Wu, Qiyu Sun, Yang Tang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.15942 - **Pdf link:** https://arxiv.org/pdf/2307.15942 - **Abstract** Most nighttime semantic segmentation studies are based on domain adaptation approaches and image input. However, limited by the low dynamic range of conventional cameras, images fail to capture structural details and boundary information in low-light conditions. Event cameras, as a new form of vision sensors, are complementary to conventional cameras with their high dynamic range. To this end, we propose a novel unsupervised Cross-Modality Domain Adaptation (CMDA) framework to leverage multi-modality (Images and Events) information for nighttime semantic segmentation, with only labels on daytime images. In CMDA, we design the Image Motion-Extractor to extract motion information and the Image Content-Extractor to extract content information from images, in order to bridge the gap between different modalities (Images to Events) and domains (Day to Night). Besides, we introduce the first image-event nighttime semantic segmentation dataset. Extensive experiments on both the public image dataset and the proposed image-event dataset demonstrate the effectiveness of our proposed approach. We open-source our code, models, and dataset at https://github.com/XiaRho/CMDA. ### Fully $1\times1$ Convolutional Network for Lightweight Image Super-Resolution - **Authors:** Gang Wu, Junjun Jiang, Kui Jiang, Xianming Liu - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI) - **Arxiv link:** https://arxiv.org/abs/2307.16140 - **Pdf link:** https://arxiv.org/pdf/2307.16140 - **Abstract** Deep models have achieved significant process on single image super-resolution (SISR) tasks, in particular large models with large kernel ($3\times3$ or more). However, the heavy computational footprint of such models prevents their deployment in real-time, resource-constrained environments. Conversely, $1\times1$ convolutions bring substantial computational efficiency, but struggle with aggregating local spatial representations, an essential capability to SISR models. In response to this dichotomy, we propose to harmonize the merits of both $3\times3$ and $1\times1$ kernels, and exploit a great potential for lightweight SISR tasks. Specifically, we propose a simple yet effective fully $1\times1$ convolutional network, named Shift-Conv-based Network (SCNet). By incorporating a parameter-free spatial-shift operation, it equips the fully $1\times1$ convolutional network with powerful representation capability while impressive computational efficiency. Extensive experiments demonstrate that SCNets, despite its fully $1\times1$ convolutional structure, consistently matches or even surpasses the performance of existing lightweight SR models that employ regular convolutions. ## Keyword: event camera ### Seeing Behind Dynamic Occlusions with Event Cameras - **Authors:** Rong Zou, Manasi Muglikar, Niko Messikommer, Davide Scaramuzza - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.15829 - **Pdf link:** https://arxiv.org/pdf/2307.15829 - **Abstract** Unwanted camera occlusions, such as debris, dust, rain-drops, and snow, can severely degrade the performance of computer-vision systems. Dynamic occlusions are particularly challenging because of the continuously changing pattern. Existing occlusion-removal methods currently use synthetic aperture imaging or image inpainting. However, they face issues with dynamic occlusions as these require multiple viewpoints or user-generated masks to hallucinate the background intensity. We propose a novel approach to reconstruct the background from a single viewpoint in the presence of dynamic occlusions. Our solution relies for the first time on the combination of a traditional camera with an event camera. When an occlusion moves across a background image, it causes intensity changes that trigger events. These events provide additional information on the relative intensity changes between foreground and background at a high temporal resolution, enabling a truer reconstruction of the background content. We present the first large-scale dataset consisting of synchronized images and event sequences to evaluate our approach. We show that our method outperforms image inpainting methods by 3dB in terms of PSNR on our dataset. ### CMDA: Cross-Modality Domain Adaptation for Nighttime Semantic Segmentation - **Authors:** Ruihao Xia, Chaoqiang Zhao, Meng Zheng, Ziyan Wu, Qiyu Sun, Yang Tang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.15942 - **Pdf link:** https://arxiv.org/pdf/2307.15942 - **Abstract** Most nighttime semantic segmentation studies are based on domain adaptation approaches and image input. However, limited by the low dynamic range of conventional cameras, images fail to capture structural details and boundary information in low-light conditions. Event cameras, as a new form of vision sensors, are complementary to conventional cameras with their high dynamic range. To this end, we propose a novel unsupervised Cross-Modality Domain Adaptation (CMDA) framework to leverage multi-modality (Images and Events) information for nighttime semantic segmentation, with only labels on daytime images. In CMDA, we design the Image Motion-Extractor to extract motion information and the Image Content-Extractor to extract content information from images, in order to bridge the gap between different modalities (Images to Events) and domains (Day to Night). Besides, we introduce the first image-event nighttime semantic segmentation dataset. Extensive experiments on both the public image dataset and the proposed image-event dataset demonstrate the effectiveness of our proposed approach. We open-source our code, models, and dataset at https://github.com/XiaRho/CMDA. ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB There is no result ## Keyword: ISP ### Implementing Edge Based Object Detection For Microplastic Debris - **Authors:** Amardeep Singh, Prof. Charles Jia, Prof. Donald Kirk - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Artificial Intelligence (cs.AI) - **Arxiv link:** https://arxiv.org/abs/2307.16289 - **Pdf link:** https://arxiv.org/pdf/2307.16289 - **Abstract** Plastic has imbibed itself as an indispensable part of our day to day activities, becoming a source of problems due to its non-biodegradable nature and cheaper production prices. With these problems, comes the challenge of mitigating and responding to the aftereffects of disposal or the lack of proper disposal which leads to waste concentrating in locations and disturbing ecosystems for both plants and animals. As plastic debris levels continue to rise with the accumulation of waste in garbage patches in landfills and more hazardously in natural water bodies, swift action is necessary to plug or cease this flow. While manual sorting operations and detection can offer a solution, they can be augmented using highly advanced computer imagery linked with robotic appendages for removing wastes. The primary application of focus in this report are the much-discussed Computer Vision and Open Vision which have gained novelty for their light dependence on internet and ability to relay information in remote areas. These applications can be applied to the creation of edge-based mobility devices that can as a counter to the growing problem of plastic debris in oceans and rivers, demanding little connectivity and still offering the same results with reasonably timed maintenance. The principal findings of this project cover the various methods that were tested and deployed to detect waste in images, as well as comparing them against different waste types. The project has been able to produce workable models that can perform on time detection of sampled images using an augmented CNN approach. Latter portions of the project have also achieved a better interpretation of the necessary preprocessing steps required to arrive at the best accuracies, including the best hardware for expanding waste detection studies to larger environments. ### DRAW: Defending Camera-shooted RAW against Image Manipulation - **Authors:** Xiaoxiao Hu, Qichao Ying, Zhenxing Qian, Sheng Li, Xinpeng Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16418 - **Pdf link:** https://arxiv.org/pdf/2307.16418 - **Abstract** RAW files are the initial measurement of scene radiance widely used in most cameras, and the ubiquitously-used RGB images are converted from RAW data through Image Signal Processing (ISP) pipelines. Nowadays, digital images are risky of being nefariously manipulated. Inspired by the fact that innate immunity is the first line of body defense, we propose DRAW, a novel scheme of defending images against manipulation by protecting their sources, i.e., camera-shooted RAWs. Specifically, we design a lightweight Multi-frequency Partial Fusion Network (MPF-Net) friendly to devices with limited computing resources by frequency learning and partial feature fusion. It introduces invisible watermarks as protective signal into the RAW data. The protection capability can not only be transferred into the rendered RGB images regardless of the applied ISP pipeline, but also is resilient to post-processing operations such as blurring or compression. Once the image is manipulated, we can accurately identify the forged areas with a localization network. Extensive experiments on several famous RAW datasets, e.g., RAISE, FiveK and SIDD, indicate the effectiveness of our method. We hope that this technique can be used in future cameras as an option for image protection, which could effectively restrict image manipulation at the source. ### Digging Into Uncertainty-based Pseudo-label for Robust Stereo Matching - **Authors:** Zhelun Shen, Xibin Song, Yuchao Dai, Dingfu Zhou, Zhibo Rao, Liangjun Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.16509 - **Pdf link:** https://arxiv.org/pdf/2307.16509 - **Abstract** Due to the domain differences and unbalanced disparity distribution across multiple datasets, current stereo matching approaches are commonly limited to a specific dataset and generalize poorly to others. Such domain shift issue is usually addressed by substantial adaptation on costly target-domain ground-truth data, which cannot be easily obtained in practical settings. In this paper, we propose to dig into uncertainty estimation for robust stereo matching. Specifically, to balance the disparity distribution, we employ a pixel-level uncertainty estimation to adaptively adjust the next stage disparity searching space, in this way driving the network progressively prune out the space of unlikely correspondences. Then, to solve the limited ground truth data, an uncertainty-based pseudo-label is proposed to adapt the pre-trained model to the new domain, where pixel-level and area-level uncertainty estimation are proposed to filter out the high-uncertainty pixels of predicted disparity maps and generate sparse while reliable pseudo-labels to align the domain gap. Experimentally, our method shows strong cross-domain, adapt, and joint generalization and obtains \textbf{1st} place on the stereo task of Robust Vision Challenge 2020. Additionally, our uncertainty-based pseudo-labels can be extended to train monocular depth estimation networks in an unsupervised way and even achieves comparable performance with the supervised methods. The code will be available at https://github.com/gallenszl/UCFNet. ### Multi-Spectral Image Stitching via Spatial Graph Reasoning - **Authors:** Zhiying Jiang, Zengxi Zhang, Jinyuan Liu, Xin Fan, Risheng Liu - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.16741 - **Pdf link:** https://arxiv.org/pdf/2307.16741 - **Abstract** Multi-spectral image stitching leverages the complementarity between infrared and visible images to generate a robust and reliable wide field-of-view (FOV) scene. The primary challenge of this task is to explore the relations between multi-spectral images for aligning and integrating multi-view scenes. Capitalizing on the strengths of Graph Convolutional Networks (GCNs) in modeling feature relationships, we propose a spatial graph reasoning based multi-spectral image stitching method that effectively distills the deformation and integration of multi-spectral images across different viewpoints. To accomplish this, we embed multi-scale complementary features from the same view position into a set of nodes. The correspondence across different views is learned through powerful dense feature embeddings, where both inter- and intra-correlations are developed to exploit cross-view matching and enhance inner feature disparity. By introducing long-range coherence along spatial and channel dimensions, the complementarity of pixel relations and channel interdependencies aids in the reconstruction of aligned multi-view features, generating informative and reliable wide FOV scenes. Moreover, we release a challenging dataset named ChaMS, comprising both real-world and synthetic sets with significant parallax, providing a new option for comprehensive evaluation. Extensive experiments demonstrate that our method surpasses the state-of-the-arts. ## Keyword: image signal processing ### DRAW: Defending Camera-shooted RAW against Image Manipulation - **Authors:** Xiaoxiao Hu, Qichao Ying, Zhenxing Qian, Sheng Li, Xinpeng Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16418 - **Pdf link:** https://arxiv.org/pdf/2307.16418 - **Abstract** RAW files are the initial measurement of scene radiance widely used in most cameras, and the ubiquitously-used RGB images are converted from RAW data through Image Signal Processing (ISP) pipelines. Nowadays, digital images are risky of being nefariously manipulated. Inspired by the fact that innate immunity is the first line of body defense, we propose DRAW, a novel scheme of defending images against manipulation by protecting their sources, i.e., camera-shooted RAWs. Specifically, we design a lightweight Multi-frequency Partial Fusion Network (MPF-Net) friendly to devices with limited computing resources by frequency learning and partial feature fusion. It introduces invisible watermarks as protective signal into the RAW data. The protection capability can not only be transferred into the rendered RGB images regardless of the applied ISP pipeline, but also is resilient to post-processing operations such as blurring or compression. Once the image is manipulated, we can accurately identify the forged areas with a localization network. Extensive experiments on several famous RAW datasets, e.g., RAISE, FiveK and SIDD, indicate the effectiveness of our method. We hope that this technique can be used in future cameras as an option for image protection, which could effectively restrict image manipulation at the source. ## Keyword: image signal process ### DRAW: Defending Camera-shooted RAW against Image Manipulation - **Authors:** Xiaoxiao Hu, Qichao Ying, Zhenxing Qian, Sheng Li, Xinpeng Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16418 - **Pdf link:** https://arxiv.org/pdf/2307.16418 - **Abstract** RAW files are the initial measurement of scene radiance widely used in most cameras, and the ubiquitously-used RGB images are converted from RAW data through Image Signal Processing (ISP) pipelines. Nowadays, digital images are risky of being nefariously manipulated. Inspired by the fact that innate immunity is the first line of body defense, we propose DRAW, a novel scheme of defending images against manipulation by protecting their sources, i.e., camera-shooted RAWs. Specifically, we design a lightweight Multi-frequency Partial Fusion Network (MPF-Net) friendly to devices with limited computing resources by frequency learning and partial feature fusion. It introduces invisible watermarks as protective signal into the RAW data. The protection capability can not only be transferred into the rendered RGB images regardless of the applied ISP pipeline, but also is resilient to post-processing operations such as blurring or compression. Once the image is manipulated, we can accurately identify the forged areas with a localization network. Extensive experiments on several famous RAW datasets, e.g., RAISE, FiveK and SIDD, indicate the effectiveness of our method. We hope that this technique can be used in future cameras as an option for image protection, which could effectively restrict image manipulation at the source. ## Keyword: compression ### InfoStyler: Disentanglement Information Bottleneck for Artistic Style Transfer - **Authors:** Yueming Lyu, Yue Jiang, Bo Peng, Jing Dong - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.16227 - **Pdf link:** https://arxiv.org/pdf/2307.16227 - **Abstract** Artistic style transfer aims to transfer the style of an artwork to a photograph while maintaining its original overall content. Many prior works focus on designing various transfer modules to transfer the style statistics to the content image. Although effective, ignoring the clear disentanglement of the content features and the style features from the first beginning, they have difficulty in balancing between content preservation and style transferring. To tackle this problem, we propose a novel information disentanglement method, named InfoStyler, to capture the minimal sufficient information for both content and style representations from the pre-trained encoding network. InfoStyler formulates the disentanglement representation learning as an information compression problem by eliminating style statistics from the content image and removing the content structure from the style image. Besides, to further facilitate disentanglement learning, a cross-domain Information Bottleneck (IB) learning strategy is proposed by reconstructing the content and style domains. Extensive experiments demonstrate that our InfoStyler can synthesize high-quality stylized images while balancing content structure preservation and style pattern richness. ### DRAW: Defending Camera-shooted RAW against Image Manipulation - **Authors:** Xiaoxiao Hu, Qichao Ying, Zhenxing Qian, Sheng Li, Xinpeng Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16418 - **Pdf link:** https://arxiv.org/pdf/2307.16418 - **Abstract** RAW files are the initial measurement of scene radiance widely used in most cameras, and the ubiquitously-used RGB images are converted from RAW data through Image Signal Processing (ISP) pipelines. Nowadays, digital images are risky of being nefariously manipulated. Inspired by the fact that innate immunity is the first line of body defense, we propose DRAW, a novel scheme of defending images against manipulation by protecting their sources, i.e., camera-shooted RAWs. Specifically, we design a lightweight Multi-frequency Partial Fusion Network (MPF-Net) friendly to devices with limited computing resources by frequency learning and partial feature fusion. It introduces invisible watermarks as protective signal into the RAW data. The protection capability can not only be transferred into the rendered RGB images regardless of the applied ISP pipeline, but also is resilient to post-processing operations such as blurring or compression. Once the image is manipulated, we can accurately identify the forged areas with a localization network. Extensive experiments on several famous RAW datasets, e.g., RAISE, FiveK and SIDD, indicate the effectiveness of our method. We hope that this technique can be used in future cameras as an option for image protection, which could effectively restrict image manipulation at the source. ## Keyword: RAW ### DRAW: Defending Camera-shooted RAW against Image Manipulation - **Authors:** Xiaoxiao Hu, Qichao Ying, Zhenxing Qian, Sheng Li, Xinpeng Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16418 - **Pdf link:** https://arxiv.org/pdf/2307.16418 - **Abstract** RAW files are the initial measurement of scene radiance widely used in most cameras, and the ubiquitously-used RGB images are converted from RAW data through Image Signal Processing (ISP) pipelines. Nowadays, digital images are risky of being nefariously manipulated. Inspired by the fact that innate immunity is the first line of body defense, we propose DRAW, a novel scheme of defending images against manipulation by protecting their sources, i.e., camera-shooted RAWs. Specifically, we design a lightweight Multi-frequency Partial Fusion Network (MPF-Net) friendly to devices with limited computing resources by frequency learning and partial feature fusion. It introduces invisible watermarks as protective signal into the RAW data. The protection capability can not only be transferred into the rendered RGB images regardless of the applied ISP pipeline, but also is resilient to post-processing operations such as blurring or compression. Once the image is manipulated, we can accurately identify the forged areas with a localization network. Extensive experiments on several famous RAW datasets, e.g., RAISE, FiveK and SIDD, indicate the effectiveness of our method. We hope that this technique can be used in future cameras as an option for image protection, which could effectively restrict image manipulation at the source. ### Towards General Low-Light Raw Noise Synthesis and Modeling - **Authors:** Feng Zhang, Bin Xu, Zhiqiang Li, Xinran Liu, Qingbo Lu, Changxin Gao, Nong Sang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Multimedia (cs.MM); Image and Video Processing (eess.IV) - **Arxiv link:** https://arxiv.org/abs/2307.16508 - **Pdf link:** https://arxiv.org/pdf/2307.16508 - **Abstract** Modeling and synthesizing low-light raw noise is a fundamental problem for computational photography and image processing applications. Although most recent works have adopted physics-based models to synthesize noise, the signal-independent noise in low-light conditions is far more complicated and varies dramatically across camera sensors, which is beyond the description of these models. To address this issue, we introduce a new perspective to synthesize the signal-independent noise by a generative model. Specifically, we synthesize the signal-dependent and signal-independent noise in a physics- and learning-based manner, respectively. In this way, our method can be considered as a general model, that is, it can simultaneously learn different noise characteristics for different ISO levels and generalize to various sensors. Subsequently, we present an effective multi-scale discriminator termed Fourier transformer discriminator (FTD) to distinguish the noise distribution accurately. Additionally, we collect a new low-light raw denoising (LRD) dataset for training and benchmarking. Qualitative validation shows that the noise generated by our proposed noise model can be highly similar to the real noise in terms of distribution. Furthermore, extensive denoising experiments demonstrate that our method performs favorably against state-of-the-art methods on different sensors. The source code and dataset can be found at ~\url{https://github.com/fengzhang427/LRD}. ### Echoes Beyond Points: Unleashing the Power of Raw Radar Data in Multi-modality Fusion - **Authors:** Yang Liu, Feng Wang, Naiyan Wang, Zhaoxiang Zhang - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2307.16532 - **Pdf link:** https://arxiv.org/pdf/2307.16532 - **Abstract** Radar is ubiquitous in autonomous driving systems due to its low cost and good adaptability to bad weather. Nevertheless, the radar detection performance is usually inferior because its point cloud is sparse and not accurate due to the poor azimuth and elevation resolution. Moreover, point cloud generation algorithms already drop weak signals to reduce the false targets which may be suboptimal for the use of deep fusion. In this paper, we propose a novel method named EchoFusion to skip the existing radar signal processing pipeline and then incorporate the radar raw data with other sensors. Specifically, we first generate the Bird's Eye View (BEV) queries and then take corresponding spectrum features from radar to fuse with other sensors. By this approach, our method could utilize both rich and lossless distance and speed clues from radar echoes and rich semantic clues from images, making our method surpass all existing methods on the RADIal dataset, and approach the performance of LiDAR. Codes will be available upon acceptance. ## Keyword: raw image There is no result
process
new submissions for tue aug keyword events seeing behind dynamic occlusions with event cameras authors rong zou manasi muglikar niko messikommer davide scaramuzza subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract unwanted camera occlusions such as debris dust rain drops and snow can severely degrade the performance of computer vision systems dynamic occlusions are particularly challenging because of the continuously changing pattern existing occlusion removal methods currently use synthetic aperture imaging or image inpainting however they face issues with dynamic occlusions as these require multiple viewpoints or user generated masks to hallucinate the background intensity we propose a novel approach to reconstruct the background from a single viewpoint in the presence of dynamic occlusions our solution relies for the first time on the combination of a traditional camera with an event camera when an occlusion moves across a background image it causes intensity changes that trigger events these events provide additional information on the relative intensity changes between foreground and background at a high temporal resolution enabling a truer reconstruction of the background content we present the first large scale dataset consisting of synchronized images and event sequences to evaluate our approach we show that our method outperforms image inpainting methods by in terms of psnr on our dataset cmda cross modality domain adaptation for nighttime semantic segmentation authors ruihao xia chaoqiang zhao meng zheng ziyan wu qiyu sun yang tang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract most nighttime semantic segmentation studies are based on domain adaptation approaches and image input however limited by the low dynamic range of conventional cameras images fail to capture structural details and boundary information in low light conditions event cameras as a new form of vision sensors are complementary to conventional cameras with their high dynamic range to this end we propose a novel unsupervised cross modality domain adaptation cmda framework to leverage multi modality images and events information for nighttime semantic segmentation with only labels on daytime images in cmda we design the image motion extractor to extract motion information and the image content extractor to extract content information from images in order to bridge the gap between different modalities images to events and domains day to night besides we introduce the first image event nighttime semantic segmentation dataset extensive experiments on both the public image dataset and the proposed image event dataset demonstrate the effectiveness of our proposed approach we open source our code models and dataset at fully convolutional network for lightweight image super resolution authors gang wu junjun jiang kui jiang xianming liu subjects computer vision and pattern recognition cs cv artificial intelligence cs ai arxiv link pdf link abstract deep models have achieved significant process on single image super resolution sisr tasks in particular large models with large kernel or more however the heavy computational footprint of such models prevents their deployment in real time resource constrained environments conversely convolutions bring substantial computational efficiency but struggle with aggregating local spatial representations an essential capability to sisr models in response to this dichotomy we propose to harmonize the merits of both and kernels and exploit a great potential for lightweight sisr tasks specifically we propose a simple yet effective fully convolutional network named shift conv based network scnet by incorporating a parameter free spatial shift operation it equips the fully convolutional network with powerful representation capability while impressive computational efficiency extensive experiments demonstrate that scnets despite its fully convolutional structure consistently matches or even surpasses the performance of existing lightweight sr models that employ regular convolutions keyword event camera seeing behind dynamic occlusions with event cameras authors rong zou manasi muglikar niko messikommer davide scaramuzza subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract unwanted camera occlusions such as debris dust rain drops and snow can severely degrade the performance of computer vision systems dynamic occlusions are particularly challenging because of the continuously changing pattern existing occlusion removal methods currently use synthetic aperture imaging or image inpainting however they face issues with dynamic occlusions as these require multiple viewpoints or user generated masks to hallucinate the background intensity we propose a novel approach to reconstruct the background from a single viewpoint in the presence of dynamic occlusions our solution relies for the first time on the combination of a traditional camera with an event camera when an occlusion moves across a background image it causes intensity changes that trigger events these events provide additional information on the relative intensity changes between foreground and background at a high temporal resolution enabling a truer reconstruction of the background content we present the first large scale dataset consisting of synchronized images and event sequences to evaluate our approach we show that our method outperforms image inpainting methods by in terms of psnr on our dataset cmda cross modality domain adaptation for nighttime semantic segmentation authors ruihao xia chaoqiang zhao meng zheng ziyan wu qiyu sun yang tang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract most nighttime semantic segmentation studies are based on domain adaptation approaches and image input however limited by the low dynamic range of conventional cameras images fail to capture structural details and boundary information in low light conditions event cameras as a new form of vision sensors are complementary to conventional cameras with their high dynamic range to this end we propose a novel unsupervised cross modality domain adaptation cmda framework to leverage multi modality images and events information for nighttime semantic segmentation with only labels on daytime images in cmda we design the image motion extractor to extract motion information and the image content extractor to extract content information from images in order to bridge the gap between different modalities images to events and domains day to night besides we introduce the first image event nighttime semantic segmentation dataset extensive experiments on both the public image dataset and the proposed image event dataset demonstrate the effectiveness of our proposed approach we open source our code models and dataset at keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb there is no result keyword isp implementing edge based object detection for microplastic debris authors amardeep singh prof charles jia prof donald kirk subjects computer vision and pattern recognition cs cv artificial intelligence cs ai arxiv link pdf link abstract plastic has imbibed itself as an indispensable part of our day to day activities becoming a source of problems due to its non biodegradable nature and cheaper production prices with these problems comes the challenge of mitigating and responding to the aftereffects of disposal or the lack of proper disposal which leads to waste concentrating in locations and disturbing ecosystems for both plants and animals as plastic debris levels continue to rise with the accumulation of waste in garbage patches in landfills and more hazardously in natural water bodies swift action is necessary to plug or cease this flow while manual sorting operations and detection can offer a solution they can be augmented using highly advanced computer imagery linked with robotic appendages for removing wastes the primary application of focus in this report are the much discussed computer vision and open vision which have gained novelty for their light dependence on internet and ability to relay information in remote areas these applications can be applied to the creation of edge based mobility devices that can as a counter to the growing problem of plastic debris in oceans and rivers demanding little connectivity and still offering the same results with reasonably timed maintenance the principal findings of this project cover the various methods that were tested and deployed to detect waste in images as well as comparing them against different waste types the project has been able to produce workable models that can perform on time detection of sampled images using an augmented cnn approach latter portions of the project have also achieved a better interpretation of the necessary preprocessing steps required to arrive at the best accuracies including the best hardware for expanding waste detection studies to larger environments draw defending camera shooted raw against image manipulation authors xiaoxiao hu qichao ying zhenxing qian sheng li xinpeng zhang subjects computer vision and pattern recognition cs cv multimedia cs mm image and video processing eess iv arxiv link pdf link abstract raw files are the initial measurement of scene radiance widely used in most cameras and the ubiquitously used rgb images are converted from raw data through image signal processing isp pipelines nowadays digital images are risky of being nefariously manipulated inspired by the fact that innate immunity is the first line of body defense we propose draw a novel scheme of defending images against manipulation by protecting their sources i e camera shooted raws specifically we design a lightweight multi frequency partial fusion network mpf net friendly to devices with limited computing resources by frequency learning and partial feature fusion it introduces invisible watermarks as protective signal into the raw data the protection capability can not only be transferred into the rendered rgb images regardless of the applied isp pipeline but also is resilient to post processing operations such as blurring or compression once the image is manipulated we can accurately identify the forged areas with a localization network extensive experiments on several famous raw datasets e g raise fivek and sidd indicate the effectiveness of our method we hope that this technique can be used in future cameras as an option for image protection which could effectively restrict image manipulation at the source digging into uncertainty based pseudo label for robust stereo matching authors zhelun shen xibin song yuchao dai dingfu zhou zhibo rao liangjun zhang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract due to the domain differences and unbalanced disparity distribution across multiple datasets current stereo matching approaches are commonly limited to a specific dataset and generalize poorly to others such domain shift issue is usually addressed by substantial adaptation on costly target domain ground truth data which cannot be easily obtained in practical settings in this paper we propose to dig into uncertainty estimation for robust stereo matching specifically to balance the disparity distribution we employ a pixel level uncertainty estimation to adaptively adjust the next stage disparity searching space in this way driving the network progressively prune out the space of unlikely correspondences then to solve the limited ground truth data an uncertainty based pseudo label is proposed to adapt the pre trained model to the new domain where pixel level and area level uncertainty estimation are proposed to filter out the high uncertainty pixels of predicted disparity maps and generate sparse while reliable pseudo labels to align the domain gap experimentally our method shows strong cross domain adapt and joint generalization and obtains textbf place on the stereo task of robust vision challenge additionally our uncertainty based pseudo labels can be extended to train monocular depth estimation networks in an unsupervised way and even achieves comparable performance with the supervised methods the code will be available at multi spectral image stitching via spatial graph reasoning authors zhiying jiang zengxi zhang jinyuan liu xin fan risheng liu subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract multi spectral image stitching leverages the complementarity between infrared and visible images to generate a robust and reliable wide field of view fov scene the primary challenge of this task is to explore the relations between multi spectral images for aligning and integrating multi view scenes capitalizing on the strengths of graph convolutional networks gcns in modeling feature relationships we propose a spatial graph reasoning based multi spectral image stitching method that effectively distills the deformation and integration of multi spectral images across different viewpoints to accomplish this we embed multi scale complementary features from the same view position into a set of nodes the correspondence across different views is learned through powerful dense feature embeddings where both inter and intra correlations are developed to exploit cross view matching and enhance inner feature disparity by introducing long range coherence along spatial and channel dimensions the complementarity of pixel relations and channel interdependencies aids in the reconstruction of aligned multi view features generating informative and reliable wide fov scenes moreover we release a challenging dataset named chams comprising both real world and synthetic sets with significant parallax providing a new option for comprehensive evaluation extensive experiments demonstrate that our method surpasses the state of the arts keyword image signal processing draw defending camera shooted raw against image manipulation authors xiaoxiao hu qichao ying zhenxing qian sheng li xinpeng zhang subjects computer vision and pattern recognition cs cv multimedia cs mm image and video processing eess iv arxiv link pdf link abstract raw files are the initial measurement of scene radiance widely used in most cameras and the ubiquitously used rgb images are converted from raw data through image signal processing isp pipelines nowadays digital images are risky of being nefariously manipulated inspired by the fact that innate immunity is the first line of body defense we propose draw a novel scheme of defending images against manipulation by protecting their sources i e camera shooted raws specifically we design a lightweight multi frequency partial fusion network mpf net friendly to devices with limited computing resources by frequency learning and partial feature fusion it introduces invisible watermarks as protective signal into the raw data the protection capability can not only be transferred into the rendered rgb images regardless of the applied isp pipeline but also is resilient to post processing operations such as blurring or compression once the image is manipulated we can accurately identify the forged areas with a localization network extensive experiments on several famous raw datasets e g raise fivek and sidd indicate the effectiveness of our method we hope that this technique can be used in future cameras as an option for image protection which could effectively restrict image manipulation at the source keyword image signal process draw defending camera shooted raw against image manipulation authors xiaoxiao hu qichao ying zhenxing qian sheng li xinpeng zhang subjects computer vision and pattern recognition cs cv multimedia cs mm image and video processing eess iv arxiv link pdf link abstract raw files are the initial measurement of scene radiance widely used in most cameras and the ubiquitously used rgb images are converted from raw data through image signal processing isp pipelines nowadays digital images are risky of being nefariously manipulated inspired by the fact that innate immunity is the first line of body defense we propose draw a novel scheme of defending images against manipulation by protecting their sources i e camera shooted raws specifically we design a lightweight multi frequency partial fusion network mpf net friendly to devices with limited computing resources by frequency learning and partial feature fusion it introduces invisible watermarks as protective signal into the raw data the protection capability can not only be transferred into the rendered rgb images regardless of the applied isp pipeline but also is resilient to post processing operations such as blurring or compression once the image is manipulated we can accurately identify the forged areas with a localization network extensive experiments on several famous raw datasets e g raise fivek and sidd indicate the effectiveness of our method we hope that this technique can be used in future cameras as an option for image protection which could effectively restrict image manipulation at the source keyword compression infostyler disentanglement information bottleneck for artistic style transfer authors yueming lyu yue jiang bo peng jing dong subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract artistic style transfer aims to transfer the style of an artwork to a photograph while maintaining its original overall content many prior works focus on designing various transfer modules to transfer the style statistics to the content image although effective ignoring the clear disentanglement of the content features and the style features from the first beginning they have difficulty in balancing between content preservation and style transferring to tackle this problem we propose a novel information disentanglement method named infostyler to capture the minimal sufficient information for both content and style representations from the pre trained encoding network infostyler formulates the disentanglement representation learning as an information compression problem by eliminating style statistics from the content image and removing the content structure from the style image besides to further facilitate disentanglement learning a cross domain information bottleneck ib learning strategy is proposed by reconstructing the content and style domains extensive experiments demonstrate that our infostyler can synthesize high quality stylized images while balancing content structure preservation and style pattern richness draw defending camera shooted raw against image manipulation authors xiaoxiao hu qichao ying zhenxing qian sheng li xinpeng zhang subjects computer vision and pattern recognition cs cv multimedia cs mm image and video processing eess iv arxiv link pdf link abstract raw files are the initial measurement of scene radiance widely used in most cameras and the ubiquitously used rgb images are converted from raw data through image signal processing isp pipelines nowadays digital images are risky of being nefariously manipulated inspired by the fact that innate immunity is the first line of body defense we propose draw a novel scheme of defending images against manipulation by protecting their sources i e camera shooted raws specifically we design a lightweight multi frequency partial fusion network mpf net friendly to devices with limited computing resources by frequency learning and partial feature fusion it introduces invisible watermarks as protective signal into the raw data the protection capability can not only be transferred into the rendered rgb images regardless of the applied isp pipeline but also is resilient to post processing operations such as blurring or compression once the image is manipulated we can accurately identify the forged areas with a localization network extensive experiments on several famous raw datasets e g raise fivek and sidd indicate the effectiveness of our method we hope that this technique can be used in future cameras as an option for image protection which could effectively restrict image manipulation at the source keyword raw draw defending camera shooted raw against image manipulation authors xiaoxiao hu qichao ying zhenxing qian sheng li xinpeng zhang subjects computer vision and pattern recognition cs cv multimedia cs mm image and video processing eess iv arxiv link pdf link abstract raw files are the initial measurement of scene radiance widely used in most cameras and the ubiquitously used rgb images are converted from raw data through image signal processing isp pipelines nowadays digital images are risky of being nefariously manipulated inspired by the fact that innate immunity is the first line of body defense we propose draw a novel scheme of defending images against manipulation by protecting their sources i e camera shooted raws specifically we design a lightweight multi frequency partial fusion network mpf net friendly to devices with limited computing resources by frequency learning and partial feature fusion it introduces invisible watermarks as protective signal into the raw data the protection capability can not only be transferred into the rendered rgb images regardless of the applied isp pipeline but also is resilient to post processing operations such as blurring or compression once the image is manipulated we can accurately identify the forged areas with a localization network extensive experiments on several famous raw datasets e g raise fivek and sidd indicate the effectiveness of our method we hope that this technique can be used in future cameras as an option for image protection which could effectively restrict image manipulation at the source towards general low light raw noise synthesis and modeling authors feng zhang bin xu zhiqiang li xinran liu qingbo lu changxin gao nong sang subjects computer vision and pattern recognition cs cv multimedia cs mm image and video processing eess iv arxiv link pdf link abstract modeling and synthesizing low light raw noise is a fundamental problem for computational photography and image processing applications although most recent works have adopted physics based models to synthesize noise the signal independent noise in low light conditions is far more complicated and varies dramatically across camera sensors which is beyond the description of these models to address this issue we introduce a new perspective to synthesize the signal independent noise by a generative model specifically we synthesize the signal dependent and signal independent noise in a physics and learning based manner respectively in this way our method can be considered as a general model that is it can simultaneously learn different noise characteristics for different iso levels and generalize to various sensors subsequently we present an effective multi scale discriminator termed fourier transformer discriminator ftd to distinguish the noise distribution accurately additionally we collect a new low light raw denoising lrd dataset for training and benchmarking qualitative validation shows that the noise generated by our proposed noise model can be highly similar to the real noise in terms of distribution furthermore extensive denoising experiments demonstrate that our method performs favorably against state of the art methods on different sensors the source code and dataset can be found at url echoes beyond points unleashing the power of raw radar data in multi modality fusion authors yang liu feng wang naiyan wang zhaoxiang zhang subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract radar is ubiquitous in autonomous driving systems due to its low cost and good adaptability to bad weather nevertheless the radar detection performance is usually inferior because its point cloud is sparse and not accurate due to the poor azimuth and elevation resolution moreover point cloud generation algorithms already drop weak signals to reduce the false targets which may be suboptimal for the use of deep fusion in this paper we propose a novel method named echofusion to skip the existing radar signal processing pipeline and then incorporate the radar raw data with other sensors specifically we first generate the bird s eye view bev queries and then take corresponding spectrum features from radar to fuse with other sensors by this approach our method could utilize both rich and lossless distance and speed clues from radar echoes and rich semantic clues from images making our method surpass all existing methods on the radial dataset and approach the performance of lidar codes will be available upon acceptance keyword raw image there is no result
1
533
2,578,268,334
IssuesEvent
2015-02-12 22:07:53
ptsochantaris/trailer
https://api.github.com/repos/ptsochantaris/trailer
closed
Extraordinarily high memory usage
bug in next build
*Version:* 1.1.5 I'm sharing this knowing that it probably falls squarely into the `wontfix` category, but I wanted to at least bring it your attention. ![screen shot 2015-02-06 at 12 12 30 pm](https://cloud.githubusercontent.com/assets/80459/6084780/91664314-adf9-11e4-9d26-12580e3fa742.png) For the last few weeks, when my Late 2011 Mac mini starts slowing down in the afternoon, I go to Activity Monitor and see that Trailer.app has claimed a rather sizable chunk of available memory. I have hidden all but two of the dozens of repositories which I can access to see if that helps, but it has not. ### Process sample ``` Sampling process 509 for 3 seconds with 1 millisecond of run time between samples Sampling completed, processing symbols... Analysis of sampling Trailer (pid 509) every 1 millisecond Process: Trailer [509] Path: /Applications/Trailer.app/Contents/MacOS/Trailer Load Address: 0x10a0a1000 Identifier: com.housetrip.Trailer Version: 1.1.5 (1164) Code Type: X86-64 Parent Process: ??? [1] Date/Time: 2015-02-06 12:20:09.106 -0600 OS Version: Mac OS X 10.10.2 (14C109) Report Version: 7 Analysis Tool: /usr/bin/sample ---- Call graph: 2667 Thread_3162 DispatchQueue_1: com.apple.main-thread (serial) + 2667 start (in libdyld.dylib) + 1 [0x7fff99e4f5c9] + 2667 NSApplicationMain (in AppKit) + 1832 [0x7fff934c4a14] + 2667 -[NSApplication run] (in AppKit) + 594 [0x7fff934d9593] + 2667 -[NSApplication nextEventMatchingMask:untilDate:inMode:dequeue:] (in AppKit) + 194 [0x7fff934e5730] + 2667 _DPSNextEvent (in AppKit) + 964 [0x7fff934e5f81] + 2667 _BlockUntilNextEventMatchingListInModeWithFilter (in HIToolbox) + 71 [0x7fff90c826ab] + 2667 ReceiveNextEventCommon (in HIToolbox) + 431 [0x7fff90c8286a] + 2667 RunCurrentEventLoopInMode (in HIToolbox) + 235 [0x7fff90c82aef] + 2667 CFRunLoopRunSpecific (in CoreFoundation) + 296 [0x7fff90650858] + 2667 __CFRunLoopRun (in CoreFoundation) + 1371 [0x7fff90650ffb] + 2667 __CFRunLoopServiceMachPort (in CoreFoundation) + 212 [0x7fff90651b34] + 2667 mach_msg (in libsystem_kernel.dylib) + 55 [0x7fff920e664f] + 2667 mach_msg_trap (in libsystem_kernel.dylib) + 10 [0x7fff920e74de] 2667 Thread_3187 DispatchQueue_2: com.apple.libdispatch-manager (serial) + 2667 _dispatch_mgr_thread (in libdispatch.dylib) + 52 [0x7fff9829ea6a] + 2667 kevent64 (in libsystem_kernel.dylib) + 10 [0x7fff920ed232] 2667 Thread_3292: com.apple.NSURLConnectionLoader + 2667 thread_start (in libsystem_pthread.dylib) + 13 [0x7fff8f7b641d] + 2667 _pthread_start (in libsystem_pthread.dylib) + 176 [0x7fff8f7b81e5] + 2667 _pthread_body (in libsystem_pthread.dylib) + 131 [0x7fff8f7b8268] + 2667 __NSThread__main__ (in Foundation) + 1345 [0x7fff9690c90a] + 2667 +[NSURLConnection(Loader) _resourceLoadLoop:] (in CFNetwork) + 434 [0x7fff93283c80] + 2667 CFRunLoopRunSpecific (in CoreFoundation) + 296 [0x7fff90650858] + 2667 __CFRunLoopRun (in CoreFoundation) + 1371 [0x7fff90650ffb] + 2667 __CFRunLoopServiceMachPort (in CoreFoundation) + 212 [0x7fff90651b34] + 2667 mach_msg (in libsystem_kernel.dylib) + 55 [0x7fff920e664f] + 2667 mach_msg_trap (in libsystem_kernel.dylib) + 10 [0x7fff920e74de] 2667 Thread_3311 + 2667 thread_start (in libsystem_pthread.dylib) + 13 [0x7fff8f7b641d] + 2667 _pthread_start (in libsystem_pthread.dylib) + 176 [0x7fff8f7b81e5] + 2667 _pthread_body (in libsystem_pthread.dylib) + 131 [0x7fff8f7b8268] + 2667 _NSEventThread (in AppKit) + 137 [0x7fff9364933b] + 2667 CFRunLoopRunSpecific (in CoreFoundation) + 296 [0x7fff90650858] + 2667 __CFRunLoopRun (in CoreFoundation) + 1371 [0x7fff90650ffb] + 2667 __CFRunLoopServiceMachPort (in CoreFoundation) + 212 [0x7fff90651b34] + 2667 mach_msg (in libsystem_kernel.dylib) + 55 [0x7fff920e664f] + 2667 mach_msg_trap (in libsystem_kernel.dylib) + 10 [0x7fff920e74de] 2667 Thread_3317: com.apple.CFSocket.private + 2667 thread_start (in libsystem_pthread.dylib) + 13 [0x7fff8f7b641d] + 2667 _pthread_start (in libsystem_pthread.dylib) + 176 [0x7fff8f7b81e5] + 2667 _pthread_body (in libsystem_pthread.dylib) + 131 [0x7fff8f7b8268] + 2667 __select (in libsystem_kernel.dylib) + 10 [0x7fff920ec3fa] 2667 Thread_1569702 + 2667 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] + 2667 __workq_kernreturn (in libsystem_kernel.dylib) + 10 [0x7fff920ec94a] 2667 Thread_1569709 + 2667 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] + 2667 __workq_kernreturn (in libsystem_kernel.dylib) + 10 [0x7fff920ec94a] 2667 Thread_1569813 + 2667 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] + 2667 __workq_kernreturn (in libsystem_kernel.dylib) + 10 [0x7fff920ec94a] 2667 Thread_1569814 + 2667 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] + 2667 __workq_kernreturn (in libsystem_kernel.dylib) + 10 [0x7fff920ec94a] 2667 Thread_1569815 2667 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] 2667 __workq_kernreturn (in libsystem_kernel.dylib) + 10 [0x7fff920ec94a] Total number in stack (recursive counted multiple, when >=5): 5 __workq_kernreturn (in libsystem_kernel.dylib) + 0 [0x7fff920ec940] 5 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] Sort by top of stack, same collapsed (when >= 5): __workq_kernreturn (in libsystem_kernel.dylib) 13335 mach_msg_trap (in libsystem_kernel.dylib) 8001 __select (in libsystem_kernel.dylib) 2667 kevent64 (in libsystem_kernel.dylib) 2667 Binary Images: 0x10a0a1000 - 0x10a0e8fff +com.housetrip.Trailer (1.1.5 - 1164) <36CF8E7E-2026-3369-B458-A47BF13C9BFD> /Applications/Trailer.app/Contents/MacOS/Trailer 0x10a102000 - 0x10a130ff7 +org.andymatuschak.Sparkle (1.8.0 - 0b186dc) <3E8C44BB-5A29-3577-AC71-89DFAB4ECD19> /Applications/Trailer.app/Contents/Frameworks/Sparkle.framework/Versions/A/Sparkle 0x10ccbf000 - 0x10ccbfffe +cl_kernels (???) <25634DE5-9D71-402E-9F80-8774D220E196> cl_kernels 0x10f183000 - 0x10f269fef unorm8_bgra.dylib (2.4.5) <9423FFD4-6EF3-31BF-9DE9-6D55BA76D59E> /System/Library/Frameworks/OpenCL.framework/Versions/A/Libraries/ImageFormats/unorm8_bgra.dylib 0x10f2be000 - 0x10f2befef +cl_kernels (???) <C38B13A9-43C5-4D35-92C4-CAC23363F685> cl_kernels 0x11070f000 - 0x110789ff7 com.apple.xquery (1.3.1 - 30) <9D868AE3-C5A0-34BD-8C33-EB5F8EDD7ACA> /System/Library/PrivateFrameworks/XQuery.framework/XQuery 0x1153b0000 - 0x1153b0ff5 +cl_kernels (???) <A37A3DF7-5951-4991-A19B-4C2CDD754F15> cl_kernels 0x7fff6db6f000 - 0x7fff6dba5837 dyld (0.0 - ???) <65DCCB06-339C-3E25-9702-600A28291D0E> /usr/lib/dyld 0x7fff8d7ae000 - 0x7fff8d7c7ff7 com.apple.CFOpenDirectory (10.10 - 187) <0F9747EF-12A3-3694-984D-0B8352CA6C0F> /System/Library/Frameworks/OpenDirectory.framework/Versions/A/Frameworks/CFOpenDirectory.framework/Versions/A/CFOpenDirectory 0x7fff8d7f5000 - 0x7fff8dac4ff3 com.apple.CoreImage (10.0.33) <6E3DDA29-718B-3BDB-BFAF-F8C201BF93A4> /System/Library/Frameworks/QuartzCore.framework/Versions/A/Frameworks/CoreImage.framework/Versions/A/CoreImage 0x7fff8dd45000 - 0x7fff8dd46fff libDiagnosticMessagesClient.dylib (100) <2EE8E436-5CDC-34C5-9959-5BA218D507FB> /usr/lib/libDiagnosticMessagesClient.dylib 0x7fff8dd50000 - 0x7fff8de8dfff com.apple.ImageIO.framework (3.3.0 - 1232) <D7AF3CD2-FAB2-3798-9C26-914886852DCD> /System/Library/Frameworks/ImageIO.framework/Versions/A/ImageIO 0x7fff8de8e000 - 0x7fff8de99fff libGL.dylib (11.1.1) <1F0EB9FB-4B0F-349B-80DD-93FD3F45B9C7> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGL.dylib 0x7fff8de9a000 - 0x7fff8de9efff com.apple.TCC (1.0 - 1) <61F36A72-B983-3A2D-9D37-A2F194D31E7D> /System/Library/PrivateFrameworks/TCC.framework/Versions/A/TCC 0x7fff8de9f000 - 0x7fff8dea1fff com.apple.EFILogin (2.0 - 2) <39895ACB-E756-342C-ABE5-DB7100EF0A69> /System/Library/PrivateFrameworks/EFILogin.framework/Versions/A/EFILogin 0x7fff8dea2000 - 0x7fff8dec1fff com.apple.CoreDuet (1.0 - 1) <36AA9FD5-2685-314D-B364-3FA4688D86BD> /System/Library/PrivateFrameworks/CoreDuet.framework/Versions/A/CoreDuet 0x7fff8dec2000 - 0x7fff8df1cff7 com.apple.LanguageModeling (1.0 - 1) <ACA93FE0-A0E3-333E-AE3C-8EB7DE5F362F> /System/Library/PrivateFrameworks/LanguageModeling.framework/Versions/A/LanguageModeling 0x7fff8df40000 - 0x7fff8e068ff7 com.apple.coreui (2.1 - 305.6.1) <B56EC212-73C1-326F-B78C-EB856386296E> /System/Library/PrivateFrameworks/CoreUI.framework/Versions/A/CoreUI 0x7fff8e06c000 - 0x7fff8e17afff com.apple.desktopservices (1.9.2 - 1.9.2) <8670FD3B-8A5B-3D84-B21E-DF21140545A2> /System/Library/PrivateFrameworks/DesktopServicesPriv.framework/Versions/A/DesktopServicesPriv 0x7fff8e17b000 - 0x7fff8e1b6fff com.apple.Symbolication (1.4 - 56045) <D64571B1-4483-3FE2-BD67-A91360F79727> /System/Library/PrivateFrameworks/Symbolication.framework/Versions/A/Symbolication 0x7fff8e1c4000 - 0x7fff8e1ccff7 com.apple.AppleSRP (5.0 - 1) <01EC5144-D09A-3D6A-AE35-F6D48585F154> /System/Library/PrivateFrameworks/AppleSRP.framework/Versions/A/AppleSRP 0x7fff8e1cd000 - 0x7fff8e1f8fff libc++abi.dylib (125) <88A22A0F-87C6-3002-BFBA-AC0F2808B8B9> /usr/lib/libc++abi.dylib 0x7fff8e723000 - 0x7fff8e7a0fff com.apple.CoreServices.OSServices (640.3 - 640.3) <84A91B00-0ED4-350C-B30A-AEAE437AE02A> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/OSServices.framework/Versions/A/OSServices 0x7fff8e7a1000 - 0x7fff8e7a3ff7 libsystem_sandbox.dylib (358.1.1) <95312E09-DA28-324A-A084-F3E574D0210E> /usr/lib/system/libsystem_sandbox.dylib 0x7fff8e7e7000 - 0x7fff8e80eff7 com.apple.shortcut (2.13 - 2.13) <0BA7C57A-C2FC-3DFC-83B2-CE6C33770B52> /System/Library/PrivateFrameworks/Shortcut.framework/Versions/A/Shortcut 0x7fff8e80f000 - 0x7fff8e901fff libxml2.2.dylib (26) <B834E7C8-EC3E-3382-BC5A-DA38DC4D720C> /usr/lib/libxml2.2.dylib 0x7fff8eb6d000 - 0x7fff8ebaefff libGLU.dylib (11.1.1) <E9ADAD30-0133-320D-A60E-D1A7F91A7795> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGLU.dylib 0x7fff8ebaf000 - 0x7fff8ee5bfff com.apple.GeoServices (1.0 - 982.4.10) <8A7FE04A-2785-30E7-A6E2-DC15D170DAF5> /System/Library/PrivateFrameworks/GeoServices.framework/Versions/A/GeoServices 0x7fff8ee5c000 - 0x7fff8eefeff7 com.apple.Bluetooth (4.3.2 - 4.3.2f6) <95676652-21AB-3FFA-B53D-EBC8BF4E913E> /System/Library/Frameworks/IOBluetooth.framework/Versions/A/IOBluetooth 0x7fff8f16d000 - 0x7fff8f178ff7 libcsfde.dylib (471.10.6) <E1BF5816-3CE6-30CE-B3EE-F68CB6BA1378> /usr/lib/libcsfde.dylib 0x7fff8f179000 - 0x7fff8f1cdfff libc++.1.dylib (120) <1B9530FD-989B-3174-BB1C-BDC159501710> /usr/lib/libc++.1.dylib 0x7fff8f346000 - 0x7fff8f36efff libxpc.dylib (559.10.3) <876216DC-D5D3-381E-8AF9-49AE464E5107> /usr/lib/system/libxpc.dylib 0x7fff8f373000 - 0x7fff8f375fff libRadiance.dylib (1232) <E670DDEF-60F8-3AEB-B6A2-B20A1340634C> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libRadiance.dylib 0x7fff8f376000 - 0x7fff8f44cff3 com.apple.DiskImagesFramework (10.10.1 - 396) <E7478685-E829-372A-A945-A512730D3312> /System/Library/PrivateFrameworks/DiskImages.framework/Versions/A/DiskImages 0x7fff8f44d000 - 0x7fff8f44dfff com.apple.Cocoa (6.8 - 21) <EAC0EA1E-3C62-3B28-A941-5D8B1E085FF8> /System/Library/Frameworks/Cocoa.framework/Versions/A/Cocoa 0x7fff8f44e000 - 0x7fff8f46ffff com.apple.framework.Apple80211 (10.1 - 1010.64) <A7378C4B-FFD3-35B9-93E8-0534A2A7B51F> /System/Library/PrivateFrameworks/Apple80211.framework/Versions/A/Apple80211 0x7fff8f470000 - 0x7fff8f4b0ff7 com.apple.CloudDocs (1.0 - 280.6) <C1179CEF-E058-3E16-BF90-C059FE7CDE77> /System/Library/PrivateFrameworks/CloudDocs.framework/Versions/A/CloudDocs 0x7fff8f4b1000 - 0x7fff8f53dff7 libsystem_c.dylib (1044.10.1) <199ED5EB-77A1-3D43-AA51-81779CE0A742> /usr/lib/system/libsystem_c.dylib 0x7fff8f5a3000 - 0x7fff8f634ff7 com.apple.cloudkit.CloudKit (259.2.5 - 259.2.5) <241EB647-C917-32F7-956A-6E505827048C> /System/Library/Frameworks/CloudKit.framework/Versions/A/CloudKit 0x7fff8f681000 - 0x7fff8f69dff7 libsystem_malloc.dylib (53.1.1) <19BCC257-5717-3502-A71F-95D65AFA861B> /usr/lib/system/libsystem_malloc.dylib 0x7fff8f6ec000 - 0x7fff8f71cfff com.apple.GSS (4.0 - 2.0) <FD154E62-F4CF-339D-B66C-AF4AED6A94A6> /System/Library/Frameworks/GSS.framework/Versions/A/GSS 0x7fff8f7b5000 - 0x7fff8f7befff libsystem_pthread.dylib (105.10.1) <3103AA7F-3BAE-3673-9649-47FFD7E15C97> /usr/lib/system/libsystem_pthread.dylib 0x7fff8f800000 - 0x7fff8f838ffb libsystem_network.dylib (411.1) <2EC3A005-473F-3C36-A665-F88B5BACC7F0> /usr/lib/system/libsystem_network.dylib 0x7fff8f839000 - 0x7fff8f843ff7 com.apple.NetAuth (5.0 - 5.0) <B9EC5425-D38D-308C-865F-207E0A98BAC7> /System/Library/PrivateFrameworks/NetAuth.framework/Versions/A/NetAuth 0x7fff8f9e5000 - 0x7fff8fafdffb com.apple.CoreText (352.0 - 454.3) <B3B8C775-14FA-38F3-9CD5-830422AE9C49> /System/Library/Frameworks/CoreText.framework/Versions/A/CoreText 0x7fff8fed6000 - 0x7fff8fedeffb libcopyfile.dylib (118.1.2) <0C68D3A6-ACDD-3EF3-991A-CC82C32AB836> /usr/lib/system/libcopyfile.dylib 0x7fff8fedf000 - 0x7fff8feeefff com.apple.LangAnalysis (1.7.0 - 1.7.0) <D1E527E4-C561-352F-9457-E8C50232793C> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/LangAnalysis.framework/Versions/A/LangAnalysis 0x7fff8ff41000 - 0x7fff90126ff3 libicucore.A.dylib (531.31) <B08E00D5-13C6-3391-AB3A-8DE693D3B42E> /usr/lib/libicucore.A.dylib 0x7fff90136000 - 0x7fff90161ff3 libarchive.2.dylib (30) <8CBB4416-EBE9-3574-8ADC-44655D245F39> /usr/lib/libarchive.2.dylib 0x7fff9027b000 - 0x7fff90295ff7 liblzma.5.dylib (7) <1D03E875-A7C0-3028-814C-3C27F7B7C079> /usr/lib/liblzma.5.dylib 0x7fff9029e000 - 0x7fff902b1ff7 com.apple.CoreBluetooth (1.0 - 1) <FA9B43B3-E183-3040-AE25-66EF9870CF35> /System/Library/Frameworks/CoreBluetooth.framework/Versions/A/CoreBluetooth 0x7fff9034e000 - 0x7fff9054846f libobjc.A.dylib (647) <759E155D-BC42-3D4E-869B-6F57D477177C> /usr/lib/libobjc.A.dylib 0x7fff90575000 - 0x7fff90582fff com.apple.ProtocolBuffer (1 - 225.1) <2D502FBB-D2A0-3937-A5C5-385FA65B3874> /System/Library/PrivateFrameworks/ProtocolBuffer.framework/Versions/A/ProtocolBuffer 0x7fff90583000 - 0x7fff90583ff7 libunc.dylib (29) <5676F7EA-C1DF-329F-B006-D2C3022B7D70> /usr/lib/system/libunc.dylib 0x7fff90584000 - 0x7fff90591fff com.apple.SpeechRecognitionCore (2.0.32 - 2.0.32) <87F0C88D-502D-3217-8B4A-8388288568BA> /System/Library/PrivateFrameworks/SpeechRecognitionCore.framework/Versions/A/SpeechRecognitionCore 0x7fff905d4000 - 0x7fff905d6fff com.apple.CoreDuetDebugLogging (1.0 - 1) <9A6E5710-EA99-366E-BF40-9A65EC1B46A1> /System/Library/PrivateFrameworks/CoreDuetDebugLogging.framework/Versions/A/CoreDuetDebugLogging 0x7fff905d7000 - 0x7fff905defff com.apple.NetFS (6.0 - 4.0) <1581D25F-CC07-39B0-90E8-5D4F3CF84EBA> /System/Library/Frameworks/NetFS.framework/Versions/A/NetFS 0x7fff905df000 - 0x7fff90975fff com.apple.CoreFoundation (6.9 - 1152) <CBD1591C-405E-376E-87E9-B264610EBF49> /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation 0x7fff909dc000 - 0x7fff90a01ff7 libJPEG.dylib (1232) <09466709-4742-3418-A0AC-116EF9714E2D> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libJPEG.dylib 0x7fff90a09000 - 0x7fff90a10ff7 libcompiler_rt.dylib (35) <BF8FC133-EE10-3DA6-9B90-92039E28678F> /usr/lib/system/libcompiler_rt.dylib 0x7fff90a11000 - 0x7fff90b03ff7 libiconv.2.dylib (42) <2A06D02F-8B76-3864-8D96-64EF5B40BC6C> /usr/lib/libiconv.2.dylib 0x7fff90b04000 - 0x7fff90bc4fff com.apple.backup.framework (1.6.2 - 1.6.2) <63E8CA47-B7B8-3A63-B505-D1622CE52527> /System/Library/PrivateFrameworks/Backup.framework/Versions/A/Backup 0x7fff90bc5000 - 0x7fff90bd6ff7 libz.1.dylib (55) <88C7C7DE-04B8-316F-8B74-ACD9F3DE1AA1> /usr/lib/libz.1.dylib 0x7fff90bd7000 - 0x7fff90bd8ff7 com.apple.print.framework.Print (10.0 - 265) <3BC4FE7F-78A0-3E57-8F4C-520E7EFD36FA> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/Print.framework/Versions/A/Print 0x7fff90bd9000 - 0x7fff90bdefff com.apple.DiskArbitration (2.6 - 2.6) <0DFF4D9B-2AC3-3B82-B5C5-30F4EFBD2DB9> /System/Library/Frameworks/DiskArbitration.framework/Versions/A/DiskArbitration 0x7fff90bdf000 - 0x7fff90bf6ff7 libLinearAlgebra.dylib (1128) <E78CCBAA-A999-3B65-8EC9-06DB15E67C37> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libLinearAlgebra.dylib 0x7fff90bf7000 - 0x7fff90c2ffff com.apple.RemoteViewServices (2.0 - 99) <C9A62691-B0D9-34B7-B71C-A48B5F4DC553> /System/Library/PrivateFrameworks/RemoteViewServices.framework/Versions/A/RemoteViewServices 0x7fff90c30000 - 0x7fff90c4afff com.apple.AppleVPAFramework (1.2.10 - 1.2.10) <DC3D5A44-AB1E-32A9-9D22-FC922B52346A> /System/Library/PrivateFrameworks/AppleVPA.framework/Versions/A/AppleVPA 0x7fff90c4b000 - 0x7fff90c53fff libsystem_dnssd.dylib (561.1.1) <62B70ECA-E40D-3C63-896E-7F00EC386DDB> /usr/lib/system/libsystem_dnssd.dylib 0x7fff90c54000 - 0x7fff90f58ffb com.apple.HIToolbox (2.1.1 - 757.3) <D827FC03-5668-3AA4-AF0E-46EEF7358EEA> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/HIToolbox.framework/Versions/A/HIToolbox 0x7fff90f59000 - 0x7fff90f5bfff com.apple.OAuth (25 - 25) <EE765AF0-2BB6-3689-9EAA-689BF1F02A0D> /System/Library/PrivateFrameworks/OAuth.framework/Versions/A/OAuth 0x7fff90f5c000 - 0x7fff90f5ffff com.apple.IOSurface (97 - 97) <D4B4D2B2-7B16-3174-9EA6-55E0A10B452D> /System/Library/Frameworks/IOSurface.framework/Versions/A/IOSurface 0x7fff90f60000 - 0x7fff90f66fff com.apple.speech.recognition.framework (5.0.9 - 5.0.9) <BB2D573F-0A01-379F-A2BA-3C454EDCB111> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/SpeechRecognition.framework/Versions/A/SpeechRecognition 0x7fff90f67000 - 0x7fff90f73ff7 com.apple.OpenDirectory (10.10 - 187) <8B98ECCB-7EFA-3A58-BD2B-A0835D869B1A> /System/Library/Frameworks/OpenDirectory.framework/Versions/A/OpenDirectory 0x7fff90fce000 - 0x7fff91063ff7 com.apple.ColorSync (4.9.0 - 4.9.0) <F06733BD-A10C-3DB3-B050-825351130392> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ColorSync.framework/Versions/A/ColorSync 0x7fff91118000 - 0x7fff911a9ff7 libCoreStorage.dylib (471.10.6) <892DEEE7-C8C7-35EA-931D-FF9862BDEB2B> /usr/lib/libCoreStorage.dylib 0x7fff911aa000 - 0x7fff9129efff libFontParser.dylib (134.1) <EA8452DB-9221-3608-95BF-496F58106313> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libFontParser.dylib 0x7fff9129f000 - 0x7fff912c5ff7 com.apple.ChunkingLibrary (2.1 - 163.1) <3514F2A4-38BD-3849-9286-B3B991057742> /System/Library/PrivateFrameworks/ChunkingLibrary.framework/Versions/A/ChunkingLibrary 0x7fff912ef000 - 0x7fff9145aff7 com.apple.audio.toolbox.AudioToolbox (1.12 - 1.12) <5C6DBEB4-F2EA-3262-B9FC-AFB89404C1DA> /System/Library/Frameworks/AudioToolbox.framework/Versions/A/AudioToolbox 0x7fff9145b000 - 0x7fff916c3ff3 com.apple.security (7.0 - 57031.10.10) <79C37E73-271B-3BEF-A96E-CDB83FF12CF0> /System/Library/Frameworks/Security.framework/Versions/A/Security 0x7fff916db000 - 0x7fff91753ff7 com.apple.SystemConfiguration (1.14 - 1.14) <E0495F7D-5624-3EF7-B7E5-DA0EE708B6E4> /System/Library/Frameworks/SystemConfiguration.framework/Versions/A/SystemConfiguration 0x7fff9176e000 - 0x7fff918b0fff libsqlite3.dylib (168) <7B580EB9-9260-35FE-AE2F-276A2C242BAB> /usr/lib/libsqlite3.dylib 0x7fff918b1000 - 0x7fff91dc4ff3 com.apple.JavaScriptCore (10600 - 10600.3.13) <C0C3246C-D26F-3440-AC75-81CFFA4F9C91> /System/Library/Frameworks/JavaScriptCore.framework/Versions/A/JavaScriptCore 0x7fff91dc5000 - 0x7fff91e3bfe7 libcorecrypto.dylib (233.1.2) <E1789801-3985-3949-B736-6B3378873301> /usr/lib/system/libcorecrypto.dylib 0x7fff91e3c000 - 0x7fff91e4eff7 com.apple.ImageCapture (9.0 - 9.0) <7FB65DD4-56B5-35C4-862C-7A2DED991D1F> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/ImageCapture.framework/Versions/A/ImageCapture 0x7fff91e52000 - 0x7fff91e52fff com.apple.CoreServices (62 - 62) <9E4577CA-3FC3-300D-AB00-87ADBDDA2E37> /System/Library/Frameworks/CoreServices.framework/Versions/A/CoreServices 0x7fff91e5c000 - 0x7fff91f7eff7 com.apple.LaunchServices (644.12.4 - 644.12.4) <59E909E8-ED4A-33EA-B85D-D409BADDF854> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/LaunchServices.framework/Versions/A/LaunchServices 0x7fff91f7f000 - 0x7fff92013fff com.apple.ink.framework (10.9 - 213) <8E029630-1530-3734-A446-13353F0E7AC5> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/Ink.framework/Versions/A/Ink 0x7fff9201d000 - 0x7fff9204fff3 com.apple.frameworks.CoreDaemon (1.3 - 1.3) <C6DB0A07-F8E4-3837-BCA9-225F460EDA81> /System/Library/PrivateFrameworks/CoreDaemon.framework/Versions/B/CoreDaemon 0x7fff92066000 - 0x7fff9206afff libcache.dylib (69) <45E9A2E7-99C4-36B2-BEE3-0C4E11614AD1> /usr/lib/system/libcache.dylib 0x7fff9206b000 - 0x7fff92078ff7 libbz2.1.0.dylib (36) <2DF83FBC-5C08-39E1-94F5-C28653791B5F> /usr/lib/libbz2.1.0.dylib 0x7fff920d6000 - 0x7fff920f3fff libsystem_kernel.dylib (2782.10.72) <97CD7ACD-EA0C-3434-BEFC-FCD013D6BB73> /usr/lib/system/libsystem_kernel.dylib 0x7fff9266f000 - 0x7fff9266ffff com.apple.audio.units.AudioUnit (1.12 - 1.12) <76EF1C9D-DEA4-3E55-A134-4099B2FD2CF2> /System/Library/Frameworks/AudioUnit.framework/Versions/A/AudioUnit 0x7fff92670000 - 0x7fff926a3ff7 com.apple.MediaKit (16 - 757) <345EDAFE-3E39-3B0F-8D84-54657EC4396D> /System/Library/PrivateFrameworks/MediaKit.framework/Versions/A/MediaKit 0x7fff926a4000 - 0x7fff926a6ff7 libquarantine.dylib (76) <DC041627-2D92-361C-BABF-A869A5C72293> /usr/lib/system/libquarantine.dylib 0x7fff92fec000 - 0x7fff92fecff7 libkeymgr.dylib (28) <77845842-DE70-3CC5-BD01-C3D14227CED5> /usr/lib/system/libkeymgr.dylib 0x7fff9309a000 - 0x7fff9317efff libcrypto.0.9.8.dylib (52.10.1) <2A2924DE-63FB-37F6-B102-84D69240675B> /usr/lib/libcrypto.0.9.8.dylib 0x7fff93181000 - 0x7fff93182fff libSystem.B.dylib (1213) <90B107BC-FF74-32CC-B1CF-4E02F544D957> /usr/lib/libSystem.B.dylib 0x7fff93184000 - 0x7fff93189ff7 libmacho.dylib (862) <126CA2ED-DE91-308F-8881-B9DAEC3C63B6> /usr/lib/system/libmacho.dylib 0x7fff9318a000 - 0x7fff9318bfff com.apple.TrustEvaluationAgent (2.0 - 25) <2D61A2C3-C83E-3A3F-8EC1-736DBEC250AB> /System/Library/PrivateFrameworks/TrustEvaluationAgent.framework/Versions/A/TrustEvaluationAgent 0x7fff9318c000 - 0x7fff93190fff libspindump.dylib (182) <085978DC-A34D-3B72-BC7B-025C35A0A373> /usr/lib/libspindump.dylib 0x7fff93191000 - 0x7fff931b9ffb libRIP.A.dylib (775.16) <7711F7A7-1813-3024-AE42-75CA7C5422B7> /System/Library/Frameworks/CoreGraphics.framework/Versions/A/Resources/libRIP.A.dylib 0x7fff931d3000 - 0x7fff931dafff libCGCMS.A.dylib (775.16) <8A173E74-7123-35F1-B160-853528C144ED> /System/Library/Frameworks/CoreGraphics.framework/Versions/A/Resources/libCGCMS.A.dylib 0x7fff931db000 - 0x7fff931ddfff libCVMSPluginSupport.dylib (11.1.1) <DA0706C5-F02A-3F3D-8EBA-18C04313CA2C> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libCVMSPluginSupport.dylib 0x7fff931e3000 - 0x7fff933e6ff3 com.apple.CFNetwork (720.2.4 - 720.2.4) <E550C671-930F-3B12-8798-23898473E179> /System/Library/Frameworks/CFNetwork.framework/Versions/A/CFNetwork 0x7fff93451000 - 0x7fff93457ff7 libsystem_networkextension.dylib (167.1.10) <29AB225B-D7FB-30ED-9600-65D44B9A9442> /usr/lib/system/libsystem_networkextension.dylib 0x7fff93471000 - 0x7fff93472ff7 libsystem_blocks.dylib (65) <9615D10A-FCA7-3BE4-AA1A-1B195DACE1A1> /usr/lib/system/libsystem_blocks.dylib 0x7fff93473000 - 0x7fff93473fff com.apple.Accelerate.vecLib (3.10 - vecLib 3.10) <B92888D0-ED3F-3430-8F3A-6E56FD16C5F1> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/vecLib 0x7fff93474000 - 0x7fff93476ff7 com.apple.securityhi (9.0 - 55006) <1F40ECF1-6AEF-3E64-9DAD-ADC646CCEA98> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/SecurityHI.framework/Versions/A/SecurityHI 0x7fff934c2000 - 0x7fff9400cff7 com.apple.AppKit (6.9 - 1344.72) <44EF7DEB-3072-3515-9F34-2857D557E828> /System/Library/Frameworks/AppKit.framework/Versions/C/AppKit 0x7fff9400d000 - 0x7fff94017ff7 com.apple.CrashReporterSupport (10.10 - 629) <4BCAA6B5-EC7F-365F-9D3F-BC483B7E956C> /System/Library/PrivateFrameworks/CrashReporterSupport.framework/Versions/A/CrashReporterSupport 0x7fff94018000 - 0x7fff9408cfff com.apple.ApplicationServices.ATS (360 - 375) <2824D38D-460D-353C-9D18-499B4BEEABB7> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/ATS 0x7fff9408d000 - 0x7fff948e4ffb com.apple.CoreGraphics (1.600.0 - 775.16) <864C1845-C41E-314C-A3B4-438DC39E5FBC> /System/Library/Frameworks/CoreGraphics.framework/Versions/A/CoreGraphics 0x7fff948e5000 - 0x7fff94932ff3 com.apple.print.framework.PrintCore (10.0 - 451) <3CA58254-D14F-3913-9DFB-CAC499570CC7> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/PrintCore.framework/Versions/A/PrintCore 0x7fff94990000 - 0x7fff949f7ffb com.apple.datadetectorscore (6.0 - 396.1.1) <80379385-A4EC-3F9B-AFED-9B1DF781943D> /System/Library/PrivateFrameworks/DataDetectorsCore.framework/Versions/A/DataDetectorsCore 0x7fff94c3c000 - 0x7fff94c44ff7 com.apple.icloud.FindMyDevice (1.0 - 1) <D198E170-3610-3727-BC87-73AD249CA097> /System/Library/PrivateFrameworks/FindMyDevice.framework/Versions/A/FindMyDevice 0x7fff94c45000 - 0x7fff94c56ff7 libsystem_coretls.dylib (35.1.2) <BC691CD1-17B6-39A5-BD02-AF973695FD1D> /usr/lib/system/libsystem_coretls.dylib 0x7fff94d38000 - 0x7fff94d3efff libsystem_trace.dylib (72.1.3) <A9E6B7D8-C327-3742-AC54-86C94218B1DF> /usr/lib/system/libsystem_trace.dylib 0x7fff94d41000 - 0x7fff94d5bff3 com.apple.Ubiquity (1.3 - 313) <DF56A657-CC6E-3BE2-86A0-71F07127724C> /System/Library/PrivateFrameworks/Ubiquity.framework/Versions/A/Ubiquity 0x7fff94e76000 - 0x7fff94ee8ff7 com.apple.framework.IOKit (2.0.2 - 1050.10.8) <FDFB1FBE-6A0E-3D63-828C-CD53500FCB0F> /System/Library/Frameworks/IOKit.framework/Versions/A/IOKit 0x7fff94ee9000 - 0x7fff94f58fff com.apple.SearchKit (1.4.0 - 1.4.0) <BFD6D876-36BA-3A3B-9F15-3E2F7DE6E89D> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/SearchKit.framework/Versions/A/SearchKit 0x7fff94f66000 - 0x7fff94f83ffb libresolv.9.dylib (57) <26B38E61-298A-3C3A-82C1-3B5E98AD5E29> /usr/lib/libresolv.9.dylib 0x7fff95321000 - 0x7fff9532cff7 com.apple.speech.synthesis.framework (5.3.3 - 5.3.3) <7DF3C68C-B219-3E13-AE72-24B8606A1560> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/SpeechSynthesis.framework/Versions/A/SpeechSynthesis 0x7fff95389000 - 0x7fff953a3ff7 com.apple.Kerberos (3.0 - 1) <7760E0C2-A222-3709-B2A6-B692D900CEB1> /System/Library/Frameworks/Kerberos.framework/Versions/A/Kerberos 0x7fff953a4000 - 0x7fff953f0ff7 com.apple.corelocation (1486.17 - 1615.21.1) <B81BC475-E215-3491-A750-8B23F05ABF5B> /System/Library/Frameworks/CoreLocation.framework/Versions/A/CoreLocation 0x7fff953f1000 - 0x7fff955a1ff7 com.apple.QuartzCore (1.10 - 361.15) <72A78C43-30DF-3748-9015-4B28119DB27B> /System/Library/Frameworks/QuartzCore.framework/Versions/A/QuartzCore 0x7fff955a7000 - 0x7fff9655effb com.apple.WebCore (10600 - 10600.3.15) <59A28076-26E4-3CE2-B6FC-AF59308C0B95> /System/Library/Frameworks/WebKit.framework/Versions/A/Frameworks/WebCore.framework/Versions/A/WebCore 0x7fff965ad000 - 0x7fff96668ff7 com.apple.DiscRecording (9.0 - 9000.4.2) <9BB46993-311A-3F2E-BD77-3CBEFB71C1F0> /System/Library/Frameworks/DiscRecording.framework/Versions/A/DiscRecording 0x7fff96769000 - 0x7fff967a3ffb com.apple.DebugSymbols (115 - 115) <6F03761D-7C3A-3C80-8031-AA1C1AD7C706> /System/Library/PrivateFrameworks/DebugSymbols.framework/Versions/A/DebugSymbols 0x7fff967a4000 - 0x7fff96867ff7 libvMisc.dylib (516) <A84F3A3B-D349-3FBC-B5A6-E0F572734073> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvMisc.dylib 0x7fff96868000 - 0x7fff9688dff7 libPng.dylib (1232) <6E72AE55-AFB0-3FC4-80B2-EBC3353436B7> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libPng.dylib 0x7fff968a4000 - 0x7fff96bd2fff com.apple.Foundation (6.9 - 1152.14) <E3746EDD-DFB1-3ECB-88ED-A91AC0EF3AAA> /System/Library/Frameworks/Foundation.framework/Versions/C/Foundation 0x7fff96bd3000 - 0x7fff96bdbfe7 libcldcpuengine.dylib (2.4.5) <F9EF8060-5E40-3E88-BC38-7452649672B2> /System/Library/Frameworks/OpenCL.framework/Versions/A/Libraries/libcldcpuengine.dylib 0x7fff96c01000 - 0x7fff96c03ffb libCGXType.A.dylib (775.16) <B2DC78CA-179F-39A7-8D0B-873DC0ACFE96> /System/Library/Frameworks/CoreGraphics.framework/Versions/A/Resources/libCGXType.A.dylib 0x7fff96c04000 - 0x7fff96c05fff libsystem_secinit.dylib (18) <581DAD0F-6B63-3A48-B63B-917AF799ABAA> /usr/lib/system/libsystem_secinit.dylib 0x7fff96c39000 - 0x7fff96c67fff com.apple.CoreServicesInternal (221.2.2 - 221.2.2) <16F7A7F1-CF1D-35AD-A91F-690A814048DF> /System/Library/PrivateFrameworks/CoreServicesInternal.framework/Versions/A/CoreServicesInternal 0x7fff96c68000 - 0x7fff96d06fff com.apple.Metadata (10.7.0 - 917.1) <46BE997C-B1F4-3BED-9332-FAC87297C87A> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/Metadata.framework/Versions/A/Metadata 0x7fff96d33000 - 0x7fff96d37fff libCoreVMClient.dylib (79) <FC4E08E3-749E-32FF-B5E9-211F29864831> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libCoreVMClient.dylib 0x7fff96e59000 - 0x7fff96e5cfff com.apple.help (1.3.3 - 46) <CA4541F4-CEF5-355C-8F1F-EA65DC1B400F> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/Help.framework/Versions/A/Help 0x7fff96e5d000 - 0x7fff96e73ff7 libsystem_asl.dylib (267) <F153AC5B-0542-356E-88C8-20A62CA704E2> /usr/lib/system/libsystem_asl.dylib 0x7fff96e74000 - 0x7fff96e9cfff libsystem_info.dylib (459) <B85A85D5-8530-3A93-B0C3-4DEC41F79478> /usr/lib/system/libsystem_info.dylib 0x7fff96f0b000 - 0x7fff97022fe7 libvDSP.dylib (516) <DFEDB210-49D1-3803-88A2-C61DB6A45C3D> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvDSP.dylib 0x7fff97023000 - 0x7fff9703dff7 libextension.dylib (55.1) <6D0CF094-85E8-3F5B-A3F1-25ECF60F80D9> /usr/lib/libextension.dylib 0x7fff9722a000 - 0x7fff97291ff7 com.apple.framework.CoreWiFi (3.0 - 300.4) <19269C1D-EB29-384A-83F3-7DDDEB7D9DAD> /System/Library/PrivateFrameworks/CoreWiFi.framework/Versions/A/CoreWiFi 0x7fff972a8000 - 0x7fff972acff7 libGIF.dylib (1232) <061D5354-FE4F-3C7E-B563-99DC0198062D> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libGIF.dylib 0x7fff972ae000 - 0x7fff972beff7 libbsm.0.dylib (34) <A3A2E56C-2B65-37C7-B43A-A1F926E1A0BB> /usr/lib/libbsm.0.dylib 0x7fff97e9c000 - 0x7fff981cffff libmecabra.dylib (666.2) <F757CABA-3EDB-3ABA-A378-A7C574EA233B> /usr/lib/libmecabra.dylib 0x7fff981d0000 - 0x7fff981dbfff libcommonCrypto.dylib (60061) <D381EBC6-69D8-31D3-8084-5A80A32CB748> /usr/lib/system/libcommonCrypto.dylib 0x7fff98291000 - 0x7fff98299fff libsystem_platform.dylib (63) <64E34079-D712-3D66-9CE2-418624A5C040> /usr/lib/system/libsystem_platform.dylib 0x7fff9829a000 - 0x7fff982c4ff7 libdispatch.dylib (442.1.4) <502CF32B-669B-3709-8862-08188225E4F0> /usr/lib/system/libdispatch.dylib 0x7fff982f8000 - 0x7fff982f8fff com.apple.Carbon (154 - 157) <0DF27AD6-ED64-34D7-825D-65297D276652> /System/Library/Frameworks/Carbon.framework/Versions/A/Carbon 0x7fff98621000 - 0x7fff9889cff7 com.apple.CoreData (111 - 526.1) <DC4F037B-B7F4-381A-B939-4414489D76BF> /System/Library/Frameworks/CoreData.framework/Versions/A/CoreData 0x7fff9889d000 - 0x7fff989e5ff7 com.apple.WebKitLegacy (10600 - 10600.3.18) <91B3E705-1378-3F73-B079-3223E838B629> /System/Library/Frameworks/WebKit.framework/Versions/A/Frameworks/WebKitLegacy.framework/Versions/A/WebKitLegacy 0x7fff989fb000 - 0x7fff98a16ff7 com.apple.aps.framework (4.0 - 4.0) <F3C3C246-101E-3E81-9608-D2D6E9352532> /System/Library/PrivateFrameworks/ApplePushService.framework/Versions/A/ApplePushService 0x7fff98bd2000 - 0x7fff98bd2fff com.apple.ApplicationServices (48 - 48) <5BF7910B-C328-3BF8-BA4F-CE52B574CE01> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/ApplicationServices 0x7fff98bd3000 - 0x7fff98bd5fff com.apple.loginsupport (1.0 - 1) <21DBC18C-F260-39FC-B52F-04A5AA84523A> /System/Library/PrivateFrameworks/login.framework/Versions/A/Frameworks/loginsupport.framework/Versions/A/loginsupport 0x7fff98c31000 - 0x7fff98f18ffb com.apple.CoreServices.CarbonCore (1108.2 - 1108.2) <FD87F83F-301A-3BD6-8262-5692FC1B4457> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/CarbonCore.framework/Versions/A/CarbonCore 0x7fff99071000 - 0x7fff99085ff7 com.apple.MultitouchSupport.framework (262.33.1 - 262.33.1) <62DF9340-01A1-3E12-A604-C90F6361FD9E> /System/Library/PrivateFrameworks/MultitouchSupport.framework/Versions/A/MultitouchSupport 0x7fff9911e000 - 0x7fff99164ffb libFontRegistry.dylib (134) <01B8034A-45FD-3360-A347-A1896F591363> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libFontRegistry.dylib 0x7fff99174000 - 0x7fff99267ff7 libJP2.dylib (1232) <10B78725-0B8A-3D87-B2E3-8FEED0C07F21> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libJP2.dylib 0x7fff99273000 - 0x7fff99275ff7 libutil.dylib (38) <471AD65E-B86E-3C4A-8ABD-B8665A2BCE3F> /usr/lib/libutil.dylib 0x7fff992ac000 - 0x7fff992e7fff com.apple.QD (301 - 301) <C4D2AD03-B839-350A-AAF0-B4A08F8BED77> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/QD.framework/Versions/A/QD 0x7fff99365000 - 0x7fff99365ff7 liblaunch.dylib (559.10.3) <DFCDEBDF-8247-3DC7-9879-E7E497DDA4B4> /usr/lib/system/liblaunch.dylib 0x7fff99c0f000 - 0x7fff99c17ffb com.apple.CoreServices.FSEvents (1210 - 1210) <782A9C69-7A45-31A7-8960-D08A36CBD0A7> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/FSEvents.framework/Versions/A/FSEvents 0x7fff99c18000 - 0x7fff99c48fff libsystem_m.dylib (3086.1) <1E12AB45-6D96-36D0-A226-F24D9FB0D9D6> /usr/lib/system/libsystem_m.dylib 0x7fff99c52000 - 0x7fff99de0fff libBLAS.dylib (1128) <497912C1-A98E-3281-BED7-E9C751552F61> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib 0x7fff99e4c000 - 0x7fff99e4fff7 libdyld.dylib (353.2.1) <4E33E416-F1D8-3598-B8CC-6863E2ECD0E6> /usr/lib/system/libdyld.dylib 0x7fff99e7d000 - 0x7fff99e8ffff libsasl2.2.dylib (193) <E523DD05-544B-3430-8AA9-672408A5AF8B> /usr/lib/libsasl2.2.dylib 0x7fff99e90000 - 0x7fff99eebfff libTIFF.dylib (1232) <29A5C7F7-D50B-35B3-8FA2-A55A47E497A6> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libTIFF.dylib 0x7fff99eec000 - 0x7fff99f0cfff com.apple.IconServices (47.1 - 47.1) <E83DFE3B-6541-3736-96BB-26DC5D0100F1> /System/Library/PrivateFrameworks/IconServices.framework/Versions/A/IconServices 0x7fff99f0d000 - 0x7fff99f5eff7 com.apple.audio.CoreAudio (4.3.0 - 4.3.0) <AF72B06E-C6C1-3FAE-8B47-AF461CAE0E22> /System/Library/Frameworks/CoreAudio.framework/Versions/A/CoreAudio 0x7fff99fc3000 - 0x7fff9a3d0ff7 libLAPACK.dylib (1128) <F9201AE7-B031-36DB-BCF8-971E994EF7C1> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libLAPACK.dylib 0x7fff9a3d1000 - 0x7fff9a3ecff7 libCRFSuite.dylib (34) <D64842BE-7BD4-3D0C-9842-1D202F7C2A51> /usr/lib/libCRFSuite.dylib 0x7fff9a3f7000 - 0x7fff9a3f7fff libOpenScriptingUtil.dylib (162) <EFD79173-A9DA-3AE6-BE15-3948938204A6> /usr/lib/libOpenScriptingUtil.dylib 0x7fff9a3f8000 - 0x7fff9a40aff7 com.apple.CoreDuetDaemonProtocol (1.0 - 1) <CE9FABB4-1C5D-3F9B-9BB8-5CC50C3E5E31> /System/Library/PrivateFrameworks/CoreDuetDaemonProtocol.framework/Versions/A/CoreDuetDaemonProtocol 0x7fff9a40b000 - 0x7fff9a477fff com.apple.framework.CoreWLAN (5.0 - 500.35.2) <37551DDD-C07C-31EB-923A-9721F03D7E29> /System/Library/Frameworks/CoreWLAN.framework/Versions/A/CoreWLAN 0x7fff9a478000 - 0x7fff9a47afff libsystem_configuration.dylib (699.1.5) <5E14864E-089A-3D84-85A4-980B776427A8> /usr/lib/system/libsystem_configuration.dylib 0x7fff9ab72000 - 0x7fff9ab7fff7 libxar.1.dylib (254) <CE10EFED-3066-3749-838A-6A15AC0DBCB6> /usr/lib/libxar.1.dylib 0x7fff9ab80000 - 0x7fff9abdfff3 com.apple.AE (681 - 681) <7F544183-A515-31A8-B45F-89A167F56216> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/AE.framework/Versions/A/AE 0x7fff9abe0000 - 0x7fff9ac2cff7 libcups.2.dylib (408) <9CECCDE3-51D7-3028-830C-F58BD36E3317> /usr/lib/libcups.2.dylib 0x7fff9ac2d000 - 0x7fff9ac36fff libGFXShared.dylib (11.1.1) <7AE7D152-597E-3B27-A52C-8DA76760B61C> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGFXShared.dylib 0x7fff9ac37000 - 0x7fff9ac37fff com.apple.Accelerate (1.10 - Accelerate 1.10) <F1B96A61-7E4B-31BD-A35B-BA7EF1F16EF4> /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate 0x7fff9ac3f000 - 0x7fff9ad6ffff com.apple.UIFoundation (1.0 - 1) <8E030D93-441C-3997-9CD2-55C8DFAC8B84> /System/Library/PrivateFrameworks/UIFoundation.framework/Versions/A/UIFoundation 0x7fff9ad70000 - 0x7fff9ad7bff7 libkxld.dylib (2782.10.72) <68E07A32-28F5-3FBB-9D74-00B4F53C2FD4> /usr/lib/system/libkxld.dylib 0x7fff9ad7c000 - 0x7fff9ad7eff7 libsystem_coreservices.dylib (9) <41B7C578-5A53-31C8-A96F-C73E030B0938> /usr/lib/system/libsystem_coreservices.dylib 0x7fff9add5000 - 0x7fff9ae23fff libcurl.4.dylib (83.1.2) <337A1FF8-E8B1-3173-9F29-C0D4C851D8E1> /usr/lib/libcurl.4.dylib 0x7fff9aea3000 - 0x7fff9aed0fff com.apple.Accounts (113 - 113) <990F0F61-6AC5-3076-932E-02A9A7F75AC4> /System/Library/Frameworks/Accounts.framework/Versions/A/Accounts 0x7fff9aeeb000 - 0x7fff9af5fff3 com.apple.securityfoundation (6.0 - 55126) <DEC91795-7754-334A-8CDA-B429F41B922D> /System/Library/Frameworks/SecurityFoundation.framework/Versions/A/SecurityFoundation 0x7fff9af60000 - 0x7fff9af89ffb libxslt.1.dylib (13) <AED1143F-B848-3E73-81ED-71356F25F084> /usr/lib/libxslt.1.dylib 0x7fff9af8a000 - 0x7fff9afc1ffb com.apple.LDAPFramework (2.4.28 - 194.5) <D22234AA-8B30-3010-8CF0-67516D52CC33> /System/Library/Frameworks/LDAP.framework/Versions/A/LDAP 0x7fff9afc2000 - 0x7fff9b030ffb com.apple.Heimdal (4.0 - 2.0) <3E5DA653-A343-3257-ADE1-BA879BAE280F> /System/Library/PrivateFrameworks/Heimdal.framework/Versions/A/Heimdal 0x7fff9b320000 - 0x7fff9b323fff com.apple.xpc.ServiceManagement (1.0 - 1) <5EFD45BF-B0CD-39F2-8232-6BA33E63E5D4> /System/Library/Frameworks/ServiceManagement.framework/Versions/A/ServiceManagement 0x7fff9b458000 - 0x7fff9b459fff liblangid.dylib (117) <B54A4AA0-2E53-3671-90F5-AFF711C0EB9E> /usr/lib/liblangid.dylib 0x7fff9b4bb000 - 0x7fff9b90efc7 com.apple.vImage (8.0 - 8.0) <33BE7B31-72DB-3364-B37E-C322A32748C5> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vImage.framework/Versions/A/vImage 0x7fff9b90f000 - 0x7fff9b91dff7 com.apple.opengl (11.1.1 - 11.1.1) <F79F5FFF-372E-329E-81FB-EE9BD6A2A7A7> /System/Library/Frameworks/OpenGL.framework/Versions/A/OpenGL 0x7fff9b91e000 - 0x7fff9b94ffff libtidy.A.dylib (15.15) <37FC944D-271A-386A-9ADD-FA33AD48F96D> /usr/lib/libtidy.A.dylib 0x7fff9b975000 - 0x7fff9b9f7fff com.apple.PerformanceAnalysis (1.0 - 1) <94F08B1A-F6AF-38D5-BE92-4FED34742966> /System/Library/PrivateFrameworks/PerformanceAnalysis.framework/Versions/A/PerformanceAnalysis 0x7fff9bd93000 - 0x7fff9bdd3ff7 libGLImage.dylib (11.1.1) <3986BFA3-4F55-380F-B01D-91BA9785D70C> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGLImage.dylib 0x7fff9bdd4000 - 0x7fff9bdddff3 com.apple.CommonAuth (4.0 - 2.0) <BA9F5A09-D200-3D18-9F4A-20C789291A30> /System/Library/PrivateFrameworks/CommonAuth.framework/Versions/A/CommonAuth 0x7fff9be2a000 - 0x7fff9be2fff7 libunwind.dylib (35.3) <BE7E51A0-B6EA-3A54-9CCA-9D88F683A6D6> /usr/lib/system/libunwind.dylib 0x7fff9be30000 - 0x7fff9c0f6fff com.apple.WebKit (10600 - 10600.3.18) <F8E36318-4F4C-348B-B1DE-D4BE035036AD> /System/Library/Frameworks/WebKit.framework/Versions/A/WebKit 0x7fff9c175000 - 0x7fff9c198fff com.apple.Sharing (328.3.2 - 328.3.2) <F555679F-1CD1-3EB2-8E01-FCB80EF07330> /System/Library/PrivateFrameworks/Sharing.framework/Versions/A/Sharing 0x7fff9c199000 - 0x7fff9c415ff3 com.apple.RawCamera.bundle (6.02 - 769) <1F0F0047-682F-39E3-BE26-2467BF5F0E22> /System/Library/CoreServices/RawCamera.bundle/Contents/MacOS/RawCamera 0x7fff9c416000 - 0x7fff9c45cff7 libauto.dylib (186) <A260789B-D4D8-316A-9490-254767B8A5F1> /usr/lib/libauto.dylib 0x7fff9c641000 - 0x7fff9c690ff7 com.apple.opencl (2.4.2 - 2.4.2) <D16CFDE6-B5F7-301A-995E-8B583D8C675A> /System/Library/Frameworks/OpenCL.framework/Versions/A/OpenCL 0x7fff9c78c000 - 0x7fff9c815fff com.apple.CoreSymbolication (3.1 - 57020) <FDF8F348-164D-38F9-90EB-F42585DD2C77> /System/Library/PrivateFrameworks/CoreSymbolication.framework/Versions/A/CoreSymbolication 0x7fff9c816000 - 0x7fff9c832fff com.apple.GenerationalStorage (2.0 - 209.11) <9FF8DD11-25FB-3047-A5BF-9415339B3EEC> /System/Library/PrivateFrameworks/GenerationalStorage.framework/Versions/A/GenerationalStorage 0x7fff9c833000 - 0x7fff9c83cff7 libsystem_notify.dylib (133.1.1) <61147800-F320-3DAA-850C-BADF33855F29> /usr/lib/system/libsystem_notify.dylib 0x7fff9c931000 - 0x7fff9c945ff7 com.apple.ProtectedCloudStorage (1.0 - 1) <52CFE68A-0663-3756-AB5B-B42195026052> /System/Library/PrivateFrameworks/ProtectedCloudStorage.framework/Versions/A/ProtectedCloudStorage 0x7fff9c946000 - 0x7fff9c95ffff com.apple.openscripting (1.4 - 162) <80DFF366-B950-3F79-903F-99DA0FFDB570> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/OpenScripting.framework/Versions/A/OpenScripting 0x7fff9c985000 - 0x7fff9c986ffb libremovefile.dylib (35) <3485B5F4-6CE8-3C62-8DFD-8736ED6E8531> /usr/lib/system/libremovefile.dylib 0x7fff9c9a3000 - 0x7fff9cdd3fff com.apple.vision.FaceCore (3.1.6 - 3.1.6) <C3B823AA-C261-37D3-B4AC-C59CE91C8241> /System/Library/PrivateFrameworks/FaceCore.framework/Versions/A/FaceCore 0x7fff9ce9e000 - 0x7fff9cea2fff libpam.2.dylib (20) <E805398D-9A92-31F8-8005-8DC188BD8B6E> /usr/lib/libpam.2.dylib 0x7fff9d026000 - 0x7fff9d0c5df7 com.apple.AppleJPEG (1.0 - 1) <9BB3D7DF-630A-3E1C-A124-12D6C4D0DE70> /System/Library/PrivateFrameworks/AppleJPEG.framework/Versions/A/AppleJPEG 0x7fff9d0c6000 - 0x7fff9d0cbffb libheimdal-asn1.dylib (398.10.1) <A7B6447A-6680-3625-83C3-993B58D5C43F> /usr/lib/libheimdal-asn1.dylib 0x7fff9d116000 - 0x7fff9d15fff3 com.apple.HIServices (1.22 - 520.12) <8EAC82AB-6A7D-3606-AF6F-60A9410D1278> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/HIServices.framework/Versions/A/HIServices 0x7fff9d160000 - 0x7fff9d16cff7 com.apple.HelpData (2.1.4 - 90) <471200E4-1D51-3D8C-A956-A52F8EB7B552> /System/Library/PrivateFrameworks/HelpData.framework/Versions/A/HelpData 0x7fff9d16d000 - 0x7fff9d17efff libcmph.dylib (1) <46EC3997-DB5E-38AE-BBBB-A035A54AD3C0> /usr/lib/libcmph.dylib 0x7fff9d182000 - 0x7fff9d187ff7 libsystem_stats.dylib (163.10.18) <9B8CCF24-DDDB-399A-9237-4BEC225D2E8C> /usr/lib/system/libsystem_stats.dylib 0x7fff9d188000 - 0x7fff9d18cfff com.apple.CommonPanels (1.2.6 - 96) <F9ECC8AF-D9CA-3350-AFB4-5113A9B789A5> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/CommonPanels.framework/Versions/A/CommonPanels 0x7fff9d18d000 - 0x7fff9d1b8fff com.apple.DictionaryServices (1.2 - 229) <6789EC43-CADA-394D-8FE8-FC3A2DD136B9> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/DictionaryServices.framework/Versions/A/DictionaryServices 0x7fff9d1b9000 - 0x7fff9d1e6fff com.apple.CoreVideo (1.8 - 145.1) <18DB07E0-B927-3260-A234-636F298D1917> /System/Library/Frameworks/CoreVideo.framework/Versions/A/CoreVideo Sample analysis of process 509 written to file /dev/stdout ```
1.0
Extraordinarily high memory usage - *Version:* 1.1.5 I'm sharing this knowing that it probably falls squarely into the `wontfix` category, but I wanted to at least bring it your attention. ![screen shot 2015-02-06 at 12 12 30 pm](https://cloud.githubusercontent.com/assets/80459/6084780/91664314-adf9-11e4-9d26-12580e3fa742.png) For the last few weeks, when my Late 2011 Mac mini starts slowing down in the afternoon, I go to Activity Monitor and see that Trailer.app has claimed a rather sizable chunk of available memory. I have hidden all but two of the dozens of repositories which I can access to see if that helps, but it has not. ### Process sample ``` Sampling process 509 for 3 seconds with 1 millisecond of run time between samples Sampling completed, processing symbols... Analysis of sampling Trailer (pid 509) every 1 millisecond Process: Trailer [509] Path: /Applications/Trailer.app/Contents/MacOS/Trailer Load Address: 0x10a0a1000 Identifier: com.housetrip.Trailer Version: 1.1.5 (1164) Code Type: X86-64 Parent Process: ??? [1] Date/Time: 2015-02-06 12:20:09.106 -0600 OS Version: Mac OS X 10.10.2 (14C109) Report Version: 7 Analysis Tool: /usr/bin/sample ---- Call graph: 2667 Thread_3162 DispatchQueue_1: com.apple.main-thread (serial) + 2667 start (in libdyld.dylib) + 1 [0x7fff99e4f5c9] + 2667 NSApplicationMain (in AppKit) + 1832 [0x7fff934c4a14] + 2667 -[NSApplication run] (in AppKit) + 594 [0x7fff934d9593] + 2667 -[NSApplication nextEventMatchingMask:untilDate:inMode:dequeue:] (in AppKit) + 194 [0x7fff934e5730] + 2667 _DPSNextEvent (in AppKit) + 964 [0x7fff934e5f81] + 2667 _BlockUntilNextEventMatchingListInModeWithFilter (in HIToolbox) + 71 [0x7fff90c826ab] + 2667 ReceiveNextEventCommon (in HIToolbox) + 431 [0x7fff90c8286a] + 2667 RunCurrentEventLoopInMode (in HIToolbox) + 235 [0x7fff90c82aef] + 2667 CFRunLoopRunSpecific (in CoreFoundation) + 296 [0x7fff90650858] + 2667 __CFRunLoopRun (in CoreFoundation) + 1371 [0x7fff90650ffb] + 2667 __CFRunLoopServiceMachPort (in CoreFoundation) + 212 [0x7fff90651b34] + 2667 mach_msg (in libsystem_kernel.dylib) + 55 [0x7fff920e664f] + 2667 mach_msg_trap (in libsystem_kernel.dylib) + 10 [0x7fff920e74de] 2667 Thread_3187 DispatchQueue_2: com.apple.libdispatch-manager (serial) + 2667 _dispatch_mgr_thread (in libdispatch.dylib) + 52 [0x7fff9829ea6a] + 2667 kevent64 (in libsystem_kernel.dylib) + 10 [0x7fff920ed232] 2667 Thread_3292: com.apple.NSURLConnectionLoader + 2667 thread_start (in libsystem_pthread.dylib) + 13 [0x7fff8f7b641d] + 2667 _pthread_start (in libsystem_pthread.dylib) + 176 [0x7fff8f7b81e5] + 2667 _pthread_body (in libsystem_pthread.dylib) + 131 [0x7fff8f7b8268] + 2667 __NSThread__main__ (in Foundation) + 1345 [0x7fff9690c90a] + 2667 +[NSURLConnection(Loader) _resourceLoadLoop:] (in CFNetwork) + 434 [0x7fff93283c80] + 2667 CFRunLoopRunSpecific (in CoreFoundation) + 296 [0x7fff90650858] + 2667 __CFRunLoopRun (in CoreFoundation) + 1371 [0x7fff90650ffb] + 2667 __CFRunLoopServiceMachPort (in CoreFoundation) + 212 [0x7fff90651b34] + 2667 mach_msg (in libsystem_kernel.dylib) + 55 [0x7fff920e664f] + 2667 mach_msg_trap (in libsystem_kernel.dylib) + 10 [0x7fff920e74de] 2667 Thread_3311 + 2667 thread_start (in libsystem_pthread.dylib) + 13 [0x7fff8f7b641d] + 2667 _pthread_start (in libsystem_pthread.dylib) + 176 [0x7fff8f7b81e5] + 2667 _pthread_body (in libsystem_pthread.dylib) + 131 [0x7fff8f7b8268] + 2667 _NSEventThread (in AppKit) + 137 [0x7fff9364933b] + 2667 CFRunLoopRunSpecific (in CoreFoundation) + 296 [0x7fff90650858] + 2667 __CFRunLoopRun (in CoreFoundation) + 1371 [0x7fff90650ffb] + 2667 __CFRunLoopServiceMachPort (in CoreFoundation) + 212 [0x7fff90651b34] + 2667 mach_msg (in libsystem_kernel.dylib) + 55 [0x7fff920e664f] + 2667 mach_msg_trap (in libsystem_kernel.dylib) + 10 [0x7fff920e74de] 2667 Thread_3317: com.apple.CFSocket.private + 2667 thread_start (in libsystem_pthread.dylib) + 13 [0x7fff8f7b641d] + 2667 _pthread_start (in libsystem_pthread.dylib) + 176 [0x7fff8f7b81e5] + 2667 _pthread_body (in libsystem_pthread.dylib) + 131 [0x7fff8f7b8268] + 2667 __select (in libsystem_kernel.dylib) + 10 [0x7fff920ec3fa] 2667 Thread_1569702 + 2667 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] + 2667 __workq_kernreturn (in libsystem_kernel.dylib) + 10 [0x7fff920ec94a] 2667 Thread_1569709 + 2667 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] + 2667 __workq_kernreturn (in libsystem_kernel.dylib) + 10 [0x7fff920ec94a] 2667 Thread_1569813 + 2667 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] + 2667 __workq_kernreturn (in libsystem_kernel.dylib) + 10 [0x7fff920ec94a] 2667 Thread_1569814 + 2667 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] + 2667 __workq_kernreturn (in libsystem_kernel.dylib) + 10 [0x7fff920ec94a] 2667 Thread_1569815 2667 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] 2667 __workq_kernreturn (in libsystem_kernel.dylib) + 10 [0x7fff920ec94a] Total number in stack (recursive counted multiple, when >=5): 5 __workq_kernreturn (in libsystem_kernel.dylib) + 0 [0x7fff920ec940] 5 start_wqthread (in libsystem_pthread.dylib) + 13 [0x7fff8f7b640d] Sort by top of stack, same collapsed (when >= 5): __workq_kernreturn (in libsystem_kernel.dylib) 13335 mach_msg_trap (in libsystem_kernel.dylib) 8001 __select (in libsystem_kernel.dylib) 2667 kevent64 (in libsystem_kernel.dylib) 2667 Binary Images: 0x10a0a1000 - 0x10a0e8fff +com.housetrip.Trailer (1.1.5 - 1164) <36CF8E7E-2026-3369-B458-A47BF13C9BFD> /Applications/Trailer.app/Contents/MacOS/Trailer 0x10a102000 - 0x10a130ff7 +org.andymatuschak.Sparkle (1.8.0 - 0b186dc) <3E8C44BB-5A29-3577-AC71-89DFAB4ECD19> /Applications/Trailer.app/Contents/Frameworks/Sparkle.framework/Versions/A/Sparkle 0x10ccbf000 - 0x10ccbfffe +cl_kernels (???) <25634DE5-9D71-402E-9F80-8774D220E196> cl_kernels 0x10f183000 - 0x10f269fef unorm8_bgra.dylib (2.4.5) <9423FFD4-6EF3-31BF-9DE9-6D55BA76D59E> /System/Library/Frameworks/OpenCL.framework/Versions/A/Libraries/ImageFormats/unorm8_bgra.dylib 0x10f2be000 - 0x10f2befef +cl_kernels (???) <C38B13A9-43C5-4D35-92C4-CAC23363F685> cl_kernels 0x11070f000 - 0x110789ff7 com.apple.xquery (1.3.1 - 30) <9D868AE3-C5A0-34BD-8C33-EB5F8EDD7ACA> /System/Library/PrivateFrameworks/XQuery.framework/XQuery 0x1153b0000 - 0x1153b0ff5 +cl_kernels (???) <A37A3DF7-5951-4991-A19B-4C2CDD754F15> cl_kernels 0x7fff6db6f000 - 0x7fff6dba5837 dyld (0.0 - ???) <65DCCB06-339C-3E25-9702-600A28291D0E> /usr/lib/dyld 0x7fff8d7ae000 - 0x7fff8d7c7ff7 com.apple.CFOpenDirectory (10.10 - 187) <0F9747EF-12A3-3694-984D-0B8352CA6C0F> /System/Library/Frameworks/OpenDirectory.framework/Versions/A/Frameworks/CFOpenDirectory.framework/Versions/A/CFOpenDirectory 0x7fff8d7f5000 - 0x7fff8dac4ff3 com.apple.CoreImage (10.0.33) <6E3DDA29-718B-3BDB-BFAF-F8C201BF93A4> /System/Library/Frameworks/QuartzCore.framework/Versions/A/Frameworks/CoreImage.framework/Versions/A/CoreImage 0x7fff8dd45000 - 0x7fff8dd46fff libDiagnosticMessagesClient.dylib (100) <2EE8E436-5CDC-34C5-9959-5BA218D507FB> /usr/lib/libDiagnosticMessagesClient.dylib 0x7fff8dd50000 - 0x7fff8de8dfff com.apple.ImageIO.framework (3.3.0 - 1232) <D7AF3CD2-FAB2-3798-9C26-914886852DCD> /System/Library/Frameworks/ImageIO.framework/Versions/A/ImageIO 0x7fff8de8e000 - 0x7fff8de99fff libGL.dylib (11.1.1) <1F0EB9FB-4B0F-349B-80DD-93FD3F45B9C7> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGL.dylib 0x7fff8de9a000 - 0x7fff8de9efff com.apple.TCC (1.0 - 1) <61F36A72-B983-3A2D-9D37-A2F194D31E7D> /System/Library/PrivateFrameworks/TCC.framework/Versions/A/TCC 0x7fff8de9f000 - 0x7fff8dea1fff com.apple.EFILogin (2.0 - 2) <39895ACB-E756-342C-ABE5-DB7100EF0A69> /System/Library/PrivateFrameworks/EFILogin.framework/Versions/A/EFILogin 0x7fff8dea2000 - 0x7fff8dec1fff com.apple.CoreDuet (1.0 - 1) <36AA9FD5-2685-314D-B364-3FA4688D86BD> /System/Library/PrivateFrameworks/CoreDuet.framework/Versions/A/CoreDuet 0x7fff8dec2000 - 0x7fff8df1cff7 com.apple.LanguageModeling (1.0 - 1) <ACA93FE0-A0E3-333E-AE3C-8EB7DE5F362F> /System/Library/PrivateFrameworks/LanguageModeling.framework/Versions/A/LanguageModeling 0x7fff8df40000 - 0x7fff8e068ff7 com.apple.coreui (2.1 - 305.6.1) <B56EC212-73C1-326F-B78C-EB856386296E> /System/Library/PrivateFrameworks/CoreUI.framework/Versions/A/CoreUI 0x7fff8e06c000 - 0x7fff8e17afff com.apple.desktopservices (1.9.2 - 1.9.2) <8670FD3B-8A5B-3D84-B21E-DF21140545A2> /System/Library/PrivateFrameworks/DesktopServicesPriv.framework/Versions/A/DesktopServicesPriv 0x7fff8e17b000 - 0x7fff8e1b6fff com.apple.Symbolication (1.4 - 56045) <D64571B1-4483-3FE2-BD67-A91360F79727> /System/Library/PrivateFrameworks/Symbolication.framework/Versions/A/Symbolication 0x7fff8e1c4000 - 0x7fff8e1ccff7 com.apple.AppleSRP (5.0 - 1) <01EC5144-D09A-3D6A-AE35-F6D48585F154> /System/Library/PrivateFrameworks/AppleSRP.framework/Versions/A/AppleSRP 0x7fff8e1cd000 - 0x7fff8e1f8fff libc++abi.dylib (125) <88A22A0F-87C6-3002-BFBA-AC0F2808B8B9> /usr/lib/libc++abi.dylib 0x7fff8e723000 - 0x7fff8e7a0fff com.apple.CoreServices.OSServices (640.3 - 640.3) <84A91B00-0ED4-350C-B30A-AEAE437AE02A> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/OSServices.framework/Versions/A/OSServices 0x7fff8e7a1000 - 0x7fff8e7a3ff7 libsystem_sandbox.dylib (358.1.1) <95312E09-DA28-324A-A084-F3E574D0210E> /usr/lib/system/libsystem_sandbox.dylib 0x7fff8e7e7000 - 0x7fff8e80eff7 com.apple.shortcut (2.13 - 2.13) <0BA7C57A-C2FC-3DFC-83B2-CE6C33770B52> /System/Library/PrivateFrameworks/Shortcut.framework/Versions/A/Shortcut 0x7fff8e80f000 - 0x7fff8e901fff libxml2.2.dylib (26) <B834E7C8-EC3E-3382-BC5A-DA38DC4D720C> /usr/lib/libxml2.2.dylib 0x7fff8eb6d000 - 0x7fff8ebaefff libGLU.dylib (11.1.1) <E9ADAD30-0133-320D-A60E-D1A7F91A7795> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGLU.dylib 0x7fff8ebaf000 - 0x7fff8ee5bfff com.apple.GeoServices (1.0 - 982.4.10) <8A7FE04A-2785-30E7-A6E2-DC15D170DAF5> /System/Library/PrivateFrameworks/GeoServices.framework/Versions/A/GeoServices 0x7fff8ee5c000 - 0x7fff8eefeff7 com.apple.Bluetooth (4.3.2 - 4.3.2f6) <95676652-21AB-3FFA-B53D-EBC8BF4E913E> /System/Library/Frameworks/IOBluetooth.framework/Versions/A/IOBluetooth 0x7fff8f16d000 - 0x7fff8f178ff7 libcsfde.dylib (471.10.6) <E1BF5816-3CE6-30CE-B3EE-F68CB6BA1378> /usr/lib/libcsfde.dylib 0x7fff8f179000 - 0x7fff8f1cdfff libc++.1.dylib (120) <1B9530FD-989B-3174-BB1C-BDC159501710> /usr/lib/libc++.1.dylib 0x7fff8f346000 - 0x7fff8f36efff libxpc.dylib (559.10.3) <876216DC-D5D3-381E-8AF9-49AE464E5107> /usr/lib/system/libxpc.dylib 0x7fff8f373000 - 0x7fff8f375fff libRadiance.dylib (1232) <E670DDEF-60F8-3AEB-B6A2-B20A1340634C> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libRadiance.dylib 0x7fff8f376000 - 0x7fff8f44cff3 com.apple.DiskImagesFramework (10.10.1 - 396) <E7478685-E829-372A-A945-A512730D3312> /System/Library/PrivateFrameworks/DiskImages.framework/Versions/A/DiskImages 0x7fff8f44d000 - 0x7fff8f44dfff com.apple.Cocoa (6.8 - 21) <EAC0EA1E-3C62-3B28-A941-5D8B1E085FF8> /System/Library/Frameworks/Cocoa.framework/Versions/A/Cocoa 0x7fff8f44e000 - 0x7fff8f46ffff com.apple.framework.Apple80211 (10.1 - 1010.64) <A7378C4B-FFD3-35B9-93E8-0534A2A7B51F> /System/Library/PrivateFrameworks/Apple80211.framework/Versions/A/Apple80211 0x7fff8f470000 - 0x7fff8f4b0ff7 com.apple.CloudDocs (1.0 - 280.6) <C1179CEF-E058-3E16-BF90-C059FE7CDE77> /System/Library/PrivateFrameworks/CloudDocs.framework/Versions/A/CloudDocs 0x7fff8f4b1000 - 0x7fff8f53dff7 libsystem_c.dylib (1044.10.1) <199ED5EB-77A1-3D43-AA51-81779CE0A742> /usr/lib/system/libsystem_c.dylib 0x7fff8f5a3000 - 0x7fff8f634ff7 com.apple.cloudkit.CloudKit (259.2.5 - 259.2.5) <241EB647-C917-32F7-956A-6E505827048C> /System/Library/Frameworks/CloudKit.framework/Versions/A/CloudKit 0x7fff8f681000 - 0x7fff8f69dff7 libsystem_malloc.dylib (53.1.1) <19BCC257-5717-3502-A71F-95D65AFA861B> /usr/lib/system/libsystem_malloc.dylib 0x7fff8f6ec000 - 0x7fff8f71cfff com.apple.GSS (4.0 - 2.0) <FD154E62-F4CF-339D-B66C-AF4AED6A94A6> /System/Library/Frameworks/GSS.framework/Versions/A/GSS 0x7fff8f7b5000 - 0x7fff8f7befff libsystem_pthread.dylib (105.10.1) <3103AA7F-3BAE-3673-9649-47FFD7E15C97> /usr/lib/system/libsystem_pthread.dylib 0x7fff8f800000 - 0x7fff8f838ffb libsystem_network.dylib (411.1) <2EC3A005-473F-3C36-A665-F88B5BACC7F0> /usr/lib/system/libsystem_network.dylib 0x7fff8f839000 - 0x7fff8f843ff7 com.apple.NetAuth (5.0 - 5.0) <B9EC5425-D38D-308C-865F-207E0A98BAC7> /System/Library/PrivateFrameworks/NetAuth.framework/Versions/A/NetAuth 0x7fff8f9e5000 - 0x7fff8fafdffb com.apple.CoreText (352.0 - 454.3) <B3B8C775-14FA-38F3-9CD5-830422AE9C49> /System/Library/Frameworks/CoreText.framework/Versions/A/CoreText 0x7fff8fed6000 - 0x7fff8fedeffb libcopyfile.dylib (118.1.2) <0C68D3A6-ACDD-3EF3-991A-CC82C32AB836> /usr/lib/system/libcopyfile.dylib 0x7fff8fedf000 - 0x7fff8feeefff com.apple.LangAnalysis (1.7.0 - 1.7.0) <D1E527E4-C561-352F-9457-E8C50232793C> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/LangAnalysis.framework/Versions/A/LangAnalysis 0x7fff8ff41000 - 0x7fff90126ff3 libicucore.A.dylib (531.31) <B08E00D5-13C6-3391-AB3A-8DE693D3B42E> /usr/lib/libicucore.A.dylib 0x7fff90136000 - 0x7fff90161ff3 libarchive.2.dylib (30) <8CBB4416-EBE9-3574-8ADC-44655D245F39> /usr/lib/libarchive.2.dylib 0x7fff9027b000 - 0x7fff90295ff7 liblzma.5.dylib (7) <1D03E875-A7C0-3028-814C-3C27F7B7C079> /usr/lib/liblzma.5.dylib 0x7fff9029e000 - 0x7fff902b1ff7 com.apple.CoreBluetooth (1.0 - 1) <FA9B43B3-E183-3040-AE25-66EF9870CF35> /System/Library/Frameworks/CoreBluetooth.framework/Versions/A/CoreBluetooth 0x7fff9034e000 - 0x7fff9054846f libobjc.A.dylib (647) <759E155D-BC42-3D4E-869B-6F57D477177C> /usr/lib/libobjc.A.dylib 0x7fff90575000 - 0x7fff90582fff com.apple.ProtocolBuffer (1 - 225.1) <2D502FBB-D2A0-3937-A5C5-385FA65B3874> /System/Library/PrivateFrameworks/ProtocolBuffer.framework/Versions/A/ProtocolBuffer 0x7fff90583000 - 0x7fff90583ff7 libunc.dylib (29) <5676F7EA-C1DF-329F-B006-D2C3022B7D70> /usr/lib/system/libunc.dylib 0x7fff90584000 - 0x7fff90591fff com.apple.SpeechRecognitionCore (2.0.32 - 2.0.32) <87F0C88D-502D-3217-8B4A-8388288568BA> /System/Library/PrivateFrameworks/SpeechRecognitionCore.framework/Versions/A/SpeechRecognitionCore 0x7fff905d4000 - 0x7fff905d6fff com.apple.CoreDuetDebugLogging (1.0 - 1) <9A6E5710-EA99-366E-BF40-9A65EC1B46A1> /System/Library/PrivateFrameworks/CoreDuetDebugLogging.framework/Versions/A/CoreDuetDebugLogging 0x7fff905d7000 - 0x7fff905defff com.apple.NetFS (6.0 - 4.0) <1581D25F-CC07-39B0-90E8-5D4F3CF84EBA> /System/Library/Frameworks/NetFS.framework/Versions/A/NetFS 0x7fff905df000 - 0x7fff90975fff com.apple.CoreFoundation (6.9 - 1152) <CBD1591C-405E-376E-87E9-B264610EBF49> /System/Library/Frameworks/CoreFoundation.framework/Versions/A/CoreFoundation 0x7fff909dc000 - 0x7fff90a01ff7 libJPEG.dylib (1232) <09466709-4742-3418-A0AC-116EF9714E2D> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libJPEG.dylib 0x7fff90a09000 - 0x7fff90a10ff7 libcompiler_rt.dylib (35) <BF8FC133-EE10-3DA6-9B90-92039E28678F> /usr/lib/system/libcompiler_rt.dylib 0x7fff90a11000 - 0x7fff90b03ff7 libiconv.2.dylib (42) <2A06D02F-8B76-3864-8D96-64EF5B40BC6C> /usr/lib/libiconv.2.dylib 0x7fff90b04000 - 0x7fff90bc4fff com.apple.backup.framework (1.6.2 - 1.6.2) <63E8CA47-B7B8-3A63-B505-D1622CE52527> /System/Library/PrivateFrameworks/Backup.framework/Versions/A/Backup 0x7fff90bc5000 - 0x7fff90bd6ff7 libz.1.dylib (55) <88C7C7DE-04B8-316F-8B74-ACD9F3DE1AA1> /usr/lib/libz.1.dylib 0x7fff90bd7000 - 0x7fff90bd8ff7 com.apple.print.framework.Print (10.0 - 265) <3BC4FE7F-78A0-3E57-8F4C-520E7EFD36FA> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/Print.framework/Versions/A/Print 0x7fff90bd9000 - 0x7fff90bdefff com.apple.DiskArbitration (2.6 - 2.6) <0DFF4D9B-2AC3-3B82-B5C5-30F4EFBD2DB9> /System/Library/Frameworks/DiskArbitration.framework/Versions/A/DiskArbitration 0x7fff90bdf000 - 0x7fff90bf6ff7 libLinearAlgebra.dylib (1128) <E78CCBAA-A999-3B65-8EC9-06DB15E67C37> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libLinearAlgebra.dylib 0x7fff90bf7000 - 0x7fff90c2ffff com.apple.RemoteViewServices (2.0 - 99) <C9A62691-B0D9-34B7-B71C-A48B5F4DC553> /System/Library/PrivateFrameworks/RemoteViewServices.framework/Versions/A/RemoteViewServices 0x7fff90c30000 - 0x7fff90c4afff com.apple.AppleVPAFramework (1.2.10 - 1.2.10) <DC3D5A44-AB1E-32A9-9D22-FC922B52346A> /System/Library/PrivateFrameworks/AppleVPA.framework/Versions/A/AppleVPA 0x7fff90c4b000 - 0x7fff90c53fff libsystem_dnssd.dylib (561.1.1) <62B70ECA-E40D-3C63-896E-7F00EC386DDB> /usr/lib/system/libsystem_dnssd.dylib 0x7fff90c54000 - 0x7fff90f58ffb com.apple.HIToolbox (2.1.1 - 757.3) <D827FC03-5668-3AA4-AF0E-46EEF7358EEA> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/HIToolbox.framework/Versions/A/HIToolbox 0x7fff90f59000 - 0x7fff90f5bfff com.apple.OAuth (25 - 25) <EE765AF0-2BB6-3689-9EAA-689BF1F02A0D> /System/Library/PrivateFrameworks/OAuth.framework/Versions/A/OAuth 0x7fff90f5c000 - 0x7fff90f5ffff com.apple.IOSurface (97 - 97) <D4B4D2B2-7B16-3174-9EA6-55E0A10B452D> /System/Library/Frameworks/IOSurface.framework/Versions/A/IOSurface 0x7fff90f60000 - 0x7fff90f66fff com.apple.speech.recognition.framework (5.0.9 - 5.0.9) <BB2D573F-0A01-379F-A2BA-3C454EDCB111> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/SpeechRecognition.framework/Versions/A/SpeechRecognition 0x7fff90f67000 - 0x7fff90f73ff7 com.apple.OpenDirectory (10.10 - 187) <8B98ECCB-7EFA-3A58-BD2B-A0835D869B1A> /System/Library/Frameworks/OpenDirectory.framework/Versions/A/OpenDirectory 0x7fff90fce000 - 0x7fff91063ff7 com.apple.ColorSync (4.9.0 - 4.9.0) <F06733BD-A10C-3DB3-B050-825351130392> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ColorSync.framework/Versions/A/ColorSync 0x7fff91118000 - 0x7fff911a9ff7 libCoreStorage.dylib (471.10.6) <892DEEE7-C8C7-35EA-931D-FF9862BDEB2B> /usr/lib/libCoreStorage.dylib 0x7fff911aa000 - 0x7fff9129efff libFontParser.dylib (134.1) <EA8452DB-9221-3608-95BF-496F58106313> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libFontParser.dylib 0x7fff9129f000 - 0x7fff912c5ff7 com.apple.ChunkingLibrary (2.1 - 163.1) <3514F2A4-38BD-3849-9286-B3B991057742> /System/Library/PrivateFrameworks/ChunkingLibrary.framework/Versions/A/ChunkingLibrary 0x7fff912ef000 - 0x7fff9145aff7 com.apple.audio.toolbox.AudioToolbox (1.12 - 1.12) <5C6DBEB4-F2EA-3262-B9FC-AFB89404C1DA> /System/Library/Frameworks/AudioToolbox.framework/Versions/A/AudioToolbox 0x7fff9145b000 - 0x7fff916c3ff3 com.apple.security (7.0 - 57031.10.10) <79C37E73-271B-3BEF-A96E-CDB83FF12CF0> /System/Library/Frameworks/Security.framework/Versions/A/Security 0x7fff916db000 - 0x7fff91753ff7 com.apple.SystemConfiguration (1.14 - 1.14) <E0495F7D-5624-3EF7-B7E5-DA0EE708B6E4> /System/Library/Frameworks/SystemConfiguration.framework/Versions/A/SystemConfiguration 0x7fff9176e000 - 0x7fff918b0fff libsqlite3.dylib (168) <7B580EB9-9260-35FE-AE2F-276A2C242BAB> /usr/lib/libsqlite3.dylib 0x7fff918b1000 - 0x7fff91dc4ff3 com.apple.JavaScriptCore (10600 - 10600.3.13) <C0C3246C-D26F-3440-AC75-81CFFA4F9C91> /System/Library/Frameworks/JavaScriptCore.framework/Versions/A/JavaScriptCore 0x7fff91dc5000 - 0x7fff91e3bfe7 libcorecrypto.dylib (233.1.2) <E1789801-3985-3949-B736-6B3378873301> /usr/lib/system/libcorecrypto.dylib 0x7fff91e3c000 - 0x7fff91e4eff7 com.apple.ImageCapture (9.0 - 9.0) <7FB65DD4-56B5-35C4-862C-7A2DED991D1F> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/ImageCapture.framework/Versions/A/ImageCapture 0x7fff91e52000 - 0x7fff91e52fff com.apple.CoreServices (62 - 62) <9E4577CA-3FC3-300D-AB00-87ADBDDA2E37> /System/Library/Frameworks/CoreServices.framework/Versions/A/CoreServices 0x7fff91e5c000 - 0x7fff91f7eff7 com.apple.LaunchServices (644.12.4 - 644.12.4) <59E909E8-ED4A-33EA-B85D-D409BADDF854> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/LaunchServices.framework/Versions/A/LaunchServices 0x7fff91f7f000 - 0x7fff92013fff com.apple.ink.framework (10.9 - 213) <8E029630-1530-3734-A446-13353F0E7AC5> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/Ink.framework/Versions/A/Ink 0x7fff9201d000 - 0x7fff9204fff3 com.apple.frameworks.CoreDaemon (1.3 - 1.3) <C6DB0A07-F8E4-3837-BCA9-225F460EDA81> /System/Library/PrivateFrameworks/CoreDaemon.framework/Versions/B/CoreDaemon 0x7fff92066000 - 0x7fff9206afff libcache.dylib (69) <45E9A2E7-99C4-36B2-BEE3-0C4E11614AD1> /usr/lib/system/libcache.dylib 0x7fff9206b000 - 0x7fff92078ff7 libbz2.1.0.dylib (36) <2DF83FBC-5C08-39E1-94F5-C28653791B5F> /usr/lib/libbz2.1.0.dylib 0x7fff920d6000 - 0x7fff920f3fff libsystem_kernel.dylib (2782.10.72) <97CD7ACD-EA0C-3434-BEFC-FCD013D6BB73> /usr/lib/system/libsystem_kernel.dylib 0x7fff9266f000 - 0x7fff9266ffff com.apple.audio.units.AudioUnit (1.12 - 1.12) <76EF1C9D-DEA4-3E55-A134-4099B2FD2CF2> /System/Library/Frameworks/AudioUnit.framework/Versions/A/AudioUnit 0x7fff92670000 - 0x7fff926a3ff7 com.apple.MediaKit (16 - 757) <345EDAFE-3E39-3B0F-8D84-54657EC4396D> /System/Library/PrivateFrameworks/MediaKit.framework/Versions/A/MediaKit 0x7fff926a4000 - 0x7fff926a6ff7 libquarantine.dylib (76) <DC041627-2D92-361C-BABF-A869A5C72293> /usr/lib/system/libquarantine.dylib 0x7fff92fec000 - 0x7fff92fecff7 libkeymgr.dylib (28) <77845842-DE70-3CC5-BD01-C3D14227CED5> /usr/lib/system/libkeymgr.dylib 0x7fff9309a000 - 0x7fff9317efff libcrypto.0.9.8.dylib (52.10.1) <2A2924DE-63FB-37F6-B102-84D69240675B> /usr/lib/libcrypto.0.9.8.dylib 0x7fff93181000 - 0x7fff93182fff libSystem.B.dylib (1213) <90B107BC-FF74-32CC-B1CF-4E02F544D957> /usr/lib/libSystem.B.dylib 0x7fff93184000 - 0x7fff93189ff7 libmacho.dylib (862) <126CA2ED-DE91-308F-8881-B9DAEC3C63B6> /usr/lib/system/libmacho.dylib 0x7fff9318a000 - 0x7fff9318bfff com.apple.TrustEvaluationAgent (2.0 - 25) <2D61A2C3-C83E-3A3F-8EC1-736DBEC250AB> /System/Library/PrivateFrameworks/TrustEvaluationAgent.framework/Versions/A/TrustEvaluationAgent 0x7fff9318c000 - 0x7fff93190fff libspindump.dylib (182) <085978DC-A34D-3B72-BC7B-025C35A0A373> /usr/lib/libspindump.dylib 0x7fff93191000 - 0x7fff931b9ffb libRIP.A.dylib (775.16) <7711F7A7-1813-3024-AE42-75CA7C5422B7> /System/Library/Frameworks/CoreGraphics.framework/Versions/A/Resources/libRIP.A.dylib 0x7fff931d3000 - 0x7fff931dafff libCGCMS.A.dylib (775.16) <8A173E74-7123-35F1-B160-853528C144ED> /System/Library/Frameworks/CoreGraphics.framework/Versions/A/Resources/libCGCMS.A.dylib 0x7fff931db000 - 0x7fff931ddfff libCVMSPluginSupport.dylib (11.1.1) <DA0706C5-F02A-3F3D-8EBA-18C04313CA2C> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libCVMSPluginSupport.dylib 0x7fff931e3000 - 0x7fff933e6ff3 com.apple.CFNetwork (720.2.4 - 720.2.4) <E550C671-930F-3B12-8798-23898473E179> /System/Library/Frameworks/CFNetwork.framework/Versions/A/CFNetwork 0x7fff93451000 - 0x7fff93457ff7 libsystem_networkextension.dylib (167.1.10) <29AB225B-D7FB-30ED-9600-65D44B9A9442> /usr/lib/system/libsystem_networkextension.dylib 0x7fff93471000 - 0x7fff93472ff7 libsystem_blocks.dylib (65) <9615D10A-FCA7-3BE4-AA1A-1B195DACE1A1> /usr/lib/system/libsystem_blocks.dylib 0x7fff93473000 - 0x7fff93473fff com.apple.Accelerate.vecLib (3.10 - vecLib 3.10) <B92888D0-ED3F-3430-8F3A-6E56FD16C5F1> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/vecLib 0x7fff93474000 - 0x7fff93476ff7 com.apple.securityhi (9.0 - 55006) <1F40ECF1-6AEF-3E64-9DAD-ADC646CCEA98> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/SecurityHI.framework/Versions/A/SecurityHI 0x7fff934c2000 - 0x7fff9400cff7 com.apple.AppKit (6.9 - 1344.72) <44EF7DEB-3072-3515-9F34-2857D557E828> /System/Library/Frameworks/AppKit.framework/Versions/C/AppKit 0x7fff9400d000 - 0x7fff94017ff7 com.apple.CrashReporterSupport (10.10 - 629) <4BCAA6B5-EC7F-365F-9D3F-BC483B7E956C> /System/Library/PrivateFrameworks/CrashReporterSupport.framework/Versions/A/CrashReporterSupport 0x7fff94018000 - 0x7fff9408cfff com.apple.ApplicationServices.ATS (360 - 375) <2824D38D-460D-353C-9D18-499B4BEEABB7> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/ATS 0x7fff9408d000 - 0x7fff948e4ffb com.apple.CoreGraphics (1.600.0 - 775.16) <864C1845-C41E-314C-A3B4-438DC39E5FBC> /System/Library/Frameworks/CoreGraphics.framework/Versions/A/CoreGraphics 0x7fff948e5000 - 0x7fff94932ff3 com.apple.print.framework.PrintCore (10.0 - 451) <3CA58254-D14F-3913-9DFB-CAC499570CC7> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/PrintCore.framework/Versions/A/PrintCore 0x7fff94990000 - 0x7fff949f7ffb com.apple.datadetectorscore (6.0 - 396.1.1) <80379385-A4EC-3F9B-AFED-9B1DF781943D> /System/Library/PrivateFrameworks/DataDetectorsCore.framework/Versions/A/DataDetectorsCore 0x7fff94c3c000 - 0x7fff94c44ff7 com.apple.icloud.FindMyDevice (1.0 - 1) <D198E170-3610-3727-BC87-73AD249CA097> /System/Library/PrivateFrameworks/FindMyDevice.framework/Versions/A/FindMyDevice 0x7fff94c45000 - 0x7fff94c56ff7 libsystem_coretls.dylib (35.1.2) <BC691CD1-17B6-39A5-BD02-AF973695FD1D> /usr/lib/system/libsystem_coretls.dylib 0x7fff94d38000 - 0x7fff94d3efff libsystem_trace.dylib (72.1.3) <A9E6B7D8-C327-3742-AC54-86C94218B1DF> /usr/lib/system/libsystem_trace.dylib 0x7fff94d41000 - 0x7fff94d5bff3 com.apple.Ubiquity (1.3 - 313) <DF56A657-CC6E-3BE2-86A0-71F07127724C> /System/Library/PrivateFrameworks/Ubiquity.framework/Versions/A/Ubiquity 0x7fff94e76000 - 0x7fff94ee8ff7 com.apple.framework.IOKit (2.0.2 - 1050.10.8) <FDFB1FBE-6A0E-3D63-828C-CD53500FCB0F> /System/Library/Frameworks/IOKit.framework/Versions/A/IOKit 0x7fff94ee9000 - 0x7fff94f58fff com.apple.SearchKit (1.4.0 - 1.4.0) <BFD6D876-36BA-3A3B-9F15-3E2F7DE6E89D> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/SearchKit.framework/Versions/A/SearchKit 0x7fff94f66000 - 0x7fff94f83ffb libresolv.9.dylib (57) <26B38E61-298A-3C3A-82C1-3B5E98AD5E29> /usr/lib/libresolv.9.dylib 0x7fff95321000 - 0x7fff9532cff7 com.apple.speech.synthesis.framework (5.3.3 - 5.3.3) <7DF3C68C-B219-3E13-AE72-24B8606A1560> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/SpeechSynthesis.framework/Versions/A/SpeechSynthesis 0x7fff95389000 - 0x7fff953a3ff7 com.apple.Kerberos (3.0 - 1) <7760E0C2-A222-3709-B2A6-B692D900CEB1> /System/Library/Frameworks/Kerberos.framework/Versions/A/Kerberos 0x7fff953a4000 - 0x7fff953f0ff7 com.apple.corelocation (1486.17 - 1615.21.1) <B81BC475-E215-3491-A750-8B23F05ABF5B> /System/Library/Frameworks/CoreLocation.framework/Versions/A/CoreLocation 0x7fff953f1000 - 0x7fff955a1ff7 com.apple.QuartzCore (1.10 - 361.15) <72A78C43-30DF-3748-9015-4B28119DB27B> /System/Library/Frameworks/QuartzCore.framework/Versions/A/QuartzCore 0x7fff955a7000 - 0x7fff9655effb com.apple.WebCore (10600 - 10600.3.15) <59A28076-26E4-3CE2-B6FC-AF59308C0B95> /System/Library/Frameworks/WebKit.framework/Versions/A/Frameworks/WebCore.framework/Versions/A/WebCore 0x7fff965ad000 - 0x7fff96668ff7 com.apple.DiscRecording (9.0 - 9000.4.2) <9BB46993-311A-3F2E-BD77-3CBEFB71C1F0> /System/Library/Frameworks/DiscRecording.framework/Versions/A/DiscRecording 0x7fff96769000 - 0x7fff967a3ffb com.apple.DebugSymbols (115 - 115) <6F03761D-7C3A-3C80-8031-AA1C1AD7C706> /System/Library/PrivateFrameworks/DebugSymbols.framework/Versions/A/DebugSymbols 0x7fff967a4000 - 0x7fff96867ff7 libvMisc.dylib (516) <A84F3A3B-D349-3FBC-B5A6-E0F572734073> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvMisc.dylib 0x7fff96868000 - 0x7fff9688dff7 libPng.dylib (1232) <6E72AE55-AFB0-3FC4-80B2-EBC3353436B7> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libPng.dylib 0x7fff968a4000 - 0x7fff96bd2fff com.apple.Foundation (6.9 - 1152.14) <E3746EDD-DFB1-3ECB-88ED-A91AC0EF3AAA> /System/Library/Frameworks/Foundation.framework/Versions/C/Foundation 0x7fff96bd3000 - 0x7fff96bdbfe7 libcldcpuengine.dylib (2.4.5) <F9EF8060-5E40-3E88-BC38-7452649672B2> /System/Library/Frameworks/OpenCL.framework/Versions/A/Libraries/libcldcpuengine.dylib 0x7fff96c01000 - 0x7fff96c03ffb libCGXType.A.dylib (775.16) <B2DC78CA-179F-39A7-8D0B-873DC0ACFE96> /System/Library/Frameworks/CoreGraphics.framework/Versions/A/Resources/libCGXType.A.dylib 0x7fff96c04000 - 0x7fff96c05fff libsystem_secinit.dylib (18) <581DAD0F-6B63-3A48-B63B-917AF799ABAA> /usr/lib/system/libsystem_secinit.dylib 0x7fff96c39000 - 0x7fff96c67fff com.apple.CoreServicesInternal (221.2.2 - 221.2.2) <16F7A7F1-CF1D-35AD-A91F-690A814048DF> /System/Library/PrivateFrameworks/CoreServicesInternal.framework/Versions/A/CoreServicesInternal 0x7fff96c68000 - 0x7fff96d06fff com.apple.Metadata (10.7.0 - 917.1) <46BE997C-B1F4-3BED-9332-FAC87297C87A> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/Metadata.framework/Versions/A/Metadata 0x7fff96d33000 - 0x7fff96d37fff libCoreVMClient.dylib (79) <FC4E08E3-749E-32FF-B5E9-211F29864831> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libCoreVMClient.dylib 0x7fff96e59000 - 0x7fff96e5cfff com.apple.help (1.3.3 - 46) <CA4541F4-CEF5-355C-8F1F-EA65DC1B400F> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/Help.framework/Versions/A/Help 0x7fff96e5d000 - 0x7fff96e73ff7 libsystem_asl.dylib (267) <F153AC5B-0542-356E-88C8-20A62CA704E2> /usr/lib/system/libsystem_asl.dylib 0x7fff96e74000 - 0x7fff96e9cfff libsystem_info.dylib (459) <B85A85D5-8530-3A93-B0C3-4DEC41F79478> /usr/lib/system/libsystem_info.dylib 0x7fff96f0b000 - 0x7fff97022fe7 libvDSP.dylib (516) <DFEDB210-49D1-3803-88A2-C61DB6A45C3D> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libvDSP.dylib 0x7fff97023000 - 0x7fff9703dff7 libextension.dylib (55.1) <6D0CF094-85E8-3F5B-A3F1-25ECF60F80D9> /usr/lib/libextension.dylib 0x7fff9722a000 - 0x7fff97291ff7 com.apple.framework.CoreWiFi (3.0 - 300.4) <19269C1D-EB29-384A-83F3-7DDDEB7D9DAD> /System/Library/PrivateFrameworks/CoreWiFi.framework/Versions/A/CoreWiFi 0x7fff972a8000 - 0x7fff972acff7 libGIF.dylib (1232) <061D5354-FE4F-3C7E-B563-99DC0198062D> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libGIF.dylib 0x7fff972ae000 - 0x7fff972beff7 libbsm.0.dylib (34) <A3A2E56C-2B65-37C7-B43A-A1F926E1A0BB> /usr/lib/libbsm.0.dylib 0x7fff97e9c000 - 0x7fff981cffff libmecabra.dylib (666.2) <F757CABA-3EDB-3ABA-A378-A7C574EA233B> /usr/lib/libmecabra.dylib 0x7fff981d0000 - 0x7fff981dbfff libcommonCrypto.dylib (60061) <D381EBC6-69D8-31D3-8084-5A80A32CB748> /usr/lib/system/libcommonCrypto.dylib 0x7fff98291000 - 0x7fff98299fff libsystem_platform.dylib (63) <64E34079-D712-3D66-9CE2-418624A5C040> /usr/lib/system/libsystem_platform.dylib 0x7fff9829a000 - 0x7fff982c4ff7 libdispatch.dylib (442.1.4) <502CF32B-669B-3709-8862-08188225E4F0> /usr/lib/system/libdispatch.dylib 0x7fff982f8000 - 0x7fff982f8fff com.apple.Carbon (154 - 157) <0DF27AD6-ED64-34D7-825D-65297D276652> /System/Library/Frameworks/Carbon.framework/Versions/A/Carbon 0x7fff98621000 - 0x7fff9889cff7 com.apple.CoreData (111 - 526.1) <DC4F037B-B7F4-381A-B939-4414489D76BF> /System/Library/Frameworks/CoreData.framework/Versions/A/CoreData 0x7fff9889d000 - 0x7fff989e5ff7 com.apple.WebKitLegacy (10600 - 10600.3.18) <91B3E705-1378-3F73-B079-3223E838B629> /System/Library/Frameworks/WebKit.framework/Versions/A/Frameworks/WebKitLegacy.framework/Versions/A/WebKitLegacy 0x7fff989fb000 - 0x7fff98a16ff7 com.apple.aps.framework (4.0 - 4.0) <F3C3C246-101E-3E81-9608-D2D6E9352532> /System/Library/PrivateFrameworks/ApplePushService.framework/Versions/A/ApplePushService 0x7fff98bd2000 - 0x7fff98bd2fff com.apple.ApplicationServices (48 - 48) <5BF7910B-C328-3BF8-BA4F-CE52B574CE01> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/ApplicationServices 0x7fff98bd3000 - 0x7fff98bd5fff com.apple.loginsupport (1.0 - 1) <21DBC18C-F260-39FC-B52F-04A5AA84523A> /System/Library/PrivateFrameworks/login.framework/Versions/A/Frameworks/loginsupport.framework/Versions/A/loginsupport 0x7fff98c31000 - 0x7fff98f18ffb com.apple.CoreServices.CarbonCore (1108.2 - 1108.2) <FD87F83F-301A-3BD6-8262-5692FC1B4457> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/CarbonCore.framework/Versions/A/CarbonCore 0x7fff99071000 - 0x7fff99085ff7 com.apple.MultitouchSupport.framework (262.33.1 - 262.33.1) <62DF9340-01A1-3E12-A604-C90F6361FD9E> /System/Library/PrivateFrameworks/MultitouchSupport.framework/Versions/A/MultitouchSupport 0x7fff9911e000 - 0x7fff99164ffb libFontRegistry.dylib (134) <01B8034A-45FD-3360-A347-A1896F591363> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/ATS.framework/Versions/A/Resources/libFontRegistry.dylib 0x7fff99174000 - 0x7fff99267ff7 libJP2.dylib (1232) <10B78725-0B8A-3D87-B2E3-8FEED0C07F21> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libJP2.dylib 0x7fff99273000 - 0x7fff99275ff7 libutil.dylib (38) <471AD65E-B86E-3C4A-8ABD-B8665A2BCE3F> /usr/lib/libutil.dylib 0x7fff992ac000 - 0x7fff992e7fff com.apple.QD (301 - 301) <C4D2AD03-B839-350A-AAF0-B4A08F8BED77> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/QD.framework/Versions/A/QD 0x7fff99365000 - 0x7fff99365ff7 liblaunch.dylib (559.10.3) <DFCDEBDF-8247-3DC7-9879-E7E497DDA4B4> /usr/lib/system/liblaunch.dylib 0x7fff99c0f000 - 0x7fff99c17ffb com.apple.CoreServices.FSEvents (1210 - 1210) <782A9C69-7A45-31A7-8960-D08A36CBD0A7> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/FSEvents.framework/Versions/A/FSEvents 0x7fff99c18000 - 0x7fff99c48fff libsystem_m.dylib (3086.1) <1E12AB45-6D96-36D0-A226-F24D9FB0D9D6> /usr/lib/system/libsystem_m.dylib 0x7fff99c52000 - 0x7fff99de0fff libBLAS.dylib (1128) <497912C1-A98E-3281-BED7-E9C751552F61> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib 0x7fff99e4c000 - 0x7fff99e4fff7 libdyld.dylib (353.2.1) <4E33E416-F1D8-3598-B8CC-6863E2ECD0E6> /usr/lib/system/libdyld.dylib 0x7fff99e7d000 - 0x7fff99e8ffff libsasl2.2.dylib (193) <E523DD05-544B-3430-8AA9-672408A5AF8B> /usr/lib/libsasl2.2.dylib 0x7fff99e90000 - 0x7fff99eebfff libTIFF.dylib (1232) <29A5C7F7-D50B-35B3-8FA2-A55A47E497A6> /System/Library/Frameworks/ImageIO.framework/Versions/A/Resources/libTIFF.dylib 0x7fff99eec000 - 0x7fff99f0cfff com.apple.IconServices (47.1 - 47.1) <E83DFE3B-6541-3736-96BB-26DC5D0100F1> /System/Library/PrivateFrameworks/IconServices.framework/Versions/A/IconServices 0x7fff99f0d000 - 0x7fff99f5eff7 com.apple.audio.CoreAudio (4.3.0 - 4.3.0) <AF72B06E-C6C1-3FAE-8B47-AF461CAE0E22> /System/Library/Frameworks/CoreAudio.framework/Versions/A/CoreAudio 0x7fff99fc3000 - 0x7fff9a3d0ff7 libLAPACK.dylib (1128) <F9201AE7-B031-36DB-BCF8-971E994EF7C1> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libLAPACK.dylib 0x7fff9a3d1000 - 0x7fff9a3ecff7 libCRFSuite.dylib (34) <D64842BE-7BD4-3D0C-9842-1D202F7C2A51> /usr/lib/libCRFSuite.dylib 0x7fff9a3f7000 - 0x7fff9a3f7fff libOpenScriptingUtil.dylib (162) <EFD79173-A9DA-3AE6-BE15-3948938204A6> /usr/lib/libOpenScriptingUtil.dylib 0x7fff9a3f8000 - 0x7fff9a40aff7 com.apple.CoreDuetDaemonProtocol (1.0 - 1) <CE9FABB4-1C5D-3F9B-9BB8-5CC50C3E5E31> /System/Library/PrivateFrameworks/CoreDuetDaemonProtocol.framework/Versions/A/CoreDuetDaemonProtocol 0x7fff9a40b000 - 0x7fff9a477fff com.apple.framework.CoreWLAN (5.0 - 500.35.2) <37551DDD-C07C-31EB-923A-9721F03D7E29> /System/Library/Frameworks/CoreWLAN.framework/Versions/A/CoreWLAN 0x7fff9a478000 - 0x7fff9a47afff libsystem_configuration.dylib (699.1.5) <5E14864E-089A-3D84-85A4-980B776427A8> /usr/lib/system/libsystem_configuration.dylib 0x7fff9ab72000 - 0x7fff9ab7fff7 libxar.1.dylib (254) <CE10EFED-3066-3749-838A-6A15AC0DBCB6> /usr/lib/libxar.1.dylib 0x7fff9ab80000 - 0x7fff9abdfff3 com.apple.AE (681 - 681) <7F544183-A515-31A8-B45F-89A167F56216> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/AE.framework/Versions/A/AE 0x7fff9abe0000 - 0x7fff9ac2cff7 libcups.2.dylib (408) <9CECCDE3-51D7-3028-830C-F58BD36E3317> /usr/lib/libcups.2.dylib 0x7fff9ac2d000 - 0x7fff9ac36fff libGFXShared.dylib (11.1.1) <7AE7D152-597E-3B27-A52C-8DA76760B61C> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGFXShared.dylib 0x7fff9ac37000 - 0x7fff9ac37fff com.apple.Accelerate (1.10 - Accelerate 1.10) <F1B96A61-7E4B-31BD-A35B-BA7EF1F16EF4> /System/Library/Frameworks/Accelerate.framework/Versions/A/Accelerate 0x7fff9ac3f000 - 0x7fff9ad6ffff com.apple.UIFoundation (1.0 - 1) <8E030D93-441C-3997-9CD2-55C8DFAC8B84> /System/Library/PrivateFrameworks/UIFoundation.framework/Versions/A/UIFoundation 0x7fff9ad70000 - 0x7fff9ad7bff7 libkxld.dylib (2782.10.72) <68E07A32-28F5-3FBB-9D74-00B4F53C2FD4> /usr/lib/system/libkxld.dylib 0x7fff9ad7c000 - 0x7fff9ad7eff7 libsystem_coreservices.dylib (9) <41B7C578-5A53-31C8-A96F-C73E030B0938> /usr/lib/system/libsystem_coreservices.dylib 0x7fff9add5000 - 0x7fff9ae23fff libcurl.4.dylib (83.1.2) <337A1FF8-E8B1-3173-9F29-C0D4C851D8E1> /usr/lib/libcurl.4.dylib 0x7fff9aea3000 - 0x7fff9aed0fff com.apple.Accounts (113 - 113) <990F0F61-6AC5-3076-932E-02A9A7F75AC4> /System/Library/Frameworks/Accounts.framework/Versions/A/Accounts 0x7fff9aeeb000 - 0x7fff9af5fff3 com.apple.securityfoundation (6.0 - 55126) <DEC91795-7754-334A-8CDA-B429F41B922D> /System/Library/Frameworks/SecurityFoundation.framework/Versions/A/SecurityFoundation 0x7fff9af60000 - 0x7fff9af89ffb libxslt.1.dylib (13) <AED1143F-B848-3E73-81ED-71356F25F084> /usr/lib/libxslt.1.dylib 0x7fff9af8a000 - 0x7fff9afc1ffb com.apple.LDAPFramework (2.4.28 - 194.5) <D22234AA-8B30-3010-8CF0-67516D52CC33> /System/Library/Frameworks/LDAP.framework/Versions/A/LDAP 0x7fff9afc2000 - 0x7fff9b030ffb com.apple.Heimdal (4.0 - 2.0) <3E5DA653-A343-3257-ADE1-BA879BAE280F> /System/Library/PrivateFrameworks/Heimdal.framework/Versions/A/Heimdal 0x7fff9b320000 - 0x7fff9b323fff com.apple.xpc.ServiceManagement (1.0 - 1) <5EFD45BF-B0CD-39F2-8232-6BA33E63E5D4> /System/Library/Frameworks/ServiceManagement.framework/Versions/A/ServiceManagement 0x7fff9b458000 - 0x7fff9b459fff liblangid.dylib (117) <B54A4AA0-2E53-3671-90F5-AFF711C0EB9E> /usr/lib/liblangid.dylib 0x7fff9b4bb000 - 0x7fff9b90efc7 com.apple.vImage (8.0 - 8.0) <33BE7B31-72DB-3364-B37E-C322A32748C5> /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vImage.framework/Versions/A/vImage 0x7fff9b90f000 - 0x7fff9b91dff7 com.apple.opengl (11.1.1 - 11.1.1) <F79F5FFF-372E-329E-81FB-EE9BD6A2A7A7> /System/Library/Frameworks/OpenGL.framework/Versions/A/OpenGL 0x7fff9b91e000 - 0x7fff9b94ffff libtidy.A.dylib (15.15) <37FC944D-271A-386A-9ADD-FA33AD48F96D> /usr/lib/libtidy.A.dylib 0x7fff9b975000 - 0x7fff9b9f7fff com.apple.PerformanceAnalysis (1.0 - 1) <94F08B1A-F6AF-38D5-BE92-4FED34742966> /System/Library/PrivateFrameworks/PerformanceAnalysis.framework/Versions/A/PerformanceAnalysis 0x7fff9bd93000 - 0x7fff9bdd3ff7 libGLImage.dylib (11.1.1) <3986BFA3-4F55-380F-B01D-91BA9785D70C> /System/Library/Frameworks/OpenGL.framework/Versions/A/Libraries/libGLImage.dylib 0x7fff9bdd4000 - 0x7fff9bdddff3 com.apple.CommonAuth (4.0 - 2.0) <BA9F5A09-D200-3D18-9F4A-20C789291A30> /System/Library/PrivateFrameworks/CommonAuth.framework/Versions/A/CommonAuth 0x7fff9be2a000 - 0x7fff9be2fff7 libunwind.dylib (35.3) <BE7E51A0-B6EA-3A54-9CCA-9D88F683A6D6> /usr/lib/system/libunwind.dylib 0x7fff9be30000 - 0x7fff9c0f6fff com.apple.WebKit (10600 - 10600.3.18) <F8E36318-4F4C-348B-B1DE-D4BE035036AD> /System/Library/Frameworks/WebKit.framework/Versions/A/WebKit 0x7fff9c175000 - 0x7fff9c198fff com.apple.Sharing (328.3.2 - 328.3.2) <F555679F-1CD1-3EB2-8E01-FCB80EF07330> /System/Library/PrivateFrameworks/Sharing.framework/Versions/A/Sharing 0x7fff9c199000 - 0x7fff9c415ff3 com.apple.RawCamera.bundle (6.02 - 769) <1F0F0047-682F-39E3-BE26-2467BF5F0E22> /System/Library/CoreServices/RawCamera.bundle/Contents/MacOS/RawCamera 0x7fff9c416000 - 0x7fff9c45cff7 libauto.dylib (186) <A260789B-D4D8-316A-9490-254767B8A5F1> /usr/lib/libauto.dylib 0x7fff9c641000 - 0x7fff9c690ff7 com.apple.opencl (2.4.2 - 2.4.2) <D16CFDE6-B5F7-301A-995E-8B583D8C675A> /System/Library/Frameworks/OpenCL.framework/Versions/A/OpenCL 0x7fff9c78c000 - 0x7fff9c815fff com.apple.CoreSymbolication (3.1 - 57020) <FDF8F348-164D-38F9-90EB-F42585DD2C77> /System/Library/PrivateFrameworks/CoreSymbolication.framework/Versions/A/CoreSymbolication 0x7fff9c816000 - 0x7fff9c832fff com.apple.GenerationalStorage (2.0 - 209.11) <9FF8DD11-25FB-3047-A5BF-9415339B3EEC> /System/Library/PrivateFrameworks/GenerationalStorage.framework/Versions/A/GenerationalStorage 0x7fff9c833000 - 0x7fff9c83cff7 libsystem_notify.dylib (133.1.1) <61147800-F320-3DAA-850C-BADF33855F29> /usr/lib/system/libsystem_notify.dylib 0x7fff9c931000 - 0x7fff9c945ff7 com.apple.ProtectedCloudStorage (1.0 - 1) <52CFE68A-0663-3756-AB5B-B42195026052> /System/Library/PrivateFrameworks/ProtectedCloudStorage.framework/Versions/A/ProtectedCloudStorage 0x7fff9c946000 - 0x7fff9c95ffff com.apple.openscripting (1.4 - 162) <80DFF366-B950-3F79-903F-99DA0FFDB570> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/OpenScripting.framework/Versions/A/OpenScripting 0x7fff9c985000 - 0x7fff9c986ffb libremovefile.dylib (35) <3485B5F4-6CE8-3C62-8DFD-8736ED6E8531> /usr/lib/system/libremovefile.dylib 0x7fff9c9a3000 - 0x7fff9cdd3fff com.apple.vision.FaceCore (3.1.6 - 3.1.6) <C3B823AA-C261-37D3-B4AC-C59CE91C8241> /System/Library/PrivateFrameworks/FaceCore.framework/Versions/A/FaceCore 0x7fff9ce9e000 - 0x7fff9cea2fff libpam.2.dylib (20) <E805398D-9A92-31F8-8005-8DC188BD8B6E> /usr/lib/libpam.2.dylib 0x7fff9d026000 - 0x7fff9d0c5df7 com.apple.AppleJPEG (1.0 - 1) <9BB3D7DF-630A-3E1C-A124-12D6C4D0DE70> /System/Library/PrivateFrameworks/AppleJPEG.framework/Versions/A/AppleJPEG 0x7fff9d0c6000 - 0x7fff9d0cbffb libheimdal-asn1.dylib (398.10.1) <A7B6447A-6680-3625-83C3-993B58D5C43F> /usr/lib/libheimdal-asn1.dylib 0x7fff9d116000 - 0x7fff9d15fff3 com.apple.HIServices (1.22 - 520.12) <8EAC82AB-6A7D-3606-AF6F-60A9410D1278> /System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/HIServices.framework/Versions/A/HIServices 0x7fff9d160000 - 0x7fff9d16cff7 com.apple.HelpData (2.1.4 - 90) <471200E4-1D51-3D8C-A956-A52F8EB7B552> /System/Library/PrivateFrameworks/HelpData.framework/Versions/A/HelpData 0x7fff9d16d000 - 0x7fff9d17efff libcmph.dylib (1) <46EC3997-DB5E-38AE-BBBB-A035A54AD3C0> /usr/lib/libcmph.dylib 0x7fff9d182000 - 0x7fff9d187ff7 libsystem_stats.dylib (163.10.18) <9B8CCF24-DDDB-399A-9237-4BEC225D2E8C> /usr/lib/system/libsystem_stats.dylib 0x7fff9d188000 - 0x7fff9d18cfff com.apple.CommonPanels (1.2.6 - 96) <F9ECC8AF-D9CA-3350-AFB4-5113A9B789A5> /System/Library/Frameworks/Carbon.framework/Versions/A/Frameworks/CommonPanels.framework/Versions/A/CommonPanels 0x7fff9d18d000 - 0x7fff9d1b8fff com.apple.DictionaryServices (1.2 - 229) <6789EC43-CADA-394D-8FE8-FC3A2DD136B9> /System/Library/Frameworks/CoreServices.framework/Versions/A/Frameworks/DictionaryServices.framework/Versions/A/DictionaryServices 0x7fff9d1b9000 - 0x7fff9d1e6fff com.apple.CoreVideo (1.8 - 145.1) <18DB07E0-B927-3260-A234-636F298D1917> /System/Library/Frameworks/CoreVideo.framework/Versions/A/CoreVideo Sample analysis of process 509 written to file /dev/stdout ```
non_process
extraordinarily high memory usage version i m sharing this knowing that it probably falls squarely into the wontfix category but i wanted to at least bring it your attention for the last few weeks when my late mac mini starts slowing down in the afternoon i go to activity monitor and see that trailer app has claimed a rather sizable chunk of available memory i have hidden all but two of the dozens of repositories which i can access to see if that helps but it has not process sample sampling process for seconds with millisecond of run time between samples sampling completed processing symbols analysis of sampling trailer pid every millisecond process trailer path applications trailer app contents macos trailer load address identifier com housetrip trailer version code type parent process date time os version mac os x report version analysis tool usr bin sample call graph thread dispatchqueue com apple main thread serial start in libdyld dylib nsapplicationmain in appkit in appkit in appkit dpsnextevent in appkit blockuntilnexteventmatchinglistinmodewithfilter in hitoolbox receivenexteventcommon in hitoolbox runcurrenteventloopinmode in hitoolbox cfrunlooprunspecific in corefoundation cfrunlooprun in corefoundation cfrunloopservicemachport in corefoundation mach msg in libsystem kernel dylib mach msg trap in libsystem kernel dylib thread dispatchqueue com apple libdispatch manager serial dispatch mgr thread in libdispatch dylib in libsystem kernel dylib thread com apple nsurlconnectionloader thread start in libsystem pthread dylib pthread start in libsystem pthread dylib pthread body in libsystem pthread dylib nsthread main in foundation in cfnetwork cfrunlooprunspecific in corefoundation cfrunlooprun in corefoundation cfrunloopservicemachport in corefoundation mach msg in libsystem kernel dylib mach msg trap in libsystem kernel dylib thread thread start in libsystem pthread dylib pthread start in libsystem pthread dylib pthread body in libsystem pthread dylib nseventthread in appkit cfrunlooprunspecific in corefoundation cfrunlooprun in corefoundation cfrunloopservicemachport in corefoundation mach msg in libsystem kernel dylib mach msg trap in libsystem kernel dylib thread com apple cfsocket private thread start in libsystem pthread dylib pthread start in libsystem pthread dylib pthread body in libsystem pthread dylib select in libsystem kernel dylib thread start wqthread in libsystem pthread dylib workq kernreturn in libsystem kernel dylib thread start wqthread in libsystem pthread dylib workq kernreturn in libsystem kernel dylib thread start wqthread in libsystem pthread dylib workq kernreturn in libsystem kernel dylib thread start wqthread in libsystem pthread dylib workq kernreturn in libsystem kernel dylib thread start wqthread in libsystem pthread dylib workq kernreturn in libsystem kernel dylib total number in stack recursive counted multiple when workq kernreturn in libsystem kernel dylib start wqthread in libsystem pthread dylib sort by top of stack same collapsed when workq kernreturn in libsystem kernel dylib mach msg trap in libsystem kernel dylib select in libsystem kernel dylib in libsystem kernel dylib binary images com housetrip trailer applications trailer app contents macos trailer org andymatuschak sparkle applications trailer app contents frameworks sparkle framework versions a sparkle cl kernels cl kernels bgra dylib system library frameworks opencl framework versions a libraries imageformats bgra dylib cl kernels cl kernels com apple xquery system library privateframeworks xquery framework xquery cl kernels cl kernels dyld usr lib dyld com apple cfopendirectory system library frameworks opendirectory framework versions a frameworks cfopendirectory framework versions a cfopendirectory com apple coreimage system library frameworks quartzcore framework versions a frameworks coreimage framework versions a coreimage libdiagnosticmessagesclient dylib usr lib libdiagnosticmessagesclient dylib com apple imageio framework system library frameworks imageio framework versions a imageio libgl dylib system library frameworks opengl framework versions a libraries libgl dylib com apple tcc system library privateframeworks tcc framework versions a tcc com apple efilogin system library privateframeworks efilogin framework versions a efilogin com apple coreduet system library privateframeworks coreduet framework versions a coreduet com apple languagemodeling system library privateframeworks languagemodeling framework versions a languagemodeling com apple coreui system library privateframeworks coreui framework versions a coreui com apple desktopservices system library privateframeworks desktopservicespriv framework versions a desktopservicespriv com apple symbolication system library privateframeworks symbolication framework versions a symbolication com apple applesrp system library privateframeworks applesrp framework versions a applesrp libc abi dylib usr lib libc abi dylib com apple coreservices osservices system library frameworks coreservices framework versions a frameworks osservices framework versions a osservices libsystem sandbox dylib usr lib system libsystem sandbox dylib com apple shortcut system library privateframeworks shortcut framework versions a shortcut dylib usr lib dylib libglu dylib system library frameworks opengl framework versions a libraries libglu dylib com apple geoservices system library privateframeworks geoservices framework versions a geoservices com apple bluetooth system library frameworks iobluetooth framework versions a iobluetooth libcsfde dylib usr lib libcsfde dylib libc dylib usr lib libc dylib libxpc dylib usr lib system libxpc dylib libradiance dylib system library frameworks imageio framework versions a resources libradiance dylib com apple diskimagesframework system library privateframeworks diskimages framework versions a diskimages com apple cocoa system library frameworks cocoa framework versions a cocoa com apple framework system library privateframeworks framework versions a com apple clouddocs system library privateframeworks clouddocs framework versions a clouddocs libsystem c dylib usr lib system libsystem c dylib com apple cloudkit cloudkit system library frameworks cloudkit framework versions a cloudkit libsystem malloc dylib usr lib system libsystem malloc dylib com apple gss system library frameworks gss framework versions a gss libsystem pthread dylib usr lib system libsystem pthread dylib libsystem network dylib usr lib system libsystem network dylib com apple netauth system library privateframeworks netauth framework versions a netauth com apple coretext system library frameworks coretext framework versions a coretext libcopyfile dylib usr lib system libcopyfile dylib com apple langanalysis system library frameworks applicationservices framework versions a frameworks langanalysis framework versions a langanalysis libicucore a dylib usr lib libicucore a dylib libarchive dylib usr lib libarchive dylib liblzma dylib usr lib liblzma dylib com apple corebluetooth system library frameworks corebluetooth framework versions a corebluetooth libobjc a dylib usr lib libobjc a dylib com apple protocolbuffer system library privateframeworks protocolbuffer framework versions a protocolbuffer libunc dylib usr lib system libunc dylib com apple speechrecognitioncore system library privateframeworks speechrecognitioncore framework versions a speechrecognitioncore com apple coreduetdebuglogging system library privateframeworks coreduetdebuglogging framework versions a coreduetdebuglogging com apple netfs system library frameworks netfs framework versions a netfs com apple corefoundation system library frameworks corefoundation framework versions a corefoundation libjpeg dylib system library frameworks imageio framework versions a resources libjpeg dylib libcompiler rt dylib usr lib system libcompiler rt dylib libiconv dylib usr lib libiconv dylib com apple backup framework system library privateframeworks backup framework versions a backup libz dylib usr lib libz dylib com apple print framework print system library frameworks carbon framework versions a frameworks print framework versions a print com apple diskarbitration system library frameworks diskarbitration framework versions a diskarbitration liblinearalgebra dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a liblinearalgebra dylib com apple remoteviewservices system library privateframeworks remoteviewservices framework versions a remoteviewservices com apple applevpaframework system library privateframeworks applevpa framework versions a applevpa libsystem dnssd dylib usr lib system libsystem dnssd dylib com apple hitoolbox system library frameworks carbon framework versions a frameworks hitoolbox framework versions a hitoolbox com apple oauth system library privateframeworks oauth framework versions a oauth com apple iosurface system library frameworks iosurface framework versions a iosurface com apple speech recognition framework system library frameworks carbon framework versions a frameworks speechrecognition framework versions a speechrecognition com apple opendirectory system library frameworks opendirectory framework versions a opendirectory com apple colorsync system library frameworks applicationservices framework versions a frameworks colorsync framework versions a colorsync libcorestorage dylib usr lib libcorestorage dylib libfontparser dylib system library frameworks applicationservices framework versions a frameworks ats framework versions a resources libfontparser dylib com apple chunkinglibrary system library privateframeworks chunkinglibrary framework versions a chunkinglibrary com apple audio toolbox audiotoolbox system library frameworks audiotoolbox framework versions a audiotoolbox com apple security system library frameworks security framework versions a security com apple systemconfiguration system library frameworks systemconfiguration framework versions a systemconfiguration dylib usr lib dylib com apple javascriptcore system library frameworks javascriptcore framework versions a javascriptcore libcorecrypto dylib usr lib system libcorecrypto dylib com apple imagecapture system library frameworks carbon framework versions a frameworks imagecapture framework versions a imagecapture com apple coreservices system library frameworks coreservices framework versions a coreservices com apple launchservices system library frameworks coreservices framework versions a frameworks launchservices framework versions a launchservices com apple ink framework system library frameworks carbon framework versions a frameworks ink framework versions a ink com apple frameworks coredaemon system library privateframeworks coredaemon framework versions b coredaemon libcache dylib usr lib system libcache dylib dylib usr lib dylib libsystem kernel dylib usr lib system libsystem kernel dylib com apple audio units audiounit system library frameworks audiounit framework versions a audiounit com apple mediakit system library privateframeworks mediakit framework versions a mediakit libquarantine dylib usr lib system libquarantine dylib libkeymgr dylib usr lib system libkeymgr dylib libcrypto dylib usr lib libcrypto dylib libsystem b dylib usr lib libsystem b dylib libmacho dylib usr lib system libmacho dylib com apple trustevaluationagent system library privateframeworks trustevaluationagent framework versions a trustevaluationagent libspindump dylib usr lib libspindump dylib librip a dylib system library frameworks coregraphics framework versions a resources librip a dylib libcgcms a dylib system library frameworks coregraphics framework versions a resources libcgcms a dylib libcvmspluginsupport dylib system library frameworks opengl framework versions a libraries libcvmspluginsupport dylib com apple cfnetwork system library frameworks cfnetwork framework versions a cfnetwork libsystem networkextension dylib usr lib system libsystem networkextension dylib libsystem blocks dylib usr lib system libsystem blocks dylib com apple accelerate veclib veclib system library frameworks accelerate framework versions a frameworks veclib framework versions a veclib com apple securityhi system library frameworks carbon framework versions a frameworks securityhi framework versions a securityhi com apple appkit system library frameworks appkit framework versions c appkit com apple crashreportersupport system library privateframeworks crashreportersupport framework versions a crashreportersupport com apple applicationservices ats system library frameworks applicationservices framework versions a frameworks ats framework versions a ats com apple coregraphics system library frameworks coregraphics framework versions a coregraphics com apple print framework printcore system library frameworks applicationservices framework versions a frameworks printcore framework versions a printcore com apple datadetectorscore system library privateframeworks datadetectorscore framework versions a datadetectorscore com apple icloud findmydevice system library privateframeworks findmydevice framework versions a findmydevice libsystem coretls dylib usr lib system libsystem coretls dylib libsystem trace dylib usr lib system libsystem trace dylib com apple ubiquity system library privateframeworks ubiquity framework versions a ubiquity com apple framework iokit system library frameworks iokit framework versions a iokit com apple searchkit system library frameworks coreservices framework versions a frameworks searchkit framework versions a searchkit libresolv dylib usr lib libresolv dylib com apple speech synthesis framework system library frameworks applicationservices framework versions a frameworks speechsynthesis framework versions a speechsynthesis com apple kerberos system library frameworks kerberos framework versions a kerberos com apple corelocation system library frameworks corelocation framework versions a corelocation com apple quartzcore system library frameworks quartzcore framework versions a quartzcore com apple webcore system library frameworks webkit framework versions a frameworks webcore framework versions a webcore com apple discrecording system library frameworks discrecording framework versions a discrecording com apple debugsymbols system library privateframeworks debugsymbols framework versions a debugsymbols libvmisc dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a libvmisc dylib libpng dylib system library frameworks imageio framework versions a resources libpng dylib com apple foundation system library frameworks foundation framework versions c foundation libcldcpuengine dylib system library frameworks opencl framework versions a libraries libcldcpuengine dylib libcgxtype a dylib system library frameworks coregraphics framework versions a resources libcgxtype a dylib libsystem secinit dylib usr lib system libsystem secinit dylib com apple coreservicesinternal system library privateframeworks coreservicesinternal framework versions a coreservicesinternal com apple metadata system library frameworks coreservices framework versions a frameworks metadata framework versions a metadata libcorevmclient dylib system library frameworks opengl framework versions a libraries libcorevmclient dylib com apple help system library frameworks carbon framework versions a frameworks help framework versions a help libsystem asl dylib usr lib system libsystem asl dylib libsystem info dylib usr lib system libsystem info dylib libvdsp dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a libvdsp dylib libextension dylib usr lib libextension dylib com apple framework corewifi system library privateframeworks corewifi framework versions a corewifi libgif dylib system library frameworks imageio framework versions a resources libgif dylib libbsm dylib usr lib libbsm dylib libmecabra dylib usr lib libmecabra dylib libcommoncrypto dylib usr lib system libcommoncrypto dylib libsystem platform dylib usr lib system libsystem platform dylib libdispatch dylib usr lib system libdispatch dylib com apple carbon system library frameworks carbon framework versions a carbon com apple coredata system library frameworks coredata framework versions a coredata com apple webkitlegacy system library frameworks webkit framework versions a frameworks webkitlegacy framework versions a webkitlegacy com apple aps framework system library privateframeworks applepushservice framework versions a applepushservice com apple applicationservices system library frameworks applicationservices framework versions a applicationservices com apple loginsupport system library privateframeworks login framework versions a frameworks loginsupport framework versions a loginsupport com apple coreservices carboncore system library frameworks coreservices framework versions a frameworks carboncore framework versions a carboncore com apple multitouchsupport framework system library privateframeworks multitouchsupport framework versions a multitouchsupport libfontregistry dylib system library frameworks applicationservices framework versions a frameworks ats framework versions a resources libfontregistry dylib dylib system library frameworks imageio framework versions a resources dylib libutil dylib usr lib libutil dylib com apple qd system library frameworks applicationservices framework versions a frameworks qd framework versions a qd liblaunch dylib usr lib system liblaunch dylib com apple coreservices fsevents system library frameworks coreservices framework versions a frameworks fsevents framework versions a fsevents libsystem m dylib usr lib system libsystem m dylib libblas dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a libblas dylib libdyld dylib usr lib system libdyld dylib dylib usr lib dylib libtiff dylib system library frameworks imageio framework versions a resources libtiff dylib com apple iconservices system library privateframeworks iconservices framework versions a iconservices com apple audio coreaudio system library frameworks coreaudio framework versions a coreaudio liblapack dylib system library frameworks accelerate framework versions a frameworks veclib framework versions a liblapack dylib libcrfsuite dylib usr lib libcrfsuite dylib libopenscriptingutil dylib usr lib libopenscriptingutil dylib com apple coreduetdaemonprotocol system library privateframeworks coreduetdaemonprotocol framework versions a coreduetdaemonprotocol com apple framework corewlan system library frameworks corewlan framework versions a corewlan libsystem configuration dylib usr lib system libsystem configuration dylib libxar dylib usr lib libxar dylib com apple ae system library frameworks coreservices framework versions a frameworks ae framework versions a ae libcups dylib usr lib libcups dylib libgfxshared dylib system library frameworks opengl framework versions a libraries libgfxshared dylib com apple accelerate accelerate system library frameworks accelerate framework versions a accelerate com apple uifoundation system library privateframeworks uifoundation framework versions a uifoundation libkxld dylib usr lib system libkxld dylib libsystem coreservices dylib usr lib system libsystem coreservices dylib libcurl dylib usr lib libcurl dylib com apple accounts system library frameworks accounts framework versions a accounts com apple securityfoundation system library frameworks securityfoundation framework versions a securityfoundation libxslt dylib usr lib libxslt dylib com apple ldapframework system library frameworks ldap framework versions a ldap com apple heimdal system library privateframeworks heimdal framework versions a heimdal com apple xpc servicemanagement system library frameworks servicemanagement framework versions a servicemanagement liblangid dylib usr lib liblangid dylib com apple vimage system library frameworks accelerate framework versions a frameworks vimage framework versions a vimage com apple opengl system library frameworks opengl framework versions a opengl libtidy a dylib usr lib libtidy a dylib com apple performanceanalysis system library privateframeworks performanceanalysis framework versions a performanceanalysis libglimage dylib system library frameworks opengl framework versions a libraries libglimage dylib com apple commonauth system library privateframeworks commonauth framework versions a commonauth libunwind dylib usr lib system libunwind dylib com apple webkit system library frameworks webkit framework versions a webkit com apple sharing system library privateframeworks sharing framework versions a sharing com apple rawcamera bundle system library coreservices rawcamera bundle contents macos rawcamera libauto dylib usr lib libauto dylib com apple opencl system library frameworks opencl framework versions a opencl com apple coresymbolication system library privateframeworks coresymbolication framework versions a coresymbolication com apple generationalstorage system library privateframeworks generationalstorage framework versions a generationalstorage libsystem notify dylib usr lib system libsystem notify dylib com apple protectedcloudstorage system library privateframeworks protectedcloudstorage framework versions a protectedcloudstorage com apple openscripting system library frameworks carbon framework versions a frameworks openscripting framework versions a openscripting libremovefile dylib usr lib system libremovefile dylib com apple vision facecore system library privateframeworks facecore framework versions a facecore libpam dylib usr lib libpam dylib com apple applejpeg system library privateframeworks applejpeg framework versions a applejpeg libheimdal dylib usr lib libheimdal dylib com apple hiservices system library frameworks applicationservices framework versions a frameworks hiservices framework versions a hiservices com apple helpdata system library privateframeworks helpdata framework versions a helpdata libcmph dylib usr lib libcmph dylib libsystem stats dylib usr lib system libsystem stats dylib com apple commonpanels system library frameworks carbon framework versions a frameworks commonpanels framework versions a commonpanels com apple dictionaryservices system library frameworks coreservices framework versions a frameworks dictionaryservices framework versions a dictionaryservices com apple corevideo system library frameworks corevideo framework versions a corevideo sample analysis of process written to file dev stdout
0
90,714
11,426,518,704
IssuesEvent
2020-02-03 22:07:39
worldbank/d4di
https://api.github.com/repos/worldbank/d4di
closed
Ch3: clarify cross sectional time dimension
Ch 3: Designing Research methods Minor/Clarify/Explain ok2go👍🏼
Explain time dimension more clearly, i.e. that in a repeated cross section the sample is re-drawn each round (so the same individuals are not necessarily revisited and an individual-level panel would not be appropriate, even if there is overlap in the samples) https://github.com/worldbank/d4di/blob/a9c4d76da0787053a59fee0c928001e04df648bc/chapters/research-design.tex#L120
1.0
Ch3: clarify cross sectional time dimension - Explain time dimension more clearly, i.e. that in a repeated cross section the sample is re-drawn each round (so the same individuals are not necessarily revisited and an individual-level panel would not be appropriate, even if there is overlap in the samples) https://github.com/worldbank/d4di/blob/a9c4d76da0787053a59fee0c928001e04df648bc/chapters/research-design.tex#L120
non_process
clarify cross sectional time dimension explain time dimension more clearly i e that in a repeated cross section the sample is re drawn each round so the same individuals are not necessarily revisited and an individual level panel would not be appropriate even if there is overlap in the samples
0
17,427
23,246,622,684
IssuesEvent
2022-08-03 20:53:02
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
Branch control - Verify branch protection
doc-enhancement devops/prod Pri2 devops-cicd-process/tech
[Enter feedback here] Can you please explain better what the "Verify branch protection" is for? Maybe with some practical examples? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: b067a175-f640-7503-9c1e-f0130c6dbeda * Version Independent ID: ff743c7b-a103-eae6-4478-62ba995a4b36 * Content: [Pipeline deployment approvals - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/approvals?view=azure-devops&tabs=check-pass#branch-control) * Content Source: [docs/pipelines/process/approvals.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/approvals.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @shashban * Microsoft Alias: **shashban**
1.0
Branch control - Verify branch protection - [Enter feedback here] Can you please explain better what the "Verify branch protection" is for? Maybe with some practical examples? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: b067a175-f640-7503-9c1e-f0130c6dbeda * Version Independent ID: ff743c7b-a103-eae6-4478-62ba995a4b36 * Content: [Pipeline deployment approvals - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/approvals?view=azure-devops&tabs=check-pass#branch-control) * Content Source: [docs/pipelines/process/approvals.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/approvals.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @shashban * Microsoft Alias: **shashban**
process
branch control verify branch protection can you please explain better what the verify branch protection is for maybe with some practical examples document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login shashban microsoft alias shashban
1
64,638
7,822,307,895
IssuesEvent
2018-06-14 01:41:10
monero-project/kovri
https://api.github.com/repos/monero-project/kovri
closed
'kovri -h' prints 3 error lines at end of output
design maintenance minor
--- **By submitting this issue, I confirm the following:** - I have read and understood the developer guide in [kovri-docs](https://github.com/monero-project/kovri-docs). - I have checked that the issue I am reporting can be replicated or that the feature I am suggesting is not present. - I have checked opened or recently closed [pull requests](https://github.com/monero-project/kovri/pulls) for existing solutions/implementations to my issue/suggestion. --- Here's my output: > [2018-06-12 01:10:03.077251] [0x000072484dc80780] [info] > > system: --host arg (=127.0.0.1) -p [ --port ] arg (=0) --data-dir path (=/root/.kovri) -s [ --service ] arg -d [ --enable-daemon ] --disable-console-log --disable-file-log --disable-color-log --enable-auto-flush-log --log-level arg (=3) --log-file-name path -c [ --kovriconf ] path -t [ --tunnelsconf ] path > > > network: --enable-ipv6 --enable-floodfill -b [ --bandwidth ] arg (=L) --disable-ssu --disable-ntcp -r [ --reseed-from ] arg --disable-https --disable-su3-verification > > > client: --httpproxyport arg (=4446) --httpproxyaddress arg (=127.0.0.1) --socksproxyport arg (=4447) --socksproxyaddress arg (=127.0.0.1) --proxykeys arg --i2pcontrolport arg (=0) --i2pcontroladdress arg (=127.0.0.1) --i2pcontrolpassword arg (=itoopie) > > [2018-06-12 01:10:03.077350] [0x000072484dc80780] [error] Configuration: standard exception: 'for more details, see user-guide or config file' [2018-06-12 01:10:03.077365] [0x000072484dc80780] [error] Instance: standard exception: 'for more details, see user-guide or config file' [2018-06-12 01:10:03.077383] [0x000072484dc80780] [error] DaemonSingleton: Configure: standard exception: 'for more details, see user-guide or config file' From IRC: <anonimal> Those are just ctor function try blocks throwing because a real option was called. Help is thrown in with any invalid command, that could be cleaned up
1.0
'kovri -h' prints 3 error lines at end of output - --- **By submitting this issue, I confirm the following:** - I have read and understood the developer guide in [kovri-docs](https://github.com/monero-project/kovri-docs). - I have checked that the issue I am reporting can be replicated or that the feature I am suggesting is not present. - I have checked opened or recently closed [pull requests](https://github.com/monero-project/kovri/pulls) for existing solutions/implementations to my issue/suggestion. --- Here's my output: > [2018-06-12 01:10:03.077251] [0x000072484dc80780] [info] > > system: --host arg (=127.0.0.1) -p [ --port ] arg (=0) --data-dir path (=/root/.kovri) -s [ --service ] arg -d [ --enable-daemon ] --disable-console-log --disable-file-log --disable-color-log --enable-auto-flush-log --log-level arg (=3) --log-file-name path -c [ --kovriconf ] path -t [ --tunnelsconf ] path > > > network: --enable-ipv6 --enable-floodfill -b [ --bandwidth ] arg (=L) --disable-ssu --disable-ntcp -r [ --reseed-from ] arg --disable-https --disable-su3-verification > > > client: --httpproxyport arg (=4446) --httpproxyaddress arg (=127.0.0.1) --socksproxyport arg (=4447) --socksproxyaddress arg (=127.0.0.1) --proxykeys arg --i2pcontrolport arg (=0) --i2pcontroladdress arg (=127.0.0.1) --i2pcontrolpassword arg (=itoopie) > > [2018-06-12 01:10:03.077350] [0x000072484dc80780] [error] Configuration: standard exception: 'for more details, see user-guide or config file' [2018-06-12 01:10:03.077365] [0x000072484dc80780] [error] Instance: standard exception: 'for more details, see user-guide or config file' [2018-06-12 01:10:03.077383] [0x000072484dc80780] [error] DaemonSingleton: Configure: standard exception: 'for more details, see user-guide or config file' From IRC: <anonimal> Those are just ctor function try blocks throwing because a real option was called. Help is thrown in with any invalid command, that could be cleaned up
non_process
kovri h prints error lines at end of output by submitting this issue i confirm the following i have read and understood the developer guide in i have checked that the issue i am reporting can be replicated or that the feature i am suggesting is not present i have checked opened or recently closed for existing solutions implementations to my issue suggestion here s my output system host arg p arg data dir path root kovri s arg d disable console log disable file log disable color log enable auto flush log log level arg log file name path c path t path network enable enable floodfill b arg l disable ssu disable ntcp r arg disable https disable verification client httpproxyport arg httpproxyaddress arg socksproxyport arg socksproxyaddress arg proxykeys arg arg arg arg itoopie configuration standard exception for more details see user guide or config file instance standard exception for more details see user guide or config file daemonsingleton configure standard exception for more details see user guide or config file from irc those are just ctor function try blocks throwing because a real option was called help is thrown in with any invalid command that could be cleaned up
0
31,910
26,234,747,673
IssuesEvent
2023-01-05 05:45:09
AvaloniaUI/Avalonia
https://api.github.com/repos/AvaloniaUI/Avalonia
closed
what is AggregatePackage.NuGet.Sdk used for?
enhancement infrastructure
i noticed https://github.com/jkoritzinsky/AggregatePackage.NuGet.Sdk is now archived and https://www.nuget.org/packages/AggregatePackage.NuGet.Sdk is unlisted. is it still required? removing it seems to still build fine
1.0
what is AggregatePackage.NuGet.Sdk used for? - i noticed https://github.com/jkoritzinsky/AggregatePackage.NuGet.Sdk is now archived and https://www.nuget.org/packages/AggregatePackage.NuGet.Sdk is unlisted. is it still required? removing it seems to still build fine
non_process
what is aggregatepackage nuget sdk used for i noticed is now archived and is unlisted is it still required removing it seems to still build fine
0
21,252
28,376,305,324
IssuesEvent
2023-04-12 21:09:16
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
[MLv2] [Bug] `:percentile` column name calculation broken
Type:Bug .Backend .metabase-lib .Team/QueryProcessor :hammer_and_wrench:
The fix itself is trivial, `%s` instead of `%d` in the format string, but we really need some tests around this. ```clj clojure.lang.ExceptionInfo: Error calculating column name for [:percentile #:lib{:uuid "62c4dd0b-7db4-4666-9796-62ca15c91c27"} [:field #:lib{:uuid "cdc94b9f-ba6b-413f-916a-ee5a19221ecf"} 133853] 0.9]: d != java.lang.Double ```
1.0
[MLv2] [Bug] `:percentile` column name calculation broken - The fix itself is trivial, `%s` instead of `%d` in the format string, but we really need some tests around this. ```clj clojure.lang.ExceptionInfo: Error calculating column name for [:percentile #:lib{:uuid "62c4dd0b-7db4-4666-9796-62ca15c91c27"} [:field #:lib{:uuid "cdc94b9f-ba6b-413f-916a-ee5a19221ecf"} 133853] 0.9]: d != java.lang.Double ```
process
percentile column name calculation broken the fix itself is trivial s instead of d in the format string but we really need some tests around this clj clojure lang exceptioninfo error calculating column name for percentile lib uuid d java lang double
1
6,946
10,113,064,902
IssuesEvent
2019-07-30 15:53:32
material-components/material-components-ios
https://api.github.com/repos/material-components/material-components-ios
closed
Perform a heavy-duty instrumentation pass on the catalog and demo apps
app:Catalog app:Pesto app:Shrine good first issue skill:Instruments type:Process
### Context We want our components to consume as few cycles as is reasonably possible so that applications have more compute time per screen refresh. ### Potential work ahead For each of our applications: - Deploy the app to a device and run the CPU profiling instrument. - Navigate through the entire application. - Check the profiling results and identify any surprising metrics. - File bugs identifying the performance problem. <!-- Auto-generated content below, do not modify --> --- #### Internal data - Associated internal bug: [b/117179530](http://b/117179530)
1.0
Perform a heavy-duty instrumentation pass on the catalog and demo apps - ### Context We want our components to consume as few cycles as is reasonably possible so that applications have more compute time per screen refresh. ### Potential work ahead For each of our applications: - Deploy the app to a device and run the CPU profiling instrument. - Navigate through the entire application. - Check the profiling results and identify any surprising metrics. - File bugs identifying the performance problem. <!-- Auto-generated content below, do not modify --> --- #### Internal data - Associated internal bug: [b/117179530](http://b/117179530)
process
perform a heavy duty instrumentation pass on the catalog and demo apps context we want our components to consume as few cycles as is reasonably possible so that applications have more compute time per screen refresh potential work ahead for each of our applications deploy the app to a device and run the cpu profiling instrument navigate through the entire application check the profiling results and identify any surprising metrics file bugs identifying the performance problem internal data associated internal bug
1
19,580
25,904,651,154
IssuesEvent
2022-12-15 09:17:01
alphagov/govuk-design-system
https://api.github.com/repos/alphagov/govuk-design-system
opened
Move the team sprint board to the new GitHub projects feature
🕔 days process
## What Github released [a new version of their projects feature](https://docs.github.com/en/issues/planning-and-tracking-with-projects/learning-about-projects/about-projects) earlier this year. We want to move out sprint board to this new version. At the same time, we should consider reviewing and simplifying our labelling system. ## Why The iterated projects feature better meets our needs, for example, it has a built in analytics which can help us measure our sprints better. ## Who needs to work on this Kelly ## Who needs to review this The whole team ## Done when - [ ] Agree what transitions from current board to new (eg review backlog items) - [ ] Investigate most appropriate layout - [ ] Review labelling system - [ ] Create new board - [ ] Review from team - [ ] Close the original sprint board
1.0
Move the team sprint board to the new GitHub projects feature - ## What Github released [a new version of their projects feature](https://docs.github.com/en/issues/planning-and-tracking-with-projects/learning-about-projects/about-projects) earlier this year. We want to move out sprint board to this new version. At the same time, we should consider reviewing and simplifying our labelling system. ## Why The iterated projects feature better meets our needs, for example, it has a built in analytics which can help us measure our sprints better. ## Who needs to work on this Kelly ## Who needs to review this The whole team ## Done when - [ ] Agree what transitions from current board to new (eg review backlog items) - [ ] Investigate most appropriate layout - [ ] Review labelling system - [ ] Create new board - [ ] Review from team - [ ] Close the original sprint board
process
move the team sprint board to the new github projects feature what github released earlier this year we want to move out sprint board to this new version at the same time we should consider reviewing and simplifying our labelling system why the iterated projects feature better meets our needs for example it has a built in analytics which can help us measure our sprints better who needs to work on this kelly who needs to review this the whole team done when agree what transitions from current board to new eg review backlog items investigate most appropriate layout review labelling system create new board review from team close the original sprint board
1
104,764
9,008,779,146
IssuesEvent
2019-02-05 06:08:30
chainer/chainer
https://api.github.com/repos/chainer/chainer
closed
Reconsider using mypy in CI
cat:test
It requires manual type annotations in the code, which is difficult to understand/write for external contributors and make the code less readable. We decided to stop checking annotations with mypy in CIs while optionally allowing type hints on function signatures and attributes. That would allow users to type-check their code by using mypy. However one concern remains: if Chainer lacks annotations in its internal code, that might cause errors in mypy checks external users would run.
1.0
Reconsider using mypy in CI - It requires manual type annotations in the code, which is difficult to understand/write for external contributors and make the code less readable. We decided to stop checking annotations with mypy in CIs while optionally allowing type hints on function signatures and attributes. That would allow users to type-check their code by using mypy. However one concern remains: if Chainer lacks annotations in its internal code, that might cause errors in mypy checks external users would run.
non_process
reconsider using mypy in ci it requires manual type annotations in the code which is difficult to understand write for external contributors and make the code less readable we decided to stop checking annotations with mypy in cis while optionally allowing type hints on function signatures and attributes that would allow users to type check their code by using mypy however one concern remains if chainer lacks annotations in its internal code that might cause errors in mypy checks external users would run
0
7,186
10,325,420,497
IssuesEvent
2019-09-01 17:11:25
microsoft/LightGBM
https://api.github.com/repos/microsoft/LightGBM
closed
[R-package] Add pkgdown documentation support
feature-request help wanted in-process r-package
I will be adding [pkgdown](https://github.com/r-lib/pkgdown) support to the documentation of LightGBM. You can check the branch here: https://github.com/Laurae2/LightGBM/tree/pkgdown To add support for pkgdown, it will require the following tasks: - [x] Find hosting for the pkgdown documentation (I'll host it, max number of simultaneous users basically infinite thanks to static site hosted on CDNs) - [x] Add badges to R README.md - [x] Change a bit the README.md documentation - [x] Add Authors@R field in DESCRIPTION (@guolinke (aut+cre) + @Laurae2 (ctb) + @yanyachen (ctb) ) - [x] Fix Author field in DESCRIPTION - [x] Move `Matrix` and `methods` to Depends instead of Imports (fixes many instances of example crashes) - [x] Fix all instances of crashes in function examples - [x] Update date of DESCRIPTION - [x] Regenerate the whole documentation from scratch - [x] Run successfully `pkgdown::build_site()` on the R-package folder without vignettes - [x] Convert demos to vignettes - [x] Find workarounds for all function examples which crashes and cannot have a direct or immediate workaround - [x] Fix all instances of crashes in vignettes - [x] Find workarounds for all vignettes which crashes and cannot have a direct or immediate workaround - [x] Run successfully `pkgdown::build_site()` on the R-package folder with vignettes Extras: - [ ] Add Travis support for running pkgdown so it can perform all examples / vignettes test in one straight command: `pkgdown::build_site(run_dont_run = TRUE)` (no idea how to do it, if someone can contribute to do it for us that would be great...) I'm sharing my setup for running `pkgdown` quickly: ```r # Load useful libraries for development library(devtools) # install.packages("devtools") library(roxygen2) # devtools::install_github("klutometis/roxygen") library(pkgdown) # devtools::install_github("hadley/pkgdown") # Set the working directory to where I am setwd("E:/GitHub/LightGBM/R-package") # Install package install() # Generate documentation document() # Check for errors # devtools::check(document = FALSE) # fails/crashes on LightGBM # Build static website pkgdown::build_site(run_dont_run = TRUE) # currently fails/crashes # Remember to cleanup R-package/src folder after testing (keep only install.libs.R, Makevars, Makevars.win) # Remember to delete R-package/docs after testing ``` It will look like this: ![image](https://user-images.githubusercontent.com/9083669/34339912-0bb64244-e97b-11e7-81e2-1185b3b968b2.png) Too much advanced example, gives a good baseline of what we can do with `pkgdown`: https://keras.rstudio.com/index.html ![image](https://user-images.githubusercontent.com/9083669/34339922-34ed92b6-e97b-11e7-9119-ff326a2e0932.png) ![image](https://user-images.githubusercontent.com/9083669/34339927-3df14f56-e97b-11e7-8c77-c7fb2a561233.png) Another (very simplified) example (live): https://laurae2.github.io/LauraeDS/
1.0
[R-package] Add pkgdown documentation support - I will be adding [pkgdown](https://github.com/r-lib/pkgdown) support to the documentation of LightGBM. You can check the branch here: https://github.com/Laurae2/LightGBM/tree/pkgdown To add support for pkgdown, it will require the following tasks: - [x] Find hosting for the pkgdown documentation (I'll host it, max number of simultaneous users basically infinite thanks to static site hosted on CDNs) - [x] Add badges to R README.md - [x] Change a bit the README.md documentation - [x] Add Authors@R field in DESCRIPTION (@guolinke (aut+cre) + @Laurae2 (ctb) + @yanyachen (ctb) ) - [x] Fix Author field in DESCRIPTION - [x] Move `Matrix` and `methods` to Depends instead of Imports (fixes many instances of example crashes) - [x] Fix all instances of crashes in function examples - [x] Update date of DESCRIPTION - [x] Regenerate the whole documentation from scratch - [x] Run successfully `pkgdown::build_site()` on the R-package folder without vignettes - [x] Convert demos to vignettes - [x] Find workarounds for all function examples which crashes and cannot have a direct or immediate workaround - [x] Fix all instances of crashes in vignettes - [x] Find workarounds for all vignettes which crashes and cannot have a direct or immediate workaround - [x] Run successfully `pkgdown::build_site()` on the R-package folder with vignettes Extras: - [ ] Add Travis support for running pkgdown so it can perform all examples / vignettes test in one straight command: `pkgdown::build_site(run_dont_run = TRUE)` (no idea how to do it, if someone can contribute to do it for us that would be great...) I'm sharing my setup for running `pkgdown` quickly: ```r # Load useful libraries for development library(devtools) # install.packages("devtools") library(roxygen2) # devtools::install_github("klutometis/roxygen") library(pkgdown) # devtools::install_github("hadley/pkgdown") # Set the working directory to where I am setwd("E:/GitHub/LightGBM/R-package") # Install package install() # Generate documentation document() # Check for errors # devtools::check(document = FALSE) # fails/crashes on LightGBM # Build static website pkgdown::build_site(run_dont_run = TRUE) # currently fails/crashes # Remember to cleanup R-package/src folder after testing (keep only install.libs.R, Makevars, Makevars.win) # Remember to delete R-package/docs after testing ``` It will look like this: ![image](https://user-images.githubusercontent.com/9083669/34339912-0bb64244-e97b-11e7-81e2-1185b3b968b2.png) Too much advanced example, gives a good baseline of what we can do with `pkgdown`: https://keras.rstudio.com/index.html ![image](https://user-images.githubusercontent.com/9083669/34339922-34ed92b6-e97b-11e7-9119-ff326a2e0932.png) ![image](https://user-images.githubusercontent.com/9083669/34339927-3df14f56-e97b-11e7-8c77-c7fb2a561233.png) Another (very simplified) example (live): https://laurae2.github.io/LauraeDS/
process
add pkgdown documentation support i will be adding support to the documentation of lightgbm you can check the branch here to add support for pkgdown it will require the following tasks find hosting for the pkgdown documentation i ll host it max number of simultaneous users basically infinite thanks to static site hosted on cdns add badges to r readme md change a bit the readme md documentation add authors r field in description guolinke aut cre ctb yanyachen ctb fix author field in description move matrix and methods to depends instead of imports fixes many instances of example crashes fix all instances of crashes in function examples update date of description regenerate the whole documentation from scratch run successfully pkgdown build site on the r package folder without vignettes convert demos to vignettes find workarounds for all function examples which crashes and cannot have a direct or immediate workaround fix all instances of crashes in vignettes find workarounds for all vignettes which crashes and cannot have a direct or immediate workaround run successfully pkgdown build site on the r package folder with vignettes extras add travis support for running pkgdown so it can perform all examples vignettes test in one straight command pkgdown build site run dont run true no idea how to do it if someone can contribute to do it for us that would be great i m sharing my setup for running pkgdown quickly r load useful libraries for development library devtools install packages devtools library devtools install github klutometis roxygen library pkgdown devtools install github hadley pkgdown set the working directory to where i am setwd e github lightgbm r package install package install generate documentation document check for errors devtools check document false fails crashes on lightgbm build static website pkgdown build site run dont run true currently fails crashes remember to cleanup r package src folder after testing keep only install libs r makevars makevars win remember to delete r package docs after testing it will look like this too much advanced example gives a good baseline of what we can do with pkgdown another very simplified example live
1
10,644
13,446,201,814
IssuesEvent
2020-09-08 12:37:32
MHRA/products
https://api.github.com/repos/MHRA/products
closed
PARs - Upload Review screen
EPIC - PARs process
There is already a stub Review screen which needs to be updated to list all of the fields the user has entered as per the designs: https://app.zeplin.io/project/5dd51ae21205c944f8c1d35b/screen/5ebbfcb87b423d17aae887f3 Acceptance Criteria: [ ] Change link on the review page for PARs upload, takes you back to the previous page.
1.0
PARs - Upload Review screen - There is already a stub Review screen which needs to be updated to list all of the fields the user has entered as per the designs: https://app.zeplin.io/project/5dd51ae21205c944f8c1d35b/screen/5ebbfcb87b423d17aae887f3 Acceptance Criteria: [ ] Change link on the review page for PARs upload, takes you back to the previous page.
process
pars upload review screen there is already a stub review screen which needs to be updated to list all of the fields the user has entered as per the designs acceptance criteria change link on the review page for pars upload takes you back to the previous page
1
15,672
19,847,391,882
IssuesEvent
2022-01-21 08:25:00
ooi-data/CE07SHSM-SBD12-04-PCO2AA000-recovered_host-pco2a_a_dcl_instrument_air_recovered
https://api.github.com/repos/ooi-data/CE07SHSM-SBD12-04-PCO2AA000-recovered_host-pco2a_a_dcl_instrument_air_recovered
opened
🛑 Processing failed: ValueError
process
## Overview `ValueError` found in `processing_task` task during run ended on 2022-01-21T08:25:00.282997. ## Details Flow name: `CE07SHSM-SBD12-04-PCO2AA000-recovered_host-pco2a_a_dcl_instrument_air_recovered` Task name: `processing_task` Error type: `ValueError` Error message: not enough values to unpack (expected 3, got 0) <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing final_path = finalize_data_stream( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream append_to_zarr(mod_ds, final_store, enc, logger=logger) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr _append_zarr(store, mod_ds) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr existing_arr.append(var_data.values) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 519, in values return _as_array_or_item(self._data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 259, in _as_array_or_item data = np.asarray(data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 1541, in __array__ x = self.compute() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 288, in compute (result,) = compute(self, traverse=False, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 571, in compute results = schedule(dsk, keys, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/threaded.py", line 79, in get results = get_async( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 507, in get_async raise_exception(exc, tb) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 315, in reraise raise exc File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 220, in execute_task result = _execute_task(task, data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py", line 119, in _execute_task return func(*(_execute_task(a, cache) for a in args)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 116, in getter c = np.asarray(c) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 357, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 551, in __array__ self._ensure_cached() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 548, in _ensure_cached self.array = NumpyIndexingAdapter(np.asarray(self.array)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 521, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__ return np.asarray(array[self.key], dtype=None) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/backends/zarr.py", line 73, in __getitem__ return array[key.tuple] File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 673, in __getitem__ return self.get_basic_selection(selection, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 798, in get_basic_selection return self._get_basic_selection_nd(selection=selection, out=out, File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 841, in _get_basic_selection_nd return self._get_selection(indexer=indexer, out=out, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1135, in _get_selection lchunk_coords, lchunk_selection, lout_selection = zip(*indexer) ValueError: not enough values to unpack (expected 3, got 0) ``` </details>
1.0
🛑 Processing failed: ValueError - ## Overview `ValueError` found in `processing_task` task during run ended on 2022-01-21T08:25:00.282997. ## Details Flow name: `CE07SHSM-SBD12-04-PCO2AA000-recovered_host-pco2a_a_dcl_instrument_air_recovered` Task name: `processing_task` Error type: `ValueError` Error message: not enough values to unpack (expected 3, got 0) <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing final_path = finalize_data_stream( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream append_to_zarr(mod_ds, final_store, enc, logger=logger) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr _append_zarr(store, mod_ds) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr existing_arr.append(var_data.values) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 519, in values return _as_array_or_item(self._data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 259, in _as_array_or_item data = np.asarray(data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 1541, in __array__ x = self.compute() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 288, in compute (result,) = compute(self, traverse=False, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 571, in compute results = schedule(dsk, keys, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/threaded.py", line 79, in get results = get_async( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 507, in get_async raise_exception(exc, tb) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 315, in reraise raise exc File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 220, in execute_task result = _execute_task(task, data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py", line 119, in _execute_task return func(*(_execute_task(a, cache) for a in args)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 116, in getter c = np.asarray(c) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 357, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 551, in __array__ self._ensure_cached() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 548, in _ensure_cached self.array = NumpyIndexingAdapter(np.asarray(self.array)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 521, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__ return np.asarray(array[self.key], dtype=None) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/backends/zarr.py", line 73, in __getitem__ return array[key.tuple] File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 673, in __getitem__ return self.get_basic_selection(selection, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 798, in get_basic_selection return self._get_basic_selection_nd(selection=selection, out=out, File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 841, in _get_basic_selection_nd return self._get_selection(indexer=indexer, out=out, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1135, in _get_selection lchunk_coords, lchunk_selection, lout_selection = zip(*indexer) ValueError: not enough values to unpack (expected 3, got 0) ``` </details>
process
🛑 processing failed valueerror overview valueerror found in processing task task during run ended on details flow name recovered host a dcl instrument air recovered task name processing task error type valueerror error message not enough values to unpack expected got traceback traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing final path finalize data stream file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize data stream append to zarr mod ds final store enc logger logger file srv conda envs notebook lib site packages ooi harvester processor init py line in append to zarr append zarr store mod ds file srv conda envs notebook lib site packages ooi harvester processor utils py line in append zarr existing arr append var data values file srv conda envs notebook lib site packages xarray core variable py line in values return as array or item self data file srv conda envs notebook lib site packages xarray core variable py line in as array or item data np asarray data file srv conda envs notebook lib site packages dask array core py line in array x self compute file srv conda envs notebook lib site packages dask base py line in compute result compute self traverse false kwargs file srv conda envs notebook lib site packages dask base py line in compute results schedule dsk keys kwargs file srv conda envs notebook lib site packages dask threaded py line in get results get async file srv conda envs notebook lib site packages dask local py line in get async raise exception exc tb file srv conda envs notebook lib site packages dask local py line in reraise raise exc file srv conda envs notebook lib site packages dask local py line in execute task result execute task task data file srv conda envs notebook lib site packages dask core py line in execute task return func execute task a cache for a in args file srv conda envs notebook lib site packages dask array core py line in getter c np asarray c file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray self array dtype dtype file srv conda envs notebook lib site packages xarray core indexing py line in array self ensure cached file srv conda envs notebook lib site packages xarray core indexing py line in ensure cached self array numpyindexingadapter np asarray self array file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray self array dtype dtype file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray array dtype none file srv conda envs notebook lib site packages xarray backends zarr py line in getitem return array file srv conda envs notebook lib site packages zarr core py line in getitem return self get basic selection selection fields fields file srv conda envs notebook lib site packages zarr core py line in get basic selection return self get basic selection nd selection selection out out file srv conda envs notebook lib site packages zarr core py line in get basic selection nd return self get selection indexer indexer out out fields fields file srv conda envs notebook lib site packages zarr core py line in get selection lchunk coords lchunk selection lout selection zip indexer valueerror not enough values to unpack expected got
1
1,298
3,838,089,866
IssuesEvent
2016-04-02 05:28:09
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
Conref push stage issues error "null"
bug P3 preprocess preprocess/conref
I've added the following markup to garagetaskoverview.xml: ```xml <ul> <li>Start the list</li> <li id="target">target</li> <li>continue the list</li> <li conref="#taskconcept/target" conaction="pushbefore">push content</li> <li conaction="mark"></li> </ul> ``` This markup is incorrect (the conref attribute should be on the last list item). The code is supposed to generate an error, but I get this on the command prompt output: > [conref-push] null The following markup is also incorrect, but is silently ignored: ```xml <li conaction="mark"></li> <li conref="#taskconcept/target" conaction="pushbefore">push content</li> ``` The code has support for some "pushbefore" condition that will generate message DOTJ044W, but I'm not sure what the condition is (found this while trying to clarify that message and user response). The correct markup works as expected, which is: ```xml <li conaction="pushbefore">push content</li> <li conref="#taskconcept/target" conaction="mark"></li> ```
2.0
Conref push stage issues error "null" - I've added the following markup to garagetaskoverview.xml: ```xml <ul> <li>Start the list</li> <li id="target">target</li> <li>continue the list</li> <li conref="#taskconcept/target" conaction="pushbefore">push content</li> <li conaction="mark"></li> </ul> ``` This markup is incorrect (the conref attribute should be on the last list item). The code is supposed to generate an error, but I get this on the command prompt output: > [conref-push] null The following markup is also incorrect, but is silently ignored: ```xml <li conaction="mark"></li> <li conref="#taskconcept/target" conaction="pushbefore">push content</li> ``` The code has support for some "pushbefore" condition that will generate message DOTJ044W, but I'm not sure what the condition is (found this while trying to clarify that message and user response). The correct markup works as expected, which is: ```xml <li conaction="pushbefore">push content</li> <li conref="#taskconcept/target" conaction="mark"></li> ```
process
conref push stage issues error null i ve added the following markup to garagetaskoverview xml xml start the list target continue the list push content this markup is incorrect the conref attribute should be on the last list item the code is supposed to generate an error but i get this on the command prompt output null the following markup is also incorrect but is silently ignored xml push content the code has support for some pushbefore condition that will generate message but i m not sure what the condition is found this while trying to clarify that message and user response the correct markup works as expected which is xml push content
1
340,165
30,497,172,147
IssuesEvent
2023-07-18 11:41:13
KB-RolePlay/kbrp-issues
https://api.github.com/repos/KB-RolePlay/kbrp-issues
closed
Duplication bug
bug need test Urgent
### Description Being a citizen if we die with weapons and a fireman revives us we take back our weapons. In this case, as a staff, we can get our money back, so we can duplicate our weapons. But also make the same principle while being in staff on players ### Expected behaviour It would be necessary to block the fact of being able to reimburse oneself and to block the reimbursements if one is revived by a fireman ### Reproduction steps ```bash 1. to be dead with weapons 2. to be revived by firemen 3. get reimbursed ``` ### Screenshots _No response_
1.0
Duplication bug - ### Description Being a citizen if we die with weapons and a fireman revives us we take back our weapons. In this case, as a staff, we can get our money back, so we can duplicate our weapons. But also make the same principle while being in staff on players ### Expected behaviour It would be necessary to block the fact of being able to reimburse oneself and to block the reimbursements if one is revived by a fireman ### Reproduction steps ```bash 1. to be dead with weapons 2. to be revived by firemen 3. get reimbursed ``` ### Screenshots _No response_
non_process
duplication bug description being a citizen if we die with weapons and a fireman revives us we take back our weapons in this case as a staff we can get our money back so we can duplicate our weapons but also make the same principle while being in staff on players expected behaviour it would be necessary to block the fact of being able to reimburse oneself and to block the reimbursements if one is revived by a fireman reproduction steps bash to be dead with weapons to be revived by firemen get reimbursed screenshots no response
0
528,652
15,371,546,825
IssuesEvent
2021-03-02 10:08:42
HSLdevcom/bultti
https://api.github.com/repos/HSLdevcom/bultti
closed
Remove "Liikennöitsijä" dropdown from Contract page
Priority 2 enhancement
FE PR: https://github.com/HSLdevcom/bultti-ui/pull/53 Use the global liikennöitsijä selection instead. Things to consider after contractPage operator dropdown removal: * add the name of selected operator to the dropdown field? * add the name of selected operator to the page title?
1.0
Remove "Liikennöitsijä" dropdown from Contract page - FE PR: https://github.com/HSLdevcom/bultti-ui/pull/53 Use the global liikennöitsijä selection instead. Things to consider after contractPage operator dropdown removal: * add the name of selected operator to the dropdown field? * add the name of selected operator to the page title?
non_process
remove liikennöitsijä dropdown from contract page fe pr use the global liikennöitsijä selection instead things to consider after contractpage operator dropdown removal add the name of selected operator to the dropdown field add the name of selected operator to the page title
0
21,802
30,316,040,437
IssuesEvent
2023-07-10 15:38:01
h4sh5/npm-auto-scanner
https://api.github.com/repos/h4sh5/npm-auto-scanner
opened
@mongodb-js/oidc-plugin 0.2.4 has 1 guarddog issues
npm-silent-process-execution
```{"npm-silent-process-execution":[{"code":" const child = (0, child_process_1.spawn)(this.options.openBrowser.command, [options.url], {\n shell: true,\n stdio: 'ignore',\n detached: true,\n signal: this.options.openB... });","location":"package/dist/plugin.js:298","message":"This package is silently executing another executable"}]}```
1.0
@mongodb-js/oidc-plugin 0.2.4 has 1 guarddog issues - ```{"npm-silent-process-execution":[{"code":" const child = (0, child_process_1.spawn)(this.options.openBrowser.command, [options.url], {\n shell: true,\n stdio: 'ignore',\n detached: true,\n signal: this.options.openB... });","location":"package/dist/plugin.js:298","message":"This package is silently executing another executable"}]}```
process
mongodb js oidc plugin has guarddog issues npm silent process execution n shell true n stdio ignore n detached true n signal this options openb location package dist plugin js message this package is silently executing another executable
1
493
2,935,469,658
IssuesEvent
2015-06-30 14:39:05
neuropoly/spinalcordtoolbox
https://api.github.com/repos/neuropoly/spinalcordtoolbox
closed
wrong output location for compute CSA
bug sct_process_segmentation
Issue: https://sourceforge.net/p/spinalcordtoolbox/discussion/help/thread/97f71a4a/ Data: ~~~~ sct_issues/20150630_emil/data ~~~~ Syntax: ~~~~ ./batch_emil.sh subj1/ ~~~~
1.0
wrong output location for compute CSA - Issue: https://sourceforge.net/p/spinalcordtoolbox/discussion/help/thread/97f71a4a/ Data: ~~~~ sct_issues/20150630_emil/data ~~~~ Syntax: ~~~~ ./batch_emil.sh subj1/ ~~~~
process
wrong output location for compute csa issue data sct issues emil data syntax batch emil sh
1
5,187
7,965,607,655
IssuesEvent
2018-07-14 10:54:20
exercism/cli
https://api.github.com/repos/exercism/cli
closed
Distribute completion scripts along with executable
release-process
The completion scripts currently live in the CLI website repository, which—as @QuLogic mentions [here](https://github.com/exercism/cli-www/issues/34)—doesn't make much sense. We should update the `bin/build-all` build script to produce a directory containing: - the executable - the completion scripts - a README file that explains what to do with the executable (make sure it's in your path) and the completion scripts (pick the right one, source it in your shell config) Then tar/zip as before.
1.0
Distribute completion scripts along with executable - The completion scripts currently live in the CLI website repository, which—as @QuLogic mentions [here](https://github.com/exercism/cli-www/issues/34)—doesn't make much sense. We should update the `bin/build-all` build script to produce a directory containing: - the executable - the completion scripts - a README file that explains what to do with the executable (make sure it's in your path) and the completion scripts (pick the right one, source it in your shell config) Then tar/zip as before.
process
distribute completion scripts along with executable the completion scripts currently live in the cli website repository which—as qulogic mentions make much sense we should update the bin build all build script to produce a directory containing the executable the completion scripts a readme file that explains what to do with the executable make sure it s in your path and the completion scripts pick the right one source it in your shell config then tar zip as before
1
65,768
14,761,894,576
IssuesEvent
2021-01-09 00:53:24
AlexRogalskiy/electron-vue-template
https://api.github.com/repos/AlexRogalskiy/electron-vue-template
opened
CVE-2020-7598 (Medium) detected in minimist-0.0.8.tgz, minimist-1.2.0.tgz
security vulnerability
## CVE-2020-7598 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimist-0.0.8.tgz</b>, <b>minimist-1.2.0.tgz</b></p></summary> <p> <details><summary><b>minimist-0.0.8.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz</a></p> <p>Path to dependency file: electron-vue-template/package.json</p> <p>Path to vulnerable library: electron-vue-template/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - eslint-6.8.0.tgz (Root Library) - mkdirp-0.5.1.tgz - :x: **minimist-0.0.8.tgz** (Vulnerable Library) </details> <details><summary><b>minimist-1.2.0.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p> <p>Path to dependency file: electron-vue-template/package.json</p> <p>Path to vulnerable library: electron-vue-template/node_modules/webpack-cli/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - copy-webpack-plugin-5.1.1.tgz (Root Library) - loader-utils-1.2.3.tgz - json5-1.0.1.tgz - :x: **minimist-1.2.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/electron-vue-template/commit/e180436ddc869ab181e9108f09eafef3237f5eb6">e180436ddc869ab181e9108f09eafef3237f5eb6</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload. <p>Publish Date: 2020-03-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94">https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94</a></p> <p>Release Date: 2020-03-11</p> <p>Fix Resolution: minimist - 0.2.1,1.2.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-7598 (Medium) detected in minimist-0.0.8.tgz, minimist-1.2.0.tgz - ## CVE-2020-7598 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimist-0.0.8.tgz</b>, <b>minimist-1.2.0.tgz</b></p></summary> <p> <details><summary><b>minimist-0.0.8.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz">https://registry.npmjs.org/minimist/-/minimist-0.0.8.tgz</a></p> <p>Path to dependency file: electron-vue-template/package.json</p> <p>Path to vulnerable library: electron-vue-template/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - eslint-6.8.0.tgz (Root Library) - mkdirp-0.5.1.tgz - :x: **minimist-0.0.8.tgz** (Vulnerable Library) </details> <details><summary><b>minimist-1.2.0.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.0.tgz</a></p> <p>Path to dependency file: electron-vue-template/package.json</p> <p>Path to vulnerable library: electron-vue-template/node_modules/webpack-cli/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - copy-webpack-plugin-5.1.1.tgz (Root Library) - loader-utils-1.2.3.tgz - json5-1.0.1.tgz - :x: **minimist-1.2.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/electron-vue-template/commit/e180436ddc869ab181e9108f09eafef3237f5eb6">e180436ddc869ab181e9108f09eafef3237f5eb6</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> minimist before 1.2.2 could be tricked into adding or modifying properties of Object.prototype using a "constructor" or "__proto__" payload. <p>Publish Date: 2020-03-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7598>CVE-2020-7598</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94">https://github.com/substack/minimist/commit/63e7ed05aa4b1889ec2f3b196426db4500cbda94</a></p> <p>Release Date: 2020-03-11</p> <p>Fix Resolution: minimist - 0.2.1,1.2.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in minimist tgz minimist tgz cve medium severity vulnerability vulnerable libraries minimist tgz minimist tgz minimist tgz parse argument options library home page a href path to dependency file electron vue template package json path to vulnerable library electron vue template node modules minimist package json dependency hierarchy eslint tgz root library mkdirp tgz x minimist tgz vulnerable library minimist tgz parse argument options library home page a href path to dependency file electron vue template package json path to vulnerable library electron vue template node modules webpack cli node modules minimist package json dependency hierarchy copy webpack plugin tgz root library loader utils tgz tgz x minimist tgz vulnerable library found in head commit a href found in base branch master vulnerability details minimist before could be tricked into adding or modifying properties of object prototype using a constructor or proto payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution minimist step up your open source security game with whitesource
0
360,257
25,282,707,920
IssuesEvent
2022-11-16 16:51:56
CryptoBlades/cryptoblades
https://api.github.com/repos/CryptoBlades/cryptoblades
closed
[Feature] - Issues with some of the BNB contracts
documentation enhancement
### Prerequisites - [X] I checked to make sure that this feature has not already been filed - [X] I'm reporting this information to the correct repository - [X] I understand enough about this issue to complete a comprehensive document ### Describe the feature and its requirements Greetings from BNB Chain, as per our weekly security screening, we identified the following behaviors: Smart contract: 0x4b4baeb52c1a2679e13d9d6d9680b03308216244 (Skill Staking Rewards Upgradeable 60) 0xecdc66ca26e2b90d1e10afaa162164f685f51c81 (Weapon Rename Tag Consumables) 0x5c76ea3a52c0b7c8378420f00f68089c12f4120e (Not sure what this one is) 0x4db374da614c3653ddead0cb8f96bd90c87602c1 (ValorToken) 0xc897e1dcd2e3f865dfe253dcf10c750b08ffebec (Valor Staking Rewards Upgradeable) 0xf521272c491a2ea2b613e39c6d5167e8382786ed (LP2 Staking Rewards Upgradeable Valor) 0x7f7a952b461084dced3a552ff4fbb0f1260e874d (Valor ERC20 Bridge Proxy Contract) Risk: HIGH Comment: An upgradeable proxy with an unverified logic contract Advice: Ensure proxyAdmin is a time lock or multisig. Ensure logic contract is verified. Smart contract: 0x0ccd575bf9378c06f6dca82f8122f570769f00c2 (King Token) Risk: MEDIUM Comment: Owner is an EOA, can change many system parameters Advice: Ensure privileged role is multisig or timelock [Security_Guidelines_for_Projects.pdf](https://github.com/CryptoBlades/cryptoblades/files/10004531/Security_Guidelines_for_Projects.pdf) ### Is your feature request related to an existing issue? Please describe. None ### Is there anything stopping this feature being completed? None ### Describe alternatives you've considered None ### Additional context _No response_
1.0
[Feature] - Issues with some of the BNB contracts - ### Prerequisites - [X] I checked to make sure that this feature has not already been filed - [X] I'm reporting this information to the correct repository - [X] I understand enough about this issue to complete a comprehensive document ### Describe the feature and its requirements Greetings from BNB Chain, as per our weekly security screening, we identified the following behaviors: Smart contract: 0x4b4baeb52c1a2679e13d9d6d9680b03308216244 (Skill Staking Rewards Upgradeable 60) 0xecdc66ca26e2b90d1e10afaa162164f685f51c81 (Weapon Rename Tag Consumables) 0x5c76ea3a52c0b7c8378420f00f68089c12f4120e (Not sure what this one is) 0x4db374da614c3653ddead0cb8f96bd90c87602c1 (ValorToken) 0xc897e1dcd2e3f865dfe253dcf10c750b08ffebec (Valor Staking Rewards Upgradeable) 0xf521272c491a2ea2b613e39c6d5167e8382786ed (LP2 Staking Rewards Upgradeable Valor) 0x7f7a952b461084dced3a552ff4fbb0f1260e874d (Valor ERC20 Bridge Proxy Contract) Risk: HIGH Comment: An upgradeable proxy with an unverified logic contract Advice: Ensure proxyAdmin is a time lock or multisig. Ensure logic contract is verified. Smart contract: 0x0ccd575bf9378c06f6dca82f8122f570769f00c2 (King Token) Risk: MEDIUM Comment: Owner is an EOA, can change many system parameters Advice: Ensure privileged role is multisig or timelock [Security_Guidelines_for_Projects.pdf](https://github.com/CryptoBlades/cryptoblades/files/10004531/Security_Guidelines_for_Projects.pdf) ### Is your feature request related to an existing issue? Please describe. None ### Is there anything stopping this feature being completed? None ### Describe alternatives you've considered None ### Additional context _No response_
non_process
issues with some of the bnb contracts prerequisites i checked to make sure that this feature has not already been filed i m reporting this information to the correct repository i understand enough about this issue to complete a comprehensive document describe the feature and its requirements greetings from bnb chain as per our weekly security screening we identified the following behaviors smart contract skill staking rewards upgradeable weapon rename tag consumables not sure what this one is valortoken valor staking rewards upgradeable staking rewards upgradeable valor valor bridge proxy contract risk high comment an upgradeable proxy with an unverified logic contract advice ensure proxyadmin is a time lock or multisig ensure logic contract is verified smart contract king token risk medium comment owner is an eoa can change many system parameters advice ensure privileged role is multisig or timelock is your feature request related to an existing issue please describe none is there anything stopping this feature being completed none describe alternatives you ve considered none additional context no response
0
380
2,823,565,118
IssuesEvent
2015-05-21 09:36:34
austundag/testing
https://api.github.com/repos/austundag/testing
closed
Show allergies in severity order by default
in process question
It appears that eHMP now displays them in severity order after the latest changes. Need to look in the code to verify.
1.0
Show allergies in severity order by default - It appears that eHMP now displays them in severity order after the latest changes. Need to look in the code to verify.
process
show allergies in severity order by default it appears that ehmp now displays them in severity order after the latest changes need to look in the code to verify
1
3,797
6,778,563,285
IssuesEvent
2017-10-28 12:36:51
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
opened
Branch filter with keyref and copy-to but without href fails
bug P2 preprocess/filtering
When a branch filter branch uses a key for the topic reference and defines a copy-to, the copy and filter fails because the source topic URI is not available at this stage. ```xml <topicref keyref="using-dita-command" copy-to="using-dita-command.dita" keys="first-build-using-dita-command"> <ditavalref href="../resources/novice.ditaval"> <ditavalmeta> <dvrResourcePrefix>first-build-</dvrResourcePrefix> </ditavalmeta> </ditavalref> </topicref> ```
1.0
Branch filter with keyref and copy-to but without href fails - When a branch filter branch uses a key for the topic reference and defines a copy-to, the copy and filter fails because the source topic URI is not available at this stage. ```xml <topicref keyref="using-dita-command" copy-to="using-dita-command.dita" keys="first-build-using-dita-command"> <ditavalref href="../resources/novice.ditaval"> <ditavalmeta> <dvrResourcePrefix>first-build-</dvrResourcePrefix> </ditavalmeta> </ditavalref> </topicref> ```
process
branch filter with keyref and copy to but without href fails when a branch filter branch uses a key for the topic reference and defines a copy to the copy and filter fails because the source topic uri is not available at this stage xml first build
1
22,277
30,828,508,951
IssuesEvent
2023-08-01 22:25:32
googleapis/api-linter
https://api.github.com/repos/googleapis/api-linter
closed
ci: migrate release creation automation to release-please app
type: process
Instead of this repo using its own hand-spun release automation, it should take advantage of the [Release-Please](https://github.com/googleapis/release-please) GitHub App in this Org. This will handle: * triggering release PR creation on eligible commits * manual release triggers * tag/release creation * release note generation * semver management We should keep any release asset generation in GitHub Actions, but have them trigger on _release creation_, so that they augment the release created by Release-Please.
1.0
ci: migrate release creation automation to release-please app - Instead of this repo using its own hand-spun release automation, it should take advantage of the [Release-Please](https://github.com/googleapis/release-please) GitHub App in this Org. This will handle: * triggering release PR creation on eligible commits * manual release triggers * tag/release creation * release note generation * semver management We should keep any release asset generation in GitHub Actions, but have them trigger on _release creation_, so that they augment the release created by Release-Please.
process
ci migrate release creation automation to release please app instead of this repo using its own hand spun release automation it should take advantage of the github app in this org this will handle triggering release pr creation on eligible commits manual release triggers tag release creation release note generation semver management we should keep any release asset generation in github actions but have them trigger on release creation so that they augment the release created by release please
1
249,885
7,965,010,909
IssuesEvent
2018-07-14 01:47:20
php-censor/php-censor
https://api.github.com/repos/php-censor/php-censor
closed
Undocumented fail timeout
component:worker priority:normal type:enhancement
I have a long set of phpunit tests in my setup. One day I’ve faced a problem: after normally running about 30 minutes something happens and build corrupts: on each and every command it starts geting error message like `shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory`. Generally it looks like a project build directory has been deleted by someone. It took me a couple of days to discover the cause. In Command/RunCommand.php the is a non documented build timeout: `$timeout = Config::getInstance()->get(‘php-censor.build.failed_after’, 1800)` So by default after running 30 minutes php-censor gracefully drops build directory (without telling anything to webapp console), keeps calm and continues to build in the non-existent directory. I’d suggest the following: 1. Document this setting in app/config.yml 2. Interrupt the build process and distinctively complain to webapp colsole in case of timeout.
1.0
Undocumented fail timeout - I have a long set of phpunit tests in my setup. One day I’ve faced a problem: after normally running about 30 minutes something happens and build corrupts: on each and every command it starts geting error message like `shell-init: error retrieving current directory: getcwd: cannot access parent directories: No such file or directory`. Generally it looks like a project build directory has been deleted by someone. It took me a couple of days to discover the cause. In Command/RunCommand.php the is a non documented build timeout: `$timeout = Config::getInstance()->get(‘php-censor.build.failed_after’, 1800)` So by default after running 30 minutes php-censor gracefully drops build directory (without telling anything to webapp console), keeps calm and continues to build in the non-existent directory. I’d suggest the following: 1. Document this setting in app/config.yml 2. Interrupt the build process and distinctively complain to webapp colsole in case of timeout.
non_process
undocumented fail timeout i have a long set of phpunit tests in my setup one day i’ve faced a problem after normally running about minutes something happens and build corrupts on each and every command it starts geting error message like shell init error retrieving current directory getcwd cannot access parent directories no such file or directory generally it looks like a project build directory has been deleted by someone it took me a couple of days to discover the cause in command runcommand php the is a non documented build timeout timeout config getinstance get ‘php censor build failed after’ so by default after running minutes php censor gracefully drops build directory without telling anything to webapp console keeps calm and continues to build in the non existent directory i’d suggest the following document this setting in app config yml interrupt the build process and distinctively complain to webapp colsole in case of timeout
0
425
2,855,762,620
IssuesEvent
2015-06-02 11:34:05
genomizer/genomizer-server
https://api.github.com/repos/genomizer/genomizer-server
closed
Refactor server and clients to be able to run bowtie on a single .fastq file
BL enhancement Processing
Takes two files at the moment which is not required, only ratio calculation that needs raw data + reference data. This concerns processing, the API and all (?) clients.
1.0
Refactor server and clients to be able to run bowtie on a single .fastq file - Takes two files at the moment which is not required, only ratio calculation that needs raw data + reference data. This concerns processing, the API and all (?) clients.
process
refactor server and clients to be able to run bowtie on a single fastq file takes two files at the moment which is not required only ratio calculation that needs raw data reference data this concerns processing the api and all clients
1
36,948
5,096,641,005
IssuesEvent
2017-01-03 18:52:11
frontendbr/forum
https://api.github.com/repos/frontendbr/forum
reopened
Karma não roda testes no chrome
Testes [Dúvida]
Boa tarde pessoal! Estou tendo problema com o karma no trabalho, já apanhei para fazer rodá-lo com o projeto, e agora que aparentemente está tudo bem, não está tudo bem! 'haha >.< Isso aqui é o que eu tenho no terminal: ``` > karma start karma.conf.js 03 01 2017 12:28:17.095:INFO [karma]: Karma v0.13.22 server started at http://localhost:9876/ 03 01 2017 12:28:17.108:INFO [launcher]: Starting browser Chrome 03 01 2017 12:28:22.165:INFO [Chrome 55.0.2883 (Windows 10 0.0.0)]: Connected on socket T2ywhpM0JuADMo2SAAAA with id 84549006 Chrome 55.0.2883 (Windows 10 0.0.0): Executed 0 of 0 ERROR (0.008 secs / 0 secs) Chrome 55.0.2883 (Windows 10 0.0.0): Executed 0 of 0 ERROR (0.008 secs / 0 secs) npm ERR! Test failed. See above for more details. ``` Acho que o problema possa ser com o chrome, não dei downgrade nele ainda, estou testando com outros browsers, mas até o momento nada... Tomei como base este [tutorial](http://twofuckingdevelopers.com/2016/01/testing-angular-2-with-karma-and-jasmine/), então os arquivos de configuração e tal são basicamente os mesmos, só modifiquei os path. Alguém já se deparou com algum problema parecido? No terminal não tem nada de util :S
1.0
Karma não roda testes no chrome - Boa tarde pessoal! Estou tendo problema com o karma no trabalho, já apanhei para fazer rodá-lo com o projeto, e agora que aparentemente está tudo bem, não está tudo bem! 'haha >.< Isso aqui é o que eu tenho no terminal: ``` > karma start karma.conf.js 03 01 2017 12:28:17.095:INFO [karma]: Karma v0.13.22 server started at http://localhost:9876/ 03 01 2017 12:28:17.108:INFO [launcher]: Starting browser Chrome 03 01 2017 12:28:22.165:INFO [Chrome 55.0.2883 (Windows 10 0.0.0)]: Connected on socket T2ywhpM0JuADMo2SAAAA with id 84549006 Chrome 55.0.2883 (Windows 10 0.0.0): Executed 0 of 0 ERROR (0.008 secs / 0 secs) Chrome 55.0.2883 (Windows 10 0.0.0): Executed 0 of 0 ERROR (0.008 secs / 0 secs) npm ERR! Test failed. See above for more details. ``` Acho que o problema possa ser com o chrome, não dei downgrade nele ainda, estou testando com outros browsers, mas até o momento nada... Tomei como base este [tutorial](http://twofuckingdevelopers.com/2016/01/testing-angular-2-with-karma-and-jasmine/), então os arquivos de configuração e tal são basicamente os mesmos, só modifiquei os path. Alguém já se deparou com algum problema parecido? No terminal não tem nada de util :S
non_process
karma não roda testes no chrome boa tarde pessoal estou tendo problema com o karma no trabalho já apanhei para fazer rodá lo com o projeto e agora que aparentemente está tudo bem não está tudo bem haha isso aqui é o que eu tenho no terminal karma start karma conf js info karma server started at info starting browser chrome info connected on socket with id chrome windows executed of error secs secs chrome windows executed of error secs secs npm err test failed see above for more details acho que o problema possa ser com o chrome não dei downgrade nele ainda estou testando com outros browsers mas até o momento nada tomei como base este então os arquivos de configuração e tal são basicamente os mesmos só modifiquei os path alguém já se deparou com algum problema parecido no terminal não tem nada de util s
0
299,595
22,615,042,721
IssuesEvent
2022-06-29 21:01:09
trinsic-id/sdk
https://api.github.com/repos/trinsic-id/sdk
closed
Nested Code Injection Targets
documentation
- `dotnet/index.md` - **Configuration** - `testSignInAndGetInfo()` targeted, includes nested `accountServiceConstructor()`, `accountServiceSignIn()`, `accountServiceGetInfo()` - `accountServiceConstructor()` targeted in `reference/index.md`, "Using an SDK Service" section, C# tab - `accountServiceSignIn()` targeted in `account-service.md`, "Sign In" section, C# tab - `accountServiceGetInfo()` targeted in `account-service.md`, "Get Account Info" section, C# tab
1.0
Nested Code Injection Targets - - `dotnet/index.md` - **Configuration** - `testSignInAndGetInfo()` targeted, includes nested `accountServiceConstructor()`, `accountServiceSignIn()`, `accountServiceGetInfo()` - `accountServiceConstructor()` targeted in `reference/index.md`, "Using an SDK Service" section, C# tab - `accountServiceSignIn()` targeted in `account-service.md`, "Sign In" section, C# tab - `accountServiceGetInfo()` targeted in `account-service.md`, "Get Account Info" section, C# tab
non_process
nested code injection targets dotnet index md configuration testsigninandgetinfo targeted includes nested accountserviceconstructor accountservicesignin accountservicegetinfo accountserviceconstructor targeted in reference index md using an sdk service section c tab accountservicesignin targeted in account service md sign in section c tab accountservicegetinfo targeted in account service md get account info section c tab
0
199,037
22,674,229,635
IssuesEvent
2022-07-04 01:29:46
Techini/WebGoat
https://api.github.com/repos/Techini/WebGoat
closed
CVE-2021-21348 (High) detected in xstream-1.4.5.jar - autoclosed
security vulnerability
## CVE-2021-21348 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xstream-1.4.5.jar</b></p></summary> <p>XStream is a serialization library from Java objects to XML and back.</p> <p>Path to dependency file: /webgoat-lessons/vulnerable-components/pom.xml</p> <p>Path to vulnerable library: /m2/repository/com/thoughtworks/xstream/xstream/1.4.5/xstream-1.4.5.jar,/home/wss-scanner/.m2/repository/com/thoughtworks/xstream/xstream/1.4.5/xstream-1.4.5.jar,/home/wss-scanner/.m2/repository/com/thoughtworks/xstream/xstream/1.4.5/xstream-1.4.5.jar</p> <p> Dependency Hierarchy: - :x: **xstream-1.4.5.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Techini/WebGoat/commit/d33cc0e32a0d1b949ff1b85af16890cd452276f8">d33cc0e32a0d1b949ff1b85af16890cd452276f8</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> XStream is a Java library to serialize objects to XML and back again. In XStream before version 1.4.16, there is a vulnerability which may allow a remote attacker to occupy a thread that consumes maximum CPU time and will never return. No user is affected, who followed the recommendation to setup XStream's security framework with a whitelist limited to the minimal required types. If you rely on XStream's default blacklist of the Security Framework, you will have to use at least version 1.4.16. <p>Publish Date: 2021-03-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21348>CVE-2021-21348</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/x-stream/xstream/security/advisories/GHSA-56p8-3fh9-4cvq">https://github.com/x-stream/xstream/security/advisories/GHSA-56p8-3fh9-4cvq</a></p> <p>Release Date: 2021-03-23</p> <p>Fix Resolution: com.thoughtworks.xstream:xstream:1.4.16</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-21348 (High) detected in xstream-1.4.5.jar - autoclosed - ## CVE-2021-21348 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xstream-1.4.5.jar</b></p></summary> <p>XStream is a serialization library from Java objects to XML and back.</p> <p>Path to dependency file: /webgoat-lessons/vulnerable-components/pom.xml</p> <p>Path to vulnerable library: /m2/repository/com/thoughtworks/xstream/xstream/1.4.5/xstream-1.4.5.jar,/home/wss-scanner/.m2/repository/com/thoughtworks/xstream/xstream/1.4.5/xstream-1.4.5.jar,/home/wss-scanner/.m2/repository/com/thoughtworks/xstream/xstream/1.4.5/xstream-1.4.5.jar</p> <p> Dependency Hierarchy: - :x: **xstream-1.4.5.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Techini/WebGoat/commit/d33cc0e32a0d1b949ff1b85af16890cd452276f8">d33cc0e32a0d1b949ff1b85af16890cd452276f8</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> XStream is a Java library to serialize objects to XML and back again. In XStream before version 1.4.16, there is a vulnerability which may allow a remote attacker to occupy a thread that consumes maximum CPU time and will never return. No user is affected, who followed the recommendation to setup XStream's security framework with a whitelist limited to the minimal required types. If you rely on XStream's default blacklist of the Security Framework, you will have to use at least version 1.4.16. <p>Publish Date: 2021-03-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-21348>CVE-2021-21348</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/x-stream/xstream/security/advisories/GHSA-56p8-3fh9-4cvq">https://github.com/x-stream/xstream/security/advisories/GHSA-56p8-3fh9-4cvq</a></p> <p>Release Date: 2021-03-23</p> <p>Fix Resolution: com.thoughtworks.xstream:xstream:1.4.16</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in xstream jar autoclosed cve high severity vulnerability vulnerable library xstream jar xstream is a serialization library from java objects to xml and back path to dependency file webgoat lessons vulnerable components pom xml path to vulnerable library repository com thoughtworks xstream xstream xstream jar home wss scanner repository com thoughtworks xstream xstream xstream jar home wss scanner repository com thoughtworks xstream xstream xstream jar dependency hierarchy x xstream jar vulnerable library found in head commit a href vulnerability details xstream is a java library to serialize objects to xml and back again in xstream before version there is a vulnerability which may allow a remote attacker to occupy a thread that consumes maximum cpu time and will never return no user is affected who followed the recommendation to setup xstream s security framework with a whitelist limited to the minimal required types if you rely on xstream s default blacklist of the security framework you will have to use at least version publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com thoughtworks xstream xstream step up your open source security game with whitesource
0
13,462
15,949,578,789
IssuesEvent
2021-04-15 07:36:48
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Buffer processing: Inward buffer produces areas in certain conditions
Bug Processing Upstream
For most of my objects the inward buffer tool works fine with a negative distance yet for some features residuals are produced where there shouldn't be any. I don't know if this behaviour is within the specs of a negative buffer or not. There doesn't seem to be a logic behind it besides a multiple processing of the same areas by the inward buffer tool. I couldn't get rid of the problem varying any parameters besides the _distance_ parameter. The problem occurred in my appended example case only somewhere within the values between 3.9 and 4.55 m. I would expect a behaviour similar to an erosion tool. It did not happen when the feature object contained more than 4 vertex points. Is this behaviour well known, documented and accepted or am I the first one to notice? At least I couldn't find anything in the docs. Any ideas for a workaround? Thanks in advance! ![inward_buffer_bug_dialog](https://user-images.githubusercontent.com/67278094/108549582-9e162880-72ed-11eb-9f0a-c5f57e8a1dfd.PNG) ![inward_buffer_bug](https://user-images.githubusercontent.com/67278094/108549587-9fdfec00-72ed-11eb-8890-2f0b8d49f2be.PNG) QGIS version | 3.16.3-Hannover | QGIS code revision | 94ac9f21b8 -- | -- | -- | -- Compiled against Qt | 5.11.2 | Running against Qt | 5.11.2 Compiled against GDAL/OGR | 3.1.4 | Running against GDAL/OGR | 3.1.4 Compiled against GEOS | 3.8.1-CAPI-1.13.3 | Running against GEOS | 3.8.1-CAPI-1.13.3 Compiled against SQLite | 3.29.0 | Running against SQLite | 3.29.0 PostgreSQL Client Version | 11.5 | SpatiaLite Version | 4.3.0 QWT Version | 6.1.3 | QScintilla2 Version | 2.10.8 Compiled against PROJ | 6.3.2 | Running against PROJ | Rel. 6.3.2, May 1st, 2020 OS Version | Windows 10 (10.0) Active python plugins | FreehandRasterGeoreferencer; GroupStats; nominatim; QuickOSM; db_manager; MetaSearch; processing [inward_buffer_base_layer.zip](https://github.com/qgis/QGIS/files/6013494/inward_buffer_base_layer.zip)
1.0
Buffer processing: Inward buffer produces areas in certain conditions - For most of my objects the inward buffer tool works fine with a negative distance yet for some features residuals are produced where there shouldn't be any. I don't know if this behaviour is within the specs of a negative buffer or not. There doesn't seem to be a logic behind it besides a multiple processing of the same areas by the inward buffer tool. I couldn't get rid of the problem varying any parameters besides the _distance_ parameter. The problem occurred in my appended example case only somewhere within the values between 3.9 and 4.55 m. I would expect a behaviour similar to an erosion tool. It did not happen when the feature object contained more than 4 vertex points. Is this behaviour well known, documented and accepted or am I the first one to notice? At least I couldn't find anything in the docs. Any ideas for a workaround? Thanks in advance! ![inward_buffer_bug_dialog](https://user-images.githubusercontent.com/67278094/108549582-9e162880-72ed-11eb-9f0a-c5f57e8a1dfd.PNG) ![inward_buffer_bug](https://user-images.githubusercontent.com/67278094/108549587-9fdfec00-72ed-11eb-8890-2f0b8d49f2be.PNG) QGIS version | 3.16.3-Hannover | QGIS code revision | 94ac9f21b8 -- | -- | -- | -- Compiled against Qt | 5.11.2 | Running against Qt | 5.11.2 Compiled against GDAL/OGR | 3.1.4 | Running against GDAL/OGR | 3.1.4 Compiled against GEOS | 3.8.1-CAPI-1.13.3 | Running against GEOS | 3.8.1-CAPI-1.13.3 Compiled against SQLite | 3.29.0 | Running against SQLite | 3.29.0 PostgreSQL Client Version | 11.5 | SpatiaLite Version | 4.3.0 QWT Version | 6.1.3 | QScintilla2 Version | 2.10.8 Compiled against PROJ | 6.3.2 | Running against PROJ | Rel. 6.3.2, May 1st, 2020 OS Version | Windows 10 (10.0) Active python plugins | FreehandRasterGeoreferencer; GroupStats; nominatim; QuickOSM; db_manager; MetaSearch; processing [inward_buffer_base_layer.zip](https://github.com/qgis/QGIS/files/6013494/inward_buffer_base_layer.zip)
process
buffer processing inward buffer produces areas in certain conditions for most of my objects the inward buffer tool works fine with a negative distance yet for some features residuals are produced where there shouldn t be any i don t know if this behaviour is within the specs of a negative buffer or not there doesn t seem to be a logic behind it besides a multiple processing of the same areas by the inward buffer tool i couldn t get rid of the problem varying any parameters besides the distance parameter the problem occurred in my appended example case only somewhere within the values between and m i would expect a behaviour similar to an erosion tool it did not happen when the feature object contained more than vertex points is this behaviour well known documented and accepted or am i the first one to notice at least i couldn t find anything in the docs any ideas for a workaround thanks in advance qgis version hannover qgis code revision compiled against qt running against qt compiled against gdal ogr running against gdal ogr compiled against geos capi running against geos capi compiled against sqlite running against sqlite postgresql client version spatialite version qwt version version compiled against proj running against proj rel may os version windows active python plugins freehandrastergeoreferencer groupstats nominatim quickosm db manager metasearch processing
1
2,281
5,108,101,563
IssuesEvent
2017-01-05 16:44:29
jlm2017/jlm-video-subtitles
https://api.github.com/repos/jlm2017/jlm-video-subtitles
closed
[Subtitles] [FR] La revue de la semaine n°1 : pauvreté, Hayange, démocratie, Alstom, Juppé et retraites.
Language: French Process: [6] Approved
# Video title La revue de la semaine n°1 : pauvreté, Hayange, démocratie, Alstom, Juppé et retraites. # URL https://www.youtube.com/watch?v=ynfJBfJKzFw # Youtube subtitles language Français # Duration 21:16 # Subtitles URL https://www.youtube.com/timedtext_editor?tab=captions&v=ynfJBfJKzFw&ui=hd&action_mde_edit_form=1&ref=player&lang=fr&bl=vmp
1.0
[Subtitles] [FR] La revue de la semaine n°1 : pauvreté, Hayange, démocratie, Alstom, Juppé et retraites. - # Video title La revue de la semaine n°1 : pauvreté, Hayange, démocratie, Alstom, Juppé et retraites. # URL https://www.youtube.com/watch?v=ynfJBfJKzFw # Youtube subtitles language Français # Duration 21:16 # Subtitles URL https://www.youtube.com/timedtext_editor?tab=captions&v=ynfJBfJKzFw&ui=hd&action_mde_edit_form=1&ref=player&lang=fr&bl=vmp
process
la revue de la semaine n° pauvreté hayange démocratie alstom juppé et retraites video title la revue de la semaine n° pauvreté hayange démocratie alstom juppé et retraites url youtube subtitles language français duration subtitles url
1
3,548
6,587,304,221
IssuesEvent
2017-09-13 20:33:37
cptechinc/soft-6-ecomm
https://api.github.com/repos/cptechinc/soft-6-ecomm
opened
Products have many templates
PHP Processwire
The products should only use one template. product-page Since all products through distributionPlus share the same fields in the backend, only the front end looks different for product type. So we can configure the product-page to check and show different output for the following - All products, - Is a paint product - is not a paint product The way we can check if the category is set or if the category is a paint or not paint category.
1.0
Products have many templates - The products should only use one template. product-page Since all products through distributionPlus share the same fields in the backend, only the front end looks different for product type. So we can configure the product-page to check and show different output for the following - All products, - Is a paint product - is not a paint product The way we can check if the category is set or if the category is a paint or not paint category.
process
products have many templates the products should only use one template product page since all products through distributionplus share the same fields in the backend only the front end looks different for product type so we can configure the product page to check and show different output for the following all products is a paint product is not a paint product the way we can check if the category is set or if the category is a paint or not paint category
1
22,378
31,142,282,536
IssuesEvent
2023-08-16 01:43:58
cypress-io/cypress
https://api.github.com/repos/cypress-io/cypress
closed
Flaky test: CypressError: `cy.task('__internal_scaffoldProject')`
process: flaky test topic: flake ❄️ stage: product backlog CT topic: scaffoldProject stale
### Link to dashboard or CircleCI failure Failures with timeouts: - https://app.circleci.com/pipelines/github/cypress-io/cypress/42187/workflows/1c55c4fe-45fc-4bde-af4c-cf01a8734c9b/jobs/1750692/tests#failed-test-0 - https://app.circleci.com/pipelines/github/cypress-io/cypress/42396/workflows/0d2b9f29-2e73-4600-8264-72e23db048ab/jobs/1761435 Failures with EEXIST: file already exists (originally noted in issue https://github.com/cypress-io/cypress/issues/23417): - https://dashboard.cypress.io/projects/ypt4pf/runs/38107/test-results/a69afa29-37b1-453f-9dbc-dd6f84c9f8a5?utm_source=github&isFlaky=%5B%7B%22value%22%3Atrue%2C%22label%22%3A%22Flaky%22%7D%5D - https://dashboard.cypress.io/projects/ypt4pf/runs/38193/test-results/ddd23eed-07a0-4a77-9630-51a29ffd2581 ### Link to failing test in GitHub Failures with ENOENT: no such file or directory: - app.circleci.com/pipelines/github/cypress-io/cypress/41309/workflows/91b07182-06e5-4d6c-88aa-8861ac4c660f/jobs/1710135 Source code for `scaffoldProject`: https://github.com/cypress-io/cypress/blob/develop/packages/frontend-shared/cypress/e2e/support/e2eSupport.ts#L183 ### Analysis We've seen this timeout a number of times in `cy.task('__internal_scaffoldProject')`: <img width="1172" alt="Screen Shot 2022-08-18 at 10 58 47 PM" src="https://user-images.githubusercontent.com/26726429/185553000-487268b1-7092-48ea-97fb-42e86bb256d5.png"> ### Cypress Version 10.5.0 ### Other _No response_
1.0
Flaky test: CypressError: `cy.task('__internal_scaffoldProject')` - ### Link to dashboard or CircleCI failure Failures with timeouts: - https://app.circleci.com/pipelines/github/cypress-io/cypress/42187/workflows/1c55c4fe-45fc-4bde-af4c-cf01a8734c9b/jobs/1750692/tests#failed-test-0 - https://app.circleci.com/pipelines/github/cypress-io/cypress/42396/workflows/0d2b9f29-2e73-4600-8264-72e23db048ab/jobs/1761435 Failures with EEXIST: file already exists (originally noted in issue https://github.com/cypress-io/cypress/issues/23417): - https://dashboard.cypress.io/projects/ypt4pf/runs/38107/test-results/a69afa29-37b1-453f-9dbc-dd6f84c9f8a5?utm_source=github&isFlaky=%5B%7B%22value%22%3Atrue%2C%22label%22%3A%22Flaky%22%7D%5D - https://dashboard.cypress.io/projects/ypt4pf/runs/38193/test-results/ddd23eed-07a0-4a77-9630-51a29ffd2581 ### Link to failing test in GitHub Failures with ENOENT: no such file or directory: - app.circleci.com/pipelines/github/cypress-io/cypress/41309/workflows/91b07182-06e5-4d6c-88aa-8861ac4c660f/jobs/1710135 Source code for `scaffoldProject`: https://github.com/cypress-io/cypress/blob/develop/packages/frontend-shared/cypress/e2e/support/e2eSupport.ts#L183 ### Analysis We've seen this timeout a number of times in `cy.task('__internal_scaffoldProject')`: <img width="1172" alt="Screen Shot 2022-08-18 at 10 58 47 PM" src="https://user-images.githubusercontent.com/26726429/185553000-487268b1-7092-48ea-97fb-42e86bb256d5.png"> ### Cypress Version 10.5.0 ### Other _No response_
process
flaky test cypresserror cy task internal scaffoldproject link to dashboard or circleci failure failures with timeouts failures with eexist file already exists originally noted in issue link to failing test in github failures with enoent no such file or directory app circleci com pipelines github cypress io cypress workflows jobs source code for scaffoldproject analysis we ve seen this timeout a number of times in cy task internal scaffoldproject img width alt screen shot at pm src cypress version other no response
1
17,588
23,408,187,486
IssuesEvent
2022-08-12 14:45:46
hashgraph/hedera-json-rpc-relay
https://api.github.com/repos/hashgraph/hedera-json-rpc-relay
closed
Add htsPrecompile acceptance tests support for fungible token allowance/approval methods
enhancement limechain P2 process
### Problem No htsPrecompile acceptance tests support for fungible token allowance/approval methods exists ### Solution Add support for the fungible token allowance/approval verifications: - approve(address token, address spender, uint256 amount) external returns (int64 responseCode) - allowance(address token, address owner, address spender) external returns (int64 responseCode, uint256 allowance) ### Alternatives _No response_
1.0
Add htsPrecompile acceptance tests support for fungible token allowance/approval methods - ### Problem No htsPrecompile acceptance tests support for fungible token allowance/approval methods exists ### Solution Add support for the fungible token allowance/approval verifications: - approve(address token, address spender, uint256 amount) external returns (int64 responseCode) - allowance(address token, address owner, address spender) external returns (int64 responseCode, uint256 allowance) ### Alternatives _No response_
process
add htsprecompile acceptance tests support for fungible token allowance approval methods problem no htsprecompile acceptance tests support for fungible token allowance approval methods exists solution add support for the fungible token allowance approval verifications approve address token address spender amount external returns responsecode allowance address token address owner address spender external returns responsecode allowance alternatives no response
1
2,757
5,682,426,194
IssuesEvent
2017-04-13 09:39:58
our-city-app/oca-backend
https://api.github.com/repos/our-city-app/oca-backend
closed
System forces to have the exactly X active applications.
process_wontfix type_question
https://docs.greenitglobe.com/gig/proj_actives/issues/24 ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ When we have an order that includes adding the customer to other applications, for example 3, the system forces the customer to be in 3 apps. Is not possible to have the app in only two apps.
1.0
System forces to have the exactly X active applications. - https://docs.greenitglobe.com/gig/proj_actives/issues/24 ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ When we have an order that includes adding the customer to other applications, for example 3, the system forces the customer to be in 3 apps. Is not possible to have the app in only two apps.
process
system forces to have the exactly x active applications when we have an order that includes adding the customer to other applications for example the system forces the customer to be in apps is not possible to have the app in only two apps
1
5,568
8,407,409,926
IssuesEvent
2018-10-11 20:51:45
SynBioDex/SEPs
https://api.github.com/repos/SynBioDex/SEPs
reopened
SEP 001 -- SBOL Enhancement Proposals
Accepted Active Type: Process
# SEP 001 -- SBOL Enhancement Proposals | SEP | 001 | | --- | --- | | **Title** | SBOL Enhancement Proposals | | **Authors** | Raik Grünberg (raik.gruenberg at gmail com), Bryan Bartley (bartleyba at sbolstandard org) | | **Editor** | Raik Grünberg | | **Type** | Procedure | | **Status** | Accepted | | **Created** | 05-Oct-2015 | | **Last modified** | 20-Feb-2016 | ## Abstract SEP stands for [SBOL](http://sbolstandard.org) Enhancement Proposal. A SEP is a design document providing information to the SBOL community. It may describe a new feature or term for the SBOL data model or a rule or organizational process that the SBOL community should follow. The SEP should provide the rationale and a concise technical specification of the feature. We intend SEPs to be the primary mechanisms for proposing major data model changes, for collecting community input on an issue, and for documenting the design decisions that have gone into SBOL. The SEP authors are responsible for building consensus within the community and documenting dissenting opinions. SEPs are filed by SBOL editors in a dedicated issue tracker on github. If not withdrawn, an SEP will eventually be put to a vote by the SBOL community. ## Table of Content - [1. Rationale](#rationale) - [2. Specification of the SEP Workflow](#specification) - 2.1 Drafting an SEP - 2.2 SEP Submission - 2.3 SEP Discussion and Updates - 2.4 Decision - [3. The SEP document](#document) - 3.1 SEP Types - 3.2 SEP Status - 3.3 Document Layout - 3.4 Auxiliary Files - [4. Changes to Voting rules](#voting) - [5. Example or Use Case](#example) - [6. Discussion](#discussion) - 6.1 current versus new issue tracker - 6.2 Filtering by editors - 6.3 Should we reserve the first 10 or 20 SEP numbers for important SBOL government rules? - [References](#references) - [Copyright](#copyright) ## 1. Rationale <a name="rationale"></a> With the growth of SBOL in terms of both scope and members, building consensus through informal mailing list discussions and SBOL meetings is becoming increasingly inefficient. Only a minority of SBOL participants is watching a given mailing list thread or present for an actual meeting. This makes it often difficult for others to "catch up" with an ongoing discussion and also leads to many circular debates that are recurring with some regularity or quickly move off-topic. SBOL Enhancement Proposals (SEPs) address many of these issues. SEPs are borrowing heavily from Python Enhancement Proposals (PEPs) [[1](https://www.python.org/dev/peps/pep-0001)] which are the main avenue to proposing and managing changes to the Python programming language. The PEP process has been used and refined by the Python developer community over many years and we hope to benefit from this experience. SEPs have the following goals: - Manage proposed changes to the SBOL data model. - Resolve long open-ended discussions - Distinguish between real issues versus informal suggestions - Summarize arguments **for** and **against** a proposed change - Introduce newcomers and bystanders to an issue under discussion - Formalize the drafting of voting ballots with community input - Document history of decision-making in the SBOL community By drafting proposals as a shared document, the SEP process is intended to help integrate the diverse opinions and perspectives presented in a discussion thread. It encourages consensus-making by forcing co-authors of a proposal to consolidate their opinions around the strongest points of agreement while resolving minor points of contention. Authors of a proposed change will be motivated to acknowledge, summarize, and address contrary points of view other than their own. ## 2. Specification of the SEP Workflow <a name="specification"></a> ### 2.1 Drafting an SEP The SEP process begins with a new idea for improving SBOL. It is highly recommended that a single SEP contain only a single key proposal or new idea. The more focused the SEP, the more successful it tends to be. The SBOL editors reserve the right to reject SEP proposals if they appear too unfocused or too broad. If in doubt, split your SEP into several well-focused ones. The SEP authors or author write the SEP using the style and format described below, shepherd the discussions in the appropriate forums, and attempt to build community consensus around the idea. The SEP champion (a.k.a. author) should first attempt to ascertain whether the idea is SEP-able. Posting to the sbol-dev@googlegroups.com mailing list is currently the best way to go about this. Vetting an idea publicly before going as far as writing a SEP is meant to save the potential author time. It also helps to make sure the idea is applicable to the entire community and not just the author. The author then writes up a plain text document, based on the template provided as SEP #2, summarizing their proposal as succinctly and clearly as possible (see below for details). ### 2.2 SEP Submission Following a discussion on sbol-dev or other channels (e.g. during an SBOL workshop), the proposal should be sent as a draft SEP to the SBOL editors < sbol-editors@googlegroups.com >. The draft must be written in SEP style as described below, else it will be sent back without further regard until proper formatting rules are followed (although minor errors will be corrected by the editors). Alternatively, the SEP may be submitted as a github pull request: 1. **fork** https://github.com/SynBioDex/SEPs into your own user space 2. open the sep_002_template.md document in RAW view and download/save the file locally ( or follow this link: https://raw.githubusercontent.com/SynBioDex/SEPs/master/sep_002_template.md) 3. rename sep_002_template.md and upload it back to your forked repository 4. Edit your SEP directly on github 5. Initiate a **pull request** for your version of the repository on https://github.com/SynBioDex/SEPs If approved, an editor will submit the SEP to the dedicated github issue tracker, for which only SBOL editors have write-access. The github issue number then becomes the SEP number by which the proposal can be referenced. The SBOL editors will _not_ unreasonably deny a SEP. Reasons for denying SEP status include duplication of effort, being technically unsound, not providing proper motivation or not addressing backwards compatibility. ### 2.3 SEP Discussion and Updates Authors are explicitly encouraged to update their SEP as the discussion progresses and their ideas are refined. As updates are necessary, the SEP author(s) can email new SEP versions to the SBOL editors, who will update the issue accordingly. The editors may invite, at their own discretion, other SBOL community members to formulate a short paragraph which will then be added to the discussion section in order to better document dissenting opinions. However, SEP authors are asked to fairly document dissent themselves so that "dissenting voice" paragraphs will hopefully be rarely needed. ### 2.4 Competing SEPs This is simply a list of any open SEPs that offer a competing or contradictory proposal. Simply by listing a competing SEP, the authors indicate acknowledgement of competing viewpoints. This should help Editors shepherd the progress of the SBOL community toward meaningful votes that provide clear, contrasting options to voters in the SBOL Developers community. ### 2.5 Decision Eventually, an SEP is either withdrawn by the authors or put to a vote by the SBOL Editors (or any two developers). The exact voting procedure is described in SEP #5. If approved, the SEP will be marked as “Accepted”. Once a change is implemented, the status changes to “Final”. Approved procedural SEPs (e.g. concerning SBOL governing rules) are labelled as "Active", indicating that this rule may be further adapted in the future. All SEPs will always remain "Open" on the issue tracker so that they are all visible by default. Editors attach and update issue labels to allow easy filtering for SEP Type and Status. ## 3. The SEP document <a name='document'></a> ### 3.1 SEP Types There are two types of SEPs: - data model -- a proposal to change or expand the SBOL data model - process -- a proposal of how to improve SBOL governance or management ### 3.2 SEP Status Every SEP starts out in “Draft” status. A draft can be: “Accepted”, “Rejected” or “Withdrawn”. A draft may also be “Deferred” if a discussion is deemed to be postponed. After their implementation, a SEP is labelled as “Final”. Later during the life cycle of an SEP it may be “Replaced” or “Deprecated”. Process SEPs are instead labelled “Active”. ### 3.3 Document Layout An SEP is described as a text document using (github-flavoured) Markdown syntax. A boilerplate template for writing a new SEP will be provided as SEP #2. The document must have the following sections (optional sections or lines given in "[ ]"): 1. Preamble - SEP number (assigned by editor, leave empty or XXX) - descriptive title (limited to a maximum of 44 characters) - names for each author - created (date) - Type of SEP (data model | process; process = SBOL organization or “governing rule”) - [SBOL version] (version to which change should be introduced; if type == data model) - [Replaces] (SEP number, only if this SEP is Final/Accepted and replaces another one) - Status (Draft | Accepted | Rejected | Withdrawn | Deferred | Final | Active) 2. Abstract -- a (very) short (~200 words, at most) summary of the proposed change or enhancement. 3. Motivation (or Rationale) -- The motivation / rationale for the proposed change and how it fits in with the mission and vision of the SBOL standard as the authors see it. 4. Specification (title may differ)-- The technical specification should describe the syntax and semantics of any new SBOL feature. The specification may include UML but this is not required. The specification can have sub-headings as required/useful. The name of this section may differ (especially for procedure SEPs). 5. Example or Use Case [optional but recommended] -- process or more trivial proposals may choose to skip this section. Data model changes **must** list a short example or use case. 6. Backwards Compatibility [optional] -- All SEPs that introduce backwards incompatibilities must include a section describing these incompatibilities and their severity. The SEP must explain how the author proposes to deal with these incompatibilities. 7. Discussion -- Summarize any relevant discussion, in particular counter-arguments brought up. References are listed at the end. A copyright statement should place the document in the public domain. Authors may choose to change or introduce additional top-level sections but should follow this layout as closely as possible. ### 3.4 Auxiliary Files SEPs may include auxiliary files such as diagrams. Such files must be named sep-XXX-Y.ext , where "XXX" is the SEP number, "Y" is a serial number (starting at 1), and "ext" is replaced by the actual file extension (e.g. "png"). Editors will attach these files to the github issue. ## 4. Changes to SBOL voting rules <a name='voting'></a> SBOL governance rules laid out on http://sbolstandard.org/development/gov/ currently do not mention SEPs. They state that any two members of the mailing list can put any change proposal to a vote. Implementing SEP #1 means the governance page needs to be adapted. The necessary changes to the sections **Voting process** and **Voting form** are described in SEP #5. ## 5. Example or Use Case <a name='example'></a> SEP 1 serves as first example of a SEP and the SEP adoption process. ## 6. Discussion <a name='discussion'></a> ### 6.1 current versus new issue tracker This proposal is a further evolution step from the current informal (and still relatively recent) practice of submitting issues on the synbiodex / SBOL-specification issue tracker. Several SBOL members expressed the opinion this informal issue tracker is sufficient. The SBOL-specification repo hosts the official SBOL specification document. Issues submitted to the repository can refer both to the document itself (e.g. issues in the description of SBOL or other technical problems) as well as actual issues with the current SBOL specification. Some key differences to the current practice are: - create a dedicated repo / issue tracker only for SEPs - editor-only write access -- this introduces a social filter that, we think, should be important in moderating and filtering the discussion and will improve the quality of proposals - prescription of an official proposal template -- promoting minimal documentation standards - formalized and transparent process of decision making Compared to informal issues that can be quickly registered by any developer, SEPs introduces quite some documentation overhead. To some extend, this is intentional and meant to force everyone taking at least one big breath before suggesting SBOL overhauls. It may however create an unwanted barrier to reporting smaller issues people encounter when implementing SBOL. We have to see how this plays out in practice. The synbiodex / sbol-specification issue tracker may still be useful to quickly keep track of potential issues before escalating any of them to more formal change requests / SEPs. A dedicated repo has the added advantage that we can later decide to also to put SEP documents under version control by committing them as *.md documents in this repository. That's the way PEPs are organized. ### 6.2 Filtering by editors In Python, core developers can choose to submit a PEP directly. We believe going through an editor is an important social filter to ensure we are not overrun by too many requests of low quality. ### 6.3 Should we reserve the first 10 or 20 SEP numbers for important SBOL government rules? In Python, PEP 1 - 100 have been reserved for community - related process issues. After discussion among the editors, we opted against this as it is not easy to realize using the github issue tracker. ### 6.4 Should we insist on at least two SEP authors? The classic SBOL decission making process always requires two people to suggest anything for a vote. In case of SEPs, the editors act as an additional "gate" so that a second author may not always be needed. ## References <a name='references'></a> ## Copyright <a name='copyright'></a> <p xmlns:dct="http://purl.org/dc/terms/" xmlns:vcard="http://www.w3.org/2001/vcard-rdf/3.0#"> <a rel="license" href="http://creativecommons.org/publicdomain/zero/1.0/"> <img src="http://i.creativecommons.org/p/zero/1.0/88x31.png" style="border-style: none;" alt="CC0" /> </a> <br /> To the extent possible under law, <a rel="dct:publisher" href="sbolstandard.org"> <span property="dct:title">SBOL developers</span></a> has waived all copyright and related or neighboring rights to <span property="dct:title">SEP 001</span>. This work is published from: <span property="vcard:Country" datatype="dct:ISO3166" content="US" about="sbolstandard.org"> United States</span>. </p>
1.0
SEP 001 -- SBOL Enhancement Proposals - # SEP 001 -- SBOL Enhancement Proposals | SEP | 001 | | --- | --- | | **Title** | SBOL Enhancement Proposals | | **Authors** | Raik Grünberg (raik.gruenberg at gmail com), Bryan Bartley (bartleyba at sbolstandard org) | | **Editor** | Raik Grünberg | | **Type** | Procedure | | **Status** | Accepted | | **Created** | 05-Oct-2015 | | **Last modified** | 20-Feb-2016 | ## Abstract SEP stands for [SBOL](http://sbolstandard.org) Enhancement Proposal. A SEP is a design document providing information to the SBOL community. It may describe a new feature or term for the SBOL data model or a rule or organizational process that the SBOL community should follow. The SEP should provide the rationale and a concise technical specification of the feature. We intend SEPs to be the primary mechanisms for proposing major data model changes, for collecting community input on an issue, and for documenting the design decisions that have gone into SBOL. The SEP authors are responsible for building consensus within the community and documenting dissenting opinions. SEPs are filed by SBOL editors in a dedicated issue tracker on github. If not withdrawn, an SEP will eventually be put to a vote by the SBOL community. ## Table of Content - [1. Rationale](#rationale) - [2. Specification of the SEP Workflow](#specification) - 2.1 Drafting an SEP - 2.2 SEP Submission - 2.3 SEP Discussion and Updates - 2.4 Decision - [3. The SEP document](#document) - 3.1 SEP Types - 3.2 SEP Status - 3.3 Document Layout - 3.4 Auxiliary Files - [4. Changes to Voting rules](#voting) - [5. Example or Use Case](#example) - [6. Discussion](#discussion) - 6.1 current versus new issue tracker - 6.2 Filtering by editors - 6.3 Should we reserve the first 10 or 20 SEP numbers for important SBOL government rules? - [References](#references) - [Copyright](#copyright) ## 1. Rationale <a name="rationale"></a> With the growth of SBOL in terms of both scope and members, building consensus through informal mailing list discussions and SBOL meetings is becoming increasingly inefficient. Only a minority of SBOL participants is watching a given mailing list thread or present for an actual meeting. This makes it often difficult for others to "catch up" with an ongoing discussion and also leads to many circular debates that are recurring with some regularity or quickly move off-topic. SBOL Enhancement Proposals (SEPs) address many of these issues. SEPs are borrowing heavily from Python Enhancement Proposals (PEPs) [[1](https://www.python.org/dev/peps/pep-0001)] which are the main avenue to proposing and managing changes to the Python programming language. The PEP process has been used and refined by the Python developer community over many years and we hope to benefit from this experience. SEPs have the following goals: - Manage proposed changes to the SBOL data model. - Resolve long open-ended discussions - Distinguish between real issues versus informal suggestions - Summarize arguments **for** and **against** a proposed change - Introduce newcomers and bystanders to an issue under discussion - Formalize the drafting of voting ballots with community input - Document history of decision-making in the SBOL community By drafting proposals as a shared document, the SEP process is intended to help integrate the diverse opinions and perspectives presented in a discussion thread. It encourages consensus-making by forcing co-authors of a proposal to consolidate their opinions around the strongest points of agreement while resolving minor points of contention. Authors of a proposed change will be motivated to acknowledge, summarize, and address contrary points of view other than their own. ## 2. Specification of the SEP Workflow <a name="specification"></a> ### 2.1 Drafting an SEP The SEP process begins with a new idea for improving SBOL. It is highly recommended that a single SEP contain only a single key proposal or new idea. The more focused the SEP, the more successful it tends to be. The SBOL editors reserve the right to reject SEP proposals if they appear too unfocused or too broad. If in doubt, split your SEP into several well-focused ones. The SEP authors or author write the SEP using the style and format described below, shepherd the discussions in the appropriate forums, and attempt to build community consensus around the idea. The SEP champion (a.k.a. author) should first attempt to ascertain whether the idea is SEP-able. Posting to the sbol-dev@googlegroups.com mailing list is currently the best way to go about this. Vetting an idea publicly before going as far as writing a SEP is meant to save the potential author time. It also helps to make sure the idea is applicable to the entire community and not just the author. The author then writes up a plain text document, based on the template provided as SEP #2, summarizing their proposal as succinctly and clearly as possible (see below for details). ### 2.2 SEP Submission Following a discussion on sbol-dev or other channels (e.g. during an SBOL workshop), the proposal should be sent as a draft SEP to the SBOL editors < sbol-editors@googlegroups.com >. The draft must be written in SEP style as described below, else it will be sent back without further regard until proper formatting rules are followed (although minor errors will be corrected by the editors). Alternatively, the SEP may be submitted as a github pull request: 1. **fork** https://github.com/SynBioDex/SEPs into your own user space 2. open the sep_002_template.md document in RAW view and download/save the file locally ( or follow this link: https://raw.githubusercontent.com/SynBioDex/SEPs/master/sep_002_template.md) 3. rename sep_002_template.md and upload it back to your forked repository 4. Edit your SEP directly on github 5. Initiate a **pull request** for your version of the repository on https://github.com/SynBioDex/SEPs If approved, an editor will submit the SEP to the dedicated github issue tracker, for which only SBOL editors have write-access. The github issue number then becomes the SEP number by which the proposal can be referenced. The SBOL editors will _not_ unreasonably deny a SEP. Reasons for denying SEP status include duplication of effort, being technically unsound, not providing proper motivation or not addressing backwards compatibility. ### 2.3 SEP Discussion and Updates Authors are explicitly encouraged to update their SEP as the discussion progresses and their ideas are refined. As updates are necessary, the SEP author(s) can email new SEP versions to the SBOL editors, who will update the issue accordingly. The editors may invite, at their own discretion, other SBOL community members to formulate a short paragraph which will then be added to the discussion section in order to better document dissenting opinions. However, SEP authors are asked to fairly document dissent themselves so that "dissenting voice" paragraphs will hopefully be rarely needed. ### 2.4 Competing SEPs This is simply a list of any open SEPs that offer a competing or contradictory proposal. Simply by listing a competing SEP, the authors indicate acknowledgement of competing viewpoints. This should help Editors shepherd the progress of the SBOL community toward meaningful votes that provide clear, contrasting options to voters in the SBOL Developers community. ### 2.5 Decision Eventually, an SEP is either withdrawn by the authors or put to a vote by the SBOL Editors (or any two developers). The exact voting procedure is described in SEP #5. If approved, the SEP will be marked as “Accepted”. Once a change is implemented, the status changes to “Final”. Approved procedural SEPs (e.g. concerning SBOL governing rules) are labelled as "Active", indicating that this rule may be further adapted in the future. All SEPs will always remain "Open" on the issue tracker so that they are all visible by default. Editors attach and update issue labels to allow easy filtering for SEP Type and Status. ## 3. The SEP document <a name='document'></a> ### 3.1 SEP Types There are two types of SEPs: - data model -- a proposal to change or expand the SBOL data model - process -- a proposal of how to improve SBOL governance or management ### 3.2 SEP Status Every SEP starts out in “Draft” status. A draft can be: “Accepted”, “Rejected” or “Withdrawn”. A draft may also be “Deferred” if a discussion is deemed to be postponed. After their implementation, a SEP is labelled as “Final”. Later during the life cycle of an SEP it may be “Replaced” or “Deprecated”. Process SEPs are instead labelled “Active”. ### 3.3 Document Layout An SEP is described as a text document using (github-flavoured) Markdown syntax. A boilerplate template for writing a new SEP will be provided as SEP #2. The document must have the following sections (optional sections or lines given in "[ ]"): 1. Preamble - SEP number (assigned by editor, leave empty or XXX) - descriptive title (limited to a maximum of 44 characters) - names for each author - created (date) - Type of SEP (data model | process; process = SBOL organization or “governing rule”) - [SBOL version] (version to which change should be introduced; if type == data model) - [Replaces] (SEP number, only if this SEP is Final/Accepted and replaces another one) - Status (Draft | Accepted | Rejected | Withdrawn | Deferred | Final | Active) 2. Abstract -- a (very) short (~200 words, at most) summary of the proposed change or enhancement. 3. Motivation (or Rationale) -- The motivation / rationale for the proposed change and how it fits in with the mission and vision of the SBOL standard as the authors see it. 4. Specification (title may differ)-- The technical specification should describe the syntax and semantics of any new SBOL feature. The specification may include UML but this is not required. The specification can have sub-headings as required/useful. The name of this section may differ (especially for procedure SEPs). 5. Example or Use Case [optional but recommended] -- process or more trivial proposals may choose to skip this section. Data model changes **must** list a short example or use case. 6. Backwards Compatibility [optional] -- All SEPs that introduce backwards incompatibilities must include a section describing these incompatibilities and their severity. The SEP must explain how the author proposes to deal with these incompatibilities. 7. Discussion -- Summarize any relevant discussion, in particular counter-arguments brought up. References are listed at the end. A copyright statement should place the document in the public domain. Authors may choose to change or introduce additional top-level sections but should follow this layout as closely as possible. ### 3.4 Auxiliary Files SEPs may include auxiliary files such as diagrams. Such files must be named sep-XXX-Y.ext , where "XXX" is the SEP number, "Y" is a serial number (starting at 1), and "ext" is replaced by the actual file extension (e.g. "png"). Editors will attach these files to the github issue. ## 4. Changes to SBOL voting rules <a name='voting'></a> SBOL governance rules laid out on http://sbolstandard.org/development/gov/ currently do not mention SEPs. They state that any two members of the mailing list can put any change proposal to a vote. Implementing SEP #1 means the governance page needs to be adapted. The necessary changes to the sections **Voting process** and **Voting form** are described in SEP #5. ## 5. Example or Use Case <a name='example'></a> SEP 1 serves as first example of a SEP and the SEP adoption process. ## 6. Discussion <a name='discussion'></a> ### 6.1 current versus new issue tracker This proposal is a further evolution step from the current informal (and still relatively recent) practice of submitting issues on the synbiodex / SBOL-specification issue tracker. Several SBOL members expressed the opinion this informal issue tracker is sufficient. The SBOL-specification repo hosts the official SBOL specification document. Issues submitted to the repository can refer both to the document itself (e.g. issues in the description of SBOL or other technical problems) as well as actual issues with the current SBOL specification. Some key differences to the current practice are: - create a dedicated repo / issue tracker only for SEPs - editor-only write access -- this introduces a social filter that, we think, should be important in moderating and filtering the discussion and will improve the quality of proposals - prescription of an official proposal template -- promoting minimal documentation standards - formalized and transparent process of decision making Compared to informal issues that can be quickly registered by any developer, SEPs introduces quite some documentation overhead. To some extend, this is intentional and meant to force everyone taking at least one big breath before suggesting SBOL overhauls. It may however create an unwanted barrier to reporting smaller issues people encounter when implementing SBOL. We have to see how this plays out in practice. The synbiodex / sbol-specification issue tracker may still be useful to quickly keep track of potential issues before escalating any of them to more formal change requests / SEPs. A dedicated repo has the added advantage that we can later decide to also to put SEP documents under version control by committing them as *.md documents in this repository. That's the way PEPs are organized. ### 6.2 Filtering by editors In Python, core developers can choose to submit a PEP directly. We believe going through an editor is an important social filter to ensure we are not overrun by too many requests of low quality. ### 6.3 Should we reserve the first 10 or 20 SEP numbers for important SBOL government rules? In Python, PEP 1 - 100 have been reserved for community - related process issues. After discussion among the editors, we opted against this as it is not easy to realize using the github issue tracker. ### 6.4 Should we insist on at least two SEP authors? The classic SBOL decission making process always requires two people to suggest anything for a vote. In case of SEPs, the editors act as an additional "gate" so that a second author may not always be needed. ## References <a name='references'></a> ## Copyright <a name='copyright'></a> <p xmlns:dct="http://purl.org/dc/terms/" xmlns:vcard="http://www.w3.org/2001/vcard-rdf/3.0#"> <a rel="license" href="http://creativecommons.org/publicdomain/zero/1.0/"> <img src="http://i.creativecommons.org/p/zero/1.0/88x31.png" style="border-style: none;" alt="CC0" /> </a> <br /> To the extent possible under law, <a rel="dct:publisher" href="sbolstandard.org"> <span property="dct:title">SBOL developers</span></a> has waived all copyright and related or neighboring rights to <span property="dct:title">SEP 001</span>. This work is published from: <span property="vcard:Country" datatype="dct:ISO3166" content="US" about="sbolstandard.org"> United States</span>. </p>
process
sep sbol enhancement proposals sep sbol enhancement proposals sep title sbol enhancement proposals authors raik grünberg raik gruenberg at gmail com bryan bartley bartleyba at sbolstandard org editor raik grünberg type procedure status accepted created oct last modified feb abstract sep stands for enhancement proposal a sep is a design document providing information to the sbol community it may describe a new feature or term for the sbol data model or a rule or organizational process that the sbol community should follow the sep should provide the rationale and a concise technical specification of the feature we intend seps to be the primary mechanisms for proposing major data model changes for collecting community input on an issue and for documenting the design decisions that have gone into sbol the sep authors are responsible for building consensus within the community and documenting dissenting opinions seps are filed by sbol editors in a dedicated issue tracker on github if not withdrawn an sep will eventually be put to a vote by the sbol community table of content rationale specification drafting an sep sep submission sep discussion and updates decision document sep types sep status document layout auxiliary files voting example discussion current versus new issue tracker filtering by editors should we reserve the first or sep numbers for important sbol government rules references copyright rationale with the growth of sbol in terms of both scope and members building consensus through informal mailing list discussions and sbol meetings is becoming increasingly inefficient only a minority of sbol participants is watching a given mailing list thread or present for an actual meeting this makes it often difficult for others to catch up with an ongoing discussion and also leads to many circular debates that are recurring with some regularity or quickly move off topic sbol enhancement proposals seps address many of these issues seps are borrowing heavily from python enhancement proposals peps which are the main avenue to proposing and managing changes to the python programming language the pep process has been used and refined by the python developer community over many years and we hope to benefit from this experience seps have the following goals manage proposed changes to the sbol data model resolve long open ended discussions distinguish between real issues versus informal suggestions summarize arguments for and against a proposed change introduce newcomers and bystanders to an issue under discussion formalize the drafting of voting ballots with community input document history of decision making in the sbol community by drafting proposals as a shared document the sep process is intended to help integrate the diverse opinions and perspectives presented in a discussion thread it encourages consensus making by forcing co authors of a proposal to consolidate their opinions around the strongest points of agreement while resolving minor points of contention authors of a proposed change will be motivated to acknowledge summarize and address contrary points of view other than their own specification of the sep workflow drafting an sep the sep process begins with a new idea for improving sbol it is highly recommended that a single sep contain only a single key proposal or new idea the more focused the sep the more successful it tends to be the sbol editors reserve the right to reject sep proposals if they appear too unfocused or too broad if in doubt split your sep into several well focused ones the sep authors or author write the sep using the style and format described below shepherd the discussions in the appropriate forums and attempt to build community consensus around the idea the sep champion a k a author should first attempt to ascertain whether the idea is sep able posting to the sbol dev googlegroups com mailing list is currently the best way to go about this vetting an idea publicly before going as far as writing a sep is meant to save the potential author time it also helps to make sure the idea is applicable to the entire community and not just the author the author then writes up a plain text document based on the template provided as sep summarizing their proposal as succinctly and clearly as possible see below for details sep submission following a discussion on sbol dev or other channels e g during an sbol workshop the proposal should be sent as a draft sep to the sbol editors the draft must be written in sep style as described below else it will be sent back without further regard until proper formatting rules are followed although minor errors will be corrected by the editors alternatively the sep may be submitted as a github pull request fork into your own user space open the sep template md document in raw view and download save the file locally or follow this link rename sep template md and upload it back to your forked repository edit your sep directly on github initiate a pull request for your version of the repository on if approved an editor will submit the sep to the dedicated github issue tracker for which only sbol editors have write access the github issue number then becomes the sep number by which the proposal can be referenced the sbol editors will not unreasonably deny a sep reasons for denying sep status include duplication of effort being technically unsound not providing proper motivation or not addressing backwards compatibility sep discussion and updates authors are explicitly encouraged to update their sep as the discussion progresses and their ideas are refined as updates are necessary the sep author s can email new sep versions to the sbol editors who will update the issue accordingly the editors may invite at their own discretion other sbol community members to formulate a short paragraph which will then be added to the discussion section in order to better document dissenting opinions however sep authors are asked to fairly document dissent themselves so that dissenting voice paragraphs will hopefully be rarely needed competing seps this is simply a list of any open seps that offer a competing or contradictory proposal simply by listing a competing sep the authors indicate acknowledgement of competing viewpoints this should help editors shepherd the progress of the sbol community toward meaningful votes that provide clear contrasting options to voters in the sbol developers community decision eventually an sep is either withdrawn by the authors or put to a vote by the sbol editors or any two developers the exact voting procedure is described in sep if approved the sep will be marked as “accepted” once a change is implemented the status changes to “final” approved procedural seps e g concerning sbol governing rules are labelled as active indicating that this rule may be further adapted in the future all seps will always remain open on the issue tracker so that they are all visible by default editors attach and update issue labels to allow easy filtering for sep type and status the sep document sep types there are two types of seps data model a proposal to change or expand the sbol data model process a proposal of how to improve sbol governance or management sep status every sep starts out in “draft” status a draft can be “accepted” “rejected” or “withdrawn” a draft may also be “deferred” if a discussion is deemed to be postponed after their implementation a sep is labelled as “final” later during the life cycle of an sep it may be “replaced” or “deprecated” process seps are instead labelled “active” document layout an sep is described as a text document using github flavoured markdown syntax a boilerplate template for writing a new sep will be provided as sep the document must have the following sections optional sections or lines given in preamble sep number assigned by editor leave empty or xxx descriptive title limited to a maximum of characters names for each author created date type of sep data model process process sbol organization or “governing rule” version to which change should be introduced if type data model sep number only if this sep is final accepted and replaces another one status draft accepted rejected withdrawn deferred final active abstract a very short words at most summary of the proposed change or enhancement motivation or rationale the motivation rationale for the proposed change and how it fits in with the mission and vision of the sbol standard as the authors see it specification title may differ the technical specification should describe the syntax and semantics of any new sbol feature the specification may include uml but this is not required the specification can have sub headings as required useful the name of this section may differ especially for procedure seps example or use case process or more trivial proposals may choose to skip this section data model changes must list a short example or use case backwards compatibility all seps that introduce backwards incompatibilities must include a section describing these incompatibilities and their severity the sep must explain how the author proposes to deal with these incompatibilities discussion summarize any relevant discussion in particular counter arguments brought up references are listed at the end a copyright statement should place the document in the public domain authors may choose to change or introduce additional top level sections but should follow this layout as closely as possible auxiliary files seps may include auxiliary files such as diagrams such files must be named sep xxx y ext where xxx is the sep number y is a serial number starting at and ext is replaced by the actual file extension e g png editors will attach these files to the github issue changes to sbol voting rules sbol governance rules laid out on currently do not mention seps they state that any two members of the mailing list can put any change proposal to a vote implementing sep means the governance page needs to be adapted the necessary changes to the sections voting process and voting form are described in sep example or use case sep serves as first example of a sep and the sep adoption process discussion current versus new issue tracker this proposal is a further evolution step from the current informal and still relatively recent practice of submitting issues on the synbiodex sbol specification issue tracker several sbol members expressed the opinion this informal issue tracker is sufficient the sbol specification repo hosts the official sbol specification document issues submitted to the repository can refer both to the document itself e g issues in the description of sbol or other technical problems as well as actual issues with the current sbol specification some key differences to the current practice are create a dedicated repo issue tracker only for seps editor only write access this introduces a social filter that we think should be important in moderating and filtering the discussion and will improve the quality of proposals prescription of an official proposal template promoting minimal documentation standards formalized and transparent process of decision making compared to informal issues that can be quickly registered by any developer seps introduces quite some documentation overhead to some extend this is intentional and meant to force everyone taking at least one big breath before suggesting sbol overhauls it may however create an unwanted barrier to reporting smaller issues people encounter when implementing sbol we have to see how this plays out in practice the synbiodex sbol specification issue tracker may still be useful to quickly keep track of potential issues before escalating any of them to more formal change requests seps a dedicated repo has the added advantage that we can later decide to also to put sep documents under version control by committing them as md documents in this repository that s the way peps are organized filtering by editors in python core developers can choose to submit a pep directly we believe going through an editor is an important social filter to ensure we are not overrun by too many requests of low quality should we reserve the first or sep numbers for important sbol government rules in python pep have been reserved for community related process issues after discussion among the editors we opted against this as it is not easy to realize using the github issue tracker should we insist on at least two sep authors the classic sbol decission making process always requires two people to suggest anything for a vote in case of seps the editors act as an additional gate so that a second author may not always be needed references copyright p xmlns dct xmlns vcard a rel license href to the extent possible under law a rel dct publisher href sbolstandard org sbol developers has waived all copyright and related or neighboring rights to sep this work is published from span property vcard country datatype dct content us about sbolstandard org united states
1
7,119
10,266,276,692
IssuesEvent
2019-08-22 21:00:00
automotive-edge-computing-consortium/AECC
https://api.github.com/repos/automotive-edge-computing-consortium/AECC
opened
Scheduling of publications (with Marketing)
priority:High status:New type:Process
Publication of WGs' documents (esp. White Paper and Technical Paper) need to be aligned with Marketing activity.
1.0
Scheduling of publications (with Marketing) - Publication of WGs' documents (esp. White Paper and Technical Paper) need to be aligned with Marketing activity.
process
scheduling of publications with marketing publication of wgs documents esp white paper and technical paper need to be aligned with marketing activity
1
160,570
13,792,830,812
IssuesEvent
2020-10-09 14:08:31
aureamunoz/hello-world-fwless
https://api.github.com/repos/aureamunoz/hello-world-fwless
closed
Review README file of the branch dekorate4OS-jib
documentation
## TODO - Review README file of the branch dekorate4OS-jib - Improve content and steps to build/deploy on Openshift
1.0
Review README file of the branch dekorate4OS-jib - ## TODO - Review README file of the branch dekorate4OS-jib - Improve content and steps to build/deploy on Openshift
non_process
review readme file of the branch jib todo review readme file of the branch jib improve content and steps to build deploy on openshift
0
7,869
11,044,259,381
IssuesEvent
2019-12-09 12:59:13
prisma/photonjs
https://api.github.com/repos/prisma/photonjs
opened
Photon facade with zeit now
bug/2-confirmed kind/bug kind/regression process/candidate
After the implementation of https://github.com/prisma/photonjs/issues/261 I tried deploying a package to now and I run into the following runtime errors: ``` 2019-12-09T12:45:29.934Z undefined ERROR Error: @prisma/photon did not initialize yet. Please run "prisma2 generate" and try to import it again. at new Photon (/var/task/api/node_modules/@prisma/photon/index.js:3:11) at Object.<anonymous> (/var/task/api/index.js:7:16) at Module._compile (internal/modules/cjs/loader.js:778:30) at Object.Module._extensions..js (internal/modules/cjs/loader.js:789:10) at Module.load (internal/modules/cjs/loader.js:653:32) at tryModuleLoad (internal/modules/cjs/loader.js:593:12) at Function.Module._load (internal/modules/cjs/loader.js:585:3) at Module.require (internal/modules/cjs/loader.js:692:17) at require (internal/modules/cjs/helpers.js:25:18) at Object.<anonymous> (/var/task/___now_launcher.js:21:18) Duration: 379.72 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 32 MB RequestId: 2f38c67f-5074-44bf-a977-4eda84a796c0 Error: Runtime exited with error: exit status 1 Runtime.ExitError 2019-12-09T12:46:30.877Z undefined ERROR Error: @prisma/photon did not initialize yet. Please run "prisma2 generate" and try to import it again. at new Photon (/var/task/api/node_modules/@prisma/photon/index.js:3:11) at Object.<anonymous> (/var/task/api/index.js:7:16) at Module._compile (internal/modules/cjs/loader.js:778:30) at Object.Module._extensions..js (internal/modules/cjs/loader.js:789:10) at Module.load (internal/modules/cjs/loader.js:653:32) at tryModuleLoad (internal/modules/cjs/loader.js:593:12) at Function.Module._load (internal/modules/cjs/loader.js:585:3) at Module.require (internal/modules/cjs/loader.js:692:17) at require (internal/modules/cjs/helpers.js:25:18) at Object.<anonymous> (/var/task/___now_launcher.js:21:18) Duration: 367.03 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 32 MB RequestId: 3e563d22-52da-49a9-88d3-2f2d2984148f Error: Runtime exited with error: exit status 1 Runtime.ExitError ``` Reproduction repository: Reproduction build: https://zeit.co/divyenduz/express-photon/cqgr17uaf (Divy has access)
1.0
Photon facade with zeit now - After the implementation of https://github.com/prisma/photonjs/issues/261 I tried deploying a package to now and I run into the following runtime errors: ``` 2019-12-09T12:45:29.934Z undefined ERROR Error: @prisma/photon did not initialize yet. Please run "prisma2 generate" and try to import it again. at new Photon (/var/task/api/node_modules/@prisma/photon/index.js:3:11) at Object.<anonymous> (/var/task/api/index.js:7:16) at Module._compile (internal/modules/cjs/loader.js:778:30) at Object.Module._extensions..js (internal/modules/cjs/loader.js:789:10) at Module.load (internal/modules/cjs/loader.js:653:32) at tryModuleLoad (internal/modules/cjs/loader.js:593:12) at Function.Module._load (internal/modules/cjs/loader.js:585:3) at Module.require (internal/modules/cjs/loader.js:692:17) at require (internal/modules/cjs/helpers.js:25:18) at Object.<anonymous> (/var/task/___now_launcher.js:21:18) Duration: 379.72 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 32 MB RequestId: 2f38c67f-5074-44bf-a977-4eda84a796c0 Error: Runtime exited with error: exit status 1 Runtime.ExitError 2019-12-09T12:46:30.877Z undefined ERROR Error: @prisma/photon did not initialize yet. Please run "prisma2 generate" and try to import it again. at new Photon (/var/task/api/node_modules/@prisma/photon/index.js:3:11) at Object.<anonymous> (/var/task/api/index.js:7:16) at Module._compile (internal/modules/cjs/loader.js:778:30) at Object.Module._extensions..js (internal/modules/cjs/loader.js:789:10) at Module.load (internal/modules/cjs/loader.js:653:32) at tryModuleLoad (internal/modules/cjs/loader.js:593:12) at Function.Module._load (internal/modules/cjs/loader.js:585:3) at Module.require (internal/modules/cjs/loader.js:692:17) at require (internal/modules/cjs/helpers.js:25:18) at Object.<anonymous> (/var/task/___now_launcher.js:21:18) Duration: 367.03 ms Billed Duration: 400 ms Memory Size: 1024 MB Max Memory Used: 32 MB RequestId: 3e563d22-52da-49a9-88d3-2f2d2984148f Error: Runtime exited with error: exit status 1 Runtime.ExitError ``` Reproduction repository: Reproduction build: https://zeit.co/divyenduz/express-photon/cqgr17uaf (Divy has access)
process
photon facade with zeit now after the implementation of i tried deploying a package to now and i run into the following runtime errors undefined error error prisma photon did not initialize yet please run generate and try to import it again at new photon var task api node modules prisma photon index js at object var task api index js at module compile internal modules cjs loader js at object module extensions js internal modules cjs loader js at module load internal modules cjs loader js at trymoduleload internal modules cjs loader js at function module load internal modules cjs loader js at module require internal modules cjs loader js at require internal modules cjs helpers js at object var task now launcher js duration ms billed duration ms memory size mb max memory used mb requestid error runtime exited with error exit status runtime exiterror undefined error error prisma photon did not initialize yet please run generate and try to import it again at new photon var task api node modules prisma photon index js at object var task api index js at module compile internal modules cjs loader js at object module extensions js internal modules cjs loader js at module load internal modules cjs loader js at trymoduleload internal modules cjs loader js at function module load internal modules cjs loader js at module require internal modules cjs loader js at require internal modules cjs helpers js at object var task now launcher js duration ms billed duration ms memory size mb max memory used mb requestid error runtime exited with error exit status runtime exiterror reproduction repository reproduction build divy has access
1
17,704
23,587,985,701
IssuesEvent
2022-08-23 13:10:55
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
Release 5.3 - August 2022
P1 type: process release team-OSS
# Status of Bazel 5.3 - Expected release date: 08/22/2022 - [List of release blockers](https://github.com/bazelbuild/bazel/milestone/39) To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone. To cherry-pick a mainline commit into 5.3, simply send a PR against the `release-5.3.0` branch. Task list: - [ ] [Create draft release announcement](https://docs.google.com/document/d/1wDvulLlj4NAlPZamdlEVFORks3YXJonCjyuQMUQEmB0/edit) - [ ] Send for review the release announcement PR: - [ ] Push the release, notify package maintainers: - [ ] Update the documentation - [ ] Push the blog post - [ ] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
1.0
Release 5.3 - August 2022 - # Status of Bazel 5.3 - Expected release date: 08/22/2022 - [List of release blockers](https://github.com/bazelbuild/bazel/milestone/39) To report a release-blocking bug, please add a comment with the text `@bazel-io flag` to the issue. A release manager will triage it and add it to the milestone. To cherry-pick a mainline commit into 5.3, simply send a PR against the `release-5.3.0` branch. Task list: - [ ] [Create draft release announcement](https://docs.google.com/document/d/1wDvulLlj4NAlPZamdlEVFORks3YXJonCjyuQMUQEmB0/edit) - [ ] Send for review the release announcement PR: - [ ] Push the release, notify package maintainers: - [ ] Update the documentation - [ ] Push the blog post - [ ] Update the [release page](https://github.com/bazelbuild/bazel/releases/)
process
release august status of bazel expected release date to report a release blocking bug please add a comment with the text bazel io flag to the issue a release manager will triage it and add it to the milestone to cherry pick a mainline commit into simply send a pr against the release branch task list send for review the release announcement pr push the release notify package maintainers update the documentation push the blog post update the
1
265,402
20,092,696,528
IssuesEvent
2022-02-06 02:07:09
atsign-foundation/at_libraries
https://api.github.com/repos/atsign-foundation/at_libraries
closed
Review for missing documentation - at_contacts (API Reference)
documentation
- [ ] Right nav review [API reference](https://pub.dev/documentation/at_contact/latest/) missing documentation (samples and examples) - [ ] Is the [documentation link](https://atsign.dev/docs/) pointing to the correct location? Please look at the library links too (current, samples and examples) LIBRARIES - [ ] [at_contact](https://pub.dev/documentation/at_contact/latest/at_contact/at_contact-library.html)
1.0
Review for missing documentation - at_contacts (API Reference) - - [ ] Right nav review [API reference](https://pub.dev/documentation/at_contact/latest/) missing documentation (samples and examples) - [ ] Is the [documentation link](https://atsign.dev/docs/) pointing to the correct location? Please look at the library links too (current, samples and examples) LIBRARIES - [ ] [at_contact](https://pub.dev/documentation/at_contact/latest/at_contact/at_contact-library.html)
non_process
review for missing documentation at contacts api reference right nav review missing documentation samples and examples is the pointing to the correct location please look at the library links too current samples and examples libraries
0
12,021
14,738,500,115
IssuesEvent
2021-01-07 04:56:49
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Sarasota- Setup Monthly Billing cycle in SAB
anc-ops anc-process anp-0.5 ant-bug ant-support
In GitLab by @kdjstudios on Jun 6, 2018, 09:15 **Submitted by:** "Cori Bartlett" <cori.bartlett@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-06-06-15914 **Server:** Internal **Client/Site:** Sarasota **Account:** NA **Issue:** Would you please create a Monthly cycle in SAB for Sarasota? The next cycle would be for 7/1- Service Period 7/1- 7/31 and Usage period 6/1-6/30. Thank you!
1.0
Sarasota- Setup Monthly Billing cycle in SAB - In GitLab by @kdjstudios on Jun 6, 2018, 09:15 **Submitted by:** "Cori Bartlett" <cori.bartlett@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-06-06-15914 **Server:** Internal **Client/Site:** Sarasota **Account:** NA **Issue:** Would you please create a Monthly cycle in SAB for Sarasota? The next cycle would be for 7/1- Service Period 7/1- 7/31 and Usage period 6/1-6/30. Thank you!
process
sarasota setup monthly billing cycle in sab in gitlab by kdjstudios on jun submitted by cori bartlett helpdesk server internal client site sarasota account na issue would you please create a monthly cycle in sab for sarasota the next cycle would be for service period and usage period thank you
1
500,626
14,503,140,591
IssuesEvent
2020-12-11 22:10:01
SatcherInstitute/het-frontend
https://api.github.com/repos/SatcherInstitute/het-frontend
closed
Finalize web server hosting decision and document tradeoffs
High Priority
Right now we're using a netlify setup, which has a lot of conveniences like automatic previews from github, but is much less flexible. For features that require more server support like an admin login, we'd need to switch to a Node.js or Flask server (or other, but those are the two natural choices so we don't have to fragment our codebase into three languages). With a Node.js or Flask server, you don't get as much out of the box like with Netlify, but they have other flexibility benefits like being able to implement features that require server-side support and hosting in GCP (which means it can be managed the same as other resources, we can control backend API access, etc). Some other considerations with Netlify: - Requires a paid account for more than one login. This would very likely be a requirement for the main website as we'd want developers to be able to access it - The marketing/informational website is on Netlify. It may be convenient to have both of them there. Is this website going to stick around long-term or be turned into a landing page for the main website? - Does netlify scale with production apps? We should definitely sort this out but I'd recommend punting the decision until we've built more of the frontend and data server.
1.0
Finalize web server hosting decision and document tradeoffs - Right now we're using a netlify setup, which has a lot of conveniences like automatic previews from github, but is much less flexible. For features that require more server support like an admin login, we'd need to switch to a Node.js or Flask server (or other, but those are the two natural choices so we don't have to fragment our codebase into three languages). With a Node.js or Flask server, you don't get as much out of the box like with Netlify, but they have other flexibility benefits like being able to implement features that require server-side support and hosting in GCP (which means it can be managed the same as other resources, we can control backend API access, etc). Some other considerations with Netlify: - Requires a paid account for more than one login. This would very likely be a requirement for the main website as we'd want developers to be able to access it - The marketing/informational website is on Netlify. It may be convenient to have both of them there. Is this website going to stick around long-term or be turned into a landing page for the main website? - Does netlify scale with production apps? We should definitely sort this out but I'd recommend punting the decision until we've built more of the frontend and data server.
non_process
finalize web server hosting decision and document tradeoffs right now we re using a netlify setup which has a lot of conveniences like automatic previews from github but is much less flexible for features that require more server support like an admin login we d need to switch to a node js or flask server or other but those are the two natural choices so we don t have to fragment our codebase into three languages with a node js or flask server you don t get as much out of the box like with netlify but they have other flexibility benefits like being able to implement features that require server side support and hosting in gcp which means it can be managed the same as other resources we can control backend api access etc some other considerations with netlify requires a paid account for more than one login this would very likely be a requirement for the main website as we d want developers to be able to access it the marketing informational website is on netlify it may be convenient to have both of them there is this website going to stick around long term or be turned into a landing page for the main website does netlify scale with production apps we should definitely sort this out but i d recommend punting the decision until we ve built more of the frontend and data server
0
26,545
4,751,418,927
IssuesEvent
2016-10-22 21:39:04
klausw/hackerskeyboard
https://api.github.com/repos/klausw/hackerskeyboard
closed
Long Press | Button | Move key to Lock ctl key from being active
auto-migrated Priority-Medium Type-Defect
``` [ Before posting bugs, please check https://code.google.com/p/hackerskeyboard/wiki/FrequentlyAskedQuestions and the existing bugs for known issues - my responses may be delayed since I have very little time to work on this project. ] What steps will reproduce the problem? 1.type really fast 2.accidentally hit ctl 3.delete everything by hitting another key What is the expected behavior? Data typed doesn't get lost. What do you see instead? Data gets highlighted then deleted once the 'any key' is pressed :( What version of Hacker's Keyboard are you using, for example "1.37"? 1.37.1419.. On what phone or tablet? asus padfone X If applicable, does this affect the 4-row or 5-row layout, or both? Which language(s)? Both / All with ctl key. Please provide any additional information below. Please fix, its exaspersting! ``` Original issue reported on code.google.com by `Andrew.P...@gmail.com` on 13 Dec 2014 at 1:26
1.0
Long Press | Button | Move key to Lock ctl key from being active - ``` [ Before posting bugs, please check https://code.google.com/p/hackerskeyboard/wiki/FrequentlyAskedQuestions and the existing bugs for known issues - my responses may be delayed since I have very little time to work on this project. ] What steps will reproduce the problem? 1.type really fast 2.accidentally hit ctl 3.delete everything by hitting another key What is the expected behavior? Data typed doesn't get lost. What do you see instead? Data gets highlighted then deleted once the 'any key' is pressed :( What version of Hacker's Keyboard are you using, for example "1.37"? 1.37.1419.. On what phone or tablet? asus padfone X If applicable, does this affect the 4-row or 5-row layout, or both? Which language(s)? Both / All with ctl key. Please provide any additional information below. Please fix, its exaspersting! ``` Original issue reported on code.google.com by `Andrew.P...@gmail.com` on 13 Dec 2014 at 1:26
non_process
long press button move key to lock ctl key from being active before posting bugs please check and the existing bugs for known issues my responses may be delayed since i have very little time to work on this project what steps will reproduce the problem type really fast accidentally hit ctl delete everything by hitting another key what is the expected behavior data typed doesn t get lost what do you see instead data gets highlighted then deleted once the any key is pressed what version of hacker s keyboard are you using for example on what phone or tablet asus padfone x if applicable does this affect the row or row layout or both which language s both all with ctl key please provide any additional information below please fix its exaspersting original issue reported on code google com by andrew p gmail com on dec at
0
1,589
4,187,322,745
IssuesEvent
2016-06-23 17:06:25
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
multi-organism cellular localization ; GO:1902581
multiorganism processes Other term-related request PARL-UCL
multi-organism cellular localization ; GO:1902581 has no descendants. It was a TG-added term when we were creating single-organism and multi-organism grouping terms to avoid multi-organism terms appearing under single-organism processes. It should at least have the following children.... possibly part-of, looking at the children of the 'normal' cellular localisation term (GO:0051641): multi-organism intracellular transport ; GO:1902583 transport of virus ; GO:0046794 The following parentage is also missing for GO:1902581 and GO:1902583: multi-organism cellular process ; GO:0044764 —[isa]multi-organism cellular localization ; GO:1902581 —[isa]multi-organism intracellular transport ; GO:1902583 Thanks!
1.0
multi-organism cellular localization ; GO:1902581 - multi-organism cellular localization ; GO:1902581 has no descendants. It was a TG-added term when we were creating single-organism and multi-organism grouping terms to avoid multi-organism terms appearing under single-organism processes. It should at least have the following children.... possibly part-of, looking at the children of the 'normal' cellular localisation term (GO:0051641): multi-organism intracellular transport ; GO:1902583 transport of virus ; GO:0046794 The following parentage is also missing for GO:1902581 and GO:1902583: multi-organism cellular process ; GO:0044764 —[isa]multi-organism cellular localization ; GO:1902581 —[isa]multi-organism intracellular transport ; GO:1902583 Thanks!
process
multi organism cellular localization go multi organism cellular localization go has no descendants it was a tg added term when we were creating single organism and multi organism grouping terms to avoid multi organism terms appearing under single organism processes it should at least have the following children possibly part of looking at the children of the normal cellular localisation term go multi organism intracellular transport go transport of virus go the following parentage is also missing for go and go multi organism cellular process go — multi organism cellular localization go — multi organism intracellular transport go thanks
1
10,831
13,611,492,555
IssuesEvent
2020-09-23 08:56:44
Graylog2/graylog2-server
https://api.github.com/repos/Graylog2/graylog2-server
opened
JSON parsing: add shallow parsing up to a certain path depth
feature processing
## What? Add JSON parsing function(s) that support shallow parsing a deeply nested JSON document, returning all the key/value pairs up to a certain depth (optionally starting at a JSONPath expression). The unparsed portion of the subtree would be returned as a string, instead of being parsed, allowing either storing it verbatim or parsing it in a later stage. ## Why? When dealing with deeply nested documents, especially from sources like AWS services, the number of unique paths in that document can easily be in the tens of thousands, leading to too many fields in an index. Most of the time those fields aren't important for searching but need to be displayed later on. Given the variable nature of these documents, it's often hard to impossible to know all the interesting paths beforehand, so parsing just the first one or two levels of the JSON document helps in navigating it, without catastrophically exploding the number of keys. Removing the extra keys is often not possible, either, because their names are not known ahead of time, so they cannot be easily removed later on. ## Your Environment * Graylog Version: 3.3.5
1.0
JSON parsing: add shallow parsing up to a certain path depth - ## What? Add JSON parsing function(s) that support shallow parsing a deeply nested JSON document, returning all the key/value pairs up to a certain depth (optionally starting at a JSONPath expression). The unparsed portion of the subtree would be returned as a string, instead of being parsed, allowing either storing it verbatim or parsing it in a later stage. ## Why? When dealing with deeply nested documents, especially from sources like AWS services, the number of unique paths in that document can easily be in the tens of thousands, leading to too many fields in an index. Most of the time those fields aren't important for searching but need to be displayed later on. Given the variable nature of these documents, it's often hard to impossible to know all the interesting paths beforehand, so parsing just the first one or two levels of the JSON document helps in navigating it, without catastrophically exploding the number of keys. Removing the extra keys is often not possible, either, because their names are not known ahead of time, so they cannot be easily removed later on. ## Your Environment * Graylog Version: 3.3.5
process
json parsing add shallow parsing up to a certain path depth what add json parsing function s that support shallow parsing a deeply nested json document returning all the key value pairs up to a certain depth optionally starting at a jsonpath expression the unparsed portion of the subtree would be returned as a string instead of being parsed allowing either storing it verbatim or parsing it in a later stage why when dealing with deeply nested documents especially from sources like aws services the number of unique paths in that document can easily be in the tens of thousands leading to too many fields in an index most of the time those fields aren t important for searching but need to be displayed later on given the variable nature of these documents it s often hard to impossible to know all the interesting paths beforehand so parsing just the first one or two levels of the json document helps in navigating it without catastrophically exploding the number of keys removing the extra keys is often not possible either because their names are not known ahead of time so they cannot be easily removed later on your environment graylog version
1
16,865
22,145,102,288
IssuesEvent
2022-06-03 11:08:01
camunda/zeebe
https://api.github.com/repos/camunda/zeebe
closed
Use profiles to configure CI and integration/unit tests execution
kind/toil team/distributed team/process-automation
**Description** We have several scripts which run tests (it-java.sh, test-java.sh, test-java8.sh) which currently run the same command, but specify different options. Most of these options can actually be configured via maven profiles, and it would simplify the CI pipeline if we leveraged these instead of having scripts with a high level of duplication (which makes it harder to maintain the pipeline). It may be that this is not possible, so we should prototype it first, but it would be a big win if we could simplify our pipeline.
1.0
Use profiles to configure CI and integration/unit tests execution - **Description** We have several scripts which run tests (it-java.sh, test-java.sh, test-java8.sh) which currently run the same command, but specify different options. Most of these options can actually be configured via maven profiles, and it would simplify the CI pipeline if we leveraged these instead of having scripts with a high level of duplication (which makes it harder to maintain the pipeline). It may be that this is not possible, so we should prototype it first, but it would be a big win if we could simplify our pipeline.
process
use profiles to configure ci and integration unit tests execution description we have several scripts which run tests it java sh test java sh test sh which currently run the same command but specify different options most of these options can actually be configured via maven profiles and it would simplify the ci pipeline if we leveraged these instead of having scripts with a high level of duplication which makes it harder to maintain the pipeline it may be that this is not possible so we should prototype it first but it would be a big win if we could simplify our pipeline
1
2,659
5,435,208,034
IssuesEvent
2017-03-05 15:07:08
jlm2017/jlm-video-subtitles
https://api.github.com/repos/jlm2017/jlm-video-subtitles
opened
[subtitles] [en] MÉLENCHON : «LE 18 MARS, MARCHONS POUR LA 6E RÉPUBLIQUE»
Language: English Process: [0] Awaiting subtitles
## MÉLENCHON : «LE 18 MARS, MARCHONS POUR LA 6E RÉPUBLIQUE» &nbsp; | Info ------------- | ------------- **Date** | Sunday, 05 March 2017 **Duration** | 0:28:27 :clock7: **Language** | English :gb: **Video** | [See it on YouTube :arrow_upper_right:](https://www.youtube.com/watch?v=lOP70Zytja0) **Subtitles** | [Edit them in YouTube :arrow_upper_right:](https://www.youtube.com/timedtext_editor?v=lOP70Zytja0&tab=captions&bl=vmp&action_mde_edit_form=1&lang=en&ui=hd)
1.0
[subtitles] [en] MÉLENCHON : «LE 18 MARS, MARCHONS POUR LA 6E RÉPUBLIQUE» - ## MÉLENCHON : «LE 18 MARS, MARCHONS POUR LA 6E RÉPUBLIQUE» &nbsp; | Info ------------- | ------------- **Date** | Sunday, 05 March 2017 **Duration** | 0:28:27 :clock7: **Language** | English :gb: **Video** | [See it on YouTube :arrow_upper_right:](https://www.youtube.com/watch?v=lOP70Zytja0) **Subtitles** | [Edit them in YouTube :arrow_upper_right:](https://www.youtube.com/timedtext_editor?v=lOP70Zytja0&tab=captions&bl=vmp&action_mde_edit_form=1&lang=en&ui=hd)
process
mélenchon «le mars marchons pour la république» mélenchon «le mars marchons pour la république» nbsp info date sunday march duration language english gb video subtitles
1
117,750
15,170,235,657
IssuesEvent
2021-02-12 22:46:51
microsoft/microsoft-ui-xaml
https://api.github.com/repos/microsoft/microsoft-ui-xaml
closed
Accent button foreground should be white
area-Button area-Styling area-UIDesign needs-triage team-Controls
In the new styles, the foreground of an accent button is black which isn't that readable. Consider using white instead
1.0
Accent button foreground should be white - In the new styles, the foreground of an accent button is black which isn't that readable. Consider using white instead
non_process
accent button foreground should be white in the new styles the foreground of an accent button is black which isn t that readable consider using white instead
0
342,085
24,728,307,125
IssuesEvent
2022-10-20 15:31:14
Codepath-iOS-Group-2/Project
https://api.github.com/repos/Codepath-iOS-Group-2/Project
opened
User can create a post
documentation enhancement
* Create Create Post Screen * Layout screen ** Input field for adding caption ** image view to hold post image preview ** buttons to capture or upload image * Buttons to save post or cancel * Add camera functionality * Save post to Parse when save button is tapped * Return to home feed on cancel button * Trigger home feed to reload data to retrieve new post
1.0
User can create a post - * Create Create Post Screen * Layout screen ** Input field for adding caption ** image view to hold post image preview ** buttons to capture or upload image * Buttons to save post or cancel * Add camera functionality * Save post to Parse when save button is tapped * Return to home feed on cancel button * Trigger home feed to reload data to retrieve new post
non_process
user can create a post create create post screen layout screen input field for adding caption image view to hold post image preview buttons to capture or upload image buttons to save post or cancel add camera functionality save post to parse when save button is tapped return to home feed on cancel button trigger home feed to reload data to retrieve new post
0
173,819
6,532,788,108
IssuesEvent
2017-08-31 01:32:55
gudell/bwardp
https://api.github.com/repos/gudell/bwardp
opened
Landing Page - Redevelopment of Overall Landing Page
open development priority 1
Based on output of redesign session, new html-based site to be created by co-op students. Site to include top picture with imbedded search feature, ability to redirect quickly to Listings Page, ability to Login, links to mobile app for downloading, user features area, SmartStart area, user testimonials, and more calls-to-action
1.0
Landing Page - Redevelopment of Overall Landing Page - Based on output of redesign session, new html-based site to be created by co-op students. Site to include top picture with imbedded search feature, ability to redirect quickly to Listings Page, ability to Login, links to mobile app for downloading, user features area, SmartStart area, user testimonials, and more calls-to-action
non_process
landing page redevelopment of overall landing page based on output of redesign session new html based site to be created by co op students site to include top picture with imbedded search feature ability to redirect quickly to listings page ability to login links to mobile app for downloading user features area smartstart area user testimonials and more calls to action
0
165,161
26,112,057,025
IssuesEvent
2022-12-27 21:43:15
flutter/flutter
https://api.github.com/repos/flutter/flutter
closed
M3 Demo app: Make whole body scrollable
framework f: material design
When visiting https://flutter-experimental-m3-demo.web.app/#/ currently you can only scroll the components via trackpad or mouse wheel when the mouse pointer is hovering over the small section of components in the middle. I think the scrollview should extend to include the entire body of the application so you can scroll the body when the mouse pointer is anywhere within the body. This would also display the scrollbars at the right edge of the screen (where it is expected) instead of at the middle. ![image](https://user-images.githubusercontent.com/1227763/191358030-3da2a70d-8e28-460b-8e34-34f8c0d1016d.png) /cc @QuncCccccc @darrenaustin
1.0
M3 Demo app: Make whole body scrollable - When visiting https://flutter-experimental-m3-demo.web.app/#/ currently you can only scroll the components via trackpad or mouse wheel when the mouse pointer is hovering over the small section of components in the middle. I think the scrollview should extend to include the entire body of the application so you can scroll the body when the mouse pointer is anywhere within the body. This would also display the scrollbars at the right edge of the screen (where it is expected) instead of at the middle. ![image](https://user-images.githubusercontent.com/1227763/191358030-3da2a70d-8e28-460b-8e34-34f8c0d1016d.png) /cc @QuncCccccc @darrenaustin
non_process
demo app make whole body scrollable when visiting currently you can only scroll the components via trackpad or mouse wheel when the mouse pointer is hovering over the small section of components in the middle i think the scrollview should extend to include the entire body of the application so you can scroll the body when the mouse pointer is anywhere within the body this would also display the scrollbars at the right edge of the screen where it is expected instead of at the middle cc qunccccccc darrenaustin
0
355,103
10,576,480,416
IssuesEvent
2019-10-07 17:58:04
compodoc/compodoc
https://api.github.com/repos/compodoc/compodoc
closed
[FEATURE] Lazy Loaded Routes
Context : routing Priority: Medium Status: Accepted Time: ~1 hour Type: Bug wontfix
##### **Overview of the issue** Currently it does not seem that the routes that are lazy loaded in the system are found and used in the routers section of the generated documentation. ##### **Operating System, Node.js, npm, compodoc version(s)** Node v8.12.0 npm 6.4.1 compodoc 1.1.5 ##### **Angular configuration, a `package.json` file in the root folder** package.json in root folder, no clue what else is needed here, no config for compodoc in here. ##### **Compodoc installed globally or locally ?** Globally and locally ##### **If possible sourcecode of the file where it breaks** N/A ##### **If possible your terminal logs before the error** N/A ##### **Motivation for or Use Case** A better visual of the routing system in the documentation ##### **Reproduce the error** N/A ##### **Related issues** Possibly #39 or #609
1.0
[FEATURE] Lazy Loaded Routes - ##### **Overview of the issue** Currently it does not seem that the routes that are lazy loaded in the system are found and used in the routers section of the generated documentation. ##### **Operating System, Node.js, npm, compodoc version(s)** Node v8.12.0 npm 6.4.1 compodoc 1.1.5 ##### **Angular configuration, a `package.json` file in the root folder** package.json in root folder, no clue what else is needed here, no config for compodoc in here. ##### **Compodoc installed globally or locally ?** Globally and locally ##### **If possible sourcecode of the file where it breaks** N/A ##### **If possible your terminal logs before the error** N/A ##### **Motivation for or Use Case** A better visual of the routing system in the documentation ##### **Reproduce the error** N/A ##### **Related issues** Possibly #39 or #609
non_process
lazy loaded routes overview of the issue currently it does not seem that the routes that are lazy loaded in the system are found and used in the routers section of the generated documentation operating system node js npm compodoc version s node npm compodoc angular configuration a package json file in the root folder package json in root folder no clue what else is needed here no config for compodoc in here compodoc installed globally or locally globally and locally if possible sourcecode of the file where it breaks n a if possible your terminal logs before the error n a motivation for or use case a better visual of the routing system in the documentation reproduce the error n a related issues possibly or
0
63,611
6,876,432,229
IssuesEvent
2017-11-20 00:29:29
tpfinal-pp1/tp-final
https://api.github.com/repos/tpfinal-pp1/tp-final
closed
IMPORTANTE : publicaciones bug al editar, no me permite editar si dejo activa la publicacion
bug Liberado por desarrollo Liberado por testing
![bug edicion publis.gif](https://images.zenhubusercontent.com/59b95e1fb0222d5de47798be/d9821f09-5d61-4f64-9f7f-487013668824)
1.0
IMPORTANTE : publicaciones bug al editar, no me permite editar si dejo activa la publicacion - ![bug edicion publis.gif](https://images.zenhubusercontent.com/59b95e1fb0222d5de47798be/d9821f09-5d61-4f64-9f7f-487013668824)
non_process
importante publicaciones bug al editar no me permite editar si dejo activa la publicacion
0
11,375
14,217,843,200
IssuesEvent
2020-11-17 10:55:43
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
NTR : 'envenomation resulting in blood agglutination in other organism'
New term request multi-species process
Hi, I would like to request a new term 'envenomation resulting in blood agglutination in other organism' based on PMID: 10484740 'Primary structure and biological activity of snake venom lectin (APL) from Agkistrodon p. piscivorus (Eastern cottonmouth).' They found that "Purified APL (1 μg) strongly agglutinated rabbit erythrocytes in the presence of CaCl2 (Fig. 3b). The minimum hemagglutinating concentration of APL determined in the absence of additional CaCl2 was 0.21 μg/ml. Incubation of APL at 80°C for 15 min did not affect the hemagglutinating activity, while complete inhibition was observed by 15 mM galactose (Table 1). The effect of galactose was also observed even in the presence of CaCl2 (Fig. 3c). The calcium chelater EGTA, also affected the activity of APL in a concentration-dependent manner (Fig. 3d–g, Table 1), suggesting that APL is a galactose-binding protein and calcium ions are essential for hemagglutination activity." There was an 'lectin' term that has been obsoleted. Looking at that page I noticed that there were some suggestions of alternative terms. ( see below) GO:0005530 'obsolete lectin' Molecular Function Definition (GO:0005530 GONUTS page) OBSOLETE. Lectins are proteins obtained particularly from the seeds of leguminous plants, but also from many other plant and animal sources, that have binding sites for specific mono or oligosaccharides in cell walls or membranes. They thereby change the physiology of the membrane to cause agglutination, mitosis, or other biochemical changes in the cell. The two terms below could potentially work but they are not under the 'multi-organism process' branch. **GO:0007157 'heterophilic cell-cell adhesion via plasma membrane cell adhesion molecules'** Biological Process Definition (GO:0007157 GONUTS page) The attachment of an adhesion molecule in one cell to a nonidentical adhesion molecule in an adjacent cell. Synonym : 'agglutination' **GO:0016339 'calcium-dependent cell-cell adhesion via plasma membrane cell adhesion molecules'** Biological Process Definition (GO:0016339 GONUTS page) The attachment of one cell to another cell via adhesion molecules that require the presence of calcium for the interaction. There there is also this term that could be suitable as a parent if it was a new term to be created. **GO:0035738 'envenomation resulting in modulation of process in other organism'** Biological Process Definition (GO:0035738 GONUTS page) The process which begins with venom being forced into an organism by the bite or sting of another organism, and ends with the manifestation of some change or damage to the bitten organism. A comparison chart of the three terms can be seen below. I found this book that describes the how these lectins work (unfortunately the following page was not freely available..) Handbook of Venoms and Toxins of Reptiles Edited By Stephen P. Mackessy, Edition 1st Edition First Published 2010 (https://books.google.co.uk/books?id=x_vME799de4C&pg=PA362&lpg=PA362&dq=does+toxin++mediated+blood+agglutination+require+calcium?&source=bl&ots=PyL5-AxXPl&sig=ACfU3U0EqdbeAR6_3vKUTzzrI3NlTVhJ7A&hl=en&sa=X&ved=2ahUKEwjx-6Wz8vHnAhVUQ8AKHey7DpMQ6AEwDXoECAoQAQ#v=onepage&q=does%20toxin%20%20mediated%20blood%20agglutination%20require%20calcium%3F&f=false) The new term would be **'envenomation resulting in blood agglutination in other organism'** Suggested Definition A process that begins with venom being forced into an organism by the bite or sting of another organism, and ends with specific proteins binding to cell-surface carbohydrates and causing calcium-dependent agglutination of blood cells in the bitten organism. Synonyms: envenomation resulting in red blood cells agglutination in other organism envenomation resulting in erythrocytes agglutination in other organism At the moment, I have added all the terms in the comparison chart until a more appropriated term is available. Thank you for looking into this. Penelope
1.0
NTR : 'envenomation resulting in blood agglutination in other organism' - Hi, I would like to request a new term 'envenomation resulting in blood agglutination in other organism' based on PMID: 10484740 'Primary structure and biological activity of snake venom lectin (APL) from Agkistrodon p. piscivorus (Eastern cottonmouth).' They found that "Purified APL (1 μg) strongly agglutinated rabbit erythrocytes in the presence of CaCl2 (Fig. 3b). The minimum hemagglutinating concentration of APL determined in the absence of additional CaCl2 was 0.21 μg/ml. Incubation of APL at 80°C for 15 min did not affect the hemagglutinating activity, while complete inhibition was observed by 15 mM galactose (Table 1). The effect of galactose was also observed even in the presence of CaCl2 (Fig. 3c). The calcium chelater EGTA, also affected the activity of APL in a concentration-dependent manner (Fig. 3d–g, Table 1), suggesting that APL is a galactose-binding protein and calcium ions are essential for hemagglutination activity." There was an 'lectin' term that has been obsoleted. Looking at that page I noticed that there were some suggestions of alternative terms. ( see below) GO:0005530 'obsolete lectin' Molecular Function Definition (GO:0005530 GONUTS page) OBSOLETE. Lectins are proteins obtained particularly from the seeds of leguminous plants, but also from many other plant and animal sources, that have binding sites for specific mono or oligosaccharides in cell walls or membranes. They thereby change the physiology of the membrane to cause agglutination, mitosis, or other biochemical changes in the cell. The two terms below could potentially work but they are not under the 'multi-organism process' branch. **GO:0007157 'heterophilic cell-cell adhesion via plasma membrane cell adhesion molecules'** Biological Process Definition (GO:0007157 GONUTS page) The attachment of an adhesion molecule in one cell to a nonidentical adhesion molecule in an adjacent cell. Synonym : 'agglutination' **GO:0016339 'calcium-dependent cell-cell adhesion via plasma membrane cell adhesion molecules'** Biological Process Definition (GO:0016339 GONUTS page) The attachment of one cell to another cell via adhesion molecules that require the presence of calcium for the interaction. There there is also this term that could be suitable as a parent if it was a new term to be created. **GO:0035738 'envenomation resulting in modulation of process in other organism'** Biological Process Definition (GO:0035738 GONUTS page) The process which begins with venom being forced into an organism by the bite or sting of another organism, and ends with the manifestation of some change or damage to the bitten organism. A comparison chart of the three terms can be seen below. I found this book that describes the how these lectins work (unfortunately the following page was not freely available..) Handbook of Venoms and Toxins of Reptiles Edited By Stephen P. Mackessy, Edition 1st Edition First Published 2010 (https://books.google.co.uk/books?id=x_vME799de4C&pg=PA362&lpg=PA362&dq=does+toxin++mediated+blood+agglutination+require+calcium?&source=bl&ots=PyL5-AxXPl&sig=ACfU3U0EqdbeAR6_3vKUTzzrI3NlTVhJ7A&hl=en&sa=X&ved=2ahUKEwjx-6Wz8vHnAhVUQ8AKHey7DpMQ6AEwDXoECAoQAQ#v=onepage&q=does%20toxin%20%20mediated%20blood%20agglutination%20require%20calcium%3F&f=false) The new term would be **'envenomation resulting in blood agglutination in other organism'** Suggested Definition A process that begins with venom being forced into an organism by the bite or sting of another organism, and ends with specific proteins binding to cell-surface carbohydrates and causing calcium-dependent agglutination of blood cells in the bitten organism. Synonyms: envenomation resulting in red blood cells agglutination in other organism envenomation resulting in erythrocytes agglutination in other organism At the moment, I have added all the terms in the comparison chart until a more appropriated term is available. Thank you for looking into this. Penelope
process
ntr envenomation resulting in blood agglutination in other organism hi i would like to request a new term envenomation resulting in blood agglutination in other organism based on pmid primary structure and biological activity of snake venom lectin apl from agkistrodon p piscivorus eastern cottonmouth they found that purified apl μg strongly agglutinated rabbit erythrocytes in the presence of fig the minimum hemagglutinating concentration of apl determined in the absence of additional was μg ml incubation of apl at °c for min did not affect the hemagglutinating activity while complete inhibition was observed by mm galactose table the effect of galactose was also observed even in the presence of fig the calcium chelater egta also affected the activity of apl in a concentration dependent manner fig –g table suggesting that apl is a galactose binding protein and calcium ions are essential for hemagglutination activity there was an lectin term that has been obsoleted looking at that page i noticed that there were some suggestions of alternative terms see below go obsolete lectin molecular function definition go gonuts page obsolete lectins are proteins obtained particularly from the seeds of leguminous plants but also from many other plant and animal sources that have binding sites for specific mono or oligosaccharides in cell walls or membranes they thereby change the physiology of the membrane to cause agglutination mitosis or other biochemical changes in the cell the two terms below could potentially work but they are not under the multi organism process branch go heterophilic cell cell adhesion via plasma membrane cell adhesion molecules biological process definition go gonuts page the attachment of an adhesion molecule in one cell to a nonidentical adhesion molecule in an adjacent cell synonym agglutination go calcium dependent cell cell adhesion via plasma membrane cell adhesion molecules biological process definition go gonuts page the attachment of one cell to another cell via adhesion molecules that require the presence of calcium for the interaction there there is also this term that could be suitable as a parent if it was a new term to be created go envenomation resulting in modulation of process in other organism biological process definition go gonuts page the process which begins with venom being forced into an organism by the bite or sting of another organism and ends with the manifestation of some change or damage to the bitten organism a comparison chart of the three terms can be seen below i found this book that describes the how these lectins work unfortunately the following page was not freely available handbook of venoms and toxins of reptiles edited by stephen p mackessy edition edition first published the new term would be envenomation resulting in blood agglutination in other organism suggested definition a process that begins with venom being forced into an organism by the bite or sting of another organism and ends with specific proteins binding to cell surface carbohydrates and causing calcium dependent agglutination of blood cells in the bitten organism synonyms envenomation resulting in red blood cells agglutination in other organism envenomation resulting in erythrocytes agglutination in other organism at the moment i have added all the terms in the comparison chart until a more appropriated term is available thank you for looking into this penelope
1
6,738
9,872,905,038
IssuesEvent
2019-06-22 09:18:22
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Concave hull (alpha shapes) should allow creating separate hulls for each value of a user-selected field
Feature Request Processing
Author Name: **Johannes Kroeger** (Johannes Kroeger) Original Redmine Issue: [21878](https://issues.qgis.org/issues/21878) Redmine category:processing/qgis --- @Concave hull (alpha shapes)@ and @Concave hull (k-nearest neighbor)@ should allow creating separate hulls for each value of a user-selected field. The @Minimum bounding geometry@ supports this (as do many others). See also #29564 for the same request for the @Convex hull@ tool.
1.0
Concave hull (alpha shapes) should allow creating separate hulls for each value of a user-selected field - Author Name: **Johannes Kroeger** (Johannes Kroeger) Original Redmine Issue: [21878](https://issues.qgis.org/issues/21878) Redmine category:processing/qgis --- @Concave hull (alpha shapes)@ and @Concave hull (k-nearest neighbor)@ should allow creating separate hulls for each value of a user-selected field. The @Minimum bounding geometry@ supports this (as do many others). See also #29564 for the same request for the @Convex hull@ tool.
process
concave hull alpha shapes should allow creating separate hulls for each value of a user selected field author name johannes kroeger johannes kroeger original redmine issue redmine category processing qgis concave hull alpha shapes and concave hull k nearest neighbor should allow creating separate hulls for each value of a user selected field the minimum bounding geometry supports this as do many others see also for the same request for the convex hull tool
1
52,988
13,098,441,059
IssuesEvent
2020-08-03 19:29:18
pwa-builder/PWABuilder
https://api.github.com/repos/pwa-builder/PWABuilder
closed
[Screen Readers - PWA Builder - Share API Page] : Alt="" should be defined for the decorative image available on the web page.
A11yCT A11yMAS A11yMediumImpact Accessibility HCL- PWABuilder HCL-E+D MAS1.1.1 Severity3 bug :bug: fixed
**User Experience:** When User navigates the focus to the image the screen reader instead of giving no result, it reads "Graph Login Image", as alt text is defined as alt="Graph Login Image" **Test Environment:** OS: Windows 10 build 19608.1006 Browser: Edge - Anaheim - Version 85.0.545.0 (Official build) dev (64-bit) URL: https://preview.pwabuilder.com/feature/Create%20Share **Repro Steps** 1. Open the URL https://preview.pwabuilder.com/feature/Create%20Share in Edge Anaheim dev browser. 2. Pwabuilder "Share content with the Web Share API" page will open. 3. Navigate to the image present at the bottom of the page. 4. Observe the issue. **Actual Result** Alt= "Graph Login Image" is defined for the decorative image. **Expected Result** Alt="" should be defined for the decorative image available on the web page as this the decorative image and does not contain any informative information. **MAS Reference:** https://microsoft.sharepoint.com/:w:/r/teams/msenable/_layouts/15/WopiFrame.aspx?sourcedoc={d2d2051f-bdc8-4af7-8e18-38aae867e216} ![Image Alt text defined Should be null](https://user-images.githubusercontent.com/66783558/85013664-9876aa80-b182-11ea-8f4c-26741c63ec57.png)
1.0
[Screen Readers - PWA Builder - Share API Page] : Alt="" should be defined for the decorative image available on the web page. - **User Experience:** When User navigates the focus to the image the screen reader instead of giving no result, it reads "Graph Login Image", as alt text is defined as alt="Graph Login Image" **Test Environment:** OS: Windows 10 build 19608.1006 Browser: Edge - Anaheim - Version 85.0.545.0 (Official build) dev (64-bit) URL: https://preview.pwabuilder.com/feature/Create%20Share **Repro Steps** 1. Open the URL https://preview.pwabuilder.com/feature/Create%20Share in Edge Anaheim dev browser. 2. Pwabuilder "Share content with the Web Share API" page will open. 3. Navigate to the image present at the bottom of the page. 4. Observe the issue. **Actual Result** Alt= "Graph Login Image" is defined for the decorative image. **Expected Result** Alt="" should be defined for the decorative image available on the web page as this the decorative image and does not contain any informative information. **MAS Reference:** https://microsoft.sharepoint.com/:w:/r/teams/msenable/_layouts/15/WopiFrame.aspx?sourcedoc={d2d2051f-bdc8-4af7-8e18-38aae867e216} ![Image Alt text defined Should be null](https://user-images.githubusercontent.com/66783558/85013664-9876aa80-b182-11ea-8f4c-26741c63ec57.png)
non_process
alt should be defined for the decorative image available on the web page user experience when user navigates the focus to the image the screen reader instead of giving no result it reads graph login image as alt text is defined as alt graph login image test environment os windows build browser edge anaheim version official build dev bit url repro steps open the url in edge anaheim dev browser pwabuilder share content with the web share api page will open navigate to the image present at the bottom of the page observe the issue actual result alt graph login image is defined for the decorative image expected result alt should be defined for the decorative image available on the web page as this the decorative image and does not contain any informative information mas reference
0
778,484
27,318,444,535
IssuesEvent
2023-02-24 17:33:42
SETI/pds-webtools
https://api.github.com/repos/SETI/pds-webtools
opened
Add pickle files for holdings/documents directory
A-Enhancement Effort 2 Medium B-Other Priority 4 Useful B-PdsFile
Currently the `documents` directory does not have associated pickle files, which means any access by PdsFile needs to go to the filesystem instead of the pickle files. It would be more consistent to have pickle files for the documents directory as well. This involves updating the scripts in `validation` and also making any needed modifications to PdsFile.
1.0
Add pickle files for holdings/documents directory - Currently the `documents` directory does not have associated pickle files, which means any access by PdsFile needs to go to the filesystem instead of the pickle files. It would be more consistent to have pickle files for the documents directory as well. This involves updating the scripts in `validation` and also making any needed modifications to PdsFile.
non_process
add pickle files for holdings documents directory currently the documents directory does not have associated pickle files which means any access by pdsfile needs to go to the filesystem instead of the pickle files it would be more consistent to have pickle files for the documents directory as well this involves updating the scripts in validation and also making any needed modifications to pdsfile
0
181,657
14,884,713,662
IssuesEvent
2021-01-20 14:54:27
sendelufa/jd0
https://api.github.com/repos/sendelufa/jd0
closed
Не описан формат запроса при редактировании профиля без смены пароля
documentation
не описан запрос ![image](https://user-images.githubusercontent.com/10711828/88476535-842c8700-cf52-11ea-9472-1f95517a3e4e.png) фото отправляется, а пароль нет необходимо добавить в API
1.0
Не описан формат запроса при редактировании профиля без смены пароля - не описан запрос ![image](https://user-images.githubusercontent.com/10711828/88476535-842c8700-cf52-11ea-9472-1f95517a3e4e.png) фото отправляется, а пароль нет необходимо добавить в API
non_process
не описан формат запроса при редактировании профиля без смены пароля не описан запрос фото отправляется а пароль нет необходимо добавить в api
0
738,129
25,546,632,382
IssuesEvent
2022-11-29 19:23:56
meanstream-io/meanstream
https://api.github.com/repos/meanstream-io/meanstream
closed
Feature Request: Button to download Atem Macro & Companion file as a zip
type: enhancement priority: low complexity: low
It would be great if a user could download the config files as a Zip file for backup and-/ or to share the config file with others.
1.0
Feature Request: Button to download Atem Macro & Companion file as a zip - It would be great if a user could download the config files as a Zip file for backup and-/ or to share the config file with others.
non_process
feature request button to download atem macro companion file as a zip it would be great if a user could download the config files as a zip file for backup and or to share the config file with others
0
9,679
12,681,878,322
IssuesEvent
2020-06-19 16:12:58
GoogleCloudPlatform/python-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples
closed
Docker image rebuilt for dependency update PR
testing type: process
PR: https://github.com/GoogleCloudPlatform/python-docs-samples/pull/4109 Log: https://source.cloud.google.com/results/invocations/62276164-4adf-42d7-ae21-c96d43abe161/targets The PR is only updating `requirements.txt`, but the docker re-build was triggered.
1.0
Docker image rebuilt for dependency update PR - PR: https://github.com/GoogleCloudPlatform/python-docs-samples/pull/4109 Log: https://source.cloud.google.com/results/invocations/62276164-4adf-42d7-ae21-c96d43abe161/targets The PR is only updating `requirements.txt`, but the docker re-build was triggered.
process
docker image rebuilt for dependency update pr pr log the pr is only updating requirements txt but the docker re build was triggered
1
331,197
28,606,200,814
IssuesEvent
2023-04-24 00:58:37
pytorch/pytorch
https://api.github.com/repos/pytorch/pytorch
closed
DISABLED test_noncontiguous_samples_matmul_cuda_float32 (__main__.TestCommonCUDA)
module: cuda triaged module: flaky-tests skipped matrix multiplication
Platforms: linux This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_noncontiguous_samples_matmul_cuda_float32&suite=TestCommonCUDA) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/10988020475). Over the past 3 hours, it has been determined flaky in 8 workflow(s) with 8 failures and 8 successes. **Debugging instructions (after clicking on the recent samples link):** DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs. To find relevant log snippets: 1. Click on the workflow logs linked above 2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work. 3. Grep for `test_noncontiguous_samples_matmul_cuda_float32` 4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs. Test file path: `test_ops.py` cc @ngimel @jianyuh @nikitaved @pearu @mruberry @walterddr @IvanYashchuk @xwang233 @Lezcano
1.0
DISABLED test_noncontiguous_samples_matmul_cuda_float32 (__main__.TestCommonCUDA) - Platforms: linux This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_noncontiguous_samples_matmul_cuda_float32&suite=TestCommonCUDA) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/10988020475). Over the past 3 hours, it has been determined flaky in 8 workflow(s) with 8 failures and 8 successes. **Debugging instructions (after clicking on the recent samples link):** DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs. To find relevant log snippets: 1. Click on the workflow logs linked above 2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work. 3. Grep for `test_noncontiguous_samples_matmul_cuda_float32` 4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs. Test file path: `test_ops.py` cc @ngimel @jianyuh @nikitaved @pearu @mruberry @walterddr @IvanYashchuk @xwang233 @Lezcano
non_process
disabled test noncontiguous samples matmul cuda main testcommoncuda platforms linux this test was disabled because it is failing in ci see and the most recent trunk over the past hours it has been determined flaky in workflow s with failures and successes debugging instructions after clicking on the recent samples link do not assume things are okay if the ci is green we now shield flaky tests from developers so ci will thus be green but it will be harder to parse the logs to find relevant log snippets click on the workflow logs linked above click on the test step of the job so that it is expanded otherwise the grepping will not work grep for test noncontiguous samples matmul cuda there should be several instances run as flaky tests are rerun in ci from which you can study the logs test file path test ops py cc ngimel jianyuh nikitaved pearu mruberry walterddr ivanyashchuk lezcano
0
812,286
30,324,331,198
IssuesEvent
2023-07-10 22:04:29
OpenSourceMalaria/OSM_To_Do_List
https://api.github.com/repos/OpenSourceMalaria/OSM_To_Do_List
closed
Resynthesise TCMDC 132385 to confirm potency
High Priority Synthetic Chemistry Needed Series 3
Prior to synthesis of more piperazine derivatives it is important to confirm the potency of this compounds: ![tcmdc-132385](https://f.cloud.github.com/assets/2626599/566084/88c25172-c669-11e2-84b1-8e591e0be178.png)
1.0
Resynthesise TCMDC 132385 to confirm potency - Prior to synthesis of more piperazine derivatives it is important to confirm the potency of this compounds: ![tcmdc-132385](https://f.cloud.github.com/assets/2626599/566084/88c25172-c669-11e2-84b1-8e591e0be178.png)
non_process
resynthesise tcmdc to confirm potency prior to synthesis of more piperazine derivatives it is important to confirm the potency of this compounds
0
5,510
8,376,914,626
IssuesEvent
2018-10-05 21:39:57
rchain/bounties
https://api.github.com/repos/rchain/bounties
closed
Copy InvoiceTemplate to Personal Gsheet
invoice-process zz-Operations
Goal: RAM can create an invoice by changing the month of payment Action: get invoice data from [Bounty App](https://rewards.rchain.coop/) to Personal Gsheet. Budget: $1000 Deadline: 4/1/2018 Restriction: trustable person because of access to private data Repeated action for 75 collaborators with [RAMmain registration](https://docs.google.com/spreadsheets/d/1rV6Qga-JJpcx-O2I352kHNabIZAXFa46zAl8rHXqYxE/edit#gid=363550433&range=B1:B680) - [ ] copy mail address from TAB [Form Responses 2](https://docs.google.com/spreadsheets/d/1rV6Qga-JJpcx-O2I352kHNabIZAXFa46zAl8rHXqYxE/edit#gid=363550433&range=B1:B680) to cell [c7](https://docs.google.com/spreadsheets/d/1rV6Qga-JJpcx-O2I352kHNabIZAXFa46zAl8rHXqYxE/edit#gid=133909296&range=C7) - [ ] duplicate TAB "Temp", rename "Copy of Temp" to "T" - [ ] in TAB "T" copy rows [2 to 18](https://docs.google.com/spreadsheets/d/1rV6Qga-JJpcx-O2I352kHNabIZAXFa46zAl8rHXqYxE/edit#gid=239080108&range=2:18) and paste as values - [ ] copy cel c15 [TempInvoice](https://docs.google.com/spreadsheets/d/1rV6Qga-JJpcx-O2I352kHNabIZAXFa46zAl8rHXqYxE/edit#gid=133909296&range=C15:G15) - [ ] TAB "Copy to" and paste it in "Or paste a web address here" - [ ] Press OK and go to next email address - [ ] Can this be programmed in a Macro, Script ??
1.0
Copy InvoiceTemplate to Personal Gsheet - Goal: RAM can create an invoice by changing the month of payment Action: get invoice data from [Bounty App](https://rewards.rchain.coop/) to Personal Gsheet. Budget: $1000 Deadline: 4/1/2018 Restriction: trustable person because of access to private data Repeated action for 75 collaborators with [RAMmain registration](https://docs.google.com/spreadsheets/d/1rV6Qga-JJpcx-O2I352kHNabIZAXFa46zAl8rHXqYxE/edit#gid=363550433&range=B1:B680) - [ ] copy mail address from TAB [Form Responses 2](https://docs.google.com/spreadsheets/d/1rV6Qga-JJpcx-O2I352kHNabIZAXFa46zAl8rHXqYxE/edit#gid=363550433&range=B1:B680) to cell [c7](https://docs.google.com/spreadsheets/d/1rV6Qga-JJpcx-O2I352kHNabIZAXFa46zAl8rHXqYxE/edit#gid=133909296&range=C7) - [ ] duplicate TAB "Temp", rename "Copy of Temp" to "T" - [ ] in TAB "T" copy rows [2 to 18](https://docs.google.com/spreadsheets/d/1rV6Qga-JJpcx-O2I352kHNabIZAXFa46zAl8rHXqYxE/edit#gid=239080108&range=2:18) and paste as values - [ ] copy cel c15 [TempInvoice](https://docs.google.com/spreadsheets/d/1rV6Qga-JJpcx-O2I352kHNabIZAXFa46zAl8rHXqYxE/edit#gid=133909296&range=C15:G15) - [ ] TAB "Copy to" and paste it in "Or paste a web address here" - [ ] Press OK and go to next email address - [ ] Can this be programmed in a Macro, Script ??
process
copy invoicetemplate to personal gsheet goal ram can create an invoice by changing the month of payment action get invoice data from to personal gsheet budget deadline restriction trustable person because of access to private data repeated action for collaborators with copy mail address from tab to cell duplicate tab temp rename copy of temp to t in tab t copy rows and paste as values copy cel tab copy to and paste it in or paste a web address here press ok and go to next email address can this be programmed in a macro script
1
21,255
28,376,900,696
IssuesEvent
2023-04-12 21:41:22
X-Sharp/XSharpPublic
https://api.github.com/repos/X-Sharp/XSharpPublic
closed
X# preprocessor insert excess spaces
bug Preprocessor
**Describe the bug** X# preprocessor insert excess spaces into #translate. Because of this it isn't possible to convert our date literals to X# native. **To Reproduce** ``` #translate $(<d>/<m>/<y>) => {^<y>-<m>-<d>} local d := $(23/03/2023) if $(24/03/2023) < d ? "More" else ? "Less" endif ``` **Expected behavior (ppo file)** ``` local d := {^2023-03-23} if {^2023-03-24} < d ? "More" else ? "Less" endif ``` **Actual behavior (ppo file)** ``` local d := {^23 -03 -2023 } if {^23 -03 -2023 } < d ? "More" else ? "Less" endif ``` **Error message** Error XS9111 The error is most likely related to the token '$' that was used at this location Error XS9002 Parser: unexpected input '23' **Additional context** X# Compiler version 2.14.0.4 (release)
1.0
X# preprocessor insert excess spaces - **Describe the bug** X# preprocessor insert excess spaces into #translate. Because of this it isn't possible to convert our date literals to X# native. **To Reproduce** ``` #translate $(<d>/<m>/<y>) => {^<y>-<m>-<d>} local d := $(23/03/2023) if $(24/03/2023) < d ? "More" else ? "Less" endif ``` **Expected behavior (ppo file)** ``` local d := {^2023-03-23} if {^2023-03-24} < d ? "More" else ? "Less" endif ``` **Actual behavior (ppo file)** ``` local d := {^23 -03 -2023 } if {^23 -03 -2023 } < d ? "More" else ? "Less" endif ``` **Error message** Error XS9111 The error is most likely related to the token '$' that was used at this location Error XS9002 Parser: unexpected input '23' **Additional context** X# Compiler version 2.14.0.4 (release)
process
x preprocessor insert excess spaces describe the bug x preprocessor insert excess spaces into translate because of this it isn t possible to convert our date literals to x native to reproduce translate local d if d more else less endif expected behavior ppo file local d if d more else less endif actual behavior ppo file local d if d more else less endif error message error the error is most likely related to the token that was used at this location error parser unexpected input additional context x compiler version release
1
701,684
24,103,736,925
IssuesEvent
2022-09-20 05:01:05
GTBitsOfGood/helping-mamas
https://api.github.com/repos/GTBitsOfGood/helping-mamas
closed
Fix volunteer view refresh button
enhancement good first issue Low priority
### Overview Replace the refresh button style on the volunteer view with the one on the admin view applicant viewer.
1.0
Fix volunteer view refresh button - ### Overview Replace the refresh button style on the volunteer view with the one on the admin view applicant viewer.
non_process
fix volunteer view refresh button overview replace the refresh button style on the volunteer view with the one on the admin view applicant viewer
0
393,016
26,968,471,582
IssuesEvent
2023-02-09 01:21:54
ledger/ledger
https://api.github.com/repos/ledger/ledger
closed
docs: merge aux date and effective date sections
documentation
@simonmichael commented on IRC: ``` 00:31 < sm> https://www.ledger-cli.org/3.0/doc/ledger3.html#Auxiliary-dates and 00:31 < sm> https://www.ledger-cli.org/3.0/doc/ledger3.html#Effective-Dates are documenting the same thing I think ``` they are the same, yeah...
1.0
docs: merge aux date and effective date sections - @simonmichael commented on IRC: ``` 00:31 < sm> https://www.ledger-cli.org/3.0/doc/ledger3.html#Auxiliary-dates and 00:31 < sm> https://www.ledger-cli.org/3.0/doc/ledger3.html#Effective-Dates are documenting the same thing I think ``` they are the same, yeah...
non_process
docs merge aux date and effective date sections simonmichael commented on irc and are documenting the same thing i think they are the same yeah
0
13,637
16,326,522,132
IssuesEvent
2021-05-12 01:57:22
allinurl/goaccess
https://api.github.com/repos/allinurl/goaccess
closed
A possible divide by zero bug
bug html report log-processing terminal output
In src/gholder.c, the function `set_root_metrics` has the following [code](https://github.com/allinurl/goaccess/blob/0abddd5f09ea6385b8bc2b6b1e0d13e438c1697c/src/gholder.c#L550-#L561): ``` static int set_root_metrics (GRawDataItem item, GModule module, datatype type, GMetrics ** nmetrics) { ... uint32_t hits = 0; if (map_data (module, item, type, &data, &hits) == 1) return 1; ... metrics->avgts.nts = cumts / hits; ``` On successful return, the function `map_data` may set `hits` to `item.hits`, and if `item.hits` is `0`, then there is a divide by zero problem. I think this case maybe possible. For example, function [`parse_raw_data`](https://github.com/allinurl/goaccess/blob/0abddd5f09ea6385b8bc2b6b1e0d13e438c1697c/src/gkhash.c#L3546) seems to leave the field `hits` in `GRawDataItem` as `0`.
1.0
A possible divide by zero bug - In src/gholder.c, the function `set_root_metrics` has the following [code](https://github.com/allinurl/goaccess/blob/0abddd5f09ea6385b8bc2b6b1e0d13e438c1697c/src/gholder.c#L550-#L561): ``` static int set_root_metrics (GRawDataItem item, GModule module, datatype type, GMetrics ** nmetrics) { ... uint32_t hits = 0; if (map_data (module, item, type, &data, &hits) == 1) return 1; ... metrics->avgts.nts = cumts / hits; ``` On successful return, the function `map_data` may set `hits` to `item.hits`, and if `item.hits` is `0`, then there is a divide by zero problem. I think this case maybe possible. For example, function [`parse_raw_data`](https://github.com/allinurl/goaccess/blob/0abddd5f09ea6385b8bc2b6b1e0d13e438c1697c/src/gkhash.c#L3546) seems to leave the field `hits` in `GRawDataItem` as `0`.
process
a possible divide by zero bug in src gholder c the function set root metrics has the following static int set root metrics grawdataitem item gmodule module datatype type gmetrics nmetrics t hits if map data module item type data hits return metrics avgts nts cumts hits on successful return the function map data may set hits to item hits and if item hits is then there is a divide by zero problem i think this case maybe possible for example function seems to leave the field hits in grawdataitem as
1
28,661
7,010,820,083
IssuesEvent
2017-12-20 01:35:47
mozilla-mobile/focus-android
https://api.github.com/repos/mozilla-mobile/focus-android
closed
[breakdown] Rethink/refactor our erase strategy
code P3
From focus-android created by [mcomella](https://github.com/mcomella) : mozilla-mobile/focus-android#1569 Splitting from #1472 to avoid scope creep: > Notes: > - We have three kinds of erases: erases with a WebView instance, erases with static WebView methods, and erasing the files on disk. I'd suggest merging the last two. > - We call each of these three at different times for what seem like a lot of redundant calls, e.g. when sessions becomes empty (delete files on disk), onPause and finishing (delete files on disk), when the erase button is actually hit (all three) > - The methods are confusingly named, especially with the extra layers of abstraction like WebViewProvider > > We should try to centralize the erase so it only happens in one place and perhaps when starting a completely fresh browsing session (since that's the workaround for #1306). > > Another note (for another bug?): @pocmo mentions we erase these files whenever a tab is removed (via back) and this seems extreme.
1.0
[breakdown] Rethink/refactor our erase strategy - From focus-android created by [mcomella](https://github.com/mcomella) : mozilla-mobile/focus-android#1569 Splitting from #1472 to avoid scope creep: > Notes: > - We have three kinds of erases: erases with a WebView instance, erases with static WebView methods, and erasing the files on disk. I'd suggest merging the last two. > - We call each of these three at different times for what seem like a lot of redundant calls, e.g. when sessions becomes empty (delete files on disk), onPause and finishing (delete files on disk), when the erase button is actually hit (all three) > - The methods are confusingly named, especially with the extra layers of abstraction like WebViewProvider > > We should try to centralize the erase so it only happens in one place and perhaps when starting a completely fresh browsing session (since that's the workaround for #1306). > > Another note (for another bug?): @pocmo mentions we erase these files whenever a tab is removed (via back) and this seems extreme.
non_process
rethink refactor our erase strategy from focus android created by mozilla mobile focus android splitting from to avoid scope creep notes we have three kinds of erases erases with a webview instance erases with static webview methods and erasing the files on disk i d suggest merging the last two we call each of these three at different times for what seem like a lot of redundant calls e g when sessions becomes empty delete files on disk onpause and finishing delete files on disk when the erase button is actually hit all three the methods are confusingly named especially with the extra layers of abstraction like webviewprovider we should try to centralize the erase so it only happens in one place and perhaps when starting a completely fresh browsing session since that s the workaround for another note for another bug pocmo mentions we erase these files whenever a tab is removed via back and this seems extreme
0
484,157
13,935,177,595
IssuesEvent
2020-10-22 11:07:08
OpenSRP/opensrp-client-reveal
https://api.github.com/repos/OpenSRP/opensrp-client-reveal
opened
Verification form - If structure was not visited, spray status question should be hidden
Android Client Priority: High
Samsung Galaxy Tab A 2017 Reveal 5.3.7 Android 9 Server: Stage Plan: Full Reveal 2020 IRS QA Login to reveal select other forms, enter verifications form Select no on Was this structure visited question. Current behavior: The spray status question is still present. ![Screenshot_20201021-230226_Reveal](https://user-images.githubusercontent.com/18394882/96863635-d573e600-146f-11eb-897a-62c275301382.jpg)
1.0
Verification form - If structure was not visited, spray status question should be hidden - Samsung Galaxy Tab A 2017 Reveal 5.3.7 Android 9 Server: Stage Plan: Full Reveal 2020 IRS QA Login to reveal select other forms, enter verifications form Select no on Was this structure visited question. Current behavior: The spray status question is still present. ![Screenshot_20201021-230226_Reveal](https://user-images.githubusercontent.com/18394882/96863635-d573e600-146f-11eb-897a-62c275301382.jpg)
non_process
verification form if structure was not visited spray status question should be hidden samsung galaxy tab a reveal android server stage plan full reveal irs qa login to reveal select other forms enter verifications form select no on was this structure visited question current behavior the spray status question is still present
0
11,046
13,865,246,737
IssuesEvent
2020-10-16 03:46:23
abhinavkashyap/sciwing
https://api.github.com/repos/abhinavkashyap/sciwing
closed
Different emb loader for word and characters are not needed
data-processing
There are two different classes for WordEmbLoader and CharEmbLoader. This is not needed. We can have just one embedding loader. The `EmbeddingLoader` abstraction does not care, if it is a word or if it is a character. Provide the appropriate embedding type and the tokens will be instantiated with appropriate values
1.0
Different emb loader for word and characters are not needed - There are two different classes for WordEmbLoader and CharEmbLoader. This is not needed. We can have just one embedding loader. The `EmbeddingLoader` abstraction does not care, if it is a word or if it is a character. Provide the appropriate embedding type and the tokens will be instantiated with appropriate values
process
different emb loader for word and characters are not needed there are two different classes for wordembloader and charembloader this is not needed we can have just one embedding loader the embeddingloader abstraction does not care if it is a word or if it is a character provide the appropriate embedding type and the tokens will be instantiated with appropriate values
1
3,215
6,275,574,815
IssuesEvent
2017-07-18 07:20:08
nodejs/node
https://api.github.com/repos/nodejs/node
closed
Node process with --inspect cannot fork child process without passing --inspect
child_process inspector windows
* **Version**: 8.1.1 * **Platform**: Windows 10 If I start a node process with --inspect and fork a child node process without passing --inspect, child process will crash with exit code of 12. Is this the intended behavior? Edit: added a repro case there https://github.com/nodejs/node/issues/14325#issuecomment-315974733
1.0
Node process with --inspect cannot fork child process without passing --inspect - * **Version**: 8.1.1 * **Platform**: Windows 10 If I start a node process with --inspect and fork a child node process without passing --inspect, child process will crash with exit code of 12. Is this the intended behavior? Edit: added a repro case there https://github.com/nodejs/node/issues/14325#issuecomment-315974733
process
node process with inspect cannot fork child process without passing inspect version platform windows if i start a node process with inspect and fork a child node process without passing inspect child process will crash with exit code of is this the intended behavior edit added a repro case there
1
17,725
23,625,816,304
IssuesEvent
2022-08-25 03:42:25
Battle-s/battle-school-backend
https://api.github.com/repos/Battle-s/battle-school-backend
reopened
[FEAT] 게시판 기능 구현
feature :computer: processing :hourglass_flowing_sand:
## 설명 > 게시판 기능 구현 ## 체크사항 > 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다. - [ ] controller (post, commemt) - [ ] domain (post, commemt) - [ ] repository (post, commemt) - [ ] service (post, commemt) - [ ] test (post, commemt) ## 참고자료 > 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다. ## 관련 논의 > 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다.
1.0
[FEAT] 게시판 기능 구현 - ## 설명 > 게시판 기능 구현 ## 체크사항 > 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다. - [ ] controller (post, commemt) - [ ] domain (post, commemt) - [ ] repository (post, commemt) - [ ] service (post, commemt) - [ ] test (post, commemt) ## 참고자료 > 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다. ## 관련 논의 > 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다.
process
게시판 기능 구현 설명 게시판 기능 구현 체크사항 이슈를 close하기 위해 필요한 조건들을 체크박스로 나열합니다 controller post commemt domain post commemt repository post commemt service post commemt test post commemt 참고자료 이슈를 해결하기 위해 필요한 참고자료가 있다면 추가합니다 관련 논의 이슈에 대한 논의가 있었다면 논의 내용을 간략하게 추가합니다
1
231,618
25,525,462,797
IssuesEvent
2022-11-29 01:36:08
kapseliboi/WeiPay
https://api.github.com/repos/kapseliboi/WeiPay
reopened
CVE-2021-32796 (Medium) detected in xmldom-0.1.27.tgz
security vulnerability
## CVE-2021-32796 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xmldom-0.1.27.tgz</b></p></summary> <p>A W3C Standard XML DOM(Level2 CORE) implementation and parser(DOMParser/XMLSerializer).</p> <p>Library home page: <a href="https://registry.npmjs.org/xmldom/-/xmldom-0.1.27.tgz">https://registry.npmjs.org/xmldom/-/xmldom-0.1.27.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/xmldom/package.json</p> <p> Dependency Hierarchy: - react-native-0.55.4.tgz (Root Library) - plist-1.2.0.tgz - :x: **xmldom-0.1.27.tgz** (Vulnerable Library) <p>Found in base branch: <b>stable</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> xmldom is an open source pure JavaScript W3C standard-based (XML DOM Level 2 Core) DOMParser and XMLSerializer module. xmldom versions 0.6.0 and older do not correctly escape special characters when serializing elements removed from their ancestor. This may lead to unexpected syntactic changes during XML processing in some downstream applications. This issue has been resolved in version 0.7.0. As a workaround downstream applications can validate the input and reject the maliciously crafted documents. <p>Publish Date: 2021-07-27 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-32796>CVE-2021-32796</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/xmldom/xmldom/security/advisories/GHSA-5fg8-2547-mr8q">https://github.com/xmldom/xmldom/security/advisories/GHSA-5fg8-2547-mr8q</a></p> <p>Release Date: 2021-07-27</p> <p>Fix Resolution: xmldom - 0.7.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-32796 (Medium) detected in xmldom-0.1.27.tgz - ## CVE-2021-32796 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xmldom-0.1.27.tgz</b></p></summary> <p>A W3C Standard XML DOM(Level2 CORE) implementation and parser(DOMParser/XMLSerializer).</p> <p>Library home page: <a href="https://registry.npmjs.org/xmldom/-/xmldom-0.1.27.tgz">https://registry.npmjs.org/xmldom/-/xmldom-0.1.27.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/xmldom/package.json</p> <p> Dependency Hierarchy: - react-native-0.55.4.tgz (Root Library) - plist-1.2.0.tgz - :x: **xmldom-0.1.27.tgz** (Vulnerable Library) <p>Found in base branch: <b>stable</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> xmldom is an open source pure JavaScript W3C standard-based (XML DOM Level 2 Core) DOMParser and XMLSerializer module. xmldom versions 0.6.0 and older do not correctly escape special characters when serializing elements removed from their ancestor. This may lead to unexpected syntactic changes during XML processing in some downstream applications. This issue has been resolved in version 0.7.0. As a workaround downstream applications can validate the input and reject the maliciously crafted documents. <p>Publish Date: 2021-07-27 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-32796>CVE-2021-32796</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/xmldom/xmldom/security/advisories/GHSA-5fg8-2547-mr8q">https://github.com/xmldom/xmldom/security/advisories/GHSA-5fg8-2547-mr8q</a></p> <p>Release Date: 2021-07-27</p> <p>Fix Resolution: xmldom - 0.7.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in xmldom tgz cve medium severity vulnerability vulnerable library xmldom tgz a standard xml dom core implementation and parser domparser xmlserializer library home page a href path to dependency file package json path to vulnerable library node modules xmldom package json dependency hierarchy react native tgz root library plist tgz x xmldom tgz vulnerable library found in base branch stable vulnerability details xmldom is an open source pure javascript standard based xml dom level core domparser and xmlserializer module xmldom versions and older do not correctly escape special characters when serializing elements removed from their ancestor this may lead to unexpected syntactic changes during xml processing in some downstream applications this issue has been resolved in version as a workaround downstream applications can validate the input and reject the maliciously crafted documents publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution xmldom step up your open source security game with mend
0
239,082
7,786,383,369
IssuesEvent
2018-06-06 18:46:29
qutebrowser/qutebrowser
https://api.github.com/repos/qutebrowser/qutebrowser
closed
Qt 5.11: Page is sometimes split in half
bug: behavior priority: 0 - high qt: 5.11
# Issue Pages are sometimes displayed split in half. # Workaround The current qutebrowser v1.3.1 release has a workaround for this, which works most (but not all) of the time. If you see this issue, make sure you're running v1.3.1. The issue is also fixed in Qt 5.11.1, but I'm not sure if it's really fixed properly - investigation is still ongoing. # Links - My bug report: [QTBUG-68279](https://bugreports.qt.io/browse/QTBUG-68279) - Original bug report: [QTBUG-68224](https://bugreports.qt.io/browse/QTBUG-68224) - Bug fix: https://codereview.qt-project.org/#/c/229443/ - Regression caused by: https://codereview.qt-project.org/#/c/226416/ (for [QTBUG-65595](https://bugreports.qt.io/browse/QTBUG-65595))
1.0
Qt 5.11: Page is sometimes split in half - # Issue Pages are sometimes displayed split in half. # Workaround The current qutebrowser v1.3.1 release has a workaround for this, which works most (but not all) of the time. If you see this issue, make sure you're running v1.3.1. The issue is also fixed in Qt 5.11.1, but I'm not sure if it's really fixed properly - investigation is still ongoing. # Links - My bug report: [QTBUG-68279](https://bugreports.qt.io/browse/QTBUG-68279) - Original bug report: [QTBUG-68224](https://bugreports.qt.io/browse/QTBUG-68224) - Bug fix: https://codereview.qt-project.org/#/c/229443/ - Regression caused by: https://codereview.qt-project.org/#/c/226416/ (for [QTBUG-65595](https://bugreports.qt.io/browse/QTBUG-65595))
non_process
qt page is sometimes split in half issue pages are sometimes displayed split in half workaround the current qutebrowser release has a workaround for this which works most but not all of the time if you see this issue make sure you re running the issue is also fixed in qt but i m not sure if it s really fixed properly investigation is still ongoing links my bug report original bug report bug fix regression caused by for
0
22,704
32,025,427,044
IssuesEvent
2023-09-22 08:30:06
SupaplexOSM/strassenraumkarte-neukoelln
https://api.github.com/repos/SupaplexOSM/strassenraumkarte-neukoelln
reopened
Improve parking lane script
processing
There are a few thinks we would like to add to https://github.com/SupaplexOSM/strassenraumkarte-neukoelln/tree/main/scripts/parking_lanes and/or the script from #6. This issue is meant as a list for those things … - [ ] Add the recently approved new tagging practices, see https://wiki.openstreetmap.org/wiki/Proposed_features/Parking_lane_conditionals - [x] Use Bus stops ([example](https://www.openstreetmap.org/node/599970222)) and/or bus stop platform ([example](https://www.openstreetmap.org/node/599970222)) to automatically define an area that is used to subtract parking space. This is done manually ATM. - [ ] Respect more `parking_spaces` values, like `disabled` https://wiki.openstreetmap.org/wiki/DE:Tag:amenity%3Dparking_space - [x] Change the direction of the way, which is the opposite of the street ATM.
1.0
Improve parking lane script - There are a few thinks we would like to add to https://github.com/SupaplexOSM/strassenraumkarte-neukoelln/tree/main/scripts/parking_lanes and/or the script from #6. This issue is meant as a list for those things … - [ ] Add the recently approved new tagging practices, see https://wiki.openstreetmap.org/wiki/Proposed_features/Parking_lane_conditionals - [x] Use Bus stops ([example](https://www.openstreetmap.org/node/599970222)) and/or bus stop platform ([example](https://www.openstreetmap.org/node/599970222)) to automatically define an area that is used to subtract parking space. This is done manually ATM. - [ ] Respect more `parking_spaces` values, like `disabled` https://wiki.openstreetmap.org/wiki/DE:Tag:amenity%3Dparking_space - [x] Change the direction of the way, which is the opposite of the street ATM.
process
improve parking lane script there are a few thinks we would like to add to and or the script from this issue is meant as a list for those things … add the recently approved new tagging practices see use bus stops and or bus stop platform to automatically define an area that is used to subtract parking space this is done manually atm respect more parking spaces values like disabled change the direction of the way which is the opposite of the street atm
1
1,406
3,970,958,646
IssuesEvent
2016-05-04 09:43:17
DevExpress/testcafe-hammerhead
https://api.github.com/repos/DevExpress/testcafe-hammerhead
closed
instance.toString is not a function
AREA: client BROWSER: Chrome SYSTEM: resource processing TYPE: bug
``` Error at Object.eval (eval at evaluate (unknown source), <anonymous>:1:2) at Object.InjectedScript._evaluateOn (<anonymous>:904:55) at Object.InjectedScript._evaluateAndWrap (<anonymous>:837:34) at Object.InjectedScript.evaluateOnCallFrame (<anonymous>:963:21) at Object.HammerheadClient.define.exports.isStyleInstance (http://kirov-sv:1340/tcse6be4085-e27c-46d7-8310-fa770c1f18fa/hammerhead.js:25552:23) at Object.HammerheadClient.define.exports.init.elementPropertyAccessors.background.condition (http://kirov-sv:1340/tcse6be4085-e27c-46d7-8310-fa770c1f18fa/hammerhead.js:21606:33) at setProperty (http://kirov-sv:1340/tcse6be4085-e27c-46d7-8310-fa770c1f18fa/hammerhead.js:21775:52) at http://kirov-sv:1340/ace/ace.js?7929ba6d39aa4465=http%3A%7Clocalhost%3A3777%7C4%7Ct%7Cscript:10:86387 at Array.forEach (native) at createKeywordMapper (http://kirov-sv:1340/ace/ace.js?7929ba6d39aa4465=http%3A%7Clocalhost%3A3777%7C4%7Ct%7Cscript:10:86270) ```
1.0
instance.toString is not a function - ``` Error at Object.eval (eval at evaluate (unknown source), <anonymous>:1:2) at Object.InjectedScript._evaluateOn (<anonymous>:904:55) at Object.InjectedScript._evaluateAndWrap (<anonymous>:837:34) at Object.InjectedScript.evaluateOnCallFrame (<anonymous>:963:21) at Object.HammerheadClient.define.exports.isStyleInstance (http://kirov-sv:1340/tcse6be4085-e27c-46d7-8310-fa770c1f18fa/hammerhead.js:25552:23) at Object.HammerheadClient.define.exports.init.elementPropertyAccessors.background.condition (http://kirov-sv:1340/tcse6be4085-e27c-46d7-8310-fa770c1f18fa/hammerhead.js:21606:33) at setProperty (http://kirov-sv:1340/tcse6be4085-e27c-46d7-8310-fa770c1f18fa/hammerhead.js:21775:52) at http://kirov-sv:1340/ace/ace.js?7929ba6d39aa4465=http%3A%7Clocalhost%3A3777%7C4%7Ct%7Cscript:10:86387 at Array.forEach (native) at createKeywordMapper (http://kirov-sv:1340/ace/ace.js?7929ba6d39aa4465=http%3A%7Clocalhost%3A3777%7C4%7Ct%7Cscript:10:86270) ```
process
instance tostring is not a function error at object eval eval at evaluate unknown source at object injectedscript evaluateon at object injectedscript evaluateandwrap at object injectedscript evaluateoncallframe at object hammerheadclient define exports isstyleinstance at object hammerheadclient define exports init elementpropertyaccessors background condition at setproperty at at array foreach native at createkeywordmapper
1
31,046
5,897,225,892
IssuesEvent
2017-05-18 11:55:51
reactor/reactor-core
https://api.github.com/repos/reactor/reactor-core
closed
Refguide and javadoc enhancements
documentation
- [x] (refguide) `delayElement`/`delayElements` in time section of "which operator to use" - [x] (javadoc) better hint at the behavior of `merge`/`mergeWith` with infinite sources that don't have a dedicated Scheduler (see #570) - [x] (javadoc) better hint at the fact that `groupBy` needs its groups to be drain or it can hang, eg. with large number of groups and a `flatMap` with too low `maxConcurrency` (see #596)
1.0
Refguide and javadoc enhancements - - [x] (refguide) `delayElement`/`delayElements` in time section of "which operator to use" - [x] (javadoc) better hint at the behavior of `merge`/`mergeWith` with infinite sources that don't have a dedicated Scheduler (see #570) - [x] (javadoc) better hint at the fact that `groupBy` needs its groups to be drain or it can hang, eg. with large number of groups and a `flatMap` with too low `maxConcurrency` (see #596)
non_process
refguide and javadoc enhancements refguide delayelement delayelements in time section of which operator to use javadoc better hint at the behavior of merge mergewith with infinite sources that don t have a dedicated scheduler see javadoc better hint at the fact that groupby needs its groups to be drain or it can hang eg with large number of groups and a flatmap with too low maxconcurrency see
0
19,524
3,775,059,769
IssuesEvent
2016-03-17 11:58:32
chrismeyersfsu/role-install_mongod
https://api.github.com/repos/chrismeyersfsu/role-install_mongod
opened
Ensure HA
enhancement test
Currently, the tests ensure that mongo is up https://github.com/chrismeyersfsu/role-install_mongod/blob/master/test/main.yml#L60 This is the minimum. We should really include an acceptance test to ensure that HA works. ### Steps to Test * Insert data into the database * Kill a mongo server * Ensure data previously inserted exists This will show that replication is working.
1.0
Ensure HA - Currently, the tests ensure that mongo is up https://github.com/chrismeyersfsu/role-install_mongod/blob/master/test/main.yml#L60 This is the minimum. We should really include an acceptance test to ensure that HA works. ### Steps to Test * Insert data into the database * Kill a mongo server * Ensure data previously inserted exists This will show that replication is working.
non_process
ensure ha currently the tests ensure that mongo is up this is the minimum we should really include an acceptance test to ensure that ha works steps to test insert data into the database kill a mongo server ensure data previously inserted exists this will show that replication is working
0
107,184
23,364,434,255
IssuesEvent
2022-08-10 14:18:35
WordPress/openverse-api
https://api.github.com/repos/WordPress/openverse-api
closed
Only fill in pages if at least one link from first page is not dead
🟥 priority: critical ✨ goal: improvement 💻 aspect: code
## Problem <!-- Describe a problem solved by this feature; or delete the section entirely. --> If we make a request to ES for 20 items and find that any of them are dead, we attempt to "fill the page in" with subsequent "deeper page" queries. If _all_ the links for a query are dead, this means we will slam ES with tons of fast queries to try to fill in the page. ## Description <!-- Describe the feature and how it solves the problem. --> Kudos @obulat for the idea. We should not try to fill pages if all of the links on the first page are dead. ## Additional context We are also working on excluding providers that account for a large number of dead links. ## Implementation <!-- Replace the [ ] with [x] to check the box. --> - [ ] 🙋 I would be interested in implementing this feature.
1.0
Only fill in pages if at least one link from first page is not dead - ## Problem <!-- Describe a problem solved by this feature; or delete the section entirely. --> If we make a request to ES for 20 items and find that any of them are dead, we attempt to "fill the page in" with subsequent "deeper page" queries. If _all_ the links for a query are dead, this means we will slam ES with tons of fast queries to try to fill in the page. ## Description <!-- Describe the feature and how it solves the problem. --> Kudos @obulat for the idea. We should not try to fill pages if all of the links on the first page are dead. ## Additional context We are also working on excluding providers that account for a large number of dead links. ## Implementation <!-- Replace the [ ] with [x] to check the box. --> - [ ] 🙋 I would be interested in implementing this feature.
non_process
only fill in pages if at least one link from first page is not dead problem if we make a request to es for items and find that any of them are dead we attempt to fill the page in with subsequent deeper page queries if all the links for a query are dead this means we will slam es with tons of fast queries to try to fill in the page description kudos obulat for the idea we should not try to fill pages if all of the links on the first page are dead additional context we are also working on excluding providers that account for a large number of dead links implementation 🙋 i would be interested in implementing this feature
0
20,420
27,081,030,181
IssuesEvent
2023-02-14 14:03:03
decidim/decidim
https://api.github.com/repos/decidim/decidim
closed
Ransack returns results for multiple organizations
type: bug module: participatory processes
Hi @ahukkanen, the changes, that introduced ransack, were committed by you, so I thought I'd mention you here. We are working on a Decidim 0.27 instance with multiple organizations and just realized, that the count numbers for the processes in one organization are including all processes from all organizations: <img width="547" alt="Screenshot 2023-01-05 at 10 52 04" src="https://user-images.githubusercontent.com/623008/210751568-75f38e82-56f2-4e4d-b155-e811a461295d.png"> I found the related code here: https://github.com/decidim/decidim/blob/46c48976c7512e25051b85798635e282c1bd24ba/decidim-participatory_processes/app/cells/decidim/participatory_processes/process_filters_cell.rb#L57-L62 https://github.com/decidim/decidim/blob/46c48976c7512e25051b85798635e282c1bd24ba/decidim-participatory_processes/app/cells/decidim/participatory_processes/process_filters_cell.rb#L40-L49 I did some simple tests. Using ransack to filter only processes for one organization returns also processes for other organizations: ```rb > Decidim::ParticipatoryProcess.ransack(organization: Decidim::Organization.first).result.collect(&:decidim_organization_i d) => [1, 1, 1, 2] > Decidim::ParticipatoryProcess.ransack({with_date: "active"}, organization: Decidim::Organization.first).result.collect(& :decidim_organization_id) => [2, 1] ``` I don't understand why ransack does this, but maybe it can't be used for filtering for foreign-keys? Maybe it needs to be done like this? ```rb Decidim::ParticipatoryProcess.where(organization: Decidim::Organization.first).ransack({with_date: "active"}).result.col lect(&:decidim_organization_id) => [1] ```
1.0
Ransack returns results for multiple organizations - Hi @ahukkanen, the changes, that introduced ransack, were committed by you, so I thought I'd mention you here. We are working on a Decidim 0.27 instance with multiple organizations and just realized, that the count numbers for the processes in one organization are including all processes from all organizations: <img width="547" alt="Screenshot 2023-01-05 at 10 52 04" src="https://user-images.githubusercontent.com/623008/210751568-75f38e82-56f2-4e4d-b155-e811a461295d.png"> I found the related code here: https://github.com/decidim/decidim/blob/46c48976c7512e25051b85798635e282c1bd24ba/decidim-participatory_processes/app/cells/decidim/participatory_processes/process_filters_cell.rb#L57-L62 https://github.com/decidim/decidim/blob/46c48976c7512e25051b85798635e282c1bd24ba/decidim-participatory_processes/app/cells/decidim/participatory_processes/process_filters_cell.rb#L40-L49 I did some simple tests. Using ransack to filter only processes for one organization returns also processes for other organizations: ```rb > Decidim::ParticipatoryProcess.ransack(organization: Decidim::Organization.first).result.collect(&:decidim_organization_i d) => [1, 1, 1, 2] > Decidim::ParticipatoryProcess.ransack({with_date: "active"}, organization: Decidim::Organization.first).result.collect(& :decidim_organization_id) => [2, 1] ``` I don't understand why ransack does this, but maybe it can't be used for filtering for foreign-keys? Maybe it needs to be done like this? ```rb Decidim::ParticipatoryProcess.where(organization: Decidim::Organization.first).ransack({with_date: "active"}).result.col lect(&:decidim_organization_id) => [1] ```
process
ransack returns results for multiple organizations hi ahukkanen the changes that introduced ransack were committed by you so i thought i d mention you here we are working on a decidim instance with multiple organizations and just realized that the count numbers for the processes in one organization are including all processes from all organizations img width alt screenshot at src i found the related code here i did some simple tests using ransack to filter only processes for one organization returns also processes for other organizations rb decidim participatoryprocess ransack organization decidim organization first result collect decidim organization i d decidim participatoryprocess ransack with date active organization decidim organization first result collect decidim organization id i don t understand why ransack does this but maybe it can t be used for filtering for foreign keys maybe it needs to be done like this rb decidim participatoryprocess where organization decidim organization first ransack with date active result col lect decidim organization id
1
6,777
9,914,924,032
IssuesEvent
2019-06-28 15:30:02
OpenLiberty/docs
https://api.github.com/repos/OpenLiberty/docs
closed
Improve process for quality tech reviews
doc process
ID and engineering need to work together to ensure quality (accuracy and relevance, in particular) of the docs. The current process is not working well enough to catch problems. For example, see #92 and #91.
1.0
Improve process for quality tech reviews - ID and engineering need to work together to ensure quality (accuracy and relevance, in particular) of the docs. The current process is not working well enough to catch problems. For example, see #92 and #91.
process
improve process for quality tech reviews id and engineering need to work together to ensure quality accuracy and relevance in particular of the docs the current process is not working well enough to catch problems for example see and
1