in_source_id stringlengths 13 58 | issue stringlengths 3 241k | before_files listlengths 0 3 | after_files listlengths 0 3 | pr_diff stringlengths 109 107M ⌀ |
|---|---|---|---|---|
pex-tool__pex-804 | Release 2.0.2
On the docket:
+ [x] Add a test of pypi index rendering. (#799)
+ [x] Fix `iter_compatible_interpreters` path biasing. (#798)
+ [x] Fix current platform handling. #801
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.1'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.2'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 52bfe2f01..80e4b80d7 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,22 @@
Release Notes
=============
+2.0.2
+-----
+
+This is a hotfix release that fixes a bug exposed when Pex was asked to use an
+interpreter with a non-canonical path as well as fixes for 'current' platform
+handling in the resolver API.
+
+* Fix current platform handling. (#801)
+ `PR #801 <https://github.com/pantsbuild/pex/pull/801>`_
+
+* Add a test of pypi index rendering. (#799)
+ `PR #799 <https://github.com/pantsbuild/pex/pull/799>`_
+
+* Fix `iter_compatible_interpreters` path biasing. (#798)
+ `PR #798 <https://github.com/pantsbuild/pex/pull/798>`_
+
2.0.1
-----
diff --git a/pex/version.py b/pex/version.py
index 7d8716be3..a698f9d1f 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '2.0.1'
+__version__ = '2.0.2'
|
pex-tool__pex-743 | Release 1.6.8
On the docket:
+ [x] Fixup pex re-exec during bootstrap. #741
+ [x] Pex should not re-exec when the current interpreter satifies constraints #709
+ [x] Pex should not lose PEX_PYTHON or PEX_PYTHON_PATH when re-exec-ing #710
+ [x] Fix resolution of `setup.py` project extras. #739
Deferred:
+ [ ] Remove PEX_HTTP_RETRIES and push into a flag for the pex tool #94
+ [ ] Sdist resolution is not always reproducible #735
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.7'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.8'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index d753c6bb2..a32083e64 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,21 @@
Release Notes
=============
+1.6.8
+-----
+
+* Fixup pex re-exec during bootstrap. (#741)
+ `PR #741 <https://github.com/pantsbuild/pex/pull/741>`_
+
+* Fix resolution of `setup.py` project extras. (#739)
+ `PR #739 <https://github.com/pantsbuild/pex/pull/739>`_
+
+* Tighten up namespace declaration logic. (#732)
+ `PR #732 <https://github.com/pantsbuild/pex/pull/732>`_
+
+* Fixup import sorting. (#731)
+ `PR #731 <https://github.com/pantsbuild/pex/pull/731>`_
+
1.6.7
-----
diff --git a/pex/version.py b/pex/version.py
index 789a4befa..fade44a0f 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '1.6.7'
+__version__ = '1.6.8'
|
pex-tool__pex-691 | Release 1.6.4
On the docket:
+ [x] Restore pex.pex_bootstrapper.is_compressed API #684
+ [ ] Release more flexible pex binaries. #654
+ [x] If an `--interpreter-constraint` is set, it should always be honored. #656
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.3'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.4'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index c6e494f18..5979452d2 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,9 +1,31 @@
Release Notes
=============
+1.6.4
+-----
+
+This release un-breaks `lambdex <https://github.com/wickman/lambdex>`_.
+
+* Restore ``pex.pex_bootstrapper.is_compressed`` API. (#685)
+ `PR #685 <https://github.com/pantsbuild/pex/pull/685>`_
+
+* Add the version of pex used to build a pex to build_properties. (#687)
+ `PR #687 <https://github.com/pantsbuild/pex/pull/687>`_
+
+* Honor interpreter constraints even when PEX_PYTHON and PEX_PYTHON_PATH not set (#668)
+ `PR #668 <https://github.com/pantsbuild/pex/pull/668>`_
+
1.6.3
-----
+This release changes the behavior of the ``--interpreter-constraint`` option.
+Previously, interpreter constraints were ANDed, which made it impossible to
+express constraints like '>=2.7,<3' OR '>=3.6,<4'; ie: either python 2.7 or
+else any python 3 release at or above 3.6. Now interpreter constraints are
+ORed, which is likely a breaking change if you have scripts that pass multiple
+interpreter constraints. To transition, use the native ``,`` AND operator in
+your constraint expression, as used in the example above.
+
* Provide control over pex warning behavior. (#680)
`PR #680 <https://github.com/pantsbuild/pex/pull/680>`_
diff --git a/pex/version.py b/pex/version.py
index 24b3a7da7..0fc585ba9 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '1.6.3'
+__version__ = '1.6.4'
|
pex-tool__pex-702 | Release 1.6.6
On the docket:
+ [x] Release more flexible pex binaries. #654
+ [x] If sys.executable is not on PATH a pex will re-exec itself forever. #700
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.5'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.6'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 39ebf2bce..808b715ed 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,18 @@
Release Notes
=============
+1.6.6
+-----
+
+This is the first release including only a single PEX pex, which
+supports execution under all interpreters pex supports.
+
+* Fix pex bootstrap interpreter selection. (#701)
+ `PR #701 <https://github.com/pantsbuild/pex/pull/701>`_
+
+* Switch releases to a single multi-pex. (#698)
+ `PR #698 <https://github.com/pantsbuild/pex/pull/698>`_
+
1.6.5
-----
diff --git a/pex/version.py b/pex/version.py
index b15f6cce4..3ecfde800 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '1.6.5'
+__version__ = '1.6.6'
|
ethereum__consensus-specs-1130 | BLS and testing
Decided I wanted to get this out to explain the current state of testing, and **collect feedback** (implementers please comment) on what you need from testing, and your feelings about BLS usage in tests.
# BLS and testing
The two pain-points to get a pretty (and large) set of test-vectors out for clients are:
- BLS Signature creation
- BLS Signature verification
And side-issue, but easily resolved:
*efficient creation of a genesis state*:
When BLS functionality is implemented in test-code (creation of signed deposits, and verification).
Solution would be to either cache it, or create it directly, without going through the spec functions (current temporary solution on experiment branch).
## Status
Talking about the status on [`spectest-deco` PR 1052](https://github.com/ethereum/eth2.0-specs/pull/1052) here, based on the `v06x` branch, where we are developing 0.6 improvements. (to be merged back into dev later)
### The testing pipeline currently looks like:
- py-spec, calls BLS stub
- test-helpers, don't create self-signed objects with valid signatures
- py-test code, unified with test-vector-creation (see [PR 1052](https://github.com/ethereum/eth2.0-specs/pull/1052))
- py-test runner to run spec-tests, purely for assertions
- test-generator running the spec-tests, passing `generator_mode=true` to each of them, making them output a test-vector.
### Pytests status:
- move from `tests/` to `eth2spec/test`, i.e. part of package
- removed use of `pytest`
- annotated with `@spec_test` or similar (see PR 1052)
- as part of test-generation effort, yay for shared effort:
- expanded in block-operation testing: [coverage checklist here](https://github.com/ethereum/eth2.0-specs/issues/927)
- slightly faster, less deep-copies
- stuck on BLS stub (no sig creation/verification)
### Test-generation status:
- BLS, SSZ-generic, SSZ-static, shuffling test generators still all in place and up to date (`v06x` branch)
- `operations` test-gen uses test-package ability to output test-vectors for each test-case
- but no valid signatures
- lack of a definition how to handle this signature problem as a test-consumer
- there are no signature-related testcases
- turning BLS off would effectively let you check conformance, but it's hacky, and not remotely a good practice to have even an option for...
- it's approx. ~140MB worth (iirc) of yaml encoded state-transitions, covering many edge-cases. Worth to get in the hands of implementers quick.
- `sanity` tests updated and can be cleanly used for test-generation, but requires more work to define the format of the test-vectors, as they is more variety.
- `epoch` processing tests also updated, also can be used, not as complete as block-processing, lower priority.
## Possible ways forward:
- Simple but hacky: "turn BLS off for testing"
- No "BLS off", BLS ON on client side, but only partially on spec side. Rely on signature verification not being hit before anything else during testing
- valid test cases generated with valid signatures
- invalid test cases marked: does it error because of BLS? And runners should check the reason for aborting processing: if it doesn't match, the test should fail. Now these pytests don't need full BLS update work, and can be released somewhat quicker
- "BLS on", more work (~1 week)
- slower on test-generation, but we get the best kind of test-vectors: correct, BLS verification ON.
- blocker: what if a test case fails because of a signature error (test setup not creating the sig correctly), instead of a real assertion case. Spec will look correct, passes tests, but things are not right. We need to mark Sig-verification errors distinctly, so we can catch these problems when we turn BLS on in the pyspec. How: instead of `assert verify_...`, just `verify_...`, and make it raise a special `BLSVerificationError` (or something like that)
- We likely still want to mark tests as "signature related" or not, so implementers can catch it easily if their code is not aborting properly before signature verification, to assure invalid inputs are not costly.
A work-in-progress introduction of actual full BLS usage in the pytests is started here: [`tests-with-sigs` branch](https://github.com/ethereum/eth2.0-specs/tree/tests-with-sigs)
Suggestions welcome.
| [
{
"content": "import sys\nimport function_puller\n\n\ndef build_phase0_spec(sourcefile, outfile):\n code_lines = []\n code_lines.append(\"\"\"\nfrom typing import (\n Any,\n Dict,\n List,\n NewType,\n Tuple,\n)\nfrom eth2spec.utils.minimal_ssz import *\nfrom eth2spec.utils.bls_stub import *... | [
{
"content": "import sys\nimport function_puller\n\n\ndef build_phase0_spec(sourcefile, outfile):\n code_lines = []\n code_lines.append(\"\"\"\nfrom typing import (\n Any,\n Dict,\n List,\n NewType,\n Tuple,\n)\nfrom eth2spec.utils.minimal_ssz import *\nfrom eth2spec.utils.bls import *\n\n\... | diff --git a/Makefile b/Makefile
index 73d8adea89..86303680d3 100644
--- a/Makefile
+++ b/Makefile
@@ -34,7 +34,7 @@ install_test:
cd $(PY_SPEC_DIR); python3 -m venv venv; . venv/bin/activate; pip3 install -r requirements-testing.txt;
test: $(PY_SPEC_ALL_TARGETS)
- cd $(PY_SPEC_DIR); . venv/bin/activate; python -m pytest .
+ cd $(PY_SPEC_DIR); . venv/bin/activate; python -m pytest eth2spec
citest: $(PY_SPEC_ALL_TARGETS)
cd $(PY_SPEC_DIR); mkdir -p test-reports/eth2spec; . venv/bin/activate; python -m pytest --junitxml=test-reports/eth2spec/test_results.xml .
diff --git a/scripts/phase0/build_spec.py b/scripts/phase0/build_spec.py
index da5845951d..26b0e5a8a6 100644
--- a/scripts/phase0/build_spec.py
+++ b/scripts/phase0/build_spec.py
@@ -13,7 +13,7 @@ def build_phase0_spec(sourcefile, outfile):
Tuple,
)
from eth2spec.utils.minimal_ssz import *
-from eth2spec.utils.bls_stub import *
+from eth2spec.utils.bls import *
""")
for i in (1, 2, 3, 4, 8, 32, 48, 96):
diff --git a/specs/core/0_beacon-chain.md b/specs/core/0_beacon-chain.md
index e56fd976cc..46c811fedb 100644
--- a/specs/core/0_beacon-chain.md
+++ b/specs/core/0_beacon-chain.md
@@ -1756,7 +1756,8 @@ def process_deposit(state: BeaconState, deposit: Deposit) -> None:
amount = deposit.data.amount
validator_pubkeys = [v.pubkey for v in state.validator_registry]
if pubkey not in validator_pubkeys:
- # Verify the deposit signature (proof of possession)
+ # Verify the deposit signature (proof of possession).
+ # Invalid signatures are allowed by the deposit contract, and hence included on-chain, but must not be processed.
if not bls_verify(pubkey, signing_root(deposit.data), deposit.data.signature, get_domain(state, DOMAIN_DEPOSIT)):
return
diff --git a/specs/test_formats/README.md b/specs/test_formats/README.md
index 273659ce93..d245fcfa46 100644
--- a/specs/test_formats/README.md
+++ b/specs/test_formats/README.md
@@ -176,6 +176,18 @@ To prevent parsing of hundreds of different YAML files to test a specific test t
... <--- more test types
```
+## Common test-case properties
+
+Some test-case formats share some common key-value pair patterns, and these are documented here:
+
+```
+bls_setting: int -- optional, can have 3 different values:
+ 0: (default, applies if key-value pair is absent). Free to choose either BLS ON or OFF.
+ Tests are generated with valid BLS data in this case,
+ but there is no change of outcome when running the test if BLS is ON or OFF.
+ 1: known as "BLS required" - if the test validity is strictly dependent on BLS being ON
+ 2: known as "BLS ignored" - if the test validity is strictly dependent on BLS being OFF
+```
## Note for implementers
diff --git a/specs/test_formats/epoch_processing/README.md b/specs/test_formats/epoch_processing/README.md
new file mode 100644
index 0000000000..6384a0eda9
--- /dev/null
+++ b/specs/test_formats/epoch_processing/README.md
@@ -0,0 +1,29 @@
+# Epoch processing tests
+
+The different epoch sub-transitions are tested individually with test handlers.
+The format is similar to block-processing state-transition tests.
+There is no "change" factor however, the transitions are pure functions with just the pre-state as input.
+Hence, the format is shared between each test-handler. (See test condition documentation on how to run the tests.)
+
+## Test case format
+
+```yaml
+description: string -- description of test case, purely for debugging purposes
+bls_setting: int -- see general test-format spec.
+pre: BeaconState -- state before running the sub-transition
+post: BeaconState -- state after applying the epoch sub-transition.
+```
+
+## Condition
+
+A handler of the `epoch_processing` test-runner should process these cases,
+ calling the corresponding processing implementation.
+
+Sub-transitions:
+
+| *`sub-transition-name`* | *`processing call`* |
+|-------------------------|-----------------------------------|
+| `crosslinks` | `process_crosslinks(state)` |
+| `registry_updates` | `process_registry_updates(state)` |
+
+The resulting state should match the expected `post` state.
diff --git a/specs/test_formats/operations/README.md b/specs/test_formats/operations/README.md
index 842dc3615f..32cf880b36 100644
--- a/specs/test_formats/operations/README.md
+++ b/specs/test_formats/operations/README.md
@@ -2,9 +2,34 @@
The different kinds of operations ("transactions") are tested individually with test handlers.
-The tested operation kinds are:
-- [`deposits`](./deposits.md)
-- More tests are work-in-progress.
+## Test case format
+```yaml
+description: string -- description of test case, purely for debugging purposes
+bls_setting: int -- see general test-format spec.
+pre: BeaconState -- state before applying the operation
+<operation-name>: <operation-object> -- the YAML encoded operation, e.g. a "ProposerSlashing", or "Deposit".
+post: BeaconState -- state after applying the operation. No value if operation processing is aborted.
+```
+## Condition
+A handler of the `operations` test-runner should process these cases,
+ calling the corresponding processing implementation.
+
+Operations:
+
+| *`operation-name`* | *`operation-object`* | *`input name`* | *`processing call`* |
+|-------------------------|----------------------|----------------------|--------------------------------------------------------|
+| `attestation` | `Attestation` | `attestation` | `process_attestation(state, attestation)` |
+| `attester_slashing` | `AttesterSlashing` | `attester_slashing` | `process_attester_slashing(state, attester_slashing)` |
+| `block_header` | `Block` | `block` | `process_block_header(state, block)` |
+| `deposit` | `Deposit` | `deposit` | `process_deposit(state, deposit)` |
+| `proposer_slashing` | `ProposerSlashing` | `proposer_slashing` | `process_proposer_slashing(state, proposer_slashing)` |
+| `transfer` | `Transfer` | `transfer` | `process_transfer(state, transfer)` |
+| `voluntary_exit` | `VoluntaryExit` | `voluntary_exit` | `process_voluntary_exit(state, voluntary_exit)` |
+
+Note that `block_header` is not strictly an operation (and is a full `Block`), but processed in the same manner, and hence included here.
+
+The resulting state should match the expected `post` state, or if the `post` state is left blank,
+ the handler should reject the input operation as invalid.
diff --git a/specs/test_formats/operations/deposits.md b/specs/test_formats/operations/deposits.md
deleted file mode 100644
index 8f44ebb228..0000000000
--- a/specs/test_formats/operations/deposits.md
+++ /dev/null
@@ -1,18 +0,0 @@
-# Test format: Deposit operations
-
-A deposit is a form of an operation (or "transaction"), modifying the state.
-
-## Test case format
-
-```yaml
-description: string -- description of test case, purely for debugging purposes
-pre: BeaconState -- state before applying the deposit
-deposit: Deposit -- the deposit
-post: BeaconState -- state after applying the deposit. No value if deposit processing is aborted.
-```
-
-## Condition
-
-A `deposits` handler of the `operations` should process these cases,
- calling the implementation of the `process_deposit(state, deposit)` functionality described in the spec.
-The resulting state should match the expected `post` state, or if the `post` state is left blank, the handler should reject the inputs as invalid.
diff --git a/specs/test_formats/sanity/README.md b/specs/test_formats/sanity/README.md
new file mode 100644
index 0000000000..20b36208a4
--- /dev/null
+++ b/specs/test_formats/sanity/README.md
@@ -0,0 +1,7 @@
+# Sanity tests
+
+The aim of the sanity tests is to set a base-line on what really needs to pass, i.e. the essentials.
+
+There are two handlers, documented individually:
+- [`slots`](./slots.md): transitions of one or more slots (and epoch transitions within)
+- [`blocks`](./blocks.md): transitions triggered by one or more blocks
diff --git a/specs/test_formats/sanity/blocks.md b/specs/test_formats/sanity/blocks.md
new file mode 100644
index 0000000000..3004a6de70
--- /dev/null
+++ b/specs/test_formats/sanity/blocks.md
@@ -0,0 +1,18 @@
+# Sanity blocks testing
+
+Sanity tests to cover a series of one or more blocks being processed, aiming to cover common changes.
+
+## Test case format
+
+```yaml
+description: string -- description of test case, purely for debugging purposes
+bls_setting: int -- see general test-format spec.
+pre: BeaconState -- state before running through the transitions triggered by the blocks.
+blocks: [BeaconBlock] -- blocks to process, in given order, following the main transition function (i.e. process slot and epoch transitions in between blocks as normal)
+post: BeaconState -- state after applying all the transitions triggered by the blocks.
+```
+
+## Condition
+
+The resulting state should match the expected `post` state, or if the `post` state is left blank,
+ the handler should reject the series of blocks as invalid.
diff --git a/specs/test_formats/sanity/slots.md b/specs/test_formats/sanity/slots.md
new file mode 100644
index 0000000000..81866d47b9
--- /dev/null
+++ b/specs/test_formats/sanity/slots.md
@@ -0,0 +1,23 @@
+# Sanity slots testing
+
+Sanity tests to cover a series of one or more empty-slot transitions being processed, aiming to cover common changes.
+
+## Test case format
+
+```yaml
+description: string -- description of test case, purely for debugging purposes
+bls_setting: int -- see general test-format spec.
+pre: BeaconState -- state before running through the transitions.
+slots: N -- amount of slots to process, N being a positive numer.
+post: BeaconState -- state after applying all the transitions.
+```
+
+The transition with pure time, no blocks, is known as `state_transition_to(state, slot)` in the spec.
+This runs state-caching (pure slot transition) and epoch processing (every E slots).
+
+To process the data, call `state_transition_to(pre, pre.slot + N)`. And see if `pre` mutated into the equivalent of `post`.
+
+
+## Condition
+
+The resulting state should match the expected `post` state.
diff --git a/specs/test_formats/ssz_static/core.md b/specs/test_formats/ssz_static/core.md
index 1d470c3381..0f26e0f9c8 100644
--- a/specs/test_formats/ssz_static/core.md
+++ b/specs/test_formats/ssz_static/core.md
@@ -9,11 +9,11 @@ This test-format ensures these direct serializations are covered.
## Test case format
```yaml
-type_name: string -- string, object name, formatted as in spec. E.g. "BeaconBlock"
-value: dynamic -- the YAML-encoded value, of the type specified by type_name.
-serialized: bytes -- string, SSZ-serialized data, hex encoded, with prefix 0x
-root: bytes32 -- string, hash-tree-root of the value, hex encoded, with prefix 0x
-signing_root: bytes32 -- string, signing-root of the value, hex encoded, with prefix 0x. Optional, present if type contains ``signature`` field
+SomeObjectName: -- key, object name, formatted as in spec. E.g. "BeaconBlock".
+ value: dynamic -- the YAML-encoded value, of the type specified by type_name.
+ serialized: bytes -- string, SSZ-serialized data, hex encoded, with prefix 0x
+ root: bytes32 -- string, hash-tree-root of the value, hex encoded, with prefix 0x
+ signing_root: bytes32 -- string, signing-root of the value, hex encoded, with prefix 0x. Optional, present if type contains ``signature`` field
```
## Condition
diff --git a/test_generators/README.md b/test_generators/README.md
index 43bf7af031..309a64bd92 100644
--- a/test_generators/README.md
+++ b/test_generators/README.md
@@ -58,7 +58,7 @@ It's recommended to extend the base-generator.
Create a `requirements.txt` in the root of your generator directory:
```
-eth-utils==1.4.1
+eth-utils==1.6.0
../../test_libs/gen_helpers
../../test_libs/config_helpers
../../test_libs/pyspec
diff --git a/test_generators/bls/requirements.txt b/test_generators/bls/requirements.txt
index 8a933d41ca..6d83bdfb59 100644
--- a/test_generators/bls/requirements.txt
+++ b/test_generators/bls/requirements.txt
@@ -1,3 +1,3 @@
-py-ecc==1.6.0
-eth-utils==1.4.1
+py-ecc==1.7.0
+eth-utils==1.6.0
../../test_libs/gen_helpers
diff --git a/test_generators/epoch_processing/README.md b/test_generators/epoch_processing/README.md
new file mode 100644
index 0000000000..9b57875e2a
--- /dev/null
+++ b/test_generators/epoch_processing/README.md
@@ -0,0 +1,11 @@
+# Epoch processing
+
+Epoch processing covers the sub-transitions during an epoch change.
+
+An epoch-processing test-runner can consume these sub-transition test-suites,
+ and handle different kinds of epoch sub-transitions by processing the cases using the specified test handler.
+
+Information on the format of the tests can be found in the [epoch-processing test formats documentation](../../specs/test_formats/epoch_processing/README.md).
+
+
+
diff --git a/test_generators/epoch_processing/main.py b/test_generators/epoch_processing/main.py
new file mode 100644
index 0000000000..8f067e4a35
--- /dev/null
+++ b/test_generators/epoch_processing/main.py
@@ -0,0 +1,38 @@
+from typing import Callable, Iterable
+
+from eth2spec.phase0 import spec
+from eth2spec.test.epoch_processing import (
+ test_process_crosslinks,
+ test_process_registry_updates
+)
+from gen_base import gen_runner, gen_suite, gen_typing
+from gen_from_tests.gen import generate_from_tests
+from preset_loader import loader
+
+
+def create_suite(transition_name: str, config_name: str, get_cases: Callable[[], Iterable[gen_typing.TestCase]]) \
+ -> Callable[[str], gen_typing.TestSuiteOutput]:
+ def suite_definition(configs_path: str) -> gen_typing.TestSuiteOutput:
+ presets = loader.load_presets(configs_path, config_name)
+ spec.apply_constants_preset(presets)
+
+ return ("%s_%s" % (transition_name, config_name), transition_name, gen_suite.render_suite(
+ title="%s epoch processing" % transition_name,
+ summary="Test suite for %s type epoch processing" % transition_name,
+ forks_timeline="testing",
+ forks=["phase0"],
+ config=config_name,
+ runner="epoch_processing",
+ handler=transition_name,
+ test_cases=get_cases()))
+
+ return suite_definition
+
+
+if __name__ == "__main__":
+ gen_runner.run_generator("epoch_processing", [
+ create_suite('crosslinks', 'minimal', lambda: generate_from_tests(test_process_crosslinks)),
+ create_suite('crosslinks', 'mainnet', lambda: generate_from_tests(test_process_crosslinks)),
+ create_suite('registry_updates', 'minimal', lambda: generate_from_tests(test_process_registry_updates)),
+ create_suite('registry_updates', 'mainnet', lambda: generate_from_tests(test_process_registry_updates)),
+ ])
diff --git a/test_generators/epoch_processing/requirements.txt b/test_generators/epoch_processing/requirements.txt
new file mode 100644
index 0000000000..595cee69cd
--- /dev/null
+++ b/test_generators/epoch_processing/requirements.txt
@@ -0,0 +1,4 @@
+eth-utils==1.6.0
+../../test_libs/gen_helpers
+../../test_libs/config_helpers
+../../test_libs/pyspec
\ No newline at end of file
diff --git a/test_generators/operations/README.md b/test_generators/operations/README.md
index e0b9d0e187..5cb3afc989 100644
--- a/test_generators/operations/README.md
+++ b/test_generators/operations/README.md
@@ -3,7 +3,6 @@
Operations (or "transactions" in previous spec iterations),
are atomic changes to the state, introduced by embedding in blocks.
-This generator provides a series of test suites, divided into handler, for each operation type.
An operation test-runner can consume these operation test-suites,
and handle different kinds of operations by processing the cases using the specified test handler.
diff --git a/test_generators/operations/deposits.py b/test_generators/operations/deposits.py
deleted file mode 100644
index 075ccbd5ba..0000000000
--- a/test_generators/operations/deposits.py
+++ /dev/null
@@ -1,180 +0,0 @@
-from eth2spec.phase0 import spec
-from eth_utils import (
- to_dict, to_tuple
-)
-from gen_base import gen_suite, gen_typing
-from preset_loader import loader
-from eth2spec.debug.encode import encode
-from eth2spec.utils.minimal_ssz import signing_root
-from eth2spec.utils.merkle_minimal import get_merkle_root, calc_merkle_tree_from_leaves, get_merkle_proof
-
-from typing import List, Tuple
-
-import genesis
-import keys
-from py_ecc import bls
-
-
-def build_deposit_data(state,
- pubkey: spec.BLSPubkey,
- withdrawal_cred: spec.Bytes32,
- privkey: int,
- amount: int):
- deposit_data = spec.DepositData(
- pubkey=pubkey,
- withdrawal_credentials=spec.BLS_WITHDRAWAL_PREFIX_BYTE + withdrawal_cred[1:],
- amount=amount,
- )
- deposit_data.proof_of_possession = bls.sign(
- message_hash=signing_root(deposit_data),
- privkey=privkey,
- domain=spec.get_domain(
- state,
- spec.get_current_epoch(state),
- spec.DOMAIN_DEPOSIT,
- )
- )
- return deposit_data
-
-
-def build_deposit(state,
- deposit_data_leaves: List[spec.Bytes32],
- pubkey: spec.BLSPubkey,
- withdrawal_cred: spec.Bytes32,
- privkey: int,
- amount: int) -> spec.Deposit:
-
- deposit_data = build_deposit_data(state, pubkey, withdrawal_cred, privkey, amount)
-
- item = deposit_data.hash_tree_root()
- index = len(deposit_data_leaves)
- deposit_data_leaves.append(item)
- tree = calc_merkle_tree_from_leaves(tuple(deposit_data_leaves))
- proof = list(get_merkle_proof(tree, item_index=index))
-
- deposit = spec.Deposit(
- proof=list(proof),
- index=index,
- data=deposit_data,
- )
- assert spec.verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, index, get_merkle_root(tuple(deposit_data_leaves)))
-
- return deposit
-
-
-def build_deposit_for_index(initial_validator_count: int, index: int) -> Tuple[spec.Deposit, spec.BeaconState]:
- genesis_deposits = genesis.create_deposits(
- keys.pubkeys[:initial_validator_count],
- keys.withdrawal_creds[:initial_validator_count]
- )
- state = genesis.create_genesis_state(genesis_deposits)
-
- deposit_data_leaves = [dep.data.hash_tree_root() for dep in genesis_deposits]
-
- deposit = build_deposit(
- state,
- deposit_data_leaves,
- keys.pubkeys[index],
- keys.withdrawal_creds[index],
- keys.privkeys[index],
- spec.MAX_EFFECTIVE_BALANCE,
- )
-
- state.latest_eth1_data.deposit_root = get_merkle_root(tuple(deposit_data_leaves))
- state.latest_eth1_data.deposit_count = len(deposit_data_leaves)
-
- return deposit, state
-
-
-@to_dict
-def valid_deposit():
- new_dep, state = build_deposit_for_index(10, 10)
- yield 'description', 'valid deposit to add new validator'
- yield 'pre', encode(state, spec.BeaconState)
- yield 'deposit', encode(new_dep, spec.Deposit)
- spec.process_deposit(state, new_dep)
- yield 'post', encode(state, spec.BeaconState)
-
-
-@to_dict
-def valid_topup():
- new_dep, state = build_deposit_for_index(10, 3)
- yield 'description', 'valid deposit to top-up existing validator'
- yield 'pre', encode(state, spec.BeaconState)
- yield 'deposit', encode(new_dep, spec.Deposit)
- spec.process_deposit(state, new_dep)
- yield 'post', encode(state, spec.BeaconState)
-
-
-@to_dict
-def invalid_deposit_index():
- new_dep, state = build_deposit_for_index(10, 10)
- # Mess up deposit index, 1 too small
- state.deposit_index = 9
-
- yield 'description', 'invalid deposit index'
- yield 'pre', encode(state, spec.BeaconState)
- yield 'deposit', encode(new_dep, spec.Deposit)
- try:
- spec.process_deposit(state, new_dep)
- except AssertionError:
- # expected
- yield 'post', None
- return
- raise Exception('invalid_deposit_index has unexpectedly allowed deposit')
-
-
-@to_dict
-def invalid_deposit_proof():
- new_dep, state = build_deposit_for_index(10, 10)
- # Make deposit proof invalid (at bottom of proof)
- new_dep.proof[-1] = spec.ZERO_HASH
-
- yield 'description', 'invalid deposit proof'
- yield 'pre', encode(state, spec.BeaconState)
- yield 'deposit', encode(new_dep, spec.Deposit)
- try:
- spec.process_deposit(state, new_dep)
- except AssertionError:
- # expected
- yield 'post', None
- return
- raise Exception('invalid_deposit_index has unexpectedly allowed deposit')
-
-
-@to_tuple
-def deposit_cases():
- yield valid_deposit()
- yield valid_topup()
- yield invalid_deposit_index()
- yield invalid_deposit_proof()
-
-
-def mini_deposits_suite(configs_path: str) -> gen_typing.TestSuiteOutput:
- presets = loader.load_presets(configs_path, 'minimal')
- spec.apply_constants_preset(presets)
-
- return ("deposit_minimal", "deposits", gen_suite.render_suite(
- title="deposit operation",
- summary="Test suite for deposit type operation processing",
- forks_timeline="testing",
- forks=["phase0"],
- config="minimal",
- runner="operations",
- handler="deposits",
- test_cases=deposit_cases()))
-
-
-def full_deposits_suite(configs_path: str) -> gen_typing.TestSuiteOutput:
- presets = loader.load_presets(configs_path, 'mainnet')
- spec.apply_constants_preset(presets)
-
- return ("deposit_full", "deposits", gen_suite.render_suite(
- title="deposit operation",
- summary="Test suite for deposit type operation processing",
- forks_timeline="mainnet",
- forks=["phase0"],
- config="mainnet",
- runner="operations",
- handler="deposits",
- test_cases=deposit_cases()))
diff --git a/test_generators/operations/genesis.py b/test_generators/operations/genesis.py
deleted file mode 100644
index f4d63c10ec..0000000000
--- a/test_generators/operations/genesis.py
+++ /dev/null
@@ -1,44 +0,0 @@
-from eth2spec.phase0 import spec
-from eth2spec.utils.merkle_minimal import get_merkle_root, calc_merkle_tree_from_leaves, get_merkle_proof
-from typing import List
-
-
-def create_genesis_state(deposits: List[spec.Deposit]) -> spec.BeaconState:
- deposit_root = get_merkle_root((tuple([(dep.data.hash_tree_root()) for dep in deposits])))
-
- return spec.get_genesis_beacon_state(
- deposits,
- genesis_time=0,
- genesis_eth1_data=spec.Eth1Data(
- deposit_root=deposit_root,
- deposit_count=len(deposits),
- block_hash=spec.ZERO_HASH,
- ),
- )
-
-
-def create_deposits(pubkeys: List[spec.BLSPubkey], withdrawal_cred: List[spec.Bytes32]) -> List[spec.Deposit]:
-
- # Mock proof of possession
- proof_of_possession = b'\x33' * 96
-
- deposit_data = [
- spec.DepositData(
- pubkey=pubkeys[i],
- withdrawal_credentials=spec.BLS_WITHDRAWAL_PREFIX_BYTE + withdrawal_cred[i][1:],
- amount=spec.MAX_EFFECTIVE_BALANCE,
- proof_of_possession=proof_of_possession,
- ) for i in range(len(pubkeys))
- ]
-
- # Fill tree with existing deposits
- deposit_data_leaves = [data.hash_tree_root() for data in deposit_data]
- tree = calc_merkle_tree_from_leaves(tuple(deposit_data_leaves))
-
- return [
- spec.Deposit(
- proof=list(get_merkle_proof(tree, item_index=i)),
- index=i,
- data=deposit_data[i]
- ) for i in range(len(deposit_data))
- ]
diff --git a/test_generators/operations/keys.py b/test_generators/operations/keys.py
deleted file mode 100644
index db4f59e0e6..0000000000
--- a/test_generators/operations/keys.py
+++ /dev/null
@@ -1,7 +0,0 @@
-from py_ecc import bls
-from eth2spec.phase0.spec import hash
-
-privkeys = list(range(1, 101))
-pubkeys = [bls.privtopub(k) for k in privkeys]
-# Insecure, but easier to follow
-withdrawal_creds = [hash(bls.privtopub(k)) for k in privkeys]
diff --git a/test_generators/operations/main.py b/test_generators/operations/main.py
index 8b0a2a6d83..96c639d12d 100644
--- a/test_generators/operations/main.py
+++ b/test_generators/operations/main.py
@@ -1,9 +1,53 @@
-from gen_base import gen_runner
+from typing import Callable, Iterable
+
+from eth2spec.test.block_processing import (
+ test_process_attestation,
+ test_process_attester_slashing,
+ test_process_block_header,
+ test_process_deposit,
+ test_process_proposer_slashing,
+ test_process_transfer,
+ test_process_voluntary_exit
+)
+
+from gen_base import gen_runner, gen_suite, gen_typing
+from gen_from_tests.gen import generate_from_tests
+from preset_loader import loader
+from eth2spec.phase0 import spec
+
+
+def create_suite(operation_name: str, config_name: str, get_cases: Callable[[], Iterable[gen_typing.TestCase]]) \
+ -> Callable[[str], gen_typing.TestSuiteOutput]:
+ def suite_definition(configs_path: str) -> gen_typing.TestSuiteOutput:
+ presets = loader.load_presets(configs_path, config_name)
+ spec.apply_constants_preset(presets)
+
+ return ("%s_%s" % (operation_name, config_name), operation_name, gen_suite.render_suite(
+ title="%s operation" % operation_name,
+ summary="Test suite for %s type operation processing" % operation_name,
+ forks_timeline="testing",
+ forks=["phase0"],
+ config=config_name,
+ runner="operations",
+ handler=operation_name,
+ test_cases=get_cases()))
+ return suite_definition
-from deposits import mini_deposits_suite, full_deposits_suite
if __name__ == "__main__":
gen_runner.run_generator("operations", [
- mini_deposits_suite,
- full_deposits_suite
+ create_suite('attestation', 'minimal', lambda: generate_from_tests(test_process_attestation)),
+ create_suite('attestation', 'mainnet', lambda: generate_from_tests(test_process_attestation)),
+ create_suite('attester_slashing', 'minimal', lambda: generate_from_tests(test_process_attester_slashing)),
+ create_suite('attester_slashing', 'mainnet', lambda: generate_from_tests(test_process_attester_slashing)),
+ create_suite('block_header', 'minimal', lambda: generate_from_tests(test_process_block_header)),
+ create_suite('block_header', 'mainnet', lambda: generate_from_tests(test_process_block_header)),
+ create_suite('deposit', 'minimal', lambda: generate_from_tests(test_process_deposit)),
+ create_suite('deposit', 'mainnet', lambda: generate_from_tests(test_process_deposit)),
+ create_suite('proposer_slashing', 'minimal', lambda: generate_from_tests(test_process_proposer_slashing)),
+ create_suite('proposer_slashing', 'mainnet', lambda: generate_from_tests(test_process_proposer_slashing)),
+ create_suite('transfer', 'minimal', lambda: generate_from_tests(test_process_transfer)),
+ create_suite('transfer', 'mainnet', lambda: generate_from_tests(test_process_transfer)),
+ create_suite('voluntary_exit', 'minimal', lambda: generate_from_tests(test_process_voluntary_exit)),
+ create_suite('voluntary_exit', 'mainnet', lambda: generate_from_tests(test_process_voluntary_exit)),
])
diff --git a/test_generators/operations/requirements.txt b/test_generators/operations/requirements.txt
index dfe8535365..595cee69cd 100644
--- a/test_generators/operations/requirements.txt
+++ b/test_generators/operations/requirements.txt
@@ -1,5 +1,4 @@
-eth-utils==1.4.1
+eth-utils==1.6.0
../../test_libs/gen_helpers
../../test_libs/config_helpers
-../../test_libs/pyspec
-py_ecc
\ No newline at end of file
+../../test_libs/pyspec
\ No newline at end of file
diff --git a/test_generators/sanity/README.md b/test_generators/sanity/README.md
new file mode 100644
index 0000000000..6d2e2f30dd
--- /dev/null
+++ b/test_generators/sanity/README.md
@@ -0,0 +1,8 @@
+# Sanity tests
+
+Sanity tests cover regular state-transitions in a common block-list format, to ensure the basics work.
+
+Information on the format of the tests can be found in the [sanity test formats documentation](../../specs/test_formats/sanity/README.md).
+
+
+
diff --git a/test_generators/sanity/main.py b/test_generators/sanity/main.py
new file mode 100644
index 0000000000..bba6ed03df
--- /dev/null
+++ b/test_generators/sanity/main.py
@@ -0,0 +1,35 @@
+from typing import Callable, Iterable
+
+from eth2spec.test.sanity import test_blocks, test_slots
+
+from gen_base import gen_runner, gen_suite, gen_typing
+from gen_from_tests.gen import generate_from_tests
+from preset_loader import loader
+from eth2spec.phase0 import spec
+
+
+def create_suite(handler_name: str, config_name: str, get_cases: Callable[[], Iterable[gen_typing.TestCase]]) \
+ -> Callable[[str], gen_typing.TestSuiteOutput]:
+ def suite_definition(configs_path: str) -> gen_typing.TestSuiteOutput:
+ presets = loader.load_presets(configs_path, config_name)
+ spec.apply_constants_preset(presets)
+
+ return ("%sanity_s_%s" % (handler_name, config_name), handler_name, gen_suite.render_suite(
+ title="sanity testing",
+ summary="Sanity test suite, %s type, generated from pytests" % handler_name,
+ forks_timeline="testing",
+ forks=["phase0"],
+ config=config_name,
+ runner="sanity",
+ handler=handler_name,
+ test_cases=get_cases()))
+ return suite_definition
+
+
+if __name__ == "__main__":
+ gen_runner.run_generator("sanity", [
+ create_suite('blocks', 'minimal', lambda: generate_from_tests(test_blocks)),
+ create_suite('blocks', 'mainnet', lambda: generate_from_tests(test_blocks)),
+ create_suite('slots', 'minimal', lambda: generate_from_tests(test_slots)),
+ create_suite('slots', 'mainnet', lambda: generate_from_tests(test_slots)),
+ ])
diff --git a/test_generators/sanity/requirements.txt b/test_generators/sanity/requirements.txt
new file mode 100644
index 0000000000..595cee69cd
--- /dev/null
+++ b/test_generators/sanity/requirements.txt
@@ -0,0 +1,4 @@
+eth-utils==1.6.0
+../../test_libs/gen_helpers
+../../test_libs/config_helpers
+../../test_libs/pyspec
\ No newline at end of file
diff --git a/test_generators/shuffling/main.py b/test_generators/shuffling/main.py
index 2c4faeb8fb..bb14520e12 100644
--- a/test_generators/shuffling/main.py
+++ b/test_generators/shuffling/main.py
@@ -10,7 +10,7 @@
def shuffling_case(seed: spec.Bytes32, count: int):
yield 'seed', '0x' + seed.hex()
yield 'count', count
- yield 'shuffled', [spec.get_permuted_index(i, count, seed) for i in range(count)]
+ yield 'shuffled', [spec.get_shuffled_index(i, count, seed) for i in range(count)]
@to_tuple
diff --git a/test_generators/shuffling/requirements.txt b/test_generators/shuffling/requirements.txt
index 8f9bede8f3..595cee69cd 100644
--- a/test_generators/shuffling/requirements.txt
+++ b/test_generators/shuffling/requirements.txt
@@ -1,4 +1,4 @@
-eth-utils==1.4.1
+eth-utils==1.6.0
../../test_libs/gen_helpers
../../test_libs/config_helpers
../../test_libs/pyspec
\ No newline at end of file
diff --git a/test_generators/ssz_generic/requirements.txt b/test_generators/ssz_generic/requirements.txt
index 94afc9d91b..dcdb0824ff 100644
--- a/test_generators/ssz_generic/requirements.txt
+++ b/test_generators/ssz_generic/requirements.txt
@@ -1,4 +1,4 @@
-eth-utils==1.4.1
+eth-utils==1.6.0
../../test_libs/gen_helpers
../../test_libs/config_helpers
ssz==0.1.0a2
diff --git a/test_generators/ssz_static/main.py b/test_generators/ssz_static/main.py
index 1234294db9..e8995b9185 100644
--- a/test_generators/ssz_static/main.py
+++ b/test_generators/ssz_static/main.py
@@ -18,10 +18,7 @@
@to_dict
-def create_test_case(rng: Random, name: str, mode: random_value.RandomizationMode, chaos: bool):
- typ = spec.get_ssz_type_by_name(name)
- value = random_value.get_random_ssz_object(rng, typ, MAX_BYTES_LENGTH, MAX_LIST_LENGTH, mode, chaos)
- yield "type_name", name
+def create_test_case_contents(value, typ):
yield "value", encode.encode(value, typ)
yield "serialized", '0x' + serialize(value).hex()
yield "root", '0x' + hash_tree_root(value).hex()
@@ -29,6 +26,13 @@ def create_test_case(rng: Random, name: str, mode: random_value.RandomizationMod
yield "signing_root", '0x' + signing_root(value).hex()
+@to_dict
+def create_test_case(rng: Random, name: str, mode: random_value.RandomizationMode, chaos: bool):
+ typ = spec.get_ssz_type_by_name(name)
+ value = random_value.get_random_ssz_object(rng, typ, MAX_BYTES_LENGTH, MAX_LIST_LENGTH, mode, chaos)
+ yield name, create_test_case_contents(value, typ)
+
+
@to_tuple
def ssz_static_cases(rng: Random, mode: random_value.RandomizationMode, chaos: bool, count: int):
for type_name in spec.ssz_types:
diff --git a/test_generators/ssz_static/requirements.txt b/test_generators/ssz_static/requirements.txt
index 8f9bede8f3..595cee69cd 100644
--- a/test_generators/ssz_static/requirements.txt
+++ b/test_generators/ssz_static/requirements.txt
@@ -1,4 +1,4 @@
-eth-utils==1.4.1
+eth-utils==1.6.0
../../test_libs/gen_helpers
../../test_libs/config_helpers
../../test_libs/pyspec
\ No newline at end of file
diff --git a/test_libs/config_helpers/requirements.txt b/test_libs/config_helpers/requirements.txt
index e441a474b8..f2f208c3fb 100644
--- a/test_libs/config_helpers/requirements.txt
+++ b/test_libs/config_helpers/requirements.txt
@@ -1 +1 @@
-ruamel.yaml==0.15.87
+ruamel.yaml==0.15.96
diff --git a/test_libs/config_helpers/setup.py b/test_libs/config_helpers/setup.py
index 90ad94ee44..9f0ea06419 100644
--- a/test_libs/config_helpers/setup.py
+++ b/test_libs/config_helpers/setup.py
@@ -4,6 +4,6 @@
name='config_helpers',
packages=['preset_loader'],
install_requires=[
- "ruamel.yaml==0.15.87"
+ "ruamel.yaml==0.15.96"
]
)
diff --git a/test_libs/pyspec/tests/__init__.py b/test_libs/gen_helpers/gen_from_tests/__init__.py
similarity index 100%
rename from test_libs/pyspec/tests/__init__.py
rename to test_libs/gen_helpers/gen_from_tests/__init__.py
diff --git a/test_libs/gen_helpers/gen_from_tests/gen.py b/test_libs/gen_helpers/gen_from_tests/gen.py
new file mode 100644
index 0000000000..e7d8011310
--- /dev/null
+++ b/test_libs/gen_helpers/gen_from_tests/gen.py
@@ -0,0 +1,25 @@
+from inspect import getmembers, isfunction
+
+def generate_from_tests(src, bls_active=True):
+ """
+ Generate a list of test cases by running tests from the given src in generator-mode.
+ :param src: to retrieve tests from (discovered using inspect.getmembers)
+ :param bls_active: optional, to override BLS switch preference. Defaults to True.
+ :return: the list of test cases.
+ """
+ fn_names = [
+ name for (name, _) in getmembers(src, isfunction)
+ if name.startswith('test_')
+ ]
+ out = []
+ print("generating test vectors from tests source: %s" % src.__name__)
+ for name in fn_names:
+ tfn = getattr(src, name)
+ try:
+ test_case = tfn(generator_mode=True, bls_active=bls_active)
+ # If no test case data is returned, the test is ignored.
+ if test_case is not None:
+ out.append(test_case)
+ except AssertionError:
+ print("ERROR: failed to generate vector from test: %s (src: %s)" % (name, src.__name__))
+ return out
diff --git a/test_libs/gen_helpers/requirements.txt b/test_libs/gen_helpers/requirements.txt
index 3d6a39458e..557cae6317 100644
--- a/test_libs/gen_helpers/requirements.txt
+++ b/test_libs/gen_helpers/requirements.txt
@@ -1,2 +1,2 @@
-ruamel.yaml==0.15.87
-eth-utils==1.4.1
+ruamel.yaml==0.15.96
+eth-utils==1.6.0
diff --git a/test_libs/gen_helpers/setup.py b/test_libs/gen_helpers/setup.py
index 5de27a6dbe..ee2c815c76 100644
--- a/test_libs/gen_helpers/setup.py
+++ b/test_libs/gen_helpers/setup.py
@@ -2,9 +2,9 @@
setup(
name='gen_helpers',
- packages=['gen_base'],
+ packages=['gen_base', 'gen_from_tests'],
install_requires=[
- "ruamel.yaml==0.15.87",
- "eth-utils==1.4.1"
+ "ruamel.yaml==0.15.96",
+ "eth-utils==1.6.0"
]
)
diff --git a/test_libs/pyspec/README.md b/test_libs/pyspec/README.md
index df18342100..330972e772 100644
--- a/test_libs/pyspec/README.md
+++ b/test_libs/pyspec/README.md
@@ -46,8 +46,9 @@ The `-B` flag may be helpful to force-overwrite the `pyspec` output after you ma
Run the tests:
```
-pytest --config=minimal
+pytest --config=minimal eth2spec
```
+Note the package-name, this is to locate the tests.
## Contributing
diff --git a/test_libs/pyspec/eth2spec/test/__init__.py b/test_libs/pyspec/eth2spec/test/__init__.py
new file mode 100644
index 0000000000..e69de29bb2
diff --git a/test_libs/pyspec/eth2spec/test/block_processing/__init__.py b/test_libs/pyspec/eth2spec/test/block_processing/__init__.py
new file mode 100644
index 0000000000..e69de29bb2
diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_attestation.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_attestation.py
new file mode 100644
index 0000000000..af6b39ef6e
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_attestation.py
@@ -0,0 +1,255 @@
+from copy import deepcopy
+
+import eth2spec.phase0.spec as spec
+from eth2spec.phase0.spec import (
+ get_current_epoch,
+ process_attestation
+)
+from eth2spec.phase0.state_transition import (
+ state_transition_to,
+)
+from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls
+from eth2spec.test.helpers.attestations import (
+ get_valid_attestation,
+ sign_attestation,
+)
+from eth2spec.test.helpers.state import (
+ next_epoch,
+ next_slot,
+)
+from eth2spec.test.helpers.block import apply_empty_block
+
+
+def run_attestation_processing(state, attestation, valid=True):
+ """
+ Run ``process_attestation``, yielding:
+ - pre-state ('pre')
+ - attestation ('attestation')
+ - post-state ('post').
+ If ``valid == False``, run expecting ``AssertionError``
+ """
+ # yield pre-state
+ yield 'pre', state
+
+ yield 'attestation', attestation
+
+ # If the attestation is invalid, processing is aborted, and there is no post-state.
+ if not valid:
+ expect_assertion_error(lambda: process_attestation(state, attestation))
+ yield 'post', None
+ return
+
+ current_epoch_count = len(state.current_epoch_attestations)
+ previous_epoch_count = len(state.previous_epoch_attestations)
+
+ # process attestation
+ process_attestation(state, attestation)
+
+ # Make sure the attestation has been processed
+ if attestation.data.target_epoch == get_current_epoch(state):
+ assert len(state.current_epoch_attestations) == current_epoch_count + 1
+ else:
+ assert len(state.previous_epoch_attestations) == previous_epoch_count + 1
+
+ # yield post-state
+ yield 'post', state
+
+
+@spec_state_test
+def test_success(state):
+ attestation = get_valid_attestation(state, signed=True)
+ state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+
+ yield from run_attestation_processing(state, attestation)
+
+
+@spec_state_test
+def test_success_previous_epoch(state):
+ attestation = get_valid_attestation(state, signed=True)
+ next_epoch(state)
+ apply_empty_block(state)
+
+ yield from run_attestation_processing(state, attestation)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_attestation_signature(state):
+ attestation = get_valid_attestation(state)
+ state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_before_inclusion_delay(state):
+ attestation = get_valid_attestation(state, signed=True)
+ # do not increment slot to allow for inclusion delay
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_after_epoch_slots(state):
+ attestation = get_valid_attestation(state, signed=True)
+ # increment past latest inclusion slot
+ state_transition_to(state, state.slot + spec.SLOTS_PER_EPOCH + 1)
+ apply_empty_block(state)
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_old_source_epoch(state):
+ state.slot = spec.SLOTS_PER_EPOCH * 5
+ state.finalized_epoch = 2
+ state.previous_justified_epoch = 3
+ state.current_justified_epoch = 4
+ attestation = get_valid_attestation(state, slot=(spec.SLOTS_PER_EPOCH * 3) + 1)
+
+ # test logic sanity check: make sure the attestation is pointing to oldest known source epoch
+ assert attestation.data.source_epoch == state.previous_justified_epoch
+
+ # Now go beyond that, it will be invalid
+ attestation.data.source_epoch -= 1
+
+ sign_attestation(state, attestation)
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_wrong_shard(state):
+ attestation = get_valid_attestation(state)
+ state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+
+ attestation.data.shard += 1
+
+ sign_attestation(state, attestation)
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_new_source_epoch(state):
+ attestation = get_valid_attestation(state)
+ state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+
+ attestation.data.source_epoch += 1
+
+ sign_attestation(state, attestation)
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_source_root_is_target_root(state):
+ attestation = get_valid_attestation(state)
+ state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+
+ attestation.data.source_root = attestation.data.target_root
+
+ sign_attestation(state, attestation)
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_invalid_current_source_root(state):
+ state.slot = spec.SLOTS_PER_EPOCH * 5
+ state.finalized_epoch = 2
+
+ state.previous_justified_epoch = 3
+ state.previous_justified_root = b'\x01' * 32
+
+ state.current_justified_epoch = 4
+ state.current_justified_root = b'\xff' * 32
+
+ attestation = get_valid_attestation(state, slot=(spec.SLOTS_PER_EPOCH * 3) + 1)
+ state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+
+ # Test logic sanity checks:
+ assert state.current_justified_root != state.previous_justified_root
+ assert attestation.data.source_root == state.previous_justified_root
+
+ # Make attestation source root invalid: should be previous justified, not current one
+ attestation.data.source_root = state.current_justified_root
+
+ sign_attestation(state, attestation)
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_bad_source_root(state):
+ attestation = get_valid_attestation(state)
+ state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+
+ attestation.data.source_root = b'\x42' * 32
+
+ sign_attestation(state, attestation)
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_non_zero_crosslink_data_root(state):
+ attestation = get_valid_attestation(state)
+ state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+
+ attestation.data.crosslink_data_root = b'\x42' * 32
+
+ sign_attestation(state, attestation)
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_bad_previous_crosslink(state):
+ next_epoch(state)
+ apply_empty_block(state)
+
+ attestation = get_valid_attestation(state, signed=True)
+ for _ in range(spec.MIN_ATTESTATION_INCLUSION_DELAY):
+ next_slot(state)
+ apply_empty_block(state)
+
+ state.current_crosslinks[attestation.data.shard].epoch += 10
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_inconsistent_bitfields(state):
+ attestation = get_valid_attestation(state)
+ state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+
+ attestation.custody_bitfield = deepcopy(attestation.aggregation_bitfield) + b'\x00'
+
+ sign_attestation(state, attestation)
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_non_empty_custody_bitfield(state):
+ attestation = get_valid_attestation(state)
+ state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+
+ attestation.custody_bitfield = deepcopy(attestation.aggregation_bitfield)
+
+ sign_attestation(state, attestation)
+
+ yield from run_attestation_processing(state, attestation, False)
+
+
+@spec_state_test
+def test_empty_aggregation_bitfield(state):
+ attestation = get_valid_attestation(state)
+ state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+
+ attestation.aggregation_bitfield = b'\x00' * len(attestation.aggregation_bitfield)
+
+ sign_attestation(state, attestation)
+
+ yield from run_attestation_processing(state, attestation, False)
diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_attester_slashing.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_attester_slashing.py
new file mode 100644
index 0000000000..28e2322772
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_attester_slashing.py
@@ -0,0 +1,149 @@
+import eth2spec.phase0.spec as spec
+from eth2spec.phase0.spec import (
+ get_beacon_proposer_index,
+ process_attester_slashing,
+)
+from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls
+from eth2spec.test.helpers.attestations import sign_indexed_attestation
+from eth2spec.test.helpers.attester_slashings import get_valid_attester_slashing
+from eth2spec.test.helpers.block import apply_empty_block
+from eth2spec.test.helpers.state import (
+ get_balance,
+ next_epoch,
+)
+
+
+def run_attester_slashing_processing(state, attester_slashing, valid=True):
+ """
+ Run ``process_attester_slashing``, yielding:
+ - pre-state ('pre')
+ - attester_slashing ('attester_slashing')
+ - post-state ('post').
+ If ``valid == False``, run expecting ``AssertionError``
+ """
+
+ yield 'pre', state
+ yield 'attester_slashing', attester_slashing
+
+ if not valid:
+ expect_assertion_error(lambda: process_attester_slashing(state, attester_slashing))
+ yield 'post', None
+ return
+
+ slashed_index = attester_slashing.attestation_1.custody_bit_0_indices[0]
+ pre_slashed_balance = get_balance(state, slashed_index)
+
+ proposer_index = get_beacon_proposer_index(state)
+ pre_proposer_balance = get_balance(state, proposer_index)
+
+ # Process slashing
+ process_attester_slashing(state, attester_slashing)
+
+ slashed_validator = state.validator_registry[slashed_index]
+
+ # Check slashing
+ assert slashed_validator.slashed
+ assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH
+ assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH
+
+ if slashed_index != proposer_index:
+ # lost whistleblower reward
+ assert get_balance(state, slashed_index) < pre_slashed_balance
+ # gained whistleblower reward
+ assert get_balance(state, proposer_index) > pre_proposer_balance
+ else:
+ # gained rewards for all slashings, which may include others. And only lost that of themselves.
+ # Netto at least 0, if more people where slashed, a balance increase.
+ assert get_balance(state, slashed_index) >= pre_slashed_balance
+
+ yield 'post', state
+
+
+@spec_state_test
+def test_success_double(state):
+ attester_slashing = get_valid_attester_slashing(state, signed_1=True, signed_2=True)
+
+ yield from run_attester_slashing_processing(state, attester_slashing)
+
+
+@spec_state_test
+def test_success_surround(state):
+ next_epoch(state)
+ apply_empty_block(state)
+
+ state.current_justified_epoch += 1
+ attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=True)
+
+ # set attestion1 to surround attestation 2
+ attester_slashing.attestation_1.data.source_epoch = attester_slashing.attestation_2.data.source_epoch - 1
+ attester_slashing.attestation_1.data.target_epoch = attester_slashing.attestation_2.data.target_epoch + 1
+
+ sign_indexed_attestation(state, attester_slashing.attestation_1)
+
+ yield from run_attester_slashing_processing(state, attester_slashing)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_sig_1(state):
+ attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=True)
+ yield from run_attester_slashing_processing(state, attester_slashing, False)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_sig_2(state):
+ attester_slashing = get_valid_attester_slashing(state, signed_1=True, signed_2=False)
+ yield from run_attester_slashing_processing(state, attester_slashing, False)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_sig_1_and_2(state):
+ attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=False)
+ yield from run_attester_slashing_processing(state, attester_slashing, False)
+
+
+@spec_state_test
+def test_same_data(state):
+ attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=True)
+
+ attester_slashing.attestation_1.data = attester_slashing.attestation_2.data
+ sign_indexed_attestation(state, attester_slashing.attestation_1)
+
+ yield from run_attester_slashing_processing(state, attester_slashing, False)
+
+
+@spec_state_test
+def test_no_double_or_surround(state):
+ attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=True)
+
+ attester_slashing.attestation_1.data.target_epoch += 1
+ sign_indexed_attestation(state, attester_slashing.attestation_1)
+
+ yield from run_attester_slashing_processing(state, attester_slashing, False)
+
+
+@spec_state_test
+def test_participants_already_slashed(state):
+ attester_slashing = get_valid_attester_slashing(state, signed_1=True, signed_2=True)
+
+ # set all indices to slashed
+ attestation_1 = attester_slashing.attestation_1
+ validator_indices = attestation_1.custody_bit_0_indices + attestation_1.custody_bit_1_indices
+ for index in validator_indices:
+ state.validator_registry[index].slashed = True
+
+ yield from run_attester_slashing_processing(state, attester_slashing, False)
+
+
+@spec_state_test
+def test_custody_bit_0_and_1(state):
+ attester_slashing = get_valid_attester_slashing(state, signed_1=False, signed_2=True)
+
+ attester_slashing.attestation_1.custody_bit_1_indices = (
+ attester_slashing.attestation_1.custody_bit_0_indices
+ )
+ sign_indexed_attestation(state, attester_slashing.attestation_1)
+
+ yield from run_attester_slashing_processing(state, attester_slashing, False)
diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_block_header.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_block_header.py
new file mode 100644
index 0000000000..454f557c5c
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_block_header.py
@@ -0,0 +1,87 @@
+from copy import deepcopy
+
+from eth2spec.phase0.spec import (
+ get_beacon_proposer_index,
+ cache_state,
+ advance_slot,
+ process_block_header,
+)
+from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls
+from eth2spec.test.helpers.block import (
+ build_empty_block_for_next_slot,
+ sign_block
+)
+from eth2spec.test.helpers.state import next_slot
+
+
+def prepare_state_for_header_processing(state):
+ cache_state(state)
+ advance_slot(state)
+
+
+def run_block_header_processing(state, block, valid=True):
+ """
+ Run ``process_block_header``, yielding:
+ - pre-state ('pre')
+ - block ('block')
+ - post-state ('post').
+ If ``valid == False``, run expecting ``AssertionError``
+ """
+ prepare_state_for_header_processing(state)
+
+ yield 'pre', state
+ yield 'block', block
+
+ if not valid:
+ expect_assertion_error(lambda: process_block_header(state, block))
+ yield 'post', None
+ return
+
+ process_block_header(state, block)
+ yield 'post', state
+
+
+@spec_state_test
+def test_success_block_header(state):
+ block = build_empty_block_for_next_slot(state, signed=True)
+ yield from run_block_header_processing(state, block)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_sig_block_header(state):
+ block = build_empty_block_for_next_slot(state)
+ yield from run_block_header_processing(state, block, valid=False)
+
+
+@spec_state_test
+def test_invalid_slot_block_header(state):
+ block = build_empty_block_for_next_slot(state)
+ block.slot = state.slot + 2 # invalid slot
+ sign_block(state, block)
+
+ yield from run_block_header_processing(state, block, valid=False)
+
+
+@spec_state_test
+def test_invalid_previous_block_root(state):
+ block = build_empty_block_for_next_slot(state)
+ block.previous_block_root = b'\12' * 32 # invalid prev root
+ sign_block(state, block)
+
+ yield from run_block_header_processing(state, block, valid=False)
+
+
+@spec_state_test
+def test_proposer_slashed(state):
+ # use stub state to get proposer index of next slot
+ stub_state = deepcopy(state)
+ next_slot(stub_state)
+ proposer_index = get_beacon_proposer_index(stub_state)
+
+ # set proposer to slashed
+ state.validator_registry[proposer_index].slashed = True
+
+ block = build_empty_block_for_next_slot(state, signed=True)
+
+ yield from run_block_header_processing(state, block, valid=False)
diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_deposit.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_deposit.py
new file mode 100644
index 0000000000..336af3bf73
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_deposit.py
@@ -0,0 +1,124 @@
+import eth2spec.phase0.spec as spec
+from eth2spec.phase0.spec import process_deposit
+from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls
+from eth2spec.test.helpers.deposits import prepare_state_and_deposit, sign_deposit_data
+from eth2spec.test.helpers.state import get_balance
+from eth2spec.test.helpers.keys import privkeys
+
+
+def run_deposit_processing(state, deposit, validator_index, valid=True, effective=True):
+ """
+ Run ``process_deposit``, yielding:
+ - pre-state ('pre')
+ - deposit ('deposit')
+ - post-state ('post').
+ If ``valid == False``, run expecting ``AssertionError``
+ """
+ pre_validator_count = len(state.validator_registry)
+ pre_balance = 0
+ if validator_index < pre_validator_count:
+ pre_balance = get_balance(state, validator_index)
+ else:
+ # if it is a new validator, it should be right at the end of the current registry.
+ assert validator_index == pre_validator_count
+
+ yield 'pre', state
+ yield 'deposit', deposit
+
+ if not valid:
+ expect_assertion_error(lambda: process_deposit(state, deposit))
+ yield 'post', None
+ return
+
+ process_deposit(state, deposit)
+
+ yield 'post', state
+
+ if not effective:
+ assert len(state.validator_registry) == pre_validator_count
+ assert len(state.balances) == pre_validator_count
+ if validator_index < pre_validator_count:
+ assert get_balance(state, validator_index) == pre_balance
+ else:
+ if validator_index < pre_validator_count:
+ # top-up
+ assert len(state.validator_registry) == pre_validator_count
+ assert len(state.balances) == pre_validator_count
+ else:
+ # new validator
+ assert len(state.validator_registry) == pre_validator_count + 1
+ assert len(state.balances) == pre_validator_count + 1
+ assert get_balance(state, validator_index) == pre_balance + deposit.data.amount
+
+ assert state.deposit_index == state.latest_eth1_data.deposit_count
+
+
+@spec_state_test
+def test_new_deposit(state):
+ # fresh deposit = next validator index = validator appended to registry
+ validator_index = len(state.validator_registry)
+ amount = spec.MAX_EFFECTIVE_BALANCE
+ deposit = prepare_state_and_deposit(state, validator_index, amount, signed=True)
+
+ yield from run_deposit_processing(state, deposit, validator_index)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_sig_new_deposit(state):
+ # fresh deposit = next validator index = validator appended to registry
+ validator_index = len(state.validator_registry)
+ amount = spec.MAX_EFFECTIVE_BALANCE
+ deposit = prepare_state_and_deposit(state, validator_index, amount)
+ yield from run_deposit_processing(state, deposit, validator_index, valid=True, effective=False)
+
+
+@spec_state_test
+def test_success_top_up(state):
+ validator_index = 0
+ amount = spec.MAX_EFFECTIVE_BALANCE // 4
+ deposit = prepare_state_and_deposit(state, validator_index, amount, signed=True)
+
+ yield from run_deposit_processing(state, deposit, validator_index)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_sig_top_up(state):
+ validator_index = 0
+ amount = spec.MAX_EFFECTIVE_BALANCE // 4
+ deposit = prepare_state_and_deposit(state, validator_index, amount)
+
+ # invalid signatures, in top-ups, are allowed!
+ yield from run_deposit_processing(state, deposit, validator_index, valid=True, effective=True)
+
+
+@spec_state_test
+def test_wrong_index(state):
+ validator_index = len(state.validator_registry)
+ amount = spec.MAX_EFFECTIVE_BALANCE
+ deposit = prepare_state_and_deposit(state, validator_index, amount)
+
+ # mess up deposit_index
+ deposit.index = state.deposit_index + 1
+
+ sign_deposit_data(state, deposit.data, privkeys[validator_index])
+
+ yield from run_deposit_processing(state, deposit, validator_index, valid=False)
+
+
+# TODO: test invalid signature
+
+
+@spec_state_test
+def test_bad_merkle_proof(state):
+ validator_index = len(state.validator_registry)
+ amount = spec.MAX_EFFECTIVE_BALANCE
+ deposit = prepare_state_and_deposit(state, validator_index, amount)
+
+ # mess up merkle branch
+ deposit.proof[-1] = spec.ZERO_HASH
+
+ sign_deposit_data(state, deposit.data, privkeys[validator_index])
+
+ yield from run_deposit_processing(state, deposit, validator_index, valid=False)
diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_proposer_slashing.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_proposer_slashing.py
new file mode 100644
index 0000000000..07ccc25f1c
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_proposer_slashing.py
@@ -0,0 +1,137 @@
+import eth2spec.phase0.spec as spec
+from eth2spec.phase0.spec import (
+ get_current_epoch,
+ process_proposer_slashing,
+)
+from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls
+from eth2spec.test.helpers.block_header import sign_block_header
+from eth2spec.test.helpers.keys import privkeys
+from eth2spec.test.helpers.proposer_slashings import get_valid_proposer_slashing
+from eth2spec.test.helpers.state import get_balance
+
+
+def run_proposer_slashing_processing(state, proposer_slashing, valid=True):
+ """
+ Run ``process_proposer_slashing``, yielding:
+ - pre-state ('pre')
+ - proposer_slashing ('proposer_slashing')
+ - post-state ('post').
+ If ``valid == False``, run expecting ``AssertionError``
+ """
+
+ yield 'pre', state
+ yield 'proposer_slashing', proposer_slashing
+
+ if not valid:
+ expect_assertion_error(lambda: process_proposer_slashing(state, proposer_slashing))
+ yield 'post', None
+ return
+
+ pre_proposer_balance = get_balance(state, proposer_slashing.proposer_index)
+
+ process_proposer_slashing(state, proposer_slashing)
+ yield 'post', state
+
+ # check if slashed
+ slashed_validator = state.validator_registry[proposer_slashing.proposer_index]
+ assert slashed_validator.slashed
+ assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH
+ assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH
+
+ # lost whistleblower reward
+ assert (
+ get_balance(state, proposer_slashing.proposer_index) <
+ pre_proposer_balance
+ )
+
+
+@spec_state_test
+def test_success(state):
+ proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True)
+
+ yield from run_proposer_slashing_processing(state, proposer_slashing)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_sig_1(state):
+ proposer_slashing = get_valid_proposer_slashing(state, signed_1=False, signed_2=True)
+ yield from run_proposer_slashing_processing(state, proposer_slashing, False)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_sig_2(state):
+ proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=False)
+ yield from run_proposer_slashing_processing(state, proposer_slashing, False)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_sig_1_and_2(state):
+ proposer_slashing = get_valid_proposer_slashing(state, signed_1=False, signed_2=False)
+ yield from run_proposer_slashing_processing(state, proposer_slashing, False)
+
+
+@spec_state_test
+def test_invalid_proposer_index(state):
+ proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True)
+ # Index just too high (by 1)
+ proposer_slashing.proposer_index = len(state.validator_registry)
+
+ yield from run_proposer_slashing_processing(state, proposer_slashing, False)
+
+
+@spec_state_test
+def test_epochs_are_different(state):
+ proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=False)
+
+ # set slots to be in different epochs
+ proposer_slashing.header_2.slot += spec.SLOTS_PER_EPOCH
+ sign_block_header(state, proposer_slashing.header_2, privkeys[proposer_slashing.proposer_index])
+
+ yield from run_proposer_slashing_processing(state, proposer_slashing, False)
+
+
+@spec_state_test
+def test_headers_are_same(state):
+ proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=False)
+
+ # set headers to be the same
+ proposer_slashing.header_2 = proposer_slashing.header_1
+
+ yield from run_proposer_slashing_processing(state, proposer_slashing, False)
+
+
+@spec_state_test
+def test_proposer_is_not_activated(state):
+ proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True)
+
+ # set proposer to be not active yet
+ state.validator_registry[proposer_slashing.proposer_index].activation_epoch = get_current_epoch(state) + 1
+
+ yield from run_proposer_slashing_processing(state, proposer_slashing, False)
+
+
+@spec_state_test
+def test_proposer_is_slashed(state):
+ proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True)
+
+ # set proposer to slashed
+ state.validator_registry[proposer_slashing.proposer_index].slashed = True
+
+ yield from run_proposer_slashing_processing(state, proposer_slashing, False)
+
+
+@spec_state_test
+def test_proposer_is_withdrawn(state):
+ proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True)
+
+ # move 1 epoch into future, to allow for past withdrawable epoch
+ state.slot += spec.SLOTS_PER_EPOCH
+ # set proposer withdrawable_epoch in past
+ current_epoch = get_current_epoch(state)
+ proposer_index = proposer_slashing.proposer_index
+ state.validator_registry[proposer_index].withdrawable_epoch = current_epoch - 1
+
+ yield from run_proposer_slashing_processing(state, proposer_slashing, False)
diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_transfer.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_transfer.py
new file mode 100644
index 0000000000..83af755743
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_transfer.py
@@ -0,0 +1,172 @@
+import eth2spec.phase0.spec as spec
+from eth2spec.phase0.spec import (
+ get_active_validator_indices,
+ get_beacon_proposer_index,
+ get_current_epoch,
+ process_transfer,
+)
+from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls
+from eth2spec.test.helpers.state import next_epoch
+from eth2spec.test.helpers.block import apply_empty_block
+from eth2spec.test.helpers.transfers import get_valid_transfer
+
+
+def run_transfer_processing(state, transfer, valid=True):
+ """
+ Run ``process_transfer``, yielding:
+ - pre-state ('pre')
+ - transfer ('transfer')
+ - post-state ('post').
+ If ``valid == False``, run expecting ``AssertionError``
+ """
+
+ proposer_index = get_beacon_proposer_index(state)
+ pre_transfer_sender_balance = state.balances[transfer.sender]
+ pre_transfer_recipient_balance = state.balances[transfer.recipient]
+ pre_transfer_proposer_balance = state.balances[proposer_index]
+
+ yield 'pre', state
+ yield 'transfer', transfer
+
+ if not valid:
+ expect_assertion_error(lambda: process_transfer(state, transfer))
+ yield 'post', None
+ return
+
+ process_transfer(state, transfer)
+ yield 'post', state
+
+ sender_balance = state.balances[transfer.sender]
+ recipient_balance = state.balances[transfer.recipient]
+ assert sender_balance == pre_transfer_sender_balance - transfer.amount - transfer.fee
+ assert recipient_balance == pre_transfer_recipient_balance + transfer.amount
+ assert state.balances[proposer_index] == pre_transfer_proposer_balance + transfer.fee
+
+
+@spec_state_test
+def test_success_non_activated(state):
+ transfer = get_valid_transfer(state, signed=True)
+ # un-activate so validator can transfer
+ state.validator_registry[transfer.sender].activation_eligibility_epoch = spec.FAR_FUTURE_EPOCH
+
+ yield from run_transfer_processing(state, transfer)
+
+
+@spec_state_test
+def test_success_withdrawable(state):
+ next_epoch(state)
+ apply_empty_block(state)
+
+ transfer = get_valid_transfer(state, signed=True)
+
+ # withdrawable_epoch in past so can transfer
+ state.validator_registry[transfer.sender].withdrawable_epoch = get_current_epoch(state) - 1
+
+ yield from run_transfer_processing(state, transfer)
+
+
+@spec_state_test
+def test_success_active_above_max_effective(state):
+ sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
+ state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE + 1
+ transfer = get_valid_transfer(state, sender_index=sender_index, amount=1, fee=0, signed=True)
+
+ yield from run_transfer_processing(state, transfer)
+
+
+@spec_state_test
+def test_success_active_above_max_effective_fee(state):
+ sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
+ state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE + 1
+ transfer = get_valid_transfer(state, sender_index=sender_index, amount=0, fee=1, signed=True)
+
+ yield from run_transfer_processing(state, transfer)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_signature(state):
+ transfer = get_valid_transfer(state)
+ # un-activate so validator can transfer
+ state.validator_registry[transfer.sender].activation_eligibility_epoch = spec.FAR_FUTURE_EPOCH
+
+ yield from run_transfer_processing(state, transfer, False)
+
+
+@spec_state_test
+def test_active_but_transfer_past_effective_balance(state):
+ sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
+ amount = spec.MAX_EFFECTIVE_BALANCE // 32
+ state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE
+ transfer = get_valid_transfer(state, sender_index=sender_index, amount=amount, fee=0, signed=True)
+
+ yield from run_transfer_processing(state, transfer, False)
+
+
+@spec_state_test
+def test_incorrect_slot(state):
+ transfer = get_valid_transfer(state, slot=state.slot + 1, signed=True)
+ # un-activate so validator can transfer
+ state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH
+
+ yield from run_transfer_processing(state, transfer, False)
+
+
+@spec_state_test
+def test_insufficient_balance_for_fee(state):
+ sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
+ state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE
+ transfer = get_valid_transfer(state, sender_index=sender_index, amount=0, fee=1, signed=True)
+
+ # un-activate so validator can transfer
+ state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH
+
+ yield from run_transfer_processing(state, transfer, False)
+
+
+@spec_state_test
+def test_insufficient_balance(state):
+ sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
+ state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE
+ transfer = get_valid_transfer(state, sender_index=sender_index, amount=1, fee=0, signed=True)
+
+ # un-activate so validator can transfer
+ state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH
+
+ yield from run_transfer_processing(state, transfer, False)
+
+
+@spec_state_test
+def test_no_dust_sender(state):
+ sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
+ balance = state.balances[sender_index]
+ transfer = get_valid_transfer(state, sender_index=sender_index, amount=balance - spec.MIN_DEPOSIT_AMOUNT + 1, fee=0, signed=True)
+
+ # un-activate so validator can transfer
+ state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH
+
+ yield from run_transfer_processing(state, transfer, False)
+
+
+@spec_state_test
+def test_no_dust_recipient(state):
+ sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
+ state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE + 1
+ transfer = get_valid_transfer(state, sender_index=sender_index, amount=1, fee=0, signed=True)
+ state.balances[transfer.recipient] = 0
+
+ # un-activate so validator can transfer
+ state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH
+
+ yield from run_transfer_processing(state, transfer, False)
+
+
+@spec_state_test
+def test_invalid_pubkey(state):
+ transfer = get_valid_transfer(state, signed=True)
+ state.validator_registry[transfer.sender].withdrawal_credentials = spec.ZERO_HASH
+
+ # un-activate so validator can transfer
+ state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH
+
+ yield from run_transfer_processing(state, transfer, False)
diff --git a/test_libs/pyspec/eth2spec/test/block_processing/test_process_voluntary_exit.py b/test_libs/pyspec/eth2spec/test/block_processing/test_process_voluntary_exit.py
new file mode 100644
index 0000000000..53fb4e3f7c
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/block_processing/test_process_voluntary_exit.py
@@ -0,0 +1,225 @@
+import eth2spec.phase0.spec as spec
+from eth2spec.phase0.spec import (
+ get_active_validator_indices,
+ get_churn_limit,
+ get_current_epoch,
+ process_voluntary_exit,
+)
+from eth2spec.test.context import spec_state_test, expect_assertion_error, always_bls
+from eth2spec.test.helpers.keys import pubkey_to_privkey
+from eth2spec.test.helpers.voluntary_exits import build_voluntary_exit, sign_voluntary_exit
+
+
+def run_voluntary_exit_processing(state, voluntary_exit, valid=True):
+ """
+ Run ``process_voluntary_exit``, yielding:
+ - pre-state ('pre')
+ - voluntary_exit ('voluntary_exit')
+ - post-state ('post').
+ If ``valid == False``, run expecting ``AssertionError``
+ """
+ validator_index = voluntary_exit.validator_index
+
+ yield 'pre', state
+ yield 'voluntary_exit', voluntary_exit
+
+ if not valid:
+ expect_assertion_error(lambda: process_voluntary_exit(state, voluntary_exit))
+ yield 'post', None
+ return
+
+ pre_exit_epoch = state.validator_registry[validator_index].exit_epoch
+
+ process_voluntary_exit(state, voluntary_exit)
+
+ yield 'post', state
+
+ assert pre_exit_epoch == spec.FAR_FUTURE_EPOCH
+ assert state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH
+
+
+@spec_state_test
+def test_success(state):
+ # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit
+ state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH
+
+ current_epoch = get_current_epoch(state)
+ validator_index = get_active_validator_indices(state, current_epoch)[0]
+ privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
+
+ voluntary_exit = build_voluntary_exit(state, current_epoch, validator_index, privkey, signed=True)
+
+ yield from run_voluntary_exit_processing(state, voluntary_exit)
+
+
+@always_bls
+@spec_state_test
+def test_invalid_signature(state):
+ # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit
+ state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH
+
+ current_epoch = get_current_epoch(state)
+ validator_index = get_active_validator_indices(state, current_epoch)[0]
+ privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
+
+ voluntary_exit = build_voluntary_exit(state, current_epoch, validator_index, privkey)
+
+ yield from run_voluntary_exit_processing(state, voluntary_exit, False)
+
+
+@spec_state_test
+def test_success_exit_queue(state):
+ # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit
+ state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH
+
+ current_epoch = get_current_epoch(state)
+
+ # exit `MAX_EXITS_PER_EPOCH`
+ initial_indices = get_active_validator_indices(state, current_epoch)[:get_churn_limit(state)]
+
+ # Prepare a bunch of exits, based on the current state
+ exit_queue = []
+ for index in initial_indices:
+ privkey = pubkey_to_privkey[state.validator_registry[index].pubkey]
+ exit_queue.append(build_voluntary_exit(
+ state,
+ current_epoch,
+ index,
+ privkey,
+ signed=True,
+ ))
+
+ # Now run all the exits
+ for voluntary_exit in exit_queue:
+ # the function yields data, but we are just interested in running it here, ignore yields.
+ for _ in run_voluntary_exit_processing(state, voluntary_exit):
+ continue
+
+ # exit an additional validator
+ validator_index = get_active_validator_indices(state, current_epoch)[-1]
+ privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
+ voluntary_exit = build_voluntary_exit(
+ state,
+ current_epoch,
+ validator_index,
+ privkey,
+ signed=True,
+ )
+
+ # This is the interesting part of the test: on a pre-state with a full exit queue,
+ # when processing an additional exit, it results in an exit in a later epoch
+ yield from run_voluntary_exit_processing(state, voluntary_exit)
+
+ assert (
+ state.validator_registry[validator_index].exit_epoch ==
+ state.validator_registry[initial_indices[0]].exit_epoch + 1
+ )
+
+
+@spec_state_test
+def test_validator_exit_in_future(state):
+ # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit
+ state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH
+
+ current_epoch = get_current_epoch(state)
+ validator_index = get_active_validator_indices(state, current_epoch)[0]
+ privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
+
+ voluntary_exit = build_voluntary_exit(
+ state,
+ current_epoch,
+ validator_index,
+ privkey,
+ signed=False,
+ )
+ voluntary_exit.epoch += 1
+ sign_voluntary_exit(state, voluntary_exit, privkey)
+
+ yield from run_voluntary_exit_processing(state, voluntary_exit, False)
+
+
+@spec_state_test
+def test_validator_invalid_validator_index(state):
+ # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit
+ state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH
+
+ current_epoch = get_current_epoch(state)
+ validator_index = get_active_validator_indices(state, current_epoch)[0]
+ privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
+
+ voluntary_exit = build_voluntary_exit(
+ state,
+ current_epoch,
+ validator_index,
+ privkey,
+ signed=False,
+ )
+ voluntary_exit.validator_index = len(state.validator_registry)
+ sign_voluntary_exit(state, voluntary_exit, privkey)
+
+ yield from run_voluntary_exit_processing(state, voluntary_exit, False)
+
+
+@spec_state_test
+def test_validator_not_active(state):
+ current_epoch = get_current_epoch(state)
+ validator_index = get_active_validator_indices(state, current_epoch)[0]
+ privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
+
+ state.validator_registry[validator_index].activation_epoch = spec.FAR_FUTURE_EPOCH
+
+ # build and test voluntary exit
+ voluntary_exit = build_voluntary_exit(
+ state,
+ current_epoch,
+ validator_index,
+ privkey,
+ signed=True,
+ )
+
+ yield from run_voluntary_exit_processing(state, voluntary_exit, False)
+
+
+@spec_state_test
+def test_validator_already_exited(state):
+ # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow validator able to exit
+ state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH
+
+ current_epoch = get_current_epoch(state)
+ validator_index = get_active_validator_indices(state, current_epoch)[0]
+ privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
+
+ # but validator already has exited
+ state.validator_registry[validator_index].exit_epoch = current_epoch + 2
+
+ voluntary_exit = build_voluntary_exit(
+ state,
+ current_epoch,
+ validator_index,
+ privkey,
+ signed=True,
+ )
+
+ yield from run_voluntary_exit_processing(state, voluntary_exit, False)
+
+
+@spec_state_test
+def test_validator_not_active_long_enough(state):
+ current_epoch = get_current_epoch(state)
+ validator_index = get_active_validator_indices(state, current_epoch)[0]
+ privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
+
+ voluntary_exit = build_voluntary_exit(
+ state,
+ current_epoch,
+ validator_index,
+ privkey,
+ signed=True,
+ )
+
+ assert (
+ current_epoch - state.validator_registry[validator_index].activation_epoch <
+ spec.PERSISTENT_COMMITTEE_PERIOD
+ )
+
+ yield from run_voluntary_exit_processing(state, voluntary_exit, False)
diff --git a/test_libs/pyspec/eth2spec/test/conftest.py b/test_libs/pyspec/eth2spec/test/conftest.py
new file mode 100644
index 0000000000..dadb0d5d06
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/conftest.py
@@ -0,0 +1,36 @@
+from eth2spec.phase0 import spec
+
+# We import pytest only when it's present, i.e. when we are running tests.
+# The test-cases themselves can be generated without installing pytest.
+
+def module_exists(module_name):
+ try:
+ __import__(module_name)
+ except ImportError:
+ return False
+ else:
+ return True
+
+
+def fixture(*args, **kwargs):
+ if module_exists("pytest"):
+ import pytest
+ return pytest.fixture(*args, **kwargs)
+ else:
+ def ignore():
+ pass
+ return ignore
+
+
+def pytest_addoption(parser):
+ parser.addoption(
+ "--config", action="store", default="minimal", help="config: make the pyspec use the specified configuration"
+ )
+
+
+@fixture(autouse=True)
+def config(request):
+ config_name = request.config.getoption("--config")
+ from preset_loader import loader
+ presets = loader.load_presets('../../configs/', config_name)
+ spec.apply_constants_preset(presets)
diff --git a/test_libs/pyspec/eth2spec/test/context.py b/test_libs/pyspec/eth2spec/test/context.py
new file mode 100644
index 0000000000..2be9322de2
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/context.py
@@ -0,0 +1,82 @@
+from eth2spec.phase0 import spec
+from eth2spec.utils import bls
+
+from .helpers.genesis import create_genesis_state
+
+from .utils import spectest, with_args, with_tags
+
+# Provides a genesis state as first argument to the function decorated with this
+with_state = with_args(lambda: [create_genesis_state(spec.SLOTS_PER_EPOCH * 8)])
+
+
+# BLS is turned off by default *for performance purposes during TESTING*.
+# The runner of the test can indicate the preferred setting (test generators prefer BLS to be ON).
+# - Some tests are marked as BLS-requiring, and ignore this setting.
+# (tests that express differences caused by BLS, e.g. invalid signatures being rejected)
+# - Some other tests are marked as BLS-ignoring, and ignore this setting.
+# (tests that are heavily performance impacted / require unsigned state transitions)
+# - Most tests respect the BLS setting.
+DEFAULT_BLS_ACTIVE = False
+
+
+# shorthand for decorating @with_state @spectest()
+def spec_state_test(fn):
+ return with_state(bls_switch(spectest()(fn)))
+
+
+def expect_assertion_error(fn):
+ bad = False
+ try:
+ fn()
+ bad = True
+ except AssertionError:
+ pass
+ except IndexError:
+ # Index errors are special; the spec is not explicit on bound checking, an IndexError is like a failed assert.
+ pass
+ if bad:
+ raise AssertionError('expected an assertion error, but got none.')
+
+
+# Tags a test to be ignoring BLS for it to pass.
+bls_ignored = with_tags({'bls_setting': 2})
+
+
+def never_bls(fn):
+ """
+ Decorator to apply on ``bls_switch`` decorator to force BLS de-activation. Useful to mark tests as BLS-ignorant.
+ """
+ def entry(*args, **kw):
+ # override bls setting
+ kw['bls_active'] = False
+ return fn(*args, **kw)
+ return bls_ignored(entry)
+
+
+# Tags a test to be requiring BLS for it to pass.
+bls_required = with_tags({'bls_setting': 1})
+
+
+def always_bls(fn):
+ """
+ Decorator to apply on ``bls_switch`` decorator to force BLS activation. Useful to mark tests as BLS-dependent.
+ """
+ def entry(*args, **kw):
+ # override bls setting
+ kw['bls_active'] = True
+ return fn(*args, **kw)
+ return bls_required(entry)
+
+
+def bls_switch(fn):
+ """
+ Decorator to make a function execute with BLS ON, or BLS off.
+ Based on an optional bool argument ``bls_active``, passed to the function at runtime.
+ """
+ def entry(*args, **kw):
+ old_state = bls.bls_active
+ bls.bls_active = kw.pop('bls_active', DEFAULT_BLS_ACTIVE)
+ out = fn(*args, **kw)
+ bls.bls_active = old_state
+ return out
+ return entry
diff --git a/test_libs/pyspec/eth2spec/test/epoch_processing/__init__.py b/test_libs/pyspec/eth2spec/test/epoch_processing/__init__.py
new file mode 100644
index 0000000000..e69de29bb2
diff --git a/test_libs/pyspec/tests/epoch_processing/test_process_crosslinks.py b/test_libs/pyspec/eth2spec/test/epoch_processing/test_process_crosslinks.py
similarity index 62%
rename from test_libs/pyspec/tests/epoch_processing/test_process_crosslinks.py
rename to test_libs/pyspec/eth2spec/test/epoch_processing/test_process_crosslinks.py
index d6765e3a72..cfbcd18834 100644
--- a/test_libs/pyspec/tests/epoch_processing/test_process_crosslinks.py
+++ b/test_libs/pyspec/eth2spec/test/epoch_processing/test_process_crosslinks.py
@@ -1,116 +1,131 @@
from copy import deepcopy
-import pytest
import eth2spec.phase0.spec as spec
-
-from eth2spec.phase0.state_transition import (
- state_transition,
-)
from eth2spec.phase0.spec import (
cache_state,
get_crosslink_deltas,
process_crosslinks,
)
-from tests.helpers import (
+from eth2spec.phase0.state_transition import (
+ state_transition,
+)
+from eth2spec.test.context import spec_state_test
+from eth2spec.test.helpers.state import (
+ next_epoch,
+ next_slot
+)
+from eth2spec.test.helpers.block import apply_empty_block, sign_block
+from eth2spec.test.helpers.attestations import (
add_attestation_to_state,
build_empty_block_for_next_slot,
fill_aggregate_attestation,
get_crosslink_committee,
get_valid_attestation,
- next_epoch,
- next_slot,
- set_bitfield_bit,
+ sign_attestation,
)
-# mark entire file as 'crosslinks'
-pytestmark = pytest.mark.crosslinks
-
-
def run_process_crosslinks(state, valid=True):
+ """
+ Run ``process_crosslinks``, yielding:
+ - pre-state ('pre')
+ - post-state ('post').
+ If ``valid == False``, run expecting ``AssertionError``
+ """
# transition state to slot before state transition
slot = state.slot + (spec.SLOTS_PER_EPOCH - state.slot % spec.SLOTS_PER_EPOCH) - 1
block = build_empty_block_for_next_slot(state)
block.slot = slot
+ sign_block(state, block)
state_transition(state, block)
# cache state before epoch transition
cache_state(state)
- post_state = deepcopy(state)
- process_crosslinks(post_state)
-
- return state, post_state
+ yield 'pre', state
+ process_crosslinks(state)
+ yield 'post', state
+@spec_state_test
def test_no_attestations(state):
- pre_state, post_state = run_process_crosslinks(state)
+ yield from run_process_crosslinks(state)
for shard in range(spec.SHARD_COUNT):
- assert post_state.previous_crosslinks[shard] == post_state.current_crosslinks[shard]
-
- return pre_state, post_state
+ assert state.previous_crosslinks[shard] == state.current_crosslinks[shard]
+@spec_state_test
def test_single_crosslink_update_from_current_epoch(state):
next_epoch(state)
- attestation = get_valid_attestation(state)
+ attestation = get_valid_attestation(state, signed=True)
fill_aggregate_attestation(state, attestation)
add_attestation_to_state(state, attestation, state.slot + spec.MIN_ATTESTATION_INCLUSION_DELAY)
assert len(state.current_epoch_attestations) == 1
- pre_state, post_state = run_process_crosslinks(state)
-
shard = attestation.data.shard
- assert post_state.previous_crosslinks[shard] != post_state.current_crosslinks[shard]
- assert pre_state.current_crosslinks[shard] != post_state.current_crosslinks[shard]
+ pre_crosslink = deepcopy(state.current_crosslinks[shard])
- return pre_state, post_state
+ yield from run_process_crosslinks(state)
+ assert state.previous_crosslinks[shard] != state.current_crosslinks[shard]
+ assert pre_crosslink != state.current_crosslinks[shard]
+
+@spec_state_test
def test_single_crosslink_update_from_previous_epoch(state):
next_epoch(state)
- attestation = get_valid_attestation(state)
+ attestation = get_valid_attestation(state, signed=True)
fill_aggregate_attestation(state, attestation)
add_attestation_to_state(state, attestation, state.slot + spec.SLOTS_PER_EPOCH)
assert len(state.previous_epoch_attestations) == 1
- pre_state, post_state = run_process_crosslinks(state)
+ shard = attestation.data.shard
+ pre_crosslink = deepcopy(state.current_crosslinks[shard])
+
crosslink_deltas = get_crosslink_deltas(state)
- shard = attestation.data.shard
- assert post_state.previous_crosslinks[shard] != post_state.current_crosslinks[shard]
- assert pre_state.current_crosslinks[shard] != post_state.current_crosslinks[shard]
+ yield from run_process_crosslinks(state)
+
+ assert state.previous_crosslinks[shard] != state.current_crosslinks[shard]
+ assert pre_crosslink != state.current_crosslinks[shard]
+
# ensure rewarded
for index in get_crosslink_committee(state, attestation.data.target_epoch, attestation.data.shard):
assert crosslink_deltas[0][index] > 0
assert crosslink_deltas[1][index] == 0
- return pre_state, post_state
-
+@spec_state_test
def test_double_late_crosslink(state):
+ if spec.get_epoch_committee_count(state, spec.get_current_epoch(state)) < spec.SHARD_COUNT:
+ print("warning: ignoring test, test-assumptions are incompatible with configuration")
+ return
+
next_epoch(state)
state.slot += 4
- attestation_1 = get_valid_attestation(state)
+ attestation_1 = get_valid_attestation(state, signed=True)
fill_aggregate_attestation(state, attestation_1)
- # add attestation_1 in the next epoch
+ # add attestation_1 to next epoch
next_epoch(state)
add_attestation_to_state(state, attestation_1, state.slot + 1)
for slot in range(spec.SLOTS_PER_EPOCH):
attestation_2 = get_valid_attestation(state)
if attestation_2.data.shard == attestation_1.data.shard:
+ sign_attestation(state, attestation_2)
break
next_slot(state)
+ apply_empty_block(state)
+
fill_aggregate_attestation(state, attestation_2)
# add attestation_2 in the next epoch after attestation_1 has
@@ -121,16 +136,15 @@ def test_double_late_crosslink(state):
assert len(state.previous_epoch_attestations) == 1
assert len(state.current_epoch_attestations) == 0
- pre_state, post_state = run_process_crosslinks(state)
crosslink_deltas = get_crosslink_deltas(state)
+ yield from run_process_crosslinks(state)
+
shard = attestation_2.data.shard
# ensure that the current crosslinks were not updated by the second attestation
- assert post_state.previous_crosslinks[shard] == post_state.current_crosslinks[shard]
+ assert state.previous_crosslinks[shard] == state.current_crosslinks[shard]
# ensure no reward, only penalties for the failed crosslink
for index in get_crosslink_committee(state, attestation_2.data.target_epoch, attestation_2.data.shard):
assert crosslink_deltas[0][index] == 0
assert crosslink_deltas[1][index] > 0
-
- return pre_state, post_state
diff --git a/test_libs/pyspec/tests/epoch_processing/test_process_registry_updates.py b/test_libs/pyspec/eth2spec/test/epoch_processing/test_process_registry_updates.py
similarity index 53%
rename from test_libs/pyspec/tests/epoch_processing/test_process_registry_updates.py
rename to test_libs/pyspec/eth2spec/test/epoch_processing/test_process_registry_updates.py
index 11f5de2ad4..71bf89c702 100644
--- a/test_libs/pyspec/tests/epoch_processing/test_process_registry_updates.py
+++ b/test_libs/pyspec/eth2spec/test/epoch_processing/test_process_registry_updates.py
@@ -1,21 +1,44 @@
-from copy import deepcopy
-
-import pytest
-
import eth2spec.phase0.spec as spec
from eth2spec.phase0.spec import (
get_current_epoch,
is_active_validator,
+ process_registry_updates
)
-from tests.helpers import (
- next_epoch,
-)
-
-# mark entire file as 'state'
-pytestmark = pytest.mark.state
-
-
+from eth2spec.phase0.state_transition import state_transition
+from eth2spec.test.helpers.block import build_empty_block_for_next_slot, sign_block
+from eth2spec.test.helpers.state import next_epoch
+from eth2spec.test.context import spec_state_test
+
+
+def run_process_registry_updates(state, valid=True):
+ """
+ Run ``process_crosslinks``, yielding:
+ - pre-state ('pre')
+ - post-state ('post').
+ If ``valid == False``, run expecting ``AssertionError``
+ """
+ # transition state to slot before state transition
+ slot = state.slot + (spec.SLOTS_PER_EPOCH - state.slot % spec.SLOTS_PER_EPOCH) - 1
+ block = build_empty_block_for_next_slot(state)
+ block.slot = slot
+ sign_block(state, block)
+ state_transition(state, block)
+
+ # cache state before epoch transition
+ spec.cache_state(state)
+
+ # process components of epoch transition before registry update
+ spec.process_justification_and_finalization(state)
+ spec.process_crosslinks(state)
+ spec.process_rewards_and_penalties(state)
+
+ yield 'pre', state
+ process_registry_updates(state)
+ yield 'post', state
+
+
+@spec_state_test
def test_activation(state):
index = 0
assert is_active_validator(state.validator_registry[index], get_current_epoch(state))
@@ -26,12 +49,10 @@ def test_activation(state):
state.validator_registry[index].effective_balance = spec.MAX_EFFECTIVE_BALANCE
assert not is_active_validator(state.validator_registry[index], get_current_epoch(state))
- pre_state = deepcopy(state)
-
- blocks = []
for _ in range(spec.ACTIVATION_EXIT_DELAY + 1):
- block = next_epoch(state)
- blocks.append(block)
+ next_epoch(state)
+
+ yield from run_process_registry_updates(state)
assert state.validator_registry[index].activation_eligibility_epoch != spec.FAR_FUTURE_EPOCH
assert state.validator_registry[index].activation_epoch != spec.FAR_FUTURE_EPOCH
@@ -40,9 +61,8 @@ def test_activation(state):
get_current_epoch(state),
)
- return pre_state, blocks, state
-
+@spec_state_test
def test_ejection(state):
index = 0
assert is_active_validator(state.validator_registry[index], get_current_epoch(state))
@@ -51,17 +71,13 @@ def test_ejection(state):
# Mock an ejection
state.validator_registry[index].effective_balance = spec.EJECTION_BALANCE
- pre_state = deepcopy(state)
-
- blocks = []
for _ in range(spec.ACTIVATION_EXIT_DELAY + 1):
- block = next_epoch(state)
- blocks.append(block)
+ next_epoch(state)
+
+ yield from run_process_registry_updates(state)
assert state.validator_registry[index].exit_epoch != spec.FAR_FUTURE_EPOCH
assert not is_active_validator(
state.validator_registry[index],
get_current_epoch(state),
)
-
- return pre_state, blocks, state
diff --git a/test_libs/pyspec/eth2spec/test/helpers/__init__.py b/test_libs/pyspec/eth2spec/test/helpers/__init__.py
new file mode 100644
index 0000000000..e69de29bb2
diff --git a/test_libs/pyspec/eth2spec/test/helpers/attestations.py b/test_libs/pyspec/eth2spec/test/helpers/attestations.py
new file mode 100644
index 0000000000..b541e610f4
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/attestations.py
@@ -0,0 +1,146 @@
+from typing import List
+
+# Access constants from spec pkg reference.
+import eth2spec.phase0.spec as spec
+from eth2spec.phase0.spec import (
+ Attestation,
+ AttestationData,
+ AttestationDataAndCustodyBit,
+ get_epoch_start_slot, get_block_root, get_current_epoch, get_previous_epoch, slot_to_epoch,
+ get_crosslink_committee, get_domain, IndexedAttestation, get_attesting_indices, BeaconState, get_block_root_at_slot,
+ get_epoch_start_shard, get_epoch_committee_count)
+from eth2spec.phase0.state_transition import (
+ state_transition, state_transition_to
+)
+from eth2spec.test.helpers.bitfields import set_bitfield_bit
+from eth2spec.test.helpers.block import build_empty_block_for_next_slot, sign_block
+from eth2spec.test.helpers.keys import privkeys
+from eth2spec.utils.bls import bls_sign, bls_aggregate_signatures
+from eth2spec.utils.minimal_ssz import hash_tree_root
+
+
+def build_attestation_data(state, slot, shard):
+ assert state.slot >= slot
+
+ if slot == state.slot:
+ block_root = build_empty_block_for_next_slot(state).previous_block_root
+ else:
+ block_root = get_block_root_at_slot(state, slot)
+
+ current_epoch_start_slot = get_epoch_start_slot(get_current_epoch(state))
+ if slot < current_epoch_start_slot:
+ epoch_boundary_root = get_block_root(state, get_previous_epoch(state))
+ elif slot == current_epoch_start_slot:
+ epoch_boundary_root = block_root
+ else:
+ epoch_boundary_root = get_block_root(state, get_current_epoch(state))
+
+ if slot < current_epoch_start_slot:
+ justified_epoch = state.previous_justified_epoch
+ justified_block_root = state.previous_justified_root
+ else:
+ justified_epoch = state.current_justified_epoch
+ justified_block_root = state.current_justified_root
+
+ crosslinks = state.current_crosslinks if slot_to_epoch(slot) == get_current_epoch(
+ state) else state.previous_crosslinks
+ return AttestationData(
+ shard=shard,
+ beacon_block_root=block_root,
+ source_epoch=justified_epoch,
+ source_root=justified_block_root,
+ target_epoch=slot_to_epoch(slot),
+ target_root=epoch_boundary_root,
+ crosslink_data_root=spec.ZERO_HASH,
+ previous_crosslink_root=hash_tree_root(crosslinks[shard]),
+ )
+
+
+def get_valid_attestation(state, slot=None, signed=False):
+ if slot is None:
+ slot = state.slot
+
+ epoch = slot_to_epoch(slot)
+ epoch_start_shard = get_epoch_start_shard(state, epoch)
+ committees_per_slot = get_epoch_committee_count(state, epoch) // spec.SLOTS_PER_EPOCH
+ shard = (epoch_start_shard + committees_per_slot * (slot % spec.SLOTS_PER_EPOCH)) % spec.SHARD_COUNT
+
+ attestation_data = build_attestation_data(state, slot, shard)
+
+ crosslink_committee = get_crosslink_committee(state, attestation_data.target_epoch, attestation_data.shard)
+
+ committee_size = len(crosslink_committee)
+ bitfield_length = (committee_size + 7) // 8
+ aggregation_bitfield = b'\x00' * bitfield_length
+ custody_bitfield = b'\x00' * bitfield_length
+ attestation = Attestation(
+ aggregation_bitfield=aggregation_bitfield,
+ data=attestation_data,
+ custody_bitfield=custody_bitfield,
+ )
+ fill_aggregate_attestation(state, attestation)
+ if signed:
+ sign_attestation(state, attestation)
+ return attestation
+
+
+def sign_aggregate_attestation(state: BeaconState, data: AttestationData, participants: List[int]):
+ signatures = []
+ for validator_index in participants:
+ privkey = privkeys[validator_index]
+ signatures.append(
+ get_attestation_signature(
+ state,
+ data,
+ privkey
+ )
+ )
+
+ return bls_aggregate_signatures(signatures)
+
+
+def sign_indexed_attestation(state, indexed_attestation: IndexedAttestation):
+ participants = indexed_attestation.custody_bit_0_indices + indexed_attestation.custody_bit_1_indices
+ indexed_attestation.signature = sign_aggregate_attestation(state, indexed_attestation.data, participants)
+
+
+def sign_attestation(state, attestation: Attestation):
+ participants = get_attesting_indices(
+ state,
+ attestation.data,
+ attestation.aggregation_bitfield,
+ )
+
+ attestation.signature = sign_aggregate_attestation(state, attestation.data, participants)
+
+
+def get_attestation_signature(state, attestation_data, privkey, custody_bit=0b0):
+ message_hash = AttestationDataAndCustodyBit(
+ data=attestation_data,
+ custody_bit=custody_bit,
+ ).hash_tree_root()
+
+ return bls_sign(
+ message_hash=message_hash,
+ privkey=privkey,
+ domain=get_domain(
+ state=state,
+ domain_type=spec.DOMAIN_ATTESTATION,
+ message_epoch=attestation_data.target_epoch,
+ )
+ )
+
+
+def fill_aggregate_attestation(state, attestation):
+ crosslink_committee = get_crosslink_committee(state, attestation.data.target_epoch, attestation.data.shard)
+ for i in range(len(crosslink_committee)):
+ attestation.aggregation_bitfield = set_bitfield_bit(attestation.aggregation_bitfield, i)
+
+
+def add_attestation_to_state(state, attestation, slot):
+ block = build_empty_block_for_next_slot(state)
+ block.slot = slot
+ block.body.attestations.append(attestation)
+ state_transition_to(state, block.slot)
+ sign_block(state, block)
+ state_transition(state, block)
diff --git a/test_libs/pyspec/eth2spec/test/helpers/attester_slashings.py b/test_libs/pyspec/eth2spec/test/helpers/attester_slashings.py
new file mode 100644
index 0000000000..d19b41dfec
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/attester_slashings.py
@@ -0,0 +1,19 @@
+from copy import deepcopy
+
+from eth2spec.phase0.spec import AttesterSlashing, convert_to_indexed
+from eth2spec.test.helpers.attestations import get_valid_attestation, sign_attestation
+
+
+def get_valid_attester_slashing(state, signed_1=False, signed_2=False):
+ attestation_1 = get_valid_attestation(state, signed=signed_1)
+
+ attestation_2 = deepcopy(attestation_1)
+ attestation_2.data.target_root = b'\x01' * 32
+
+ if signed_2:
+ sign_attestation(state, attestation_2)
+
+ return AttesterSlashing(
+ attestation_1=convert_to_indexed(state, attestation_1),
+ attestation_2=convert_to_indexed(state, attestation_2),
+ )
diff --git a/test_libs/pyspec/eth2spec/test/helpers/bitfields.py b/test_libs/pyspec/eth2spec/test/helpers/bitfields.py
new file mode 100644
index 0000000000..7c25d073ab
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/bitfields.py
@@ -0,0 +1,11 @@
+def set_bitfield_bit(bitfield, i):
+ """
+ Set the bit in ``bitfield`` at position ``i`` to ``1``.
+ """
+ byte_index = i // 8
+ bit_index = i % 8
+ return (
+ bitfield[:byte_index] +
+ bytes([bitfield[byte_index] | (1 << bit_index)]) +
+ bitfield[byte_index + 1:]
+ )
diff --git a/test_libs/pyspec/eth2spec/test/helpers/block.py b/test_libs/pyspec/eth2spec/test/helpers/block.py
new file mode 100644
index 0000000000..81c5e9ef5b
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/block.py
@@ -0,0 +1,77 @@
+from copy import deepcopy
+
+from eth2spec.phase0 import spec
+from eth2spec.phase0.spec import get_beacon_proposer_index, slot_to_epoch, get_domain, BeaconBlock
+from eth2spec.phase0.state_transition import state_transition, state_transition_to
+from eth2spec.test.helpers.keys import privkeys
+from eth2spec.utils.bls import bls_sign, only_with_bls
+from eth2spec.utils.minimal_ssz import signing_root, hash_tree_root
+
+
+# Fully ignore the function if BLS is off, beacon-proposer index calculation is slow.
+@only_with_bls()
+def sign_block(state, block, proposer_index=None):
+ assert state.slot <= block.slot
+
+ if proposer_index is None:
+ if block.slot == state.slot:
+ proposer_index = get_beacon_proposer_index(state)
+ else:
+ if slot_to_epoch(state.slot) + 1 > slot_to_epoch(block.slot):
+ print("warning: block slot far away, and no proposer index manually given."
+ " Signing block is slow due to transition for proposer index calculation.")
+ # use stub state to get proposer index of future slot
+ stub_state = deepcopy(state)
+ state_transition_to(stub_state, block.slot)
+ proposer_index = get_beacon_proposer_index(stub_state)
+
+ privkey = privkeys[proposer_index]
+
+ block.body.randao_reveal = bls_sign(
+ privkey=privkey,
+ message_hash=hash_tree_root(slot_to_epoch(block.slot)),
+ domain=get_domain(
+ state,
+ message_epoch=slot_to_epoch(block.slot),
+ domain_type=spec.DOMAIN_RANDAO,
+ )
+ )
+ block.signature = bls_sign(
+ message_hash=signing_root(block),
+ privkey=privkey,
+ domain=get_domain(
+ state,
+ spec.DOMAIN_BEACON_PROPOSER,
+ slot_to_epoch(block.slot)))
+
+
+def apply_empty_block(state):
+ """
+ Transition via an empty block (on current slot, assuming no block has been applied yet).
+ :return: the empty block that triggered the transition.
+ """
+ block = build_empty_block(state, signed=True)
+ state_transition(state, block)
+ return block
+
+
+def build_empty_block(state, slot=None, signed=False):
+ if slot is None:
+ slot = state.slot
+ empty_block = BeaconBlock()
+ empty_block.slot = slot
+ empty_block.body.eth1_data.deposit_count = state.deposit_index
+ previous_block_header = deepcopy(state.latest_block_header)
+ if previous_block_header.state_root == spec.ZERO_HASH:
+ previous_block_header.state_root = state.hash_tree_root()
+ empty_block.previous_block_root = signing_root(previous_block_header)
+
+ if signed:
+ sign_block(state, empty_block)
+
+ return empty_block
+
+
+def build_empty_block_for_next_slot(state, signed=False):
+ return build_empty_block(state, state.slot + 1, signed=signed)
+
diff --git a/test_libs/pyspec/eth2spec/test/helpers/block_header.py b/test_libs/pyspec/eth2spec/test/helpers/block_header.py
new file mode 100644
index 0000000000..9aba62d37d
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/block_header.py
@@ -0,0 +1,18 @@
+# Access constants from spec pkg reference.
+import eth2spec.phase0.spec as spec
+
+from eth2spec.phase0.spec import get_domain
+from eth2spec.utils.bls import bls_sign
+from eth2spec.utils.minimal_ssz import signing_root
+
+
+def sign_block_header(state, header, privkey):
+ domain = get_domain(
+ state=state,
+ domain_type=spec.DOMAIN_BEACON_PROPOSER,
+ )
+ header.signature = bls_sign(
+ message_hash=signing_root(header),
+ privkey=privkey,
+ domain=domain,
+ )
diff --git a/test_libs/pyspec/eth2spec/test/helpers/deposits.py b/test_libs/pyspec/eth2spec/test/helpers/deposits.py
new file mode 100644
index 0000000000..c5deb124e6
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/deposits.py
@@ -0,0 +1,81 @@
+# Access constants from spec pkg reference.
+import eth2spec.phase0.spec as spec
+
+from eth2spec.phase0.spec import get_domain, DepositData, verify_merkle_branch, Deposit, ZERO_HASH
+from eth2spec.test.helpers.keys import pubkeys, privkeys
+from eth2spec.utils.bls import bls_sign
+from eth2spec.utils.merkle_minimal import calc_merkle_tree_from_leaves, get_merkle_root, get_merkle_proof
+from eth2spec.utils.minimal_ssz import signing_root
+
+
+def build_deposit_data(state, pubkey, privkey, amount, signed=False):
+ deposit_data = DepositData(
+ pubkey=pubkey,
+ # insecurely use pubkey as withdrawal key as well
+ withdrawal_credentials=spec.BLS_WITHDRAWAL_PREFIX_BYTE + spec.hash(pubkey)[1:],
+ amount=amount,
+ )
+ if signed:
+ sign_deposit_data(state, deposit_data, privkey)
+ return deposit_data
+
+
+def sign_deposit_data(state, deposit_data, privkey):
+ signature = bls_sign(
+ message_hash=signing_root(deposit_data),
+ privkey=privkey,
+ domain=get_domain(
+ state,
+ spec.DOMAIN_DEPOSIT,
+ )
+ )
+ deposit_data.signature = signature
+
+
+def build_deposit(state,
+ deposit_data_leaves,
+ pubkey,
+ privkey,
+ amount,
+ signed):
+ deposit_data = build_deposit_data(state, pubkey, privkey, amount, signed)
+
+ item = deposit_data.hash_tree_root()
+ index = len(deposit_data_leaves)
+ deposit_data_leaves.append(item)
+ tree = calc_merkle_tree_from_leaves(tuple(deposit_data_leaves))
+ root = get_merkle_root((tuple(deposit_data_leaves)))
+ proof = list(get_merkle_proof(tree, item_index=index))
+ assert verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, index, root)
+
+ deposit = Deposit(
+ proof=list(proof),
+ index=index,
+ data=deposit_data,
+ )
+
+ return deposit, root, deposit_data_leaves
+
+
+def prepare_state_and_deposit(state, validator_index, amount, signed=False):
+ """
+ Prepare the state for the deposit, and create a deposit for the given validator, depositing the given amount.
+ """
+ pre_validator_count = len(state.validator_registry)
+ # fill previous deposits with zero-hash
+ deposit_data_leaves = [ZERO_HASH] * pre_validator_count
+
+ pubkey = pubkeys[validator_index]
+ privkey = privkeys[validator_index]
+ deposit, root, deposit_data_leaves = build_deposit(
+ state,
+ deposit_data_leaves,
+ pubkey,
+ privkey,
+ amount,
+ signed
+ )
+
+ state.latest_eth1_data.deposit_root = root
+ state.latest_eth1_data.deposit_count = len(deposit_data_leaves)
+ return deposit
diff --git a/test_libs/pyspec/eth2spec/test/helpers/genesis.py b/test_libs/pyspec/eth2spec/test/helpers/genesis.py
new file mode 100644
index 0000000000..01011cacd0
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/genesis.py
@@ -0,0 +1,51 @@
+# Access constants from spec pkg reference.
+import eth2spec.phase0.spec as spec
+
+from eth2spec.phase0.spec import Eth1Data, ZERO_HASH, get_active_validator_indices
+from eth2spec.test.helpers.keys import pubkeys
+from eth2spec.utils.minimal_ssz import hash_tree_root
+
+
+def build_mock_validator(i: int, balance: int):
+ pubkey = pubkeys[i]
+ # insecurely use pubkey as withdrawal key as well
+ withdrawal_credentials = spec.BLS_WITHDRAWAL_PREFIX_BYTE + spec.hash(pubkey)[1:]
+ return spec.Validator(
+ pubkey=pubkeys[i],
+ withdrawal_credentials=withdrawal_credentials,
+ activation_eligibility_epoch=spec.FAR_FUTURE_EPOCH,
+ activation_epoch=spec.FAR_FUTURE_EPOCH,
+ exit_epoch=spec.FAR_FUTURE_EPOCH,
+ withdrawable_epoch=spec.FAR_FUTURE_EPOCH,
+ effective_balance=min(balance - balance % spec.EFFECTIVE_BALANCE_INCREMENT, spec.MAX_EFFECTIVE_BALANCE)
+ )
+
+
+def create_genesis_state(num_validators):
+ deposit_root = b'\x42' * 32
+
+ state = spec.BeaconState(
+ genesis_time=0,
+ deposit_index=num_validators,
+ latest_eth1_data=Eth1Data(
+ deposit_root=deposit_root,
+ deposit_count=num_validators,
+ block_hash=ZERO_HASH,
+ ))
+
+ # We "hack" in the initial validators,
+ # as it is much faster than creating and processing genesis deposits for every single test case.
+ state.balances = [spec.MAX_EFFECTIVE_BALANCE] * num_validators
+ state.validator_registry = [build_mock_validator(i, state.balances[i]) for i in range(num_validators)]
+
+ # Process genesis activations
+ for validator in state.validator_registry:
+ if validator.effective_balance >= spec.MAX_EFFECTIVE_BALANCE:
+ validator.activation_eligibility_epoch = spec.GENESIS_EPOCH
+ validator.activation_epoch = spec.GENESIS_EPOCH
+
+ genesis_active_index_root = hash_tree_root(get_active_validator_indices(state, spec.GENESIS_EPOCH))
+ for index in range(spec.LATEST_ACTIVE_INDEX_ROOTS_LENGTH):
+ state.latest_active_index_roots[index] = genesis_active_index_root
+
+ return state
diff --git a/test_libs/pyspec/eth2spec/test/helpers/keys.py b/test_libs/pyspec/eth2spec/test/helpers/keys.py
new file mode 100644
index 0000000000..f47cd7c10b
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/keys.py
@@ -0,0 +1,6 @@
+from py_ecc import bls
+from eth2spec.phase0 import spec
+
+privkeys = [i + 1 for i in range(spec.SLOTS_PER_EPOCH * 16)]
+pubkeys = [bls.privtopub(privkey) for privkey in privkeys]
+pubkey_to_privkey = {pubkey: privkey for privkey, pubkey in zip(privkeys, pubkeys)}
diff --git a/test_libs/pyspec/eth2spec/test/helpers/proposer_slashings.py b/test_libs/pyspec/eth2spec/test/helpers/proposer_slashings.py
new file mode 100644
index 0000000000..dfb8895dc2
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/proposer_slashings.py
@@ -0,0 +1,35 @@
+from copy import deepcopy
+
+from eth2spec.phase0.spec import (
+ get_current_epoch, get_active_validator_indices, BeaconBlockHeader, ProposerSlashing
+)
+from eth2spec.test.helpers.block_header import sign_block_header
+from eth2spec.test.helpers.keys import pubkey_to_privkey
+
+
+def get_valid_proposer_slashing(state, signed_1=False, signed_2=False):
+ current_epoch = get_current_epoch(state)
+ validator_index = get_active_validator_indices(state, current_epoch)[-1]
+ privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
+ slot = state.slot
+
+ header_1 = BeaconBlockHeader(
+ slot=slot,
+ previous_block_root=b'\x33' * 32,
+ state_root=b'\x44' * 32,
+ block_body_root=b'\x55' * 32,
+ )
+ header_2 = deepcopy(header_1)
+ header_2.previous_block_root = b'\x99' * 32
+ header_2.slot = slot + 1
+
+ if signed_1:
+ sign_block_header(state, header_1, privkey)
+ if signed_2:
+ sign_block_header(state, header_2, privkey)
+
+ return ProposerSlashing(
+ proposer_index=validator_index,
+ header_1=header_1,
+ header_2=header_2,
+ )
diff --git a/test_libs/pyspec/eth2spec/test/helpers/state.py b/test_libs/pyspec/eth2spec/test/helpers/state.py
new file mode 100644
index 0000000000..e720a9709f
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/state.py
@@ -0,0 +1,31 @@
+# Access constants from spec pkg reference.
+import eth2spec.phase0.spec as spec
+
+from eth2spec.phase0.state_transition import state_transition_to
+
+
+def get_balance(state, index):
+ return state.balances[index]
+
+
+def next_slot(state):
+ """
+ Transition to the next slot.
+ """
+ state_transition_to(state, state.slot + 1)
+
+
+def next_epoch(state):
+ """
+ Transition to the start slot of the next epoch
+ """
+ slot = state.slot + spec.SLOTS_PER_EPOCH - (state.slot % spec.SLOTS_PER_EPOCH)
+ state_transition_to(state, slot)
+
+
+def get_state_root(state, slot) -> bytes:
+ """
+ Return the state root at a recent ``slot``.
+ """
+ assert slot < state.slot <= slot + spec.SLOTS_PER_HISTORICAL_ROOT
+ return state.latest_state_roots[slot % spec.SLOTS_PER_HISTORICAL_ROOT]
diff --git a/test_libs/pyspec/eth2spec/test/helpers/transfers.py b/test_libs/pyspec/eth2spec/test/helpers/transfers.py
new file mode 100644
index 0000000000..2045f48ad6
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/transfers.py
@@ -0,0 +1,55 @@
+# Access constants from spec pkg reference.
+import eth2spec.phase0.spec as spec
+
+from eth2spec.phase0.spec import get_current_epoch, get_active_validator_indices, Transfer, get_domain
+from eth2spec.test.helpers.keys import pubkeys, privkeys
+from eth2spec.test.helpers.state import get_balance
+from eth2spec.utils.bls import bls_sign
+from eth2spec.utils.minimal_ssz import signing_root
+
+
+def get_valid_transfer(state, slot=None, sender_index=None, amount=None, fee=None, signed=False):
+ if slot is None:
+ slot = state.slot
+ current_epoch = get_current_epoch(state)
+ if sender_index is None:
+ sender_index = get_active_validator_indices(state, current_epoch)[-1]
+ recipient_index = get_active_validator_indices(state, current_epoch)[0]
+ transfer_pubkey = pubkeys[-1]
+ transfer_privkey = privkeys[-1]
+
+ if fee is None:
+ fee = get_balance(state, sender_index) // 32
+ if amount is None:
+ amount = get_balance(state, sender_index) - fee
+
+ transfer = Transfer(
+ sender=sender_index,
+ recipient=recipient_index,
+ amount=amount,
+ fee=fee,
+ slot=slot,
+ pubkey=transfer_pubkey,
+ )
+ if signed:
+ sign_transfer(state, transfer, transfer_privkey)
+
+ # ensure withdrawal_credentials reproducible
+ state.validator_registry[transfer.sender].withdrawal_credentials = (
+ spec.BLS_WITHDRAWAL_PREFIX_BYTE + spec.hash(transfer.pubkey)[1:]
+ )
+
+ return transfer
+
+
+def sign_transfer(state, transfer, privkey):
+ transfer.signature = bls_sign(
+ message_hash=signing_root(transfer),
+ privkey=privkey,
+ domain=get_domain(
+ state=state,
+ domain_type=spec.DOMAIN_TRANSFER,
+ message_epoch=get_current_epoch(state),
+ )
+ )
+ return transfer
diff --git a/test_libs/pyspec/eth2spec/test/helpers/voluntary_exits.py b/test_libs/pyspec/eth2spec/test/helpers/voluntary_exits.py
new file mode 100644
index 0000000000..54376d694b
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/helpers/voluntary_exits.py
@@ -0,0 +1,28 @@
+# Access constants from spec pkg reference.
+import eth2spec.phase0.spec as spec
+
+from eth2spec.phase0.spec import VoluntaryExit, get_domain
+from eth2spec.utils.bls import bls_sign
+from eth2spec.utils.minimal_ssz import signing_root
+
+
+def build_voluntary_exit(state, epoch, validator_index, privkey, signed=False):
+ voluntary_exit = VoluntaryExit(
+ epoch=epoch,
+ validator_index=validator_index,
+ )
+ if signed:
+ sign_voluntary_exit(state, voluntary_exit, privkey)
+ return voluntary_exit
+
+
+def sign_voluntary_exit(state, voluntary_exit, privkey):
+ voluntary_exit.signature = bls_sign(
+ message_hash=signing_root(voluntary_exit),
+ privkey=privkey,
+ domain=get_domain(
+ state=state,
+ domain_type=spec.DOMAIN_VOLUNTARY_EXIT,
+ message_epoch=voluntary_exit.epoch,
+ )
+ )
diff --git a/test_libs/pyspec/eth2spec/test/sanity/__init__.py b/test_libs/pyspec/eth2spec/test/sanity/__init__.py
new file mode 100644
index 0000000000..e69de29bb2
diff --git a/test_libs/pyspec/eth2spec/test/sanity/test_blocks.py b/test_libs/pyspec/eth2spec/test/sanity/test_blocks.py
new file mode 100644
index 0000000000..c9aadbf2ac
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/sanity/test_blocks.py
@@ -0,0 +1,408 @@
+from copy import deepcopy
+
+import eth2spec.phase0.spec as spec
+from eth2spec.utils.bls import bls_sign
+
+from eth2spec.utils.minimal_ssz import signing_root
+from eth2spec.phase0.spec import (
+ # SSZ
+ VoluntaryExit,
+ # functions
+ get_active_validator_indices,
+ get_beacon_proposer_index,
+ get_block_root_at_slot,
+ get_current_epoch,
+ get_domain,
+)
+from eth2spec.phase0.state_transition import (
+ state_transition,
+)
+from eth2spec.test.helpers.state import get_balance
+from eth2spec.test.helpers.transfers import get_valid_transfer
+from eth2spec.test.helpers.block import build_empty_block_for_next_slot, sign_block
+from eth2spec.test.helpers.keys import privkeys, pubkeys
+from eth2spec.test.helpers.attester_slashings import get_valid_attester_slashing
+from eth2spec.test.helpers.proposer_slashings import get_valid_proposer_slashing
+from eth2spec.test.helpers.attestations import get_valid_attestation
+from eth2spec.test.helpers.deposits import prepare_state_and_deposit
+
+from eth2spec.test.context import spec_state_test, never_bls
+
+
+@never_bls
+@spec_state_test
+def test_empty_block_transition(state):
+ pre_slot = state.slot
+ pre_eth1_votes = len(state.eth1_data_votes)
+
+ yield 'pre', state
+
+ block = build_empty_block_for_next_slot(state, signed=True)
+ yield 'blocks', [block], [spec.BeaconBlock]
+
+ state_transition(state, block)
+ yield 'post', state
+
+ assert len(state.eth1_data_votes) == pre_eth1_votes + 1
+ assert get_block_root_at_slot(state, pre_slot) == block.previous_block_root
+
+
+@never_bls
+@spec_state_test
+def test_skipped_slots(state):
+ pre_slot = state.slot
+ yield 'pre', state
+
+ block = build_empty_block_for_next_slot(state)
+ block.slot += 3
+ sign_block(state, block)
+ yield 'blocks', [block], [spec.BeaconBlock]
+
+ state_transition(state, block)
+ yield 'post', state
+
+ assert state.slot == block.slot
+ for slot in range(pre_slot, state.slot):
+ assert get_block_root_at_slot(state, slot) == block.previous_block_root
+
+
+@spec_state_test
+def test_empty_epoch_transition(state):
+ pre_slot = state.slot
+ yield 'pre', state
+
+ block = build_empty_block_for_next_slot(state)
+ block.slot += spec.SLOTS_PER_EPOCH
+ sign_block(state, block)
+ yield 'blocks', [block], [spec.BeaconBlock]
+
+ state_transition(state, block)
+ yield 'post', state
+
+ assert state.slot == block.slot
+ for slot in range(pre_slot, state.slot):
+ assert get_block_root_at_slot(state, slot) == block.previous_block_root
+
+
+# @spec_state_test
+# def test_empty_epoch_transition_not_finalizing(state):
+# # copy for later balance lookups.
+# pre_state = deepcopy(state)
+# yield 'pre', state
+#
+# block = build_empty_block_for_next_slot(state)
+# block.slot += spec.SLOTS_PER_EPOCH * 5
+# sign_block(state, block, proposer_index=0)
+# yield 'blocks', [block], [spec.BeaconBlock]
+#
+# state_transition(state, block)
+# yield 'post', state
+#
+# assert state.slot == block.slot
+# assert state.finalized_epoch < get_current_epoch(state) - 4
+# for index in range(len(state.validator_registry)):
+# assert get_balance(state, index) < get_balance(pre_state, index)
+
+
+@spec_state_test
+def test_proposer_slashing(state):
+ # copy for later balance lookups.
+ pre_state = deepcopy(state)
+ proposer_slashing = get_valid_proposer_slashing(state, signed_1=True, signed_2=True)
+ validator_index = proposer_slashing.proposer_index
+
+ assert not state.validator_registry[validator_index].slashed
+
+ yield 'pre', state
+
+ #
+ # Add to state via block transition
+ #
+ block = build_empty_block_for_next_slot(state)
+ block.body.proposer_slashings.append(proposer_slashing)
+ sign_block(state, block)
+ yield 'blocks', [block], [spec.BeaconBlock]
+
+ state_transition(state, block)
+ yield 'post', state
+
+ # check if slashed
+ slashed_validator = state.validator_registry[validator_index]
+ assert slashed_validator.slashed
+ assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH
+ assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH
+ # lost whistleblower reward
+ assert get_balance(state, validator_index) < get_balance(pre_state, validator_index)
+
+
+@spec_state_test
+def test_attester_slashing(state):
+ # copy for later balance lookups.
+ pre_state = deepcopy(state)
+
+ attester_slashing = get_valid_attester_slashing(state, signed_1=True, signed_2=True)
+ validator_index = (attester_slashing.attestation_1.custody_bit_0_indices +
+ attester_slashing.attestation_1.custody_bit_1_indices)[0]
+
+ assert not state.validator_registry[validator_index].slashed
+
+ yield 'pre', state
+
+ #
+ # Add to state via block transition
+ #
+ block = build_empty_block_for_next_slot(state)
+ block.body.attester_slashings.append(attester_slashing)
+ sign_block(state, block)
+ yield 'blocks', [block], [spec.BeaconBlock]
+
+ state_transition(state, block)
+ yield 'post', state
+
+ slashed_validator = state.validator_registry[validator_index]
+ assert slashed_validator.slashed
+ assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH
+ assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH
+ # lost whistleblower reward
+ assert get_balance(state, validator_index) < get_balance(pre_state, validator_index)
+
+ proposer_index = get_beacon_proposer_index(state)
+ # gained whistleblower reward
+ assert (
+ get_balance(state, proposer_index) >
+ get_balance(pre_state, proposer_index)
+ )
+
+
+# TODO update functions below to be like above, i.e. with @spec_state_test and yielding data to put into the test vector
+
+@spec_state_test
+def test_deposit_in_block(state):
+ initial_registry_len = len(state.validator_registry)
+ initial_balances_len = len(state.balances)
+
+ validator_index = len(state.validator_registry)
+ amount = spec.MAX_EFFECTIVE_BALANCE
+ deposit = prepare_state_and_deposit(state, validator_index, amount, signed=True)
+
+ yield 'pre', state
+
+ block = build_empty_block_for_next_slot(state)
+ block.body.deposits.append(deposit)
+ sign_block(state, block)
+
+ yield 'blocks', [block], [spec.BeaconBlock]
+
+ state_transition(state, block)
+ yield 'post', state
+
+ assert len(state.validator_registry) == initial_registry_len + 1
+ assert len(state.balances) == initial_balances_len + 1
+ assert get_balance(state, validator_index) == spec.MAX_EFFECTIVE_BALANCE
+ assert state.validator_registry[validator_index].pubkey == pubkeys[validator_index]
+
+
+@spec_state_test
+def test_deposit_top_up(state):
+ validator_index = 0
+ amount = spec.MAX_EFFECTIVE_BALANCE // 4
+ deposit = prepare_state_and_deposit(state, validator_index, amount)
+
+ initial_registry_len = len(state.validator_registry)
+ initial_balances_len = len(state.balances)
+ validator_pre_balance = get_balance(state, validator_index)
+
+ yield 'pre', state
+
+ block = build_empty_block_for_next_slot(state)
+ block.body.deposits.append(deposit)
+ sign_block(state, block)
+
+ yield 'blocks', [block], [spec.BeaconBlock]
+
+ state_transition(state, block)
+ yield 'post', state
+
+ assert len(state.validator_registry) == initial_registry_len
+ assert len(state.balances) == initial_balances_len
+ assert get_balance(state, validator_index) == validator_pre_balance + amount
+
+
+@spec_state_test
+def test_attestation(state):
+ state.slot = spec.SLOTS_PER_EPOCH
+
+ yield 'pre', state
+
+ attestation = get_valid_attestation(state, signed=True)
+
+ # Add to state via block transition
+ pre_current_attestations_len = len(state.current_epoch_attestations)
+ attestation_block = build_empty_block_for_next_slot(state)
+ attestation_block.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
+ attestation_block.body.attestations.append(attestation)
+ sign_block(state, attestation_block)
+ state_transition(state, attestation_block)
+
+ assert len(state.current_epoch_attestations) == pre_current_attestations_len + 1
+
+ # Epoch transition should move to previous_epoch_attestations
+ pre_current_attestations_root = spec.hash_tree_root(state.current_epoch_attestations)
+
+ epoch_block = build_empty_block_for_next_slot(state)
+ epoch_block.slot += spec.SLOTS_PER_EPOCH
+ sign_block(state, epoch_block)
+ state_transition(state, epoch_block)
+
+ yield 'blocks', [attestation_block, epoch_block], [spec.BeaconBlock]
+ yield 'post', state
+
+ assert len(state.current_epoch_attestations) == 0
+ assert spec.hash_tree_root(state.previous_epoch_attestations) == pre_current_attestations_root
+
+
+@spec_state_test
+def test_voluntary_exit(state):
+ validator_index = get_active_validator_indices(
+ state,
+ get_current_epoch(state)
+ )[-1]
+
+ # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit
+ state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH
+
+ yield 'pre', state
+
+ voluntary_exit = VoluntaryExit(
+ epoch=get_current_epoch(state),
+ validator_index=validator_index,
+ )
+ voluntary_exit.signature = bls_sign(
+ message_hash=signing_root(voluntary_exit),
+ privkey=privkeys[validator_index],
+ domain=get_domain(
+ state=state,
+ domain_type=spec.DOMAIN_VOLUNTARY_EXIT,
+ )
+ )
+
+ # Add to state via block transition
+ initiate_exit_block = build_empty_block_for_next_slot(state)
+ initiate_exit_block.body.voluntary_exits.append(voluntary_exit)
+ sign_block(state, initiate_exit_block)
+ state_transition(state, initiate_exit_block)
+
+ assert state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH
+
+ # Process within epoch transition
+ exit_block = build_empty_block_for_next_slot(state)
+ exit_block.slot += spec.SLOTS_PER_EPOCH
+ sign_block(state, exit_block)
+ state_transition(state, exit_block)
+
+ yield 'blocks', [initiate_exit_block, exit_block], [spec.BeaconBlock]
+ yield 'post', state
+
+ assert state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH
+
+
+@spec_state_test
+def test_transfer(state):
+ # overwrite default 0 to test
+ spec.MAX_TRANSFERS = 1
+
+ sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
+ amount = get_balance(state, sender_index)
+
+ transfer = get_valid_transfer(state, state.slot + 1, sender_index, amount, signed=True)
+ recipient_index = transfer.recipient
+ pre_transfer_recipient_balance = get_balance(state, recipient_index)
+
+ # un-activate so validator can transfer
+ state.validator_registry[sender_index].activation_eligibility_epoch = spec.FAR_FUTURE_EPOCH
+
+ yield 'pre', state
+
+ # Add to state via block transition
+ block = build_empty_block_for_next_slot(state)
+ block.body.transfers.append(transfer)
+ sign_block(state, block)
+
+ yield 'blocks', [block], [spec.BeaconBlock]
+
+ state_transition(state, block)
+ yield 'post', state
+
+ sender_balance = get_balance(state, sender_index)
+ recipient_balance = get_balance(state, recipient_index)
+ assert sender_balance == 0
+ assert recipient_balance == pre_transfer_recipient_balance + amount
+
+
+@spec_state_test
+def test_balance_driven_status_transitions(state):
+ current_epoch = get_current_epoch(state)
+ validator_index = get_active_validator_indices(state, current_epoch)[-1]
+
+ assert state.validator_registry[validator_index].exit_epoch == spec.FAR_FUTURE_EPOCH
+
+ # set validator balance to below ejection threshold
+ state.validator_registry[validator_index].effective_balance = spec.EJECTION_BALANCE
+
+ yield 'pre', state
+
+ # trigger epoch transition
+ block = build_empty_block_for_next_slot(state)
+ block.slot += spec.SLOTS_PER_EPOCH
+ sign_block(state, block)
+ state_transition(state, block)
+
+ yield 'blocks', [block], [spec.BeaconBlock]
+ yield 'post', state
+
+ assert state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH
+
+
+@spec_state_test
+def test_historical_batch(state):
+ state.slot += spec.SLOTS_PER_HISTORICAL_ROOT - (state.slot % spec.SLOTS_PER_HISTORICAL_ROOT) - 1
+ pre_historical_roots_len = len(state.historical_roots)
+
+ yield 'pre', state
+
+ block = build_empty_block_for_next_slot(state, signed=True)
+ state_transition(state, block)
+
+ yield 'blocks', [block], [spec.BeaconBlock]
+ yield 'post', state
+
+ assert state.slot == block.slot
+ assert get_current_epoch(state) % (spec.SLOTS_PER_HISTORICAL_ROOT // spec.SLOTS_PER_EPOCH) == 0
+ assert len(state.historical_roots) == pre_historical_roots_len + 1
+
+
+# @spec_state_test
+# def test_eth1_data_votes(state):
+# yield 'pre', state
+#
+# expected_votes = 0
+# assert len(state.eth1_data_votes) == expected_votes
+#
+# blocks = []
+# for _ in range(spec.SLOTS_PER_ETH1_VOTING_PERIOD - 1):
+# block = build_empty_block_for_next_slot(state)
+# state_transition(state, block)
+# expected_votes += 1
+# assert len(state.eth1_data_votes) == expected_votes
+# blocks.append(block)
+#
+# block = build_empty_block_for_next_slot(state)
+# blocks.append(block)
+#
+# state_transition(state, block)
+#
+# yield 'blocks', [block], [spec.BeaconBlock]
+# yield 'post', state
+#
+# assert state.slot % spec.SLOTS_PER_ETH1_VOTING_PERIOD == 0
+# assert len(state.eth1_data_votes) == 1
diff --git a/test_libs/pyspec/eth2spec/test/sanity/test_slots.py b/test_libs/pyspec/eth2spec/test/sanity/test_slots.py
new file mode 100644
index 0000000000..2e5f3a5df6
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/sanity/test_slots.py
@@ -0,0 +1,58 @@
+import eth2spec.phase0.spec as spec
+
+from eth2spec.phase0.state_transition import state_transition_to
+from eth2spec.test.helpers.state import get_state_root
+from eth2spec.test.context import spec_state_test
+
+
+@spec_state_test
+def test_slots_1(state):
+ pre_slot = state.slot
+ pre_root = state.hash_tree_root()
+ yield 'pre', state
+
+ slots = 1
+ yield 'slots', slots
+ state_transition_to(state, state.slot + slots)
+
+ yield 'post', state
+ assert state.slot == pre_slot + 1
+ assert get_state_root(state, pre_slot) == pre_root
+
+
+@spec_state_test
+def test_slots_2(state):
+ yield 'pre', state
+ slots = 2
+ yield 'slots', slots
+ state_transition_to(state, state.slot + slots)
+ yield 'post', state
+
+
+@spec_state_test
+def test_empty_epoch(state):
+ yield 'pre', state
+ slots = spec.SLOTS_PER_EPOCH
+ yield 'slots', slots
+ state_transition_to(state, state.slot + slots)
+ yield 'post', state
+
+
+@spec_state_test
+def test_double_empty_epoch(state):
+ yield 'pre', state
+ slots = spec.SLOTS_PER_EPOCH * 2
+ yield 'slots', slots
+ state_transition_to(state, state.slot + slots)
+ yield 'post', state
+
+
+@spec_state_test
+def test_over_epoch_boundary(state):
+ state_transition_to(state, state.slot + (spec.SLOTS_PER_EPOCH // 2))
+ yield 'pre', state
+ slots = spec.SLOTS_PER_EPOCH
+ yield 'slots', slots
+ state_transition_to(state, state.slot + slots)
+ yield 'post', state
+
diff --git a/test_libs/pyspec/tests/test_finality.py b/test_libs/pyspec/eth2spec/test/test_finality.py
similarity index 56%
rename from test_libs/pyspec/tests/test_finality.py
rename to test_libs/pyspec/eth2spec/test/test_finality.py
index ca048c2b2a..56f65eca9a 100644
--- a/test_libs/pyspec/tests/test_finality.py
+++ b/test_libs/pyspec/eth2spec/test/test_finality.py
@@ -1,24 +1,18 @@
from copy import deepcopy
-import pytest
-
import eth2spec.phase0.spec as spec
-
from eth2spec.phase0.state_transition import (
state_transition,
)
-from .helpers import (
- build_empty_block_for_next_slot,
- fill_aggregate_attestation,
+from .context import spec_state_test, never_bls
+from .helpers.state import next_epoch
+from .helpers.block import build_empty_block_for_next_slot, apply_empty_block
+from .helpers.attestations import (
get_current_epoch,
get_epoch_start_slot,
get_valid_attestation,
- next_epoch,
)
-# mark entire file as 'state'
-pytestmark = pytest.mark.state
-
def check_finality(state,
prev_state,
@@ -58,13 +52,11 @@ def next_epoch_with_attestations(state,
slot_to_attest = post_state.slot - spec.MIN_ATTESTATION_INCLUSION_DELAY + 1
if slot_to_attest >= get_epoch_start_slot(get_current_epoch(post_state)):
cur_attestation = get_valid_attestation(post_state, slot_to_attest)
- fill_aggregate_attestation(post_state, cur_attestation)
block.body.attestations.append(cur_attestation)
if fill_prev_epoch:
slot_to_attest = post_state.slot - spec.SLOTS_PER_EPOCH + 1
prev_attestation = get_valid_attestation(post_state, slot_to_attest)
- fill_aggregate_attestation(post_state, prev_attestation)
block.body.attestations.append(prev_attestation)
state_transition(post_state, block)
@@ -73,126 +65,140 @@ def next_epoch_with_attestations(state,
return state, blocks, post_state
+@never_bls
+@spec_state_test
def test_finality_rule_4(state):
- test_state = deepcopy(state)
+ yield 'pre', state
blocks = []
for epoch in range(4):
- prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, True, False)
+ prev_state, new_blocks, state = next_epoch_with_attestations(state, True, False)
blocks += new_blocks
# justification/finalization skipped at GENESIS_EPOCH
if epoch == 0:
- check_finality(test_state, prev_state, False, False, False)
+ check_finality(state, prev_state, False, False, False)
# justification/finalization skipped at GENESIS_EPOCH + 1
elif epoch == 1:
- check_finality(test_state, prev_state, False, False, False)
+ check_finality(state, prev_state, False, False, False)
elif epoch == 2:
- check_finality(test_state, prev_state, True, False, False)
+ check_finality(state, prev_state, True, False, False)
elif epoch >= 3:
# rule 4 of finality
- check_finality(test_state, prev_state, True, True, True)
- assert test_state.finalized_epoch == prev_state.current_justified_epoch
- assert test_state.finalized_root == prev_state.current_justified_root
+ check_finality(state, prev_state, True, True, True)
+ assert state.finalized_epoch == prev_state.current_justified_epoch
+ assert state.finalized_root == prev_state.current_justified_root
- return state, blocks, test_state
+ yield 'blocks', blocks, [spec.BeaconBlock]
+ yield 'post', state
+@never_bls
+@spec_state_test
def test_finality_rule_1(state):
# get past first two epochs that finality does not run on
next_epoch(state)
+ apply_empty_block(state)
next_epoch(state)
+ apply_empty_block(state)
- pre_state = deepcopy(state)
- test_state = deepcopy(state)
+ yield 'pre', state
blocks = []
for epoch in range(3):
- prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, False, True)
+ prev_state, new_blocks, state = next_epoch_with_attestations(state, False, True)
blocks += new_blocks
if epoch == 0:
- check_finality(test_state, prev_state, True, False, False)
+ check_finality(state, prev_state, True, False, False)
elif epoch == 1:
- check_finality(test_state, prev_state, True, True, False)
+ check_finality(state, prev_state, True, True, False)
elif epoch == 2:
# finalized by rule 1
- check_finality(test_state, prev_state, True, True, True)
- assert test_state.finalized_epoch == prev_state.previous_justified_epoch
- assert test_state.finalized_root == prev_state.previous_justified_root
+ check_finality(state, prev_state, True, True, True)
+ assert state.finalized_epoch == prev_state.previous_justified_epoch
+ assert state.finalized_root == prev_state.previous_justified_root
- return pre_state, blocks, test_state
+ yield 'blocks', blocks, [spec.BeaconBlock]
+ yield 'post', state
+@never_bls
+@spec_state_test
def test_finality_rule_2(state):
# get past first two epochs that finality does not run on
next_epoch(state)
+ apply_empty_block(state)
next_epoch(state)
+ apply_empty_block(state)
- pre_state = deepcopy(state)
- test_state = deepcopy(state)
+ yield 'pre', state
blocks = []
for epoch in range(3):
if epoch == 0:
- prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, True, False)
- check_finality(test_state, prev_state, True, False, False)
+ prev_state, new_blocks, state = next_epoch_with_attestations(state, True, False)
+ check_finality(state, prev_state, True, False, False)
elif epoch == 1:
- prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, False, False)
- check_finality(test_state, prev_state, False, True, False)
+ prev_state, new_blocks, state = next_epoch_with_attestations(state, False, False)
+ check_finality(state, prev_state, False, True, False)
elif epoch == 2:
- prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, False, True)
+ prev_state, new_blocks, state = next_epoch_with_attestations(state, False, True)
# finalized by rule 2
- check_finality(test_state, prev_state, True, False, True)
- assert test_state.finalized_epoch == prev_state.previous_justified_epoch
- assert test_state.finalized_root == prev_state.previous_justified_root
+ check_finality(state, prev_state, True, False, True)
+ assert state.finalized_epoch == prev_state.previous_justified_epoch
+ assert state.finalized_root == prev_state.previous_justified_root
blocks += new_blocks
- return pre_state, blocks, test_state
+ yield 'blocks', blocks, [spec.BeaconBlock]
+ yield 'post', state
+@never_bls
+@spec_state_test
def test_finality_rule_3(state):
"""
Test scenario described here
https://github.com/ethereum/eth2.0-specs/issues/611#issuecomment-463612892
"""
-
# get past first two epochs that finality does not run on
next_epoch(state)
+ apply_empty_block(state)
next_epoch(state)
+ apply_empty_block(state)
- pre_state = deepcopy(state)
- test_state = deepcopy(state)
+ yield 'pre', state
blocks = []
- prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, True, False)
+ prev_state, new_blocks, state = next_epoch_with_attestations(state, True, False)
blocks += new_blocks
- check_finality(test_state, prev_state, True, False, False)
+ check_finality(state, prev_state, True, False, False)
# In epoch N, JE is set to N, prev JE is set to N-1
- prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, True, False)
+ prev_state, new_blocks, state = next_epoch_with_attestations(state, True, False)
blocks += new_blocks
- check_finality(test_state, prev_state, True, True, True)
+ check_finality(state, prev_state, True, True, True)
# In epoch N+1, JE is N, prev JE is N-1, and not enough messages get in to do anything
- prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, False, False)
+ prev_state, new_blocks, state = next_epoch_with_attestations(state, False, False)
blocks += new_blocks
- check_finality(test_state, prev_state, False, True, False)
+ check_finality(state, prev_state, False, True, False)
# In epoch N+2, JE is N, prev JE is N, and enough messages from the previous epoch get in to justify N+1.
# N+1 now becomes the JE. Not enough messages from epoch N+2 itself get in to justify N+2
- prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, False, True)
+ prev_state, new_blocks, state = next_epoch_with_attestations(state, False, True)
blocks += new_blocks
# rule 2
- check_finality(test_state, prev_state, True, False, True)
+ check_finality(state, prev_state, True, False, True)
# In epoch N+3, LJE is N+1, prev LJE is N, and enough messages get in to justify epochs N+2 and N+3.
- prev_state, new_blocks, test_state = next_epoch_with_attestations(test_state, True, True)
+ prev_state, new_blocks, state = next_epoch_with_attestations(state, True, True)
blocks += new_blocks
# rule 3
- check_finality(test_state, prev_state, True, True, True)
- assert test_state.finalized_epoch == prev_state.current_justified_epoch
- assert test_state.finalized_root == prev_state.current_justified_root
+ check_finality(state, prev_state, True, True, True)
+ assert state.finalized_epoch == prev_state.current_justified_epoch
+ assert state.finalized_root == prev_state.current_justified_root
- return pre_state, blocks, test_state
+ yield 'blocks', blocks, [spec.BeaconBlock]
+ yield 'post', state
diff --git a/test_libs/pyspec/eth2spec/test/utils.py b/test_libs/pyspec/eth2spec/test/utils.py
new file mode 100644
index 0000000000..b61801c3dd
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/test/utils.py
@@ -0,0 +1,80 @@
+from typing import Dict, Any, Callable, Iterable
+from eth2spec.debug.encode import encode
+
+
+def spectest(description: str = None):
+ def runner(fn):
+ # this wraps the function, to hide that the function actually is yielding data, instead of returning once.
+ def entry(*args, **kw):
+ # check generator mode, may be None/else.
+ # "pop" removes it, so it is not passed to the inner function.
+ if kw.pop('generator_mode', False) is True:
+ out = {}
+ if description is None:
+ # fall back on function name for test description
+ name = fn.__name__
+ if name.startswith('test_'):
+ name = name[5:]
+ out['description'] = name
+ else:
+ # description can be explicit
+ out['description'] = description
+ has_contents = False
+ # put all generated data into a dict.
+ for data in fn(*args, **kw):
+ has_contents = True
+ # If there is a type argument, encode it as that type.
+ if len(data) == 3:
+ (key, value, typ) = data
+ out[key] = encode(value, typ)
+ else:
+ # Otherwise, try to infer the type, but keep it as-is if it's not a SSZ container.
+ (key, value) = data
+ if hasattr(value.__class__, 'fields'):
+ out[key] = encode(value, value.__class__)
+ else:
+ out[key] = value
+ if has_contents:
+ return out
+ else:
+ return None
+ else:
+ # just complete the function, ignore all yielded data, we are not using it
+ for _ in fn(*args, **kw):
+ continue
+ return None
+ return entry
+ return runner
+
+
+def with_tags(tags: Dict[str, Any]):
+ """
+ Decorator factory, adds tags (key, value) pairs to the output of the function.
+ Useful to build test-vector annotations with.
+ This decorator is applied after the ``spectest`` decorator is applied.
+ :param tags: dict of tags
+ :return: Decorator.
+ """
+ def runner(fn):
+ def entry(*args, **kw):
+ fn_out = fn(*args, **kw)
+ # do not add tags if the function is not returning a dict at all (i.e. not in generator mode)
+ if fn_out is None:
+ return None
+ return {**tags, **fn_out}
+ return entry
+ return runner
+
+
+def with_args(create_args: Callable[[], Iterable[Any]]):
+ """
+ Decorator factory, adds given extra arguments to the decorated function.
+ :param create_args: function to create arguments with.
+ :return: Decorator.
+ """
+ def runner(fn):
+ # this wraps the function, to hide that the function actually yielding data.
+ def entry(*args, **kw):
+ return fn(*(list(create_args()) + list(args)), **kw)
+ return entry
+ return runner
diff --git a/test_libs/pyspec/eth2spec/utils/bls.py b/test_libs/pyspec/eth2spec/utils/bls.py
new file mode 100644
index 0000000000..52f1fed632
--- /dev/null
+++ b/test_libs/pyspec/eth2spec/utils/bls.py
@@ -0,0 +1,46 @@
+from py_ecc import bls
+
+# Flag to make BLS active or not. Used for testing, do not ignore BLS in production unless you know what you are doing.
+bls_active = True
+
+STUB_SIGNATURE = b'\x11' * 96
+STUB_PUBKEY = b'\x22' * 48
+
+
+def only_with_bls(alt_return=None):
+ """
+ Decorator factory to make a function only run when BLS is active. Otherwise return the default.
+ """
+ def runner(fn):
+ def entry(*args, **kw):
+ if bls_active:
+ return fn(*args, **kw)
+ else:
+ return alt_return
+ return entry
+ return runner
+
+
+@only_with_bls(alt_return=True)
+def bls_verify(pubkey, message_hash, signature, domain):
+ return bls.verify(message_hash=message_hash, pubkey=pubkey, signature=signature, domain=domain)
+
+
+@only_with_bls(alt_return=True)
+def bls_verify_multiple(pubkeys, message_hashes, signature, domain):
+ return bls.verify_multiple(pubkeys, message_hashes, signature, domain)
+
+
+@only_with_bls(alt_return=STUB_PUBKEY)
+def bls_aggregate_pubkeys(pubkeys):
+ return bls.aggregate_pubkeys(pubkeys)
+
+
+@only_with_bls(alt_return=STUB_SIGNATURE)
+def bls_aggregate_signatures(signatures):
+ return bls.aggregate_signatures(signatures)
+
+
+@only_with_bls(alt_return=STUB_SIGNATURE)
+def bls_sign(message_hash, privkey, domain):
+ return bls.sign(message_hash=message_hash, privkey=privkey, domain=domain)
diff --git a/test_libs/pyspec/eth2spec/utils/bls_stub.py b/test_libs/pyspec/eth2spec/utils/bls_stub.py
deleted file mode 100644
index 108c4ef710..0000000000
--- a/test_libs/pyspec/eth2spec/utils/bls_stub.py
+++ /dev/null
@@ -1,12 +0,0 @@
-
-
-def bls_verify(pubkey, message_hash, signature, domain):
- return True
-
-
-def bls_verify_multiple(pubkeys, message_hashes, signature, domain):
- return True
-
-
-def bls_aggregate_pubkeys(pubkeys):
- return b'\x42' * 96
diff --git a/test_libs/pyspec/tests/block_processing/test_process_attestation.py b/test_libs/pyspec/tests/block_processing/test_process_attestation.py
deleted file mode 100644
index bcf71376ce..0000000000
--- a/test_libs/pyspec/tests/block_processing/test_process_attestation.py
+++ /dev/null
@@ -1,155 +0,0 @@
-from copy import deepcopy
-import pytest
-
-import eth2spec.phase0.spec as spec
-
-from eth2spec.phase0.state_transition import (
- state_transition,
-)
-from eth2spec.phase0.spec import (
- get_current_epoch,
- process_attestation,
- slot_to_epoch,
-)
-from tests.helpers import (
- build_empty_block_for_next_slot,
- get_valid_attestation,
- next_epoch,
- next_slot,
-)
-
-
-# mark entire file as 'attestations'
-pytestmark = pytest.mark.attestations
-
-
-def run_attestation_processing(state, attestation, valid=True):
- """
- Run ``process_attestation`` returning the pre and post state.
- If ``valid == False``, run expecting ``AssertionError``
- """
- post_state = deepcopy(state)
-
- if not valid:
- with pytest.raises(AssertionError):
- process_attestation(post_state, attestation)
- return state, None
-
- process_attestation(post_state, attestation)
-
- current_epoch = get_current_epoch(state)
- if attestation.data.target_epoch == current_epoch:
- assert len(post_state.current_epoch_attestations) == len(state.current_epoch_attestations) + 1
- else:
- assert len(post_state.previous_epoch_attestations) == len(state.previous_epoch_attestations) + 1
-
- return state, post_state
-
-
-def test_success(state):
- attestation = get_valid_attestation(state)
- state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
-
- pre_state, post_state = run_attestation_processing(state, attestation)
-
- return pre_state, attestation, post_state
-
-
-def test_success_prevous_epoch(state):
- attestation = get_valid_attestation(state)
- block = build_empty_block_for_next_slot(state)
- block.slot = state.slot + spec.SLOTS_PER_EPOCH
- state_transition(state, block)
-
- pre_state, post_state = run_attestation_processing(state, attestation)
-
- return pre_state, attestation, post_state
-
-
-def test_before_inclusion_delay(state):
- attestation = get_valid_attestation(state)
- # do not increment slot to allow for inclusion delay
-
- pre_state, post_state = run_attestation_processing(state, attestation, False)
-
- return pre_state, attestation, post_state
-
-
-def test_after_epoch_slots(state):
- attestation = get_valid_attestation(state)
- block = build_empty_block_for_next_slot(state)
- # increment past latest inclusion slot
- block.slot = state.slot + spec.SLOTS_PER_EPOCH + 1
- state_transition(state, block)
-
- pre_state, post_state = run_attestation_processing(state, attestation, False)
-
- return pre_state, attestation, post_state
-
-
-def test_bad_source_epoch(state):
- attestation = get_valid_attestation(state)
- state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
-
- attestation.data.source_epoch += 10
-
- pre_state, post_state = run_attestation_processing(state, attestation, False)
-
- return pre_state, attestation, post_state
-
-
-def test_bad_source_root(state):
- attestation = get_valid_attestation(state)
- state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
-
- attestation.data.source_root = b'\x42' * 32
-
- pre_state, post_state = run_attestation_processing(state, attestation, False)
-
- return pre_state, attestation, post_state
-
-
-def test_non_zero_crosslink_data_root(state):
- attestation = get_valid_attestation(state)
- state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
-
- attestation.data.crosslink_data_root = b'\x42' * 32
-
- pre_state, post_state = run_attestation_processing(state, attestation, False)
-
- return pre_state, attestation, post_state
-
-
-def test_bad_previous_crosslink(state):
- next_epoch(state)
- attestation = get_valid_attestation(state)
- for _ in range(spec.MIN_ATTESTATION_INCLUSION_DELAY):
- next_slot(state)
-
- state.current_crosslinks[attestation.data.shard].epoch += 10
-
- pre_state, post_state = run_attestation_processing(state, attestation, False)
-
- return pre_state, attestation, post_state
-
-
-def test_non_empty_custody_bitfield(state):
- attestation = get_valid_attestation(state)
- state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
-
- attestation.custody_bitfield = deepcopy(attestation.aggregation_bitfield)
-
- pre_state, post_state = run_attestation_processing(state, attestation, False)
-
- return pre_state, attestation, post_state
-
-
-def test_empty_aggregation_bitfield(state):
- attestation = get_valid_attestation(state)
- state.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
-
- attestation.aggregation_bitfield = b'\x00' * len(attestation.aggregation_bitfield)
-
- pre_state, post_state = run_attestation_processing(state, attestation, False)
-
- return pre_state, attestation, post_state
diff --git a/test_libs/pyspec/tests/block_processing/test_process_attester_slashing.py b/test_libs/pyspec/tests/block_processing/test_process_attester_slashing.py
deleted file mode 100644
index 2ea16f13d9..0000000000
--- a/test_libs/pyspec/tests/block_processing/test_process_attester_slashing.py
+++ /dev/null
@@ -1,117 +0,0 @@
-from copy import deepcopy
-import pytest
-
-import eth2spec.phase0.spec as spec
-from eth2spec.phase0.spec import (
- get_beacon_proposer_index,
- process_attester_slashing,
-)
-from tests.helpers import (
- get_balance,
- get_valid_attester_slashing,
- next_epoch,
-)
-
-# mark entire file as 'attester_slashing'
-pytestmark = pytest.mark.attester_slashings
-
-
-def run_attester_slashing_processing(state, attester_slashing, valid=True):
- """
- Run ``process_attester_slashing`` returning the pre and post state.
- If ``valid == False``, run expecting ``AssertionError``
- """
- post_state = deepcopy(state)
-
- if not valid:
- with pytest.raises(AssertionError):
- process_attester_slashing(post_state, attester_slashing)
- return state, None
-
- process_attester_slashing(post_state, attester_slashing)
-
- slashed_index = attester_slashing.attestation_1.custody_bit_0_indices[0]
- slashed_validator = post_state.validator_registry[slashed_index]
- assert slashed_validator.slashed
- assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH
- assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH
- # lost whistleblower reward
- assert (
- get_balance(post_state, slashed_index) <
- get_balance(state, slashed_index)
- )
- proposer_index = get_beacon_proposer_index(state)
- # gained whistleblower reward
- assert (
- get_balance(post_state, proposer_index) >
- get_balance(state, proposer_index)
- )
-
- return state, post_state
-
-
-def test_success_double(state):
- attester_slashing = get_valid_attester_slashing(state)
-
- pre_state, post_state = run_attester_slashing_processing(state, attester_slashing)
-
- return pre_state, attester_slashing, post_state
-
-
-def test_success_surround(state):
- next_epoch(state)
- state.current_justified_epoch += 1
- attester_slashing = get_valid_attester_slashing(state)
-
- # set attestion1 to surround attestation 2
- attester_slashing.attestation_1.data.source_epoch = attester_slashing.attestation_2.data.source_epoch - 1
- attester_slashing.attestation_1.data.target_epoch = attester_slashing.attestation_2.data.target_epoch + 1
-
- pre_state, post_state = run_attester_slashing_processing(state, attester_slashing)
-
- return pre_state, attester_slashing, post_state
-
-
-def test_same_data(state):
- attester_slashing = get_valid_attester_slashing(state)
-
- attester_slashing.attestation_1.data = attester_slashing.attestation_2.data
-
- pre_state, post_state = run_attester_slashing_processing(state, attester_slashing, False)
-
- return pre_state, attester_slashing, post_state
-
-
-def test_no_double_or_surround(state):
- attester_slashing = get_valid_attester_slashing(state)
-
- attester_slashing.attestation_1.data.target_epoch += 1
-
- pre_state, post_state = run_attester_slashing_processing(state, attester_slashing, False)
-
- return pre_state, attester_slashing, post_state
-
-
-def test_participants_already_slashed(state):
- attester_slashing = get_valid_attester_slashing(state)
-
- # set all indices to slashed
- attestation_1 = attester_slashing.attestation_1
- validator_indices = attestation_1.custody_bit_0_indices + attestation_1.custody_bit_1_indices
- for index in validator_indices:
- state.validator_registry[index].slashed = True
-
- pre_state, post_state = run_attester_slashing_processing(state, attester_slashing, False)
-
- return pre_state, attester_slashing, post_state
-
-
-def test_custody_bit_0_and_1(state):
- attester_slashing = get_valid_attester_slashing(state)
-
- attester_slashing.attestation_1.custody_bit_1_indices = (
- attester_slashing.attestation_1.custody_bit_0_indices
- )
- pre_state, post_state = run_attester_slashing_processing(state, attester_slashing, False)
-
- return pre_state, attester_slashing, post_state
diff --git a/test_libs/pyspec/tests/block_processing/test_process_block_header.py b/test_libs/pyspec/tests/block_processing/test_process_block_header.py
deleted file mode 100644
index b35b0a9c11..0000000000
--- a/test_libs/pyspec/tests/block_processing/test_process_block_header.py
+++ /dev/null
@@ -1,76 +0,0 @@
-from copy import deepcopy
-import pytest
-
-
-from eth2spec.phase0.spec import (
- get_beacon_proposer_index,
- cache_state,
- advance_slot,
- process_block_header,
-)
-from tests.helpers import (
- build_empty_block_for_next_slot,
- next_slot,
-)
-
-# mark entire file as 'header'
-pytestmark = pytest.mark.header
-
-
-def prepare_state_for_header_processing(state):
- cache_state(state)
- advance_slot(state)
-
-
-def run_block_header_processing(state, block, valid=True):
- """
- Run ``process_block_header`` returning the pre and post state.
- If ``valid == False``, run expecting ``AssertionError``
- """
- prepare_state_for_header_processing(state)
- post_state = deepcopy(state)
-
- if not valid:
- with pytest.raises(AssertionError):
- process_block_header(post_state, block)
- return state, None
-
- process_block_header(post_state, block)
- return state, post_state
-
-
-def test_success(state):
- block = build_empty_block_for_next_slot(state)
- pre_state, post_state = run_block_header_processing(state, block)
- return state, block, post_state
-
-
-def test_invalid_slot(state):
- block = build_empty_block_for_next_slot(state)
- block.slot = state.slot + 2 # invalid slot
-
- pre_state, post_state = run_block_header_processing(state, block, valid=False)
- return pre_state, block, None
-
-
-def test_invalid_previous_block_root(state):
- block = build_empty_block_for_next_slot(state)
- block.previous_block_root = b'\12' * 32 # invalid prev root
-
- pre_state, post_state = run_block_header_processing(state, block, valid=False)
- return pre_state, block, None
-
-
-def test_proposer_slashed(state):
- # use stub state to get proposer index of next slot
- stub_state = deepcopy(state)
- next_slot(stub_state)
- proposer_index = get_beacon_proposer_index(stub_state)
-
- # set proposer to slashed
- state.validator_registry[proposer_index].slashed = True
-
- block = build_empty_block_for_next_slot(state)
-
- pre_state, post_state = run_block_header_processing(state, block, valid=False)
- return pre_state, block, None
diff --git a/test_libs/pyspec/tests/block_processing/test_process_deposit.py b/test_libs/pyspec/tests/block_processing/test_process_deposit.py
deleted file mode 100644
index bbfb390efb..0000000000
--- a/test_libs/pyspec/tests/block_processing/test_process_deposit.py
+++ /dev/null
@@ -1,141 +0,0 @@
-from copy import deepcopy
-import pytest
-
-import eth2spec.phase0.spec as spec
-
-from eth2spec.phase0.spec import (
- ZERO_HASH,
- process_deposit,
-)
-from tests.helpers import (
- get_balance,
- build_deposit,
- privkeys,
- pubkeys,
-)
-
-
-# mark entire file as 'deposits'
-pytestmark = pytest.mark.deposits
-
-
-def test_success(state):
- pre_state = deepcopy(state)
- # fill previous deposits with zero-hash
- deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry)
-
- index = len(deposit_data_leaves)
- pubkey = pubkeys[index]
- privkey = privkeys[index]
- deposit, root, deposit_data_leaves = build_deposit(
- pre_state,
- deposit_data_leaves,
- pubkey,
- privkey,
- spec.MAX_EFFECTIVE_BALANCE,
- )
-
- pre_state.latest_eth1_data.deposit_root = root
- pre_state.latest_eth1_data.deposit_count = len(deposit_data_leaves)
-
- post_state = deepcopy(pre_state)
-
- process_deposit(post_state, deposit)
-
- assert len(post_state.validator_registry) == len(state.validator_registry) + 1
- assert len(post_state.balances) == len(state.balances) + 1
- assert post_state.validator_registry[index].pubkey == pubkeys[index]
- assert get_balance(post_state, index) == spec.MAX_EFFECTIVE_BALANCE
- assert post_state.deposit_index == post_state.latest_eth1_data.deposit_count
-
- return pre_state, deposit, post_state
-
-
-def test_success_top_up(state):
- pre_state = deepcopy(state)
- deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry)
-
- validator_index = 0
- amount = spec.MAX_EFFECTIVE_BALANCE // 4
- pubkey = pubkeys[validator_index]
- privkey = privkeys[validator_index]
- deposit, root, deposit_data_leaves = build_deposit(
- pre_state,
- deposit_data_leaves,
- pubkey,
- privkey,
- amount,
- )
-
- pre_state.latest_eth1_data.deposit_root = root
- pre_state.latest_eth1_data.deposit_count = len(deposit_data_leaves)
- pre_balance = get_balance(pre_state, validator_index)
-
- post_state = deepcopy(pre_state)
-
- process_deposit(post_state, deposit)
-
- assert len(post_state.validator_registry) == len(state.validator_registry)
- assert len(post_state.balances) == len(state.balances)
- assert post_state.deposit_index == post_state.latest_eth1_data.deposit_count
- assert get_balance(post_state, validator_index) == pre_balance + amount
-
- return pre_state, deposit, post_state
-
-
-def test_wrong_index(state):
- pre_state = deepcopy(state)
- deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry)
-
- index = len(deposit_data_leaves)
- pubkey = pubkeys[index]
- privkey = privkeys[index]
- deposit, root, deposit_data_leaves = build_deposit(
- pre_state,
- deposit_data_leaves,
- pubkey,
- privkey,
- spec.MAX_EFFECTIVE_BALANCE,
- )
-
- # mess up deposit_index
- deposit.index = pre_state.deposit_index + 1
-
- pre_state.latest_eth1_data.deposit_root = root
- pre_state.latest_eth1_data.deposit_count = len(deposit_data_leaves)
-
- post_state = deepcopy(pre_state)
-
- with pytest.raises(AssertionError):
- process_deposit(post_state, deposit)
-
- return pre_state, deposit, None
-
-
-def test_bad_merkle_proof(state):
- pre_state = deepcopy(state)
- deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry)
-
- index = len(deposit_data_leaves)
- pubkey = pubkeys[index]
- privkey = privkeys[index]
- deposit, root, deposit_data_leaves = build_deposit(
- pre_state,
- deposit_data_leaves,
- pubkey,
- privkey,
- spec.MAX_EFFECTIVE_BALANCE,
- )
-
- # mess up merkle branch
- deposit.proof[-1] = spec.ZERO_HASH
-
- pre_state.latest_eth1_data.deposit_root = root
- pre_state.latest_eth1_data.deposit_count = len(deposit_data_leaves)
-
- post_state = deepcopy(pre_state)
-
- with pytest.raises(AssertionError):
- process_deposit(post_state, deposit)
-
- return pre_state, deposit, None
diff --git a/test_libs/pyspec/tests/block_processing/test_process_proposer_slashing.py b/test_libs/pyspec/tests/block_processing/test_process_proposer_slashing.py
deleted file mode 100644
index 4752210366..0000000000
--- a/test_libs/pyspec/tests/block_processing/test_process_proposer_slashing.py
+++ /dev/null
@@ -1,96 +0,0 @@
-from copy import deepcopy
-import pytest
-
-import eth2spec.phase0.spec as spec
-from eth2spec.phase0.spec import (
- get_current_epoch,
- process_proposer_slashing,
-)
-from tests.helpers import (
- get_balance,
- get_valid_proposer_slashing,
-)
-
-# mark entire file as 'proposer_slashings'
-pytestmark = pytest.mark.proposer_slashings
-
-
-def run_proposer_slashing_processing(state, proposer_slashing, valid=True):
- """
- Run ``process_proposer_slashing`` returning the pre and post state.
- If ``valid == False``, run expecting ``AssertionError``
- """
- post_state = deepcopy(state)
-
- if not valid:
- with pytest.raises(AssertionError):
- process_proposer_slashing(post_state, proposer_slashing)
- return state, None
-
- process_proposer_slashing(post_state, proposer_slashing)
-
- slashed_validator = post_state.validator_registry[proposer_slashing.proposer_index]
- assert slashed_validator.slashed
- assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH
- assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH
- # lost whistleblower reward
- assert (
- get_balance(post_state, proposer_slashing.proposer_index) <
- get_balance(state, proposer_slashing.proposer_index)
- )
-
- return state, post_state
-
-
-def test_success(state):
- proposer_slashing = get_valid_proposer_slashing(state)
-
- pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing)
-
- return pre_state, proposer_slashing, post_state
-
-
-def test_epochs_are_different(state):
- proposer_slashing = get_valid_proposer_slashing(state)
-
- # set slots to be in different epochs
- proposer_slashing.header_2.slot += spec.SLOTS_PER_EPOCH
-
- pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False)
-
- return pre_state, proposer_slashing, post_state
-
-
-def test_headers_are_same(state):
- proposer_slashing = get_valid_proposer_slashing(state)
-
- # set headers to be the same
- proposer_slashing.header_2 = proposer_slashing.header_1
-
- pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False)
-
- return pre_state, proposer_slashing, post_state
-
-
-def test_proposer_is_slashed(state):
- proposer_slashing = get_valid_proposer_slashing(state)
-
- # set proposer to slashed
- state.validator_registry[proposer_slashing.proposer_index].slashed = True
-
- pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False)
-
- return pre_state, proposer_slashing, post_state
-
-
-def test_proposer_is_withdrawn(state):
- proposer_slashing = get_valid_proposer_slashing(state)
-
- # set proposer withdrawable_epoch in past
- current_epoch = get_current_epoch(state)
- proposer_index = proposer_slashing.proposer_index
- state.validator_registry[proposer_index].withdrawable_epoch = current_epoch - 1
-
- pre_state, post_state = run_proposer_slashing_processing(state, proposer_slashing, False)
-
- return pre_state, proposer_slashing, post_state
diff --git a/test_libs/pyspec/tests/block_processing/test_process_transfer.py b/test_libs/pyspec/tests/block_processing/test_process_transfer.py
deleted file mode 100644
index 0eeaa77929..0000000000
--- a/test_libs/pyspec/tests/block_processing/test_process_transfer.py
+++ /dev/null
@@ -1,141 +0,0 @@
-from copy import deepcopy
-import pytest
-
-import eth2spec.phase0.spec as spec
-
-from eth2spec.phase0.spec import (
- get_active_validator_indices,
- get_beacon_proposer_index,
- get_current_epoch,
- process_transfer,
-)
-from tests.helpers import (
- get_valid_transfer,
- next_epoch,
-)
-
-
-# mark entire file as 'transfers'
-pytestmark = pytest.mark.transfers
-
-
-def run_transfer_processing(state, transfer, valid=True):
- """
- Run ``process_transfer`` returning the pre and post state.
- If ``valid == False``, run expecting ``AssertionError``
- """
- post_state = deepcopy(state)
-
- if not valid:
- with pytest.raises(AssertionError):
- process_transfer(post_state, transfer)
- return state, None
-
-
- process_transfer(post_state, transfer)
-
- proposer_index = get_beacon_proposer_index(state)
- pre_transfer_sender_balance = state.balances[transfer.sender]
- pre_transfer_recipient_balance = state.balances[transfer.recipient]
- pre_transfer_proposer_balance = state.balances[proposer_index]
- sender_balance = post_state.balances[transfer.sender]
- recipient_balance = post_state.balances[transfer.recipient]
- assert sender_balance == pre_transfer_sender_balance - transfer.amount - transfer.fee
- assert recipient_balance == pre_transfer_recipient_balance + transfer.amount
- assert post_state.balances[proposer_index] == pre_transfer_proposer_balance + transfer.fee
-
- return state, post_state
-
-
-def test_success_non_activated(state):
- transfer = get_valid_transfer(state)
- # un-activate so validator can transfer
- state.validator_registry[transfer.sender].activation_eligibility_epoch = spec.FAR_FUTURE_EPOCH
-
- pre_state, post_state = run_transfer_processing(state, transfer)
-
- return pre_state, transfer, post_state
-
-
-def test_success_withdrawable(state):
- next_epoch(state)
-
- transfer = get_valid_transfer(state)
-
- # withdrawable_epoch in past so can transfer
- state.validator_registry[transfer.sender].withdrawable_epoch = get_current_epoch(state) - 1
-
- pre_state, post_state = run_transfer_processing(state, transfer)
-
- return pre_state, transfer, post_state
-
-
-def test_success_active_above_max_effective(state):
- sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
- amount = spec.MAX_EFFECTIVE_BALANCE // 32
- state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE + amount
- transfer = get_valid_transfer(state, sender_index=sender_index, amount=amount, fee=0)
-
- pre_state, post_state = run_transfer_processing(state, transfer)
-
- return pre_state, transfer, post_state
-
-
-def test_active_but_transfer_past_effective_balance(state):
- sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
- amount = spec.MAX_EFFECTIVE_BALANCE // 32
- state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE
- transfer = get_valid_transfer(state, sender_index=sender_index, amount=amount, fee=0)
-
- pre_state, post_state = run_transfer_processing(state, transfer, False)
-
- return pre_state, transfer, post_state
-
-
-def test_incorrect_slot(state):
- transfer = get_valid_transfer(state, slot=state.slot+1)
- # un-activate so validator can transfer
- state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH
-
- pre_state, post_state = run_transfer_processing(state, transfer, False)
-
- return pre_state, transfer, post_state
-
-
-def test_insufficient_balance(state):
- sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
- amount = spec.MAX_EFFECTIVE_BALANCE
- state.balances[sender_index] = spec.MAX_EFFECTIVE_BALANCE
- transfer = get_valid_transfer(state, sender_index=sender_index, amount=amount + 1, fee=0)
-
- # un-activate so validator can transfer
- state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH
-
- pre_state, post_state = run_transfer_processing(state, transfer, False)
-
- return pre_state, transfer, post_state
-
-
-def test_no_dust(state):
- sender_index = get_active_validator_indices(state, get_current_epoch(state))[-1]
- balance = state.balances[sender_index]
- transfer = get_valid_transfer(state, sender_index=sender_index, amount=balance - spec.MIN_DEPOSIT_AMOUNT + 1, fee=0)
-
- # un-activate so validator can transfer
- state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH
-
- pre_state, post_state = run_transfer_processing(state, transfer, False)
-
- return pre_state, transfer, post_state
-
-
-def test_invalid_pubkey(state):
- transfer = get_valid_transfer(state)
- state.validator_registry[transfer.sender].withdrawal_credentials = spec.ZERO_HASH
-
- # un-activate so validator can transfer
- state.validator_registry[transfer.sender].activation_epoch = spec.FAR_FUTURE_EPOCH
-
- pre_state, post_state = run_transfer_processing(state, transfer, False)
-
- return pre_state, transfer, post_state
diff --git a/test_libs/pyspec/tests/block_processing/test_voluntary_exit.py b/test_libs/pyspec/tests/block_processing/test_voluntary_exit.py
deleted file mode 100644
index c58c5238a9..0000000000
--- a/test_libs/pyspec/tests/block_processing/test_voluntary_exit.py
+++ /dev/null
@@ -1,163 +0,0 @@
-from copy import deepcopy
-import pytest
-
-import eth2spec.phase0.spec as spec
-
-from eth2spec.phase0.spec import (
- get_active_validator_indices,
- get_churn_limit,
- get_current_epoch,
- process_voluntary_exit,
-)
-from tests.helpers import (
- build_voluntary_exit,
- pubkey_to_privkey,
-)
-
-
-# mark entire file as 'voluntary_exits'
-pytestmark = pytest.mark.voluntary_exits
-
-
-def run_voluntary_exit_processing(state, voluntary_exit, valid=True):
- """
- Run ``process_voluntary_exit`` returning the pre and post state.
- If ``valid == False``, run expecting ``AssertionError``
- """
- post_state = deepcopy(state)
-
- if not valid:
- with pytest.raises(AssertionError):
- process_voluntary_exit(post_state, voluntary_exit)
- return state, None
-
- process_voluntary_exit(post_state, voluntary_exit)
-
- validator_index = voluntary_exit.validator_index
- assert state.validator_registry[validator_index].exit_epoch == spec.FAR_FUTURE_EPOCH
- assert post_state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH
-
- return state, post_state
-
-
-def test_success(state):
- # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit
- state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH
-
- current_epoch = get_current_epoch(state)
- validator_index = get_active_validator_indices(state, current_epoch)[0]
- privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
-
- voluntary_exit = build_voluntary_exit(
- state,
- current_epoch,
- validator_index,
- privkey,
- )
-
- pre_state, post_state = run_voluntary_exit_processing(state, voluntary_exit)
- return pre_state, voluntary_exit, post_state
-
-
-def test_success_exit_queue(state):
- # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit
- state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH
-
- current_epoch = get_current_epoch(state)
-
- # exit `MAX_EXITS_PER_EPOCH`
- initial_indices = get_active_validator_indices(state, current_epoch)[:get_churn_limit(state)]
- post_state = state
- for index in initial_indices:
- privkey = pubkey_to_privkey[state.validator_registry[index].pubkey]
- voluntary_exit = build_voluntary_exit(
- state,
- current_epoch,
- index,
- privkey,
- )
-
- pre_state, post_state = run_voluntary_exit_processing(post_state, voluntary_exit)
-
- # exit an additional validator
- validator_index = get_active_validator_indices(state, current_epoch)[-1]
- privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
- voluntary_exit = build_voluntary_exit(
- state,
- current_epoch,
- validator_index,
- privkey,
- )
-
- pre_state, post_state = run_voluntary_exit_processing(post_state, voluntary_exit)
-
- assert (
- post_state.validator_registry[validator_index].exit_epoch ==
- post_state.validator_registry[initial_indices[0]].exit_epoch + 1
- )
-
- return pre_state, voluntary_exit, post_state
-
-
-def test_validator_not_active(state):
- current_epoch = get_current_epoch(state)
- validator_index = get_active_validator_indices(state, current_epoch)[0]
- privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
-
- state.validator_registry[validator_index].activation_epoch = spec.FAR_FUTURE_EPOCH
-
- #
- # build and test voluntary exit
- #
- voluntary_exit = build_voluntary_exit(
- state,
- current_epoch,
- validator_index,
- privkey,
- )
-
- pre_state, post_state = run_voluntary_exit_processing(state, voluntary_exit, False)
- return pre_state, voluntary_exit, post_state
-
-
-def test_validator_already_exited(state):
- # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow validator able to exit
- state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH
-
- current_epoch = get_current_epoch(state)
- validator_index = get_active_validator_indices(state, current_epoch)[0]
- privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
-
- # but validator already has exited
- state.validator_registry[validator_index].exit_epoch = current_epoch + 2
-
- voluntary_exit = build_voluntary_exit(
- state,
- current_epoch,
- validator_index,
- privkey,
- )
-
- pre_state, post_state = run_voluntary_exit_processing(state, voluntary_exit, False)
- return pre_state, voluntary_exit, post_state
-
-
-def test_validator_not_active_long_enough(state):
- current_epoch = get_current_epoch(state)
- validator_index = get_active_validator_indices(state, current_epoch)[0]
- privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
-
- voluntary_exit = build_voluntary_exit(
- state,
- current_epoch,
- validator_index,
- privkey,
- )
-
- assert (
- current_epoch - state.validator_registry[validator_index].activation_epoch <
- spec.PERSISTENT_COMMITTEE_PERIOD
- )
-
- pre_state, post_state = run_voluntary_exit_processing(state, voluntary_exit, False)
- return pre_state, voluntary_exit, post_state
diff --git a/test_libs/pyspec/tests/conftest.py b/test_libs/pyspec/tests/conftest.py
deleted file mode 100644
index 9840dc7b20..0000000000
--- a/test_libs/pyspec/tests/conftest.py
+++ /dev/null
@@ -1,36 +0,0 @@
-import pytest
-
-from eth2spec.phase0 import spec
-from preset_loader import loader
-
-from .helpers import (
- create_genesis_state,
-)
-
-
-def pytest_addoption(parser):
- parser.addoption(
- "--config", action="store", default="minimal", help="config: make the pyspec use the specified configuration"
- )
-
-
-@pytest.fixture(autouse=True)
-def config(request):
- config_name = request.config.getoption("--config")
- presets = loader.load_presets('../../configs/', config_name)
- spec.apply_constants_preset(presets)
-
-
-@pytest.fixture
-def num_validators(config):
- return spec.SLOTS_PER_EPOCH * 8
-
-
-@pytest.fixture
-def deposit_data_leaves():
- return list()
-
-
-@pytest.fixture
-def state(num_validators, deposit_data_leaves):
- return create_genesis_state(num_validators, deposit_data_leaves)
diff --git a/test_libs/pyspec/tests/helpers.py b/test_libs/pyspec/tests/helpers.py
deleted file mode 100644
index 3b9b6904d5..0000000000
--- a/test_libs/pyspec/tests/helpers.py
+++ /dev/null
@@ -1,422 +0,0 @@
-from copy import deepcopy
-
-from py_ecc import bls
-
-from eth2spec.phase0.state_transition import (
- state_transition,
-)
-import eth2spec.phase0.spec as spec
-from eth2spec.utils.minimal_ssz import signing_root
-from eth2spec.phase0.spec import (
- # constants
- ZERO_HASH,
- # SSZ
- Attestation,
- AttestationData,
- AttestationDataAndCustodyBit,
- AttesterSlashing,
- BeaconBlock,
- BeaconBlockHeader,
- Deposit,
- DepositData,
- Eth1Data,
- ProposerSlashing,
- Transfer,
- VoluntaryExit,
- # functions
- convert_to_indexed,
- get_active_validator_indices,
- get_attesting_indices,
- get_block_root,
- get_block_root_at_slot,
- get_crosslink_committee,
- get_current_epoch,
- get_domain,
- get_epoch_start_slot,
- get_genesis_beacon_state,
- get_previous_epoch,
- get_shard_delta,
- hash_tree_root,
- slot_to_epoch,
- verify_merkle_branch,
- hash,
-)
-from eth2spec.utils.merkle_minimal import (
- calc_merkle_tree_from_leaves,
- get_merkle_proof,
- get_merkle_root,
-)
-
-
-privkeys = [i + 1 for i in range(1024)]
-pubkeys = [bls.privtopub(privkey) for privkey in privkeys]
-pubkey_to_privkey = {pubkey: privkey for privkey, pubkey in zip(privkeys, pubkeys)}
-
-
-def get_balance(state, index):
- return state.balances[index]
-
-
-def set_bitfield_bit(bitfield, i):
- """
- Set the bit in ``bitfield`` at position ``i`` to ``1``.
- """
- byte_index = i // 8
- bit_index = i % 8
- return (
- bitfield[:byte_index] +
- bytes([bitfield[byte_index] | (1 << bit_index)]) +
- bitfield[byte_index+1:]
- )
-
-
-def create_mock_genesis_validator_deposits(num_validators, deposit_data_leaves=None):
- if not deposit_data_leaves:
- deposit_data_leaves = []
- signature = b'\x33' * 96
-
- deposit_data_list = []
- for i in range(num_validators):
- pubkey = pubkeys[i]
- deposit_data = DepositData(
- pubkey=pubkey,
- # insecurely use pubkey as withdrawal key as well
- withdrawal_credentials=spec.BLS_WITHDRAWAL_PREFIX_BYTE + hash(pubkey)[1:],
- amount=spec.MAX_EFFECTIVE_BALANCE,
- signature=signature,
- )
- item = deposit_data.hash_tree_root()
- deposit_data_leaves.append(item)
- tree = calc_merkle_tree_from_leaves(tuple(deposit_data_leaves))
- root = get_merkle_root((tuple(deposit_data_leaves)))
- proof = list(get_merkle_proof(tree, item_index=i))
- assert verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, i, root)
- deposit_data_list.append(deposit_data)
-
- genesis_validator_deposits = []
- for i in range(num_validators):
- genesis_validator_deposits.append(Deposit(
- proof=list(get_merkle_proof(tree, item_index=i)),
- index=i,
- data=deposit_data_list[i]
- ))
- return genesis_validator_deposits, root
-
-
-def create_genesis_state(num_validators, deposit_data_leaves=None):
- initial_deposits, deposit_root = create_mock_genesis_validator_deposits(
- num_validators,
- deposit_data_leaves,
- )
- return get_genesis_beacon_state(
- initial_deposits,
- genesis_time=0,
- genesis_eth1_data=Eth1Data(
- deposit_root=deposit_root,
- deposit_count=len(initial_deposits),
- block_hash=spec.ZERO_HASH,
- ),
- )
-
-
-def build_empty_block_for_next_slot(state):
- empty_block = BeaconBlock()
- empty_block.slot = state.slot + 1
- empty_block.body.eth1_data.deposit_count = state.deposit_index
- previous_block_header = deepcopy(state.latest_block_header)
- if previous_block_header.state_root == spec.ZERO_HASH:
- previous_block_header.state_root = state.hash_tree_root()
- empty_block.previous_block_root = signing_root(previous_block_header)
- return empty_block
-
-
-def build_deposit_data(state, pubkey, privkey, amount):
- deposit_data = DepositData(
- pubkey=pubkey,
- # insecurely use pubkey as withdrawal key as well
- withdrawal_credentials=spec.BLS_WITHDRAWAL_PREFIX_BYTE + hash(pubkey)[1:],
- amount=amount,
- )
- signature = bls.sign(
- message_hash=signing_root(deposit_data),
- privkey=privkey,
- domain=get_domain(
- state,
- spec.DOMAIN_DEPOSIT,
- )
- )
- deposit_data.signature = signature
- return deposit_data
-
-
-def build_attestation_data(state, slot, shard):
- assert state.slot >= slot
-
- if slot == state.slot:
- block_root = build_empty_block_for_next_slot(state).previous_block_root
- else:
- block_root = get_block_root_at_slot(state, slot)
-
- current_epoch_start_slot = get_epoch_start_slot(get_current_epoch(state))
- if slot < current_epoch_start_slot:
- epoch_boundary_root = get_block_root(state, get_previous_epoch(state))
- elif slot == current_epoch_start_slot:
- epoch_boundary_root = block_root
- else:
- epoch_boundary_root = get_block_root(state, get_current_epoch(state))
-
- if slot < current_epoch_start_slot:
- justified_epoch = state.previous_justified_epoch
- justified_block_root = state.previous_justified_root
- else:
- justified_epoch = state.current_justified_epoch
- justified_block_root = state.current_justified_root
-
- crosslinks = state.current_crosslinks if slot_to_epoch(slot) == get_current_epoch(state) else state.previous_crosslinks
- return AttestationData(
- shard=shard,
- beacon_block_root=block_root,
- source_epoch=justified_epoch,
- source_root=justified_block_root,
- target_epoch=slot_to_epoch(slot),
- target_root=epoch_boundary_root,
- crosslink_data_root=spec.ZERO_HASH,
- previous_crosslink_root=hash_tree_root(crosslinks[shard]),
- )
-
-
-def build_voluntary_exit(state, epoch, validator_index, privkey):
- voluntary_exit = VoluntaryExit(
- epoch=epoch,
- validator_index=validator_index,
- )
- voluntary_exit.signature = bls.sign(
- message_hash=signing_root(voluntary_exit),
- privkey=privkey,
- domain=get_domain(
- state=state,
- domain_type=spec.DOMAIN_VOLUNTARY_EXIT,
- message_epoch=epoch,
- )
- )
-
- return voluntary_exit
-
-
-def build_deposit(state,
- deposit_data_leaves,
- pubkey,
- privkey,
- amount):
- deposit_data = build_deposit_data(state, pubkey, privkey, amount)
-
- item = deposit_data.hash_tree_root()
- index = len(deposit_data_leaves)
- deposit_data_leaves.append(item)
- tree = calc_merkle_tree_from_leaves(tuple(deposit_data_leaves))
- root = get_merkle_root((tuple(deposit_data_leaves)))
- proof = list(get_merkle_proof(tree, item_index=index))
- assert verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, index, root)
-
- deposit = Deposit(
- proof=list(proof),
- index=index,
- data=deposit_data,
- )
-
- return deposit, root, deposit_data_leaves
-
-
-def get_valid_proposer_slashing(state):
- current_epoch = get_current_epoch(state)
- validator_index = get_active_validator_indices(state, current_epoch)[-1]
- privkey = pubkey_to_privkey[state.validator_registry[validator_index].pubkey]
- slot = state.slot
-
- header_1 = BeaconBlockHeader(
- slot=slot,
- previous_block_root=ZERO_HASH,
- state_root=ZERO_HASH,
- block_body_root=ZERO_HASH,
- )
- header_2 = deepcopy(header_1)
- header_2.previous_block_root = b'\x02' * 32
- header_2.slot = slot + 1
-
- domain = get_domain(
- state=state,
- domain_type=spec.DOMAIN_BEACON_PROPOSER,
- )
- header_1.signature = bls.sign(
- message_hash=signing_root(header_1),
- privkey=privkey,
- domain=domain,
- )
- header_2.signature = bls.sign(
- message_hash=signing_root(header_2),
- privkey=privkey,
- domain=domain,
- )
-
- return ProposerSlashing(
- proposer_index=validator_index,
- header_1=header_1,
- header_2=header_2,
- )
-
-
-def get_valid_attester_slashing(state):
- attestation_1 = get_valid_attestation(state)
- attestation_2 = deepcopy(attestation_1)
- attestation_2.data.target_root = b'\x01' * 32
-
- return AttesterSlashing(
- attestation_1=convert_to_indexed(state, attestation_1),
- attestation_2=convert_to_indexed(state, attestation_2),
- )
-
-
-def get_valid_attestation(state, slot=None):
- if slot is None:
- slot = state.slot
-
- if slot_to_epoch(slot) == get_current_epoch(state):
- shard = (state.latest_start_shard + slot) % spec.SLOTS_PER_EPOCH
- else:
- previous_shard_delta = get_shard_delta(state, get_previous_epoch(state))
- shard = (state.latest_start_shard - previous_shard_delta + slot) % spec.SHARD_COUNT
-
- attestation_data = build_attestation_data(state, slot, shard)
-
- crosslink_committee = get_crosslink_committee(state, attestation_data.target_epoch, attestation_data.shard)
-
- committee_size = len(crosslink_committee)
- bitfield_length = (committee_size + 7) // 8
- aggregation_bitfield = b'\xC0' + b'\x00' * (bitfield_length - 1)
- custody_bitfield = b'\x00' * bitfield_length
- attestation = Attestation(
- aggregation_bitfield=aggregation_bitfield,
- data=attestation_data,
- custody_bitfield=custody_bitfield,
- )
- participants = get_attesting_indices(
- state,
- attestation.data,
- attestation.aggregation_bitfield,
- )
- assert len(participants) == 2
-
- signatures = []
- for validator_index in participants:
- privkey = privkeys[validator_index]
- signatures.append(
- get_attestation_signature(
- state,
- attestation.data,
- privkey
- )
- )
-
- attestation.aggregation_signature = bls.aggregate_signatures(signatures)
- return attestation
-
-
-def get_valid_transfer(state, slot=None, sender_index=None, amount=None, fee=None):
- if slot is None:
- slot = state.slot
- current_epoch = get_current_epoch(state)
- if sender_index is None:
- sender_index = get_active_validator_indices(state, current_epoch)[-1]
- recipient_index = get_active_validator_indices(state, current_epoch)[0]
- transfer_pubkey = pubkeys[-1]
- transfer_privkey = privkeys[-1]
-
- if fee is None:
- fee = get_balance(state, sender_index) // 32
- if amount is None:
- amount = get_balance(state, sender_index) - fee
-
- transfer = Transfer(
- sender=sender_index,
- recipient=recipient_index,
- amount=amount,
- fee=fee,
- slot=slot,
- pubkey=transfer_pubkey,
- signature=ZERO_HASH,
- )
- transfer.signature = bls.sign(
- message_hash=signing_root(transfer),
- privkey=transfer_privkey,
- domain=get_domain(
- state=state,
- domain_type=spec.DOMAIN_TRANSFER,
- message_epoch=get_current_epoch(state),
- )
- )
-
- # ensure withdrawal_credentials reproducable
- state.validator_registry[transfer.sender].withdrawal_credentials = (
- spec.BLS_WITHDRAWAL_PREFIX_BYTE + spec.hash(transfer.pubkey)[1:]
- )
-
- return transfer
-
-
-def get_attestation_signature(state, attestation_data, privkey, custody_bit=0b0):
- message_hash = AttestationDataAndCustodyBit(
- data=attestation_data,
- custody_bit=custody_bit,
- ).hash_tree_root()
-
- return bls.sign(
- message_hash=message_hash,
- privkey=privkey,
- domain=get_domain(
- state=state,
- domain_type=spec.DOMAIN_ATTESTATION,
- message_epoch=attestation_data.target_epoch,
- )
- )
-
-
-def fill_aggregate_attestation(state, attestation):
- crosslink_committee = get_crosslink_committee(state, attestation.data.target_epoch, attestation.data.shard)
- for i in range(len(crosslink_committee)):
- attestation.aggregation_bitfield = set_bitfield_bit(attestation.aggregation_bitfield, i)
-
-
-def add_attestation_to_state(state, attestation, slot):
- block = build_empty_block_for_next_slot(state)
- block.slot = slot
- block.body.attestations.append(attestation)
- state_transition(state, block)
-
-
-def next_slot(state):
- """
- Transition to the next slot via an empty block.
- Return the empty block that triggered the transition.
- """
- block = build_empty_block_for_next_slot(state)
- state_transition(state, block)
- return block
-
-
-def next_epoch(state):
- """
- Transition to the start slot of the next epoch via an empty block.
- Return the empty block that triggered the transition.
- """
- block = build_empty_block_for_next_slot(state)
- block.slot += spec.SLOTS_PER_EPOCH - (state.slot % spec.SLOTS_PER_EPOCH)
- state_transition(state, block)
- return block
-
-
-def get_state_root(state, slot) -> bytes:
- """
- Return the state root at a recent ``slot``.
- """
- assert slot < state.slot <= slot + spec.SLOTS_PER_HISTORICAL_ROOT
- return state.latest_state_roots[slot % spec.SLOTS_PER_HISTORICAL_ROOT]
diff --git a/test_libs/pyspec/tests/test_sanity.py b/test_libs/pyspec/tests/test_sanity.py
deleted file mode 100644
index 1b4d20f4c5..0000000000
--- a/test_libs/pyspec/tests/test_sanity.py
+++ /dev/null
@@ -1,438 +0,0 @@
-from copy import deepcopy
-
-import pytest
-
-from py_ecc import bls
-import eth2spec.phase0.spec as spec
-
-from eth2spec.utils.minimal_ssz import signing_root
-from eth2spec.phase0.spec import (
- # constants
- ZERO_HASH,
- SLOTS_PER_HISTORICAL_ROOT,
- # SSZ
- Deposit,
- Transfer,
- VoluntaryExit,
- # functions
- get_active_validator_indices,
- get_beacon_proposer_index,
- get_block_root_at_slot,
- get_current_epoch,
- get_domain,
- advance_slot,
- cache_state,
- verify_merkle_branch,
- hash,
-)
-from eth2spec.phase0.state_transition import (
- state_transition,
-)
-from eth2spec.utils.merkle_minimal import (
- calc_merkle_tree_from_leaves,
- get_merkle_proof,
- get_merkle_root,
-)
-from .helpers import (
- get_balance,
- build_deposit_data,
- build_empty_block_for_next_slot,
- fill_aggregate_attestation,
- get_state_root,
- get_valid_attestation,
- get_valid_attester_slashing,
- get_valid_proposer_slashing,
- next_slot,
- privkeys,
- pubkeys,
-)
-
-
-# mark entire file as 'sanity'
-pytestmark = pytest.mark.sanity
-
-
-def test_slot_transition(state):
- test_state = deepcopy(state)
- cache_state(test_state)
- advance_slot(test_state)
- assert test_state.slot == state.slot + 1
- assert get_state_root(test_state, state.slot) == state.hash_tree_root()
- return test_state
-
-
-def test_empty_block_transition(state):
- test_state = deepcopy(state)
-
- block = build_empty_block_for_next_slot(test_state)
- state_transition(test_state, block)
-
- assert len(test_state.eth1_data_votes) == len(state.eth1_data_votes) + 1
- assert get_block_root_at_slot(test_state, state.slot) == block.previous_block_root
-
- return state, [block], test_state
-
-
-def test_skipped_slots(state):
- test_state = deepcopy(state)
- block = build_empty_block_for_next_slot(test_state)
- block.slot += 3
-
- state_transition(test_state, block)
-
- assert test_state.slot == block.slot
- for slot in range(state.slot, test_state.slot):
- assert get_block_root_at_slot(test_state, slot) == block.previous_block_root
-
- return state, [block], test_state
-
-
-def test_empty_epoch_transition(state):
- test_state = deepcopy(state)
- block = build_empty_block_for_next_slot(test_state)
- block.slot += spec.SLOTS_PER_EPOCH
-
- state_transition(test_state, block)
-
- assert test_state.slot == block.slot
- for slot in range(state.slot, test_state.slot):
- assert get_block_root_at_slot(test_state, slot) == block.previous_block_root
-
- return state, [block], test_state
-
-
-def test_empty_epoch_transition_not_finalizing(state):
- test_state = deepcopy(state)
- block = build_empty_block_for_next_slot(test_state)
- block.slot += spec.SLOTS_PER_EPOCH * 5
-
- state_transition(test_state, block)
-
- assert test_state.slot == block.slot
- assert test_state.finalized_epoch < get_current_epoch(test_state) - 4
- for index in range(len(test_state.validator_registry)):
- assert get_balance(test_state, index) < get_balance(state, index)
-
- return state, [block], test_state
-
-
-def test_proposer_slashing(state):
- test_state = deepcopy(state)
- proposer_slashing = get_valid_proposer_slashing(state)
- validator_index = proposer_slashing.proposer_index
-
- #
- # Add to state via block transition
- #
- block = build_empty_block_for_next_slot(test_state)
- block.body.proposer_slashings.append(proposer_slashing)
- state_transition(test_state, block)
-
- assert not state.validator_registry[validator_index].slashed
-
- slashed_validator = test_state.validator_registry[validator_index]
- assert slashed_validator.slashed
- assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH
- assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH
- # lost whistleblower reward
- assert get_balance(test_state, validator_index) < get_balance(state, validator_index)
-
- return state, [block], test_state
-
-
-def test_attester_slashing(state):
- test_state = deepcopy(state)
- attester_slashing = get_valid_attester_slashing(state)
- validator_index = attester_slashing.attestation_1.custody_bit_0_indices[0]
-
- #
- # Add to state via block transition
- #
- block = build_empty_block_for_next_slot(test_state)
- block.body.attester_slashings.append(attester_slashing)
- state_transition(test_state, block)
-
- assert not state.validator_registry[validator_index].slashed
-
- slashed_validator = test_state.validator_registry[validator_index]
- assert slashed_validator.slashed
- assert slashed_validator.exit_epoch < spec.FAR_FUTURE_EPOCH
- assert slashed_validator.withdrawable_epoch < spec.FAR_FUTURE_EPOCH
- # lost whistleblower reward
- assert get_balance(test_state, validator_index) < get_balance(state, validator_index)
-
- proposer_index = get_beacon_proposer_index(test_state)
- # gained whistleblower reward
- assert (
- get_balance(test_state, proposer_index) >
- get_balance(state, proposer_index)
- )
-
- return state, [block], test_state
-
-
-def test_deposit_in_block(state):
- pre_state = deepcopy(state)
- test_deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry)
-
- index = len(test_deposit_data_leaves)
- pubkey = pubkeys[index]
- privkey = privkeys[index]
- deposit_data = build_deposit_data(pre_state, pubkey, privkey, spec.MAX_EFFECTIVE_BALANCE)
-
- item = deposit_data.hash_tree_root()
- test_deposit_data_leaves.append(item)
- tree = calc_merkle_tree_from_leaves(tuple(test_deposit_data_leaves))
- root = get_merkle_root((tuple(test_deposit_data_leaves)))
- proof = list(get_merkle_proof(tree, item_index=index))
- assert verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, index, root)
-
- deposit = Deposit(
- proof=list(proof),
- index=index,
- data=deposit_data,
- )
-
- pre_state.latest_eth1_data.deposit_root = root
- pre_state.latest_eth1_data.deposit_count = len(test_deposit_data_leaves)
- post_state = deepcopy(pre_state)
- block = build_empty_block_for_next_slot(post_state)
- block.body.deposits.append(deposit)
-
- state_transition(post_state, block)
- assert len(post_state.validator_registry) == len(state.validator_registry) + 1
- assert len(post_state.balances) == len(state.balances) + 1
- assert get_balance(post_state, index) == spec.MAX_EFFECTIVE_BALANCE
- assert post_state.validator_registry[index].pubkey == pubkeys[index]
-
- return pre_state, [block], post_state
-
-
-def test_deposit_top_up(state):
- pre_state = deepcopy(state)
- test_deposit_data_leaves = [ZERO_HASH] * len(pre_state.validator_registry)
-
- validator_index = 0
- amount = spec.MAX_EFFECTIVE_BALANCE // 4
- pubkey = pubkeys[validator_index]
- privkey = privkeys[validator_index]
- deposit_data = build_deposit_data(pre_state, pubkey, privkey, amount)
-
- merkle_index = len(test_deposit_data_leaves)
- item = deposit_data.hash_tree_root()
- test_deposit_data_leaves.append(item)
- tree = calc_merkle_tree_from_leaves(tuple(test_deposit_data_leaves))
- root = get_merkle_root((tuple(test_deposit_data_leaves)))
- proof = list(get_merkle_proof(tree, item_index=merkle_index))
- assert verify_merkle_branch(item, proof, spec.DEPOSIT_CONTRACT_TREE_DEPTH, merkle_index, root)
-
- deposit = Deposit(
- proof=list(proof),
- index=merkle_index,
- data=deposit_data,
- )
-
- pre_state.latest_eth1_data.deposit_root = root
- pre_state.latest_eth1_data.deposit_count = len(test_deposit_data_leaves)
- block = build_empty_block_for_next_slot(pre_state)
- block.body.deposits.append(deposit)
-
- pre_balance = get_balance(pre_state, validator_index)
- post_state = deepcopy(pre_state)
- state_transition(post_state, block)
- assert len(post_state.validator_registry) == len(pre_state.validator_registry)
- assert len(post_state.balances) == len(pre_state.balances)
- assert get_balance(post_state, validator_index) == pre_balance + amount
-
- return pre_state, [block], post_state
-
-
-def test_attestation(state):
- state.slot = spec.SLOTS_PER_EPOCH
- test_state = deepcopy(state)
- attestation = get_valid_attestation(state)
-
- #
- # Add to state via block transition
- #
- attestation_block = build_empty_block_for_next_slot(test_state)
- attestation_block.slot += spec.MIN_ATTESTATION_INCLUSION_DELAY
- attestation_block.body.attestations.append(attestation)
- state_transition(test_state, attestation_block)
-
- assert len(test_state.current_epoch_attestations) == len(state.current_epoch_attestations) + 1
-
-
- #
- # Epoch transition should move to previous_epoch_attestations
- #
- pre_current_epoch_attestations = deepcopy(test_state.current_epoch_attestations)
-
- epoch_block = build_empty_block_for_next_slot(test_state)
- epoch_block.slot += spec.SLOTS_PER_EPOCH
- state_transition(test_state, epoch_block)
-
- assert len(test_state.current_epoch_attestations) == 0
- assert test_state.previous_epoch_attestations == pre_current_epoch_attestations
-
- return state, [attestation_block, epoch_block], test_state
-
-
-def test_voluntary_exit(state):
- pre_state = deepcopy(state)
- validator_index = get_active_validator_indices(
- pre_state,
- get_current_epoch(pre_state)
- )[-1]
-
- # move state forward PERSISTENT_COMMITTEE_PERIOD epochs to allow for exit
- pre_state.slot += spec.PERSISTENT_COMMITTEE_PERIOD * spec.SLOTS_PER_EPOCH
-
- post_state = deepcopy(pre_state)
-
- voluntary_exit = VoluntaryExit(
- epoch=get_current_epoch(pre_state),
- validator_index=validator_index,
- )
- voluntary_exit.signature = bls.sign(
- message_hash=signing_root(voluntary_exit),
- privkey=privkeys[validator_index],
- domain=get_domain(
- state=pre_state,
- domain_type=spec.DOMAIN_VOLUNTARY_EXIT,
- )
- )
-
- #
- # Add to state via block transition
- #
- initiate_exit_block = build_empty_block_for_next_slot(post_state)
- initiate_exit_block.body.voluntary_exits.append(voluntary_exit)
- state_transition(post_state, initiate_exit_block)
-
- assert post_state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH
-
- #
- # Process within epoch transition
- #
- exit_block = build_empty_block_for_next_slot(post_state)
- exit_block.slot += spec.SLOTS_PER_EPOCH
- state_transition(post_state, exit_block)
-
- assert post_state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH
-
- return pre_state, [initiate_exit_block, exit_block], post_state
-
-
-def test_transfer(state):
- # overwrite default 0 to test
- spec.MAX_TRANSFERS = 1
-
- pre_state = deepcopy(state)
- current_epoch = get_current_epoch(pre_state)
- sender_index = get_active_validator_indices(pre_state, current_epoch)[-1]
- recipient_index = get_active_validator_indices(pre_state, current_epoch)[0]
- transfer_pubkey = pubkeys[-1]
- transfer_privkey = privkeys[-1]
- amount = get_balance(pre_state, sender_index)
- pre_transfer_recipient_balance = get_balance(pre_state, recipient_index)
- transfer = Transfer(
- sender=sender_index,
- recipient=recipient_index,
- amount=amount,
- fee=0,
- slot=pre_state.slot + 1,
- pubkey=transfer_pubkey,
- )
- transfer.signature = bls.sign(
- message_hash=signing_root(transfer),
- privkey=transfer_privkey,
- domain=get_domain(
- state=pre_state,
- domain_type=spec.DOMAIN_TRANSFER,
- )
- )
-
- # ensure withdrawal_credentials reproducable
- pre_state.validator_registry[sender_index].withdrawal_credentials = (
- spec.BLS_WITHDRAWAL_PREFIX_BYTE + hash(transfer_pubkey)[1:]
- )
- # un-activate so validator can transfer
- pre_state.validator_registry[sender_index].activation_eligibility_epoch = spec.FAR_FUTURE_EPOCH
-
- post_state = deepcopy(pre_state)
- #
- # Add to state via block transition
- #
- block = build_empty_block_for_next_slot(post_state)
- block.body.transfers.append(transfer)
- state_transition(post_state, block)
-
- sender_balance = get_balance(post_state, sender_index)
- recipient_balance = get_balance(post_state, recipient_index)
- assert sender_balance == 0
- assert recipient_balance == pre_transfer_recipient_balance + amount
-
- return pre_state, [block], post_state
-
-
-def test_balance_driven_status_transitions(state):
- current_epoch = get_current_epoch(state)
- validator_index = get_active_validator_indices(state, current_epoch)[-1]
-
- assert state.validator_registry[validator_index].exit_epoch == spec.FAR_FUTURE_EPOCH
-
- # set validator balance to below ejection threshold
- state.validator_registry[validator_index].effective_balance = spec.EJECTION_BALANCE
-
- post_state = deepcopy(state)
- #
- # trigger epoch transition
- #
- block = build_empty_block_for_next_slot(post_state)
- block.slot += spec.SLOTS_PER_EPOCH
- state_transition(post_state, block)
-
- assert post_state.validator_registry[validator_index].exit_epoch < spec.FAR_FUTURE_EPOCH
-
- return state, [block], post_state
-
-
-def test_historical_batch(state):
- state.slot += spec.SLOTS_PER_HISTORICAL_ROOT - (state.slot % spec.SLOTS_PER_HISTORICAL_ROOT) - 1
-
- post_state = deepcopy(state)
-
- block = build_empty_block_for_next_slot(post_state)
-
- state_transition(post_state, block)
-
- assert post_state.slot == block.slot
- assert get_current_epoch(post_state) % (spec.SLOTS_PER_HISTORICAL_ROOT // spec.SLOTS_PER_EPOCH) == 0
- assert len(post_state.historical_roots) == len(state.historical_roots) + 1
-
- return state, [block], post_state
-
-
-def test_eth1_data_votes(state):
- post_state = deepcopy(state)
-
- expected_votes = 0
- assert len(state.eth1_data_votes) == expected_votes
-
- blocks = []
- for _ in range(spec.SLOTS_PER_ETH1_VOTING_PERIOD - 1):
- block = build_empty_block_for_next_slot(post_state)
- state_transition(post_state, block)
- expected_votes += 1
- assert len(post_state.eth1_data_votes) == expected_votes
- blocks.append(block)
-
- block = build_empty_block_for_next_slot(post_state)
- state_transition(post_state, block)
- blocks.append(block)
-
- assert post_state.slot % spec.SLOTS_PER_ETH1_VOTING_PERIOD == 0
- assert len(post_state.eth1_data_votes) == 1
-
- return state, blocks, post_state
|
pex-tool__pex-822 | Release 2.0.3
On the docket:
+ [x] Pex should trust any host passed via `--index` or `--find-links`. #812
+ [x] A cache should always be used by `pex.resolver.resolve`. #809
+ [x] Use the resolve cache to skip installs. #815
+ [x] Parallelize resolve. #818
+ [x] Cache sdist & local project builds #817
+ [x] Unify resolve and runtime wheel caches. #820
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.2'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.3'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 80e4b80d7..42a5e1252 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,26 @@
Release Notes
=============
+2.0.3
+-----
+
+This release fixes a regression in handling explicitly requested `--index` or
+`--find-links` http (insecure) repos. In addition, performance of the pex 2.x
+resolver is brought in line with the 1.x resolver in all cases and improved in
+most cases.
+
+* Unify PEX buildtime and runtime wheel caches. #821
+ `PR #821 <https://github.com/pantsbuild/pex/pull/821>`_
+
+* Parallelize resolve. (#819)
+ `PR #819 <https://github.com/pantsbuild/pex/pull/819>`_
+
+* Use the resolve cache to skip installs. (#815)
+ `PR #815 <https://github.com/pantsbuild/pex/pull/815>`_
+
+* Implicitly trust explicitly requested repos. (#813)
+ `PR #813 <https://github.com/pantsbuild/pex/pull/813>`_
+
2.0.2
-----
diff --git a/pex/version.py b/pex/version.py
index a698f9d1f..f77cc369d 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '2.0.2'
+__version__ = '2.0.3'
|
mabel-dev__opteryx-1641 | 🪲 Python 3.9 tests stalling
### Thank you for taking the time to report a problem with Opteryx.
_To help us to respond to your request we ask that you try to provide the below detail about the bug._
**Describe the bug** _A clear and specific description of what the bug is. What the error, incorrect or unexpected behaviour was._
**Expected behaviour** _A clear and concise description of what you expected to happen._
**Sample Code/Statement** _If you can, please submit the SQL statement or Python code snippet, or a representative example using the sample datasets._
~~~sql
~~~
**Additional context** _Add any other context about the problem here, for example what you have done to try to diagnose or workaround the problem._
| [
{
"content": "__build__ = 477\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agre... | [
{
"content": "__build__ = 482\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agre... | diff --git a/.github/workflows/regression_suite.yaml b/.github/workflows/regression_suite.yaml
index ef2373725..dac54768e 100644
--- a/.github/workflows/regression_suite.yaml
+++ b/.github/workflows/regression_suite.yaml
@@ -11,7 +11,7 @@ jobs:
max-parallel: 4
fail-fast: false
matrix:
- python-version: ['3.9', '3.10', '3.11', '3.12']
+ python-version: ['3.10', '3.11', '3.12']
os: [ubuntu-latest]
runs-on: ${{ matrix.os }}
steps:
diff --git a/.github/workflows/regression_suite_mac_x86.yaml b/.github/workflows/regression_suite_mac_x86.yaml
index ecde26602..d50d52d62 100644
--- a/.github/workflows/regression_suite_mac_x86.yaml
+++ b/.github/workflows/regression_suite_mac_x86.yaml
@@ -8,7 +8,7 @@ jobs:
strategy:
max-parallel: 4
matrix:
- python-version: ['3.9', '3.10', '3.11']
+ python-version: ['3.10', '3.11']
os: ['macos-latest']
runs-on: ${{ matrix.os }}
steps:
diff --git a/.github/workflows/regression_suite_windows.yaml b/.github/workflows/regression_suite_windows.yaml
index 488f85140..f5e13c483 100644
--- a/.github/workflows/regression_suite_windows.yaml
+++ b/.github/workflows/regression_suite_windows.yaml
@@ -8,7 +8,7 @@ jobs:
strategy:
max-parallel: 4
matrix:
- python-version: ['3.9', '3.10', '3.11']
+ python-version: ['3.10', '3.11']
os: ['windows-latest']
runs-on: ${{ matrix.os }}
steps:
diff --git a/opteryx/__version__.py b/opteryx/__version__.py
index ff5d4c87b..25008bb66 100644
--- a/opteryx/__version__.py
+++ b/opteryx/__version__.py
@@ -1,4 +1,4 @@
-__build__ = 477
+__build__ = 482
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
diff --git a/tests/misc/test_documentation.py b/tests/misc/test_documentation.py
index fbc138621..e7ae5d67a 100644
--- a/tests/misc/test_documentation.py
+++ b/tests/misc/test_documentation.py
@@ -8,8 +8,10 @@
sys.path.insert(1, os.path.join(sys.path[0], "../.."))
from tests.tools import download_file
+from tests.tools import is_version, skip_if
+@skip_if(is_version("3.9"))
def test_documentation_connect_example():
import opteryx
@@ -24,6 +26,7 @@ def test_documentation_connect_example():
conn.close()
+@skip_if(is_version("3.9"))
def test_readme_1():
import opteryx
@@ -31,6 +34,7 @@ def test_readme_1():
result.head()
+@skip_if(is_version("3.9"))
def test_readme_2():
import pandas
@@ -44,6 +48,7 @@ def test_readme_2():
aggregated_df.head()
+@skip_if(is_version("3.9"))
def test_readme_3():
import opteryx
@@ -57,6 +62,7 @@ def test_readme_3():
result.head()
+@skip_if(is_version("3.9"))
def test_readme_4():
import opteryx
from opteryx.connectors import GcpCloudStorageConnector
@@ -68,6 +74,7 @@ def test_readme_4():
result.head()
+@skip_if(is_version("3.9"))
def test_readme_5():
import opteryx
from opteryx.connectors import SqlConnector
@@ -90,12 +97,14 @@ def test_readme_5():
result.head()
+@skip_if(is_version("3.9"))
def test_get_started():
import opteryx
result = opteryx.query("SELECT * FROM $planets;").arrow()
+@skip_if(is_version("3.9"))
def test_python_client():
import opteryx
@@ -136,6 +145,7 @@ def test_python_client():
).fetchall()
+@skip_if(is_version("3.9"))
def test_pandas_integration_input():
import pandas
@@ -155,12 +165,14 @@ def test_pandas_integration_input():
results = opteryx.query("SELECT * FROM nephews").arrow()
+@skip_if(is_version("3.9"))
def test_pandas_integration_output():
import opteryx
dataframe = opteryx.query("SELECT * FROM $planets").pandas()
+@skip_if(is_version("3.9"))
def test_polars_integration_input():
import polars
@@ -180,12 +192,14 @@ def test_polars_integration_input():
results = opteryx.query("SELECT * FROM nephews").arrow()
+@skip_if(is_version("3.9"))
def test_polars_integration_output():
import opteryx
dataframe = opteryx.query("SELECT * FROM $planets").polars()
+@skip_if(is_version("3.9"))
def test_permissions_example():
import opteryx
@@ -200,6 +214,7 @@ def test_permissions_example():
print("User does not have permission to execute this query")
+@skip_if(is_version("3.9"))
def test_role_based_permissions():
import opteryx
@@ -226,6 +241,7 @@ def get_user_permissions(user_roles):
assert perms == {"Query", "Execute", "Analyze"}
+@skip_if(is_version("3.9"))
def test_membership_permissions():
import opteryx
diff --git a/tests/plan_optimization/test_fold_all_constants.py b/tests/plan_optimization/test_fold_all_constants.py
index 67591ab63..61124b475 100644
--- a/tests/plan_optimization/test_fold_all_constants.py
+++ b/tests/plan_optimization/test_fold_all_constants.py
@@ -5,8 +5,10 @@
import numpy
import opteryx
+from tests.tools import is_version, skip_if
+@skip_if(is_version("3.9"))
def test_we_dont_fold_random():
SQL = "SELECT random() AS r FROM GENERATE_SERIES(5000) AS g"
|
pex-tool__pex-910 | Release 2.1.5
On the docket:
+ [x] Kill `Pip.spawn_install_wheel` `overwrite` arg. #907
+ [x] Silence pip warnings about Python 2.7. #908
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.4'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.5'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 8d19aed0e..80c591fe1 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,18 @@
Release Notes
=============
+2.1.5
+-----
+
+* Silence pip warnings about Python 2.7. (#908)
+ `PR #908 <https://github.com/pantsbuild/pexpull/908>`_
+
+* Kill `Pip.spawn_install_wheel` `overwrite` arg. (#907)
+ `PR #907 <https://github.com/pantsbuild/pexpull/907>`_
+
+* Show pex-root from env as default in help output (#901)
+ `PR #901 <https://github.com/pantsbuild/pexpull/901>`_
+
2.1.4
-----
diff --git a/pex/version.py b/pex/version.py
index 3860dbc2e..3e0c53016 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '2.1.4'
+__version__ = '2.1.5'
|
mabel-dev__opteryx-1695 | ✨ Memory Pool Optimizations
### Thanks for stopping by to let us know something could be better!
**Is your feature request related to a problem? Please describe.** _A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]_
**Describe the solution you'd like** _A clear and concise description of what you want to happen._
**Describe alternatives you've considered** _A clear and concise description of any alternative solutions or features you've considered._
**Additional context** _Add any other context or screenshots about the feature request here._
| [
{
"content": "__build__ = 527\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agre... | [
{
"content": "__build__ = 532\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agre... | diff --git a/opteryx/__version__.py b/opteryx/__version__.py
index a2313e19e..f9db5e2cc 100644
--- a/opteryx/__version__.py
+++ b/opteryx/__version__.py
@@ -1,4 +1,4 @@
-__build__ = 527
+__build__ = 532
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
diff --git a/opteryx/compiled/structures/memory_pool.pyx b/opteryx/compiled/structures/memory_pool.pyx
index 538b79869..449ff55d7 100644
--- a/opteryx/compiled/structures/memory_pool.pyx
+++ b/opteryx/compiled/structures/memory_pool.pyx
@@ -1,6 +1,10 @@
# cython: language_level=3
# cython: nonecheck=False
# cython: cdivision=True
+# cython: initializedcheck=False
+# cython: infer_types=True
+# cython: wraparound=True
+# cython: boundscheck=False
from libc.stdlib cimport malloc, free
from libc.string cimport memcpy
@@ -81,24 +85,26 @@ cdef class MemoryPool:
cdef vector[MemorySegment] sorted_segments
self.l1_compaction += 1
- i = 1
n = len(self.free_segments)
+ if n <= 1:
+ return
- sorted_segments = sorted(self.free_segments, key=lambda x: x["start"])
- new_free_segments = [sorted_segments[0]]
+ # Sort the free segments by start attribute
+ self.free_segments = sorted(self.free_segments, key=lambda x: x["start"])
+ new_free_segments = [self.free_segments[0]]
- for segment in sorted_segments[1:]:
+ for segment in self.free_segments[1:]:
last_segment = new_free_segments[-1]
if last_segment.start + last_segment.length == segment.start:
# If adjacent, merge by extending the last segment
- last_segment.length += segment.length
- new_free_segments[-1] = last_segment
+ new_free_segments[-1] = MemorySegment(last_segment.start, last_segment.length + segment.length)
else:
# If not adjacent, just add the segment to the new list
new_free_segments.append(segment)
self.free_segments = new_free_segments
+
def _level2_compaction(self):
"""
Aggressively compacts by pushing all free memory to the end (Level 2 compaction).
@@ -134,7 +140,6 @@ cdef class MemoryPool:
# special case for 0 byte segments
if len_data == 0:
new_segment = MemorySegment(0, 0)
- ref_id = random_int()
self.used_segments[ref_id] = new_segment
self.commits += 1
return ref_id
@@ -179,7 +184,7 @@ cdef class MemoryPool:
segment = self.used_segments[ref_id]
if zero_copy != 0:
- raw_data = <char[:segment.length]> char_ptr
+ raw_data = <char[:segment.length]> (char_ptr + segment.start)
data = memoryview(raw_data) # Create a memoryview from the raw data
else:
data = PyBytes_FromStringAndSize(char_ptr + segment.start, segment.length)
@@ -188,7 +193,6 @@ cdef class MemoryPool:
raise ValueError("Invalid reference ID.")
post_read_segment = self.used_segments[ref_id]
if post_read_segment.start != segment.start or post_read_segment.length != segment.length:
-
with self.lock:
self.read_locks += 1
@@ -197,11 +201,10 @@ cdef class MemoryPool:
segment = self.used_segments[ref_id]
if zero_copy != 0:
- raw_data = <char[:segment.length]> char_ptr
+ raw_data = <char[:segment.length]> (char_ptr + segment.start)
data = memoryview(raw_data) # Create a memoryview from the raw data
else:
return PyBytes_FromStringAndSize(char_ptr + segment.start, segment.length)
-
return data
def read_and_release(self, long ref_id, int zero_copy = 1):
@@ -219,7 +222,7 @@ cdef class MemoryPool:
self.free_segments.push_back(segment)
if zero_copy != 0:
- raw_data = <char[:segment.length]> char_ptr
+ raw_data = <char[:segment.length]> (char_ptr + segment.start)
return memoryview(raw_data) # Create a memoryview from the raw data
else:
return PyBytes_FromStringAndSize(char_ptr + segment.start, segment.length)
diff --git a/tests/misc/test_memory_pool.py b/tests/misc/test_memory_pool.py
index a8f32a88d..b6dcfcb67 100644
--- a/tests/misc/test_memory_pool.py
+++ b/tests/misc/test_memory_pool.py
@@ -73,7 +73,8 @@ def test_compaction():
mp.release(ref1)
ref3 = mp.commit(b"Third")
# Ensure that the third commit succeeds after compaction, despite the first segment being released
- assert mp.read(ref3, False) == b"Third"
+ data = mp.read(ref3, False)
+ assert data == b"Third"
def test_multiple_commits_and_reads():
@@ -94,6 +95,119 @@ def test_overlapping_writes():
assert mp.read(ref2, False) == b"abcde"
assert mp.read(ref3, False) == b"XYZ"
+def test_overlapping_writes_memcopy():
+ mp = MemoryPool(size=20)
+ ref1 = mp.commit(b"12345")
+ ref2 = mp.commit(b"abcde")
+ mp.release(ref1)
+ ref3 = mp.commit(b"XYZ")
+ # Test if the new write overlaps correctly and does not corrupt other data
+ r2_memcopy = bytes(mp.read(ref2, True))
+ r2_no_memcopy = mp.read(ref2, False)
+ r3_memcopy = bytes(mp.read(ref3, True))
+ r3_no_memcopy = mp.read(ref3, False)
+
+ assert r2_memcopy == r2_no_memcopy == b"abcde", f"{r2_memcopy} / {r2_no_memcopy} / abcde"
+ assert r3_memcopy == r3_no_memcopy == b"XYZ", f"{r3_memcopy} / {r3_no_memcopy} / XYZ"
+
+def test_zero_copy_vs_copy_reads():
+ mp = MemoryPool(size=30)
+
+ # Initial commits
+ ref1 = mp.commit(b"12345")
+ ref2 = mp.commit(b"abcde")
+ ref3 = mp.commit(b"ABCDE")
+
+ # Release one segment to create free space
+ mp.release(ref1)
+
+ # Commit more data to fill the pool
+ ref4 = mp.commit(b"XYZ")
+ ref5 = mp.commit(b"7890")
+
+ # Additional activity
+ ref6 = mp.commit(b"LMNOP")
+ mp.release(ref3)
+ ref7 = mp.commit(b"qrst")
+ mp.release(ref2)
+ ref8 = mp.commit(b"uvwxyz")
+
+ # Reading segments with and without zero-copy
+ r4_memcopy = bytes(mp.read(ref4, True))
+ r4_no_memcopy = mp.read(ref4, False)
+ r5_memcopy = bytes(mp.read(ref5, True))
+ r5_no_memcopy = mp.read(ref5, False)
+ r6_memcopy = bytes(mp.read(ref6, True))
+ r6_no_memcopy = mp.read(ref6, False)
+ r7_memcopy = bytes(mp.read(ref7, True))
+ r7_no_memcopy = mp.read(ref7, False)
+ r8_memcopy = bytes(mp.read(ref8, True))
+ r8_no_memcopy = mp.read(ref8, False)
+
+ assert r4_memcopy == r4_no_memcopy == b"XYZ", f"{r4_memcopy} / {r4_no_memcopy} / XYZ"
+ assert r5_memcopy == r5_no_memcopy == b"7890", f"{r5_memcopy} / {r5_no_memcopy} / 7890"
+ assert r6_memcopy == r6_no_memcopy == b"LMNOP", f"{r6_memcopy} / {r6_no_memcopy} / LMNOP"
+ assert r7_memcopy == r7_no_memcopy == b"qrst", f"{r7_memcopy} / {r7_no_memcopy} / qrst"
+ assert r8_memcopy == r8_no_memcopy == b"uvwxyz", f"{r8_memcopy} / {r8_no_memcopy} / uvwxyz"
+
+
+def test_zero_copy_vs_copy_reads_and_release():
+ mp = MemoryPool(size=30)
+
+ # Initial commits
+ ref1 = mp.commit(b"12345")
+ ref2 = mp.commit(b"abcde")
+ ref3 = mp.commit(b"ABCDE")
+
+ # Release one segment to create free space
+ mp.release(ref1)
+
+ # Commit more data to fill the pool
+ ref4 = mp.commit(b"XYZ")
+ ref5 = mp.commit(b"7890")
+
+ # Additional activity
+ ref6 = mp.commit(b"LMNOP")
+ mp.release(ref3)
+ ref7 = mp.commit(b"qrst")
+ mp.release(ref2)
+ ref8 = mp.commit(b"uvwxyz")
+
+ # Reading segments with and without zero-copy, alternating read and read_and_release
+ # read no zero copy, release zero copy
+ r4_read_no_memcopy = bytes(mp.read(ref4, False))
+ r4_release_memcopy = bytes(mp.read_and_release(ref4, True))
+
+ # read zero copy, release no zero copy
+ r5_read_memcopy = bytes(mp.read(ref5, True))
+ r5_release_no_memcopy = bytes(mp.read_and_release(ref5, False))
+
+ # read zero copy, release zero copy
+ r6_read_memcopy = bytes(mp.read(ref6, True))
+ r6_release_memcopy = bytes(mp.read_and_release(ref6, True))
+
+ # read no zero copy, release no zero copy
+ r7_read_no_memcopy = bytes(mp.read(ref7, False))
+ r7_release_no_memcopy = bytes(mp.read_and_release(ref7, False))
+
+ # read zero copy, release zero copy
+ r8_read_memcopy = bytes(mp.read(ref8, True))
+ r8_release_memcopy = bytes(mp.read_and_release(ref8, True))
+
+ assert r4_read_no_memcopy == r4_release_memcopy == b"XYZ", f"{r4_read_no_memcopy} / {r4_release_memcopy} / XYZ"
+ assert r5_read_memcopy == r5_release_no_memcopy == b"7890", f"{r5_read_memcopy} / {r5_release_no_memcopy} / 7890"
+ assert r6_read_memcopy == r6_release_memcopy == b"LMNOP", f"{r6_read_memcopy} / {r6_release_memcopy} / LMNOP"
+ assert r7_read_no_memcopy == r7_release_no_memcopy == b"qrst", f"{r7_read_no_memcopy} / {r7_release_no_memcopy} / qrst"
+ assert r8_read_memcopy == r8_release_memcopy == b"uvwxyz", f"{r8_read_memcopy} / {r8_release_memcopy} / uvwxyz"
+
+ # Ensure that the segments are released and available for new commits
+ ref9 = mp.commit(b"newdata")
+ r9_memcopy = bytes(mp.read(ref9, True))
+ r9_no_memcopy = mp.read(ref9, False)
+
+ assert r9_memcopy == r9_no_memcopy == b"newdata", f"{r9_memcopy} / {r9_no_memcopy} / newdata"
+
+
def test_pool_exhaustion_and_compaction():
mp = MemoryPool(size=20)
@@ -375,5 +489,5 @@ def test_return_types():
if __name__ == "__main__": # pragma: no cover
from tests.tools import run_tests
-
+ test_compaction_effectiveness()
run_tests()
diff --git a/tests/tools.py b/tests/tools.py
index 19afe1ce4..8788fa848 100644
--- a/tests/tools.py
+++ b/tests/tools.py
@@ -137,7 +137,7 @@ def run_tests():
for index, method in enumerate(test_methods):
start_time = time.monotonic_ns()
test_name = f"\033[38;2;255;184;108m{(index + 1):04}\033[0m \033[38;2;189;147;249m{str(method.__name__)}\033[0m"
- print(test_name.ljust(display_width - 20), end="")
+ print(test_name.ljust(display_width - 20), end="", flush=True)
error = None
output = ""
try:
|
gammapy__gammapy-5151 | Defaults for `methods` in signature of `SafeMaskMaker.__init__` is confusing
```
def __init__(
self,
methods=("aeff-default",),
aeff_percent=10,
bias_percent=10,
position=None,
fixed_offset=None,
offset_max="3 deg",
irfs="DL4",
):
```
In the signature of the `SafeMaskMaker`, the methods arguments is defaults to `("aeff-default",)` which is confusing because the coma is necessary for the code to work. If one don't put the coma while using a tuple, in the code latter, the instruction `set(methods)` will give back `{'-', 'a', 'd', 'e', 'f', 'l', 't', 'u'}`. To make it less confusing I think it would be good change the tuple to a list, for which `set(["aeff-default"])` give back `{'aeff-default'}`.
| [
{
"content": "# Licensed under a 3-clause BSD style license - see LICENSE.rst\nimport logging\nimport numpy as np\nfrom astropy import units as u\nfrom astropy.coordinates import Angle\nfrom gammapy.irf import EDispKernelMap\nfrom gammapy.maps import Map\nfrom gammapy.modeling.models import TemplateSpectralMode... | [
{
"content": "# Licensed under a 3-clause BSD style license - see LICENSE.rst\nimport logging\nimport numpy as np\nfrom astropy import units as u\nfrom astropy.coordinates import Angle\nfrom gammapy.irf import EDispKernelMap\nfrom gammapy.maps import Map\nfrom gammapy.modeling.models import TemplateSpectralMode... | diff --git a/gammapy/makers/safe.py b/gammapy/makers/safe.py
index 09c381c9f1..73d9b60b8c 100644
--- a/gammapy/makers/safe.py
+++ b/gammapy/makers/safe.py
@@ -62,7 +62,7 @@ class SafeMaskMaker(Maker):
def __init__(
self,
- methods=("aeff-default",),
+ methods=["aeff-default"],
aeff_percent=10,
bias_percent=10,
position=None,
|
pex-tool__pex-797 | Release 2.0.1
On the docket:
+ [x] pex --index-url=... fails in 2.0.0 #794
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.0'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.1'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 2b97c531c..52bfe2f01 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,15 @@
Release Notes
=============
+2.0.1
+-----
+
+This is a htofix release that fixes a bug when specifying a custom index
+(`-i`/`--index`/`--index-url`) via the CLI.
+
+* Fix #794 issue by add missing return statement in __str__ (#795)
+ `PR #795 <https://github.com/pantsbuild/pex/pull/795>`_
+
2.0.0
-----
diff --git a/pex/version.py b/pex/version.py
index 772804dc4..7d8716be3 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '2.0.0'
+__version__ = '2.0.1'
|
pex-tool__pex-891 | Release 2.1.3
On the docket:
+ [x] Error eagerly if an interpreter binary doesn't exist #886
+ [x] The pip-powered resolve in pex 2 will re-tokenize --find-links pages on each transitive requirement #887
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.2'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.3'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index ea14e891f..a8a961167 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,23 @@
Release Notes
=============
+2.1.3
+-----
+
+This release fixes a performance regression in which pip
+would re-tokenize --find-links pages unnecessarily.
+The parsed pages are now cached in a pip patch that has
+also been submitted upstream.
+
+* Revendor pip (#890)
+ `PR #890 <https://github.com/pantsbuild/pex/pull/890>`_
+
+* Add a clear_cache() method to PythonInterpreter. (#885)
+ `PR #885 <https://github.com/pantsbuild/pex/pull/885>`_
+
+* Error eagerly if an interpreter binary doesn't exist. (#886)
+ `PR #886 <https://github.com/pantsbuild/pex/pull/886>`_
+
2.1.2
-----
diff --git a/pex/version.py b/pex/version.py
index b75078c50..393c14b99 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '2.1.2'
+__version__ = '2.1.3'
|
pex-tool__pex-836 | Release 2.1.0
On the docket:
The prime motivator:
+ [x] Pex does not download foreign abi3 wheels correctly #823
Changes to support the above as well as others:
+ [x] Fix pex resolving for foreign platforms. #835
+ [x] Use pypa/packaging. #831
+ [x] Upgrade vendored setuptools to 42.0.2. #832
+ [x] De-vendor pex just once per version. #833
+ [x] Support VCS urls for vendoring. #834
+ [x] Support python 3.8 in CI. #829
+ [x] Fix pex resolution to respect --ignore-errors. #828
+ [x] Kill `pkg_resources` finders monkey-patching. #827
+ [x] Use flit to distribute pex. #826
+ [x] Cleanup extras_require. #825
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.3'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.0'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 42a5e1252..330d89a06 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,49 @@
Release Notes
=============
+2.1.0
+-----
+
+This release restores and improves support for building and running
+multiplatform pexes. Foreign `linux*` platform builds now include
+`manylinux2014` compatible wheels by default and foreign CPython pexes now
+resolve `abi3` wheels correctly. In addition, error messages at both buildtime
+and runtime related to resolution of dependencies are more informative.
+
+Pex 2.1.0 should be considered the first Pex 2-series release that fully
+replaces and improves upon Pex 1-series functionality.
+
+* Fix pex resolving for foreign platforms. (#835)
+ `PR #835 <https://github.com/pantsbuild/pex/pull/835>`_
+
+* Use pypa/packaging. (#831)
+ `PR #831 <https://github.com/pantsbuild/pex/pull/831>`_
+
+* Upgrade vendored setuptools to 42.0.2. (#832)
+ `PR #832 <https://github.com/pantsbuild/pex/pull/832>`_
+ `PR #1830 <https://github.com/pypa/setuptools/pull/1830>`_
+
+* De-vendor pex just once per version. (#833)
+ `PR #833 <https://github.com/pantsbuild/pex/pull/833>`_
+
+* Support VCS urls for vendoring. (#834)
+ `PR #834 <https://github.com/pantsbuild/pex/pull/834>`_
+
+* Support python 3.8 in CI. (#829)
+ `PR #829 <https://github.com/pantsbuild/pex/pull/829>`_
+
+* Fix pex resolution to respect --ignore-errors. (#828)
+ `PR #828 <https://github.com/pantsbuild/pex/pull/828>`_
+
+* Kill `pkg_resources` finders monkey-patching. (#827)
+ `PR #827 <https://github.com/pantsbuild/pex/pull/827>`_
+
+* Use flit to distribute pex. (#826)
+ `PR #826 <https://github.com/pantsbuild/pex/pull/826>`_
+
+* Cleanup extras_require. (#825)
+ `PR #825 <https://github.com/pantsbuild/pex/pull/825>`_
+
2.0.3
-----
diff --git a/pex/version.py b/pex/version.py
index f77cc369d..befdccbff 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '2.0.3'
+__version__ = '2.1.0'
|
pex-tool__pex-945 | Release 2.1.8
On the docket:
+ [x] Cache pip.pex. #937
+ [x] Ensure the interpreter path is a file #938
+ [x] Support an unzip toggle for PEXes. #939
+ [x] Better support unzip mode PEXes. #941
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.7'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.8'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 199e0d5db..b306c05b0 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,27 @@
Release Notes
=============
+2.1.8
+-----
+
+This release brings enhanced performance when using the Pex CLI or API to resolve requirements and
+improved performance for many PEXed applications when specifying the `--unzip` option. PEXes built
+with `--unzip` will first unzip themselves into the Pex cache if not unzipped there already and
+then re-execute themselves from there. This can improve startup latency. Pex itself now uses this
+mode in our [PEX release](https://github.com/pantsbuild/pex/releases/download/v2.1.8/pex).
+
+* Better support unzip mode PEXes. (#941)
+ `PR #941 <https://github.com/pantsbuild/pex/pull/941>`_
+
+* Support an unzip toggle for PEXes. (#939)
+ `PR #939 <https://github.com/pantsbuild/pex/pull/939>`_
+
+* Ensure the interpreter path is a file (#938)
+ `PR #938 <https://github.com/pantsbuild/pex/pull/938>`_
+
+* Cache pip.pex. (#937)
+ `PR #937 <https://github.com/pantsbuild/pex/pull/937>`_
+
2.1.7
-----
diff --git a/pex/version.py b/pex/version.py
index 930579c1a..b24b6e806 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '2.1.7'
+__version__ = '2.1.8'
|
fossasia__open-event-server-2429 | Add filter "Checked In" and "Not checked in" to attendees
In order to easily sort "checked in" and "not checked-in" it would be good to have the relevant options in the filters next to "completed, pending, expired".

| [
{
"content": "# -*- coding: utf-8 -*-\n\n##\n# Module for helper static variables\n##\n\n# Event Licences\n\nEVENT_LICENCES = {\n # Licence Name : ( Long Name, Description, Licence URL, Licence Logo, Licence Compact Logo )\n 'All rights reserved': (\n 'All rights reserved',\n u'The copyright... | [
{
"content": "# -*- coding: utf-8 -*-\n\n##\n# Module for helper static variables\n##\n\n# Event Licences\n\nEVENT_LICENCES = {\n # Licence Name : ( Long Name, Description, Licence URL, Licence Logo, Licence Compact Logo )\n 'All rights reserved': (\n 'All rights reserved',\n u'The copyright... | diff --git a/app/helpers/static.py b/app/helpers/static.py
index 2de510161d..67a0b9635c 100644
--- a/app/helpers/static.py
+++ b/app/helpers/static.py
@@ -210,7 +210,7 @@
('SGD', True, True),
('THB', True, True),
('TWD', True, True),
-
+ ('USD', True, True),
}
# Event Images with Event Topics and Subtopics
diff --git a/app/templates/gentelella/admin/event/tickets/attendees.html b/app/templates/gentelella/admin/event/tickets/attendees.html
index 566cbfe9d7..df2bbf6422 100644
--- a/app/templates/gentelella/admin/event/tickets/attendees.html
+++ b/app/templates/gentelella/admin/event/tickets/attendees.html
@@ -57,6 +57,13 @@
<label class="btn btn-default btn-responsive">
<input type="radio" name="show_state" autocomplete="off" value="Expired"> Expired
</label>
+ <label class="btn btn-default btn-responsive">
+ <input type="radio" name="show_state" autocomplete="off" value="checked_in"> Checked In
+ </label>
+ <label class="btn btn-default btn-responsive">
+ <input type="radio" name="show_state" autocomplete="off" value="not_checked_in"> Not Checked In
+ </label>
+
</div>
</div>
@@ -131,7 +138,7 @@ <h3>View Attendees</h3>
{% if order.status == 'completed' %}
{% if holder.checked_in %}
<button class="btn btn-warning holder-check-in-toggle" data-holder-id="{{ holder.id }}">
- Undo
+ Undo
</button>
{% else %}
<button class="btn btn-success holder-check-in-toggle" data-holder-id="{{ holder.id }}">
@@ -155,10 +162,13 @@ <h3>View Attendees</h3>
$.fn.dataTable.ext.search.push(
function (settings, data, dataIndex) {
var user_option = $("input[name=show_state]:checked").val();
- console.log(data);
var state = data[2].trim() || 'pending';
if (user_option === "all") {
return true;
+ } else if (user_option === 'checked_in') {
+ return data[8].trim().indexOf('Undo') !== -1
+ } else if (user_option === 'not_checked_in') {
+ return data[8].trim().indexOf('Check In') !== -1
} else if (user_option === state) {
return true;
}
@@ -214,11 +224,16 @@ <h3>View Attendees</h3>
success: function (result) {
$btn.prop("disabled", false);
if (result.status === "ok") {
- if(result.checked_in) {
+ if (result.checked_in) {
$btn.html("Undo").removeClass("btn-success").addClass("btn-warning");
} else {
$btn.html('<i class="fa fa-check fa-fw"></i> Check In').removeClass("btn-warning").addClass("btn-success");
}
+ var row = table.row($btn.closest('tr'));
+ var data = row.data();
+ data[8] = $btn.text();
+ row.invalidate();
+ table.draw();
} else {
$btn.html(oldText);
createSnackbar("There was an error while processing.", "Try Again", function () {
diff --git a/app/templates/gentelella/admin/event/wizard/step-1.html b/app/templates/gentelella/admin/event/wizard/step-1.html
index ff1c92ac7d..36049c8342 100644
--- a/app/templates/gentelella/admin/event/wizard/step-1.html
+++ b/app/templates/gentelella/admin/event/wizard/step-1.html
@@ -1503,12 +1503,11 @@ <h3 class="modal-title">Bank Instructions</h3>
}
};
}
-
var $stripeConnectedMessage = $("#stripe-connected-message");
$(".stripe-connect").click(function (e) {
e.preventDefault();
$.oauthpopup({
- path: "https://connect.stripe.com/oauth/authorize?response_type=code&client_id={{ settings.stripe_client_id }}&scope=read_write&redirect_uri={{ url_for('ticketing.stripe_callback', _external=true) }}",
+ path: "https://connect.stripe.com/oauth/authorize?response_type=code&client_id={{ key_settings.stripe_client_id }}&scope=read_write&redirect_uri={{ url_for('ticketing.stripe_callback', _external=true) }}",
callback: function () {
// TODO Disallow test accounts. Only accept live accounts.
if (1 || window.oauth_response.live_mode) {
|
Uberspace__lab-28 | Change project name to lab in config
| [
{
"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n#\n# Uberspace 7 lab documentation build configuration file, created by\n# sphinx-quickstart on Tue Feb 13 12:19:29 2018.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible config... | [
{
"content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n#\n# Uberspace 7 lab documentation build configuration file, created by\n# sphinx-quickstart on Tue Feb 13 12:19:29 2018.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible config... | diff --git a/source/_templates/breadcrumbs.html b/source/_templates/breadcrumbs.html
index 39d3706f..81a84c88 100755
--- a/source/_templates/breadcrumbs.html
+++ b/source/_templates/breadcrumbs.html
@@ -32,7 +32,7 @@
<ul class="wy-breadcrumbs">
{% block breadcrumbs %}
- <li><a href="{{ pathto(master_doc) }}">{{ _('Manual') }}</a> »</li>
+ <li><a href="{{ pathto(master_doc) }}">{{ _('Lab') }}</a> »</li>
{% for doc in parents %}
<li><a href="{{ doc.link|e }}">{{ doc.title }}</a> »</li>
{% endfor %}
diff --git a/source/conf.py b/source/conf.py
index eea6ac81..e04ec227 100644
--- a/source/conf.py
+++ b/source/conf.py
@@ -47,7 +47,7 @@
master_doc = 'index'
# General information about the project.
-project = 'Uberspace 7 Lab'
+project = 'UberLab'
copyright = '2018, uberspace.de'
author = 'uberspace.de'
|
pennersr__django-allauth-2388 | Error when 500 template contains a django-allauth template tag
**Error message**:
- `AttributeError: 'NoneType' object has no attribute 'POST'`
**How to reproduce**:
1) Create a 500 template (`500.html` in your template directory) that includes a template tag from django-allauth. For me, it was `Google Sign In` button:
```html
<!--500.html-->
{% load socialaccount %}
<a class="nav-link" href="{% provider_login_url 'google' %}">Log In</a>
```
2. Add an endpoint that is handled by Django's default 500 handler (`handler500` in `django.conf.urls`, which by default points to `django.views.defaults.server_error`)
```python
# urls.py
from django.conf.urls import handler500
urlpatterns = [
# ...
path('500/', handler500, name='500'),
# ...
]
```
- The handler (`server_error`) renders the template: `return HttpResponseServerError(template.render())`
- `render` in `allauth/socialaccount/templatetags/socialaccount.py` is called
- `get_request_param` in `allauth/utils.py` is called in the line `next = get_request_param(request, 'next')`
- `return request.POST.get(param) or request.GET.get(param, default)` in `get_request_param` causes an error because `request` is `None` in this case
**Solution**:
- Add a guard statement to the function `get_request_param` in `allauth/utils.py`
- before a patch: Use a custom 500 handler instead of Django's default 500 handler
| [
{
"content": "import base64\nimport importlib\nimport json\nimport random\nimport re\nimport string\nimport unicodedata\nfrom collections import OrderedDict\n\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.sites.models import Site\nfrom django.core.exceptions import ImproperlyConfigured\nf... | [
{
"content": "import base64\nimport importlib\nimport json\nimport random\nimport re\nimport string\nimport unicodedata\nfrom collections import OrderedDict\n\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.sites.models import Site\nfrom django.core.exceptions import ImproperlyConfigured\nf... | diff --git a/AUTHORS b/AUTHORS
index 588e9b7a77..642ae957c4 100644
--- a/AUTHORS
+++ b/AUTHORS
@@ -66,6 +66,7 @@ Jeff Triplett
Jeremy Satterfield
Jerome Leclanche
Jesse Gerard Brands
+Jihoon Park
Jiyoon Ha
Joe Vanderstelt
John Bazik
diff --git a/allauth/utils.py b/allauth/utils.py
index fe366eae8e..d2c58cda68 100644
--- a/allauth/utils.py
+++ b/allauth/utils.py
@@ -299,4 +299,6 @@ def get_form_class(forms, form_id, default_form):
def get_request_param(request, param, default=None):
+ if request is None:
+ return default
return request.POST.get(param) or request.GET.get(param, default)
|
hylang__hy-1343 | REPL history is lost on (quit)
REPL history is not flushed to disk if the REPL is exited using `(quit)`.
A workaround is to remember to use `CTRL-D` to exit the REPL.
Would be nice if `(quit)` also worked.
| [
{
"content": "# Copyright 2017 the authors.\n# This file is part of Hy, which is free software licensed under the Expat\n# license. See the LICENSE.\n\nimport contextlib\nimport os\nimport re\nimport sys\n\nimport hy.macros\nimport hy.compiler\nfrom hy._compat import builtins, string_types\n\n\ndocomplete = Tru... | [
{
"content": "# Copyright 2017 the authors.\n# This file is part of Hy, which is free software licensed under the Expat\n# license. See the LICENSE.\n\nimport contextlib\nimport os\nimport re\nimport sys\n\nimport hy.macros\nimport hy.compiler\nfrom hy._compat import builtins, string_types\n\n\ndocomplete = Tru... | diff --git a/NEWS b/NEWS
index 3785792ec..300db30e5 100644
--- a/NEWS
+++ b/NEWS
@@ -20,6 +20,8 @@ Changes from 0.13.0
* String literals should no longer be interpreted as special forms or macros
* Tag macros (née sharp macros) whose names begin with `!` are no longer
mistaken for shebang lines
+ * Fixed a bug where REPL history wasn't saved if you quit the REPL with
+ `(quit)` or `(exit)`
Changes from 0.12.1
diff --git a/hy/completer.py b/hy/completer.py
index 0d9c906e9..c4de45cca 100644
--- a/hy/completer.py
+++ b/hy/completer.py
@@ -124,7 +124,8 @@ def completion(completer=None):
readline.parse_and_bind(readline_bind)
- yield
-
- if docomplete:
- readline.write_history_file(history)
+ try:
+ yield
+ finally:
+ if docomplete:
+ readline.write_history_file(history)
|
google-research__text-to-text-transfer-transformer-351 | Unable to import tensorflow_gcs_config in t5_trivia colab notebook
Upon running line `import tensorflow_gcs_config` (in t5_trivia colab notebook, setup section) I get this error,
```
---------------------------------------------------------------------------
NotImplementedError Traceback (most recent call last)
<ipython-input-2-3bb7f36f8553> in <module>()
----> 1 import tensorflow_gcs_config
1 frames
/usr/local/lib/python3.6/dist-packages/tensorflow_gcs_config/__init__.py in _load_library(filename, lib)
55 raise NotImplementedError(
56 "unable to open file: " +
---> 57 "{}, from paths: {}\ncaused by: {}".format(filename, filenames, errs))
58
59 _gcs_config_so = _load_library("_gcs_config_ops.so")
NotImplementedError: unable to open file: _gcs_config_ops.so, from paths: ['/usr/local/lib/python3.6/dist-packages/tensorflow_gcs_config/_gcs_config_ops.so']
caused by: ['/usr/local/lib/python3.6/dist-packages/tensorflow_gcs_config/_gcs_config_ops.so: undefined symbol: _ZN10tensorflow15OpKernelContext5inputEN4absl11string_viewEPPKNS_6TensorE']
```
`tf.__version__` is '2.3.0'
| [
{
"content": "# Copyright 2020 The T5 Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by appl... | [
{
"content": "# Copyright 2020 The T5 Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by appl... | diff --git a/t5/version.py b/t5/version.py
index f3b47b22..e50b2461 100644
--- a/t5/version.py
+++ b/t5/version.py
@@ -18,4 +18,4 @@
Stored in a separate file so that setup.py can reference the version without
pulling in all the dependencies in __init__.py.
"""
-__version__ = '0.6.3'
+__version__ = '0.6.4'
|
espnet__espnet-913 | matplotlib.use('Agg') fail
It result in plot fail
```
_tkinter.TclError: no display name and no $DISPLAY environment variable
```
I fixed this by applying a patch on `espnet/nets/pytorch_backend/transformer/plot.py`
```
@@ -1,5 +1,6 @@
import logging
-
+import matplotlib
+matplotlib.use('Agg')
import matplotlib.pyplot as plt
from espnet.asr import asr_utils
```
| [
{
"content": "import logging\n\nimport matplotlib.pyplot as plt\n\nfrom espnet.asr import asr_utils\n\n\ndef _plot_and_save_attention(att_w, filename):\n # dynamically import matplotlib due to not found error\n from matplotlib.ticker import MaxNLocator\n import os\n d = os.path.dirname(filename)\n ... | [
{
"content": "import logging\n\nfrom espnet.asr import asr_utils\nimport matplotlib.pyplot as plt\n\n\ndef _plot_and_save_attention(att_w, filename):\n # dynamically import matplotlib due to not found error\n from matplotlib.ticker import MaxNLocator\n import os\n d = os.path.dirname(filename)\n ... | diff --git a/espnet/nets/pytorch_backend/transformer/plot.py b/espnet/nets/pytorch_backend/transformer/plot.py
index 66985df55e0..af5f0ef6e68 100644
--- a/espnet/nets/pytorch_backend/transformer/plot.py
+++ b/espnet/nets/pytorch_backend/transformer/plot.py
@@ -1,8 +1,7 @@
import logging
-import matplotlib.pyplot as plt
-
from espnet.asr import asr_utils
+import matplotlib.pyplot as plt
def _plot_and_save_attention(att_w, filename):
|
mkdocs__mkdocs-1921 | Unexpected behaviour with page.is_homepage
Starting a new site and rolling my own theme. Came across some slightly odd behaviour.
Mkdocs version 1.0.4
Python version 3.7.1
**Expected:**
`page.is_homepage` evaluates to True on the home (index.md) of the site, and false on all other pages.
**Actual:**
`page.is_homepage` evaluates to True on the home (index.md), and on any other index.md that is included in the nav object without nesting.
**Examples:**
The unexpected result:
```
nav:
- Home: index.md <--- page.is_homepage evaluates to True
- About: about.md <--- page.is_homepage evaluates to False
- Projects: projects/index.md <--- page.is_homepage evaluates to True
```
Changing the filename causes it to evaluate to false:
```
nav:
- Home: index.md <--- page.is_homepage evaluates to True
- About: about.md <--- page.is_homepage evaluates to False
- Projects: projects/test.md <--- page.is_homepage evaluates to False
```
If I tweak it a bit, so that the sections are nested, then it evaluates to false as I'd expect:
```
nav:
- About:
- About: about.md <--- page.is_homepage evaluates to False
- Projects:
- Project home: projects/index.md <--- page.is_homepage evaluates to False
```
This feels like a bug - especially as simply changing the markdown file name causes the behaviour to change.
| [
{
"content": "# coding: utf-8\n\nfrom __future__ import unicode_literals\n\nimport os\nimport io\nimport datetime\nimport logging\n\nimport markdown\nfrom markdown.extensions import Extension\nfrom markdown.treeprocessors import Treeprocessor\nfrom markdown.util import AMP_SUBSTITUTE\n\nfrom mkdocs.structure.to... | [
{
"content": "# coding: utf-8\n\nfrom __future__ import unicode_literals\n\nimport os\nimport io\nimport datetime\nimport logging\n\nimport markdown\nfrom markdown.extensions import Extension\nfrom markdown.treeprocessors import Treeprocessor\nfrom markdown.util import AMP_SUBSTITUTE\n\nfrom mkdocs.structure.to... | diff --git a/docs/about/release-notes.md b/docs/about/release-notes.md
index bb33917c3d..35564ffab2 100644
--- a/docs/about/release-notes.md
+++ b/docs/about/release-notes.md
@@ -56,6 +56,7 @@ your global navigation uses more than one level, things will likely be broken.
### Other Changes and Additions to Version 1.1
+* Bugfix: Ensure nested index pages do not get identified as the homepage (#1919).
* Bugfix: Properly identify deployment version (#1879).
* Bugfix: Properly build `ValidationError` message for `custom_dir` (#1849).
* Bugfix: Exclude Markdown files and READMEs from theme (#1766).
diff --git a/mkdocs/structure/pages.py b/mkdocs/structure/pages.py
index b032d7799b..a15a28a83e 100644
--- a/mkdocs/structure/pages.py
+++ b/mkdocs/structure/pages.py
@@ -93,7 +93,7 @@ def is_top_level(self):
@property
def is_homepage(self):
- return self.is_top_level and self.is_index
+ return self.is_top_level and self.is_index and self.file.url == '.'
@property
def url(self):
diff --git a/mkdocs/tests/structure/page_tests.py b/mkdocs/tests/structure/page_tests.py
index f229c8cbe5..fd015eadf2 100644
--- a/mkdocs/tests/structure/page_tests.py
+++ b/mkdocs/tests/structure/page_tests.py
@@ -70,6 +70,54 @@ def test_nested_index_page(self):
self.assertEqual(pg.title, 'Foo')
self.assertEqual(pg.toc, [])
+ def test_nested_index_page_no_parent(self):
+ cfg = load_config(docs_dir=self.DOCS_DIR)
+ fl = File('sub1/index.md', cfg['docs_dir'], cfg['site_dir'], cfg['use_directory_urls'])
+ pg = Page('Foo', fl, cfg)
+ pg.parent = None # non-homepage at nav root level; see #1919.
+ self.assertEqual(pg.url, 'sub1/')
+ self.assertEqual(pg.abs_url, None)
+ self.assertEqual(pg.canonical_url, None)
+ self.assertEqual(pg.edit_url, None)
+ self.assertEqual(pg.file, fl)
+ self.assertEqual(pg.content, None)
+ self.assertFalse(pg.is_homepage)
+ self.assertTrue(pg.is_index)
+ self.assertTrue(pg.is_page)
+ self.assertFalse(pg.is_section)
+ self.assertTrue(pg.is_top_level)
+ self.assertEqual(pg.markdown, None)
+ self.assertEqual(pg.meta, {})
+ self.assertEqual(pg.next_page, None)
+ self.assertEqual(pg.parent, None)
+ self.assertEqual(pg.previous_page, None)
+ self.assertEqual(pg.title, 'Foo')
+ self.assertEqual(pg.toc, [])
+
+ def test_nested_index_page_no_parent_no_directory_urls(self):
+ cfg = load_config(docs_dir=self.DOCS_DIR, use_directory_urls=False)
+ fl = File('sub1/index.md', cfg['docs_dir'], cfg['site_dir'], cfg['use_directory_urls'])
+ pg = Page('Foo', fl, cfg)
+ pg.parent = None # non-homepage at nav root level; see #1919.
+ self.assertEqual(pg.url, 'sub1/index.html')
+ self.assertEqual(pg.abs_url, None)
+ self.assertEqual(pg.canonical_url, None)
+ self.assertEqual(pg.edit_url, None)
+ self.assertEqual(pg.file, fl)
+ self.assertEqual(pg.content, None)
+ self.assertFalse(pg.is_homepage)
+ self.assertTrue(pg.is_index)
+ self.assertTrue(pg.is_page)
+ self.assertFalse(pg.is_section)
+ self.assertTrue(pg.is_top_level)
+ self.assertEqual(pg.markdown, None)
+ self.assertEqual(pg.meta, {})
+ self.assertEqual(pg.next_page, None)
+ self.assertEqual(pg.parent, None)
+ self.assertEqual(pg.previous_page, None)
+ self.assertEqual(pg.title, 'Foo')
+ self.assertEqual(pg.toc, [])
+
def test_nested_nonindex_page(self):
cfg = load_config(docs_dir=self.DOCS_DIR)
fl = File('sub1/non-index.md', cfg['docs_dir'], cfg['site_dir'], cfg['use_directory_urls'])
|
pex-tool__pex-975 | Release 2.1.10
On the docket:
+ [x] Improve Pex packaging. (#961)
+ [x] Make the interpreter cache deterministic. (#960)
+ [x] Fix deprecation warning for `rU` mode (#956)
+ [x] Fix runtime resolve error message generation. (#955)
+ [x] Kill dead code. (#954)
+ [x] Many Pex tests fail under Python 2.7 in CI #967
+ [x] Add a `--local` mode for packaging the Pex PEX. #971
+ [x] Split Pex resolve API. (#970)
+ [x] Can't run PEX file when a dependency's wheel includes a build tag #964
+ [x] Expose network configuration in pex options. #803
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.9'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.1.10'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 12c003ef3..bc1013393 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,53 @@
Release Notes
=============
+2.1.10
+------
+
+This release focuses on the resolver API and resolution performance. Pex 2 resolving using Pip is
+now at least at performance parity with Pex 1 in all studied cases and most often is 5% to 10%
+faster.
+
+As part of the resolution performance work, Pip networking configuration is now exposed via Pex CLI
+options and the ``NetworkConfiguration`` API type / new ``resolver.resolve`` API parameter.
+
+With network configuration now wired up, the ``PEX_HTTP_RETRIES`` and ``PEX_HTTP_TIMEOUT`` env var
+support in Pex 1 that was never wired into Pex 2 is now dropped in favor of passing ``--retries``
+and ``--timeout`` via the CLI (See: `Issue #94 <https://github.com/pantsbuild/pex/issues/94>`_)
+
+* Expose Pip network configuration. (#974)
+ `PR #974 <https://github.com/pantsbuild/pex/pull/974>`_
+
+* Restore handling for bad wheel filenames to ``.can_add()`` (#973)
+ `PR #973 <https://github.com/pantsbuild/pex/pull/973>`_
+
+* Fix wheel filename parsing in PEXEnvironment.can_add (#965)
+ `PR #965 <https://github.com/pantsbuild/pex/pull/965>`_
+
+* Split Pex resolve API. (#970)
+ `PR #970 <https://github.com/pantsbuild/pex/pull/970>`_
+
+* Add a ``--local`` mode for packaging the Pex PEX. (#971)
+ `PR #971 <https://github.com/pantsbuild/pex/pull/971>`_
+
+* Constrain the virtualenv version used by tox. (#968)
+ `PR #968 <https://github.com/pantsbuild/pex/pull/968>`_
+
+* Improve Pex packaging. (#961)
+ `PR #961 <https://github.com/pantsbuild/pex/pull/961>`_
+
+* Make the interpreter cache deterministic. (#960)
+ `PR #960 <https://github.com/pantsbuild/pex/pull/960>`_
+
+* Fix deprecation warning for ``rU`` mode (#956)
+ `PR #956 <https://github.com/pantsbuild/pex/pull/956>`_
+
+* Fix runtime resolve error message generation. (#955)
+ `PR #955 <https://github.com/pantsbuild/pex/pull/955>`_
+
+* Kill dead code. (#954)
+ `PR #954 <https://github.com/pantsbuild/pex/pull/954>`_
+
2.1.9
-----
diff --git a/pex/version.py b/pex/version.py
index a6ec8e0ae..ef5420cfc 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '2.1.9'
+__version__ = '2.1.10'
|
numba__numba-1356 | Use CPython allocator in NRT
NRT should optionally use the CPython memory allocation functions (when imported from CPython). This would allow Numba-allocated memory to be seen by other utilities such as `sys.getallocatedblocks()`, `sys.debugmallocstats()`, and `tracemalloc`.
| [
{
"content": "from __future__ import print_function, absolute_import, division\n\nfrom collections import namedtuple\n\nfrom . import atomicops\nfrom llvmlite import binding as ll\n\nfrom numba.utils import finalize as _finalize\nfrom . import _nrt_python as _nrt\n\n_nrt_mstats = namedtuple(\"nrt_mstats\", [\"a... | [
{
"content": "from __future__ import print_function, absolute_import, division\n\nfrom collections import namedtuple\n\nfrom . import atomicops\nfrom llvmlite import binding as ll\n\nfrom numba.utils import finalize as _finalize\nfrom . import _nrt_python as _nrt\n\n_nrt_mstats = namedtuple(\"nrt_mstats\", [\"a... | diff --git a/numba/_pymodule.h b/numba/_pymodule.h
index 89787ce4d6d..182817a0aa1 100644
--- a/numba/_pymodule.h
+++ b/numba/_pymodule.h
@@ -38,8 +38,9 @@
#define Py_uhash_t unsigned long
#endif
-#if PY_MAJOR_VERSION < 3 || (PY_MAJOR_VERSION == 3 && Py_MINOR_VERSION < 4)
+#if PY_MAJOR_VERSION < 3 || (PY_MAJOR_VERSION == 3 && PY_MINOR_VERSION < 4)
#define PyMem_RawMalloc malloc
+ #define PyMem_RawRealloc realloc
#define PyMem_RawFree free
#endif
diff --git a/numba/runtime/_nrt_python.c b/numba/runtime/_nrt_python.c
index 218ca957f30..f1b00e73f50 100644
--- a/numba/runtime/_nrt_python.c
+++ b/numba/runtime/_nrt_python.c
@@ -20,6 +20,14 @@ memsys_shutdown(PyObject *self, PyObject *args) {
Py_RETURN_NONE;
}
+static PyObject *
+memsys_use_cpython_allocator(PyObject *self, PyObject *args) {
+ NRT_MemSys_set_allocator(PyMem_RawMalloc,
+ PyMem_RawRealloc,
+ PyMem_RawFree);
+ Py_RETURN_NONE;
+}
+
static
PyObject*
memsys_set_atomic_inc_dec(PyObject *self, PyObject *args) {
@@ -518,6 +526,7 @@ NRT_decref(MemInfo* mi) {
static PyMethodDef ext_methods[] = {
#define declmethod(func) { #func , ( PyCFunction )func , METH_VARARGS , NULL }
#define declmethod_noargs(func) { #func , ( PyCFunction )func , METH_NOARGS, NULL }
+ declmethod_noargs(memsys_use_cpython_allocator),
declmethod_noargs(memsys_shutdown),
declmethod(memsys_set_atomic_inc_dec),
declmethod(memsys_set_atomic_cas),
diff --git a/numba/runtime/nrt.c b/numba/runtime/nrt.c
index f68f80e805a..220b2ec22dd 100644
--- a/numba/runtime/nrt.c
+++ b/numba/runtime/nrt.c
@@ -22,6 +22,21 @@ struct MemInfo {
};
+/*
+ * Misc helpers.
+ */
+
+static void nrt_fatal_error(const char *msg)
+{
+ fprintf(stderr, "Fatal Numba error: %s\n", msg);
+ fflush(stderr); /* it helps in Windows debug build */
+
+#if defined(MS_WINDOWS) && defined(_DEBUG)
+ DebugBreak();
+#endif
+ abort();
+}
+
/*
* Global resources.
*/
@@ -35,6 +50,12 @@ struct MemSys{
int shutting;
/* Stats */
size_t stats_alloc, stats_free, stats_mi_alloc, stats_mi_free;
+ /* System allocation functions */
+ struct {
+ NRT_malloc_func malloc;
+ NRT_realloc_func realloc;
+ NRT_free_func free;
+ } allocator;
};
/* The Memory System object */
@@ -42,6 +63,10 @@ static MemSys TheMSys;
void NRT_MemSys_init(void) {
memset(&TheMSys, 0, sizeof(MemSys));
+ /* Bind to libc allocator */
+ TheMSys.allocator.malloc = malloc;
+ TheMSys.allocator.realloc = realloc;
+ TheMSys.allocator.free = free;
}
void NRT_MemSys_shutdown(void) {
@@ -54,6 +79,22 @@ void NRT_MemSys_shutdown(void) {
NRT_MemSys_set_atomic_cas_stub();
}
+void NRT_MemSys_set_allocator(NRT_malloc_func malloc_func,
+ NRT_realloc_func realloc_func,
+ NRT_free_func free_func)
+{
+ if ((malloc_func != TheMSys.allocator.malloc ||
+ realloc_func != TheMSys.allocator.realloc ||
+ free_func != TheMSys.allocator.free) &&
+ (TheMSys.stats_alloc != TheMSys.stats_free ||
+ TheMSys.stats_mi_alloc != TheMSys.stats_mi_free)) {
+ nrt_fatal_error("cannot change allocator while blocks are allocated");
+ }
+ TheMSys.allocator.malloc = malloc_func;
+ TheMSys.allocator.realloc = realloc_func;
+ TheMSys.allocator.free = free_func;
+}
+
void NRT_MemSys_set_atomic_inc_dec(atomic_inc_dec_func inc,
atomic_inc_dec_func dec)
{
@@ -122,17 +163,6 @@ void NRT_MemSys_set_atomic_cas_stub(void) {
NRT_MemSys_set_atomic_cas(nrt_testing_atomic_cas);
}
-static void nrt_fatal_error(const char *msg)
-{
- fprintf(stderr, "Fatal Numba error: %s\n", msg);
- fflush(stderr); /* it helps in Windows debug build */
-
-#if defined(MS_WINDOWS) && defined(_DEBUG)
- DebugBreak();
-#endif
- abort();
-}
-
/*
* The MemInfo structure.
@@ -328,14 +358,14 @@ void *NRT_MemInfo_varsize_realloc(MemInfo *mi, size_t size)
*/
void* NRT_Allocate(size_t size) {
- void *ptr = malloc(size);
+ void *ptr = TheMSys.allocator.malloc(size);
NRT_Debug(nrt_debug_print("NRT_Allocate bytes=%zu ptr=%p\n", size, ptr));
TheMSys.atomic_inc(&TheMSys.stats_alloc);
return ptr;
}
void *NRT_Reallocate(void *ptr, size_t size) {
- void *new_ptr = realloc(ptr, size);
+ void *new_ptr = TheMSys.allocator.realloc(ptr, size);
NRT_Debug(nrt_debug_print("NRT_Reallocate bytes=%zu ptr=%p -> %p\n",
size, ptr, new_ptr));
return new_ptr;
@@ -343,6 +373,6 @@ void *NRT_Reallocate(void *ptr, size_t size) {
void NRT_Free(void *ptr) {
NRT_Debug(nrt_debug_print("NRT_Free %p\n", ptr));
- free(ptr);
+ TheMSys.allocator.free(ptr);
TheMSys.atomic_inc(&TheMSys.stats_free);
}
diff --git a/numba/runtime/nrt.h b/numba/runtime/nrt.h
index 5c2cfe5432d..10510ec1acb 100644
--- a/numba/runtime/nrt.h
+++ b/numba/runtime/nrt.h
@@ -36,6 +36,11 @@ typedef int (*atomic_cas_func)(void * volatile *ptr, void *cmp, void *repl,
typedef struct MemInfo MemInfo;
typedef struct MemSys MemSys;
+typedef void *(*NRT_malloc_func)(size_t size);
+typedef void *(*NRT_realloc_func)(void *ptr, size_t new_size);
+typedef void (*NRT_free_func)(void *ptr);
+
+
/* Memory System API */
/* Initialize the memory system */
@@ -44,6 +49,11 @@ void NRT_MemSys_init(void);
/* Shutdown the memory system */
void NRT_MemSys_shutdown(void);
+/*
+ * Register the system allocation functions
+ */
+void NRT_MemSys_set_allocator(NRT_malloc_func, NRT_realloc_func, NRT_free_func);
+
/*
* Register the atomic increment and decrement functions
*/
diff --git a/numba/runtime/nrt.py b/numba/runtime/nrt.py
index b337a18f30e..721cf769db4 100644
--- a/numba/runtime/nrt.py
+++ b/numba/runtime/nrt.py
@@ -98,7 +98,8 @@ def get_allocation_stats(self):
# Alias to _nrt_python._MemInfo
MemInfo = _nrt._MemInfo
-# Create uninitialized runtime
+# Create runtime
+_nrt.memsys_use_cpython_allocator()
rtsys = _Runtime()
# Install finalizer
diff --git a/numba/tests/test_nrt.py b/numba/tests/test_nrt.py
index 94f957f793d..6e1b2add502 100644
--- a/numba/tests/test_nrt.py
+++ b/numba/tests/test_nrt.py
@@ -1,7 +1,13 @@
from __future__ import absolute_import, division, print_function
+import math
+import os
+import sys
+
import numpy as np
+
from numba import unittest_support as unittest
+from numba import njit
from numba.runtime import rtsys
from numba.config import PYVERSION
from .support import MemoryLeakMixin
@@ -164,15 +170,71 @@ def test_buffer(self):
# consumed by another thread.
+@unittest.skipUnless(sys.version_info >= (3, 4),
+ "need Python 3.4+ for the tracemalloc module")
+class TestTracemalloc(unittest.TestCase):
+ """
+ Test NRT-allocated memory can be tracked by tracemalloc.
+ """
+
+ def measure_memory_diff(self, func):
+ import tracemalloc
+ tracemalloc.start()
+ try:
+ before = tracemalloc.take_snapshot()
+ # Keep the result and only delete it after taking a snapshot
+ res = func()
+ after = tracemalloc.take_snapshot()
+ del res
+ return after.compare_to(before, 'lineno')
+ finally:
+ tracemalloc.stop()
+
+ def test_snapshot(self):
+ N = 1000000
+ dtype = np.int8
+
+ @njit
+ def alloc_nrt_memory():
+ """
+ Allocate and return a large array.
+ """
+ return np.empty(N, dtype)
+
+ def keep_memory():
+ return alloc_nrt_memory()
+
+ def release_memory():
+ alloc_nrt_memory()
+
+ alloc_lineno = keep_memory.__code__.co_firstlineno + 1
+
+ # Warmup JIT
+ alloc_nrt_memory()
+
+ # The large NRT-allocated array should appear topmost in the diff
+ diff = self.measure_memory_diff(keep_memory)
+ stat = diff[0]
+ # There is a slight overhead, so the allocated size won't exactly be N
+ self.assertGreaterEqual(stat.size, N)
+ self.assertLess(stat.size, N * 1.01)
+ frame = stat.traceback[0]
+ self.assertEqual(os.path.basename(frame.filename), "test_nrt.py")
+ self.assertEqual(frame.lineno, alloc_lineno)
+
+ # If NRT memory is released before taking a snapshot, it shouldn't
+ # appear.
+ diff = self.measure_memory_diff(release_memory)
+ stat = diff[0]
+ # Something else appears, but nothing the magnitude of N
+ self.assertLess(stat.size, N * 0.01)
+
+
class TestNRTIssue(MemoryLeakMixin, unittest.TestCase):
def test_issue_with_refct_op_pruning(self):
"""
GitHub Issue #1244 https://github.com/numba/numba/issues/1244
"""
- from numba import njit
- import numpy as np
- import math
-
@njit
def calculate_2D_vector_mag(vector):
x, y = vector
|
DjangoGirls__djangogirls-785 | return paginator.Paginator(self._items(), self.limit)
Sentry Issue: [DJANGO-GIRLS-WEBSITE-3V](https://sentry.io/organizations/django-girls/issues/3236790374/?referrer=github_integration)
```
return paginator.Paginator(self._items(), self.limit)
```
| [
{
"content": "from django.contrib.sitemaps import Sitemap\n\nfrom .models import Story\n\n\nclass BlogSiteMap(Sitemap):\n priority = 0.5\n\n def items(self):\n return Story.objects.all()\n\n def location(self, item):\n url = item.post_url\n if url is not None and 'http://' in url:\... | [
{
"content": "from django.contrib.sitemaps import Sitemap\n\nfrom .models import Story\n\n\nclass BlogSiteMap(Sitemap):\n priority = 0.5\n\n def items(self):\n return Story.objects.all().order_by('-created')\n\n def location(self, item):\n url = item.post_url\n if url is not None a... | diff --git a/story/sitemap.py b/story/sitemap.py
index 95f2d36b2..44a8eb71f 100644
--- a/story/sitemap.py
+++ b/story/sitemap.py
@@ -7,7 +7,7 @@ class BlogSiteMap(Sitemap):
priority = 0.5
def items(self):
- return Story.objects.all()
+ return Story.objects.all().order_by('-created')
def location(self, item):
url = item.post_url
|
mlcommons__GaNDLF-537 | Radiology DataLoader takes up a *lot* memory during certain conditions
**Describe the bug**
During sanity checking of subjects, the queue construction seems to take up a lot of memory.
**To Reproduce**
Steps to reproduce the behavior:
1. Have a ridiculous number of subjects on a small machine (e.g., 10k on a machine with 16G RAM)
2. Start training on rad mode
4. See error sometime during/after queue construction:
```bash
## last message
Constructing queue for train data: 100%|██████████| 8681/8681 [07:57<00:00, 18.19it/s]
## failure with message related to exceeded RAM usage
```
**Expected behavior**
There should not be any failure at this stage.
**Screenshots**
N.A>
**GaNDLF Version**
<!-- Put the output of the following command:
python -c 'import GANDLF as g;print(g.__version__)'
-->
0.0.16-dev
**Desktop (please complete the following information):**
CentOS 7
**Additional context**
N.A.
| [
{
"content": "#!/usr/bin/env python\n\n\"\"\"The setup script.\"\"\"\n\n\nimport os\nfrom setuptools import setup, find_packages\nfrom setuptools.command.install import install\nfrom setuptools.command.develop import develop\nfrom setuptools.command.egg_info import egg_info\n\nwith open(\"README.md\") as readme... | [
{
"content": "#!/usr/bin/env python\n\n\"\"\"The setup script.\"\"\"\n\n\nimport os\nfrom setuptools import setup, find_packages\nfrom setuptools.command.install import install\nfrom setuptools.command.develop import develop\nfrom setuptools.command.egg_info import egg_info\n\nwith open(\"README.md\") as readme... | diff --git a/setup.py b/setup.py
index 74d1422d8..46241f78e 100644
--- a/setup.py
+++ b/setup.py
@@ -53,6 +53,7 @@ def run(self):
"numpy==1.22.0",
"scipy",
"SimpleITK!=2.0.*",
+ "SimpleITK!=2.2.1", # https://github.com/mlcommons/GaNDLF/issues/536
"torchvision",
"tqdm",
"torchio==0.18.75",
|
pex-tool__pex-792 | Release 2.0.0
On the docket:
+ [x] Use pip for resolving and building distributions. #788
+ [x] Pex defaults to reproduceable builds. #791
That one issue closes out a slew of others, partially documented in #686.
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.12'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '2.0.0'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index ae7de56f8..2b97c531c 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,58 @@
Release Notes
=============
+2.0.0
+-----
+
+Pex 2.0.0 is cut on the advent of a large, mostly internal change for typical
+use cases: it now uses vendored pip to perform resolves and wheel builds. This
+fixes a large number of compatibility and correctness bugs as well as gaining
+feature support from pip including handling manylinux2010 and manylinux2014 as
+well as VCS requirements and support for PEP-517 & PEP-518 builds.
+
+API changes to be wary of:
+
+* The egg distribution format is no longer supported.
+* The deprecated ``--interpreter-cache-dir`` CLI option was removed.
+* The ``--cache-ttl`` CLI option and ``cache_ttl`` resolver API argument were
+ removed.
+* The resolver API replaced ``fetchers`` with a list of ``indexes`` and a list
+ of ``find_links`` repos.
+* The resolver API removed (http) ``context`` which is now automatically
+ handled.
+* The resolver API removed ``precedence`` which is now pip default precedence:
+ wheels when available and not ruled out via the ``--no-wheel`` CLI option or
+ ``use_wheel=False`` API argument.
+* The ``--platform`` CLI option and ``platform`` resolver API argument now must
+ be full platform strings that include platform, implementation, version and
+ abi; e.g.: ``--platform=macosx-10.13-x86_64-cp-36-m``.
+* The ``--manylinux`` CLI option and ``use_manylinux`` resolver API argument
+ were removed. Instead, to resolve manylinux wheels for a foreign platform,
+ specify the manylinux platform to target with an explicit ``--platform`` CLI
+ flag or ``platform`` resolver API argument; e.g.:
+ ``--platform=manylinux2010-x86_64-cp-36-m``.
+
+In addition, Pex 2.0.0 now builds reproduceable pexes by default; ie:
+
+* Python modules embedded in the pex are not pre-compiled (pass --compile if
+ you want this).
+* The timestamps for Pex file zip entries default to midnight on
+ January 1, 1980 (pass --use-system-time to change this).
+
+This finishes off the effort tracked by
+`Issue #716 <https://github.com/pantsbuild/pex/pull/718>`_
+
+Changes in this release:
+
+* Pex defaults to reproduceable builds. (#791)
+ `PR #791 <https://github.com/pantsbuild/pex/pull/791>`_
+
+* Use pip for resolving and building distributions. (#788)
+ `PR #788 <https://github.com/pantsbuild/pex/pull/788>`_
+
+* Bias selecting the current interpreter. (#783)
+ `PR #783 <https://github.com/pantsbuild/pex/pull/783>`_
+
1.6.12
------
diff --git a/pex/version.py b/pex/version.py
index 23734e89b..772804dc4 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '1.6.12'
+__version__ = '2.0.0'
|
pex-tool__pex-1191 | Release 2.1.26
On the docket:
+ [x] Pex requirement parsing is tripped up by files in the CWD with the same name as requirements' project names. #1188
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.25\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.26\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index f63fa42fa..03ecea3f8 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,15 @@
Release Notes
=============
+2.1.26
+------
+
+This is a hotfix release that fixes requirement parsing when there is a local file in the CWD with
+the same name as the project name of a remote requirement to be resolved.
+
+* Requirement parsing handles local non-dist files. (#1190)
+ `PR #1190 <https://github.com/pantsbuild/pex/pull/1190>`_
+
2.1.25
------
diff --git a/pex/version.py b/pex/version.py
index d89e91f46..d49dc67b8 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.25"
+__version__ = "2.1.26"
|
pex-tool__pex-1139 | Release 2.1.22
On the docket:
+ [x] Fix `--help-variables` docs. #1113
+ [x] pex binary hangs on startup at atomic_directory #1119
+ [x] pex vendoring does not (always) isolate itself from a colliding setuptools install in site-packages #1031
+ [x] Remove long deprecated support for _pex module. #1135
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.21\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.22\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index c7101cdd1..9a14e88b8 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,76 @@
Release Notes
=============
+2.1.22
+------
+
+This release fixes a deadlock that could be experienced when building
+PEX files in highly concurrent environments in addition to fixing
+`pex --help-variables` output.
+
+A new suite of PEX tools is now available in Pex itself and any PEXes
+built with the new `--include-tools` option. Use
+`PEX_TOOLS=1 pex --help` to find out more about the available tools and
+their usage.
+
+Finally, the long deprecated exposure of the Pex APIs through `_pex` has
+been removed. To use the Pex APIs you must include pex as a dependency
+in your PEX file.
+
+* Add a dependency graph tool. (#1132)
+ `PR #1132 <https://github.com/pantsbuild/pex/pull/1132>`_
+
+* Add a venv tool. (#1128)
+ `PR #1128 <https://github.com/pantsbuild/pex/pull/1128>`_
+
+* Remove long deprecated support for _pex module. (#1135)
+ `PR #1135 <https://github.com/pantsbuild/pex/pull/1135>`_
+
+* Add an interpreter tool. (#1131)
+ `PR #1131 <https://github.com/pantsbuild/pex/pull/1131>`_
+
+* Escape venvs unless PEX_INHERIT_PATH is requested. (#1130)
+ `PR #1130 <https://github.com/pantsbuild/pex/pull/1130>`_
+
+* Improve `PythonInterpreter` venv support. (#1129)
+ `PR #1129 <https://github.com/pantsbuild/pex/pull/1129>`_
+
+* Add support for PEX runtime tools & an info tool. (#1127)
+ `PR #1127 <https://github.com/pantsbuild/pex/pull/1127>`_
+
+* Exclusive atomic_directory always unlocks. (#1126)
+ `PR #1126 <https://github.com/pantsbuild/pex/pull/1126>`_
+
+* Fix `PythonInterpreter` binary normalization. (#1125)
+ `PR #1125 <https://github.com/pantsbuild/pex/pull/1125>`_
+
+* Add a `requires_dists` function. (#1122)
+ `PR #1122 <https://github.com/pantsbuild/pex/pull/1122>`_
+
+* Add an `is_exe` helper. (#1123)
+ `PR #1123 <https://github.com/pantsbuild/pex/pull/1123>`_
+
+* Fix req parsing for local archives & projects. (#1121)
+ `PR #1121 <https://github.com/pantsbuild/pex/pull/1121>`_
+
+* Improve PEXEnvironment constructor ergonomics. (#1120)
+ `PR #1120 <https://github.com/pantsbuild/pex/pull/1120>`_
+
+* Fix `safe_open` for single element relative paths. (#1118)
+ `PR #1118 <https://github.com/pantsbuild/pex/pull/1118>`_
+
+* Add URLFetcher IT. (#1116)
+ `PR #1116 <https://github.com/pantsbuild/pex/pull/1116>`_
+
+* Implement full featured requirment parsing. (#1114)
+ `PR #1114 <https://github.com/pantsbuild/pex/pull/1114>`_
+
+* Fix `--help-variables` docs. (#1113)
+ `PR #1113 <https://github.com/pantsbuild/pex/pull/1113>`_
+
+* Switch from optparse to argparse. (#1083)
+ `PR #1083 <https://github.com/pantsbuild/pex/pull/1083>`_
+
2.1.21
------
diff --git a/pex/version.py b/pex/version.py
index 8b44b5e28..16163f6f4 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.21"
+__version__ = "2.1.22"
|
pex-tool__pex-777 | Release 1.6.12
On the docket:
+ [x] PythonInterpreter: support python binary names with single letter suffixes #769
+ [x] Pex should support some form of verifiably reproducible resolve. #772
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.11'\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = '1.6.12'\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 6c2163b2d..ae7de56f8 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,18 @@
Release Notes
=============
+1.6.12
+------
+
+This release adds the `--intransitive` option to support pre-resolved requirements
+lists and allows for python binaries built under Gentoo naming conventions.
+
+* Add an --intransitive option. (#775)
+ `PR #775 <https://github.com/pantsbuild/pex/pull/775>`_
+
+* PythonInterpreter: support python binary names with single letter suffixes (#769)
+ `PR #769 <https://github.com/pantsbuild/pex/pull/769>`_
+
1.6.11
------
diff --git a/pex/version.py b/pex/version.py
index 255088057..23734e89b 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = '1.6.11'
+__version__ = '1.6.12'
|
pex-tool__pex-1750 | Release 2.1.85
On the docket:
+ [x] PEX interpreters should support all underlying Python interpreter options. #1745
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.84\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.85\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 0edc4617a..3ecf6a075 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,21 @@
Release Notes
=============
+2.1.85
+------
+
+This PyCon US 2022 release brings full support for Python interpreter
+emulation when a PEX is run in interpreter mode (without an entry point
+or else when forced via ``PEX_INTERPRETER=1``).
+
+A special thank you to Loren Arthur for contributing the fix in the
+Pantsbuild sprint at PyCon.
+
+* PEX interpreters should support all underlying Python interpreter options. (#1745)
+ `Issue #1745 <https://github.com/pantsbuild/pex/issues/1745>`_
+ `PR #1746 <https://github.com/pantsbuild/pex/pull/1746>`_
+ `PR #1748 <https://github.com/pantsbuild/pex/pull/1748>`_
+
2.1.84
------
diff --git a/pex/version.py b/pex/version.py
index be294c41c..62a251651 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.84"
+__version__ = "2.1.85"
|
pex-tool__pex-1709 | Release 2.1.77
On the docket:
+ [x] Fix pathologic lock creation slowness. #1707
+ [x] Support uncompressed PEXes. (#1705)
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.76\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.77\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 1e0fb931f..2f87648a4 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,10 +1,23 @@
Release Notes
=============
+2.1.77
+------
+
+This release fixes pathologically slow cases of lock creation as well as
+introducing support for ``--no-compression`` to allow picking the the
+time-space tradeoff you want for your PEX zips.
+
+* Fix pathologic lock creation slowness. (#1707)
+ `PR #1707 <https://github.com/pantsbuild/pex/pull/1707>`_
+
+* Support uncompressed PEXes. (#1705)
+ `PR #1705 <https://github.com/pantsbuild/pex/pull/1705>`_
+
2.1.76
------
-This release finalizes spurious deadlock handling in `--lock` resolves
+This release finalizes spurious deadlock handling in ``--lock`` resolves
worked around in #1694 in Pex 2.1.75.
* Fix lock_resolver to use BSD file locks. (#1702)
diff --git a/pex/version.py b/pex/version.py
index 75d3da9cc..f8a647e8e 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.76"
+__version__ = "2.1.77"
|
pex-tool__pex-1679 | Release 2.1.73
On the docket:
+ [x] Unexpected distribution hash #1683
+ [x] Pex fails to parse wheel tags correctly when resolving from a lock. #1676
+ [x] `pex3 lock create --style universal` does not fully patch ambient interpreter properties. #1681
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.72\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.73\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index da8f81916..b2b079c4a 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,27 @@
Release Notes
=============
+2.1.73
+------
+
+This is a hotfix for various PEX issues:
+
+#. ``--requirements-pex`` handling was broken by #1661 in the 2.1.71
+ release and is now fixed.
+#. Creating ``universal`` locks now works using any interpreter when the
+ resolver version is the ``pip-2020-resolver``.
+#. Building PEXes with ``--lock`` resolves that contain wheels with
+ build tags in their names now works.
+
+* Fix ``--requirements-pex``. (#1684)
+ `PR #1684 <https://github.com/pantsbuild/pex/pull/1684>`_
+
+* Fix universal locks for the ``pip-2020-resolver``. (#1682)
+ `PR #1682 <https://github.com/pantsbuild/pex/pull/1682>`_
+
+* Fix ``--lock`` resolve wheel tag parsing. (#1678)
+ `PR #1678 <https://github.com/pantsbuild/pex/pull/1678>`_
+
2.1.72
------
diff --git a/pex/version.py b/pex/version.py
index 1cf91c0a8..a1e0ffe02 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.72"
+__version__ = "2.1.73"
|
pex-tool__pex-1275 | Release 2.1.34
On the docket:
+ [x] Allow command-line arguments to be read from a file #1271
+ [x] Issue when running a module inside pex file #1018
+ [x] Guard against concurrent re-imports. #1270
+ [x] Ensure Pip logs to stderr. #1268
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.33\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.34\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 0e5749f8f..b2a99154a 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,26 @@
Release Notes
=============
+2.1.34
+------
+
+Beyond bugfixes for a few important edge cases, this release includes
+new support for @argfiles on the command line from @jjhelmus. These
+can be useful to overcome command line length limitations. See:
+https://docs.python.org/3/library/argparse.html#fromfile-prefix-chars.
+
+* Allow cli arguments to be specified in a file (#1273)
+ `PR #1273 <https://github.com/pantsbuild/pex/pull/1273>`_
+
+* Fix module entrypoints. (#1274)
+ `PR #1274 <https://github.com/pantsbuild/pants/pull/1274>`_
+
+* Guard against concurrent re-imports. (#1270)
+ `PR #1270 <https://github.com/pantsbuild/pants/pull/1270>`_
+
+* Ensure Pip logs to stderr. (#1268)
+ `PR #1268 <https://github.com/pantsbuild/pants/pull/1268>`_
+
2.1.33
------
diff --git a/pex/version.py b/pex/version.py
index c20716d59..ee3ef65c4 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.33"
+__version__ = "2.1.34"
|
pex-tool__pex-1450 | Release 2.1.50
On the docket:
+ [x] Fix zipapp layout identification. #1448
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.49\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.50\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 4cf9ec11a..f05e62f6f 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,17 @@
Release Notes
=============
+2.1.50
+------
+
+This is another hotfix of the 2.1.48 release's ``--layout`` feature that
+fixes identification of ``--layout zipapp`` PEXes that have had their
+execute mode bit turned off. A notable example is the Pex PEX when
+downloaded from https://github.com/pantsbuild/pex/releases.
+
+* Fix zipapp layout identification. (#1448)
+ `PR #1448 <https://github.com/pantsbuild/pex/pull/1448>`_
+
2.1.49
------
diff --git a/pex/version.py b/pex/version.py
index 3977ea4eb..fed40855d 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.49"
+__version__ = "2.1.50"
|
pex-tool__pex-1148 | PexInfo.copy does not copy its collection attributes.
The copy method was oversimplified in #1127 and now only copies the dict backing the non-collection attributes of PexInfo.
| [
{
"content": "# Copyright 2014 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nfrom __future__ import absolute_import\n\nimport json\nimport os\n\nfrom pex import pex_warnings\nfrom pex.common import can_write_dir, open_zip, safe_mkdtemp\nfro... | [
{
"content": "# Copyright 2014 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nfrom __future__ import absolute_import\n\nimport json\nimport os\n\nfrom pex import pex_warnings\nfrom pex.common import can_write_dir, open_zip, safe_mkdtemp\nfro... | diff --git a/pex/pex_info.py b/pex/pex_info.py
index 24c2bf524..5a9dcb43c 100644
--- a/pex/pex_info.py
+++ b/pex/pex_info.py
@@ -394,7 +394,7 @@ def dump(self):
def copy(self):
# type: () -> PexInfo
- return PexInfo(self._pex_info)
+ return PexInfo(self.as_json_dict())
@staticmethod
def _merge_split(*paths):
diff --git a/tests/test_pex_info.py b/tests/test_pex_info.py
index ee613697f..aea4e23b2 100644
--- a/tests/test_pex_info.py
+++ b/tests/test_pex_info.py
@@ -7,6 +7,7 @@
import pytest
from pex.common import temporary_dir
+from pex.inherit_path import InheritPath
from pex.orderedset import OrderedSet
from pex.pex_info import PexInfo
from pex.pex_warnings import PEXWarning
@@ -142,3 +143,31 @@ def test_pex_root_set_unwriteable():
assert isinstance(message, PEXWarning)
assert pex_root in str(message)
assert pex_info.pex_root in str(message)
+
+
+def test_copy():
+ # type: () -> None
+ default_info = PexInfo.default()
+ default_info_copy = default_info.copy()
+ assert default_info is not default_info_copy
+ assert default_info.dump() == default_info_copy.dump()
+
+ info = PexInfo.default()
+ info.unzip = True
+ info.code_hash = "foo"
+ info.inherit_path = InheritPath.FALLBACK
+ info.add_requirement("bar==1")
+ info.add_requirement("baz==2")
+ info.add_distribution("bar.whl", "bar-sha")
+ info.add_distribution("baz.whl", "baz-sha")
+ info.add_interpreter_constraint(">=2.7.18")
+ info.add_interpreter_constraint("CPython==2.7.9")
+ info_copy = info.copy()
+
+ assert info_copy.unzip is True
+ assert "foo" == info_copy.code_hash
+ assert InheritPath.FALLBACK == info_copy.inherit_path
+ assert OrderedSet(["bar==1", "baz==2"]) == info_copy.requirements
+ assert {"bar.whl": "bar-sha", "baz.whl": "baz-sha"} == info_copy.distributions
+ assert {">=2.7.18", "CPython==2.7.9"} == set(info_copy.interpreter_constraints)
+ assert info.dump() == info_copy.dump()
|
pex-tool__pex-1377 | Release 2.1.43
On the docket:
+ [x] Support more verbose output for interpreter info. (#1347)
+ [x] Fix Pex emitting warnings about its Pip PEX venv. (#1351)
+ [x] Fix execution modes. (#1353)
+ [x] Warn for PEX env vars unsupported by venv. (#1354)
+ [x] Do not suppress pex output in bidst_pex (#1358)
+ [x] Using --platform manylinux2010 includes pyarrow wheel for manylinux2014 #1355
+ [x] Fix --no-manylinux. #1365
+ [x] Environment markers are incorrectly evaluated for --platform resolves. #1366
+ [x] Pex probes wheel metadata incorrectly. #1375
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.42\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.43\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index fd894df7f..713eaf4ec 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,39 @@
Release Notes
=============
+2.1.43
+------
+
+* Fix dist-info metadata discovery. (#1376)
+ `PR #1376 <https://github.com/pantsbuild/pex/pull/1376>`_
+
+* Fix ``--platform`` resolve handling of env markers. (#1367)
+ `PR #1367 <https://github.com/pantsbuild/pex/pull/1367>`_
+
+* Fix ``--no-manylinux``. (#1365)
+ `PR #1365 <https://github.com/pantsbuild/pex/pull/1365>`_
+
+* Allow ``--platform`` resolves for current interpreter. (#1364)
+ `PR #1364 <https://github.com/pantsbuild/pex/pull/1364>`_
+
+* Do not suppress pex output in bidst_pex (#1358)
+ `PR #1358 <https://github.com/pantsbuild/pex/pull/1358>`_
+
+* Warn for PEX env vars unsupported by venv. (#1354)
+ `PR #1354 <https://github.com/pantsbuild/pex/pull/1354>`_
+
+* Fix execution modes. (#1353)
+ `PR #1353 <https://github.com/pantsbuild/pex/pull/1353>`_
+
+* Fix Pex emitting warnings about its Pip PEX venv. (#1351)
+ `PR #1351 <https://github.com/pantsbuild/pex/pull/1351>`_
+
+* Support more verbose output for interpreter info. (#1347)
+ `PR #1347 <https://github.com/pantsbuild/pex/pull/1347>`_
+
+* Fix typo in recipes.rst (#1342)
+ `PR #1342 <https://github.com/pantsbuild/pex/pull/1342>`_
+
2.1.42
------
diff --git a/pex/version.py b/pex/version.py
index 5d2cd4dfc..4c1b8de5b 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.42"
+__version__ = "2.1.43"
|
pex-tool__pex-1838 | Release 2.1.96
On the docket:
+ [x] PEX_EXTRA_SYS_PATH propagation can break subprocesses run against other venvs. #1836
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.95\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.96\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index a66e2c6c4..bd4f2b4fe 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,16 @@
Release Notes
=============
+2.1.96
+------
+
+This is a hotfix release that fixes ``--venv`` mode
+``PEX_EXTRA_SYS_PATH`` propogation introduced in Pex 2.1.95 to only
+apply to ``sys.executable`` and not other Pythons.
+
+* Fix ``--venv`` ``PEX PEX_EXTRA_SYS_PATH`` propagation. (#1837)
+ `PR #1837 <https://github.com/pantsbuild/pex/pull/1837>`_
+
2.1.95
------
diff --git a/pex/version.py b/pex/version.py
index 40153e391..26a429cb9 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.95"
+__version__ = "2.1.96"
|
pex-tool__pex-1140 | Release 2.1.23
On the docket:
+ [x] Upgrade Pex to Pip 20.3.1. #1133
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.22\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.23\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 9a14e88b8..3e999fa51 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,21 @@
Release Notes
=============
+2.1.23
+------
+
+This release upgrades Pex to the latest Pip which includes support for
+the new 2020-resolver (see:
+https://pip.pypa.io/en/stable/user_guide/#resolver-changes-2020) as well
+as support for macOS BigSur. Although this release defaults to the
+legacy resolver behavior, the next release will deprecate the legacy
+resolver and support for the legacy resolver will later be removed to
+allow continuing Pip upgrades going forward. To switch to the new
+resolver, use: `--resolver-version pip-2020-resolver`.
+
+* Upgrade Pex to Pip 20.3.1. (#1133)
+ `PR #1133 <https://github.com/pantsbuild/pex/pull/1133>`_
+
2.1.22
------
diff --git a/pex/version.py b/pex/version.py
index 16163f6f4..10a632ad9 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.22"
+__version__ = "2.1.23"
|
rotki__rotki-2262 | Wrong Binance asset mapping for STX
## Problem Definition
This was reported by a user via email. They own some STX in binance. For Rotki STX is Stox (https://www.coingecko.com/en/coins/stox) but in Binance it's another token not called Stacks (https://www.coingecko.com/en/coins/stack). We support it in Rotki as blockstacks (STX-2) so we just need to change the binance mapping.
## Task
Fix the binance mapping.
| [
{
"content": "from dataclasses import dataclass, field\nfrom functools import total_ordering\nfrom typing import Any, Optional, Type, TypeVar\n\nfrom rotkehlchen.assets.resolver import AssetResolver\nfrom rotkehlchen.errors import DeserializationError, UnknownAsset, UnsupportedAsset\nfrom rotkehlchen.typing imp... | [
{
"content": "from dataclasses import dataclass, field\nfrom functools import total_ordering\nfrom typing import Any, Optional, Type, TypeVar\n\nfrom rotkehlchen.assets.resolver import AssetResolver\nfrom rotkehlchen.errors import DeserializationError, UnknownAsset, UnsupportedAsset\nfrom rotkehlchen.typing imp... | diff --git a/docs/changelog.rst b/docs/changelog.rst
index a5f1e33579..cd069973b1 100755
--- a/docs/changelog.rst
+++ b/docs/changelog.rst
@@ -2,6 +2,7 @@
Changelog
=========
+* :bug:`2261` Users who had STX in Binance should now see it mapped properly to blockstack and not stox.
* :bug:`-` Users will now see the total worth contained in the card for bigger amounts.
* :bug:`2239` Amounts in the dashboard should now appear in single line for users.
* :bug:`2244` Fix edge case where using a cryptocompare api key could result in the all coins endpoint to error if no cache already existed.
diff --git a/rotkehlchen/assets/asset.py b/rotkehlchen/assets/asset.py
index 9253a63d18..d76c8f1fda 100644
--- a/rotkehlchen/assets/asset.py
+++ b/rotkehlchen/assets/asset.py
@@ -154,6 +154,8 @@
'SOL-2': 'SOL',
# BETH is the eth staked in beacon chain
'ETH2': 'BETH',
+ # STX is Blockstack in Binance
+ 'STX-2': 'STX',
}
WORLD_TO_BITFINEX = {
diff --git a/rotkehlchen/data/all_assets.json b/rotkehlchen/data/all_assets.json
index 886fd48332..5aa997e462 100644
--- a/rotkehlchen/data/all_assets.json
+++ b/rotkehlchen/data/all_assets.json
@@ -11140,6 +11140,7 @@
},
"STX": {
"coingecko": "stox",
+ "cryptocompare": "STOX",
"ethereum_address": "0x006BeA43Baa3f7A6f765F14f10A1a1b08334EF45",
"ethereum_token_decimals": 18,
"name": "Stox",
@@ -11149,7 +11150,7 @@
},
"STX-2": {
"coingecko": "blockstack",
- "cryptocompare": "BSTX",
+ "cryptocompare": "STX",
"name": "Blockstack",
"started": 1572048000,
"symbol": "STX",
diff --git a/rotkehlchen/data/all_assets.meta b/rotkehlchen/data/all_assets.meta
index 79efc18417..79e8195bbd 100644
--- a/rotkehlchen/data/all_assets.meta
+++ b/rotkehlchen/data/all_assets.meta
@@ -1 +1 @@
-{"md5": "6499628db3b6996da07eb0e19741d8c0", "version": 48}
+{"md5": "9d34eb6018b8d6013b4cb3fd8af22edf", "version": 48}
diff --git a/rotkehlchen/tests/unit/test_assets.py b/rotkehlchen/tests/unit/test_assets.py
index 9594df7ab8..cf76d74938 100644
--- a/rotkehlchen/tests/unit/test_assets.py
+++ b/rotkehlchen/tests/unit/test_assets.py
@@ -186,7 +186,7 @@ def test_coingecko_identifiers_are_reachable(data_dir):
def test_assets_json_meta():
"""Test that all_assets.json md5 matches and that if md5 changes since last
time then version is also bumped"""
- last_meta = {'md5': '6499628db3b6996da07eb0e19741d8c0', 'version': 48}
+ last_meta = {'md5': '9d34eb6018b8d6013b4cb3fd8af22edf', 'version': 48}
data_dir = Path(__file__).resolve().parent.parent.parent / 'data'
data_md5 = file_md5(data_dir / 'all_assets.json')
|
projectmesa__mesa-1437 | v1.1.0 Safford Release
Milestone: https://github.com/projectmesa/mesa/milestone/31
Highlighted changes:
- #1376 > 6x perf speedup for add/remove agent in `ContinuousSpace`
- #1391 correctness fix for `SimultaneousActivation` and `StagedActivation`
- #1399 make `self.running = True` optional. We need to tell existing users that initializing this is no longer necessary, and so, reducing the boilerplate code
- #1435 Allow user-specified local dir to be served by Tornado. Needed by Mesa-Geo
- #1413 Allow batch_run to take arbitrary parameters
| [
{
"content": "\"\"\"\nMesa Agent-Based Modeling Framework\n\nCore Objects: Model, and Agent.\n\n\"\"\"\nimport datetime\n\nfrom mesa.model import Model\nfrom mesa.agent import Agent\n\nimport mesa.time as time\nimport mesa.space as space\nimport mesa.flat.visualization as visualization\nfrom mesa.datacollection... | [
{
"content": "\"\"\"\nMesa Agent-Based Modeling Framework\n\nCore Objects: Model, and Agent.\n\n\"\"\"\nimport datetime\n\nfrom mesa.model import Model\nfrom mesa.agent import Agent\n\nimport mesa.time as time\nimport mesa.space as space\nimport mesa.flat.visualization as visualization\nfrom mesa.datacollection... | diff --git a/HISTORY.rst b/HISTORY.rst
index 640c93a0c59..55afb0b1840 100644
--- a/HISTORY.rst
+++ b/HISTORY.rst
@@ -3,6 +3,59 @@
Release History
---------------
+1.1.0 (2022-10-10) Safford
+++++++++++++++++++++++++++
+
+**Special notes**
+
+* Perf: ContinuousSpace: speed-up add/remove agents #1376. This is a ~6x performance improvement for add/remove.
+* fix: time: Recompute agent_keys between stages #1391. This is a correctness fix for ``SimultaneousActivation`` and ``StagedActivation`` when agents are being removed during simulation.
+* ModularServer: Always set model.running = True on reset #1399. With this change, specifying ``self.running = True`` in your model ``__init__`` is now optional. Mesa's visualization server will automatically sets it to ``True`` in the beginning of a simulation.
+* feat: Allow user-specified local dir to be served by Tornado #1435. This simplifies the usage of ``ModularServer`` in Mesa-Geo.
+* Allow batch_run to take arbitrary parameters #1413. With this change, you can finally use any arbitrary Python objects as ``batch_run`` parameters, where previously they are restricted to hashable objects only.
+* Prevent seed and random from being shared between instances #1439. With this fix, a model instance has their own isolated RNG.
+
+**Improvements**
+
+* CI Updates
+ * ci: Cancel previous obsolete runs #1378
+ * ci: update black to prevent click error #1382
+ * Add "falsy" to .codespellignore #1412
+ * Upgrade pre-commit CI (with pyupgrade and syntax checks) #1422
+* Tests
+ * test: RandomActivationByType: Test adding agents with duplicate ID #1392
+* Dependency updates
+ * Update Pipfile.lock (dependencies) #1398
+ * Update Pipfile.lock (dependencies) #1408
+ * Update Pipfile.lock (dependencies) #1434
+* Docs
+ * docs: Add Tim Pope's guideline for proper Git commit msg #1379
+ * readme: Improve the pip install for Git repo instruction #1416
+ * Docs: Remove trailing whitespaces #1421
+ * Fixes #1423 - fixes build badge in docs #1424
+* Refactors
+ * refactor: Apply pyupgrade --py37-plus #1429
+ * refactor ModularServer (moving code into __init__) #1403
+* Perf: ContinuousSpace: speed-up add/remove agents #1376
+* Remove monospace formatting for hyperlinks #1388
+* ModularServer: Always set model.running = True on reset #1399
+* Allow batch_run to take arbitrary parameters #1413
+* ModularServer: Put new optional arg port last #1432
+* feat: Allow user-specified local dir to be served by Tornado #1435
+* Improve and measure speed of clamp function #1440
+
+**Fixes**
+
+* Fix stray " in modular_template.html #1380
+* Fix zoom on network visualisation #1381
+* Fix broken monospace links #1387
+* fix: Ensure agent id is unique in RandomActivationByType.add #1386
+* fix: time: Recompute agent_keys between stages #1391
+* Fix batchrunner progress bar #1395
+* Fix stray " in visualisation dropdown labels #1409
+* space: Fix type error for Python < 3.9 #1430
+* Prevent seed and random from being shared between instances #1439
+
1.0.0 (2022-07-06) Quartzsite
+++++++++++++++++++++++++++++++++++++++++++
diff --git a/mesa/__init__.py b/mesa/__init__.py
index 64b3ea6fc0d..b5d5c7a7d03 100644
--- a/mesa/__init__.py
+++ b/mesa/__init__.py
@@ -26,6 +26,6 @@
]
__title__ = "mesa"
-__version__ = "1.0.0"
+__version__ = "1.1.0"
__license__ = "Apache 2.0"
__copyright__ = f"Copyright {datetime.date.today().year} Project Mesa Team"
|
pex-tool__pex-1057 | Release 2.1.17
On the docket:
+ [x] TypeError when resolving local platforms. #1043
+ [x] No such file for interpreter's binary name #1009
+ [x] Pex resources leak while bootstrapping pants #1050
+ [x] Pex PEX perf regression #1054
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.16\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.17\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index f1e69a75b..118247d4d 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,46 @@
Release Notes
=============
+2.1.17
+------
+
+This release fixes a bug in ``--resolve-local-platforms`` handling that made it unusable in 2.1.16
+(#1043) as well as fixing a long standing file handle leak (#1050) and a bug when running under
+macOS framework builds of Python (#1009).
+
+* Fix `--unzip` performance regression. (#1056)
+ `PR #1056 <https://github.com/pantsbuild/pex/pull/1056>`_
+
+* Fix resource leak in Pex self-isolation. (#1052)
+ `PR #1052 <https://github.com/pantsbuild/pex/pull/1052>`_
+
+* Fix use of `iter_compatible_interpreters`. (#1048)
+ `PR #1048 <https://github.com/pantsbuild/pex/pull/1048>`_
+
+* Do not rely on `sys.executable` being accurate. (#1049)
+ `PR #1049 <https://github.com/pantsbuild/pex/pull/1049>`_
+
+* slightly demystify the relationship between platforms and interpreters in the library API and CLI (#1047)
+ `PR #1047 <https://github.com/pantsbuild/pex/pull/1047>`_
+
+* Path filter for PythonInterpreter.iter_candidates. (#1046)
+ `PR #1046 <https://github.com/pantsbuild/pex/pull/1046>`_
+
+* Add type hints to `util.py` and `tracer.py` (#1044)
+ `PR #1044 <https://github.com/pantsbuild/pex/pull/1044>`_
+
+* Add type hints to variables.py and platforms.py (#1042)
+ `PR #1042 <https://github.com/pantsbuild/pex/pull/1042>`_
+
+* Add type hints to the remaining tests (#1040)
+ `PR #1040 <https://github.com/pantsbuild/pex/pull/1040>`_
+
+* Add type hints to most tests (#1036)
+ `PR #1036 <https://github.com/pantsbuild/pex/pull/1036>`_
+
+* Use MyPy via type comments (#1032)
+ `PR #1032 <https://github.com/pantsbuild/pex/pull/1032>`_
+
2.1.16
------
diff --git a/pex/version.py b/pex/version.py
index 5d7a390f0..b8849c9a0 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.16"
+__version__ = "2.1.17"
|
pytorch__ignite-2907 | Issue with Enum on Python3.11
## 🐛 Bug description
Importing `ignite.distributed` fails on Python3.11.
To reproduce:
```bash
python3.11 -m pip install pytorch-ignite
python3.11 -c 'import ignite.distributed'
```
I get the following `AttributeError`:
```python
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/opt/homebrew/lib/python3.11/site-packages/ignite/__init__.py", line 3, in <module>
import ignite.engine
File "/opt/homebrew/lib/python3.11/site-packages/ignite/engine/__init__.py", line 7, in <module>
from ignite.engine.deterministic import DeterministicEngine
File "/opt/homebrew/lib/python3.11/site-packages/ignite/engine/deterministic.py", line 11, in <module>
from ignite.engine.engine import Engine
File "/opt/homebrew/lib/python3.11/site-packages/ignite/engine/engine.py", line 13, in <module>
from ignite.engine.events import CallableEventWithFilter, EventEnum, Events, EventsList, RemovableEventHandle, State
File "/opt/homebrew/lib/python3.11/site-packages/ignite/engine/events.py", line 254, in <module>
class Events(EventEnum):
File "/opt/homebrew/Cellar/python@3.11/3.11.2_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/enum.py", line 560, in __new__
raise exc
File "/opt/homebrew/Cellar/python@3.11/3.11.2_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/enum.py", line 280, in __set_name__
enum_member = enum_class._value2member_map_[value]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/ignite/engine/events.py", line 200, in __hash__
return hash(self._name_)
^^^^^^^^^^^
AttributeError: 'CallableEventWithFilter' object has no attribute '_name_'. Did you mean: 'name'?
```
<!-- A clear and concise description of what the bug is. -->
<!-- Please, add steps on how to reproduce it. -->
<!-- If you have a code sample, error messages, stack traces, please provide it here as well -->
<!-- A clear and concise description of what you expected to happen. -->
## Environment
- PyTorch Version: 2.0.0
- Ignite Version: 0.4.11
- OS (e.g., Linux): macOS Ventura 13.2.1
- How you installed Ignite: `pip`
- Python version: 3.11
| [
{
"content": "import numbers\nimport warnings\nimport weakref\nfrom collections.abc import Sequence\nfrom enum import Enum\nfrom types import DynamicClassAttribute\nfrom typing import Any, Callable, Dict, Iterable, Iterator, List, Optional, TYPE_CHECKING, Union\n\nfrom torch.utils.data import DataLoader\n\nfrom... | [
{
"content": "import numbers\nimport warnings\nimport weakref\nfrom collections.abc import Sequence\nfrom enum import Enum\nfrom types import DynamicClassAttribute\nfrom typing import Any, Callable, Dict, Iterable, Iterator, List, Optional, TYPE_CHECKING, Union\n\nfrom torch.utils.data import DataLoader\n\nfrom... | diff --git a/ignite/engine/events.py b/ignite/engine/events.py
index a80277c525d3..9dd99348492b 100644
--- a/ignite/engine/events.py
+++ b/ignite/engine/events.py
@@ -237,7 +237,10 @@ def function_before_backprop(engine):
# ...
"""
- pass
+ def __new__(cls, value: str) -> "EventEnum":
+ obj = CallableEventWithFilter.__new__(cls)
+ obj._value_ = value
+ return obj
class Events(EventEnum):
|
pex-tool__pex-1673 | Release 2.1.72
On the docket:
+ [x] Fix Locker to prune un-downloaded entries. (#1666)
+ [x] Fix venv creation to ignore ambient PEX env vars. #1669
+ [x] Lockfiles: requirement might not be compatible with requested interpreter constraints #1667
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.71\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.72\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index e7c692766..da8f81916 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,15 @@
Release Notes
=============
+2.1.72
+------
+
+This release fixes an old bug with ``--venv`` PEXes initially executed
+with either ``PEX_MODULE`` or ``PEX_SCRIPT`` active in the environment.
+
+* Fix venv creation to ignore ambient PEX env vars. (#1669)
+ `PR #1669 <https://github.com/pantsbuild/pex/pull/1669>`_
+
2.1.71
------
diff --git a/pex/version.py b/pex/version.py
index ff6708e70..1cf91c0a8 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.71"
+__version__ = "2.1.72"
|
pex-tool__pex-1610 | Release 2.1.66
On the docket:
+ [x] Support specifying foreign platforms in full detail. #1597
+ [x] Respect PEX_ROOT in PEXEnvironment.mount. #1599
+ [x] Be able to see what .pex file is run from the list of system processes #1604
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.65\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.66\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index c524a7ef1..31b6db792 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,36 @@
Release Notes
=============
+2.1.66
+------
+
+This release brings a new ``--complete-platform`` Pex CLI option that
+can be used instead of ``--platform`` when more detailed foreign
+platform specification is needed to satisfy a resolve (most commonly,
+when ``python_full_version`` environment markers are in-play). This,
+paired with the new ``pex3 interpreter inspect`` command that can be
+used to generate complete platform data on the foreign platform machine
+being targeted, should allow all foreign platform PEX builds to succeed
+exactly as they would if run on that foreign platform as long as
+pre-built wheels are available for that foreign platform.
+
+Additionally, PEXes now know how to set a useable process name when the
+PEX contains the `psutil` distribution. See
+`here <https://pex.readthedocs.io/en/v2.1.66/recipes.html#long-running-pex-applications-and-daemons>`_
+for more information.
+
+* Add support for ``--complete-platform``. (#1609)
+ `PR #1609 <https://github.com/pantsbuild/pex/pull/1609>`_
+
+* Introduce ``pex3 interpreter inspect``. (#1607)
+ `PR #1607 <https://github.com/pantsbuild/pex/pull/1607>`_
+
+* Use setproctitle to sanitize ``ps`` info. (#1605)
+ `PR #1605 <https://github.com/pantsbuild/pex/pull/1605>`_
+
+* Respect ``PEX_ROOT`` in ``PEXEnvironment.mount``. (#1599)
+ `PR #1599 <https://github.com/pantsbuild/pex/pull/1599>`_
+
2.1.65
------
diff --git a/pex/version.py b/pex/version.py
index 891b24c7e..24551f628 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.65"
+__version__ = "2.1.66"
|
pex-tool__pex-1733 | Release 2.1.82
On the docket:
+ [x] Pex resolve checking does not allow resolved pre-releases when --no-pre. #1730
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.81\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.82\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index d79994b8a..76a815eff 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,15 @@
Release Notes
=============
+2.1.82
+------
+
+This is a hotfix release for a regression in prerelease version handling
+introduced in the 2.1.81 release by #1727.
+
+* Fix prerelease handling when checking resolves. (#1732)
+ `PR #1732 <https://github.com/pantsbuild/pex/pull/1732>`_
+
2.1.81
------
diff --git a/pex/version.py b/pex/version.py
index ad0037b0d..45a6e1143 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.81"
+__version__ = "2.1.82"
|
pex-tool__pex-1834 | Release 2.1.95
On the docket:
+ [x] Lock creation should skip Windows-only requirements and / or allow selecting target platforms (OS classes). #1821
+ [x] Feature request: "universal" lock mode can reject unsupported platforms #1595
+ [x] Avoid ENOEXEC for --venv shebangs. #1828
+ [x] pex3 lock export does't seem to respect the platform flag. #1826
+ [x] Clarify pex3 lock export command. #1645
+ [x] Support exporting PYTHONPATH before running user code #1825
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.94\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.95\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index b40dc81c0..a66e2c6c4 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,49 @@
Release Notes
=============
+2.1.95
+------
+
+This release brings two new ``pex3 lock`` features for
+``--style universal`` locks.
+
+By default, universal locks are created to target all operating systems.
+This can cause problems when you only target a subset of operating
+systems and a lock transitive dependency that is conditional on an OS
+you do not target is not lockable. The new
+``--target-system {linux,mac,windows}`` option allows you to restrict
+the set of targeted OSes to work around this sort of issue. Since PEX
+files currently only support running on Linux and Mac, specifying
+``--target-system linux --target-system mac`` is a safe way to
+pre-emptively avoid these sorts of locking issues when creating a
+universal lock.
+
+Previously you could not specify the ``--platform``\s or
+``--complete-platform``\s you would be using later to build PEXes with
+when creating a universal lock. You now can, and Pex will verify the
+universal lock can support all the specified platforms.
+
+As is usual there are also several bug fixes including properly
+propagating ``PEX_EXTRA_SYS_PATH`` additions to forked Python processes,
+fixing ``pex3 lock export`` to only attempt to export for the selected
+target and avoiding too long shebang errors for ``--venv`` mode PEXes in
+a robust way.
+
+* Fix ``PEX_EXTRA_SYS_PATH`` propagation. (#1832)
+ `PR #1832 <https://github.com/pantsbuild/pex/pull/1832>`_
+
+* Fix ``pex3 lock export``: re-use ``--lock`` resolver. (#1831)
+ `PR #1831 <https://github.com/pantsbuild/pex/pull/1831>`_
+
+* Avoid ENOEXEC for ``--venv`` shebangs. (#1828)
+ `PR #1828 <https://github.com/pantsbuild/pex/pull/1828>`_
+
+* Check lock can resolve platforms at creation time. (#1824)
+ `PR #1824 <https://github.com/pantsbuild/pex/pull/1824>`_
+
+* Support restricting universal lock target os. (#1823)
+ `PR #1823 <https://github.com/pantsbuild/pex/pull/1823>`_
+
2.1.94
------
diff --git a/pex/version.py b/pex/version.py
index d72f3cdbf..40153e391 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.94"
+__version__ = "2.1.95"
|
pex-tool__pex-1319 | Release 2.1.39
On the docket:
+ [x] Running opvault 0.4.9 pex leads to infinite recursion in setup tools #1316
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.38\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.39\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 778a61032..d63bfcf3e 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,15 @@
Release Notes
=============
+2.1.39
+------
+
+A hotfix that fixes a bug present since 2.1.25 that results in infinite
+recursion in PEX runtime resolves when handling dependency cycles.
+
+* Guard against cyclic dependency graphs. (#1317)
+ `PR #1317 <https://github.com/pantsbuild/pex/pull/1317>`_
+
2.1.38
------
diff --git a/pex/version.py b/pex/version.py
index 092a74bf7..82e4c47d5 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.38"
+__version__ = "2.1.39"
|
pex-tool__pex-1288 | Release 2.1.35
On the docket:
+ [x] Ensure venv pex does not enter a re-exec loop. #1286
+ [x] Improve resolve error information. #1287
+ [x] Expose Pex tools via a pex-tools console script. #1279
+ [x] Fix auto-created `--venv` core scripts. (#1278)
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.34\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.35\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index b2a99154a..fd0770874 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,25 @@
Release Notes
=============
+2.1.35
+------
+
+This release hardens a few aspects of `--venv` mode PEXes. An infinite
+re-exec loop in venv `pex` scripts is fixed and the `activate` family
+of scripts in the venv is fixed.
+
+* Improve resolve error information. (#1287)
+ `PR #1287 <https://github.com/pantsbuild/pex/pull/1287>`_
+
+* Ensure venv pex does not enter a re-exec loop. (#1286)
+ `PR #1286 <https://github.com/pantsbuild/pex/pull/1286>`_
+
+* Expose Pex tools via a pex-tools console script. (#1279)
+ `PR #1279 <https://github.com/pantsbuild/pex/pull/1279>`_
+
+* Fix auto-created `--venv` core scripts. (#1278)
+ `PR #1278 <https://github.com/pantsbuild/pex/pull/1278>`_
+
2.1.34
------
diff --git a/pex/version.py b/pex/version.py
index ee3ef65c4..c4e8f5075 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.34"
+__version__ = "2.1.35"
|
pex-tool__pex-1844 | Release 2.1.97
On the docket:
+ [x] Avoid ENOEXEC for Pex internal --venvs. #1843
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.96\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.97\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index bd4f2b4fe..cfaab5109 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,16 @@
Release Notes
=============
+2.1.97
+------
+
+This release patches a hole left by #1828 in the Pex 2.1.95 release
+whereby, although you could run a PEX under a too-long PEX_ROOT you
+could not build a PEX under a tool-long PEX_ROOT.
+
+* Avoid ENOEXEC for Pex internal --venvs. (#1843)
+ `PR #1843 <https://github.com/pantsbuild/pex/pull/1843>`_
+
2.1.96
------
@@ -9,7 +19,7 @@ This is a hotfix release that fixes ``--venv`` mode
apply to ``sys.executable`` and not other Pythons.
* Fix ``--venv`` ``PEX PEX_EXTRA_SYS_PATH`` propagation. (#1837)
- `PR #1837 <https://github.com/pantsbuild/pex/pull/1837>`_
+ `PR #1837 <https://github.com/pantsbuild/pex/pull/1837>`_
2.1.95
------
diff --git a/pex/version.py b/pex/version.py
index 26a429cb9..3cebb26c2 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.96"
+__version__ = "2.1.97"
|
pex-tool__pex-1761 | Release 2.1.87
On the docket:
+ [ ] A relative --tmpdir foils pex3 lock create. #1758
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.86\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.87\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 37e7ea169..8a8cac55a 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,15 +1,23 @@
Release Notes
=============
+2.1.87
+------
+
+This release fixes ``pex3 lock create`` to handle relative ``--tmpdir``.
+
+* Fix lock save detection to be more robust. (#1760)
+ `PR #1760 <https://github.com/pantsbuild/pex/pull/1760>`_
+
2.1.86
------
This release fixes an oversight in lock file use against secured custom
indexes and find links repos. Previously credentials were passed during
-the lock creation process via either `~/.netrc` or via embedded
+the lock creation process via either ``~/.netrc`` or via embedded
credentials in the custom indexes and find links URLs Pex was configured
with. But, at lock use time, these credentials were not used. Now
-`~/.netrc` entries are always used and embedded credentials passed via
+``~/.netrc`` entries are always used and embedded credentials passed via
custom URLS at lock creation time can be passed in the same manner at
lock use time.
diff --git a/pex/version.py b/pex/version.py
index 10d51c427..3899ba6eb 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.86"
+__version__ = "2.1.87"
|
pex-tool__pex-1419 | Release 2.1.46
On the docket:
+ [x] Fix Pip proprietary URL env marker handling. #1417
+ [x] Un-reify installed wheel script shebangs. #1410
+ [x] Support deterministic repository extract tool. #1411
+ [x] support setuptools scripts #1379
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.45\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.46\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index c829f74d5..26dc72d40 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,32 @@
Release Notes
=============
+2.1.46
+------
+
+This release improves PEX file build reproducibility and requirement
+parsing of environment markers in Pip's proprietary URL format.
+
+Also, the `-c` / `--script` / `--console-script` argument now supports
+non-Python distribution scripts.
+
+Finally, new contributor @blag improved the README.
+
+* Fix Pip proprietary URL env marker handling. (#1417)
+ `PR #1417 <https://github.com/pantsbuild/pex/pull/1417>`_
+
+* Un-reify installed wheel script shebangs. (#1410)
+ `PR #1410 <https://github.com/pantsbuild/pex/pull/1410>`_
+
+* Support deterministic repository extract tool. (#1411)
+ `PR #1411 <https://github.com/pantsbuild/pex/pull/1411>`_
+
+* Improve examples and add example subsection titles (#1409)
+ `PR #1409 <https://github.com/pantsbuild/pex/pull/1409>`_
+
+* support any scripts specified in `setup(scripts=...)` from setup.py. (#1381)
+ `PR #1381 <https://github.com/pantsbuild/pex/pull/1381>`_
+
2.1.45
------
@@ -72,7 +98,7 @@ that improves Pip execution environment isolation.
2.1.41
------
-This release brings a hotfix from @kaos for interpreter identification
+This release brings a hotfix from @kaos for interpreter identification
on macOS 11.
* Update interpreter.py (#1332)
diff --git a/pex/version.py b/pex/version.py
index 29d0fe485..2c840cdc4 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.45"
+__version__ = "2.1.46"
|
pex-tool__pex-1664 | Release 2.1.71
On the docket:
+ [x] Secure Pex against sha1 collision attacks. #1662
+ [x] Problems building venvs from certain distributions. #1656
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.70\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.71\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index d8bc7c6e8..e7c692766 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,19 @@
Release Notes
=============
+2.1.71
+------
+
+This release fixes the instability introduced in 2.1.68 by switching to
+a more robust means of determining venv layouts. Along the way it
+upgrades Pex internals to cache all artifacts with strong hashes (
+previously sha1 was used). It's strongly recommended to upgrade or use
+the exclude ``!=2.1.68,!=2.1.69,!=2.1.70`` when depending on an open
+ended Pex version range.
+
+* Switch Pex installed wheels to ``--prefix`` scheme. (#1661)
+ `PR #1661 <https://github.com/pantsbuild/pex/pull/1661>`_
+
2.1.70
------
diff --git a/pex/version.py b/pex/version.py
index 4183f280c..ff6708e70 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.70"
+__version__ = "2.1.71"
|
pex-tool__pex-1559 | Release 2.1.61
On the docket:
+ [x] Merge packages for --venv-site-packages-copies. #1557
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.60\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.61\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 76c9fd97b..35792bd9e 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,16 @@
Release Notes
=============
+2.1.61
+------
+
+This release fixes a regression in Pex ``--venv`` mode compatibility
+with distributions that are members of a namespace package that was
+introduced by #1532 in the 2.1.57 release.
+
+* Merge packages for ``--venv-site-packages-copies``. (#1557)
+ `PR #1557 <https://github.com/pantsbuild/pex/pull/1557>`_
+
2.1.60
------
diff --git a/pex/version.py b/pex/version.py
index e534a9d5b..225183d03 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.60"
+__version__ = "2.1.61"
|
pex-tool__pex-1446 | Release 2.1.49
On the docket:
+ [ ] Avoid re-using old ~/.pex/code/ caches. #1444
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.48\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.49\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 52558ddeb..4cf9ec11a 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,17 @@
Release Notes
=============
+2.1.49
+------
+
+This is a hotfix release that fixes the new ``--layout {zipapp,packed}``
+modes for PEX files with no user code & just third party dependencies
+when executed against a ``$PEX_ROOT`` where similar PEXes built with the
+old ``--not-zip-safe`` option were were run in the past.
+
+* Avoid re-using old ~/.pex/code/ caches. (#1444)
+ `PR #1444 <https://github.com/pantsbuild/pex/pull/1444>`_
+
2.1.48
------
diff --git a/pex/version.py b/pex/version.py
index c923ea6ba..3977ea4eb 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.48"
+__version__ = "2.1.49"
|
pex-tool__pex-1482 | Release 2.1.51
On the docket:
+ [ ] UnicodeDecodeError when packaging after upgrading to v2.1.46 #1479
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.50\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.51\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index f05e62f6f..07f84eada 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,17 @@
Release Notes
=============
+2.1.51
+------
+
+This release fixes both PEX creation and ``--venv`` creation to handle
+distributions that contain scripts with non-ascii characters in them
+when running in environments with a default encoding that does not
+contain those characters under PyPy3, Python 3.5 and Python 3.6.
+
+* Fix non-ascii script shebang re-writing. (#1480)
+ `PR #1480 <https://github.com/pantsbuild/pex/pull/1480>`_
+
2.1.50
------
diff --git a/pex/version.py b/pex/version.py
index fed40855d..f960a0a0b 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.50"
+__version__ = "2.1.51"
|
pypi__warehouse-3165 | Add a "shortlink" for projects
**From user testing:**
When viewing projects on PyPI, some users type the URL directly if they know the project name.
We should create a shortlink like`pypi.org/p/myproject` which would redirect to `pypi.org/projects/myproject`
cc @di for feedback / guidance.
---
**Good First Issue**: This issue is good for first time contributors. If you've already contributed to Warehouse, please work on [another issue without this label](https://github.com/pypa/warehouse/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen+-label%3A%22good+first+issue%22) instead. If there is not a corresponding pull request for this issue, it is up for grabs. For directions for getting set up, see our [Getting Started Guide](https://warehouse.pypa.io/development/getting-started/). If you are working on this issue and have questions, please feel free to ask them here, [`#pypa-dev` on Freenode](https://webchat.freenode.net/?channels=%23pypa-dev), or the [pypa-dev mailing list](https://groups.google.com/forum/#!forum/pypa-dev).
| [
{
"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, softw... | [
{
"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, softw... | diff --git a/tests/unit/test_routes.py b/tests/unit/test_routes.py
index 362b91010524..1d9607ca39a7 100644
--- a/tests/unit/test_routes.py
+++ b/tests/unit/test_routes.py
@@ -269,6 +269,7 @@ def add_policy(name, filename):
]
assert config.add_redirect.calls == [
+ pretend.call("/p/{name}/", "/project/{name}/", domain=warehouse),
pretend.call("/pypi/{name}/", "/project/{name}/", domain=warehouse),
pretend.call(
"/pypi/{name}/{version}/",
diff --git a/warehouse/routes.py b/warehouse/routes.py
index 0a49b73894fc..84902feb03f0 100644
--- a/warehouse/routes.py
+++ b/warehouse/routes.py
@@ -178,6 +178,7 @@ def includeme(config):
)
# Packaging
+ config.add_redirect('/p/{name}/', '/project/{name}/', domain=warehouse)
config.add_route(
"packaging.project",
"/project/{name}/",
|
ivy-llc__ivy-23142 | ifft
| [
{
"content": "# local\nimport ivy\nfrom ivy.functional.frontends.jax.func_wrapper import to_ivy_arrays_and_back\nfrom ivy.func_wrapper import with_unsupported_dtypes\n\n\n@to_ivy_arrays_and_back\ndef fft(a, n=None, axis=-1, norm=None):\n if norm is None:\n norm = \"backward\"\n return ivy.fft(a, ax... | [
{
"content": "# local\nimport ivy\nfrom ivy.functional.frontends.jax.func_wrapper import to_ivy_arrays_and_back\nfrom ivy.func_wrapper import with_unsupported_dtypes\n\n\n@to_ivy_arrays_and_back\ndef fft(a, n=None, axis=-1, norm=None):\n if norm is None:\n norm = \"backward\"\n return ivy.fft(a, ax... | diff --git a/ivy/functional/frontends/jax/numpy/fft.py b/ivy/functional/frontends/jax/numpy/fft.py
index 7a62b524d67e8..16d9cb97e67b2 100644
--- a/ivy/functional/frontends/jax/numpy/fft.py
+++ b/ivy/functional/frontends/jax/numpy/fft.py
@@ -27,3 +27,10 @@ def fftshift(x, axes=None, name=None):
roll = ivy.roll(x, shifts, axis=axes)
return roll
+
+
+@to_ivy_arrays_and_back
+def ifft(a, n=None, axis=-1, norm=None):
+ if norm is None:
+ norm = "backward"
+ return ivy.ifft(a, axis, norm=norm, n=n)
diff --git a/ivy_tests/test_ivy/test_frontends/test_jax/test_numpy/test_fft.py b/ivy_tests/test_ivy/test_frontends/test_jax/test_numpy/test_fft.py
index ed316ca5a544e..7fd9f00f3f135 100644
--- a/ivy_tests/test_ivy/test_frontends/test_jax/test_numpy/test_fft.py
+++ b/ivy_tests/test_ivy/test_frontends/test_jax/test_numpy/test_fft.py
@@ -70,3 +70,45 @@ def test_jax_numpy_fftshift(
x=arr[0],
axes=None,
)
+
+
+# ifft
+@handle_frontend_test(
+ fn_tree="jax.numpy.fft.ifft",
+ dtype_values_axis=helpers.dtype_values_axis(
+ available_dtypes=helpers.get_dtypes("complex"),
+ num_arrays=1,
+ min_value=-1e5,
+ max_value=1e5,
+ min_num_dims=1,
+ max_num_dims=5,
+ min_dim_size=1,
+ max_dim_size=5,
+ allow_inf=False,
+ large_abs_safety_factor=2.5,
+ small_abs_safety_factor=2.5,
+ safety_factor_scale="log",
+ valid_axis=True,
+ force_int_axis=True,
+ ),
+ n=st.integers(min_value=2, max_value=10),
+ norm=st.sampled_from(["backward", "ortho", "forward", None]),
+)
+def test_jax_numpy_ifft(
+ dtype_values_axis, n, norm, frontend, backend_fw, test_flags, fn_tree, on_device
+):
+ dtype, values, axis = dtype_values_axis
+ helpers.test_frontend_function(
+ input_dtypes=dtype,
+ frontend=frontend,
+ backend_to_test=backend_fw,
+ test_flags=test_flags,
+ fn_tree=fn_tree,
+ on_device=on_device,
+ a=values[0],
+ n=n,
+ axis=axis,
+ norm=norm,
+ atol=1e-02,
+ rtol=1e-02,
+ )
|
pex-tool__pex-1251 | Release 2.1.31
On the docket:
+ [x] When Pex is run from a Pex PEX its isolation is broken. #1232
+ [x] The `--venv` mode `pex` script does not have a `__name__ == '__main__'` guard breaking multiprocessing. #1236
+ [x] The `--seed` mode for a `--venv` PEX is unsafe. #1239
+ [x] The venv `pex` script handles entrypoint functions differently from PEX. #1241
+ [x] Interpreter identification leaks an unconstrained `$PWD` entry into `sys.path`. #1231
+ [x] Support control of venv creation mode `--copies` vs. `--symlinks` #1230
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.30\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.31\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index fa1eefff1..85a8bdc9b 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,38 @@
Release Notes
=============
+2.1.31
+------
+
+This release primarily hardens Pex venvs fixing several bugs.
+
+* Fix Pex isolation. (#1250)
+ `PR #1250 <https://github.com/pantsbuild/pex/pull/1250>`_
+
+* Support pre-compiling a venv. (#1246)
+ `PR #1246 <https://github.com/pantsbuild/pex/pull/1246>`_
+
+* Support venv relocation. (#1247)
+ `PR #1247 <https://github.com/pantsbuild/pex/pull/1247>`_
+
+* Fix `--runtime-pex-root` leak in pex bootstrap. (#1244)
+ `PR #1244 <https://github.com/pantsbuild/pex/pull/1244>`_
+
+* Support venvs that can outlive their base python. (#1245)
+ `PR #1245 <https://github.com/pantsbuild/pex/pull/1245>`_
+
+* Harden Pex interpreter identification. (#1248)
+ `PR #1248 <https://github.com/pantsbuild/pex/pull/1248>`_
+
+* The `pex` venv script handles entrypoints like PEX. (#1242)
+ `PR #1242 <https://github.com/pantsbuild/pex/pull/1242>`_
+
+* Ensure PEX files aren't symlinked in venv. (#1240)
+ `PR #1240 <https://github.com/pantsbuild/pex/pull/1240>`_
+
+* Fix venv pex script for use with multiprocessing. (#1238)
+ `PR #1238 <https://github.com/pantsbuild/pex/pull/1238>`_
+
2.1.30
------
diff --git a/pex/version.py b/pex/version.py
index ea2323052..058623131 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.30"
+__version__ = "2.1.31"
|
pex-tool__pex-1502 | Release 2.1.53
On the docket:
+ [x] pex stops interpreter search if even one intepreter fails to identify itself #1494
+ [x] Add support for setting custom venv prompts. #1499
+ [x] How to know whether we are running from within pex #1485
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.52\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.53\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index e3cb3fcdf..72ae8ff25 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,20 @@
Release Notes
=============
+2.1.53
+------
+
+This release fixes a bug identifying certain interpreters on macOS
+Monterey. Additionally, Pex now exposes the ``PEX`` environment
+variable inside running PEXes to allow application code to both detect
+it's running from a PEX and determine where that PEX is located.
+
+* Guard against fake interpreters. (#1500)
+ `PR #1500 <https://github.com/pantsbuild/pex/pull/1500>`_
+
+* Introduce the ``PEX`` env var. (#1495)
+ `PR #1495 <https://github.com/pantsbuild/pex/pull/1495>`_
+
2.1.52
------
diff --git a/pex/version.py b/pex/version.py
index b9833ab3e..ea63babce 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.52"
+__version__ = "2.1.53"
|
pex-tool__pex-1314 | Release 2.1.38
On the docket:
+ [ ] PEX direct requirement metadata for resolves via Pip is incorrect. #1311
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.37\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.38\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index f66b63894..778a61032 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,16 @@
Release Notes
=============
+2.1.38
+------
+
+A hotfix that finishes work started in 2.1.37 by #1304 to align Pip
+based resolve results with ``--pex-repository`` based resolve results
+for requirements with '.' in their names as allowed by PEP-503.
+
+* Fix PEX direct requirements metadata. (#1312)
+ `PR #1312 <https://github.com/pantsbuild/pex/pull/1312>`_
+
2.1.37
------
diff --git a/pex/version.py b/pex/version.py
index 7fc3ab4af..092a74bf7 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.37"
+__version__ = "2.1.38"
|
pex-tool__pex-1547 | Release 2.1.59
On the docket:
+ [x] Add knob for --venv site-packages symlinking. #1543
+ [x] Fix Pex to identify Python 3.10 interpreters. #1545
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.58\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.59\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index c8533e860..6f5b53a3e 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,21 @@
Release Notes
=============
+2.1.59
+------
+
+This release adds the boolean option ``--venv-site-packages-copies`` to
+control whether ``--venv`` execution mode PEXes create their venv with
+copies (hardlinks when possible) or symlinks. It also fixes a bug that
+prevented Python 3.10 interpreters from being discovered when
+``--interpreter-constraint`` was used.
+
+* Add knob for --venv site-packages symlinking. (#1543)
+ `PR #1543 <https://github.com/pantsbuild/pex/pull/1543>`_
+
+* Fix Pex to identify Python 3.10 interpreters. (#1545)
+ `PR #1545 <https://github.com/pantsbuild/pex/pull/1545>`_
+
2.1.58
------
diff --git a/pex/version.py b/pex/version.py
index 51231815e..30fbcbd9c 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.58"
+__version__ = "2.1.59"
|
pex-tool__pex-1255 | Release 2.1.32
On the docket:
+ [x] Venv `pex` and bin scripts can run afoul of shebang length limits. #1252
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.31\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.32\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 85a8bdc9b..d72ec401a 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,15 @@
Release Notes
=============
+2.1.32
+------
+
+This is a hotfix release that fixes ``--venv`` mode shebangs being too long for some Linux
+environments.
+
+* Guard against too long ``--venv`` mode shebangs. (#1254)
+ `PR #1254 <https://github.com/pantsbuild/pex/pull/1254>`_
+
2.1.31
------
diff --git a/pex/version.py b/pex/version.py
index 058623131..21d1e3b7c 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.31"
+__version__ = "2.1.32"
|
pytest-dev__pytest-django-881 | admin_client is not checking for login success
`client.login` inside `admin_client` can return `False` in the case when there's an existing admin user with a password set to something other than `'password'`. Perhaps, `admin_client` should use `force_login` instead?
| [
{
"content": "\"\"\"All pytest-django fixtures\"\"\"\n\n\nimport os\nfrom contextlib import contextmanager\nfrom functools import partial\n\nimport pytest\n\nfrom . import live_server_helper\nfrom .django_compat import is_django_unittest\nfrom .lazy_django import skip_if_no_django\n\n__all__ = [\n \"django_d... | [
{
"content": "\"\"\"All pytest-django fixtures\"\"\"\n\n\nimport os\nfrom contextlib import contextmanager\nfrom functools import partial\n\nimport pytest\n\nfrom . import live_server_helper\nfrom .django_compat import is_django_unittest\nfrom .lazy_django import skip_if_no_django\n\n__all__ = [\n \"django_d... | diff --git a/docs/helpers.rst b/docs/helpers.rst
index 03434faf7..d70ffe2d0 100644
--- a/docs/helpers.rst
+++ b/docs/helpers.rst
@@ -158,14 +158,18 @@ Example
response = client.get('/')
assert response.content == 'Foobar'
-To use `client` as an authenticated standard user, call its `login()` method before accessing a URL:
+To use `client` as an authenticated standard user, call its `force_login()` or
+`login()` method before accessing a URL:
::
def test_with_authenticated_client(client, django_user_model):
username = "user1"
password = "bar"
- django_user_model.objects.create_user(username=username, password=password)
+ user = django_user_model.objects.create_user(username=username, password=password)
+ # Use this:
+ client.force_login(user)
+ # Or this:
client.login(username=username, password=password)
response = client.get('/private')
assert response.content == 'Protected Area'
diff --git a/pytest_django/fixtures.py b/pytest_django/fixtures.py
index 0f2dd6115..d1918d3fd 100644
--- a/pytest_django/fixtures.py
+++ b/pytest_django/fixtures.py
@@ -304,7 +304,7 @@ def admin_client(db, admin_user):
from django.test.client import Client
client = Client()
- client.login(username=admin_user.username, password="password")
+ client.force_login(admin_user)
return client
diff --git a/tests/test_fixtures.py b/tests/test_fixtures.py
index 6a8e4208a..26b5394aa 100644
--- a/tests/test_fixtures.py
+++ b/tests/test_fixtures.py
@@ -49,6 +49,17 @@ def test_admin_client_no_db_marker(admin_client):
assert force_str(resp.content) == "You are an admin"
+# For test below.
+@pytest.fixture
+def existing_admin_user(django_user_model):
+ return django_user_model._default_manager.create_superuser('admin', None, None)
+
+
+def test_admin_client_existing_user(db, existing_admin_user, admin_user, admin_client):
+ resp = admin_client.get("/admin-required/")
+ assert force_str(resp.content) == "You are an admin"
+
+
@pytest.mark.django_db
def test_admin_user(admin_user, django_user_model):
assert isinstance(admin_user, django_user_model)
|
scverse__scanpy-2566 | pyproject.toml should refer to `igraph` and not `python-igraph`
### Please make sure these conditions are met
- [X] I have checked that this issue has not already been reported.
- [X] I have confirmed this bug exists on the latest version of scanpy.
- [X] (optional) I have confirmed this bug exists on the master branch of scanpy.
### What happened?
I've noticed that `pyproject.toml` refers to the `python-igraph` package in the PyPI repository. This name is deprecated; the `igraph` package is currently called [`igraph`](https://pypi.org/project/igraph). The old package name currently works as a redirect (i.e. it brings in `igraph` as its own sub-dependency), but it will not be maintained in the future. Please switch to referring to `igraph` in `pyproject.toml` and not `python-igraph`.
### Minimal code sample
```python
N/A
```
### Error output
_No response_
### Versions
<details>
```
-----
anndata 0.9.1
scanpy 1.9.3
-----
PIL 10.0.0
cycler 0.10.0
cython_runtime NA
dateutil 2.8.2
h5py 3.9.0
joblib 1.3.1
kiwisolver 1.4.4
llvmlite 0.40.1
matplotlib 3.7.2
mpl_toolkits NA
natsort 8.4.0
numba 0.57.1
numpy 1.24.4
packaging 23.1
pandas 2.0.3
pyparsing 3.0.9
pytz 2023.3
scipy 1.11.1
session_info 1.0.0
sitecustomize NA
six 1.16.0
sklearn 1.3.0
threadpoolctl 3.2.0
-----
Python 3.11.4 (main, Jun 20 2023, 17:23:00) [Clang 14.0.3 (clang-1403.0.22.14.1)]
macOS-13.2.1-arm64-arm-64bit
-----
Session information updated at 2023-07-19 13:34
```
</details>
BaseException: Could not construct partition: Weight vector not the same size as the number of edges.
- [X] I have checked that this issue has not already been reported.
- [X] I have confirmed this bug exists on the latest version of scanpy.
- [ ] (optional) I have confirmed this bug exists on the master branch of scanpy.
---
I have been trying to replicate [this tutorial](https://scanpy-tutorials.readthedocs.io/en/latest/paga-paul15.html#Clustering-and-PAGA) on trajectory inference. I have followed every step up until clustering, where I try to use sc.tl.leiden(adata) to cluster, but keep having the following error. This seemed to have resolved itself by installing leidenalg via pip, but with conda install it fails every time.
### Minimal code sample (that we can copy&paste without having any data)
```python
sc.tl.leiden(adata)
```
```pytb
BaseException Traceback (most recent call last)
Cell In [15], line 1
----> 1 sc.tl.leiden(adata)
File ~/miniconda3/envs/py39/lib/python3.9/site-packages/scanpy/tools/_leiden.py:144, in leiden(adata, resolution, restrict_to, random_state, key_added, adjacency, directed, use_weights, n_iterations, partition_type, neighbors_key, obsp, copy, **partition_kwargs)
142 partition_kwargs[‘resolution_parameter’] = resolution
143 # clustering proper
→ 144 part = leidenalg.find_partition(g, partition_type, **partition_kwargs)
145 # store output into adata.obs
146 groups = np.array(part.membership)
File ~/miniconda3/envs/py39/lib/python3.9/site-packages/leidenalg/functions.py:81, in find_partition(graph, partition_type, initial_membership, weights, n_iterations, max_comm_size, seed, **kwargs)
79 if not weights is None:
80 kwargs[‘weights’] = weights
—> 81 partition = partition_type(graph,
82 initial_membership=initial_membership,
83 **kwargs)
84 optimiser = Optimiser()
86 optimiser.max_comm_size = max_comm_size
File ~/miniconda3/envs/py39/lib/python3.9/site-packages/leidenalg/VertexPartition.py:855, in RBConfigurationVertexPartition.init(self, graph, initial_membership, weights, node_sizes, resolution_parameter)
851 else:
852 # Make sure it is a list
853 node_sizes = list(node_sizes)
→ 855 self._partition = _c_leiden._new_RBConfigurationVertexPartition(pygraph_t,
856 initial_membership, weights, node_sizes, resolution_parameter)
857 self._update_internal_membership()
BaseException: Could not construct partition: Weight vector not the same size as the number of edges.
```
#### Versions
<details>
```
anndata 0.8.0
scanpy 1.9.1
-----
PIL 9.2.0
appnope 0.1.3
asttokens NA
backcall 0.2.0
beta_ufunc NA
binom_ufunc NA
cffi 1.15.1
colorama 0.4.5
cycler 0.10.0
cython_runtime NA
dateutil 2.8.2
debugpy 1.6.3
decorator 5.1.1
defusedxml 0.7.1
entrypoints 0.4
executing 1.1.0
h5py 3.7.0
hypergeom_ufunc NA
igraph 0.9.11
ipykernel 6.16.0
ipython_genutils 0.2.0
ipywidgets 8.0.2
jedi 0.18.1
joblib 1.2.0
jupyter_server 1.19.1
kiwisolver 1.4.4
leidenalg 0.8.10
llvmlite 0.39.1
louvain 0.7.1
matplotlib 3.6.0
matplotlib_inline 0.1.6
mpl_toolkits NA
natsort 8.2.0
nbinom_ufunc NA
ncf_ufunc NA
numba 0.56.2
numpy 1.23.3
packaging 21.3
pandas 1.5.0
parso 0.8.3
pexpect 4.8.0
pickleshare 0.7.5
pkg_resources NA
prompt_toolkit 3.0.31
psutil 5.9.2
ptyprocess 0.7.0
pure_eval 0.2.2
pycparser 2.21
pydev_ipython NA
pydevconsole NA
pydevd 2.8.0
pydevd_file_utils NA
pydevd_plugins NA
pydevd_tracing NA
pygments 2.13.0
pynndescent 0.5.7
pyparsing 3.0.9
pytz 2022.2.1
scipy 1.9.1
session_info 1.0.0
setuptools 65.4.0
six 1.16.0
sklearn 1.1.2
sphinxcontrib NA
stack_data 0.5.1
statsmodels 0.13.2
texttable 1.6.4
threadpoolctl 3.1.0
tornado 6.2
tqdm 4.64.1
traitlets 5.4.0
typing_extensions NA
umap 0.5.3
wcwidth 0.2.5
zipp NA
zmq 24.0.1
zoneinfo NA
-----
IPython 8.5.0
jupyter_client 7.3.5
jupyter_core 4.11.1
jupyterlab 3.4.7
notebook 6.4.12
-----
Python 3.9.13 | packaged by conda-forge | (main, May 27 2022, 17:00:33) [Clang 13.0.1 ]
macOS-12.6-arm64-arm-64bit
-----
Session information updated at 2022-09-29 11:08
```
</details>
| [
{
"content": "\"\"\"Logging and Profiling\n\"\"\"\nimport logging\nimport sys\nfrom functools import update_wrapper, partial\nfrom logging import CRITICAL, ERROR, WARNING, INFO, DEBUG\nfrom datetime import datetime, timedelta, timezone\nfrom typing import Optional, IO\nimport warnings\n\nimport anndata.logging\... | [
{
"content": "\"\"\"Logging and Profiling\n\"\"\"\nimport logging\nimport sys\nfrom functools import update_wrapper, partial\nfrom logging import CRITICAL, ERROR, WARNING, INFO, DEBUG\nfrom datetime import datetime, timedelta, timezone\nfrom typing import Optional, IO\nimport warnings\n\nimport anndata.logging\... | diff --git a/.gitignore b/.gitignore
index ed60829903..74a7fdcc33 100644
--- a/.gitignore
+++ b/.gitignore
@@ -21,6 +21,7 @@
/scanpy/tests/notebooks/figures/
# Environment management
+/hatch.toml
/Pipfile
/Pipfile.lock
/requirements*.lock
diff --git a/docs/installation.md b/docs/installation.md
index b8b75e1a9d..c94883946b 100644
--- a/docs/installation.md
+++ b/docs/installation.md
@@ -24,7 +24,7 @@ pip install 'scanpy[leiden]'
```
The extra `[leiden]` installs two packages that are needed for popular
-parts of scanpy but aren't requirements: [python-igraph] [^cite_csardi06] and [leiden] [^cite_traag18].
+parts of scanpy but aren't requirements: [igraph] [^cite_csardi06] and [leiden] [^cite_traag18].
(dev-install-instructions)=
@@ -83,7 +83,7 @@ pip install --user scanpy
- `brew install igraph`
-- If python-igraph still fails to install, see the question on [compiling igraph].
+- If igraph still fails to install, see the question on [compiling igraph].
Alternatively consider installing gcc via `brew install gcc --without-multilib`
and exporting the required variables:
@@ -125,5 +125,5 @@ The whole process takes just a couple of minutes.
[leiden]: https://leidenalg.readthedocs.io
[miniconda]: http://conda.pydata.org/miniconda.html
[on github]: https://github.com/scverse/scanpy
-[python-igraph]: http://igraph.org/python/
+[igraph]: https://python.igraph.org/en/stable/
[unofficial binaries]: https://www.lfd.uci.edu/~gohlke/pythonlibs/
diff --git a/pyproject.toml b/pyproject.toml
index 1a0d3abdac..076155e867 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -131,9 +131,9 @@ dev = [
"docutils",
]
# Algorithms
-paga = ["python-igraph"]
-louvain = ["python-igraph", "louvain>=0.6,!=0.6.2"] # Louvain community detection
-leiden = ["python-igraph", "leidenalg"] # Leiden community detection
+paga = ["igraph"]
+louvain = ["igraph", "louvain>=0.6,!=0.6.2"] # Louvain community detection
+leiden = ["igraph>=0.10", "leidenalg>=0.9"] # Leiden community detection
bbknn = ["bbknn"] # Batch balanced KNN (batch correction)
magic = ["magic-impute>=2.0"] # MAGIC imputation method
skmisc = ["scikit-misc>=0.1.3"] # highly_variable_genes method 'seurat_v3'
diff --git a/scanpy/logging.py b/scanpy/logging.py
index 712a187961..086c559593 100644
--- a/scanpy/logging.py
+++ b/scanpy/logging.py
@@ -127,7 +127,7 @@ def format(self, record: logging.LogRecord):
'pandas',
('sklearn', 'scikit-learn'),
'statsmodels',
- ('igraph', 'python-igraph'),
+ 'igraph',
'louvain',
'leidenalg',
'pynndescent',
diff --git a/scanpy/testing/_pytest/marks.py b/scanpy/testing/_pytest/marks.py
index 64ef05d24e..6ea633bf17 100644
--- a/scanpy/testing/_pytest/marks.py
+++ b/scanpy/testing/_pytest/marks.py
@@ -10,7 +10,7 @@
louvain="louvain",
skmisc="scikit-misc",
fa2="fa2",
- igraph="python-igraph",
+ igraph="igraph",
dask="dask",
zarr="zarr",
zappy="zappy",
|
pex-tool__pex-1618 | Release 2.1.67
On the docket:
+ [x] Expand --platform syntax: support full versions. #1614
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.66\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.67\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 7ade9666f..3d9366907 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,20 @@
Release Notes
=============
+2.1.67
+------
+
+This release brings support for `--platform` arguments with a
+3-component PYVER portion. This supports working around
+`python_full_version` environment marker evaluation failures for
+`--platform` resolves by changing, for example, a platform of
+`linux_x86_64-cp-38-cp38` to `linux_x86_64-cp-3.8.10-cp38`. This is
+likely a simpler way to work around these issues than using the
+`--complete-platform` facility introduced in 2.1.66 by #1609.
+
+* Expand `--platform` syntax: support full versions. (#1614)
+ `PR #1614 <https://github.com/pantsbuild/pex/pull/1614>`_
+
2.1.66
------
diff --git a/pex/version.py b/pex/version.py
index 24551f628..872eae86e 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.66"
+__version__ = "2.1.67"
diff --git a/tests/test_pep_508.py b/tests/test_pep_508.py
index 8842ce185..a108b19b2 100644
--- a/tests/test_pep_508.py
+++ b/tests/test_pep_508.py
@@ -50,13 +50,11 @@ def test_extended_platform_marker_environment():
# type: () -> None
platform = Platform.create("linux-x86_64-cp-3.10.1-cp310")
marker_environment = MarkerEnvironment.from_platform(platform)
- env_defaulted = marker_environment.as_dict(default_unknown=True)
- env_sparse = marker_environment.as_dict(default_unknown=False)
+ env = marker_environment.as_dict()
def assert_known_marker(expression):
# type: (str) -> None
- assert evaluate_marker(expression, env_defaulted)
- assert evaluate_marker(expression, env_sparse)
+ assert evaluate_marker(expression, env)
assert_known_marker("python_full_version == '3.10.1'")
assert_known_marker("python_version == '3.10'")
@@ -66,9 +64,8 @@ def assert_known_marker(expression):
def assert_unknown_marker(expression):
# type: (str) -> None
- assert not evaluate_marker(expression, env_defaulted)
with pytest.raises(markers.UndefinedEnvironmentName):
- evaluate_marker(expression, env_sparse)
+ evaluate_marker(expression, env)
assert_unknown_marker("platform_release == '5.12.12-arch1-1'")
assert_unknown_marker("platform_version == '#1 SMP PREEMPT Fri, 18 Jun 2021 21:59:22 +0000'")
|
Bitmessage__PyBitmessage-1387 | Better logging
Using the built-in python logging module I've made various log levels possible and made the creation of a log file a matter of changing the configuration in debug.py. The python logging module is thread-safe so we can safely replace all `print` calls with calls to `logger`. I only replaced some of them mainly to test the configuration (and there are a lot of `print` calls)
There are some commits in my merge that mention translation files, I'm working on that but didn't mean to include them in this merge. I deleted them but the commit history is already there.
| [
{
"content": "#!/usr/bin/env python2\n\"\"\"\nCheck dependendies and give recommendations about how to satisfy them\n\nLimitations:\n\n * Does not detect whether packages are already installed. Solving this requires writing more of a configuration\n management system. Or we could switch to an existing one... | [
{
"content": "#!/usr/bin/env python2\n\"\"\"\nCheck dependendies and give recommendations about how to satisfy them\n\nLimitations:\n\n * Does not detect whether packages are already installed. Solving this requires writing more of a configuration\n management system. Or we could switch to an existing one... | diff --git a/checkdeps.py b/checkdeps.py
index c0e1005199..03782037e8 100755
--- a/checkdeps.py
+++ b/checkdeps.py
@@ -22,7 +22,7 @@
from setup import EXTRAS_REQUIRE
except ImportError:
HAVE_SETUPTOOLS = False
- EXTRAS_REQUIRE = []
+ EXTRAS_REQUIRE = {}
from importlib import import_module
|
liqd__a4-meinberlin-4706 | #6460 Previous/Next Button Poll Request Results no backround color
**URL:** https://meinberlin-dev.liqd.net/projekte/test-poll-merge-running-poll-with-user-content/
**user:** any
**expected behaviour:** Previous/Next button on the poll request results has a pink background.
**behaviour:** Button has no background. Only the outlines turn pink when the button is clicked
**important screensize:**
**device & browser:**
**Comment/Question:**
Screenshot?
dev:
<img width="286" alt="Bildschirmfoto 2022-11-09 um 05 38 05" src="https://user-images.githubusercontent.com/113356258/200740386-60d26bc2-f169-40e4-9730-79d6d8724dad.png">
<img width="220" alt="Bildschirmfoto 2022-11-09 um 05 40 30" src="https://user-images.githubusercontent.com/113356258/200740411-e40f6bf6-83ba-468f-a941-93bbfe045993.png">
stage:
<img width="189" alt="Bildschirmfoto 2022-11-09 um 05 44 21" src="https://user-images.githubusercontent.com/113356258/200740726-f116d498-cb19-4074-bd57-541f7d5d8d2a.png">
| [
{
"content": "from django.contrib import messages\nfrom django.db import transaction\nfrom django.urls import reverse\nfrom django.utils.translation import gettext_lazy as _\nfrom django.views import generic\n\nfrom adhocracy4.categories import filters as category_filters\nfrom adhocracy4.exports.views import D... | [
{
"content": "from django.contrib import messages\nfrom django.db import transaction\nfrom django.urls import reverse\nfrom django.utils.translation import gettext_lazy as _\nfrom django.views import generic\n\nfrom adhocracy4.categories import filters as category_filters\nfrom adhocracy4.exports.views import D... | diff --git a/meinberlin/apps/ideas/views.py b/meinberlin/apps/ideas/views.py
index fb0f02f039..2bd994b679 100644
--- a/meinberlin/apps/ideas/views.py
+++ b/meinberlin/apps/ideas/views.py
@@ -55,7 +55,7 @@ class IdeaFilterSet(a4_filters.DefaultsFilterSet):
class Meta:
model = models.Idea
- fields = ['search', 'labels', 'category']
+ fields = ['search', 'category', 'labels']
class AbstractIdeaListView(ProjectMixin,
diff --git a/meinberlin/assets/scss/components/_dropdown.scss b/meinberlin/assets/scss/components/_dropdown.scss
index f9db9a61fe..abe6aad79c 100644
--- a/meinberlin/assets/scss/components/_dropdown.scss
+++ b/meinberlin/assets/scss/components/_dropdown.scss
@@ -47,9 +47,7 @@
}
}
- button:last-child,
- li:last-child > a,
- li:last-child > button {
+ &:last-child {
border-bottom: 0;
}
diff --git a/meinberlin/assets/scss/components/_poll.scss b/meinberlin/assets/scss/components/_poll.scss
index bb11928dc8..587ec9085d 100644
--- a/meinberlin/assets/scss/components/_poll.scss
+++ b/meinberlin/assets/scss/components/_poll.scss
@@ -241,7 +241,7 @@ $checkbox-size: 20px;
}
.poll-slider__answer {
- padding-bottom: 1.75 * $spacer;
+ padding-bottom: 2 * $spacer;
i {
color: $gray-lighter;
@@ -253,7 +253,7 @@ $checkbox-size: 20px;
font-size: $font-size-xs;
color: $gray-lighter;
position: absolute;
- bottom: 1.25 * $spacer; // to allign with arrows
+ bottom: 1.75 * $spacer; // to allign with arrows
left: $spacer;
}
@@ -266,7 +266,7 @@ $checkbox-size: 20px;
// slick overwrites - nested for specificity
.slick-prev {
left: revert !important;
- right: 4 * $spacer !important;
+ right: 5 * $spacer !important;
}
.slick-next {
@@ -275,13 +275,14 @@ $checkbox-size: 20px;
.slick-prev,
.slick-next {
+ @extend .btn;
@extend .btn--primary;
position: absolute;
top: revert;
bottom: 0;
text-align: center;
- width: 30px;
- height: 30px;
+ width: 40px;
+ height: 40px;
border-radius: 100%;
z-index: 1; // for when tile links overlap
@@ -289,6 +290,7 @@ $checkbox-size: 20px;
opacity: 0.25;
cursor: not-allowed;
pointer-events: none;
+ box-shadow: none;
}
&:before {
@@ -296,8 +298,8 @@ $checkbox-size: 20px;
opacity: 1;
font-family: "Font Awesome 6 Free", sans-serif;
font-weight: 900;
- font-size: $font-size-xl;
- line-height: 1.3rem;
+ font-size: $font-size-xxl;
+ line-height: 1.6rem;
}
}
@@ -325,7 +327,7 @@ $checkbox-size: 20px;
@media (min-width: $breakpoint) {
.poll-slider__count--spaced {
left: revert;
- right: 6.5 * $spacer;
+ right: 8.5 * $spacer;
}
.poll-slider__count {
|
pwndbg__pwndbg-1104 | `dX` commands truncate output longer than native word size
### Example
The screenshot below shows pwndbg commands issued when debugging an x86 program.
Note that some of the data printed by the `dd` command is omitted by the `dq` command:

### Cause
This happens in the first line of `enhex()`, which is called by `dX()`:
https://github.com/pwndbg/pwndbg/blob/5d358585b1149aead6774f17c5721f10c4bed7be/pwndbg/commands/windbg.py#L137-L138
`value` is masked to the native word size, resulting in loss of information when `dX()` tries to print words longer than this, e.g. printing quadwords from an x86 process memory.
### Possible solution
Making the mask in `enhex()` fit the requested data width could fix this.
`pwndbg.arch.ptrmask` is calculated like so: https://github.com/pwndbg/pwndbg/blob/5d358585b1149aead6774f17c5721f10c4bed7be/pwndbg/arch.py#L53
So perhaps replacing the first line of `enhex()` with `value = value & (1 << 8*size) - 1` might work.
| [
{
"content": "\"\"\"\nCompatibility functionality for Windbg users.\n\"\"\"\n\nimport argparse\nimport codecs\nimport math\nimport sys\nfrom builtins import str\n\nimport gdb\n\nimport pwndbg.arch\nimport pwndbg.commands\nimport pwndbg.memory\nimport pwndbg.strings\nimport pwndbg.symbol\nimport pwndbg.typeinfo\... | [
{
"content": "\"\"\"\nCompatibility functionality for Windbg users.\n\"\"\"\n\nimport argparse\nimport codecs\nimport math\nimport sys\nfrom builtins import str\n\nimport gdb\n\nimport pwndbg.arch\nimport pwndbg.commands\nimport pwndbg.memory\nimport pwndbg.strings\nimport pwndbg.symbol\nimport pwndbg.typeinfo\... | diff --git a/pwndbg/commands/windbg.py b/pwndbg/commands/windbg.py
index f02bb70d6e4..e81af32eeaf 100644
--- a/pwndbg/commands/windbg.py
+++ b/pwndbg/commands/windbg.py
@@ -190,7 +190,7 @@ def dX(size, address, count, to_string=False, repeat=False):
def enhex(size, value):
- value = value & pwndbg.arch.ptrmask
+ value = value & ((1 << 8 * size) - 1)
x = "%x" % abs(value)
x = x.rjust(size * 2, "0")
return x
diff --git a/tests/test_windbg.py b/tests/test_windbg.py
index 1af5dbedadd..8aabb7c7816 100644
--- a/tests/test_windbg.py
+++ b/tests/test_windbg.py
@@ -4,6 +4,8 @@
import tests
MEMORY_BINARY = tests.binaries.get("memory.out")
+X86_BINARY = tests.binaries.get("gosample.x86")
+
data_addr = "0x400081"
@@ -299,3 +301,67 @@ def test_windbg_eX_commands(start_binary):
# Check if the write actually occurred
assert pwndbg.memory.read(stack_last_qword_ea, 8) == b"\xef\xbe\xad\xde\xbe\xba\xfe\xca"
+
+
+def test_windbg_commands_x86(start_binary):
+ """
+ Tests windbg compatibility commands that dump memory
+ like dq, dw, db, ds etc.
+ """
+ start_binary(X86_BINARY)
+
+ # Prepare memory
+ pwndbg.memory.write(pwndbg.regs.esp, b"1234567890abcdef_")
+ pwndbg.memory.write(pwndbg.regs.esp + 16, b"\x00" * 16)
+ pwndbg.memory.write(pwndbg.regs.esp + 32, bytes(range(16)))
+ pwndbg.memory.write(pwndbg.regs.esp + 48, b"Z" * 16)
+
+ #################################################
+ #### dX command tests
+ #################################################
+ db = gdb.execute("db $esp", to_string=True).splitlines()
+ assert db == [
+ "%x 31 32 33 34 35 36 37 38 39 30 61 62 63 64 65 66" % pwndbg.regs.esp,
+ "%x 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00" % (pwndbg.regs.esp + 16),
+ "%x 00 01 02 03 04 05 06 07 08 09 0a 0b 0c 0d 0e 0f" % (pwndbg.regs.esp + 32),
+ "%x 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a 5a" % (pwndbg.regs.esp + 48),
+ ]
+
+ dw = gdb.execute("dw $esp", to_string=True).splitlines()
+ assert dw == [
+ "%x 3231 3433 3635 3837 3039 6261 6463 6665" % pwndbg.regs.esp,
+ "%x 0000 0000 0000 0000 0000 0000 0000 0000" % (pwndbg.regs.esp + 16),
+ "%x 0100 0302 0504 0706 0908 0b0a 0d0c 0f0e" % (pwndbg.regs.esp + 32),
+ "%x 5a5a 5a5a 5a5a 5a5a 5a5a 5a5a 5a5a 5a5a" % (pwndbg.regs.esp + 48),
+ ]
+
+ dd = gdb.execute("dd $esp", to_string=True).splitlines()
+ assert dd == [
+ "%x 34333231 38373635 62613039 66656463" % pwndbg.regs.esp,
+ "%x 00000000 00000000 00000000 00000000" % (pwndbg.regs.esp + 16),
+ "%x 03020100 07060504 0b0a0908 0f0e0d0c" % (pwndbg.regs.esp + 32),
+ "%x 5a5a5a5a 5a5a5a5a 5a5a5a5a 5a5a5a5a" % (pwndbg.regs.esp + 48),
+ ]
+
+ dq = gdb.execute("dq $esp", to_string=True).splitlines()
+ assert dq == [
+ "%x 3837363534333231 6665646362613039" % pwndbg.regs.esp,
+ "%x 0000000000000000 0000000000000000" % (pwndbg.regs.esp + 16),
+ "%x 0706050403020100 0f0e0d0c0b0a0908" % (pwndbg.regs.esp + 32),
+ "%x 5a5a5a5a5a5a5a5a 5a5a5a5a5a5a5a5a" % (pwndbg.regs.esp + 48),
+ ]
+
+ #################################################
+ #### eX command tests
+ #################################################
+ gdb.execute("eb $esp 00")
+ assert pwndbg.memory.read(pwndbg.regs.esp, 1) == b"\x00"
+
+ gdb.execute("ew $esp 4141")
+ assert pwndbg.memory.read(pwndbg.regs.esp, 2) == b"\x41\x41"
+
+ gdb.execute("ed $esp 5252525252")
+ assert pwndbg.memory.read(pwndbg.regs.esp, 4) == b"\x52" * 4
+
+ gdb.execute("eq $esp 1122334455667788")
+ assert pwndbg.memory.read(pwndbg.regs.esp, 8) == b"\x88\x77\x66\x55\x44\x33\x22\x11"
|
readthedocs__readthedocs.org-10610 | Change profile edit form success page
Currently, when a user saves the profile edit form, the success page is not the profile form page, the user gets redirected to the profile public view page. This is quite confusing UX but might be baked into Allauth. I would expect this end up on the profile edit form page instead.
| [
{
"content": "\"\"\"Views for creating, editing and viewing site-specific user profiles.\"\"\"\n\nfrom allauth.account.views import LoginView as AllAuthLoginView\nfrom allauth.account.views import LogoutView as AllAuthLogoutView\nfrom django.conf import settings\nfrom django.contrib import messages\nfrom django... | [
{
"content": "\"\"\"Views for creating, editing and viewing site-specific user profiles.\"\"\"\n\nfrom allauth.account.views import LoginView as AllAuthLoginView\nfrom allauth.account.views import LogoutView as AllAuthLogoutView\nfrom django.conf import settings\nfrom django.contrib import messages\nfrom django... | diff --git a/readthedocs/profiles/views.py b/readthedocs/profiles/views.py
index 97607e0eda5..00f72266458 100644
--- a/readthedocs/profiles/views.py
+++ b/readthedocs/profiles/views.py
@@ -61,8 +61,7 @@ def get_object(self):
def get_success_url(self):
return reverse(
- 'profiles_profile_detail',
- kwargs={'username': self.request.user.username},
+ "profiles_profile_edit",
)
diff --git a/readthedocs/rtd_tests/tests/test_profile_views.py b/readthedocs/rtd_tests/tests/test_profile_views.py
index 23f5297fc5d..a8bf31d439b 100644
--- a/readthedocs/rtd_tests/tests/test_profile_views.py
+++ b/readthedocs/rtd_tests/tests/test_profile_views.py
@@ -36,6 +36,7 @@ def test_edit_profile(self):
},
)
self.assertTrue(resp.status_code, 200)
+ self.assertEqual(resp["Location"], "/accounts/edit/")
self.user.refresh_from_db()
self.user.profile.refresh_from_db()
|
pex-tool__pex-1859 | Release 2.1.100
On the docket:
+ [x] Using --target-system linux --target-system mac can still lead to failed attempts to lock Windows requirements. #1856
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.99\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.100\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 134903f81..4c6cf4cb8 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,15 @@
Release Notes
=============
+2.1.100
+-------
+
+This release fixes a hole in the lock creation ``--target-system``
+feature added in #1823 in Pex 2.1.95.
+
+* Fix lock creation ``--target-system`` handling. (#1858)
+ `PR #1858 <https://github.com/pantsbuild/pex/pull/1858>`_
+
2.1.99
------
diff --git a/pex/version.py b/pex/version.py
index 1262846c2..80d82318d 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.99"
+__version__ = "2.1.100"
|
microsoft__Qcodes-5046 | Update Sphinx favicon config
Thanks for using [Sphinx Favicon](https://github.com/tcmetzger/sphinx-favicon) in your project! I just released version 1.0 of the extension, which brings one breaking change: to better conform with Python standards, we changed the module name to `sphinx_favicon` (instead of `sphinx-favicon`). This means you'll have to update the name in the `extensions` list of your conf.py file (https://github.com/QCoDeS/Qcodes/blob/master/docs/conf.py#L81) to use version 1.0. Otherwise, your existing configuration should continue to work!
| [
{
"content": "#!/usr/bin/env python3\n#\n# QCoDeS documentation build configuration file, created by\n# sphinx-quickstart on Thu Jun 2 10:41:37 2016.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in this... | [
{
"content": "#!/usr/bin/env python3\n#\n# QCoDeS documentation build configuration file, created by\n# sphinx-quickstart on Thu Jun 2 10:41:37 2016.\n#\n# This file is execfile()d with the current directory set to its\n# containing dir.\n#\n# Note that not all possible configuration values are present in this... | diff --git a/docs/conf.py b/docs/conf.py
index f199041d57f..de9b9b3413f 100644
--- a/docs/conf.py
+++ b/docs/conf.py
@@ -78,7 +78,7 @@
"sphinxcontrib.towncrier",
"autodocsumm",
"sphinx_issues",
- "sphinx-favicon",
+ "sphinx_favicon",
]
# include special __xxx__ that DO have a docstring
diff --git a/pyproject.toml b/pyproject.toml
index 1b6ece851b0..c851529a1b1 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -107,7 +107,7 @@ docs = [
"nbsphinx>=0.8.9",
"PyVisa-sim>=0.4.0",
"sphinx>=4.5.0",
- "sphinx-favicon>=0.2",
+ "sphinx-favicon>=1.0",
"sphinx-issues>=3.0.1",
"sphinx-jsonschema>=1.19.1",
"sphinx-rtd-theme>=1.0.0",
diff --git a/requirements.txt b/requirements.txt
index d58457f88a5..de398991089 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -139,7 +139,7 @@ snowballstemmer~=2.2.0
sortedcontainers~=2.4.0
soupsieve~=2.4
Sphinx @ git+https://github.com/jenshnielsen/sphinx.git@fix_9884_6_1_3
-sphinx-favicon==0.2
+sphinx-favicon==1.0
sphinx-issues~=3.0.1
sphinx-jsonschema~=1.19.1
sphinx-rtd-theme~=1.2.0
|
codespell-project__codespell-89 | Makefile is broken
The makefile is no longer working, since there is no longer a codespell.py
| [
{
"content": "#! /usr/bin/env python\n\n# adapted from mne-python\n\nimport os\nfrom os import path as op\n\ntry:\n import setuptools # noqa to allow --develop\nexcept Exception:\n pass\nfrom distutils.core import setup\n\nfrom codespell_lib import __version__\n\nDISTNAME = 'codespell'\nDESCRIPTION = \"\... | [
{
"content": "#! /usr/bin/env python\n\n# adapted from mne-python\n\nimport os\nfrom os import path as op\n\ntry:\n import setuptools # noqa to allow --develop\nexcept Exception:\n pass\nfrom distutils.core import setup\n\nfrom codespell_lib import __version__\n\nDISTNAME = 'codespell'\nDESCRIPTION = \"\... | diff --git a/README.rst b/README.rst
index cc19f4cf8f..44b4f0b749 100644
--- a/README.rst
+++ b/README.rst
@@ -31,7 +31,7 @@ You can use ``pip`` to install codespell with e.g.::
Usage
-----
-Check usage with ``./codespell.py -h``. There are a few command line options.
+Check usage with ``codespell -h``. There are a few command line options.
Note that upon installation with "make install" we don't have the "py" suffix.
We ship a dictionary that is an improved version of the one available
`on Wikipedia <https://en.wikipedia.org/wiki/Wikipedia:Lists_of_common_misspellings/For_machines>`_
@@ -71,7 +71,7 @@ directly, but instead be manually inspected. E.g.:
License
-------
-The Python script ``codespell.py`` is available with the following terms:
+The Python script ``codespell`` is available with the following terms:
(*tl;dr*: `GPL v2`_)
Copyright (C) 2010-2011 Lucas De Marchi <lucas.de.marchi@gmail.com>
diff --git a/bin/codespell.py b/bin/codespell
similarity index 100%
rename from bin/codespell.py
rename to bin/codespell
diff --git a/codespell_lib/tests/test_basic.py b/codespell_lib/tests/test_basic.py
index 05d2fe9693..c0285412f9 100644
--- a/codespell_lib/tests/test_basic.py
+++ b/codespell_lib/tests/test_basic.py
@@ -18,12 +18,12 @@
def run_codespell(args=(), cwd=None):
"""Helper to run codespell"""
return subprocess.Popen(
- ['codespell.py'] + list(args), cwd=cwd,
+ ['codespell'] + list(args), cwd=cwd,
stdout=subprocess.PIPE, stderr=subprocess.PIPE).wait()
def test_command():
- """Test running codespell.py"""
+ """Test running the codespell executable"""
# With no arguments does "."
with TemporaryDirectory() as d:
assert_equal(run_codespell(cwd=d), 0)
diff --git a/setup.py b/setup.py
index 5619389a75..e43fbf23a9 100755
--- a/setup.py
+++ b/setup.py
@@ -55,4 +55,4 @@
op.join('data', 'dictionary.txt'),
op.join('data', 'linux-kernel.exclude'),
]},
- scripts=['bin/codespell.py'])
+ scripts=['bin/codespell'])
|
OpenEnergyPlatform__oeplatform-495 | The data versioning does not track change types
Change tables do not store the types of changes. "_type" has to be injected into queries
| [
{
"content": "###########\n# Parsers #\n###########\nimport decimal\nimport re\nfrom datetime import datetime, date\n\nimport geoalchemy2 # Although this import seems unused is has to be here\nimport sqlalchemy as sa\nfrom sqlalchemy import (\n Column,\n MetaData,\n Table,\n and_,\n not_,\n c... | [
{
"content": "###########\n# Parsers #\n###########\nimport decimal\nimport re\nfrom datetime import datetime, date\n\nimport geoalchemy2 # Although this import seems unused is has to be here\nimport sqlalchemy as sa\nfrom sqlalchemy import (\n Column,\n MetaData,\n Table,\n and_,\n not_,\n c... | diff --git a/api/parser.py b/api/parser.py
index 3c76c36ef..8e0f4abba 100644
--- a/api/parser.py
+++ b/api/parser.py
@@ -98,6 +98,7 @@ def set_meta_info(method, user, message=None):
val_dict = {}
val_dict["_user"] = user # TODO: Add user handling
val_dict["_message"] = message
+ val_dict["_type"] = method
return val_dict
|
bridgecrewio__checkov-2810 | HCL2 parser cannot parse functions with comments interleaved in the arguments.
**Describe the issue**
The HCL2 parser fails to parse a file that contains an expression with a Terraform function call that contains comments interleaved within the arguments.
**Example Value**
A file that contains the following exaple variable will fail to parse.
```hcl
variable "example" {
default = function(
# this comment is fine
argument1,
# this comment causes a parsing error
argument2
# this comment is fine
)
}
```
This seems to be a replicated issue in the downstream as well > https://github.com/amplify-education/python-hcl2/issues/95.
I have opened a PR to fix this in the bridgecrewio specific parser > https://github.com/bridgecrewio/python-hcl2/pull/29.
**Question**
Is the bridgecrewio HCL2 Parser intened to be merged upstream?
If not, I will implement the change in Amplify's codebase separately.
**An aside**
Checkov is an awesome tool, it makes the jobs of myself and the rest of the Platform/DevOps Engineers on my team so much easier!
| [
{
"content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\")... | [
{
"content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\")... | diff --git a/Pipfile b/Pipfile
index 1f3262d62c..c27ece993d 100644
--- a/Pipfile
+++ b/Pipfile
@@ -33,7 +33,7 @@ dlint = "*"
#
# REMINDER: Update "install_requires" deps on setup.py when changing
#
-bc-python-hcl2 = "==0.3.38"
+bc-python-hcl2 = "==0.3.39"
deep_merge = "*"
tabulate = "*"
colorama="*"
diff --git a/Pipfile.lock b/Pipfile.lock
index d72dfdbf46..150358cf43 100644
--- a/Pipfile.lock
+++ b/Pipfile.lock
@@ -1,7 +1,7 @@
{
"_meta": {
"hash": {
- "sha256": "169f933b7a713d9a651f2709ecae1a574bc490f5b62bc0c5a9859096b876ca1d"
+ "sha256": "8ca53ee8b86605dbfd93847c70135a3b8dab3b11c3c632f87efa9678b83d29d7"
},
"pipfile-spec": 6,
"requires": {
@@ -144,11 +144,11 @@
},
"bc-python-hcl2": {
"hashes": [
- "sha256:8bccdfd4ac9ec1997f313abef7b130f32e54b5cdb028a3941213141cffd46dee",
- "sha256:ac54165081831db2eb25fdb7cd9e0c3c350b677afdd68467dcf295ca7811da6f"
+ "sha256:24c436b0b8009cc275ff49d1f0b80a6e93c1e378152ad0adad92abfd3e29d0ef",
+ "sha256:baa5491d0d1497a5c2f07ef2eea2f4a2f3bc1d730b3a7f96dd9bf92ce8c1b586"
],
"index": "pypi",
- "version": "==0.3.38"
+ "version": "==0.3.39"
},
"beautifulsoup4": {
"hashes": [
@@ -160,19 +160,19 @@
},
"boto3": {
"hashes": [
- "sha256:013ba57295f05da141e364191dd46f4086e8fe3eb83a3cd09730eeb684ffbab3",
- "sha256:1e845aa92b3ad70b954329b98835135c28b3000e322ff8d3fc46a956bdb6e94b"
+ "sha256:56425debf5f1fd2cf5494d9cb110b2a977453888f071898a12e6ab64bdd41796",
+ "sha256:b709cb65ffc4e3f78c590145e2dee40758056c9edafb9ee692f67d170855dfc3"
],
"index": "pypi",
- "version": "==1.21.37"
+ "version": "==1.21.39"
},
"botocore": {
"hashes": [
- "sha256:21e164a213beca36033c46026bffa62f2ee2cd2600777271f9a551fb34dba006",
- "sha256:70c48c4ae3c2b9ec0ca025385979d01f4c7dae4d9a61c82758d4cf7caa7082cd"
+ "sha256:94f50a544003918270ba726eb5652b2c31f6cb34accbf25e053ed6ea97ecf1fd",
+ "sha256:a0883dfe8b81689060af7bb2ca4ce3048b954b25bef4ed712c6760ce3da51485"
],
"markers": "python_version >= '3.6'",
- "version": "==1.24.37"
+ "version": "==1.24.39"
},
"cached-property": {
"hashes": [
@@ -926,11 +926,11 @@
},
"setuptools": {
"hashes": [
- "sha256:7999cbd87f1b6e1f33bf47efa368b224bed5e27b5ef2c4d46580186cbcb1a86a",
- "sha256:a65e3802053e99fc64c6b3b29c11132943d5b8c8facbcc461157511546510967"
+ "sha256:26ead7d1f93efc0f8c804d9fafafbe4a44b179580a7105754b245155f9af05a8",
+ "sha256:47c7b0c0f8fc10eec4cf1e71c6fdadf8decaa74ffa087e68cd1c20db7ad6a592"
],
"markers": "python_version >= '3.7'",
- "version": "==62.0.0"
+ "version": "==62.1.0"
},
"six": {
"hashes": [
@@ -1872,11 +1872,11 @@
},
"virtualenv": {
"hashes": [
- "sha256:1e8588f35e8b42c6ec6841a13c5e88239de1e6e4e4cedfd3916b306dc826ec66",
- "sha256:8e5b402037287126e81ccde9432b95a8be5b19d36584f64957060a3488c11ca8"
+ "sha256:e617f16e25b42eb4f6e74096b9c9e37713cf10bf30168fb4a739f3fa8f898a3a",
+ "sha256:ef589a79795589aada0c1c5b319486797c03b67ac3984c48c669c0e4f50df3a5"
],
"markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
- "version": "==20.14.0"
+ "version": "==20.14.1"
},
"yarl": {
"hashes": [
diff --git a/setup.py b/setup.py
index 9a9f9109bd..6128201ffc 100644
--- a/setup.py
+++ b/setup.py
@@ -33,7 +33,7 @@
]
},
install_requires=[
- "bc-python-hcl2==0.3.38",
+ "bc-python-hcl2==0.3.39",
"cloudsplaining>=0.4.1",
"deep_merge",
"tabulate",
|
bridgecrewio__checkov-3151 | Terraform parsing error string with escaped backslash at the end
**Describe the issue**
Checkov crashes if it encounters an escaped backslash (`"\\"`) at the end of a string.
**Examples**
Minimal example to reproduce the error:
```terraform
variable "slash" {
default = "\\"
}
output "slash" {
value = var.slash
}
```
`terraform validate` sees this configuration as valid, but checkov fails with a parsing error.
This only happens when the last character of the string is the escaped backslash, as the parser assumes the closing quotation mark is escaped. Adding any normal character at the end of the string doesn't trigger this error.
```terraform
variable "slash" {
default = "\\"
}
```
**Exception Trace**
Relevant traceback
```sh
> LOG_LEVEL=DEBUG checkov -d .
[...]
[MainThread ] [DEBUG] failed while parsing file /workdir/main.tf
Traceback (most recent call last):
File "/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/checkov/terraform/parser.py", line 726, in _load_or_die_quietly
raw_data = hcl2.load(f)
File "/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/hcl2/api.py", line 12, in load
return loads(file.read())
File "/Users/user/.local/pipx/venvs/checkov/lib/python3.8/site-packages/hcl2/api.py", line 80, in loads
raise ValueError(f"Line has unclosed quote marks: {line}")
ValueError: Line has unclosed quote marks: default = "\\"
[...]
```
**Desktop (please complete the following information):**
- OS: MacOS 12.3.1 (Intel)
- Checkov Version: 2.0.1230
| [
{
"content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\")... | [
{
"content": "#!/usr/bin/env python\nimport logging\nimport os\nfrom importlib import util\nfrom os import path\n\nimport setuptools\nfrom setuptools import setup\n\n# read the contents of your README file\nthis_directory = path.abspath(path.dirname(__file__))\nwith open(path.join(this_directory, \"README.md\")... | diff --git a/Pipfile b/Pipfile
index 7a57559a95..e164b06aed 100644
--- a/Pipfile
+++ b/Pipfile
@@ -41,7 +41,7 @@ flake8-bugbear = "*"
#
# REMINDER: Update "install_requires" deps on setup.py when changing
#
-bc-python-hcl2 = "==0.3.42"
+bc-python-hcl2 = "==0.3.44"
deep_merge = "*"
tabulate = "*"
colorama="*"
diff --git a/Pipfile.lock b/Pipfile.lock
index 5cfd9bed84..f2ea7e5f42 100644
--- a/Pipfile.lock
+++ b/Pipfile.lock
@@ -1,7 +1,7 @@
{
"_meta": {
"hash": {
- "sha256": "a63095146044c16ab3ae85422a75874c797a0aaa6bd246fdf054d0bcc31fef6f"
+ "sha256": "f2b84cfe07cdad3a1d23ddb6849034ffac03c489ead842fc690a8bedd32b0a44"
},
"pipfile-spec": 6,
"requires": {
@@ -115,6 +115,7 @@
"sha256:26e62109036cd181df6e6ad646f91f0dcfd05fe16d0cb924138ff2ab75d64e3a",
"sha256:78ed67db6c7b7ced4f98e495e572106d5c432a93e1ddd1bf475e1dc05f5b7df2"
],
+ "markers": "python_version >= '3.6'",
"version": "==1.2.0"
},
"argcomplete": {
@@ -130,44 +131,56 @@
"sha256:2163e1640ddb52b7a8c80d0a67a08587e5d245cc9c553a74a847056bc2976b15",
"sha256:8ca1e4fcf50d07413d66d1a5e416e42cfdf5851c981d679a09851a6853383b3c"
],
+ "markers": "python_version >= '3.6'",
"version": "==4.0.2"
},
+ "asynctest": {
+ "hashes": [
+ "sha256:5da6118a7e6d6b54d83a8f7197769d046922a44d2a99c21382f0a6e4fadae676",
+ "sha256:c27862842d15d83e6a34eb0b2866c323880eb3a75e4485b079ea11748fd77fac"
+ ],
+ "markers": "python_version < '3.8'",
+ "version": "==0.13.0"
+ },
"attrs": {
"hashes": [
"sha256:2d27e3784d7a565d36ab851fe94887c5eccd6a463168875832a1be79c82828b4",
"sha256:626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd"
],
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
"version": "==21.4.0"
},
"bc-python-hcl2": {
"hashes": [
- "sha256:ab9e851013561f537015725a2cc787091611c7f31bb900a03bd4ffc5e280054b",
- "sha256:b383b6c6835a7d81c5053a3190ace0279363407e0ce43ecf729b2688066a47ff"
+ "sha256:0bb7649c37e8378af05c8e7f621b0a5e87abb69414a6838aadb6d6e32557ae04",
+ "sha256:f9beb5d6b835413fb306a0f458098ca0bf74a79db5bc5af6fe97a405c6a3f8fa"
],
"index": "pypi",
- "version": "==0.3.42"
+ "version": "==0.3.44"
},
"beautifulsoup4": {
"hashes": [
"sha256:58d5c3d29f5a36ffeb94f02f0d786cd53014cf9b3b3951d42e0080d8a9498d30",
"sha256:ad9aa55b65ef2808eb405f46cf74df7fcb7044d5cbc26487f96eb2ef2e436693"
],
+ "markers": "python_version >= '3.6'",
"version": "==4.11.1"
},
"boto3": {
"hashes": [
- "sha256:0b9757575b8003928defc5fb6e816936fa1bdb1384d0edec6622bb9fb104e96c",
- "sha256:f39b91a4c3614db8e44912ee82426fb4b16d5df2cd66883f3aff6f76d7f5d310"
+ "sha256:7033d3a351171b85647405eb70c4c9ae0d75c085dd987d7674557607acbcd459",
+ "sha256:e87dbc67475b0ea7564b17b6686995fd3a120312a95a625e6db61490fa0a3fed"
],
"index": "pypi",
- "version": "==1.24.12"
+ "version": "==1.24.20"
},
"botocore": {
"hashes": [
- "sha256:17d3ec9f684d21e06b64d9cb224934557bcd95031e2ecb551bf16271e8722fec",
- "sha256:b8ac156e55267da6e728ea0b806bfcd97adf882801cffe7849c4b88ce4780326"
+ "sha256:bb80a2204ccd51c1611e562d3d0511dc2a156257f87edeb59e99d7cef24b75d6",
+ "sha256:d3445a382711b58b4ec29e42267f074aa743ac7a5ddc50a08e0aae2b8309e3a5"
],
- "version": "==1.27.12"
+ "markers": "python_version >= '3.7'",
+ "version": "==1.27.20"
},
"cached-property": {
"hashes": [
@@ -189,6 +202,7 @@
"sha256:84c85a9078b11105f04f3036a9482ae10e4621616db313fe045dd24743a0820d",
"sha256:fe86415d55e84719d75f8b69414f6438ac3547d2078ab91b67e779ef69378412"
],
+ "markers": "python_version >= '3.6'",
"version": "==2022.6.15"
},
"cffi": {
@@ -248,11 +262,11 @@
},
"charset-normalizer": {
"hashes": [
- "sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597",
- "sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df"
+ "sha256:5189b6f22b01957427f35b6a08d9a0bc45b46d3788ef5a92e978433c7a35f8a5",
+ "sha256:575e708016ff3a5e3681541cb9d79312c416835686d054a23accb873b254f413"
],
"index": "pypi",
- "version": "==2.0.12"
+ "version": "==2.1.0"
},
"click": {
"hashes": [
@@ -267,6 +281,7 @@
"sha256:9653a2297357335d7325a1827e71ac1245d91c97d959346a7decabd4a52d5354",
"sha256:a6e924f3c46b657feb5b72679f7e930f8e5b224b766ab35c91ae4019b4e0615e"
],
+ "markers": "python_version >= '3.6' and python_version < '4'",
"version": "==0.5.3"
},
"cloudsplaining": {
@@ -298,21 +313,23 @@
"sha256:3fbdb64466afd23abaf6c977627b75b6139a5a3e8ce38405c5b413aed7a0471f",
"sha256:ab1e2bfe1d01d968e1b7e8d9023bc51ef3509bba217bb730cee3827e1ee82869"
],
+ "markers": "python_version >= '3.6'",
"version": "==21.6.0"
},
"cyclonedx-python-lib": {
"hashes": [
- "sha256:7a3aebcc1603e2cb0bc13ebf4274d2bd28ee46d199a7c2c05bd9d823ea7143e4",
- "sha256:875c0dac4c8be1da58cef399eb09ceba8668a153d2bfed67b7af8bdbca5bad61"
+ "sha256:06242c2a61033c4112b41b4f55d3b5130bc2a7bc6107a7b4950eac2431351963",
+ "sha256:8235aad70efc0f84cdf154b8c28802b605ee8133cd5c9a247d834bdb8b6c827a"
],
"index": "pypi",
- "version": "==2.5.2"
+ "version": "==2.6.0"
},
"decorator": {
"hashes": [
"sha256:637996211036b6385ef91435e4fae22989472f9d571faba8927ba8253acbc330",
"sha256:b8c3f85900b9dc423225913c5aace94729fe1fa9763b38939a95226f02d37186"
],
+ "markers": "python_version >= '3.5'",
"version": "==5.1.1"
},
"deep-merge": {
@@ -416,6 +433,7 @@
"sha256:f96293d6f982c58ebebb428c50163d010c2f05de0cde99fd681bfdc18d4b2dc2",
"sha256:ff9310f05b9d9c5c4dd472983dc956901ee6cb2c3ec1ab116ecdde25f3ce4951"
],
+ "markers": "python_version >= '3.7'",
"version": "==1.3.0"
},
"gitdb": {
@@ -423,6 +441,7 @@
"sha256:8033ad4e853066ba6ca92050b9df2f89301b8fc8bf7e9324d412a63f8bf1a8fd",
"sha256:bac2fd45c0a1c9cf619e63a90d62bdc63892ef92387424b855792a6cabe789aa"
],
+ "markers": "python_version >= '3.6'",
"version": "==4.0.9"
},
"gitpython": {
@@ -438,15 +457,16 @@
"sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff",
"sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d"
],
+ "markers": "python_version >= '3.5'",
"version": "==3.3"
},
"importlib-metadata": {
"hashes": [
- "sha256:5d26852efe48c0a32b0509ffbc583fda1a2266545a78d104a6f4aff3db17d700",
- "sha256:c58c8eb8a762858f49e18436ff552e83914778e50e9d2f1660535ffb364552ec"
+ "sha256:637245b8bab2b6502fcbc752cc4b7a6f6243bb02b31c5c26156ad103d3d45670",
+ "sha256:7401a975809ea1fdc658c3aa4f78cc2195a0e019c5cbc4c06122884e9ae80c23"
],
"index": "pypi",
- "version": "==4.11.4"
+ "version": "==4.12.0"
},
"importlib-resources": {
"hashes": [
@@ -461,6 +481,7 @@
"sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852",
"sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"
],
+ "markers": "python_version >= '3.7'",
"version": "==3.1.2"
},
"jmespath": {
@@ -507,6 +528,7 @@
"sha256:cbb516f16218e643d8e0a95b309f77eb118cb138d39a4f27851e6a63581db874",
"sha256:f5da449a6e1c989a4cea2631aa8ee67caa5a2ef855d551c88f9e309f4634c621"
],
+ "markers": "python_version >= '3.6'",
"version": "==3.3.7"
},
"markupsafe": {
@@ -552,6 +574,7 @@
"sha256:f121a1420d4e173a5d96e47e9a0c0dcff965afdf1626d28de1460815f7c4ee7a",
"sha256:fc7b548b17d238737688817ab67deebb30e8073c95749d55538ed473130ec0c7"
],
+ "markers": "python_version >= '3.7'",
"version": "==2.1.1"
},
"multidict": {
@@ -616,6 +639,7 @@
"sha256:feba80698173761cddd814fa22e88b0661e98cb810f9f986c54aa34d281e4937",
"sha256:feea820722e69451743a3d56ad74948b68bf456984d63c1a92e8347b7b88452d"
],
+ "markers": "python_version >= '3.7'",
"version": "==6.0.2"
},
"networkx": {
@@ -628,10 +652,11 @@
},
"packageurl-python": {
"hashes": [
- "sha256:07aa852d1c48b0e86e625f6a32d83f96427739806b269d0f8142788ee807114b",
- "sha256:872a0434b9a448b3fa97571711f69dd2a3fb72345ad66c90b17d827afea82f09"
+ "sha256:99df143960b7100fff3b2cf5b0beba2f64b6d8c818f6c9f125aed6fac7438763",
+ "sha256:c7dc928aaa9465f04c86eaa956c75247d5f140ec8d50bc111b55314f143324bb"
],
- "version": "==0.9.9"
+ "markers": "python_version >= '3.6'",
+ "version": "==0.10.0"
},
"packaging": {
"hashes": [
@@ -653,15 +678,16 @@
"sha256:5358f388ba7ff682a337f0a80a9cb7fb1ee981b6f46ad1c446132c017ac5ede4",
"sha256:75137fc7e1311bc24836855dce7caa40548f3f81a72045bb6731d55f48de644a"
],
+ "markers": "python_version >= '3.6'",
"version": "==0.12.3"
},
"policyuniverse": {
"hashes": [
- "sha256:826705f0a77018b314e60d4d620c4b2a004b935c89ad68bf7695444c3698d15a",
- "sha256:997db60c3c0181a3fbae09e73a56f5c28076fa5e9b13ea09f93eedf9a0978fa7"
+ "sha256:be5d9148bf6cc2586b02aa85242e9c9cdc94e4469f9b393114950cae299eeb5d",
+ "sha256:c66b1fb907750643a1987eb419b2112ae3f9c527c013429525f9fab989c9a2d7"
],
"index": "pypi",
- "version": "==1.5.0.20220523"
+ "version": "==1.5.0.20220613"
},
"prettytable": {
"hashes": [
@@ -673,39 +699,39 @@
},
"pycares": {
"hashes": [
- "sha256:03490be0e7b51a0c8073f877bec347eff31003f64f57d9518d419d9369452837",
- "sha256:056330275dea42b7199494047a745e1d9785d39fb8c4cd469dca043532240b80",
- "sha256:0aa897543a786daba74ec5e19638bd38b2b432d179a0e248eac1e62de5756207",
- "sha256:112e1385c451069112d6b5ea1f9c378544f3c6b89882ff964e9a64be3336d7e4",
- "sha256:27a6f09dbfb69bb79609724c0f90dfaa7c215876a7cd9f12d585574d1f922112",
- "sha256:2b837315ed08c7df009b67725fe1f50489e99de9089f58ec1b243dc612f172aa",
- "sha256:2f5f84fe9f83eab9cd68544b165b74ba6e3412d029cc9ab20098d9c332869fc5",
- "sha256:40079ed58efa91747c50aac4edf8ecc7e570132ab57dc0a4030eb0d016a6cab8",
- "sha256:439799be4b7576e907139a7f9b3c8a01b90d3e38af4af9cd1fc6c1ee9a42b9e6",
- "sha256:4d5da840aa0d9b15fa51107f09270c563a348cb77b14ae9653d0bbdbe326fcc2",
- "sha256:4e190471a015f8225fa38069617192e06122771cce2b169ac7a60bfdbd3d4ab2",
- "sha256:5632f21d92cc0225ba5ff906e4e5dec415ef0b3df322c461d138190681cd5d89",
- "sha256:569eef8597b5e02b1bc4644b9f272160304d8c9985357d7ecfcd054da97c0771",
- "sha256:58a41a2baabcd95266db776c510d349d417919407f03510fc87ac7488730d913",
- "sha256:6831e963a910b0a8cbdd2750ffcdf5f2bb0edb3f53ca69ff18484de2cc3807c4",
- "sha256:71b99b9e041ae3356b859822c511f286f84c8889ec9ed1fbf6ac30fb4da13e4c",
- "sha256:8319afe4838e09df267c421ca93da408f770b945ec6217dda72f1f6a493e37e4",
- "sha256:8fd1ff17a26bb004f0f6bb902ba7dddd810059096ae0cc3b45e4f5be46315d19",
- "sha256:a810d01c9a426ee8b0f36969c2aef5fb966712be9d7e466920beb328cd9cefa3",
- "sha256:ad7b28e1b6bc68edd3d678373fa3af84e39d287090434f25055d21b4716b2fc6",
- "sha256:b0e50ddc78252f2e2b6b5f2c73e5b2449dfb6bea7a5a0e21dfd1e2bcc9e17382",
- "sha256:b266cec81dcea2c3efbbd3dda00af8d7eb0693ae9e47e8706518334b21f27d4a",
- "sha256:c000942f5fc64e6e046aa61aa53b629b576ba11607d108909727c3c8f211a157",
- "sha256:c6680f7fdc0f1163e8f6c2a11d11b9a0b524a61000d2a71f9ccd410f154fb171",
- "sha256:c7eba3c8354b730a54d23237d0b6445a2f68570fa68d0848887da23a3f3b71f3",
- "sha256:cbceaa9b2c416aa931627466d3240aecfc905c292c842252e3d77b8630072505",
- "sha256:dc942692fca0e27081b7bb414bb971d34609c80df5e953f6d0c62ecc8019acd9",
- "sha256:e1489aa25d14dbf7176110ead937c01176ed5a0ebefd3b092bbd6b202241814c",
- "sha256:e5a060f5fa90ae245aa99a4a8ad13ec39c2340400de037c7e8d27b081e1a3c64",
- "sha256:ec00f3594ee775665167b1a1630edceefb1b1283af9ac57480dba2fb6fd6c360",
- "sha256:ed71dc4290d9c3353945965604ef1f6a4de631733e9819a7ebc747220b27e641"
- ],
- "version": "==4.1.2"
+ "sha256:061dd4c80fec73feb150455b159704cd51a122f20d36790033bd6375d4198579",
+ "sha256:15dd5cf21bc73ad539e8aabf7afe370d1df8af7bc6944cd7298f3bfef0c1a27c",
+ "sha256:1a9506d496efeb809a1b63647cb2f3f33c67fcf62bf80a2359af692fef2c1755",
+ "sha256:1f37f762414680063b4dfec5be809a84f74cd8e203d939aaf3ba9c807a9e7013",
+ "sha256:2113529004df4894783eaa61e9abc3a680756b6f033d942f2800301ae8c71c29",
+ "sha256:2fd53eb5b441c4f6f9c78d7900e05883e9998b34a14b804be4fc4c6f9fea89f3",
+ "sha256:3636fccf643c5192c34ee0183c514a2d09419e3a76ca2717cef626638027cb21",
+ "sha256:396ee487178e9de06ca4122a35a157474db3ce0a0db6038a31c831ebb9863315",
+ "sha256:3b78bdee2f2f1351d5fccc2d1b667aea2d15a55d74d52cb9fd5bea8b5e74c4dc",
+ "sha256:4ee625d7571039038bca51ae049b047cbfcfc024b302aae6cc53d5d9aa8648a8",
+ "sha256:5333b51ef4ff3e8973b4a1b57cad5ada13e15552445ee3cd74bd77407dec9d44",
+ "sha256:66b5390a4885a578e687d3f2683689c35e1d4573f4d0ecf217431f7bb55c49a0",
+ "sha256:6724573e830ea2345f4bcf0f968af64cc6d491dc2133e9c617f603445dcdfa58",
+ "sha256:735b4f75fd0f595c4e9184da18cd87737f46bc81a64ea41f4edce2b6b68d46d2",
+ "sha256:7a901776163a04de5d67c42bd63a287cff9cb05fc041668ad1681fe3daa36445",
+ "sha256:8bd6ed3ad3a5358a635c1acf5d0f46be9afb095772b84427ff22283d2f31db1b",
+ "sha256:99e00e397d07a79c9f43e4303e67f4f97bcabd013bda0d8f2d430509b7aef8a0",
+ "sha256:9b05c2cec644a6c66b55bcf6c24d4dfdaf2f7205b16e5c4ceee31db104fac958",
+ "sha256:a521d7f54f3e52ded4d34c306ba05cfe9eb5aaa2e5aaf83c96564b9369495588",
+ "sha256:b03f69df69f0ab3bfb8dbe54444afddff6ff9389561a08aade96b4f91207a655",
+ "sha256:c8a46839da642b281ac5f56d3c6336528e128b3c41eab9c5330d250f22325e9d",
+ "sha256:d2e8ec4c8e07c986b70a3cc8f5b297c53b08ac755e5b9797512002a466e2de86",
+ "sha256:d83f193563b42360528167705b1c7bb91e2a09f990b98e3d6378835b72cd5c96",
+ "sha256:d9cd826d8e0c270059450709bff994bfeb072f79d82fd3f11c701690ff65d0e7",
+ "sha256:e4dc37f732f7110ca6368e0128cbbd0a54f5211515a061b2add64da2ddb8e5ca",
+ "sha256:e75cbd4d3b3d9b02bba6e170846e39893a825e7a5fb1b96728fc6d7b964f8945",
+ "sha256:e7a95763cdc20cf9ec357066e656ea30b8de6b03de6175cbb50890e22aa01868",
+ "sha256:e9dbfcacbde6c21380c412c13d53ea44b257dea3f7b9d80be2c873bb20e21fee",
+ "sha256:f05223de13467bb26f9a1594a1799ce2d08ad8ea241489fecd9d8ed3bbbfc672",
+ "sha256:f8e6942965465ca98e212376c4afb9aec501d8129054929744b2f4a487c8c14b",
+ "sha256:fbd53728d798d07811898e11991e22209229c090eab265a53d12270b95d70d1a"
+ ],
+ "version": "==4.2.1"
},
"pycep-parser": {
"hashes": [
@@ -727,6 +753,7 @@
"sha256:2b020ecf7d21b687f219b71ecad3631f644a47f01403fa1d1036b0c6416d70fb",
"sha256:5026bae9a10eeaefb61dab2f09052b9f4307d44aee4eda64b309723d8d206bbc"
],
+ "markers": "python_full_version >= '3.6.8'",
"version": "==3.0.9"
},
"pyrsistent": {
@@ -753,6 +780,7 @@
"sha256:f87cc2863ef33c709e237d4b5f4502a62a00fab450c9e020892e8e2ede5847f5",
"sha256:fd8da6d0124efa2f67d86fa70c851022f87c98e205f0594e1fae044e7119a5a6"
],
+ "markers": "python_version >= '3.7'",
"version": "==0.18.1"
},
"python-dateutil": {
@@ -760,6 +788,7 @@
"sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86",
"sha256:961d03dc3453ebbc59dbdea9e4e11c5651520a876d0f4db161e8674aae935da9"
],
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==2.8.2"
},
"pyyaml": {
@@ -878,20 +907,23 @@
"sha256:fdecb225d0f1d50d4b26ac423e0032e76d46a788b83b4e299a520717a47d968c",
"sha256:ffef4b30785dc2d1604dfb7cf9fca5dc27cd86d65f7c2a9ec34d6d3ae4565ec2"
],
+ "markers": "python_version >= '3.6'",
"version": "==2022.6.2"
},
"requests": {
"hashes": [
- "sha256:bc7861137fbce630f17b03d3ad02ad0bf978c844f3536d0edda6499dafce2b6f",
- "sha256:d568723a7ebd25875d8d1eaf5dfa068cd2fc8194b2e483d7b1f7c81918dbec6b"
+ "sha256:7c5599b102feddaa661c826c56ab4fee28bfd17f5abca1ebbe3e7f19d7c97983",
+ "sha256:8fefa2a1a1365bf5520aac41836fbee479da67864514bdb821f31ce07ce65349"
],
- "version": "==2.28.0"
+ "markers": "python_version >= '3.7' and python_version < '4'",
+ "version": "==2.28.1"
},
"s3transfer": {
"hashes": [
"sha256:06176b74f3a15f61f1b4f25a1fc29a4429040b7647133a463da8fa5bd28d5ecd",
"sha256:2ed07d3866f523cc561bf4a00fc5535827981b117dd7876f036b0c1aca42c947"
],
+ "markers": "python_version >= '3.7'",
"version": "==0.6.0"
},
"schema": {
@@ -909,11 +941,20 @@
"index": "pypi",
"version": "==2.10.0"
},
+ "setuptools": {
+ "hashes": [
+ "sha256:990a4f7861b31532871ab72331e755b5f14efbe52d336ea7f6118144dd478741",
+ "sha256:c1848f654aea2e3526d17fc3ce6aeaa5e7e24e66e645b5be2171f3f6b4e5a178"
+ ],
+ "markers": "python_version >= '3.7'",
+ "version": "==62.6.0"
+ },
"six": {
"hashes": [
"sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926",
"sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"
],
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==1.16.0"
},
"smmap": {
@@ -921,6 +962,7 @@
"sha256:2aba19d6a040e78d8b09de5c57e96207b09ed71d8e55ce0959eeee6c8e190d94",
"sha256:c840e62059cd3be204b0c9c9f74be2c09d5648eddd4580d9314c3ecde0b30936"
],
+ "markers": "python_version >= '3.6'",
"version": "==5.0.0"
},
"sortedcontainers": {
@@ -935,15 +977,17 @@
"sha256:3b2503d3c7084a42b1ebd08116e5f81aadfaea95863628c80a3b774a11b7c759",
"sha256:fc53893b3da2c33de295667a0e19f078c14bf86544af307354de5fcf12a3f30d"
],
+ "markers": "python_version >= '3.6'",
"version": "==2.3.2.post1"
},
"tabulate": {
"hashes": [
- "sha256:d7c013fe7abbc5e491394e10fa845f8f32fe54f8dc60c6622c6cf482d25d47e4",
- "sha256:eb1d13f25760052e8931f2ef80aaf6045a6cceb47514db8beab24cded16f13a7"
+ "sha256:0ba055423dbaa164b9e456abe7920c5e8ed33fcc16f6d1b2f2d152c8e1e8b4fc",
+ "sha256:436f1c768b424654fce8597290d2764def1eea6a77cfa5c33be00b1bc0f4f63d",
+ "sha256:6c57f3f3dd7ac2782770155f3adb2db0b1a269637e42f27599925e64b114f519"
],
"index": "pypi",
- "version": "==0.8.9"
+ "version": "==0.8.10"
},
"termcolor": {
"hashes": [
@@ -957,6 +1001,7 @@
"sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b",
"sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"
],
+ "markers": "python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==0.10.2"
},
"tqdm": {
@@ -988,6 +1033,7 @@
"sha256:44ece4d53fb1706f667c9bd1c648f5469a2ec925fcf3a776667042d645472c14",
"sha256:aabaf16477806a5e1dd19aa41f8c2b7950dd3c746362d7e3223dbe6de6ac448e"
],
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4' and python_version < '4'",
"version": "==1.26.9"
},
"wcwidth": {
@@ -999,10 +1045,11 @@
},
"websocket-client": {
"hashes": [
- "sha256:50b21db0058f7a953d67cc0445be4b948d7fc196ecbeb8083d68d94628e4abf6",
- "sha256:722b171be00f2b90e1d4fb2f2b53146a536ca38db1da8ff49c972a4e1365d0ef"
+ "sha256:5d55652dc1d0b3c734f044337d929aaf83f4f9138816ec680c1aefefb4dc4877",
+ "sha256:d58c5f284d6a9bf8379dab423259fe8f85b70d5fa5d2916d5791a84594b122b1"
],
- "version": "==1.3.2"
+ "markers": "python_version >= '3.7'",
+ "version": "==1.3.3"
},
"yarl": {
"hashes": [
@@ -1079,6 +1126,7 @@
"sha256:fce78593346c014d0d986b7ebc80d782b7f5e19843ca798ed62f8e3ba8728576",
"sha256:fd547ec596d90c8676e369dd8a581a21227fe9b4ad37d0dc7feb4ccf544c2d59"
],
+ "markers": "python_version >= '3.6'",
"version": "==1.7.2"
},
"zipp": {
@@ -1086,7 +1134,7 @@
"sha256:56bf8aadb83c24db6c4b577e13de374ccfb67da2078beba1d037c17980bf43ad",
"sha256:c4f6e5bbf48e74f7a38e7cc5b0480ff42b0ae5178957d564d18932525d5cf099"
],
- "markers": "python_version < '3.10'",
+ "markers": "python_version >= '3.7'",
"version": "==3.8.0"
}
},
@@ -1182,6 +1230,7 @@
"sha256:26e62109036cd181df6e6ad646f91f0dcfd05fe16d0cb924138ff2ab75d64e3a",
"sha256:78ed67db6c7b7ced4f98e495e572106d5c432a93e1ddd1bf475e1dc05f5b7df2"
],
+ "markers": "python_version >= '3.6'",
"version": "==1.2.0"
},
"async-timeout": {
@@ -1189,8 +1238,17 @@
"sha256:2163e1640ddb52b7a8c80d0a67a08587e5d245cc9c553a74a847056bc2976b15",
"sha256:8ca1e4fcf50d07413d66d1a5e416e42cfdf5851c981d679a09851a6853383b3c"
],
+ "markers": "python_version >= '3.6'",
"version": "==4.0.2"
},
+ "asynctest": {
+ "hashes": [
+ "sha256:5da6118a7e6d6b54d83a8f7197769d046922a44d2a99c21382f0a6e4fadae676",
+ "sha256:c27862842d15d83e6a34eb0b2866c323880eb3a75e4485b079ea11748fd77fac"
+ ],
+ "markers": "python_version < '3.8'",
+ "version": "==0.13.0"
+ },
"atomicwrites": {
"hashes": [
"sha256:6d1784dea7c0c8d4a5172b6c620f40b6e4cbfdf96d783691f2e1302a7b88e197",
@@ -1204,6 +1262,7 @@
"sha256:2d27e3784d7a565d36ab851fe94887c5eccd6a463168875832a1be79c82828b4",
"sha256:626ba8234211db98e869df76230a137c4c40a12d72445c45d5f5b716f076e2fd"
],
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
"version": "==21.4.0"
},
"bandit": {
@@ -1219,6 +1278,7 @@
"sha256:84c85a9078b11105f04f3036a9482ae10e4621616db313fe045dd24743a0820d",
"sha256:fe86415d55e84719d75f8b69414f6438ac3547d2078ab91b67e779ef69378412"
],
+ "markers": "python_version >= '3.6'",
"version": "==2022.6.15"
},
"cfgv": {
@@ -1226,20 +1286,19 @@
"sha256:c6a0883f3917a037485059700b9e75da2464e6c27051014ad85ba6aaa5884426",
"sha256:f5a830efb9ce7a445376bb66ec94c638a9787422f96264c98edc6bdeed8ab736"
],
+ "markers": "python_full_version >= '3.6.1'",
"version": "==3.3.1"
},
"charset-normalizer": {
"hashes": [
- "sha256:2857e29ff0d34db842cd7ca3230549d1a697f96ee6d3fb071cfa6c7393832597",
- "sha256:6881edbebdb17b39b4eaaa821b438bf6eddffb4468cf344f09f89def34a8b1df"
+ "sha256:5189b6f22b01957427f35b6a08d9a0bc45b46d3788ef5a92e978433c7a35f8a5",
+ "sha256:575e708016ff3a5e3681541cb9d79312c416835686d054a23accb873b254f413"
],
"index": "pypi",
- "version": "==2.0.12"
+ "version": "==2.1.0"
},
"coverage": {
- "extras": [
- "toml"
- ],
+ "extras": [],
"hashes": [
"sha256:004d1880bed2d97151facef49f08e255a20ceb6f9432df75f4eef018fdd5a78c",
"sha256:01d84219b5cdbfc8122223b39a954820929497a1cb1422824bb86b07b74594b6",
@@ -1324,6 +1383,7 @@
"sha256:8f694f3ba9cc92cab508b152dcfe322153975c29bda272e2fd7f3f00f36e47c5",
"sha256:a295f7cc774947aac58dde7fdc85f4aa00c42adf5d8f5468fc630c1acf30a142"
],
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
"version": "==1.9.0"
},
"filelock": {
@@ -1331,6 +1391,7 @@
"sha256:37def7b658813cda163b56fc564cdc75e86d338246458c4c28ae84cabefa2404",
"sha256:3a0fd85166ad9dbab54c9aec96737b744106dc5f15c0b09a6744a445299fcf04"
],
+ "markers": "python_version >= '3.7'",
"version": "==3.7.1"
},
"flake8": {
@@ -1343,11 +1404,11 @@
},
"flake8-bugbear": {
"hashes": [
- "sha256:ec374101cddf65bd7a96d393847d74e58d3b98669dbf9768344c39b6290e8bd6",
- "sha256:f7c080563fca75ee6b205d06b181ecba22b802babb96b0b084cc7743d6908a55"
+ "sha256:ac3317eba27d79dc19dcdeb7356ca1f656f0cde11d899c4551badf770f05cbef",
+ "sha256:ad2b33dbe33a6d4ca1f0037e1d156d0a89107ee63c0600e3b4f7b60e37998ac2"
],
"index": "pypi",
- "version": "==22.4.25"
+ "version": "==22.6.22"
},
"frozenlist": {
"hashes": [
@@ -1411,6 +1472,7 @@
"sha256:f96293d6f982c58ebebb428c50163d010c2f05de0cde99fd681bfdc18d4b2dc2",
"sha256:ff9310f05b9d9c5c4dd472983dc956901ee6cb2c3ec1ab116ecdde25f3ce4951"
],
+ "markers": "python_version >= '3.7'",
"version": "==1.3.0"
},
"gitdb": {
@@ -1418,6 +1480,7 @@
"sha256:8033ad4e853066ba6ca92050b9df2f89301b8fc8bf7e9324d412a63f8bf1a8fd",
"sha256:bac2fd45c0a1c9cf619e63a90d62bdc63892ef92387424b855792a6cabe789aa"
],
+ "markers": "python_version >= '3.6'",
"version": "==4.0.9"
},
"gitpython": {
@@ -1433,6 +1496,7 @@
"sha256:0dca2ea3e4381c435ef9c33ba100a78a9b40c0bab11189c7cf121f75815efeaa",
"sha256:3d11b16f3fe19f52039fb7e39c9c884b21cb1b586988114fbe42671f03de3e82"
],
+ "markers": "python_version >= '3.7'",
"version": "==2.5.1"
},
"idna": {
@@ -1440,8 +1504,17 @@
"sha256:84d9dd047ffa80596e0f246e2eab0b391788b0503584e8945f2368256d2735ff",
"sha256:9d643ff0a55b762d5cdb124b8eaa99c66322e2157b69160bc32796e824360e6d"
],
+ "markers": "python_version >= '3.5'",
"version": "==3.3"
},
+ "importlib-metadata": {
+ "hashes": [
+ "sha256:637245b8bab2b6502fcbc752cc4b7a6f6243bb02b31c5c26156ad103d3d45670",
+ "sha256:7401a975809ea1fdc658c3aa4f78cc2195a0e019c5cbc4c06122884e9ae80c23"
+ ],
+ "index": "pypi",
+ "version": "==4.12.0"
+ },
"importlib-resources": {
"hashes": [
"sha256:568c9f16cb204f9decc8d6d24a572eeea27dacbb4cee9e6b03a8025736769751",
@@ -1542,6 +1615,7 @@
"sha256:feba80698173761cddd814fa22e88b0661e98cb810f9f986c54aa34d281e4937",
"sha256:feea820722e69451743a3d56ad74948b68bf456984d63c1a92e8347b7b88452d"
],
+ "markers": "python_version >= '3.7'",
"version": "==6.0.2"
},
"mypy": {
@@ -1582,10 +1656,11 @@
},
"nodeenv": {
"hashes": [
- "sha256:3ef13ff90291ba2a4a7a4ff9a979b63ffdd00a464dbe04acf0ea6471517a4c2b",
- "sha256:621e6b7076565ddcacd2db0294c0381e01fd28945ab36bcf00f41c5daf63bef7"
+ "sha256:27083a7b96a25f2f5e1d8cb4b6317ee8aeda3bdd121394e5ac54e498028a042e",
+ "sha256:e0e7f7dfb85fc5394c6fe1e8fa98131a2473e04311a45afb6508f7cf1836fa2b"
],
- "version": "==1.6.0"
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4, 3.5, 3.6'",
+ "version": "==1.7.0"
},
"packaging": {
"hashes": [
@@ -1600,6 +1675,7 @@
"sha256:e547125940bcc052856ded43be8e101f63828c2d94239ffbe2b327ba3d5ccf0a",
"sha256:e8dca2f4b43560edef58813969f52a56cef023146cbb8931626db80e6c1c4308"
],
+ "markers": "python_version >= '2.6'",
"version": "==5.9.0"
},
"platformdirs": {
@@ -1607,6 +1683,7 @@
"sha256:027d8e83a2d7de06bbac4e5ef7e023c02b863d7ea5d079477e722bb41ab25788",
"sha256:58c8abb07dcb441e6ee4b11d8df0ac856038f944ab98b7be6b27b2a3c7feef19"
],
+ "markers": "python_version >= '3.7'",
"version": "==2.5.2"
},
"pluggy": {
@@ -1614,6 +1691,7 @@
"sha256:4224373bacce55f955a878bf9cfa763c1e360858e330072059e10bad68531159",
"sha256:74134bbf457f031a36d68416e1509f34bd5ccc019f0bcc952c7b909d06b37bd3"
],
+ "markers": "python_version >= '3.6'",
"version": "==1.0.0"
},
"pre-commit": {
@@ -1629,6 +1707,7 @@
"sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719",
"sha256:607c53218732647dff4acdfcd50cb62615cedf612e72d1724fb1a0cc6405b378"
],
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
"version": "==1.11.0"
},
"pycodestyle": {
@@ -1636,6 +1715,7 @@
"sha256:720f8b39dde8b293825e7ff02c475f3077124006db4f440dcbc9a20b76548a20",
"sha256:eddd5847ef438ea1c7870ca7eb78a9d47ce0cdb4851a5523949f2601d0cbbe7f"
],
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
"version": "==2.8.0"
},
"pyflakes": {
@@ -1643,6 +1723,7 @@
"sha256:05a85c2872edf37a4ed30b0cce2f6093e1d0581f8c19d7393122da7e25b2b24c",
"sha256:3bb3a3f256f4b7968c9c788781e4ff07dce46bdf12339dcda61053375426ee2e"
],
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==2.4.0"
},
"pyparsing": {
@@ -1650,6 +1731,7 @@
"sha256:2b020ecf7d21b687f219b71ecad3631f644a47f01403fa1d1036b0c6416d70fb",
"sha256:5026bae9a10eeaefb61dab2f09052b9f4307d44aee4eda64b309723d8d206bbc"
],
+ "markers": "python_full_version >= '3.6.8'",
"version": "==3.0.9"
},
"pyrsistent": {
@@ -1676,6 +1758,7 @@
"sha256:f87cc2863ef33c709e237d4b5f4502a62a00fab450c9e020892e8e2ede5847f5",
"sha256:fd8da6d0124efa2f67d86fa70c851022f87c98e205f0594e1fae044e7119a5a6"
],
+ "markers": "python_version >= '3.7'",
"version": "==0.18.1"
},
"pytest": {
@@ -1708,15 +1791,16 @@
"sha256:8b67587c8f98cbbadfdd804539ed5455b6ed03802203485dd2f53c1422d7440e",
"sha256:bbbb6717efc886b9d64537b41fb1497cfaf3c9601276be8da2cccfea5a3c8ad8"
],
+ "markers": "python_version >= '3.6'",
"version": "==1.4.0"
},
"pytest-mock": {
"hashes": [
- "sha256:5112bd92cc9f186ee96e1a92efc84969ea494939c3aead39c50f421c4cc69534",
- "sha256:6cff27cec936bf81dc5ee87f07132b807bcda51106b5ec4b90a04331cba76231"
+ "sha256:2c6d756d5d3bf98e2e80797a959ca7f81f479e7d1f5f571611b0fdd6d1745240",
+ "sha256:d989f11ca4a84479e288b0cd1e6769d6ad0d3d7743dcc75e460d1416a5f2135a"
],
"index": "pypi",
- "version": "==3.7.0"
+ "version": "==3.8.1"
},
"pytest-xdist": {
"hashes": [
@@ -1767,10 +1851,11 @@
},
"requests": {
"hashes": [
- "sha256:bc7861137fbce630f17b03d3ad02ad0bf978c844f3536d0edda6499dafce2b6f",
- "sha256:d568723a7ebd25875d8d1eaf5dfa068cd2fc8194b2e483d7b1f7c81918dbec6b"
+ "sha256:7c5599b102feddaa661c826c56ab4fee28bfd17f5abca1ebbe3e7f19d7c97983",
+ "sha256:8fefa2a1a1365bf5520aac41836fbee479da67864514bdb821f31ce07ce65349"
],
- "version": "==2.28.0"
+ "markers": "python_version >= '3.7' and python_version < '4'",
+ "version": "==2.28.1"
},
"responses": {
"hashes": [
@@ -1780,11 +1865,20 @@
"index": "pypi",
"version": "==0.21.0"
},
+ "setuptools": {
+ "hashes": [
+ "sha256:990a4f7861b31532871ab72331e755b5f14efbe52d336ea7f6118144dd478741",
+ "sha256:c1848f654aea2e3526d17fc3ce6aeaa5e7e24e66e645b5be2171f3f6b4e5a178"
+ ],
+ "markers": "python_version >= '3.7'",
+ "version": "==62.6.0"
+ },
"six": {
"hashes": [
"sha256:1e61c37477a1626458e36f7b1d82aa5c9b094fa4802892072e49de9c60c4c926",
"sha256:8abb2f1d86890a2dfb989f9a77cfcfd3e47c2a354b01111771326f8aa26e0254"
],
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==1.16.0"
},
"smmap": {
@@ -1792,6 +1886,7 @@
"sha256:2aba19d6a040e78d8b09de5c57e96207b09ed71d8e55ce0959eeee6c8e190d94",
"sha256:c840e62059cd3be204b0c9c9f74be2c09d5648eddd4580d9314c3ecde0b30936"
],
+ "markers": "python_version >= '3.6'",
"version": "==5.0.0"
},
"stevedore": {
@@ -1799,6 +1894,7 @@
"sha256:a547de73308fd7e90075bb4d301405bebf705292fa90a90fc3bcf9133f58616c",
"sha256:f40253887d8712eaa2bb0ea3830374416736dc8ec0e22f5a65092c1174c44335"
],
+ "markers": "python_version >= '3.6'",
"version": "==3.5.0"
},
"toml": {
@@ -1806,6 +1902,7 @@
"sha256:806143ae5bfb6a3c6e736a764057db0e6a0e05e338b5630894a5f779cabb4f9b",
"sha256:b3bda1d108d5dd99f4a20d24d9c348e91c4db7ab1b749200bded2f839ccbe68f"
],
+ "markers": "python_version >= '2.6' and python_version not in '3.0, 3.1, 3.2, 3.3'",
"version": "==0.10.2"
},
"tomli": {
@@ -1816,13 +1913,43 @@
"markers": "python_version < '3.11'",
"version": "==2.0.1"
},
+ "typed-ast": {
+ "hashes": [
+ "sha256:0261195c2062caf107831e92a76764c81227dae162c4f75192c0d489faf751a2",
+ "sha256:0fdbcf2fef0ca421a3f5912555804296f0b0960f0418c440f5d6d3abb549f3e1",
+ "sha256:183afdf0ec5b1b211724dfef3d2cad2d767cbefac291f24d69b00546c1837fb6",
+ "sha256:211260621ab1cd7324e0798d6be953d00b74e0428382991adfddb352252f1d62",
+ "sha256:267e3f78697a6c00c689c03db4876dd1efdfea2f251a5ad6555e82a26847b4ac",
+ "sha256:2efae9db7a8c05ad5547d522e7dbe62c83d838d3906a3716d1478b6c1d61388d",
+ "sha256:370788a63915e82fd6f212865a596a0fefcbb7d408bbbb13dea723d971ed8bdc",
+ "sha256:39e21ceb7388e4bb37f4c679d72707ed46c2fbf2a5609b8b8ebc4b067d977df2",
+ "sha256:3e123d878ba170397916557d31c8f589951e353cc95fb7f24f6bb69adc1a8a97",
+ "sha256:4879da6c9b73443f97e731b617184a596ac1235fe91f98d279a7af36c796da35",
+ "sha256:4e964b4ff86550a7a7d56345c7864b18f403f5bd7380edf44a3c1fb4ee7ac6c6",
+ "sha256:639c5f0b21776605dd6c9dbe592d5228f021404dafd377e2b7ac046b0349b1a1",
+ "sha256:669dd0c4167f6f2cd9f57041e03c3c2ebf9063d0757dc89f79ba1daa2bfca9d4",
+ "sha256:6778e1b2f81dfc7bc58e4b259363b83d2e509a65198e85d5700dfae4c6c8ff1c",
+ "sha256:683407d92dc953c8a7347119596f0b0e6c55eb98ebebd9b23437501b28dcbb8e",
+ "sha256:79b1e0869db7c830ba6a981d58711c88b6677506e648496b1f64ac7d15633aec",
+ "sha256:7d5d014b7daa8b0bf2eaef684295acae12b036d79f54178b92a2b6a56f92278f",
+ "sha256:98f80dee3c03455e92796b58b98ff6ca0b2a6f652120c263efdba4d6c5e58f72",
+ "sha256:a94d55d142c9265f4ea46fab70977a1944ecae359ae867397757d836ea5a3f47",
+ "sha256:a9916d2bb8865f973824fb47436fa45e1ebf2efd920f2b9f99342cb7fab93f72",
+ "sha256:c542eeda69212fa10a7ada75e668876fdec5f856cd3d06829e6aa64ad17c8dfe",
+ "sha256:cf4afcfac006ece570e32d6fa90ab74a17245b83dfd6655a6f68568098345ff6",
+ "sha256:ebd9d7f80ccf7a82ac5f88c521115cc55d84e35bf8b446fcd7836eb6b98929a3",
+ "sha256:ed855bbe3eb3715fca349c80174cfcfd699c2f9de574d40527b8429acae23a66"
+ ],
+ "markers": "python_version < '3.8'",
+ "version": "==1.5.4"
+ },
"types-cachetools": {
"hashes": [
- "sha256:4291c3b6ae10e7b0d7ae3c4cb7d9daa6b21d4b7deb64d193e44bcec7ca5c4095",
- "sha256:bf22b2e9f9243983914f6510e43a1873f012afb8c3fc5e09a59b0ccbe3ab0f35"
+ "sha256:069cfc825697cd51445c1feabbe4edc1fae2b2315870e7a9a179a7c4a5851bee",
+ "sha256:b496b7e364ba050c4eaadcc6582f2c9fbb04f8ee7141eb3b311a8589dbd4506a"
],
"index": "pypi",
- "version": "==5.0.2"
+ "version": "==5.2.1"
},
"types-colorama": {
"hashes": [
@@ -1834,11 +1961,11 @@
},
"types-jmespath": {
"hashes": [
- "sha256:46ec8e126f2b132879f431c607e9ef7928d0040c10a8e9eb40bf75752431a003",
- "sha256:c2f5810f4c5026ea537e352d6b06368ae456daf983bf2dafb68c8f4c6f864842"
+ "sha256:89c0f6894f59626dcd074664a7294c4a7740b9e2195f5ccb698ada0f6680ce1f",
+ "sha256:db05811bbd758c76b3209fa92c78f8b28ac9fdf5e62bac6aed95cffc55ff0195"
],
"index": "pypi",
- "version": "==0.10.2"
+ "version": "==1.0.0"
},
"types-jsonschema": {
"hashes": [
@@ -1850,35 +1977,35 @@
},
"types-pyyaml": {
"hashes": [
- "sha256:56a7b0e8109602785f942a11ebfbd16e97d5d0e79f5fbb077ec4e6a0004837ff",
- "sha256:d9495d377bb4f9c5387ac278776403eb3b4bb376851025d913eea4c22b4c6438"
+ "sha256:33ae75c84b8f61fddf0c63e9c7e557db9db1694ad3c2ee8628ec5efebb5a5e9b",
+ "sha256:b738e9ef120da0af8c235ba49d3b72510f56ef9bcc308fc8e7357100ff122284"
],
"index": "pypi",
- "version": "==6.0.8"
+ "version": "==6.0.9"
},
"types-requests": {
"hashes": [
- "sha256:b9b6cd0a6e5d500e56419b79f44ec96f316e9375ff6c8ee566c39d25e9612621",
- "sha256:ca8d7cc549c3d10dbcb3c69c1b53e3ffd1270089c1001a65c1e9e1017eb5e704"
+ "sha256:85383b4ef0535f639c3f06c5bbb6494bbf59570c4cd88bbcf540f0b2ac1b49ab",
+ "sha256:9863d16dfbb3fa55dcda64fa3b989e76e8859033b26c1e1623e30465cfe294d3"
],
"index": "pypi",
- "version": "==2.27.30"
+ "version": "==2.28.0"
},
"types-tabulate": {
"hashes": [
- "sha256:2fc3fa4fe1853ac987cf50e8d4599e3fe446dd53064fe86a46a407a98e9fc04f",
- "sha256:7971ed0cd40454eb18d82c01e2f18bcd09ca23cc9eb901c62d2b04e5d1f57f84"
+ "sha256:17a5fa3b5ca453815778fc9865e8ecd0118b07b2b9faff3e2b06fe448174dd5e",
+ "sha256:af811268241e8fb87b63c052c87d1e329898a93191309d5d42111372232b2e0e"
],
"index": "pypi",
- "version": "==0.8.9"
+ "version": "==0.8.11"
},
"types-termcolor": {
"hashes": [
- "sha256:4986dea39b82c9b78714154ac88033d4e225e3c06e0386491f74003c9071e541",
- "sha256:becba28967a8792221f202c6ba14c2ae236ef90519dd14aaefe8af54d94639e0"
+ "sha256:3dc714e884a98b6a8c4c6af22ee99e1b53d2e595a22e0933b2dc9cc32b8b8c58",
+ "sha256:dd10b878548dbd72885f72c1c45d42a45172634f7c8d0284559238785604e068"
],
"index": "pypi",
- "version": "==1.1.4"
+ "version": "==1.1.5"
},
"types-urllib3": {
"hashes": [
@@ -1900,6 +2027,7 @@
"sha256:44ece4d53fb1706f667c9bd1c648f5469a2ec925fcf3a776667042d645472c14",
"sha256:aabaf16477806a5e1dd19aa41f8c2b7950dd3c746362d7e3223dbe6de6ac448e"
],
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4' and python_version < '4'",
"version": "==1.26.9"
},
"urllib3-mock": {
@@ -1912,10 +2040,11 @@
},
"virtualenv": {
"hashes": [
- "sha256:e617f16e25b42eb4f6e74096b9c9e37713cf10bf30168fb4a739f3fa8f898a3a",
- "sha256:ef589a79795589aada0c1c5b319486797c03b67ac3984c48c669c0e4f50df3a5"
+ "sha256:288171134a2ff3bfb1a2f54f119e77cd1b81c29fc1265a2356f3e8d14c7d58c4",
+ "sha256:b30aefac647e86af6d82bfc944c556f8f1a9c90427b2fb4e3bfbf338cb82becf"
],
- "version": "==20.14.1"
+ "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2, 3.3, 3.4'",
+ "version": "==20.15.1"
},
"yarl": {
"hashes": [
@@ -1992,6 +2121,7 @@
"sha256:fce78593346c014d0d986b7ebc80d782b7f5e19843ca798ed62f8e3ba8728576",
"sha256:fd547ec596d90c8676e369dd8a581a21227fe9b4ad37d0dc7feb4ccf544c2d59"
],
+ "markers": "python_version >= '3.6'",
"version": "==1.7.2"
},
"zipp": {
@@ -1999,7 +2129,7 @@
"sha256:56bf8aadb83c24db6c4b577e13de374ccfb67da2078beba1d037c17980bf43ad",
"sha256:c4f6e5bbf48e74f7a38e7cc5b0480ff42b0ae5178957d564d18932525d5cf099"
],
- "markers": "python_version < '3.10'",
+ "markers": "python_version >= '3.7'",
"version": "==3.8.0"
}
}
diff --git a/setup.py b/setup.py
index 7e7a2020a0..59e644cec6 100644
--- a/setup.py
+++ b/setup.py
@@ -33,7 +33,7 @@
]
},
install_requires=[
- "bc-python-hcl2==0.3.42",
+ "bc-python-hcl2==0.3.44",
"cloudsplaining>=0.4.1",
"deep_merge",
"tabulate",
|
litestar-org__litestar-1633 | StaticFilesConfig and virtual directories
I'm trying to write a ``FileSystemProtocol`` to load files from the package data using [importlib_resources](https://importlib-resources.readthedocs.io/en/latest/using.html#). But because ``directories`` is defined as ``DirectoryPath``, pydantic checks if the given directories exist in the local filesystem.
This is not generally true, especially in any kind of virtual filesystem (e.g. a zipped package). I think this condition should be relaxed to support virtual filesystems.
https://github.com/starlite-api/starlite/blob/9bb6dcd57c10a591377cf8e3a537e9292566d5b9/starlite/config/static_files.py#L32
| [
{
"content": "from __future__ import annotations\n\nimport argparse\nimport importlib.metadata\nimport json\nimport os\nimport shutil\nimport subprocess\nfrom contextlib import contextmanager\nfrom pathlib import Path\nfrom typing import TypedDict\n\nREDIRECT_TEMPLATE = \"\"\"\n<!DOCTYPE HTML>\n<html lang=\"en-... | [
{
"content": "from __future__ import annotations\n\nimport argparse\nimport importlib.metadata\nimport json\nimport os\nimport shutil\nimport subprocess\nfrom contextlib import contextmanager\nfrom pathlib import Path\nfrom typing import TypedDict\n\nREDIRECT_TEMPLATE = \"\"\"\n<!DOCTYPE HTML>\n<html lang=\"en-... | diff --git a/tools/build_docs.py b/tools/build_docs.py
index 46e577e3e5..34b4f48b4a 100644
--- a/tools/build_docs.py
+++ b/tools/build_docs.py
@@ -90,7 +90,7 @@ def main() -> None:
build(
output_dir=args.output,
version=args.version,
- ignore_missing_output=args.ignore_missing_output,
+ ignore_missing_output=args.ignore_missing_examples_output,
)
|
zulip__zulip-8684 | lint rules: Prevent `return undefined;`
We should sweep the code to replace `return undefined;` with `return;`, and then make a lint rule for it, either via eslint (if they support that) or by making a custom rule.
| [
{
"content": "ZULIP_VERSION = \"1.7.1+git\"\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old version of the code to a newer version. Bump\n# the major version to indicate that folks should provision in both\n# directions.\n\n# Typically, adding a depen... | [
{
"content": "ZULIP_VERSION = \"1.7.1+git\"\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old version of the code to a newer version. Bump\n# the major version to indicate that folks should provision in both\n# directions.\n\n# Typically, adding a depen... | diff --git a/.eslintrc.json b/.eslintrc.json
index 7c680d38220ce..db7c93bd78cad 100644
--- a/.eslintrc.json
+++ b/.eslintrc.json
@@ -172,6 +172,9 @@
"common": false,
"panels": false
},
+ "plugins": [
+ "eslint-plugin-empty-returns"
+ ],
"rules": {
"array-callback-return": "error",
"array-bracket-spacing": "error",
@@ -191,6 +194,7 @@
"complexity": [ 0, 4 ],
"curly": 2,
"dot-notation": [ "error", { "allowKeywords": true } ],
+ "empty-returns/main": "error",
"eol-last": [ "error", "always" ],
"eqeqeq": 2,
"func-style": [ "off", "expression" ],
diff --git a/frontend_tests/node_tests/people.js b/frontend_tests/node_tests/people.js
index f7a1552f9d644..4e62c45a3aff8 100644
--- a/frontend_tests/node_tests/people.js
+++ b/frontend_tests/node_tests/people.js
@@ -2,7 +2,7 @@ zrequire('util');
zrequire('people');
set_global('blueslip', {
- error: function () { return undefined; },
+ error: function () { return; },
});
set_global('page_params', {});
set_global('md5', function (s) {
@@ -555,7 +555,7 @@ initialize();
assert.equal(email, 'debbie71@example.com');
// Test undefined slug
- people.emails_strings_to_user_ids_string = function () { return undefined; };
+ people.emails_strings_to_user_ids_string = function () { return; };
assert.equal(people.emails_to_slug(), undefined);
}());
diff --git a/frontend_tests/node_tests/people_errors.js b/frontend_tests/node_tests/people_errors.js
index 3eb3a5d0d6666..c3906c9d798a0 100644
--- a/frontend_tests/node_tests/people_errors.js
+++ b/frontend_tests/node_tests/people_errors.js
@@ -107,7 +107,7 @@ people.initialize_current_user(me.user_id);
assert(reply_to.indexOf('?') > -1);
people.pm_with_user_ids = function () { return [42]; };
- people.get_person_from_user_id = function () { return undefined; };
+ people.get_person_from_user_id = function () { return; };
global.blueslip.error = function (msg) {
assert.equal(msg, 'Unknown people in message');
};
diff --git a/frontend_tests/node_tests/search_suggestion.js b/frontend_tests/node_tests/search_suggestion.js
index 437a86ea7105b..382e29429889e 100644
--- a/frontend_tests/node_tests/search_suggestion.js
+++ b/frontend_tests/node_tests/search_suggestion.js
@@ -53,7 +53,7 @@ topic_data.reset();
};
global.narrow_state.stream = function () {
- return undefined;
+ return;
};
var suggestions = search.get_suggestions(query);
@@ -73,7 +73,7 @@ topic_data.reset();
};
global.narrow_state.stream = function () {
- return undefined;
+ return;
};
var ted =
@@ -244,7 +244,7 @@ topic_data.reset();
};
global.narrow_state.stream = function () {
- return undefined;
+ return;
};
set_global('activity', {
@@ -430,7 +430,7 @@ init();
};
global.narrow_state.stream = function () {
- return undefined;
+ return;
};
var suggestions = search.get_suggestions(query);
@@ -466,7 +466,7 @@ init();
};
global.narrow_state.stream = function () {
- return undefined;
+ return;
};
var query = '';
diff --git a/frontend_tests/node_tests/topic_generator.js b/frontend_tests/node_tests/topic_generator.js
index 70c6b8023f0bb..61f182d870488 100644
--- a/frontend_tests/node_tests/topic_generator.js
+++ b/frontend_tests/node_tests/topic_generator.js
@@ -174,7 +174,7 @@ function is_odd(i) { return i % 2 === 1; }
assert.equal(gen.next(), undefined);
var undef = function () {
- return undefined;
+ return;
};
global.blueslip.error = function (msg) {
@@ -315,7 +315,7 @@ function is_odd(i) { return i % 2 === 1; }
unread.num_unread_for_person = function (user_ids_string) {
if (user_ids_string === 'unk') {
- return undefined;
+ return;
}
if (user_ids_string === 'read') {
diff --git a/package.json b/package.json
index 69db08b0d523a..2c8a4f0f332e8 100644
--- a/package.json
+++ b/package.json
@@ -50,6 +50,7 @@
"cssstyle": "0.2.29",
"difflib": "0.2.4",
"eslint": "3.9.1",
+ "eslint-plugin-empty-returns": "1.0.2",
"htmlparser2": "3.8.3",
"istanbul": "0.4.5",
"jsdom": "9.4.1",
diff --git a/static/js/blueslip.js b/static/js/blueslip.js
index 65c2c0cc6dd9e..aa597cee54b3d 100644
--- a/static/js/blueslip.js
+++ b/static/js/blueslip.js
@@ -71,7 +71,7 @@ Logger.prototype = (function () {
if (console[name] !== undefined) {
return console[name].apply(console, arguments);
}
- return undefined;
+ return;
};
}
diff --git a/static/js/common.js b/static/js/common.js
index 12390dde98a6d..d145583bb9449 100644
--- a/static/js/common.js
+++ b/static/js/common.js
@@ -26,7 +26,7 @@ exports.autofocus = function (selector) {
exports.password_quality = function (password, bar, password_field) {
// We load zxcvbn.js asynchronously, so the variable might not be set.
if (typeof zxcvbn === 'undefined') {
- return undefined;
+ return;
}
var min_length = password_field.data('minLength');
@@ -58,7 +58,7 @@ exports.password_quality = function (password, bar, password_field) {
exports.password_warning = function (password, password_field) {
if (typeof zxcvbn === 'undefined') {
- return undefined;
+ return;
}
var min_length = password_field.data('minLength');
diff --git a/static/js/compose_fade.js b/static/js/compose_fade.js
index ea307c5982076..4b916097cc9a0 100644
--- a/static/js/compose_fade.js
+++ b/static/js/compose_fade.js
@@ -122,13 +122,13 @@ exports.would_receive_message = function (email) {
if (!sub) {
// If the stream isn't valid, there is no risk of a mix
// yet, so don't fade.
- return undefined;
+ return;
}
if (user && user.is_bot && !sub.invite_only) {
// Bots may receive messages on public streams even if they are
// not subscribed.
- return undefined;
+ return;
}
return stream_data.user_is_subscribed(focused_recipient.stream, email);
}
diff --git a/static/js/copy_and_paste.js b/static/js/copy_and_paste.js
index 69cadb26141ad..99f3eb1f41aeb 100644
--- a/static/js/copy_and_paste.js
+++ b/static/js/copy_and_paste.js
@@ -11,7 +11,7 @@ function find_boundary_tr(initial_tr, iterate_row) {
// parent tr, we should let the browser handle the copy-paste
// entirely on its own
if (tr.length === 0) {
- return undefined;
+ return;
}
// If the selection boundary is on a table row that does not have an
@@ -24,7 +24,7 @@ function find_boundary_tr(initial_tr, iterate_row) {
tr = iterate_row(tr);
}
if (j === 10) {
- return undefined;
+ return;
} else if (j !== 0) {
// If we updated tr, then we are not dealing with a selection
// that is entirely within one td, and we can skip the same td
diff --git a/static/js/dict.js b/static/js/dict.js
index f8b5577f02adf..b7485f853bc85 100644
--- a/static/js/dict.js
+++ b/static/js/dict.js
@@ -57,7 +57,7 @@ Dict.prototype = {
_munge: function Dict__munge(k) {
if (k === undefined) {
blueslip.error("Tried to call a Dict method with an undefined key.");
- return undefined;
+ return;
}
if (this._opts.fold_case) {
k = k.toLowerCase();
@@ -74,7 +74,7 @@ Dict.prototype = {
get: function Dict_get(key) {
var mapping = this._items[this._munge(key)];
if (mapping === undefined) {
- return undefined;
+ return;
}
return mapping.v;
},
diff --git a/static/js/echo.js b/static/js/echo.js
index d1285b89c586f..a6bde1f55b837 100644
--- a/static/js/echo.js
+++ b/static/js/echo.js
@@ -60,19 +60,19 @@ var get_next_local_id = (function () {
// If our id is already used, it is probably an edge case like we had
// to abort a very recent message.
blueslip.warn("We don't reuse ids for local echo.");
- return undefined;
+ return;
}
if (next_local_id % 1 > local_id_increment * 5) {
blueslip.warn("Turning off local echo for this message to let host catch up");
- return undefined;
+ return;
}
if (next_local_id % 1 === 0) {
// The logic to stop at 0.05 should prevent us from ever wrapping around
// to the next integer.
blueslip.error("Programming error");
- return undefined;
+ return;
}
already_used[next_local_id] = true;
@@ -139,18 +139,18 @@ function insert_local_message(message_request, local_id) {
exports.try_deliver_locally = function try_deliver_locally(message_request) {
if (markdown.contains_backend_only_syntax(message_request.content)) {
- return undefined;
+ return;
}
if (narrow_state.active() && !narrow_state.filter().can_apply_locally()) {
- return undefined;
+ return;
}
var next_local_id = get_next_local_id();
if (!next_local_id) {
// This can happen for legit reasons.
- return undefined;
+ return;
}
return insert_local_message(message_request, next_local_id);
diff --git a/static/js/hashchange.js b/static/js/hashchange.js
index 9d2d29b70f374..917565455e40c 100644
--- a/static/js/hashchange.js
+++ b/static/js/hashchange.js
@@ -85,7 +85,7 @@ exports.parse_narrow = function (hash) {
}
operators.push({negated: negated, operator: operator, operand: operand});
} catch (err) {
- return undefined;
+ return;
}
}
return operators;
diff --git a/static/js/localstorage.js b/static/js/localstorage.js
index c68d7aa1ef47e..1bea82f21b6d9 100644
--- a/static/js/localstorage.js
+++ b/static/js/localstorage.js
@@ -6,7 +6,7 @@ var ls = {
try {
return JSON.parse(str);
} catch (err) {
- return undefined;
+ return;
}
},
diff --git a/static/js/markdown.js b/static/js/markdown.js
index c436e73ba6c0a..c171e54ecabbd 100644
--- a/static/js/markdown.js
+++ b/static/js/markdown.js
@@ -63,7 +63,7 @@ exports.apply_markdown = function (message) {
'@' + name +
'</span>';
}
- return undefined;
+ return;
},
groupMentionHandler: function (name) {
var group = user_groups.get_user_group_from_name(name);
@@ -75,7 +75,7 @@ exports.apply_markdown = function (message) {
'@' + group.name +
'</span>';
}
- return undefined;
+ return;
},
};
message.content = marked(message.raw_content + '\n\n', options).trim();
@@ -165,7 +165,7 @@ function handleAvatar(email) {
function handleStream(streamName) {
var stream = stream_data.get_sub(streamName);
if (stream === undefined) {
- return undefined;
+ return;
}
var href = window.location.origin + '/#narrow/stream/' + hash_util.encode_stream_name(stream.name);
return '<a class="stream" data-stream-id="' + stream.stream_id + '" ' +
diff --git a/static/js/message_list.js b/static/js/message_list.js
index e49e6f8828bd3..4b73888cbb725 100644
--- a/static/js/message_list.js
+++ b/static/js/message_list.js
@@ -100,7 +100,7 @@ exports.MessageList.prototype = {
get: function MessageList_get(id) {
id = parseFloat(id);
if (isNaN(id)) {
- return undefined;
+ return;
}
return this._hash[id];
},
diff --git a/static/js/narrow_state.js b/static/js/narrow_state.js
index af47562ec5a6a..028510671bd86 100644
--- a/static/js/narrow_state.js
+++ b/static/js/narrow_state.js
@@ -41,7 +41,7 @@ exports.update_email = function (user_id, new_email) {
/* Operators we should send to the server. */
exports.public_operators = function () {
if (current_filter === undefined) {
- return undefined;
+ return;
}
return current_filter.public_operators();
};
@@ -96,7 +96,7 @@ exports.set_compose_defaults = function () {
exports.stream = function () {
if (current_filter === undefined) {
- return undefined;
+ return;
}
var stream_operands = current_filter.operands("stream");
if (stream_operands.length === 1) {
@@ -106,18 +106,18 @@ exports.stream = function () {
// name (considering renames and capitalization).
return stream_data.get_name(name);
}
- return undefined;
+ return;
};
exports.topic = function () {
if (current_filter === undefined) {
- return undefined;
+ return;
}
var operands = current_filter.operands("topic");
if (operands.length === 1) {
return operands[0];
}
- return undefined;
+ return;
};
exports.pm_string = function () {
diff --git a/static/js/people.js b/static/js/people.js
index 9db02b7606040..c64ffd6047bd5 100644
--- a/static/js/people.js
+++ b/static/js/people.js
@@ -35,7 +35,7 @@ exports.init();
exports.get_person_from_user_id = function (user_id) {
if (!people_by_user_id_dict.has(user_id)) {
blueslip.error('Unknown user_id in get_person_from_user_id: ' + user_id);
- return undefined;
+ return;
}
return people_by_user_id_dict.get(user_id);
};
@@ -44,7 +44,7 @@ exports.get_by_email = function (email) {
var person = people_dict.get(email);
if (!person) {
- return undefined;
+ return;
}
if (person.email.toLowerCase() !== email.toLowerCase()) {
@@ -91,12 +91,12 @@ exports.get_user_id = function (email) {
if (person === undefined) {
var error_msg = 'Unknown email for get_user_id: ' + email;
blueslip.error(error_msg);
- return undefined;
+ return;
}
var user_id = person.user_id;
if (!user_id) {
blueslip.error('No user_id found for ' + email);
- return undefined;
+ return;
}
return user_id;
@@ -555,7 +555,7 @@ exports.is_valid_email_for_compose = function (email) {
exports.get_active_user_for_email = function (email) {
var person = people.get_by_email(email);
if (!person) {
- return undefined;
+ return;
}
return active_user_dict.get(person.user_id);
};
@@ -596,7 +596,7 @@ exports.get_active_user_ids = function () {
exports.is_cross_realm_email = function (email) {
var person = people.get_by_email(email);
if (!person) {
- return undefined;
+ return;
}
return cross_realm_dict.has(person.user_id);
};
diff --git a/static/js/stream_data.js b/static/js/stream_data.js
index 1ac17478782c1..e5374022aa137 100644
--- a/static/js/stream_data.js
+++ b/static/js/stream_data.js
@@ -370,7 +370,7 @@ exports.user_is_subscribed = function (stream_name, user_email) {
// subscribed, we can't keep track of the subscriber list in general,
// so we return undefined (treated as falsy if not explicitly handled).
blueslip.warn("We got a user_is_subscribed call for a non-existent or unsubscribed stream.");
- return undefined;
+ return;
}
var user_id = people.get_user_id(user_email);
if (!user_id) {
@@ -529,7 +529,7 @@ exports.get_newbie_stream = function () {
return page_params.notifications_stream;
}
- return undefined;
+ return;
};
exports.remove_default_stream = function (stream_id) {
diff --git a/static/js/typing.js b/static/js/typing.js
index f364566d91678..86b669f3ef892 100644
--- a/static/js/typing.js
+++ b/static/js/typing.js
@@ -24,7 +24,7 @@ function send_typing_notification_ajax(recipients, operation) {
function get_recipient() {
var compose_recipient = compose_state.recipient();
if (compose_recipient === "") {
- return undefined;
+ return;
}
return compose_recipient;
}
diff --git a/static/js/user_groups.js b/static/js/user_groups.js
index cdb9cd8281646..1b63815b323da 100644
--- a/static/js/user_groups.js
+++ b/static/js/user_groups.js
@@ -31,7 +31,7 @@ exports.remove = function (user_group) {
exports.get_user_group_from_id = function (group_id) {
if (!user_group_by_id_dict.has(group_id)) {
blueslip.error('Unknown group_id in get_user_group_from_id: ' + group_id);
- return undefined;
+ return;
}
return user_group_by_id_dict.get(group_id);
};
diff --git a/version.py b/version.py
index 72fea154e844d..96d25f384546d 100644
--- a/version.py
+++ b/version.py
@@ -8,4 +8,4 @@
# Typically, adding a dependency only requires a minor version bump, and
# removing a dependency requires a major version bump.
-PROVISION_VERSION = '15.9'
+PROVISION_VERSION = '15.10'
diff --git a/yarn.lock b/yarn.lock
index 04d346301a3a7..27c03618df4f7 100644
--- a/yarn.lock
+++ b/yarn.lock
@@ -1731,6 +1731,10 @@ escope@^3.6.0:
esrecurse "^4.1.0"
estraverse "^4.1.1"
+eslint-plugin-empty-returns@^1.0.1:
+ version "1.0.1"
+ resolved "https://registry.yarnpkg.com/eslint-plugin-empty-returns/-/eslint-plugin-empty-returns-1.0.1.tgz#ca19faa501e114812577db68ec6882ea48c40a27"
+
eslint@3.9.1:
version "3.9.1"
resolved "https://registry.yarnpkg.com/eslint/-/eslint-3.9.1.tgz#5a8597706fc6048bc6061ac754d4a211d28f4f5b"
@@ -3994,13 +3998,13 @@ mapbox-gl-function@^1.2.1:
version "1.3.0"
resolved "https://registry.yarnpkg.com/mapbox-gl-function/-/mapbox-gl-function-1.3.0.tgz#cee3d95750c189d45e83ab41a0a57fc2a8a509bc"
-"mapbox-gl-shaders@github:mapbox/mapbox-gl-shaders#de2ab007455aa2587c552694c68583f94c9f2747":
+mapbox-gl-shaders@mapbox/mapbox-gl-shaders#de2ab007455aa2587c552694c68583f94c9f2747:
version "1.0.0"
resolved "https://codeload.github.com/mapbox/mapbox-gl-shaders/tar.gz/de2ab007455aa2587c552694c68583f94c9f2747"
dependencies:
brfs "^1.4.0"
-"mapbox-gl-style-spec@github:mapbox/mapbox-gl-style-spec#83b1a3e5837d785af582efd5ed1a212f2df6a4ae":
+mapbox-gl-style-spec@mapbox/mapbox-gl-style-spec#83b1a3e5837d785af582efd5ed1a212f2df6a4ae:
version "8.8.0"
resolved "https://codeload.github.com/mapbox/mapbox-gl-style-spec/tar.gz/83b1a3e5837d785af582efd5ed1a212f2df6a4ae"
dependencies:
|
pex-tool__pex-2240 | Release 2.1.146
On the docket:
+ [x] Fix non executable venv sys path bug #2236
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.145\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.146\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.md b/CHANGES.md
index e0b86c7d2..e4578df2c 100644
--- a/CHANGES.md
+++ b/CHANGES.md
@@ -1,5 +1,12 @@
# Release Notes
+## 2.1.146
+
+This release brings a fix by new contributor @yjabri for the `__pex__`
+import hook that gets it working properly for `--venv` mode PEXes.
+
+* Fix non executable venv sys path bug (#2236)
+
## 2.1.145
This release broadens the range of the `flit-core` build system Pex uses
diff --git a/pex/version.py b/pex/version.py
index 1fff73a2e..79b99d81d 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.145"
+__version__ = "2.1.146"
|
pex-tool__pex-2042 | Release 2.1.121
On the docket:
+ [x] Building Pex with requirements.txt that includes local directory + Python version specifier fails #2037
+ [x] Failed to resolve compatible distributions when building Pex from .whl with local dependencies #2038
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.120\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.121\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 992b31e92..052808652 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,18 @@
Release Notes
=============
+2.1.121
+-------
+
+This release fixes two bugs brought to light trying to interoperate with
+Poetry projects.
+
+* Support space separated markers in URL reqs. (#2039)
+ `PR #2039 <https://github.com/pantsbuild/pex/pull/2039>`_
+
+* Handle file:// URL deps in distributions. (#2041)
+ `PR #2041 <https://github.com/pantsbuild/pex/pull/2041>`_
+
2.1.120
-------
diff --git a/pex/version.py b/pex/version.py
index 85c867798..2513fd6e8 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.120"
+__version__ = "2.1.121"
|
pex-tool__pex-2245 | Release 2.1.147
On the docket:
+ [x] pex does not use .pip/pip.conf to resolve packages #336 / #838
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.146\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.147\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.md b/CHANGES.md
index e4578df2c..70201b9bb 100644
--- a/CHANGES.md
+++ b/CHANGES.md
@@ -1,5 +1,13 @@
# Release Notes
+## 2.1.147
+
+Add support for `--use-pip-config` to allow the Pip Pex calls to read
+`PIP_*` env vars and Pip configuration files. This can be particularly
+useful for picking up custom index configuration (including auth).
+
+* Add support for --use-pip-config. (#2243)
+
## 2.1.146
This release brings a fix by new contributor @yjabri for the `__pex__`
diff --git a/pex/version.py b/pex/version.py
index 79b99d81d..34e32d6eb 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.146"
+__version__ = "2.1.147"
|
pex-tool__pex-1947 | Release 2.1.110
On the docket:
+ [x] PEX runtime sys.path scrubbing is imperfect. #1944
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.109\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.110\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index cc087a1ee..79be2e8ea 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,17 @@
Release Notes
=============
+2.1.110
+-------
+
+This release fixes Pex runtime ``sys.path`` scrubbing for cases where
+Pex is not the main entry point. An important example of this is in
+Lambdex where the AWS Lambda Python runtime packages (``boto3`` and
+``botocore``) are leaked into the PEX runtime ``sys.path``.
+
+* Fix ``sys.path`` scrubbing. (#1946)
+ `PR #1946 <https://github.com/pantsbuild/pex/pull/1946>`_
+
2.1.109
-------
diff --git a/pex/version.py b/pex/version.py
index 32f577f51..6e6ad76d6 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.109"
+__version__ = "2.1.110"
|
ivy-llc__ivy-26758 | igamma
| [
{
"content": "# global\nfrom typing import Any\nimport itertools\nimport string\nimport builtins\n\n# local\nimport ivy\nfrom ivy.func_wrapper import with_supported_dtypes\nfrom ivy.functional.frontends.jax.func_wrapper import to_ivy_arrays_and_back\nfrom ivy.func_wrapper import with_unsupported_dtypes, fronten... | [
{
"content": "# global\nfrom typing import Any\nimport itertools\nimport string\nimport builtins\n\n# local\nimport ivy\nfrom ivy.func_wrapper import with_supported_dtypes\nfrom ivy.functional.frontends.jax.func_wrapper import to_ivy_arrays_and_back\nfrom ivy.func_wrapper import with_unsupported_dtypes, fronten... | diff --git a/ivy/functional/frontends/jax/lax/operators.py b/ivy/functional/frontends/jax/lax/operators.py
index e456a89ff4e01..488e60b4335e2 100644
--- a/ivy/functional/frontends/jax/lax/operators.py
+++ b/ivy/functional/frontends/jax/lax/operators.py
@@ -454,6 +454,11 @@ def gt(x, y):
return ivy.greater(x, y)
+@to_ivy_arrays_and_back
+def igamma(a, x):
+ return ivy.igamma(a, x=x)
+
+
@to_ivy_arrays_and_back
def imag(x):
return ivy.imag(x)
diff --git a/ivy_tests/test_ivy/test_frontends/test_jax/test_lax/test_operators.py b/ivy_tests/test_ivy/test_frontends/test_jax/test_lax/test_operators.py
index 7ad483e8edfc1..b04d2ec3604f1 100644
--- a/ivy_tests/test_ivy/test_frontends/test_jax/test_lax/test_operators.py
+++ b/ivy_tests/test_ivy/test_frontends/test_jax/test_lax/test_operators.py
@@ -1921,6 +1921,40 @@ def test_jax_gt(
)
+# igamma
+@handle_frontend_test(
+ fn_tree="jax.lax.igamma",
+ dtypes_and_xs=helpers.dtype_and_values(
+ available_dtypes=helpers.get_dtypes("numeric"),
+ num_arrays=2,
+ shared_dtype=True,
+ ),
+ test_with_out=st.just(False),
+)
+def test_jax_igamma(
+ *,
+ dtypes_and_xs,
+ on_device,
+ fn_tree,
+ frontend,
+ test_flags,
+ backend_fw,
+):
+ input_dtypes, (x, y) = dtypes_and_xs
+
+ helpers.test_frontend_function(
+ input_dtypes=input_dtypes,
+ backend_to_test=backend_fw,
+ frontend=frontend,
+ test_flags=test_flags,
+ fn_tree=fn_tree,
+ on_device=on_device,
+ test_values=True,
+ x=x,
+ y=y,
+ )
+
+
# imag
@handle_frontend_test(
fn_tree="jax.lax.imag",
|
docker__docker-py-635 | Can't use multiple of same host path in binds
Reference for issue is docker/compose#983
The `convert_volume_binds` is using the dict key to get the host path to bind.
Because of this, it is impossible to do the equivalent of `docker run -v /foo:/bar -v /foo:baz`.
| [
{
"content": "# Copyright 2013 dotCloud inc.\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless requir... | [
{
"content": "# Copyright 2013 dotCloud inc.\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless requir... | diff --git a/docker/utils/utils.py b/docker/utils/utils.py
index e4a3c9e64..724af4650 100644
--- a/docker/utils/utils.py
+++ b/docker/utils/utils.py
@@ -174,6 +174,9 @@ def convert_port_bindings(port_bindings):
def convert_volume_binds(binds):
+ if isinstance(binds, list):
+ return binds
+
result = []
for k, v in binds.items():
if isinstance(v, dict):
diff --git a/docs/volumes.md b/docs/volumes.md
index de2821400..db421557a 100644
--- a/docs/volumes.md
+++ b/docs/volumes.md
@@ -19,3 +19,16 @@ container_id = c.create_container(
})
)
```
+
+You can alternatively specify binds as a list. This code is equivalent to the
+example above:
+
+```python
+container_id = c.create_container(
+ 'busybox', 'ls', volumes=['/mnt/vol1', '/mnt/vol2'],
+ host_config=docker.utils.create_host_config(binds=[
+ '/home/user1/:/mnt/vol2',
+ '/var/www:/mnt/vol1:ro',
+ ])
+)
+```
diff --git a/tests/test.py b/tests/test.py
index e0a9e3452..97af11eec 100644
--- a/tests/test.py
+++ b/tests/test.py
@@ -808,6 +808,36 @@ def test_create_container_with_binds_rw(self):
DEFAULT_TIMEOUT_SECONDS
)
+ def test_create_container_with_binds_list(self):
+ try:
+ self.client.create_container(
+ 'busybox', 'true', host_config=create_host_config(
+ binds=[
+ "/tmp:/mnt/1:ro",
+ "/tmp:/mnt/2",
+ ],
+ )
+ )
+ except Exception as e:
+ self.fail('Command should not raise exception: {0}'.format(e))
+
+ args = fake_request.call_args
+ self.assertEqual(args[0][0], url_prefix +
+ 'containers/create')
+ expected_payload = self.base_create_payload()
+ expected_payload['HostConfig'] = create_host_config()
+ expected_payload['HostConfig']['Binds'] = [
+ "/tmp:/mnt/1:ro",
+ "/tmp:/mnt/2",
+ ]
+ self.assertEqual(json.loads(args[1]['data']), expected_payload)
+ self.assertEqual(args[1]['headers'],
+ {'Content-Type': 'application/json'})
+ self.assertEqual(
+ args[1]['timeout'],
+ DEFAULT_TIMEOUT_SECONDS
+ )
+
def test_create_container_with_port_binds(self):
self.maxDiff = None
try:
|
pex-tool__pex-2143 | Release 2.1.135
On the docket:
+ [x] Add Support for Pip 23.1.1. #2133
+ [x] Introduce pex3 venv inspect. #2135
+ [x] Add support for Pip 23.1.2. #2142
+ [x] Introduce pex3 venv create. #2140
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.134\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.135\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 46b9ed061..5730d24ef 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,42 @@
Release Notes
=============
+2.1.135
+-------
+
+This release brings support for ``pex3 venv {inspect,create}`` for
+working with venvs directly using Pex. Previously, a PEX built with
+``--include-tools`` (or ``--venv``) had the capability of turning itself
+into a venv but the new ``pex3 venv create`` command can do this for any
+PEX file with the addition of a few new features:
+
+#. The venv can now be created directly from requirements producing no
+ intermediate PEX file.
+#. The venv can be created either from a PEX file or a lock file. A
+ subset of either of those can be chosen by also supplying
+ requirements.
+#. Instead of creating a full-fledged venv, just the site-packages can
+ be exported (without creating an intermediate venv). This "flat"
+ layout is used by several prominent runtimes - notably AWS Lambda -
+ and emulates ``pip install --target``. This style layout can also be
+ zipped and prefixed. Additionally it supports ``--platform`` and
+ ``--complete-platform`` allowing creation of, for example, an AWS
+ Lambda (or Lambda Layer) deployment zip on a non-Linux host.
+
+Additionally this release adds support for Pip 23.1.1 and 23.1.2.
+
+* Add Support for Pip 23.1.1. (#2133)
+ `PR #2133 <https://github.com/pantsbuild/pex/pull/2133>`_
+
+* Introduce pex3 venv inspect. (#2135)
+ `PR #2135 <https://github.com/pantsbuild/pex/pull/2135>`_
+
+* Introduce pex3 venv create. (#2140)
+ `PR #2140 <https://github.com/pantsbuild/pex/pull/2140>`_
+
+* Add support for Pip 23.1.2. (#2142)
+ `PR #2142 <https://github.com/pantsbuild/pex/pull/2142>`_
+
2.1.134
-------
diff --git a/pex/version.py b/pex/version.py
index bfbeb741e..7e49b0302 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.134"
+__version__ = "2.1.135"
|
Kinto__kinto-1304 | Cannot import name `Utc`
While trying to debug #1299 I encountered the following error:
```
$ make serve
...
~/.virtualenvs/test/bin/kinto migrate --ini config/kinto.ini
Traceback (most recent call last):
File "~/.virtualenvs/test/bin/kinto", line 11, in <module>
load_entry_point('kinto', 'console_scripts', 'kinto')()
File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 560, in load_entry_point
return get_distribution(dist).load_entry_point(group, name)
File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2648, in load_entry_point
return ep.load()
File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2302, in load
return self.resolve()
File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2308, in resolve
module = __import__(self.module_name, fromlist=['__name__'], level=0)
File "~/mozilla/kinto/kinto/__init__.py", line 4, in <module>
import kinto.core
File "~/mozilla/kinto/kinto/core/__init__.py", line 10, in <module>
from kinto.core import errors
File "~/mozilla/kinto/kinto/core/errors.py", line 1, in <module>
import colander
File "~/.virtualenvs/test/lib/python3.5/site-packages/colander/__init__.py", line 22, in <module>
from . import iso8601
File "~/.virtualenvs/test/lib/python3.5/site-packages/colander/iso8601.py", line 3, in <module>
from iso8601.iso8601 import (parse_date, ParseError, Utc, FixedOffset, UTC, ZERO, ISO8601_REGEX)
ImportError: cannot import name 'Utc'
Makefile:87 : la recette pour la cible « migrate » a échouée
make: *** [migrate] Erreur 1
```
Cannot import name `Utc`
While trying to debug #1299 I encountered the following error:
```
$ make serve
...
~/.virtualenvs/test/bin/kinto migrate --ini config/kinto.ini
Traceback (most recent call last):
File "~/.virtualenvs/test/bin/kinto", line 11, in <module>
load_entry_point('kinto', 'console_scripts', 'kinto')()
File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 560, in load_entry_point
return get_distribution(dist).load_entry_point(group, name)
File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2648, in load_entry_point
return ep.load()
File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2302, in load
return self.resolve()
File "~/.virtualenvs/test/lib/python3.5/site-packages/pkg_resources/__init__.py", line 2308, in resolve
module = __import__(self.module_name, fromlist=['__name__'], level=0)
File "~/mozilla/kinto/kinto/__init__.py", line 4, in <module>
import kinto.core
File "~/mozilla/kinto/kinto/core/__init__.py", line 10, in <module>
from kinto.core import errors
File "~/mozilla/kinto/kinto/core/errors.py", line 1, in <module>
import colander
File "~/.virtualenvs/test/lib/python3.5/site-packages/colander/__init__.py", line 22, in <module>
from . import iso8601
File "~/.virtualenvs/test/lib/python3.5/site-packages/colander/iso8601.py", line 3, in <module>
from iso8601.iso8601 import (parse_date, ParseError, Utc, FixedOffset, UTC, ZERO, ISO8601_REGEX)
ImportError: cannot import name 'Utc'
Makefile:87 : la recette pour la cible « migrate » a échouée
make: *** [migrate] Erreur 1
```
| [
{
"content": "import codecs\nimport os\nfrom setuptools import setup, find_packages\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\ndef read_file(filename):\n \"\"\"Open a related file and return its content.\"\"\"\n with codecs.open(os.path.join(here, filename), encoding='utf-8') as f:\n ... | [
{
"content": "import codecs\nimport os\nfrom setuptools import setup, find_packages\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n\ndef read_file(filename):\n \"\"\"Open a related file and return its content.\"\"\"\n with codecs.open(os.path.join(here, filename), encoding='utf-8') as f:\n ... | diff --git a/setup.py b/setup.py
index 1ffb4863d..7515b388d 100644
--- a/setup.py
+++ b/setup.py
@@ -18,8 +18,7 @@ def read_file(filename):
REQUIREMENTS = [
'bcrypt',
- 'iso8601==0.1.11', # Refs #1301
- 'colander >= 1.3.2',
+ 'colander >= 1.4.0',
'cornice >= 2.4',
'cornice_swagger >= 0.5.1',
'jsonschema',
|
pex-tool__pex-2278 | Release 2.1.150
On the docket:
+ [x] Add support for Pip 23.3.1. #2276
+ [x] Support .egg-info dist metadata. #2264
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.149\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.150\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.md b/CHANGES.md
index d8ccde54f..9c6d8a052 100644
--- a/CHANGES.md
+++ b/CHANGES.md
@@ -1,5 +1,11 @@
# Release Notes
+## 2.1.150
+
+This release brings support for `--pip-version 23.3.1`.
+
+* Add support for Pip 23.3.1. (#2276)
+
## 2.1.149
Fix `--style universal` lock handing of `none` ABI wheels with a
diff --git a/pex/version.py b/pex/version.py
index 70044c887..e85c65794 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.149"
+__version__ = "2.1.150"
|
pex-tool__pex-2226 | Release 2.1.144
On the docket:
+ [x] Traverse directories in stable order when building a PEX #2220
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.143\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.144\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.md b/CHANGES.md
index 51e6cf86f..8b6c19d5f 100644
--- a/CHANGES.md
+++ b/CHANGES.md
@@ -1,5 +1,12 @@
# Release Notes
+## 2.1.144
+
+This release fixes Pex to build PEX files with deterministic file order
+regardless of the operating system / file system the PEX was built on.
+
+* Traverse directories in stable order when building a PEX (#2220)
+
## 2.1.143
This release fixes Pex to work by default under eCryptFS home dirs.
diff --git a/pex/version.py b/pex/version.py
index b3894a15d..80b91697b 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.143"
+__version__ = "2.1.144"
|
pex-tool__pex-1922 | Release 2.1.106
On the docket:
+ [x] Providing a direct reference to a wheel with a local version fails to resolve #1919
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.105\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.106\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 4d8d4f999..15da265af 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,17 +1,26 @@
Release Notes
=============
+2.1.106
+-------
+
+This release fixes a long standing bug in handling direct reference
+requirements with a local version component.
+
+* Unquote path component of parsed url requirements (#1920)
+ `PR #1920 <https://github.com/pantsbuild/pex/pull/1920>`_
+
2.1.105
-------
-This is a fix release which addresses issues related to build time work_dir creation,
-virtualenv, and sh_boot support.
+This is a fix release which addresses issues related to build time
+work_dir creation, virtualenv, and sh_boot support.
In the unlikely event of a UUID collision in atomic workdir creation,
pex could overwrite an existing directory and cause a corrupt state.
-When building a shell bootable ``--sh-boot`` pex the ``--runtime-pex-root``
-was not always respected based on the condition of the build environment,
-and the value of the PEX_ROOT.
+When building a shell bootable ``--sh-boot`` pex the
+``--runtime-pex-root`` was not always respected based on the condition
+of the build environment, and the value of the PEX_ROOT.
* Fail on atomic_directory work_dir collision. (#1905)
`PR #1905 <https://github.com/pantsbuild/pex/pull/1905>`_
@@ -19,8 +28,6 @@ and the value of the PEX_ROOT.
* Use raw_pex_root when constructing sh_boot pexes. (#1906)
`PR #1906 <https://github.com/pantsbuild/pex/pull/1906>`_
-Docs.
-
* Add support for offline downloads (#1898)
`PR #1898 <https://github.com/pantsbuild/pex/pull/1898>`_
diff --git a/pex/version.py b/pex/version.py
index 1daaee1de..1949bc250 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.105"
+__version__ = "2.1.106"
|
pex-tool__pex-2219 | Release 2.1.143
On the docket:
+ [x] pex fails to build pycryptodome due to filename too long #2087
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.142\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.143\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.md b/CHANGES.md
index 5bbc1c157..51e6cf86f 100644
--- a/CHANGES.md
+++ b/CHANGES.md
@@ -1,5 +1,11 @@
# Release Notes
+## 2.1.143
+
+This release fixes Pex to work by default under eCryptFS home dirs.
+
+* Guard against too long filenames on eCryptFS. (#2217)
+
## 2.1.142
This release fixes Pex to handle Pip backtracking due to sdist build
diff --git a/pex/version.py b/pex/version.py
index f385f96a2..b3894a15d 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.142"
+__version__ = "2.1.143"
|
pex-tool__pex-2062 | Release 2.1.123
On the docket:
+ [x] Create lockfile for xmlsec fails #2063
+ [x] Internal not enough values to unpack error for pex3 lock create 'pip @ https://github.com/pypa/pip/archive/22.0.2.zip' ... #2057
+ [x] Pex lock creation does not handle wheels with non {cp,pp,py} pyver tag. #2059
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.122\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.123\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 869877089..978a81753 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,17 +1,51 @@
Release Notes
=============
+2.1.123
+-------
+
+This release fixes a few ``pex3 lock create`` bugs.
+
+There was a regression introduced in Pex 2.1.122 where projects that
+used a PEP-518 ``[build-system] requires`` but specified no
+corresponding ``build-backend`` would fail to lock.
+
+There were also two long standing issues handling more exotic direct
+reference URL requirements. Source archives with names not following the
+standard Python sdist naming scheme of
+``<project name>-<version>.{zip,tar.gz}`` would cause a lock error. An
+important class of these is provided by GitHub's magic source archive
+download URLs. Also, although local projects addressed with Pip
+proprietary support for pure local path requirements would lock, the
+same local projects addressed via
+``<project name> @ file://<local project path>`` would also cause a lock
+error. Both of these cases are now fixed and can be locked successfully.
+
+When locking with an ``--interpreter-constraint``, any resolve
+traversing wheels using the ``pypyXY`` or ``cpythonXY`` python tags
+would cause the lock to error. Wheels with this form of python tag are
+now handled correctly.
+
+* Handle ``[build-system]`` with no build-backend. (#2064)
+ `PR #2064 <https://github.com/pantsbuild/pex/pull/2064>`_
+
+* Handle locking all direct reference URL forms. (#2060)
+ `PR #2060 <https://github.com/pantsbuild/pex/pull/2060>`_
+
+* Fix python tag handling in IC locks. (#2061)
+ `PR #2061 <https://github.com/pantsbuild/pex/pull/2061>`_
+
2.1.122
-------
This release fixes posix file locks used by Pex internally and enhances
lock creation to support locking sdist-only C extension projects that
do not build on the current platform. Pex is also updated to support
-`--pip-version 22.3.1` and `--pip-version 23.0`, bringing it up to date
-with the latest Pip's available.
+``--pip-version 22.3.1`` and ``--pip-version 23.0``, bringing it up to
+date with the latest Pip's available.
* Support the latest Pip releases: 22.3.1 & 23.0 (#2056)
- `PR #2053 <https://github.com/pantsbuild/pex/pull/2056>`_
+ `PR #2056 <https://github.com/pantsbuild/pex/pull/2056>`_
* Lock sdists with ``prepare-metadata-for-build-wheel``. (#2053)
`PR #2053 <https://github.com/pantsbuild/pex/pull/2053>`_
diff --git a/pex/version.py b/pex/version.py
index c30e2a6bb..79f767fd5 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.122"
+__version__ = "2.1.123"
|
pex-tool__pex-1925 | Release 2.1.107
On the docket:
+ [x] `git` username replaced with `****` redaction in lockfile for `git+ssh` direct references #1918
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.106\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.107\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 15da265af..be93a0e4d 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,15 @@
Release Notes
=============
+2.1.107
+-------
+
+This release fixes an issue handling credentials in git+ssh VCS urls
+when creating locks.
+
+* Fix locks for git+ssh with credentials. (#1923)
+ `PR #1923 <https://github.com/pantsbuild/pex/pull/1923>`_
+
2.1.106
-------
diff --git a/pex/version.py b/pex/version.py
index 1949bc250..648e9a986 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.106"
+__version__ = "2.1.107"
|
pex-tool__pex-1864 | Release 2.1.101
On the docket:
+ [x] Pex fails to find RECORD for python-certifi-win32 1.6.1 #1861
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.100\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.101\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 4c6cf4cb8..d02e8c1cc 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,15 @@
Release Notes
=============
+2.1.101
+-------
+
+This release fixes a corner-case revealed by python-certifi-win32 1.6.1
+that was not previously handled when installing certain distributions.
+
+* Make wheel install ``site-packages`` detection robust. (#1863)
+ `PR #1863 <https://github.com/pantsbuild/pex/pull/1863>`_
+
2.1.100
-------
diff --git a/pex/version.py b/pex/version.py
index 80d82318d..37d0e7cd6 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.100"
+__version__ = "2.1.101"
|
zulip__zulip-12366 | Scrollbar drag can result in unintended click actions
Split off from #11792:
> * on the settings pages, if you click on the scrollbar, drag it down, and then release your click when the mouse is outside the settings modal (e.g. below it or to the right), it closes the settings modal. I don't know if this is an existing thing or a regression, but I ran into it a bunch of times when testing even after knowing the behavior.
This was not a regression from perfect-scrollbar, but I fixed it in Grsmto/simplebar#312 and Grsmto/simplebar#317. Just waiting for the fixes to be included in a new upstream release.
| [
{
"content": "ZULIP_VERSION = \"2.0.3+git\"\nLATEST_MAJOR_VERSION = \"2.0\"\nLATEST_RELEASE_VERSION = \"2.0.3\"\nLATEST_RELEASE_ANNOUNCEMENT = \"https://blog.zulip.org/2019/03/01/zulip-2-0-released/\"\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old ve... | [
{
"content": "ZULIP_VERSION = \"2.0.3+git\"\nLATEST_MAJOR_VERSION = \"2.0\"\nLATEST_RELEASE_VERSION = \"2.0.3\"\nLATEST_RELEASE_ANNOUNCEMENT = \"https://blog.zulip.org/2019/03/01/zulip-2-0-released/\"\n\n# Bump the minor PROVISION_VERSION to indicate that folks should provision\n# only when going from an old ve... | diff --git a/package.json b/package.json
index 7072781bf8cae..58195f9d7ddb6 100644
--- a/package.json
+++ b/package.json
@@ -34,7 +34,7 @@
"plotly.js": "1.37.1",
"sass-loader": "7.0.1",
"script-loader": "0.7.2",
- "simplebar": "^4.0.0-alpha.9",
+ "simplebar": "^4.0.0",
"sortablejs": "^1.7.0",
"sorttable": "1.0.2",
"source-map-loader": "0.2.3",
diff --git a/version.py b/version.py
index f9bad91b290c6..9d3ebd232d245 100644
--- a/version.py
+++ b/version.py
@@ -11,4 +11,4 @@
# Typically, adding a dependency only requires a minor version bump, and
# removing a dependency requires a major version bump.
-PROVISION_VERSION = '32.0'
+PROVISION_VERSION = '32.1'
diff --git a/yarn.lock b/yarn.lock
index 820cb7c560785..0847b96ce0991 100644
--- a/yarn.lock
+++ b/yarn.lock
@@ -10687,10 +10687,10 @@ signum@^1.0.0:
resolved "https://registry.yarnpkg.com/signum/-/signum-1.0.0.tgz#74a7d2bf2a20b40eba16a92b152124f1d559fa77"
integrity sha1-dKfSvyogtA66FqkrFSEk8dVZ+nc=
-simplebar@^4.0.0-alpha.9:
- version "4.0.0-alpha.9"
- resolved "https://registry.yarnpkg.com/simplebar/-/simplebar-4.0.0-alpha.9.tgz#e6cf24a2e613abbef952e962680ed2429d421617"
- integrity sha512-WGscL/Lsrfk0uTuG1Pyl/jV6ZkZh0A70atCxcVfvS81aGZdnRjQfHntQPT/nSr+8jxv6YSib5F+FnPCGVw9raw==
+simplebar@^4.0.0:
+ version "4.0.0"
+ resolved "https://registry.yarnpkg.com/simplebar/-/simplebar-4.0.0.tgz#7f1b9e735ec94a58f887d4803f6b15abf401b6b5"
+ integrity sha512-td6vJVhqIXfa3JgNZR5OgETPLfmHNSSpt+OXIbk6WH/nOrUtX3Qcyio30+5rdxxAV/61+F5eJ4jJV4Ek7/KJYQ==
dependencies:
can-use-dom "^0.1.0"
core-js "^3.0.1"
|
pex-tool__pex-2034 | Release 2.1.120
On the docket:
+ [x] Support REPL command history #2019
+ [x] Using --complete-platform with --resolve-local-platforms should build sdists when local platform provides a subset of complete-platforms #2026
+ [x] A loose layout, venv-with-symlink PEX creates brittle symlinks #2023
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.119\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.120\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index a7c5d4575..992b31e92 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,29 @@
Release Notes
=============
+2.1.120
+-------
+
+This release completes the ``--complete-platform`` fix started in
+Pex 2.1.116 by #1991. That fix did not work in all cases but now does.
+
+PEXes run in interpreter mode now support command history when the
+underlying interpreter being used to run the PEX does; use the
+``PEX_INTERPRETER_HISTORY`` bool env var to turn this on.
+
+Additionally, PEXes built with the combination
+``--layout loose --venv --no-venv-site-packages-copies`` are fixed to
+be robust to moves of the source loose PEX directory.
+
+* Fix loose --venv PEXes to be robust to moves. (#2033)
+ `PR #2033 <https://github.com/pantsbuild/pex/pull/2033>`_
+
+* Fix interpreter resolution when using --complete-platform with --resolve-local-platforms (#2031)
+ `PR #2031 <https://github.com/pantsbuild/pex/pull/2031>`_
+
+* Support REPL command history. (#2018)
+ `PR #2018 <https://github.com/pantsbuild/pex/pull/2018>`_
+
2.1.119
-------
diff --git a/pex/version.py b/pex/version.py
index a517c22a2..85c867798 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.119"
+__version__ = "2.1.120"
|
pex-tool__pex-2095 | Release 2.1.129
On the docket:
+ [x] Pex resolves VCS and local project requirements from locks incorrectly. #2092
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.128\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.129\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index d4a3a0197..46c800c11 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,16 @@
Release Notes
=============
+2.1.129
+-------
+
+This release fixes a bug downloading a VCS requirement from a lock when
+the ambient Python interpreter used to run Pex does not meet the
+``Requires-Python`` constraint of the VCS requirement.
+
+* Fix VCS lock downloads to respect target. (#2094)
+ `PR #2094 <https://github.com/pantsbuild/pex/pull/2094>`_
+
2.1.128
-------
diff --git a/pex/version.py b/pex/version.py
index f00e67d65..2553debad 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.128"
+__version__ = "2.1.129"
|
pex-tool__pex-2000 | Release 2.1.117
On the docket:
+ [x] Published pex on github no longer works with PyPy since 2.1.109 #1995
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.116\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.117\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 3f75d503d..20955dbe7 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,21 @@
Release Notes
=============
+2.1.117
+-------
+
+This release fixes a bug introduced in Pex 2.1.109 where the released
+Pex PEX could not be executed by PyPy interpreters. More generally, any
+PEX created with interpreter constraints that did not specify the Python
+implementation, e.g.: ``==3.8.*``, were interpreted as being CPython
+specific, i.e.: ``CPython==3.8.*``. This is now fixed, but if the
+intention of a constraint like ``==3.8.*`` was in fact to restrict to
+CPython only, interpreter constraints need to say so now and use
+``CPython==3.8.*`` explicitly.
+
+* Fix interpreter constraint parsing. (#1998)
+ `PR #1998 <https://github.com/pantsbuild/pex/pull/1998>`_
+
2.1.116
-------
diff --git a/pex/version.py b/pex/version.py
index 12b8e8168..eadcefbeb 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.116"
+__version__ = "2.1.117"
|
pex-tool__pex-1997 | Release 2.1.116
On the docket:
+ [x] The --resolve-local-platforms option does not work with --complete-platforms #1899
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.115\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.116\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 94f0b27b4..3f75d503d 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,15 @@
Release Notes
=============
+2.1.116
+-------
+
+This release fixes a bug in ``--resolve-local-platforms`` when
+``--complete-platform`` was used.
+
+* Check for --complete-platforms match when --resolve-local-platforms (#1991)
+ `PR #1991 <https://github.com/pantsbuild/pex/pull/1991>`_
+
2.1.115
-------
diff --git a/pex/version.py b/pex/version.py
index edffe5b53..12b8e8168 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.115"
+__version__ = "2.1.116"
|
pex-tool__pex-1942 | Release 2.1.109
On the docket:
+ [x] pex does not support musllinux wheels #1933
+ [x] Empty string PEX_PATH="" env var causes CWD (.) to be added bootstrapped pex_path #1936
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.108\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.109\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index f1a94fcc9..cc087a1ee 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,20 @@
Release Notes
=============
+2.1.109
+-------
+
+This release brings musllinux wheel support and a fix for a regression
+introduced in Pex 2.1.105 by #1902 that caused ``PEX_PATH=`` (an
+exported ``PEX_PATH`` with an empty string value) to raise an error in
+almost all use cases.
+
+* Vendor latest packaging; support musllinux wheels. (#1937)
+ `PR #1937 <https://github.com/pantsbuild/pex/pull/1937>`_
+
+* Don't treat ``PEX_PATH=`` as ``.`` like other PATHS. (#1938)
+ `PR #1938 <https://github.com/pantsbuild/pex/pull/1938>`_
+
2.1.108
-------
diff --git a/pex/version.py b/pex/version.py
index c0dfd4790..32f577f51 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.108"
+__version__ = "2.1.109"
|
DataDog__integrations-extras-1031 | Collect clock_time_seconds metric from cert-manager
cert-manager v1.5+ exposes a `clock_time` metric which reports the current seconds since the Unix Epoch
See: https://github.com/jetstack/cert-manager/pull/4105
It would be useful to collect this metric in DataDog so we can alert on seconds until a given certificate expires
| [
{
"content": "# (C) Datadog, Inc. 2019-present\n# All rights reserved\n# Licensed under a 3-clause BSD style license (see LICENSE)\n\nCERT_METRICS = {\n 'certmanager_certificate_ready_status': 'certificate.ready_status',\n 'certmanager_certificate_expiration_timestamp_seconds': 'certificate.expiration_tim... | [
{
"content": "# (C) Datadog, Inc. 2019-present\n# All rights reserved\n# Licensed under a 3-clause BSD style license (see LICENSE)\n\nCERT_METRICS = {\n 'certmanager_certificate_ready_status': 'certificate.ready_status',\n 'certmanager_certificate_expiration_timestamp_seconds': 'certificate.expiration_tim... | diff --git a/cert_manager/datadog_checks/cert_manager/metrics.py b/cert_manager/datadog_checks/cert_manager/metrics.py
index f098835222..7df0d19522 100644
--- a/cert_manager/datadog_checks/cert_manager/metrics.py
+++ b/cert_manager/datadog_checks/cert_manager/metrics.py
@@ -8,6 +8,7 @@
}
CONTROLLER_METRICS = {
+ 'certmanager_clock_time_seconds': 'clock_time',
'certmanager_controller_sync_call_count': 'controller.sync_call.count',
}
diff --git a/cert_manager/metadata.csv b/cert_manager/metadata.csv
index e9822aaf53..95c6dea960 100644
--- a/cert_manager/metadata.csv
+++ b/cert_manager/metadata.csv
@@ -1,4 +1,5 @@
metric_name,metric_type,interval,unit_name,per_unit_name,description,orientation,integration,short_name
+cert_manager.clock_time,count,,second,,The clock time given in seconds (from 1970/01/01 UTC),0,cert-manager,cm.clock_time
cert_manager.prometheus.health,gauge,,,,Whether the check is able to connect to the metrics endpoint,0,cert-manager,cm.health
cert_manager.certificate.ready_status,gauge,,,,The ready status of the certificate,0,cert-manager,cm.cert_ready_status
cert_manager.certificate.expiration_timestamp,gauge,,second,,The date after which the certificate expires. Expressed as a Unix Epoch Time,0,cert-manager,cm.cert_exp_time
diff --git a/cert_manager/tests/common.py b/cert_manager/tests/common.py
index cf75d2464a..f9ffa63be6 100644
--- a/cert_manager/tests/common.py
+++ b/cert_manager/tests/common.py
@@ -16,6 +16,7 @@
}
CONTROLLER_METRICS = {
+ 'cert_manager.clock_time': aggregator.MONOTONIC_COUNT,
'cert_manager.controller.sync_call.count': aggregator.MONOTONIC_COUNT,
'cert_manager.prometheus.health': aggregator.GAUGE,
}
diff --git a/cert_manager/tests/conftest.py b/cert_manager/tests/conftest.py
index 7afb08c858..88c2f581f0 100644
--- a/cert_manager/tests/conftest.py
+++ b/cert_manager/tests/conftest.py
@@ -27,7 +27,7 @@ def setup_cert_manager():
"kubectl",
"apply",
"-f",
- "https://github.com/jetstack/cert-manager/releases/download/v1.2.0/cert-manager.yaml",
+ "https://github.com/jetstack/cert-manager/releases/download/v1.5.0/cert-manager.yaml",
]
)
run_command(
diff --git a/cert_manager/tests/fixtures/cert_manager.txt b/cert_manager/tests/fixtures/cert_manager.txt
index d1cf303737..b5e0fd94b1 100644
--- a/cert_manager/tests/fixtures/cert_manager.txt
+++ b/cert_manager/tests/fixtures/cert_manager.txt
@@ -18,6 +18,9 @@ certmanager_certificate_ready_status{condition="Unknown",name="acme-cert",namesp
certmanager_certificate_ready_status{condition="Unknown",name="acme-cert2",namespace="default"} 0
certmanager_certificate_ready_status{condition="Unknown",name="myingress-cert",namespace="cert-manager-test"} 0
certmanager_certificate_ready_status{condition="Unknown",name="selfsigned-cert",namespace="cert-manager-test"} 0
+# HELP certmanager_clock_time_seconds The clock time given in seconds (from 1970/01/01 UTC).
+# TYPE certmanager_clock_time_seconds counter
+certmanager_clock_time_seconds 1.61915483e+09
# HELP certmanager_controller_sync_call_count The number of sync() calls made by a controller.
# TYPE certmanager_controller_sync_call_count counter
certmanager_controller_sync_call_count{controller="CertificateIssuing"} 20
|
pex-tool__pex-2086 | Release 2.1.127
On the docket:
+ [x] Pex fails to subset a "foo @ file:///bar" URL lock. #2083
| [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.126\"\n",
"path": "pex/version.py"
}
] | [
{
"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.127\"\n",
"path": "pex/version.py"
}
] | diff --git a/CHANGES.rst b/CHANGES.rst
index 112d072a0..b7c71cb3c 100644
--- a/CHANGES.rst
+++ b/CHANGES.rst
@@ -1,6 +1,15 @@
Release Notes
=============
+2.1.127
+-------
+
+This release fixes `--lock` resolve sub-setting for local project
+requirements.
+
+* Fix lock subsetting for local projects. (#2085)
+ `PR #2085 <https://github.com/pantsbuild/pex/pull/2085>`_
+
2.1.126
-------
diff --git a/pex/version.py b/pex/version.py
index 014827637..98d74ea5d 100644
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.126"
+__version__ = "2.1.127"
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.