organization string | repo_name string | base_commit string | iss_html_url string | iss_label string | title string | body string | code null | pr_html_url string | commit_html_url string | file_loc string | own_code_loc list | ass_file_loc list | other_rep_loc list | analysis dict | loctype dict | iss_has_pr int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
scrapy | scrapy | fd55f62207bbbb18d7758c8e2ef46fe9115eb2c5 | https://github.com/scrapy/scrapy/issues/5400 | bug
CI | Tests broken with Twisted 22.1.0 | `ImportError: cannot import name 'PayloadResource' from 'twisted.web.test.test_webclient'`
`ImportError: cannot import name 'ForeverTakingResource' from 'twisted.web.test.test_webclient'` | null | https://github.com/scrapy/scrapy/pull/5405 | null | {'base_commit': 'fd55f62207bbbb18d7758c8e2ef46fe9115eb2c5', 'files': [{'path': 'pytest.ini', 'status': 'modified', 'Loc': {'(None, None, 24)': {'mod': [24, 25]}}}, {'path': 'tests/mockserver.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [17, 20]}, "('LeafResource', None, 38)": {'mod': [38]}, "('Root', None, 178)": {'mod': [178]}, "('Root', '__init__', 180)": {'mod': [181, 190]}}}, {'path': 'tests/test_downloader_handlers.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [18, 19, 37]}}}, {'path': 'tests/test_webclient.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [24, 25, 26, 27, 28, 29, 30, 31, 39]}}}, {'path': 'tox.ini', 'status': 'modified', 'Loc': {'(None, None, 22)': {'mod': [22, 23]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"tests/mockserver.py"
],
"doc": [],
"test": [
"tests/test_webclient.py",
"tests/test_downloader_handlers.py"
],
"config": [
"pytest.ini",
"tox.ini"
],
"asset": []
} | null |
ultralytics | yolov5 | 04081f810270712ba3a69577c47e5dcfa850fa90 | https://github.com/ultralytics/yolov5/issues/1355 | bug | The exported label txt seems have problem | Hi, @glenn-jocher i manage to use `python detect.py --save-txt` to semi-auto label images, but when i set `Open Dir` and `Change Save Dir` in [labelImg](https://github.com/tzutalin/labelImg/releases/tag/v1.8.1),the labelImg can not display the exported bbox, and its command line window appears error:
```
Traceback (most recent call last):
File "<string>", line 1268, in openNextImg
File "<string>", line 1035, in loadFile
File "<string>", line 1427, in loadYOLOTXTByFilename
File "Z:\home\darrenl\tmp\labelImg\build-tools\build\labelImg\out00-PYZ.pyz\libs.yolo_io", line 112, in __init__
File "Z:\home\darrenl\tmp\labelImg\build-tools\build\labelImg\out00-PYZ.pyz\libs.yolo_io", line 142, in parseYoloFormat
ValueError: too many values to unpack
```
If i set `Change Save Dir` to another empty folder, it will not occur error, so i doubt it is the problem of exported label txt, could you have a try ? | null | https://github.com/ultralytics/yolov5/pull/1377 | null | {'base_commit': '04081f810270712ba3a69577c47e5dcfa850fa90', 'files': [{'path': '.github/workflows/ci-testing.yml', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [69, 72]}}}, {'path': 'README.md', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [99]}}}, {'path': 'detect.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [164], 'mod': [13, 159, 160]}, "(None, 'detect', 17)": {'mod': [18, 19, 24, 25, 26, 27]}}}, {'path': 'test.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [16, 282, 290, 291, 308]}, "(None, 'test', 20)": {'mod': [49, 50, 51, 52]}}}, {'path': 'train.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [417], 'mod': [30, 413, 414, 431, 433, 443, 469, 470, 517]}, "(None, 'train', 37)": {'mod': [39, 40, 41, 44, 45, 46, 49, 51, 123, 124, 191, 218, 299, 324, 372, 381]}}}, {'path': 'tutorial.ipynb', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [600, 614, 890, 972, 989, 990, 1033, 1042, 1043, 1044, 1081, 1173, 1175]}}}, {'path': 'utils/general.py', 'status': 'modified', 'Loc': {"(None, 'get_latest_run', 63)": {'mod': [63]}, "(None, 'increment_dir', 954)": {'mod': [954, 955, 956, 957, 958, 959, 960, 962, 964, 965, 966, 967, 968, 969, 970]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"tutorial.ipynb",
"utils/general.py",
"detect.py",
"train.py"
],
"doc": [
"README.md"
],
"test": [
"test.py"
],
"config": [
".github/workflows/ci-testing.yml"
],
"asset": []
} | 1 |
psf | requests | be62645dd56580dd7576032b348cf79d880851d8 | https://github.com/psf/requests/issues/1088 | Feature Request | Session pickling support is broken and tests for it are removed | The commit 42b029552190f6639642d0f62d27abcd1ceed51e removes the `__attrs__` attribute of the `Session` class, which is used in the pickle protocol's `__getstate__` method.
The tests that are testing this functionality (functions `test_session_pickling` and `test_unpickled_session_requests` in the once present `tests/test_requests.py`) are also removed.
The commit messages don't seem to indicate any reason for this, and I can't find anything searching in the issues.
If it is intended that pickling of Session objects not be supported, could you give the reason? And may be the `__getstate__` and `__setstate__` methods should be removed too, as they might send a wrong message.
If this is unintended (which is what I think is the case), I can work on a pull request to fix this. Please confirm.
Thank you.
| null | https://github.com/psf/requests/pull/1223 | null | {'base_commit': 'be62645dd56580dd7576032b348cf79d880851d8', 'files': [{'path': 'requests/sessions.py', 'status': 'modified', 'Loc': {"('Session', None, 166)": {'add': [178]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"requests/sessions.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 |
Significant-Gravitas | AutoGPT | 34261a15835390c5c464cef88c4a42b52a88b739 | https://github.com/Significant-Gravitas/AutoGPT/issues/987 | Massage about Pinecone initializing | ### Duplicates
- [X] I have searched the existing issues
### Summary 💡
Add a message like: "Connecting Pinecone. This may take some time..."
### Examples 🌈
_No response_
### Motivation 🔦
At this point, if the Pinecone index setup takes a noticeable amount of time, the console just stops. It is necessary to notify the user that the index is being configured now and this may take some time. | null | https://github.com/Significant-Gravitas/AutoGPT/pull/1194 | null | {'base_commit': '34261a15835390c5c464cef88c4a42b52a88b739', 'files': [{'path': 'autogpt/memory/pinecone.py', 'status': 'modified', 'Loc': {"('PineconeMemory', '__init__', 10)": {'add': [40]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"autogpt/memory/pinecone.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 | |
scikit-learn | scikit-learn | 6f7ae911f18fda59669309582706f1aa1f36374d | https://github.com/scikit-learn/scikit-learn/issues/19489 | Bug
Regression
module:feature_extraction | 'feature_name' referenced before assignment | <!--
Before submitting a bug, please make sure the issue hasn't been already
addressed by searching through the past issues.
-->
#### Describe the bug
When I run some preprocessing on my data the line triggering the error is:
```
C:\local_tools\Anaconda3\envs\mother_env\lib\site-packages\sklearn\feature_extraction\_dict_vectorizer.py in _transform(self, X, fitting)
226 indices=indices, values=values)
227
--> 228 if feature_name is not None:
229 if fitting and feature_name not in vocab:
230 vocab[feature_name] = len(feature_names)
UnboundLocalError: local variable 'feature_name' referenced before assignment
```
#### Steps/Code to Reproduce
<!--
Please add a minimal example that we can reproduce the error by running the
code. Be as succinct as possible, do not depend on external data. In short, we
are going to copy-paste your code and we expect to get the same
result as you.
Example:
```python
from sklearn.feature_extraction.text import CountVectorizer
from sklearn.decomposition import LatentDirichletAllocation
docs = ["Help I have a bug" for i in range(1000)]
vectorizer = CountVectorizer(input=docs, analyzer='word')
lda_features = vectorizer.fit_transform(docs)
lda_model = LatentDirichletAllocation(
n_topics=10,
learning_method='online',
evaluate_every=10,
n_jobs=4,
)
model = lda_model.fit(lda_features)
```
If the code is too long, feel free to put it in a public gist and link
it in the issue: https://gist.github.com
-->
It involves a bit too much preprocessing to put here but from inspecting the respective source file (see above, sklearn\feature_extraction\_dict_vectorizer.py) I have the strong suspicion that ```feature_name``` can go through all if/elif checks without being assigned anything.
#### Versions
<!--
Please run the following snippet and paste the output below.
For scikit-learn >= 0.20:
import sklearn; sklearn.show_versions()
For scikit-learn < 0.20:
import platform; print(platform.platform())
import sys; print("Python", sys.version)
import numpy; print("NumPy", numpy.__version__)
import scipy; print("SciPy", scipy.__version__)
import sklearn; print("Scikit-Learn", sklearn.__version__)
import imblearn; print("Imbalanced-Learn", imblearn.__version__)
-->
System:
python: 3.8.5 (default, Sep 3 2020, 21:29:08) [MSC v.1916 64 bit (AMD64)]
executable: C:\local_tools\Anaconda3\envs\mother_env\python.exe
machine: Windows-10-10.0.18362-SP0
Python dependencies:
pip: 20.3.3
setuptools: 52.0.0.post20210125
sklearn: 0.24.1
numpy: 1.19.2
scipy: 1.6.0
Cython: None
pandas: 1.2.1
matplotlib: 3.3.4
joblib: 1.0.1
threadpoolctl: 2.1.0
Built with OpenMP: True
<!-- Thanks for contributing! -->
| null | https://github.com/scikit-learn/scikit-learn/pull/19520 | null | {'base_commit': '6f7ae911f18fda59669309582706f1aa1f36374d', 'files': [{'path': 'doc/whats_new/v1.0.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [346], 'mod': [343]}}}, {'path': 'sklearn/feature_extraction/_dict_vectorizer.py', 'status': 'modified', 'Loc': {"('DictVectorizer', '_transform', 190)": {'add': [246], 'mod': [229, 230, 231, 232, 233, 234, 235]}}}, {'path': 'sklearn/feature_extraction/tests/test_dict_vectorizer.py', 'status': 'modified', 'Loc': {"(None, 'test_dictvectorizer_dense_sparse_equivalence', 174)": {'add': [211]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"sklearn/feature_extraction/_dict_vectorizer.py"
],
"doc": [
"doc/whats_new/v1.0.rst"
],
"test": [
"sklearn/feature_extraction/tests/test_dict_vectorizer.py"
],
"config": [],
"asset": []
} | 1 |
scikit-learn | scikit-learn | 0fb9a50033574e36a8bd635d8e5c0a793428877c | https://github.com/scikit-learn/scikit-learn/issues/8996 | Easy
Sprint | Deprecate LSHForest | LSHForest should be deprecated and scheduled for removal in 0.21. It should also warn about having bad performance. cc @ogrisel | null | https://github.com/scikit-learn/scikit-learn/pull/9078 | null | {'base_commit': '0fb9a50033574e36a8bd635d8e5c0a793428877c', 'files': [{'path': 'benchmarks/bench_plot_approximate_neighbors.py', 'status': 'removed', 'Loc': {}}, {'path': 'doc/modules/classes.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [1062]}}}, {'path': 'doc/modules/neighbors.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [515, 517, 518, 520, 521, 522, 523, 524, 525, 527, 528, 529, 530, 531, 532, 533, 534, 535, 536, 538, 539, 541, 542, 543, 544, 545, 546, 547, 548, 550, 551, 552, 554, 555, 556, 557, 559, 560, 561, 562, 564, 565, 566, 568, 569, 570, 571, 572, 573, 574, 575, 577, 578, 579, 580, 582, 583, 584, 585, 587, 588, 589, 590, 592, 593, 594, 596, 598, 599, 601, 602, 604, 606, 607, 609, 610, 611, 612, 613, 614, 615, 616, 618, 619, 620, 621, 623, 624, 626, 627, 628, 629, 630, 632, 633, 634, 635, 636, 638, 639, 641, 642, 643, 644, 645, 646, 647, 648, 650, 651, 652, 653, 655, 656, 657, 658, 659, 660, 661, 662, 664, 665, 666, 667, 668, 669, 670, 671, 672, 673, 674, 675, 676, 677, 679, 681, 682, 683, 684, 685, 687, 688, 689, 690]}}}, {'path': 'doc/whats_new.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [467], 'mod': [435]}}}, {'path': 'examples/neighbors/plot_approximate_nearest_neighbors_hyperparameters.py', 'status': 'removed', 'Loc': {}}, {'path': 'examples/neighbors/plot_approximate_nearest_neighbors_scalability.py', 'status': 'removed', 'Loc': {}}, {'path': 'sklearn/neighbors/approximate.py', 'status': 'modified', 'Loc': {"('LSHForest', None, 110)": {'add': [219]}}}, {'path': 'sklearn/neighbors/tests/test_approximate.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [28]}, "(None, 'test_neighbors_accuracy_with_n_candidates', 29)": {'mod': [41]}, "(None, 'test_neighbors_accuracy_with_n_estimators', 65)": {'mod': [77]}, "(None, 'test_kneighbors', 100)": {'mod': [111]}, "(None, 'test_radius_neighbors', 149)": {'mod': [162]}, "(None, 'test_radius_neighbors_boundary_handling', 223)": {'mod': [233, 234]}, "(None, 'test_distances', 283)": {'mod': [291]}, "(None, 'test_fit', 309)": {'mod': [317]}, "(None, 'test_partial_fit', 336)": {'mod': [346]}, "(None, 'test_hash_functions', 371)": {'mod': [383, 384]}, "(None, 'test_candidates', 400)": {'mod': [410, 424]}, "(None, 'test_graphs', 438)": {'mod': [446]}, "(None, 'test_sparse_input', 458)": {'mod': [463, 464]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"examples/neighbors/plot_approximate_nearest_neighbors_scalability.py",
"benchmarks/bench_plot_approximate_neighbors.py",
"examples/neighbors/plot_approximate_nearest_neighbors_hyperparameters.py",
"sklearn/neighbors/approximate.py"
],
"doc": [
"doc/modules/classes.rst",
"doc/modules/neighbors.rst",
"doc/whats_new.rst"
],
"test": [
"sklearn/neighbors/tests/test_approximate.py"
],
"config": [],
"asset": []
} | 1 |
deepfakes | faceswap | 9438672b1cf80602fc93536670d9601d655377f5 | https://github.com/deepfakes/faceswap/issues/150 | code to integrate | Multi-GPU training | I've read reports of people succesfully training on multiple GPU'S using the following code:
```
from keras.utils import multi_gpu_model
autoencoder_A = multi_gpu_model( autoencoder_A ,2)
autoencoder_B = multi_gpu_model( autoencoder_B ,2)
```
https://keras.io/utils/#multi_gpu_model
I could add support for this but I can't test it as I only have a single GPU.
Anyobe here with a multi-GPU setup that would like to have a go at this? | null | https://github.com/deepfakes/faceswap/pull/241 | null | {'base_commit': '9438672b1cf80602fc93536670d9601d655377f5', 'files': [{'path': 'scripts/train.py', 'status': 'modified', 'Loc': {"('TrainingProcessor', 'parse_arguments', 25)": {'mod': [75]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"scripts/train.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 |
localstack | localstack | b8290ff8013366de16f7dd2ed14d74b56d1fb03b | https://github.com/localstack/localstack/issues/10860 | Internal Refactoring: Towards a Multi-Distribution Setup | Over the next few weeks and months we’re refactoring the code in this repository to move toward a **multi-distribution setup**.
For now this only affects active contributors as well as any developers that depend on code existing in the published container under the path `/opt/code/localstack/localstack`. Most users should not be affected by this change.
## Motivation
This will enable us to define clearer boundaries and allows for easier re-use of individual components.
Some of the previously internal code has already been moved into external open repositories such as [localstack/rolo](https://github.com/localstack/rolo).
Other parts of the codebase will keep living in this repository but under its own distribution.
How we will map this to PyPI items is still being discussed and should become clearer over the next weeks during the initial refactorings.
The code layout is not part of any official API or semver guarantees, nevertheless we still want to use this chance to give you a heads up and some guidance in how to make your existing code compatible with the new structure.
## Detailed instructions
### 1. Moving everything into `localstack-core`
As a first step, the entirety of the localstack module is moved into a `localstack-core` directory with https://github.com/localstack/localstack/pull/10800, which will make up one of the multiple distributions.
In this initial step on our way to a multi-distribution system, only the additional root level of `localstack-core` is introduced and the rest of the directory structure is unchanged.
- If you have an open PR, you can rebase onto master after https://github.com/localstack/localstack/pull/10800 has been merged.
- After the PR is merged, update your local repository with `git pull`, remove the now empty localstack directory `rm -r localstack`, and run a `make clean install`. You should see an `localstack_core.egg-info` directory in `localstack-core/`
- If you are an active contributor and you're using the PyCharm IDE, you need to adapt your project structure by marking the new `localstack-core` module as a source folder. Otherwise you will encounter errors where it will complain about not being able to find the `localstack` module.
- 
- If you want to call code from the `localstack` module, you now need to perform an installation of the project (e.g. with `pip install -e .`). Previously, since `localstack` was a root-level module, python automatically included it in its import path. With a source directory layout there is a more strict boundary now which also helps avoiding unintentional imports. See [here](https://packaging.python.org/en/latest/discussions/src-layout-vs-flat-layout/) for more information on the differences between a flat and a src layout.
- The location of test files is unchanged.
- Locally the code moves from `.../localstack/localstack/...` => `.../localstack/localstack-core/localstack/...`
- In the published container the code moves from `/opt/code/localstack/localstack` => `/opt/code/localstack/localstack-core/localstack`
### ?. Next steps
After the initial move is over the line, additional code will be extracted from `localstack-core` into new distributions such as `localstack-cli`.
This issue will be updated with new information as the project progresses.
| null | https://github.com/localstack/localstack/pull/10800 | null | {'base_commit': 'b8290ff8013366de16f7dd2ed14d74b56d1fb03b', 'files': [{'path': '.circleci/config.yml', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [137, 405, 468, 469, 559, 560, 561, 562]}}}, {'path': '.dockerignore', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [4]}}}, {'path': '.github/workflows/asf-updates.yml', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [61, 66]}}}, {'path': '.github/workflows/tests-pro-integration.yml', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [294, 301, 337]}}}, {'path': '.github/workflows/tests-s3-image.yml', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [7, 8, 9, 10, 11, 12, 13, 14, 15, 26, 27, 28, 29, 30, 31, 32, 33, 34]}}}, {'path': 'Dockerfile', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [101], 'mod': [179]}}}, {'path': 'Dockerfile.s3', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [6], 'mod': [84]}}}, {'path': 'Makefile', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [239], 'mod': [76, 83, 244]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
".circleci/config.yml"
],
"doc": [
".dockerignore"
],
"test": [],
"config": [
".github/workflows/tests-pro-integration.yml",
"Makefile",
"Dockerfile",
"Dockerfile.s3",
".github/workflows/tests-s3-image.yml",
".github/workflows/asf-updates.yml"
],
"asset": []
} | 1 | |
pallets | flask | 6f2fdc5ac4ad869a21c4c0281d7fa1eb8aa5a689 | https://github.com/pallets/flask/issues/3628 | Returning Response and headers causes duplicate headers | <!-- **This issue tracker is a tool to address bugs in Flask itself.
Please use the Pallets Discord or Stack Overflow for general questions
about using Flask or issues not related to Flask.** -->
<!-- If you'd like to report a bug in Flask, fill out the template below. Provide
any extra information that may be useful / related to your problem.
Ideally, create an [MCVE](https://stackoverflow.com/help/mcve), which helps us
understand the problem and helps check that it is not caused by something in
your code. -->
### Expected Behavior
```
from flask import Flask
app = Flask(__name__)
@app.route('/')
def issue():
return {'test': 'test'}, {'Content-Type': 'test'}
```
Using `curl -v http://127.0.0.1:5000/` to query the view I expect only one `Content-Type` header > `Content-Type: test`
### Actual Behavior
Duplicate headers are returned
```
< Content-Type: application/json
< Content-Type: test
```
### Environment
* Python version: 3.8.2
* Flask version: 1.1.2
* Werkzeug version: 1.0.1
### Context
This issue also effects responses created with make_response when using a dict or jsonify body + the headers argument with a 'Content-Type':
```
from flask import Flask, make_response
app = Flask(__name__)
@app.route('/')
def issue():
return make_response({'test': 'test'}, {'Content-Type': 'test'})
```
This issue is caused by jsonify adding a 'Content-Type' header then make_response uses `extent` to add the additional headers, thus leading to the duplicate.
Returning a str/bytes body does not have this problem as no 'Content-Type' is added by flask, if one is missing it is added by werkzeug.
The reason I came across this issue is we have older code which does `return json.dumps(data), 200, {'Content-Type': 'application/json+somecustomtype'}` and I assumed based on the flask docs that just returning the data and letting flask do the jsonify would be better.
| null | https://github.com/pallets/flask/pull/3684 | null | {'base_commit': '6f2fdc5ac4ad869a21c4c0281d7fa1eb8aa5a689', 'files': [{'path': 'CHANGES.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [30, 31, 32]}}}, {'path': 'src/flask/app.py', 'status': 'modified', 'Loc': {"('Flask', 'make_response', 1935)": {'mod': [2048]}}}, {'path': 'tests/test_basic.py', 'status': 'modified', 'Loc': {"(None, 'from_response_headers', 1118)": {'mod': [1120, 1121]}, "(None, 'test_response_types', 1092)": {'mod': [1158]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"src/flask/app.py"
],
"doc": [
"CHANGES.rst"
],
"test": [
"tests/test_basic.py"
],
"config": [],
"asset": []
} | 1 | |
lllyasviel | Fooocus | 0a87da7dc1998e0073ba824c7f223cd331858b24 | https://github.com/lllyasviel/Fooocus/issues/3502 | bug
can't reproduce
feedback pending | [Bug]: Unsupported image type in input when using input image | ### Checklist
- [ ] The issue has not been resolved by following the [troubleshooting guide](https://github.com/lllyasviel/Fooocus/blob/main/troubleshoot.md)
- [ ] The issue exists on a clean installation of Fooocus
- [ ] The issue exists in the current version of Fooocus
- [ ] The issue has not been reported before recently
- [ ] The issue has been reported before but has not been fixed yet
### What happened?
Whether I'm using image prompt or inpaint/outpaint, I get the error: Unsupported image type in input.
Normal image generation from text works fine, but using any input image throws this error.
Others have reported the same issue on Reddit, so it's not just me.
### Steps to reproduce the problem
1. Run foocus.
2. Try to use inpaint on any image.
### What should have happened?
Just do the magic...
### What browsers do you use to access Fooocus?
Microsoft Edge
### Where are you running Fooocus?
Cloud (Google Colab)
### What operating system are you using?
Win 11
### Console logs
```Shell
Traceback (most recent call last):
File "/content/Fooocus/modules/gradio_hijack.py", line 279, in preprocess
im = processing_utils.decode_base64_to_image(x)
File "/usr/local/lib/python3.10/dist-packages/gradio/processing_utils.py", line 59, in decode_base64_to_image
img = Image.open(BytesIO(base64.b64decode(image_encoded)))
File "/usr/local/lib/python3.10/dist-packages/PIL/Image.py", line 3283, in open
rawmode = mode
PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x782f4810f010>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/gradio/routes.py", line 488, in run_predict
output = await app.get_blocks().process_api(
File "/usr/local/lib/python3.10/dist-packages/gradio/blocks.py", line 1429, in process_api
inputs = self.preprocess_data(fn_index, inputs, state)
File "/usr/local/lib/python3.10/dist-packages/gradio/blocks.py", line 1239, in preprocess_data
processed_input.append(block.preprocess(inputs[i]))
File "/content/Fooocus/modules/gradio_hijack.py", line 281, in preprocess
raise Error("Unsupported image type in input")
gradio.exceptions.Error: 'Unsupported image type in input'
```
### Additional information
I noticed gradio was down today, I don't know if this has anything to do with the issue | null | https://github.com/lllyasviel/Fooocus/pull/3506 | null | {'base_commit': '0a87da7dc1998e0073ba824c7f223cd331858b24', 'files': [{'path': 'launch.py', 'status': 'modified', 'Loc': {"(None, 'download_models', 104)": {'add': [104]}, '(None, None, None)': {'mod': [24]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"launch.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 |
Z4nzu | hackingtool | 5c69e5cb13127601aaba6ee04e522ead84b74f6a | https://github.com/Z4nzu/hackingtool/issues/181 | help me | when i run the install.sh it gives me this error [✘] Installation Failed !!! [✘]
[✔] Loading ...
Hit:1 http://kali.download/kali kali-rolling InRelease
Reading package lists... Done
E: Could not open lock file /var/lib/dpkg/lock-frontend - open (13: Permission denied)
E: Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), are you root?
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
Package python-pip is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source
However the following packages replace it:
python3-pip
E: Package 'python-pip' has no installation candidate
[✔] Checking directories...
[✔] Installing ...
fatal: could not create work tree dir '/usr/share/doc/hackingtool': Permission denied
[✔] Trying to installing Requirements ...
Requirement already satisfied: lolcat in /usr/local/lib/python3.9/dist-packages (1.4)
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
figlet is already the newest version (2.2.5-3+b1).
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
Requirement already satisfied: boxes in /usr/local/lib/python3.9/dist-packages (0.0.0)
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
boxes is already the newest version (2.1.1-2).
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
Requirement already satisfied: flask in /usr/local/lib/python3.9/dist-packages (2.0.2)
Requirement already satisfied: click>=7.1.2 in /usr/local/lib/python3.9/dist-packages (from flask) (8.0.3)
Requirement already satisfied: Jinja2>=3.0 in /usr/local/lib/python3.9/dist-packages (from flask) (3.0.3)
Requirement already satisfied: Werkzeug>=2.0 in /usr/local/lib/python3.9/dist-packages (from flask) (2.0.2)
Requirement already satisfied: itsdangerous>=2.0 in /usr/local/lib/python3.9/dist-packages (from flask) (2.0.1)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.9/dist-packages (from Jinja2>=3.0->flask) (2.0.1)
Requirement already satisfied: requests in /usr/local/lib/python3.9/dist-packages (2.27.1)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.9/dist-packages (from requests) (3.3)
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.9/dist-packages (from requests) (2.0.10)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.9/dist-packages (from requests) (1.26.8)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.9/dist-packages (from requests) (2021.10.8)
[✘] Installation Failed !!! [✘] | null | https://github.com/Z4nzu/hackingtool/pull/348 | null | {'base_commit': '5c69e5cb13127601aaba6ee04e522ead84b74f6a', 'files': [{'path': 'install.sh', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [7, 8, 9, 10, 11, 12, 13, 14, 15, 17, 19, 33, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 65, 66, 67, 68, 69, 70, 72, 73, 74, 75, 76, 77, 78, 79, 82, 83, 84, 86, 88, 89, 90, 91, 92, 93, 95, 96, 98, 99, 100, 102]}}}, {'path': 'update.sh', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [0], 'mod': [9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [],
"doc": [],
"test": [],
"config": [],
"asset": [
"update.sh",
"install.sh"
]
} | 1 | |
binary-husky | gpt_academic | 6b5bdbe98a882a726ec9710e5e94baa94d470ad6 | https://github.com/binary-husky/gpt_academic/issues/286 | 弱弱的提问,怎么可以解析前端项目呢 | 弱弱的提问,怎么可以解析前端项目呢 | null | https://github.com/binary-husky/gpt_academic/pull/290 | null | {'base_commit': '6b5bdbe98a882a726ec9710e5e94baa94d470ad6', 'files': [{'path': 'functional_crazy.py', 'status': 'modified', 'Loc': {"(None, 'get_crazy_functionals', 3)": {'mod': [46]}}}]} | [] | [] | [] | {
"iss_type": "3",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"functional_crazy.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 | |
scikit-learn | scikit-learn | 2b79665b90bd54fa59701090d5f608a1fc4dd33a | https://github.com/scikit-learn/scikit-learn/issues/18408 | Bug
module:ensemble | Data type mismatch problem when calling HistGradientBoostingClassifier.predict() | <!--
Before submitting a bug, please make sure the issue hasn't been already
addressed by searching through the past issues.
-->
#### Describe the bug
It looks like HistGradientBoostingClassifier has problems on handling datasets with different data types. It works fine when X is `np.float`. However, when X is of the type `uint8`, HistGradientBoostingClassifier crushes when calling `predict()`.
#### Steps/Code to Reproduce
<!--
Please add a minimal example that we can reproduce the error by running the
code. Be as succinct as possible, do not depend on external data. In short, we
are going to copy-paste your code and we expect to get the same
result as you.
Example:
```python
from sklearn.feature_extraction.text import CountVectorizer
from sklearn.decomposition import LatentDirichletAllocation
docs = ["Help I have a bug" for i in range(1000)]
vectorizer = CountVectorizer(input=docs, analyzer='word')
lda_features = vectorizer.fit_transform(docs)
lda_model = LatentDirichletAllocation(
n_topics=10,
learning_method='online',
evaluate_every=10,
n_jobs=4,
)
model = lda_model.fit(lda_features)
```
If the code is too long, feel free to put it in a public gist and link
it in the issue: https://gist.github.com
-->
```
from keras.datasets import mnist
from sklearn.metrics import accuracy_score
from sklearn.experimental import enable_hist_gradient_boosting
from sklearn.ensemble import HistGradientBoostingClassifier
if __name__ == '__main__':
(X_train, y_train), (X_test, y_test) = mnist.load_data()
X_train = X_train.reshape(X_train.shape[0], -1)
X_test = X_test.reshape(X_test.shape[0], -1)
model = HistGradientBoostingClassifier(max_iter=100,
loss='categorical_crossentropy',
validation_fraction=None,
random_state=0)
model.fit(X_train, y_train)
y_pred = model.predict(X_test)
acc = accuracy_score(y_test, y_pred)
print('Testing Acc: {:.4f} %'.format(100.*acc))
```
#### Expected Results
The HistGradientBoostingClassifier successfully returns prediction results.
#### Actual Results
```
File "FILEPATH", line 21, in <module>
y_pred = model.predict(X_test)
File "C:\Software\Anaconda\lib\site-packages\sklearn\ensemble\_hist_gradient_boosting\gradient_boosting.py", line 1114, in predict
encoded_classes = np.argmax(self.predict_proba(X), axis=1)
File "C:\Software\Anaconda\lib\site-packages\sklearn\ensemble\_hist_gradient_boosting\gradient_boosting.py", line 1130, in predict_proba
raw_predictions = self._raw_predict(X)
File "C:\Software\Anaconda\lib\site-packages\sklearn\ensemble\_hist_gradient_boosting\gradient_boosting.py", line 667, in _raw_predict
raw_predictions[k, :] += predict(X)
File "C:\Software\Anaconda\lib\site-packages\sklearn\ensemble\_hist_gradient_boosting\predictor.py", line 47, in predict
_predict_from_numeric_data(self.nodes, X, out)
File "sklearn\ensemble\_hist_gradient_boosting\_predictor.pyx", line 26, in sklearn.ensemble._hist_gradient_boosting._predictor._predict_from_numeric_data
ValueError: Buffer dtype mismatch, expected 'const X_DTYPE_C' but got 'unsigned char'
```
#### Versions
<!--
Please run the following snippet and paste the output below.
For scikit-learn >= 0.20:
import sklearn; sklearn.show_versions()
For scikit-learn < 0.20:
import platform; print(platform.platform())
import sys; print("Python", sys.version)
import numpy; print("NumPy", numpy.__version__)
import scipy; print("SciPy", scipy.__version__)
import sklearn; print("Scikit-Learn", sklearn.__version__)
import imblearn; print("Imbalanced-Learn", imblearn.__version__)
-->
cython == 0.29.21
scikit-learn == 0.23.1
<!-- Thanks for contributing! -->
| null | https://github.com/scikit-learn/scikit-learn/pull/18410 | null | {'base_commit': '2b79665b90bd54fa59701090d5f608a1fc4dd33a', 'files': [{'path': 'doc/whats_new/v0.24.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [213]}}}, {'path': 'sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py', 'status': 'modified', 'Loc': {"('BaseHistGradientBoosting', '_raw_predict', 635)": {'mod': [648, 649, 656]}}}, {'path': 'sklearn/ensemble/_hist_gradient_boosting/tests/test_gradient_boosting.py', 'status': 'modified', 'Loc': {"(None, 'test_staged_predict', 760)": {'add': [796]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"sklearn/ensemble/_hist_gradient_boosting/gradient_boosting.py"
],
"doc": [
"doc/whats_new/v0.24.rst"
],
"test": [
"sklearn/ensemble/_hist_gradient_boosting/tests/test_gradient_boosting.py"
],
"config": [],
"asset": []
} | 1 |
nvbn | thefuck | fee874cddc1af36344e1cdaedd6d80eb6aea8341 | https://github.com/nvbn/thefuck/issues/449 | Fuck alias for fish | `~> fuck` says:
> ```
> Seems like fuck alias isn't configured!
> Please put eval thefuck --alias in your ~/.config/fish/config.fish.
> More details - https://github.com/nvbn/thefuck#manual-installation
> ```
but https://github.com/nvbn/thefuck/wiki/Shell-aliases says:
> Add this function to config.fish:
>
> ``` fish
> eval (thefuck --alias | tr '\n' ';')
> ```
What should I add to my `config.fish`?
- `eval thefuck --alias`
or
- `eval (thefuck --alias | tr '\n' ';')`
| null | https://github.com/nvbn/thefuck/pull/450 | null | {'base_commit': 'fee874cddc1af36344e1cdaedd6d80eb6aea8341', 'files': [{'path': 'thefuck/shells.py', 'status': 'modified', 'Loc': {"('Fish', 'how_to_configure', 201)": {'mod': [202]}}}]} | [] | [] | [] | {
"iss_type": "3",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"thefuck/shells.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 | |
python | cpython | 9f1814723f5596115a794a8bec0d053f25dbf32f | https://github.com/python/cpython/issues/96828 | type-feature
topic-SSL | Add an `ssl.OP_ENABLE_KTLS` option for enabling the use of the kernel TLS | # Feature or enhancement
A new `ssl.OP_ENABLE_KTLS` option for enabling the use of the kernel TLS.
# Pitch
Kernel Transport Layer Security (kTLS) can improve performance of programs using TLS by reducing the number of switches between the user space and the kernel space. kTLS allows using the `sendfile` system call for sending data using TLS. Also, it may offload TLS to network interface controllers.
kTLS is not enabled by default for various reasons which you can find in https://github.com/openssl/openssl/issues/13794. Even if a system supports the feature and OpenSSL was compiled with support for it, Python still has to set an OpenSSL's option `SSL_OP_ENABLE_KTLS` to use it.
In theory, it is possible to enable the kernel TLS in any Python compiled against OpenSSL 3 using this following code. If all other requirements are met, Python should start writing to and reading from a secure socket using the kernel TLS.
```python
import ssl
context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
context.options |= 8 # SSL_OP_ENABLE_KTLS
```
Since Python's `ssl` module defines a few constants similar to `SSL_OP_ENABLE_KTLS`, it should provide an `ssl.OP_ENABLE_KTLS` option.
# Previous discussion
I created https://discuss.python.org/t/sslsocket-sendfile-and-kernel-tls/18886 previously to discuss benefiting from the OpenSSL's [SSL_sendfile](https://www.openssl.org/docs/manmaster/man3/SSL_sendfile.html) function. An option for enabling kTLS is a base for the work.
| null | https://github.com/python/cpython/pull/96830 | null | {'base_commit': '9f1814723f5596115a794a8bec0d053f25dbf32f', 'files': [{'path': 'Doc/library/ssl.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [841]}}}, {'path': 'Modules/_ssl.c', 'status': 'modified', 'Loc': {"(None, 'sslmodule_init_constants', 5725)": {'add': [5883]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"Modules/_ssl.c"
],
"doc": [
"Doc/library/ssl.rst"
],
"test": [],
"config": [],
"asset": []
} | 1 |
huggingface | transformers | fa876aee2adf525b597495c10ad9c96896953dbd | https://github.com/huggingface/transformers/issues/9620 | SQuAD 2.0 metric not supported | Hello.
I'm trying to run the official `run_qa.py` code for SQuAD 2.0.
You have an open TODO here that is causing a bug: https://github.com/huggingface/transformers/blob/master/examples/question-answering/run_qa.py#L436
I would like to know what is the status of this TODO, and if it is going to be updated, or is there a way around it.
This is the current code:
```python
current_dir = os.path.sep.join(os.path.join(__file__).split(os.path.sep)[:-1])
metric = load_metric(os.path.join(current_dir, "squad_v2_local") if data_args.version_2_with_negative else "squad")
```
I receive:
```
FileNotFoundError: Couldn't find file locally at .../squad_v2_local/squad_v2_local.py,
```
I've tried to change it to:
```python
metric = load_metric("squad_v2" if data_args.version_2_with_negative else "squad")
```
But this is the stacktrace I receive:
```
Traceback (most recent call last):
File "/data/users/yonatab/transformers_pip/QA/run_qa_val_more_valueable.py", line 557, in <module>
main()
File "/data/users/yonatab/transformers_pip/QA/run_qa_val_more_valueable.py", line 538, in main
results = trainer.evaluate()
File "/data/users/yonatab/transformers_pip/QA/trainer_qa.py", line 63, in evaluate
metrics = self.compute_metrics(eval_preds)
File "/data/users/yonatab/transformers_pip/QA/run_qa_val_more_valueable.py", line 499, in compute_metrics
return metric.compute(predictions=p.predictions, references=p.label_ids)
File "/data/users/yonatab/transformers_pip/trans_pip/lib/python3.6/site-packages/datasets/metric.py", line 398, in compute
output = self._compute(predictions=predictions, references=references, **kwargs)
File "/home/ec2-user/.cache/huggingface/modules/datasets_modules/metrics/squad_v2/7529efd518b03f775290694e7b797412cb2253e90b4f843af83cf7434cccb3a8/squad_v2.py", line 108, in _compute
exact_raw, f1_raw = get_raw_scores(dataset, predictions)
File "/home/ec2-user/.cache/huggingface/modules/datasets_modules/metrics/squad_v2/7529efd518b03f775290694e7b797412cb2253e90b4f843af83cf7434cccb3a8/evaluate.py", line 111, in get_raw_scores
gold_answers = [a["text"] for a in qa["answers"] if normalize_answer(a["text"])]
File "/home/ec2-user/.cache/huggingface/modules/datasets_modules/metrics/squad_v2/7529efd518b03f775290694e7b797412cb2253e90b4f843af83cf7434cccb3a8/evaluate.py", line 111, in <listcomp>
gold_answers = [a["text"] for a in qa["answers"] if normalize_answer(a["text"])]
TypeError: string indices must be integers
100%|███████████████████████████████████████████| 13/13 [00:05<00:00, 2.51it/s]
```
How can I solve it?
Thanks | null | https://github.com/huggingface/transformers/pull/9677 | null | {'base_commit': 'fa876aee2adf525b597495c10ad9c96896953dbd', 'files': [{'path': 'examples/question-answering/requirements.txt', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [1]}}}, {'path': 'examples/question-answering/run_qa.py', 'status': 'modified', 'Loc': {"(None, 'main', 159)": {'mod': [436, 437, 438]}}}, {'path': 'examples/question-answering/run_qa_beam_search.py', 'status': 'modified', 'Loc': {"(None, 'main', 158)": {'mod': [475, 476, 477]}}}, {'path': 'examples/question-answering/squad_v2_local/evaluate.py', 'status': 'removed', 'Loc': {}}, {'path': 'examples/question-answering/squad_v2_local/squad_v2_local.py', 'status': 'removed', 'Loc': {}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"examples/question-answering/run_qa.py",
"examples/question-answering/squad_v2_local/evaluate.py",
"examples/question-answering/squad_v2_local/squad_v2_local.py",
"examples/question-answering/run_qa_beam_search.py"
],
"doc": [],
"test": [],
"config": [
"examples/question-answering/requirements.txt"
],
"asset": []
} | 1 | |
sherlock-project | sherlock | 21fe11db51edcca881665694c4cc2a3fe6f1af54 | https://github.com/sherlock-project/sherlock/issues/113 | help wanted | Blackplanet false positive | Blackplanet is giving false positives. (request from germany)
@Czechball you added this in #81 ; maybe you are able to fix it? | null | https://github.com/sherlock-project/sherlock/pull/169 | null | {'base_commit': '21fe11db51edcca881665694c4cc2a3fe6f1af54', 'files': [{'path': 'data.json', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 215, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311, 312, 313, 314, 315, 316, 317, 318, 319, 320, 321, 322, 323, 324, 325, 326, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337, 338, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348, 349, 350, 351, 352, 353, 354, 355, 356, 357, 358, 359, 360, 361, 362, 363, 364, 365, 366, 367, 368, 369, 370, 371, 372, 373, 374, 375, 376, 377, 378, 379, 380, 381, 382, 383, 384, 385, 386, 387, 388, 389, 390, 391, 392, 393, 394, 395, 396, 397, 398, 399, 400, 401, 402, 403, 404, 405, 406, 407, 408, 409, 410, 411, 412, 413, 414, 415, 416, 417, 418, 419, 420, 421, 422, 423, 424, 425, 426, 427, 428, 429, 430, 431, 432, 433, 434, 435, 436, 437, 438, 439, 440, 441, 442, 443, 444, 445, 446, 447, 448, 449, 450, 451, 452, 453, 454, 455, 456, 457, 458, 459, 460, 461, 462, 463, 464, 465, 466, 467, 468, 469, 470, 471, 472, 473, 474, 475, 476, 477, 478, 479, 480, 481, 482, 483, 484, 485, 486, 487, 488, 489, 490, 491, 492, 493, 494, 495, 496, 497, 498, 499, 500, 501, 502, 503, 504, 505, 506, 507, 508, 509, 510, 511, 512, 513, 514, 515, 516, 517, 518, 519, 520, 521, 522, 523, 524, 525, 526, 527, 528, 529, 530, 531, 532, 533, 534, 535, 536, 537, 538, 539, 540, 541, 542, 543, 544, 545, 546, 547, 548, 549, 550, 551, 552, 553, 554, 555, 556, 557, 558, 559, 560, 561, 562, 563, 564, 565, 566, 567, 568, 569, 570, 571, 572, 573, 574, 575, 576, 577, 578, 579, 580, 581, 582, 583, 584, 585, 586, 587, 588, 589, 590, 591, 592, 593, 594, 595, 596, 597, 598, 599, 600, 601, 602, 603, 604, 605, 606, 607, 608, 609, 610, 611, 612, 613, 614, 615, 616, 617, 618, 619, 620, 621, 622, 623, 624, 625, 626, 627, 628, 629, 630, 631, 632, 633, 634, 635, 636, 637, 638, 639, 640, 641, 642, 643, 644, 645, 646, 647, 648, 649, 650, 651, 652, 653, 654, 655, 656, 657, 658, 659, 660, 661, 662, 663, 664, 665, 666, 667, 668, 669, 670, 671, 672, 673, 674, 675, 676, 677, 678, 679, 680, 681, 682, 683, 684, 685, 686, 687, 688, 689, 690, 691, 692, 693, 694, 695, 696, 697, 698, 699, 700, 701, 702, 703, 704, 705, 706, 707, 708, 709, 710, 711, 712, 713, 714, 715, 716, 717, 718, 719, 720, 721, 722, 723, 724, 725, 726, 727, 728, 729, 730, 731, 732, 733, 734, 735, 736, 737, 738, 739, 740, 741, 742, 743, 744, 745, 746, 747, 748, 749, 750, 751, 752, 753, 754, 755, 756, 757, 758, 759, 760, 761, 762, 763, 764, 765, 766, 767, 768, 769, 770, 771, 772, 773, 774, 775, 776, 777, 778, 779, 780, 781, 782, 783, 784, 785, 786, 787, 788, 789, 790, 791, 792, 793, 794, 795, 796, 797, 798, 799, 800, 801, 802, 803, 804, 805, 806, 807, 808, 809, 810, 811, 812, 813, 814, 815, 816, 817, 818, 819, 820, 821, 822, 823, 824, 825, 826, 827, 828, 829, 830, 831, 832, 833, 834, 835, 836, 837, 838, 839, 840, 841, 842, 843, 844, 845, 846, 847, 848, 849, 850, 851, 852, 853, 854, 855, 856, 857, 858, 859, 860, 861, 862, 863, 864, 865, 866, 867, 868, 869, 870, 871, 872, 873, 874, 875, 876, 877, 878, 879, 880, 881, 882, 883, 884, 885, 886, 887, 888, 889, 890, 891, 892, 893, 894, 895, 896, 897, 898, 899, 900, 901, 902, 903, 904, 905, 906, 907, 908, 909, 910, 911, 912, 913, 914, 915, 916, 917, 918, 919, 920, 921, 922, 923, 924, 925, 926, 927, 928, 929, 930, 931, 932, 933, 934, 935, 936, 937, 938, 939, 940, 941, 942, 943, 944, 945, 946, 947, 948, 949, 950, 951, 952, 953, 954, 955, 956, 957, 958, 959, 960, 961, 962, 963, 964, 965, 966, 967, 968, 969, 970, 971, 972, 973, 974, 975, 976, 977, 978, 979, 980, 981, 982, 983, 984, 985, 986, 987, 988, 989, 990, 991, 992, 993, 994, 995, 996, 997, 998, 999, 1000, 1001, 1002, 1003, 1004, 1005, 1006, 1007, 1008, 1009, 1010, 1011, 1012, 1013, 1014, 1015, 1016, 1017, 1018, 1019, 1020, 1021, 1022, 1023, 1024, 1025, 1026, 1027, 1028, 1029, 1030, 1031, 1032, 1033, 1034, 1035, 1036, 1037, 1038, 1039, 1040, 1041, 1042, 1043, 1044, 1045, 1046, 1047, 1048, 1049, 1050, 1051, 1052, 1053, 1054, 1055, 1056, 1057, 1058, 1059, 1060, 1061, 1062, 1063, 1064, 1065, 1066, 1067, 1068, 1069, 1070, 1071, 1072, 1073, 1074, 1075, 1076, 1077, 1078, 1079, 1080, 1081, 1082, 1083, 1084, 1085, 1086, 1087, 1088, 1089, 1090, 1091, 1092, 1093, 1094, 1095, 1096, 1097, 1098, 1099, 1100, 1101, 1102, 1103, 1104, 1105, 1106, 1107, 1108, 1109, 1110, 1111, 1112, 1113, 1114, 1115, 1116, 1117, 1118, 1119, 1120, 1121, 1122, 1123, 1124, 1125, 1126, 1127, 1128, 1129, 1130, 1131, 1132, 1133, 1134, 1135, 1136, 1137, 1138, 1139, 1140, 1141, 1142, 1143, 1144]}}}, {'path': 'sites.md', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 138]}}}, {'path': 'tests/all.py', 'status': 'modified', 'Loc': {"('SherlockSiteCoverageTests', 'test_coverage_true_via_message', 188)": {'add': [204]}}}, {'path': 'tests/base.py', 'status': 'modified', 'Loc': {"('SherlockBaseTest', 'detect_type_check', 109)": {'add': [168]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"tests/base.py",
"tests/all.py",
"data.json"
],
"doc": [
"sites.md"
],
"test": [],
"config": [],
"asset": []
} | 1 |
geekan | MetaGPT | f201b2f5f32c2d48eab6632bf103e9b3a92fc999 | https://github.com/geekan/MetaGPT/issues/1239 | RAG faiss AssertionError | **Bug description**
<!-- Clearly and directly describe the current bug -->
execute this demo
```Python
import asyncio
from metagpt.rag.engines import SimpleEngine
from metagpt.rag.schema import FAISSRetrieverConfig
from metagpt.const import EXAMPLE_DATA_PATH
DOC_PATH = EXAMPLE_DATA_PATH / "rag/travel.txt"
async def main():
engine = SimpleEngine.from_docs(input_files=[DOC_PATH], retriever_configs=[FAISSRetrieverConfig()])
answer = await engine.aquery("What does Bob like?")
print(answer)
if __name__ == "__main__":
asyncio.run(main())
```
get error
```bash
Traceback (most recent call last):
File "/home/wanfu/projects/llm/multi_agent_rag/src/simple_custom_object.py", line 26, in <module>
asyncio.run(main())
File "/home/wanfu/data/miniconda3/envs/metagpt/lib/python3.9/asyncio/runners.py", line 44, in run
return loop.run_until_complete(main)
File "/home/wanfu/data/miniconda3/envs/metagpt/lib/python3.9/asyncio/base_events.py", line 647, in run_until_complete
return future.result()
File "/home/wanfu/projects/llm/multi_agent_rag/src/simple_custom_object.py", line 21, in main
engine.add_docs([DOC_PATH])
File "/mnt/data/work/development/projects/llm/MetaGPT/metagpt/rag/engines/simple.py", line 195, in add_docs
self._save_nodes(nodes)
File "/mnt/data/work/development/projects/llm/MetaGPT/metagpt/rag/engines/simple.py", line 274, in _save_nodes
self.retriever.add_nodes(nodes)
File "/mnt/data/work/development/projects/llm/MetaGPT/metagpt/rag/retrievers/faiss_retriever.py", line 12, in add_nodes
self._index.insert_nodes(nodes, **kwargs)
File "/home/wanfu/data/miniconda3/envs/metagpt/lib/python3.9/site-packages/llama_index/core/indices/vector_store/base.py", line 320, in insert_nodes
self._insert(nodes, **insert_kwargs)
File "/home/wanfu/data/miniconda3/envs/metagpt/lib/python3.9/site-packages/llama_index/core/indices/vector_store/base.py", line 311, in _insert
self._add_nodes_to_index(self._index_struct, nodes, **insert_kwargs)
File "/home/wanfu/data/miniconda3/envs/metagpt/lib/python3.9/site-packages/llama_index/core/indices/vector_store/base.py", line 233, in _add_nodes_to_index
new_ids = self._vector_store.add(nodes_batch, **insert_kwargs)
File "/home/wanfu/data/miniconda3/envs/metagpt/lib/python3.9/site-packages/llama_index/vector_stores/faiss/base.py", line 121, in add
self._faiss_index.add(text_embedding_np)
File "/home/wanfu/data/miniconda3/envs/metagpt/lib/python3.9/site-packages/faiss/__init__.py", line 214, in replacement_add
assert d == self.d
AssertionError
```
**Bug solved method**
<!-- If you solved the bug, describe the idea or process to solve the current bug. Of course, you can also paste the URL address of your Pull Request. -->
<!-- If not, provide more auxiliary information to facilitate our further positioning and investigation -->
**Environment information**
<!-- Environment:System version (like ubuntu 22.04), Python version (conda python 3.7), LLM type and model (OpenAI gpt-4-1106-preview) -->
- LLM type and model name: zhipuai
- Embeddings : fastchat, BAAI/bge-large-zh
- System version: Ubuntu 22.04
- Python version: 3.9.19
- MetaGPT version or branch:
<!-- Dependent packagess:the packages version cause the bug(like `pydantic 1.10.8`), installation method(like `pip install metagpt` or `pip install from source` or `run in docker`) -->
- packages version:
- installation method: pip install from source
**Screenshots or logs**
<!-- Screenshots or logs of the bug can help us understand the problem more quickly -->
| null | https://github.com/geekan/MetaGPT/pull/1241 | null | {'base_commit': 'f201b2f5f32c2d48eab6632bf103e9b3a92fc999', 'files': [{'path': 'config/config2.example.yaml', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [20]}}}, {'path': 'metagpt/configs/embedding_config.py', 'status': 'modified', 'Loc': {"('EmbeddingConfig', None, 16)": {'add': [22, 27, 34, 43]}}}, {'path': 'metagpt/rag/schema.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [14]}, "('FAISSRetrieverConfig', 'check_dimensions', 45)": {'mod': [47]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"metagpt/rag/schema.py",
"metagpt/configs/embedding_config.py"
],
"doc": [],
"test": [],
"config": [
"config/config2.example.yaml"
],
"asset": []
} | 1 | |
keras-team | keras | 8f5592bcb61ff48c96560c8923e482db1076b54a | https://github.com/keras-team/keras/issues/20324 | type:support
keras-team-review-pending | Reason for the recently added shape restriction in MultiHeadAttention | Hello,
Wondering why is there a restriction on the input shape of `query` and `value` to have a matching final dimension?
This blocks having cross-attention to a source that has a different shape than query, unless adding an extra projection layer. Given that all input tensors (`query`, `key`, `value`) are immediately projected by dense layers inside `MultiHeadAttention`, I don't think any restriction on final dims is necessary.
For reference, the [pytorch doc](https://keras.io/api/layers/attention_layers/multi_head_attention/) for `MultiHeadAttention` explicitly uses 3 distinct variables to describe expected dimensions for the three tensors. The tensorflow implementation does not enforce such restriction as well.
The restriction is enforced here: https://github.com/keras-team/keras/blob/5aa5f88dc200bbf2cd765d5a213c23c58da48e80/keras/src/layers/attention/multi_head_attention.py#L214-L219
And was added as part of the PR #19973 (in response to the issue #19769)
Thanks | null | https://github.com/keras-team/keras/pull/20340 | null | {'base_commit': '8f5592bcb61ff48c96560c8923e482db1076b54a', 'files': [{'path': 'keras/src/layers/attention/multi_head_attention.py', 'status': 'modified', 'Loc': {"('MultiHeadAttention', 'build', 199)": {'mod': [214, 215, 216, 217, 218, 219]}, "('MultiHeadAttention', 'compute_output_shape', 598)": {'mod': [607, 608, 609, 610, 611, 612]}}}, {'path': 'keras/src/layers/attention/multi_head_attention_test.py', 'status': 'modified', 'Loc': {"('MultiHeadAttentionTest', None, 17)": {'add': [106], 'mod': [133]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"keras/src/layers/attention/multi_head_attention_test.py",
"keras/src/layers/attention/multi_head_attention.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 |
ansible | ansible | 2897cf43cea3d61b9673ce14ba796a663d99f19d | https://github.com/ansible/ansible/issues/56571 | python3
support:community
bug
has_pr
affects_2.7
collection
collection:community.general
needs_collection_redirect
bot_closed | "machinectl: invalid option -- 'c'" when using become_method: machinectl | <!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
`become_method: machinectl` fails with the error "machinectl: invalid option -- 'c'".
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
`lib/ansible/plugins/become/machinectl.py`
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.7.10
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/thomas/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.7/site-packages/ansible
executable location = /usr/bin/ansible
python version = 3.7.3 (default, Mar 26 2019, 21:43:19) [GCC 8.2.1 20181127]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
<no output>
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Host: Arch Linux x64
Target: Ubuntu 16.04.6 Desktop 64-bits
Target machinectl version:
```
$ machinectl --version
systemd 229
+PAM +AUDIT +SELINUX +IMA +APPARMOR +SMACK +SYSVINIT +UTMP +LIBCRYPTSETUP +GCRYPT +GNUTLS +ACL +XZ -LZ4 +SECCOMP +BLKID +ELFUTILS +KMOD -IDN
```
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
<!--- Paste example playbooks or commands between quotes below -->
```yaml
$ ansible -m ping --user thomas --become --become-user somebody --become-method machinectl target-machine
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
No error, `machinectl` should just work. (I need to use `machinectl` because I want to create/start a systemd user service running as `somebody`, while `somebody` may not be logged in.)
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
With `-vvvv` (lots of spammy ssh output): [gist](https://gist.github.com/ttencate/36b75976a3564b8cd59ce1562c906c89)
From the output we can see that Ansible is running this command on the target:
```
machinectl shell -q somebody@ /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-kppmktulvryalmvucprbybyfjgvsiseh; /usr/bin/python /var/tmp/ansible-tmp-1558089659.8620167-87402387160364/AnsiballZ_ping.py'"'"'"'"'"'"'"'"' && sleep 0
```
On the Ubuntu target (systemd 229), that fails:
```
$ machinectl shell -q somebody@ /bin/sh -c 'echo foo'
machinectl: invalid option -- 'c'
```
On the Arch Linux host (systemd 242), it succeeds:
```
$ machinectl shell -q thomas@ /bin/sh -c 'echo foo'
[...]
foo
```
The cause seems to be [systemd issue #2420](https://github.com/systemd/systemd/issues/2420), which presumably was fixed just too late to make it into the Ubuntu 16.04 release. A simple workaround is to add `--` before the actual command, which terminates the option list and works on both old and new [edit 2019-07-05: no it doesn't, see below!]:
```
$ machinectl shell -q somebody@ -- /bin/sh -c 'echo foo'
[...]
foo
``` | null | https://github.com/ansible/ansible/pull/56572 | null | {'base_commit': '2897cf43cea3d61b9673ce14ba796a663d99f19d', 'files': [{'path': 'lib/ansible/plugins/become/machinectl.py', 'status': 'modified', 'Loc': {"('BecomeModule', 'build_become_command', 78)": {'mod': [87]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"lib/ansible/plugins/become/machinectl.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 |
pandas-dev | pandas | b5a5268dabb2a4dea1c3c543a1ddff501b87a447 | https://github.com/pandas-dev/pandas/issues/16870 | Docs
Groupby
good first issue | (DOC) A `string` passed to `groupby` is hard to understand based on current doc | #### Code Sample, a copy-pastable example if possible
From [Here](pandas/doc/source/groupby.rst)
```rst
For DataFrame objects, a string indicating a column to be used to group. Of course
df.groupby('A') is just syntactic sugar for df.groupby(df['A']), but
it makes life simpler
For DataFrame objects, a string indicating an index level to be used to group.
```
#### Problem description
These two sentences are in a kind of conflict with each other, until one read until she read the note below.
#### Expected Output
Reword to make it clear that a string may indicate column or index level
#### Output of ``pd.show_versions()``
<details>
INSTALLED VERSIONS
------------------
commit: None
python: 3.5.3.final.0
python-bits: 64
OS: Darwin
OS-release: 16.6.0
machine: x86_64
processor: i386
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: en_US.UTF-8
pandas: 0.21.0.dev+193.gb2b5dc32e
pytest: 3.1.2
pip: 9.0.1
setuptools: 36.0.1
Cython: 0.25.2
numpy: 1.13.0
scipy: 0.19.0
xarray: None
IPython: 6.0.0
sphinx: 1.6.2
patsy: 0.4.1
dateutil: 2.6.0
pytz: 2017.2
blosc: None
bottleneck: 1.2.1
tables: None
numexpr: 2.6.2
feather: None
matplotlib: 2.0.2
openpyxl: None
xlrd: None
xlwt: None
xlsxwriter: None
lxml: None
bs4: None
html5lib: 0.9999999
sqlalchemy: None
pymysql: None
psycopg2: None
jinja2: 2.9.6
s3fs: None
pandas_gbq: None
pandas_datareader: None
</details>
| null | https://github.com/pandas-dev/pandas/pull/36238 | null | {'base_commit': 'b5a5268dabb2a4dea1c3c543a1ddff501b87a447', 'files': [{'path': 'doc/source/user_guide/groupby.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [90, 91, 92, 93, 94]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [],
"doc": [
"doc/source/user_guide/groupby.rst"
],
"test": [],
"config": [],
"asset": []
} | 1 |
pandas-dev | pandas | 48d0460ab9acbee223bae1be699344f8fd232224 | https://github.com/pandas-dev/pandas/issues/12401 | Indexing
API Design
Deprecate
Needs Discussion | DEPR: filter & select | do we need label selectors? we should for sure just have a single method for this. maybe call it `query_labels`? to be consistent with `.query` as the workhorse for data selection.
- [x] ``.select`` (#17633)
- [ ] ``.filter``
xref #6599
| null | null | https://github.com/pandas-dev/pandas/commit/48d0460ab9acbee223bae1be699344f8fd232224 | {'base_commit': '48d0460ab9acbee223bae1be699344f8fd232224', 'files': [{'path': 'doc/source/whatsnew/v0.21.0.txt', 'status': 'modified', 'Loc': {'(None, None, 669)': {'add': [669]}}}, {'path': 'pandas/core/common.py', 'status': 'modified', 'Loc': {"(None, '_apply_if_callable', 444)": {'add': [447]}}}, {'path': 'pandas/core/generic.py', 'status': 'modified', 'Loc': {"('NDFrame', 'select', 2338)": {'add': [2341, 2351]}, "('NDFrame', 'filter', 3061)": {'mod': [3104, 3123, 3124, 3127, 3128, 3130, 3131, 3132, 3135, 3136]}}}, {'path': 'pandas/core/indexing.py', 'status': 'modified', 'Loc': {"('_NDFrameIndexer', '__call__', 98)": {'add': [101]}, "('_NDFrameIndexer', '__getitem__', 110)": {'add': [119], 'mod': [121, 123]}, "('_NDFrameIndexer', None, 88)": {'add': [198], 'mod': [110, 195]}, "('_NDFrameIndexer', '_convert_tuple', 228)": {'add': [235]}, "('_NDFrameIndexer', '_getitem_iterable', 1110)": {'add': [1155], 'mod': [1141]}, "('_NDFrameIndexer', '_convert_to_indexer', 1167)": {'add': [1260], 'mod': [1258]}, "('_LocationIndexer', None, 1355)": {'add': [1358], 'mod': [1357]}, "('_iLocIndexer', '_getitem_tuple', 1735)": {'add': [1744], 'mod': [1751]}, "('_iLocIndexer', '_get_list_axis', 1778)": {'add': [1785], 'mod': [1784]}, "('_NDFrameIndexer', '_get_label', 129)": {'mod': [138, 141]}, "('_NDFrameIndexer', '_get_setitem_indexer', 157)": {'mod': [176]}, "('_NDFrameIndexer', '_multi_take_opportunity', 882)": {'mod': [898]}, "('_NDFrameIndexer', '_convert_for_reindex', 916)": {'mod': [928]}, "('_NDFrameIndexer', '_getitem_lowerdim', 963)": {'mod': [1018]}, "('_NDFrameIndexer', '_getitem_nested_tuple', 1024)": {'mod': [1052]}, "('_NDFrameIndexer', '_getitem_axis', 1072)": {'mod': [1087]}, "('_IXIndexer', '__init__', 1324)": {'mod': [1328, 1336, 1337]}, "('_IXIndexer', '_has_valid_type', 1338)": {'mod': [1345, 1348]}, "('_LocIndexer', '_is_scalar_access', 1518)": {'mod': [1531]}, "('_iLocIndexer', '_is_valid_list_like', 1716)": {'mod': [1720, 1732]}, "('_iLocIndexer', '_getitem_axis', 1799)": {'mod': [1821]}}}, {'path': 'pandas/tests/frame/test_alter_axes.py', 'status': 'modified', 'Loc': {"('TestDataFrameAlterAxes', 'test_set_index_bug', 143)": {'add': [149], 'mod': [146, 147]}}}, {'path': 'pandas/tests/frame/test_axis_select_reindex.py', 'status': 'modified', 'Loc': {"('TestDataFrameSelectReindex', None, 25)": {'add': [798]}, "('TestDataFrameSelectReindex', 'test_select', 798)": {'add': [806], 'mod': [800, 801, 802, 803, 805, 808]}}}, {'path': 'pandas/tests/frame/test_mutate_columns.py', 'status': 'modified', 'Loc': {}}, {'path': 'pandas/tests/groupby/test_groupby.py', 'status': 'modified', 'Loc': {"('TestGroupBy', '_func', 3105)": {'mod': [3106]}}}, {'path': 'pandas/tests/series/test_indexing.py', 'status': 'modified', 'Loc': {"('TestSeriesIndexing', 'test_select', 2227)": {'mod': [2228, 2229, 2230, 2231, 2233, 2234, 2235]}}}, {'path': 'pandas/tests/test_multilevel.py', 'status': 'modified', 'Loc': {"('TestMultiLevel', 'test_groupby_level_no_obs', 1236)": {'mod': [1242]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"pandas/core/common.py",
"pandas/core/generic.py",
"pandas/core/indexing.py"
],
"doc": [
"doc/source/whatsnew/v0.21.0.txt"
],
"test": [
"pandas/tests/series/test_indexing.py",
"pandas/tests/test_multilevel.py",
"pandas/tests/frame/test_mutate_columns.py",
"pandas/tests/groupby/test_groupby.py",
"pandas/tests/frame/test_alter_axes.py",
"pandas/tests/frame/test_axis_select_reindex.py"
],
"config": [],
"asset": []
} | null |
deepfakes | faceswap | 9dc151e5b58abb5f8862d2aa84124ed86156e0b8 | https://github.com/deepfakes/faceswap/issues/355 | when using GUI recent version, A converting error has occurred. | I am testing the gui version downloaded today. But when converting, the following error has occurred.
Can anyone tell me what I am doing wrong or how to solve it?
(1) error message
"Failed to convert image: ...\faceA_source_gui\out1.png. Reason: argument of type 'NoneType' is not iterable"
(1) train image :
https://imgur.com/tLB15CB
(2) convert error image :
https://imgur.com/OAzWKdR
| null | https://github.com/deepfakes/faceswap/pull/352 | null | {'base_commit': '9dc151e5b58abb5f8862d2aa84124ed86156e0b8', 'files': [{'path': 'faceswap.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [30]}}}, {'path': 'requirements-gpu-python35-cuda8.txt', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [10]}}}, {'path': 'requirements-gpu-python36-cuda9.txt', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [10]}}}, {'path': 'requirements-python35.txt', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [10]}}}, {'path': 'requirements-python36.txt', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [10]}}}, {'path': 'scripts/convert.py', 'status': 'modified', 'Loc': {"('ConvertImage', 'get_optional_arguments', 26)": {'mod': [119]}}}, {'path': 'scripts/gui.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [2, 5, 377], 'mod': [1, 4, 10, 11, 12, 13, 14, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 40, 41, 42, 43, 44, 45, 46, 47]}, "('TKGui', None, 423)": {'add': [435, 470], 'mod': [424, 425, 426]}, "('TKGui', 'extract_options', 436)": {'add': [441], 'mod': [437, 438, 439, 443]}, "('TKGui', 'process', 480)": {'add': [482], 'mod': [481]}, "('FaceswapGui', None, 49)": {'mod': [49, 50, 51, 52, 70, 71, 72, 73, 74, 75, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 159, 160, 161, 162, 163, 165, 166, 167]}, "('FaceswapGui', '__init__', 51)": {'mod': [54, 56, 57, 58, 59, 60, 61, 62, 64, 65, 67, 68]}, "('FaceswapGui', 'build_gui', 70)": {'mod': [77, 78, 79, 80]}, "('FaceswapGui', 'load_config', 97)": {'mod': [98, 107, 108]}, "('FaceswapGui', 'set_command_args', 110)": {'mod': [111]}, "('FaceswapGui', 'save_config', 118)": {'mod': [119, 120, 121, 122, 132]}, "('FaceswapGui', 'reset_config', 134)": {'mod': [135]}, "('FaceswapGui', 'clear_config', 145)": {'mod': [146]}, "('ActionFrame', None, 169)": {'mod': [169, 170, 171, 172, 173, 174, 175, 177, 178, 179, 180, 217, 218, 219, 220]}, "('ActionFrame', 'build_frame', 177)": {'mod': [182, 183, 184, 185, 187, 188, 189, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200, 201, 202, 204, 205, 206, 207, 209, 210, 211, 212, 213, 214, 215]}, "('ActionFrame', 'add_util_buttons', 217)": {'mod': [222, 223, 224, 225, 226, 227, 228, 229, 230, 231, 233, 234, 235, 236, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249]}, "('CommandTab', None, 251)": {'mod': [252, 253, 254, 272, 273, 274, 275, 277, 278, 279, 280, 292, 293, 294, 295, 296, 297, 299, 300, 301, 303, 304, 305, 306, 307, 308, 309, 326, 327, 328, 331, 332]}, "('CommandTab', 'build_tab', 260)": {'mod': [261, 262, 264, 265, 267, 268]}, "('CommandTab', 'add_right_frame', 277)": {'mod': [282, 283, 285, 287, 288, 290]}, "('CommandTab', 'build_tabs', 304)": {'mod': [312, 314, 315, 317, 318, 320, 321, 322, 323]}, "('CommandTab', 'build_control', 331)": {'mod': [335, 341, 342, 344, 345, 352, 354, 355]}, "('CommandTab', 'add_browser_buttons', 357)": {'mod': [358, 359, 361, 362]}, "('CommandTab', 'ask_folder', 365)": {'mod': [366]}, "('CommandTab', 'ask_load', 372)": {'mod': [373]}, "('FaceswapControl', None, 378)": {'mod': [379, 380, 381, 382, 383, 384, 386, 387, 388, 390, 391]}, "('FaceswapControl', 'execute_script', 390)": {'mod': [393, 394, 396, 397, 398, 403, 405, 406, 407, 408, 410, 411, 412]}, "('FaceswapControl', 'launch_faceswap', 410)": {'mod': [414, 415, 416, 417, 418, 419, 420, 421]}, "('TKGui', '__init__', 425)": {'mod': [428, 431, 433]}, "('TKGui', 'set_control_title', 449)": {'mod': [450, 452]}, "('TKGui', 'set_control', 456)": {'mod': [457, 459, 467]}, "('TKGui', 'parse_arguments', 470)": {'mod': [476, 477, 478]}}}, {'path': 'scripts/train.py', 'status': 'modified', 'Loc': {"('TrainingProcessor', 'get_argument_list', 40)": {'add': [108]}, '(None, None, None)': {'mod': [7]}, "('TrainingProcessor', 'process', 141)": {'mod': [164, 165]}, "('TrainingProcessor', 'show', 226)": {'mod': [228]}}}, {'path': 'tools.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [5, 29]}}}, {'path': 'tools/sort.py', 'status': 'modified', 'Loc': {"('SortProcessor', None, 35)": {'add': [40], 'mod': [721, 722, 723, 724, 803, 804]}, '(None, None, None)': {'mod': [1, 11, 13, 30, 31, 32, 33, 817, 818]}, "(None, 'import_face_recognition', 17)": {'mod': [18]}, "(None, 'import_FaceLandmarksExtractor', 23)": {'mod': [24]}, "('SortProcessor', '__init__', 36)": {'mod': [37]}, "('SortProcessor', 'parse_arguments', 41)": {'mod': [53, 54, 55, 56, 57, 59, 60, 61, 62, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 112, 113, 114, 115, 116, 117, 118, 119, 120, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 133, 134, 135, 136, 137, 138, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 151, 152, 153, 154, 155, 156, 157, 158, 159, 160, 161]}, "('SortProcessor', 'add_optional_arguments', 166)": {'mod': [167]}, "('SortProcessor', 'process_arguments', 170)": {'mod': [171, 177, 178, 181, 182, 185, 186, 189, 190, 192, 194, 196, 199, 203, 204]}, "('SortProcessor', 'process', 208)": {'mod': [214, 215, 216, 218, 219, 220, 221, 224, 226, 234]}, "('SortProcessor', 'sort_blur', 237)": {'mod': [238, 240, 241, 242]}, "('SortProcessor', 'sort_face', 248)": {'mod': [251, 253, 255, 258, 260, 261, 272, 273]}, "('SortProcessor', 'sort_face_dissim', 277)": {'mod': [280, 282, 284, 287, 289, 300]}, "('SortProcessor', 'sort_face_cnn', 304)": {'mod': [307, 309, 312, 314, 317, 319, 320, 324, 329]}, "('SortProcessor', 'sort_face_cnn_dissim', 333)": {'mod': [336, 338, 341, 343, 346, 348, 353, 357]}, "('SortProcessor', 'sort_face_yaw', 362)": {'mod': [363, 364, 373, 376, 378, 380]}, "('SortProcessor', 'calc_landmarks_face_pitch', 363)": {'mod': [366]}, "('SortProcessor', 'calc_landmarks_face_yaw', 367)": {'mod': [368, 369, 370]}, "('SortProcessor', 'sort_hist', 385)": {'mod': [386, 388, 390, 393, 395, 396, 397, 401]}, "('SortProcessor', 'sort_hist_dissim', 405)": {'mod': [406, 408, 410, 413, 415, 418, 423]}, "('SortProcessor', 'group_blur', 429)": {'mod': [431, 438, 439]}, "('SortProcessor', 'group_face', 452)": {'mod': [453, 465]}, "('SortProcessor', 'group_face_cnn', 503)": {'mod': [504, 517, 521]}, "('SortProcessor', 'group_hist', 545)": {'mod': [546, 555]}, "('SortProcessor', 'final_process_rename', 578)": {'mod': [579, 581, 584, 585, 587, 593, 595, 598, 600, 601, 605, 608, 610, 611]}, "('SortProcessor', 'final_process_group', 613)": {'mod': [614, 616, 620, 622, 623, 624, 626, 628, 632, 634, 636, 638, 639, 641, 642]}, "('SortProcessor', 'reload_images', 645)": {'mod': [657, 660, 662, 667, 670]}, "('SortProcessor', 'find_images', 703)": {'mod': [709]}, "('SortProcessor', 'renaming', 759)": {'mod': [762, 763]}, "('SortProcessor', 'renaming', 769)": {'mod': [772, 773]}, "('SortProcessor', 'get_avg_score_hist', 778)": {'mod': [783]}, "('SortProcessor', 'get_avg_score_faces', 786)": {'mod': [792]}, "('SortProcessor', 'get_avg_score_faces_cnn', 795)": {'mod': [798, 800]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"scripts/gui.py",
"scripts/train.py",
"faceswap.py",
"tools.py",
"tools/sort.py",
"scripts/convert.py"
],
"doc": [],
"test": [],
"config": [
"requirements-gpu-python36-cuda9.txt",
"requirements-gpu-python35-cuda8.txt",
"requirements-python35.txt",
"requirements-python36.txt"
],
"asset": []
} | 1 | |
xtekky | gpt4free | b9478049b3e8644be2de93015476b9111126d683 | https://github.com/xtekky/gpt4free/issues/660 | bug | gpt4free useless: IndexError: list index out of range | **Bug description**
Telegram bot using gpt4free not working
main.py:
```import telebot
from gpt4free import usesless
bot = telebot.TeleBot('my_token')
@bot.message_handler(commands=['start'])
def send_welcome(message):
bot.reply_to(message, "ChatGPT unlimited and free but without memory")
@bot.message_handler()
def test(message):
prompt = ""
req = usesless.Completion.create(prompt=prompt)
prompt = message.text
bot.reply_to(message, req["text"])
if __name__ == "__main__":
bot.polling()
```
Error:
```
Traceback (most recent call last): File "main.py", line 20, in <module> bot.polling()
File "/home/runner/Test/venv/lib/python3.1
0/site-packages/telebot/__init__.py", line 1 043, in polling self.__threaded_polling (non_stop=non_sto p, interval=interval, timeout=timeout, long_
polling_timeout-long_polling_timeout,
File "/home/runner/Test/venv/lib/python3.1
0/site-packages/telebot/__init__.py", line 1 118, in __threaded_polling
raise e
File "/home/runner/Test/venv/lib/python3.1 0/site-packages/telebot/__init__.py", line 1 074, in threaded_polling
self.worker_pool.raise_exceptions() File "/home/runner/Test/venv/lib/python3.1 0/site-packages/telebot/util.py", line 147, in raise_exceptions
raise self.exception_info File "/home/runner/Test/venv/lib/python3.1 0/site-packages/telebot/util.py", line 90, i n run
task(*args, **kwargs) File "/home/runner/Test/venv/lib/python3.1 0/site-packages/telebot/__init__.py", line 6 770, in _run_middlewares_and_handler result = handler['function'](message)
File "main.py", line 15, in test
req = usesless.Completion.create(prompt=prompt) File "/home/runner/Test/venv/lib/python3.1 0/site-packages/gpt4free/usesless/__init__.py", line 46, in create
response = Completion.__response_to_json (content) File "/home/runner/Test/venv/lib/python3.10/site-packages/gpt4free/usesless/__init__.py", line 53, in __response_to_json split_text = text.rsplit("\n", 1)[1]
IndexError: list index out of range
```
**Environement**
- python version 3.10
- server location Poland
**Additional context**
If you need more information to help me, please let me know. | null | https://github.com/xtekky/gpt4free/pull/664 | null | {'base_commit': 'b9478049b3e8644be2de93015476b9111126d683', 'files': [{'path': 'gpt4free/usesless/__init__.py', 'status': 'modified', 'Loc': {"('Completion', '__response_to_json', 148)": {'mod': [151, 152, 153]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"gpt4free/usesless/__init__.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 |
keras-team | keras | aab55e649c34f8a24f00ee63922d049d3417c979 | https://github.com/keras-team/keras/issues/8304 | HDF5 Normalizer not working. | ```
def preprocess_train(array):
""" Given a batch of numpy arrays, it outputs a batch of numpy of arrays with all preprocessing
size : (w, h)
"""
num1 = np.random.randint(0, 128 - 112)
num2 = np.random.randint(0, 171 - 112)
crop = array[ :, num1:num1+112, num2:num2+112, :]
crop = crop/255.0
return crop
```
```
X_train = HDF5Matrix(train_loc, 'images', start=0, normalizer=preprocess_train)
y_train = HDF5Matrix(train_loc, 'labels')
````
```
model_final.fit(X_train, y_train, batch_size=16, shuffle='batch', validation_data = [X_test, y_test], epochs=10)
```
```
ValueError: Error when checking model input: expected conv1_input to have shape (None, 16, 112, 112, 3) but got array with shape (5797, 16, 128, 171, 3)
```
Basically I have a h5py file with shape (5797, 16, 128, 171, 3) and my preprocess function should output (16, 112, 112, 3). this is not happening.
However when I run only X_train and used X_train.__getitem___(1). It outputs an array with (16, 112, 112, 3) shape.
Not sure where I am going wrong. Can someone help me ? | null | https://github.com/keras-team/keras/pull/10749 | null | {'base_commit': 'aab55e649c34f8a24f00ee63922d049d3417c979', 'files': [{'path': 'keras/utils/io_utils.py', 'status': 'modified', 'Loc': {"('HDF5Matrix', '__init__', 44)": {'add': [60]}, "('HDF5Matrix', 'shape', 98)": {'mod': [104]}, "('HDF5Matrix', 'dtype', 107)": {'mod': [113]}}}, {'path': 'tests/keras/utils/io_utils_test.py', 'status': 'modified', 'Loc': {"(None, 'test_io_utils', 43)": {'add': [106]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"keras/utils/io_utils.py",
"tests/keras/utils/io_utils_test.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 | |
home-assistant | core | 8c8feb95a9c9048d655bc1eb263f6bc6ee61ee74 | https://github.com/home-assistant/core/issues/4 | Instructions don't result in homeassistant listening on any port | Neither of these result in homeassistant listening on port `8123`
``` bash
python3 -m homeassistant
python3 -m homeassistant --config=config
```
In fact, it isn't seeming to be listening on _any_ port.
``` bash
(ve)[jeff@omniscience home-assistant] (master)$ ./build_frontend
(ve)[jeff@omniscience home-assistant] (master)$ git status
On branch master
Your branch is up-to-date with 'origin/master'.
Changes not staged for commit:
(use "git add/rm <file>..." to update what will be committed)
(use "git checkout -- <file>..." to discard changes in working directory)
modified: build_frontend
deleted: config/home-assistant.conf.example
modified: homeassistant/components/http/frontend.py
modified: homeassistant/components/http/www_static/frontend.html
Untracked files:
(use "git add <file>..." to include in what will be committed)
ve/
no changes added to commit (use "git add" and/or "git commit -a")
(ve)[jeff@omniscience home-assistant] (master)$ python3 -m homeassistant --config config
INFO:homeassistant.loader:Loaded component demo from homeassistant.components.demo
ERROR:homeassistant.loader:Error loading homeassistant.components.http
Traceback (most recent call last):
File "/home/jeff/git/home-assistant/homeassistant/loader.py", line 91, in _get_component
comp = importlib.import_module(module)
File "/usr/lib64/python3.4/importlib/__init__.py", line 109, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 2254, in _gcd_import
File "<frozen importlib._bootstrap>", line 2237, in _find_and_load
File "<frozen importlib._bootstrap>", line 2226, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 1200, in _load_unlocked
File "<frozen importlib._bootstrap>", line 1129, in _exec
File "<frozen importlib._bootstrap>", line 1471, in exec_module
File "<frozen importlib._bootstrap>", line 321, in _call_with_frames_removed
File "/home/jeff/git/home-assistant/homeassistant/components/http/__init__.py", line 86, in <module>
import homeassistant.remote as rem
File "/home/jeff/git/home-assistant/homeassistant/remote.py", line 18, in <module>
import requests
ImportError: No module named 'requests'
ERROR:homeassistant.loader:Unable to load component http
INFO:homeassistant.loader:Loaded component group from homeassistant.components.group
INFO:homeassistant.bootstrap:Home Assistant core initialized
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=a.Demo_Mode, new_state=<state Enabled @ 22:30:15 06-11-2014>>
INFO:homeassistant.loader:Loaded component sun from homeassistant.components.sun
ERROR:homeassistant.components.sun:Error while importing dependency ephem.
Traceback (most recent call last):
File "/home/jeff/git/home-assistant/homeassistant/components/sun.py", line 66, in setup
import ephem
ImportError: No module named 'ephem'
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=light.Bowl, new_state=<state on:xy_color=[0.861, 0.3259], brightness=200 @ 22:30:15 06-11-2014>>
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=light.Ceiling, new_state=<state on:xy_color=[0.1684, 0.0416], brightness=200 @ 22:30:15 06-11-2014>>
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=light.TV_Back_light, new_state=<state off @ 22:30:15 06-11-2014>>
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=light.Bed_light, new_state=<state off @ 22:30:15 06-11-2014>>
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=group.all_lights, new_state=<state on:entity_id=['light.Bowl', 'light.Ceiling', 'light.TV_Back_light', 'light.Bed_light'], auto=True @ 22:30:15 06-11-2014>>
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=wemo.AC, new_state=<state on @ 22:30:15 06-11-2014>>
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=wemo.Christmas_Lights, new_state=<state off @ 22:30:15 06-11-2014>>
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=group.living_room, new_state=<state on:entity_id=['light.Bowl', 'light.Ceiling', 'light.TV_Back_light', 'wemo.AC'], auto=False @ 22:30:15 06-11-2014>>
ERROR:homeassistant:WorkerPool:All 4 threads are busy and 17 jobs pending
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=group.bedroom, new_state=<state off:entity_id=['light.Bed_light', 'wemo.Christmas_Lights'], auto=False @ 22:30:15 06-11-2014>>
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=process.XBMC, new_state=<state on @ 22:30:15 06-11-2014>>
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=device_tracker.Paulus, new_state=<state home:entity_picture=http://graph.facebook.com/schoutsen/picture @ 22:30:15 06-11-2014>>
ERROR:homeassistant:WorkerPool:All 4 threads are busy and 33 jobs pending
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=device_tracker.Anne_Therese, new_state=<state not_home:entity_picture=http://graph.facebook.com/anne.t.frederiksen/picture @ 22:30:15 06-11-2014>>
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=group.all_devices, new_state=<state home:entity_id=['device_tracker.Paulus', 'device_tracker.Anne_Therese'], auto=True @ 22:30:15 06-11-2014>>
INFO:homeassistant:Bus:Handling <Event state_changed[L]: entity_id=chromecast.Living_Rm, new_state=<state Netflix:friendly_name=Living Room @ 22:30:15 06-11-2014>>
INFO:homeassistant.bootstrap:component demo initialized
INFO:homeassistant.bootstrap:component group initialized
INFO:homeassistant:Bus:Handling <Event homeassistant_start[L]>
INFO:homeassistant:Timer:starting
INFO:homeassistant:Bus:Handling <Event time_changed[L]: now=22:30:20 06-11-2014>
^C
```
Sadly, I can't use the docker container due to [docker being broken in Fedora 21](https://github.com/docker/docker/issues/7952) right now. So while it is running, I tried `lsof -i tcp:8123` and `lsof -p $(pidof python3)`
This is with Python 3.4.1 on Fedora 21 (pre-release) x86_64.
FYI: I work on python automation code and django apps for `$REAL_JOB` and would love to help you improve this software if at all possible. I've got a home Insteon network and have SONOS speakers throughout the house. Once I get this all working, one of the first things I'd like to write is the integration between this and the SONOS xml api
| null | https://github.com/home-assistant/core/pull/35811 | null | {'base_commit': '8c8feb95a9c9048d655bc1eb263f6bc6ee61ee74', 'files': [{'path': 'homeassistant/components/google_assistant/helpers.py', 'status': 'modified', 'Loc': {"('GoogleEntity', 'sync_serialize', 393)": {'add': [428]}}}, {'path': 'tests/components/google_assistant/test_helpers.py', 'status': 'modified', 'Loc': {"(None, 'test_google_entity_sync_serialize_with_local_sdk', 25)": {'mod': [47, 48, 49, 50, 51, 52, 53, 54, 55]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"homeassistant/components/google_assistant/helpers.py"
],
"doc": [],
"test": [
"tests/components/google_assistant/test_helpers.py"
],
"config": [],
"asset": []
} | 1 | |
psf | requests | 2203c3bccd5e4888a16d73247d540fd6e359d29c | https://github.com/psf/requests/issues/1 | Cookie support? | An feature request (not found in documentation).
Does this support cookies?
Usecase: I can integrate this module inside an existings framework. This framework generate for me the authentication/session cookie, so to perform request using requests there I need to add the same auth cookie already generated.
| null | null | https://github.com/psf/requests/commit/2203c3bccd5e4888a16d73247d540fd6e359d29c | {'base_commit': '2203c3bccd5e4888a16d73247d540fd6e359d29c', 'files': [{'path': 'requests/core.py', 'status': 'modified', 'Loc': {"('Request', '__init__', 68)": {'add': [76]}, "('Request', None, 61)": {'add': [101]}, "('Request', '_get_opener', 101)": {'mod': [108, 109, 112, 113, 114]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "并没有找到对应的pr,这一行提供的pr也无法解决issue问题,在issue中解决问题的是一个commit",
"info_type": ""
} | {
"code": [
"requests/core.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
psf | requests | ac4e05874a1a983ca126185a0e4d4e74915f792e | https://github.com/psf/requests/issues/1859 | Brittle test | The test `test_expires_valid_str` fails on my OS X box, in Python 2.7:
``` python
============================= test session starts ==============================
platform darwin -- Python 2.7.5 -- pytest-2.3.4
plugins: cov
collected 116 items
test_requests.py .................................................................................................................F..
=================================== FAILURES ===================================
_______________ TestMorselToCookieExpires.test_expires_valid_str _______________
self = <test_requests.TestMorselToCookieExpires testMethod=test_expires_valid_str>
def test_expires_valid_str(self):
"""Test case where we convert expires from string time."""
morsel = Morsel()
morsel['expires'] = 'Thu, 01-Jan-1970 00:00:01 GMT'
cookie = morsel_to_cookie(morsel)
> assert cookie.expires == 1
E AssertionError: assert -3599 == 1
E + where -3599 = Cookie(version=0, name=None, value=None, port=None, port_specified=False, domain='', domain_specified=False, domain_in...False, secure=False, expires=-3599, discard=False, comment='', comment_url=False, rest={'HttpOnly': ''}, rfc2109=False).expires
test_requests.py:1111: AssertionError
==================== 1 failed, 115 passed in 23.32 seconds =====================
```
I've not yet got a good theory for this, though I think it's telling that the error is one hour. I don't know _what_ it's telling though, because time is complicated.
Anyway, this test needs to be rewritten to be more accepting of breakage. It's also possible that the intermittent failure of this test represents a problem with the `morsel_to_cookie` function itself, in which case that needs rewriting.
| null | https://github.com/psf/requests/pull/1860 | null | {'base_commit': 'ac4e05874a1a983ca126185a0e4d4e74915f792e', 'files': [{'path': 'requests/cookies.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [9]}, "(None, 'morsel_to_cookie', 388)": {'mod': [396, 397]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"requests/cookies.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 | |
nvbn | thefuck | 58ddd4338adf12a3abc2ffed0e27794a398fa8d2 | https://github.com/nvbn/thefuck/issues/994 | help wanted
hacktoberfest | UnicodeDecodeError when using thefuck | I followed the alias guide, but I got an error when running thefuck in PowerShell:
```
Traceback (most recent call last):
File "d:\python36\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "d:\python36\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "D:\Python36\Scripts\thefuck.exe\__main__.py", line 9, in <module>
File "d:\python36\lib\site-packages\thefuck\entrypoints\main.py", line 26, in main
fix_command(known_args)
File "d:\python36\lib\site-packages\thefuck\entrypoints\fix_command.py", line 36, in fix_command
command = types.Command.from_raw_script(raw_command)
File "d:\python36\lib\site-packages\thefuck\types.py", line 82, in from_raw_script
output = get_output(script, expanded)
File "d:\python36\lib\site-packages\thefuck\output_readers\__init__.py", line 20, in get_output
return rerun.get_output(script, expanded)
File "d:\python36\lib\site-packages\thefuck\output_readers\rerun.py", line 62, in get_output
output = result.stdout.read().decode('utf-8')
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xb2 in position 9: invalid start byte
``` | null | https://github.com/nvbn/thefuck/pull/1214 | null | {'base_commit': '58ddd4338adf12a3abc2ffed0e27794a398fa8d2', 'files': [{'path': 'tests/output_readers/test_rerun.py', 'status': 'modified', 'Loc': {"('TestRerun', None, 9)": {'add': [24]}}}, {'path': 'thefuck/output_readers/rerun.py', 'status': 'modified', 'Loc': {"(None, 'get_output', 45)": {'mod': [63]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"thefuck/output_readers/rerun.py"
],
"doc": [],
"test": [
"tests/output_readers/test_rerun.py"
],
"config": [],
"asset": []
} | 1 |
deepfakes | faceswap | c50287c23b3f35f54aa703823a8c3f9cbfc34377 | https://github.com/deepfakes/faceswap/issues/233 | Some faces with one eye hair covered can't be recognized | *First THANKS A LOT for all contributors' hard work!
*Always make a compare test after big change, test with same source 1000 pics (kar801 -> kar1800) , compare with FakeApp1.1 & latest faceswap commit 232d931.
*Test files [Link Removed]
## Expected behavior
Not sure, limitation ? or possible to improve ?
## Actual behavior
FakeApp1.1 extract rate is 988/1000
faceswap -D cnn extract rate is 943/1000
[Image Removed]
Notice that some faces - specially one eye covered by hair can't be extract. Example: kar1086 -> kar1090, these 5 pics can be extract normally in FakeApp, but failed in faceswap. Compare kar1085 with kar1086, no big gap in these 2 pics, just corner of the eye be covered by hair in kar1086.
## Steps to reproduce
python faceswap.py extract -i D:/project4/data_A1/ -o D:/project4/data_A1/output/ -D cnn
## Other relevant information
- **Operating system and version:** Windows, macOS, Linux
Windows10
Python3.6.4
CUDA9.0
dlib 19.9
The others env same as requirements-gpu-python36-cuda9.txt
| null | https://github.com/deepfakes/faceswap/pull/236 | null | {'base_commit': 'c50287c23b3f35f54aa703823a8c3f9cbfc34377', 'files': [{'path': 'lib/FaceLandmarksExtractor/FaceLandmarksExtractor.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [13], 'mod': [11, 12]}, "(None, 'extract', 114)": {'add': [162, 170], 'mod': [114, 115, 117, 118, 121, 123, 124, 125, 126, 127, 129, 130, 132, 133, 136, 139, 141, 143, 145, 146, 147, 148, 149, 150, 151, 152, 153, 155, 156, 157, 158, 161, 165, 169, 172]}, "(None, 'onExit', 16)": {'mod': [17, 18, 25, 26, 28, 29]}}}, {'path': 'lib/ModelAE.py', 'status': 'modified', 'Loc': {"('TrainerAE', 'show_sample', 69)": {'add': [80], 'mod': [82]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"lib/FaceLandmarksExtractor/FaceLandmarksExtractor.py",
"lib/ModelAE.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 | |
huggingface | transformers | 02e05fb0a532e572b56ba75dad6ba3db625bbdeb | https://github.com/huggingface/transformers/issues/9438 | Documentation | Doc styling utils adds parasites new lines | ## Environment info
- `transformers` version: 4.2.0dev0
- Platform: Windows-10-10.0.18362-SP0
- Python version: 3.7.9
- PyTorch version (GPU?): 1.7.1 (False)
- Tensorflow version (GPU?): 2.3.1 (False)
- Using GPU in script?: Nope
- Using distributed or parallel set-up in script?: Nope
### Who can help
@sgugger
## Information
Running the python util to style docs adds parasite new lines in every single docstring. See:
```bash
$ python utils/style_doc.py src/transformers docs/source --max_len 119 --check_only
Traceback (most recent call last):
File "utils/style_doc.py", line 491, in <module>
main(*args.files, max_len=args.max_len, check_only=args.check_only)
File "utils/style_doc.py", line 479, in main
raise ValueError(f"{len(changed)} files should be restyled!")
ValueError: 345 files should be restyled!
```
See this commit for an example of what it does: https://github.com/huggingface/transformers/pull/9150/commits/b4dedd5ca25f043c66d12c774fa00a34c74dffb2
## To reproduce
Steps to reproduce the behavior:
1. Checkout and update master branch
2. run `python utils/style_doc.py src/transformers docs/source --max_len 119 --check-only` from transformers root
Output:
```python
Traceback (most recent call last):
File "utils/style_doc.py", line 491, in <module>
main(*args.files, max_len=args.max_len, check_only=args.check_only)
File "utils/style_doc.py", line 479, in main
raise ValueError(f"{len(changed)} files should be restyled!")
ValueError: 345 files should be restyled!
```
It might have something to do with Windows or a particular setup of my machine because behavior cannot be reproduced by @patrickvonplaten.
## Expected behavior
On master branch, documentation should not need to be restyled
| null | https://github.com/huggingface/transformers/pull/9488 | null | {'base_commit': '02e05fb0a532e572b56ba75dad6ba3db625bbdeb', 'files': [{'path': 'docs/source/benchmarks.rst', 'status': 'modified', 'Loc': {}}, {'path': 'utils/style_doc.py', 'status': 'modified', 'Loc': {"(None, 'style_rst_file', 378)": {'mod': [384, 386]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"utils/style_doc.py"
],
"doc": [
"docs/source/benchmarks.rst"
],
"test": [],
"config": [],
"asset": []
} | 1 |
scrapy | scrapy | 09e56ae43eb63641381e0d722a04536c2fe22c0d | https://github.com/scrapy/scrapy/issues/3616 | Document LogFormatter | Currently, the `LogFormatter` class is only mentioned in the [Release notes](https://docs.scrapy.org/en/latest/news.html) page of the documentation. This class should be properly documented, both its API members and a small section introducing it on the documentation page about [Logging](https://docs.scrapy.org/en/latest/topics/logging.html).
The responses to [Scrapy - Silently drop an item](https://stackoverflow.com/q/13527921/939364) in StackOverflow would be a good starting point. | null | https://github.com/scrapy/scrapy/pull/3660 | null | {'base_commit': '09e56ae43eb63641381e0d722a04536c2fe22c0d', 'files': [{'path': 'docs/topics/logging.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [195]}}}, {'path': 'docs/topics/settings.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [868]}}}, {'path': 'scrapy/logformatter.py', 'status': 'modified', 'Loc': {"('LogFormatter', None, 13)": {'add': [33, 34, 51, 65], 'mod': [16, 17, 18, 21, 22, 23, 25, 26, 27, 29, 30, 31, 32]}, "('LogFormatter', 'crawled', 34)": {'mod': [43]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"scrapy/logformatter.py"
],
"doc": [
"docs/topics/logging.rst",
"docs/topics/settings.rst"
],
"test": [],
"config": [],
"asset": []
} | 1 | |
scrapy | scrapy | 47b9de93a9c7a514f4007439335facd8ea82a12d | https://github.com/scrapy/scrapy/issues/2905 | enhancement
docs
help wanted | An error occurred while connecting: [Failure instance: Traceback: <class 'ValueError'>: filedescriptor out of range in select() | I'm trying crawl ~200k sites, only the home pages. In the beginning the crawl works fine but the logs quickly fill up with the following errors:
2017-08-29 11:18:55,131 - scrapy.core.scraper - ERROR - Error downloading <GET http://axo-suit.eu>
Traceback (most recent call last):
File "venv/lib/python3.6/site-packages/twisted/internet/defer.py", line 1384, in _inlineCallbacks
result = result.throwExceptionIntoGenerator(g)
File "venv/lib/python3.6/site-packages/twisted/python/failure.py", line 393, in throwExceptionIntoGenerator
return g.throw(self.type, self.value, self.tb)
File "venv/lib/python3.6/site-packages/scrapy/core/downloader/middleware.py", line 43, in process_request
defer.returnValue((yield download_func(request=request,spider=spider)))
twisted.internet.error.ConnectError: An error occurred while connecting: [Failure instance: Traceback: <class 'ValueError'>: filedescriptor out of range in select()
venv/lib/python3.6/site-packages/twisted/internet/base.py:1243:run
venv/lib/python3.6/site-packages/twisted/internet/base.py:1255:mainLoop
venv/lib/python3.6/site-packages/twisted/internet/selectreactor.py:106:doSelect
venv/lib/python3.6/site-packages/twisted/internet/selectreactor.py:88:_preenDescriptors
--- <exception caught here> ---
venv/lib/python3.6/site-packages/twisted/internet/selectreactor.py:85:_preenDescriptors
].
lsof shows that the process indeed has >1024 open network connections, which I believe is the limit for select().
I set CONCURRENT_REQUESTS = 100 and REACTOR_THREADPOOL_MAXSIZE = 20 based on https://doc.scrapy.org/en/latest/topics/broad-crawls.html.
Not sure how the crawl ends up with so many open connections. Maybe it's leaking filedescriptors somewhere?
I'm using:
Python 3.6.2
Scrapy 1.4.0
Twisted 17.5.0
macOS Sierra 10.12.6
Happy to provide more info as needed. | null | https://github.com/scrapy/scrapy/pull/4294 | null | {'base_commit': '47b9de93a9c7a514f4007439335facd8ea82a12d', 'files': [{'path': 'docs/faq.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [363]}}}, {'path': 'docs/topics/broad-crawls.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [213]}}}, {'path': 'docs/topics/settings.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [1465], 'mod': [163, 165, 166, 168, 170, 172, 173, 174, 175, 176, 177, 178, 180, 181, 182]}}}, {'path': 'pytest.ini', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [23], 'mod': [112, 132, 138]}}}, {'path': 'scrapy/crawler.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [25], 'mod': [16]}, "('CrawlerRunner', '__init__', 133)": {'mod': [141]}, "('CrawlerRunner', None, 114)": {'mod': [235, 236, 237, 238]}, "('CrawlerProcess', None, 241)": {'mod': [327, 328, 329, 330]}}}, {'path': 'scrapy/settings/default_settings.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [293], 'mod': [22]}}}, {'path': 'scrapy/utils/asyncio.py', 'status': 'removed', 'Loc': {}}, {'path': 'scrapy/utils/defer.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [6], 'mod': [5, 12]}}}, {'path': 'scrapy/utils/log.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [4, 9], 'mod': [3, 8, 12, 14]}, "(None, 'log_scrapy_info', 145)": {'mod': [152, 153]}}}, {'path': 'scrapy/utils/reactor.py', 'status': 'modified', 'Loc': {"('CallLaterOnce', '__call__', 42)": {'add': [44]}, '(None, None, None)': {'mod': [1]}}}, {'path': 'tests/CrawlerProcess/asyncio_enabled_no_reactor.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [13]}}}, {'path': 'tests/CrawlerProcess/asyncio_enabled_reactor.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [18]}}}, {'path': 'tests/test_commands.py', 'status': 'modified', 'Loc': {"('RunSpiderCommandTest', 'test_asyncio_enabled_true', 298)": {'mod': [299, 300]}, "('RunSpiderCommandTest', 'test_asyncio_enabled_false', 302)": {'mod': [303, 304]}}}, {'path': 'tests/test_crawler.py', 'status': 'modified', 'Loc': {"('CrawlerProcessSubprocess', 'test_ipv6_alternative_name_resolver', 315)": {'add': [325]}, "('CrawlerRunnerHasSpider', 'test_crawler_runner_asyncio_enabled_true', 255)": {'mod': [257, 259, 261]}, "('CrawlerRunnerHasSpider', 'test_crawler_process_asyncio_enabled_true', 264)": {'mod': [267, 269, 271, 273]}, "('CrawlerRunnerHasSpider', 'test_crawler_process_asyncio_enabled_false', 276)": {'mod': [277, 280]}, "('CrawlerProcessSubprocess', 'test_simple', 294)": {'mod': [297]}, "('CrawlerProcessSubprocess', 'test_asyncio_enabled_no_reactor', 299)": {'mod': [302]}, "('CrawlerProcessSubprocess', 'test_asyncio_enabled_reactor', 304)": {'mod': [307]}}}, {'path': 'tests/test_utils_asyncio.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [5]}, "('AsyncioTest', 'test_install_asyncio_reactor', 15)": {'mod': [17]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"scrapy/utils/reactor.py",
"scrapy/crawler.py",
"scrapy/utils/log.py",
"tests/CrawlerProcess/asyncio_enabled_no_reactor.py",
"scrapy/utils/defer.py",
"scrapy/utils/asyncio.py",
"tests/CrawlerProcess/asyncio_enabled_reactor.py",
"scrapy/settings/default_settings.py"
],
"doc": [
"docs/topics/broad-crawls.rst",
"docs/topics/settings.rst",
"docs/faq.rst"
],
"test": [
"tests/test_utils_asyncio.py",
"tests/test_crawler.py",
"tests/test_commands.py"
],
"config": [
"pytest.ini"
],
"asset": []
} | 1 |
pallets | flask | fb89745408cc02515815c792355c7e883b2d08a4 | https://github.com/pallets/flask/issues/4602 | Flask.auto_find_instance_path() can return wrong path for namespace packages installed in development mode | https://github.com/pallets/flask/blob/bd56d19b167822a9a23e2e9e2a07ccccc36baa8d/src/flask/scaffold.py#L798
If there are several packages under the same namespace, all installed in development mode, like:
```
~/namespace-package1/
namespace/
package1/
__init__.py
app.py
instance/
~/namespace-package2/
namespace/
package2/
__init__.py
app.py
instance/
```
and the code in `namespace.package2` uses `app.instance_path`, then its expected value is `~/namespace-package2/instance` (["Uninstalled package" decision path](https://flask.palletsprojects.com/en/2.1.x/config/#instance-folders)).
Instead of that the following happens:
* `find_package()` [cuts import info](https://github.com/pallets/flask/blob/bd56d19b167822a9a23e2e9e2a07ccccc36baa8d/src/flask/scaffold.py#L846) to the very top package name, `namespace`,
* then `_find_package_path()` finds module specification for the whole namespace package, which contains several submodule search locations, like `ModuleSpec(name='namespace', loader=<_frozen_importlib_external._NamespaceLoader object at ...>, submodule_search_locations=_NamespacePath(['~/namespace-package1/namespace', '~/namespace-package2/namespace']))`
* and then the quoted line returns first, i.e. _arbitrary_, package from that namespace, e.g. `~/namespace-package1`, which produces wrong instance path.
Suggestion: pass also `import_name` into `_find_package_path` and use it for resolving ambiguity at this point, like:
```
def _find_package_path(root_mod_name, import_name):
...
if spec.origin in {"namespace", None}:
package_spec = importlib.util.find_spec(import_name)
package_path = os.path.commonpath(package_spec.submodule_search_locations)
return os.path.dirname(next(
location for location in spec.submodule_search_locations
if package_path.startswith(location)
))
``` | null | https://github.com/pallets/flask/pull/4610 | null | {'base_commit': 'fb89745408cc02515815c792355c7e883b2d08a4', 'files': [{'path': 'CHANGES.rst', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [10]}}}, {'path': 'src/flask/scaffold.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [2]}, "(None, '_find_package_path', 783)": {'add': [784], 'mod': [783, 786, 788, 794, 799, 800, 802, 803, 806]}, "(None, 'find_package', 835)": {'mod': [848, 849, 853]}}}, {'path': 'tests/test_instance_config.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [61], 'mod': [1, 18, 19, 20, 21, 22, 24, 26, 27, 30, 45]}}}, {'path': 'tox.ini', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [11]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"src/flask/scaffold.py"
],
"doc": [
"CHANGES.rst"
],
"test": [
"tests/test_instance_config.py"
],
"config": [
"tox.ini"
],
"asset": []
} | 1 | |
huggingface | transformers | 0ee71188ff184ee5f8b70081665858301fe4afb1 | https://github.com/huggingface/transformers/issues/20395 | some tokenizer(s) don't save the updated attributes | ### System Info
transformers version: 4.25.0.dev0
Torch version: 1.13.0+cpu
Cuda available: False
Cuda version: None
CuDNN version: None
Number of GPUs available: 0
### Description
For `GPT2Tokenizer(Fast)`, Set `tokenizer.model_max_length` to `128` (originally `1024`), save it then reload, will give `tokenizer.model_max_length` being `1024`.
### Reproduction
```python
from transformers import GPT2Tokenizer, GPT2TokenizerFast
tokenizer = GPT2TokenizerFast.from_pretrained("gpt2")
print(tokenizer.model_max_length)
tokenizer.model_max_length = 128
print(tokenizer.model_max_length)
tokenizer.save_pretrained("my-gpt2")
tokenizer_loaded = GPT2TokenizerFast.from_pretrained("my-gpt2")
print(tokenizer_loaded.model_max_length)
```
The output is
```bash
1024
128
1024
```
### Expected behavior
`tokenizer_loaded.model_max_length` should be `128` in the above example. In general, the updated attribute(s) should be saved. | null | https://github.com/huggingface/transformers/pull/20401 | null | {'base_commit': '0ee71188ff184ee5f8b70081665858301fe4afb1', 'files': [{'path': 'src/transformers/tokenization_utils_base.py', 'status': 'modified', 'Loc': {"('PreTrainedTokenizerBase', 'save_pretrained', 2022)": {'add': [2084]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"src/transformers/tokenization_utils_base.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 | |
AntonOsika | gpt-engineer | c6dd5237428895c0ba6cda40e3b2b95012276a05 | https://github.com/AntonOsika/gpt-engineer/issues/928 | bug
triage | KeyError in apply_edits breaking improve mode | I am running improve mode, creating c# and xaml. GPT Engineer is attempting to make updates to a xaml user control (here renamed to be "myExistingUserControl.xaml") and running into an issue where the filepath is invalid.
```These edits will ensure that the code changes are in the correct format and can be found in the code.Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "C:\Users\asdf\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\Scripts\gpte.exe\__main__.py", line 7, in <module>
sys.exit(app())
^^^^^
File "C:\Users\asdf\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\gpt_engineer\applications\cli\main.py", line 194, in main
files_dict = agent.improve(files_dict, prompt)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\asdf\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\gpt_engineer\applications\cli\cli_agent.py", line 131, in improve
files_dict = self.improve_fn(
^^^^^^^^^^^^^^^^
File "C:\Users\asdf\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\gpt_engineer\core\default\steps.py", line 182, in improve
overwrite_code_with_edits(chat, files_dict)
File "C:\Users\asdf\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\gpt_engineer\core\chat_to_files.py", line 97, in overwrite_code_with_edits
apply_edits(edits, files_dict)
File "C:\Users\asdf\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\gpt_engineer\core\chat_to_files.py", line 185, in apply_edits
occurrences_cnt = files_dict[filename].count(edit.before)
~~~~~~~~~~^^^^^^^^^^
KeyError: 'some/dir/myExistingUserControl.xaml'``` | null | https://github.com/AntonOsika/gpt-engineer/pull/930 | null | {'base_commit': 'c6dd5237428895c0ba6cda40e3b2b95012276a05', 'files': [{'path': 'gpt_engineer/preprompts/improve', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [67], 'mod': [11, 32, 41, 52]}}}, {'path': 'tests/core/test_chat_to_files.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [185]}, "(None, 'test_apply_edit_new_file', 186)": {'mod': [188, 191]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [],
"doc": [],
"test": [
"tests/core/test_chat_to_files.py"
],
"config": [],
"asset": [
"gpt_engineer/preprompts/improve"
]
} | 1 |
ultralytics | yolov5 | 77415a42e5975ea356393c9f1d5cff0ae8acae2c | https://github.com/ultralytics/yolov5/issues/2446 | enhancement | Images in MPO Format are considered corrupted | I am using images taken by a DJI drone. These images are deemed corrupted by the dataset loader, and are thus not used.
This happens because in datasets.py the `im.format` is checked against a list of formats that doesn't contain "mpo".
If I add that entry manually everything works as expected.
MPO is a container format, that can contain any of the valid formats.
## 🐛 Bug
Images that report "MPO" as PIL.Image.format are deemed corrupted.
## To Reproduce (REQUIRED)
Try to load MPO images.

I'm not sure whether Github tempers with the image. If necessary I can upload somewhere else.
## Expected behavior
Images should be considered valid.
| null | https://github.com/ultralytics/yolov5/pull/2615 | null | {'base_commit': '77415a42e5975ea356393c9f1d5cff0ae8acae2c', 'files': [{'path': 'utils/datasets.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [29]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"utils/datasets.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 |
scikit-learn | scikit-learn | 0bbd57b322aaa5aeca4f3af2dd7f802360d29673 | https://github.com/scikit-learn/scikit-learn/issues/2190 | Bug | crash in MeanShift tests after make cython (edited from k_means) | The crash:
```
[erg@pliny scikit-learn]$ [master*] nosetests -v
/home/erg/python/scikit-learn/sklearn/feature_selection/selector_mixin.py:7: DeprecationWarning: sklearn.feature_selection.selector_mixin.SelectorMixin has been renamed sklearn.feature_selection.from_model._LearntSelectorMixin, and this alias will be removed in version 0.16
DeprecationWarning)
Affinity Propagation algorithm ... ok
Tests the DBSCAN algorithm with a similarity array. ... ok
Tests the DBSCAN algorithm with a feature vector array. ... ok
Tests the DBSCAN algorithm with a callable metric. ... ok
sklearn.cluster.tests.test_dbscan.test_pickle ... ok
Check that we obtain the correct solution for structured ward tree. ... ok
Check that we obtain the correct solution for unstructured ward tree. ... ok
Check that the height of ward tree is sorted. ... ok
Check that we obtain the correct number of clusters with Ward clustering. ... ok
Check that we obtain the correct solution in a simplistic case ... ok
Test scikit ward with full connectivity (i.e. unstructured) vs scipy ... ok
Check that connectivity in the ward tree is propagated correctly during ... ok
Check non regression of a bug if a non item assignable connectivity is ... ok
sklearn.cluster.tests.test_k_means.test_square_norms ... ok
sklearn.cluster.tests.test_k_means.test_kmeans_dtype ... ok
sklearn.cluster.tests.test_k_means.test_labels_assignment_and_inertia ... ok
Check that dense and sparse minibatch update give the same results ... ok
sklearn.cluster.tests.test_k_means.test_k_means_plus_plus_init ... ok
sklearn.cluster.tests.test_k_means.test_k_means_check_fitted ... ok
sklearn.cluster.tests.test_k_means.test_k_means_new_centers ... ok
sklearn.cluster.tests.test_k_means.test_k_means_plus_plus_init_2_jobs ... ok
sklearn.cluster.tests.test_k_means.test_k_means_plus_plus_init_sparse ... ok
sklearn.cluster.tests.test_k_means.test_k_means_random_init ... ok
sklearn.cluster.tests.test_k_means.test_k_means_random_init_sparse ... ok
sklearn.cluster.tests.test_k_means.test_k_means_plus_plus_init_not_precomputed ... ok
sklearn.cluster.tests.test_k_means.test_k_means_random_init_not_precomputed ... ok
sklearn.cluster.tests.test_k_means.test_k_means_perfect_init ... ok
sklearn.cluster.tests.test_k_means.test_mb_k_means_plus_plus_init_dense_array ... ok
sklearn.cluster.tests.test_k_means.test_mb_kmeans_verbose ... ok
sklearn.cluster.tests.test_k_means.test_mb_k_means_plus_plus_init_sparse_matrix ... ok
sklearn.cluster.tests.test_k_means.test_minibatch_init_with_large_k ... ok
sklearn.cluster.tests.test_k_means.test_minibatch_k_means_random_init_dense_array ... ok
sklearn.cluster.tests.test_k_means.test_minibatch_k_means_random_init_sparse_csr ... ok
sklearn.cluster.tests.test_k_means.test_minibatch_k_means_perfect_init_dense_array ... ok
sklearn.cluster.tests.test_k_means.test_minibatch_k_means_perfect_init_sparse_csr ... ok
sklearn.cluster.tests.test_k_means.test_minibatch_reassign ... ok
sklearn.cluster.tests.test_k_means.test_sparse_mb_k_means_callable_init ... ok
sklearn.cluster.tests.test_k_means.test_mini_batch_k_means_random_init_partial_fit ... ok
sklearn.cluster.tests.test_k_means.test_minibatch_default_init_size ... ok
sklearn.cluster.tests.test_k_means.test_minibatch_tol ... ok
sklearn.cluster.tests.test_k_means.test_minibatch_set_init_size ... ok
sklearn.cluster.tests.test_k_means.test_k_means_invalid_init ... ok
sklearn.cluster.tests.test_k_means.test_mini_match_k_means_invalid_init ... ok
Check if copy_x=False returns nearly equal X after de-centering. ... ok
Check k_means with a bad initialization does not yield a singleton ... ok
sklearn.cluster.tests.test_k_means.test_predict ... ok
sklearn.cluster.tests.test_k_means.test_score ... ok
sklearn.cluster.tests.test_k_means.test_predict_minibatch_dense_input ... ok
sklearn.cluster.tests.test_k_means.test_predict_minibatch_kmeanspp_init_sparse_input ... ok
sklearn.cluster.tests.test_k_means.test_predict_minibatch_random_init_sparse_input ... ok
sklearn.cluster.tests.test_k_means.test_input_dtypes ... ok
sklearn.cluster.tests.test_k_means.test_transform ... ok
sklearn.cluster.tests.test_k_means.test_fit_transform ... ok
Check that increasing the number of init increases the quality ... ok
sklearn.cluster.tests.test_k_means.test_k_means_function ... ok
Test MeanShift algorithm ... Segmentation fault (core dumped)
```
Some related warnings?
```
[erg@pliny ~]$ cython --version
Cython version 0.19.1
[erg@pliny scikit-learn]$ [master*] make cython
find sklearn -name "*.pyx" | xargs cython
warning: sklearn/neighbors/binary_tree.pxi:1199:20: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1257:48: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1258:46: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1260:45: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1345:20: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1355:42: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1357:36: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1398:59: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1400:46: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1401:48: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1403:45: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1491:20: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1544:64: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1589:20: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1199:20: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1257:48: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1258:46: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1260:45: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1345:20: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1355:42: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1357:36: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1398:59: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1400:46: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1401:48: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1403:45: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1491:20: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1544:64: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
warning: sklearn/neighbors/binary_tree.pxi:1589:20: the result of using negative indices inside of code sections marked as 'wraparound=False' is undefined
```
| null | https://github.com/scikit-learn/scikit-learn/pull/2230 | null | {'base_commit': '0bbd57b322aaa5aeca4f3af2dd7f802360d29673', 'files': [{'path': 'sklearn/neighbors/binary_tree.pxi', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [1199, 1257, 1258, 1260, 1345, 1355, 1357, 1398, 1400, 1401, 1403, 1491, 1544, 1589]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"sklearn/neighbors/binary_tree.pxi"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 |
Significant-Gravitas | AutoGPT | 4c2a566acc37c8d95b07c023f8c52a1a2a5d15bf | https://github.com/Significant-Gravitas/AutoGPT/issues/2186 | bug
needs investigation
API access | Azure support broken? | ### ⚠️ Search for existing issues first ⚠️
- [X] I have searched the existing issues, and there is no existing issue for my problem
### GPT-3 or GPT-4
- [ ] I am using Auto-GPT with GPT-3 (GPT-3.5)
### Steps to reproduce 🕹
```yaml
azure.yaml:
azure_api_type: azure
azure_api_base: https://test.openai.azure.com/
azure_api_version: 2023-03-15-preview
azure_model_map:
fast_llm_model_deployment_id: "gpt-35-turbo"
smart_llm_model_deployment_id: "gpt-4"
embedding_model_deployment_id: "emb-ada"
```
### Current behavior 😯
When I run "python -m autogpt", it just broken
Welcome back! Would you like me to return to being Entrepreneur-GPT?
Continue with the last settings?
Name: Entrepreneur-GPT
Role: an AI designed to autonomously develop and run businesses with the
Goals: ['Increase net worth', 'Grow Twitter Account', 'Develop and manage multiple businesses autonomously']
Continue (y/n): y
Using memory of type: LocalCache
Using Browser: chrome
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/data/Auto-GPT/autogpt/__main__.py", line 50, in <module>
main()
File "/data/Auto-GPT/autogpt/__main__.py", line 46, in main
agent.start_interaction_loop()
File "/data/Auto-GPT/autogpt/agent/agent.py", line 75, in start_interaction_loop
assistant_reply = chat_with_ai(
^^^^^^^^^^^^^
File "/data/Auto-GPT/autogpt/chat.py", line 159, in chat_with_ai
assistant_reply = create_chat_completion(
^^^^^^^^^^^^^^^^^^^^^^^
File "/data/Auto-GPT/autogpt/llm_utils.py", line 84, in create_chat_completion
deployment_id=CFG.get_azure_deployment_id_for_model(model),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/data/Auto-GPT/autogpt/config/config.py", line 120, in get_azure_deployment_id_for_model
return self.azure_model_to_deployment_id_map[
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: list indices must be integers or slices, not str
### Expected behavior 🤔
It should works well.
### Your prompt 📝
```yaml
# Paste your prompt here
```
| null | https://github.com/Significant-Gravitas/AutoGPT/pull/2351 | null | {'base_commit': '4c2a566acc37c8d95b07c023f8c52a1a2a5d15bf', 'files': [{'path': 'autogpt/config/config.py', 'status': 'modified', 'Loc': {"('Config', 'load_azure_config', 136)": {'mod': [157]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"autogpt/config/config.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | 1 |
nvbn | thefuck | f966ecd4f5b8221ee15e843f5ec287e1f7cca940 | https://github.com/nvbn/thefuck/issues/740 | wrong suggestion with git push --set-upstream | Thefuck is incorrectly adding the remote name at the end of the command suggestion:
```
$ git push myfork
fatal: The current branch test-branch has no upstream branch.
To push the current branch and set the remote as upstream, use
git push --set-upstream myfork test-branch
$ fuck
git push --set-upstream myfork test-branch myfork [enter/↑/↓/ctrl+c]
error: src refspec myfork does not match any.
error: failed to push some refs to 'git@github.com:waldyrious/project-foo.git'
``` | null | https://github.com/nvbn/thefuck/pull/745 | null | {'base_commit': 'f966ecd4f5b8221ee15e843f5ec287e1f7cca940', 'files': [{'path': 'tests/rules/test_git_push.py', 'status': 'modified', 'Loc': {"(None, 'test_get_new_command', 23)": {'add': [25]}}}, {'path': 'thefuck/rules/git_push.py', 'status': 'modified', 'Loc': {"(None, 'get_new_command', 22)": {'add': [34]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [
"thefuck/rules/git_push.py"
],
"doc": [],
"test": [
"tests/rules/test_git_push.py"
],
"config": [],
"asset": []
} | 1 | |
huggingface | transformers | e4b234834a79541f31be227aadce13f5aafda85a | https://github.com/huggingface/transformers/issues/16497 | WIP | [TODO] Investigate equivalence tests | **(add a lot of assignees just to make you informed and kept updated in the future. Don't hesitate to remove yourself if you think it's irrelevant)**
Currently the PT/TF/Flax equivalence tests use `1e-5` as the tolerance for the absolute differences of outputs.
We see that these tests failed with a non-negligible (although not carefully defined) frequency.
Create this page to track a list of models to investigate.
- **FlaxWav2Vec2ModelTest** (2.2888184e-05 > 1e-5)
- https://app.circleci.com/pipelines/github/huggingface/transformers/37363/workflows/a4b06424-0ba8-4fbc-9054-6ff52fbf8145/jobs/411654
- **TFGPT2EncoderDecoderModelTest** (0.001009281724691391 > 1e-3)
- https://app.circleci.com/pipelines/github/huggingface/transformers/37358/workflows/43c12161-33d8-4df5-ba3c-3e62a4507ee7/jobs/411579
- This also happens to **TFBERTEncoderDecoderModelTest**
- This is caused by some sequence in a batch which gets all 0s as attention mask (generated by ids_tensor) - may happens on both encoder and decoder (especially after combining with the causal mask).
- For **TFBERTEncoderDecoderModelTest**, the difference is smaller than *TFGPT2EncoderDecoderModelTest* (by a magnitude of 5x~10x) -> this is due to the last hidden states in GPT2 is after layer norm (not the case for BERT).
- If we look the cross attention diff between PT/TF, it is clear that we have the same issue (both in the magnitude of `1e-3`)
- The encoder attention diff between PT/TF is in the magnitude of `5e-8`: ~~**not very sure why this doesn't get much larger**~~.
- This is because PT/TF (at least in BERT) has different `encoder_extended_attention_mask`: `1e-4` vs `1e-9`.
- **TFViTMAEModelTest** (1.013279e-05 > 1e-5)
- https://app.circleci.com/pipelines/github/huggingface/transformers/37319/workflows/5adfba7a-d12b-4e1e-9a7a-e33c7d5fd6ee/jobs/411002 | null | https://github.com/huggingface/transformers/pull/16517 | null | {'base_commit': 'e4b234834a79541f31be227aadce13f5aafda85a', 'files': [{'path': 'templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/test_modeling_tf_{{cookiecutter.lowercase_modelname}}.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [24]}, "(None, 'prepare_config_and_inputs', 90)": {'mod': [95]}}}, {'path': 'tests/albert/test_modeling_tf_albert.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [24]}, "('TFAlbertModelTester', 'prepare_config_and_inputs', 94)": {'mod': [99]}}}, {'path': 'tests/bert/test_modeling_tf_bert.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [24]}, "('TFBertModelTester', 'prepare_config_and_inputs', 94)": {'mod': [99]}}}, {'path': 'tests/clip/test_modeling_tf_clip.py', 'status': 'modified', 'Loc': {"('TFCLIPTextModelTester', 'prepare_config_and_inputs', 298)": {'add': [303]}}}, {'path': 'tests/convbert/test_modeling_tf_convbert.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFConvBertModelTester', 'prepare_config_and_inputs', 92)": {'mod': [97]}}}, {'path': 'tests/ctrl/test_modeling_tf_ctrl.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFCTRLModelTester', 'prepare_config_and_inputs', 67)": {'mod': [72]}}}, {'path': 'tests/deberta/test_modeling_tf_deberta.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFDebertaModelTester', 'prepare_config_and_inputs', 90)": {'mod': [95]}}}, {'path': 'tests/deberta_v2/test_modeling_tf_deberta_v2.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFDebertaV2ModelTester', 'prepare_config_and_inputs', 93)": {'mod': [98]}}}, {'path': 'tests/distilbert/test_modeling_tf_distilbert.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFDistilBertModelTester', 'prepare_config_and_inputs', 68)": {'mod': [73]}}}, {'path': 'tests/dpr/test_modeling_tf_dpr.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [22]}, "('TFDPRModelTester', 'prepare_config_and_inputs', 92)": {'mod': [97, 98, 99]}}}, {'path': 'tests/electra/test_modeling_tf_electra.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFElectraModelTester', 'prepare_config_and_inputs', 69)": {'mod': [74]}}}, {'path': 'tests/flaubert/test_modeling_tf_flaubert.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [22]}, "('TFFlaubertModelTester', 'prepare_config_and_inputs', 76)": {'mod': [78]}}}, {'path': 'tests/funnel/test_modeling_tf_funnel.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFFunnelModelTester', 'prepare_config_and_inputs', 109)": {'mod': [114]}}}, {'path': 'tests/gpt2/test_modeling_tf_gpt2.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [22]}, "('TFGPT2ModelTester', 'prepare_config_and_inputs', 72)": {'mod': [77]}}}, {'path': 'tests/gptj/test_modeling_tf_gptj.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFGPTJModelTester', 'prepare_config_and_inputs', 68)": {'mod': [73]}}}, {'path': 'tests/layoutlm/test_modeling_tf_layoutlm.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [24]}, "('TFLayoutLMModelTester', 'prepare_config_and_inputs', 90)": {'mod': [110]}}}, {'path': 'tests/longformer/test_modeling_tf_longformer.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFLongformerModelTester', 'prepare_config_and_inputs', 77)": {'mod': [82]}}}, {'path': 'tests/lxmert/test_modeling_tf_lxmert.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [26]}, "('TFLxmertModelTester', 'prepare_config_and_inputs', 119)": {'mod': [127]}}}, {'path': 'tests/mobilebert/test_modeling_tf_mobilebert.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFMobileBertModelTester', 'prepare_config_and_inputs', 112)": {'mod': [117]}}}, {'path': 'tests/mpnet/test_modeling_tf_mpnet.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFMPNetModelTester', 'prepare_config_and_inputs', 88)": {'mod': [93]}}}, {'path': 'tests/openai/test_modeling_tf_openai.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFOpenAIGPTModelTester', 'prepare_config_and_inputs', 68)": {'mod': [73]}}}, {'path': 'tests/rembert/test_modeling_tf_rembert.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFRemBertModelTester', 'prepare_config_and_inputs', 93)": {'mod': [98]}}}, {'path': 'tests/roberta/test_modeling_tf_roberta.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFRobertaModelTester', 'prepare_config_and_inputs', 70)": {'mod': [75]}}}, {'path': 'tests/roformer/test_modeling_tf_roformer.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFRoFormerModelTester', 'prepare_config_and_inputs', 93)": {'mod': [98]}}}, {'path': 'tests/t5/test_modeling_tf_t5.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFT5ModelTester', 'prepare_config_and_inputs', 56)": {'mod': [61]}}}, {'path': 'tests/tapas/test_modeling_tf_tapas.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [41]}, "('TFTapasModelTester', 'prepare_config_and_inputs', 156)": {'mod': [161]}}}, {'path': 'tests/test_modeling_tf_common.py', 'status': 'modified', 'Loc': {"(None, 'random_attention_mask', 1440)": {'mod': [1443]}}}, {'path': 'tests/xlm/test_modeling_tf_xlm.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [23]}, "('TFXLMModelTester', 'prepare_config_and_inputs', 76)": {'mod': [78]}}}, {'path': 'tests/xlnet/test_modeling_tf_xlnet.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [25]}, "('TFXLNetModelTester', 'prepare_config_and_inputs', 74)": {'mod': [78]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": ""
} | {
"code": [],
"doc": [],
"test": [
"tests/test_modeling_tf_common.py",
"templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/test_modeling_tf_{{cookiecutter.lowercase_modelname}}.py",
"tests/openai/test_modeling_tf_openai.py",
"tests/funnel/test_modeling_tf_funnel.py",
"tests/convbert/test_modeling_tf_convbert.py",
"tests/bert/test_modeling_tf_bert.py",
"tests/roformer/test_modeling_tf_roformer.py",
"tests/t5/test_modeling_tf_t5.py",
"tests/lxmert/test_modeling_tf_lxmert.py",
"tests/mpnet/test_modeling_tf_mpnet.py",
"tests/rembert/test_modeling_tf_rembert.py",
"tests/layoutlm/test_modeling_tf_layoutlm.py",
"tests/dpr/test_modeling_tf_dpr.py",
"tests/gptj/test_modeling_tf_gptj.py",
"tests/roberta/test_modeling_tf_roberta.py",
"tests/flaubert/test_modeling_tf_flaubert.py",
"tests/clip/test_modeling_tf_clip.py",
"tests/tapas/test_modeling_tf_tapas.py",
"tests/deberta/test_modeling_tf_deberta.py",
"tests/electra/test_modeling_tf_electra.py",
"tests/gpt2/test_modeling_tf_gpt2.py",
"tests/xlm/test_modeling_tf_xlm.py",
"tests/longformer/test_modeling_tf_longformer.py",
"tests/deberta_v2/test_modeling_tf_deberta_v2.py",
"tests/distilbert/test_modeling_tf_distilbert.py",
"tests/albert/test_modeling_tf_albert.py",
"tests/xlnet/test_modeling_tf_xlnet.py",
"tests/mobilebert/test_modeling_tf_mobilebert.py",
"tests/ctrl/test_modeling_tf_ctrl.py"
],
"config": [],
"asset": []
} | 1 |
pallets | flask | 01081dbe6cdfa3fc43d8e1fff708d4ed95e1be7e | https://github.com/pallets/flask/issues/1971 | Implement RFC 7233 | It would be great to support [RFC 7233 : Hypertext Transfer Protocol (HTTP/1.1): Range Requests](https://tools.ietf.org/html/rfc7233) for next major version, at least for non multipart/byteranges media type.
I'm willing to implement this, so please share your thoughts about this.
What must be done:
- Modify `send_file` method to support Range Requests
- Use existing `conditionnal` parameter to enable Range Requests support ?
| null | https://github.com/pallets/flask/pull/2031 | null | {'base_commit': '01081dbe6cdfa3fc43d8e1fff708d4ed95e1be7e', 'files': [{'path': 'CHANGES', 'status': 'modified', 'Loc': {'(None, None, 20)': {'add': [20]}}}, {'path': 'flask/helpers.py', 'status': 'modified', 'Loc': {"(None, 'send_file', 430)": {'add': [448, 502], 'mod': [538, 544, 578]}, '(None, None, None)': {'mod': [28, 29]}}}, {'path': 'tests/test_helpers.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [18]}, "('TestSendfile', None, 356)": {'add': [464]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"flask/helpers.py"
],
"doc": [
"CHANGES"
],
"test": [
"tests/test_helpers.py"
],
"config": [],
"asset": []
} | null | |
pallets | flask | 673e5af658cf029e82d87047dcb7ebee3d343d10 | https://github.com/pallets/flask/issues/2823 | Flask complains a .env file exists when not using python-dotenv, even though that .env is a directory | I place my virtualenvs in a `.env` directory in my project directory. Flask 1.x sees this directory and thinks it might be a "dotenv" file (even though it is a directory).
### Expected Behavior
`flask` should ignore a `.env` directory when `python-dotenv` is not installed.
### Actual Behavior
`flask` says:
> * Tip: There are .env files present. Do "pip install python-dotenv" to use them.
### Environment
* Python version: 3.6.5
* Flask version: 1.0.2
* Werkzeug version: 0.14.1
| null | https://github.com/pallets/flask/pull/2827 | null | {'base_commit': '673e5af658cf029e82d87047dcb7ebee3d343d10', 'files': [{'path': 'flask/cli.py', 'status': 'modified', 'Loc': {"(None, 'load_dotenv', 567)": {'mod': [587]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"flask/cli.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
pallets | flask | 8e589daaf2cec6a10262b8ff88801127f2fa14fd | https://github.com/pallets/flask/issues/4220 | `template_filter` decorator typing does not support custom filters with multiple arguments | `template_filter` decorator typing does not support custom filters that take in multiple arguments. Consider:
```py
from flask import Flask
app = Flask(__name__)
@app.template_filter('foo_bar')
def foo_bar_filter(foo, bar):
return f'{foo} {bar}'
```
`mypy` will return the following error message:
```
error: Argument 1 has incompatible type "Callable[[Any, Any], Any]"; expected "Callable[[Any], str]" [arg-type]
```
As custom filters with multiple arguments are supported by Jinja (https://jinja.palletsprojects.com/en/3.0.x/api/#custom-filters), I think this typing error is a false positive.
Environment:
- Python version: 3.6.13
- Flask version: 2.0.1
- Mypy version: 0.812
| null | null | https://github.com/pallets/flask/commit/8e589daaf2cec6a10262b8ff88801127f2fa14fd | {'base_commit': '8e589daaf2cec6a10262b8ff88801127f2fa14fd', 'files': [{'path': 'CHANGES.rst', 'status': 'modified', 'Loc': {'(None, None, 10)': {'add': [10]}}}, {'path': 'src/flask/typing.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [43, 44, 45]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"src/flask/typing.py"
],
"doc": [
"CHANGES.rst"
],
"test": [],
"config": [],
"asset": []
} | null | |
sherlock-project | sherlock | a4df5010f49044eb1f1713057e8914e6a5a104b3 | https://github.com/sherlock-project/sherlock/issues/1073 | false positive | producthunt.com false positive | <!--
######################################################################
WARNING!
IGNORING THE FOLLOWING TEMPLATE WILL RESULT IN ISSUE CLOSED AS INCOMPLETE
######################################################################
-->
## Checklist
<!--
Put x into all boxes (like this [x]) once you have completed what they say.
Make sure complete everything in the checklist.
-->
- [X] I'm reporting a website that is returning **false positive** results
- [X] I've checked for similar site support requests including closed ones
- [X] I've checked for pull requests attempting to fix this false positive
- [X] I'm only reporting **one** site (create a seperate issue for each site)
## Description
<!--
Provide the username that is causing Sherlock to return a false positive, along with any other information that might help us fix this false positive.
-->
https://www.producthunt.com/@adasaaakzzzzzzzzsdsdsdasdadadasqe22aasd
| null | null | https://github.com/sherlock-project/sherlock/commit/a4df5010f49044eb1f1713057e8914e6a5a104b3 | {'base_commit': 'a4df5010f49044eb1f1713057e8914e6a5a104b3', 'files': [{'path': 'sherlock/resources/data.json', 'status': 'modified', 'Loc': {'(None, None, 1159)': {'mod': [1159]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"sherlock/resources/data.json"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
keras-team | keras | d2803c0fb7d0ba9361dcba8eb9bcebbf2f774958 | https://github.com/keras-team/keras/issues/11023 | Cannot load_model | Thank you!
- [ ] Check that you are up-to-date with the master branch of Keras. You can update with:
pip install git+git://github.com/keras-team/keras.git --upgrade --no-deps
- [x] If running on TensorFlow, check that you are up-to-date with the latest version. The installation instructions can be found [here](https://www.tensorflow.org/get_started/os_setup).
- [ ] If running on Theano, check that you are up-to-date with the master branch of Theano. You can update with:
pip install git+git://github.com/Theano/Theano.git --upgrade --no-deps
- [x] Provide a link to a GitHub Gist of a Python script that can reproduce your issue (or just copy the script here if it is short).
I am using Google Colab to train a CNN and then save the entire model to a `.h5` file. The code is available here: [CNN-Colab](https://gist.github.com/abhisheksoni27/184c49ca703eb124e1b17eb8dd8af518)
The model gets saved but when I later try to load it back, I get the following error:
```
TypeError: float() argument must be a string or a number, not 'dict'
```
The entire Output log is here: [CNN - Colab - Error](https://gist.github.com/abhisheksoni27/732bec240629d2dd721e80130cb2956b)
| null | https://github.com/keras-team/keras/pull/10727 | null | {'base_commit': 'd2803c0fb7d0ba9361dcba8eb9bcebbf2f774958', 'files': [{'path': 'keras/engine/saving.py', 'status': 'modified', 'Loc': {"(None, 'get_json_type', 61)": {'mod': [82, 83]}}}, {'path': 'tests/test_model_saving.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [14, 643]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"keras/engine/saving.py"
],
"doc": [],
"test": [
"tests/test_model_saving.py"
],
"config": [],
"asset": []
} | null | |
nvbn | thefuck | b28ece0f34e54d1c980e31223451f3b2f0f20ff9 | https://github.com/nvbn/thefuck/issues/1021 | Git checkout should provide multiple corrections | When correcting git checkout, the default is to use the 'closest branch'. We have a lot of branches with similar names, but quite often, what I actually meant to do was supply the '-b' flag.
Can the git checkout rule be updated to return all of the possible options, rather than trying to guess, based on some arbitrary priority?
| null | https://github.com/nvbn/thefuck/pull/1022 | null | {'base_commit': 'b28ece0f34e54d1c980e31223451f3b2f0f20ff9', 'files': [{'path': 'tests/rules/test_git_checkout.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [59, 62, 66, 70]}}}, {'path': 'thefuck/rules/git_checkout.py', 'status': 'modified', 'Loc': {"(None, 'get_new_command', 31)": {'add': [36], 'mod': [38, 39, 40, 41, 42, 43]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"thefuck/rules/git_checkout.py"
],
"doc": [],
"test": [
"tests/rules/test_git_checkout.py"
],
"config": [],
"asset": []
} | null | |
nvbn | thefuck | 2d81166213c403dce5c04d1fb73ba5d3e57d6676 | https://github.com/nvbn/thefuck/issues/660 | Slow execution time | The command output is very slow on macOS w/ fish shell. Reproduction rate is ~80% for me.
Version: The Fuck 3.18 using Python 2.7.10
Shell: fish, version 2.6.0
OS: macOS 10.12.5
Debug Output:
```
❯ fuck 333ms
DEBUG: Run with settings: {'alter_history': True,
'debug': True,
'env': {'GIT_TRACE': '1', 'LANG': 'C', 'LC_ALL': 'C'},
'exclude_rules': [],
'history_limit': None,
'no_colors': False,
'priority': {},
'repeat': False,
'require_confirmation': True,
'rules': [<const: All rules enabled>],
'slow_commands': ['lein', 'react-native', 'gradle', './gradlew', 'vagrant'],
'user_dir': PosixPath('/Users/sbennett/.config/thefuck'),
'wait_command': 3,
'wait_slow_command': 15}
DEBUG: Execution timed out!
DEBUG: Call: fish -ic "fuck"; with env: {'PYTHONIOENCODING': 'utf-8', 'VERSIONER_PYTHON_PREFER_32_BIT': 'no', 'TERM_PROGRAM_VERSION': '3.0.15', 'LOGNAME': 'sbennett', 'USER': 'sbennett', 'HOME': '/Users/sbennett', 'PATH': '/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin', 'TERM_PROGRAM': 'iTerm.app', 'LANG': 'C', 'THEFUCK_DEBUG': 'true', 'TERM': 'xterm-256color', 'Apple_PubSub_Socket_Render': '/private/tmp/com.apple.launchd.1eq3gwtm7Y/Render', 'COLORFGBG': '15;0', 'VERSIONER_PYTHON_VERSION': '2.7', 'SHLVL': '2', 'XPC_FLAGS': '0x0', 'ITERM_SESSION_ID': 'w1t1p0:E781FA41-C385-4CCE-A9E0-EBF190B3D246', 'TERM_SESSION_ID': 'w1t1p0:E781FA41-C385-4CCE-A9E0-EBF190B3D246', 'SSH_AUTH_SOCK': '/private/tmp/com.apple.launchd.leMomVKppy/Listeners', 'TF_ALIAS': 'fuck', 'XPC_SERVICE_NAME': '0', 'SHELL': '/usr/local/bin/fish', 'ITERM_PROFILE': 'Default', 'LC_ALL': 'C', 'TMPDIR': '/var/folders/0s/c0f2hl495352w24847p7ybwm35h1r_/T/', 'GIT_TRACE': '1', '__CF_USER_TEXT_ENCODING': '0x658070A:0x0:0x0', 'PWD': '/Users/sbennett'}; is slow: took: 0:00:03.018166
DEBUG: Importing rule: ag_literal; took: 0:00:00.000511
DEBUG: Importing rule: apt_get; took: 0:00:00.000571
DEBUG: Importing rule: apt_get_search; took: 0:00:00.000224
DEBUG: Importing rule: apt_invalid_operation; took: 0:00:00.000715
DEBUG: Importing rule: aws_cli; took: 0:00:00.000235
DEBUG: Importing rule: brew_install; took: 0:00:00.000279
DEBUG: Importing rule: brew_link; took: 0:00:00.000217
DEBUG: Importing rule: brew_uninstall; took: 0:00:00.000276
DEBUG: Importing rule: brew_unknown_command; took: 0:00:00.000105
DEBUG: Importing rule: brew_update_formula; took: 0:00:00.000222
DEBUG: Importing rule: brew_upgrade; took: 0:00:00.000061
DEBUG: Importing rule: cargo; took: 0:00:00.000049
DEBUG: Importing rule: cargo_no_command; took: 0:00:00.000223
DEBUG: Importing rule: cd_correction; took: 0:00:00.000950
DEBUG: Importing rule: cd_mkdir; took: 0:00:00.000342
DEBUG: Importing rule: cd_parent; took: 0:00:00.000050
DEBUG: Importing rule: chmod_x; took: 0:00:00.000058
DEBUG: Importing rule: composer_not_command; took: 0:00:00.001520
DEBUG: Importing rule: cp_omitting_directory; took: 0:00:00.000677
DEBUG: Importing rule: cpp11; took: 0:00:00.000324
DEBUG: Importing rule: dirty_untar; took: 0:00:00.001812
DEBUG: Importing rule: dirty_unzip; took: 0:00:00.000257
DEBUG: Importing rule: django_south_ghost; took: 0:00:00.000066
DEBUG: Importing rule: django_south_merge; took: 0:00:00.000113
DEBUG: Importing rule: docker_not_command; took: 0:00:00.000528
DEBUG: Importing rule: dry; took: 0:00:00.000068
DEBUG: Importing rule: fab_command_not_found; took: 0:00:00.000396
DEBUG: Importing rule: fix_alt_space; took: 0:00:00.000337
DEBUG: Importing rule: fix_file; took: 0:00:00.003110
DEBUG: Importing rule: gem_unknown_command; took: 0:00:00.000506
DEBUG: Importing rule: git_add; took: 0:00:00.000520
DEBUG: Importing rule: git_add_force; took: 0:00:00.000252
DEBUG: Importing rule: git_bisect_usage; took: 0:00:00.000249
DEBUG: Importing rule: git_branch_delete; took: 0:00:00.000232
DEBUG: Importing rule: git_branch_exists; took: 0:00:00.000309
DEBUG: Importing rule: git_branch_list; took: 0:00:00.000236
DEBUG: Importing rule: git_checkout; took: 0:00:00.000254
DEBUG: Importing rule: git_diff_no_index; took: 0:00:00.000238
DEBUG: Importing rule: git_diff_staged; took: 0:00:00.000228
DEBUG: Importing rule: git_fix_stash; took: 0:00:00.000252
DEBUG: Importing rule: git_flag_after_filename; took: 0:00:00.000231
DEBUG: Importing rule: git_help_aliased; took: 0:00:00.000231
DEBUG: Importing rule: git_not_command; took: 0:00:00.000363
DEBUG: Importing rule: git_pull; took: 0:00:00.000242
DEBUG: Importing rule: git_pull_clone; took: 0:00:00.000239
DEBUG: Importing rule: git_pull_uncommitted_changes; took: 0:00:00.000244
DEBUG: Importing rule: git_push; took: 0:00:00.000246
DEBUG: Importing rule: git_push_force; took: 0:00:00.000238
DEBUG: Importing rule: git_push_pull; took: 0:00:00.000221
DEBUG: Importing rule: git_push_without_commits; took: 0:00:00.000343
DEBUG: Importing rule: git_rebase_merge_dir; took: 0:00:00.000250
DEBUG: Importing rule: git_rebase_no_changes; took: 0:00:00.000164
DEBUG: Importing rule: git_remote_seturl_add; took: 0:00:00.000159
DEBUG: Importing rule: git_rm_local_modifications; took: 0:00:00.000241
DEBUG: Importing rule: git_rm_recursive; took: 0:00:00.000493
DEBUG: Importing rule: git_rm_staged; took: 0:00:00.000347
DEBUG: Importing rule: git_stash; took: 0:00:00.000286
DEBUG: Importing rule: git_stash_pop; took: 0:00:00.000281
DEBUG: Importing rule: git_tag_force; took: 0:00:00.000268
DEBUG: Importing rule: git_two_dashes; took: 0:00:00.000239
DEBUG: Importing rule: go_run; took: 0:00:00.000217
DEBUG: Importing rule: gradle_no_task; took: 0:00:00.000566
DEBUG: Importing rule: gradle_wrapper; took: 0:00:00.000227
DEBUG: Importing rule: grep_arguments_order; took: 0:00:00.000235
DEBUG: Importing rule: grep_recursive; took: 0:00:00.000222
DEBUG: Importing rule: grunt_task_not_found; took: 0:00:00.000479
DEBUG: Importing rule: gulp_not_task; took: 0:00:00.000227
DEBUG: Importing rule: has_exists_script; took: 0:00:00.000240
DEBUG: Importing rule: heroku_not_command; took: 0:00:00.000310
DEBUG: Importing rule: history; took: 0:00:00.000067
DEBUG: Importing rule: hostscli; took: 0:00:00.000383
DEBUG: Importing rule: ifconfig_device_not_found; took: 0:00:00.000296
DEBUG: Importing rule: java; took: 0:00:00.000226
DEBUG: Importing rule: javac; took: 0:00:00.000216
DEBUG: Importing rule: lein_not_task; took: 0:00:00.000370
DEBUG: Importing rule: ln_no_hard_link; took: 0:00:00.000237
DEBUG: Importing rule: ln_s_order; took: 0:00:00.000241
DEBUG: Importing rule: ls_all; took: 0:00:00.000208
DEBUG: Importing rule: ls_lah; took: 0:00:00.000347
DEBUG: Importing rule: man; took: 0:00:00.000241
DEBUG: Importing rule: man_no_space; took: 0:00:00.000062
DEBUG: Importing rule: mercurial; took: 0:00:00.000234
DEBUG: Importing rule: missing_space_before_subcommand; took: 0:00:00.000085
DEBUG: Importing rule: mkdir_p; took: 0:00:00.000252
DEBUG: Importing rule: mvn_no_command; took: 0:00:00.000213
DEBUG: Importing rule: mvn_unknown_lifecycle_phase; took: 0:00:00.000260
DEBUG: Importing rule: no_command; took: 0:00:00.000261
DEBUG: Importing rule: no_such_file; took: 0:00:00.000066
DEBUG: Importing rule: npm_missing_script; took: 0:00:00.000593
DEBUG: Importing rule: npm_run_script; took: 0:00:00.000235
DEBUG: Importing rule: npm_wrong_command; took: 0:00:00.000378
DEBUG: Importing rule: open; took: 0:00:00.000605
DEBUG: Importing rule: pacman; took: 0:00:00.000366
DEBUG: Importing rule: pacman_not_found; took: 0:00:00.000111
DEBUG: Importing rule: path_from_history; took: 0:00:00.000099
DEBUG: Importing rule: pip_unknown_command; took: 0:00:00.000315
DEBUG: Importing rule: port_already_in_use; took: 0:00:00.000183
DEBUG: Importing rule: python_command; took: 0:00:00.000261
DEBUG: Importing rule: python_execute; took: 0:00:00.000232
DEBUG: Importing rule: quotation_marks; took: 0:00:00.000052
DEBUG: Importing rule: react_native_command_unrecognized; took: 0:00:00.000224
DEBUG: Importing rule: remove_trailing_cedilla; took: 0:00:00.000051
DEBUG: Importing rule: rm_dir; took: 0:00:00.000242
DEBUG: Importing rule: rm_root; took: 0:00:00.000235
DEBUG: Importing rule: scm_correction; took: 0:00:00.000254
DEBUG: Importing rule: sed_unterminated_s; took: 0:00:00.000222
DEBUG: Importing rule: sl_ls; took: 0:00:00.000052
DEBUG: Importing rule: ssh_known_hosts; took: 0:00:00.000239
DEBUG: Importing rule: sudo; took: 0:00:00.000059
DEBUG: Importing rule: sudo_command_from_user_path; took: 0:00:00.000231
DEBUG: Importing rule: switch_lang; took: 0:00:00.000091
DEBUG: Importing rule: systemctl; took: 0:00:00.000378
DEBUG: Importing rule: test.py; took: 0:00:00.000051
DEBUG: Importing rule: tmux; took: 0:00:00.000212
DEBUG: Importing rule: touch; took: 0:00:00.000223
DEBUG: Importing rule: tsuru_login; took: 0:00:00.000281
DEBUG: Importing rule: tsuru_not_command; took: 0:00:00.000223
DEBUG: Importing rule: unknown_command; took: 0:00:00.000062
DEBUG: Importing rule: vagrant_up; took: 0:00:00.000308
DEBUG: Importing rule: whois; took: 0:00:00.000282
DEBUG: Importing rule: workon_doesnt_exists; took: 0:00:00.000309
DEBUG: Importing rule: yarn_alias; took: 0:00:00.000219
DEBUG: Importing rule: yarn_command_not_found; took: 0:00:00.000494
DEBUG: Importing rule: yarn_command_replaced; took: 0:00:00.000357
DEBUG: Importing rule: yarn_help; took: 0:00:00.000232
DEBUG: Trying rule: dirty_unzip; took: 0:00:00.000568
No fucks given
DEBUG: Total took: 0:00:03.282835
``` | null | null | https://github.com/nvbn/thefuck/commit/2d81166213c403dce5c04d1fb73ba5d3e57d6676 | {'base_commit': '2d81166213c403dce5c04d1fb73ba5d3e57d6676', 'files': [{'path': 'tests/shells/test_fish.py', 'status': 'modified', 'Loc': {"('TestFish', 'test_get_overridden_aliases', 29)": {'mod': [31, 32]}}}, {'path': 'thefuck/shells/fish.py', 'status': 'modified', 'Loc': {"('Fish', '_get_overridden_aliases', 40)": {'mod': [46]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"thefuck/shells/fish.py"
],
"doc": [],
"test": [
"tests/shells/test_fish.py"
],
"config": [],
"asset": []
} | null | |
nvbn | thefuck | 6da0bc557f0fd94ea1397d3a7f508be896cc98d8 | https://github.com/nvbn/thefuck/issues/1120 | Trying rule missing_space_before_subcommand taking so long | <!-- If you have any issue with The Fuck, sorry about that, but we will do what we
can to fix that. Actually, maybe we already have, so first thing to do is to
update The Fuck and see if the bug is still there. -->
<!-- If it is (sorry again), check if the problem has not already been reported and
if not, just open an issue on [GitHub](https://github.com/nvbn/thefuck) with
the following basic information: -->
The output of `thefuck --version` (something like `The Fuck 3.1 using Python
3.5.0 and Bash 4.4.12(1)-release`):
The Fuck 3.29 using Python 3.8.2 and ZSH 5.8
Your system (Debian 7, ArchLinux, Windows, etc.):
ubuntu 20.04 on wsl2
How to reproduce the bug:
env THEFUCK_DEBUG=true thefuck test
The output of The Fuck with `THEFUCK_DEBUG=true` exported (typically execute `export THEFUCK_DEBUG=true` in your shell before The Fuck):
DEBUG: Trying rule: missing_space_before_subcommand; took: 0:00:08.341279
No fucks given
Anything else you think is relevant:
I have no idea why this taking so long. anyone else having this problem?
<!-- It's only with enough information that we can do something to fix the problem. -->
| null | null | https://github.com/KiaraGrouwstra/thefuck/commit/6da0bc557f0fd94ea1397d3a7f508be896cc98d8 | {'base_commit': '6da0bc557f0fd94ea1397d3a7f508be896cc98d8', 'files': [{'path': 'README.md', 'status': 'modified', 'Loc': {'(None, None, 436)': {'add': [436]}, '(None, None, 468)': {'add': [468]}}}, {'path': 'tests/test_conf.py', 'status': 'modified', 'Loc': {"('TestSettingsFromEnv', 'test_from_env', 48)": {'add': [67], 'mod': [57]}}}, {'path': 'tests/test_utils.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [96]}}}, {'path': 'thefuck/conf.py', 'status': 'modified', 'Loc': {"('Settings', '_val_from_env', 91)": {'mod': [104]}}}, {'path': 'thefuck/const.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [46, 61]}}}, {'path': 'thefuck/utils.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [106]}, "(None, 'get_all_executables', 112)": {'add': [121]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"thefuck/conf.py",
"thefuck/utils.py",
"thefuck/const.py"
],
"doc": [
"README.md"
],
"test": [
"tests/test_conf.py",
"tests/test_utils.py"
],
"config": [],
"asset": []
} | null | |
nvbn | thefuck | a84671dd3b7505d4d73f11ee9c7d057429542e24 | https://github.com/nvbn/thefuck/issues/20 | Some Unicode error in Ubuntu 14.10 | ``` bash
$ apt-get update
E: Не удалось открыть файл блокировки /var/lib/apt/lists/lock - open (13: Отказано в доступе)
E: Невозможно заблокировать каталог /var/lib/apt/lists/
E: Не удалось открыть файл блокировки /var/lib/dpkg/lock - open (13: Отказано в доступе)
E: Не удалось выполнить блокировку управляющего каталога (/var/lib/dpkg/); у вас есть права суперпользователя?
$ fuck
Traceback (most recent call last):
File "/usr/local/bin/thefuck", line 9, in <module>
load_entry_point('thefuck==1.7', 'console_scripts', 'thefuck')()
File "/usr/local/lib/python2.7/dist-packages/thefuck/main.py", line 91, in main
matched_rule = get_matched_rule(command, rules, settings)
File "/usr/local/lib/python2.7/dist-packages/thefuck/main.py", line 67, in get_matched_rule
if rule.match(command, settings):
File "/usr/local/lib/python2.7/dist-packages/thefuck/utils.py", line 41, in wrapper
return fn(command, settings)
File "/usr/local/lib/python2.7/dist-packages/thefuck/rules/no_command.py", line 19, in match
output = _get_output(command, settings)
File "/usr/local/lib/python2.7/dist-packages/thefuck/rules/no_command.py", line 13, in _get_output
return result.stderr.read().decode()
UnicodeDecodeError: 'ascii' codec can't decode byte 0xd0 in position 0: ordinal not in range(128)
```
| null | null | https://github.com/nvbn/thefuck/commit/a84671dd3b7505d4d73f11ee9c7d057429542e24 | {'base_commit': 'a84671dd3b7505d4d73f11ee9c7d057429542e24', 'files': [{'path': 'setup.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [5]}}}, {'path': 'thefuck/rules/no_command.py', 'status': 'modified', 'Loc': {"(None, '_get_output', 9)": {'mod': [13]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"thefuck/rules/no_command.py",
"setup.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
nvbn | thefuck | 622298549172754afff07a8ea1f55358062e17a7 | https://github.com/nvbn/thefuck/issues/330 | Add command options (--version, --help, --update/--upgrade) | And perhaps a manpage too, even if it only says "Please use fuck --help for documentation"
| null | null | https://github.com/nvbn/thefuck/commit/622298549172754afff07a8ea1f55358062e17a7 | {'base_commit': '622298549172754afff07a8ea1f55358062e17a7', 'files': [{'path': 'README.md', 'status': 'modified', 'Loc': {'(None, None, 110)': {'mod': [110]}, '(None, None, 112)': {'mod': [112]}}}, {'path': 'thefuck/main.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [0, 3], 'mod': [83, 99, 100]}, "(None, 'print_alias', 100)": {'add': [101]}, "(None, 'fix_command', 86)": {'mod': [97]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"thefuck/main.py"
],
"doc": [
"README.md"
],
"test": [],
"config": [],
"asset": []
} | null | |
nvbn | thefuck | 284d49da8d0ab3252b5426423b608033d39c2669 | https://github.com/nvbn/thefuck/issues/786 | next release | "TypeError: 'module' object is not callable" On any invocation of thefuck | <!-- If you have any issue with The Fuck, sorry about that, but we will do what we
can to fix that. Actually, maybe we already have, so first thing to do is to
update The Fuck and see if the bug is still there. -->
<!-- If it is (sorry again), check if the problem has not already been reported and
if not, just open an issue on [GitHub](https://github.com/nvbn/thefuck) with
the following basic information: -->
The output of `thefuck --version` (something like `The Fuck 3.1 using Python 3.5.0`):
The Fuck 3.25 using Python 3.6.4+
Your shell and its version (`bash`, `zsh`, *Windows PowerShell*, etc.):
GNU bash, version 4.4.18(1)-release (x86_64-pc-linux-gnu)
Your system (Debian 7, ArchLinux, Windows, etc.):
Ubuntu 18.04, Bionic Beaver
How to reproduce the bug:
Execute any bad command (I tested with `cd..` and `apt install whatever`. Then enter `fuck`.
The output of The Fuck with `THEFUCK_DEBUG=true` exported (typically execute `export THEFUCK_DEBUG=true` in your shell before The Fuck):
```
DEBUG: Run with settings: {'alter_history': True,
'debug': True,
'env': {'GIT_TRACE': '1', 'LANG': 'C', 'LC_ALL': 'C'},
'exclude_rules': [],
'history_limit': None,
'instant_mode': False,
'no_colors': False,
'priority': {},
'repeat': False,
'require_confirmation': True,
'rules': [<const: All rules enabled>],
'slow_commands': ['lein', 'react-native', 'gradle', './gradlew', 'vagrant'],
'user_dir': PosixPath('/home/thomasokeeffe/.config/thefuck'),
'wait_command': 3,
'wait_slow_command': 15}
DEBUG: Received output:
DEBUG: Call: export THEFUCK_DEBUG=true; with env: {'CLUTTER_IM_MODULE': 'xim', 'LS_COLORS': 'rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.wim=01;31:*.swm=01;31:*.dwm=01;31:*.esd=01;31:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36:', 'LESSCLOSE': '/usr/bin/lesspipe %s %s', 'XDG_MENU_PREFIX': 'gnome-', 'LANG': 'C', 'GDM_LANG': 'en_US', 'MANAGERPID': '1425', 'DISPLAY': ':0', 'INVOCATION_ID': '09b52cf5b26f4acf8d4fcf48e96663bb', 'UNITY_DEFAULT_PROFILE': 'unity', 'COMPIZ_CONFIG_PROFILE': 'ubuntu', 'GTK2_MODULES': 'overlay-scrollbar', 'DOOMWADDIR': '/opt/doom', 'GTK_CSD': '0', 'COLORTERM': 'truecolor', 'TF_SHELL_ALIASES': 'alias alert=\'notify-send --urgency=low -i "$([ $? = 0 ] && echo terminal || echo error)" "$(history|tail -n1|sed -e \'\\\'\'s/^\\s*[0-9]\\+\\s*//;s/[;&|]\\s*alert$//\'\\\'\')"\'\nalias dfhack=\'~/df_linux/dfhack\'\nalias dwarff=\'/home/thomasokeeffe/df_linux/df\'\nalias egrep=\'egrep --color=auto\'\nalias fgrep=\'fgrep --color=auto\'\nalias grep=\'grep --color=auto\'\nalias l=\'ls -CF\'\nalias la=\'ls -A\'\nalias ll=\'ls -alF\'\nalias ls=\'ls --color=auto\'\nalias pip=\'pip3\'\nalias python=\'python3\'', 'JAVA_HOME': '/usr/lib/jvm/java-8-oracle/', 'J2SDKDIR': '/usr/lib/jvm/java-9-oracle', 'PYTHONIOENCODING': 'utf-8', 'SSH_AUTH_SOCK': '/run/user/1000/keyring/ssh', 'MANDATORY_PATH': '/usr/share/gconf/unity.mandatory.path', 'XDG_GREETER_DATA_DIR': '/var/lib/lightdm-data/thomasokeeffe', 'DERBY_HOME': '/usr/lib/jvm/java-9-oracle/db', 'USER': 'thomasokeeffe', 'DESKTOP_SESSION': 'unity', 'QT4_IM_MODULE': 'xim', 'TEXTDOMAINDIR': '/usr/share/locale/', 'DEFAULTS_PATH': '/usr/share/gconf/unity.default.path', 'PWD': '/home/thomasokeeffe', 'HOME': '/home/thomasokeeffe', 'JOURNAL_STREAM': '9:28556', 'TEXTDOMAIN': 'im-config', 'J2REDIR': '/usr/lib/jvm/java-9-oracle', 'QT_ACCESSIBILITY': '1', 'XDG_SESSION_TYPE': 'x11', 'COMPIZ_BIN_PATH': '/usr/bin/', 'XDG_DATA_DIRS': '/usr/share/unity:/usr/share/unity:/usr/local/share:/usr/share:/var/lib/snapd/desktop:/var/lib/snapd/desktop', 'XDG_SESSION_DESKTOP': 'unity', 'WINEDEBUG': '-all', 'SSH_AGENT_LAUNCHER': 'gnome-keyring', 'GTK_MODULES': 'gail:atk-bridge:unity-gtk-module', 'GNOME_SESSION_XDG_SESSION_PATH': '/org/freedesktop/DisplayManager/Session0', 'TERM': 'xterm-256color', 'VTE_VERSION': '5002', 'SHELL': '/bin/bash', 'XDG_SEAT_PATH': '/org/freedesktop/DisplayManager/Seat0', 'QT_IM_MODULE': 'ibus', 'XMODIFIERS': '@im=ibus', 'IM_CONFIG_PHASE': '2', 'XDG_CURRENT_DESKTOP': 'Unity:Unity7:ubuntu', 'GPG_AGENT_INFO': '/home/thomasokeeffe/.gnupg/S.gpg-agent:0:1:', 'TF_ALIAS': 'fuck', 'UNITY_HAS_3D_SUPPORT': 'true', 'SHLVL': '2', 'LANGUAGE': 'en_US', 'WINDOWID': '67108870', 'GDMSESSION': 'unity', 'GNOME_DESKTOP_SESSION_ID': 'this-is-deprecated', 'LOGNAME': 'thomasokeeffe', 'DBUS_SESSION_BUS_ADDRESS': 'unix:path=/run/user/1000/bus', 'XDG_RUNTIME_DIR': '/run/user/1000', 'XAUTHORITY': '/home/thomasokeeffe/.Xauthority', 'TF_HISTORY': '\t python\n\t fuck\n\t source ~/.bashrc\n\t fuck\n\t apt install whatever\n\t fuck\n\t cd..\n\t fuck\n\t fuck --version\n\t export THEFUCK_DEBUG=true', 'XDG_SESSION_PATH': '/org/freedesktop/DisplayManager/Session0', 'XDG_CONFIG_DIRS': '/etc/xdg/xdg-unity:/etc/xdg/xdg-unity:/etc/xdg', 'PATH': '/usr/bin/ski:/home/thomasokeeffe/.local/bin:/opt/doom:/usr/bin/python3:/usr/bin/ski:/home/thomasokeeffe/.local/bin:/opt/doom:/usr/bin/python3:/home/thomasokeeffe/.local/share/umake/bin:/home/thomasokeeffe/bin:/home/thomasokeeffe/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/lib/jvm/java-9-oracle/bin:/usr/lib/jvm/java-9-oracle/db/bin', 'THEFUCK_DEBUG': 'true', 'LD_PRELOAD': 'libgtk3-nocsd.so.0', 'SESSION_MANAGER': 'local/Wirecat:@/tmp/.ICE-unix/1738,unix/Wirecat:/tmp/.ICE-unix/1738', 'LESSOPEN': '| /usr/bin/lesspipe %s', 'GTK_IM_MODULE': 'ibus', '_': '/home/thomasokeeffe/.local/bin/thefuck', 'LC_ALL': 'C', 'GIT_TRACE': '1'}; is slow: took: 0:00:00.001356
DEBUG: Importing rule: ag_literal; took: 0:00:00.000609
DEBUG: Importing rule: apt_get; took: 0:00:00.001838
DEBUG: Total took: 0:00:00.028332
Traceback (most recent call last):
File "/home/thomasokeeffe/.local/bin/thefuck", line 11, in <module>
sys.exit(main())
File "/home/thomasokeeffe/.local/lib/python3.6/site-packages/thefuck/entrypoints/main.py", line 25, in main
fix_command(known_args)
File "/home/thomasokeeffe/.local/lib/python3.6/site-packages/thefuck/entrypoints/fix_command.py", line 41, in fix_command
corrected_commands = get_corrected_commands(command)
File "/home/thomasokeeffe/.local/lib/python3.6/site-packages/thefuck/corrector.py", line 89, in get_corrected_commands
corrected for rule in get_rules()
File "/home/thomasokeeffe/.local/lib/python3.6/site-packages/thefuck/corrector.py", line 49, in get_rules
key=lambda rule: rule.priority)
File "/home/thomasokeeffe/.local/lib/python3.6/site-packages/thefuck/corrector.py", line 17, in get_loaded_rules
rule = Rule.from_path(path)
File "/home/thomasokeeffe/.local/lib/python3.6/site-packages/thefuck/types.py", line 140, in from_path
rule_module = load_source(name, str(path))
File "/usr/lib/python3.6/imp.py", line 172, in load_source
module = _load(spec)
File "<frozen importlib._bootstrap>", line 696, in _load
File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/thomasokeeffe/.local/lib/python3.6/site-packages/thefuck/rules/apt_get.py", line 8, in <module>
command_not_found = CommandNotFound()
TypeError: 'module' object is not callable
```
| null | null | https://github.com/nvbn/thefuck/commit/fb39d0bbd349e916ae12a77f04efd151dd046e6b
https://github.com/nvbn/thefuck/commit/284d49da8d0ab3252b5426423b608033d39c2669 | {'base_commit': '284d49da8d0ab3252b5426423b608033d39c2669', 'files': [{'path': 'tests/rules/test_apt_get.py', 'status': 'modified', 'Loc': {"(None, 'test_match', 13)": {'mod': [15, 16, 17]}, "(None, 'test_not_match', 30)": {'mod': [33, 34, 35]}, "(None, 'test_get_new_command', 49)": {'mod': [52, 53, 54]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [],
"doc": [],
"test": [
"tests/rules/test_apt_get.py"
],
"config": [],
"asset": []
} | null |
home-assistant | core | 2b5e7c26111e447c2714284151c2e7555abd11e4 | https://github.com/home-assistant/core/issues/27175 | integration: google_assistant | Google assistant: something went wrong when using alarm | <!-- READ THIS FIRST:
- If you need additional help with this template please refer to https://www.home-assistant.io/help/reporting_issues/
- Make sure you are running the latest version of Home Assistant before reporting an issue: https://github.com/home-assistant/home-assistant/releases
- Frontend issues should be submitted to the home-assistant-polymer repository: https://github.com/home-assistant/home-assistant-polymer/issues
- iOS issues should be submitted to the home-assistant-iOS repository: https://github.com/home-assistant/home-assistant-iOS/issues
- Do not report issues for integrations if you are using a custom integration: files in <config-dir>/custom_components
- This is for bugs only. Feature and enhancement requests should go in our community forum: https://community.home-assistant.io/c/feature-requests
- Provide as many details as possible. Paste logs, configuration sample and code into the backticks. Do not delete any text from this template!
-->
**Home Assistant release with the issue:**
0.100.0b0
<!--
- Frontend -> Developer tools -> Info
- Or use this command: hass --version
-->
**Last working Home Assistant release (if known):**
**Operating environment (Hass.io/Docker/Windows/etc.):**
hassio
**Integration:**
<!--
Please add the link to the documentation at https://www.home-assistant.io/integrations/ of the integration in question.
-->
nabu casa cloud
google assistant
envisalink
**Description of problem:**
Using the google assistant to arm home/arm away/disarm causes the google assistant to indicate that "something went wrong" although it actually performed the action.
I am using the envisalink component which allows you to specify the code so that it is sent with each service call. I tried with/without the code configuration and it made no difference.
**Problem-relevant `configuration.yaml` entries and (fill out even if it seems unimportant):**
```yaml
```
**Traceback (if applicable):**
```
```
**Additional information:**
| null | https://github.com/home-assistant/core/pull/36942 | null | {'base_commit': '2b5e7c26111e447c2714284151c2e7555abd11e4', 'files': [{'path': 'homeassistant/components/google_assistant/trait.py', 'status': 'modified', 'Loc': {"('ArmDisArmTrait', None, 974)": {'add': [990, 1000]}, "('ArmDisArmTrait', 'sync_attributes', 1001)": {'mod': [1005]}, "('ArmDisArmTrait', 'execute', 1031)": {'mod': [1034, 1038]}}}, {'path': 'tests/components/google_assistant/test_trait.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [1033]}, "(None, 'test_arm_disarm_arm_away', 865)": {'mod': [876, 895, 896, 897, 898, 899, 900, 901, 902, 903, 904, 905, 906, 907, 908, 909, 910, 911, 912, 913]}, "(None, 'test_arm_disarm_disarm', 1035)": {'mod': [1046, 1053, 1054, 1055, 1056, 1057, 1058, 1059, 1060, 1061, 1062, 1063, 1064, 1065, 1066, 1067, 1068, 1069, 1070]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"homeassistant/components/google_assistant/trait.py"
],
"doc": [],
"test": [
"tests/components/google_assistant/test_trait.py"
],
"config": [],
"asset": []
} | null |
home-assistant | core | fb7fb0ea78ee335cd23f3647223a675718ccf048 | https://github.com/home-assistant/core/issues/40316 | integration: knx | KNX problem with 0.115.0 and 0.115.1 | ## The problem
KNX integration has changed behavior and don't work fine:
1) it is possible to read the status of a scene only if it is launched from the KNX bus but not if it is launched from the HA
2) KNX climate don't read operation_mode_state_address correctly, when the operation mode is changed it reads the correct state then it is changed to "standby"
## Environment
Home Assistant 0.115.1
Frontend: 20200917.1 - latest
Raspberry 3
arch | armv7l
chassis | embedded
dev | false
docker | true
docker_version | 19.03.11
hassio | true
host_os | HassOS 4.13
installation_type | Home Assistant OS
os_name | Linux
os_version | 4.19.127-v7
python_version | 3.8.5
supervisor | 245
timezone | Europe/Rome
version | 0.115.1
virtualenv | false
- Home Assistant Core release with the issue: 0.115.1
- Last working Home Assistant Core release (if known): 0.113.3
- Operating environment (OS/Container/Supervised/Core): 4.12
- Integration causing this issue: KNX
- Link to integration documentation on our website: https://www.home-assistant.io/integrations/knx/
| null | https://github.com/home-assistant/core/pull/40472 | null | {'base_commit': 'fb7fb0ea78ee335cd23f3647223a675718ccf048', 'files': [{'path': 'homeassistant/components/knx/manifest.json', 'status': 'modified', 'Loc': {'(None, None, 5)': {'mod': [5]}}}, {'path': 'requirements_all.txt', 'status': 'modified', 'Loc': {'(None, None, 2268)': {'mod': [2268]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Doc\nJson"
} | {
"code": [
"homeassistant/components/knx/manifest.json"
],
"doc": [],
"test": [],
"config": [
"requirements_all.txt"
],
"asset": []
} | null |
home-assistant | core | faedba04079d2c999a479118b5189ef4c0bff060 | https://github.com/home-assistant/core/issues/77928 | integration: velux
stale | Somfy blind motors cannot be assigned to a room | ### The problem
Somfy motors will return `None` as serial number via the Velux KLF-200:
[Handle devices without serial numbers.](https://github.com/Julius2342/pyvlx/pull/42/commits/d409d66db8732553e928f5dd9d00d458ba638dea)
This serial is usesd as unique id here:
[core/homeassistant/components/velux/__init__.py#L114](https://github.com/home-assistant/core/blob/dev/homeassistant/components/velux/__init__.py#L114)
Could it be reasonable to return the node name instead of `None`?
```python
if self.node.serial_number:
return self.node.serial_number
elif self.node.name:
return self.node.name
else:
return "velux_#" + str(self.node.node_id)
```
### What version of Home Assistant Core has the issue?
2022.8.7
### What was the last working version of Home Assistant Core?
_No response_
### What type of installation are you running?
Home Assistant OS
### Integration causing the issue
Velux
### Link to integration documentation on our website
https://www.home-assistant.io/integrations/velux/
### Diagnostics information
_No response_
### Example YAML snippet
_No response_
### Anything in the logs that might be useful for us?
_No response_
### Additional information
Related issues:
[66262](https://github.com/home-assistant/core/issues/66262)
[35935](https://github.com/home-assistant/core/issues/35935)
[74009](https://github.com/home-assistant/core/issues/74009)
| null | https://github.com/home-assistant/core/pull/117508 | null | {'base_commit': 'faedba04079d2c999a479118b5189ef4c0bff060', 'files': [{'path': 'homeassistant/components/velux/__init__.py', 'status': 'modified', 'Loc': {"('VeluxEntity', None, 106)": {'mod': [111]}, "('VeluxEntity', '__init__', 111)": {'mod': [114]}}}, {'path': 'homeassistant/components/velux/cover.py', 'status': 'modified', 'Loc': {"(None, 'async_setup_entry', 26)": {'mod': [32]}, "('VeluxCover', None, 38)": {'mod': [44]}, "('VeluxCover', '__init__', 44)": {'mod': [46]}}}, {'path': 'homeassistant/components/velux/light.py', 'status': 'modified', 'Loc': {"(None, 'async_setup_entry', 19)": {'mod': [26]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"homeassistant/components/velux/light.py",
"homeassistant/components/velux/cover.py",
"homeassistant/components/velux/__init__.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
home-assistant | core | 551a584ca69771804b6f094eceb67dcb25a2f627 | https://github.com/home-assistant/core/issues/68620 | needs-more-information
integration: overkiz | Polling interval for stateless (e.g. Somfy (Oceania)) is not applied in Overkiz | ### The problem
Every day I get a "Gateway ID" error in Overkiz error that reads as below. Same problem as [#66606](https://github.com/home-assistant/core/issues/66606)
"Translation Error: The intl string context variable "gateway id" was not provided to the string "Gateway: {gateway id}" Overkiz (by Somfy)".
When I click "Reconfigure" and reenter my password, the problem is corrected. But then it reoccurs in the next day or so.
Looking at the log, it seems like there's some really aggressive polling going on?
### What version of Home Assistant Core has the issue?
core-2022.3.5
### What was the last working version of Home Assistant Core?
_No response_
### What type of installation are you running?
Home Assistant Supervised
### Integration causing the issue
Overkiz (by Somfy)
### Link to integration documentation on our website
https://www.home-assistant.io/integrations/overkiz
### Diagnostics information
[config_entry-overkiz-0bf20335f9aeaa86644cb071861f6ef1.json.txt](https://github.com/home-assistant/core/files/8341651/config_entry-overkiz-0bf20335f9aeaa86644cb071861f6ef1.json.txt)
### Example YAML snippet
_No response_
### Anything in the logs that might be useful for us?
_No response_
### Additional information
_No response_ | null | https://github.com/home-assistant/core/pull/133617 | null | {'base_commit': '551a584ca69771804b6f094eceb67dcb25a2f627', 'files': [{'path': 'homeassistant/components/overkiz/__init__.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [43]}, "(None, 'async_setup_entry', 57)": {'mod': [116, 117, 118, 119, 122]}}}, {'path': 'homeassistant/components/overkiz/const.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [46]}}}, {'path': 'homeassistant/components/overkiz/coordinator.py', 'status': 'modified', 'Loc': {"('OverkizDataUpdateCoordinator', None, 36)": {'add': [38]}, "('OverkizDataUpdateCoordinator', '__init__', 39)": {'add': [67], 'mod': [48, 62, 63, 64]}, "('OverkizDataUpdateCoordinator', '_async_update_data', 69)": {'add': [104], 'mod': [106]}, '(None, None, None)': {'add': [126], 'mod': [29]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"homeassistant/components/overkiz/const.py",
"homeassistant/components/overkiz/__init__.py",
"homeassistant/components/overkiz/coordinator.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
yt-dlp | yt-dlp | f7590d47641cedbf630b909aa8f53930c4a9ce5c | https://github.com/yt-dlp/yt-dlp/issues/983 | site-bug | VRV - NoneType object is not iterable | <!--
######################################################################
WARNING!
IGNORING THE FOLLOWING TEMPLATE WILL RESULT IN ISSUE CLOSED AS INCOMPLETE
######################################################################
-->
## Checklist
<!--
Carefully read and work through this check list in order to prevent the most common mistakes and misuse of yt-dlp:
- First of, make sure you are using the latest version of yt-dlp. Run `yt-dlp --version` and ensure your version is 2021.09.02. If it's not, see https://github.com/yt-dlp/yt-dlp on how to update. Issues with outdated version will be REJECTED.
- Make sure that all provided video/audio/playlist URLs (if any) are alive and playable in a browser.
- Make sure that all URLs and arguments with special characters are properly quoted or escaped as explained in https://github.com/yt-dlp/yt-dlp.
- Search the bugtracker for similar issues: https://github.com/yt-dlp/yt-dlp. DO NOT post duplicates.
- Read bugs section in FAQ: https://github.com/yt-dlp/yt-dlp
- Finally, put x into all relevant boxes like this [x] (Dont forget to delete the empty space)
-->
- [X] I'm reporting a bug unrelated to a specific site
- [X] I've verified that I'm running yt-dlp version **2021.09.02**
- [X] I've checked that all provided URLs are alive and playable in a browser
- [X] The provided URLs do not contain any DRM to the best of my knowledge
- [X] I've checked that all URLs and arguments with special characters are properly quoted or escaped
- [X] I've searched the bugtracker for similar bug reports including closed ones
- [X] I've read bugs section in FAQ
## Verbose log
<!--
Provide the complete verbose output of yt-dlp that clearly demonstrates the problem.
Add the `-v` flag to your command line you run yt-dlp with (`yt-dlp -v <your command line>`), copy the WHOLE output and insert it below. It should look similar to this:
[debug] System config: []
[debug] User config: []
[debug] Command-line args: [u'-v', u'http://www.youtube.com/watch?v=BaW_jenozKc']
[debug] Encodings: locale cp1251, fs mbcs, out cp866, pref cp1251
[debug] yt-dlp version 2021.09.02
[debug] Python version 2.7.11 - Windows-2003Server-5.2.3790-SP2
[debug] exe versions: ffmpeg N-75573-g1d0487f, ffprobe N-75573-g1d0487f, rtmpdump 2.4
[debug] Proxy map: {}
<more lines>
-->
```
ytdl -F -u PRIVATE -p PRIVATE "https://vrv.co/watch/GRP5G39JR/The-Seven-Heavenly-Virtues:The-Angels-Descend" --verbose
[debug] Command-line config: ['-F', '-u', 'PRIVATE', '-p', 'PRIVATE', 'https://vrv.co/watch/GRP5G39JR/The-Seven-Heavenly-Virtues:The-Angels-Descend', '--verbose']
[debug] Encodings: locale cp1252, fs utf-8, out utf-8, pref cp1252
[debug] yt-dlp version 2021.09.02 (exe)
[debug] Python version 3.8.10 (CPython 64bit) - Windows-10-10.0.18363-SP0
[debug] exe versions: ffmpeg 4.4-full_build-www.gyan.dev, ffprobe 4.4-full_build-www.gyan.dev
[debug] Optional libraries: mutagen, pycryptodome, sqlite, websockets
[debug] Proxy map: {}
[vrv] None: Downloading webpage
[vrv] Downloading Token Credentials JSON metadata
[debug] [vrv] Extracting URL: https://vrv.co/watch/GRP5G39JR/The-Seven-Heavenly-Virtues:The-Angels-Descend
[vrv] GRP5G39JR: Downloading resource path JSON metadata
[vrv] GRP5G39JR: Downloading CMS Signing JSON metadata
[vrv] GRP5G39JR: Downloading object JSON metadata
[vrv] GRP5G39JR: Downloading video JSON metadata
[vrv] GRP5G39JR: Downloading streams JSON metadata
[vrv] GRP5G39JR: Downloading dash-audio-en-US information
[vrv] GRP5G39JR: Downloading hls-audio-en-US information
[debug] Formats sorted by: hasvid, ie_pref, lang, quality, res, fps, vcodec:vp9.2(10), acodec, filesize, fs_approx, tbr, vbr, abr, asr, proto, vext, aext, hasaud, source, id
ERROR: 'NoneType' object is not iterable
Traceback (most recent call last):
File "yt_dlp\YoutubeDL.py", line 1214, in wrapper
File "yt_dlp\YoutubeDL.py", line 1239, in __extract_info
File "yt_dlp\extractor\common.py", line 584, in extract
File "yt_dlp\extractor\vrv.py", line 221, in _real_extract
TypeError: 'NoneType' object is not iterable
```
<!--
Do not remove the above ```
-->
## Description
<!--
Provide an explanation of your issue in an arbitrary form. Please make sure the description is worded well enough to be understood, see https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient. Provide any additional information, suggested solution and as much context and examples as possible.
If work on your issue requires account credentials please provide them or explain how one can obtain them.
-->
I've noticed this is happening a little more often but it seems that the entire series for this one does this but then it works just fine on other series. So haven't really noticed where this is hanging up but i used `--write-pages` and got an extra dump file for this one vs. one that actually downloads, which looks like this.
```
<!DOCTYPE html>
<html lang="en">
<script>
window.segmentConfig = {"writeKey":"SIeNJozAqhQxDdHOOY6mvnSKKzHo1BvJ","defaultApiHost":"vrv-eec.etp-prod.com/v1","fallbackLibraryHost":"sa.etp-prod.com/analytics.js/v1/"};
window.vilos_main_style = "https://static.vrv.co/vrvweb/build/vilos.fb39b50eca04ae7083e9.css"
</script>
<head data-reactroot=""><link rel="icon" type="image/png" sizes="16x16" href="https://static.vrv.co/vrvweb/assets/img/favicons/favicon-16x16.png"/><link rel="icon" type="image/png" sizes="32x32" href="https://static.vrv.co/vrvweb/assets/img/favicons/favicon-32x32.png"/><link rel="icon" type="image/png" sizes="96x96" href="https://static.vrv.co/vrvweb/assets/img/favicons/favicon-96x96.png"/><link rel="apple-touch-icon" sizes="57x57" href="https://static.vrv.co/vrvweb/assets/img/favicons/apple-touch-icon-57x57.png"/><link rel="apple-touch-icon" sizes="60x60" href="https://static.vrv.co/vrvweb/assets/img/favicons/apple-touch-icon-60x60.png"/><link rel="apple-touch-icon" sizes="72x72" href="https://static.vrv.co/vrvweb/assets/img/favicons/apple-touch-icon-72x72.png"/><link rel="apple-touch-icon" sizes="76x76" href="https://static.vrv.co/vrvweb/assets/img/favicons/apple-touch-icon-76x76.png"/><link rel="apple-touch-icon" sizes="114x114" href="https://static.vrv.co/vrvweb/assets/img/favicons/apple-touch-icon-114x114.png"/><link rel="apple-touch-icon" sizes="120x120" href="https://static.vrv.co/vrvweb/assets/img/favicons/apple-touch-icon-120x120.png"/><link rel="apple-touch-icon" sizes="144x144" href="https://static.vrv.co/vrvweb/assets/img/favicons/apple-touch-icon-144x144.png"/><link rel="apple-touch-icon" sizes="152x152" href="https://static.vrv.co/vrvweb/assets/img/favicons/apple-touch-icon-152x152.png"/><link rel="apple-touch-icon" sizes="180x180" href="https://static.vrv.co/vrvweb/assets/img/favicons/apple-touch-icon-180x180.png"/><meta charSet="UTF-8"/><meta name="viewport" content="width=device-width"/><meta http-equiv="X-UA-Compatible" content="IE=edge"/><meta name="viewport" content="width=device-width, initial-scale=1, maximum-scale=1.0, user-scalable=0"/><meta name="theme-color" content="#1b1a26"/><meta name="apple-itunes-app" content="app-id=1165206979"/><link rel="manifest" href="https://static.vrv.co/vrvweb/assets/manifest.json"/><meta data-react-helmet="true" name="description" content="VRV is the home of your favorite channels, events, and communities celebrating anime, animation, video games, comics, science fiction, fantasy, and tech."/><meta data-react-helmet="true" name="keywords" content="Japanese Animation, Role Playing Games, Channel Frederator, Cartoon Hangover, Bravest Warriors, Naruto Shippuden, Adult Animation, Science Fiction, Hunter x Hunter, Japanese Anime, Bee & Puppycat, Graphic Novel, Rooster Teeth, Board Games, Video Games, Crunchyroll, Titansgrave, Crunch Time, Otter Media, Comic Book, Super Hero, Frederator, Lazer Team, Animation, One Piece, Tabletop, Ellation, Fantasy, Chernin, Manhua, Sci-Fi, Gaming, Comics, Naruto, Fandom, Stream, Online, Anime, Manga, Gamer, Otaku, Watch, Video, RWBY, Xbox, Fans, Nerd, Geek, VRV"/><link data-react-helmet="true" rel="canonical" href="https://vrv.co"/><title data-react-helmet="true">VRV - Home of Your Favorite Channels</title><script type="application/javascript" src="https://static.vrv.co/vrvweb/build/segment.96bfc94608106fdea978.js"></script><script type="application/javascript" src="https://js.adsrvr.org/up_loader.1.1.0.js"></script><link rel="stylesheet" type="text/css" href="https://static.vrv.co/vrvweb/build/main.59a6f4611b0080f805eb.css"/><script type="application/javascript" src="https://static.vrv.co/vrvweb/build/common.6fb25c4cff650ac4e6ae.js" defer=""></script><script type="application/javascript" src="https://static.vrv.co/vrvweb/build/main.75b5fcb9699fbd1844be.js" defer=""></script><script type="application/javascript" src="https://static.vrv.co/vrvweb/build/app.0d3f6794c03f11f4106a.js" defer=""></script><script type="application/javascript" src="https://static.vrv.co/vrvweb/build/manifest.3bed60dfdc3cde9cafe8.js" defer=""></script><script type="application/javascript" src="https://static.vrv.co/vrvweb/build/unsupported.ec440e1cfcb18abd44e1.js" defer=""></script></head>
<body>
<div id="content"><div><div class="erc-flash-messages success"><div class="content"><ul class="message-list"></ul><button class="dismiss" type="button"><svg class="vrv-interaction-icon close" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="close-svg"><polygon points="18.67 1.33 17.33 1.33 10 8.67 2.67 1.33 1.33 1.33 1.33 2.67 8.67 10 1.33 17.33 1.33 18.67 2.67 18.67 10 11.33 17.33 18.67 18.67 18.67 18.67 17.33 11.33 10 18.67 2.67 18.67 1.33"></polygon></svg><span class="dismiss-text">Close</span></button></div></div><div class="erc-header"><div class="header-content"><div class="header-left"><div class="erc-nav-header-item"><div class="wrapper"><div tabindex="0" class="nav-item-clickable"></div><a tabindex="0" class="item-logo" href="/"><svg class="icon-home" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16" data-t="home-svg"><polygon points="8.009 0.002 0.009 4.802 0.009 16.002 5.875 16.002 5.875 10.669 10.142 10.669 10.142 16.002 16.009 16.002 16.009 4.802 8.009 0.002"></polygon></svg></a><div class="item-info"><h1 class="item-title">Vrv Home</h1></div><div class="arrows-up-down"><svg class="arrows-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 12 16" data-t="arrows-up-down-svg"><path d="M6,0l5.6,5.6H.4ZM6,16,.4,10.4H11.6Z"></path></svg></div></div><div class="erc-nav-dropdown"><div class="nav-dropdown-scrollable"><div class="erc-nav-dropdown-item erc-nav-dropdown-vrv-item"><a aria-current="page" class="erc-nav-dropdown-item-link state-active" tabindex="0" href="/"><div class="logo"><svg class="home-logo" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16" data-t="home-svg"><polygon points="8.009 0.002 0.009 4.802 0.009 16.002 5.875 16.002 5.875 10.669 10.142 10.669 10.142 16.002 16.009 16.002 16.009 4.802 8.009 0.002"></polygon></svg></div><div class="info"><h3 class="title">Vrv Home</h3><p class="description">All of VRV</p></div></a><a class="erc-nav-dropdown-item-browse-link state-hidden-desktop" style="color:" tabindex="0" href="/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a><a class="erc-nav-dropdown-item-browse-link state-hidden-mobile" style="color:" tabindex="0" href="/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a></div><div class="nav-dropdown-channels"><h4 class="nav-dropdown-title">Channels</h4><div class="erc-nav-dropdown-item"><a class="erc-nav-dropdown-item-link" style="color:#EC2024" tabindex="0" href="/cartoonhangover"><div style="background-color:#EC2024" class="c-circle-icon c-circle-icon--size-small"><img class="c-circle-icon__element" src="https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png" alt="Cartoon Hangover"/></div><div class="info"><h3 class="title">Cartoon Hangover</h3><p class="description">Indie Cartoons</p></div></a><a class="erc-nav-dropdown-item-browse-link state-hidden-desktop" style="color:#EC2024" tabindex="0" href="/cartoonhangover/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a><a class="erc-nav-dropdown-item-browse-link state-hidden-mobile" style="color:#EC2024" tabindex="0" href="/cartoonhangover/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a></div><div class="erc-nav-dropdown-item"><a class="erc-nav-dropdown-item-link" style="color:#F47521" tabindex="0" href="/crunchyroll"><div style="background-color:#F47521" class="c-circle-icon c-circle-icon--size-small"><img class="c-circle-icon__element" src="https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png" alt="Crunchyroll"/></div><div class="info"><h3 class="title">Crunchyroll</h3><p class="description">World's Largest Anime Collection</p></div></a><a class="erc-nav-dropdown-item-browse-link state-hidden-desktop" style="color:#F47521" tabindex="0" href="/crunchyroll/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a><a class="erc-nav-dropdown-item-browse-link state-hidden-mobile" style="color:#F47521" tabindex="0" href="/crunchyroll/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a></div><div class="erc-nav-dropdown-item"><a class="erc-nav-dropdown-item-link" style="color:#00aeef" tabindex="0" href="/hidive"><div style="background-color:#00aeef" class="c-circle-icon c-circle-icon--size-small"><img class="c-circle-icon__element" src="https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png" alt="HIDIVE"/></div><div class="info"><h3 class="title">HIDIVE</h3><p class="description">Stream Anime & Stuff</p></div></a><a class="erc-nav-dropdown-item-browse-link state-hidden-desktop" style="color:#00aeef" tabindex="0" href="/hidive/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a><a class="erc-nav-dropdown-item-browse-link state-hidden-mobile" style="color:#00aeef" tabindex="0" href="/hidive/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a></div><div class="erc-nav-dropdown-item"><a class="erc-nav-dropdown-item-link" style="color:#E93735" tabindex="0" href="/mondo"><div style="background-color:#E93735" class="c-circle-icon c-circle-icon--size-small"><img class="c-circle-icon__element" src="https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png" alt="Mondo"/></div><div class="info"><h3 class="title">Mondo</h3><p class="description">Extreme animation</p></div></a><a class="erc-nav-dropdown-item-browse-link state-hidden-desktop" style="color:#E93735" tabindex="0" href="/mondo/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a><a class="erc-nav-dropdown-item-browse-link state-hidden-mobile" style="color:#E93735" tabindex="0" href="/mondo/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a></div><div class="erc-nav-dropdown-item"><a class="erc-nav-dropdown-item-link" style="color:#AF272F" tabindex="0" href="/roosterteeth"><div style="background-color:#AF272F" class="c-circle-icon c-circle-icon--size-small"><img class="c-circle-icon__element" src="https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png" alt="Rooster Teeth"/></div><div class="info"><h3 class="title">Rooster Teeth</h3><p class="description"> Original online hits</p></div></a><a class="erc-nav-dropdown-item-browse-link state-hidden-desktop" style="color:#AF272F" tabindex="0" href="/roosterteeth/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a><a class="erc-nav-dropdown-item-browse-link state-hidden-mobile" style="color:#AF272F" tabindex="0" href="/roosterteeth/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a></div><div class="erc-nav-dropdown-item"><a class="erc-nav-dropdown-item-link" style="color:#808285" tabindex="0" href="/vrvselect"><div style="background-color:#808285" class="c-circle-icon c-circle-icon--size-small"><img class="c-circle-icon__element" src="https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/vrvselect/495617604205faf4ef10affdecf12006.png" alt="VRV Select"/></div><div class="info"><h3 class="title">VRV Select</h3><p class="description">Discover VRV's next big thing</p></div></a><a class="erc-nav-dropdown-item-browse-link state-hidden-desktop" style="color:#808285" tabindex="0" href="/vrvselect/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a><a class="erc-nav-dropdown-item-browse-link state-hidden-mobile" style="color:#808285" tabindex="0" href="/vrvselect/browse"><svg class="icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg></a></div></div></div></div></div></div><div class="header-center"><a tabindex="0" class="erc-logo" href="/"><svg class="logo-full hidden-mobile" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 120 32" data-t="vrv-logo-svg"><polygon class="group-three" points="23.16 20.81 13.38 26.59 3.47 20.81 3.47 12.19 23.16 12.19 23.16 20.81"></polygon><g><polygon class="group-two" points="113.04 3.6 105.77 18.97 98.5 3.6 91.53 3.6 103.27 28.4 108.27 28.4 120 3.6 113.04 3.6"></polygon><path class="group-two" d="M86.93,19A8.78,8.78,0,0,0,81.32,3.6H63.92V28.4H70.2V23l10.35-1.68,2.82,7.07h6.77l-3.61-9Zm-5.7-4.21v0l-11,1.8V10H81.32a2.39,2.39,0,0,1,2.37,2.4C83.69,13.54,83.13,14.48,81.23,14.79Z"></path><polygon class="group-two" points="45.92 18.97 38.65 3.6 31.69 3.6 43.42 28.4 48.42 28.4 60.16 3.6 53.2 3.6 45.92 18.97"></polygon><path class="group-one" d="M0,8.19V23.81l13.37,7.81,13.37-7.81V8.19L13.37.38ZM22.11,20.91,13.37,26,4.64,20.9V14.44l8.73,5.1,8.73-5.1Z"></path></g></svg><svg class="logo-short hidden-desktop c-svg-vrv-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 27 32" data-t="vrv-icon-svg"><polygon class="logo-arrow" fill="currentColor" points="23.16 20.81 13.38 26.59 3.47 20.81 3.47 12.19 23.16 12.19 23.16 20.81"></polygon><path class="logo-background" d="M13.37.38,0,8.19V23.81l13.37,7.81,13.37-7.81V8.19ZM22.1,20.92,13.37,26,4.64,20.9V14.44l8.73,5.1,8.73-5.1Z"></path></svg></a></div><div class="header-right"><ul class="erc-user-actions"><li class="user-actions-item button-wrapper"><a role="button" tabindex="0" class="erc-header-promotion-button c-button -type-one -small" data-t="header-promotion-button" href="/?subscribe=bundle-combopack"><svg class="" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" data-t="premium-svg"><path d="M7.17,0L2.4,4.8l0.8,4L1.6,11.2l1.6,2.4h4l1.2-1.42L9.6,13.6h4.8l1.2-1.42,1.2,1.42h4l1.6-2.4L20.8,8.8l0.8-4L16.72,0H7.17ZM8,19.2l1.6,2.4,7.2-4,1.6-2.4H15.31Zm-2.4-4,1.72,2.7,5-2.7H5.6Zm4.78,7.52L11.2,24h1.6l2.4-4Z"></path></svg><span class="button-text">Try Free</span></a></li><li class="user-actions-item watchlist-wrapper"><a tabindex="0" class="erc-header-tile" href="/watchlist"><div class="erc-header-svg"><svg class="header-svg-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" data-t="watchlist-svg"><path d="M9.6,12,16,15.2,9.6,18.4ZM3.2,21.6H20.8V8.8H3.2ZM6.4,4H17.6V2.4H6.4ZM4.8,7.2H19.2V5.6H4.8Z"></path></svg></div></a></li><li class="user-actions-item"><div tabindex="0" class="erc-header-tile"><div class="erc-header-svg"><svg class="header-svg-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="search-svg"><path d="M2,10.666V5.333L5.333,2h5.333L14,5.333v5.333L10.667,14H5.333ZM4.364,0,0,4.363v7.273L4.364,16h7.273l1.757-1.757L18,20h2V18l-5.757-4.606L16,11.637V4.363L11.637,0Z"></path></svg></div></div></li><li class="user-actions-item"><div tabindex="0" class="erc-header-tile"><div class="erc-header-avatar-placeholder"></div><svg class="arrow-down-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 12 12" data-t="down-arrow-svg"><polygon points="6 9.5 12 2.5 0 2.5 6 9.5"></polygon></svg></div></li></ul></div><div class="erc-header-bar" style="background-color:"></div></div><div class="erc-page-overlay"></div></div><div class="app-body-wrapper"><div class="vrv-homepage state-guest"><div class="home-container"><div class="content-absolute"><div><div class="erc-overlay-info-button"><svg class="info-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" data-t="info-svg"><g><polygon points="11.2 17.6 12 17.6 12.8 17.6 12.8 9.6 11.2 9.6 11.2 17.6"></polygon><rect x="11.2" y="6.4" width="1.6" height="1.6"></rect></g><path d="M12,0A12,12,0,1,0,24,12,12,12,0,0,0,12,0Zm0,22.4A10.4,10.4,0,1,1,22.4,12,10.4,10.4,0,0,1,12,22.4Z"></path></svg><div class="button-text"><span class="state-hidden-mobile">What is Vrv?</span><span class="state-hidden-desktop">Vrv</span></div></div></div><a class="erc-view-all-top-link" href="/browse"><svg class="viewall-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg><span>View All</span><svg class="right-arrow" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 12 20" data-t="right-arrow-svg"><path d="M2 .667L11.333 10 2 19.332H.667V18l8-8-8-8V.667"></path></svg></a></div><section class="top-home-container"><div class="background-image"></div><div class="home-content"><div class="top-home-container-row"><div class="top-home-container-cell"><div class="info-text"><h1 class="greeting-title">Watch <strong>The Best</strong> Stuff Ever</h1><div class="description">The best in anime, gaming, tech, cartoons, + more! Create a free account to keep watching across our apps, build a watchlist, or go premium to sync & watch videos offline.<p class="view-all-text">What's Available on VRV? <a href="/browse">View All</a></p></div><div class="hidden-desktop"><div role="button" tabindex="0" class=" c-button -type-one" data-t="signup-btn">Create Account</div></div></div></div><div class="top-home-container-cell hidden-mobile"><div class="signup-wrapper"><div class="signup-form"><div class="erc-signup"><div><h1 class="signup-heading">Create My Free Account</h1><div class="erc-input-field erc-email-field icon state-from-left"><label class="floating-label">Email</label><div class="fake-input" type="email" name="email" value="" autoComplete="off"></div><div class="faded-text"></div></div><div class="erc-input-field erc-password-field icon state-from-left"><label>Password</label><div class="fake-input" type="password" name="password" autoComplete="new-password"></div><div class="faded-text"></div><button class="toggle-password-visibility" tabindex="-1" title="Click to display your password as you type">Show</button><div class="password-strength "><div class="rectangle-container"><div class="rectangle"></div><div class="rectangle"></div><div class="rectangle"></div></div></div></div><div class="erc-button-call-spinner"><div role="button" tabindex="0" class="signup-submit spinner-wrapper c-button -type-one -wide" data-t="signup-btn">Create Account</div></div></div></div><div class="existing-user">Existing user?<!-- --> <button class="button-link">Sign In</button></div></div><div class="erc-policy-content"><p class="policy-text">By creating an account you’re agreeing to our <a target="_blank" href="/terms">Terms</a> &<!-- --> <a target="_blank" href="/privacy">Privacy Policy</a>, and you confirm that you are at least 16 years of age.</p></div></div></div></div></div></section><div class="erc-feed-container"><article class="erc-hero-card" id="G6497Z43Y"><div class="erc-hero-card-background-overlay top-angled bottom-angled"><span class="background-gradient"></span><img class="background-image" src="https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x450/catalog/crunchyroll/9af6719d7998089e7fe249c1c60986e2.jpeg" alt="Miss Kobayashi's Dragon Maid"/></div><div class="foreground"><a class="erc-channel-icon" href="/crunchyroll"><div class="channel-mask"><div class="channel-background" style="background-color:#F47521"></div><img class="channel-icon" src="https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png" alt="Crunchyroll icon"/><span class="channel-name">Crunchyroll</span></div></a><div class="main-image-wrapper-link"><div class="main-image-wrapper"><a class="poster-hover-layer" href="/series/G6497Z43Y/Miss-Kobayashis-Dragon-Maid">to series</a><img src="https://beta.crunchyroll.com/imgsrv/display/thumbnail/240x360/catalog/crunchyroll/6f47c37390d0efe7ab61e34f93ddcfe7.jpeg" class="c-content-image" alt="Miss Kobayashi's Dragon Maid"/></div></div><section class="info"><a class="title-link" href="/series/G6497Z43Y/Miss-Kobayashis-Dragon-Maid"><h1 class="title">Miss Kobayashi's Dragon Maid</h1></a><div class="additional-information"><div class="c-meta-tags media-tag-group"><span class="c-meta-tags__type">Series</span><span class="c-meta-tags__language">Subtitled</span></div></div><p class="description">Miss Kobayashi is your average office worker who lives a boring life, alone in her small apartment–until she saves the life of a female dragon in distress. The dragon, named Tohru, has the ability to magically transform into an adorable human girl (albeit with horns and a long tail!), who will do anything to pay off her debt of gratitude, whether Miss Kobayashi likes it or not. With a very persistent and amorous dragon as a roommate, nothing comes easy, and Miss Kobayashi’s normal life is about to go off the deep end!</p><div class="watch-actions"><a role="button" tabindex="0" class="go-watch c-button -type-one-weak" data-t="watching-btn" href="/series/G6497Z43Y/Miss-Kobayashis-Dragon-Maid"><svg class="" viewBox="0 0 20 20" xmlns="http://www.w3.org/2000/svg" data-t="play-arrow-svg"><polygon points="0 0 0 20 20 10"></polygon></svg><span>Start Watching</span></a></div></section></div></article><div class="erc-shelf-feed-item" id="browse_popular"><h1 class="feed-title">Most Popular</h1><div class="erc-cards-collection"><div class="card-full"><article class="erc-series-movie-card xl placeholder"><a class="card-link"></a><div class="watch-tag-list"><div class="erc-info-tags-group"></div></div><div class="h-thumbnail" style="border-color:"><div class="c-content-image image"><svg class="c-svg-no-image" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 30 30" data-t="no-image-svg"><path d="M0,4V26H30V4ZM28,24H2V6H28Z"></path><polygon points="26 19 23 16 19 20 10 11 4 17 4 22 26 22 26 19"></polygon><rect x="17.586" y="10.586" width="2.828" height="2.828" transform="translate(-2.92 16.95) rotate(-45)"></rect></svg></div></div><div class="body-section"><div class="poster-image"><div class="c-content-image"><svg class="c-svg-no-image" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 30 30" data-t="no-image-svg"><path d="M0,4V26H30V4ZM28,24H2V6H28Z"></path><polygon points="26 19 23 16 19 20 10 11 4 17 4 22 26 22 26 19"></polygon><rect x="17.586" y="10.586" width="2.828" height="2.828" transform="translate(-2.92 16.95) rotate(-45)"></rect></svg></div></div><div class="info"><div class="description-metadata"><h1></h1></div><div class="details-metadata"><div class="c-meta-tags media-tag-group"></div></div></div></div></article></div></div></div><div class="erc-shelf-feed-item" id="GZJH3D719"><h1 class="feed-title">TSUKIMICHI -Moonlit Fantasy-</h1><div class="erc-cards-collection"><div class="card-full"><article class="erc-series-movie-card xl placeholder"><a class="card-link"></a><div class="watch-tag-list"><div class="erc-info-tags-group"></div></div><div class="h-thumbnail" style="border-color:"><div class="c-content-image image"><svg class="c-svg-no-image" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 30 30" data-t="no-image-svg"><path d="M0,4V26H30V4ZM28,24H2V6H28Z"></path><polygon points="26 19 23 16 19 20 10 11 4 17 4 22 26 22 26 19"></polygon><rect x="17.586" y="10.586" width="2.828" height="2.828" transform="translate(-2.92 16.95) rotate(-45)"></rect></svg></div></div><div class="body-section"><div class="poster-image"><div class="c-content-image"><svg class="c-svg-no-image" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 30 30" data-t="no-image-svg"><path d="M0,4V26H30V4ZM28,24H2V6H28Z"></path><polygon points="26 19 23 16 19 20 10 11 4 17 4 22 26 22 26 19"></polygon><rect x="17.586" y="10.586" width="2.828" height="2.828" transform="translate(-2.92 16.95) rotate(-45)"></rect></svg></div></div><div class="info"><div class="description-metadata"><h1></h1></div><div class="details-metadata"><div class="c-meta-tags media-tag-group"></div></div></div></div></article></div></div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div><div></div></div><div class="erc-view-all-feed-section"><p class="view-all-feed-section-title">You’ve reached the end of the feed.</p><a role="button" tabindex="0" class=" c-button -type-two-weak -small" data-t="view-all-btn" href="/browse"><svg class="" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 20 20" data-t="view-all-svg"><path d="M6,9V4H9V9Zm5,0V4h3V9ZM6,16V11H9v5Zm5,0V11h3v5Z"></path></svg>View All</a></div><section class="bottom-home-container"><div class="combo-pack-section"><div class="combo-pack"><div style="background-color:#fd0" class="c-circle-icon c-circle-icon--size-normal combo-pack-icon"><svg class="c-circle-icon__element c-svg-vrv-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 27 32" data-t="vrv-icon-svg"><polygon class="logo-arrow" fill="currentColor" points="23.16 20.81 13.38 26.59 3.47 20.81 3.47 12.19 23.16 12.19 23.16 20.81"></polygon><path class="logo-background" d="M13.37.38,0,8.19V23.81l13.37,7.81,13.37-7.81V8.19ZM22.1,20.92,13.37,26,4.64,20.9V14.44l8.73,5.1,8.73-5.1Z"></path></svg></div><h3 class="title">VRV Premium</h3><h4 class="info">Everything on VRV, Ad-free<!-- -->, $<!-- -->9.99<!-- --> +tax/mo</h4><div class="description">Get newest episodes, exclusive series, and ad-free viewing to everything on VRV such as HarmonQuest, Dragon Ball Super, Bravest Warriors, and more!</div><a role="button" tabindex="0" class="erc-bundle-button c-button -type-one -scalable" data-t="bundle-btn" href="/?checkout=bundle-combopack"><svg class="" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24" data-t="premium-svg"><path d="M7.17,0L2.4,4.8l0.8,4L1.6,11.2l1.6,2.4h4l1.2-1.42L9.6,13.6h4.8l1.2-1.42,1.2,1.42h4l1.6-2.4L20.8,8.8l0.8-4L16.72,0H7.17ZM8,19.2l1.6,2.4,7.2-4,1.6-2.4H15.31Zm-2.4-4,1.72,2.7,5-2.7H5.6Zm4.78,7.52L11.2,24h1.6l2.4-4Z"></path></svg>30-DAY FREE TRIAL</a></div><svg class="triangle" viewBox="0 0 1920 60" preserveAspectRatio="none"><polygon points="0,0 1920,60 0,60"></polygon></svg></div></section></div></div></div><div class="app-footer-wrapper"><div class="erc-footer"><div class="content"><div class="footer-copyright"><div class="copyright">© Ellation, LLC</div><a href="http://www.crunchyroll.com/" target="_blank" rel="noopener noreferrer" class="made-by-crunchyroll"><svg class="c-svg-made-by-crunchyroll" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 60 60" data-t="made-by-crunchyroll"><path class="circle" d="M30.183,10.287a19.714,19.714,0,1,0,19.531,19.9,19.714,19.714,0,0,0-19.531-19.9"></path><g class="ring"><path d="M29.994,1.71h.268a28.29,28.29,0,0,1-.256,56.58h-.268a28.29,28.29,0,0,1,.256-56.58m0-1.71a30,30,0,0,0-.272,60h.284a30,30,0,0,0,.272-60l-.284,0"></path></g><g class="text"><polygon points="18.646 11.488 17.985 12.001 16.045 9.494 16.916 12.316 16.217 12.857 13.719 11.325 15.644 13.811 14.983 14.324 12.414 11.004 13.383 10.254 15.98 11.826 15.107 8.919 16.076 8.168 18.646 11.488"></polygon><path d="M23.9,9.17l-.829.335-.707-.77-1.661.671.027,1.045-.811.327-.058-4.509.946-.382Zm-3.219-.586,1.127-.456L20.641,6.85Z"></path><path d="M26.539,4.424a2.016,2.016,0,0,1,2.434,1.8,2.015,2.015,0,0,1-1.887,2.364l-1.715.226-.547-4.161Zm-.784.892.344,2.61.877-.115a1.208,1.208,0,0,0,1.131-1.477A1.2,1.2,0,0,0,26.632,5.2Z"></path><polygon points="34.068 8.689 30.952 8.346 31.412 4.175 34.505 4.517 34.42 5.279 32.159 5.029 32.057 5.954 34.214 6.191 34.13 6.953 31.973 6.715 31.868 7.674 34.152 7.926 34.068 8.689"></polygon><path d="M41.7,6.933c.873.435,1.262,1.032.906,1.749a.935.935,0,0,1-.939.526,1.007,1.007,0,0,1,.206,1.153c-.358.721-1.047.821-1.947.373L38.3,9.926l1.866-3.755Zm-1.877,1.8-.447.9.838.42c.413.2.721.237.9-.122s-.037-.576-.45-.783ZM40.581,7.2l-.436.877.768.381c.372.186.7.252.882-.113s-.079-.581-.449-.765Z"></path><polygon points="45.24 12.314 44.188 13.59 43.544 13.058 44.598 11.78 44.965 8.797 45.723 9.422 45.405 11.465 47.35 10.762 48.084 11.368 45.24 12.314"></polygon><path d="M9.964,40.884,9.275,40.4a1.251,1.251,0,0,0,.1-1.224A1.3,1.3,0,0,0,7.6,38.5a1.325,1.325,0,0,0,.222,2.535l-.181.821A2.086,2.086,0,0,1,6.123,40.58,2.107,2.107,0,0,1,7.252,37.7a2.107,2.107,0,0,1,2.867,1.164,2.038,2.038,0,0,1-.155,2.022"></path><path d="M12.661,43.309c.6.8.541,1.519-.1,2a1.249,1.249,0,0,1-1.764-.291l-.577,2.044-.607-.813.609-1.983-.147-.2-1.341,1-.5-.672,3.37-2.512Zm-1.18-.292-.825.616.552.741c.273.367.549.456.833.244s.258-.5-.007-.859Z"></path><path d="M15.8,49.946a1.64,1.64,0,0,1-2.517.178,1.64,1.64,0,0,1-.258-2.511l1.626-1.935.644.541-1.6,1.9a.885.885,0,0,0,.1,1.387.885.885,0,0,0,1.383-.138l1.6-1.9.644.541Z"></path><polygon points="20.416 54.368 19.708 54.026 19.006 50.574 17.786 53.101 17.032 52.737 18.856 48.953 19.561 49.292 20.264 52.746 21.483 50.219 22.24 50.583 20.416 54.368"></polygon><path d="M27.468,52.761l-.791.288a1.25,1.25,0,0,0-.947-.781,1.371,1.371,0,1,0,.64,2.341l.572.616a2.189,2.189,0,1,1-1.047-3.743,2.035,2.035,0,0,1,1.574,1.279"></path><polygon points="30.117 53.668 32.135 53.575 32.056 51.887 32.892 51.848 33.084 56.04 32.25 56.079 32.17 54.354 30.152 54.446 30.232 56.172 29.397 56.21 29.204 52.018 30.039 51.979 30.117 53.668"></polygon><polygon points="37.285 53.424 37.743 55.016 36.939 55.248 36.479 53.652 34.221 51.656 35.167 51.384 36.676 52.805 37.198 50.799 38.116 50.536 37.285 53.424"></polygon><path d="M40.9,49.107c.874-.487,1.573-.333,1.964.369a1.246,1.246,0,0,1-.523,1.706l1.947.842-.885.492-1.88-.864-.214.119.81,1.459-.729.407-2.04-3.668Zm-.446,1.129.5.9.807-.449c.4-.221.524-.483.352-.791s-.464-.322-.851-.106Z"></path><path d="M48.044,46.12a2.206,2.206,0,1,1-3.1-.091,2.117,2.117,0,0,1,3.1.091m-2.591,2.41a1.332,1.332,0,1,0,.04-1.913,1.3,1.3,0,0,0-.04,1.913"></path><polygon points="52.745 43.083 51.056 45.626 47.562 43.306 48.024 42.61 50.867 44.498 52.093 42.65 52.745 43.083"></polygon><polygon points="54.945 38.166 53.804 40.999 49.912 39.432 50.224 38.656 53.391 39.931 54.219 37.873 54.945 38.166"></polygon><polygon points="4.044 26.987 5.118 28.189 3.916 29.263 4.722 30.166 5.718 30.222 7.822 28.342 5.942 26.237 4.946 26.181 4.044 26.987"></polygon><polygon points="6.751 18.682 7.255 20.214 5.723 20.718 6.102 21.867 6.992 22.316 9.673 21.433 8.791 18.753 7.899 18.304 6.751 18.682"></polygon><polygon points="55.072 26.181 54.076 26.236 52.195 28.342 54.3 30.222 55.296 30.166 56.102 29.264 54.899 28.189 55.973 26.986 55.072 26.181"></polygon><polygon points="52.118 18.304 51.227 18.753 50.344 21.434 53.025 22.317 53.916 21.867 54.294 20.718 52.762 20.214 53.267 18.682 52.118 18.304"></polygon></g><path class="logo" d="M42.908,30.956a11.3,11.3,0,1,0-12.073,11.96c-.289.017-.58.024-.873.022A12.858,12.858,0,1,1,42.937,30.2C42.934,30.454,42.926,30.706,42.908,30.956ZM32.2,41.338a9.036,9.036,0,1,1,4.359-17,4.113,4.113,0,1,0,4.622,6.387A9.038,9.038,0,0,1,32.2,41.338Z"></path></svg></a></div><div class="footer-menu-wrapper"><div class="info-content channels"><h3 class="footer-title">Channels</h3><ul class="links"><li class="erc-footer-channel"><a class="footer-channel-link" href="/cartoonhangover"><span class="footer-channel-marker" style="background-color:#EC2024"></span>Cartoon Hangover</a></li><li class="erc-footer-channel"><a class="footer-channel-link" href="/crunchyroll"><span class="footer-channel-marker" style="background-color:#F47521"></span>Crunchyroll</a></li><li class="erc-footer-channel"><a class="footer-channel-link" href="/hidive"><span class="footer-channel-marker" style="background-color:#00aeef"></span>HIDIVE</a></li><li class="erc-footer-channel"><a class="footer-channel-link" href="/mondo"><span class="footer-channel-marker" style="background-color:#E93735"></span>Mondo</a></li><li class="erc-footer-channel"><a class="footer-channel-link" href="/roosterteeth"><span class="footer-channel-marker" style="background-color:#AF272F"></span>Rooster Teeth</a></li><li class="erc-footer-channel"><a class="footer-channel-link" href="/vrvselect"><span class="footer-channel-marker" style="background-color:#808285"></span>VRV Select</a></li></ul></div></div><div class="footer-menu-wrapper"><div class="info-content socials"><h3 class="footer-title">VRV Socials</h3><ul class="links"><li class="list-item"><a class="list-item-link" href="https://www.youtube.com/channel/UCzfC0gLAUq4UTShtfjTmw1A"><svg class="vrv-social-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16" data-t="youtube-svg"><path d="M15.666,4.124A2.01,2.01,0,0,0,14.251,2.7,47.511,47.511,0,0,0,8,2.364,47.511,47.511,0,0,0,1.749,2.7,2.01,2.01,0,0,0,.334,4.124,21.09,21.09,0,0,0,0,8a21.09,21.09,0,0,0,.334,3.876A2.01,2.01,0,0,0,1.749,13.3,47.509,47.509,0,0,0,8,13.636a47.509,47.509,0,0,0,6.251-.337,2.01,2.01,0,0,0,1.415-1.424A21.09,21.09,0,0,0,16,8,21.09,21.09,0,0,0,15.666,4.124Zm-9.3,6.255V5.621L10.545,8Z"></path></svg>Youtube</a></li><li class="list-item"><a class="list-item-link" href="https://www.facebook.com/WatchVRV/?fref=ts"><svg class="vrv-social-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16" data-t="facebook-svg"><path d="M15.1169,0 L0.8829,0 C0.3949,0 -0.0001,0.395 -0.0001,0.883 L-0.0001,15.117 C-0.0001,15.605 0.3949,16 0.8829,16 L8.5459,16 L8.5459,9.804 L6.4609,9.804 L6.4609,7.389 L8.5459,7.389 L8.5459,5.608 C8.5459,3.542 9.8079,2.417 11.6519,2.417 C12.5349,2.417 13.2939,2.482 13.5149,2.512 L13.5149,4.671 L12.2369,4.672 C11.2339,4.672 11.0399,5.148 11.0399,5.848 L11.0399,7.389 L13.4309,7.389 L13.1199,9.804 L11.0399,9.804 L11.0399,16 L15.1169,16 C15.6049,16 15.9999,15.605 15.9999,15.117 L15.9999,0.883 C15.9999,0.395 15.6049,0 15.1169,0"></path></svg>Facebook</a></li><li class="list-item"><a class="list-item-link" href="https://twitter.com/WatchVRV"><svg class="vrv-social-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16" data-t="twitter-svg"><path d="M5,14.5a9.28,9.28,0,0,0,9.34-9.34c0-.14,0-0.28,0-0.42A6.68,6.68,0,0,0,16,3a6.55,6.55,0,0,1-1.89.52,3.29,3.29,0,0,0,1.44-1.82,6.58,6.58,0,0,1-2.08.8,3.29,3.29,0,0,0-5.59,3A9.32,9.32,0,0,1,1.11,2.1a3.29,3.29,0,0,0,1,4.38A3.26,3.26,0,0,1,.64,6.07s0,0,0,0A3.28,3.28,0,0,0,3.28,9.33a3.28,3.28,0,0,1-1.48.06,3.29,3.29,0,0,0,3.07,2.28A6.59,6.59,0,0,1,.78,13.07,6.68,6.68,0,0,1,0,13,9.29,9.29,0,0,0,5,14.5"></path></svg>Twitter</a></li><li class="list-item"><a class="list-item-link" href="https://www.instagram.com/watchvrv/"><svg class="vrv-social-icon" xmlns="http://www.w3.org/2000/svg" viewBox="0 0 16 16" data-t="instagram-svg"><path d="M8,1.44c2.14,0,2.39,0,3.23,0a4.43,4.43,0,0,1,1.49.28,2.48,2.48,0,0,1,.92.6,2.48,2.48,0,0,1,.6.92,4.43,4.43,0,0,1,.28,1.49c0,0.84,0,1.1,0,3.23s0,2.39,0,3.23a4.43,4.43,0,0,1-.28,1.49,2.65,2.65,0,0,1-1.52,1.52,4.43,4.43,0,0,1-1.49.28c-0.84,0-1.1,0-3.23,0s-2.39,0-3.23,0a4.43,4.43,0,0,1-1.49-.28,2.48,2.48,0,0,1-.92-0.6,2.48,2.48,0,0,1-.6-0.92,4.43,4.43,0,0,1-.28-1.49c0-.84,0-1.1,0-3.23s0-2.39,0-3.23a4.43,4.43,0,0,1,.28-1.49,2.48,2.48,0,0,1,.6-0.92,2.48,2.48,0,0,1,.92-0.6,4.43,4.43,0,0,1,1.49-.28c0.84,0,1.1,0,3.23,0M8,0C5.83,0,5.55,0,4.7,0A5.87,5.87,0,0,0,2.76.42a3.92,3.92,0,0,0-1.42.92A3.92,3.92,0,0,0,.42,2.76,5.87,5.87,0,0,0,0,4.7C0,5.55,0,5.83,0,8s0,2.45,0,3.3a5.87,5.87,0,0,0,.37,1.94,3.92,3.92,0,0,0,.92,1.42,3.92,3.92,0,0,0,1.42.92A5.87,5.87,0,0,0,4.7,16c0.85,0,1.13,0,3.3,0s2.45,0,3.3,0a5.87,5.87,0,0,0,1.94-.37,4.09,4.09,0,0,0,2.34-2.34A5.87,5.87,0,0,0,16,11.3c0-.85,0-1.13,0-3.3s0-2.45,0-3.3a5.87,5.87,0,0,0-.37-1.94,3.92,3.92,0,0,0-.92-1.42A3.92,3.92,0,0,0,13.24.42,5.87,5.87,0,0,0,11.3,0C10.45,0,10.17,0,8,0H8Z"></path><path d="M8,3.89A4.11,4.11,0,1,0,12.11,8,4.11,4.11,0,0,0,8,3.89Zm0,6.77A2.67,2.67,0,1,1,10.67,8,2.67,2.67,0,0,1,8,10.67Z"></path><circle cx="12.27" cy="3.73" r="0.96"></circle></svg>Instagram</a></li></ul></div></div><div class="footer-menu-wrapper"><div class="info-content help"><h3 class="footer-title">More Stuff</h3><div class="links-container"><ul class="links"><li class="list-item"><a class="list-item-link" target="_blank" href="/help">Help/FAQ</a></li><li class="list-item"><a class="list-item-link" target="_blank" href="/terms">Terms of Use</a></li><li class="list-item"><a class="list-item-link" target="_blank" href="/privacy">Privacy Policy</a></li><li class="list-item"><a class="list-item-link" target="_blank" href="/privacy">Do Not Sell My Personal Information</a></li><li class="list-item"><a class="list-item-link" href="mailto:media@vrv.co">Press Inquiries</a></li><li class="list-item"><a class="list-item-link" href="/get-vrv">Get The Apps</a></li><li class="list-item"><a class="list-item-link" href="/redeem">Add Gift Card</a></li><li class="list-item"><a class="list-item-link" href="https://www.crunchyroll.com/about/work/index.html" target="_blank" rel="noopener noreferrer">Jobs</a></li></ul></div></div></div><div class="footer-menu-wrapper footer-guest-menu"><div class="info-content guest"><h3 class="footer-title guest-title">Account</h3><div class="hidden-desktop c-buttons-group"><div role="button" tabindex="0" class=" c-button -type-five -small" data-t="signup-btn">Create Account</div><div role="button" tabindex="0" class=" c-button -type-five -small" data-t="signin-btn">Sign In</div></div><ul class="guest-links links hidden-mobile"><li class="list-item"><a class="list-item-link" data-t="signup-link" href="#">Create Account</a></li><li class="list-item"><a class="list-item-link" data-t="signin-link" href="#">Sign In</a></li></ul></div></div></div></div></div></div></div>
<div class="unsupported-container" id="unsupported-modal" data-reactroot=""><div class="wrapper-dark"></div><div class="wrapper-table"><div class="wrapper-cell"><div class="browser-wrapper"><div class="close-button" id="unsupported-modal-close-button"><img src="https://static.vrv.co/vrvweb/assets/img/browsers/close-x.png" srcSet="https://static.vrv.co/vrvweb/assets/img/browsers/close-x@2x.png" alt="Close unsupported"/></div><div class="intro"><h1 class="browser-title">Ancient browser detected!</h1><p class="browser-description">Some old stuff is cool. Stuff like Stonehenge, ancient remains,
and that picture of your dad next to that sweet car. What's not
cool? Old browsers. VRV doesn't work on old browsers, so it looks
like it's time for an upgrade. Here are some we officially
support.</p></div><ul class="browser-list"><li class="browser-item"><img class="browser-item-elem browser-img" src="https://static.vrv.co/vrvweb/assets/img/browsers/chrome.png" srcSet="https://static.vrv.co/vrvweb/assets/img/browsers/chrome@2x.png 2x" alt="Google Chrome"/><div class="browser-item-elem browser-info"><div class="browser-info-title">Google Chrome</div><p class="browser-info-description">Version 55+</p></div><a class="browser-info-link" href="https://www.google.com/chrome/">GET IT</a></li><li class="browser-item"><img class="browser-item-elem browser-img" src="https://static.vrv.co/vrvweb/assets/img/browsers/firefox.png" srcSet="https://static.vrv.co/vrvweb/assets/img/browsers/firefox@2x.png 2x" alt="Mozilla Firefox"/><div class="browser-item-elem browser-info"><div class="browser-info-title">Mozilla Firefox</div><p class="browser-info-description">Version 60+</p></div><a class="browser-info-link" href="https://www.mozilla.org/en-US/firefox">GET IT</a></li><li class="browser-item"><img class="browser-item-elem browser-img" src="https://static.vrv.co/vrvweb/assets/img/browsers/edge.png" srcSet="https://static.vrv.co/vrvweb/assets/img/browsers/edge@2x.png 2x" alt="Microsoft Edge"/><div class="browser-item-elem browser-info"><div class="browser-info-title">Microsoft Edge</div><p class="browser-info-description">Version 15+</p></div><a class="browser-info-link" href="https://www.microsoft.com/en-us/windows/microsoft-edge/microsoft-edge">GET IT</a></li><li class="browser-item"><img class="browser-item-elem browser-img" src="https://static.vrv.co/vrvweb/assets/img/browsers/safari.png" srcSet="https://static.vrv.co/vrvweb/assets/img/browsers/safari@2x.png 2x" alt="Apple Safari"/><div class="browser-item-elem browser-info"><div class="browser-info-title">Apple Safari</div><p class="browser-info-description">Version 10+</p></div><a class="browser-info-link" href="http://www.apple.com/safari/">GET IT</a></li></ul></div></div></div></div>
</body>
<div id="preload-data">
<script>
window.__INITIAL_STATE__ = {"index":{"isLoading":false,"endpoints":{"discIndex":{"href":"http://data-contentdiscovery.cx-prod.com/disc/private/v1/US/M2/-/-/index?Policy=eyJTdGF0ZW1lbnQiOlt7IlJlc291cmNlIjoiaHR0cHM6XC9cL2FwaS52cnYuY29cL2Rpc2NcL3ByaXZhdGVcL3Y%7EXC9VU1wvTTJcLy1cLy1cLyoiLCJDb25kaXRpb24iOnsiRGF0ZUxlc3NUaGFuIjp7IkFXUzpFcG9jaFRpbWUiOjE2MzE3NDM2NjR9fX1dfQ__&Signature=YU91QAAfF9YmvZWE4ksrSRbjUSfCMNyFJoxr5F%7Ernqfzrr3o9AaDzmb2kUr8EACT7nFIU-Oyzg2nhMO3dDiZojhEhJuXERLjEXGz280WKmthuM42N6FLy5wfaGtvXEceaMMT0qlxlx2%7Ebs8nJ2q8J4vRp7q61zeC%7EH6oxZVkxF%7EJzavSXRb803GslXl3f6jW1kSvaHFmBXxUjx4ue-t5bTxoPVseKO0nI9HQ6AtRAkLiaiu-exLjxjvevPWlO3FTabyOPCUp%7Ee6a5ogr8OzD06Y-X4GEhV5v8b%7Ey72h4h2HfZ0cP2UTwC1KIJ8WaFHrV3M-5A4TYS2OeIKx27CGz8A__&Key-Pair-Id=APKAJMWSQ5S7ZB3MF5VA&Expires=1631743664&endpoint_expires=1631743664","CxApiClass":"DiscIndex","discLinks":{"browse":"browse","search":"search","browse_index":"browse/index","home_feed":"home_feed","continue_watching":"continue_watching","similar_to":"similar_to"}},"discIndexUnsigned":{"href":"http://data-contentdiscovery.cx-prod.com/disc/private/v1/US/M2/-/-/index","CxApiClass":"DiscIndex","discLinks":{"browse":"browse","search":"search","browse_index":"browse/index","home_feed":"home_feed","continue_watching":"continue_watching","similar_to":"similar_to"}},"currentAccount":{"href":"/core/account","CxApiClass":"CurrentAccount"},"accounts":{"href":"/core/accounts","CxApiClass":"Accounts"},"coreChannels":{"href":"/core/channels","CxApiClass":"CoreChannels"},"bundles":{"CxApiClass":"Bundles","selfLink":"/core/bundles","links":{"bundle/subscription_products":"bundle/subscription_products"}},"upNext":{"CxApiClass":"UpNext","selfLink":"/core/up_next"},"client_config":{"CxApiClass":"Configuration","selfLink":"/core/client/config"},"authenticate":{"credentialsHref":"/core/authenticate/by:credentials","tokenLoginHref":"/core/authenticate/by:token","CxApiClass":"Authenticate"},"isServiceAvailable":true,"channels":{"href":"/cms/v2/US/M2/-/channels","CxApiClass":"Channels"},"homeFeeds":{"href":"/cms/v2/US/M2/-/curated_feeds?channel_id=vrv","CxApiClass":"HomeFeeds"},"primaryFeedExpanded":{"CxApiClass":"PrimaryFeedExpanded","href":"/cms/v2/US/M2/-/curated_feeds/GYWEE723Y?version=1.1&expand=true"},"search":{"href":"/cms/v2/US/M2/-/search","CxApiClass":"Search"}}},"flashMessages":{"messages":[],"statusType":"success","timeoutExpiration":4000},"earlyAccess":{"isModalOpen":false},"modals":{"authenticationView":null,"isResetPasswordOpen":false},"accounts":{"currentUser":{"account":null,"paymentMethods":{"isFetched":false,"fetchError":false,"creditCards":[],"payPalPaymentMethod":null},"giftCardBalance":{"isLoading":false,"fetchError":false,"details":null},"currentSubscription":{"isLoading":false,"fetchError":false,"details":null},"subscribedProducts":{"isLoading":false,"fetchError":false,"items":null},"freeTrialProducts":{"isLoading":false,"fetchError":false,"items":null},"premiumChannels":{"isLoading":false,"fetchError":false,"isFetched":false,"items":null},"linkedPartners":null,"nextRenewalPricingBreakdown":null,"isLoading":false,"fetchError":null,"hasCookieWithToken":false,"apiIndex":{"signingVersion":"56f5aa63ea3f21483714b030387d8960","isLoading":false},"payPal":{"token":null},"isSigningIn":false,"isSigningUp":false}},"profile":{"preferences":{"mature_content":false},"username":null,"avatar":null,"selectedAvatar":{},"isFetching":true,"isUpdating":false,"profileUpdateError":null,"isProfileLoading":false},"temporaryProfile":{"usernames":[],"username":"DrunkWalrus","isUsernameValid":true,"avatar":{},"isFetchingUsernames":false},"bundles":{"byId":{"combopack":{"__class__":"bundle","__href__":"/core/bundles/combopack","__links__":{"bundle/subscription_products":{"href":"/core/subscription_products?bundle_id=combopack"},"bundle/android_upsell":{"href":"https://vrv.co/android/upsell/combopack"}},"__actions__":{},"id":"combopack","channels":["cartoonhangover","crunchyroll","hidive","mondo","roosterteeth","vrvselect"],"name":"VRV Premium","description":"Get newest episodes, exclusive series, and ad-free viewing to everything on VRV such as HarmonQuest, Dragon Ball Super, Bravest Warriors, and more!","image":[{"width":600,"height":338,"type":"thumbnail_600","source":"http://static.vrv.co/imgsrv/display/resize/600x338/bundles/combopack.png"},{"width":800,"height":450,"type":"thumbnail_800","source":"http://static.vrv.co/imgsrv/display/resize/800x450/bundles/combopack.png"}]}},"allIds":["combopack"],"isLoading":false,"isFetchingFailed":false},"subscriptionProducts":{"byId":{"combopack":{"__class__":"subscription_product","__href__":"/core/subscription_products/30","__links__":{"get_subscribable":{"href":"/core/bundles/combopack"}},"__actions__":{},"id":"30","sku":"combopackCC1MFF30D20170901","type":"bundle","subscribable_id":"combopack","price":"9.99","currency_code":"USD","billing_cycle":"P1M","description":"Everything on VRV, Ad-free","free_trial_duration":"P30D"},"crunchyroll":{"__class__":"subscription_product","__href__":"/core/subscription_products/2","__links__":{"get_subscribable":{"href":"/core/channels/crunchyroll"}},"__actions__":{},"id":"2","sku":"crunchyrollCC1MFF7D","type":"channel","subscribable_id":"crunchyroll","price":"7.99","currency_code":"USD","billing_cycle":"P1M","description":"First Access, 20,000+ Episodes, No Ads","free_trial_duration":"P7D"},"mondo":{"__class__":"subscription_product","__href__":"/core/subscription_products/7","__links__":{"get_subscribable":{"href":"/core/channels/mondo"}},"__actions__":{},"id":"7","sku":"mondoCC1MFF7D","type":"channel","subscribable_id":"mondo","price":"2.99","currency_code":"USD","billing_cycle":"P1M","description":"First Access, Exclusive Content, No Ads","free_trial_duration":"P7D"},"roosterteeth":{"__class__":"subscription_product","__href__":"/core/subscription_products/13","__links__":{"get_subscribable":{"href":"/core/channels/roosterteeth"}},"__actions__":{},"id":"13","sku":"roosterteethCC1MFF7D","type":"channel","subscribable_id":"roosterteeth","price":"5.99","currency_code":"USD","billing_cycle":"P1M","description":"First Access, Exclusive Content, No Ads","free_trial_duration":"P7D"},"cartoonhangover":{"__class__":"subscription_product","__href__":"/core/subscription_products/35","__links__":{"get_subscribable":{"href":"/core/channels/cartoonhangover"}},"__actions__":{},"id":"35","sku":"cartoonhangoverCC1MFF7DP399","type":"channel","subscribable_id":"cartoonhangover","price":"3.99","currency_code":"USD","billing_cycle":"P1M","description":"First Access, Exclusive Content, No Ads","free_trial_duration":"P7D"}},"allIds":["combopack","crunchyroll","mondo","roosterteeth","cartoonhangover"],"isLoading":false,"isFetchingFailed":false},"checkout":{},"freeTrialEligibility":{"freeTrialEligibilityCollection":null,"isLoading":false,"loadingError":false,"errorMessage":""},"watch":{"isLoading":false,"mediaResource":null,"parentResource":null,"videoPlayerConfig":null,"seasons":[],"extras":[],"movies":null,"episodes":[],"currentSeasonIndex":0,"playheadUpdateFrequency":30000,"upNextResource":null},"device":null,"partners":{"crunchyroll":{"id":"1"}},"seriesPage":{"series":null,"seasons":[],"seasonsIds":[],"episodes":[],"episodesIds":[],"extras":[],"isLoading":false,"upNextSeriesPanel":null,"currentSeasonIndex":0},"header":{"channelId":null,"isUserDropdownOpen":false,"isNavDropdownOpen":false},"userActionsHistory":{"routes":[],"matureConfirmation":false},"search":{"isOpen":false,"isLoading":false,"fetchError":null,"series":{"items":[],"hasMoreItems":false},"topResults":{"items":[],"hasMoreItems":false},"movieListings":{"items":[],"hasMoreItems":false},"episodes":{"items":[],"hasMoreItems":false},"recentSearches":[]},"watchlist":{"allIds":{"parent":[],"child":[]},"byId":{},"isLoading":false,"isItemLoading":false,"partialIds":{"parent":[],"child":[]},"sorting":{"type":"","order":"desc"},"filters":{"favorite":false,"content":"","language":""}},"discovery":{"isLoading":false,"loadingError":null,"metadata":{},"byLetter":{},"byId":{}},"talkbox":{"comments":{"byId":{},"topLevelIds":[],"newCommentId":null,"pagination":{"total":0,"limit":50,"page":0}},"isLoading":false,"loadingError":null},"guestbook":{"byId":{"G6497Z43Y":{"__class__":"guestbook","__href__":"","__links__":{"guestbook/comments":{"href":"http://talkbox.cx-prod.com/talkbox/guestbooks/G6497Z43Y/comments"}},"__actions__":[],"guestbook_key":"G6497Z43Y","domain_id":"vrv","total_comments":0}},"allIds":["G6497Z43Y"],"activeId":null,"isLoading":false,"loadingError":null},"userActionsFlow":{"sequenceById":{"COMPLETE_PROFILE_FLOW":[]}},"fetchEnvironment":{"isClient":false},"att":{"view":"","isSignUpLoading":false,"signInError":null,"isSignInLoading":false,"signUpError":null},"feeds":{"byId":{"ea6ba25c9e01fb8ea569c3243410b0ea":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/G6497Z43Y?locale=en-US"}},"__actions__":{},"id":"ea6ba25c9e01fb8ea569c3243410b0ea","title":"Miss Kobayashi's Dragon Maid","resource_type":"panel","display_type":"hero","description":"Miss Kobayashi is your average office worker who lives a boring life, alone in her small apartment–until she saves the life of a female dragon in distress. The dragon, named Tohru, has the ability to magically transform into an adorable human girl (albeit with horns and a long tail!), who will do anything to pay off her debt of gratitude, whether Miss Kobayashi likes it or not. With a very persistent and amorous dragon as a roommate, nothing comes easy, and Miss Kobayashi’s normal life is about to go off the deep end!","source_media_id":"none","source_media_title":"none","original_id":"G6497Z43Y","version":null},"version":null,"items":["G6497Z43Y"]},"df016415e47c2cc35f5bd9ba4b396ef2":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/disc/public/v1/US/M2/-/-/browse?locale=en-US&n=20&sort_by=popularity&start=0"}},"__actions__":{},"id":"df016415e47c2cc35f5bd9ba4b396ef2","title":"Most Popular","resource_type":"dynamic_collection","display_type":"shelf","description":"The most-watched shows and movies on VRV.","source_media_id":"none","source_media_title":"none","original_id":"browse_popular","version":null}},"d417565f8000121785ccfc50f15ea283":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/GZJH3D719?locale=en-US"}},"__actions__":{},"id":"d417565f8000121785ccfc50f15ea283","title":"TSUKIMICHI -Moonlit Fantasy-","resource_type":"panel","display_type":"shelf","description":"","source_media_id":"none","source_media_title":"none","original_id":"GZJH3D719","version":null}},"b86092c279ba81d716bebabce315090c":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/GDKHZEN80?locale=en-US"}},"__actions__":{},"id":"b86092c279ba81d716bebabce315090c","title":"Drug Store in Another World - The Slow Life of a Cheat Pharmacist","resource_type":"panel","display_type":"shelf","description":"Reiji Kirio was a corporate wage slave, who did nothing but work all day, every day. On a typical day of walking to work with a hollow look on his face, he suddenly found himself in the forest of another world. \"Oh, this must be one of those isekai reincarnations people keep talking about.\"\r\n\r\nThe two skills he has are \"Appraise\" and \"Drug Discovery.\" Those don't sound too impressive... well, whatever. Except these turn out to be particularly overpowered skills! \r\n\r\nAnd so, a story about following the slow life philosophy while running a drugstore in another world begins.","source_media_id":"none","source_media_title":"none","original_id":"GDKHZEN80","version":null}},"7584383dc2159f4941f0ce08dc5073ad":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/disc/public/v1/US/M2/-/-/browse?locale=en-US&n=20&sort_by=newly_added&start=0"}},"__actions__":{},"id":"7584383dc2159f4941f0ce08dc5073ad","title":"Just Updated On VRV","resource_type":"dynamic_collection","display_type":"shelf","description":"The newest and coolest stuff to watch on VRV, updated live.","source_media_id":"none","source_media_title":"none","original_id":"browse_newly","version":null}},"943b4d5ac3f5b9400c058255ffb079a2":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/curated_feeds/G6QWVXG16?locale=en-US&version=1.1"}},"__actions__":{},"id":"943b4d5ac3f5b9400c058255ffb079a2","title":"Does This Spark Joy?","resource_type":"curated_collection","display_type":"shelf","description":"","source_media_id":"none","source_media_title":"none","original_id":"G6QWVXG16","version":null}},"772ce5a996b502565a6f2db2ead4ec5b":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/GRDQ41V1Y?locale=en-US"}},"__actions__":{},"id":"772ce5a996b502565a6f2db2ead4ec5b","title":"Bee and PuppyCat","resource_type":"panel","display_type":"shelf","description":"Bee, a reluctant hero, becomes entangled in the adventures of a puppy (...or is he a cat?) as they travel between reality and the void of Fishbowl Space. Created by Natasha Allegri, character designer and storyboard artist for Adventure Time. We can neither confirm nor deny the autobiographical nature of Bee & PuppyCat.","source_media_id":"none","source_media_title":"none","original_id":"GRDQ41V1Y","version":null}},"3295b943c8a5a5e4d5bb9cc6a4364b51":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/G609MWD26?locale=en-US"}},"__actions__":{},"id":"3295b943c8a5a5e4d5bb9cc6a4364b51","title":"RWBY","resource_type":"panel","display_type":"shelf","description":"The future-fantasy world of Remnant is filled with ravenous monsters, treacherous terrain, and more villains than you can shake a sniper-scythe at. Fortunately, Beacon Academy is training Huntsmen and Huntresses to battle the evils of the world, and Ruby, Weiss, Blake, and Yang are ready for their first day of class.","source_media_id":"none","source_media_title":"none","original_id":"G609MWD26","version":null}},"7cda4eb9d2e2ba89fffada0e0148bf40":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/GYP5213VY?locale=en-US"}},"__actions__":{},"id":"7cda4eb9d2e2ba89fffada0e0148bf40","title":"CF 107 Recuts","resource_type":"panel","display_type":"shelf","description":"","source_media_id":"none","source_media_title":"none","original_id":"GYP5213VY","version":null}},"4e54f5a702e54e7b119afb964e6593b5":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/GR71570NR?locale=en-US"}},"__actions__":{},"id":"4e54f5a702e54e7b119afb964e6593b5","title":"NANA","resource_type":"panel","display_type":"shelf","description":"A chance meeting on a Tokyo-bound train sets Nana Komatsu and Nana Osaki on a collision course with destiny. Their fates intertwine as they pursue their dreams, but soon the harsh realities of life in metropolitan Tokyo threaten to tear them apart.","source_media_id":"none","source_media_title":"none","original_id":"GR71570NR","version":null}},"65b7c10023d4dc6f228a3b3f6f824e8e":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/G63VKW8WY?locale=en-US"}},"__actions__":{},"id":"65b7c10023d4dc6f228a3b3f6f824e8e","title":"Cat Agent","resource_type":"panel","display_type":"shelf","description":"CAT AGENT is a cat who's an agent, who represents cats. Created by Kent Osborne (of 'Adventure Time' and 'SpongeBob SquarePants').","source_media_id":"none","source_media_title":"none","original_id":"G63VKW8WY","version":null}},"cb0619299c66c5b335127aa24c979ae8":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/GR751KNZY?locale=en-US"}},"__actions__":{},"id":"cb0619299c66c5b335127aa24c979ae8","title":"Attack on Titan","resource_type":"panel","display_type":"shelf","description":"Known in Japan as Shingeki no Kyojin, many years ago, the last remnants of humanity were forced to retreat behind the towering walls of a fortified city to escape the massive, man-eating Titans that roamed the land outside their fortress. Only the heroic members of the Scouting Legion dared to stray beyond the safety of the walls – but even those brave warriors seldom returned alive. Those within the city clung to the illusion of a peaceful existence until the day that dream was shattered, and their slim chance at survival was reduced to one horrifying choice: kill – or be devoured!","source_media_id":"none","source_media_title":"none","original_id":"GR751KNZY","version":null}},"5d0d4335deb157894bf035df9e986068":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/curated_feeds/GYEXG5NQ6?locale=en-US&version=1.1"}},"__actions__":{},"id":"5d0d4335deb157894bf035df9e986068","title":"Cardsharks","resource_type":"curated_collection","display_type":"shelf","description":"","source_media_id":"none","source_media_title":"none","original_id":"GYEXG5NQ6","version":null}},"a2ae7f9ab7ed668b73b83ddfdada04e3":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/GRNQZ129R?locale=en-US"}},"__actions__":{},"id":"a2ae7f9ab7ed668b73b83ddfdada04e3","title":"HarmonQuest","resource_type":"panel","display_type":"shelf","description":"A comedic journey into the hilarious world of fantasy roleplaying with Dan Harmon and his Comedian Companions. A new Seeso Original Series starring Dan Harmon, Spencer Crittenden, Erin McGathy, and Jeff B. Davis. Guest stars include, Paul F. Tompkins, Chelsea Peretti, Steve Agee, Aubrey Plaza, Thomas Middleditch, Kumail Nanjiani, & more.","source_media_id":"none","source_media_title":"none","original_id":"GRNQZ129R","version":null}},"e1cc724c96c70c7333af19a70a0eaaf8":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/G6KE1QDW6?locale=en-US"}},"__actions__":{},"id":"e1cc724c96c70c7333af19a70a0eaaf8","title":"Kinda Funny Podcast","resource_type":"panel","display_type":"shelf","description":"A look into the shenanigan filled lives of the core Kinda Funny cast. Hosted by Greg Miller, Tim Gettys, Nick Scarpino, and Andy Cortez.","source_media_id":"none","source_media_title":"none","original_id":"G6KE1QDW6","version":null}},"dad83360976cf44ef69c13fb75f8b04a":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/G6W4M3V0R?locale=en-US"}},"__actions__":{},"id":"dad83360976cf44ef69c13fb75f8b04a","title":"MADE IN ABYSS","resource_type":"panel","display_type":"shelf","description":"Within the depths of the Abyss, a girl named Riko stumbles upon a robot who looks like a young boy. Riko and her new friend descend into uncharted territory to unlock its mysteries, but what lies in wait for them in the darkness?","source_media_id":"none","source_media_title":"none","original_id":"G6W4M3V0R","version":null}},"b8b1cd76aeedb1bac4a8d98f1b70782f":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/G6Q485KPR?locale=en-US"}},"__actions__":{},"id":"b8b1cd76aeedb1bac4a8d98f1b70782f","title":"Happy Tree Friends ","resource_type":"panel","display_type":"shelf","description":"","source_media_id":"none","source_media_title":"none","original_id":"G6Q485KPR","version":null}},"155f622e6446b0626d20e9df14e9e238":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/G65VMP8Q6?locale=en-US"}},"__actions__":{},"id":"155f622e6446b0626d20e9df14e9e238","title":"Shadowstone Park","resource_type":"panel","display_type":"shelf","description":"From Charlie the Unicorn creator Jason Steele of Filmcow comes Shadowstone Park. A pelican with a cat in its mouth attempts to uncover the secrets of Shadowstone Park, a nature preserve where tourists have been mysteriously disappearing. The pelican is an athletic warrior-type and the cat is an investigative mastermind. Together they form the only mystery-solving duo capable of solving these tricky cases.","source_media_id":"none","source_media_title":"none","original_id":"G65VMP8Q6","version":null}},"f07cae91a46b58050101fc9ab56250da":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/curated_feeds/G6KKD3XE6?locale=en-US&version=1.1"}},"__actions__":{},"id":"f07cae91a46b58050101fc9ab56250da","title":"People Die When They Are Killed","resource_type":"curated_collection","display_type":"shelf","description":"","source_media_id":"none","source_media_title":"none","original_id":"G6KKD3XE6","version":null}},"c2cf56995a1a3ba30050058973ac2339":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/G6KK5G7G6?locale=en-US"}},"__actions__":{},"id":"c2cf56995a1a3ba30050058973ac2339","title":"Ultraseven","resource_type":"panel","display_type":"shelf","description":"As an “Interstellar War of Invasion” spreads across the universe, a scout from the Land of Light in Nebula M78 comes to Earth on a mission of peace.","source_media_id":"none","source_media_title":"none","original_id":"G6KK5G7G6","version":null}},"8f0db24a72d8835fc1642cc3af734e28":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/G79H23G0D?locale=en-US"}},"__actions__":{},"id":"8f0db24a72d8835fc1642cc3af734e28","title":"So I'm a Spider, So What?","resource_type":"panel","display_type":"shelf","description":"I, the protagonist, was just an ordinary high school girl, but suddenly I was reincarnated as a spider monster in a fantasy world. Not only that, but I awakened in a dungeon filled with vicious monsters. Armed with only my human knowledge and my overwhelming positivity, I'm forced to use spiderwebs and traps to defeat far stronger monsters just to stay alive... So begins the labyrinth survival story of a girl with incredible mental strength living as one of the lowest-ranked beasts!","source_media_id":"none","source_media_title":"none","original_id":"G79H23G0D","version":null}},"7c99b1d8bbed0d0272e9f134dd1cea05":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/series/GYP8VJJQY?locale=en-US"}},"__actions__":{},"id":"7c99b1d8bbed0d0272e9f134dd1cea05","title":"Bravest Warriors","resource_type":"panel","display_type":"shelf","description":"Bravest Warriors follows four teenaged heroes-for-hire as they warp through the universe to save adorable aliens and their worlds using the power of their emotions. Created by Pendleton Ward, the mind behind Cartoon Network's Adventure Time.","source_media_id":"none","source_media_title":"none","original_id":"GYP8VJJQY","version":null}},"520d4acaa213a8c8881b1bed3991faa5":{"reference":{"__class__":"home_feed_item","__href__":"","__links__":{"resource":{"href":"/cms/v2/US/M2/-/curated_feeds/GYG53DGDY?locale=en-US&version=1.1"}},"__actions__":{},"id":"520d4acaa213a8c8881b1bed3991faa5","title":"Not saying it was aliens... but it was ALIENS","resource_type":"curated_collection","display_type":"shelf","description":"","source_media_id":"none","source_media_title":"none","original_id":"GYG53DGDY","version":null}}},"allIdsMap":{"home":["ea6ba25c9e01fb8ea569c3243410b0ea","df016415e47c2cc35f5bd9ba4b396ef2","d417565f8000121785ccfc50f15ea283","b86092c279ba81d716bebabce315090c","7584383dc2159f4941f0ce08dc5073ad","943b4d5ac3f5b9400c058255ffb079a2","772ce5a996b502565a6f2db2ead4ec5b","3295b943c8a5a5e4d5bb9cc6a4364b51","7cda4eb9d2e2ba89fffada0e0148bf40","4e54f5a702e54e7b119afb964e6593b5","65b7c10023d4dc6f228a3b3f6f824e8e","cb0619299c66c5b335127aa24c979ae8","5d0d4335deb157894bf035df9e986068","a2ae7f9ab7ed668b73b83ddfdada04e3","e1cc724c96c70c7333af19a70a0eaaf8","dad83360976cf44ef69c13fb75f8b04a","b8b1cd76aeedb1bac4a8d98f1b70782f","155f622e6446b0626d20e9df14e9e238","f07cae91a46b58050101fc9ab56250da","c2cf56995a1a3ba30050058973ac2339","8f0db24a72d8835fc1642cc3af734e28","7c99b1d8bbed0d0272e9f134dd1cea05","520d4acaa213a8c8881b1bed3991faa5"]},"error":null},"upNext":{"byId":{}},"content":{"byId":{"G6497Z43Y":{"__class__":"series","__href__":"/cms/v2/US/M2/-/series/G6497Z43Y","__resource_key__":"cms:/series/G6497Z43Y","__links__":{"series/channel":{"href":"/cms/v2/US/M2/-/channels/crunchyroll"},"series/seasons":{"href":"/cms/v2/US/M2/-/seasons?series_id=G6497Z43Y"}},"__actions__":{},"id":"G6497Z43Y","channel_id":"crunchyroll","title":"Miss Kobayashi's Dragon Maid","slug":"","slug_title":"miss-kobayashis-dragon-maid","description":"Miss Kobayashi is your average office worker who lives a boring life, alone in her small apartment–until she saves the life of a female dragon in distress. The dragon, named Tohru, has the ability to magically transform into an adorable human girl (albeit with horns and a long tail!), who will do anything to pay off her debt of gratitude, whether Miss Kobayashi likes it or not. With a very persistent and amorous dragon as a roommate, nothing comes easy, and Miss Kobayashi’s normal life is about to go off the deep end!","extended_description":"","keywords":["kobayashi-san chi no maid dragon","miss kobayashi's maid dragon","slice of life","romance","comedy","fantasy","deutsche synchro","dublado","vf","seinen"],"season_tags":["summer-2021"],"images":{"poster_tall":[[{"width":60,"height":90,"type":"poster_tall","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/60x90/catalog/crunchyroll/6f47c37390d0efe7ab61e34f93ddcfe7.jpeg"},{"width":120,"height":180,"type":"poster_tall","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/120x180/catalog/crunchyroll/6f47c37390d0efe7ab61e34f93ddcfe7.jpeg"},{"width":240,"height":360,"type":"poster_tall","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/240x360/catalog/crunchyroll/6f47c37390d0efe7ab61e34f93ddcfe7.jpeg"},{"width":480,"height":720,"type":"poster_tall","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/480x720/catalog/crunchyroll/6f47c37390d0efe7ab61e34f93ddcfe7.jpeg"},{"width":750,"height":1125,"type":"poster_tall","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/750x1125/catalog/crunchyroll/6f47c37390d0efe7ab61e34f93ddcfe7.jpeg"},{"width":960,"height":1440,"type":"poster_tall","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/960x1440/catalog/crunchyroll/6f47c37390d0efe7ab61e34f93ddcfe7.jpeg"},{"width":1125,"height":1688,"type":"poster_tall","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1125x1688/catalog/crunchyroll/6f47c37390d0efe7ab61e34f93ddcfe7.jpeg"},{"width":1200,"height":1800,"type":"poster_tall","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1200x1800/catalog/crunchyroll/6f47c37390d0efe7ab61e34f93ddcfe7.jpeg"},{"width":1560,"height":2340,"type":"poster_tall","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1560x2340/catalog/crunchyroll/6f47c37390d0efe7ab61e34f93ddcfe7.jpeg"}]],"poster_wide":[[{"width":320,"height":180,"type":"poster_wide","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/320x180/catalog/crunchyroll/9af6719d7998089e7fe249c1c60986e2.jpeg"},{"width":600,"height":338,"type":"poster_wide","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x338/catalog/crunchyroll/9af6719d7998089e7fe249c1c60986e2.jpeg"},{"width":640,"height":360,"type":"poster_wide","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/640x360/catalog/crunchyroll/9af6719d7998089e7fe249c1c60986e2.jpeg"},{"width":800,"height":450,"type":"poster_wide","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x450/catalog/crunchyroll/9af6719d7998089e7fe249c1c60986e2.jpeg"},{"width":1200,"height":675,"type":"poster_wide","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1200x675/catalog/crunchyroll/9af6719d7998089e7fe249c1c60986e2.jpeg"},{"width":1440,"height":810,"type":"poster_wide","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1440x810/catalog/crunchyroll/9af6719d7998089e7fe249c1c60986e2.jpeg"},{"width":1600,"height":900,"type":"poster_wide","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1600x900/catalog/crunchyroll/9af6719d7998089e7fe249c1c60986e2.jpeg"},{"width":1920,"height":1080,"type":"poster_wide","source":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1920x1080/catalog/crunchyroll/9af6719d7998089e7fe249c1c60986e2.jpeg"}]]},"maturity_ratings":["TV-14"],"episode_count":107,"season_count":13,"media_count":107,"content_provider":"ABC Asahi","is_mature":false,"mature_blocked":false,"is_subbed":true,"is_dubbed":false,"is_simulcast":true,"seo_title":"","seo_description":"","availability_notes":""}}},"playheads":{"byId":{}},"overlay":{"isOverlayOpen":false,"visitedChannels":[]},"episodeSort":{"isOrderAscending":true},"channels":{"byId":{"cartoonhangover":{"__class__":"channel","__href__":"/cms/v2/US/M2/-/channels/cartoonhangover","__resource_key__":"cms:/channels/cartoonhangover","__links__":{"channel/curated_feeds":{"href":"/cms/v2/US/M2/-/curated_feeds?channel_id=cartoonhangover"},"channel/feed":{"href":"/cms/v2/US/M2/-/channels/cartoonhangover/feed"},"channel/movie_listings":{"href":"/cms/v2/US/M2/-/movie_listings?channel_id=cartoonhangover&mode=channel"},"channel/primary_feed":{"href":"/cms/v2/US/M2/-/curated_feeds/GY3VD49ZR?version=1.1"},"channel/primary_feed_expanded":{"href":"/cms/v2/US/M2/-/curated_feeds/GY3VD49ZR?version=1.1&expand=true"},"channel/series":{"href":"/cms/v2/US/M2/-/series?channel_id=cartoonhangover&mode=channel"}},"__actions__":{},"id":"cartoonhangover","name":"Cartoon Hangover","description":"Perfectly odd entertainment for perfect people, like you.","slug":"Indie Cartoons","primary_background_color":"#EC2024","images":{"channel_promo":{"320":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/320x180/catalog/cartoonhangover/78b4e19547e95a038734ee545a1e604b.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x338/catalog/cartoonhangover/78b4e19547e95a038734ee545a1e604b.png","640":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/640x360/catalog/cartoonhangover/78b4e19547e95a038734ee545a1e604b.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x450/catalog/cartoonhangover/78b4e19547e95a038734ee545a1e604b.png","1200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1200x675/catalog/cartoonhangover/78b4e19547e95a038734ee545a1e604b.png","1440":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1440x810/catalog/cartoonhangover/78b4e19547e95a038734ee545a1e604b.png","1600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1600x900/catalog/cartoonhangover/78b4e19547e95a038734ee545a1e604b.png","1920":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1920x1080/catalog/cartoonhangover/78b4e19547e95a038734ee545a1e604b.png"},"logo_mark":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/cartoonhangover/bb0dae7fdb5cd466bd7b924332368bf2.png"},"logo_mark_simple":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/cartoonhangover/fe6d9f6e046d18d85c3e260517324bf2.png"}}},"crunchyroll":{"__class__":"channel","__href__":"/cms/v2/US/M2/-/channels/crunchyroll","__resource_key__":"cms:/channels/crunchyroll","__links__":{"channel/curated_feeds":{"href":"/cms/v2/US/M2/-/curated_feeds?channel_id=crunchyroll"},"channel/feed":{"href":"/cms/v2/US/M2/-/channels/crunchyroll/feed"},"channel/movie_listings":{"href":"/cms/v2/US/M2/-/movie_listings?channel_id=crunchyroll&mode=channel"},"channel/primary_feed":{"href":"/cms/v2/US/M2/-/curated_feeds/GY5VE1WPY?version=1.1"},"channel/primary_feed_expanded":{"href":"/cms/v2/US/M2/-/curated_feeds/GY5VE1WPY?version=1.1&expand=true"},"channel/series":{"href":"/cms/v2/US/M2/-/series?channel_id=crunchyroll&mode=channel"}},"__actions__":{},"id":"crunchyroll","name":"Crunchyroll","description":"The most comprehensive library of new and classic anime and more from Japan.","slug":"World's Largest Anime Collection","primary_background_color":"#F47521","images":{"channel_promo":{"320":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/320x180/catalog/crunchyroll/643094d094a400c93cbd05460d1edbda.jpg","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x338/catalog/crunchyroll/643094d094a400c93cbd05460d1edbda.jpg","640":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/640x360/catalog/crunchyroll/643094d094a400c93cbd05460d1edbda.jpg","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x450/catalog/crunchyroll/643094d094a400c93cbd05460d1edbda.jpg","1200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1200x675/catalog/crunchyroll/643094d094a400c93cbd05460d1edbda.jpg","1440":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1440x810/catalog/crunchyroll/643094d094a400c93cbd05460d1edbda.jpg","1600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1600x900/catalog/crunchyroll/643094d094a400c93cbd05460d1edbda.jpg","1920":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1920x1080/catalog/crunchyroll/643094d094a400c93cbd05460d1edbda.jpg"},"logo_mark":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/crunchyroll/f7b0fbfd084de75d4c5e515063ead238.png"},"logo_mark_simple":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/crunchyroll/2f608375a63408fd2a808049ebe1177d.png"}}},"hidive":{"__class__":"channel","__href__":"/cms/v2/US/M2/-/channels/hidive","__resource_key__":"cms:/channels/hidive","__links__":{"channel/curated_feeds":{"href":"/cms/v2/US/M2/-/curated_feeds?channel_id=hidive"},"channel/feed":{"href":"/cms/v2/US/M2/-/channels/hidive/feed"},"channel/movie_listings":{"href":"/cms/v2/US/M2/-/movie_listings?channel_id=hidive&mode=channel"},"channel/primary_feed":{"href":"/cms/v2/US/M2/-/curated_feeds/GYM8EK8WY?version=1.1"},"channel/primary_feed_expanded":{"href":"/cms/v2/US/M2/-/curated_feeds/GYM8EK8WY?version=1.1&expand=true"},"channel/series":{"href":"/cms/v2/US/M2/-/series?channel_id=hidive&mode=channel"}},"__actions__":{},"id":"hidive","name":"HIDIVE","description":"The best simulcasts, dubs, exclusives, uncensored anime and live-action series – all in one place.\n","slug":"Stream Anime & Stuff","primary_background_color":"#00aeef","images":{"channel_promo":{"320":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/320x180/catalog/hidive/b7f5ccfc79d5ae3005233f4560fe16f5.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x338/catalog/hidive/b7f5ccfc79d5ae3005233f4560fe16f5.png","640":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/640x360/catalog/hidive/b7f5ccfc79d5ae3005233f4560fe16f5.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x450/catalog/hidive/b7f5ccfc79d5ae3005233f4560fe16f5.png","1200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1200x675/catalog/hidive/b7f5ccfc79d5ae3005233f4560fe16f5.png","1440":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1440x810/catalog/hidive/b7f5ccfc79d5ae3005233f4560fe16f5.png","1600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1600x900/catalog/hidive/b7f5ccfc79d5ae3005233f4560fe16f5.png","1920":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1920x1080/catalog/hidive/b7f5ccfc79d5ae3005233f4560fe16f5.png"},"logo_mark":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/hidive/3252360d921815bf55b43db0c2b8a8c5.png"},"logo_mark_simple":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/hidive/5d65b6c92604fc2e793d3f621805e6ad.png"}}},"mondo":{"__class__":"channel","__href__":"/cms/v2/US/M2/-/channels/mondo","__resource_key__":"cms:/channels/mondo","__links__":{"channel/curated_feeds":{"href":"/cms/v2/US/M2/-/curated_feeds?channel_id=mondo"},"channel/feed":{"href":"/cms/v2/US/M2/-/channels/mondo/feed"},"channel/movie_listings":{"href":"/cms/v2/US/M2/-/movie_listings?channel_id=mondo&mode=channel"},"channel/primary_feed":{"href":"/cms/v2/US/M2/-/curated_feeds/GYQ4WKX96?version=1.1"},"channel/primary_feed_expanded":{"href":"/cms/v2/US/M2/-/curated_feeds/GYQ4WKX96?version=1.1&expand=true"},"channel/series":{"href":"/cms/v2/US/M2/-/series?channel_id=mondo&mode=channel"}},"__actions__":{},"id":"mondo","name":"Mondo","description":"Mondo is your home for all things animated. Bold. Subversive. Irreverent.","slug":"Extreme animation","primary_background_color":"#E93735","images":{"channel_promo":{"320":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/320x180/catalog/mondo/a27459c4be91f959af3df26e35075ee3.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x338/catalog/mondo/a27459c4be91f959af3df26e35075ee3.png","640":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/640x360/catalog/mondo/a27459c4be91f959af3df26e35075ee3.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x450/catalog/mondo/a27459c4be91f959af3df26e35075ee3.png","1200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1200x675/catalog/mondo/a27459c4be91f959af3df26e35075ee3.png","1440":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1440x810/catalog/mondo/a27459c4be91f959af3df26e35075ee3.png","1600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1600x900/catalog/mondo/a27459c4be91f959af3df26e35075ee3.png","1920":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1920x1080/catalog/mondo/a27459c4be91f959af3df26e35075ee3.png"},"logo_mark":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/mondo/081b887593b6fc979fb721665ad4ef31.png"},"logo_mark_simple":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/mondo/e4e43ea71ac360a96d36ef760883559a.png"}}},"roosterteeth":{"__class__":"channel","__href__":"/cms/v2/US/M2/-/channels/roosterteeth","__resource_key__":"cms:/channels/roosterteeth","__links__":{"channel/curated_feeds":{"href":"/cms/v2/US/M2/-/curated_feeds?channel_id=roosterteeth"},"channel/feed":{"href":"/cms/v2/US/M2/-/channels/roosterteeth/feed"},"channel/movie_listings":{"href":"/cms/v2/US/M2/-/movie_listings?channel_id=roosterteeth&mode=channel"},"channel/primary_feed":{"href":"/cms/v2/US/M2/-/curated_feeds/G6JQ9J33R?version=1.1"},"channel/primary_feed_expanded":{"href":"/cms/v2/US/M2/-/curated_feeds/G6JQ9J33R?version=1.1&expand=true"},"channel/series":{"href":"/cms/v2/US/M2/-/series?channel_id=roosterteeth&mode=channel"}},"__actions__":{},"id":"roosterteeth","name":"Rooster Teeth","description":"Award-winning shows for fans of animation, gaming, and comedy.","slug":" Original online hits","primary_background_color":"#AF272F","images":{"channel_promo":{"320":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/320x180/catalog/roosterteeth/56d1c5df6717367cfeefc82f02957182.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x338/catalog/roosterteeth/56d1c5df6717367cfeefc82f02957182.png","640":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/640x360/catalog/roosterteeth/56d1c5df6717367cfeefc82f02957182.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x450/catalog/roosterteeth/56d1c5df6717367cfeefc82f02957182.png","1200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1200x675/catalog/roosterteeth/56d1c5df6717367cfeefc82f02957182.png","1440":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1440x810/catalog/roosterteeth/56d1c5df6717367cfeefc82f02957182.png","1600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1600x900/catalog/roosterteeth/56d1c5df6717367cfeefc82f02957182.png","1920":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1920x1080/catalog/roosterteeth/56d1c5df6717367cfeefc82f02957182.png"},"logo_mark":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/roosterteeth/c953ff0cd7ae00f074acd2a35cd3675e.png"},"logo_mark_simple":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/roosterteeth/84580e82f7182ab2f523832390a2ca9b.png"}}},"vrvselect":{"__class__":"channel","__href__":"/cms/v2/US/M2/-/channels/vrvselect","__resource_key__":"cms:/channels/vrvselect","__links__":{"channel/curated_feeds":{"href":"/cms/v2/US/M2/-/curated_feeds?channel_id=vrvselect"},"channel/feed":{"href":"/cms/v2/US/M2/-/channels/vrvselect/feed"},"channel/movie_listings":{"href":"/cms/v2/US/M2/-/movie_listings?channel_id=vrvselect&mode=channel"},"channel/primary_feed":{"href":"/cms/v2/US/M2/-/curated_feeds/GYNQZ209Y?version=1.1"},"channel/primary_feed_expanded":{"href":"/cms/v2/US/M2/-/curated_feeds/GYNQZ209Y?version=1.1&expand=true"},"channel/series":{"href":"/cms/v2/US/M2/-/series?channel_id=vrvselect&mode=channel"}},"__actions__":{},"id":"vrvselect","name":"VRV Select","description":"VRV's hand-picked shows and movies, just for Premium members.","slug":"Discover VRV's next big thing","primary_background_color":"#808285","images":{"channel_promo":{"320":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/320x180/catalog/vrvselect/df705d6badbeedef739e44f21311c590.jpg","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x338/catalog/vrvselect/df705d6badbeedef739e44f21311c590.jpg","640":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/640x360/catalog/vrvselect/df705d6badbeedef739e44f21311c590.jpg","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x450/catalog/vrvselect/df705d6badbeedef739e44f21311c590.jpg","1200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1200x675/catalog/vrvselect/df705d6badbeedef739e44f21311c590.jpg","1440":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1440x810/catalog/vrvselect/df705d6badbeedef739e44f21311c590.jpg","1600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1600x900/catalog/vrvselect/df705d6badbeedef739e44f21311c590.jpg","1920":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1920x1080/catalog/vrvselect/df705d6badbeedef739e44f21311c590.jpg"},"logo_mark":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/vrvselect/8f0f6b18e05bf89359d29aac1ce4a64d.png"},"logo_mark_simple":{"20":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/20x20/catalog/vrvselect/495617604205faf4ef10affdecf12006.png","40":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/40x40/catalog/vrvselect/495617604205faf4ef10affdecf12006.png","46":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/46x46/catalog/vrvselect/495617604205faf4ef10affdecf12006.png","92":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/92x92/catalog/vrvselect/495617604205faf4ef10affdecf12006.png","100":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/100x100/catalog/vrvselect/495617604205faf4ef10affdecf12006.png","180":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/180x180/catalog/vrvselect/495617604205faf4ef10affdecf12006.png","200":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/200x200/catalog/vrvselect/495617604205faf4ef10affdecf12006.png","360":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/360x360/catalog/vrvselect/495617604205faf4ef10affdecf12006.png","400":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/400x400/catalog/vrvselect/495617604205faf4ef10affdecf12006.png","600":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/600x600/catalog/vrvselect/495617604205faf4ef10affdecf12006.png","800":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/800x800/catalog/vrvselect/495617604205faf4ef10affdecf12006.png","1000":"https://beta.crunchyroll.com/imgsrv/display/thumbnail/1000x1000/catalog/vrvselect/495617604205faf4ef10affdecf12006.png"}}}},"allIds":["cartoonhangover","crunchyroll","hidive","mondo","roosterteeth","vrvselect"],"isLoading":false,"fetchError":null},"router":{"isNotFound":false},"sunset":{"user":{},"lastRemindedAt":0,"modal":null}};
window.__APP_CONFIG__ = {"whatIsVrvVideoId":"LpOumrC4yno","serverSideDataFetching":{"enabled":true,"onEnterExecutionTimeout":6000},"logger":{"enabled":true},"debug":{"enabled":false},"newrelic":{"enabled":true},"segment":{"writeKey":"SIeNJozAqhQxDdHOOY6mvnSKKzHo1BvJ","defaultApiHost":"vrv-eec.etp-prod.com/v1","fallbackLibraryHost":"sa.etp-prod.com/analytics.js/v1/"},"drm":{"dashRolloutPercent":100},"payPal":{"env":"production"},"darkFeatureConfig":{"features":{"vendriPID":"7e1c53b9c4dddc90afc13ff0330d5695f5a82454a830f5ac4062251efa58b525","strictMode":"adc72b417b58bb9970bf5ac02fc275acf92977f3ae3c602ab638f1f39c243155","vilosDrmdash":"3f7362b27b3c7503e356a7dc22340f130299ec67c3081d0b864b508ac8ed8995","vilosDrmhls":"f6b48e4fcab5a38fe3f38e4a3553d6d973837addc20494e45faf5339f25a00be","vilosUrl":"79fd1e27d68052f4e55907d20273b1334df964a9c89e19314bbd1a9ec44ad287","sunset":"0829103205fbe7963996bfacbaeaab326bb8eded5bd97e9b677f7c05fbe44b5b","sunsetDateOverride":"f7e30c63e98fd5ca4d548b284dd594f79bdf9e16cc5265b8f341dcef0e32ccb3"},"storageKey":"efg_state","securedRoutes":["/feature(?![\\w\\d])","/feature(?![wd])"]},"staticDomain":"https://static.vrv.co","assetsPath":"/vrvweb","assetsBuildPath":"/build","crunchyrollSiteUrl":"https://www.crunchyroll.com","baseSiteUrl":"https://vrv.co","vilosPlayerUrl":"https://static.vrv.co/vilos/player.html","cancelHappyMealUrl":"https://www.crunchyroll.com/acct/membership","cxApiParams":{"oAuthKey":"OvqR158Z9212i41UkNRzooutpU9Vp0vuXD9K0zKAvJdXPh6LfMOro4stVQRS","oAuthSecret":"EBgJav6Z99M9jFLzcexL6iETovNGbobFAJGudkDKMloqaBJgdo9u3WNuumM1","apiDomain":"https://api.vrv.co","talkboxOAuth":{"oAuthKey":"5UZvuVsRWpcJYaINRg9PGPyd7mgFebkfEOwlzwspAWDYp62WipmD7C6UzYyG","oAuthSecret":"efYuyLgNTfEFN30d5F5Up6T1Ra3yMVI6FfxwXxcfqKVbkEOGtFklNosqBw8m"}},"redirectRoutes":[{"url":"/maintenance","patterns":[],"statusCode":302}],"authorizedRoutes":["^/account(?![wd])","^/signup/profile","^/watchlist(?![wd])"],"availableRoutes":["/","/account/link-accounts/*","/account/link-accounts","/account/manage-notifications","/account/manage-payments","/account/manage-profile","/account/memberships","/account","/boomerang(/browse)?","/browse","/cartoonhangover(/browse)?","/crunchyroll(/browse)?","/crunchyroll/watch/*","/feature","/get-vrv","/gopremium-offline-viewing","/gopremium","/help","/hidive(/browse)?","/maintenance","/modal","/mondo(/browse)?","/nicksplat(/browse)?","/privacy","/redeem","/roosterteeth(/browse)?","/seeso(/browse)?","/series/*","/signin/att","/signin","/signup/att","/signup/profile","/signup","/terms","/tested(/browse)?","/tryfree","/unavailable","/unsubscribe","/vrvselect(/browse)?","/watch/*","/watchlist"],"NODE_ENV":"production"};
</script>
</div>
</html>
```
Not sure itf it is helpful but that's all I got for now. | null | null | https://github.com/yt-dlp/yt-dlp/commit/f7590d47641cedbf630b909aa8f53930c4a9ce5c | {'base_commit': 'f7590d47641cedbf630b909aa8f53930c4a9ce5c', 'files': [{'path': 'yt_dlp/extractor/vrv.py', 'status': 'modified', 'Loc': {"('VRVIE', '_real_extract', 168)": {'mod': [221]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"yt_dlp/extractor/vrv.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
yt-dlp | yt-dlp | 50e93e03a7ca6ae35a319ea310104f7d6d91eee3 | https://github.com/yt-dlp/yt-dlp/issues/3183 | geo-blocked
site-bug | Tele5 has an extraction error | ### Checklist
- [X] I'm reporting a broken site
- [X] I've verified that I'm running yt-dlp version **2022.03.08.1**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
- [X] I've checked that all provided URLs are alive and playable in a browser
- [X] I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
- [X] I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
- [X] I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
- [ ] I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and I'm willing to share it if required
### Region
Germany
### Description
trying to download the curernt andromenda series:
`yt-dlp -F https://tele5.de/mediathek/gene-roddenberrys-andromeda/`
`[Tele5] gene-roddenberrys-andromeda: Downloading webpage`
`ERROR: gene-roddenberrys-andromeda: An extractor error has occurred. (caused by KeyError('assetid')); please report this issue on https://github.com/yt-dlp/yt-dlp , filling out the "Broken site" issue template properly. Confirm you are on the latest ver`
### Verbose log
```shell
ERROR: gene-roddenberrys-andromeda: An extractor error has occurred. (caused by KeyError('assetid')); please report this issue on https://github.com/yt-dlp/yt-dlp , filling out the "Broken site" issue template properly. Confirm you are on the latest version using yt-dlp -U
File "/usr/bin/yt-dlp/yt_dlp/extractor/common.py", line 617, in extract
ie_result = self._real_extract(url)
File "/usr/bin/yt-dlp/yt_dlp/extractor/tele5.py", line 81, in _real_extract
asset_id, country, realm = (player_info[x] for x in ('assetid', 'locale', 'realm', ))
File "/usr/bin/yt-dlp/yt_dlp/extractor/tele5.py", line 81, in <genexpr>
asset_id, country, realm = (player_info[x] for x in ('assetid', 'locale', 'realm', ))
KeyError: 'assetid'
```
| null | null | https://github.com/yt-dlp/yt-dlp/commit/50e93e03a7ca6ae35a319ea310104f7d6d91eee3 | {'base_commit': '50e93e03a7ca6ae35a319ea310104f7d6d91eee3', 'files': [{'path': 'yt_dlp/YoutubeDL.py', 'status': 'modified', 'Loc': {}}, {'path': 'yt_dlp/extractor/aliexpress.py', 'status': 'modified', 'Loc': {"('AliExpressLiveIE', None, 12)": {'mod': [21]}}}, {'path': 'yt_dlp/extractor/applepodcasts.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [5, 6]}, "('ApplePodcastsIE', None, 15)": {'add': [26], 'mod': [17, 22, 24, 25, 42, 43, 44, 45, 46]}, "('ApplePodcastsIE', '_real_extract', 42)": {'add': [52, 61], 'mod': [50, 56]}}}, {'path': 'yt_dlp/extractor/arte.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [14]}, "('ArteTVPlaylistIE', '_real_extract', 230)": {'add': [255]}}}, {'path': 'yt_dlp/extractor/audiomack.py', 'status': 'modified', 'Loc': {"('AudiomackIE', None, 16)": {'add': [31]}}}, {'path': 'yt_dlp/extractor/bbc.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [13]}, "('BBCIE', None, 604)": {'add': [786, 791], 'mod': [796, 797, 799, 800, 801]}, "('BBCCoUkIE', None, 39)": {'mod': [41]}, "('BBCCoUkIE', '_process_media_selector', 363)": {'mod': [397, 398, 399]}, "('BBCIE', '_real_extract', 906)": {'mod': [1174, 1175, 1176]}, "('BBCIE', 'parse_media', 1206)": {'mod': [1217]}}}, {'path': 'yt_dlp/extractor/bigo.py', 'status': 'modified', 'Loc': {"('BigoIE', '_real_extract', 30)": {'add': [36], 'mod': [39, 47]}}}, {'path': 'yt_dlp/extractor/extractors.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [70, 93, 308]}}}, {'path': 'yt_dlp/extractor/nuvid.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [2], 'mod': [8]}, "('NuvidIE', None, 15)": {'add': [22, 25, 30, 48], 'mod': [29]}, "('NuvidIE', '_real_extract', 53)": {'mod': [58, 59, 60, 61, 62, 63, 64, 70]}}}, {'path': 'yt_dlp/extractor/rutv.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [9]}, "('RUTVIE', '_real_extract', 126)": {'mod': [182]}}}, {'path': 'yt_dlp/extractor/streamcz.py', 'status': 'modified', 'Loc': {"('StreamCZIE', None, 14)": {'add': [24], 'mod': [34]}}}, {'path': 'yt_dlp/extractor/tele5.py', 'status': 'modified', 'Loc': {"('Tele5IE', None, 12)": {'add': [30], 'mod': [16, 45, 67, 68, 70, 71, 73, 74, 75, 76, 78, 80, 81, 82, 83, 84, 86, 87, 88, 90, 91, 92, 93, 94, 95, 97, 98, 99, 101, 102, 104, 105, 106, 107, 108]}, '(None, None, None)': {'mod': [4, 6, 7, 8, 10, 11, 12]}}}, {'path': 'yt_dlp/extractor/tv2dk.py', 'status': 'modified', 'Loc': {"('TV2DKIE', '_real_extract', 79)": {'add': [98], 'mod': [94]}, "('TV2DKIE', None, 16)": {'mod': [44, 45]}}}, {'path': 'yt_dlp/extractor/uol.py', 'status': 'modified', 'Loc': {"('UOLIE', '_real_extract', 67)": {'mod': [98]}}}, {'path': 'yt_dlp/extractor/urplay.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [6, 7]}, "('URPlayIE', None, 16)": {'add': [28], 'mod': [26, 53, 54, 55, 56, 57]}, "('URPlayIE', '_real_extract', 54)": {'add': [113], 'mod': [75, 76, 77, 78, 79, 101]}}}, {'path': 'yt_dlp/extractor/videa.py', 'status': 'modified', 'Loc': {"('VideaIE', '_real_extract', 112)": {'mod': [149, 166, 167, 168]}}}, {'path': 'yt_dlp/extractor/vimeo.py', 'status': 'modified', 'Loc': {"('VimeoIE', None, 297)": {'add': [638]}}}, {'path': 'yt_dlp/extractor/wdr.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [12, 24]}, "('WDRIE', None, 25)": {'add': [43], 'mod': [31, 39]}, "('WDRPageIE', None, 139)": {'add': [209, 234], 'mod': [173, 175, 177, 186, 194, 197, 248]}, "('WDRPageIE', '_real_extract', 258)": {'add': [273], 'mod': [293, 295, 296, 299, 300, 301, 302]}, "('WDRElefantIE', '_real_extract', 324)": {'add': [336]}, "('WDRIE', '_real_extract', 47)": {'mod': [129, 130, 132]}}}, {'path': 'yt_dlp/extractor/zdf.py', 'status': 'modified', 'Loc': {"('ZDFIE', None, 136)": {'add': [138], 'mod': [198, 199, 200, 201, 202, 203, 204]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "",
"info_type": "Code"
} | {
"code": [
"yt_dlp/extractor/extractors.py",
"yt_dlp/extractor/streamcz.py",
"yt_dlp/extractor/bbc.py",
"yt_dlp/extractor/zdf.py",
"yt_dlp/extractor/tv2dk.py",
"yt_dlp/extractor/rutv.py",
"yt_dlp/extractor/aliexpress.py",
"yt_dlp/extractor/wdr.py",
"yt_dlp/extractor/videa.py",
"yt_dlp/extractor/nuvid.py",
"yt_dlp/extractor/arte.py",
"yt_dlp/extractor/vimeo.py",
"yt_dlp/extractor/urplay.py",
"yt_dlp/extractor/bigo.py",
"yt_dlp/YoutubeDL.py",
"yt_dlp/extractor/applepodcasts.py",
"yt_dlp/extractor/tele5.py",
"yt_dlp/extractor/uol.py",
"yt_dlp/extractor/audiomack.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
yt-dlp | yt-dlp | 80e8493ee7c3083f4e215794e4a67ba5265f24f7 | https://github.com/yt-dlp/yt-dlp/issues/2885 | site-request
patch-available | Add Filmarkivet.se as a Supported Site | ### Checklist
- [X] I'm reporting a new site support request
- [X] I've verified that I'm running yt-dlp version **2022.02.04**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
- [X] I've checked that all provided URLs are alive and playable in a browser
- [X] I've checked that none of provided URLs [violate any copyrights](https://github.com/ytdl-org/youtube-dl#can-you-add-support-for-this-anime-video-site-or-site-which-shows-current-movies-for-free) or contain any [DRM](https://en.wikipedia.org/wiki/Digital_rights_management) to the best of my knowledge
- [X] I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
- [X] I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
- [ ] I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and am willing to share it if required
### Region
United States
### Example URLs
https://www.filmarkivet.se/movies/paris-d-moll/
### Description
Please add Filmarkivet.se as a supported site. I already watched the YouTube video "The Secret Logos Of SF Studios (1919 - 1999)" by CCGFilms, which has some SF Studios logos. I need to capture its logos.
### Verbose log
```shell
[debug] Command-line config: ['-v', 'https://www.filmarkivet.se/movies/paris-d-moll/']
[debug] Encodings: locale cp1252, fs utf-8, out utf-8 (No ANSI), err utf-8 (No ANSI), pref cp1252
[debug] yt-dlp version 2022.02.04 [c1653e9] (win_exe)
[debug] Python version 3.8.10 (CPython 64bit) - Windows-7-6.1.7601-SP1
[debug] exe versions: ffmpeg N-105662-ge534d98af3-20220217 (setts), ffprobe N-105038-g30322ebe3c-sherpya
[debug] Optional libraries: Cryptodome, mutagen, sqlite, websockets
[debug] Proxy map: {}
[debug] [generic] Extracting URL: https://www.filmarkivet.se/movies/paris-d-moll/
[generic] paris-d-moll: Requesting header
WARNING: [generic] Falling back on generic information extractor.
[generic] paris-d-moll: Downloading webpage
WARNING: [generic] URL could be a direct video link, returning it as such.
[debug] Default format spec: bestvideo*+bestaudio/best
[info] paris-d-moll: Downloading 1 format(s): 0
[debug] Invoking downloader on "https://www.filmarkivet.se/movies/paris-d-moll/"
[download] Destination: paris-d-moll [paris-d-moll].unknown_video
[download] 100% of 373.64KiB in 00:01
```
| null | null | https://github.com/yt-dlp/yt-dlp/commit/80e8493ee7c3083f4e215794e4a67ba5265f24f7 | {'base_commit': '80e8493ee7c3083f4e215794e4a67ba5265f24f7', 'files': [{'path': 'yt_dlp/extractor/generic.py', 'status': 'modified', 'Loc': {"('GenericIE', None, 143)": {'add': [2529]}}}, {'path': 'yt_dlp/utils.py', 'status': 'modified', 'Loc': {"(None, 'is_html', 3283)": {'add': [3292], 'mod': [3294, 3295, 3296, 3297, 3298]}, '(None, None, None)': {'mod': [3300]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"yt_dlp/utils.py",
"yt_dlp/extractor/generic.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
yt-dlp | yt-dlp | 5da08bde9e073987d1aae2683235721e4813f9c6 | https://github.com/yt-dlp/yt-dlp/issues/5424 | site-enhancement | [VLIVE.TV] Extract release timestamp | ### DO NOT REMOVE OR SKIP THE ISSUE TEMPLATE
- [X] I understand that I will be **blocked** if I remove or skip any mandatory\* field
### Checklist
- [X] I'm asking a question and **not** reporting a bug or requesting a feature
- [X] I've looked through the [README](https://github.com/yt-dlp/yt-dlp#readme)
- [X] I've verified that I'm running yt-dlp version **2022.10.04** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- [X] I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar questions **including closed ones**. DO NOT post duplicates
- [X] I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
### Please make sure the question is worded well enough to be understood
Is there a way to change `upload_date` from UTC to a specific GMT? This video (https://www.vlive.tv/post/1-18318601) was posted on Nov. 28, 2018 KST (Korean Standard Time) but yt-dlp downloads it as 20181127.
I know you can prefer not to use UTC for YouTube videos but don't know how for other sites.
Here is my command:
`!yt-dlp -vU --embed-metadata --embed-thumbnail --merge-output-format "mkv/mp4" --write-subs --sub-langs all,-live_chat --embed-subs --compat-options no-keep-subs "https://www.vlive.tv/post/1-18318601" -o "%(upload_date)s - %(creator)s - %(title)s.%(ext)s" -P "/content/drive/Shareddrives/VLIVE" -P temp:"/content/drive/Shareddrives/VLIVE/!temp"`
Thanks!
### Provide verbose output that clearly demonstrates the problem
- [X] Run **your** yt-dlp command with **-vU** flag added (`yt-dlp -vU <your command line>`)
- [X] Copy the WHOLE output (starting with `[debug] Command-line config`) and insert it below
### Complete Verbose Output
```shell
[debug] Command-line config: ['-vU', '--embed-metadata', '--embed-thumbnail', '--merge-output-format', 'mkv/mp4', '--write-subs', '--sub-langs', 'all,-live_chat', '--embed-subs', '--compat-options', 'no-keep-subs', 'https://www.vlive.tv/post/1-18318601', '-o', '%(upload_date)s - %(creator)s - %(title)s.%(ext)s', '-P', '/content/drive/Shareddrives/VLIVE', '-P', 'temp:/content/drive/Shareddrives/VLIVE/!temp']
[debug] Encodings: locale UTF-8, fs utf-8, pref UTF-8, out UTF-8, error UTF-8, screen UTF-8
[debug] yt-dlp version 2022.10.04 [4e0511f27]
[debug] Lazy loading extractors is disabled
[debug] Compatibility options: no-keep-subs
[debug] Python 3.7.15 (CPython 64bit) - Linux-5.10.133+-x86_64-with-Ubuntu-18.04-bionic (glibc 2.26)
[debug] Checking exe version: ffmpeg -bsfs
[debug] Checking exe version: ffprobe -bsfs
[debug] exe versions: ffmpeg 3.4.11, ffprobe 3.4.11
[debug] Optional libraries: Cryptodome-3.15.0, brotli-1.0.9, certifi-2022.09.24, mutagen-1.46.0, sqlite3-2.6.0, websockets-10.4
[debug] Proxy map: {}
[debug] Loaded 1706 extractors
[debug] Fetching release info: https://api.github.com/repos/yt-dlp/yt-dlp/releases/latest
Latest version: 2022.10.04, Current version: 2022.10.04
yt-dlp is up to date (2022.10.04)
[debug] [vlive:post] Extracting URL: https://www.vlive.tv/post/1-18318601
[vlive:post] 1-18318601: Downloading post JSON metadata
[debug] [vlive] Extracting URL: http://www.vlive.tv/video/101216
[vlive] 101216: Downloading officialVideoPost JSON metadata
[vlive] 101216: Downloading inkey JSON metadata
[vlive] 101216: Downloading JSON metadata
[debug] Formats sorted by: hasvid, ie_pref, lang, quality, res, fps, hdr:12(7), vcodec:vp9.2(10), channels, acodec, filesize, fs_approx, tbr, vbr, abr, asr, proto, vext, aext, hasaud, source, id
[info] 101216: Downloading subtitles: en_US, es_PA, es_ES, fr_FR, in_ID, pt_PT, vi_VN, jp, zh_CN, ko_KR
[debug] Default format spec: bestvideo*+bestaudio/best
[info] 101216: Downloading 1 format(s): avc1_720P
[info] Writing video subtitles to: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.en_US.vtt
[debug] Invoking http downloader on "http://resources-rmcnmv.pstatic.net/globalv/c/read/v2/VOD_ALPHA/global_v_2018_12_06_3/aa339b4a-f89e-11e8-bc80-3ca82a21f531-1544022097899_en_US_cp.vtt"
[download] Destination: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.en_US.vtt
[download] 100% of 55.30KiB in 00:00:00 at 154.65KiB/s
[info] Writing video subtitles to: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.es_PA.vtt
[debug] Invoking http downloader on "http://resources-rmcnmv.pstatic.net/globalv/c/read/v2/VOD_ALPHA/global_v_2018_11_28_2/3b1a6fdd-f30e-11e8-8111-3ca82a220799-1543410308156_es_PA_cp.vtt"
[download] Destination: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.es_PA.vtt
[download] 100% of 24.65KiB in 00:00:00 at 317.51KiB/s
[info] Writing video subtitles to: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.es_ES.vtt
[debug] Invoking http downloader on "http://resources-rmcnmv.pstatic.net/globalv/c/read/v2/VOD_ALPHA/2020_11_16/a3003ab1-27e4-11eb-9a2e-0050569c085d-1605514850566_es_ES_fan.vtt"
[download] Destination: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.es_ES.vtt
[download] 100% of 59.17KiB in 00:00:00 at 197.86KiB/s
[info] Writing video subtitles to: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.fr_FR.vtt
[debug] Invoking http downloader on "http://resources-rmcnmv.pstatic.net/globalv/c/read/v2/VOD_ALPHA/2020_12_14/8d2ea1db-3dcf-11eb-9b2a-0050569c085d-1607924720110_fr_FR_fan.vtt"
[download] Destination: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.fr_FR.vtt
[download] 100% of 55.22KiB in 00:00:00 at 528.98KiB/s
[info] Writing video subtitles to: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.in_ID.vtt
[debug] Invoking http downloader on "http://resources-rmcnmv.pstatic.net/globalv/c/read/v2/VOD_ALPHA/global_v_2018_11_28_2/3ace9972-f30e-11e8-8606-3ca82a22c1e9-1543410307659_in_ID_cp.vtt"
[download] Destination: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.in_ID.vtt
[download] 100% of 24.42KiB in 00:00:00 at 157.16KiB/s
[info] Writing video subtitles to: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.pt_PT.vtt
[debug] Invoking http downloader on "http://resources-rmcnmv.pstatic.net/globalv/c/read/v2/VOD_ALPHA/global_v_2018_11_28_2/3b090ad1-f30e-11e8-9c04-3ca82a225339-1543410308041_pt_PT_cp.vtt"
[download] Destination: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.pt_PT.vtt
[download] 100% of 25.00KiB in 00:00:00 at 267.88KiB/s
[info] Writing video subtitles to: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.vi_VN.vtt
[debug] Invoking http downloader on "http://resources-rmcnmv.pstatic.net/globalv/c/read/v2/VOD_ALPHA/global_v_2018_12_07_4/dac66573-f9fc-11e8-98b0-3ca82a22d7a5-1544172503245_vi_VN_cp.vtt"
[download] Destination: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.vi_VN.vtt
[download] 100% of 64.32KiB in 00:00:00 at 555.77KiB/s
[info] Writing video subtitles to: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.jp.vtt
[debug] Invoking http downloader on "http://resources-rmcnmv.pstatic.net/globalv/c/read/v2/VOD_ALPHA/global_v_2018_11_28_2/3aee2f6b-f30e-11e8-bb16-3ca82a21e509-1543410307868_jp_cp.vtt"
[download] Destination: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.jp.vtt
[download] 100% of 23.29KiB in 00:00:00 at 258.58KiB/s
[info] Writing video subtitles to: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.zh_CN.vtt
[debug] Invoking http downloader on "http://resources-rmcnmv.pstatic.net/globalv/c/read/v2/VOD_ALPHA/global_v_2018_12_10_4/077db2f1-fc58-11e8-8818-3ca82a22c1e9-1544431564794_zh_CN_cp.vtt"
[download] Destination: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.zh_CN.vtt
[download] 100% of 52.85KiB in 00:00:00 at 581.89KiB/s
[info] Writing video subtitles to: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.ko_KR.vtt
[debug] Invoking http downloader on "http://resources-rmcnmv.pstatic.net/globalv/c/read/v2/VOD_ALPHA/global_v_2018_12_06_2/cc2e3feb-f921-11e8-8285-3ca82a2243c9-1544078418977_ko_KR_cp.vtt"
[download] Destination: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.ko_KR.vtt
[download] 100% of 64.29KiB in 00:00:00 at 450.71KiB/s
[info] Downloading video thumbnail 1 ...
[info] Writing video thumbnail 1 to: /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.png
[download] /content/drive/Shareddrives/VLIVE/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.mp4 has already been downloaded
[EmbedSubtitle] Embedding subtitles in "/content/drive/Shareddrives/VLIVE/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.mp4"
[debug] ffmpeg command line: ffmpeg -y -loglevel repeat+info -i 'file:/content/drive/Shareddrives/VLIVE/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.mp4' -i 'file:/content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.en_US.vtt' -i 'file:/content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.es_PA.vtt' -i 'file:/content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.es_ES.vtt' -i 'file:/content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.fr_FR.vtt' -i 'file:/content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.in_ID.vtt' -i 'file:/content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.pt_PT.vtt' -i 'file:/content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.vi_VN.vtt' -i 'file:/content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.jp.vtt' -i 'file:/content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.zh_CN.vtt' -i 'file:/content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.ko_KR.vtt' -map 0 -dn -ignore_unknown -c copy -c:s mov_text -map -0:s -map 1:0 -metadata:s:s:0 language=eng -map 2:0 -metadata:s:s:1 language=spa -map 3:0 -metadata:s:s:2 language=spa -map 4:0 -metadata:s:s:3 language=fra -map 5:0 -metadata:s:s:4 language=ind -map 6:0 -metadata:s:s:5 language=por -map 7:0 -metadata:s:s:6 language=vie -map 8:0 -metadata:s:s:7 language=jp -map 9:0 -metadata:s:s:8 language=zho -map 10:0 -metadata:s:s:9 language=kor -movflags +faststart 'file:/content/drive/Shareddrives/VLIVE/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.temp.mp4'
Deleting original file /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.en_US.vtt (pass -k to keep)
Deleting original file /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.fr_FR.vtt (pass -k to keep)
Deleting original file /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.es_PA.vtt (pass -k to keep)
Deleting original file /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.jp.vtt (pass -k to keep)
Deleting original file /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.ko_KR.vtt (pass -k to keep)
Deleting original file /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.zh_CN.vtt (pass -k to keep)
Deleting original file /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.in_ID.vtt (pass -k to keep)
Deleting original file /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.pt_PT.vtt (pass -k to keep)
Deleting original file /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.vi_VN.vtt (pass -k to keep)
Deleting original file /content/drive/Shareddrives/VLIVE/!temp/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.es_ES.vtt (pass -k to keep)
[Metadata] Adding metadata to "/content/drive/Shareddrives/VLIVE/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.mp4"
[debug] ffmpeg command line: ffmpeg -y -loglevel repeat+info -i 'file:/content/drive/Shareddrives/VLIVE/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.mp4' -map 0 -dn -ignore_unknown -c copy -write_id3v1 1 -metadata 'title=♥도요일♥ 12/1 도다제 녹음현장 ! with 도영' -metadata date=20181127 -metadata purl=http://www.vlive.tv/video/101216 -metadata comment=http://www.vlive.tv/video/101216 -metadata 'artist=NCT의 night night!' -movflags +faststart 'file:/content/drive/Shareddrives/VLIVE/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.temp.mp4'
[EmbedThumbnail] mutagen: Adding thumbnail to "/content/drive/Shareddrives/VLIVE/20181127 - NCT의 night night! - ♥도요일♥ 12⧸1 도다제 녹음현장 ! with 도영.mp4"
```
| null | null | https://github.com/HHeroin/yt-dlp/commit/5da08bde9e073987d1aae2683235721e4813f9c6 | {'base_commit': '5da08bde9e073987d1aae2683235721e4813f9c6', 'files': [{'path': 'yt_dlp/extractor/vlive.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [15]}, "('VLiveIE', None, 69)": {'add': [83, 100]}, "('VLiveIE', '_real_extract', 148)": {'add': [171]}}}]} | [] | [] | [] | {
"iss_type": "3",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"yt_dlp/extractor/vlive.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
yt-dlp | yt-dlp | 51c22ef4e2af966d6100d0d97d9e8019022df8ad | https://github.com/yt-dlp/yt-dlp/issues/2996 | bug | '<' not supported between instances of 'float' and 'str' and --throttled-rate error after update? | ### Checklist
- [X] I'm reporting a bug unrelated to a specific site
- [X] I've verified that I'm running yt-dlp version **2022.03.08.1**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
- [X] I've checked that all provided URLs are alive and playable in a browser
- [X] I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
- [X] I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
- [X] I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
### Description
After update I get error
'<' not supported between instances of 'float' and 'str'
I found out that it is somewhat related to --throttled-rate setting? When I remove it I can download from YT no issues
If I leave it, I get the following message
[download] 0.0% of 714.94MiB at 499.98KiB/s ETA 24:24ERROR: '<' not supported between instances of 'float' and 'str'
### Verbose log
```shell
Microsoft Windows [Version 6.1.7601]
>yt-dlp https://www.youtube.com/watch?v=XUp9pe1T-UE --throttled-rate 999k
[youtube] XUp9pe1T-UE: Downloading webpage
[youtube] XUp9pe1T-UE: Downloading android player API JSON
[info] XUp9pe1T-UE: Downloading 1 format(s): 571+251
WARNING: Requested formats are incompatible for merge and will be merged into mkv
[download] Destination: 8k VIDEOS _ Beauty of Nature 8K (60 FPS) HDR UltraHD _ Sony Demo [XUp9pe1T-UE].f571.mp4
[download] 0.0% of 505.86MiB at 90.90KiB/s ETA 01:34:58ERROR: '<' not supported between instances of 'float' and 'str'
yt>
```
| null | null | https://github.com/yt-dlp/yt-dlp/commit/51c22ef4e2af966d6100d0d97d9e8019022df8ad | {'base_commit': '51c22ef4e2af966d6100d0d97d9e8019022df8ad', 'files': [{'path': 'yt_dlp/__init__.py', 'status': 'modified', 'Loc': {"(None, 'validate_options', 156)": {'mod': [258]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"yt_dlp/__init__.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
yt-dlp | yt-dlp | 6f638d325e1878df304822c6bf4e231e06dae89a | https://github.com/yt-dlp/yt-dlp/issues/3467 | docs/meta/cleanup
high-priority
regression | Error since commit 43cc91a | ### Checklist
- [X] I'm reporting a bug unrelated to a specific site
- [X] I've verified that I'm running yt-dlp version **2022.04.08** ([update instructions](https://github.com/yt-dlp/yt-dlp#update)) or later (specify commit)
- [X] I've checked that all provided URLs are alive and playable in a browser
- [X] I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
- [X] I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
- [X] I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
### Description
After commit 43cc91a, I get the error shown in the verbose log.
### Verbose log
```shell
yt-dlp -Uv
Traceback (most recent call last):
File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/usr/local/bin/yt-dlp/__main__.py", line 13, in <module>
File "<frozen zipimport>", line 259, in load_module
File "/usr/local/bin/yt-dlp/yt_dlp/__init__.py", line 12, in <module>
ModuleNotFoundError: No module named 'yt_dlp.compat'
```
| null | null | https://github.com/yt-dlp/yt-dlp/commit/6f638d325e1878df304822c6bf4e231e06dae89a | {'base_commit': '6f638d325e1878df304822c6bf4e231e06dae89a', 'files': [{'path': 'Makefile', 'status': 'modified', 'Loc': {'(None, None, 61)': {'add': [61]}, '(None, None, 64)': {'mod': [64]}, '(None, None, 68)': {'mod': [68]}, '(None, None, 70)': {'mod': [70]}}}, {'path': 'yt_dlp/extractor/anvato.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [7], 'mod': [22, 23, 24, 25, 26, 27, 28, 29, 30]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"yt_dlp/extractor/anvato.py"
],
"doc": [],
"test": [],
"config": [
"Makefile"
],
"asset": []
} | null |
yt-dlp | yt-dlp | 14a086058a30a0748b5b716e9b21481f993518f3 | https://github.com/yt-dlp/yt-dlp/issues/1601 | site-bug | ARD:mediathek doesn't work anymore | ### Checklist
- [X] I'm reporting a broken site
- [X] I've verified that I'm running yt-dlp version **2021.10.22**. ([update instructions](https://github.com/yt-dlp/yt-dlp#update))
- [X] I've checked that all provided URLs are alive and playable in a browser
- [X] I've checked that all URLs and arguments with special characters are [properly quoted or escaped](https://github.com/ytdl-org/youtube-dl#video-url-contains-an-ampersand-and-im-getting-some-strange-output-1-2839-or-v-is-not-recognized-as-an-internal-or-external-command)
- [X] I've searched the [bugtracker](https://github.com/yt-dlp/yt-dlp/issues?q=) for similar issues including closed ones. DO NOT post duplicates
- [X] I've read the [guidelines for opening an issue](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#opening-an-issue)
- [X] I've read about [sharing account credentials](https://github.com/yt-dlp/yt-dlp/blob/master/CONTRIBUTING.md#are-you-willing-to-share-account-details-if-needed) and I'm willing to share it if required
### Region
Germany
### Description
Downloading from ARDmediathek dosen’t work anymore
### Verbose log
```shell
$ /repositories/yt-dlp/yt-dlp --no-config --verbose https://www.ardmediathek.de/video/tagesschau-oder-tagesschau-20-00-uhr/das-erste/Y3JpZDovL2Rhc2Vyc3RlLmRlL3RhZ2Vzc2NoYXUvZmM4ZDUxMjgtOTE0ZC00Y2MzLTgzNzAtNDZkNGNiZWJkOTll/
[debug] Command-line config: ['--no-config', '--verbose', 'https://www.ardmediathek.de/video/tagesschau-oder-tagesschau-20-00-uhr/das-erste/Y3JpZDovL2Rhc2Vyc3RlLmRlL3RhZ2Vzc2NoYXUvZmM4ZDUxMjgtOTE0ZC00Y2MzLTgzNzAtNDZkNGNiZWJkOTll/']
[debug] Encodings: locale UTF-8, fs utf-8, out utf-8, err utf-8, pref UTF-8
[debug] yt-dlp version 2021.10.22 (zip)
[debug] Lazy loading extractors is disabled
[debug] Plugins: ['SamplePluginIE', 'SamplePluginPP']
[debug] Python version 3.9.7 (CPython 64bit) - Linux-5.13.0-21-generic-x86_64-with-glibc2.34
[debug] exe versions: ffmpeg 4.4 (setts), ffprobe 4.4, rtmpdump 2.4
[debug] Optional libraries: Cryptodome, keyring, mutagen, sqlite
[debug] Proxy map: {}
[debug] Using fake IP 53.36.205.78 (DE) as X-Forwarded-For
[debug] [ARD:mediathek] Extracting URL: https://www.ardmediathek.de/video/tagesschau-oder-tagesschau-20-00-uhr/das-erste/Y3JpZDovL2Rhc2Vyc3RlLmRlL3RhZ2Vzc2NoYXUvZmM4ZDUxMjgtOTE0ZC00Y2MzLTgzNzAtNDZkNGNiZWJkOTll/
[ARD:mediathek] Y3JpZDovL2Rhc2Vyc3RlLmRlL3RhZ2Vzc2NoYXUvZmM4ZDUxMjgtOTE0ZC00Y2MzLTgzNzAtNDZkNGNiZWJkOTll: Downloading webpage
[ARD:mediathek] 10049223: Downloading media JSON
ERROR: [ARD:mediathek] Unable to download JSON metadata: HTTP Error 404: Not Found (caused by <HTTPError 404: 'Not Found'>); please report this issue on https://github.com/yt-dlp/yt-dlp . Make sure you are using the latest version; type yt-dlp -U to update. Be sure to call yt-dlp with the --verbose flag and include its complete output.
File "/repositories/yt-dlp/yt-dlp/yt_dlp/extractor/common.py", line 713, in _request_webpage
return self._downloader.urlopen(url_or_request)
File "/repositories/yt-dlp/yt-dlp/yt_dlp/YoutubeDL.py", line 3288, in urlopen
return self._opener.open(req, timeout=self._socket_timeout)
File "/usr/lib/python3.9/urllib/request.py", line 523, in open
response = meth(req, response)
File "/usr/lib/python3.9/urllib/request.py", line 632, in http_response
response = self.parent.error(
File "/usr/lib/python3.9/urllib/request.py", line 555, in error
result = self._call_chain(*args)
File "/usr/lib/python3.9/urllib/request.py", line 494, in _call_chain
result = func(*args)
File "/usr/lib/python3.9/urllib/request.py", line 747, in http_error_302
return self.parent.open(new, timeout=req.timeout)
File "/usr/lib/python3.9/urllib/request.py", line 523, in open
response = meth(req, response)
File "/usr/lib/python3.9/urllib/request.py", line 632, in http_response
response = self.parent.error(
File "/usr/lib/python3.9/urllib/request.py", line 561, in error
return self._call_chain(*args)
File "/usr/lib/python3.9/urllib/request.py", line 494, in _call_chain
result = func(*args)
File "/usr/lib/python3.9/urllib/request.py", line 641, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
```
| null | null | https://github.com/yt-dlp/yt-dlp/commit/14a086058a30a0748b5b716e9b21481f993518f3 | {'base_commit': '14a086058a30a0748b5b716e9b21481f993518f3', 'files': [{'path': 'yt_dlp/extractor/ard.py', 'status': 'modified', 'Loc': {"('ARDBetaMediathekIE', None, 390)": {'add': [405, 428], 'mod': [391]}, "('ARDBetaMediathekIE', '_ARD_extract_playlist', 512)": {'mod': [528, 529, 530, 531, 532, 533, 534, 536, 537, 538, 539, 540, 541]}, "('ARDBetaMediathekIE', '_real_extract', 551)": {'mod': [577]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"yt_dlp/extractor/ard.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
comfyanonymous | ComfyUI | ab7d4f784892c275e888d71aa80a3a2ed59d9b83 | https://github.com/comfyanonymous/ComfyUI/issues/2019 | [Bug] text of collapsed node still present | On latest commit https://github.com/comfyanonymous/ComfyUI/commit/d66b631d74e6f6ac95c61c63d4a0da150bf74903.
Dragging the node also doesn't do anything until it's uncollapsed.
<img width="1236" alt="Screenshot 2023-11-21 at 1 14 19 PM" src="https://github.com/comfyanonymous/ComfyUI/assets/111034657/abb0b5c1-3e94-4928-813d-3481ca5f1f47">
| null | null | https://github.com/comfyanonymous/ComfyUI/commit/ab7d4f784892c275e888d71aa80a3a2ed59d9b83 | {'base_commit': 'ab7d4f784892c275e888d71aa80a3a2ed59d9b83', 'files': [{'path': 'web/scripts/domWidget.js', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [235, 292]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"web/scripts/domWidget.js"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
AntonOsika | gpt-engineer | 3e589bf1356024fb471a9d17738e4626f21a953b | https://github.com/AntonOsika/gpt-engineer/issues/1153 | bug
triage | Azure Deployment Name Bug | ## Policy and info
- Maintainers will close issues that have been stale for 14 days if they contain relevant answers.
- Adding the label "sweep" will automatically turn the issue into a coded pull request. Works best for mechanical tasks. More info/syntax at: https://docs.sweep.dev/
## Expected Behavior
There shouldn't be an error with the model name.
## Current Behavior
### Deployment name seems to mix with model name.
Everything seems to work perfectly and code is being made:

But then an error pops up telling me that the model doesn't exist and it takes my Azure OpenAI deployment name and says it's not a model.

Here is the command style I used following these instructions from here: https://gpt-engineer.readthedocs.io/en/latest/open_models.html

`gpt-engineer --azure [redacted_endpoint_url] ./snake_game/ [redacted_deployment_name]`
## Additional Failure Information
Using Azure OpenAI with gpt-4-turbo deployed with a different deployment name. Only installed gpt-engineer in a virtual environment. | null | https://github.com/AntonOsika/gpt-engineer/pull/1170 | null | {'base_commit': '3e589bf1356024fb471a9d17738e4626f21a953b', 'files': [{'path': '.github/CONTRIBUTING.md', 'status': 'modified', 'Loc': {'(None, None, 114)': {'add': [114]}}}, {'path': 'gpt_engineer/core/ai.py', 'status': 'modified', 'Loc': {"('AI', '_create_chat_model', 330)": {'mod': [349]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"gpt_engineer/core/ai.py"
],
"doc": [
".github/CONTRIBUTING.md"
],
"test": [],
"config": [],
"asset": []
} | null |
AntonOsika | gpt-engineer | c4c1203fc07b2e23c3e5a5e9277266a711ab9466 | https://github.com/AntonOsika/gpt-engineer/issues/35 | .py files are not being created. I just get all_output.txt that I manually have to create from. | Hi, I absolutely love this script. This is the most accurate auto-GPT development script I have tried yet, it's so powerful!
In the demo video it shows the script creating each of the development files, in my case .py files within the workspace folder automatically. My build isn't doing this I just get an all_output.txt file with all .py files codes in one place and a single python file.
How do I ensure that GPT-Engineer automatically creates the .py files for me. Thanks | null | https://github.com/AntonOsika/gpt-engineer/pull/120 | null | {'base_commit': 'c4c1203fc07b2e23c3e5a5e9277266a711ab9466', 'files': [{'path': 'gpt_engineer/chat_to_files.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [4]}, "(None, 'parse_chat', 6)": {'add': [11], 'mod': [6, 7, 8, 10, 13, 14, 15, 16, 17, 18, 19]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"gpt_engineer/chat_to_files.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
AntonOsika | gpt-engineer | 7b91676a0c2ccd4589a42f2cadbf1e69f93ad81b | https://github.com/AntonOsika/gpt-engineer/issues/1128 | bug
triage | Applying diffs failing silently | ## Expected Behavior
I would expect GPT engineer to either successfully apply all diffs sent by the AI or fail in a way that lets you know which diffs have been applied, which failed, and allows you to manually salvage the failed diff parts by copy and pasting
## Current Behavior
The current behaviour seems to be that it applies the sections of the diff which it can and silently throws the rest of the code away. From a users perspective it seems like everything has gone well - but in reality its only applied a portion of the diff.
This is really bad from a usability perspective - for one, a partially applied diff is obviously never going to be working code so applying it is pointless. Also, the knowledge that this is the behaviour pf gpte means i need to manually check every single output to verify its applied the whole diff which is a complete waste of time for diffs which do apply succesfully.
Not applying any of the diffs at all would actually be a better outcome for me, as at least i would have a consistent workflow of copy and pasting... however a more sensible sollution is applying the diffs it can, and if it cant apply a diff for a file, not apply any change to it at all, and instead providing an error output which is convenient for the use to copy and paste manually into the file
### Failure Logs
I cant upload failure logs as the code im working on is sensitive | null | https://github.com/AntonOsika/gpt-engineer/pull/1138 | null | {'base_commit': '7b91676a0c2ccd4589a42f2cadbf1e69f93ad81b', 'files': [{'path': 'gpt_engineer/core/diff.py', 'status': 'modified', 'Loc': {"('Diff', 'validate_and_correct', 340)": {'mod': [357]}}}, {'path': 'tests/core/test_salvage_correct_hunks.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [82]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"gpt_engineer/core/diff.py"
],
"doc": [],
"test": [
"tests/core/test_salvage_correct_hunks.py"
],
"config": [],
"asset": []
} | null |
lllyasviel | Fooocus | f7bb578a1409b1f96aff534ff5ed2bd10502296f | https://github.com/lllyasviel/Fooocus/issues/1527 | Add copy to clipboard in plaintext for image details | Add copy to clipboard in plaintext for image details
A button we can click to copy to clipboard all of the image details shown in the log output file. If not on the log page then on the app itself.
The quick copying of these settings enables us to share our work methods with others in the community more smoothly, thereby assisting them in a more efficient and effective way.


When I copy the text manually from the log file it looks like a garbled mess. See example below.
```
Prompt | Cute troll with fluffy long spiked hair wearing a ugly Christmas sweater. snow falling down and troll village in the background. full body
-- | --
Negative Prompt |
Fooocus V2 Expansion | Cute troll with fluffy long spiked hair wearing a ugly Christmas sweater. snow falling down and troll village in the background. full body, intricate, elegant, highly detailed, sharp focus, illuminated, sunny, magical, scenic, artistic, true colors, deep aesthetic, very inspirational, cute, cozy, inspired, original, fine detail, professional, winning, enhanced, polished
Styles | ['SAI Photographic', 'Fooocus V2', 'Artstyle Hyperrealism', 'MRE Artistic Vision']
Performance | Quality
Resolution | (1024, 1024)
Sharpness | 3
Guidance Scale | 1.7
ADM Guidance | (1.5, 0.8, 0.3)
Base Model | dreamshaperXL_turboDpmppSDEKarras.safetensors
Refiner Model | None
Refiner Switch | 0.5
Sampler | dpmpp_sde
Scheduler | karras
Seed | 5044578018584347060
Version | v2.1.853
``` | null | null | https://github.com/lllyasviel/Fooocus/commit/f7bb578a1409b1f96aff534ff5ed2bd10502296f | {'base_commit': 'f7bb578a1409b1f96aff534ff5ed2bd10502296f', 'files': [{'path': 'fooocus_version.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [1]}}}, {'path': 'modules/async_worker.py', 'status': 'modified', 'Loc': {"(None, 'handler', 116)": {'mod': [400, 401, 780, 782]}}}, {'path': 'modules/private_logger.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [3]}, "(None, 'log', 21)": {'add': [38, 61], 'mod': [42, 60]}}}, {'path': 'update_log.md', 'status': 'modified', 'Loc': {'(None, None, 1)': {'add': [0]}}}, {'path': 'webui.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [3, 14, 111, 512], 'mod': [103]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"modules/private_logger.py",
"webui.py",
"modules/async_worker.py",
"fooocus_version.py"
],
"doc": [
"update_log.md"
],
"test": [],
"config": [],
"asset": []
} | null | |
lllyasviel | Fooocus | 3a55e7e3910b8ae58f82a5a0e4c11d7d4fa3143f | https://github.com/lllyasviel/Fooocus/issues/2561 | enhancement | [Feature Request]: Prompt embedded LoRAs | ### Is there an existing issue for this?
- [x] I have searched the existing issues and checked the recent builds/commits
### What would your feature do?
Similar to how A1111 handles LoRAs by default, I believe there should be an option to embed LoRAs in the prompt by using the following structure:
```csharp
<LORA_NAME:WEIGHT>
```
The current workflow works well, but has a few limitations, namely being able to use wildcards and LoRAs together for more dynamic prompts. Additionally, this feature already exists for embeddings, so I reckon adding it for LoRAs should be trivial.
### Proposed workflow
1. Enter LoRAs in the prompt using the `<LORA_NAME:WEIGHT>` structure
2. Generate images, and LoRAs are loaded for each iteration
### Additional information
_No response_ | null | https://github.com/lllyasviel/Fooocus/pull/2323 | null | {'base_commit': '3a55e7e3910b8ae58f82a5a0e4c11d7d4fa3143f', 'files': [{'path': 'modules/async_worker.py', 'status': 'modified', 'Loc': {"(None, 'handler', 134)": {'add': [435], 'mod': [155, 453, 454, 655, 865, 908, 912]}, "(None, 'worker', 19)": {'mod': [47, 50, 51, 72]}, "(None, 'callback', 806)": {'mod': [810]}}}, {'path': 'modules/config.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [23], 'mod': [11]}}}, {'path': 'modules/sdxl_styles.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [5, 7, 12]}, "(None, 'apply_wildcards', 68)": {'mod': [68, 69, 70, 71, 72, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 91, 92, 95]}, "(None, 'get_words', 95)": {'mod': [104]}}}, {'path': 'modules/util.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [8, 16], 'mod': [1]}, "(None, 'get_files_from_folder', 166)": {'mod': [166, 167, 168, 170, 172, 173, 174, 175, 176, 177, 178, 179, 180, 182]}, "('PromptStyle', None, 358)": {'mod': [358]}, "(None, 'get_enabled_loras', 396)": {'mod': [397]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"modules/async_worker.py",
"modules/sdxl_styles.py",
"modules/config.py",
"modules/util.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
lllyasviel | Fooocus | 8e62a72a63b30a3067d1a1bc3f8d226824bd9283 | https://github.com/lllyasviel/Fooocus/issues/1671 | bug (AMD) | Cannot use image prompts | I am trying to use 2x images as an image prompt but when I press generate this is what I'm getting (I can generate just fine without image prompts):
Full console log:
<code>[Parameters] Adaptive CFG = 7
[Parameters] Sharpness = 3
[Parameters] ADM Scale = 1.5 : 0.8 : 0.3
[Parameters] CFG = 1.5
[Parameters] Seed = 953753918774495193
[Fooocus] Downloading control models ...
[Fooocus] Loading control models ...
[Parameters] Sampler = dpmpp_2m_sde_gpu - karras
[Parameters] Steps = 6 - 30
[Fooocus] Initializing ...
[Fooocus] Loading models ...
Refiner unloaded.
model_type EPS
UNet ADM Dimension 2816
Using split attention in VAE
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
Using split attention in VAE
extra {'cond_stage_model.clip_l.logit_scale', 'cond_stage_model.clip_g.transformer.text_model.embeddings.position_ids', 'cond_stage_model.clip_l.text_projection'}
Base model loaded: H:\Programs\Fooocus_win64_2-1-831\Fooocus\models\checkpoints\realisticStockPhoto_v10.safetensors
Request to load LoRAs [['None', 0.25], ['None', 1.0], ['None', 1.0], ['None', 1.0], ['None', 1.0]] for model [H:\Programs\Fooocus_win64_2-1-831\Fooocus\models\checkpoints\realisticStockPhoto_v10.safetensors].
Requested to load SDXLClipModel
Loading 1 new model
[Fooocus] Processing prompts ...
[Fooocus] Encoding positive #1 ...
[Fooocus] Encoding negative #1 ...
[Fooocus] Image processing ...
Traceback (most recent call last):
File "H:\Programs\Fooocus_win64_2-1-831\Fooocus\modules\async_worker.py", line 806, in worker
handler(task)
File "H:\Programs\Fooocus_win64_2-1-831\python_embeded\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "H:\Programs\Fooocus_win64_2-1-831\python_embeded\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "H:\Programs\Fooocus_win64_2-1-831\Fooocus\modules\async_worker.py", line 647, in handler
task[0] = ip_adapter.preprocess(cn_img, ip_adapter_path=ip_adapter_path)
File "H:\Programs\Fooocus_win64_2-1-831\python_embeded\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "H:\Programs\Fooocus_win64_2-1-831\python_embeded\lib\site-packages\torch\utils\_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "H:\Programs\Fooocus_win64_2-1-831\Fooocus\extras\ip_adapter.py", line 185, in preprocess
cond = image_proj_model.model(cond).to(device=ip_adapter.load_device, dtype=ip_adapter.dtype)
File "H:\Programs\Fooocus_win64_2-1-831\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "H:\Programs\Fooocus_win64_2-1-831\Fooocus\extras\resampler.py", line 117, in forward
latents = attn(x, latents) + latents
File "H:\Programs\Fooocus_win64_2-1-831\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "H:\Programs\Fooocus_win64_2-1-831\Fooocus\extras\resampler.py", line 55, in forward
latents = self.norm2(latents)
File "H:\Programs\Fooocus_win64_2-1-831\python_embeded\lib\site-packages\torch\nn\modules\module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "H:\Programs\Fooocus_win64_2-1-831\python_embeded\lib\site-packages\torch\nn\modules\normalization.py", line 190, in forward
return F.layer_norm(
File "H:\Programs\Fooocus_win64_2-1-831\python_embeded\lib\site-packages\torch\nn\functional.py", line 2515, in layer_norm
return torch.layer_norm(input, normalized_shape, weight, bias, eps, torch.backends.cudnn.enabled)
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, privateuseone:0 and cpu!
Total time: 37.40 seconds
</code>
| null | https://github.com/lllyasviel/Fooocus/pull/1678 | null | {'base_commit': '8e62a72a63b30a3067d1a1bc3f8d226824bd9283', 'files': [{'path': 'extras/ip_adapter.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [10], 'mod': [5]}, "(None, 'load_ip_adapter', 90)": {'mod': [119, 120, 121, 122, 123, 124, 125, 126]}}}, {'path': 'fooocus_version.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [1]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"fooocus_version.py",
"extras/ip_adapter.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
lllyasviel | Fooocus | d57afc88a48359bc1642c2ae30a091f0426eff43 | https://github.com/lllyasviel/Fooocus/issues/1063 | Faceswap crashes | **Describe the problem**
The program crashes when trying to use an image as prompt and selecting the faceswap advanced option
**Full Console Log**
Requirement already satisfied: pygit2==1.12.2 in /usr/local/lib/python3.10/dist-packages (1.12.2)
Requirement already satisfied: cffi>=1.9.1 in /usr/local/lib/python3.10/dist-packages (from pygit2==1.12.2) (1.16.0)
Requirement already satisfied: pycparser in /usr/local/lib/python3.10/dist-packages (from cffi>=1.9.1->pygit2==1.12.2) (2.21)
/content
fatal: destination path 'Fooocus' already exists and is not an empty directory.
/content/Fooocus
Already up-to-date
Update succeeded.
[System ARGV] ['entry_with_update.py', '--preset', 'realistic', '--share']
Loaded preset: /content/Fooocus/presets/realistic.json
Python 3.10.12 (main, Nov 20 2023, 15:14:05) [GCC 11.4.0]
Fooocus version: 2.1.824
Running on local URL: http://127.0.0.1:7865/
Running on public URL: https://fb6371be5d9ced0c1d.gradio.live/
This share link expires in 72 hours. For free permanent hosting and GPU upgrades, run `gradio deploy` from Terminal to deploy to Spaces (https://huggingface.co/spaces)
Total VRAM 15102 MB, total RAM 12983 MB
2023-11-29 21:03:50.202601: E tensorflow/compiler/xla/stream_executor/cuda/cuda_dnn.cc:9342] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2023-11-29 21:03:50.202658: E tensorflow/compiler/xla/stream_executor/cuda/cuda_fft.cc:609] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2023-11-29 21:03:50.202708: E tensorflow/compiler/xla/stream_executor/cuda/cuda_blas.cc:1518] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2023-11-29 21:03:52.244376: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Set vram state to: NORMAL_VRAM
Disabling smart memory management
Device: cuda:0 Tesla T4 : native
VAE dtype: torch.float32
Using pytorch cross attention
Refiner unloaded.
model_type EPS
adm 2816
Using pytorch attention in VAE
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
Using pytorch attention in VAE
extra keys {'cond_stage_model.clip_l.text_projection', 'cond_stage_model.clip_g.transformer.text_model.embeddings.position_ids', 'cond_stage_model.clip_l.logit_scale'}
Base model loaded: /content/Fooocus/models/checkpoints/realisticStockPhoto_v10.safetensors
Request to load LoRAs [['SDXL_FILM_PHOTOGRAPHY_STYLE_BetaV0.4.safetensors', 0.25], ['None', 1.0], ['None', 1.0], ['None', 1.0], ['None', 1.0]] for model [/content/Fooocus/models/checkpoints/realisticStockPhoto_v10.safetensors].
Loaded LoRA [/content/Fooocus/models/loras/SDXL_FILM_PHOTOGRAPHY_STYLE_BetaV0.4.safetensors] for UNet [/content/Fooocus/models/checkpoints/realisticStockPhoto_v10.safetensors] with 788 keys at weight 0.25.
Loaded LoRA [/content/Fooocus/models/loras/SDXL_FILM_PHOTOGRAPHY_STYLE_BetaV0.4.safetensors] for CLIP [/content/Fooocus/models/checkpoints/realisticStockPhoto_v10.safetensors] with 264 keys at weight 0.25.
Fooocus V2 Expansion: Vocab with 642 words.
Fooocus Expansion engine loaded for cuda:0, use_fp16 = True.
Requested to load SDXLClipModel
Requested to load GPT2LMHeadModel
Loading 2 new models
[Fooocus Model Management] Moving model(s) has taken 1.30 seconds
App started successful. Use the app with http://127.0.0.1:7865/ or 127.0.0.1:7865 or https://fb6371be5d9ced0c1d.gradio.live/
[Parameters] Adaptive CFG = 7
[Parameters] Sharpness = 2
[Parameters] ADM Scale = 1.5 : 0.8 : 0.3
[Parameters] CFG = 3.0
[Parameters] Seed = 604471590939558783
[Parameters] Sampler = dpmpp_2m_sde_gpu - karras
[Parameters] Steps = 60 - 30
[Fooocus] Initializing ...
[Fooocus] Loading models ...
Refiner unloaded.
[Fooocus] Processing prompts ...
[Fooocus] Preparing Fooocus text #1 ...
[Prompt Expansion] Portrait of a young man on the beach, full light, gorgeous, amazing, elegant, intricate, highly detailed, dynamic, rich deep vivid colors, beautiful, very inspirational, inspiring, thought, fancy, sharp focus, colorful, epic, professional, artistic, new, charismatic, cool, brilliant, awesome, attractive, shiny, fine detail, pretty, focused, creative
[Fooocus] Preparing Fooocus text #2 ...
[Prompt Expansion] Portrait of a young man on the beach, full pretty, attractive, fine detail, intricate, elegant, luxury, elite, dramatic light, highly detailed, cinematic, complex, sharp focus, illuminated, amazing, marvelous, thought, epic, fabulous, colorful, shiny, brilliant, symmetry, great, excellent composition, ambient, dynamic, vibrant colors, relaxed, beautiful
[Fooocus] Encoding positive #1 ...
[Fooocus Model Management] Moving model(s) has taken 0.11 seconds
[Fooocus] Encoding positive #2 ...
[Fooocus] Encoding negative #1 ...
[Fooocus] Encoding negative #2 ...
[Parameters] Denoising Strength = 1.0
[Parameters] Initial Latent shape: Image Space (1152, 896)
Preparation time: 3.60 seconds
[Sampler] refiner_swap_method = joint
[Sampler] sigma_min = 0.0291671771556139, sigma_max = 14.614643096923828
Requested to load SDXL
Loading 1 new model
[Fooocus Model Management] Moving model(s) has taken 2.40 seconds
100% 60/60 [00:55<00:00, 1.09it/s]
Image generated with private log at: /content/Fooocus/outputs/2023-11-29/log.html
Generating and saving time: 60.73 seconds
[Sampler] refiner_swap_method = joint
[Sampler] sigma_min = 0.0291671771556139, sigma_max = 14.614643096923828
Requested to load SDXL
Loading 1 new model
[Fooocus Model Management] Moving model(s) has taken 2.01 seconds
100% 60/60 [00:56<00:00, 1.06it/s]
Image generated with private log at: /content/Fooocus/outputs/2023-11-29/log.html
Generating and saving time: 61.85 seconds
Requested to load SDXLClipModel
Requested to load GPT2LMHeadModel
Loading 2 new models
[Fooocus Model Management] Moving model(s) has taken 1.57 seconds
Total time: 131.21 seconds
[Parameters] Adaptive CFG = 7
[Parameters] Sharpness = 2
[Parameters] ADM Scale = 1.5 : 0.8 : 0.3
[Parameters] CFG = 3.0
[Parameters] Seed = 7513856776859948774
[Fooocus] Downloading control models ...
[Fooocus] Loading control models ...
extra keys clip vision: ['vision_model.embeddings.position_ids']
| null | https://github.com/lllyasviel/Fooocus/pull/1710 | null | {'base_commit': 'd57afc88a48359bc1642c2ae30a091f0426eff43', 'files': [{'path': 'fooocus_colab.ipynb', 'status': 'modified', 'Loc': {'(None, None, 15)': {'mod': [15]}}}, {'path': 'readme.md', 'status': 'modified', 'Loc': {'(None, None, 127)': {'add': [127]}, '(None, None, 118)': {'mod': [118]}, '(None, None, 124)': {'mod': [124]}}}, {'path': 'ldm_patched/modules/args_parser.py', 'Loc': {'(None, None, None)': [99]}, 'base_commit': 'cca0ca704a713ab153938e78de6787609c723cad'}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "5",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"fooocus_colab.ipynb",
"ldm_patched/modules/args_parser.py"
],
"doc": [
"readme.md"
],
"test": [],
"config": [],
"asset": []
} | null | |
odoo | odoo | 72ec0050b442214c9be93907fc01a48832243c15 | https://github.com/odoo/odoo/issues/7306 | [v8.0] Bank statement : Customer Import invoice wizard do not auto-fill the right field | Step to reproduce:
create a customer invoice
create a new bank statement and import this invoice
click on 'Reconcile'
Problem: No match proposition between the bank statement line and the invoice move line can be found since the communication field is '/'. (The invoice number is in the field 'Reference' instead)
So please the ref must go to communication
Thanks
| null | null | https://github.com/odoo/odoo/commit/72ec0050b442214c9be93907fc01a48832243c15 | {'base_commit': '72ec0050b442214c9be93907fc01a48832243c15', 'files': [{'path': 'addons/account/account_bank_statement.py', 'status': 'modified', 'Loc': {"('account_bank_statement_line', 'get_reconciliation_proposition', 537)": {'mod': [575]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"addons/account/account_bank_statement.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
binary-husky | gpt_academic | 197287fc303119bf71caf9b3f72280cab08da749 | https://github.com/binary-husky/gpt_academic/issues/1147 | [Bug]: 翻译arxiv文档报错,无论本地自己搭建还是官方在线均报错 | ### Installation Method | 安装方法与平台
OneKeyInstall (一键安装脚本-windows)
### Version | 版本
Latest | 最新版
### OS | 操作系统
Windows
### Describe the bug | 简述
官方在线版报错代码如下:
> Local Message] 实验性函数调用出错:
>
> Traceback (most recent call last):
> File "./toolbox.py", line 165, in decorated
> yield from f(main_input, llm_kwargs, plugin_kwargs, chatbot_with_cookie, history, *args, **kwargs)
> File "./crazy_functions/Latex输出PDF结果.py", line 249, in Latex翻译中文并重新编译PDF
> txt, arxiv_id = yield from arxiv_download(chatbot, history, txt)
> File "./crazy_functions/Latex输出PDF结果.py", line 141, in arxiv_download
> extract_archive(file_path=dst, dest_dir=extract_dst)
> File "./toolbox.py", line 507, in extract_archive
> with tarfile.open(file_path, 'r:*') as tarobj:
> File "/usr/lib/python3.8/tarfile.py", line 1608, in open
> raise ReadError("file could not be opened successfully")
> tarfile.ReadError: file could not be opened successfully
>
> 当前代理可用性:
>
> 代理配置 socks5h://localhost:7890, 代理所在地:Japan
本地搭建版报错代码如下:
> [Local Message] 实验性函数调用出错:
>
> Traceback (most recent call last):
> File ".\toolbox.py", line 150, in decorated
> yield from f(main_input, llm_kwargs, plugin_kwargs, chatbot_with_cookie, history, *args, **kwargs)
> File ".\crazy_functions\Latex输出PDF结果.py", line 250, in Latex翻译中文并重新编译PDF
> txt, arxiv_id = yield from arxiv_download(chatbot, history, txt, allow_cache)
> File ".\crazy_functions\Latex输出PDF结果.py", line 139, in arxiv_download
> extract_archive(file_path=dst, dest_dir=extract_dst)
> File ".\toolbox.py", line 461, in extract_archive
> with tarfile.open(file_path, 'r:*') as tarobj:
> File "D:\academic-gpt\installer_files\env\lib\tarfile.py", line 1811, in open
> raise ReadError(f"file could not be opened successfully:\n{error_msgs_summary}")
> tarfile.ReadError: file could not be opened successfully:
> - method gz: ReadError('invalid header')
> - method bz2: ReadError('not a bzip2 file')
> - method xz: ReadError('not an lzma file')
> - method tar: ReadError('invalid header')
>
> 当前代理可用性:
>
> 代理配置 socks5h://127.0.0.1:12341, 代理所在地:Hong Kong - Cloudflare, Inc.
所翻译的arxiv文档的地址为:https://arxiv.org/abs/2112.10551
### Screen Shot | 有帮助的截图

### Terminal Traceback & Material to Help Reproduce Bugs | 终端traceback(如有) + 帮助我们复现的测试材料样本(如有)
_No response_ | null | null | https://github.com/binary-husky/gpt_academic/commit/197287fc303119bf71caf9b3f72280cab08da749 | {'base_commit': '197287fc303119bf71caf9b3f72280cab08da749', 'files': [{'path': 'shared_utils/handle_upload.py', 'status': 'modified', 'Loc': {"(None, 'extract_archive', 91)": {'mod': [107, 108, 109, 110, 111, 112, 113, 114, 116, 117]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"shared_utils/handle_upload.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
binary-husky | gpt_academic | 65317e33af87640b68c84c9f6ee67188b76c6d7a | https://github.com/binary-husky/gpt_academic/issues/558 | 能否利用EdgeGPT,支持调用微软Bing接口 | 大佬们求求了,看看这个项目吧,https://github.com/acheong08/EdgeGPT
如果可以方便地调用Bing接口,或者未来的百度、阿里等第三方接口,对于没有openAI-key也没法本地部署GLM的同学是福音啊 | null | null | https://github.com/binary-husky/gpt_academic/commit/65317e33af87640b68c84c9f6ee67188b76c6d7a | {'base_commit': '65317e33af87640b68c84c9f6ee67188b76c6d7a', 'files': [{'path': 'config.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [65], 'mod': [47, 48]}}}, {'path': 'request_llm/bridge_all.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [21, 119]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"request_llm/bridge_all.py",
"config.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
binary-husky | gpt_academic | e359fff0405c4cb865b809b4ecfc0a95a54d2512 | https://github.com/binary-husky/gpt_academic/issues/1554 | [Bug]: docker安装版本适配spark api报错 | ### Installation Method | 安装方法与平台
Docker-Compose(Windows/Mac)
### Version | 版本
Latest | 最新版
### OS | 操作系统
Mac
### Describe the bug | 简述
在mac本地使用conda安装方式,适配spark api可以正常运行。但是通过docker compose方式安装之后通过spark api会出现报错,不过千帆api则可以正常使用
### Screen Shot | 有帮助的截图
<img width="1423" alt="Snipaste_2024-02-14_21-12-27" src="https://github.com/binary-husky/gpt_academic/assets/57496712/a1672606-c4ce-4328-a05a-b1d311cc26bd">
### Terminal Traceback & Material to Help Reproduce Bugs | 终端traceback(如有) + 帮助我们复现的测试材料样本(如有)
gpt_academic_nolocalllms-1 | error: Connection to remote host was lost.
gpt_academic_nolocalllms-1 | Exception ignored in thread started by: <function SparkRequestInstance.create_blocking_request.<locals>.run at 0x2aaaf7fdfa60>
gpt_academic_nolocalllms-1 | Traceback (most recent call last):
gpt_academic_nolocalllms-1 | File "/gpt/request_llms/com_sparkapi.py", line 113, in run
gpt_academic_nolocalllms-1 | ws.send(data)
gpt_academic_nolocalllms-1 | File "/usr/local/lib/python3.11/site-packages/websocket/_app.py", line 284, in send
gpt_academic_nolocalllms-1 | raise WebSocketConnectionClosedException("Connection is already closed.")
gpt_academic_nolocalllms-1 | websocket._exceptions.WebSocketConnectionClosedException: Connection is already closed.
gpt_academic_nolocalllms-1 | Traceback (most recent call last):
gpt_academic_nolocalllms-1 | File "/usr/local/lib/python3.11/site-packages/gradio/routes.py", line 422, in run_predict
gpt_academic_nolocalllms-1 | output = await app.get_blocks().process_api(
gpt_academic_nolocalllms-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
gpt_academic_nolocalllms-1 | File "/usr/local/lib/python3.11/site-packages/gradio/blocks.py", line 1323, in process_api
gpt_academic_nolocalllms-1 | result = await self.call_function(
gpt_academic_nolocalllms-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^
gpt_academic_nolocalllms-1 | File "/usr/local/lib/python3.11/site-packages/gradio/blocks.py", line 1067, in call_function
gpt_academic_nolocalllms-1 | prediction = await utils.async_iteration(iterator)
gpt_academic_nolocalllms-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
gpt_academic_nolocalllms-1 | File "/usr/local/lib/python3.11/site-packages/gradio/utils.py", line 336, in async_iteration
gpt_academic_nolocalllms-1 | return await iterator.__anext__()
gpt_academic_nolocalllms-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^
gpt_academic_nolocalllms-1 | File "/usr/local/lib/python3.11/site-packages/gradio/utils.py", line 329, in __anext__
gpt_academic_nolocalllms-1 | return await anyio.to_thread.run_sync(
gpt_academic_nolocalllms-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
gpt_academic_nolocalllms-1 | File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
gpt_academic_nolocalllms-1 | return await get_async_backend().run_sync_in_worker_thread(
gpt_academic_nolocalllms-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
gpt_academic_nolocalllms-1 | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2134, in run_sync_in_worker_thread
gpt_academic_nolocalllms-1 | return await future
gpt_academic_nolocalllms-1 | ^^^^^^^^^^^^
gpt_academic_nolocalllms-1 | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 851, in run
gpt_academic_nolocalllms-1 | result = context.run(func, *args)
gpt_academic_nolocalllms-1 | ^^^^^^^^^^^^^^^^^^^^^^^^
gpt_academic_nolocalllms-1 | File "/usr/local/lib/python3.11/site-packages/gradio/utils.py", line 312, in run_sync_iterator_async
gpt_academic_nolocalllms-1 | return next(iterator)
gpt_academic_nolocalllms-1 | ^^^^^^^^^^^^^^
gpt_academic_nolocalllms-1 | File "/gpt/toolbox.py", line 115, in decorated
gpt_academic_nolocalllms-1 | yield from f(txt_passon, llm_kwargs, plugin_kwargs, chatbot_with_cookie, history, system_prompt, *args)
gpt_academic_nolocalllms-1 | File "/gpt/request_llms/bridge_all.py", line 765, in predict
gpt_academic_nolocalllms-1 | yield from method(inputs, llm_kwargs, *args, **kwargs)
gpt_academic_nolocalllms-1 | File "/gpt/request_llms/bridge_spark.py", line 60, in predict
gpt_academic_nolocalllms-1 | if response == f"[Local Message] 等待{model_name}响应中 ...":
gpt_academic_nolocalllms-1 | ^^^^^^^^
gpt_academic_nolocalllms-1 | UnboundLocalError: cannot access local variable 'response' where it is not associated with a value
| null | null | https://github.com/binary-husky/gpt_academic/commit/e359fff0405c4cb865b809b4ecfc0a95a54d2512 | {'base_commit': 'e359fff0405c4cb865b809b4ecfc0a95a54d2512', 'files': [{'path': 'request_llms/bridge_qianfan.py', 'status': 'modified', 'Loc': {"(None, 'predict', 135)": {'add': [148, 151], 'mod': [161, 162, 163, 164, 165, 166]}}}, {'path': 'request_llms/bridge_qwen.py', 'status': 'modified', 'Loc': {"(None, 'predict', 25)": {'add': [53]}}}, {'path': 'request_llms/bridge_skylark2.py', 'status': 'modified', 'Loc': {"(None, 'predict', 32)": {'add': [58]}}}, {'path': 'request_llms/bridge_spark.py', 'status': 'modified', 'Loc': {"(None, 'predict', 36)": {'add': [54]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"request_llms/bridge_qwen.py",
"request_llms/bridge_qianfan.py",
"request_llms/bridge_skylark2.py",
"request_llms/bridge_spark.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
binary-husky | gpt_academic | c17fc2a9b55b1c7447718a06a3eac4378828bb22 | https://github.com/binary-husky/gpt_academic/issues/1021 | waiting feedback | [Feature]: 通义千问的模型开源了,建议加入. | ### Class | 类型
None
### Feature Request | 功能请求
附:开源地址
魔搭ModelScope:
https://modelscope.cn/models/qwen/Qwen-7B/summary
https://modelscope.cn/models/qwen/Qwen-7B-Chat/summary
Hugging Face:https://huggingface.co/Qwen
GitHub:https://github.com/QwenLM/Qwen-7B | null | null | https://github.com/binary-husky/gpt_academic/commit/c17fc2a9b55b1c7447718a06a3eac4378828bb22 | {'base_commit': 'c17fc2a9b55b1c7447718a06a3eac4378828bb22', 'files': [{'path': 'config.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [74]}}}, {'path': 'request_llm/bridge_all.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [337]}}}, {'path': 'request_llm/bridge_qwen.py', 'status': 'modified', 'Loc': {"('GetONNXGLMHandle', 'load_model_and_tokenizer', 26)": {'mod': [35, 37, 38, 39, 40]}, "('GetONNXGLMHandle', None, 19)": {'mod': [43, 57, 58]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"request_llm/bridge_all.py",
"request_llm/bridge_qwen.py",
"config.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
binary-husky | gpt_academic | 19bd0c35ed05e6f99c8e3c0a8c994b1385341cae | https://github.com/binary-husky/gpt_academic/issues/1053 | ToDo | [Bug]: 本地翻译Latex出错 | ### Installation Method | 安装方法与平台
Pip Install (I used latest requirements.txt)
### Version | 版本
Latest | 最新版
### OS | 操作系统
Windows
### Describe the bug | 简述
* 问题:找不到所谓的“fp”(文件指针)

* stack:这里是将在**tex文件合并(merge_tex_files)** 函数中的一个子函数的调用(merge_tex_files_),主要作用就是将原始tex中的\input命令内容进行合并,但实际过程中存在一个问题,通过debug找到,具体的debug代码(也就加了点print)和结果图附在了下面
```python
def merge_tex_files_(project_foler, main_file, mode):
"""
Merge Tex project recrusively
"""
main_file = rm_comments(main_file)
for s in reversed([q for q in re.finditer(r"\\input\{(.*?)\}", main_file, re.M)]):
f = s.group(1)
fp = os.path.join(project_foler, f)
fp = find_tex_file_ignore_case(fp)
if fp:
with open(fp, 'r', encoding='utf-8', errors='replace') as fx: c = fx.read()
else:
raise RuntimeError(f'找不到{fp},Tex源文件缺失!')
c = merge_tex_files_(project_foler, c, mode)
main_file = main_file[:s.span()[0]] + c + main_file[s.span()[1]:]
return main_file
```
**debug的代码**
```python
def merge_tex_files_(project_foler, main_file, mode):
"""
Merge Tex project recrusively
"""
## === AAS ADDED FOR TEST ===
print('======== IN merge_tex_files_(SUB FUN) Function ===========')
# print('project_foler:{}\nmain_file:{}\nmode:{}'.format(project_foler,main_file,mode))
## ===
main_file = rm_comments(main_file)
for s in reversed([q for q in re.finditer(r"\\input\{(.*?)\}", main_file, re.M)]):
## === AAS ADDED FOR TEST ===
print('======== IN LOOP of merge_tex_files_(SUB FUN)===========')
print("s:",s)
## === AAS ADDED FOR TEST ===
f = s.group(1)
## === AAS ADDED FOR TEST ===
print("f:",f)
## === AAS ADDED FOR TEST ===
fp = os.path.join(project_foler, f)
## === AAS ADDED FOR TEST ===
print("fp1:",fp)
## === AAS ADDED FOR TEST ===
fp = find_tex_file_ignore_case(fp)
## === AAS ADDED FOR TEST ===
print("fp2:",fp)
## === AAS ADDED FOR TEST ===
if fp:
with open(fp, 'r', encoding='utf-8', errors='replace') as fx: c = fx.read()
else:
raise RuntimeError(f'找不到{fp},Tex源文件缺失!')
c = merge_tex_files_(project_foler, c, mode)
main_file = main_file[:s.span()[0]] + c + main_file[s.span()[1]:]
return main_file
```
**结果图**

**出错部分的代码**
```python
def find_tex_file_ignore_case(fp):
dir_name = os.path.dirname(fp)
base_name = os.path.basename(fp)
## === AAS ADDED FOR TEST ===
print('============ IN find_tex_file_ignore_case Fun ==========')
print('dir_name:',dir_name)
print('base_name',base_name)
## === AAS ADDED FOR TEST ===
## 出错的问题在于是bbl文件导入,而不是tex,尝试一下取消tex限制
if not base_name.endswith('.tex'): base_name+='.tex'
## === AAS ADDED FOR TEST ===
if os.path.exists(pj(dir_name, base_name)): return pj(dir_name, base_name)
# go case in-sensitive
import glob
for f in glob.glob(dir_name+'/*.tex'):
base_name_s = os.path.basename(fp)
if base_name_s.lower() == base_name.lower(): return f
return None
```
* 实际错误原因:简单而言就是:这个地方**只考虑到了tex合并**(`find_tex_file_ignore_case`函数),还存在**比如`bbl`**(另一种一种文献引用的格式,可以直接塞到tex里面编译,相对比较原始但小巧,和普通的`.bib`利用`references`稍有差异)**没有考虑到**,所以造成了合并过程中的tex文件缺失
* 改进建议:input这个地方来说一般确实只有tex,用find_tex_file_ignore_case这个函数也挺好的,但是可以考虑以下其他情况,比如说纯文本(.txt),其他code(`.c, .cpp, .py`等等),
- 方案:**直接去掉tex的限制**,什么都直接往里插入即可,然后交给tex编译,实际上也是这样,所以没有什么必要在input这个地方把只限制插入tex。实在有错误的话其实交给tex输出然后查看就好了。代码方面把下面这行注释掉就好了
```python
if not base_name.endswith('.tex'): base_name+='.tex'
```
* p.s 找这个还挺费事和hhh,乍一看还不知道什么情况,但其实小问题
在注释掉之后,暂且就能正常使用了
### Screen Shot | 有帮助的截图
See the Describe the bug part
### Terminal Traceback & Material to Help Reproduce Bugs | 终端traceback(如有) + 帮助我们复现的测试材料样本(如有)
See the Describe the bug part | null | null | https://github.com/binary-husky/gpt_academic/commit/19bd0c35ed05e6f99c8e3c0a8c994b1385341cae | {'base_commit': '19bd0c35ed05e6f99c8e3c0a8c994b1385341cae', 'files': [{'path': 'crazy_functions/latex_fns/latex_toolbox.py', 'status': 'modified', 'Loc': {"(None, 'find_tex_file_ignore_case', 281)": {'add': [283], 'mod': [286]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"crazy_functions/latex_fns/latex_toolbox.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
binary-husky | gpt_academic | e24f077b68e38b679e5ca25853ea2c402f074ea3 | https://github.com/binary-husky/gpt_academic/issues/1120 | [Feature]: 希望能够增加azure openai gpt4 的模型选项 | ### Class | 类型
程序主体
### Feature Request | 功能请求
RT | null | null | https://github.com/binary-husky/gpt_academic/commit/e24f077b68e38b679e5ca25853ea2c402f074ea3 | {'base_commit': 'e24f077b68e38b679e5ca25853ea2c402f074ea3', 'files': [{'path': 'config.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [83]}}}, {'path': 'request_llm/bridge_all.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [147]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"request_llm/bridge_all.py",
"config.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
deepfakes | faceswap | a799f769e4c48908c3efd64792384403392f2e82 | https://github.com/deepfakes/faceswap/issues/67 | Cluster faces during extract using dlib.chinese_whispers_clustering | I have had some success hacking together a pre-processing script to run over my training images. It uses [dlib.chinese_whispers_clustering](http://dlib.net/python/index.html#dlib.chinese_whispers_clustering) to group the found faces in the training data based on likeness. I think one of the keys to good results is good training sets, and this helps to prevent polluting the training data with other peoples faces as tends to be the case with Google image search sets or images with multiple faces.
There are a couple of ways I think this could be integrated into the project:
1) during extract when generating face chips, discard non target faces (all faces not in the largest cluster)
2) during convert where frames have multiple faces, identifying only the target face for replacement.
Here's [the script](https://gist.github.com/badluckwiththinking/92dd6f155bc8babca6422b08b642d35d), sorry its a bit hacky, I just wanted something that worked and haven't cleaned it up. I'm not sure where I would begin to integrate it into the project, perhaps as an alternative plugin?
| null | https://github.com/deepfakes/faceswap/pull/61 | null | {'base_commit': 'a799f769e4c48908c3efd64792384403392f2e82', 'files': [{'path': 'Dockerfile', 'status': 'modified', 'Loc': {'(None, None, 14)': {'add': [14]}, '(None, None, 10)': {'mod': [10, 11, 12]}, '(None, None, 16)': {'mod': [16]}, '(None, None, 18)': {'mod': [18]}}}, {'path': 'faceswap.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [2, 8, 17, 18, 19, 20]}}}, {'path': 'lib/DetectedFace.py', 'status': 'removed', 'Loc': {}}, {'path': 'lib/aligner.py', 'status': 'modified', 'Loc': {"(None, 'get_align_mat', 25)": {'mod': [26]}}}, {'path': 'lib/cli.py', 'status': 'modified', 'Loc': {"('DirectoryProcessor', 'process_arguments', 34)": {'add': [47], 'mod': [49, 51]}, '(None, None, None)': {'mod': [5]}, "('DirectoryProcessor', 'process_directory', 51)": {'mod': [56, 59]}, "('DirectoryProcessor', None, 14)": {'mod': [62]}}}, {'path': 'lib/faces_detect.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [0], 'mod': [3, 4, 28]}, "(None, 'detect_faces', 6)": {'mod': [9, 11, 12, 13, 14, 15, 16]}}}, {'path': 'lib/model.py', 'status': 'removed', 'Loc': {}}, {'path': 'lib/training_data.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [2, 3, 5], 'mod': [45]}, "(None, 'get_training_data', 13)": {'mod': [13, 14, 15, 16, 17, 18, 20, 21, 22, 23, 24, 26, 27, 29]}, "(None, 'random_warp', 47)": {'mod': [49]}}}, {'path': 'lib/utils.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [16], 'mod': [1, 2]}, "(None, 'get_folder', 8)": {'mod': [10]}, "(None, 'load_images', 18)": {'mod': [18, 19, 20, 21, 22, 23, 24, 25, 26]}}}, {'path': 'plugins/Convert_Adjust.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [0]}, "('Convert', None, 5)": {'mod': [6, 7]}, "('Convert', 'patch_image', 12)": {'mod': [21]}}}, {'path': 'plugins/Convert_Masked.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [0], 'mod': [6]}, "('Convert', None, 8)": {'mod': [9, 10]}, "('Convert', 'get_new_face', 51)": {'mod': [54]}, "('Convert', 'get_image_mask', 58)": {'mod': [67]}}}, {'path': 'plugins/Extract_Align.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [0]}, "('Extract', 'extract', 6)": {'add': [7]}}}, {'path': 'plugins/Extract_Crop.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [0]}}}, {'path': 'plugins/PluginLoader.py', 'status': 'modified', 'Loc': {"('PluginLoader', None, 2)": {'mod': [4, 5, 6, 9, 10, 11, 14, 15]}}}, {'path': 'scripts/convert.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [4, 64], 'mod': [7, 8, 9]}, "('ConvertImage', 'process_image', 38)": {'add': [48], 'mod': [42, 43, 44, 45, 47, 50, 51, 52, 53, 54, 57]}, "('ConvertImage', None, 13)": {'mod': [38, 39, 40]}}}, {'path': 'scripts/extract.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [5]}, "('ExtractTrainingData', None, 8)": {'mod': [18, 19]}, "('ExtractTrainingData', 'process_image', 18)": {'mod': [22, 23, 24, 25, 26, 28, 29, 30, 31]}}}, {'path': 'scripts/train.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [10], 'mod': [5, 6, 8, 9]}, "('TrainingProcessor', 'process_arguments', 18)": {'mod': [24, 25, 26, 27, 28, 29, 30]}, "('TrainingProcessor', None, 12)": {'mod': [89, 90, 91, 92, 93, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 107, 108, 109, 111, 113, 114, 115, 116]}, "('TrainingProcessor', 'process', 118)": {'mod': [119, 122, 123, 125, 127, 129, 131, 132, 133, 134, 135, 136, 138, 139, 140, 142, 143, 144, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"lib/aligner.py",
"lib/model.py",
"lib/training_data.py",
"plugins/Convert_Adjust.py",
"plugins/Extract_Align.py",
"plugins/Extract_Crop.py",
"scripts/train.py",
"faceswap.py",
"plugins/PluginLoader.py",
"plugins/Convert_Masked.py",
"lib/DetectedFace.py",
"lib/faces_detect.py",
"lib/utils.py",
"lib/cli.py",
"scripts/convert.py",
"scripts/extract.py"
],
"doc": [],
"test": [],
"config": [
"Dockerfile"
],
"asset": []
} | null | |
deepfakes | faceswap | f5dd18352c6640bc5c39a01642c7ac7356c0dea1 | https://github.com/deepfakes/faceswap/issues/718 | bug | [Windows] cuda_path was not set if success on first check. | **Describe the bug**
setup.py file:
cuDNN was not detected if `cuda_check` success in first check using "nvcc -V" because of `self.env.cuda_path` not set
**To Reproduce**
Steps to reproduce the behavior:
1, run `python setup.py` on windows 10 environment
**Expected behavior**
detect cuDNN lib
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Desktop (please complete the following information):**
- OS: Windows 10
**Additional context**
I temporary disable first method to check CUDA so it working for now.
| null | null | https://github.com/deepfakes/faceswap/commit/f5dd18352c6640bc5c39a01642c7ac7356c0dea1 | {'base_commit': 'f5dd18352c6640bc5c39a01642c7ac7356c0dea1', 'files': [{'path': 'lib/gpu_stats.py', 'status': 'modified', 'Loc': {"('GPUStats', 'initialize', 64)": {'mod': [92]}}}, {'path': 'setup.py', 'status': 'modified', 'Loc': {"('Checks', None, 314)": {'add': [353]}, "('Checks', 'cudnn_check', 458)": {'add': [459]}, "('Install', 'ask_continue', 542)": {'add': [543]}, "('Checks', 'cuda_check_linux', 423)": {'mod': [442, 443, 444]}, "('Checks', 'cuda_check_windows', 445)": {'mod': [451]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"setup.py",
"lib/gpu_stats.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
deepfakes | faceswap | dea984efc1c720832d7c32513c806b4b67cc6560 | https://github.com/deepfakes/faceswap/issues/590 | Disable logging | In previous commits before the logging implementation, multiple GPUS were able to run different tasks simultaneously ( extract/train/convert ).
After the logging commit, only 1 task can be run due to the log file being in use by the first process.
Is there an option to disable logging or specify a log file instead? | null | null | https://github.com/deepfakes/faceswap/commit/dea984efc1c720832d7c32513c806b4b67cc6560 | {'base_commit': 'dea984efc1c720832d7c32513c806b4b67cc6560', 'files': [{'path': 'lib/cli.py', 'status': 'modified', 'Loc': {"('ScriptExecutor', 'execute_script', 83)": {'mod': [85]}, "('DirOrFileFullPaths', None, 150)": {'mod': [150]}, "('FaceSwapArgs', 'get_global_arguments', 265)": {'mod': [274, 275, 276, 277]}}}, {'path': 'lib/gui/utils.py', 'status': 'modified', 'Loc': {"('FileHandler', '__init__', 36)": {'mod': [48, 49, 50, 51, 57, 58]}, "('ContextMenu', None, 332)": {'mod': [334]}}}, {'path': 'lib/logger.py', 'status': 'modified', 'Loc': {"(None, 'log_setup', 71)": {'mod': [71, 79]}, "(None, 'file_handler', 89)": {'mod': [89, 91, 92, 93]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"lib/cli.py",
"lib/gui/utils.py",
"lib/logger.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
3b1b | manim | b4b4d39ec51cbfce7fafdc5ff0f9f4ddfd26b181 | https://github.com/3b1b/manim/issues/1436 | bug | PNG images have a black background (no transparency) | ### Description
When trying do display a png image(with transparent background), it shows the background as black, didn't encouter the issue when trying with the cairo renderer.
**Code**:
```python
img = ImageMobject("./dice.png")
self.play(FadeIn(img))
```
### Results
<img width="626" alt="result" src="https://user-images.githubusercontent.com/38077008/110259231-7639ec80-7f9e-11eb-898a-c0cf5757de5d.png">
# Original image

| null | null | https://github.com/3b1b/manim/commit/b4b4d39ec51cbfce7fafdc5ff0f9f4ddfd26b181 | {'base_commit': 'b4b4d39ec51cbfce7fafdc5ff0f9f4ddfd26b181', 'files': [{'path': 'manimlib/shaders/image/frag.glsl', 'status': 'modified', 'Loc': {'(None, None, 12)': {'mod': [12]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [],
"doc": [],
"test": [],
"config": [],
"asset": [
"manimlib/shaders/image/frag.glsl"
]
} | null |
3b1b | manim | e1c049bece420bc1190eb3ed4d5d9878c431aa5e | https://github.com/3b1b/manim/issues/394 | import readline is failing | I am trying to run examples_scenes.py and it threw a ModuleNotFoundError when it tried to import readline. This should be easy to resolve - just pip install readline right? Nope. readline apparently doesn't work on Windows, and I got this strange follow-up error below. I don't know what to do at this point. Help?
c:\Tensorexperiments\manim>python -m manim example_scenes.py SquareToCircle -pl
Traceback (most recent call last):
File "C:\Program Files\Python36\lib\runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "C:\Program Files\Python36\lib\runpy.py", line 85, in _run_code
exec(code, run_globals)
File "c:\Tensorexperiments\manim\manim.py", line 4, in <module>
import manimlib.stream_starter
File "c:\Tensorexperiments\manim\manimlib\stream_starter.py", line 4, in <module>
import readline
ModuleNotFoundError: No module named 'readline'
c:\Tensorexperiments\manim>pip install readline
Collecting readline
Downloading https://files.pythonhosted.org/packages/f4/01/2cf081af8d880b44939a5f1b446551a7f8d59eae414277fd0c303757ff1b/readline-6.2.4.1.tar.gz (2.3MB)
100% |████████████████████████████████| 2.3MB 8.5MB/s
Complete output from command python setup.py egg_info:
error: this module is not meant to work on Windows
Command "python setup.py egg_info" failed with error code 1 in C:\Users\SAMERN~1\AppData\Local\Temp\pip-install-z8maklzo\readline\ | null | https://github.com/3b1b/manim/pull/672 | null | {'base_commit': 'e1c049bece420bc1190eb3ed4d5d9878c431aa5e', 'files': [{'path': 'requirements.txt', 'status': 'modified', 'Loc': {'(None, None, 11)': {'add': [11]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [],
"doc": [],
"test": [],
"config": [
"requirements.txt"
],
"asset": []
} | null | |
All-Hands-AI | OpenHands | 660d1d1e64c5e28e96bf9b8172cd87d1d809fd07 | https://github.com/All-Hands-AI/OpenHands/issues/5876 | bug
severity:medium | [Bug]: "The model produces invalid content" | ### Is there an existing issue for the same bug?
- [X] I have checked the existing issues.
### Describe the bug and reproduction steps
https://www.all-hands.dev/share?share_id=dab4a77e7d64e7a4dc6124dc672d3f4beb2d411a33155977425b821e292d4f4c
The LLM is `gpt-4o`
In the logs I got
```yaml
{'error': {'message': 'The model produced invalid content. Consider modifying your prompt if you are seeing this error persistently.', 'type': 'model_error', 'param': None, 'code': None}}
```
### OpenHands Installation
Docker command in README
### OpenHands Version
0.17
### Operating System
Windows
### Logs, Errors, Screenshots, and Additional Context
_No response_ | null | https://github.com/All-Hands-AI/OpenHands/pull/7045 | null | {'base_commit': '660d1d1e64c5e28e96bf9b8172cd87d1d809fd07', 'files': [{'path': 'openhands/llm/llm.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [78]}, "('LLM', 'wrapper', 180)": {'mod': [220, 221, 222]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"openhands/llm/llm.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
scrapy | scrapy | 7e8453cf1ec992e5df5cebfeda08552c58e7c9bc | https://github.com/scrapy/scrapy/issues/2656 | sos filepipelines 302 | hi
when i setting file_urls "http://m.baidu.com/api?action=redirect&token=kpyysd&from=1014090y&type=app&dltype=new&refid=2650327114&tj=soft_5845028_88031597_%E8%AF%AD%E9%9F%B3%E6%90%9C%E7%B4%A2&refp=action_search&blink=da5b687474703a2f2f7265736765742e39312e636f6d2f536f66742f436f6e74726f6c6c65722e617368783f616374696f6e3d646f776e6c6f61642674706c3d312669643d34313034393931c658&crversion=1"
this url redirect 3 times so when i use scrap download it the scrapy retrun 302 how can i setting it can working ? please help me!

| null | https://github.com/scrapy/scrapy/pull/2616 | null | {'base_commit': '7e8453cf1ec992e5df5cebfeda08552c58e7c9bc', 'files': [{'path': 'docs/topics/media-pipeline.rst', 'status': 'modified', 'Loc': {'(None, None, 324)': {'add': [324]}}}, {'path': 'scrapy/pipelines/files.py', 'status': 'modified', 'Loc': {"('FilesPipeline', '__init__', 226)": {'mod': [252]}}}, {'path': 'scrapy/pipelines/media.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [2, 7]}, "('MediaPipeline', None, 16)": {'add': [29, 95], 'mod': [27]}, "('MediaPipeline', '_check_media_to_download', 96)": {'mod': [106]}}}, {'path': 'tests/mockserver.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [7, 10, 14, 122]}, "('Root', '__init__', 152)": {'add': [162]}}}, {'path': 'tests/test_pipeline_media.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [8, 84]}, "('BaseMediaPipelineTestCase', None, 22)": {'add': [24]}, "('MediaPipelineTestCase', 'test_use_media_to_download_result', 245)": {'add': [251]}, "('BaseMediaPipelineTestCase', 'setUp', 26)": {'mod': [28]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"scrapy/pipelines/media.py",
"scrapy/pipelines/files.py",
"tests/mockserver.py"
],
"doc": [
"docs/topics/media-pipeline.rst"
],
"test": [
"tests/test_pipeline_media.py"
],
"config": [],
"asset": []
} | null | |
ansible | ansible | cc00f21a358923c03e334e245d58df0853d10661 | https://github.com/ansible/ansible/issues/57069 | networking
module
support:network
nxos
bug
affects_2.7
cisco | nxos_vpc breaks using default vrf | ##### SUMMARY
When using pkl_vrf": "default" command is missing vrf value
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
Module: nxos_vpc
##### ANSIBLE VERSION
```
ansible 2.7.2
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.15rc1 (default, Nov 12 2018, 14:31:15) [GCC 7.3.0]
```
##### CONFIGURATION
asterisk due privacy
```
CACHE_PLUGIN(/etc/ansible/ansible.cfg) = jsonfile
CACHE_PLUGIN_CONNECTION(/etc/ansible/ansible.cfg) = /**/config/ansible/facts
DEFAULT_HOST_LIST(/etc/ansible/ansible.cfg) = [u'/**/config/ansible/**/hosts.yml']
DISPLAY_SKIPPED_HOSTS(/etc/ansible/ansible.cfg) = True
HOST_KEY_CHECKING(/etc/ansible/ansible.cfg) = False
RETRY_FILES_ENABLED(/etc/ansible/ansible.cfg) = False
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
##### STEPS TO REPRODUCE
```
nxos_vpc:
domain: 10
pkl_src: 1.1.1.2
pkl_dest: 1.1.1.1
pkl_vrf: default
```
##### EXPECTED RESULTS
```
"commands": [
"vpc domain 10",
"peer-keepalive destination 1.1.1.1 source 1.1.1.2 vrf default",
```
##### ACTUAL RESULTS
```
"commands": [
"vpc domain 10",
"peer-keepalive destination 1.1.1.1 source 1.1.1.2",
```
| null | https://github.com/ansible/ansible/pull/57370 | null | {'base_commit': 'cc00f21a358923c03e334e245d58df0853d10661', 'files': [{'path': 'lib/ansible/modules/network/nxos/nxos_vpc.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [60, 63, 277]}, "(None, 'main', 317)": {'add': [396], 'mod': [392]}, "(None, 'get_vpc', 222)": {'mod': [265, 266, 267, 268, 269, 270, 271, 272, 273, 274]}, "(None, 'get_commands_to_config_vpc', 278)": {'mod': [288]}}}, {'path': 'test/units/modules/network/nxos/test_nxos_vpc.py', 'status': 'modified', 'Loc': {"('TestNxosVpcModule', 'setUp', 31)": {'add': [33]}, "('TestNxosVpcModule', 'tearDown', 40)": {'add': [41]}, "('TestNxosVpcModule', 'load_fixtures', 45)": {'add': [54], 'mod': [56]}, "('TestNxosVpcModule', 'test_nxos_vpc_present', 58)": {'add': [66]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"lib/ansible/modules/network/nxos/nxos_vpc.py"
],
"doc": [],
"test": [
"test/units/modules/network/nxos/test_nxos_vpc.py"
],
"config": [],
"asset": []
} | null |
ansible | ansible | 44b53141748d29220441e0799b54ea3130ac6753 | https://github.com/ansible/ansible/issues/78076 | support:core
has_pr
docs
affects_2.12 | Minor change to the getting started diagram | ### Summary
I was looking through the new Ansible getting started guide and noticed one of the nodes in the diagram has a duplicate label. s/node 2/node 3
### Issue Type
Documentation Report
### Component Name
https://github.com/ansible/ansible/blob/devel/docs/docsite/rst/images/ansible_basic.svg
### Ansible Version
```console
$ ansible --version
ansible [core 2.12.6]
config file = /etc/ansible/ansible.cfg
configured module search path = ['/home/dnaro/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python3.10/site-packages/ansible
ansible collection location = /home/dnaro/.ansible/collections:/usr/share/ansible/collections
executable location = /usr/bin/ansible
python version = 3.10.4 (main, Mar 25 2022, 00:00:00) [GCC 12.0.1 20220308 (Red Hat 12.0.1-0)]
jinja version = 3.0.3
libyaml = True
```
### Configuration
```console
$ ansible-config dump --only-changed -t all
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
:...skipping...
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
NETCONF:
:...skipping...
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
NETCONF:
=======
:...skipping...
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
NETCONF:
=======
SHELL:
:...skipping...
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
NETCONF:
=======
SHELL:
=====
:...skipping...
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
NETCONF:
=======
SHELL:
=====
VARS:
:...skipping...
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
NETCONF:
=======
SHELL:
=====
VARS:
====
(END)...skipping...
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
NETCONF:
=======
SHELL:
=====
VARS:
====
~
(END)...skipping...
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
NETCONF:
=======
SHELL:
=====
VARS:
====
~
~
(END)...skipping...
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
NETCONF:
=======
SHELL:
=====
VARS:
====
~
~
~
(END)...skipping...
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
NETCONF:
=======
SHELL:
=====
VARS:
====
~
~
~
~
(END)...skipping...
BECOME:
======
CACHE:
=====
CALLBACK:
========
CLICONF:
=======
CONNECTION:
==========
HTTPAPI:
=======
INVENTORY:
=========
LOOKUP:
======
NETCONF:
=======
SHELL:
=====
VARS:
====
~
~
~
~
```
### OS / Environment
Fedora 36
### Additional Information
It corrects something that is wrong.
### Code of Conduct
- [X] I agree to follow the Ansible Code of Conduct | null | https://github.com/ansible/ansible/pull/78077 | null | {'base_commit': '44b53141748d29220441e0799b54ea3130ac6753', 'files': [{'path': 'docs/docsite/rst/images/ansible_basic.svg', 'status': 'modified', 'Loc': {'(None, None, 27)': {'mod': [27, 28, 29]}, '(None, None, 35)': {'mod': [35]}, '(None, None, 51)': {'mod': [51]}, '(None, None, 67)': {'mod': [67]}, '(None, None, 192)': {'mod': [192]}, '(None, None, 194)': {'mod': [194, 195, 196, 197, 198, 199, 200]}, '(None, None, 203)': {'mod': [203]}, '(None, None, 205)': {'mod': [205]}, '(None, None, 207)': {'mod': [207]}, '(None, None, 209)': {'mod': [209]}, '(None, None, 211)': {'mod': [211]}, '(None, None, 213)': {'mod': [213]}, '(None, None, 215)': {'mod': [215]}, '(None, None, 217)': {'mod': [217]}, '(None, None, 219)': {'mod': [219]}, '(None, None, 221)': {'mod': [221]}, '(None, None, 223)': {'mod': [223, 224, 225, 226]}, '(None, None, 230)': {'mod': [230]}, '(None, None, 233)': {'mod': [233]}, '(None, None, 236)': {'mod': [236, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 259, 260, 261, 262, 263, 264, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 280, 281, 282, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311, 312, 313, 314, 315, 316, 317, 318, 319, 320, 321, 322, 323]}, '(None, None, 326)': {'mod': [326, 327, 328, 329, 330, 331, 332]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Doc"
} | {
"code": [],
"doc": [
"docs/docsite/rst/images/ansible_basic.svg"
],
"test": [],
"config": [],
"asset": []
} | null |
ansible | ansible | 0335d05f437eb59bcb77a58ef7819562f298ba79 | https://github.com/ansible/ansible/issues/3730 | ansible stacktrace | simple ansible facts now stack trace:
```
ansible -m setup -c local -i ~/hosts 127.0.0.1
```
127.0.0.1 | FAILED => Traceback (most recent call last):
File "/home/bcoca/work/ansible/lib/ansible/runner/**init**.py", line 367, in _executor
exec_rc = self._executor_internal(host, new_stdin)
File "/home/bcoca/work/ansible/lib/ansible/runner/__init__.py", line 389, in _executor_internal
host_variables = self.inventory.get_variables(host)
File "/home/bcoca/work/ansible/lib/ansible/inventory/__init__.py", line 284, in get_variables
self._vars_per_host[hostname] = self._get_variables(hostname)
File "/home/bcoca/work/ansible/lib/ansible/inventory/__init__.py", line 294, in _get_variables
vars_results = [ plugin.run(host) for plugin in self._vars_plugins ]
File "/home/bcoca/work/ansible/lib/ansible/inventory/vars_plugins/group_vars.py", line 43, in run
self.pb_basedir = os.path.abspath(inventory.playbook_basedir())
File "/usr/lib/python2.7/posixpath.py", line 343, in abspath
if not isabs(path):
File "/usr/lib/python2.7/posixpath.py", line 53, in isabs
return s.startswith('/')
AttributeError: 'NoneType' object has no attribute 'startswith'
bisect showed 16efb45735899737aacc106f89014ee9551fd625 as culprit
| null | null | https://github.com/ansible/ansible/commit/0335d05f437eb59bcb77a58ef7819562f298ba79 | {'base_commit': '0335d05f437eb59bcb77a58ef7819562f298ba79', 'files': [{'path': 'lib/ansible/inventory/vars_plugins/group_vars.py', 'status': 'modified', 'Loc': {"('VarsModule', 'run', 38)": {'mod': [43]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"lib/ansible/inventory/vars_plugins/group_vars.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null | |
ansible | ansible | f841c2803a1e36bb6f392c466d36b669f9243464 | https://github.com/ansible/ansible/issues/77073 | module
support:core
feature
P3
affects_2.13 | Add support for deb822 apt sources with apt_repository | ### Summary
Debian has deprecated APT's original `sources.list` file format. As of Debian 11 (and Ubuntu 20.10), APT uses [the newer "DEB822" format](https://manpages.debian.org/unstable/apt/sources.list.5.en.html#DEB822-STYLE_FORMAT) by default. This format has been supported since APT 1.1, which goes back to Ubuntu 16.04 and Debian 9.
Ansible should generate DEB822 `.sources` files instead of legacy `.list` files on supported systems.
### Issue Type
Feature Idea
### Component Name
apt_repository
### Additional Information
Here's an example of the deb822 format:
```
Types: deb
URIs: http://deb.debian.org/debian
Suites: bullseye
Components: main contrib non-free
```
The `apt_repository` module can behave a lot more like the `yum_repository` one with this new format.
### Code of Conduct
- [X] I agree to follow the Ansible Code of Conduct | null | https://github.com/ansible/ansible/pull/80018 | null | {'base_commit': 'f841c2803a1e36bb6f392c466d36b669f9243464', 'files': [{'path': 'test/integration/targets/setup_deb_repo/tasks/main.yml', 'status': 'modified', 'Loc': {'(None, None, 61)': {'add': [61]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [],
"doc": [],
"test": [],
"config": [
"test/integration/targets/setup_deb_repo/tasks/main.yml"
],
"asset": []
} | null |
ansible | ansible | 6a71aef6c5b2f4c26d5f6522cd5b1a85cd78ee6b | https://github.com/ansible/ansible/issues/58126 | networking
python3
module
support:network
bug
affects_2.8
ios
cisco | ios_facts module not enumerating ansible_net_model in Ansible 2.8 | <!--- Verify first that your issue is not already reported on GitHub -->
<!--- Also test if the latest release and devel branch are affected too -->
<!--- Complete *all* sections as described, this form is processed automatically -->
##### SUMMARY
<!--- Explain the problem briefly below -->
ios_facts module not enumerating ansible_net_model in Ansible 2.8
##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
<!--- Write the short name of the module, plugin, task or feature below, use your best guess if unsure -->
ios_facts
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes -->
```paste below
ansible 2.8.1
config file = /home/ryan/test/ansible.cfg
configured module search path = ['/home/ryan/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.6/site-packages/ansible
executable location = /usr/local/bin/ansible
python version = 3.6.8 (default, Apr 25 2019, 21:02:35) [GCC 4.8.5 20150623 (Red Hat 4.8.5-36)]
```
##### CONFIGURATION
<!--- Paste verbatim output from "ansible-config dump --only-changed" between quotes -->
```paste below
$ ansible-config dump --only-changed
DEFAULT_GATHERING(/home/ryan/test/ansible.cfg) = explicit
DEFAULT_HOST_LIST(/home/ryan/test/ansible.cfg) = ['/home/ryan/test/inventory']
DEPRECATION_WARNINGS(/home/ryan/test/ansible.cfg) = False
HOST_KEY_CHECKING(/home/ryan/test/ansible.cfg) = False
PERSISTENT_COMMAND_TIMEOUT(/home/ryan/test/ansible.cfg) = 30
PERSISTENT_CONNECT_TIMEOUT(/home/ryan/test/ansible.cfg) = 30
RETRY_FILES_ENABLED(/home/ryan/test/ansible.cfg) = False
```
##### OS / ENVIRONMENT
<!--- Provide all relevant information below, e.g. target OS versions, network device firmware, etc. -->
Host OS: CentOS 7 virtual machine (VMware player)
Python versions: Reproducible on 2.7.5 and 3.6
Tested on:
CSR1000v running IOS-XE 16.09.03
ISR4331 running IOS-XE 16.06.03
Catalyst 3850 running IOS-XE 03.06.03E
##### STEPS TO REPRODUCE
<!--- Describe exactly how to reproduce the problem, using a minimal test-case -->
Run playbook to gather ios_facts, ansible_net_model is not included in any subset. Should always be included, per: https://docs.ansible.com/ansible/latest/modules/ios_facts_module.html
<!--- Paste example playbooks or commands between quotes below -->
```yaml
name: IOS Facts gathering
hosts: CSRTEST
connection: network_cli
gather_facts: yes
tasks:
- name: Gather facts from device
ios_facts:
gather_subset: all
```
<!--- HINT: You can paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- Describe what you expected to happen when running the steps above -->
Expecting ansible_net_model back as one of the facts gathered.
##### ACTUAL RESULTS
<!--- Describe what actually happened. If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes -->
```paste below
TASK [Gather facts from device] ****************************************************************************************************************************************************************************************************************************************
task path: /home/ryan/test/test_facts.yml:6
<CSRTEST> attempting to start connection
<CSRTEST> using connection plugin network_cli
<CSRTEST> found existing local domain socket, using it!
<CSRTEST> updating play_context for connection
<CSRTEST>
<CSRTEST> local domain socket path is /home/ryan/.ansible/pc/5485150d9c
<CSRTEST> ESTABLISH LOCAL CONNECTION FOR USER: ryan
<CSRTEST> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /home/ryan/.ansible/tmp/ansible-local-228682ct6pdp_/ansible-tmp-1561047456.179465-161563255687379 `" && echo ansible-tmp-1561047456.179465-161563255687379="` echo /home/ryan/.ansible/tmp/ansible-local-228682ct6pdp_/ansible-tmp-1561047456.179465-161563255687379 `" ) && sleep 0'
Using module file /usr/local/lib/python3.6/site-packages/ansible/modules/network/ios/ios_facts.py
<CSRTEST> PUT /home/ryan/.ansible/tmp/ansible-local-228682ct6pdp_/tmp6gh5jigs TO /home/ryan/.ansible/tmp/ansible-local-228682ct6pdp_/ansible-tmp-1561047456.179465-161563255687379/AnsiballZ_ios_facts.py
<CSRTEST> EXEC /bin/sh -c 'chmod u+x /home/ryan/.ansible/tmp/ansible-local-228682ct6pdp_/ansible-tmp-1561047456.179465-161563255687379/ /home/ryan/.ansible/tmp/ansible-local-228682ct6pdp_/ansible-tmp-1561047456.179465-161563255687379/AnsiballZ_ios_facts.py && sleep 0'
<CSRTEST> EXEC /bin/sh -c '/usr/bin/python /home/ryan/.ansible/tmp/ansible-local-228682ct6pdp_/ansible-tmp-1561047456.179465-161563255687379/AnsiballZ_ios_facts.py && sleep 0'
<CSRTEST> EXEC /bin/sh -c 'rm -f -r /home/ryan/.ansible/tmp/ansible-local-228682ct6pdp_/ansible-tmp-1561047456.179465-161563255687379/ > /dev/null 2>&1 && sleep 0'
ok: [CSRTEST] => {
"ansible_facts": {
"ansible_net_all_ipv4_addresses": [
"192.168.102.133"
],
"ansible_net_all_ipv6_addresses": [],
"ansible_net_api": "cliconf",
"ansible_net_config": "!\n! Last configuration change at 16:13:51 UTC Thu Jun 20 2019\n!\nversion 16.9\nservice timestamps debug datetime msec\nservice timestamps log datetime msec\nplatform qfp utilization monitor load 80\nno platform punt-keepalive disable-kernel-core\nplatform console virtual\n!\nhostname CSRTEST\n!\nboot-start-marker\nboot-end-marker\n!\n!\n!\nno aaa new-model\n!\n!\n!\n!\n!\n!\n!\n!\n!\n!\nlogin on-success log\n!\n!\n!\n!\n!\n!\n!\nsubscriber templating\n! \n! \n! \n! \n!\nmultilink bundle-name authenticated\n!\n!\n!\n!\n!\ncrypto pki trustpoint TP-self-signed-3768273344\n enrollment selfsigned\n subject-name cn=IOS-Self-Signed-Certificate-3768273344\n revocation-check none\n rsakeypair TP-self-signed-3768273344\n!\n!\ncrypto pki certificate chain TP-self-signed-3768273344\n certificate self-signed 01\n 30820330 30820218 A0030201 02020101 300D0609 2A864886 F70D0101 05050030 \n 31312F30 2D060355 04031326 494F532D 53656C66 2D536967 6E65642D 43657274 \n 69666963 6174652D 33373638 32373333 3434301E 170D3139 30363230 31363134 \n 30395A17 0D333030 31303130 30303030 305A3031 312F302D 06035504 03132649 \n 4F532D53 656C662D 5369676E 65642D43 65727469 66696361 74652D33 37363832 \n 37333334 34308201 22300D06 092A8648 86F70D01 01010500 0382010F 00308201 \n 0A028201 0100891F 68316AAF AF54176F 7D9C39F5 E34FB187 F4D88C88 8265FDE9 \n B3A338A1 FADD5622 1A2887D2 1E655477 9EDEA72C 94EAB9C4 744C428C 83BC30A1 \n E18B6EBC 69856EC8 4F5E8649 9D442076 3544F7D1 01AC0B0B 76E9CBE1 AEFA2C4A \n 4EB0EE8B 29895287 97A9C7CC 586A0241 19DC79E9 35A415A5 7D976DAB 7E072350 \n C2617E80 F8DB84D1 CFC0EBE5 3194A8C4 2E7AAC3C 7F97D423 2B016D97 C12164A6 \n D75B73E8 A9EA96ED 079CAB76 2B8DEA2E BBB61836 C913E020 B0F7659D DA4CF838 \n 7FCC72B5 522932D6 37196DD2 2897D197 BD6FD0C0 576CED54 85A7C94B 029BC4A3 \n F0D7F7CC 4AAFC50A 297B6E6E ECF97699 2062D939 38DD585D E78A2794 40381513 \n 75AEAA98 F8550203 010001A3 53305130 0F060355 1D130101 FF040530 030101FF \n 301F0603 551D2304 18301680 147DF3A5 74A80322 7F0D4A33 C839CE1E 479BCFD0 \n 8C301D06 03551D0E 04160414 7DF3A574 A803227F 0D4A33C8 39CE1E47 9BCFD08C \n 300D0609 2A864886 F70D0101 05050003 82010100 87C47448 FAE908F7 47B564D7 \n 992A8E16 24966357 D0B864AB B32BB538 6A5371F3 0BF093E8 D0E461AC 2ED99B84 \n 768E700C A88464AA B8E0B774 2308D4A2 881495B7 AFE1F6D7 3D25AFEE 2A7D6653 \n 6814B4AC E4189640 15C0003E 1E1EE9B1 6E3FF371 448CA017 DA622BCD 49EF07C5 \n FB4D6859 208FF4FE 29AEB2F3 BB9BA26E 1D140B6A B2C4DADA 913D4846 84370AF0 \n A67E3D78 F0E9CE1E 9D344542 2732C2A7 70A50162 B32BBE36 BF3382AD 641DB7A6 \n 1AE1FD10 2CFEC3A6 1ACCD4FD 58E48276 9F2417F4 1871A9F7 11C61604 09E4BBEB \n 2D821D14 815A48FC 7B14A7C2 8766F1B1 7C04112A 139DB760 EFF339D0 1BA82B52 \n 5E85BBA9 3FC49134 4FEDD944 BA27F4A4 1317652C\n \tquit\n!\n!\n!\n!\n!\n!\n!\n!\nlicense udi pid CSR1000V sn 9U4DE1R3P2Y\nlicense boot level ax\nno license smart enable\ndiagnostic bootup level minimal\n!\nspanning-tree extend system-id\n!\n!\n!\nusername ansible privilege 15 secret 5 $1$Ax9o$F2JTz/1dXjNSB21muGqxU1\n!\nredundancy\n!\n!\n!\n!\n!\n!\n! \n!\n!\n!\n!\n!\n!\n!\n!\n!\n!\n!\n!\n! \n! \n!\n!\ninterface GigabitEthernet1\n ip address dhcp\n negotiation auto\n no mop enabled\n no mop sysid\n!\ninterface GigabitEthernet2\n no ip address\n shutdown\n negotiation auto\n no mop enabled\n no mop sysid\n!\ninterface GigabitEthernet3\n no ip address\n shutdown\n negotiation auto\n no mop enabled\n no mop sysid\n!\nip forward-protocol nd\nip http server\nip http authentication local\nip http secure-server\nip route 0.0.0.0 0.0.0.0 GigabitEthernet1 dhcp\n!\nip ssh version 2\n!\n!\n!\n!\n!\ncontrol-plane\n!\n!\n!\n!\n!\n!\nline con 0\n stopbits 1\nline vty 0 4\n login local\nline vty 5 15\n login local\n!\n!\n!\n!\n!\n!\nend",
"ansible_net_filesystems": [
"bootflash:"
],
"ansible_net_filesystems_info": {
"bootflash:": {
"spacefree_kb": 6801160,
"spacetotal_kb": 7712692
}
},
"ansible_net_gather_subset": [
"hardware",
"default",
"interfaces",
"config"
],
"ansible_net_hostname": "CSRTEST",
"ansible_net_image": "bootflash:packages.conf",
"ansible_net_interfaces": {
"GigabitEthernet1": {
"bandwidth": 1000000,
"description": null,
"duplex": "Full",
"ipv4": [
{
"address": "192.168.102.133",
"subnet": "24"
}
],
"lineprotocol": "up",
"macaddress": "000c.29a5.1122",
"mediatype": "Virtual",
"mtu": 1500,
"operstatus": "up",
"type": "CSR vNIC"
},
"GigabitEthernet2": {
"bandwidth": 1000000,
"description": null,
"duplex": "Full",
"ipv4": [],
"lineprotocol": "down",
"macaddress": "000c.29a5.112c",
"mediatype": "Virtual",
"mtu": 1500,
"operstatus": "administratively down",
"type": "CSR vNIC"
},
"GigabitEthernet3": {
"bandwidth": 1000000,
"description": null,
"duplex": "Full",
"ipv4": [],
"lineprotocol": "down",
"macaddress": "000c.29a5.1136",
"mediatype": "Virtual",
"mtu": 1500,
"operstatus": "administratively down",
"type": "CSR vNIC"
}
},
"ansible_net_iostype": "IOS-XE",
"ansible_net_memfree_mb": 1863849,
"ansible_net_memtotal_mb": 2182523,
"ansible_net_neighbors": {},
"ansible_net_python_version": "2.7.5",
"ansible_net_serialnum": "9U4DE1R3P2Y",
"ansible_net_system": "ios",
"ansible_net_version": "16.09.03"
},
"changed": false,
"invocation": {
"module_args": {
"auth_pass": null,
"authorize": null,
"gather_subset": [
"all"
],
"host": null,
"password": null,
"port": null,
"provider": null,
"ssh_keyfile": null,
"timeout": null,
"username": null
}
}
}
META: ran handlers
META: ran handlers
PLAY RECAP *************************************************************************************************************************************************************************************************************************************************************
CSRTEST : ok=2 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
```
| null | https://github.com/ansible/ansible/pull/58174 | null | {'base_commit': '6a71aef6c5b2f4c26d5f6522cd5b1a85cd78ee6b', 'files': [{'path': 'lib/ansible/plugins/cliconf/ios.py', 'status': 'modified', 'Loc': {"('Cliconf', 'get_device_info', 199)": {'mod': [210, 211, 212]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"lib/ansible/plugins/cliconf/ios.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
ansible | ansible | d1cd6ee56d492deef40f6f2f178832a1815730a5 | https://github.com/ansible/ansible/issues/37734 | cloud
azure
module
affects_2.4
support:certified
feature | Add network interface to Load Balancer Backend pool in azure_rm_networkinterface | ##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
azure_rm_networkinterface
##### ANSIBLE VERSION
```
ansible --version
ansible 2.4.3.0
config file = /etc/ansible/ansible.cfg
configured module search path = [u'/home/dgermain/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.12 (default, Dec 4 2017, 14:50:18) [GCC 5.4.0 20160609]
```
##### CONFIGURATION
```
ansible-config dump --only-changed
#empty return
```
##### OS / ENVIRONMENT
N/A
##### SUMMARY
In current azure loadbalancer module, you can create Backend pools, but you don't have the possibility to add network interfaces in this Backend pool, neither in *azure_rm_networkinterface* nor in *azure_rm_loadbalancer*.
As an example, this feature is present in Powershell azure CLI, when handling network interfaces :
```
$nic = Get-AzurermNetworkInterface -name $virtualnetworkcardname" -resourcegroupname $resourceGroup
$nic.IpConfigurations[0].LoadBalancerBackendAddressPools=$backend
Set-AzureRmNetworkInterface -NetworkInterface $nic
```
##### STEPS TO REPRODUCE
As far as I can tell, you don't have this option in the ansible module
##### EXPECTED RESULTS
Have an option to allow this
##### ACTUAL RESULTS
No option to do so | null | github.com/ansible/ansible/pull/38643 | null | {'base_commit': 'd1cd6ee56d492deef40f6f2f178832a1815730a5', 'files': [{'path': 'lib/ansible/module_utils/azure_rm_common.py', 'status': 'modified', 'Loc': {"('AzureRMModuleBase', None, 216)": {'add': [605]}, '(None, None, None)': {'mod': [131]}}}, {'path': 'lib/ansible/modules/cloud/azure/azure_rm_networkinterface.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [66, 73, 153, 198, 210, 239, 351], 'mod': [55, 57, 58, 59, 60, 61, 62, 63, 64, 65, 68, 127, 128, 160, 162, 163, 165, 186, 197, 207, 220, 222, 233, 234, 277, 286]}, "(None, 'nic_to_dict', 306)": {'add': [313]}, "('AzureRMNetworkInterface', 'exec_module', 411)": {'add': [525], 'mod': [427, 428, 429, 431, 432, 435, 468, 469, 470, 473, 477, 514, 515, 516, 530, 532, 534]}, "('AzureRMNetworkInterface', None, 356)": {'add': [600], 'mod': [594]}, "('AzureRMNetworkInterface', 'construct_ip_configuration_set', 601)": {'add': [606]}, "('AzureRMNetworkInterface', '__init__', 358)": {'mod': [364, 371, 372, 380, 386, 392, 393, 397]}, "('AzureRMNetworkInterface', 'get_security_group', 594)": {'mod': [597]}}}, {'path': 'test/integration/targets/azure_rm_networkinterface/tasks/main.yml', 'status': 'modified', 'Loc': {'(None, None, 19)': {'add': [19]}, '(None, None, 124)': {'add': [124]}, '(None, None, 131)': {'add': [131]}, '(None, None, 148)': {'add': [148]}, '(None, None, 164)': {'add': [164]}, '(None, None, 179)': {'add': [179]}, '(None, None, 189)': {'add': [189]}, '(None, None, 36)': {'mod': [36]}, '(None, None, 40)': {'mod': [40, 41]}, '(None, None, 43)': {'mod': [43]}, '(None, None, 48)': {'mod': [48]}, '(None, None, 52)': {'mod': [52, 53]}, '(None, None, 55)': {'mod': [55]}, '(None, None, 78)': {'mod': [78]}, '(None, None, 90)': {'mod': [90]}, '(None, None, 113)': {'mod': [113]}, '(None, None, 137)': {'mod': [137]}, '(None, None, 159)': {'mod': [159]}, '(None, None, 176)': {'mod': [176]}}}]} | [] | [] | [] | {
"iss_type": "4",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code\nConfig\nTest"
} | {
"code": [
"lib/ansible/modules/cloud/azure/azure_rm_networkinterface.py",
"lib/ansible/module_utils/azure_rm_common.py"
],
"doc": [],
"test": [],
"config": [
"test/integration/targets/azure_rm_networkinterface/tasks/main.yml"
],
"asset": []
} | null |
ansible | ansible | 6f8c1da0c805f334b8598fd2556f7ed92dc9348e | https://github.com/ansible/ansible/issues/79277 | bug
traceback
affects_2.13 | ansible-test fails to report the proper error when validating ansible-doc | ### Summary
The utility ansible-test sanity is fantastic and does its job. Unfortunately, when validating the ansible-doc, if the YAML is malformed, you'll get a parsing error instead of the actual YAML error.
### Issue Type
Bug Report
### Component Name
ansible-test
### Ansible Version
```console
$ ansible --version
ansible [core 2.13.6rc1.post0] (stable-2.13 33852737fd) last updated 2022/10/31 21:51:24 (GMT +200)
config file = None
configured module search path = ['/home/warkdev/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /home/warkdev/ansible/lib/ansible
ansible collection location = /home/warkdev/.ansible/collections:/usr/share/ansible/collections
executable location = /home/warkdev/ansible/bin/ansible
python version = 3.9.2 (default, Feb 28 2021, 17:03:44) [GCC 10.2.1 20210110]
jinja version = 3.1.2
libyaml = False
```
### Configuration
```console
# if using a version older than ansible-core 2.12 you should omit the '-t all'
$ ansible-config dump --only-changed -t all
```
### OS / Environment
Debian 12
### Steps to Reproduce
* Generate an ansible module that you want to validate and introduce invalid YAML syntax in the ansible-doc
* Run ansible-test sanity against that module
* Verify that the error is happening
I've tracked down the issue till this code: https://github.com/ansible/ansible/blob/stable-2.13/test/lib/ansible_test/_util/controller/sanity/validate-modules/validate_modules/utils.py#L157
### Expected Results
ERROR: Found 2 yamllint issue(s) which need to be resolved:
ERROR: plugins/modules/axway_cft_about_info.py:36:15: error: RETURN: syntax error: mapping values are not allowed here (syntax)
ERROR: plugins/modules/axway_cft_about_info.py:36:15: unparsable-with-libyaml: None - mapping values are not allowed in this context
### Actual Results
```console
Traceback (most recent call last):
File "/root/ansible/test/lib/ansible_test/_util/controller/sanity/validate-modules/validate_modules/utils.py", line 153, in parse_yaml
data = yaml_load(value, Loader=loader)
File "/root/.ansible/test/venv/sanity.validate-modules/3.10/487215fd/lib/python3.10/site-packages/yaml/__init__.py", line 81, in load
return loader.get_single_data()
File "/root/.ansible/test/venv/sanity.validate-modules/3.10/487215fd/lib/python3.10/site-packages/yaml/constructor.py", line 49, in get_single_data
node = self.get_single_node()
File "yaml/_yaml.pyx", line 673, in yaml._yaml.CParser.get_single_node
File "yaml/_yaml.pyx", line 687, in yaml._yaml.CParser._compose_document
File "yaml/_yaml.pyx", line 731, in yaml._yaml.CParser._compose_node
File "yaml/_yaml.pyx", line 845, in yaml._yaml.CParser._compose_mapping_node
File "yaml/_yaml.pyx", line 731, in yaml._yaml.CParser._compose_node
File "yaml/_yaml.pyx", line 845, in yaml._yaml.CParser._compose_mapping_node
File "yaml/_yaml.pyx", line 731, in yaml._yaml.CParser._compose_node
File "yaml/_yaml.pyx", line 847, in yaml._yaml.CParser._compose_mapping_node
File "yaml/_yaml.pyx", line 860, in yaml._yaml.CParser._parse_next_event
yaml.scanner.ScannerError: mapping values are not allowed in this context
in "<unicode string>", line 9, column 15
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/ansible/test/lib/ansible_test/_util/controller/sanity/validate-modules/validate.py", line 6, in <module>
main()
File "/root/ansible/test/lib/ansible_test/_util/controller/sanity/validate-modules/validate_modules/main.py", line 2475, in main
run()
File "/root/ansible/test/lib/ansible_test/_util/controller/sanity/validate-modules/validate_modules/main.py", line 2363, in run
mv1.validate()
File "/root/ansible/test/lib/ansible_test/_util/controller/sanity/validate-modules/validate_modules/main.py", line 2156, in validate
doc_info, docs = self._validate_docs()
File "/root/ansible/test/lib/ansible_test/_util/controller/sanity/validate-modules/validate_modules/main.py", line 1080, in _validate_docs
data, errors, traces = parse_yaml(doc_info['RETURN']['value'],
File "/root/ansible/test/lib/ansible_test/_util/controller/sanity/validate-modules/validate_modules/utils.py", line 157, in parse_yaml
e.problem_mark.line += lineno - 1
AttributeError: attribute 'line' of 'yaml._yaml.Mark' objects is not writable
```
### Code of Conduct
- [X] I agree to follow the Ansible Code of Conduct | null | https://github.com/ansible/ansible/pull/79682 | null | {'base_commit': '6f8c1da0c805f334b8598fd2556f7ed92dc9348e', 'files': [{'path': 'test/integration/targets/ansible-test-sanity-validate-modules/runme.sh', 'status': 'modified', 'Loc': {'(None, None, 7)': {'mod': [7]}}}, {'path': 'test/lib/ansible_test/_util/controller/sanity/validate-modules/validate_modules/utils.py', 'status': 'modified', 'Loc': {"(None, 'parse_yaml', 137)": {'mod': [157, 158, 161]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"test/lib/ansible_test/_util/controller/sanity/validate-modules/validate_modules/utils.py"
],
"doc": [],
"test": [],
"config": [],
"asset": [
"test/integration/targets/ansible-test-sanity-validate-modules/runme.sh"
]
} | null |
ansible | ansible | d97080174e9bbebd27a967368934ef91d1f28f64 | https://github.com/ansible/ansible/issues/32070 | networking
affects_2.4
support:core
nxos
bug
cisco | Occasional failures with NXOS modules | ##### ISSUE TYPE
- Bug Report
##### COMPONENT NAME
nxos modules
##### ANSIBLE VERSION
ansible 2.4.0.0
config file = /project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/ansible.cfg
configured module search path = [u'/etc/ansible/roles/plugins/library', u'/usr/local/lib/python2.7/dist-packages/ara/plugins/modules']
ansible python module location = /usr/lib/python2.7/dist-packages/ansible
executable location = /usr/bin/ansible
python version = 2.7.12 (default, Nov 19 2016, 06:48:10) [GCC 5.4.0 20160609]
##### CONFIGURATION
DEFAULT_ACTION_PLUGIN_PATH(env: ANSIBLE_ACTION_PLUGINS) = [u'/etc/ansible/roles/plugins/action', u'/usr/local/lib/python2.7/dist-packages/ara/plugins/actions']
DEFAULT_CALLBACK_PLUGIN_PATH(env: ANSIBLE_CALLBACK_PLUGINS) = [u'/etc/ansible/roles/plugins/callback', u'/usr/local/lib/python2.7/dist-packages/ara/plugins/callbacks']
DEFAULT_CALLBACK_WHITELIST(/project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/ansible.cfg) = ['profile_tasks']
DEFAULT_FILTER_PLUGIN_PATH(/project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/ansible.cfg) = [u'/project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/plugins/filter']
DEFAULT_FORCE_HANDLERS(/project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/ansible.cfg) = True
DEFAULT_HOST_LIST(/project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/ansible.cfg) = [u'/project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/inventory']
DEFAULT_MODULE_PATH(env: ANSIBLE_LIBRARY) = [u'/etc/ansible/roles/plugins/library', u'/usr/local/lib/python2.7/dist-packages/ara/plugins/modules']
DEFAULT_ROLES_PATH(/project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/ansible.cfg) = [u'/project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/roles-dev', u'/project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/roles', u'/project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/roles-base-shell']
HOST_KEY_CHECKING(/project-mcuk/ap/ipp/jrosser/cloud/ansible-mist/ansible.cfg) = False
##### OS / ENVIRONMENT
Ubuntu 16.04
NXOS: version 7.0(3)I7(1)
##### SUMMARY
I observe non deterministic failures with the nxos modules when configuring 9200 series switches, in this specific case a 92160.
##### STEPS TO REPRODUCE
Sadly this is difficult to reproduce. I have a playbook which configures a couple of dozen ports on several switches, each taking a dozen or more tasks. This is a sufficient number of tasks to occasionally trigger a failure of a task. Running the playbook again most likely will result in no errors.
Playbook https://gist.github.com/jrosser/b4d88748f5b1323828a8f2f266596ead
##### EXPECTED RESULTS
All tasks to run without error. Running with -vvvv gives no insight into the communication with the switch so doesn't provide any useful debug.
##### ACTUAL RESULTS
Very occasionally one or more tasks will fail.
```
TASK [Ensure all layer 2 interfaces are up] ***********************************************************************************************************
Tuesday 24 October 2017 10:54:15 +0000 (0:00:21.378) 0:01:00.450 *******
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/1', u'description': u'to infra0-1-b505-10'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/1', u'description': u'to infra0-1-b505-10'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/2', u'description': u'to infra0-2-b505-10'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/2', u'description': u'to infra0-2-b505-10'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/3', u'description': u'to infra0-3-b505-10'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/3', u'description': u'to infra0-3-b505-10'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/4', u'description': u'to infra0-4-b505-10'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/4', u'description': u'to infra0-4-b505-10'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/5', u'description': u'to infra0-5-b505-10'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/5', u'description': u'to infra0-5-b505-10'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/6', u'description': u'to infra0-6-b505-10'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/6', u'description': u'to infra0-6-b505-10'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/7', u'description': u'to infra0-7-b505-10'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/7', u'description': u'to infra0-7-b505-10'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/8', u'description': u'to infra0-8-b505-10'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/8', u'description': u'to infra0-8-b505-10'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/9', u'description': u'to infra0-1-b505-9'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/9', u'description': u'to infra0-1-b505-9'})
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: TypeError: string indices must be integers, not str
failed: [fbs0-b505-10] (item={u'interface': u'Ethernet1/10', u'description': u'to infra0-2-b505-9'}) => {"changed": false, "failed": true, "item": {"description": "to infra0-2-b505-9", "interface": "Ethernet1/10"}, "module_stderr": "Traceback (most recent call last):\n File \"/tmp/ansible_SPjZ0l/ansible_module_nxos_interface.py\", line 710, in <module>\n main()\n File \"/tmp/ansible_SPjZ0l/ansible_module_nxos_interface.py\", line 701, in main\n normalized_interface)\n File \"/tmp/ansible_SPjZ0l/ansible_module_nxos_interface.py\", line 534, in smart_existing\n existing = get_interface(normalized_interface, module)\n File \"/tmp/ansible_SPjZ0l/ansible_module_nxos_interface.py\", line 281, in get_interface\n interface_table = body['TABLE_interface']['ROW_interface']\nTypeError: string indices must be integers, not str\n", "module_stdout": "", "msg": "MODULE FAILURE", "rc": 0}
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/10', u'description': u'to infra0-2-b505-9'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/11', u'description': u'to infra0-3-b505-9'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/11', u'description': u'to infra0-3-b505-9'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/12', u'description': u'to infra0-4-b505-9'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/12', u'description': u'to infra0-4-b505-9'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/13', u'description': u'to infra0-5-b505-9'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/13', u'description': u'to infra0-5-b505-9'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/14', u'description': u'to infra0-6-b505-9'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/14', u'description': u'to infra0-6-b505-9'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/15', u'description': u'to infra0-7-b505-9'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/15', u'description': u'to infra0-7-b505-9'})
changed: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/16', u'description': u'to infra0-8-b505-9'})
changed: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/16', u'description': u'to infra0-8-b505-9'})
ok: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/47', u'stp_port_type': u'network', u'description': u'vpc peer link'})
ok: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/47', u'stp_port_type': u'network', u'description': u'vpc peer link'})
ok: [fbs0-b505-10] => (item={u'interface': u'Ethernet1/48', u'stp_port_type': u'network', u'description': u'vpc peer link'})
ok: [fbs0-b505-9] => (item={u'interface': u'Ethernet1/48', u'stp_port_type': u'network', u'description': u'vpc peer link'})
TASK [Ensure vrrpv3 is applied for vlans that need it] ************************************************************************************************
Tuesday 24 October 2017 11:01:48 +0000 (0:00:11.191) 0:08:33.606 *******
skipping: [fbs0-b505-9] => (item={u'vrf': u'default', u'vlan_id': 999})
ok: [fbs0-b505-9] => (item={u'vrrpv3': {u'priority': u'102', u'address_family': u'ipv4', u'group_id': 23, u'description': u'storage-clients', u'address': u'10.23.128.5'}, u'vrf': u'STORAGE', u'address': u'10.23.128.1/24', u'interface': u'Vlan1923', u'extra_lines': [u'mtu 9216'], u'vlan_id': 1923})
ok: [fbs0-b505-9] => (item={u'vrrpv3': {u'priority': u'102', u'address_family': u'ipv4', u'group_id': 21, u'description': u'storage-services', u'address': u'10.21.128.5'}, u'vrf': u'STORAGE', u'address': u'10.21.128.1/24', u'interface': u'Vlan1921', u'extra_lines': [u'mtu 9216'], u'vlan_id': 1921})
ok: [fbs0-b505-9] => (item={u'interface': u'Vlan1911', u'vrrpv3': {u'priority': u'102', u'address_family': u'ipv4', u'group_id': 11, u'description': u'osmgmt', u'address': u'10.11.128.5'}, u'vrf': u'OSMGMT', u'vlan_id': 1911, u'address': u'10.11.128.1/24'})
ok: [fbs0-b505-9] => (item={u'interface': u'Vlan1931', u'vrrpv3': {u'priority': u'102', u'address_family': u'ipv4', u'group_id': 31, u'description': u'metal', u'address': u'10.31.128.5'}, u'vrf': u'METAL', u'vlan_id': 1931, u'address': u'10.31.128.1/24'})
ok: [fbs0-b505-9] => (item={u'interface': u'Vlan1932', u'vrrpv3': {u'priority': u'102', u'address_family': u'ipv4', u'group_id': 32, u'description': u'metal', u'address': u'10.32.128.5'}, u'vrf': u'METAL', u'vlan_id': 1932, u'address': u'10.32.128.1/24'})
failed: [fbs0-b505-9] (item={u'interface': u'Vlan1941', u'vrrpv3': {u'priority': u'102', u'address_family': u'ipv4', u'group_id': 41, u'description': u'tunnels', u'address': u'10.41.128.5'}, u'vrf': u'TUNNEL', u'vlan_id': 1941, u'address': u'10.41.128.1/24'}) => {"changed": false, "failed": true, "item": {"address": "10.41.128.1/24", "interface": "Vlan1941", "vlan_id": 1941, "vrf": "TUNNEL", "vrrpv3": {"address": "10.41.128.5", "address_family": "ipv4", "description": "tunnels", "group_id": 41, "priority": "102"}}, "msg": "interface Vlan1941\r\r\n ^\r\n% Invalid command at '^' marker.\r\n\rfbs0-b505-9# "}
``` | null | https://github.com/ansible/ansible/pull/32114 | null | {'base_commit': 'd97080174e9bbebd27a967368934ef91d1f28f64', 'files': [{'path': 'lib/ansible/module_utils/nxos.py', 'status': 'modified', 'Loc': {"('Cli', 'run_commands', 139)": {'add': [171]}, '(None, None, None)': {'mod': [37]}}}, {'path': 'lib/ansible/modules/network/nxos/nxos_interface.py', 'status': 'modified', 'Loc': {"(None, 'get_interface', 238)": {'mod': [278, 280, 281, 282, 283, 284, 285, 286, 288, 289, 290, 291, 292, 293, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311, 312, 313, 315, 316, 317, 318, 319, 320, 321, 322, 324, 325, 326, 327, 329, 330, 331, 332, 333, 334, 335, 336, 338, 339, 340, 341, 342, 343]}, "(None, 'get_interfaces_dict', 361)": {'mod': [372]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "",
"info_type": "Code"
} | {
"code": [
"lib/ansible/modules/network/nxos/nxos_interface.py",
"lib/ansible/module_utils/nxos.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
ansible | ansible | cc7a5228b02344658dac69c38ccb7d6580d2b4c6 | https://github.com/ansible/ansible/issues/34012 | module
affects_2.4
net_tools
support:community
bug | nmcli module fails with self.dns4=' '.join(module.params['dns4']) TypeError |
##### ISSUE TYPE
<!--- Pick one below and delete the rest -->
- Bug Report
##### COMPONENT NAME
`nmcli`
##### ANSIBLE VERSION
<!--- Paste verbatim output from "ansible --version" between quotes below -->
```
ansible 2.4.1.0
config file = /Users/dlbewley/src/ansible/playbook-openshift/ansible.cfg
configured module search path = [u'/Users/dlbewley/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/Cellar/ansible/2.4.1.0/libexec/lib/python2.7/site-packages/ansible
executable location = /usr/local/bin/ansible
python version = 2.7.14 (default, Sep 25 2017, 09:53:22) [GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.37)]
```
##### CONFIGURATION
##### OS / ENVIRONMENT
- Manager: OS X
- Managed: Red Hat Enterprise Linux Server release 7.4 (Maipo)
##### SUMMARY
Playbook fails when trying to join `None` value for `dns4` param [here](https://github.com/ansible/ansible/blob/devel/lib/ansible/modules/net_tools/nmcli.py#L559)
I do not see a requirement to include dns servers, and expect to use DHCP.
##### STEPS TO REPRODUCE
Host with links on eno1, eno2. Int eno1 is def gw
<!--- Paste example playbooks or commands between quotes below -->
```yaml
---
- hosts: bonded
# Dec 18 18:08:43 ose-prod-node-07 ansible-nmcli[31031]: Invoked with conn_name=cluster ingress=None slavepriority=32 vlandev=None forwarddelay=15 egress=None ageingtime=300 mtu=None hellotime=2 maxage=20 vlanid=None priority=128 gw4=None state=present gw6=None master=None stp=True ifname=None type=bond miimon=None arp_ip_target=None downdelay=None mac=None ip6=None ip4=None autoconnect=None dns6=None dns4=None arp_interval=None flags=None mode=802.3ad updelay=None
vars:
nmcli_bond:
- conn_name: cluster
mode: 802.3ad
mtu: 9000
nmcli_bond_slave:
- conn_name: eno1
master: cluster
- conn_name: eno2
master: cluster
tasks:
- name: create bond
nmcli:
type: bond
conn_name: '{{ item.conn_name }}'
mode: '{{ item.mode }}'
state: present
with_items:
- '{{ nmcli_bond }}'
- name: add interfaces to bond
nmcli:
type: bond-slave
conn_name: '{{ item.conn_name }}'
ifname: '{{ item.ifname }}'
master: '{{ item.master }}'
state: present
with_items:
- '{{ nmcli_bond_slave }}'
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- What did you expect to happen when running the steps above? -->
First test, but expect playbook to run without error.
##### ACTUAL RESULTS
<!--- What actually happened? If possible run with extra verbosity (-vvvv) -->
<!--- Paste verbatim command output between quotes below -->
```
failed: [ose-prod-node-07.example.com] (item={u'conn_name': u'cluster', u'mode': u'802.3ad', u'mtu': 9000}) => {
"changed": false,
"failed": true,
"item": {
"conn_name": "cluster",
"mode": "802.3ad",
"mtu": 9000
},
"module_stderr": "OpenSSH_7.4p1, LibreSSL 2.5.0\r\ndebug1: Reading configuration data /Users/dlbewley/.ssh/config\r\ndebug1: /Users/dlbewley/.ssh/config line 3: Applying options for *\r\ndebug1: Reading configuration data /etc/ssh/ssh_config\r\ndebug1: /etc/ssh/ssh_config line 51: Applying options for *\r\ndebug1: /etc/ssh/ssh_config line 56: Applying options for *\r\ndebug1: auto-mux: Trying existing master\r\ndebug2: fd 3 setting O_NONBLOCK\r\ndebug2: mux_client_hello_exchange: master version 4\r\ndebug3: mux_client_forwards: request forwardings: 0 local, 0 remote\r\ndebug3: mux_client_request_session: entering\r\ndebug3: mux_client_request_alive: entering\r\ndebug3: mux_client_request_alive: done pid = 10219\r\ndebug3: mux_client_request_session: session request sent\r\ndebug1: mux_client_request_session: master session id: 2\r\ndebug3: mux_client_read_packet: read header failed: Broken pipe\r\ndebug2: Received exit status from master 1\r\nShared connection to ose-prod-node-07.example.com closed.\r\n",
"module_stdout": "/tmp/ansible_gKn2an/ansible_module_nmcli.py:493: PyGIWarning: NetworkManager was imported without specifying a version first. Use gi.require_version('NetworkManager', '1.0') before import to ensure that the right version gets loaded.\r\n from gi.repository import NetworkManager, NMClient\r\n/tmp/ansible_gKn2an/ansible_module_nmcli.py:493: PyGIWarning: NMClient was imported without specifying a version first. Use gi.require_version('NMClient', '1.0') before import to ensure that the right version gets loaded.\r\n from gi.repository import NetworkManager, NMClient\r\nTraceback (most recent call last):\r\n File \"/tmp/ansible_gKn2an/ansible_module_nmcli.py\", line 1190, in <module>\r\n main()\r\n File \"/tmp/ansible_gKn2an/ansible_module_nmcli.py\", line 1134, in main\r\n nmcli=Nmcli(module)\r\n File \"/tmp/ansible_gKn2an/ansible_module_nmcli.py\", line 559, in __init__\r\n self.dns4=' '.join(module.params['dns4'])\r\nTypeError\r\n",
"msg": "MODULE FAILURE",
"rc": 1
}
```
| null | https://github.com/ansible/ansible/pull/30757 | null | {'base_commit': 'cc7a5228b02344658dac69c38ccb7d6580d2b4c6', 'files': [{'path': 'lib/ansible/modules/net_tools/nmcli.py', 'status': 'modified', 'Loc': {"('Nmcli', '__init__', 549)": {'mod': [559]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"lib/ansible/modules/net_tools/nmcli.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
ultralytics | yolov5 | 5f7d39fede4de8af98472bd009c63c3a86568e2d | https://github.com/ultralytics/yolov5/issues/2840 | bug | wandb: Network error (ReadTimeout), entering retry loop. See wandb\debug-internal.log for full traceback. |
- **Current repo**: yolov5-5.0 release version
- **Common dataset**: VisDrone.yaml
- **Common environment**: Colab, Google Cloud, or Docker image. See https://github.com/ultralytics/yolov5#environments
## 🐛 Bug
I try to use your rep to train yolov4's NET because yolov4(https://github.com/WongKinYiu/PyTorch_YOLOv4)'s code is outdate and do not maintain, it has many bugs.
when I train my own yolov4-tiny.yaml, it comes this bug, I think this bug is because my network can not connect to wandb's server? before today, I can train normally, and a few minute ago, I try many times to `python train.py `,but I still can not begin my train code.
## To Reproduce (REQUIRED)
`python train.py `
Output:
```
YOLOv5 2021-4-15 torch 1.7.1 CUDA:0 (GRID V100D-32Q, 32638.0MB)
Namespace(adam=False, artifact_alias='latest', batch_size=64, bbox_interval=-1, bucket='', cache_images=False, cfg='models/yolov4-tiny.yaml', data='datai/Visdrone.yaml', device='', entity=None, epochs=300, evolve=False, exist_ok=False, global_rank=-1, hyp='data/hyp.scratch.yaml', image_weights=False, img_size=[640, 640], label_smoothing=0.0, linear_lr=False, local_rank=-1, multi_scale=False, name='exp', noautoanchor=False, nosave=False, notest=False, project='runs/train', quad=False, rect=False, resume=False, save_dir='runs\\train\\exp8', save_period=-1, single_cls=False, sync_bn=False, total_batch_size=64, upload_dataset=False, weights='', workers=8, world_size=1)
tensorboard: Start with 'tensorboard --logdir runs/train', view at http://localhost:6006/
hyperparameters: lr0=0.01, lrf=0.2, momentum=0.937, weight_decay=0.0005, warmup_epochs=3.0, warmup_momentum=0.8, warmup_bias_lr=0.1, box=0.05, cls=0.5, cls_pw=1.0, obj=1.0, obj_pw=1.0, iou_t=0.2, anchor_t=4.0, fl_gamma=0.0, hsv_h=0.015, hsv_s=0.7, hsv_v=0.4, degrees=0.0, translate=0.1, scale=0.5, shear=0.0, perspective=0.0, flipud=0.0, fliplr=0.5, mosaic=1.0, mixup=0.0
wandb: Currently logged in as: zigar (use `wandb login --relogin` to force relogin)
wandb: Network error (ReadTimeout), entering retry loop. See wandb\debug-internal.log for full traceback.
```
## Expected behavior
A clear and concise description of what you expected to happen.
## Environment
If applicable, add screenshots to help explain your problem.
- OS: [e.g. WIndows 10]
- GPU [e.g. GRID V100D-32Q, 32638.0MB]
## Additional context
Add any other context about the problem here.
| null | https://github.com/ultralytics/yolov5/pull/2882 | null | {'base_commit': '5f7d39fede4de8af98472bd009c63c3a86568e2d', 'files': [{'path': 'data/argoverse_hd.yaml', 'status': 'modified', 'Loc': {'(None, None, 3)': {'mod': [3]}}}, {'path': 'data/coco.yaml', 'status': 'modified', 'Loc': {'(None, None, 3)': {'mod': [3]}}}, {'path': 'data/coco128.yaml', 'status': 'modified', 'Loc': {'(None, None, 3)': {'mod': [3]}}}, {'path': 'data/scripts/get_argoverse_hd.sh', 'status': 'modified', 'Loc': {'(None, None, 5)': {'mod': [5]}}}, {'path': 'data/scripts/get_coco.sh', 'status': 'modified', 'Loc': {'(None, None, 5)': {'mod': [5]}}}, {'path': 'data/scripts/get_voc.sh', 'status': 'modified', 'Loc': {'(None, None, 41)': {'add': [41]}, '(None, None, 77)': {'add': [77]}, '(None, None, 120)': {'add': [120]}, '(None, None, 5)': {'mod': [5]}, '(None, None, 32)': {'mod': [32, 33]}, '(None, None, 35)': {'mod': [35, 36]}, '(None, None, 38)': {'mod': [38]}, '(None, None, 40)': {'mod': [40]}, '(None, None, 43)': {'mod': [43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54]}, '(None, None, 57)': {'mod': [57, 58, 59]}, '(None, None, 68)': {'mod': [68]}, '(None, None, 72)': {'mod': [72, 73]}, '(None, None, 76)': {'mod': [76]}, '(None, None, 79)': {'mod': [79, 80, 81, 82]}, '(None, None, 84)': {'mod': [84]}, '(None, None, 93)': {'mod': [93]}, '(None, None, 95)': {'mod': [95]}, '(None, None, 97)': {'mod': [97, 98, 99, 100, 102, 103, 104]}, '(None, None, 106)': {'mod': [106]}, '(None, None, 108)': {'mod': [108, 109, 111, 112, 113, 114, 116, 117, 118, 119]}, '(None, None, 123)': {'mod': [123, 124, 126, 127, 128, 129, 131, 132, 133, 134]}}}, {'path': 'data/voc.yaml', 'status': 'modified', 'Loc': {'(None, None, 3)': {'mod': [3]}}}, {'path': 'utils/general.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [11, 175]}, "(None, 'check_dataset', 156)": {'add': [166], 'mod': [164, 168, 169, 171]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"utils/general.py"
],
"doc": [],
"test": [],
"config": [
"data/argoverse_hd.yaml",
"data/voc.yaml",
"data/coco.yaml",
"data/coco128.yaml"
],
"asset": [
"data/scripts/get_argoverse_hd.sh",
"data/scripts/get_voc.sh",
"data/scripts/get_coco.sh"
]
} | null |
ultralytics | yolov5 | cbd55da5d24becbe3b94afaaa4cdd1187a512c3f | https://github.com/ultralytics/yolov5/issues/2824 | bug | Sizes of tensors must match | Multi Threaded Inference is not working with Yolo5. It throws the following error,
```
File "/home/zumbala/anaconda3/envs/environment/lib/python3.8/site-packages/torch/nn/modules/module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/zumbala/yolov5/models/yolo.py", line 113, in forward
yi = self.forward_once(xi)[0] # forward
File "/home/zumbala/yolov5/models/yolo.py", line 139, in forward_once
x = m(x) # run
File "/home/zumbala/anaconda3/envs/environment/lib/python3.8/site-packages/torch/nn/modules/module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/zumbala/yolov5/models/yolo.py", line 54, in forward
y[..., 0:2] = (y[..., 0:2] * 2. - 0.5 + self.grid[i]) * self.stride[i] # xy
RuntimeError: The size of tensor a (68) must match the size of tensor b (56) at non-singleton dimension 3
Exception in thread Thread-112:
Traceback (most recent call last):
File "/home/zumbala/anaconda3/envs/environment/lib/python3.8/threading.py", line 932, in _bootstrap_inner
self.run()
File "/home/zumbala/anaconda3/envs/environment/lib/python3.8/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
```
I saw the similar bug in other issue and I used the latest version of this repo. Still the problem persists. How can I fix it?
| null | null | https://github.com/ultralytics/yolov5/commit/cbd55da5d24becbe3b94afaaa4cdd1187a512c3f | {'base_commit': 'cbd55da5d24becbe3b94afaaa4cdd1187a512c3f', 'files': [{'path': 'models/yolo.py', 'status': 'modified', 'Loc': {"('Detect', 'forward', 38)": {'mod': [52]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"models/yolo.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
ultralytics | yolov5 | d9b64c27c24db2001535bb480959aca015159510 | https://github.com/ultralytics/yolov5/issues/119 | question
Stale | yolov5m模型,由42M增大到84M,是做了什么修改么? | 6.16我做训练的时候(yolov5m),训练出来的模型大小是42M
但是今天(6.18)我用最新代码训练的时候,模型大小是84M
请问是做了什么修改么? | null | null | https://github.com/ultralytics/yolov5/commit/d9b64c27c24db2001535bb480959aca015159510 | {'base_commit': 'd9b64c27c24db2001535bb480959aca015159510', 'files': [{'path': 'train.py', 'status': 'modified', 'Loc': {"(None, 'train', 60)": {'mod': [335]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "2",
"loc_way": "commit",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"train.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
ultralytics | yolov5 | bfd51f62f8e0a114cb94c269e83ff135e31d8bdb | https://github.com/ultralytics/yolov5/issues/187 | bug | can't test with my finetune weights | i train a model in my custom data, can get the weights (**last.pt** and **best.pt**)
i run:
`python test.py --img 640 --batch 16 --data ./data/patrol.yaml --weights weights/last.pt --device 4`
`python test.py --img 640 --batch 16 --data ./data/patrol.yaml --weights weights/best.pt --device 4`
both raise the error:
**Traceback (most recent call last):
File "test.py", line 277, in <module>
opt.verbose)
File "test.py", line 86, in test
names = model.names if hasattr(model, 'names') else model.module.names
File "/home/anaconda3/envs/yolov5/lib/python3.7/site-packages/torch/nn/modules/module.py", line 594, in __getattr__
type(self).__name__, name))
AttributeError: 'Model' object has no attribute 'module'**
However, i can run with the default weight **yolov5s.pt**
`python test.py --img 640 --batch 16 --data ./data/patrol.yaml --device 4`
pytorch = 1.5 | null | https://github.com/ultralytics/yolov5/pull/245 | null | {'base_commit': 'bfd51f62f8e0a114cb94c269e83ff135e31d8bdb', 'files': [{'path': 'train.py', 'status': 'modified', 'Loc': {"(None, 'train', 62)": {'add': [135, 136, 174], 'mod': [82, 291]}, '(None, None, None)': {'mod': [375]}}}, {'path': 'utils/torch_utils.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [56]}, "(None, 'model_info', 101)": {'mod': [114, 115]}, "('ModelEMA', 'update', 184)": {'mod': [188]}, "('ModelEMA', 'update_attr', 198)": {'mod': [199, 200, 201, 202]}}}, {'path': 'utils/utils.py', 'status': 'modified', 'Loc': {"(None, 'check_img_size', 48)": {'mod': [50]}}}]} | [] | [] | [] | {
"iss_type": "1",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"train.py",
"utils/utils.py",
"utils/torch_utils.py"
],
"doc": [],
"test": [],
"config": [],
"asset": []
} | null |
CorentinJ | Real-Time-Voice-Cloning | 5425557efe30863267f805851f918124191e0be0 | https://github.com/CorentinJ/Real-Time-Voice-Cloning/issues/227 | Short text samples | It would be awesome to be able to use this to help train a hot word detector. In addition to recording myself saying the hotword, I could create an even larger dataset by adding outputs of this model that used my voice as the reference.
The problem with that, however, is that this model seems to only work well on sentences of medium length (+- 20 words according to demo_cli.py). Is there anything I can do to make short text samples (e.g. 2 words) sound better? | null | https://github.com/CorentinJ/Real-Time-Voice-Cloning/pull/472 | null | {'base_commit': '5425557efe30863267f805851f918124191e0be0', 'files': [{'path': 'README.md', 'status': 'modified', 'Loc': {'(None, None, 18)': {'mod': [18]}, '(None, None, 23)': {'mod': [23, 24]}, '(None, None, 65)': {'mod': [65, 66, 68, 70]}}}, {'path': 'demo_cli.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [13, 43, 162], 'mod': [24, 25, 26, 30, 31, 32, 70, 76]}}}, {'path': 'demo_toolbox.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [5, 32], 'mod': [23, 24, 25]}}}, {'path': 'encoder/audio.py', 'status': 'modified', 'Loc': {"(None, 'preprocess_wav', 19)": {'mod': [20, 43, 44]}}}, {'path': 'requirements.txt', 'status': 'modified', 'Loc': {'(None, None, 16)': {'add': [16]}, '(None, None, 1)': {'mod': [1]}}}, {'path': 'requirements_gpu.txt', 'status': 'removed', 'Loc': {}}, {'path': 'synthesizer/LICENSE.txt', 'status': 'modified', 'Loc': {'(None, None, 3)': {'add': [3]}, '(None, None, 4)': {'add': [4]}}}, {'path': 'synthesizer/audio.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [4]}}}, {'path': 'synthesizer/feeder.py', 'status': 'removed', 'Loc': {}}, {'path': 'synthesizer/hparams.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [348], 'mod': [1, 3, 5, 6, 7, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 20, 21, 22, 23, 24, 25, 26, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 39, 40, 41, 42, 44, 45, 46, 47, 48, 49, 50, 51, 52, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 105, 106, 107, 108, 109, 110, 111, 113, 114, 115, 116, 117, 119, 121, 122, 123, 124, 125, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 143, 144, 145, 146, 147, 149, 150, 151, 152, 153, 154, 155, 157, 158, 159, 160, 161, 162, 164, 165, 166, 167, 168, 169, 170, 172, 174, 175, 176, 177, 178, 180, 181, 182, 183, 184, 185, 186, 187, 189, 190, 191, 192, 193, 194, 196, 197, 198, 199, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214, 216, 217, 218, 219, 220, 221, 222, 223, 224, 225, 226, 227, 228, 229, 231, 232, 233, 234, 235, 237, 238, 239, 240, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 255, 256, 257, 258, 259, 260, 261, 262, 264, 265, 266, 267, 269, 270, 271, 272, 273, 274, 275, 276, 278, 279, 280, 281, 283, 284, 285, 286, 287, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306, 308, 309, 310, 311, 313, 314, 315, 316, 317, 318, 319, 320, 321, 322, 323, 324, 325, 326, 327, 328, 329, 330, 331, 332, 333, 334, 335, 336, 337, 338, 339, 342, 343, 344, 345, 347]}, "(None, 'hparams_debug_string', 350)": {'mod': [351, 352, 353]}}}, {'path': 'synthesizer/inference.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [6], 'mod': [1, 2, 3, 4, 5, 9, 11]}, "('Synthesizer', '__init__', 19)": {'add': [33], 'mod': [21, 22, 24, 25, 26, 27, 28, 29, 30, 31, 32, 35, 36, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 49, 50, 51, 52, 53, 54, 55, 56, 57, 59]}, "('Synthesizer', 'griffin_lim', 149)": {'add': [154]}, "('Synthesizer', None, 15)": {'mod': [19, 106, 107, 108, 109, 110, 111, 113, 114, 116, 117, 118, 119, 121]}, "('Synthesizer', 'is_loaded', 61)": {'mod': [63]}, "('Synthesizer', 'load', 67)": {'mod': [69, 70, 71, 72, 73, 74, 75]}, "('Synthesizer', 'synthesize_spectrograms', 77)": {'mod': [91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 104]}}}, {'path': 'synthesizer/infolog.py', 'status': 'removed', 'Loc': {}}, {'path': 'synthesizer/models/__init__.py', 'status': 'removed', 'Loc': {}}, {'path': 'synthesizer/models/architecture_wrappers.py', 'status': 'removed', 'Loc': {}}, {'path': 'synthesizer/models/attention.py', 'status': 'removed', 'Loc': {}}, {'path': 'synthesizer/models/custom_decoder.py', 'status': 'removed', 'Loc': {}}, {'path': 'synthesizer/models/helpers.py', 'status': 'removed', 'Loc': {}}, {'path': 'synthesizer/models/modules.py', 'status': 'removed', 'Loc': {}}, {'path': 'synthesizer/models/tacotron.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [11], 'mod': [1, 2, 3, 4, 5, 6, 7, 8, 9]}, "(None, 'split_func', 14)": {'mod': [14, 15, 16, 17, 18, 19, 20, 21, 24, 25, 26, 28, 29, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 65, 66, 67, 68, 69, 70, 71, 73, 74, 75, 76, 77, 79, 81, 82, 84, 86, 87, 88, 89, 90, 91, 93, 94, 95, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 108, 109, 110, 111, 113, 114, 115, 116, 117, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 130, 131, 132, 134, 135, 136, 137, 139, 140, 141, 142, 143, 145, 147, 148, 151, 153, 154, 155, 156, 157, 158, 160, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178, 179, 180, 181, 182, 183, 184, 185, 186, 187, 188, 190, 191, 192, 193, 194, 195, 196, 198, 199, 200, 201, 202, 203, 205, 206, 207, 209, 210, 212, 213, 214, 215, 216, 217, 218, 220, 221, 222, 223, 225, 226, 228, 229, 230, 232, 233, 234, 235, 237, 238, 240, 241, 242, 243, 244, 245, 246, 247, 249, 250, 252, 253, 254, 256, 257, 259, 260, 261, 263, 264, 265, 266, 267, 268, 269, 270, 271, 273, 274, 275, 277, 278, 279, 280, 281, 282, 283, 284, 286, 288, 289, 290, 291, 292, 293, 294, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 307, 308, 309, 312, 313, 314, 316, 317, 318, 319, 320, 321, 323, 324, 325, 326, 327, 328, 330, 331, 333, 334, 335, 336, 337, 338, 339, 340, 341, 342, 343, 344, 345, 346, 347, 348, 349, 350, 351, 352, 353, 354, 355, 356, 357, 358, 359, 360, 361, 362, 363, 364, 365, 366, 367, 369, 370, 371, 373, 374, 375, 376, 377, 378, 379, 380, 381, 382, 383, 385, 386, 387, 388, 389, 390, 391, 392, 394, 395, 396, 397, 398, 399, 400, 402, 403, 404, 405, 406, 407, 409, 410, 412, 413, 414, 415, 416, 417, 418, 420, 421, 422, 423, 424, 425, 427, 428, 429, 430, 431, 432, 433, 435, 436, 437, 439, 441, 442, 443, 444, 445, 446, 447, 448, 449, 451, 452, 454, 455, 456, 457, 458, 459, 460, 461, 462, 464, 465, 466, 467, 468, 469, 470, 471, 472, 473, 474, 475, 476, 477, 479, 480, 481, 483, 484, 485, 486, 487, 488, 489, 491, 492, 493, 494, 495, 497, 498, 499, 501, 502, 504, 505, 507, 508, 509, 510, 512, 513, 514, 515, 516, 517, 518, 520, 521]}}}, {'path': 'synthesizer/preprocess.py', 'status': 'modified', 'Loc': {"(None, 'process_utterance', 185)": {'add': [204]}}}, {'path': 'synthesizer/synthesize.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [82], 'mod': [1, 3, 4, 6, 7]}, "(None, 'run_eval', 10)": {'mod': [10, 11, 12, 14, 15, 16, 17, 18, 20, 21, 23, 24, 25, 27, 28, 29, 30, 31, 32, 34, 35, 36, 37]}, "(None, 'run_synthesis', 39)": {'mod': [40, 41, 42, 43, 45, 46, 47, 48, 50, 51, 52, 53, 54, 55, 57, 58, 59, 60, 61, 62, 64, 65, 66, 67, 69, 70, 71, 72, 73, 74, 75, 77, 78, 80, 81]}}}, {'path': 'synthesizer/tacotron2.py', 'status': 'removed', 'Loc': {}}, {'path': 'synthesizer/train.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [0, 79, 83], 'mod': [3, 4, 5, 6, 7, 9, 10, 12, 14, 16, 19, 20, 21, 22, 24, 25, 26, 27, 28, 29, 31, 32, 35, 36, 37, 38, 39, 40, 41, 43, 44, 45, 46, 47, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78]}, "(None, 'model_train_mode', 85)": {'mod': [85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 130, 131, 133, 134, 135, 136, 138, 139, 141, 142, 143, 144, 146, 147, 148, 149]}, "(None, 'train', 110)": {'mod': [151, 152, 153, 154, 155, 156, 157, 159, 161, 167, 169, 171, 172, 173, 174, 176, 177, 178, 179, 181, 183, 184, 185, 186, 187, 189, 190, 191, 192, 194, 195, 196, 198, 199, 201, 202, 204, 205, 207, 208, 210, 212, 213, 214, 215, 216, 218, 219, 220, 222, 223, 224, 226, 227, 228, 230, 231, 232, 233, 234, 235, 237, 238, 239, 240, 241, 242, 243, 244, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 255, 256, 257, 258, 260, 261, 262, 263, 265, 266, 267, 268, 269, 270, 271, 272, 273, 274, 275, 276, 277, 278, 279, 280, 281, 283, 284, 285, 286, 288, 289, 290, 291, 292, 293, 295, 296, 297, 298, 299, 300, 301, 302, 303, 304, 305, 306, 307, 308, 309, 310, 311, 313, 314, 315, 316, 317, 318, 319, 320, 322, 323, 324, 325, 327, 328, 329, 330, 332, 333, 334, 335, 336, 337, 338, 339, 341, 342, 343, 344, 346, 347, 348, 349, 350, 352, 353, 354, 355, 356, 357, 358, 359, 360, 361, 362, 363, 364, 365, 366, 367, 368, 370, 371, 372, 374, 375, 376, 377, 378, 379, 381, 382, 383, 385, 386, 387, 388, 391, 392]}}}, {'path': 'synthesizer/utils/__init__.py', 'status': 'modified', 'Loc': {"('ValueWindow', None, 1)": {'add': [0]}}}, {'path': 'synthesizer_train.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [2, 4, 6, 9, 10, 11, 12, 13, 14, 15, 16, 21, 22, 23, 24, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 53, 55]}}}, {'path': 'toolbox/__init__.py', 'status': 'modified', 'Loc': {"('Toolbox', 'init_encoder', 325)": {'add': [333]}, "('Toolbox', None, 42)": {'mod': [43]}, "('Toolbox', '__init__', 43)": {'mod': [54]}, "('Toolbox', 'synthesize', 207)": {'mod': [211, 212, 213, 214, 215, 216, 217, 221, 224, 228]}, "('Toolbox', 'vocode', 237)": {'mod': [243]}}}, {'path': 'toolbox/ui.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'mod': [41]}, "('UI', None, 53)": {'mod': [331]}, "('UI', 'populate_models', 338)": {'mod': [347, 348, 349, 350, 351, 352, 353]}}}, {'path': 'vocoder_preprocess.py', 'status': 'modified', 'Loc': {'(None, None, None)': {'add': [32, 40], 'mod': [20]}}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "2",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"synthesizer/models/modules.py",
"synthesizer/models/tacotron.py",
"synthesizer/train.py",
"synthesizer/models/attention.py",
"synthesizer_train.py",
"demo_cli.py",
"toolbox/__init__.py",
"demo_toolbox.py",
"synthesizer/models/architecture_wrappers.py",
"synthesizer/audio.py",
"synthesizer/preprocess.py",
"synthesizer/tacotron2.py",
"synthesizer/hparams.py",
"synthesizer/utils/__init__.py",
"synthesizer/synthesize.py",
"toolbox/ui.py",
"encoder/audio.py",
"synthesizer/feeder.py",
"synthesizer/models/helpers.py",
"synthesizer/models/__init__.py",
"synthesizer/inference.py",
"vocoder_preprocess.py",
"synthesizer/models/custom_decoder.py",
"synthesizer/infolog.py"
],
"doc": [
"synthesizer/LICENSE.txt",
"README.md"
],
"test": [],
"config": [
"requirements_gpu.txt",
"requirements.txt"
],
"asset": []
} | null | |
AUTOMATIC1111 | stable-diffusion-webui | f108782e30369dedfc66f22d21c2b72c77941de7 | https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/5050 | bug | [Bug]: img2img sampler is not changing | ### Is there an existing issue for this?
- [X] I have searched the existing issues and checked the recent builds/commits
### What happened?
I'm trying to choose another sampler, but it is not working.
I tried checking the p value, and found sampler_name = None
There seems to be a code missing to assign the variable sampler_name in the img2img
txt2img seems working fine, though.
### Steps to reproduce the problem
Change the sampler and see the results. They are all the same.
### What should have happened?
Different samplers should produce different results.
### Commit where the problem happens
828438b
### What platforms do you use to access UI ?
Windows
### What browsers do you use to access the UI ?
Mozilla Firefox
### Command Line Arguments
_No response_
### Additional information, context and logs
_No response_ | null | https://github.com/AUTOMATIC1111/stable-diffusion-webui/pull/4910 | null | {'base_commit': 'f108782e30369dedfc66f22d21c2b72c77941de7', 'files': [{'path': 'scripts/xy_grid.py', 'status': 'modified', 'Loc': {"(None, 'confirm_samplers', 71)": {'add': [74]}, "('Script', 'process_axis', 276)": {'add': [279]}}}, {'path': 'img2img.py', 'Loc': {}}, {'path': 'Line 102: sampler_index=sd_samplers.samplers_for_img2img[sampler_index].name', 'Loc': {}}]} | [] | [] | [] | {
"iss_type": "2",
"iss_reason": "1",
"loc_way": "pr",
"loc_scope": "0",
"info_type": "Code"
} | {
"code": [
"img2img.py",
"scripts/xy_grid.py"
],
"doc": [],
"test": [],
"config": [],
"asset": [
"Line 102: sampler_index=sd_samplers.samplers_for_img2img[sampler_index].name"
]
} | null |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.