status stringclasses 1
value | repo_name stringclasses 13
values | repo_url stringclasses 13
values | issue_id int64 1 104k | updated_files stringlengths 11 1.76k | title stringlengths 4 369 | body stringlengths 0 254k ⌀ | issue_url stringlengths 38 55 | pull_url stringlengths 38 53 | before_fix_sha stringlengths 40 40 | after_fix_sha stringlengths 40 40 | report_datetime timestamp[ns, tz=UTC] | language stringclasses 5
values | commit_datetime timestamp[us, tz=UTC] |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
closed | apache/airflow | https://github.com/apache/airflow | 15,622 | ["airflow/providers/google/CHANGELOG.rst", "airflow/providers/google/cloud/example_dags/example_dataproc.py", "airflow/providers/google/cloud/hooks/dataproc.py", "airflow/providers/google/cloud/operators/dataproc.py", "airflow/providers/google/cloud/sensors/dataproc.py", "tests/providers/google/cloud/hooks/test_datapro... | Inconsistencies with Dataproc Operator parameters | I'm looking at the GCP Dataproc operator and noticed that the `DataprocCreateClusterOperator` and `DataprocDeleteClusterOperator` require a `region` parameter, but other operators, like the `DataprocSubmitJobOperator` require a `location` parameter instead. I think it would be best to consistently enforce the parameter... | https://github.com/apache/airflow/issues/15622 | https://github.com/apache/airflow/pull/16034 | 5a5f30f9133a6c5f0c41886ff9ae80ea53c73989 | b0f7f91fe29d1314b71c76de0f11d2dbe81c5c4a | 2021-04-30T23:46:34Z | python | 2021-07-07T20:37:32Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,598 | ["airflow/providers/qubole/CHANGELOG.rst", "airflow/providers/qubole/hooks/qubole.py", "airflow/providers/qubole/hooks/qubole_check.py", "airflow/providers/qubole/operators/qubole.py", "airflow/providers/qubole/provider.yaml"] | Qubole Hook Does Not Support 'include_headers' |
**Description**
Qubole Hook and Operator do not support `include_header` param for getting results with headers
Add Support for `include_header` get_results(... arguments=[True])
**Use case / motivation**
It's very hard to work with CSV results from db without headers.
This is super important when using Q... | https://github.com/apache/airflow/issues/15598 | https://github.com/apache/airflow/pull/15615 | edbc89c64033517fd6ff156067bc572811bfe3ac | 47a5539f7b83826b85b189b58b1641798d637369 | 2021-04-29T21:01:34Z | python | 2021-05-04T06:39:27Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,596 | ["airflow/dag_processing/manager.py", "docs/apache-airflow/administration-and-deployment/logging-monitoring/metrics.rst", "newsfragments/30076.significant.rst", "tests/dag_processing/test_manager.py"] | Using SLAs causes DagFileProcessorManager timeouts and prevents deleted dags from being recreated | **Apache Airflow version**: 2.0.1 and 2.0.2
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A
**Environment**: Celery executors, Redis + Postgres
- **Cloud provider or hardware configuration**: Running inside docker
- **OS** (e.g. from /etc/os-release): Centos (inside Docker)
... | https://github.com/apache/airflow/issues/15596 | https://github.com/apache/airflow/pull/30076 | 851fde06dc66a9f8e852f9a763746a47c47e1bb7 | 53ed5620a45d454ab95df886a713a5e28933f8c2 | 2021-04-29T20:21:20Z | python | 2023-03-16T21:51:23Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,559 | ["airflow/settings.py", "tests/core/test_sqlalchemy_config.py", "tests/www/test_app.py"] | airflow dag success , but tasks not yet started,not scheduled. | hi,team:
my dag is 1 minute schedule,one parts dag state is success,but tasks state is not yet started in a dag:

how can to fix it? | https://github.com/apache/airflow/issues/15559 | https://github.com/apache/airflow/pull/15714 | 507bca57b9fb40c36117e622de3b1313c45b41c3 | 231d104e37da57aa097e5f726fe6d3031ad04c52 | 2021-04-28T03:58:29Z | python | 2021-05-09T08:45:16Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,538 | ["airflow/providers/amazon/aws/hooks/s3.py", "airflow/providers/amazon/aws/sensors/s3_key.py", "tests/providers/amazon/aws/hooks/test_s3.py", "tests/providers/amazon/aws/sensors/test_s3_key.py"] | S3KeySensor wildcard fails to match valid unix wildcards | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/15538 | https://github.com/apache/airflow/pull/18211 | 2f88009bbf8818f3b4b553a04ae3b848af43c4aa | 12133861ecefd28f1d569cf2d190c2f26f6fd2fb | 2021-04-26T20:30:10Z | python | 2021-10-01T17:36:03Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,536 | ["airflow/providers/apache/beam/hooks/beam.py", "tests/providers/apache/beam/hooks/test_beam.py", "tests/providers/google/cloud/hooks/test_dataflow.py"] | Get rid of state in Apache Beam provider hook | As discussed in https://github.com/apache/airflow/pull/15534#discussion_r620500075, we could possibly rewrite Beam Hook to remove the need of storing state in it. | https://github.com/apache/airflow/issues/15536 | https://github.com/apache/airflow/pull/29503 | 46d45e09cb5607ae583929f3eba1923a64631f48 | 7ba27e78812b890f0c7642d78a986fe325ff61c4 | 2021-04-26T17:29:42Z | python | 2023-02-17T14:19:11Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,532 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg"] | Airflow 1.10.15 : The CSRF session token is missing when i try to trigger a new dag | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/15532 | https://github.com/apache/airflow/pull/15546 | 5b2fe0e74013cd08d1f76f5c115f2c8f990ff9bc | dfaaf49135760cddb1a1f79399c7b08905833c21 | 2021-04-26T15:09:02Z | python | 2021-04-27T21:20:02Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,526 | ["tests/kubernetes/kube_config", "tests/kubernetes/test_refresh_config.py"] | Improve test coverage of Kubernetes config_refresh | Kuberentes refresh_config has untested method https://codecov.io/gh/apache/airflow/src/master/airflow/kubernetes/refresh_config.py 75%
We might want to improve that. | https://github.com/apache/airflow/issues/15526 | https://github.com/apache/airflow/pull/18563 | 73fcbb0e4e151c9965fd69ba08de59462bbbe6dc | a6be59726004001214bd4d7e284fd1748425fa98 | 2021-04-26T07:33:28Z | python | 2021-10-13T23:30:28Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,524 | ["tests/cli/commands/test_task_command.py"] | Improve test coverage of task_command | Task command has a few missing commands not tested: https://codecov.io/gh/apache/airflow/src/master/airflow/cli/commands/task_command.py (77%)
| https://github.com/apache/airflow/issues/15524 | https://github.com/apache/airflow/pull/15760 | 37d549bde79cd560d24748ebe7f94730115c0e88 | 51e54cb530995edbb6f439294888a79724365647 | 2021-04-26T07:29:42Z | python | 2021-05-14T04:34:15Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,523 | ["tests/executors/test_kubernetes_executor.py"] | Improve test coverage of Kubernetes Executor | The Kubernetes executor has surprisingly low test coverage: 64%
https://codecov.io/gh/apache/airflow/src/master/airflow/executors/kubernetes_executor.py - looks like some of the "flush/end" code is not tested.
We might want to improve it. | https://github.com/apache/airflow/issues/15523 | https://github.com/apache/airflow/pull/15617 | cf583b9290b3c2c58893f03b12d3711cc6c6a73c | dd56875066486f8c7043fbc51f272933fa634a25 | 2021-04-26T07:28:03Z | python | 2021-05-04T21:08:21Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,483 | ["airflow/providers/apache/beam/operators/beam.py", "tests/providers/apache/beam/operators/test_beam.py"] | Dataflow operator checks wrong project_id | **Apache Airflow version**:
composer-1.16.1-airflow-1.10.15
**Environment**:
- **Cloud provider or hardware configuration**: Google Composer
**What happened**:
First, a bit of context. We have a single instance of airflow within its own GCP project, which runs dataflows jobs on different GCP projects.
Le... | https://github.com/apache/airflow/issues/15483 | https://github.com/apache/airflow/pull/24020 | 56fd04016f1a8561f1c02e7f756bab8805c05876 | 4a5250774be8f48629294785801879277f42cc62 | 2021-04-22T09:22:48Z | python | 2022-05-30T12:17:42Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,463 | ["scripts/in_container/_in_container_utils.sh"] | Inconsistency between the setup.py and the constraints file | **Apache Airflow version**: 2.0.2
**What happened**:
Airflow's 2.0.2's [constraints file](https://raw.githubusercontent.com/apache/airflow/constraints-2.0.2/constraints-3.8.txt) has used newer `oauthlib==3.1.0` and `request-oauthlib==1.3.0` than 2.0.1's [constraints file](https://raw.githubusercontent.com/apache/... | https://github.com/apache/airflow/issues/15463 | https://github.com/apache/airflow/pull/15470 | c5e302030de7512a07120f71f388ad1859b26ca2 | 5da74f668e68132144590d1f95008bacf6f8b45e | 2021-04-20T21:40:34Z | python | 2021-04-21T12:06:22Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,451 | ["airflow/providers/google/provider.yaml", "scripts/in_container/run_install_and_test_provider_packages.sh", "tests/core/test_providers_manager.py"] | No module named 'airflow.providers.google.common.hooks.leveldb' | **Apache Airflow version**:
2.0.2
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
v1.18.18
**Environment**:
Cloud provider or hardware configuration: AWS
**What happened**:
Updated to Airflow 2.0.2 and a new warning appeared in webserver logs:
```
WARNING - Exception whe... | https://github.com/apache/airflow/issues/15451 | https://github.com/apache/airflow/pull/15453 | 63bec6f634ba67ec62a77c301e390b8354e650c9 | 42a1ca8aab905a0eb1ffb3da30cef9c76830abff | 2021-04-20T10:44:17Z | python | 2021-04-20T17:36:40Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,439 | ["airflow/jobs/local_task_job.py", "tests/jobs/test_local_task_job.py"] | DAG run state not updated while DAG is paused | **Apache Airflow version**:
2.0.0
**What happened**:
The state of a DAG run does not update while the DAG is paused. The _tasks_ continue to run if the DAG run was kicked off before the DAG was paused and eventually finish and are marked correctly. The DAG run state does not get updated and stays in Running... | https://github.com/apache/airflow/issues/15439 | https://github.com/apache/airflow/pull/16343 | d53371be10451d153625df9105234aca77d5f1d4 | 3834df6ade22b33addd47e3ab2165a0b282926fa | 2021-04-19T18:27:33Z | python | 2021-06-17T23:29:00Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,434 | ["airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", "kubernetes_tests/test_kubernetes_pod_operator.py", "tests/providers/cncf/kubernetes/operators/test_kubernetes_pod.py"] | KubernetesPodOperator name randomization | `KubernetesPodOperator.name` randomization should be decoupled from the way the name is set. Currently `name` is only randomized if the `name` kwarg is used. However, one could also want name randomization when a name is set in a `pod_template_file` or `full_pod_spec`.
Move the name randomization feature behind a ne... | https://github.com/apache/airflow/issues/15434 | https://github.com/apache/airflow/pull/19398 | ca679c014cad86976c1b2e248b099d9dc9fc99eb | 854b70b9048c4bbe97abde2252b3992892a4aab0 | 2021-04-19T14:15:31Z | python | 2021-11-07T16:47:01Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,416 | ["BREEZE.rst", "scripts/in_container/configure_environment.sh"] | breeze should load local tmux configuration in 'breeze start-airflow' | **Description**
Currently, when we run
`
breeze start-airflow
`
**breeze** doesn't load local tmux configuration file **.tmux.conf** and we get default tmux configuration inside the containers.
**Use case / motivation**
Breeze must load local **tmux configuration** in to the containers and developers ... | https://github.com/apache/airflow/issues/15416 | https://github.com/apache/airflow/pull/15454 | fdea6226742d36eea2a7e0ef7e075f7746291561 | 508cd394bcf8dc1bada8824d52ebff7bb6c86b3b | 2021-04-17T14:34:32Z | python | 2021-04-21T16:46:02Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,399 | ["airflow/models/pool.py", "tests/models/test_pool.py"] | Not scheduling since there are (negative number) open slots in pool | **Apache Airflow version**: 2.0.1
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.16
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened... | https://github.com/apache/airflow/issues/15399 | https://github.com/apache/airflow/pull/15426 | 8711f90ab820ed420ef317b931e933a2062c891f | d7c27b85055010377b6f971c3c604ce9821d6f46 | 2021-04-16T05:14:41Z | python | 2021-04-19T22:14:40Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,384 | ["airflow/www/utils.py", "airflow/www/views.py", "tests/www/test_utils.py"] | Pagination doesn't work with tags filter |
**Apache Airflow version**:
2.0.1
**Environment**:
- **OS**: Linux Mint 19.2
- **Kernel**: 5.5.0-050500-generic #202001262030 SMP Mon Jan 27 01:33:36 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
**What happened**:
Seems that pagination doesn't work. I filter DAGs by tags and get too many results to get them... | https://github.com/apache/airflow/issues/15384 | https://github.com/apache/airflow/pull/15411 | cb1344b63d6650de537320460b7b0547efd2353c | f878ec6c599a089a6d7516b7a66eed693f0c9037 | 2021-04-15T14:57:20Z | python | 2021-04-16T21:34:10Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,374 | ["airflow/models/dag.py", "tests/models/test_dag.py"] | Clearing a subdag task leaves parent dag in the failed state | **Apache Airflow version**:
2.0.1
**Kubernetes version**:
Server Version: version.Info{Major:"1", Minor:"18+", GitVersion:"v1.18.9-eks-d1db3c", GitCommit:"d1db3c46e55f95d6a7d3e5578689371318f95ff9", GitTreeState:"clean", BuildDate:"2020-10-20T22:18:07Z", GoVersion:"go1.13.15", Compiler:"gc", Platform:"linux/amd64"}... | https://github.com/apache/airflow/issues/15374 | https://github.com/apache/airflow/pull/15562 | 18531f81848dbd8d8a0d25b9f26988500a27e2a7 | a4211e276fce6521f0423fe94b01241a9c43a22c | 2021-04-14T21:15:44Z | python | 2021-04-30T19:52:26Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,353 | ["docs/apache-airflow/howto/custom-view-plugin.rst", "docs/apache-airflow/howto/index.rst", "docs/spelling_wordlist.txt", "metastore_browser/hive_metastore.py"] | Some more information regarding custom view plugins would be really nice! |
**Description**
Some more information regarding custom view plugins would be really nice
**Use case / motivation**
I have created a custom view for airflow which was a little tricky since the Airflow docs are quite short and most of the information in the www is out of date.
Additionally the only example cannot... | https://github.com/apache/airflow/issues/15353 | https://github.com/apache/airflow/pull/27244 | 1447158e690f3d63981b3d8ec065665ec91ca54e | 544c93f0a4d2673c8de64d97a7a8128387899474 | 2021-04-13T19:08:56Z | python | 2022-10-31T04:33:52Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,332 | ["airflow/providers/sftp/hooks/sftp.py", "airflow/providers/sftp/sensors/sftp.py", "tests/providers/sftp/hooks/test_sftp.py", "tests/providers/sftp/sensors/test_sftp.py"] | SftpSensor w/ possibility to use RegEx or fnmatch | **Description**
SmartSftpSensor with possibility to search for patterns (RegEx or UNIX fnmatch) in filenames or folders
**Use case / motivation**
I would like to have the possibility to use wildcards and/or regular expressions to look for certain files when using an SftpSensor.
At the moment I tried to do s... | https://github.com/apache/airflow/issues/15332 | https://github.com/apache/airflow/pull/24084 | ec84ffe71cfa8246155b9b4cb10bf2167e75adcf | e656e1de55094e8369cab80b9b1669b1d1225f54 | 2021-04-12T17:01:24Z | python | 2022-06-06T12:54:27Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,318 | ["airflow/cli/cli_parser.py", "airflow/cli/commands/role_command.py", "tests/cli/commands/test_role_command.py"] | Add CLI to delete roles | Currently there is no option to delete a role from CLI which is very limiting.
I think it would be good if CLI will allow to delete a role (assuming no users are assigned to the role)
| https://github.com/apache/airflow/issues/15318 | https://github.com/apache/airflow/pull/25854 | 3c806ff32d48e5b7a40b92500969a0597106d7db | 799b2695bb09495fc419d3ea2a8d29ff27fc3037 | 2021-04-11T06:39:05Z | python | 2022-08-27T00:37:12Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,289 | ["setup.py"] | reporting deprecated warnings saw in breeze | when working with breeze I saw these warnings many times in the logs:
```
=============================== warnings summary ===============================
tests/providers/amazon/aws/log/test_cloudwatch_task_handler.py::TestCloudwatchTaskHandler::test_close_prevents_duplicate_calls
/usr/local/lib/python3.6... | https://github.com/apache/airflow/issues/15289 | https://github.com/apache/airflow/pull/15290 | 594d93d3b0882132615ec26770ea77ff6aac5dff | 9ba467b388148f4217b263d2518e8a24407b9d5c | 2021-04-08T19:07:53Z | python | 2021-04-09T15:28:43Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,280 | ["airflow/providers/apache/spark/hooks/spark_submit.py", "tests/providers/apache/spark/hooks/test_spark_submit.py"] | Incorrect handle stop DAG when use spark-submit in cluster mode on yarn cluster. on yarn | **Apache Airflow version**: v2.0.1
**Environment**:
- **Cloud provider or hardware configuration**:
- bare metal
- **OS** (e.g. from /etc/os-release):
- Ubuntu 20.04.2 LTS (GNU/Linux 5.4.0-65-generic x86_64)
- **Kernel** (e.g. `uname -a`):
- Linux 5.4.0-65-generic #73-Ubuntu SMP Mon Jan 18 17:25:17 UTC 2021 ... | https://github.com/apache/airflow/issues/15280 | https://github.com/apache/airflow/pull/15304 | 9dd14aae40f4c2164ce1010cd5ee67d2317ea3ea | 9015beb316a7614616c9d8c5108f5b54e1b47843 | 2021-04-08T15:15:44Z | python | 2021-04-09T23:04:14Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,279 | ["airflow/providers/amazon/aws/log/cloudwatch_task_handler.py", "setup.py"] | Error on logging empty line to Cloudwatch | **Apache Airflow version**: 2.0.1
**Environment**:
- **Cloud provider or hardware configuration**: AWS
**What happened**:
I have Airflow with Cloudwatch-based remote logging running. I also have `BashOperator` that does, for example, `rsync` with invalid parameters, for example `rsync -av test test`. The outp... | https://github.com/apache/airflow/issues/15279 | https://github.com/apache/airflow/pull/19907 | 5b50d610d4f1288347392fac4a6eaaed78d1bc41 | 2539cb44b47d78e81a88fde51087f4cc77c924c5 | 2021-04-08T15:10:23Z | python | 2021-12-01T17:53:30Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,261 | ["airflow/www/static/js/task_instance.js", "airflow/www/templates/airflow/task_instance.html"] | Changing datetime will never show task instance logs | This is an extension of #15103
**Apache Airflow version**: 2.x.x
**What happened**:
Once you get to the task instance logs page, the date will successfully load at first. But if you change the time of the `execution_date` from the datetimepicker in any way the logs will be blank.
The logs seem to require e... | https://github.com/apache/airflow/issues/15261 | https://github.com/apache/airflow/pull/15284 | de9567f3f5dc212cee4e83f41de75c1bbe43bfe6 | 56a03710a607376a01cb201ec81eb9d87d7418fe | 2021-04-07T20:52:49Z | python | 2021-04-09T00:51:18Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,260 | ["docs/apache-airflow-providers-mysql/connections/mysql.rst"] | Documentation - MySQL Connection - example contains a typo | **What happened**:
There is an extra single quote after /tmp/server-ca.pem in the example.
[MySQL Connections](https://airflow.apache.org/docs/apache-airflow-providers-mysql/stable/connections/mysql.html)
Example “extras” field:
{
"charset": "utf8",
"cursor": "sscursor",
"local_infile": true,
"... | https://github.com/apache/airflow/issues/15260 | https://github.com/apache/airflow/pull/15265 | c594d9cfb32bbcfe30af3f5dcb452c6053cacc95 | 7ab4b2707669498d7278113439a13f58bd12ea1a | 2021-04-07T20:31:20Z | python | 2021-04-08T11:09:55Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,259 | ["chart/templates/scheduler/scheduler-deployment.yaml", "chart/tests/test_scheduler.py", "chart/values.schema.json", "chart/values.yaml"] | Scheduler livenessprobe and k8s v1.20+ | Pre Kubernetes v1.20, exec livenessprobes `timeoutSeconds` wasn't functional, and defaults to 1 second. The livenessprobe for the scheduler, however, takes longer than 1 second to finish so the scheduler will have consistent livenessprobe failures when running on v1.20.
> Before Kubernetes 1.20, the field timeoutSec... | https://github.com/apache/airflow/issues/15259 | https://github.com/apache/airflow/pull/15333 | 18c5b8af1020a86a82c459b8a26615ba6f1d8df6 | 8b56629ecd44d346e35c146779e2bb5422af1b5d | 2021-04-07T20:04:27Z | python | 2021-04-12T22:46:59Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,255 | ["tests/jobs/test_scheduler_job.py", "tests/test_utils/asserts.py"] | [QUARANTINED] TestSchedulerJob.test_scheduler_keeps_scheduling_pool_full is flaky |
For example here:
https://github.com/apache/airflow/runs/2288380184?check_suite_focus=true#step:6:8759
```
=================================== FAILURES ===================================
__________ TestSchedulerJob.test_scheduler_keeps_scheduling_pool_full __________
self = <tests.jobs.test_sched... | https://github.com/apache/airflow/issues/15255 | https://github.com/apache/airflow/pull/19860 | d1848bcf2460fa82cd6c1fc1e9e5f9b103d95479 | 9b277dbb9b77c74a9799d64e01e0b86b7c1d1542 | 2021-04-07T18:12:23Z | python | 2021-12-13T17:55:43Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,248 | ["airflow/example_dags/tutorial_taskflow_api_etl_virtualenv.py", "airflow/exceptions.py", "airflow/models/dagbag.py", "airflow/providers/papermill/example_dags/example_papermill.py", "tests/api_connexion/endpoints/test_log_endpoint.py", "tests/core/test_impersonation_tests.py", "tests/dags/test_backfill_pooled_tasks.py... | Clear notification in UI when duplicate dag names are present | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/15248 | https://github.com/apache/airflow/pull/15302 | faa4a527440fb1a8f47bf066bb89bbff380b914d | 09674537cb12f46ca53054314aea4d8eec9c2e43 | 2021-04-07T10:04:57Z | python | 2021-05-06T11:59:25Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,245 | ["airflow/providers/google/cloud/operators/dataproc.py", "tests/providers/google/cloud/operators/test_dataproc.py"] | Passing Custom Image Family Name to the DataprocClusterCreateOperator() | **Description**
Currently, we can only pass custom Image name to **DataprocClusterCreateOperator(),**
as the custom image expires after 60 days, we either need to update the image or we need to pass the expiration token.
Functionality is already available in **gcloud** and **REST**.
`gcloud dataproc clusters t... | https://github.com/apache/airflow/issues/15245 | https://github.com/apache/airflow/pull/15250 | 99ec208024933d790272a09a6f20b241410a7df7 | 6da36bad2c5c86628284d91ad6de418bae7cd029 | 2021-04-07T06:17:45Z | python | 2021-04-18T17:26:44Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,218 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/executors/kubernetes_executor.py", "airflow/jobs/scheduler_job.py", "airflow/kubernetes/kube_config.py", "airflow/utils/event_scheduler.py", "tests/executors/test_kubernetes_executor.py", "tests/utils/test_event_scheduler.p... | Task stuck in queued state with pending pod | **Apache Airflow version**: 2.0.1
**Kubernetes version**: v1.19.7
**Executor**: KubernetesExecutor
**What happened**:
If you have a worker that gets stuck in pending forever, say with a missing volume mount, the task will stay in the queued state forever. Nothing is applying a timeout on it actually being able ... | https://github.com/apache/airflow/issues/15218 | https://github.com/apache/airflow/pull/15263 | 1e425fe6459a39d93a9ada64278c35f7cf0eab06 | dd7ff4621e003421521960289a323eb1139d1d91 | 2021-04-05T22:04:15Z | python | 2021-04-20T18:24:38Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,179 | ["chart/templates/NOTES.txt"] | Kubernetes does not show logs for task instances if remote logging is not configured | Without configuring remote logging, logs from Kubernetes for task instances are not complete.
Without remote logging configured, the logging for task instances are outputted as :
logging_level: INFO
```log
BACKEND=postgresql
DB_HOST=airflow-postgresql.airflow.svc.cluster.local
DB_PORT=5432
[2021-04-03 12... | https://github.com/apache/airflow/issues/15179 | https://github.com/apache/airflow/pull/16784 | 1eed6b4f37ddf2086bf06fb5c4475c68fadac0f9 | 8885fc1d9516b30b316487f21e37d34bdd21e40e | 2021-04-03T21:20:52Z | python | 2021-07-06T18:37:31Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,178 | ["airflow/example_dags/tutorial.py", "airflow/models/baseoperator.py", "airflow/serialization/schema.json", "airflow/www/utils.py", "airflow/www/views.py", "docs/apache-airflow/concepts.rst", "tests/serialization/test_dag_serialization.py", "tests/www/test_utils.py"] | Task doc is not shown on Airflow 2.0 Task Instance Detail view | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/15178 | https://github.com/apache/airflow/pull/15191 | 7c17bf0d1e828b454a6b2c7245ded275b313c792 | e86f5ca8fa5ff22c1e1f48addc012919034c672f | 2021-04-03T20:48:59Z | python | 2021-04-05T02:46:41Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,145 | ["airflow/providers/google/cloud/example_dags/example_bigquery_to_mssql.py", "airflow/providers/google/cloud/transfers/bigquery_to_mssql.py", "airflow/providers/google/provider.yaml", "tests/providers/google/cloud/transfers/test_bigquery_to_mssql.py"] | Big Query to MS SQL operator | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/15145 | https://github.com/apache/airflow/pull/15422 | 70cfe0135373d1f0400e7d9b275ebb017429794b | 7f8f75eb80790d4be3167f5e1ffccc669a281d55 | 2021-04-01T20:36:55Z | python | 2021-06-12T21:07:06Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,113 | ["setup.py"] | ImportError: cannot import name '_check_google_client_version' from 'pandas_gbq.gbq' | **What happened**:
`pandas-gbq` released version [0.15.0](https://github.com/pydata/pandas-gbq/releases/tag/0.15.0) which broke `apache-airflow-backport-providers-google==2020.11.23`
```
../lib/python3.7/site-packages/airflow/providers/google/cloud/hooks/bigquery.py:49: in <module>
from pandas_gbq.gbq impor... | https://github.com/apache/airflow/issues/15113 | https://github.com/apache/airflow/pull/15114 | 64b00896d905abcf1fbae195a29b81f393319c5f | b3b412523c8029b1ffbc600952668dc233589302 | 2021-03-31T14:39:00Z | python | 2021-04-04T17:25:22Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,107 | ["Dockerfile", "chart/values.yaml", "docs/docker-stack/build-arg-ref.rst", "docs/docker-stack/build.rst", "docs/docker-stack/docker-examples/extending/writable-directory/Dockerfile", "docs/docker-stack/entrypoint.rst", "scripts/in_container/prod/entrypoint_prod.sh"] | Make the entrypoint in Prod image fail in case the user/group is not properly set | Airflow Production image accepts two types of uid/gid setting:
* airflow user (50000) with any GID
* any other user wit GID = 0 (this is to accommodate OpenShift Guidelines https://docs.openshift.com/enterprise/3.0/creating_images/guidelines.html)
We should check the uid/gid at entrypoint and fail it with clear ... | https://github.com/apache/airflow/issues/15107 | https://github.com/apache/airflow/pull/15162 | 1d635ef0aefe995553059ee5cf6847cf2db65b8c | ce91872eccceb8fb6277012a909ad6b529a071d2 | 2021-03-31T10:30:38Z | python | 2021-04-08T17:28:36Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,103 | ["airflow/www/static/js/task_instance.js"] | Airflow web server redirects to a non-existing log folder - v2.1.0.dev0 | **Apache Airflow version**: v2.1.0.dev0
**Environment**:
- **Others**: Docker + docker compose
```
docker pull apache/airflow:master-python3.8
```
**What happened**:
Once the tasks finish successfully, I click on the Logs button in the web server, then I got redirected to this URL:
`http://localhost:808... | https://github.com/apache/airflow/issues/15103 | https://github.com/apache/airflow/pull/15258 | 019241be0c839ba32361679ffecd178c0506d10d | 523fb5c3f421129aea10045081dc5e519859c1ae | 2021-03-30T23:29:48Z | python | 2021-04-07T20:38:30Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,071 | ["airflow/cli/cli_parser.py", "airflow/cli/commands/scheduler_command.py", "chart/templates/scheduler/scheduler-deployment.yaml", "tests/cli/commands/test_scheduler_command.py"] | Run serve_logs process as part of scheduler command | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/15071 | https://github.com/apache/airflow/pull/15557 | 053d903816464f699876109b50390636bf617eff | 414bb20fad6c6a50c5a209f6d81f5ca3d679b083 | 2021-03-29T17:46:46Z | python | 2021-04-29T15:06:06Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,059 | ["airflow/api_connexion/openapi/v1.yaml", "airflow/api_connexion/schemas/user_schema.py", "tests/api_connexion/endpoints/test_user_endpoint.py", "tests/api_connexion/schemas/test_user_schema.py"] | Remove 'user_id', 'role_id' from User and Role in OpenAPI schema | Would be good to remove the 'id' of both User and Role schemas from what is dumped in REST API endpoints. ID of User and Role table are sensitive data that would be fine to hide from the endpoints
| https://github.com/apache/airflow/issues/15059 | https://github.com/apache/airflow/pull/15117 | b62ca0ad5d8550a72257ce59c8946e7f134ed70b | 7087541a56faafd7aa4b9bf9f94eb6b75eed6851 | 2021-03-28T15:40:00Z | python | 2021-04-07T13:54:45Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,023 | ["airflow/www/api/experimental/endpoints.py", "airflow/www/templates/airflow/trigger.html", "airflow/www/views.py", "tests/api_connexion/endpoints/test_dag_run_endpoint.py", "tests/www/api/experimental/test_endpoints.py", "tests/www/views/test_views_trigger_dag.py"] | DAG task execution and API fails if dag_run.conf is provided with an array or string (instead of dict) | **Apache Airflow version**: 2.0.1
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): Tried both pip install and k8s image
**Environment**: Dev Workstation of K8s execution - both the same
- **OS** (e.g. from /etc/os-release): Ubuntu 20.04 LTS
- **Others**: Python 3.6
**What happe... | https://github.com/apache/airflow/issues/15023 | https://github.com/apache/airflow/pull/15057 | eeb97cff9c2cef46f2eb9a603ccf7e1ccf804863 | 01c9818405107271ee8341c72b3d2d1e48574e08 | 2021-03-25T22:50:15Z | python | 2021-06-22T12:31:37Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,019 | ["airflow/ui/src/api/index.ts", "airflow/ui/src/components/TriggerRunModal.tsx", "airflow/ui/src/interfaces/api.ts", "airflow/ui/src/utils/memo.ts", "airflow/ui/src/views/Pipelines/Row.tsx", "airflow/ui/src/views/Pipelines/index.tsx", "airflow/ui/test/Pipelines.test.tsx"] | Establish mutation patterns via the API | https://github.com/apache/airflow/issues/15019 | https://github.com/apache/airflow/pull/15068 | 794922649982b2a6c095f7fa6be4e5d6a6d9f496 | 9ca49b69113bb2a1eaa0f8cec2b5f8598efc19ea | 2021-03-25T21:24:01Z | python | 2021-03-30T00:32:11Z | |
closed | apache/airflow | https://github.com/apache/airflow | 15,018 | ["airflow/ui/package.json", "airflow/ui/src/api/index.ts", "airflow/ui/src/components/Table.tsx", "airflow/ui/src/interfaces/react-table-config.d.ts", "airflow/ui/src/views/Pipelines/PipelinesTable.tsx", "airflow/ui/src/views/Pipelines/Row.tsx", "airflow/ui/test/Pipelines.test.tsx", "airflow/ui/yarn.lock"] | Build out custom Table components | https://github.com/apache/airflow/issues/15018 | https://github.com/apache/airflow/pull/15805 | 65519ab83ddf4bd6fc30c435b5bfccefcb14d596 | 2c6b003fbe619d5d736cf97f20a94a3451e1a14a | 2021-03-25T21:22:50Z | python | 2021-05-27T20:23:02Z | |
closed | apache/airflow | https://github.com/apache/airflow | 15,001 | ["airflow/providers/amazon/aws/sensors/s3_prefix.py", "tests/providers/amazon/aws/sensors/test_s3_prefix.py"] | S3MultipleKeysSensor operator | **Description**
Currently we have an operator, S3KeySensor which polls for the given prefix in the bucket. At times, there is need to poll for multiple prefixes in given bucket in one go. To have that - I propose to have a S3MultipleKeysSensor, which would poll for multiple prefixes in the given bucket in one go.
... | https://github.com/apache/airflow/issues/15001 | https://github.com/apache/airflow/pull/18807 | ec31b2049e7c3b9f9694913031553f2d7eb66265 | 176165de3b297c0ed7d2b60cf6b4c37fc7a2337f | 2021-03-25T07:24:52Z | python | 2021-10-11T21:15:16Z |
closed | apache/airflow | https://github.com/apache/airflow | 15,000 | ["airflow/providers/amazon/aws/operators/ecs.py", "tests/providers/amazon/aws/operators/test_ecs.py"] | When an ECS Task fails to start, ECS Operator raises a CloudWatch exception | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/15000 | https://github.com/apache/airflow/pull/18733 | a192b4afbd497fdff508b2a06ec68cd5ca97c998 | 767a4f5207f8fc6c3d8072fa780d84460d41fc7a | 2021-03-25T05:55:31Z | python | 2021-10-05T21:34:26Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,991 | ["scripts/ci/libraries/_md5sum.sh", "scripts/ci/libraries/_verify_image.sh", "scripts/docker/compile_www_assets.sh"] | Static file not being loaded in web server in docker-compose | Apache Airflow version: apache/airflow:master-python3.8
Environment:
Cloud provider or hardware configuration:
OS (e.g. from /etc/os-release): Mac OS 10.16.5
Kernel (e.g. uname -a): Darwin Kernel Version 19.6.0
Browser:
Google Chrome Version 89.0.4389.90
What happened:
I am having an issue with runnin... | https://github.com/apache/airflow/issues/14991 | https://github.com/apache/airflow/pull/14995 | 775ee51d0e58aeab5d29683dd2ff21b8c9057095 | 5dc634bf74bbec68bbe1c7b6944d0a9efd85181d | 2021-03-24T20:54:58Z | python | 2021-03-25T13:04:43Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,989 | [".github/workflows/ci.yml", "docs/exts/docs_build/fetch_inventories.py", "scripts/ci/docs/ci_docs.sh", "scripts/ci/docs/ci_docs_prepare.sh"] | Make Docs builds fallback in case external docs sources are missing | Every now and then our docs builds start to fail because of external dependency (latest example here #14985). And while we are doing caching now of that information, it does not help when the initial retrieval fails. This information does not change often but with the number of dependencies we have it will continue to ... | https://github.com/apache/airflow/issues/14989 | https://github.com/apache/airflow/pull/15109 | 2ac4638b7e93d5144dd46f2c09fb982c374db79e | 8cc8d11fb87d0ad5b3b80907874f695a77533bfa | 2021-03-24T18:15:48Z | python | 2021-04-02T22:11:44Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,959 | ["airflow/providers/docker/operators/docker_swarm.py", "tests/providers/docker/operators/test_docker_swarm.py"] | Support all terminus task states for Docker Swarm Operator | **Apache Airflow version**: latest
**What happened**:
There are more terminus task states than the ones we currently check in Docker Swarm Operator. This makes the operator run infinitely when the service goes into these states.
**What you expected to happen**:
The operator should terminate.
**How to re... | https://github.com/apache/airflow/issues/14959 | https://github.com/apache/airflow/pull/14960 | 6b78394617c7e699dda1acf42e36161d2fc29925 | ab477176998090e8fb94d6f0e6bf056bad2da441 | 2021-03-23T15:44:21Z | python | 2021-04-07T12:39:43Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,957 | [".github/workflows/ci.yml", ".pre-commit-config.yaml", "BREEZE.rst", "STATIC_CODE_CHECKS.rst", "breeze-complete", "scripts/ci/selective_ci_checks.sh", "scripts/ci/static_checks/eslint.sh"] | Run selective CI pipeline for UI-only PRs | For PRs that only touch files in `airflow/ui/` we'd like to run a selective set of CI actions. We only need linting and UI tests.
Additionally, this update should pull the test runs out of the pre-commit. | https://github.com/apache/airflow/issues/14957 | https://github.com/apache/airflow/pull/15009 | a2d99293c9f5bdf1777fed91f1c48230111f53ac | 7417f81d36ad02c2a9d7feb9b9f881610f50ceba | 2021-03-23T14:32:41Z | python | 2021-03-31T22:10:00Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,924 | ["airflow/utils/cli.py", "airflow/utils/log/file_processor_handler.py", "airflow/utils/log/file_task_handler.py", "airflow/utils/log/non_caching_file_handler.py"] | Scheduler Memory Leak in Airflow 2.0.1 | **Apache Airflow version**: 2.0.1
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): v1.17.4
**Environment**: Dev
- **OS** (e.g. from /etc/os-release): RHEL7
**What happened**:
After running fine for some time my airflow tasks got stuck in scheduled state with below error in Task... | https://github.com/apache/airflow/issues/14924 | https://github.com/apache/airflow/pull/18054 | 6acb9e1ac1dd7705d9bfcfd9810451dbb549af97 | 43f595fe1b8cd6f325d8535c03ee219edbf4a559 | 2021-03-21T15:35:14Z | python | 2021-09-09T10:50:45Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,888 | ["airflow/providers/amazon/aws/transfers/s3_to_redshift.py", "tests/providers/amazon/aws/transfers/test_s3_to_redshift.py"] | S3ToRedshiftOperator is not transaction safe for truncate | **Apache Airflow version**: 2.0.0
**Environment**:
- **Cloud provider or hardware configuration**: AWS
- **OS** (e.g. from /etc/os-release): Amazon Linux 2
**What happened**:
The TRUNCATE operation has a fine print in Redshift that it is committing the transaction.
See https://docs.aws.amazon.com/re... | https://github.com/apache/airflow/issues/14888 | https://github.com/apache/airflow/pull/17117 | 32582b5bf1432e7c7603b959a675cf7edd76c9e6 | f44d7bd9cfe00b1409db78c2a644516b0ab003e9 | 2021-03-19T00:33:07Z | python | 2021-07-21T16:33:22Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,880 | ["airflow/providers/slack/operators/slack.py", "tests/providers/slack/operators/test_slack.py"] | SlackAPIFileOperator is broken | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/14880 | https://github.com/apache/airflow/pull/17247 | 797b515a23136d1f00c6bd938960882772c1c6bd | 07c8ee01512b0cc1c4602e356b7179cfb50a27f4 | 2021-03-18T16:07:03Z | python | 2021-08-01T23:08:07Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,864 | ["airflow/exceptions.py", "airflow/utils/task_group.py", "tests/utils/test_task_group.py"] | Using TaskGroup without context manager (Graph view visual bug) | **Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): n/a
**What happened**:
When I do not use the context manager for the task group and instead call the add function to add the tasks, those tasks show up on the Graph view.
![Screen Shot 2021-03-17 at 2... | https://github.com/apache/airflow/issues/14864 | https://github.com/apache/airflow/pull/23071 | 9caa511387f92c51ab4fc42df06e0a9ba777e115 | 337863fa35bba8463d62e5cf0859f2bb73cf053a | 2021-03-17T22:25:05Z | python | 2022-06-05T13:52:02Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,830 | ["airflow/api_connexion/endpoints/role_and_permission_endpoint.py", "airflow/api_connexion/openapi/v1.yaml", "airflow/security/permissions.py", "tests/api_connexion/endpoints/test_role_and_permission_endpoint.py"] | Add Create/Update/Delete API endpoints for Roles | To be able to fully manage the permissions in the UI we will need to be able to modify the roles and the permissions they have.
It probably makes sense to have one PR that adds CUD (Read is already done) endpoints for Roles.
Permissions are not createable via anything but code, so we only need these endpoints for... | https://github.com/apache/airflow/issues/14830 | https://github.com/apache/airflow/pull/14840 | 266384a63f4693b667f308d49fcbed9a10a41fce | 6706b67fecc00a22c1e1d6658616ed9dd96bbc7b | 2021-03-16T10:58:54Z | python | 2021-04-05T09:22:43Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,811 | ["setup.cfg"] | Latest SQLAlchemy (1.4) Incompatible with latest sqlalchemy_utils | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/14811 | https://github.com/apache/airflow/pull/14812 | 251eb7d170db3f677e0c2759a10ac1e31ac786eb | c29f6fb76b9d87c50713ae94fda805b9f789a01d | 2021-03-15T19:39:29Z | python | 2021-03-15T20:28:06Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,807 | ["airflow/ui/package.json", "airflow/ui/src/components/AppContainer/AppHeader.tsx", "airflow/ui/src/components/AppContainer/TimezoneDropdown.tsx", "airflow/ui/src/components/MultiSelect.tsx", "airflow/ui/src/providers/TimezoneProvider.tsx", "airflow/ui/src/views/Pipelines/PipelinesTable.tsx", "airflow/ui/src/views/Pipe... | Design/build timezone switcher modal | - Once we have the current user's preference set and available in Context, add a modal that allows the preferred display timezone to be changed.
- Modal will be triggered by clicking the time/TZ in the global navigation. | https://github.com/apache/airflow/issues/14807 | https://github.com/apache/airflow/pull/15674 | 46d62782e85ff54dd9dc96e1071d794309497983 | 3614910b4fd32c90858cd9731fc0421078ca94be | 2021-03-15T15:14:24Z | python | 2021-05-07T17:49:37Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,755 | ["tests/jobs/test_backfill_job.py"] | [QUARANTINE] Backfill depends on past test is flaky | Test backfill_depends_on_past is flaky. The whole Backfill class was in Heisentest but I believe this is the only one that is problematic now so I remove the class from heisentests and move the depends_on_past to quarantine.
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following ... | https://github.com/apache/airflow/issues/14755 | https://github.com/apache/airflow/pull/19862 | 5ebd63a31b5bc1974fc8974f137b9fdf0a5f58aa | a804666347b50b026a8d3a1a14c0b2e27a369201 | 2021-03-13T13:00:28Z | python | 2021-11-30T12:59:42Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,726 | [".pre-commit-config.yaml", "BREEZE.rst", "STATIC_CODE_CHECKS.rst", "airflow/ui/package.json", "airflow/ui/yarn.lock", "breeze-complete"] | Add precommit linting and testing to the new /ui | **Description**
We just initialized the new UI for AIP-38 under `/airflow/ui`. To continue development, it would be best to add a pre-commit hook to run the linting and testing commands for the new project.
**Use case / motivation**
The new UI already has linting and testing setup with `yarn lint` and `yarn te... | https://github.com/apache/airflow/issues/14726 | https://github.com/apache/airflow/pull/14836 | 5f774fae530577e302c153cc8726c93040ebbde0 | e395fcd247b8aa14dbff2ee979c1a0a17c42adf4 | 2021-03-11T16:18:27Z | python | 2021-03-16T23:06:26Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,682 | ["airflow/providers/amazon/aws/transfers/local_to_s3.py", "airflow/providers/google/cloud/transfers/azure_fileshare_to_gcs.py", "airflow/providers/google/cloud/transfers/s3_to_gcs.py", "tests/providers/amazon/aws/transfers/test_local_to_s3.py"] | The S3ToGCSOperator fails on templated `dest_gcs` URL | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/14682 | https://github.com/apache/airflow/pull/19048 | efdfd15477f92da059fa86b4fa18b6f29cb97feb | 3c08c025c5445ffc0533ac28d07ccf2e69a19ca8 | 2021-03-09T14:44:14Z | python | 2021-10-27T06:15:00Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,675 | ["airflow/utils/helpers.py", "tests/utils/test_helpers.py"] | TriggerDagRunOperator OperatorLink doesn't work when HTML base url doesn't match the Airflow base url | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/14675 | https://github.com/apache/airflow/pull/14990 | 62aa7965a32f1f8dde83cb9c763deef5b234092b | aaa3bf6b44238241bd61178426b692df53770c22 | 2021-03-09T01:03:33Z | python | 2021-04-11T11:51:59Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,597 | ["airflow/models/taskinstance.py", "docs/apache-airflow/concepts/connections.rst", "docs/apache-airflow/macros-ref.rst", "tests/models/test_taskinstance.py"] | Provide jinja template syntax to access connections | **Description**
Expose the connection into the jinja template context via `conn.value.<connectionname>.{host,port,login,password,extra_config,etc}`
Today is possible to conveniently access [airflow's variables](https://airflow.apache.org/docs/apache-airflow/stable/concepts.html#variables) in jinja templates using... | https://github.com/apache/airflow/issues/14597 | https://github.com/apache/airflow/pull/16686 | 5034414208f85a8be61fe51d6a3091936fe402ba | d3ba80a4aa766d5eaa756f1fa097189978086dac | 2021-03-04T07:51:09Z | python | 2021-06-29T10:50:31Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,592 | ["airflow/configuration.py", "airflow/models/connection.py", "airflow/models/variable.py", "tests/core/test_configuration.py"] | Unreachable Secrets Backend Causes Web Server Crash | **Apache Airflow version**:
1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
n/a
**Environment**:
- **Cloud provider or hardware configuration**:
Amazon MWAA
- **OS** (e.g. from /etc/os-release):
Amazon Linux (latest)
- **Kernel** (e.g. `uname -a`):
n/a
- ... | https://github.com/apache/airflow/issues/14592 | https://github.com/apache/airflow/pull/16404 | 4d4830599578ae93bb904a255fb16b81bd471ef1 | 0abbd2d918ad9027948fd8a33ebb42487e4aa000 | 2021-03-03T23:17:03Z | python | 2021-08-27T20:59:15Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,563 | ["airflow/example_dags/example_external_task_marker_dag.py", "airflow/models/dag.py", "airflow/sensors/external_task.py", "docs/apache-airflow/howto/operator/external_task_sensor.rst", "tests/sensors/test_external_task_sensor.py"] | TaskGroup Sensor | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/14563 | https://github.com/apache/airflow/pull/24902 | 0eb0b543a9751f3d458beb2f03d4c6ff22fcd1c7 | bc04c5ff0fa56e80d3d5def38b798170f6575ee8 | 2021-03-02T14:22:22Z | python | 2022-08-22T18:13:09Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,518 | ["airflow/cli/commands/cheat_sheet_command.py", "airflow/cli/commands/info_command.py", "airflow/cli/simple_table.py"] | Airflow info command doesn't work properly with pbcopy on Mac OS | Hello,
Mac OS has a command for copying data to the clipboard - `pbcopy`. Unfortunately, with the [introduction of more fancy tables](https://github.com/apache/airflow/pull/12689) to this command, we can no longer use it together.
For example:
```bash
airflow info | pbcopy
```
<details>
<summary>Clipboard... | https://github.com/apache/airflow/issues/14518 | https://github.com/apache/airflow/pull/14528 | 1b0851c9b75f0d0a15427898ae49a2f67d076f81 | a1097f6f29796bd11f8ed7b3651dfeb3e40eec09 | 2021-02-27T21:07:49Z | python | 2021-02-28T15:42:33Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,517 | ["airflow/cli/cli_parser.py", "airflow/cli/simple_table.py", "docs/apache-airflow/usage-cli.rst", "docs/spelling_wordlist.txt"] | The tables are not parsable by standard linux utilities. | Hello,
I changed the format of the tables a long time ago so that they could be parsed in standard Linux tools such as AWK.
https://github.com/apache/airflow/pull/8409
For example, to list the files that contain the DAG, I could run the command below.
```
airflow dags list | grep -v "dag_id" | awk '{print $2}' |... | https://github.com/apache/airflow/issues/14517 | https://github.com/apache/airflow/pull/14546 | 8801a0cc3b39cf3d2a3e5ef6af004d763bdb0b93 | 0ef084c3b70089b9b061090f7d88ce86e3651ed4 | 2021-02-27T20:56:59Z | python | 2021-03-02T19:12:53Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,489 | ["airflow/providers/ssh/CHANGELOG.rst", "airflow/providers/ssh/hooks/ssh.py", "airflow/providers/ssh/provider.yaml"] | Add a retry with wait interval for SSH operator | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/14489 | https://github.com/apache/airflow/pull/19981 | 4a73d8f3d1f0c2cb52707901f9e9a34198573d5e | b6edc3bfa1ed46bed2ae23bb2baeefde3f9a59d3 | 2021-02-26T21:22:34Z | python | 2022-02-01T09:30:09Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,486 | ["airflow/www/static/js/tree.js"] | tree view task instances have too much left padding in webserver UI | **Apache Airflow version**: 2.0.1
Here is tree view of a dag with one task:

For some reason the task instances render partially off the page, and there's a large amount of empty space that could have b... | https://github.com/apache/airflow/issues/14486 | https://github.com/apache/airflow/pull/14566 | 8ef862eee6443cc2f34f4cc46425357861e8b96c | 3f7ebfdfe2a1fa90b0854028a5db057adacd46c1 | 2021-02-26T19:02:06Z | python | 2021-03-04T00:00:12Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,481 | ["airflow/api_connexion/schemas/dag_schema.py", "tests/api_connexion/endpoints/test_dag_endpoint.py", "tests/api_connexion/schemas/test_dag_schema.py"] | DAG /details endpoint returning empty array objects | When testing the following two endpoints, I get different results for the array of owners and tags. The former should be identical to the response of the latter endpoint.
`/api/v1/dags/{dag_id}/details`:
```json
{
"owners": [],
"tags": [
{},
{}
],
}
```
`/api/v1/dags/{dag_id}`:
```js... | https://github.com/apache/airflow/issues/14481 | https://github.com/apache/airflow/pull/14490 | 9c773bbf0174a8153720d594041f886b2323d52f | 4424d10f05fa268b54c81ef8b96a0745643690b6 | 2021-02-26T14:59:56Z | python | 2021-03-03T14:39:02Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,473 | ["airflow/www/static/js/tree.js"] | DagRun duration not visible in tree view tooltip if not currently running | On airflow 2.0.1
On tree view if dag run is running, duration shows as expected:

But if dag run is complete, duration is null:
, we should update mainly for the `AUTH_ROLES_MAPPING` feature, which lets users bind to RBAC roles based on their LDAP/OAUTH group membership.
Here are the docs about Flask-AppBui... | https://github.com/apache/airflow/issues/14469 | https://github.com/apache/airflow/pull/14665 | b718495e4caecb753742c3eb22919411a715f24a | 97b5e4cd6c001ec1a1597606f4e9f1c0fbea20d2 | 2021-02-25T23:00:08Z | python | 2021-03-08T17:12:05Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,422 | ["airflow/jobs/local_task_job.py", "tests/jobs/test_local_task_job.py"] | on_failure_callback does not seem to fire on pod deletion/eviction | **Apache Airflow version**: 2.0.1
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.16.x
**Environment**: KubernetesExecutor with single scheduler pod
**What happened**: On all previous versions we used (from 1.10.x to 2.0.0), evicting or deleting a running task pod triggered the ... | https://github.com/apache/airflow/issues/14422 | https://github.com/apache/airflow/pull/15172 | e5d69ad6f2d25e652bb34b6bcf2ce738944de407 | def1e7c5841d89a60f8972a84b83fe362a6a878d | 2021-02-24T16:55:21Z | python | 2021-04-23T22:47:20Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,421 | ["airflow/api_connexion/openapi/v1.yaml", "tests/api_connexion/endpoints/test_task_instance_endpoint.py"] | NULL values in the operator column of task_instance table cause API validation failures | **Apache Airflow version**: 2.0.1
**Environment**: Docker on Linux Mint 20.1, image based on apache/airflow:2.0.1-python3.8
**What happened**:
I'm using the airflow API and the following exception occurred:
```python
>>> import json
>>> import requests
>>> from requests.auth import HTTPBasicAuth
>>> pay... | https://github.com/apache/airflow/issues/14421 | https://github.com/apache/airflow/pull/16516 | 60925453b1da9fe54ca82ed59889cd65a0171516 | 087556f0c210e345ac1749933ff4de38e40478f6 | 2021-02-24T15:24:05Z | python | 2021-06-18T07:56:05Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,364 | ["airflow/jobs/scheduler_job.py", "tests/jobs/test_scheduler_job.py"] | Missing schedule_delay metric | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/14364 | https://github.com/apache/airflow/pull/15105 | 441b4ef19f07d8c72cd38a8565804e56e63b543c | ca4c4f3d343dea0a034546a896072b9c87244e71 | 2021-02-22T18:18:44Z | python | 2021-03-31T12:38:14Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,331 | ["airflow/api_connexion/openapi/v1.yaml", "airflow/utils/state.py", "tests/api_connexion/endpoints/test_task_instance_endpoint.py"] | Airflow stable API taskInstance call fails if a task is removed from running DAG | **Apache Airflow version**: 2.0.1
**Environment**: Docker on Win 10 with WSL, image based on `apache/airflow:2.0.1-python3.8`
**What happened**:
I'm using the airflow API and the following (what I believe to be a) bug popped up:
```Python
>>> import requests
>>> r = requests.get("http://localhost:8084/api... | https://github.com/apache/airflow/issues/14331 | https://github.com/apache/airflow/pull/14381 | ea7118316660df43dd0ac0a5e72283fbdf5f2396 | 7418679591e5df4ceaab6c471bc6d4a975201871 | 2021-02-20T13:15:11Z | python | 2021-03-08T21:24:59Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,326 | ["airflow/kubernetes/pod_generator.py", "tests/kubernetes/test_pod_generator.py"] | Task Instances stuck in "scheduled" state | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/14326 | https://github.com/apache/airflow/pull/14703 | b1ce429fee450aef69a813774bf5d3404d50f4a5 | b5e7ada34536259e21fca5032ef67b5e33722c05 | 2021-02-20T00:11:56Z | python | 2021-03-26T14:41:18Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,299 | ["airflow/www/templates/airflow/dag_details.html"] | UI: Start Date is incorrect in "DAG Details" view | **Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Ubuntu
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happen... | https://github.com/apache/airflow/issues/14299 | https://github.com/apache/airflow/pull/16206 | 78c4f1a46ce74f13a99447207f8cdf0fcfc7df95 | ebc03c63af7282c9d826054b17fe7ed50e09fe4e | 2021-02-18T20:14:50Z | python | 2021-06-08T14:13:18Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,279 | ["airflow/providers/amazon/aws/example_dags/example_s3_bucket.py", "airflow/providers/amazon/aws/example_dags/example_s3_bucket_tagging.py", "airflow/providers/amazon/aws/hooks/s3.py", "airflow/providers/amazon/aws/operators/s3_bucket.py", "airflow/providers/amazon/aws/operators/s3_bucket_tagging.py", "airflow/provider... | Add AWS S3 Bucket Tagging Operator | **Description**
Add the missing AWS Operators for the three (get/put/delete) AWS S3 bucket tagging APIs, including testing.
**Use case / motivation**
I am looking to add an Operator that will implement the existing API functionality to manage the tags on an AWS S3 bucket.
**Are you willing to submit a PR?**... | https://github.com/apache/airflow/issues/14279 | https://github.com/apache/airflow/pull/14402 | f25ec3368348be479dde097efdd9c49ce56922b3 | 8ced652ecf847ed668e5eed27e3e47a51a27b1c8 | 2021-02-17T17:07:01Z | python | 2021-02-28T20:50:11Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,270 | ["airflow/task/task_runner/standard_task_runner.py", "tests/task/task_runner/test_standard_task_runner.py"] | Specify that exit code -9 is due to RAM | Related to https://github.com/apache/airflow/issues/9655
It would be nice to add a message when you get this error with some info, like 'This probably is because a lack of RAM' or something like that.
I have found the code where the -9 is assigned but have no idea how to add a logging message.
self.p... | https://github.com/apache/airflow/issues/14270 | https://github.com/apache/airflow/pull/15207 | eae22cec9c87e8dad4d6e8599e45af1bdd452062 | 18e2c1de776c8c3bc42c984ea0d31515788b6572 | 2021-02-17T09:01:05Z | python | 2021-04-06T19:02:11Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,252 | ["airflow/models/baseoperator.py", "tests/core/test_core.py"] | Unable to clear Failed task with retries |
**Apache Airflow version**: 2.0.1
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): NA
**Environment**: Windows WSL2 (Ubuntu) Local
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Ubuntu 18.04
- **Kernel** (e.g. `uname -a`): Linux d255bce4... | https://github.com/apache/airflow/issues/14252 | https://github.com/apache/airflow/pull/16415 | 643f3c35a6ba3def40de7db8e974c72e98cfad44 | 15ff2388e8a52348afcc923653f85ce15a3c5f71 | 2021-02-15T22:27:00Z | python | 2021-06-13T00:29:14Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,202 | ["chart/templates/scheduler/scheduler-deployment.yaml"] | Scheduler in helm chart cannot access DAG with git sync | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/14202 | https://github.com/apache/airflow/pull/14203 | 8f21fb1bf77fc67e37dc13613778ff1e6fa87cea | e164080479775aca53146331abf6f615d1f03ff0 | 2021-02-12T06:56:10Z | python | 2021-02-19T01:03:39Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,200 | ["docs/apache-airflow/best-practices.rst", "docs/apache-airflow/security/index.rst", "docs/apache-airflow/security/secrets/secrets-backend/index.rst"] | Update Best practises doc | Update https://airflow.apache.org/docs/apache-airflow/stable/best-practices.html#variables to use Secret Backend (especially Environment Variables) as it asks user not to use Variable in top level | https://github.com/apache/airflow/issues/14200 | https://github.com/apache/airflow/pull/17319 | bcf719bfb49ca20eea66a2527307968ff290c929 | 2c1880a90712aa79dd7c16c78a93b343cd312268 | 2021-02-11T19:31:08Z | python | 2021-08-02T20:43:12Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,106 | ["airflow/lineage/__init__.py", "airflow/lineage/backend.py", "docs/apache-airflow/lineage.rst", "tests/lineage/test_lineage.py"] | Lineage Backend removed for no reason | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/14106 | https://github.com/apache/airflow/pull/14146 | 9ac1d0a3963b0e152cb2ba4a58b14cf6b61a73a0 | af2d11e36ed43b0103a54780640493b8ae46d70e | 2021-02-05T16:47:46Z | python | 2021-04-03T08:26:59Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,104 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg"] | BACKEND: Unbound Variable issue in docker entrypoint | This is NOT a bug in Airflow, I'm writing this issue for documentation should someone come across this same issue and need to identify how to solve it. Please tag as appropriate.
**Apache Airflow version**: Docker 2.0.1rc2
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A
**En... | https://github.com/apache/airflow/issues/14104 | https://github.com/apache/airflow/pull/14124 | d77f79d134e0d14443f75325b24dffed4b779920 | b151b5eea5057f167bf3d2f13a84ab4eb8e42734 | 2021-02-05T15:31:07Z | python | 2021-03-22T15:42:37Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,097 | ["UPDATING.md", "airflow/contrib/sensors/gcs_sensor.py", "airflow/providers/google/BACKPORT_PROVIDER_README.md", "airflow/providers/google/cloud/sensors/gcs.py", "tests/always/test_project_structure.py", "tests/deprecated_classes.py", "tests/providers/google/cloud/sensors/test_gcs.py"] | Typo in Sensor: GCSObjectsWtihPrefixExistenceSensor (should be GCSObjectsWithPrefixExistenceSensor) | Typo in Google Cloud Storage sensor: airflow/providers/google/cloud/sensors/gcs/GCSObjectsWithPrefixExistenceSensor
The word _With_ is spelled incorrectly. It should be: GCSObjects**With**PrefixExistenceSensor
**Apache Airflow version**: 2.0.0
**Environment**:
- **Cloud provider or hardware configuration**: Goo... | https://github.com/apache/airflow/issues/14097 | https://github.com/apache/airflow/pull/14179 | 6dc6339635f41a9fa50a987c4fdae5af0bae9fdc | e3bcaa3ba351234effe52ad380345c4e39003fcb | 2021-02-05T12:13:09Z | python | 2021-02-12T20:14:00Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,077 | ["airflow/providers/google/marketing_platform/hooks/display_video.py", "airflow/providers/google/marketing_platform/operators/display_video.py", "tests/providers/google/marketing_platform/hooks/test_display_video.py", "tests/providers/google/marketing_platform/operators/test_display_video.py"] | GoogleDisplayVideo360Hook.download_media does not pass the resourceName correctly | **Apache Airflow version**: 1.10.12
**Environment**: Google Cloud Composer 1.13.3
- **Cloud provider or hardware configuration**:
- Google Cloud Composer
**What happened**:
The GoogleDisplayVideo360Hook.download_media hook tries to download media using the "resource_name" argument. However, [per the API spe... | https://github.com/apache/airflow/issues/14077 | https://github.com/apache/airflow/pull/20528 | af4a2b0240fbf79a0a6774a9662243050e8fea9c | a6e60ce25d9f3d621a7b4089834ca5e50cd123db | 2021-02-04T16:35:25Z | python | 2021-12-30T12:48:55Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,071 | ["airflow/providers/jenkins/operators/jenkins_job_trigger.py", "tests/providers/jenkins/operators/test_jenkins_job_trigger.py"] | Add support for UNSTABLE Jenkins status | **Description**
Don't mark dag as `failed` when `UNSTABLE` status received from Jenkins.
It can be done by adding `allow_unstable: bool` or `success_status_values: list` parameter to `JenkinsJobTriggerOperator.__init__`. For now `SUCCESS` status is hardcoded, any other lead to fail.
**Use case / motivation**
... | https://github.com/apache/airflow/issues/14071 | https://github.com/apache/airflow/pull/14131 | f180fa13bf2a0ffa31b30bb21468510fe8a20131 | 78adaed5e62fa604d2ef2234ad560eb1c6530976 | 2021-02-04T15:20:47Z | python | 2021-02-08T21:43:39Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,051 | ["docs/build_docs.py", "docs/exts/docs_build/spelling_checks.py", "docs/spelling_wordlist.txt"] | Docs Builder creates SpellingError for Sphinx error unrelated to spelling issues | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/14051 | https://github.com/apache/airflow/pull/14196 | e31b27d593f7379f38ced34b6e4ce8947b91fcb8 | cb4a60e9d059eeeae02909bb56a348272a55c233 | 2021-02-03T16:46:25Z | python | 2021-02-12T23:46:23Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,050 | ["airflow/jobs/scheduler_job.py", "airflow/serialization/serialized_objects.py", "tests/jobs/test_scheduler_job.py", "tests/serialization/test_dag_serialization.py"] | SLA mechanism does not work | **Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
... | https://github.com/apache/airflow/issues/14050 | https://github.com/apache/airflow/pull/14056 | 914e9ce042bf29dc50d410f271108b1e42da0add | 604a37eee50715db345c5a7afed085c9afe8530d | 2021-02-03T14:58:32Z | python | 2021-02-04T01:59:31Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,046 | ["airflow/www/templates/airflow/tree.html"] | Day change flag is in wrong place | **Apache Airflow version**: 2.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
... | https://github.com/apache/airflow/issues/14046 | https://github.com/apache/airflow/pull/14141 | 0f384f0644c8cfe55ca4c75d08b707be699b440f | 6dc6339635f41a9fa50a987c4fdae5af0bae9fdc | 2021-02-03T13:19:58Z | python | 2021-02-12T18:50:02Z |
closed | apache/airflow | https://github.com/apache/airflow | 14,010 | ["airflow/www/templates/airflow/task.html"] | Order of items not preserved in Task instance view | **Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
... | https://github.com/apache/airflow/issues/14010 | https://github.com/apache/airflow/pull/14036 | 68758b826076e93fadecf599108a4d304dd87ac7 | fc67521f31a0c9a74dadda8d5f0ac32c07be218d | 2021-02-01T15:51:38Z | python | 2021-02-05T15:38:13Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,989 | ["airflow/providers/telegram/operators/telegram.py", "tests/providers/telegram/operators/test_telegram.py"] | AttributeError: 'TelegramOperator' object has no attribute 'text' | Hi there 👋
I was playing with the **TelegramOperator** and stumbled upon a bug with the `text` field. It is supposed to be a template field but in reality the instance of the **TelegramOperator** does not have this attribute thus every time I try to execute code I get the error:
> AttributeError: 'TelegramOperat... | https://github.com/apache/airflow/issues/13989 | https://github.com/apache/airflow/pull/13990 | 9034f277ef935df98b63963c824ba71e0dcd92c7 | 106d2c85ec4a240605830bf41962c0197b003135 | 2021-01-30T19:25:35Z | python | 2021-02-10T12:06:04Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,985 | ["airflow/www/static/js/connection_form.js"] | Can't save any connection if provider-provided connection form widgets have fields marked as InputRequired | **Apache Airflow version**: 2.0.0 with the following patch: https://github.com/apache/airflow/pull/13640
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A
**Environment**:
- **Cloud provider or hardware configuration**: AMD Ryzen 3900X (12C/24T), 64GB RAM
- **OS** (e.g. from ... | https://github.com/apache/airflow/issues/13985 | https://github.com/apache/airflow/pull/14052 | f9c9e9c38f444a39987478f3d1a262db909de8c4 | 98bbe5aec578a012c1544667bf727688da1dadd4 | 2021-01-30T16:21:53Z | python | 2021-02-11T13:59:21Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,924 | ["scripts/in_container/_in_container_utils.sh"] | Improve error messages and propagation in CI builds | Airflow version: dev
The error information in `Backport packages: wheel` is not that easy to find.
Here is the end of the step that failed and end of its log:
<img width="1151" alt="Screenshot 2021-01-27 at 12 02 01" src="https://user-images.githubusercontent.com/9528307/105982515-aa64e800-6097-11eb-91c8-9911448... | https://github.com/apache/airflow/issues/13924 | https://github.com/apache/airflow/pull/15190 | 041a09f3ee6bc447c3457b108bd5431a2fd70ad9 | 7c17bf0d1e828b454a6b2c7245ded275b313c792 | 2021-01-27T11:07:09Z | python | 2021-04-04T20:20:11Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,918 | ["airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", "kubernetes_tests/test_kubernetes_pod_operator.py", "kubernetes_tests/test_kubernetes_pod_operator_backcompat.py", "tests/providers/cncf/kubernetes/operators/test_kubernetes_pod.py"] | KubernetesPodOperator with pod_template_file = No Metadata & Wrong Pod Name | **Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** 1.15.15
**What happened**:
If you use the **KubernetesPodOperator** with **LocalExecutor** and you use a **pod_template_file**, the pod created doesn't have metadata like :
- dag_id
- task_id
- ...
I want to have a ``... | https://github.com/apache/airflow/issues/13918 | https://github.com/apache/airflow/pull/15492 | def1e7c5841d89a60f8972a84b83fe362a6a878d | be421a6b07c2ae9167150b77dc1185a94812b358 | 2021-01-26T20:27:09Z | python | 2021-04-23T22:54:43Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,905 | ["setup.py"] | DockerOperator fails to pull an image | **Apache Airflow version**: 2.0
**Environment**:
- **OS** (from /etc/os-release): Debian GNU/Linux 10 (buster)
- **Kernel** (`uname -a`): Linux 37365fa0b59b 5.4.0-47-generic #51-Ubuntu SMP Fri Sep 4 19:50:52 UTC 2020 x86_64 GNU/Linux
- **Others**: running inside a docker container, forked puckel/docker-airflow
... | https://github.com/apache/airflow/issues/13905 | https://github.com/apache/airflow/pull/15731 | 7933aaf07f5672503cfd83361b00fda9d4c281a3 | 41930fdebfaa7ed2c53e7861c77a83312ca9bdc4 | 2021-01-26T05:49:03Z | python | 2021-05-09T21:05:49Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,891 | ["airflow/api_connexion/endpoints/dag_run_endpoint.py", "airflow/migrations/versions/2c6edca13270_resource_based_permissions.py", "airflow/www/templates/airflow/dags.html", "airflow/www/views.py", "docs/apache-airflow/security/access-control.rst", "tests/api_connexion/endpoints/test_dag_run_endpoint.py", "tests/www/tes... | RBAC Granular DAG Permissions don't work as intended | Previous versions (before 2.0) allowed for granular can_edit DAG permissions so that different user groups can trigger different DAGs and access control is more specific. Since 2.0 it seems that this does not work as expected.
How to reproduce:
Create a copy of the VIEWER role, try adding it can dag edit on a speci... | https://github.com/apache/airflow/issues/13891 | https://github.com/apache/airflow/pull/13922 | 568327f01a39d6f181dda62ef6a143f5096e6b97 | 629abfdbab23da24ca45996aaaa6e3aa094dd0de | 2021-01-25T13:55:12Z | python | 2021-02-03T03:16:18Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,805 | ["airflow/cli/commands/task_command.py"] | Could not get scheduler_job_id | **Apache Airflow version:**
2.0.0
**Kubernetes version (if you are using kubernetes) (use kubectl version):**
1.18.3
**Environment:**
Cloud provider or hardware configuration: AWS
**What happened:**
When trying to run a DAG, it gets scheduled, but task is never run. When attempting to run task manu... | https://github.com/apache/airflow/issues/13805 | https://github.com/apache/airflow/pull/16108 | 436e0d096700c344e7099693d9bf58e12658f9ed | cdc9f1a33854254607fa81265a323cf1eed6d6bb | 2021-01-21T10:09:05Z | python | 2021-05-27T12:50:03Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,799 | ["airflow/migrations/versions/8646922c8a04_change_default_pool_slots_to_1.py", "airflow/models/taskinstance.py"] | Scheduler crashes when unpausing some dags with: TypeError: '>' not supported between instances of 'NoneType' and 'int' | **Apache Airflow version**:
2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
1.15
**Environment**:
- **Cloud provider or hardware configuration**:
GKE
- **OS** (e.g. from /etc/os-release):
Ubuntu 18.04
**What happened**:
I just migrated from 1.10.14 to 2.0.0. When I t... | https://github.com/apache/airflow/issues/13799 | https://github.com/apache/airflow/pull/14406 | c069e64920da780237a1e1bdd155319b007a2587 | f763b7c3aa9cdac82b5d77e21e1840fbe931257a | 2021-01-20T22:08:00Z | python | 2021-02-25T02:56:40Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,774 | ["airflow/providers/amazon/aws/operators/s3_copy_object.py"] | add acl_policy to S3CopyObjectOperator | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details ... | https://github.com/apache/airflow/issues/13774 | https://github.com/apache/airflow/pull/13773 | 9923d606d2887c52390a30639fc1ee0d4000149c | 29730d720066a4c16d524e905de8cdf07e8cd129 | 2021-01-19T21:53:18Z | python | 2021-01-20T15:16:25Z |
closed | apache/airflow | https://github.com/apache/airflow | 13,761 | ["airflow/example_dags/tutorial.py", "airflow/models/baseoperator.py", "airflow/serialization/schema.json", "airflow/www/utils.py", "airflow/www/views.py", "docs/apache-airflow/concepts.rst", "tests/serialization/test_dag_serialization.py", "tests/www/test_utils.py"] | Markdown from doc_md is not being rendered in ui | **Apache Airflow version**: 1.10.14
**Environment**:
- **Cloud provider or hardware configuration**: docker
- **OS** (e.g. from /etc/os-release): apache/airflow:1.10.14-python3.8
- **Kernel** (e.g. `uname -a`): Linux host 5.4.0-62-generic #70-Ubuntu SMP Tue Jan 12 12:45:47 UTC 2021 x86_64 x86_64 x86_64 GNU/Linu... | https://github.com/apache/airflow/issues/13761 | https://github.com/apache/airflow/pull/15191 | 7c17bf0d1e828b454a6b2c7245ded275b313c792 | e86f5ca8fa5ff22c1e1f48addc012919034c672f | 2021-01-19T08:10:12Z | python | 2021-04-05T02:46:41Z |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.