INSTRUCTION
stringlengths
1
8.43k
RESPONSE
stringlengths
75
104k
Opens a ssh connection to the remote host.
def get_conn(self): """ Opens a ssh connection to the remote host. :rtype: paramiko.client.SSHClient """ self.log.debug('Creating SSH client for conn_id: %s', self.ssh_conn_id) client = paramiko.SSHClient() if not self.allow_host_key_change: self.log...
Creates a tunnel between two hosts. Like ssh - L <LOCAL_PORT >: host: <REMOTE_PORT >.
def get_tunnel(self, remote_port, remote_host="localhost", local_port=None): """ Creates a tunnel between two hosts. Like ssh -L <LOCAL_PORT>:host:<REMOTE_PORT>. :param remote_port: The remote port to create a tunnel to :type remote_port: int :param remote_host: The remote host ...
Creates a transfer job that runs periodically.
def create_transfer_job(self, body): """ Creates a transfer job that runs periodically. :param body: (Required) A request body, as described in https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs/patch#request-body :type body: dict :return: ...
Gets the latest state of a long - running operation in Google Storage Transfer Service.
def get_transfer_job(self, job_name, project_id=None): """ Gets the latest state of a long-running operation in Google Storage Transfer Service. :param job_name: (Required) Name of the job to be fetched :type job_name: str :param project_id: (Optional) the ID of the proj...
Lists long - running operations in Google Storage Transfer Service that match the specified filter.
def list_transfer_job(self, filter): """ Lists long-running operations in Google Storage Transfer Service that match the specified filter. :param filter: (Required) A request filter, as described in https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJob...
Updates a transfer job that runs periodically.
def update_transfer_job(self, job_name, body): """ Updates a transfer job that runs periodically. :param job_name: (Required) Name of the job to be updated :type job_name: str :param body: A request body, as described in https://cloud.google.com/storage-transfer/docs...
Deletes a transfer job. This is a soft delete. After a transfer job is deleted the job and all the transfer executions are subject to garbage collection. Transfer jobs become eligible for garbage collection 30 days after soft delete.
def delete_transfer_job(self, job_name, project_id): """ Deletes a transfer job. This is a soft delete. After a transfer job is deleted, the job and all the transfer executions are subject to garbage collection. Transfer jobs become eligible for garbage collection 30 days after s...
Cancels an transfer operation in Google Storage Transfer Service.
def cancel_transfer_operation(self, operation_name): """ Cancels an transfer operation in Google Storage Transfer Service. :param operation_name: Name of the transfer operation. :type operation_name: str :rtype: None """ self.get_conn().transferOperations().cance...
Gets an transfer operation in Google Storage Transfer Service.
def get_transfer_operation(self, operation_name): """ Gets an transfer operation in Google Storage Transfer Service. :param operation_name: (Required) Name of the transfer operation. :type operation_name: str :return: transfer operation See: https://cloud...
Gets an transfer operation in Google Storage Transfer Service.
def list_transfer_operations(self, filter): """ Gets an transfer operation in Google Storage Transfer Service. :param filter: (Required) A request filter, as described in https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs/list#body.QUERY_PARAMETERS.filter ...
Pauses an transfer operation in Google Storage Transfer Service.
def pause_transfer_operation(self, operation_name): """ Pauses an transfer operation in Google Storage Transfer Service. :param operation_name: (Required) Name of the transfer operation. :type operation_name: str :rtype: None """ self.get_conn().transferOperation...
Resumes an transfer operation in Google Storage Transfer Service.
def resume_transfer_operation(self, operation_name): """ Resumes an transfer operation in Google Storage Transfer Service. :param operation_name: (Required) Name of the transfer operation. :type operation_name: str :rtype: None """ self.get_conn().transferOperati...
Waits until the job reaches the expected state.
def wait_for_transfer_job(self, job, expected_statuses=(GcpTransferOperationStatus.SUCCESS,), timeout=60): """ Waits until the job reaches the expected state. :param job: Transfer job See: https://cloud.google.com/storage-transfer/docs/reference/rest/v1/transferJobs#Tran...
Checks whether the operation list has an operation with the expected status then returns true If it encounters operations in FAILED or ABORTED state throw: class: airflow. exceptions. AirflowException.
def operations_contain_expected_statuses(operations, expected_statuses): """ Checks whether the operation list has an operation with the expected status, then returns true If it encounters operations in FAILED or ABORTED state throw :class:`airflow.exceptions.AirflowException`. ...
Returns all task reschedules for the task instance and try number in ascending order.
def find_for_task_instance(task_instance, session): """ Returns all task reschedules for the task instance and try number, in ascending order. :param task_instance: the task instance to find task reschedules for :type task_instance: airflow.models.TaskInstance """ ...
Kubernetes only supports lowercase alphanumeric characters and - and. in the pod name However there are special rules about how - and. can be used so let s only keep alphanumeric chars see here for detail: https:// kubernetes. io/ docs/ concepts/ overview/ working - with - objects/ names/
def _strip_unsafe_kubernetes_special_chars(string): """ Kubernetes only supports lowercase alphanumeric characters and "-" and "." in the pod name However, there are special rules about how "-" and "." can be used so let's only keep alphanumeric chars see here for detail...
Kubernetes pod names must be < = 253 chars and must pass the following regex for validation ^ [ a - z0 - 9 ] ( [ - a - z0 - 9 ] * [ a - z0 - 9 ] ) ? ( \\. [ a - z0 - 9 ] ( [ - a - z0 - 9 ] * [ a - z0 - 9 ] ) ? ) * $
def _make_safe_pod_id(safe_dag_id, safe_task_id, safe_uuid): """ Kubernetes pod names must be <= 253 chars and must pass the following regex for validation "^[a-z0-9]([-a-z0-9]*[a-z0-9])?(\\.[a-z0-9]([-a-z0-9]*[a-z0-9])?)*$" :param safe_dag_id: a dag_id with only alphanumeric ch...
Valid label values must be 63 characters or less and must be empty or begin and end with an alphanumeric character ( [ a - z0 - 9A - Z ] ) with dashes ( - ) underscores ( _ ) dots (. ) and alphanumerics between.
def _make_safe_label_value(string): """ Valid label values must be 63 characters or less and must be empty or begin and end with an alphanumeric character ([a-z0-9A-Z]) with dashes (-), underscores (_), dots (.), and alphanumerics between. If the label value is then greater than...
If the airflow scheduler restarts with pending Queued tasks the tasks may or may not have been launched Thus on starting up the scheduler let s check every Queued task to see if it has been launched ( ie: if there is a corresponding pod on kubernetes )
def clear_not_launched_queued_tasks(self, session=None): """ If the airflow scheduler restarts with pending "Queued" tasks, the tasks may or may not have been launched Thus, on starting up the scheduler let's check every "Queued" task to see if it has been launched (ie: i...
Returns the number of slots open at the moment
def open_slots(self, session): """ Returns the number of slots open at the moment """ from airflow.models.taskinstance import \ TaskInstance as TI # Avoid circular import used_slots = session.query(func.count()).filter(TI.pool == self.pool).filter( TI.st...
Expands ( potentially nested ) env vars by repeatedly applying expandvars and expanduser until interpolation stops having any effect.
def expand_env_var(env_var): """ Expands (potentially nested) env vars by repeatedly applying `expandvars` and `expanduser` until interpolation stops having any effect. """ if not env_var: return env_var while True: interpolated = os.path.expanduser(os.path.expandvars(str(env...
Runs command and returns stdout
def run_command(command): """ Runs command and returns stdout """ process = subprocess.Popen( shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, close_fds=True) output, stderr = [stream.decode(sys.getdefaultencoding(), 'ignore') ...
Generates a configuration from the provided template + variables defined in current scope: param template: a config content templated with {{ variables }}
def parameterized_config(template): """ Generates a configuration from the provided template + variables defined in current scope :param template: a config content templated with {{variables}} """ all_vars = {k: v for d in [globals(), locals()] for k, v in d.items()} return template.format(*...
Remove an option if it exists in config from a file or default config. If both of config have the same option this removes the option in both configs unless remove_default = False.
def remove_option(self, section, option, remove_default=True): """ Remove an option if it exists in config from a file or default config. If both of config have the same option, this removes the option in both configs unless remove_default=False. """ if super().has_option...
Returns the section as a dict. Values are converted to int float bool as required.
def getsection(self, section): """ Returns the section as a dict. Values are converted to int, float, bool as required. :param section: section from the config :rtype: dict """ if (section not in self._sections and section not in self.airflow_defa...
Returns the current configuration as an OrderedDict of OrderedDicts.: param display_source: If False the option value is returned. If True a tuple of ( option_value source ) is returned. Source is either airflow. cfg default env var or cmd.: type display_source: bool: param display_sensitive: If True the values of opti...
def as_dict( self, display_source=False, display_sensitive=False, raw=False): """ Returns the current configuration as an OrderedDict of OrderedDicts. :param display_source: If False, the option value is returned. If True, a tuple of (option_value, source) is returned. So...
Allocate IDs for incomplete keys.
def allocate_ids(self, partial_keys): """ Allocate IDs for incomplete keys. .. seealso:: https://cloud.google.com/datastore/docs/reference/rest/v1/projects/allocateIds :param partial_keys: a list of partial keys. :type partial_keys: list :return: a list of f...
Begins a new transaction.
def begin_transaction(self): """ Begins a new transaction. .. seealso:: https://cloud.google.com/datastore/docs/reference/rest/v1/projects/beginTransaction :return: a transaction handle. :rtype: str """ conn = self.get_conn() resp = (conn ...
Commit a transaction optionally creating deleting or modifying some entities.
def commit(self, body): """ Commit a transaction, optionally creating, deleting or modifying some entities. .. seealso:: https://cloud.google.com/datastore/docs/reference/rest/v1/projects/commit :param body: the body of the commit request. :type body: dict :...
Lookup some entities by key.
def lookup(self, keys, read_consistency=None, transaction=None): """ Lookup some entities by key. .. seealso:: https://cloud.google.com/datastore/docs/reference/rest/v1/projects/lookup :param keys: the keys to lookup. :type keys: list :param read_consistency...
Roll back a transaction.
def rollback(self, transaction): """ Roll back a transaction. .. seealso:: https://cloud.google.com/datastore/docs/reference/rest/v1/projects/rollback :param transaction: the transaction to roll back. :type transaction: str """ conn = self.get_conn()...
Run a query for entities.
def run_query(self, body): """ Run a query for entities. .. seealso:: https://cloud.google.com/datastore/docs/reference/rest/v1/projects/runQuery :param body: the body of the query request. :type body: dict :return: the batch of query results. :rtype...
Gets the latest state of a long - running operation.
def get_operation(self, name): """ Gets the latest state of a long-running operation. .. seealso:: https://cloud.google.com/datastore/docs/reference/data/rest/v1/projects.operations/get :param name: the name of the operation resource. :type name: str :return...
Deletes the long - running operation.
def delete_operation(self, name): """ Deletes the long-running operation. .. seealso:: https://cloud.google.com/datastore/docs/reference/data/rest/v1/projects.operations/delete :param name: the name of the operation resource. :type name: str :return: none if...
Poll backup operation state until it s completed.
def poll_operation_until_done(self, name, polling_interval_in_seconds): """ Poll backup operation state until it's completed. :param name: the name of the operation resource :type name: str :param polling_interval_in_seconds: The number of seconds to wait before calling another ...
Export entities from Cloud Datastore to Cloud Storage for backup.
def export_to_storage_bucket(self, bucket, namespace=None, entity_filter=None, labels=None): """ Export entities from Cloud Datastore to Cloud Storage for backup. .. note:: Keep in mind that this requests the Admin API not the Data API. .. seealso:: https://clou...
Import a backup from Cloud Storage to Cloud Datastore.
def import_from_storage_bucket(self, bucket, file, namespace=None, entity_filter=None, labels=None): """ Import a backup from Cloud Storage to Cloud Datastore. .. note:: Keep in mind that this requests the Admin API not the Data API. .. seealso:: https://cloud.g...
Publish a message to a topic or an endpoint.
def publish_to_target(self, target_arn, message): """ Publish a message to a topic or an endpoint. :param target_arn: either a TopicArn or an EndpointArn :type target_arn: str :param message: the default message you want to send :param message: str """ c...
Fetch the hostname using the callable from the config or using socket. getfqdn as a fallback.
def get_hostname(): """ Fetch the hostname using the callable from the config or using `socket.getfqdn` as a fallback. """ # First we attempt to fetch the callable path from the config. try: callable_path = conf.get('core', 'hostname_callable') except AirflowConfigException: ...
Retrieves connection to Cloud Natural Language service.
def get_conn(self): """ Retrieves connection to Cloud Natural Language service. :return: Cloud Natural Language service object :rtype: google.cloud.language_v1.LanguageServiceClient """ if not self._conn: self._conn = LanguageServiceClient(credentials=self._g...
Finds named entities in the text along with entity types salience mentions for each entity and other properties.
def analyze_entities(self, document, encoding_type=None, retry=None, timeout=None, metadata=None): """ Finds named entities in the text along with entity types, salience, mentions for each entity, and other properties. :param document: Input document. If a dict is provided, ...
A convenience method that provides all the features that analyzeSentiment analyzeEntities and analyzeSyntax provide in one call.
def annotate_text(self, document, features, encoding_type=None, retry=None, timeout=None, metadata=None): """ A convenience method that provides all the features that analyzeSentiment, analyzeEntities, and analyzeSyntax provide in one call. :param document: Input document. I...
Classifies a document into categories.
def classify_text(self, document, retry=None, timeout=None, metadata=None): """ Classifies a document into categories. :param document: Input document. If a dict is provided, it must be of the same form as the protobuf message Document :type document: dict or class google.cl...
Return the task object identified by the given dag_id and task_id.
def get_task(dag_id, task_id): """Return the task object identified by the given dag_id and task_id.""" dagbag = DagBag() # Check DAG exists. if dag_id not in dagbag.dags: error_message = "Dag id {} not found".format(dag_id) raise DagNotFound(error_message) # Get DAG object and che...
Gets template fields for specific operator class.
def get_template_field(env, fullname): """ Gets template fields for specific operator class. :param fullname: Full path to operator class. For example: ``airflow.contrib.operators.gcp_vision_operator.CloudVisionProductSetCreateOperator`` :return: List of template field :rtype: list[str] ...
A role that allows you to include a list of template fields in the middle of the text. This is especially useful when writing guides describing how to use the operator. The result is a list of fields where each field is shorted in the literal block.
def template_field_role(app, typ, rawtext, text, lineno, inliner, options={}, content=[]): """ A role that allows you to include a list of template fields in the middle of the text. This is especially useful when writing guides describing how to use the operator. The result is a list of fields where eac...
Properly close pooled database connections
def dispose_orm(): """ Properly close pooled database connections """ log.debug("Disposing DB connection pool (PID %s)", os.getpid()) global engine global Session if Session: Session.remove() Session = None if engine: engine.dispose() engine = None
Ensures that certain subfolders of AIRFLOW_HOME are on the classpath
def prepare_classpath(): """ Ensures that certain subfolders of AIRFLOW_HOME are on the classpath """ if DAGS_FOLDER not in sys.path: sys.path.append(DAGS_FOLDER) # Add ./config/ for loading custom log parsers etc, or # airflow_local_settings etc. config_path = os.path.join(AIRFLOW...
Gets the returned Celery result from the Airflow task ID provided to the sensor and returns True if the celery result has been finished execution.
def _check_task_id(self, context): """ Gets the returned Celery result from the Airflow task ID provided to the sensor, and returns True if the celery result has been finished execution. :param context: Airflow's execution context :type context: dict :return: Tru...
Return true if the ticket cache contains conf information as is found in ticket caches of Kerberos 1. 8. 1 or later. This is incompatible with the Sun Java Krb5LoginModule in Java6 so we need to take an action to work around it.
def detect_conf_var(): """Return true if the ticket cache contains "conf" information as is found in ticket caches of Kerberos 1.8.1 or later. This is incompatible with the Sun Java Krb5LoginModule in Java6, so we need to take an action to work around it. """ ticket_cache = configuration.conf.ge...
Transforms a SQLAlchemy model instance into a dictionary
def alchemy_to_dict(obj): """ Transforms a SQLAlchemy model instance into a dictionary """ if not obj: return None d = {} for c in obj.__table__.columns: value = getattr(obj, c.name) if type(value) == datetime: value = value.isoformat() d[c.name] = val...
Yield successive chunks of a given size from a list of items
def chunks(items, chunk_size): """ Yield successive chunks of a given size from a list of items """ if chunk_size <= 0: raise ValueError('Chunk size must be a positive integer') for i in range(0, len(items), chunk_size): yield items[i:i + chunk_size]
Reduce the given list of items by splitting it into chunks of the given size and passing each chunk through the reducer
def reduce_in_chunks(fn, iterable, initializer, chunk_size=0): """ Reduce the given list of items by splitting it into chunks of the given size and passing each chunk through the reducer """ if len(iterable) == 0: return initializer if chunk_size == 0: chunk_size = len(iterable) ...
Given a number of tasks builds a dependency chain.
def chain(*tasks): """ Given a number of tasks, builds a dependency chain. chain(task_1, task_2, task_3, task_4) is equivalent to task_1.set_downstream(task_2) task_2.set_downstream(task_3) task_3.set_downstream(task_4) """ for up_task, down_task in zip(tasks[:-1], tasks[1:]): ...
Returns a pretty ascii table from tuples
def pprinttable(rows): """Returns a pretty ascii table from tuples If namedtuple are used, the table will have headers """ if not rows: return if hasattr(rows[0], '_fields'): # if namedtuple headers = rows[0]._fields else: headers = ["col{}".format(i) for i in range(len...
Tries really hard to terminate all children ( including grandchildren ). Will send sig ( SIGTERM ) to the process group of pid. If any process is alive after timeout a SIGKILL will be send.
def reap_process_group(pid, log, sig=signal.SIGTERM, timeout=DEFAULT_TIME_TO_WAIT_AFTER_SIGTERM): """ Tries really hard to terminate all children (including grandchildren). Will send sig (SIGTERM) to the process group of pid. If any process is alive after timeout a SIGKILL will be...
Given task instance try_number filename_template return the rendered log filename
def render_log_filename(ti, try_number, filename_template): """ Given task instance, try_number, filename_template, return the rendered log filename :param ti: task instance :param try_number: try_number of the task :param filename_template: filename template, which can be jinja template or ...
Return the task object identified by the given dag_id and task_id.
def get_task_instance(dag_id, task_id, execution_date): """Return the task object identified by the given dag_id and task_id.""" dagbag = DagBag() # Check DAG exists. if dag_id not in dagbag.dags: error_message = "Dag id {} not found".format(dag_id) raise DagNotFound(error_message) ...
Integrate plugins to the context
def _integrate_plugins(): """Integrate plugins to the context""" import sys from airflow.plugins_manager import operators_modules for operators_module in operators_modules: sys.modules[operators_module.__name__] = operators_module globals()[operators_module._name] = operators_module
Returns a Google Cloud Dataproc service object.
def get_conn(self): """Returns a Google Cloud Dataproc service object.""" http_authorized = self._authorize() return build( 'dataproc', self.api_version, http=http_authorized, cache_discovery=False)
Awaits for Google Cloud Dataproc Operation to complete.
def wait(self, operation): """Awaits for Google Cloud Dataproc Operation to complete.""" submitted = _DataProcOperation(self.get_conn(), operation, self.num_retries) submitted.wait_for_done()
Coerces content or all values of content if it is a dict to a string. The function will throw if content contains non - string or non - numeric types.
def _deep_string_coerce(content, json_path='json'): """ Coerces content or all values of content if it is a dict to a string. The function will throw if content contains non-string or non-numeric types. The reason why we have this function is because the ``self.json`` field must be a dict with only...
Handles the Airflow + Databricks lifecycle logic for a Databricks operator
def _handle_databricks_operator_execution(operator, hook, log, context): """ Handles the Airflow + Databricks lifecycle logic for a Databricks operator :param operator: Databricks operator being handled :param context: Airflow context """ if operator.do_xcom_push: context['ti'].xcom_pus...
Run an pig script using the pig cli
def run_cli(self, pig, verbose=True): """ Run an pig script using the pig cli >>> ph = PigCliHook() >>> result = ph.run_cli("ls /;") >>> ("hdfs://" in result) True """ with TemporaryDirectory(prefix='airflow_pigop_') as tmp_dir: with NamedTem...
Fetch and return the state of the given celery task. The scope of this function is global so that it can be called by subprocesses in the pool.
def fetch_celery_task_state(celery_task): """ Fetch and return the state of the given celery task. The scope of this function is global so that it can be called by subprocesses in the pool. :param celery_task: a tuple of the Celery task key and the async Celery object used to fetch the task's s...
How many Celery tasks should each worker process send.
def _num_tasks_per_send_process(self, to_send_count): """ How many Celery tasks should each worker process send. :return: Number of tasks that should be sent per process :rtype: int """ return max(1, int(math.ceil(1.0 * to_send_count / self._sync_paral...
How many Celery tasks should be sent to each worker process.
def _num_tasks_per_fetch_process(self): """ How many Celery tasks should be sent to each worker process. :return: Number of tasks that should be used per process :rtype: int """ return max(1, int(math.ceil(1.0 * len(self.tasks) / self._sync_parallelism...
Like a Python builtin dict object setdefault returns the current value for a key and if it isn t there stores the default value and returns it.
def setdefault(cls, key, default, deserialize_json=False): """ Like a Python builtin dict object, setdefault returns the current value for a key, and if it isn't there, stores the default value and returns it. :param key: Dict key for this Variable :type key: str :param ...
Returns a Google MLEngine service object.
def get_conn(self): """ Returns a Google MLEngine service object. """ authed_http = self._authorize() return build('ml', 'v1', http=authed_http, cache_discovery=False)
Launches a MLEngine job and wait for it to reach a terminal state.
def create_job(self, project_id, job, use_existing_job_fn=None): """ Launches a MLEngine job and wait for it to reach a terminal state. :param project_id: The Google Cloud project id within which MLEngine job will be launched. :type project_id: str :param job: MLEng...
Gets a MLEngine job based on the job name.
def _get_job(self, project_id, job_id): """ Gets a MLEngine job based on the job name. :return: MLEngine job object if succeed. :rtype: dict Raises: googleapiclient.errors.HttpError: if HTTP error is returned from server """ job_name = 'projects/{}/j...
Waits for the Job to reach a terminal state.
def _wait_for_job_done(self, project_id, job_id, interval=30): """ Waits for the Job to reach a terminal state. This method will periodically check the job state until the job reach a terminal state. Raises: googleapiclient.errors.HttpError: if HTTP error is returne...
Creates the Version on Google Cloud ML Engine.
def create_version(self, project_id, model_name, version_spec): """ Creates the Version on Google Cloud ML Engine. Returns the operation if the version was created successfully and raises an error otherwise. """ parent_name = 'projects/{}/models/{}'.format(project_id, mo...
Sets a version to be the default. Blocks until finished.
def set_default_version(self, project_id, model_name, version_name): """ Sets a version to be the default. Blocks until finished. """ full_version_name = 'projects/{}/models/{}/versions/{}'.format( project_id, model_name, version_name) request = self._mlengine.project...
Lists all available versions of a model. Blocks until finished.
def list_versions(self, project_id, model_name): """ Lists all available versions of a model. Blocks until finished. """ result = [] full_parent_name = 'projects/{}/models/{}'.format( project_id, model_name) request = self._mlengine.projects().models().version...
Deletes the given version of a model. Blocks until finished.
def delete_version(self, project_id, model_name, version_name): """ Deletes the given version of a model. Blocks until finished. """ full_name = 'projects/{}/models/{}/versions/{}'.format( project_id, model_name, version_name) delete_request = self._mlengine.projects(...
Create a Model. Blocks until finished.
def create_model(self, project_id, model): """ Create a Model. Blocks until finished. """ if not model['name']: raise ValueError("Model name must be provided and " "could not be an empty string") project = 'projects/{}'.format(project_id) ...
Gets a Model. Blocks until finished.
def get_model(self, project_id, model_name): """ Gets a Model. Blocks until finished. """ if not model_name: raise ValueError("Model name must be provided and " "it could not be an empty string") full_model_name = 'projects/{}/models/{}'.f...
Executes command received and stores result state in queue.: param key: the key to identify the TI: type key: tuple ( dag_id task_id execution_date ): param command: the command to execute: type command: str
def execute_work(self, key, command): """ Executes command received and stores result state in queue. :param key: the key to identify the TI :type key: tuple(dag_id, task_id, execution_date) :param command: the command to execute :type command: str """ if ...
Write batch items to dynamodb table with provisioned throughout capacity.
def write_batch_data(self, items): """ Write batch items to dynamodb table with provisioned throughout capacity. """ dynamodb_conn = self.get_conn() try: table = dynamodb_conn.Table(self.table_name) with table.batch_writer(overwrite_by_pkeys=self.table_...
Integrate plugins to the context.
def _integrate_plugins(): """Integrate plugins to the context.""" from airflow.plugins_manager import executors_modules for executors_module in executors_modules: sys.modules[executors_module.__name__] = executors_module globals()[executors_module._name] = executors_module
Creates a new instance of the configured executor if none exists and returns it
def get_default_executor(): """Creates a new instance of the configured executor if none exists and returns it""" global DEFAULT_EXECUTOR if DEFAULT_EXECUTOR is not None: return DEFAULT_EXECUTOR executor_name = configuration.conf.get('core', 'EXECUTOR') DEFAULT_EXECUTOR = _get_executor(ex...
Creates a new instance of the named executor. In case the executor name is not know in airflow look for it in the plugins
def _get_executor(executor_name): """ Creates a new instance of the named executor. In case the executor name is not know in airflow, look for it in the plugins """ if executor_name == Executors.LocalExecutor: return LocalExecutor() elif executor_name == Executors.SequentialExecutor:...
Handles error callbacks when using Segment with segment_debug_mode set to True
def on_error(self, error, items): """ Handles error callbacks when using Segment with segment_debug_mode set to True """ self.log.error('Encountered Segment error: {segment_error} with ' 'items: {with_items}'.format(segment_error=error, ...
Launches the pod synchronously and waits for completion. Args: pod ( Pod ): startup_timeout ( int ): Timeout for startup of the pod ( if pod is pending for too long considers task a failure
def run_pod(self, pod, startup_timeout=120, get_logs=True): # type: (Pod, int, bool) -> Tuple[State, Optional[str]] """ Launches the pod synchronously and waits for completion. Args: pod (Pod): startup_timeout (int): Timeout for startup of the pod (if pod is pendi...
Returns a mssql connection object
def get_conn(self): """ Returns a mssql connection object """ conn = self.get_connection(self.mssql_conn_id) conn = pymssql.connect( server=conn.host, user=conn.login, password=conn.password, database=self.schema or conn.schema, ...
Call the SparkSubmitHook to run the provided spark job
def execute(self, context): """ Call the SparkSubmitHook to run the provided spark job """ self._hook = SparkSubmitHook( conf=self._conf, conn_id=self._conn_id, files=self._files, py_files=self._py_files, archives=self._archives...
Trigger a new dag run for a Dag with an execution date of now unless specified in the data.
def trigger_dag(dag_id): """ Trigger a new dag run for a Dag with an execution date of now unless specified in the data. """ data = request.get_json(force=True) run_id = None if 'run_id' in data: run_id = data['run_id'] conf = None if 'conf' in data: conf = data['co...
Delete all DB records related to the specified Dag.
def delete_dag(dag_id): """ Delete all DB records related to the specified Dag. """ try: count = delete.delete_dag(dag_id) except AirflowException as err: _log.error(err) response = jsonify(error="{}".format(err)) response.status_code = err.status_code return ...
Returns a list of Dag Runs for a specific DAG ID.: query param state: a query string parameter ?state = queued|running|success...: param dag_id: String identifier of a DAG: return: List of DAG runs of a DAG with requested state or all runs if the state is not specified
def dag_runs(dag_id): """ Returns a list of Dag Runs for a specific DAG ID. :query param state: a query string parameter '?state=queued|running|success...' :param dag_id: String identifier of a DAG :return: List of DAG runs of a DAG with requested state, or all runs if the state is not specified...
Return python code of a given dag_id.
def get_dag_code(dag_id): """Return python code of a given dag_id.""" try: return get_code(dag_id) except AirflowException as err: _log.info(err) response = jsonify(error="{}".format(err)) response.status_code = err.status_code return response
Returns a JSON with a task s public instance variables.
def task_info(dag_id, task_id): """Returns a JSON with a task's public instance variables. """ try: info = get_task(dag_id, task_id) except AirflowException as err: _log.info(err) response = jsonify(error="{}".format(err)) response.status_code = err.status_code return...
( Un ) pauses a dag
def dag_paused(dag_id, paused): """(Un)pauses a dag""" DagModel = models.DagModel with create_session() as session: orm_dag = ( session.query(DagModel) .filter(DagModel.dag_id == dag_id).first() ) if paused == 'true': orm_dag.is_paused = Tr...
Returns a JSON with a task instance s public instance variables. The format for the exec_date is expected to be YYYY - mm - DDTHH: MM: SS for example: 2016 - 11 - 16T11: 34: 15. This will of course need to have been encoded for URL in the request.
def task_instance_info(dag_id, execution_date, task_id): """ Returns a JSON with a task instance's public instance variables. The format for the exec_date is expected to be "YYYY-mm-DDTHH:MM:SS", for example: "2016-11-16T11:34:15". This will of course need to have been encoded for URL in the request...
Returns a JSON with a dag_run s public instance variables. The format for the exec_date is expected to be YYYY - mm - DDTHH: MM: SS for example: 2016 - 11 - 16T11: 34: 15. This will of course need to have been encoded for URL in the request.
def dag_run_status(dag_id, execution_date): """ Returns a JSON with a dag_run's public instance variables. The format for the exec_date is expected to be "YYYY-mm-DDTHH:MM:SS", for example: "2016-11-16T11:34:15". This will of course need to have been encoded for URL in the request. """ # Co...
Get all pools.
def get_pools(): """Get all pools.""" try: pools = pool_api.get_pools() except AirflowException as err: _log.error(err) response = jsonify(error="{}".format(err)) response.status_code = err.status_code return response else: return jsonify([p.to_json() for ...
Create a pool.
def create_pool(): """Create a pool.""" params = request.get_json(force=True) try: pool = pool_api.create_pool(**params) except AirflowException as err: _log.error(err) response = jsonify(error="{}".format(err)) response.status_code = err.status_code return respon...
Delete pool.
def delete_pool(name): """Delete pool.""" try: pool = pool_api.delete_pool(name=name) except AirflowException as err: _log.error(err) response = jsonify(error="{}".format(err)) response.status_code = err.status_code return response else: return jsonify(poo...
Create a new container group
def create_or_update(self, resource_group, name, container_group): """ Create a new container group :param resource_group: the name of the resource group :type resource_group: str :param name: the name of the container group :type name: str :param container_group...
Get the state and exitcode of a container group
def get_state_exitcode_details(self, resource_group, name): """ Get the state and exitcode of a container group :param resource_group: the name of the resource group :type resource_group: str :param name: the name of the container group :type name: str :return: A...