Skip to content

Commit

Permalink
[AIRFLOW-406] Sphinx/rst fixes
Browse files Browse the repository at this point in the history
Dear Airflow Maintainers,

- Fix some syntax errors in our sphinx/rst docstrings, which appears on pythonhosted.org
- Move `xcom_push` documentation from constructor to class docstring
- Rewrite some copy in `HivePartitionSensor` and `NamedHivePartitionSensor` docstrings
- Fix `AirflowImporter` docstring that seems to have been automatically search-and-replaced

Closes apache#1717 from zodiac/xuanji/fix_documentation
  • Loading branch information
ldct authored and aoen committed Aug 11, 2016
1 parent d200f60 commit 5ac8ff1
Show file tree
Hide file tree
Showing 7 changed files with 29 additions and 22 deletions.
5 changes: 3 additions & 2 deletions airflow/contrib/operators/bigquery_check_operator.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,9 @@ class BigQueryCheckOperator(CheckOperator):
values return ``False`` the check is failed and errors out.
Note that Python bool casting evals the following as ``False``:
* False
* 0
* ``False``
* ``0``
* Empty string (``""``)
* Empty list (``[]``)
* Empty dictionary or set (``{}``)
Expand Down
5 changes: 3 additions & 2 deletions airflow/hooks/druid_hook.py
Original file line number Diff line number Diff line change
Expand Up @@ -165,8 +165,9 @@ def load_from_hdfs(
intervals, num_shards, target_partition_size, metric_spec=None, hadoop_dependency_coordinates=None):
"""
load data to druid from hdfs
:params ts_dim: The column name to use as a timestamp
:params metric_spec: A list of dictionaries
:param ts_dim: The column name to use as a timestamp
:param metric_spec: A list of dictionaries
"""
task_id = self.send_ingest_query(
datasource, static_path, ts_dim, columns, metric_spec,
Expand Down
8 changes: 4 additions & 4 deletions airflow/operators/bash_operator.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,9 @@ class BashOperator(BaseOperator):
:param bash_command: The command, set of commands or reference to a
bash script (must be '.sh') to be executed.
:type bash_command: string
:param xcom_push: If xcom_push is True, the last line written to stdout
will also be pushed to an XCom when the bash command completes.
:type xcom_push: bool
:param env: If env is not None, it must be a mapping that defines the
environment variables for the new process; these are used instead
of inheriting the current process environment, which is the default
Expand All @@ -50,10 +53,7 @@ def __init__(
env=None,
output_encoding='utf-8',
*args, **kwargs):
"""
If xcom_push is True, the last line written to stdout will also
be pushed to an XCom when the bash command completes.
"""

super(BashOperator, self).__init__(*args, **kwargs)
self.bash_command = bash_command
self.env = env
Expand Down
5 changes: 3 additions & 2 deletions airflow/operators/check_operator.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,9 @@ class CheckOperator(BaseOperator):
values return ``False`` the check is failed and errors out.
Note that Python bool casting evals the following as ``False``:
* False
* 0
* ``False``
* ``0``
* Empty string (``""``)
* Empty list (``[]``)
* Empty dictionary or set (``{}``)
Expand Down
5 changes: 3 additions & 2 deletions airflow/operators/presto_check_operator.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,9 @@ class PrestoCheckOperator(CheckOperator):
values return ``False`` the check is failed and errors out.
Note that Python bool casting evals the following as ``False``:
* False
* 0
* ``False``
* ``0``
* Empty string (``""``)
* Empty list (``[]``)
* Empty dictionary or set (``{}``)
Expand Down
15 changes: 8 additions & 7 deletions airflow/operators/sensors.py
Original file line number Diff line number Diff line change
Expand Up @@ -245,10 +245,11 @@ class NamedHivePartitionSensor(BaseSensorOperator):
:param partition_names: List of fully qualified names of the
partitions to wait for. A fully qualified name is of the
form schema.table/pk1=pv1/pk2=pv2, for example,
form ``schema.table/pk1=pv1/pk2=pv2``, for example,
default.users/ds=2016-01-01. This is passed as is to the metastore
Thrift client "get_partitions_by_name" method. Note that
you cannot use logical operators as in HivePartitionSensor.
Thrift client ``get_partitions_by_name`` method. Note that
you cannot use logical or comparison operators as in
HivePartitionSensor.
:type partition_names: list of strings
:param metastore_conn_id: reference to the metastore thrift service
connection id
Expand Down Expand Up @@ -312,17 +313,17 @@ class HivePartitionSensor(BaseSensorOperator):
"""
Waits for a partition to show up in Hive.
Note: Because @partition supports general logical operators, it
Note: Because ``partition`` supports general logical operators, it
can be inefficient. Consider using NamedHivePartitionSensor instead if
you don't need the full flexibility of HivePartitionSensor.
:param table: The name of the table to wait for, supports the dot
notation (my_database.my_table)
:type table: string
:param partition: The partition clause to wait for. This is passed as
is to the metastore Thrift client "get_partitions_by_filter" method,
and apparently supports SQL like notation as in `ds='2015-01-01'
AND type='value'` and > < sings as in "ds>=2015-01-01"
is to the metastore Thrift client ``get_partitions_by_filter`` method,
and apparently supports SQL like notation as in ``ds='2015-01-01'
AND type='value'`` and comparison operators as in ``"ds>=2015-01-01"``
:type partition: string
:param metastore_conn_id: reference to the metastore thrift service
connection id
Expand Down
8 changes: 5 additions & 3 deletions airflow/utils/helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -177,13 +177,15 @@ def f(t):
class AirflowImporter(object):
"""
Importer that dynamically loads a class and module from its parent. This
allows Airflow to support `from airflow.operators.bash_operator import
BashOperator` even though BashOperator is actually in
airflow.operators.bash_operator.
allows Airflow to support ``from airflow.operators import BashOperator``
even though BashOperator is actually in
``airflow.operators.bash_operator``.
The importer also takes over for the parent_module by wrapping it. This is
required to support attribute-based usage:
.. code:: python
from airflow import operators
operators.BashOperator(...)
"""
Expand Down

0 comments on commit 5ac8ff1

Please sign in to comment.