Skip to content

Commit

Permalink
black formatted codebase with pre-commit files (pytorch#792)
Browse files Browse the repository at this point in the history
* Added configuration files

* Added additional flake8 ignores caused by black

* Updated pre-commit yaml

* Black formatted files

* Updated .travis.yml and CONTRIBUTING.md

* Updated CONTRIBUTING.md with links

Co-authored-by: vfdev <[email protected]>
  • Loading branch information
anmolsjoshi and vfdev-5 authored Feb 19, 2020
1 parent 57df912 commit 6040d45
Show file tree
Hide file tree
Showing 164 changed files with 4,972 additions and 4,112 deletions.
12 changes: 12 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
repos:
- repo: https://github.com/python/black
rev: 19.10b0
hooks:
- id: black
language_version: python3.7

- repo: https://gitlab.com/pycqa/flake8
rev: 3.7.7
hooks:
- id: flake8
args: [--append-config=tox.ini]
6 changes: 4 additions & 2 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -86,8 +86,10 @@ jobs:
- stage: Lint check
python: "3.7"
before_install: # Nothing to do
install: pip install flake8
script: flake8
install: pip install flake8 black
script:
- flake8 .
- black .
after_success: # Nothing to do

# GitHub Pages Deployment: https://docs.travis-ci.com/user/deployment/pages/
Expand Down
41 changes: 41 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,47 @@ In both cases, you will also need to code some tests to ensure the correct behav

New code should be compatible with Python 3.X versions. Once you finish implementing a feature or bugfix and tests, please run lint checking and tests:

#### pre-commit
To ensure the codebase complies with a style guide, we use [black](https://black.readthedocs.io/en/stable/) and [flake8](https://flake8.pycqa.org/en/latest/) to format and check codebase for compliance with PEP8.

To automate the process, we have configured the repo with [pre-commit hooks](https://pre-commit.com/) to use black to autoformat the staged files to ensure every commit complies with a style guide. This requires some setup, which is described below:

1. Install pre-commit in your python environment.
2. Run pre-commit install that configures a virtual environment to invoke black and flake8 on commits.

```bash
pip install pre-commit
pre-commit install
```

3. When files are committed:
- If the stages files are not compliant with black, black will autoformat the staged files. If this were to happen, files should be staged and committed again. See example code below.
- If the staged files are not compliant with flake8, errors will be raised. These errors should be fixed and the files should be committed again. See example code below.

```bash
git add .
git commit -m "Added awesome feature"
# DONT'T WORRY IF ERRORS ARE RAISED.
# YOUR CODE IS NOT COMPLIANT WITH FLAKE8 or BLACK
# Fix any flake8 errors by following their suggestions
# black will automatically format the files so they might look different, but you'll need to stage the files again for committing
# After fixing any flake8 errors
git add .
git commit -m "Added feature"
```

#### Formatting Code without pre-commit
If you choose not to use pre-commit, you can take advantage of IDE extensions configured to black format or invoke black manually to format files and commit them.

```bash
pip install black
black .
# This should autoformat the files
git add .
git commit -m "....."
```


#### Run lint checking
```bash
flake8 ignite/ tests/ examples/
Expand Down
88 changes: 44 additions & 44 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,28 +14,29 @@
#
import os
import sys
sys.path.insert(0, os.path.abspath('../..'))

sys.path.insert(0, os.path.abspath("../.."))
import ignite
import pytorch_sphinx_theme

# -- Project information -----------------------------------------------------

project = 'ignite'
copyright = '2019, Torch Contributors'
author = 'Torch Contributors'
project = "ignite"
copyright = "2019, Torch Contributors"
author = "Torch Contributors"

# The short X.Y version
try:
version = os.environ['code_version']
if 'master' in version:
version = 'master (' + ignite.__version__ + ')'
version = os.environ["code_version"]
if "master" in version:
version = "master (" + ignite.__version__ + ")"
else:
version = version.replace('v', '')
version = version.replace("v", "")
except KeyError:
version = ignite.__version__

# The full version, including alpha/beta/rc tags
release = 'master'
release = "master"


# -- General configuration ---------------------------------------------------
Expand All @@ -48,27 +49,27 @@
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = [
'sphinx.ext.autosummary',
'sphinx.ext.doctest',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.mathjax',
'sphinx.ext.napoleon',
'sphinx.ext.viewcode'
"sphinx.ext.autosummary",
"sphinx.ext.doctest",
"sphinx.ext.intersphinx",
"sphinx.ext.todo",
"sphinx.ext.coverage",
"sphinx.ext.mathjax",
"sphinx.ext.napoleon",
"sphinx.ext.viewcode",
]

# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
templates_path = ["_templates"]

# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
source_suffix = '.rst'
source_suffix = ".rst"

# The master toctree document.
master_doc = 'index'
master_doc = "index"

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
Expand All @@ -83,25 +84,25 @@
exclude_patterns = []

# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
pygments_style = "sphinx"


# -- Options for HTML output -------------------------------------------------

# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
html_theme = 'pytorch_sphinx_theme'
html_theme = "pytorch_sphinx_theme"
html_theme_path = [pytorch_sphinx_theme.get_html_theme_path()]

html_theme_options = {
'canonical_url': 'https://pytorch.org/ignite/index.html',
'collapse_navigation': False,
'display_version': True,
'logo_only': True,
"canonical_url": "https://pytorch.org/ignite/index.html",
"collapse_navigation": False,
"display_version": True,
"logo_only": True,
}

html_logo = '_static/img/ignite-logo-dark.svg'
html_logo = "_static/img/ignite-logo-dark.svg"

# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
Expand All @@ -112,21 +113,21 @@
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
html_static_path = ['_static', '_templates/_static']
html_static_path = ["_static", "_templates/_static"]

html_context = {
'css_files': [
"css_files": [
# 'https://fonts.googleapis.com/css?family=Lato',
# '_static/css/pytorch_theme.css'
'_static/css/ignite_theme.css'
"_static/css/ignite_theme.css"
],
}


# -- Options for HTMLHelp output ---------------------------------------------

# Output file base name for HTML help builder.
htmlhelp_basename = 'ignitedoc'
htmlhelp_basename = "ignitedoc"


# -- Options for LaTeX output ------------------------------------------------
Expand All @@ -135,15 +136,12 @@
# The paper size ('letterpaper' or 'a4paper').
#
# 'papersize': 'letterpaper',

# The font size ('10pt', '11pt' or '12pt').
#
# 'pointsize': '10pt',

# Additional stuff for the LaTeX preamble.
#
# 'preamble': '',

# Latex figure (float) alignment
#
# 'figure_align': 'htbp',
Expand All @@ -153,19 +151,15 @@
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'ignite.tex', 'ignite Documentation',
'Torch Contributors', 'manual'),
(master_doc, "ignite.tex", "ignite Documentation", "Torch Contributors", "manual"),
]


# -- Options for manual page output ------------------------------------------

# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'ignite', 'ignite Documentation',
[author], 1)
]
man_pages = [(master_doc, "ignite", "ignite Documentation", [author], 1)]


# -- Options for Texinfo output ----------------------------------------------
Expand All @@ -174,9 +168,15 @@
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'ignite', 'ignite Documentation',
author, 'ignite', 'One line description of project.',
'Miscellaneous'),
(
master_doc,
"ignite",
"ignite Documentation",
author,
"ignite",
"One line description of project.",
"Miscellaneous",
),
]


Expand All @@ -185,7 +185,7 @@
# -- Options for intersphinx extension ---------------------------------------

# Example configuration for intersphinx: refer to the Python standard library.
intersphinx_mapping = {'https://docs.python.org/': None}
intersphinx_mapping = {"https://docs.python.org/": None}

# -- Options for todo extension ----------------------------------------------

Expand Down
46 changes: 12 additions & 34 deletions examples/contrib/cifar10/fastresnet.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@

# Network from https://github.com/davidcpage/cifar10-fast
# Adapted to python < 3.6

Expand All @@ -9,8 +8,7 @@ def fastresnet():
return FastResnet()


def batch_norm(num_channels, bn_bias_init=None, bn_bias_freeze=False,
bn_weight_init=None, bn_weight_freeze=False):
def batch_norm(num_channels, bn_bias_init=None, bn_bias_freeze=False, bn_weight_init=None, bn_weight_freeze=False):
m = nn.BatchNorm2d(num_channels)
if bn_bias_init is not None:
m.bias.data.fill_(bn_bias_init)
Expand All @@ -32,10 +30,9 @@ def seq_conv_bn(in_channels, out_channels, conv_kwargs, bn_kwargs):
if "bias" not in conv_kwargs:
conv_kwargs["bias"] = False
return nn.Sequential(
nn.Conv2d(in_channels, out_channels,
kernel_size=3, **conv_kwargs),
nn.Conv2d(in_channels, out_channels, kernel_size=3, **conv_kwargs),
batch_norm(out_channels, **bn_kwargs),
nn.ReLU(inplace=True)
nn.ReLU(inplace=True),
)


Expand All @@ -47,24 +44,19 @@ def conv_bn_elu(in_channels, out_channels, conv_kwargs, bn_kwargs, alpha=1.0):
if "bias" not in conv_kwargs:
conv_kwargs["bias"] = False
return nn.Sequential(
nn.Conv2d(in_channels, out_channels,
kernel_size=3, **conv_kwargs),
nn.Conv2d(in_channels, out_channels, kernel_size=3, **conv_kwargs),
batch_norm(out_channels, **bn_kwargs),
nn.ELU(alpha=alpha, inplace=True)
nn.ELU(alpha=alpha, inplace=True),
)


class Flatten(nn.Module):

def forward(self, x):
return x.view(x.size(0), x.size(1))


class FastResnet(nn.Module):

def __init__(self, conv_kwargs=None, bn_kwargs=None,
conv_bn_fn=seq_conv_bn,
final_weight=0.125):
def __init__(self, conv_kwargs=None, bn_kwargs=None, conv_bn_fn=seq_conv_bn, final_weight=0.125):
super(FastResnet, self).__init__()

conv_kwargs = {} if conv_kwargs is None else conv_kwargs
Expand All @@ -75,34 +67,22 @@ def __init__(self, conv_kwargs=None, bn_kwargs=None,
self.layer1 = nn.Sequential(
conv_bn_fn(64, 128, conv_kwargs, bn_kwargs),
nn.MaxPool2d(kernel_size=2),
IdentityResidualBlock(128, 128, conv_kwargs, bn_kwargs, conv_bn_fn=conv_bn_fn)
IdentityResidualBlock(128, 128, conv_kwargs, bn_kwargs, conv_bn_fn=conv_bn_fn),
)

self.layer2 = nn.Sequential(
conv_bn_fn(128, 256, conv_kwargs, bn_kwargs),
nn.MaxPool2d(kernel_size=2)
)
self.layer2 = nn.Sequential(conv_bn_fn(128, 256, conv_kwargs, bn_kwargs), nn.MaxPool2d(kernel_size=2))

self.layer3 = nn.Sequential(
conv_bn_fn(256, 512, conv_kwargs, bn_kwargs),
nn.MaxPool2d(kernel_size=2),
IdentityResidualBlock(512, 512, conv_kwargs, bn_kwargs, conv_bn_fn=conv_bn_fn)
IdentityResidualBlock(512, 512, conv_kwargs, bn_kwargs, conv_bn_fn=conv_bn_fn),
)

self.head = nn.Sequential(
nn.AdaptiveMaxPool2d(1),
Flatten(),
)
self.head = nn.Sequential(nn.AdaptiveMaxPool2d(1), Flatten(),)

self.final_weight = final_weight

self.features = nn.Sequential(
self.prep,
self.layer1,
self.layer2,
self.layer3,
self.head
)
self.features = nn.Sequential(self.prep, self.layer1, self.layer2, self.layer3, self.head)

self.classifier = nn.Linear(512, 10, bias=False)

Expand All @@ -115,9 +95,7 @@ def forward(self, x):


class IdentityResidualBlock(nn.Module):

def __init__(self, in_channels, out_channels, conv_kwargs, bn_kwargs,
conv_bn_fn=seq_conv_bn):
def __init__(self, in_channels, out_channels, conv_kwargs, bn_kwargs, conv_bn_fn=seq_conv_bn):
super(IdentityResidualBlock, self).__init__()
self.conv1 = conv_bn_fn(in_channels, out_channels, conv_kwargs, bn_kwargs)
self.conv2 = conv_bn_fn(out_channels, out_channels, conv_kwargs, bn_kwargs)
Expand Down
Loading

0 comments on commit 6040d45

Please sign in to comment.