Skip to content

Commit

Permalink
Use relative links, cleanup whitespace.
Browse files Browse the repository at this point in the history
  • Loading branch information
dan-zheng committed Apr 26, 2018
1 parent 480785c commit ddddf84
Show file tree
Hide file tree
Showing 8 changed files with 92 additions and 81 deletions.
8 changes: 4 additions & 4 deletions CODE_OF_CONDUCT.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Examples of unacceptable behavior by participants include:
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic address, without explicit permission
* Conduct which could reasonably be considered inappropriate for the forum in which it occurs.
* Conduct which could reasonably be considered inappropriate for the forum in which it occurs.

All TensorFlow forums and spaces are meant for professional interactions, and any behavior which could reasonably be considered inappropriate in a professional setting is unacceptable.

Expand All @@ -35,7 +35,7 @@ Project maintainers have the right and responsibility to remove, edit, or reject

This Code of Conduct applies to all content on tensorflow.org, TensorFlow’s GitHub organization, or any other official TensorFlow web presence allowing for community interactions, as well as at all official TensorFlow events, whether offline or online.

The Code of Conduct also applies within project spaces and in public spaces whenever an individual is representing TensorFlow or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed or de facto representative at an online or offline event.
The Code of Conduct also applies within project spaces and in public spaces whenever an individual is representing TensorFlow or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed or de facto representative at an online or offline event.


## Conflict Resolution
Expand All @@ -44,11 +44,11 @@ Conflicts in an open source project can take many forms, from someone having a b

If the behavior is threatening or harassing, or for other reasons requires immediate escalation, please see below.

However, for the vast majority of issues, we aim to empower individuals to first resolve conflicts themselves, asking for help when needed, and only after that fails to escalate further. This approach gives people more control over the outcome of their dispute.
However, for the vast majority of issues, we aim to empower individuals to first resolve conflicts themselves, asking for help when needed, and only after that fails to escalate further. This approach gives people more control over the outcome of their dispute.

If you are experiencing or witnessing conflict, we ask you to use the following escalation strategy to address the conflict:

1. Address the perceived conflict directly with those involved, preferably in a real-time medium.
1. Address the perceived conflict directly with those involved, preferably in a real-time medium.
2. If this fails, get a third party (e.g. a mutual friend, and/or someone with background on the issue, but not involved in conflict) to intercede.
3. If you are still unable to resolve the conflict, and you believe it rises to harassment or another code of conduct violation, report it.

Expand Down
1 change: 0 additions & 1 deletion Installation.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@

# Install Swift for TensorFlow

To install Swift for TensorFlow, download one of the packages below and follow the instructions for your operating system. After installation, you can use the full suite of Swift tools, including `swift` (Swift REPL/interpreter) and `swiftc` (Swift compiler). See [here](Usage.md) for more details about using Swift for TensorFlow.
Expand Down
46 changes: 32 additions & 14 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,45 +2,59 @@

Welcome to the Swift for TensorFlow development community!

Swift for TensorFlow is the result of first-principles thinking applied to machine learning frameworks and aims to take TensorFlow usability to new heights. Swift for TensorFlow is based on the belief that machine learning is important enough for first-class language and compiler support, and thus works very differently from normal language bindings.
Swift for TensorFlow is the result of first-principles thinking applied to
machine learning frameworks and aims to take TensorFlow usability to new
heights. Swift for TensorFlow is based on the belief that machine learning is
important enough for first-class language and compiler support, and thus works
very differently from normal language bindings.

First-class language and compiler support allow us to innovate in areas that
traditionally were out of bounds for machine learning libraries. Our programming model combines the performance of TensorFlow graphs with the flexibility and expressivity of Eager execution, while keeping a strong focus on improved usability at every level of the stack.
traditionally were out of bounds for machine learning libraries. Our
programming model combines the performance of TensorFlow graphs with the
flexibility and expressivity of Eager execution, while keeping a strong focus
on improved usability at every level of the stack.

**Note:** Swift for TensorFlow is an early stage research project. It has been released to enable open source development and is not yet ready for general use by machine learning developers.
**Note:** Swift for TensorFlow is an early stage research project. It has been
released to enable open source development and is not yet ready for general use
by machine learning developers.

## Installation and Usage

You can download a pre-built package for Swift for TensorFlow [here](https://github.com/tensorflow/swift/blob/master/Installation.md). After installing Swift for TensorFlow, you can learn how to use the project [here](https://github.com/tensorflow/swift/blob/master/Usage.md).
You can download a pre-built package for Swift for TensorFlow
[here](Installation.md). After installing Swift for TensorFlow, you can learn
how to use the project [here](Usage.md).

For instructions on building from source, visit [google/swift](https://github.com/google/swift/tree/tensorflow).
For instructions on building from source, visit
[google/swift](https://github.com/google/swift/tree/tensorflow).

## Documentation

Below are some documents explaining the Swift for TensorFlow project.

Conceptual:

- [Swift for TensorFlow Design Overview](https://github.com/tensorflow/swift/blob/master/docs/DesignOverview.md)
- [Why *Swift* for TensorFlow?](https://github.com/tensorflow/swift/blob/master/docs/WhySwiftForTensorFlow.md)
- [Swift for TensorFlow Design Overview](docs/DesignOverview.md)
- [Why *Swift* for TensorFlow?](docs/WhySwiftForTensorFlow.md)

Deeper dives:

- [Graph Program Extraction](https://github.com/tensorflow/swift/blob/master/docs/GraphProgramExtraction.md)
- [Automatic Differentiation](https://github.com/tensorflow/swift/blob/master/docs/AutomaticDifferentiation.md)
- [Python Interoperability](https://github.com/tensorflow/swift/blob/master/docs/PythonInteroperability.md)
- [Graph Program Extraction](docs/GraphProgramExtraction.md)
- [Automatic Differentiation](docs/AutomaticDifferentiation.md)
- [Python Interoperability](docs/PythonInteroperability.md)

## Source code

Currently, the active development of Swift for TensorFlow will happen under the "tensorflow" branch of
Currently, the active development of Swift for TensorFlow will happen under
the "tensorflow" branch of
[google/swift](https://github.com/google/swift/tree/tensorflow).

These projects include:

- The compiler and standard libraries: [google/swift](http://github.com/google/swift/tree/tensorflow)
- Debugger and REPL support: [google/swift-lldb](http://github.com/google/swift-lldb)

As the code matures, we aim to move it upstream to the corresponding [Swift.org](https://swift.org) repositories.
As the code matures, we aim to move it upstream to the corresponding
[Swift.org](https://swift.org) repositories.

## Models

Expand All @@ -59,7 +73,10 @@ mailing list.

## Contributing

We welcome source code contributions: please read the [Contributor Guide](https://github.com/google/swift/blob/tensorflow/CONTRIBUTING.md) to get started. It's always a good idea to discuss your plans on the mailing list before making any major submissions.
We welcome source code contributions: please read the [Contributor
Guide](https://github.com/google/swift/blob/tensorflow/CONTRIBUTING.md) to get
started. It is always a good idea to discuss your plans on the mailing list
before making any major submissions.

## Code of Conduct

Expand All @@ -71,4 +88,5 @@ experience, education, socio-economic status, nationality, personal appearance,
race, religion, or sexual identity and orientation.

The Swift for TensorFlow community is guided by our [Code of
Conduct](CODE_OF_CONDUCT.md), which we encourage everybody to read before participating.
Conduct](CODE_OF_CONDUCT.md), which we encourage everybody to read before
participating.
4 changes: 2 additions & 2 deletions Usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -112,15 +112,15 @@ This was a simple demonstration of Swift for TensorFlow. To see example models w

## (Mac-only) Xcode

To use Swift for TensorFlow with Xcode, you must have installed a toolchain from [this page](Installation.md).
To use Swift for TensorFlow with Xcode, you must have installed a toolchain from [this page](Installation.md).

1. Open Xcode’s `Preferences`, navigate to `Components > Toolchains`, and select the installed Swift for TensorFlow toolchain. The name of the toolchain should start with "Swift for TensorFlow Development Snapshot".

<span align="center">
<img src="docs/images/Installation-XcodePreferences.png?raw=true" alt="Select toolchain in Xcode preferences."/>
</span>

2. In the menu bar, select `File > New > Playground...`.
2. In the menu bar, select `File > New > Playground...`.

3. Then, select `macOS` and `Blank` and hit `Next`.

Expand Down
14 changes: 6 additions & 8 deletions docs/DesignOverview.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ This document provides a high level view of these subcomponents and describe how
We go describe these pieces of the project:

- [Swift](#swift)
- [TensorFlow](#tensorflow)
- [TensorFlow](#tensorflow)
- [Graph Program Extraction](#graph-program-extraction)
- [The TensorFlow module](#the-tensorflow-module)
- [Automatic Differentiation](#automatic-differentiation)
Expand All @@ -35,7 +35,7 @@ One warning: Swift evolved rapidly in its early years, so you should be careful

## TensorFlow

[TensorFlow](https://tensorflow.org/) is a popular and widely-used machine learning framework. TensorFlow provides a graph-based Python API where you explicitly build graph operations and then execute the graph one or more times with the session API. In addition, TensorFlow added [eager execution](https://www.tensorflow.org/programmers_guide/eager) which lets you call operations one-by-one in a Pythonic mode, but without the benefits of graphs.
[TensorFlow](https://tensorflow.org/) is a popular and widely-used machine learning framework. TensorFlow provides a graph-based Python API where you explicitly build graph operations and then execute the graph one or more times with the session API. In addition, TensorFlow added [eager execution](https://www.tensorflow.org/programmers_guide/eager) which lets you call operations one-by-one in a Pythonic mode, but without the benefits of graphs.

In that context, many users will initially think Swift for TensorFlow is just a straight language binding. However, Swift for TensorFlow lets you write imperative eager execution-style code, while Swift gives you the full performance of the explicit graph APIs. The magic behind this is a [compiler transformation](#graph-program-extraction) that analyzes your code and automatically builds the TensorFlow graph and runtime calls for you. The nice thing about this is that TensorFlow "just works", and you don’t have to think about graphs at all.

Expand All @@ -46,12 +46,12 @@ Swift for TensorFlow has a low-level syntax that gives you direct access to any
```swift
struct Tensor<Scalar> {
...
// Implement the infix `+` operator on Tensor in terms of the TensorFlow `Add` op,
// Implement the infix `+` operator on Tensor in terms of the TensorFlow `Add` op,
// which takes two input tensors and returns one result.
static func +(lhs: Tensor, rhs: Tensor) -> Tensor {
return #tfop("Add", lhs, rhs)
}
// Another example that implements a method in terms of the TensorFlow `Conv2D` op,
// Another example that implements a method in terms of the TensorFlow `Conv2D` op,
// which takes two input tensors, as well as a `strides` and `padding` attribute.
func convolved2D(withFilter filter: Tensor,
strides: (Int32, Int32, Int32, Int32),
Expand All @@ -75,7 +75,7 @@ The Graph Program Extraction transformation is the key technique that allows Ten

First, the compiler finds the tensor operations in the code (which is trivial due to the low-level `#tfop` syntax described above). Next, it desugars high-level abstractions (like structs, tuples, generics, functions, variables, etc) that connect tensor operations through a process called "deabstraction". After deabstraction, the tensor operations are directly connected to each other through SSA dataflow edges and are embedded in a control flow graph represented in the [Swift Intermediate Language](https://github.com/apple/swift/blob/master/docs/SIL.rst) (SIL). The code for this is primarily implemented in [TFDeabstraction.cpp](Link to Github).

Once the tensor operations are desugared, a transformation we call "partitioning" extracts the graph operations from the program and builds a new SIL function to represent the tensor code. In addition to removing the tensor operations from the host code, new calls are injected that call into [our new runtime library](#runtime-entry-points-for-extraction) to start up TensorFlow, rendezvous to collect any results, and send/receive values between the host and the tensor program as it runs. The bulk of the Graph Program Extraction transformation itself lives in [TFPartition.cpp](TODO: LINK TO GITHUB).
Once the tensor operations are desugared, a transformation we call "partitioning" extracts the graph operations from the program and builds a new SIL function to represent the tensor code. In addition to removing the tensor operations from the host code, new calls are injected that call into [our new runtime library](#runtime-entry-points-for-extraction) to start up TensorFlow, rendezvous to collect any results, and send/receive values between the host and the tensor program as it runs. The bulk of the Graph Program Extraction transformation itself lives in [TFPartition.cpp](TODO: LINK TO GITHUB).

Once the tensor function is formed, it has some transformations applied to it, and is eventually emitted to a TensorFlow graph using the code in [TFLowerGraph.cpp](TODO: LINK TO GITHUB). After the TensorFlow graph is formed, we serialize it to a protobuf and encode the bits directly into the executable, making it easy to load at program runtime.

Expand Down Expand Up @@ -150,7 +150,7 @@ The most significant unimplemented piece of our compiler and runtime model is su

## Automatic Differentiation

[Automatic differentiation](https://en.wikipedia.org/wiki/Automatic_differentiation) (AD) is a powerful technique that all machine learning frameworks are expected to implement, because gradients are so important for this work (e.g. with [SGD](https://en.wikipedia.org/wiki/Stochastic_gradient_descent)). TensorFlow implements automatic differentiation as a TensorFlow graph transformation, but we would like to deploy more powerful techniques to improve user experience in failure cases, enable differentiating custom data structures, recursion, and higher-order differentiation. As such, we built a stand-alone AD feature for Swift: one that is completely independent of the standard TensorFlow implementation of AD, and also completely independent of TensorFlow support in Swift.
[Automatic differentiation](https://en.wikipedia.org/wiki/Automatic_differentiation) (AD) is a powerful technique that all machine learning frameworks are expected to implement, because gradients are so important for this work (e.g. with [SGD](https://en.wikipedia.org/wiki/Stochastic_gradient_descent)). TensorFlow implements automatic differentiation as a TensorFlow graph transformation, but we would like to deploy more powerful techniques to improve user experience in failure cases, enable differentiating custom data structures, recursion, and higher-order differentiation. As such, we built a stand-alone AD feature for Swift: one that is completely independent of the standard TensorFlow implementation of AD, and also completely independent of TensorFlow support in Swift.

The way this works is by having Swift AD support arbitrary user-defined types. Swift for TensorFlow builds on this by making its Tensor types conform to the AD system, allowing them to participate as you’d expect. A nice thing about this is that Swift programmers interested in non-Tensor numerical analysis can use AD for any other types that are important for their work.

Expand Down Expand Up @@ -233,5 +233,3 @@ We’re focusing on finishing the basic Swift for TensorFlow model, gaining more
**Differentiating Opaque Closures:** Statically differentiating a function requires the body of the function to be visible to the compiler. However, this limits the expressiveness of the differential operator, e.g. users can’t apply the gradient operator to a function argument that has a function type because the compiler can’t always see into the body of the original function. We will discuss the possibility to introduce a new function convention - when a differentiable function is passed around, a pointer to its primal and adjoint gets passed along. This enables the compiler to directly call the primal and the adjoint, without the need to see into the function declaration. This is important for class and protocol methods.

**Quantization Support:** We believe we can get a much better user experience for [fixed-point quanitization tools](https://www.tensorflow.org/performance/quantization) if we integrate them into the compiler, and this should help with integrating quanitization into the training process.


Loading

0 comments on commit ddddf84

Please sign in to comment.