Skip to content

Models and examples built with TensorFlow

License

Notifications You must be signed in to change notification settings

cshuangyi82/models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TensorFlow Models

This repository contains machine learning models implemented in TensorFlow. The models are maintained by their respective authors. To propose a model for inclusion, please submit a pull request.

Currently, the models are compatible with TensorFlow 1.0 or later. If you are running TensorFlow 0.12 or earlier, please upgrade your installation.

Models

  • adversarial_text: semi-supervised sequence learning with adversarial training.
  • autoencoder: various autoencoders.
  • compression: compressing and decompressing images using a pre-trained Residual GRU network.
  • differential_privacy: privacy-preserving student models from multiple teachers.
  • domain_adaptation: domain separation networks.
  • im2txt: image-to-text neural network for image captioning.
  • inception: deep convolutional networks for computer vision.
  • learning_to_remember_rare_events: a large-scale life-long memory module for use in deep learning.
  • lm_1b: language modeling on the one billion word benchmark.
  • namignizer: recognize and generate names.
  • neural_gpu: highly parallel neural computer.
  • neural_programmer: neural network augmented with logic and mathematic operations.
  • next_frame_prediction: probabilistic future frame synthesis via cross convolutional networks.
  • real_nvp: density estimation using real-valued non-volume preserving (real NVP) transformations.
  • resnet: deep and wide residual networks.
  • skip_thoughts: recurrent neural network sentence-to-vector encoder.
  • slim: image classification models in TF-Slim.
  • street: identify the name of a street (in France) from an image using a Deep RNN.
  • swivel: the Swivel algorithm for generating word embeddings.
  • syntaxnet: neural models of natural language syntax.
  • textsum: sequence-to-sequence with attention model for text summarization.
  • transformer: spatial transformer network, which allows the spatial manipulation of data within the network.
  • tutorials: models described in the TensorFlow tutorials.
  • video_prediction: predicting future video frames with neural advection.

About

Models and examples built with TensorFlow

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 74.7%
  • Jupyter Notebook 15.2%
  • C++ 8.2%
  • Shell 0.8%
  • HTML 0.8%
  • JavaScript 0.2%
  • Other 0.1%