Skip to content

Commit

Permalink
Changes
Browse files Browse the repository at this point in the history
  • Loading branch information
sgugger committed Apr 22, 2020
1 parent 5fb6482 commit f94b161
Show file tree
Hide file tree
Showing 4 changed files with 5 additions and 87 deletions.
45 changes: 3 additions & 42 deletions 01_intro.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -28,11 +28,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Hello, and thank you for letting us join you on your deep learning journey, however far along it you may be! If you are a complete beginner to deep learning and machine learning, then you are most welcome here. Our only expectation is that you already know how to code, preferably in Python.\n",
"\n",
"> note: If you don't have any experience coding, that's OK too! The first three chapters have been explicitly written in a way that will allow executives, product managers, etc. to understand the most important things they'll need to know about deep learning. When you see bits of code in the text, try to look over them to get an intuitive sense of what they're doing. We'll explain them line by line. The details of the syntax are not nearly as important as the high level understanding of what's going on.\n",
"\n",
"If you are already a confident deep learning practitioner, then you will also find a lot here. In this book we will be showing you how to achieve world-class results, including techniques from the latest research. As we will show, this doesn't require advanced mathematical training, or years of study. It just requires a bit of common sense and tenacity."
"Hello, and thank you for letting us join you on your deep learning journey, however far along that you may be! In this chapter, we will tell you a little bit more about what to expext in this book, introduce the key concepts behind deep learning and we will train our first models on different tasks. It doesn't matter if you don't come from a technical or a mathematical background (though that's okay too!), we wrote this book to put it in the hands of as many people as possible."
]
},
{
Expand Down Expand Up @@ -158,42 +154,7 @@
"\n",
"Although researchers showed 30 years ago that to get practical good performance you need to use even more layers of neurons, it is only in the last decade that this principle has been more widely appreciatedand applied. Neural networks are now finally living up to their potential, thanks to the use of more layers, coupled with the capacity to do so due to improvements in computer hardware, increases in data availability, and algorithmic tweaks that allow neural networks to be trained faster and more easily. We now have what Rosenblatt had promised: \"a machine capable of perceiving, recognizing and identifying its surroundings without any human training or control\".\n",
"\n",
"This is what you will learn how to build in this book."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## What you will learn"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To be exact, after reading this book you will know:\n",
"\n",
"- How to train models that achieve state of the art results in:\n",
" - Computer vision: Image classification (e.g. classify pet photos by breed), and image localization and detection (e.g. find where the animals in an image are)\n",
" - NLP: Document classification (e.g. movie review sentiment analysis), and language modelling\n",
" - Tabular data (e.g. sales prediction) with categorical data, continuous data, and mixed data, including time series\n",
" - Collaborative filtering (e.g. movie recommendation)\n",
"- How to turn your models into web applications\n",
"- Why and how deep learning models work, and how to use that knowledge to improve the accuracy, speed, and reliability of your models\n",
"- The latest deep learning techniques which really matter in practice\n",
"- How to read a deep learning research paper\n",
"- How to implement deep learning algorithms from scratch\n",
"- How to think about ethical implications of your work, to help ensure that you're making the world a better place, and that your work isn't misused for harm\n",
"\n",
"See the table of contents for a complete list; but to give you a taste, here's some of the techniques covered (don't worry if none of these words mean anything to you yet – you'll learn them all soon): Affine functions and non-linearities; Parameters and activations; Random init and transfer learning; SGD, Momentum, Adam and more optimizers; Convolutions; Batch normalization; Dropout; Data augmentation; Weight decay; Resnet and Densenet architectures; Image classification and regression; Embeddings; Recurrent neural networks (RNNs); Transformers; Segmentation; U-net; Generative Adversarial Networks (GANs), and much more."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"> note: If you look at the end of each chapter, you'll find a questionnaire. That's a great place also to see what we cover in each chapter, since (we hope!) by the end of each chapter you'll be able to answer all the questions there. In fact, one of our reviewers (thanks Fred!) said that he likes to read the questionnaire *first*, before reading the chapter, so that way he knows what to look for."
"This is what you will learn how to build in this book. Since we are going to be spending a lot of time together, let's get to know each other a bit… "
]
},
{
Expand All @@ -207,7 +168,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Since we are going to be spending a lot of time together, let's get to know each other a bit… We are Sylvain and Jeremy, your guides on this journey. We hope that you will find us well suited for this position.\n",
"We are Sylvain and Jeremy, your guides on this journey. We hope that you will find us well suited for this position.\n",
"\n",
"Jeremy has been using and teaching machine learning for around 30 years. He started using neural networks 25 years ago. During this time, he has led many companies and projects which have machine learning at their core, including founding the first company to focus on deep learning and medicine, Enlitic, and taking on the role of President and Chief Scientist of the world's largest machine learning community, Kaggle. He is the co-founder, along with Dr Rachel Thomas, of fast.ai, the organisation that built the course this book is based on.\n",
"\n",
Expand Down
20 changes: 1 addition & 19 deletions 06_multicat.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -773,25 +773,7 @@
"source": [
"def binary_cross_entropy(inputs, targets):\n",
" inputs = inputs.sigmoid()\n",
" return torch.where(targets==1, inputs, 1-inputs).log().mean()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"binary_cross_entropy(activs, y)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"F.binary_cross_entropy_with_logits(activs, y)"
" return -torch.where(targets==1, inputs, 1-inputs).log().mean()"
]
},
{
Expand Down
7 changes: 0 additions & 7 deletions clean/01_intro.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,13 +31,6 @@
"## Neural networks: a brief history"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## What you will learn"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down
20 changes: 1 addition & 19 deletions clean/06_multicat.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -510,25 +510,7 @@
"source": [
"def binary_cross_entropy(inputs, targets):\n",
" inputs = inputs.sigmoid()\n",
" return torch.where(targets==1, inputs, 1-inputs).log().mean()"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"binary_cross_entropy(activs, y)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"F.binary_cross_entropy_with_logits(activs, y)"
" return -torch.where(targets==1, inputs, 1-inputs).log().mean()"
]
},
{
Expand Down

0 comments on commit f94b161

Please sign in to comment.