Skip to content

Commit 04a6455

Browse files
committed
revising the structure of the repository
1 parent 9886fc5 commit 04a6455

30 files changed

+81
-0
lines changed

.gitignore

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
.DS_Store
2+
.ipynb_checkpoints/
3+
logs/
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.

ch8/01_Intro_to_CNN/readme.md

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
## Introduction to Convolutional Neural Networks
2+
3+
Convolutional Neural Networks (CNNs) are responsible for the latest major breakthroughs in image recognition in the past few years.
4+
5+
In mathematics, a convolution is a function which is applied over the output of another function. In our case, we will consider applying a matrix mutliplication (filter) across an image. See the below diagram for an example of how this may work.
6+
7+
![Convolutional Filter](../images/01_intro_cnn.png)
8+
9+
CNNs generally follow a structure. The main convolutional setup is (input array) -> (convolutional filter layer) -> (Pooling) -> (Activation layer). The above diagram depicts how a convolutional layer may create one feature. Generally, filters are multidimensional and end up creating many features. It is also common to have a completely separate filter-feature creator of different sizes acting on the same layer. After this convolutional filter, it is common to apply a pooling layer. This pooling may be a max-pooling or an average pooling or another aggregation. One of the key concepts here is that the pooling layer has no parameters- while decreasing the layer size. See the below diagram for an example of max-pooling.
10+
11+
![Convolutional Filter](../images/01_intro_cnn2.png)
12+
13+
After the max pooling, there is generally an activation layer. One of the more common activation layers is the ReLU (Rectified Linear Unit). See [Chapter 1, Section 6](../../01_Introduction/06_Implementing_Activation_Functions) for examples.

ch8/02_Intro_to_CNN_MNIST/02_introductory_cnn.ipynb

Lines changed: 1 addition & 0 deletions
Large diffs are not rendered by default.

0 commit comments

Comments
 (0)