Deep Neural Networks: Algorithms

Maths

BLAS

Probability and Information Theory

Numerical Computation

Neurons

Weights

Zeros

Ones

Constants

Random Uniform

Random Normal

Truncated Normal

Uniform Unit Scaling

Random Walk

Layers

Forward:

Convolutions

Convolutions

1D Convolution
2D Convolution
3D Convolution
Group Convolution
Dilated Convolution

Pooling

Max Pooling
Avg Pooling
RoI Pooling

Regions

Region Proposal Network

Fully Connected

Inner Product

Activations

Single-Fold X

Identity

Step

Piecewise Linear

Sigmoid

Complementary Log Log

Bipolar

Bipolar Sigmoid

TanH

LeCun’s TanH

Hard TanH

Absolute

Rectifier

Modifications of ReLU

Smooth Rectifier

Logit

Probit

Cosine

Multi-Fold Xs

Softmax

Maxout

RBF

(RBF) Gaussian

(RBF) Multiquadratic

(RBF) Inverse Multiquadratic

Normalizations

Regularizations

L1 Regularization

L2 Regularization

Dropout

Backward:

Losses

Contrastive Loss

Hinge Loss

Euclidean Loss

Infogain Loss

Sigmoid Cross Entropy Loss

Softmax Loss

Multinomial Logistic Loss

Smooth L1 Loss

Gradients

Stochastic Gradient Descent

Ada Delta

Adaptive Gradient

Adam

Nesterov’s Accelerated Gradient

RMS Prop

Networks

Faster R-CNN

YOLO

Solvers

Train

Validate

Test

Models

Convolutional Neural Networks

Network Year
LeNet 1998
AlexNet 2012
VGG 9/2014
GoogleLeNet 9/2014
InceptionBN 2/2015
Inception V3 12/2015
ResNet 12/2015

LeNet

Model
00 input
01 conv                             5x5/1,20      20x24x24
02 activation       relu / tanh                   20x24x24
03 pool                             2x2/2         20x12x12
04 conv                             5x5/1,50      50x8x8
05 activation       relu / tanh                   50x8x8
06 pool                             2x2/2         50x4x4
07 flatten                                        800
07 fc                               500           500
08 activation       relu / tanh                   500
09 fc                               10            10
10 softmax

AlexNet

Model
00 input

01 conv                         11x11/4,96       96x54x54
02 activation       relu                         96x54x54
03 pool             max         3x3/2            96x27x27
04 LRN                                           96x27x27
05 conv                         5x5/1,256        256x27x27
06 activation       relu                         256x27x27
07 pool             max         3x3/2            256x13x13
08 LRN                                           256x13x13

09 conv                         3x3/1,384        384x13x13
10 activation       relu                         384x13x13
11 conv                         3x3/1,384        384x13x13
12 activation       relu                         384x13x13
13 conv                         3x3/1,256        256x13x13
14 activation       relu                         256x13x13
15 pool             max         3x3/2            256x6x6
16 flatten                                       9216

17 fc                           4096             4096
18 activation       relu                         4096
19 dropout                                       4096
20 fc                           4096             4096
21 activation       relu                         4096
22 dropout                                       4096

23 fc                           2                2
24 softmax

Overfeat

VGG

Model
00 input

01 conv                           3x3/1,64        64x224x224
02 activation       relu                          64x224x224
03 pool             max           2x2/2           64x112x112
04 conv                           3x3/1,128       128x112x112
05 activation       relu          2x2/2           128x112x112
06 pool             max           2x2/2           128x56x56

07 conv                           3x3/1,256       256x56x56
08 activation       relu                          256x56x56
09 conv                           3x3/1,256       256x56x56
10 activation       relu                          256x56x56
11 pool             max           2x2/2           256x28x28

12 conv                           3x3/1,512       512x28x28
13 activation       relu                          512x28x28
14 conv                           3x3/1,512       512x28x28
15 activation       relu                          512x28x28
16 pool             max           2x2/2           512x14x14

17 conv                           3x3/1,512       512x14x14
18 activation       relu                          512x14x14
19 conv                           3x3/1,512       512x14x14
20 activation       relu                          512x14x14
21 pool             max           2x2/2           512x7x7

22 flatten                                        25088
23 fc                             4096            4096
24 activation       relu                          4096
25 dropout                                        4096
26 fc                             4096            4096
27 activation       relu                          4096
28 dropout                                        4096

29 fc                             2               2
30 softmax

GoogleNet

ResNet

Applications

MNIST

CIFAR

PASCAL

COCO

ImageNet