https://travis-ci.org/MacHu-GWU/Deep-Learning-Specialization-Course-Notes.svg?branch=master https://codecov.io/gh/MacHu-GWU/Deep-Learning-Specialization-Course-Notes/branch/master/graph/badge.svg https://img.shields.io/pypi/v/dl.svg https://img.shields.io/pypi/l/dl.svg https://img.shields.io/pypi/pyversions/dl.svg https://img.shields.io/badge/Star_Me_on_GitHub!--None.svg?style=social

Welcome to dl Documentation

Documentation for dl.

Table of Content

C1 - Neural Networks and Deep Learning

Table of Content

Week1 - Introduction to Deep Learning
Table of Content
What is a neural network?

Size - Neuron -> Price

Concept
ReLU Function

ReUL (Rectified, Linear, Unit) is a function that start with zero, and going up linearly.

Week2 - Logistic Regression as a Neural Network
Table of Content
Binary Classification
Logistic Regression

逻辑回归是用于二值分类的情景,二值分类时我们更多的是想知道输出为1的概率值,通常情况下大于50%我们就认为是1,小于50%就认为是0。对比线性回归,用一个向量积+一个常数得到的结果通常是一个数值,可能是负,也可能非常大,这个结果很不适合用来表示概率。

所以逻辑回归的本质就是在线性回归的基础上再加上一个sigma函数,将 (-∞, +∞) 这个区间映射到 (0, 1)这个区间上。

                              1
σ(wT * x + b) = σ(z) = ---------------
                        1 + e ** (-z)

        | /------
        |/
   0.5  /
       /|
------/ |
Logistic Regression Cost Function

传统的LMS误差对于二值分类来说不是一个好的Cost Function,因为会导致Cost Function Non Convex。

逻辑回归的Cost Function长这个样子:Cost(y', y) = - ( y * log(y') + (1 - y) * log(1 - y') )

我们的参数优化目标就是Minimize Cost Function的值。

About the Author

(\ (\
( -.-)o    I am a lovely Rabbit!
o_(")(")

Sanhe Hu is a very active Python Developer Since 2010. Now working at Whiskerlabs as a Data Scientist. Research area includes Machine Learning, Big Data Infrastructure, Block Chain, Business Intelligent, Open Cloud, Distribute System. Love photography, vocal, outdoor, arts, game, and also the best Python.

API Document