Welcome to dl
Documentation¶
Documentation for dl
.
Quick Links¶
Table of Content¶
C1 - Neural Networks and Deep Learning¶
Table of Content¶
Week1 - Introduction to Deep Learning¶
Week2 - Logistic Regression as a Neural Network¶
Table of Content¶
逻辑回归是用于二值分类的情景,二值分类时我们更多的是想知道输出为1的概率值,通常情况下大于50%我们就认为是1,小于50%就认为是0。对比线性回归,用一个向量积+一个常数得到的结果通常是一个数值,可能是负,也可能非常大,这个结果很不适合用来表示概率。
所以逻辑回归的本质就是在线性回归的基础上再加上一个sigma函数,将 (-∞, +∞) 这个区间映射到 (0, 1)这个区间上。
1
σ(wT * x + b) = σ(z) = ---------------
1 + e ** (-z)
| /------
|/
0.5 /
/|
------/ |
传统的LMS误差对于二值分类来说不是一个好的Cost Function,因为会导致Cost Function Non Convex。
逻辑回归的Cost Function长这个样子:Cost(y', y) = - ( y * log(y') + (1 - y) * log(1 - y') )
我们的参数优化目标就是Minimize Cost Function的值。
About the Author¶
(\ (\
( -.-)o I am a lovely Rabbit!
o_(")(")
Sanhe Hu is a very active Python Developer Since 2010. Now working at Whiskerlabs as a Data Scientist. Research area includes Machine Learning, Big Data Infrastructure, Block Chain, Business Intelligent, Open Cloud, Distribute System. Love photography, vocal, outdoor, arts, game, and also the best Python.
- My Github: https://github.com/MacHu-GWU
- My HomePage: http://www.sanhehu.org/