Convolutional Neural Network
2018-08-19
Machine-Learning
333
This article is my reflection on my previous work FaceLock, a project to recognize user's face and lock the computer if the user doesn't present in a certain time. CNN is used to recognize different faces. I watch the Coursera course Convolutional Neural Networks by Andrew Ng to understand more about CNN, so it's also a learning note about it.
One Layer of a Convolutional Network
In a non-convolutional network, we have the following formula:
Similarly, in the convolutional network, we can have:
@ is a convolution operation.
@ is the input matrix.
@ is the filter. Different filter can detect different feature, e.g. vertical edge, diagonal edge, etc.
@ is the bias.
@ is a activation function.
@ is the output matrix, and can be fed to the next layer.
Calculating the Number
The Number of the Parameters
Suppose we have 10 filters which are in one layer of a neural
AdaBoost
2018-03-30
Machine-Learning
1004
Python 实现: AdaBoost - Donny-Hikari - Github
Introduction
AdaBoost 是 Adaptive Boosting 的简称。 Boosting 是一种 Ensemble Learning 方法。 其他的 Ensemble Learning 方法还有 Bagging, Stacking 等。 Bagging, Boosting, Stacking 的区别如下:
Bagging:
Equal weight voting. Trains each model with a random drawn subset of training set.
Boosting:
Trains each new model instance to emphasize the training instances that previous models mis-classified. Has better accuracy comparing to bagging, but also tends to overfit.
Stacking:
Trains a learning algorithm to combine the predictions of several other learning algorithms.
The Formulas
Given a N*M matrix X, and a N vector y, where N is the count of samples, and M is the features of samples. AdaBoost trains T weak classifiers with the following steps:
给定一个N*M的矩阵X(特征),和一个N维向量y(标签),N为样本数,M为特征维度。AdaBoost以一下步骤训练T个弱分类器: