In the learning of a machine, the perceptron is described as an algorithm for the supervised and classification of the input into one of the several possible and non-binary outputs. This is one type of the linear classifier, for instance, the algorithm classification which makes its predictions basing on the linear predictor and the function that combines a different set of weights with feature vector that describes a certain input by the use of delta rule (Kugler, 77). The perceptrons algorithm learning is seen as an online algorithm, the processes and the elements in training the set of each at a time.
The perceptron is seen as the binary classifier that maps the output x (the real valued vector) to the output value (as a single binary value):
In this case w is always a vector for the real valued weights w. on the other side, x is termed as the dot product, (it computes here with the weighted sum), and b is bias, whereby, a constant term used will not depend on the input value.
The (0 or 1) value is therefore used so as to classify x as positive or negative instance, in cases like the binary classification problem. When b is negative, the weighted inputs combination should produce positive and greater value unlike so as to push classifier neuron over 0 thresholds. Spatially, there is alteration of position by the bias (even though it is not the orientation) of decision boundary. The learning algorithm by perceptron will not terminate when learning set is never linearly separable. When vectors are seen not to be linearly separable, the learning will always never reach the point whereby all the vectors are to classify properly.
On the other hand, the nearest neighbor based designs will provide a way or an approach to the acoustic modeling that will avoid often lengthy and also heuristic procedure of training the traditional Gaussian mixture based on models. Here, the problem is studied through choosing distance metric for the k or nearest neighbor (k-NN) or phonetic classifier frame.
Work Cited
Kugler, Eliás A. A Method for Adapting Two Layers of Weights in a Perceptron Applied to a Problem of Feature Detection. Ithaca, N. Y., 1965. Print.