Back Prop

Constants

: # of features : size of training data : size of th hidden layer : output size : the th training sample.

Perceptron

First, we talk a bit about the general setup of a neural network. A neural network is alternating linear function and non-linear function repeatedly. Any time a nonlinear operation happens indicates a new layer. So two things happen at every layer, a linear transformation, and a non-linear operation.

We start with the simplest setup, a neural network with no hidden layer, or a perceptron. We assume the output is of size to be more general.

perceptron

Comments

In the image syntax ![text](url), the url should be absolute. For exmaple, your image can be displayed like this: . Moreover, you can directly draw diagrams with latex on our platform.

Add a comment

You must log in to post a comment.