Adding weights and bias.
Summing the inputs.
If weight is zero: No effect of that parameter
If exactly opposite: No bias.
If opposite at some level:
If bias is zero: Weights can take any number. You can multiply or divide it by any number.
Building blocks of a neural networks are again NOT, AND and OR gates.
This is best said in the following comment from Udacity where we created a XOR gate from NOT, AND and OR gates.
"You've seen that a perceptron can solve linearly separable problems. Solving more complex problems, you use more perceptrons. You saw this by calculating AND, OR, NOT, and XOR operations using perceptrons. These operations can be used to create any computer program. With enough data and time, a neural network can solve any problem that a computer can calculate. However, you don't build a Twitter using a neural network. A neural network is like any tool, you have to know when to use it.
The power of a neural network isn't building it by hand, like we were doing. It's the ability to learn from examples. In the next few sections, you'll learn how a neural networks sets it's own weights and biases."
No comments:
Post a Comment