Function approximation using Neural Networks.
조회 수: 11 (최근 30일)
이전 댓글 표시
I have a task like this :
Manual design of a neural network for computing a function - Suggest weights of a multi-layered neural network computing the function f(x1, x2) = 3 − x1 − x2, where x1, x2 are input bits (of value 0 or 1 each). The neurons of the network should use the sigmoidal transfer function with the slope 1 and they have biases. The topology of the network must be the following:
(a) two input neurons – inputs are bits (with value 0 or 1),
(b) two neurons in a single hidden layer, and
(c) two neurons in the output layer.
Outputs of the network will be interpreted as two-bit binary number in the following way:
• output greater or equal to 0.5 will be considered as logical 1,
• output greater less than 0.5 will be considered as logical 0.
List weights and biases of all neurons and also a table with the actual outputs of the network (before rounding) for all four combinations of the input bits.
MY QUESTION: at the end, I will have single weight for each neuron that satisfies all four combaniations? or what? As far as I understood, I need to train and get weight for each neuron that satisfies all four combinations right?
댓글 수: 1
Greg Heath
2018년 11월 30일
편집: Greg Heath
2018년 11월 30일
Peripheral comment:
Input nodes are not neurons.
Yeah. I know.
Greg
답변 (0개)
참고 항목
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!