Given a two-input neuron with the following weight matrix and input vector: W =
ID: 2085720 • Letter: G
Question
Given a two-input neuron with the following weight matrix and input vector: W = [3 -2] and p = [-7 7]T, we would like to have an output of 0.5. Do you suppose that there is a combination of bias and transfer function that might allow this?
Is there a transfer function from Table 2.1 that will do the job if the bias is zero?
Is there a bias that will do the job if the linear transfer function is used? If yes, what is it?
Is there a bias that will do the job if a log-sigmoid transfer function is used? Again, if yes, what is it?
Is there a bias that will do the job if a symmetrical hard limit transfer function is used? Again, if yes, what is it?
Explanation / Answer
a) Yes,there is a combination of bias and transfer function.The scalar input is multiplied by the scalar weight to form , one of the terms. The other input, , is multiplied by a bias and then passed to the summer. The output , often referred to as the net input, goes into a transfer function , which produces the scalar neuron output .
b)A bias value allows you to shift the activation function to the left or right, which may be critical for successful learning.The yield of the system is processed by duplicating the info (x) by the weight (w0) and going the outcome through some sort of initiation work (e.g. a sigmoid capacity.)
c)yes,The bias is much like a weight, except that it has a constant input of 1. However, if you do not want to have a bias in a particular neuron, it can be omitted.
d)yes,sigmoid can likewise be developed utilizing the hyperbolic digression work rather than this connection, in which case it would be known as a tan-sigmoid. Here, we will allude to the log-sigmoid as just "sigmoid". The sigmoid has the property of being like the to the step function, however with the expansion of region of uncertainty. Sigmoid function in this regard are fundamentally the same as the info yield connections of organic neurons, in spite of the fact that not precisely the same.
f)Yes,the desired characteristics of the output signal also help to select the transfer function for the output layer. If an output is to be either or , then a symmetrical hard limit transfer function should be used. Thus, the architecture of a single-layer network is almost completely determined by problem specifications, including the specific number of inputs and outputs and the particular output signal characteristic.
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.