250+ TOP MCQs on Neural Networks – 2 and Answers

AI Multiple Choice Questions on “Neural Networks – 2”.

1. Why is the XOR problem exceptionally interesting to neural network researchers?
a) Because it can be expressed in a way that allows you to use a neural network
b) Because it is complex binary operation that cannot be solved using neural networks
c) Because it can be solved by a single layer perceptron
d) Because it is the simplest linearly inseparable problem that exists.
Answer: d
Clarification: None.

2. What is back propagation?
a) It is another name given to the curvy function in the perceptron
b) It is the transmission of error back through the network to adjust the inputs
c) It is the transmission of error back through the network to allow weights to be adjusted so that the network can learn
d) None of the mentioned
Answer: c
Clarification: Back propagation is the transmission of error back through the network to allow weights to be adjusted so that the network can learn.

3. Why are linearly separable problems of interest of neural network researchers?
a) Because they are the only class of problem that network can solve successfully
b) Because they are the only class of problem that Perceptron can solve successfully
c) Because they are the only mathematical functions that are continue
d) Because they are the only mathematical functions you can draw
Answer: b
Clarification: Linearly separable problems of interest of neural network researchers because they are the only class of problem that Perceptron can solve successfully.

4. Which of the following is not the promise of artificial neural network?
a) It can explain result
b) It can survive the failure of some nodes
c) It has inherent parallelism
d) It can handle noise
Answer: a
Clarification: The artificial Neural Network (ANN) cannot explain result.

5. Neural Networks are complex ______________ with many parameters.
a) Linear Functions
b) Nonlinear Functions
c) Discrete Functions
d) Exponential Functions
Answer: a
Clarification: Neural networks are complex linear functions with many parameters.

6. A perceptron adds up all the weighted inputs it receives, and if it exceeds a certain value, it outputs a 1, otherwise it just outputs a 0.
a) True
b) False
c) Sometimes – it can also output intermediate values as well
d) Can’t say
Answer: a
Clarification: Yes the perceptron works like that.

7. What is the name of the function in the following statement “A perceptron adds up all the weighted inputs it receives, and if it exceeds a certain value, it outputs a 1, otherwise it just outputs a 0”?
a) Step function
b) Heaviside function
c) Logistic function
d) Perceptron function
Answer: b
Clarification: Also known as the step function – so answer 1 is also right. It is a hard thresholding function, either on or off with no in-between.

8. Having multiple perceptrons can actually solve the XOR problem satisfactorily: this is because each perceptron can partition off a linear part of the space itself, and they can then combine their results.
a) True – this works always, and these multiple perceptrons learn to classify even complex problems
b) False – perceptrons are mathematically incapable of solving linearly inseparable functions, no matter what you do
c) True – perceptrons can do this but are unable to learn to do it – they have to be explicitly hand-coded
d) False – just having a single perceptron is enough
Answer: c
Clarification: None.

9. The network that involves backward links from output to the input and hidden layers is called _________
a) Self organizing maps
b) Perceptrons
c) Recurrent neural network
d) Multi layered perceptron
Answer: c
Clarification: RNN (Recurrent neural network) topology involves backward links from output to the input and hidden layers.

10. Which of the following is an application of NN (Neural Network)?
a) Sales forecasting
b) Data validation
c) Risk management
d) All of the mentioned
Answer: d
Clarification: All mentioned options are applications of Neural Network.

Leave a Reply

Your email address will not be published. Required fields are marked *