250+ MCQs on Neural Networks Recall and Answers

Neural Networks Multiple Choice Questions on “Recall″.

1. Lyapunov function is vector in nature?
A. yes
B. no

Answer: B
Clarification: Lyapunov function is scalar in nature.

2. What’s the role of lyaopunov fuction?
A. to determine stability
B. to determine convergence
C. both stability & convergence
D. none of the mentioned

Answer: A
Clarification: lyapunov is an energy function.

3. Did existence of lyapunov function is necessary for stability?
A. yes
B. no

Answer: B
Clarification: It is sufficient but not necessary condition.

4. V(x) is said to be lyapunov function if?
A. v(x) >=0
B. v(x) <=0
C. v(x) =0
D. none of the mentioned

Answer: B
Clarification: It is the condition for existence for lyapunov function.

5. What does cohen grossberg theorem?
A. shows the stability of fixed weight autoassociative networks
B. shows the stability of adaptive autoaassociative networks
C. shows the stability of adaptive heteroassociative networks
D. none of the mentioned

Answer: A
Clarification: Cohen grossberg theorem shows the stability of fixed weight autoassociative networks.

6. What does cohen grossberg kosko theorem?
A. shows the stability of fixed weight autoassociative networks
B. shows the stability of adaptive autoaassociative networks
C. shows the stability of adaptive heteroassociative networks
D. none of the mentioned

Answer: B
Clarification: Cohen grossberg kosko shows the stability of adaptive autoaassociative networks.

7. What does 3rd theorem that describe the stability of a set of nonlinear dynamical systems?
A. shows the stability of fixed weight autoassociative networks
B. shows the stability of adaptive autoaassociative networks
C. shows the stability of adaptive heteroassociative networks
D. none of the mentioned

Answer: C
Clarification: 3rd theorem of nonlinear dynamical systems, shows the stability of adaptive heteroassociative networks.

8. What happens during recall in neural networks?
A. weight changes are suppressed
B. input to the network determines the output activation
C. both process has to happen
D. none of the mentioned

Answer: C
Clarification: Follows from basic definition of Recall in a network.

9. Can a neural network learn & recall at the same time?
A. yes
B. no

Answer: A
Clarification: It was later proved by kosko in 1988.

10. In nearest neighbour case, the stored pattern closest to input pattern is recalled, where does it occurs?
A. feedback pattern classification
B. feedforward pattern classification
C. can be feedback or feedforward
D. none of the mentioned

Answer: B
Clarification: It is a case of feedforward networks.

 

250+ MCQs on Stochastic Networks and Answers

Neural Networks Multiple Choice Questions on “Stochastic Networks″.

1. p(s=1|x) = 1/(1+exp(-x/T))) ,where ‘s’ is the output given the activation ‘x’ is a?
A. hopfield network
B. sigma network
C. stochastic network
D. none of the mentioned
Answer: C
Clarification: This is the basic equation of a stochastic network.

2. Does a stochastic network will evolve differently each time it is run?
A. yes
B. no
Answer: A
Clarification: As trajectory of the state of the network becomes a sample function of a random process.

3. In case of deterministic update, what kind of equilibrium is reached?
A. static
B. dynamic
C. neutral
D. none of the mentioned
Answer: A
Clarification: In case of deterministic update, static equilibrium is reached.

4. In case of stochastic update, can static equilibrium be reached?
A. yes
B. no
Answer: B
Clarification: There will never be a static equilibrium in stochastic network.

5. In case of stochastic update, what kind of equilibrium is reached?
A. static
B. dynamic
C. neutral
D. equilibrium not possible
Answer: B
Clarification: In case of stochastic update, dynamic equilibrium is reached.

6. Is it possible in stochastic network that average state of network doesn’t change with time?
A. yes
B. no
Answer: A
Clarification: Dynamic equilibrium is possible in stochastic network.

7. What can be the possible reason for thermal equilibrium in stochastic networks?
A. probability distribution of states changes and compensates
B. probability distribution change with only update
C. probability distribution does not change with time
D. none of the mentionedstochastic network exhibits stable states
Answer: C
Clarification: Probability distribution does not change with time is the only reason for thermal equilibrium in stochastic networks.

8. Can networks with symmetric weight reach thermal equilibrium?
A. yes
B. no
Answer: A
Clarification: Networks with symmetric weight reach thermal equilibrium at a given temperature.

9. When activation value is determined by using the average of fluctuations of outputs from other units, it is known as?
A. maximum field approximation
B. median field approximation
C. minimum field approximation
D. none of the mentioned
Answer: D
Clarification: It is known as mean field approximation.

10. Where does a stochastic network exhibits stable states ?
A. at any temperature
B. above critical temperature
C. at critical temperature
D. below critical temperature
Answer: D
Clarification: Stochastic network exhibits stable states below critical temperature.

250+ MCQs on Characteristics – 3 and Answers

Neural Networks Questions & Answers for entrance exams on “Characteristics – 3”.

1. The cell body of neuron can be analogous to what mathamatical operation?
A. summing
B. differentiator
C. integrator
D. none of the mentioned
Answer: A
Clarification: Because adding of potential(due to neural fluiD. at different parts of neuron is the reason of its firing.

2. What is the critical threshold voltage value at which neuron get fired?
A. 30mv
B. 20mv
C. 25mv
D. 10mv
Answer: D
Clarification: This critical is founded by series of experiments conducted by neural scientist.

3. Does there is any effect on particular neuron which got repeatedly fired ?
A. yes
B. no
Answer: A
Clarification: The strength of neuron to fire in future increases.

4. What is name of above mechanism?
A. hebb rule learning
B. error correction learning
C. memory based learning
D. none of the mentioned
Answer: A
Clarification: It follows from basic definition of hebb rule learning.

5. What is hebb’s rule of learning
A. the system learns from its past mistakes
B. the system recalls previous reference inputs & respective ideal outputs
C. the strength of neural connection get modified accordingly
D. none of the mentioned
Answer:c
Clarification: The strength of neuron to fire in future increases, if it is fired repeatedly.

6. Are all neuron in brain are of same type?
A. yes
B. no
Answer: B
Clarification: Follows from the fact no two body cells are exactly similar in human body, even if they belong to same class.

7. What is estimate number of neurons in human cortex?
A. 108
B. 105
C. 1011
D. 1020
Answer: C
Clarification: It is a fact !

8. what is estimated density of neuron per mm^2 of cortex?
A. 15*(102)
B. 15*(104)
C. 15*(103)
D. 5*(104)
Answer: B
Clarification: It is a biological fact !

9. Why can’t we design a perfect neural network?
A. full operation is still not known of biological neurons
B. number of neuron is itself not precisely known
C. number of interconnection is very large & is very complex
D. all of the mentioned
Answer: D
Clarification: These are all fundamental reasons, why can’t we design a perfect neural network !

10. How many synaptic connection are there in human brain?
A. 1010
B. 1015
C. 1020
D. 105
Answer: B
Clarification: You can estimate this value from number of neurons in human cortex & their density.

To practice all areas of Neural Networks for entrance exams,

250+ MCQs on Pattern Association – 1 and Answers

Neural Networks Multiple Choice Questions on “Pattern Association – 1″.

1. Feedforward networks are used for?
A. pattern mapping
B. pattern association
C. pattern classification
D. all of the mentioned
Answer: D
Clarification: Feedforward networks are used for pattern mapping, pattern association, pattern classification.

2. Feedback networks are used for?
A. autoassociation
B. pattern storage
C. both autoassociation & pattern storage
D. none of the mentioned
Answer: C
Clarification: Feedback networks are used for autoassociation, pattern storage.

3. The simplest combination network is called competitive learning network?
A. yes
B. no
Answer: A
Clarification: The most basic example of of combination of feedforward & feedback network is competitive learning net.

4. Competitive learning net is used for?
A. pattern grouping
B. pattern storage
C. pattern grouping or storage
D. none of the mentioned
Answer: A
Clarification: Competitive learning net is used for pattern grouping.

5. Feedback connection strength are usually ?
A. fixed
B. variable
C. both fixed or variable type
D. none of the mentioned
Answer: A
Clarification: Feedback connection strength are usually fixed & linear to reduce complexity.

6. Feedforward network are used for pattern storage?
A. yes
B. no
Answer: B
Clarification: Feedforward network are used for pattern mapping, pattern association, pattern classification.

7. If some of output patterns in pattern association problem are identical then problem shifts to?
A. pattern storage problem
B. pattern classification problem
C. pattern mapping problem
D. none of the mentioned
Answer: B
Clarification: Because then number of distinct output can be viewed as class labels.

8. The network for pattern mapping is expected to perform?
A. pattern storage
B. pattern classification
C. genaralization
D. none of the mentioned
Answer: C
Clarification: The network for pattern mapping is expected to perform genaralization.

9. In case of autoassociation by feedback nets in pattern recognition task, what is the behaviour expected?
A. accretive
B. interpolative
C. can be either accretive or interpolative
D. none of the mentioned
Answer: B
Clarification: When a noisy pattern is given , network retrieves a noisy pattern.

10. In case of pattern by feedback nets in pattern recognition task, what is the behaviour expected?
A. accretive
B. interpolative
C. can be either accretive or interpolative
D. none of the mentioned
Answer: A
Clarification:Accretive behaviour is exhibited in case of pattern storage problem.

250+ MCQs on Boltzman Machine – 1 and Answers

Neural Networks Multiple Choice Questions on “Boltzman Machine – 1″.

1. Probability of error in recall of stored patterns can be reduced if?
A. patterns are stored appropriately
B. inputs are captured appropriately
C. weights are chosen appropriately
D. none of the mentioned
Answer: C
Clarification: Probability of error in recall of stored patterns can be reduced if weights are chosen appropriately.

2. What is pattern environment?
A. probability of desired patterns
B. probability of given patterns
C. behaviour of system
D. none of the mentioned
Answer: D
Clarification: Pattern environment is probability distribution of given patterns.

3. For what purpose is pattern environment useful?
A. determining structure
B. determining desired outputs
C. determining future inputs
D. none of the mentioned
Answer: D
Clarification: Pattern environment is useful for determining weights.

4. What should be the aim of training procedure in boltzman machine of feedback networks?
A. to capture inputs
B. to feedback the captured outputs
C. to capture the behaviour of system
D. none of the mentioned
Answer: D
Clarification: The training procedure should try to capture the pattern environment.

5. What consist of boltzman machine?
A. fully connected network with both hidden and visible units
B. asynchronous operation
C. stochastic update
D. all of the mentioned
Answer: D
Clarification: Boltzman machine consist of fully connected network with both hidden and visible units operating asynchronously with stochastic update.

6. By using which method, boltzman machine reduces effect of additional stable states?
A. no such method exist
B. simulated annealing
C. hopfield reduction
D. none of the mentioned
Answer: B
Clarification: boltzman machine uses simulated annealing to reduce the effect of additional stable states.

7. For which other task can boltzman machine be used?
A. pattern mapping
B. feature mapping
C. classification
D. pattern association
Answer: D
Clarification: Boltzman machine can be used for pattern association.

8. How are energy minima related to probability of occurrence of corresponding patterns in the environment?
A. directly
B. inversely
C. directly or inversely
D. no relation
Answer: A
Clarification: Energy minima is directly related to probability of occurrence of corresponding patterns in the environment.

9. Is exact representation of pattern environment possible?
A. yes
B. no
Answer: B
Clarification: Exact representation of pattern environment is not possible.

10. What may be the reasons for non zero probability of error in recalling?
A. spurious stable states
B. approximation in pattern environment representation
C. extra stable states
D. all of the mentioned
Answer: D
Clarification: These all are the primary reasons for existence of non zero probability of error.

250+ MCQs on Neural Networks History and Answers

Neural Networks Multiple Choice Questions on “History″.

1. Operations in the neural networks can perform what kind of operations?
A. serial
B. parallel
C. serial or parallel
D. none of the mentioned

Answer: C
Clarification: General characteristics of neural networks.

2. Does the argument information in brain is adaptable, whereas in the computer it is replaceable is valid?
A. yes
B. no

Answer: A
Clarification: Its a fact & related to basic knowledge of neural networks !

3. Does there exist central control for processing information in brain as in computer?
A. yes
B. no

Answer: B
Clarification: In human brain information is locally processed & analysed.

4. Which action is faster pattern classification or adjustment of weights in neural nets?
A. pattern classification
B. adjustment of weights
C. equal
D. either of them can be fast, depending on conditions

Answer: A
Clarification: Memory is addressable, so thus pattern can be easily classified.

5. What is the feature of ANNs due to which they can deal with noisy, fuzzy, inconsistent data?
A. associative nature of networks
B. distributive nature of networks
C. both associative & distributive
D. none of the mentioned

Answer: C
Clarification: General characteristics of ANNs.

6. What was the name of the first model which can perform wieghted sum of inputs?
A. McCulloch-pitts neuron model
B. Marvin Minsky neuron model
C. Hopfield model of neuron
D. none of the mentioned

Answer: A
Clarification: McCulloch-pitts neuron model can perform weighted sum of inputs followed by threshold logic operation.

7. Who developed the first learning machine in which connection strengths could be adapted automatically?
A. McCulloch-pitts
B. Marvin Minsky
C. Hopfield
D. none of the mentioned

Answer: B
Clarification: In 1954 Marvin Minsky developed the first learning machine in which connection strengths could be adapted automatically & efficiebtly.

8. Who proposed the first perceptron model in 1958?
A. McCulloch-pitts
B. Marvin Minsky
C. Hopfield
D. Rosenblatt

Answer: D
Clarification: Rosenblatt proposed the first perceptron model in 1958 .

9. John hopfield was credited for what important aspec of neuron?
A. learning algorithms
B. adaptive signal processing
C. energy analysis
D. none of the mentioned

Answer: C
Clarification: It was of major contribution of his works in 1982.

10. What is the contribution of Ackley, Hinton in neural?
A. perceptron
B. boltzman machine
C. learning algorithms
D. none of the mentioned

Answer: B
Clarification: Ackley, Hinton built the boltzman machine.