Total Questions : 40
Expected Time : 40 Minutes

1. What is the 'Hessian matrix' in the context of neural networks, and how is it relevant to model optimization?

2. What is the purpose of the 'Leaky ReLU' activation function in neural networks?

3. In neural networks, what does 'epoch' refer to?

4. Discuss the concept of 'neuroevolution' and how it differs from traditional gradient-based optimization in neural networks.

5. How does 'batch size' impact the training of neural networks?

6. Discuss the role of 'transfer learning' in the training of deep neural networks, and provide examples of its applications.

7. Discuss the significance of 'residual networks' (ResNets) in addressing the challenges of deep neural networks.

8. How does 'ensemble learning' contribute to improving the performance of neural networks, and what types of ensembles are commonly used?

9. What is the primary role of the 'backpropagation' algorithm in neural network training?

10. What is 'data augmentation' in neural network training?

11. What is a 'feedforward' neural network?

12. What is the primary function of the 'sigmoid' activation function in neural networks?

13. In neural networks, what is 'overhead'?

14. What are the main differences between 'supervised learning' and 'reinforcement learning' in neural networks?

15. In the context of neural networks, what is 'data normalization' and why is it important?

16. In neural networks, what are 'weights' associated with?

17. What does 'training' a neural network involve?

18. In convolutional neural networks (CNNs), what is the purpose of pooling layers?

19. What is the role of the 'learning rate schedule' in neural network training?

20. What role does 'learning rate annealing' play in the optimization of neural network training?

21. What does 'gradient descent' optimize during neural network training?

22. What is the 'dropout' technique, and how does it contribute to the training of neural networks?

23. Explain the concept of 'self-supervised learning' in neural networks and its advantages in representation learning.

24. What is the purpose of 'validation data' in neural network training?

25. What is 'transfer learning' in the context of neural networks?

26. What is a 'neuron' in a neural network?

27. In the context of neural networks, what is 'hyperparameter tuning'?

28. What is 'underfitting' in neural network training?

29. In neural network terminology, what is a 'epoch'?

30. In neural networks, what does 'batch size' represent?

31. Explain the concept of 'attention mechanism' in neural networks and its role in natural language processing.

32. What is the vanishing gradient problem in deep learning?

33. What role does the 'momentum' term play in optimization algorithms for neural networks?

34. What challenges does 'long-term dependency' pose in traditional recurrent neural networks (RNNs), and how are they addressed in more advanced architectures like LSTMs?

35. Explain the 'Gated Recurrent Unit' (GRU) and its advantages over traditional recurrent neural networks (RNNs).

36. What is the 'Bag of Words' model in the context of natural language processing and neural networks?

37. What is 'unsupervised learning' in the context of neural networks, and provide an example of its application.

38. What is the 'Kullback-Leibler (KL) divergence' and how is it used in the context of probabilistic models and neural networks?

39. What is a neural network?

40. In neural networks, what is 'weight decay' used for?