Total Questions : 50
Expected Time : 50 Minutes

1. Discuss the role of 'bidirectional LSTM' in NLP and its advantages over traditional LSTM.

2. Which step is typically included in the preprocessing phase of NLP tasks?

3. In the context of NLP, what does the term 'corpus' refer to?

4. Discuss the concept of 'transfer learning' in NLP and its advantages in training language models.

5. What is the purpose of a 'stemming algorithm' in natural language processing, and provide an example.

6. How does 'context window' influence the performance of word embeddings in NLP?

7. What does TF-IDF stand for in the context of document representation?

8. What is the significance of the term 'TF-IDF' in document representation, and how does it contribute to NLP tasks?

9. What are 'stop words' in NLP, and why are they often excluded from text analysis?

10. Discuss the significance of 'Named Entity Recognition (NER)' in NLP and its real-world applications.

11. What is the purpose of an attention mechanism in NLP models?

12. What is the purpose of stemming in NLP?

13. Examine the impact of imbalanced datasets on the performance of Natural Language Processing models. Propose strategies to address this issue.

14. In machine translation, what does the acronym BLEU stand for?

15. Discuss the challenges and potential solutions in handling sarcasm detection using Natural Language Processing techniques.

16. What is tokenization in the context of NLP?

17. Compare and contrast the bag-of-words model and word embeddings in NLP. Highlight their respective advantages and limitations.

18. Discuss the challenges associated with cross-lingual Natural Language Processing and propose techniques to overcome language barriers in NLP applications.

19. What is the purpose of the stemming process in NLP?

20. What is the purpose of a Word Embedding in NLP?

21. Which technique is commonly used for text summarization in NLP?

22. Which neural network architecture is commonly used for named entity recognition?

23. What is the 'long-tail distribution' in the context of language processing?

24. Explain the concept of co-reference resolution in Natural Language Processing. Provide an example scenario where co-reference resolution is crucial.

25. What is the key difference between precision and recall in the context of NLP evaluation metrics?

26. Which library is commonly used for NLP tasks in Python?

27. Define 'corpus' in NLP and its role in training language models.

28. In the context of neural networks, explain the concept of transfer learning and its application in Natural Language Processing.

29. Which technique is commonly used for sentiment analysis in NLP?

30. What is the purpose of 'stop words' in text processing, and provide an example.

31. What is the significance of 'syntax tree' in the analysis of sentence structure in NLP?

32. Discuss the challenges associated with 'sentiment analysis' in natural language processing.

33. Define 'TF-IDF (Term Frequency-Inverse Document Frequency)' and its role in text analysis.

34. What is 'word sense disambiguation' in NLP, and why is it important?

35. What is the purpose of a language model in NLP?

36. Discuss the challenges associated with 'machine translation' in natural language processing.

37. How does 'semantic analysis' contribute to the understanding of language in NLP?

38. Which evaluation metric is commonly used for named entity recognition tasks?

39. Examine the ethical considerations in deploying sentiment analysis models, particularly in social media. How can biases be addressed in such applications?

40. Define 'lemmatization' and explain its significance in linguistic analysis.

41. What is the primary purpose of 'tokenization' in Natural Language Processing?

42. Which algorithm is commonly used for text classification in NLP?

43. Explain the concept of word sense disambiguation in Natural Language Processing and provide an example scenario where it is crucial.

44. Explain the role of attention mechanisms in advanced Natural Language Processing models and provide an example of their application.

45. Which deep learning architecture is commonly used for sequence-to-sequence tasks in NLP?

46. Explain the concept of 'Named Entity Recognition (NER)' in NLP and its applications.

47. Discuss the significance of 'part-of-speech tagging' in NLP and its applications.

48. What is the primary goal of Natural Language Processing (NLP)?

49. How does 'lemmatization' differ from 'stemming' in NLP, and why might one be preferred over the other?

50. Define 'recurrent neural network (RNN)' in the context of NLP and its limitations.