Natural Language Processing MCQ



Question 651 : According to Austin, speech acts are direct when

  1. the locutionary and perlocutionary acts coincide
  2. the locutionary and illocutionary acts coincide
  3. When no act coincide
  4. the illocutionary and perlocutionary acts coincide
  

Question 652 : Which of the following NLP tasks use sequential labeling technique?

  1. POS tagging
  2. Named Entity Recognition
  3. Speech recognition
  4. All of the above
  

Question 653 : Computer Vs Computational Is An Example Of ______ Morphology.

  1. Inflectional
  2. Derivational
  3. Cliticization
  4. Information Retrieval
  

Question 654 : Which of the following is merits of Context-Free Grammar?

  1. simplest style of grammar
  2. They are highly precise.
  3. High speed
  4. efficiency
  

Question 655 : The dish is displayed on the screen. Here the type of ambiguity is

  1. Phonetic
  2. Lexical
  3. Structural
  4. Semantic
  

Question 656 : In which type of morphology, new words get created by changing part-of-speech. e.g. organise, organization, organizational

  1. Inflectional
  2. Both derivation and inflectional
  3. Semantic
  4. Derivational
  

Question 657 : Choose areas where NLP can not be useful.

  1. Automatic Text Summarization
  2. Automatic Question-Answering Systems
  3. Information Retrieval
  4. X-Ray Analysis
  

Question 658 : When did first patent's for transating machine were applied for

  1. In the mid of 1930
  2. at the end of 1930
  3. In the begining of 1930
  4. In the mid of 1931
  

Question 659 : When gmail extracts only the data from the email recived for you to add in your Google Calendar. This example denotes

  1. Information extraction
  2. Information retrieval
  3. Information Handling
  4. Information Transformation
  

Question 660 : Which Nlp Based System Can Read Out Your Mails On Telephone Or Even Read Out A Story Book For You

  1. Speech Recognition
  2. Machine Translation
  3. Speech Synthesis
  4. Information Retrieval
  

Question 661 : Which Data Structure Is Used To Give Better Heuristic Estimates?

  1. Forwards State-Space
  2. Backward State-Space
  3. Planning Graph
  4. Planning Graph Algorithm
  

Question 662 : "The Cat flys" after applying which ngram gives the output as "The Cat","Cat flys"

  1. Unigram
  2. Bigram
  3. Trigram
  4. Quadrigrams
  

Question 663 : ____ use hand-written rules to identify the correct tag

  1. Stochastic POS tagging
  2. Rule based POS tagging
  3. Transformation based Tagging
  4. Fuzzy logic based Tagging
  

Question 664 : What is the role of NLP in recommendation engines like Collaborative Filtering?

  1. Extracting features from text
  2. Measuring semantic similarity
  3. Constructing feature vector
  4. All of the mentioned
  

Question 665 : _________ can specify the results of processes described by utterances in a discourse.

  1. generics
  2. one-anaphora
  3. Inferrables
  4. discontinuous sets
  

Question 666 : Google Translate is one of the ________________ application.

  1. Machine translation
  2. Information Retrieval
  3. Information Extraction
  4. Summarisation
  

Question 667 : “ Bat is flying in the sky” Identify the dependency checking to perform sense disambiguation of ‘Bat’

  1. Batà sky
  2. Skyà fly
  3. Batà fly
  4. Batà sky, fly
  

Question 668 : From a verb to a specific manner elaboration of that web

  1. Homonymy
  2. Troponym
  3. Polysemy
  4. Metonymy
  

Question 669 : e.g. Original statement in speech is 'I saw a van' During speech to text conversion statement becomes "eye awe of an" Such type of error can be removed by

  1. Parser
  2. Tagger
  3. N-gram
  4. FST
  

Question 670 : Consider the CFG as defined: X--> XY, X--> ax / bx / a, Y --> Ya / Yb / b Any string of terminals, which can be generated by the CFG

  1. Has at least one b
  2. Ends with a
  3. Has no consecutive a’s and b’s
  4. Has at least 2 a’s.
  

Question 671 : What is the main challenge/s of NLP?

  1. Handling Ambiguity of Sentences
  2. Handling Tokenization
  3. Handling POS-Tagging
  4. Stemming
  

Question 672 : Which semantic relation exists between the words"piece" and "peace?

  1. Homophony
  2. Homonymy
  3. Hypernymy
  4. Meronymy
  

Question 673 : How to use WordNet to measure semantic relatedness between words:

  1. Measure the shortest path between two words on WordNet
  2. Count the number of shared parent nodes
  3. Measure the difference between their depths in WordNet
  4. Measure the difference between the size of child nodes they have.
  

Question 674 : This is not feature of binary machine learning classifier

  1. Length of the keyphrase
  2. Frequency of the keyphrase
  3. The most recurring word in keyphrase
  4. Stemming
  

Question 675 : To find relevance of word in a document technique used

  1. TF-IDF
  2. Lemma
  3. Tokenizer
  4. Pos tagging
  

Question 676 : Which Of The Following Will Be A Better Choice To Address Nlp Use Cases Such As Semantic Similarity, Reading Comprehension, And Common Sense Reasoning

  1. Elmo
  2. Open Ai’S Gpt
  3. Ulmfit
  4. Gpt-2
  

Question 677 : In Nlp, The Process Of Identifying People, An Organization From A Given Sentence, Paragraph Is Called

  1. Stemming
  2. Lemmatization
  3. Stop Word Removal
  4. Named Entity Recognition
  

Question 678 : Clock = digital - analog - alarm

  1. Polysemy
  2. Meronymy
  3. Hyponymy
  4. Cline
  

Question 679 : Natural language processing is divided in _____feilds.

  1. 5
  2. 6
  3. 3
  4. 2
  

Question 680 : Which of the following techniques can be used to compute similarity between two sentences in NLP?

  1. Lemmatization
  2. Part of Speech Tagging
  3. Cosine Similarity
  4. N-grams
  

Question 681 : In linguistic morphology _____________ is the process for reducing inflected words to their root form.

  1. Rooting
  2. Stemming
  3. Text-Proofing
  4. Both Rooting & Stemming
  

Question 682 : How conditional probability rewrite in language model? P(B | A) =P(A, B) / P(A)

  1. P(A, B) = P(A) P(B | A)
  2. P(A, B) = P(A) P(A | B)
  3. P(A, B) = P(B) P(B | A)
  4. P(A) = P(A) P(B | A)
  

Question 683 : Given a sentence S="w1 w2 w3 ... wn", to compute the likelihood of S using a bigram model. How would you compute the likelihood of S?

  1. Calculate the conditional probability of each word in the sentence given the preceding word and add the resulting numbers
  2. Calculate the conditional probability of each word in the sentence given the preceding word and multiply the resulting numbers
  3. Calculate the conditional probability of each word given all preceding words in a sentence and add the resulting numbers
  4. Calculate the conditional probability of each word given all preceding words in a sentence and multiply the resulting numbers
  

Question 684 : What Is Most Commonly Described As The Language Above The Sentence Level Or As 'Language In Use

  1. Discourse
  2. Word Level Analysis
  3. Semantic Analysis
  4. Syntax Analysis
  

Question 685 : Consider the following sentences. "The horse ran up the hill. It was very steep. It soon got tired." What type of ambiguity is introduced due to the word "it"?

  1. Syntactic
  2. Pragmatics
  3. Cataphoric
  4. Anaphoric
  

Question 686 : What is irony

  1. Using language to signal attitude other than what has been literally said.
  2. Using words that are context bound
  3. A mixture of vague language and humour
  4. The process of deriving implied meanings.
  

Question 687 : In the sentence, “He ate the pizza”, the BOLD part is an example of _____.

  1. Noun phrase
  2. Verb phrase
  3. Prepositional phrase
  4. Adverbial phrase
  

Question 688 : The study of how knowledge about the world and language conventions interact with literal meaning is called as ____________.

  1. Morphology
  2. Discourse analysis
  3. Co reference
  4. Reference Resolution
  

Question 689 : ....................are the entities that have been previously introduced into the discourse.

  1. Anaphoras
  2. Cataphoras
  3. Pronouns
  4. derminers
  

Question 690 : Any question and answering system is classified into _________ and ___________ types

  1. Locked domain QAS, Unlocked Domain QAS
  2. Easy Domain QAS, Difficult Domain QAS
  3. Close Domain QAS , Open Domain QAS
  4. Direct Domain QAS, Indirect Domain QAS
  

Question 691 : Reason for stop word removal

  1. Stop word slow down processing
  2. Stop word enhance speed of searching
  3. Stop word removal programs are easily available
  4. This is routine pre processing without any benefit
  

Question 692 : What Is Full Form Of Nlp?

  1. Natural Language Processing
  2. Nature Language Processing
  3. Natural Language Process
  4. Natural Language Pages The Stage Of Nlp Were "Processing Of Sequence Of Sentences Is Done" Is Called As M
  

Question 693 : Co-reference Resolution is –

  1. Anaphora Resolution
  2. Given a sentence or larger chunk of text, determine which words (“mentions”) refer to the same objects (“entities”)
  3. Counting frequency of define terms.
  4. Notation of vector for every sentence.
  

Question 694 : Select one from following which is not a rule of language?

  1. Lexicalization
  2. Morphology
  3. Semantics
  4. Phonology
  

Question 695 : FST is used in _____ Analysis.

  1. Lexical
  2. Morphological
  3. Semantic
  4. Syntactic
  

Question 696 : Discourse analysis is a part of ____________.

  1. Semantic Analysis
  2. Syntax Analysis
  3. Pragmatics
  4. Morphology
  

Question 697 : Automatic text Summarization is not useful for

  1. Create short summary
  2. Reducing reading time
  3. Accelerate the research
  4. Improves redundancy
  

Question 698 : The process of assigning tags or categories to text according to its content is called

  1. Sentiment Analysis
  2. Text Summarization
  3. Information Retrival
  4. Text classification
  

Question 699 : In 1969 Roger schank introduced ________________ dependency theory for NL understanding

  1. Conceptual
  2. Bilateral
  3. Trilateral
  4. Textual
  

Question 700 : "Buy books for children" which type of ambiguity exists in the above sentence?

  1. Semantic
  2. Syntactic
  3. Lexical
  4. Pragmatic