CS/B.Tech/(CSE)/SEM-8/CS-802B/2012
2012
NATURAL LANGUAGE PROCESSING
Time Allotted : 3 Hours Full Marks : 70
The figures in the margin indicate full marks.
Candidates are required to give their answers in their own words
as far as practicable.
GROUP – A
( Multiple Choice Type Questions )
i) The use of the period (.) is to specify
a) any context
b) any number
c) any character
d) none of these.
ii) Word probability is calculated by
a) Likelihood probability
b) Prior probability
c) Baye's rule
d) none of these.
iii) Minimum edit distance is computed by
a) Phonology
b) Dynamic programming
c) Tautology
d) Hidden Markov Model ( HMM ).
iv) The use of brackets [ ] is to specify
a) disjunction of characters
b) disjunction of numbers
c) words sequence
d) none of these.
v) Viterbi algorithm is used in
a) speech processing
b) Language processing
c) Speech & Language processing
d) none of these.
vi) In deleted interpolation algorithm, which symbol is used ?
a) α
b) β
c) γ
d) μ
vii) Entropy is used to
a) measure the information
b)correct the information
c) detect the information
d) handle the noise.
viii) Phrase Structure Grammar is used in
a) a) Regular Grammar
b) Context–Free Grammar ( CFG )
c) Context–Sensitive Grammar ( CSG )
d) None of these.
a) nouns
b)verbs
c) both (a) and (b)
d) none of these
x) Subcategorize of verbs is classified into
a) Transitive
c) both (a) & (b)
b) Intransitive
d) none of these.
GROUP -B
( Short Answer Type Questions )
Answer any three of the following. 3*5=15
2) What is Regular Expression ? Write down the Regular
Expression for the following languages :
a) The set of all alphabetic strings
b) Column 1 Column 2 Column 3
c) 5.7 Gb.
Expression for the following languages :
a) The set of all alphabetic strings
b) Column 1 Column 2 Column 3
c) 5.7 Gb.
3) Define two level Morphology with suitable example. Briefly
describe the different types of Error Handling mechanism.
and Derivational Morphology with suitable example. What is
stem ? What is morpheme ?
5) Why POS ( Part – of – Speech ) Tagging is required in NLP
( Natural Language Processing ) ? Briefly compare the Top – Down & Bottom – Up Parsing techniques.
6) Write down the concept of feature structure. What is
unification ? What is Word Sense Disambiguation ( WSD ) ?
GROUP – C
( Long Answer Type Questions )
Answer any three of the followin : 3*15 =45
8) a) What is Smoothing ? Why is it required ?
b) Write down the equation for trigram probability estimation.
c) Write down the equation for the discount d = c /c for add-one smoothing. Do the same thing used for Written Bell smoothing. How do they differ?
7) a) Define wordform, lemma, type, token.
b) Briefly describe the role of Finite State Tranducer
( FST ) with suitable example.
c) Define prior probability and likelihood probability using Bayesian Method.
d) What is Confusion Matrix ? Why is it required in NLP( Natural Language Processing )? 9. a) Compute Minimum edit by hand. Figure out whether the word intention is closer to the word execution and calculate a minimum edit distance.
b) Estimate p ( t / c ) as follows (where c p is the pth character of the word c ) using Kernigham et al. four confusion matrices, one for each type of single error.
c) Briefly describe Hidden Markov Model ( HMM ).
d) Compare open class & closed class word groups with suitable examples.
10. a)Draw tree structure for the following ATIS sentences :
I perfer a morning flight
i want a morning flight
Using S-->NP VP
NP-->Pronoun|
Pronoun-Noun
|Det Nominal
Nominal-->|Noun Nominal
|Noun
VP-->verb
|Verb NP
|Verb NP PP
| Verb PP
b) Write rules expressing the verbal subcategory of English auxiliaries with example.
c)Define predeterminers, cardinal numbers, ordinal numbers and quantifiers with suitable examples.
d) How are Transformation Based Learning ( TBL ) Rules applied in NLP ( Natural Language Processing ) ?
11. Write short notes on any three of the following :
a) Weighted Automata
b) Baye's rule in noisy channel
c) Stochastic Part-of-Speech Tagging
d) HMM Tagging
e) Constituency & Agreement.
f) Problems with the basic top down parser
-----------------------x-------------------
-----------------------x-------------------
Comments
Post a Comment