University of Calcutta Admission to the Ph.D. Programme in Statistics: 2021

University of Calcutta Admission to the Ph.D. Programme in Statistics: 2021

Admission to the Ph.D. Programme in Statistics: 2021
Procedure:
1. The conditions for eligibility will be guided overall by the rules specified in the notification regarding Regulations for the Degree of Doctor of Philosophy (Ph.D.) of the University of Calcutta.

2. Eligibility: Candidates with M.Sc. or equivalent degree in Statistics and allied subjects from any UGC recognized University/Institute with 55% marks in aggregate (SC / ST/OBC with 50% marks) are eligible to apply for admission to the Ph.D. programme.

(a) In case of candidates from other Universities, admission for the Ph.D. Programme will proceed after determination of equivalence by the relevant University body and fulfillment of the admission criteria.

(b) Foreign students will be required to produce clearance from the government of India and/or other appropriate authorities, if any, for admission to the Ph.D. Programme.

Enrolment in the Ph. D. Programme may be allowed to only such foreign nationals as have obtained and are holding research visa.

(c) The non-creamy layer certificate would have to be submitted by OBC candidates.

3. The admission procedure consists of a written test followed by an interview for
candidates successful in the same. Those who have cleared UGC/CSIR (JRF)
examinations/ NET / SET (Mathematical Sciences) / GATE (Mathematics) or hold DST
INSPIRE Fellowship/ Teachers’ Fellowship or have obtained M.Phil. degree in Statistics
and allied subjects or M. Tech. (QR&OR) degree of ISI prior to the application deadline
will be exempted from the written examination but will have to apply and appear in the
interview.

4. Number of seats: 6
5. Reservations will be followed as per the existing West Bengal Higher Educational
Institutions (Reservations in Admissions) Rules.
6. Application forms may be downloaded from the university website
(http://www.caluniv.ac.in/admission/CU_RET_Form.pdf).
Application Deadlines:
Payment of Fees
The prescribed application fee of Rs. 100/- is to be deposited online through SBI Collect
https://www.onlinesbi.com/sbicollect/
For making the payment go to the site specified above, then choose State-(West Bengal), Type-
(Educational Institution), then select Calcutta University and proceed to Category-(Payment of Misc.
Fees), choose Application for Admission to PhD and proceed to the next step.

Form Submission
1. The following documents need to be submitted offline to the Office of Department of Statistics, University of Calcutta, or can be mailed directly to [email protected]
● Recent colour photo of the candidate. ● M.Sc. final marksheet. ● Rank
Card/Certificate etc. of NET/SET/GATE/M.Phil. etc. ● Caste Certificate, if applicable
(SC/ST/OBC-A/OBC-B etc.). ● SBI Collect Transaction document.
2. Please note that candidates who are eligible for waiver of the written test are also required to complete and submit the application form by the above deadline.

Structure of the entrance examination and subsequent process:
Date of Advertisement : September 15, 2021
Last date of submission of application form : October 30, 2021
Date of Ph.D Entrance test(online/offline) : November 15, 2021
Result of Ph.D Entrance test : November 22, 2021
Date of Interview : November 29, 2021
Date of publication of selection list : December 02, 2021

1. There will be 24 short answer type questions of 5 marks each out of which one has to
answer 15 questions.
2. The qualifying marks for Entrance Test will be 50%.
3. The mode of the entrance examination will be informed later.
4. Candidates successful in the written examination would have to compete with other
eligible candidates who have already cleared NET / SET / GATE / M. Phil / M. Tech.
(QR&OR) at the interview stage. The list of finally selected candidates would be posted
in the University website and Departmental Notice Board.
5. Candidates selected for the final interview will be required to submit by a specified date a
Statement of Purpose (SoP) that should at least contain his/her areas of interest before
the interview. However, the selection committee may, at its discretion, require a
candidate to opt for a topic/area other than his/her initial choice before admitting him/her
into the Ph.D. programme. The final date for submitting the SoP will be announced along
with the intimation for the interview.
Detailed Syllabus for common M.Phil-PhD Entrance Examination:
Real Analysis :
Real Number System, Cluster Points of sets, Closed and open sets, Compact sets, BolzanoWeierstrass Property, Heine-Borel Property. Sets of Real Vectors, Sequences and Series,
Convergence. Real valued functions. Limit, Continuity and Uniform continuity. Differentiability
of univariate function. Mean value theorems. Extrema of functions. Riemann integral. Improper
integrals. Sequences and Series of functions, Uniform convergence, Power series.Term by term
differentiation and integration.
Probability
Fields, sigma-fields and generators, semifields, Borel sigma-field on R and R^k. Monotone
classes, Measurable functions and properties, compositions; product sigma-fields, Borel sigmafield on Euclidean spaces. Measures, finite, sigma-finite measures. Probability measures,
properties. Independence of events, Borel-Cantelli lemmas. Extensions of measures, Lebesgue
measure on R and R^k.
induced measures. Random variables, Distribution functions, measures in R and R^k. Probability
distributions. Discrete and absolutely continuous distributions. probability densities.
Convergence in probability and almost sure. Integration: simple, nonnegative, general
measurable functions, integrability, MCT, DCT, Fatou’s lemma. Change of variables. Holder’s
and Minkowski’s inequalities. Expectations, moments. Jensen’s inequality.
Product measures. Fubini’s theorem. Independence of random variables. Sums, variances,
covariances. Second Borel-Cantelli lemma. Kolmogorov’s 0-1 law. Weak and strong laws of
large numbers. Kolmogorov’s inequality. Convergence in distribution. Integration of complexvalued functions, characteristic functions. Inversion and Continuity theorems. Central Limit
Theorems.
Lp-convergence of random variables. Connections between various modes of convergence (in
distribution, in probability, L_p, almost sure).Absolute continuity and singularity of measures.
Radon-Nikodym theorem (statement).
Linear Algebra and Linear Programming
Vectors and Matrices: Vector spaces and subspaces, Linear dependence and independence, span,
basis, orthogonality and orthonormality,
Matrix algebra: ;
Linear programming: Graphical Solution and Simplex Algorithm
Sampling Distributions
Non-central x2
, t & F distributions – definitions and properties.Distribution of quadratic forms –
Cochran’s theorem.
Large Sample Theory
Scheffe’s theorem, Slutsky’s theorem. Asymptotic normality, multivariate CLTs, delta method.
Glivenko-Cantelli Lemma
Asymptotic distributions of sample moments and functions of moments, Asymptotic
distributions of Order Statistics and Quantiles. Consistency and Asymptotic Efficiency of
Estimators, Large sample properties of Maximum Likelihood estimators. Asymptotic
distributions and properties of Likelihood ratio tests, Rao’s test and Wald’s tests in the simple
hypothesis case.
Statistical Inference
Sufficiency & completeness, Notions of minimal sufficiency,bounded completeness and
ancillarity, Exponential family.Point estimation : Bhattacharya system of lower bounds to
variance of estimators. Minimum variance unbiased estimators – Applications of Rao –
Blackwell and Lehmann – Scheffe theorems.Testing of Hypothesis : nonrandomized and
randomized tests, critical function, power function. MP tests – Neyman – Pearson Lemma. UMP
tests. Monotone Likelihood Ratio families. Generalized Neyman – Pearson Lemma. UMPU tests
for one parameter families. Locally best tests. Similar tests. Neyman structure. UMPU tests for
composite hypotheses.
Confidence sets: relation with hypothesis testing. Optimum parametric confidence intervals.
Sequential tests. Wald’s equation for ASN. SPRT and its properties – fundamental identity. O.C.
and ASN. Optimality of SPRT (under usual approximation).
Linear Models
Gauss Markov Model: Estimable function, error function, BLUE, Gauss Markov theorem.
Correlated set-up, least squares estimate with restriction on parameters.
Linear Set, General linear hypothesis –related sampling distributions, Multiple comparison
techniques due to Scheffe and Tukey.
Analysis of variance: Balanced classification, Fixed Effects Model, Random Effects Model and
Mixed Effects Model; Inference on Variance components.
Regression analysis, Analysis of covariance.
Regression Analysis
Building a regression model: Transformations – Box-Cox model, Stepwise regression, Model
selection (adjusted R2, cross validation and Mallow’s Cp criteria, AIC and BIC),
Multicollinearity.
Detection of outliers and influential observations: residuals and leverages, DFBETA, DFFIT and
Cook’s Distance.
Checking for normality: Q-Q plots, Normal Probability plot, Shapiro-Wilks test.
Departures from the Gauss-Markov set-up: Heteroscedasticity and Autocorrelation – detection
and remedies.
Longitudinal Data Analysis – introduction with motivation. Exploring longitudinal data. Linear
models for longitudinal data –introduction, mean models, covariance models, mixed effects
models. Predictions.
Types of data. Two-way classified data – Contingency Tables and associated distributions, Types
of studies, Relative Risk and Odds Ratio and their properties. More-than-two-way classified data
– partial associations, marginal and conditional odds.
Generalized Linear Models: Introduction, Components of a GLM, Goodness of fit – deviance,
Residuals, Maximum likelihood estimation.
Binary data and Count data: ungrouped and grouped. Polytomous data.
Over dispersion, Quasi-likelihood.
Models with constant coefficient of variation, joint modeling of mean and variance, Generalized
additive models.
Discrete longitudinal data – generalized linear marginal models, GEE for marginal models,
Generalized linear subject specific models and transition models.
Design of Experiments
Block Designs: Connectedness, Orthogonality, Balance and Efficiency; Resolvable designs;
Properties of BIB designs, Designs derived from BIB designs.
Intrablock analysis of BIB, Lattice and PBIB designs, Row column designs, Youden Square
designs; Recovery of inter-block information in BIB designs; Missing plot technique.
Construction of mutually orthogonal Latin Squares (MOLS); Construction of BIB designs
through MOLS and Bose’s fundamental method of differences.
Factorial designs: Analysis, Confounding and balancing in Symmetric Factorials.
Sample Surveys
Probability sampling from a finite population – Notions of sampling design, sampling scheme,
inclusion probabilites, Horvitz-Thompson estimator of a population total. Basic sampling
schemes – Simple random sampling with and without replacement, Unequal probability
sampling with and without replacement, Systematic sampling. Related estimators of population
total/mean, their variances and variance estimators – Mean per distinct unit in simple random
with replacement sampling, Hansen-Hurwitz estimator in unequal probability sampling with
replacement, Des Raj and Murthy’s estimator (for sample of size two) in unequal probability
sampling without replacement.Stratified sampling – Allocation problem and construction of
strata.Ratio, Product, Difference and Regression estimators. Unbiased Ratio estimators –
Probability proportional to aggregate size sampling, Hartley – Ross estimator in simple random
sampling. Sampling and sub-sampling of clusters. Two-stage sampling with equal/unequal
number of second stage units and simple random sampling without replacement / unequal
probability sampling with replacement at first stage, Ratio estimation in two-stage
sampling.Double sampling for stratification. Double sampling ratio and regression estimators.
Sampling on successive occasions.
Bayesian Analysis
Different Priors and related Posteriors
Estimation, testing and prediction for Univariate Normal distribution with known / unknown
mean and / or variance.
Hierarchical and Empirical Bayes under normal setup.
Prior and posterior analysis in Generalized linear models
Decision Theory
Risk function, Admissibility of decision rules, Complete, essentially complete, minimal complete
and minimal essentially complete classes. Essential completeness and completeness of class of
rules based on sufficient statistic and the class of nonrandomized rules for convex loss
Resampling Techniques
Empirical distribution function and its properties
Jackknife and Bootstrap for estimating bias and standard error.
Consistency of the Jackknife variance estimate in an iid setup.
Bootstrap confidence intervals.
Stochastic Processes
Poisson process. Renewal Theory: renewal processes, renewal function, elementary renewal
theorem, applications, Blackwell’s theorem and key renewal theorem (statements), applications,
alternating renewal processes, applications to limiting excess and age.
Markov chains: time-homogeneity, one-step & multi-step transition probabilities, ChapmanKolmogorov equations, Markov times, strong Markov property, classification of states,
stationary distributions, periodicity, ergodicity, convergence, convergence rate. Examples: birthand-death processes, branching processes.
Jump-Markov processes: conservativeness, transition probabilities, holding times, embedded
Markov chain, Chapman-Kolmogorov equations, Kolmogorov backward and forward equations,
stationary distributions. Examples: pure birth, birth-and-death chains, Markovian queues.
Time Series Analysis
Stationary time series. Autocorrelation and partial autocorrelation functions. Correlogram.
Box-Jenkins Models – identification, estimation and diagnostic checking.
Volatility – ARCH, GARCH models.
Multivariate Analysis:
Multivariate normal distribution and its properties- marginal and conditional distributions.
Random sampling from a multivariate normal distribution- UMVUE and MLE of parameters,
joint distribution of sample mean vector and SS-SP matrix; Wishart distribution and its
properties. Distribution of sample correlation coefficients, partial and multiple correlation
coefficients partial regression coefficient and intraclass correlation coefficient. Distributions of
Hotelling’s T
2
and Mahalanobis’ D
2
statistics- their applications in testing and confidence set
construction. Multivariate linear model, MANOVA for one-way and two-way classified data.
Applied Multivariate Analysis
Clustering: Hierarchical clustering for continuous and categorical data- different choices of
proximity measures, Agglomerative and Divisive algorithms. K-means clustering- optimum
choice of the number of clusters.
Classification and discrimination procedures: Discrimination between two known populations
– Bayes, Minimax and Likelihood Ratio procedures. Discrimination between two multivariate
normal populations. Sample discriminant function. Likelihood ratio rule. Tests associated with
discriminant function, Probabilities of misclassification and their estimation. Classification of
several populations. Fisher’s method for discriminating among several populations.
Principal Component Analysis: Population and sample Principal components and their uses.
Plotting techniques, Large sample inferences.
Factor Analysis: The orthogonal factor model, Estimation of factor loading, Factor rotation,
Estimation of Factor scores, Interpretation of Factor Analysis.
Canonical Correlations: Population and sample canonical variables and canonical correlations
and their interpretations. Plotting techniques, Large sample inferences.

Leave a Comment

Your email address will not be published. Required fields are marked *