Syllabus
Course Code: *Elective –I MTEC-109 Course Name: Statistical Information Processing |
||
MODULE NO / UNIT | COURSE SYLLABUS CONTENTS OF MODULE | NOTES |
---|---|---|
1 | Review of random variables: Probability Concepts, distribution and density functions,moments, independent, uncorrelated and orthogonal random variables; Vector-space representation of Random variables, Vector quantization, Tchebaychef inequality theorem, Central Limit theorem, Discrete &Continuous Random Variables. Random process: Expectations, Moments,Ergodicity, Discrete-Time Random Processes Stationary process, autocorrelation and auto covariance functions, Spectral representation of random signals, Properties of power spectral density, Gaussian Process and White noise process. | |
2 | Random signal modelling: MA(q), AR(p), ARMA(p,q) models, Hidden Markov Model &its applications ,Linear System with random input , Forward and Backward Predictions, Levinson Durbin Algorithm.Statistical Decision Theory: Bayes’ Criterion, Binary Hypothesis Testing, M-aryHypothesis Testing, Minimax Criterion, NeymanPearson Criterion, Composite Hypothesis Testing.Parameter Estimation Theory: Maximum Likelihood Estimation, Generalized Likelihood Ratio Test ,Some Criteria for Good Estimators, Bayes’ Estimation Minimum Mean-Square Error Estimate, Minimum, Mean Absolute Value of Error Estimate Maximum A Posteriori Estimate , Multiple Parameter Estimation Best Linear Unbiased Estimator ,Least-Square Estimation Recursive Least-Square Estimator. | |
3 | Spectral analysis: Estimated autocorrelation function, Periodogram, Averaging theperiodogram (Bartlett Method),
Welch modification, Parametric method, AR(p) spectral estimation and detection of Harmonic signals. Information Theory and Source Coding: Introduction, Uncertainty, Information andEntropy, Source coding theorem, Huffman, ShanonFano , Arithmetic , Adaptive coding , RLE , LZW Data compaction, , LZ-77, LZ-78. Discrete Memory less channels, Mutual information, channel capacity, Channel coding theorem, Differential entropy and mutual information for continuous ensembles. |
|
4 | Application of Information Theory: Group, Ring & Field, Vector, GF addition,multiplication rules. Introduction to BCH codes, Primitive elements ,Minimal polynomials, Generator polynomials in terms of Minimal polynomials, Some examples of BCH codes,& Decoder, Reed- Solomon codes & Decoder, Implementation of Reed Solomon encoders and decoders. |