Estimation of Specific Parameters#
Estimation of Single Specific Parameter
This chapter focuses on the estimation of key parameters, including amplitude, phase, time of arrival (delay), and frequency.
Both individual parameter estimation and joint estimation of multiple parameters are explored in depth.
Building on the foundational concepts introduced in Chapter 3 (Introduction to Estimation), this chapter applies general estimation theory to specific signal parameters such as amplitude and phase, considering cases where the noise is white Gaussian.
The emphasis is on Maximum A Posteriori (MAP) and Maximum Likelihood (ML) estimation techniques.
We begin by examining the case of a signal within additive white Gaussian noise (WGN), where the parameter is embedded in the signal.
Following this, we specialize these results to signal amplitude estimation within WGN for both coherent and noncoherent signals, presenting both MAP and ML estimation approaches.
Note that noncoherent estimation refers to estimation with an unknown signal phase.
Subsequent sections cover phase and time-delay estimation in WGN, as well as an initial approach to frequency estimation, focusing on estimating a single frequency in WGN.
In more typical scenarios, the exact number of signals and noise characteristics may be unknown.
The chapter then addresses the simultaneous estimation of multiple parameters in WGN (time permitting).
The Cramér-Rao bound is derived, expressed in terms of the Fisher information matrix, whose inverse provides insights into the covariance elements of the estimators.
Estimation of Multiple Parameters
Next, the single-parameter estimation problem, introduced previously sections, is extended to the case of multiple-parameter estimation, where almost all of the operations are linear.
Both ML (Maximum Likelihood) and MAP (Maximum A Posteriori) estimates are computed using a discrete linear observation model that is constructed from the parameters to be estimated.
We provide computations for an ML estimate where, in general, the noise is zero mean but not white, and is characterized by its covariance matrix.
From the Cramér-Rao bound, the ML estimate is found to be unbiased and minimum variance.
We consider the case where the parameters are random with a Gaussian probability distribution.
The Fisher information matrix is then derived, and the MAP estimate is shown to be unbiased and minimum variance.
Modern estimation techniques are often concerned with providing estimates before all of the data have been observed. This motivates the discussion on sequential estimation in Gaussian noise, which considers the case where an estimate is formed after each new observation, using the current observation and a prior estimate, rather than waiting for all the observations to occur.
It is pointed out that this formulation is actually a simple version of the Kalman filter.
References#
The contents of the sections in this chapter are based on the following materials.
T. Schonhoff and A. Giordano, Detection and Estimation Theory and its Applications. Prentice Hall, 2006 , Chapter 11 and Chapter 12