Random Processes#
Introduction to Random Process#
Definition: Random Process
A random process is a function that depends on both the elements of a sample space, \( S \), and an independent variable, \( t \). Given an experiment \( E \) with sample space \( S \), a random process \( X(t) \) maps each possible outcome \( \xi \in S \) to a function of \( t \), denoted as \( x(t, \xi) \), according to a specific rule.
Mean and Autocorrelation Function#
For a random process \( X(t) \), the mean function \( m_x(t) \) and the autocorrelation function \( R_{XX}(t_1, t_2) \) are defined as:
where \( R_{XX}(t_1, t_2) \) characterizes the correlation between values of the process at different time instances.
Wide-Sense Stationary (WSS) Random Processes#
Definition: WSS Random Process
A random process \( X(t) \) is said to be wide-sense stationary (WSS) if it satisfies the following conditions:
The mean \( m_x(t) \) is constant over time.
The autocorrelation function depends only on the time difference \( \tau = t_1 - t_2 \), i.e.,
\[ R_{XX}(t_1, t_2) = R_{XX}(\tau). \]
Properties of WSS Processes#
For a WSS process, the autocorrelation function satisfies:
\[ R_{XX}(-\tau) = R^*_{XX}(\tau). \]A complex-valued process is WSS if and only if its real and imaginary components are jointly WSS.
Power Spectral Density (PSD)#
Definition: Power Spectral Density
The power spectral density (PSD), also known as the power spectrum, of a wide-sense stationary (WSS) random process \( X(t) \) is a function \( S_X(f) \) that describes the distribution of power as a function of frequency.
Unit of PSD
The unit of power spectral density is watts per hertz (W/Hz).
Wiener-Khinchin Theorem#
The Wiener-Khinchin theorem states that for a WSS random process, the power spectral density is the Fourier transform of the autocorrelation function \( R_{XX}(\tau) \), given by:
This theorem establishes a fundamental connection between the time-domain and frequency-domain representations of a stationary random process, showing that the PSD provides insight into the frequency content of the process.
Example: Sine-Wave Random Process#
This example illustrates a simple case of a random process where randomness is introduced only through the phase variation while the amplitude remains fixed.
Consider a sine-wave random process where the random variable is the phase, \( \Theta \), which is uniformly distributed over the interval \([0, 2\pi]\). That is,
where the amplitude \( a \) is a fixed (non-random) constant.
Mean Function#
The mean function of the process is given by:
Since \( \Theta \) follows a uniform distribution over \([0, 2\pi]\), its probability density function (PDF) is:
Substituting this into the expectation,
Evaluating the integral,
Thus, the mean function is a constant and equal to zero.
Autocorrelation Function#
For the sine-wave random process with a random phase:
the autocorrelation function is given by:
Simplification Using a Trigonometric Identity
To evaluate the expectation, we use the trigonometric identity:
Applying this identity,
Since \( \Theta \) is uniformly distributed over \([0, 2\pi]\), the expectation of the second term evaluates to zero, leaving:
Final Autocorrelation Function
By setting \( \tau = t_2 - t_1 \), the autocorrelation function can be rewritten as:
Key Observation
The autocorrelation function depends only on the time difference \( \tau \), rather than on absolute time values \( t_1 \) and \( t_2 \).
This confirms that the process is wide-sense stationary (WSS) since stationarity requires that the autocorrelation depends only on \( \tau \), not on \( t \).
This result also prompts a natural question: What would happen if we considered a cosine-wave random process instead?