Random Processes#

Introduction to Random Process#

Definition: Random Process
A random process is a function that depends on both the elements of a sample space, \( S \), and an independent variable, \( t \). Given an experiment \( E \) with sample space \( S \), a random process \( X(t) \) maps each possible outcome \( \xi \in S \) to a function of \( t \), denoted as \( x(t, \xi) \), according to a specific rule.

Mean and Autocorrelation Function#

For a random process \( X(t) \), the mean function \( m_x(t) \) and the autocorrelation function \( R_{XX}(t_1, t_2) \) are defined as:

\[ m_x(t) = \mathbb{E}[X(t)] \]
\[ R_{XX}(t_1, t_2) = \mathbb{E}[X(t_1)X^*(t_2)] \]

where \( R_{XX}(t_1, t_2) \) characterizes the correlation between values of the process at different time instances.

Wide-Sense Stationary (WSS) Random Processes#

Definition: WSS Random Process
A random process \( X(t) \) is said to be wide-sense stationary (WSS) if it satisfies the following conditions:

  1. The mean \( m_x(t) \) is constant over time.

  2. The autocorrelation function depends only on the time difference \( \tau = t_1 - t_2 \), i.e.,

    \[ R_{XX}(t_1, t_2) = R_{XX}(\tau). \]

Properties of WSS Processes#

  • For a WSS process, the autocorrelation function satisfies:

    \[ R_{XX}(-\tau) = R^*_{XX}(\tau). \]
  • A complex-valued process is WSS if and only if its real and imaginary components are jointly WSS.

Power Spectral Density (PSD)#

Definition: Power Spectral Density
The power spectral density (PSD), also known as the power spectrum, of a wide-sense stationary (WSS) random process \( X(t) \) is a function \( S_X(f) \) that describes the distribution of power as a function of frequency.

Unit of PSD
The unit of power spectral density is watts per hertz (W/Hz).

Wiener-Khinchin Theorem#

The Wiener-Khinchin theorem states that for a WSS random process, the power spectral density is the Fourier transform of the autocorrelation function \( R_{XX}(\tau) \), given by:

\[ S_X(f) = \mathcal{F} \{ R_{XX}(\tau) \}. \]

This theorem establishes a fundamental connection between the time-domain and frequency-domain representations of a stationary random process, showing that the PSD provides insight into the frequency content of the process.

Example: Sine-Wave Random Process#

This example illustrates a simple case of a random process where randomness is introduced only through the phase variation while the amplitude remains fixed.

Consider a sine-wave random process where the random variable is the phase, \( \Theta \), which is uniformly distributed over the interval \([0, 2\pi]\). That is,

\[ X(t) = a \sin(\omega_0 t + \Theta), \]

where the amplitude \( a \) is a fixed (non-random) constant.

Mean Function#

The mean function of the process is given by:

\[ \mu_X(t) = \mathbb{E}[X(t)] = \mathbb{E}[a \sin(\omega_0 t + \Theta)]. \]

Since \( \Theta \) follows a uniform distribution over \([0, 2\pi]\), its probability density function (PDF) is:

\[ f_\Theta(\theta) = \frac{1}{2\pi}, \quad 0 \leq \theta < 2\pi. \]

Substituting this into the expectation,

\[ \mu_X(t) = a \int_{0}^{2\pi} \sin(\omega_0 t + \theta) \cdot \frac{1}{2\pi} d\theta. \]

Evaluating the integral,

\[ \mu_X(t) = \frac{a}{2\pi} \int_{0}^{2\pi} \sin(\omega_0 t + \theta) d\theta = 0. \]

Thus, the mean function is a constant and equal to zero.

Autocorrelation Function#

For the sine-wave random process with a random phase:

\[ X(t) = a \sin(\omega_0 t + \Theta), \]

the autocorrelation function is given by:

\[ R_{XX}(t_1, t_2) = \mathbb{E}[X(t_1)X(t_2)] = \mathbb{E}[a^2 \sin(\omega_0 t_1 + \Theta) \sin(\omega_0 t_2 + \Theta)]. \]

Simplification Using a Trigonometric Identity
To evaluate the expectation, we use the trigonometric identity:

\[ \sin(x) \sin(y) = \frac{1}{2} \cos(x - y) - \frac{1}{2} \cos(x + y). \]

Applying this identity,

\[ R_{XX}(t_1, t_2) = \frac{a^2}{2} \mathbb{E}[\cos(\omega_0 (t_2 - t_1))] + \frac{a^2}{2} \mathbb{E}[\cos(\omega_0 (t_1 + t_2 + 2\Theta))]. \]

Since \( \Theta \) is uniformly distributed over \([0, 2\pi]\), the expectation of the second term evaluates to zero, leaving:

\[ R_{XX}(t_1, t_2) = \frac{a^2}{2} \cos(\omega_0 (t_2 - t_1)). \]

Final Autocorrelation Function
By setting \( \tau = t_2 - t_1 \), the autocorrelation function can be rewritten as:

\[ \boxed{ R_{XX}(t, t + \tau) = \frac{a^2}{2} \cos(\omega_0 \tau). } \]

Key Observation

  • The autocorrelation function depends only on the time difference \( \tau \), rather than on absolute time values \( t_1 \) and \( t_2 \).

  • This confirms that the process is wide-sense stationary (WSS) since stationarity requires that the autocorrelation depends only on \( \tau \), not on \( t \).

This result also prompts a natural question: What would happen if we considered a cosine-wave random process instead?