Table of Contents
##
What’s in a “Noise”?
- As in English, it can mean an unwanted signal of any kind. If two signals interfere with each other, each signal would consider the other to be noise.
- “Noise” also refers to a signal that contains components at many frequecies, so it lacks the harmonic structure of the periodic signals.
This post is about the second kind.
#
Uncorrelated noise
Uncorrelated uniform noise (UU noise) “uniform” means the signal contants random values from a uniform distribution, “uncorrelative” means the values are independent (one value provides no information about the others)
Relationship between power (square of amplitude) and frequency: In the spectrum of UU noise, the power at all frequecies is drawn from the same distribution; that is, the average power is the same for all the frequecies. (Better discribed by a Integrated spectrum of UU noise which it’s x-axis is frequency (Hz) and y-axis is cumulative fraction of total power.)
And the “Integrated Spectrum” figure should show a straight line for UU noise, which indicates that power at all frequecies is constant, on average. Noise with equal power at all frequecies is called white noise (by analogy with light, because an equal mixture of light at all visible frequecies is white.)
#
Brownian noise
In a Brownian noise, each value is the sum of the previous value and a random “step”. It is called “Brownian” by analogy with Brownian motion which is often described using a “random walk”. A random walk is a mathematical model of a path where the distance between steps is characterized by a random distribution.
In a one-dimensional random walk of Brownian motion of particle, it moves up or down by a random amount at each time step. The location of the particle at any point in time is the sum of all previous steps. And that is how Brownian noise would be generated.
For Brownian noise, the slope of the power spectrum is -2, so we can write this relationship as: $$ \log P = k - 2\log f $$ where $P$ is power, $f$ is grequency, and $k$ is the intercept of the line. Exponentiating both sides yields: $$ P = K / f^2 $$ where $K$ is $e^k$, a scalar. And that power is proportional to $1/f^2$, which is characteristic of Brownian noise.
Brownian noise is also called “red noise”, for the same reason that white noise is called “white” – combining visible light with power proportional to $1/f^2$, most of the power would be at the low-frequency end of the spectrum, which is red.
#
Pink noise
More generally, we can synthesize noise with any exponent, $\beta$, in power-frequency relationship: $$ P = K / f^{\beta} $$ when $\beta = 0$, power is constant at all frequencies, so the result is white noise. when $\beta = 2$, the result is red noise.
when $\beta$ is between 0 and 2, the result is between white and red noise, so it is called “pink noise”.
#
Gaussian noise
When people talk about “white noise”, they don’t always mean UU noise. In fact, more often they mean uncorrelative Gaussian noise (UG noise).
UG noise is similar in many ways to UU noise. The spectrum has equal power at all frequecies, on average, so UG is also white. And it has one other interesting property: the spectrum of UG noise is also UG noise. More precisely, the real and imaginary parts of the spectrum are Uncorrelated Gaussian values. (Testing with “normal plot”)
In fact, by the Central Limit Theorem (CLT), the spectrum of almost any Uncorrelated noise is approximately Gaussian, as long as the distribution has finite mean and standard deviation, and the number of samples is large.