## Problem Statement:

Given a signal, which is regularly sampled over time and is “noisy”, how can the noise be reduced while minimizing the changes to the original signal.

## Discussion:

A common problem in reconstructing data is elimination of noise. Noise can corrupt a signal through many means: quantization, measurement noise, errors in sampling time, sensor bias, sensor nonlinearities, signal cross coupling, etc. There are many methods which can be used to eliminate the noise on a signal. Some common approaches include use of a linear filter, Kalman filtering, Wiener filtering, construction of a custom optimization problem, and any number of ad-hoc approaches. Furthermore, the filtering of the signal can be done causally or noncausally. In a causal filter, the filtering for a particular sample if determined based only on the sample received before that sample. In acausal or noncausal filtering, all of the data before and after the sample can be used to determine the ‘best’ value for each sample.
One way to quickly filter a dataset without much effort is to use a Fourier transform. A Fourier transform is a way to decompose a signal into a sum of sine waves. The amplitude and phase associated with each sine wave is known as the spectrum of a signal. If the spectrum of the noise if away from the spectrum of the original signal, then original signal can be filtered by taking a Fourier transform, filtering the Fourier transform, then using the inverse Fourier transform to reconstruct the signal.
Lets start with a  simple example. Consider a signal consisting of a single sine wave, $$s(t)=sin(w*t)$$. Let the signal be subject to white noise which is added in during measurement, $$s_{measured}(t)=s(t)+n$$. Let the $$F$$ be the Fourier transform of $$s$$. Now by setting the value of $$F$$ to zero for frequencies above and below $$w$$, the noise is reduced. Let $$F_{filtered}$$ be the filtered Fourier transform. Taking the inverse Fourier transform of $$F_{filtered}$$ yields $$s_{filtered}(t)$$. The code sample which follows illustrates these operations.

 """
Script to demonstrate the use of Fourier Transforms to acausally filter
a signal.

"""

__author__ = 'Ed Tate'
__email__  = 'edtategmail-dot-com'
__website__ = 'exnumerus.blogspot.com'

from pylab import *

# setup the problem
num_samples  = 1000 # number of samples

# generate an ideal signal
f_signal  = 6   # signal frequency  in Hz
dt = 0.01 # sample timing in sec
p  = 30   # 30 degrees of phase shift
a  = 1    # signal amplitude
s = [a*sin((2*pi)*f_signal*k*dt) for k in range(0,num_samples)]
s_time = [k*dt for k in range(0,num_samples)]

# simulate measurement noise
from random import gauss
mu = 0
sigma = 2
n = [gauss(mu,sigma) for k in range(0,num_samples)]

# measured signal
s_measured = [ss+nn for ss,nn in zip(s,n)]

# take the fourier transform of the data
F = fft(s_measured)

# calculate the frequencies for each FFT sample
f = fftfreq(len(F),dt)  # get sample frequency in cycles/sec (Hz)

# filter the Fourier transform
def filter_rule(x,freq):
band = 0.05
if abs(freq)>f_signal+band or abs(freq)<f_signal-band:
return 0
else:
return x

F_filtered = array([filter_rule(x,freq) for x,freq in zip(F,f)])

# reconstruct the filtered signal
s_filtered = ifft(F_filtered)

# get error
err = [abs(s1-s2) for s1,s2 in zip(s,s_filtered) ]

figure()
subplot(4,1,1)
plot(s_time,s)
ylabel('Original Signal')
xlabel('time [s]')

subplot(4,1,2)
plot(s_time,s_measured)
ylabel('Measured Signal')
xlabel('time [s]')

subplot(4,1,3)
semilogy(f,abs(F_filtered),'or')
semilogy(f,abs(F),'.')
legend(['Filtered Spectrum','Measured Spectrum',])
xlabel('frequency [Hz]')

subplot(4,1,4)
plot(s_time,s_filtered,'r')
plot(s_time,s,'b')
legend(['Filtered Signal','Original Signal'])
xlabel('time [s]')

show()


The output from this script shows the initial signal, followed by the measured signal which is corrupted with noise. The spectrum of this signal is filtered to recover the original signal.

Test Configuration:
• win7
• PythonXY 2.7.2.1