SINE WAVE ORTHOGONALITY
SUITABILITY OF SINE WAVES AS COORDINATE BASES

Discussing the Discrete Fourier Transform in terms of physical metaphors like “detectors” and “bins”, or even statistical notions like correlation is a bit disingenuous. In some sense, this style of discourse takes a bunch of fascinating mathematical relationships and reduces them to a set of constructions which are amenable to thought-by-analogy but are far less mathematically profound.1 It’s probably better to recognize that sine waves possess a number of (sometimes stunning) properties which make them particularly well suited for use as the bases of a coordinate system. I don’t want to thoroughly investigate coordinate systems and bases in this primer, but I’d like to provide you with some related visualizations which you generally will not find in the relevant literature.

Orthogonality is an important property for the basis vectors of a coordinate system. Two vectors (signals) are orthogonal to one another if their dot product is zero. Geometrically, when two vectors are orthogonal, they point at right angles to one another. The natural bases - the default bases for the Cartesian system - are orthogonal to one another. If we have a 3-Dimensional Cartesian system, the three natural basis vectors are,

[1, 0, 0]
[0, 1, 0]
[0, 0, 1]

It’s easy to show that these vectors are orthogonal to one another. We can prove that by writing out all of the dot products, or simply by noticing that the dot product between any of the two vectors must be zero since every 1 will be multiplied by a 0.

A bit more surprisingly, it can be shown that any two sine waves whose frequencies are multiples of one another are also orthogonal, regardless of their phases. This statement is a bit harder to mentally validate. You can take it on faith, or see a few examples before digging into a proof. Figure 1 allows you to compute the dot product between two sine and cosine pairs at different frequencies. Notice that the dot product is always zero unless the two waves are at the exact same frequency.


Figure 1.  Orthogonality of Sine and Cosine













1. This is a bit of a non-sequitur since I know next to nothing about Douglas Hofstadter, but I really enjoyed his talk titled "Analogy as the Core of Cognition".

SUMMING SINE WAVES
ANOTHER CURIOUS PROPERTY

If we take two sine waves with the same frequency and sum them together, the result will always be a sine wave with the exact same frequency. This is a somewhat curious result, and it holds true even if the two sinusoids have completely different phases and amplitudes. No other periodic signal possesses this property. This means that we can take an infinite number of sine waves at a particular frequency, adjust their phases and amplitudes arbitrarily, sum them up, and still have a sine wave with the original frequency.

Figure 2 demonstrates the summing of two sine-waves with equal frequency. You can use the sliders to adjust the phase and amplitude of the second sine wave. Notice that the frequency of the summation is always equal to the frequency of the two input sines.



Figure 2.  Summing Sine Waves with Equal Frequency




Sine waves are utterly fascinating. Their peculiar nature makes them particularly well suited to act as the “atomic” components of complex signals. I’d encourage you to play around with sinusoids and the trigonometric identities. Successful signal processing practice requires being somewhat intellectually intimate with sine waves. You'll constantly use them as test signals and ground-truths when writing and reasoning about signal processing algorithms.