Your History

Menu

Definition of Sampling Frequency

Prerequisites

Sampling Period | \( T_{s} \)
Sampling Frequency | \( f_{s} \)

Description

This is the definition of sampling frequency. It is defined as how many times per second a continuous (analog) signal is sampled.

\[\htmlClass{sdt-0000000055}{f_{s}} = \frac{1}{\htmlClass{sdt-0000000048}{T_{s}}}\]

Symbols Used:

This symbol represents sampling period, the amount of time between samples taken in an analog-to-digital converter.

\( f_{s} \)

This symbol represents sampling frequency, the frequency at which a continuous signal is sampled when converting it to a discrete signal.

Derivation

  1. Consider the definition of sampling frequency:

    The sampling frequency, \(f_{s}\), is the frequency at which a continuous signal is sampled when converting it to a discrete signal. Its defined by \(\frac{1}{\htmlClass{sdt-0000000048}{T_{s}}}\) with its unit being samples per second. To preserve all the information in a signal when discretizing it, the sampling frequency should be equal or bigger than the Nyquist frequency.

  2. We can now consider the definition of sampling period:

    The sampling period, denoted by the symbol \(T_{s}\) is the amount of time between samples taken in an analog-to-digital converter. It describes the number of seconds between each sample taken from the continuous signal. More coming soon...

  3. From here it follows that:
    \[\htmlClass{sdt-0000000055}{f_{s}} = \frac{1}{\htmlClass{sdt-0000000048}{T_{s}}}\]

as required.

Example

coming soon...