Source : Free On-Line Dictionary of Computing
wavelength division multiplexing
(WDM) {Multiplexing} several {Optical Carrier
n} signals on a single {optical fibre} by using different
wavelengths (colours) of {laser} light to carry different
signals.
The device that joins the signals together is known as a
{multiplexor}, and the one that splits them apart is a
{demultiplexor}. With the right type of fibre you can have a
device that does both and that ought to be called a "mudem"
but isn't.
The first WDM systems combined two signals and appeared around
1985. Modern systems can handle up to 128 signals and can
expand a basic 9.6 {Gbps} fibre system to a capacity of over
1000 Gbps.
WDM systems are popular with telecommunications companies
because they allow them to expand the capacity of their fibre
networks without digging up the road again. All they have to
do is to upgrade the (de)multiplexors at each end. However
these systems are expensive and complicated to run. There is
currently no {standard}, which makes it awkward to integrate
with older but more standard {SONET} systems.
Note that this term applies to an optical {carrier} (which is
typically described by its wavelength), whereas {frequency
division multiplexing} typically applies to a {radio} carrier
(which is more often described by frequency). However, since
wavelength and frequency are inversely proportional, and since
radio and light are both forms of electromagnetic radiation,
the distinction is somewhat arbitrary.
See also {time division multiplexing}, {code division
multiplexing}.
[Is "wave division multiplexing", as in "dense wave division
multiplexing" (DWDM) just a trendy abbreviation?]
(2002-07-16)