Information Theory: The branch of applied mathematics involving the quantification of information.
Information: A message that consists of an ordered sequence of symbols, or the meaning that can be interpreted from it.
Claude E. Shannon first developed Information theory as part of his research into the fundamental limits or signal processing in 1948. This research was a purely statistical study that did not consider the content or meaning of the information. It also produced what is now the standard measurement of information called the bit. The full concept involves a source of information that produces the message which in turn is converted by a transmitter to a signal. During transmission noise can enter the system and degrade the signal which causes information to be lost. Eventually the signal can be picked up by a receiver where the message is then sent to the destination.
These basic concepts of Information Theory make much of what we take for granted to day possible including computers, the internet, cell phones, and thing we don’t even notice. Information is an important part of life in the 21 first century and in the form of DNA information is absolutely necessary for life to exist at all.