The central problem of Information Theory is how to transmit information (messages) from a source to a receiver without significant distortion. Shannon postulated that the point of sending information is to remove uncertainty. When a message/communication is initially sent, it is at maximum uncertainty, which has been termed “information entropy.” The purpose of information theory is to make the message more intelligible by reducing the message’s uncertainty/entropy. Shannon’s solution to this problem was three-fold. First, convert all information to bits, either a 1 or a 0. Second, eliminate the redundancy that is contained in a message. Lastly, use mathematics to eliminate noise/error/distortion. Shannon proposed that there are codes that will be found to eliminate the noise in a sent message. When this is determined, we will move from a level of maximum entropy (no intelligibility) to greater understanding—a reduction of entropy (Gleick, 2011; Seife, 2006; Shannon, 1948; Soni & Goodman, 2017; Stone, 2015).
To read more, a purchase is needed: Click here to subscribe