Quantifying Information PDF
Document Details
Uploaded by FreeCatSEye8877
Tags
Related
Summary
This document explains the concept of entropy in both thermodynamics and information technology. It covers the basics of entropy, its relationship to disorder and randomness, and how it applies to information theory.
Full Transcript
Quantifying Information COURSE #04 ECONOMIC INFORMATICS QUANTIFYING INFORMATION With its Greek prefix en-, meaning "within", and the trop- root here meaning "change", entropy basically means "change within (a closed system – usually the entire universe)" Lack of order or predictability; gradual...
Quantifying Information COURSE #04 ECONOMIC INFORMATICS QUANTIFYING INFORMATION With its Greek prefix en-, meaning "within", and the trop- root here meaning "change", entropy basically means "change within (a closed system – usually the entire universe)" Lack of order or predictability; gradual decline into disorder; chaos, disorganization, randomness; large-scale collapse. QUANTIFYING INFORMATION QUANTIFYING INFORMATION QUANTIFYING INFORMATION Entropy - is most simply defined as the measure of disorder. The more disordered particles are, the higher their entropy. The universe is constantly becoming more chaotic over time - "the general trend of the universe toward death and disorder“, "the arrow of time“, "probability argument“. The entropy of a closed system can only increase over time, never decrease. QUANTIFYING INFORMATION Information Technology Entropy – the expected value of the information contained in a message QUANTIFYING INFORMATION Information Technology Entropy – the expected value of the information contained in a message QUANTIFYING INFORMATION Information Technology Entropy – the expected value of the information contained in a message QUANTIFYING INFORMATION Information Technology Entropy – the expected value of the information contained in a message QUANTIFYING INFORMATION Thermodynamics Entropy is a measure of the molecular disorder, or randomness, of a system. QUANTIFYING INFORMATION Thermodynamics Entropy - a measure of how quickly the particles inside an object are moving. QUANTIFYING INFORMATION QUANTIFYING INFORMATION Information Pick a random ball How much do I know about the ball I am picking? QUANTIFYING INFORMATION Information Entropy - measure of randomness Knowledge vs Entropy QUANTIFYING INFORMATION Information Entropy - measure of randomness How easy is to guess a random letter? QUANTIFYING INFORMATION Information Entropy - measure of randomness How easy is to guess a random letter? Entropy – minimum number of yes/no questions needed to guess any randomly selected letter in the sequence QUANTIFYING INFORMATION QUANTIFYING INFORMATION QUANTIFYING INFORMATION QUANTIFYING INFORMATION QUANTIFYING INFORMATION QUANTIFYING INFORMATION How much space do we need to store the data defining the information? What is the amount of the information contained in a message? QUANTIFYING INFORMATION Bit (BInary digiT) – basic unit of information, can store one out of two possible outcomes – 0/1, T/F, Y/N, … Coin flip – it can land on either heads or tails – 1 bit – H/T QUANTIFYING INFORMATION A bit (binary digit) is the basic unit of information in computing and digital communications. The bit represents: → The quantitative measure of information → The measure of surprise 1 bit → 2 states (0/1, T/F, Y/N, H/T, …) 2 bits → 4 states (00, 01, 10, 11) 3 bits → 8 states (000, 001, 010, 011, 100, 101, 110, 111) … N bits → 2N states, M states → log2M bits QUANTIFYING INFORMATION How many positions (minimum number of yes/no questions to be 100% sure) we need for: 4 Coin flips? 4 x ??? QUANTIFYING INFORMATION How many positions (minimum number of yes/no questions to be 100% sure) we need for: 7 Numerical digits (0-9)? 7 x ??? QUANTIFYING INFORMATION 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9