lec(10)Data CompressionHuffman.pptx

Document Details

StylishSpessartine

Uploaded by StylishSpessartine

جامعة العلوم والتقانة

Tags

data compression Huffman coding information theory

Full Transcript

Information Theory Data Compression & Huffman Code Data Compression o data compression can be achieved by assigning short descriptions to the most frequent outcomes of data source. o find shortest average description length of random variable oDefine notion of instantaneous code Data Compr...

Information Theory Data Compression & Huffman Code Data Compression o data compression can be achieved by assigning short descriptions to the most frequent outcomes of data source. o find shortest average description length of random variable oDefine notion of instantaneous code Data Compression cont’d oExpected description length must be grater than or equal to the entropy oEntropy is a natural measure of efficient descriptions length. oHuffman coding procedure for finding minimum expected description length oHuffman Codes are optimal. Huffman Coding oHuffman coding is a simple and systematic way to design good variable length codes given the probabilities of symbols. oThe resulting code is both uniquely decodable and instantaneous (prefix) oThe Huffman coding algorithm can be summarized as follows: Huffman Coding cont’d 1) think of the Pi as the leaf nodes of tree. In constructing Huffman code it’s useful to sort the pi in decreasing order. 2) Starting with leaf nodes, construct tree as follow: − Repeatedly, join two nodes with the smallest probabilities to form new node with the sum of probabilities just joined. Huffman Coding cont’d −Assign a 0 to one branch and a 1 to the other branch. −It’s helpful to do this assignment in systematic way. 3) The codeword for each symbol is given by the sequence of 0’s and 1’s starting from the root node and leading to the leaf node corresponding to the symbol. Example oConsider a source with symbols s1,s2,s3,s4 with probabilities ½,1/4, 1/8, 1/8 respectively. oThe Huffman constructed as follow: Sol. oThe result Huffman code is: s4 s3 s2 s1 symbols 000 001 01 1 Huffman code Example oConsider a source with symbols s1,s2,…,s8 with following probabilities : s8 s7 s6 s5 s4 s3 s2 s1 symbols 0.0625 0.0625 0.0625 0.0625 0.14 0.15 0.21 0.25 pi oConstruct Huffman code Sol. s8 s7 s6 s5 s4 s3 s2 s1 symbols 0.0625 0.0625 0.0625 0.0625 0.14 0.15 0.21 0.25 pi 1010 1011 1000 1001 110 111 00 01 H Code THE END OF THE COURSE

Use Quizgecko on...
Browser
Browser