《第5章 无失真编码.ppt》由会员分享,可在线阅读,更多相关《第5章 无失真编码.ppt(34页珍藏版)》请在得力文库 - 分享文档赚钱的网站上搜索。
1、Chapter 5 Lossless Source CodingInformationTheoryandCodinglGeneralmodelofdiscrete,lossless(无失真),non-memorysource:InputsymbolsetCodesymbolsetCodewordset5.1LosslessencoderlHowtocodelosslessly?Assumingthatthestatisticalcharacteristicsofthesourcecanbeignored,itmustsatisfy thefollowingcondition.Togetthet
2、argetoflosslesscoding,itmustconformtothecondition,whichguaranteesthatthereareplentyofcodesymbolstobeused.lFromtheconditionwecangetaninequality,lE.g.5.1SupposeweonlyhavethefirsteightlettersofEnglishalphabet(AtoH)inourvocabulary.TheFixedLengthCode(FLC)forthissetofletterswouldbeLetterCodewordLetterCode
3、wordA000E100B001F101C010G110D011H111Fixed Length CodelAVariableLengthCode(VLC)forthesamesetofletterscanbeLetterCodewordLetterCodewordA00E101B010F110C011G1110D100H1111Variable Length Code1(Huffman coding)lSupposewehavetocodetheseriesofletters:”ABADCAB”.Thefixedlengthcodeandvariablelengthcoderepresent
4、ationsofthe7lettersareFixed Length Code000 001 000 011 010 000 001Total bits=21Variable Length Code00 010 00 100 011 00 010Total bits=18ItcanbeseenthattheVLCusesafewernumberofbitsthanFLCsincetheletters,forexample,A,appearingmorefrequently in the sentence are represented with a fewernumberofbits00.lI
5、nstantaneous code(prefix code)(1)Itisakindofuniquelydecodablecode(2)Inanunfixedlengthcode,thereisnoonecodebeingprefixtoothercodes.(3)Whendecoding,itdoesnotneedtorefertoitsfollowingcodes,andcanmakethejudgmentimmediately,carryingoutnon-delaydecoding.lLetushavealookatExample5.1again.Nowweconsideranothe
6、rVLCforthefirst8lettersofEnglishalphabet:LetterCodewordLetterCodewordA0E10B1F11C00G000D01H111Variable Length Code2lThissecondvariablelengthcodeappearstobemoreefficientthanthefirstoneintermsofrepresentationoftheletters.Variable Length Code100 010 00 100 011 00 010Total bits=18Variable Length Code20 1
7、 0 01 00 0 1Total bits=9However,it has decoding problemsHowever,it has decoding problems:Original:010010001-ABADCABNow:010010001-AEABAADOr010010001-ABAABAAABlDefinition 5.1APrefix Codeisoneinwhichnocodewordformstheprefixofanyothercodeword.SuchcodesarealsocalledUniquely DecodableorInstantaneous Codes
8、.lThe optimal codeItisakindofuniquelydecodablecodeItsaveragecodelengthislessthanthatofanyotheruniquelydecodablecode.5.2.1Fixedlengthcodingtheorem5.2LosslesssourcecodingFixed length source coding theoremFixed length source coding theorem:Astothediscrete,non-memory,stationary,ergodicsourcesymbolsequen
9、ce S=(S1,S2.Sq),wecanuserdifferentsymbolsX=(x1,x2.xr)toperformfixedlengthcode.Forany0and0,astotheN-expansionsource,Ifissatisfied,andwhenNisbigenough,thedecodingerrorcanbelessthan.Thefixedlengthcodingtheoremhaspresentedatheoreticallimitofthecodelengthusedforfixedlengthcoding.lIftheequallengthcodeisde
10、mandedtobeuniquelydecodable,thenitmusthas:lIfN1,then:Conclusion:foruniquedecoding,eachsourcesymbolneedsatleastcodesymbolstoberepresented.lWhenusedual-codesymboltocode,r2,then:Conclusion:whenuseequallengthdual-code,thelimitcodelengthofeachsourcesymbolisExample5.2.2Unfixedlength(Variablelength)sourcec
11、odinglSeveralconceptsofcodetype(Eg.2.4.2)lNon-singularcodeandsingularcodelUniquelydecodablecodelPrefixcode(instantaneouscode,non-delaycode)lKrafttheoremlUnfixedlengthcodingtheorem(ShannonFirstTheorem)Kraft theoremlquestion:findreal-time,uniquelydecodablecodelmethod:researchthecodepartitionconditiono
12、fthereal-timeuniquelydecodablecodelintroduction:conceptof“codetree”lconclusion:presenttheconditionthatthereal-timeuniquelydecodablecode(prefixcode)exists,thatis,theKrafttheoremcode treeThe corresponding relationship between VLC and code tree:(1)TreerootStartingpointofacodeword(2)Numberofbranchesfrom
13、atreenodeCodedimension(3)NodePartofacodeword(4)TerminalnodeEndofacodeword(5)NumberofbranchesCodelength(6)Non-fullbranchtreeVariablelengthcode(7)FullbranchtreeFixedlengthcodelTheorem 5.1(Kraft Inequality)AnecessaryandsufficientconditionfortheexistenceofabinaryprefixcodesetwhosecodewordshavelengthsisC
14、ode tree map for prooflAveragelengthoftheprefixcodeAdiscretenon-memorystationarysourceItslimitsourceentropyisandthenumberofinputsymbolsetis.Thecodelengthofeachcodewordis,thentheaveragecodelengthofprefixcodeis5.3.3 Lossless unfixed length(variable length)source coding theorem lLossless unfixed length
15、 source coding theorem(Shannon First Theorem)lFor discrete non-memory stationary source S,its limit entropy is and its code signals are X=x1,xr.lWe can always code the source S by using a coding method to construct uniquely decodable code.Thus the average code length of each source signal satisfies:
16、lTheorem 5.3(Source Coding TheoremSource Coding Theorem)LetXbethesetoflettersfromaDMSwithfiniteentropyH(X)andxk,k=1,2,Ltheoutputsymbols,occurringwithprobabilities.Giventheseparameters,itispossibletoconstructacodethatsatisfiestheprefixconditionandhasanaveragelengthRthatsatisfiesthefollowinginequality
17、l lExample:discretesourcewithoutmemoryIts entropy is:Use dual signals(0,1;r=2)construct a prefix code:The average code length of each signal is:The code efficiency is:Construct a prefix code of expansion source S2Prefixcodes1s19/160s1s23/1610s2s13/16110s2s21/16111average code length:average code len
18、gth of each signal in source S2:code efficiency:Also has:In order to improve the code efficiency,a two-dimensional union code is adopted by considering the 2nd order expansion source of S.Conclusion:Usingunfixedlengthcode,wecanachievequitehighcodingefficiencyevenwhenNisnotveryhigh.Moreoverwemayreali
19、zelosslesscoding.AlsowithNincreasing,thecodingefficiencymoreandmoreapproachesto1.5.4Typicalexampleoflosslesssourcecodingl5.4.1 Huffman codinglCodingmethodlCharacteristiclApplication(Huffman coding)Eg.5.7Codingsteps:1)SortthesourcemessageUaccordingtotheirprobabilitiesinadescendingorder;2)Startfromthe
20、twoleastprobabilities;forexample,thelowerbranchwithlittleprobabilityisassigned“1”,andtheupperbranchisassigned“0”.Ifthetwobranchesprobabilitiesareequal,stillassignthelowerbranch“1”,andtheupper“0”.3)Combinethetwocodedbranches,reorderandrecode.4)Repeatstep3)untilthesumoftheprobabilitiesis1.5)Turnoverfr
21、omtherightmostendtotheleftalongthetreebranchtogetthecorrespondingcodeword,likeU3“110”.(Huffman coding)lE.g.1:E.g.2Eight symbols(A-H)with the probabilities of occurrence given in the right table.Please draw the Huffman coding procedure and compute the coding efficiency.SymbolProbabilityA0.10B0.18C0.4
22、0D0.05E0.06F0.10G0.07H0.04(1)The entropy of this source is:(2)The average length of codeword is:(3)The efficiency of this code is therefore:lCharacteristicslCodingmethodisnotunique.lEnsurethesignalwithhigherprobabilityhasashortercodeword,whilethesignalwithlowerprobabilityhasalongerone.Thiscanguarant
23、eeashorteraveragecodelength.lThelasttwocodewordsarealwaysdifferentinthelastbitsymbol,andthesameinthepreviousbits.lThetwolongestsymbolshavethesamecodelength.These four characteristics make Huffman coding be one of the optimum coding.lApplication:lApply widely in various fields.example:International digital facsimile coding standard(1980)US HDTV(1995)lProblems:lError spreadinglAlgorithm complexity rapidly grows with the source string length increasingEnd of Chapter 5
限制150内