Color motion artifact detection and processing app.docx
- 文档编号:29835412
- 上传时间:2023-07-27
- 格式:DOCX
- 页数:16
- 大小:22.88KB
Color motion artifact detection and processing app.docx
《Color motion artifact detection and processing app.docx》由会员分享,可在线阅读,更多相关《Color motion artifact detection and processing app.docx(16页珍藏版)》请在冰豆网上搜索。
Colormotionartifactdetectionandprocessingapp
Colormotionartifactdetectionandprocessingapparatuscompatiblewithvideocodingstandards
BACKGROUNDOFTHEINVENTION
Thepresentinventionrelatestoprocessingofvideodata,andmorespecificallytodetectionandreductionofcolor-motionartifactsinprocessingofvideodata.
Successiveframesinatypicalvideosequenceareoftenverysimilartoeachother.Forexample,asequenceofframesmayhavescenesinwhichanobjectmovesacrossastationarybackground,orabackgroundmovesbehindastationaryobject.Consequently,manyscenesinoneframemayalsoappearinadifferentpositionofasubsequentframe.Videosystemstakeadvantageofsuchredundancywithintheframesbyusingpredictivecodingtechniques,suchasmotionestimationandmotioncompensation,toreducethevolumeofdatarequiredincompressingtheframes.
Inaccordancewiththewell-knownmotionestimationtechnique,toconservebitrate,datarelatedtothedifferencesbetweenpositionsofsimilarobjectsinsuccessiveframesarecapturedbyoneormoremotionvectors.Themotionvectorsarethenusedtoidentifythespatialcoordinatesoftheshiftedobjectsinasubsequentframe.Themotionvectorsthereforelimitthebitratethatwouldotherwiseberequiredtoencodethedataassociatedwiththeshiftedobjects.
MotionestimationandcompensationareusedinseveralinternationalstandardssuchasH.261,H.263,MPEG-1,MPEG-2,andMPEG-4.Partlyduetoitscomputationalintensity,amotionvectorissharedtypicallybyallcolorcomponentsin(Y,U,V)or(Y,Cr,Cb)coordinatesystems.Inthe(Y,U,V)colorcoordinatesystem,Yisthelumacomponent(alsoreferredtobelowastheluminanceandisrelatedtotheintensity),andUandVarethechromacomponents(alsoreferredtobelowasthechrominancecomponentsandarerelatedtohueandsaturation)ofacolor.Similarly,inthe(Y,Cr,Cb)colorcoordinatesystem,Yisthelumacomponent,andCbandCrarethechromacomponents.Eachmotionvectoristypicallygeneratedforamacroblock.Eachmacroblocktypicallyincludesablockof,e.g.,16×16or8×8pixels.TheMPEG-2standardprovidesaninterlacedmodethatseparateseach16×16macroblockintotwo16×8sub-macroblockseachhavinganassociatedmotionvector.Inthefollowing,thetermsblock,macroblock,andsub-macroblockmaybeusedinterchangeably.
Tosimplifycomputation,mostcommonlyknownvideostandardsuseonlytheluminancecomponenttogenerateamotionvectorforeachmacroblock.Thismotionvectorissubsequentlyappliedtobothchromacomponentsassociatedwiththatmacroblock.Thegenerationofamotionvectorusingonlytheluminancecomponentmaycauseundesirablecolor-motionartifacts(alternativelyreferredtohereinbelowascolorartifacts)suchascolorpatches.
Tofurtherthereducebitrateforencodingofvideodata,inter-frameandintra-frameencodinghavebeendeveloped.Inaccordancewiththeinter-framecoding,thedifferencebetweenthedatacontainedinapreviousframeandacurrentframeisusedtoencodethecurrentframe.Theinter-frameencodingmaynotimprovecodingefficiency,forexample,ifthecurrentframeisthefirstframeinanewscene(i.e.,whenthereisascenechange),inwhichcaseintra-frameencodingisused.Inaccordancewiththeintra-frameencoding,onlytheinformationcontainedwithintheframeitselfisusedtoencodetheframe.
FIG.1isasimplifiedhigh-levelblockdiagramofaconventionalsystem100adaptedtodetectcolor-motionartifacts.System100receivesasequenceofincomingvideoframesviaframereorderblock102.Inresponse,framereorderblock102seriallysuppliesthe(Y,U,V)componentsofacurrentframeoftheframesequencetoanadder/subtractor104.Adder/subtractor104isalsoadaptedtoreceiveasignalfrommotioncompensationblock106viaselector108.Ifselector108isintheupperposition,thenintra-framecodingisusedinwhichcaseanullsignal(i.e.,0)issuppliedtoadder/subtractor104.Ontheotherhand,ifselector108isinthelowerposition,theninter-framecodingisused.Adder/subtractor104generatesasignalthatcorrespondstothedifferencebetweenthevideodatasuppliedbyframereorderblock102andthatsuppliedbyselector108.
Thesignalgeneratedbyadder/subtractor104issuppliedtoadiscretecosinetransform(DCT)block110whoseoutputsignalisquantizedbyaquantizer112.Thequantizedsignalgeneratedbyquantizer112isthenencodedbyvariable-lengthcoder(VLC)114.ThesignalencodedbyVLC114isthenstoredinbuffer116,whichinturn,suppliestheencodedvideobitstreamtoavideodecoder(notshown).
Thesignalgeneratedbyquantizer112isinverselyquantizedbyaninversequantizer118andissubsequentlydeliveredtoaninverseDCT(IDCT)120.IDCT20performsaninverseDCTfunctiononthesignalitreceivesandsuppliesthatsignaltoadder122.Adder122addsthesignalitreceivesfromselector108tothesignalitreceivesfromIDCT120andstorestheaddedresultinframememory124forfutureretrieval.Thesignalstoredinframememory124onlyincludesthelumacomponentofacurrentframeandisadaptedtoserveasareferenceframeformotionestimationandcompensationoffutureframes.
Amotionestimator128receivesthesignalstoredinframememory124andthesignalsuppliedbyframereorderblock102togenerateamotionvectorsignalthatissuppliedtomotioncompensationblock106.Onlythelumacomponentsofthecurrentframe—assuppliedbyframereorderblock102—andthereferenceframe—assuppliedbyframememory124—arereceivedandusedbymotionestimator128togenerateamotionvector.Themotionvectorgeneratedbymotionestimator128issuppliedtomotioncompensator106.Motioncompensator106,inturn,compensatesforthemotionofthesignalitreceivesfromframememory124usingthemotionvectorsignalthatitreceivesfrommotionestimator128.Theoutputsignalgeneratedbymotioncompensator106isamotion-compensatedsignalofacurrentframeandservesasthereferenceframeforthenextincomingframewheninter-frameencodingisused.
Theremaybeoccasionswhenareferenceframeisnotrequired.Forexample,noreferenceframeisrequiredwhenanewvideosequenceisreceivedbysystem100.Similarly,thereisnoneedforareferenceframewhenprocessingthefirstframeofanewscene.Toaccommodatesituationswherenoreferenceframeisneeded,selector108isprovidedwithanupperposition.Whenplacedintheupperposition,anullsignal(i.e.,0)istransferredtosubtracted104.
Conventionalluminance-onlybasedmotionestimationandcompensationsystems,suchassystem100,failtoreflectthetruemovementofanobjectinavideosequence.Thisfailureresultsinnoticeablecolor-motionartifacts.FIG.2isanexemplarydiagramshowingcolor-motionartifactsstemmingfromfailuretodetectthemotionofacolorobject.InFIG.2,auniformgrayarea200providesabackgroundtotwosyntheticcolorpatches,namelyaredcolorpatch210andagreencolorpatch220.Bothredandgreencolorpatches210and220producethesameluminancelevelasgraybackground200.Bothcolorpatches210and220arealsomovinginfrontofgraybackground200.
AssumethatinFIG.2,thecolorconversionrecommendedbytheITU-RstandardBT.709isused,asshownbelow:
Y=0.715G+0.072B+0.213R.
Assumefurtherthatgraybackground200has(R,G,B)colorcomponentsof(40,40,40)resultinginalumacomponent(i.e.,intensitylevel)of40.Assumefurtherthatredcolorpatch210andgreencolorpatch220haverespective(R,G,B)colorcomponentsof(188,0,0)and(0,56,0).Consequently,inaccordancewithequation
(1)above,bothredcolorpatch210andgreencolorpatch220havethesameluminancelevel(i.e.,40)asthegraybackground200.Therefore,conventionalluminance-onlybasedmotionestimationandmotioncompensationsystems,suchassystem100ofFIG.1,failtodetectthemovementofredandgreencolorpatches210and220relativetograybackground200.Thisfailureresultsinnoticeabledifferencesinthequantizationerrorsofthechromacomponents—particularlyforthegraybackground—therebycreatingcolormotionartifacts.
Oneknownmethodforovercomingtheproblemsassociatedwithluminance-onlymotionestimationandcompensationsystemsisdescribedinU.S.Pat.No.5,544,263andwhichinvolvesperformingmotionestimationforeachofthecolorcomponents(Y,U,V).Thisthree-motion-vectormethod,whileachievingagoodmatchforeachofthecolorcomponents,substantiallyincreasesthecomputationalintensity.Theincreaseincomputationalintensity,increasesboththecostofthesystemaswellasthebandwidthrequirementfortransmittingthethreemotionvectors.This,inturn,limitstheavailablebitratetoencodethemotion-compensatedinter-framedifferences.Therefore,althoughthemethodasdescribedinU.S.Pat.No.5,544,263hasanimprovedcolormotionartifactrejection,becauseitprovideslessbandwidthtoencodetheinter-framedifferences,itislessimmunetogranularnoise.
Anotherknownmethodforovercomingtheproblemsassociatedwithluminance-onlymotionestimationandcompensationsystemsisdescribedinU.S.Pat.No.5,544,263andwhichinvolvesusingallcolorcomponentswhenmatchingblocks.Inaccordancewiththismethod,block-matchingforeachmacroblockisperformedonlyonce.However,thistechniquemayrequire50%morecomputationthandotheluminance-onlymotion-estimationandcompensationsystems.This50%increaseincomputationincreasesthesystemcomplexityandcost.
Needcontinuestoexistforimprovedcolormotionartifactdetectionandreductiontechniques.
BRIEFSUMMARYOFTHEINVENTION
- 配套讲稿:
如PPT文件的首页显示word图标,表示该PPT已包含配套word讲稿。双击word图标可打开word文档。
- 特殊限制:
部分文档作品中含有的国旗、国徽等图片,仅作为作品整体效果示例展示,禁止商用。设计者仅对作品中独创性部分享有著作权。
- 关 键 词:
- Color motion artifact detection and processing app
链接地址:https://www.bdocx.com/doc/29835412.html