Hi,
Assume there is a hardware board having RISC CPU(e.g. ARM Cortex processor), and associated peripherals, Data cache/Instruction cache, DRAM, and some OS running on it (Say embedded linux) roughly what percentage portion of the CPU clocks(MHz) (max limit) can be alloted to a Video decoder?
Asked differently, if a Cortex-A8 processor runs at 666 MHz , then how much of this would be available to a Video decoder in a mobile phone system, in a real time scenario?, which would help decide the max. MCPS the Video decoder can consume?
(I know the exact answer to this will depend upon what all code/components run in the real system, like Audio decoder, Video decoder, Modem, something else..., but for a typical mobile phone system are there any estimates)
Thanks,
-AD.
+ Reply to Thread
Results 1 to 1 of 1
Similar Threads
-
Measure distance from target in different video formats.
By Adventuredad in forum Newbie / General discussionsReplies: 5Last Post: 19th Apr 2010, 09:52 -
How to generate instant replays and game time clock on live online video
By DomHennig in forum Video Streaming DownloadingReplies: 5Last Post: 24th Mar 2009, 20:18 -
decoder/encoder speed
By exekutive in forum ffmpegX general discussionReplies: 2Last Post: 30th Nov 2008, 19:16 -
Combining PowerDVD 7's Audio Decoder and Microsot's MPEG-2 Video Decoder
By Dark Alpha in forum Software PlayingReplies: 4Last Post: 11th Sep 2008, 18:44 -
Please HELP: How can I add a box score/time clock to a sports video?
By MaTTuP in forum EditingReplies: 5Last Post: 23rd Aug 2008, 12:40