Hey
I was just wonderin, is there anything wrong with running multiple video/encoding tasks at once ?
For example, at one instance:
-running multiples of DGIndex at once to create d2v files of mpeg2's.
or
-encoding with say Gordian Knot using XviD and with Megui using x264
or
-encoding with GK/Megui to XviD/x264 and using CCE to encode XviD/x264 to MPEG-2 (DVD)
Is it possible to get screwed up video or glitches or any other sort of problems in any of the above scenarios ?
Thanks
+ Reply to Thread
Results 1 to 29 of 29
-
-
There shouldn't be any problem running multiple tasks, it will just run them all a little slower. Some tasks that demand more real time results, like capturing, may be affected by dropped frames when running too many other things in the background, and possibly burning, because of disc access, could have problems, but encoding or similar should have no ill effects, other than slowing.
-
My experience with doing two encodes at the same time was that it actually took longer than if I encoded one after the other. But both encodes turned out fine (although terribly fragmented).
"Shut up Wesley!" -- Captain Jean-Luc Picard
Buy My Books -
Terribly fragmented ? Doesn't fragmented mean broken up into little fragments
:S -
Originally Posted by spanky123"Shut up Wesley!" -- Captain Jean-Luc Picard
Buy My Books -
If you put too much stress on the HDD (reading, writings too many items at the same time, I'm not talking CPU) you may end up with bad blocks and HDD premature failure eventually.
...happened to me, so go ahead... As it was said above, not gonna be faster but this is inviting trouble... your encoding has to be set to lowest priority so that PC could interrupt when it needs to, if it can't interrupt things happen. -
So I probably shouldn't do it ?
Last night, for the first time I ran DGIndex for 3 different videos at the same time to make d2v files and to demux the audio. Then I encoded them to x264 with Megui.
For the first time in a long time, the filesizes were quite a bit off what I had specified. The only thing I done differently this time was that I ran multiples of DGIndex simultaneously.
Is it possible, that this could have affected the output filesize ?
Seems kinda strange to me -
It shouldn't affect encoding outcome, just makes HDD work harder, especially when fragmented. I lost a HDD (not really, replacement) while encoding too many things (and watching) at the same time. This was for 20 min. max enough to produce over 100 bad blocks, some of them irreparable (decided not lo low level format which could remap the drive, just return).
Just not worth the time you have to invest fixing things afterwards. Happened month ago. So beware. Task priority is the key. If your PC has room to react you may be OK, if it cannot interrupt, unload memory, sh.t happens. -
I do it all the time but usually write to different drives to avoid the fragmentation issue. NTFS handles fragmentation well but multi-encode thrashes the poor hard drive. The drive may keep up with it but makes you fear the thing will explode or at least expire.
Dual core allows capture to proceed while encoding if set up properly. Best thing since sliced bread.
Again capture ideally should be to a separate drive form encoder read/write. -
If you have a dual core system but only a single threaded (or poorly multithreaded) encoder you can nearly double your throughput doing two encodes at the same time.
-
As I said, the files were fine, but instead of saving time (my goal) it actually took longer. At the time I did not have the luxury of enough drives. The source files were on one drive and the destinations were to another, but the constant reads and writes between the drives caused a lot of fragmentation and when I tried to author I had some delays that made it kind of frustrating. I defragged and it worked fine then, but that just added more time to the process. I can't say it caused my hard drives to die prematurly as I'm still using the destination drive. The source drive has since been replaced but it gave me a good six years of service so I can't complain.
"Shut up Wesley!" -- Captain Jean-Luc Picard
Buy My Books -
The slowest response on a drive is seek time as the heads mechanically move across the drive.
-
Originally Posted by edDV
Would defraggin the HDD help at all here ?
Originally Posted by spanky123 -
Originally Posted by spanky123
Originally Posted by spanky123 -
Originally Posted by jagabo
-
One thing to remember is that HDD is still the slowest (beside CD/DVD drive) item in your PC processing CPU commands.
While RAM and CPU can easily interrupt internal processes HDD is taken hostage by them and has to deal with a slew of requests until finished with a chunk it got. If it has to be halted by brute force with repeat req. from CPU (another app. banging it for access to disk) it will eventually corrupt either page file or data. I use dual CPU and although it gives a lot of room for internal processing most of the processed data either comes from or ends up on a HDD which is an obvious bottleneck for a PC. Unlike RAM and CPU HDD has mechanical parts that have a limit to what and how quickly they can process data. While RAM memory is purged on the spot your HDD needs time to complete the task. Rarely HDD fails in hardware without some early signs and most disk corruption occurs due to logical rather then physical errors. Unfortunately both are equally (although not to the same degree) frustrating and difficult to recover from. Obviously physical failure will be a candidate for professional recovery services while chances to salvage data from logical errors are usually better for a home user. Although I prsonally don't subscribe to weekly defrags, performing it once a month and monitoring fragmentation level may prevent a disaster.
Encoding is one of the most intense tests for PC hardware so performing it on a sequential basis is safer rather then trying to cram everything at once. I was pushing it and could hardly believe how much damage it has done after I have done very much in depth sector analysys. Looked like a repeated head crash even in areas that contained no data at all. This HDD was a new addition (Sept. 2006) - 320 Seagate. One thing to say though - love Seagate 5 year warranty. -
Hi.
So, it's not rocket science. When two processes are doing intensive disk I/O you will definitely have fragmentation. You will put extra wear and tear on the hard drive heads and mechanism. If your drive uses a voice coil mechanism then you will have good performance - however most consumer drives use a stepper motor to move the heads. This will eventually wear out.
Ideally, you want a clean defragged optimized HD with free space that is contiguous toward the end of the drive.
And you want to run one encoding at a time.
Here's what I do:
I have 3 hard drives. One is meant for final output - being encoding to MPEG video. That drive is always defragged and optimized.
I use CCE Basic and create a batch of files to run then run them overnite.
With this setup I know I won't have any problems.
JB -
I must strongly disagree with some of what has been stated here.
A heavy usage pattern will certainly shorten a drive's lifetime. However, that lifetime is measured in tens of thousands of hours.
A heavy usage pattern will not CAUSE a defect in a drive's hardware. It will REVEAL one that already exists. Head crashes specifically are a failure of the hardware. Usage does not create this failure. There is no possible set of software usage which should cause a head crash in a non-defective drive. Power isues or extreme vibration would be the only case where a defective drive is not the cause.
Example - you drive a car at 80mph for one hour and the engine explodes. Did the driving cause the failure? You could argue that it did, however any engine that is functioning properly with no inherent defects should be able to do this with no problem at all. An engine which fails in these circumstances was defective in some way.
Defragging - it does reduce wear and tear, however the defrag itself represents a fair amount of usage. For most folks, once a month or even three is fine. Some need it every day. If the analysis says you are 95% or better unfragmented, wait longer next time. -
Keep in mind that even new HDD's come with "known errors" table stored in their firmware (as well as a table of reserve sectors in case of bad ones occurring) so ALL of them are imperfect straight from the factory.
Head crash is a physical damage, I said it looked like it (magnitude), never said it actually happened.
If you say "usage does not create this failure" then what does, sonar-ocular emissions?
Your very eloquent example with the car only confirms my suspicion, that I've had for some time, that things are not perfect even if they sometimes appear so. Otherwise the engine would run forever...Just a thought.
PS.
Originally Posted by Nelson37 -
Short answer: Running multiple encodes simultaneously will not effect the results of the encodes. But it can create disk access bottlenecks (and possibly excessive OS/CPU overhead, depending the CPU used), that slow down the overall progress of the encodes compared to running them successively. I've noticed this phenomenon big-time when runnning encodes, so I usually do them successively.
Usually long gone and forgotten -
I'm with Nelson37. All things being equal, just because a drive is doing a ton of read/writes for an extended period does not mean the drive will be damaged. It will produce more heat and if your drive isn't properly vented, that can cause damage, but a drive will not be damaged simply by doing what it was designed to do.
I think TheFamilyMan summed it up nicely."Shut up Wesley!" -- Captain Jean-Luc Picard
Buy My Books -
Originally Posted by gadgetguy
-
Originally Posted by edDVOriginally Posted by Nelson37"Shut up Wesley!" -- Captain Jean-Luc Picard
Buy My Books -
ahh. OK.
As multi core goes 2x to 4x to 8x and single drive sizes keep increasing to >TB, this is an interesting (Chinese meaning) trend. -
MTBF - that is mean time between failure - is at least 50,000 hours for even the cheapest drives. This is not "utopian", this is established baseline fact.
As far as what causes the actual failure, what difference would that make to you? Simple wear and tear over tens of thousands of hours, sunspots, molecular discontinuities measured at the Angstrom level, none of these are under user control. Clean power, good cooling, minimal vibration, these are the only things the user can effectively control.
Usage pattern, continued over a period of months of thrashing, might begin to have an effect but a few hours is largely irrelevant. I have tested drives with butterfly seeks for days with no detectable ill effects, same drives have continues to run perfectly for years. The one or two which failed these tests were by definintion defective before the tests were run. -
Originally Posted by Nelson37
http://www.storagereview.com/guide2000/ref/hdd/perf/qual/specMTBF.html
I agree that encoding two videos simultaneously isn't an issue though. Unless your converting uncompressed video to uncompressed video 24 hours a day for months on end. -
What some would call "utopian" certainly appears to me to be valid statistical analysis, based on a large sample set. The specific manner by which this does or does not apply to an individual drive is something a lot of people just don't get a handle on. Understanding the "MEAN" in MTBF is the first step in this process.
The author mentions reduced warranty periods as an indicator of reliability. Absolutely no mention is made of competitive pricing which creates pressure to reduce costs. Warranty coverage period reduction is an effective way to reduce costs. The lower pricing is something we the consumer have voted with our wallets and demanded by our purchasing practices. In doing so, we the consumer have actually lowered the warranty periods by our own actions. We are not willing to pay more for a drive with a longer warranty period.
Someone will undoubtedly post to state their individual purchases conflict with this, illustrating the lack of understanding of the large sample set mentioned in paragraph one. -
Originally Posted by Nelson37Originally Posted by StorageReview
Similar Threads
-
GUI Tool created for batch video tasks
By VirtualDoobMon in forum ProgrammingReplies: 10Last Post: 10th Jun 2019, 08:50 -
How to create multiple tasks for FFMPEG?
By sandrushe in forum Video ConversionReplies: 3Last Post: 16th Dec 2011, 04:41 -
Encoding a video with x264 from multiple files
By Virnec in forum LinuxReplies: 2Last Post: 22nd Feb 2010, 16:56 -
Encoding - Audio fine but video running fast
By adeleander in forum ffmpegX general discussionReplies: 3Last Post: 23rd Dec 2007, 14:11 -
Adding multiple title tasks to a video
By nam207 in forum EditingReplies: 1Last Post: 22nd Nov 2007, 08:02