VideoHelp Forum




+ Reply to Thread
Results 1 to 5 of 5
  1. Member
    Join Date
    May 2009
    Location
    United States
    Search Comp PM
    I was thumbing through an x264 wiki when I encountered this:

    interlaced

    Default: Not Set

    Enable interlaced encoding. x264's interlaced encoding is inherently less efficient than its progressive encoding, so it is probably better to deinterlace an interlaced source before encoding rather than use interlaced mode.
    Now, my plan is to encode a big (~90 minute) raw video into a hopefully Bluray-compatible 1080i30 H.264 video. As it happens, the entire video was rendered as 1080p60 so I have the option of feeding the progressive footage to x264, but my assumption had been that I would need to give MeGUI an appropriate .AVS file which included interlacing functions which would render it as 1080i30 before letting x264 have it.

    The above-quoted wiki entry seems to be suggesting that if I have the option of giving x264 progressive footage, x264 can do a better job of rendering its output than if I only have interlaced footage. Perhaps I am getting my hopes up for nothing, but does this mean I could feed x264 (via MeGUI) the 1080p60 footage and, with the use of an appropriate command-line argument, tell x264 to spit out 1080i30 H.264 from it?

    Thought I'd check. ;p
    Quote Quote  
  2. Originally Posted by Asterra View Post
    Perhaps I am getting my hopes up for nothing, but does this mean I could feed x264 (via MeGUI) the 1080p60 footage and, with the use of an appropriate command-line argument, tell x264 to spit out 1080i30 H.264 from it?
    I don't see how you drew that conclusion from the Wiki entry you quoted. Why not just reinterlace it in the script?

    I think it's talking about progressive .vs interlaced encoding when both are at the same 30fps framerate to begin with.
    Quote Quote  
  3. does this mean I could feed x264 (via MeGUI) the 1080p60 footage and, with the use of an appropriate command-line argument, tell x264 to spit out 1080i30 H.264 from it?
    No, it doesn't. It suggests that they recommend to deinterlace and encode progressively instead of using interlaced encoding.
    (if you feed x264 with a 60p source and enable interlacing you will end up with 60i not 30i output)

    Cu Selur

    Ps.: Personally I would, depending on the source, go for 1280x720@60p instead of 1920x1080@30i
    Last edited by Selur; 9th Jul 2011 at 02:00.
    Quote Quote  
  4. Member
    Join Date
    May 2009
    Location
    United States
    Search Comp PM
    Originally Posted by Selur View Post
    No, it doesn't. It suggests that they recommend to deinterlace and encode progressively instead of using interlaced encoding.
    Pretty much what I thought, which makes the suggestion a dubious one because deinterlacing footage so it can be re-interlaced, without any actual improvement to the encode, sounds like a colossal waste of time. If they meant to say, "Compressed progressive video will look better than compressed interlaced video", then they didn't have to word it so dubiously. In truth, I plan to do some tests to see if the literal wording they used has any merit.

    Originally Posted by Selur View Post
    Ps.: Personally I would, depending on the source, go for 1280x720@60p instead of 1920x1080@30i
    Well. About 85% of the footage was interlaced to begin with, but rendered as progressive with QTGMC (Very Slow). Certainly the result of that process is better than any realtime hardware deinterlacer you could name. The other 15% came from a 1080p60 camcorder. As for the resolution, about half of it is well above 720p so it would, unfortunately, be a far bigger sacrifice to jump down to 720p than to stick with 1080i30.

    Of course I should mention that I am the sort of person who feels that recording in 24 or 30fps would be an atrocious sacrifice of smoothness. ;p
    Quote Quote  
  5. Member
    Join Date
    May 2009
    Location
    United States
    Search Comp PM
    Addressing my own question for posterity's sake. I gave x264 (via MeGUI) two inputs and two outputs:

    1) In: 1080i30, Out: (no interlaced flag) : Result was 1080i30 video.

    2) In: 1080i30, Out: -tff (interlaced flag) : Result was 1080i30 video. There are very clear differences between these two outputs. With the -tff flag, the output exhibits much better separation between the two fields, whereas without the flag the two fields woven together often seem to be one big compressed mess. On the other hand, with -tff in place, macroblocking is a little more apparent. But I would judge that when giving x264 interlaced footage, it is by far preferable to specify the use of the -tff flag.

    3) In: 1080p60, Out: (no interlaced flag) : Result was 1080p30 footage, doubtless because Bluray compatibility was specified.

    4) In: 1080p60, Out: -tff : Result was again 1080p30 footage, which means that -tff serves no apparent purpose unless one specifically feeds x264 interlaced video.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!