VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 36
Thread
  1. I have a camcorder which shoots at 50 Mbps. I want to broadcast it on an ATSC MPEG-2 subchannel, so the bit rate must be around 3 Mbps.

    I'm having trouble downsampling it as a MPEG-2 transport stream, from 50 Mbps to 3 Mbps. The video tends to freeze at 3 Mbps when played on VLC, even if I do two-pass MPEG-2. Audio is AC3 at 192k which poses no problem, except when playback freezes.

    Would it make sense to transcode it first to, say, h.265 or HuffYUV lossless to get a lower bit rate, say, 25 Mbps, and then to 3 Mbps MPEG-2, rather than going directly from 50 Mbps to 3 Mbps, or is this a dumb idea?

    Yes, there are TV stations that broadcast programming at 2.3 Mbps. See KPBS:

    https://en.wikipedia.org/wiki/KPBS_(TV)#Digital_channels
    Last edited by chris319; 18th Apr 2019 at 06:43.
    Quote Quote  
  2. Originally Posted by chris319 View Post
    Would it make sense to transcode it first to, say, h.265 or HuffYUV lossless to get a lower bit rate, say, 25 Mbps, and then to 3 Mbps MPEG-2, rather than going directly from 50 Mbps to 3 Mbps
    No (and huffyuv will give you higher bitrates, not lower). The process of reencoding video involves first decompressing the source then compressing with the new codec. The encoder has no idea what the bitrate of the source was. It receives uncompressed frames.

    Originally Posted by chris319 View Post
    Yes, there are TV stations that broadcast programming at 2.3 Mbps. See KPBS:
    At 480i.
    Quote Quote  
  3. What, then, is the benefit of doing double passes?
    Quote Quote  
  4. Member
    Join Date
    Mar 2008
    Location
    United States
    Search Comp PM
    Originally Posted by chris319 View Post
    What, then, is the benefit of doing double passes?
    Double pass encode? This is so the encoder can figure out the best bit-rate allocation when a lossy encoder is used
    Quote Quote  
  5. Originally Posted by chris319 View Post
    What, then, is the benefit of doing double passes?
    2-pass encodes are used when you want the best bitrate allocation for a particular file size (average bitrate). The encoder examines the video during the first pass so it knows what parts need more bitrate and what parts need less, that information is stored in a log file. During the second pass it uses that log to determine how much bitrate to give each part to meet the requested size.
    Quote Quote  
  6. Surely a simple avisynth script fed to something like HCEnc would suffice wouldn't it? You can do any resizing and colour conversions etc. and then set the output characteristics for the mpeg2 file as you want them.
    Quote Quote  
  7. 2-pass encodes are used when you want the best bitrate allocation for a particular file size (average bitrate).
    Ah. I'm aiming for constant bit rates.

    As we have yet to discuss here, U.S. broadcast terrestrial television nowadays carries several digital subchannels. The sum of the bit rates of the main channel and subchannels must total no more than 19.39 Mbps to fit within the allocated 6 MHz RF TV channel.

    Here is the single-pass code I use:

    Code:
    ffmpeg -i "C0008.MP4"  -s 1280x720  -vcodec mpeg2video  -vb 5.5M -minrate 5.5M -maxrate 5.5M -bufsize 10M  -muxrate 6.0M  -vf  scale=out_color_matrix=bt709  -acodec ac3 -strict -2  -ab 192k  -y  -f mpegts output.ts
    Quote Quote  
  8. These days broadcasters usually use statistical encoders that encode all the channels at once. They use variable bitrates, allocating more to the channels that need it, less to those that don't, filling up the total 19 Mb/s.
    Quote Quote  
  9. Originally Posted by jagabo View Post
    These days broadcasters usually use statistical encoders that encode all the channels at once. They use variable bitrates, allocating more to the channels that need it, less to those that don't, filling up the total 19 Mb/s.
    Can you cite one such unit? Make and model and a link to a web page?
    Quote Quote  
  10. I'm sure you can perform a google search as well as I. Here's one example: http://anywavecom.net/encoders-stream-processing/
    Quote Quote  
  11. Originally Posted by chris319 View Post
    Originally Posted by jagabo View Post
    These days broadcasters usually use statistical encoders that encode all the channels at once. They use variable bitrates, allocating more to the channels that need it, less to those that don't, filling up the total 19 Mb/s.
    Can you cite one such unit? Make and model and a link to a web page?

    lol are you trying to fit HD within 3Mbps with MPEG-2 codec? Good luck...
    Cisco (Scientific Atlanta) Regulus attached as pdf...
    Image Attached Thumbnails product_data_sheet0900aecd806e333f.pdf  

    Quote Quote  
  12. are you trying to fit HD within 3Mbps with MPEG-2 codec? Good luck...
    Not any more. Something about a gallon of s*** in a quart bottle.

    6 Mbps seems to make VLC happy at 1280 x 720.
    Quote Quote  
  13. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    As has been told to you before: give the TV station a copy of your high quality master and they will transcode it properly for their system (either during ingest, or in realtime using tools such as jagabo mentioned).

    BTW, going to an interim very-lossy codec is almost never a good idea.

    Scott
    Quote Quote  
  14. give the TV station a copy of your high quality master and they will transcode it properly for their system
    That's not the point of this exercise. The point is to do the CBR encoding myself.

    I have an ATSC modulator on order and this is an experiment to see if an ordinary TV set can decode an MPEG-2 RF stream fed to it by the modulator. Just playing around. The experiment stops there because I cannot work with multiple streams on different TV channels. i.e. multiplexing, without $$$ hardware.

    6 Mbps is not a typical delivery format anyway. A station would want a 25 - 50 Mbps master to be delivered.
    Quote Quote  
  15. CBR is worst choice ever - you can reduce horizontal resolution to 2/3 or 3/4 of nominal - it was quite common to broadcast 1440x1080 anamorphic 16:9 - same rule apply to other resolution (confront with ATSC specification). You may gain some bits to improve quality with limited impact on perceived resolution...
    Quote Quote  
  16. CBR is worst choice ever
    Why?

    you can reduce horizontal resolution to 2/3 or 3/4 of nominal - it was quite common to broadcast 1440x1080 anamorphic 16:9
    How is this achieved in ffmpeg?

    Which U.S. OTA broadcasters do this? My station doesn't and we're about as major as you get.
    Quote Quote  
  17. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by chris319 View Post
    CBR is worst choice ever
    Why?
    Because it gives everything the same bitrate. So a black screen gets the same bitrate as a fast action sports game. But in the broadcast world CBR is much more common and necessary because of fixed bandwidth.


    Originally Posted by chris319 View Post
    you can reduce horizontal resolution to 2/3 or 3/4 of nominal - it was quite common to broadcast 1440x1080 anamorphic 16:9
    How is this achieved in ffmpeg?

    Which U.S. OTA broadcasters do this? My station doesn't and we're about as major as you get.
    I've only known of the BBC to do this at the start of digital broadcasting age. Now they don't. I've seen anamorphic 480p but never 1080p type stuff.
    Last edited by KarMa; 20th Apr 2019 at 03:20. Reason: Messed up the quotes
    Quote Quote  
  18. But in the broadcast world CBR is much more common and necessary because of fixed bandwidth.
    Thank you. That's what I thought and why I went with CBR in the first place.

    Broadcasters have to budget their bit rates and may NOT exceed 19.38 Mbps across 6 MHz as you know.
    Quote Quote  
  19. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Just like DV could be anamorphic 16:9 and was occasionally used as sources for SDTV broadcast, so too is HDV, DVCProHD and a few other formats can be anamorphic and still be considered legit sources for HDTV broadcast. 1440x1080 and 960x720 (both stretched to 16:9) is not that rare. It's allowed in the spec, and that leeway enables stations to ingest and transmit without resizing, thus avoiding a common quality hit.

    Y'all might need to do your homework some more instead of assuming an anecdotal sample is totally indicative of the whole market.

    Btw, I'm pretty sure pandy was referring to CBR as being "the worst choice ever" in the theoretical sense (he is well known for quoting from the place of theory & standards), when comparing the various lossy BRR options: CBR, 1passVBR, 2passVBR, CQ/CRF.
    And in that sense, he is totally right, it is the worst. As in least efficient, least "bang for your buck". But pandy also already know of the constraint of broadcast channel bandwidth. I also don't think he was commenting so much on your attempt to keep things within that constraint. Because by your own admission "it an exercise".
    But it is a partially disingenuous exercise, because while you know the nominal overall bitrate constraint, that is NOT the constraint real programs will have in the real world, precisely because it doesn't take into account multistream compression and the statistical juggling that goes on behind the scenes. And that intrinsically would turn all those programs into some form of (even further constrained) VBR.

    Scott
    Quote Quote  
  20. Y'all might need to do your homework some more instead of assuming an anecdotal sample is totally indicative of the whole market.
    The bottom line is what can the installed base of receivers handle?

    If a station starts broadcasting in anamorphic and home receivers can't deal with it, you're going to get phone calls from viewers (I've had to take some of those calls). They'll be wondering why the picture on your channel looks funny. During the transition to digital we had to assign someone just to take viewer reception calls.

    The ATSC spec supports h.264 but we're not broadcasting it. I was told point blank that we're not going to because there might be receivers "out there" that can't handle it and we could lose viewers. Engineering managers have lost their jobs for making decisions that have cost a station viewers.

    Same with ATSC 3. No station is going to risk losing viewers over ATSC 3 and we're not going to make viewers buy a new TV receiver just to watch an ATSC 3 signal.

    it is a partially disingenuous exercise, because while you know the nominal overall bitrate constraint, that is NOT the constraint real programs will have in the real world, precisely because it doesn't take into account multistream compression and the statistical juggling that goes on behind the scenes. And that intrinsically would turn all those programs into some form of (even further constrained) VBR.
    I have already posted (recently) that many programs are delivered at 25 - 50 Mbps. Some are even h.264. Needless to say, they must be transcoded to MPEG-2 at a lower bit rate. To complicate thing further, every station is different.
    Last edited by chris319; 20th Apr 2019 at 03:13.
    Quote Quote  
  21. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by Cornucopia View Post
    Just like DV could be anamorphic 16:9 and was occasionally used as sources for SDTV broadcast, so too is HDV, DVCProHD and a few other formats can be anamorphic and still be considered legit sources for HDTV broadcast. 1440x1080 and 960x720 (both stretched to 16:9) is not that rare. It's allowed in the spec, and that leeway enables stations to ingest and transmit without resizing, thus avoiding a common quality hit.
    Do you have a station call sign that does 1440x1080? Can enter the callstation here and click "technical data", they are usually pretty good at showing the true resolution and sometimes stuff like bitrate and if they use VBR".https://www.rabbitears.info/market.php?request=station_search&callsign=KLRU#station

    Do stations actually freely switch between 1920 and 1440, like between every commercial? Are broadcasters really going to take the risk that every consumer level tv is fast enough or even capable of seamlessly switching. The same stations that broadcast 24/7 stereo or 24/7 5.1 surround, without ever switching, even though most sources are going to be stereo. Meaning most broadcast 5.1 audio is just up mixed stereo, simply for the sake of consistency. As far as anamorphic capturing (DVCProHD, etc) that still needs to be converted to MPEG2, and if they get a anamorphic feed from where ever (sat, fiber) they are still going to convert it to be compliant with their system. Even if the source is MPEG2 and within ATSC spec, it's probably going to get encoded again for broadcast. So in any of these cases, they have opportunities to resize the resolution to what they broadcast at.
    Quote Quote  
  22. Do stations actually freely switch between 1920 and 1440, like between every commercial?
    What would be the point?
    Quote Quote  
  23. Dinosaur Supervisor KarMa's Avatar
    Join Date
    Jul 2015
    Location
    US
    Search Comp PM
    Originally Posted by chris319 View Post
    Do stations actually freely switch between 1920 and 1440, like between every commercial?
    What would be the point?
    Idk, not having to stretch the content out to 16:9 yourself. But I've never seen 1440x1080 in the two markets I watch TV from, let alone switching between 1920 and 1440 on the fly. Nor have I seen any of this on FTA satellite.
    Quote Quote  
  24. Idk, not having to stretch the content out to 16:9 yourself.
    In 2019 it's probably shot in 16:9.
    Quote Quote  
  25. CBR is worst ever as it is pure waste of bandwidth - it is used by broadcasters only when some limitations of head end exist (mostly lack of statistical multiplexer) - decent broadcaster plan carefully mux on channel allocations, usually there is one hq service plus few auxiliary...
    IMHO is is way better to encode video in VBR with particular VBV buffer (in case of MPEG-2 is is quite limited) ans maximum bitrate allowed to fill buffer - this should satisfy static multiplexer. IMHO CBR is mostly workaround for poor multiplexer.

    Anamorphic HD is (was?) quite common for HD MPEG-2 services. IMHO reducing luminance bandwidth by 2/3 or 3/4 is not severe quality loss and it should allow to save sometimes even few Mbps, also careful pre-processing (e.g. adaptive motion blur) may be beneficial to overall quality gain.
    As loss of quality is unavoidable thus it is better to do this in a way that is more subjectively acceptable (i.e trying to match human eye spatial/temporal characteristic). Some techniques used by x264 (particularly in-loop deblocking) may be applied to MPEG-2 (i think this was main idea behind x262 project).
    AFAIR NHK demonstrated many years ago MPEG-2 encoder capable to deliver decent quality for 1920x1080 with around 11 - 15Mbps...
    Image Attached Thumbnails trev_304-mpeg2.pdf  

    Quote Quote  
  26. CBR is worst ever as it is pure waste of bandwidth - it is used by broadcasters only when some limitations of head end exist (mostly lack of statistical multiplexer) - decent broadcaster plan carefully mux on channel allocations, usually there is one hq service plus few auxiliary...
    All this to save a few Mbps?

    viz.:

    Service 4 SD: 3 Mbps
    Service 3 SD: 3 Mbps
    Service 2 SD: 3 Mbps
    Service 1 HD: 10 Mbs

    Total:
    19 Mbps

    Where is the big waste of bit rate?

    Bigger question: can the audience or the advertising clients tell the difference?
    Last edited by chris319; 21st Apr 2019 at 01:10.
    Quote Quote  
  27. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    You think (possibly interlaced) SD mpeg2 @3mbps and HD mpeg2 @10mbps is good?

    Ok, nuff said. Oh yeah, CBR at that.

    Scott
    Last edited by Cornucopia; 21st Apr 2019 at 02:22.
    Quote Quote  
  28. You think (possibly interlaced) SD mpeg2 @3mbps and HD mpeg2 @10mbps is good?
    Your argument is with the corporate engineering managers who make these decisions.

    They typically have an HD main channel and several SD subchannels. That layout is close to what they use. The subchannels often run old B&W movies and filmed TV shows, or foreign language programming, BUT the commercials might be in color and in HD. The commercials are where the money is.
    Quote Quote  
  29. Originally Posted by chris319 View Post
    CBR is worst ever as it is pure waste of bandwidth - it is used by broadcasters only when some limitations of head end exist (mostly lack of statistical multiplexer) - decent broadcaster plan carefully mux on channel allocations, usually there is one hq service plus few auxiliary...
    All this to save a few Mbps?

    viz.:

    Service 4 SD: 3 Mbps
    Service 3 SD: 3 Mbps
    Service 2 SD: 3 Mbps
    Service 1 HD: 10 Mbs

    Total:
    19 Mbps

    Where is the big waste of bit rate?

    Bigger question: can the audience or the advertising clients tell the difference?
    Everywhere... that's why on single mux you need to mix different types of services - dynamic with static... that can share dynamically available bandwidth improving overall quality - objectively mux using CBR will be always in average worse quality than mux using statistical multiplexing... with VBR (vbrmax) you may gain some space for auxiliary data (dynamically ingested to overall mux) - proper service planning is demanding task, you need to start analysing complexity...
    Advertising is content that can be compressed offline - this is another topic - how many of your services is live and how many ingested from library - if all of them are from library then you can simulate statistical multiplexing by proper planning...
    It is not clear to me what are your intention but i have impression that there is many naive assumptions in background...
    Quote Quote  
  30. Member Cornucopia's Avatar
    Join Date
    Oct 2001
    Location
    Deep in the Heart of Texas
    Search PM
    Agreed.

    My argument was your assumption that with those mediocre bitrate levels, CBR would do justice to the quality without wasting bitrate. It doesn't do either.
    BTW, color vs B/W (lack of) is not a compelling reason for justifying a certain channel set as being of acceptable quality at such mediocre levels - Color doesn't contribute appreciably to the bitrate requirements due to the subsampling. ESPECIALLY where conversion from DV is involved.

    Scott
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!