I have a camcorder which shoots at 50 Mbps. I want to broadcast it on an ATSC MPEG-2 subchannel, so the bit rate must be around 3 Mbps.
I'm having trouble downsampling it as a MPEG-2 transport stream, from 50 Mbps to 3 Mbps. The video tends to freeze at 3 Mbps when played on VLC, even if I do two-pass MPEG-2. Audio is AC3 at 192k which poses no problem, except when playback freezes.
Would it make sense to transcode it first to, say, h.265 or HuffYUV lossless to get a lower bit rate, say, 25 Mbps, and then to 3 Mbps MPEG-2, rather than going directly from 50 Mbps to 3 Mbps, or is this a dumb idea?
Yes, there are TV stations that broadcast programming at 2.3 Mbps. See KPBS:
https://en.wikipedia.org/wiki/KPBS_(TV)#Digital_channels
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays! or rip iTunes movies!
+ Reply to Thread
Results 1 to 30 of 36
Thread
-
Last edited by chris319; 18th Apr 2019 at 06:43.
-
No (and huffyuv will give you higher bitrates, not lower). The process of reencoding video involves first decompressing the source then compressing with the new codec. The encoder has no idea what the bitrate of the source was. It receives uncompressed frames.
At 480i. -
-
2-pass encodes are used when you want the best bitrate allocation for a particular file size (average bitrate). The encoder examines the video during the first pass so it knows what parts need more bitrate and what parts need less, that information is stored in a log file. During the second pass it uses that log to determine how much bitrate to give each part to meet the requested size.
-
2-pass encodes are used when you want the best bitrate allocation for a particular file size (average bitrate).
As we have yet to discuss here, U.S. broadcast terrestrial television nowadays carries several digital subchannels. The sum of the bit rates of the main channel and subchannels must total no more than 19.39 Mbps to fit within the allocated 6 MHz RF TV channel.
Here is the single-pass code I use:
Code:ffmpeg -i "C0008.MP4" -s 1280x720 -vcodec mpeg2video -vb 5.5M -minrate 5.5M -maxrate 5.5M -bufsize 10M -muxrate 6.0M -vf scale=out_color_matrix=bt709 -acodec ac3 -strict -2 -ab 192k -y -f mpegts output.ts
-
These days broadcasters usually use statistical encoders that encode all the channels at once. They use variable bitrates, allocating more to the channels that need it, less to those that don't, filling up the total 19 Mb/s.
-
-
I'm sure you can perform a google search as well as I. Here's one example: http://anywavecom.net/encoders-stream-processing/
-
-
As has been told to you before: give the TV station a copy of your high quality master and they will transcode it properly for their system (either during ingest, or in realtime using tools such as jagabo mentioned).
BTW, going to an interim very-lossy codec is almost never a good idea.
Scott -
give the TV station a copy of your high quality master and they will transcode it properly for their system
I have an ATSC modulator on order and this is an experiment to see if an ordinary TV set can decode an MPEG-2 RF stream fed to it by the modulator. Just playing around. The experiment stops there because I cannot work with multiple streams on different TV channels. i.e. multiplexing, without $$$ hardware.
6 Mbps is not a typical delivery format anyway. A station would want a 25 - 50 Mbps master to be delivered. -
CBR is worst choice ever - you can reduce horizontal resolution to 2/3 or 3/4 of nominal - it was quite common to broadcast 1440x1080 anamorphic 16:9 - same rule apply to other resolution (confront with ATSC specification). You may gain some bits to improve quality with limited impact on perceived resolution...
-
Because it gives everything the same bitrate. So a black screen gets the same bitrate as a fast action sports game. But in the broadcast world CBR is much more common and necessary because of fixed bandwidth.
I've only known of the BBC to do this at the start of digital broadcasting age. Now they don't. I've seen anamorphic 480p but never 1080p type stuff.Last edited by KarMa; 20th Apr 2019 at 03:20. Reason: Messed up the quotes
-
But in the broadcast world CBR is much more common and necessary because of fixed bandwidth.
Broadcasters have to budget their bit rates and may NOT exceed 19.38 Mbps across 6 MHz as you know. -
Just like DV could be anamorphic 16:9 and was occasionally used as sources for SDTV broadcast, so too is HDV, DVCProHD and a few other formats can be anamorphic and still be considered legit sources for HDTV broadcast. 1440x1080 and 960x720 (both stretched to 16:9) is not that rare. It's allowed in the spec, and that leeway enables stations to ingest and transmit without resizing, thus avoiding a common quality hit.
Y'all might need to do your homework some more instead of assuming an anecdotal sample is totally indicative of the whole market.
Btw, I'm pretty sure pandy was referring to CBR as being "the worst choice ever" in the theoretical sense (he is well known for quoting from the place of theory & standards), when comparing the various lossy BRR options: CBR, 1passVBR, 2passVBR, CQ/CRF.
And in that sense, he is totally right, it is the worst. As in least efficient, least "bang for your buck". But pandy also already know of the constraint of broadcast channel bandwidth. I also don't think he was commenting so much on your attempt to keep things within that constraint. Because by your own admission "it an exercise".
But it is a partially disingenuous exercise, because while you know the nominal overall bitrate constraint, that is NOT the constraint real programs will have in the real world, precisely because it doesn't take into account multistream compression and the statistical juggling that goes on behind the scenes. And that intrinsically would turn all those programs into some form of (even further constrained) VBR.
Scott -
Y'all might need to do your homework some more instead of assuming an anecdotal sample is totally indicative of the whole market.
If a station starts broadcasting in anamorphic and home receivers can't deal with it, you're going to get phone calls from viewers (I've had to take some of those calls). They'll be wondering why the picture on your channel looks funny. During the transition to digital we had to assign someone just to take viewer reception calls.
The ATSC spec supports h.264 but we're not broadcasting it. I was told point blank that we're not going to because there might be receivers "out there" that can't handle it and we could lose viewers. Engineering managers have lost their jobs for making decisions that have cost a station viewers.
Same with ATSC 3. No station is going to risk losing viewers over ATSC 3 and we're not going to make viewers buy a new TV receiver just to watch an ATSC 3 signal.
it is a partially disingenuous exercise, because while you know the nominal overall bitrate constraint, that is NOT the constraint real programs will have in the real world, precisely because it doesn't take into account multistream compression and the statistical juggling that goes on behind the scenes. And that intrinsically would turn all those programs into some form of (even further constrained) VBR.Last edited by chris319; 20th Apr 2019 at 03:13.
-
Do you have a station call sign that does 1440x1080? Can enter the callstation here and click "technical data", they are usually pretty good at showing the true resolution and sometimes stuff like bitrate and if they use VBR".https://www.rabbitears.info/market.php?request=station_search&callsign=KLRU#station
Do stations actually freely switch between 1920 and 1440, like between every commercial? Are broadcasters really going to take the risk that every consumer level tv is fast enough or even capable of seamlessly switching. The same stations that broadcast 24/7 stereo or 24/7 5.1 surround, without ever switching, even though most sources are going to be stereo. Meaning most broadcast 5.1 audio is just up mixed stereo, simply for the sake of consistency. As far as anamorphic capturing (DVCProHD, etc) that still needs to be converted to MPEG2, and if they get a anamorphic feed from where ever (sat, fiber) they are still going to convert it to be compliant with their system. Even if the source is MPEG2 and within ATSC spec, it's probably going to get encoded again for broadcast. So in any of these cases, they have opportunities to resize the resolution to what they broadcast at. -
Do stations actually freely switch between 1920 and 1440, like between every commercial?
-
-
Idk, not having to stretch the content out to 16:9 yourself.
-
CBR is worst ever as it is pure waste of bandwidth - it is used by broadcasters only when some limitations of head end exist (mostly lack of statistical multiplexer) - decent broadcaster plan carefully mux on channel allocations, usually there is one hq service plus few auxiliary...
IMHO is is way better to encode video in VBR with particular VBV buffer (in case of MPEG-2 is is quite limited) ans maximum bitrate allowed to fill buffer - this should satisfy static multiplexer. IMHO CBR is mostly workaround for poor multiplexer.
Anamorphic HD is (was?) quite common for HD MPEG-2 services. IMHO reducing luminance bandwidth by 2/3 or 3/4 is not severe quality loss and it should allow to save sometimes even few Mbps, also careful pre-processing (e.g. adaptive motion blur) may be beneficial to overall quality gain.
As loss of quality is unavoidable thus it is better to do this in a way that is more subjectively acceptable (i.e trying to match human eye spatial/temporal characteristic). Some techniques used by x264 (particularly in-loop deblocking) may be applied to MPEG-2 (i think this was main idea behind x262 project).
AFAIR NHK demonstrated many years ago MPEG-2 encoder capable to deliver decent quality for 1920x1080 with around 11 - 15Mbps... -
CBR is worst ever as it is pure waste of bandwidth - it is used by broadcasters only when some limitations of head end exist (mostly lack of statistical multiplexer) - decent broadcaster plan carefully mux on channel allocations, usually there is one hq service plus few auxiliary...
viz.:
Service 4 SD: 3 Mbps
Service 3 SD: 3 Mbps
Service 2 SD: 3 Mbps
Service 1 HD: 10 Mbs
Total:
19 Mbps
Where is the big waste of bit rate?
Bigger question: can the audience or the advertising clients tell the difference?Last edited by chris319; 21st Apr 2019 at 01:10.
-
You think (possibly interlaced) SD mpeg2 @3mbps and HD mpeg2 @10mbps is good?
Ok, nuff said. Oh yeah, CBR at that.
ScottLast edited by Cornucopia; 21st Apr 2019 at 02:22.
-
You think (possibly interlaced) SD mpeg2 @3mbps and HD mpeg2 @10mbps is good?
They typically have an HD main channel and several SD subchannels. That layout is close to what they use. The subchannels often run old B&W movies and filmed TV shows, or foreign language programming, BUT the commercials might be in color and in HD. The commercials are where the money is. -
Everywhere... that's why on single mux you need to mix different types of services - dynamic with static... that can share dynamically available bandwidth improving overall quality - objectively mux using CBR will be always in average worse quality than mux using statistical multiplexing... with VBR (vbrmax) you may gain some space for auxiliary data (dynamically ingested to overall mux) - proper service planning is demanding task, you need to start analysing complexity...
Advertising is content that can be compressed offline - this is another topic - how many of your services is live and how many ingested from library - if all of them are from library then you can simulate statistical multiplexing by proper planning...
It is not clear to me what are your intention but i have impression that there is many naive assumptions in background... -
Agreed.
My argument was your assumption that with those mediocre bitrate levels, CBR would do justice to the quality without wasting bitrate. It doesn't do either.
BTW, color vs B/W (lack of) is not a compelling reason for justifying a certain channel set as being of acceptable quality at such mediocre levels - Color doesn't contribute appreciably to the bitrate requirements due to the subsampling. ESPECIALLY where conversion from DV is involved.
Scott
Similar Threads
-
VHS to .mkv/.avi/.mpg for dumb dumbs
By newmy51 in forum Newbie / General discussionsReplies: 4Last Post: 7th Jun 2021, 03:46 -
Am I dumb if I think my GTX960 should be rendering faster than my i7-875k?
By CursedLemon in forum Video ConversionReplies: 13Last Post: 28th Jan 2016, 11:44 -
Dumb question about Canopus ADVC110
By Foxhack in forum CapturingReplies: 3Last Post: 28th Dec 2015, 19:22 -
dumb question on capture FPS
By friendly_jacek in forum Newbie / General discussionsReplies: 28Last Post: 9th Aug 2015, 11:50 -
MKV to MP4 simple, and probably dumb question.
By PedgeJameson in forum Video ConversionReplies: 4Last Post: 2nd Apr 2015, 05:31