VideoHelp Forum
+ Reply to Thread
Page 1 of 2
1 2 LastLast
Results 1 to 30 of 35
Thread
  1. will be there any picture quality difference if i encode video at same bit rate for 720p and 1080p?
    Quote Quote  
  2. It depends on the source and the bitrate. If you use a high enough bitrate it probably won't make any difference (aside from any possible detail loss through resizing down) as if the quality is high for 1080p it'll be high for 720p, even if the bitrate is higher for 720p than would have been necessary for a decent quality. At low bitrates you're more likely to see compression artefacts for 1080p because there's more video to encode with the same number of bits available to encode it.

    Most people use quality based encoding rather than specifying an average bitrate for the x264 encoder. You can't control the bitrate as such, but the quality will be the same for a given quality setting (CRF value as x264 calls it). You can probably get away with slightly lower quality for 1080p, as the encode doesn't need to be upscaled on playback (or upscaled less), as upscaling also upscales any compression artefacts making them easier to see.

    I tend to encode at 720p unless the source has lots of fine detail that'll be lost when downscaling (which often isn't the case) and encode using a decent quality setting, but there's probably no simple answer.
    Quote Quote  
  3. ditto everything that hello_hello wrote.

    I'd also say that it's often incredibly subjective. Most people will probably be fine with either option, but there'll always be some people who will always see the flaws and not be happy. (Many of them are members here! ) If you can, I'd suggest that you cut a relatively small bit of video that ideally contains some fast moving action across the screen as well as some close-ups of faces, and do some test encodes on it. If you do as hello_hello suggests and use quality based encoding then, once you've found the settings that you're happy with, you can use them as a starting point for any future encodes.

    Let us know how you get on and what you decided in the end.
    Quote Quote  
  4. There are no rules, guidelines, or formulas to answer the question because the interrelationship between resolution, bitrate, and quality is not at all linear and is highly subjective, as you trade one artifact (loss of detail as you decrease resolution) for another (increasing encoding artifacts) as you decrease the bitrate.

    My answer: YOU have to encode several test samples, watch them on the display you plan to be using, and decide for yourself. It's the only way to get the right answer for yourself.
    Quote Quote  
  5. sorry i should have be more specific about the encoding. i will use 2 pass bit rate. assume source is bluray and i will encode in 4000 Bitrate

    1080p=>encoding will take longer time
    720=>encoding will take less time compare 1080p.

    will be there any benefit if encode in 1080p? and if consider technical overview, will be there any difference while watching in big screen?
    Quote Quote  
  6. Member
    Join Date
    Jul 2007
    Location
    United States
    Search Comp PM
    As stated above, only you can decide if the quality difference is noticeable on your set, now and it the future when you upgrade. 1080P contains ~2x the pixels (~1mil px for 720p vs ~2mil px for 1080P) and on a 1080p HDTV, it will be pixel perfect vs the 720P version having to be upscaled and therefore slightly less clear.

    This may not be a good analogy, but say you have a half a bottle of wine and add water to make it full again and pour a glass from it. Could you tell the difference from a glass of undiluted wine from a full bottle? Since I'm not a wine drinker, I probably couldn't tell the difference, but someone who knows wine will definitely know.
    Quote Quote  
  7. Originally Posted by firekid View Post
    sorry i should have be more specific about the encoding. i will use 2 pass bit rate. assume source is bluray and i will encode in 4000 Bitrate

    1080p=>encoding will take longer time
    720=>encoding will take less time compare 1080p.

    will be there any benefit if encode in 1080p? and if consider technical overview, will be there any difference while watching in big screen?
    The problem is no two videos require the same bitrate for the same quality. It depends how much grain there is, and how much action and detail etc. At CRF18, which is a fairly high quality for x264, I've had 720p encodes come in at under 2000 kb/s, so a 1080p encode of the same video at 4000 kb/s would probably look good, but for harder to compress video 4000 kb/s wouldn't be unrealistic for 720p, so at 1080p the quality wouldn't be all that special.

    How far you sit from the screen can be a factor. I'm generally not close enough to see 1080p worth of detail, so if I was forced to encode at 4000 kb/s all the time and didn't want to mess around checking the quality of each encode I'd probably stick to 720p and keep the encoding quality high, but that's just me.

    There's a guide to resolution and viewing distance here. http://s3.carltonbale.com/resolution_chart.html

    I'd still not bother with 2 pass encoding unless there's a good reason. CRF encoding produces the same quality at the same bitrate, with the same encoder settings (if you ran a CRF encode and used the resulting bitrate for a 2 pass encode, the quality would be virtually identical) and as CRF only requires a single pass it's a fair bit faster. The file sizes will vary but after a few encodes you'll settle on a quality you're happy with that gives you a bitrate you're happy with.... on average. I generally use CRF18 for 720 and CRF20 for 1080p, x264's medium speed preset or slower and the film tuning, but once again, that's just me.
    Last edited by hello_hello; 14th Dec 2019 at 21:09.
    Quote Quote  
  8. These pics compare 1080 to 720. Both are at a bitrate of 35Mbps. If you switch between them you can see the loss of detail in the 720p version.
    Image Attached Thumbnails Click image for larger version

Name:	1080p.JPG
Views:	887
Size:	299.3 KB
ID:	51099  

    Click image for larger version

Name:	720p.JPG
Views:	813
Size:	258.2 KB
ID:	51100  

    Canon C100 mk2 - Dell XPS8700 i7 - Win 10 - 24gb RAM - GTX 1060/6GB - DaVinci Resolve Studio 18.6.3 - Blackmagic Speed Editor - Presonus Faderport 1 - 3 calibrated screens
    Quote Quote  
  9. I'd be willing to go out on a limb and say that's more blurring due to the downscaling and/or upscaling not being sharp than it is a loss of detail as such. Of course that is a factor. You are at the mercy of the player's/TV's upscaling in that respect. However.....

    The 1080p image downscaled to 720p then upscaled again in an Avisynth script (screenshot 1).

    ImageSource("D:\1080p.JPG")
    Spline36Resize(1280,720)
    Spline36Resize(1920,1080)

    Downscaling to 540p and back still looks a tad sharper than the 720p image upscaled by WMP, although it's hard to know for sure as they're not the same frame (screenshot 2).

    ImageSource("D:\1080p.JPG")
    Spline36Resize(960,540)
    Spline36Resize(1920,1080)

    To 720p and back with a little extra sharpening. I converted to YV12 first for LSFMod (screenshot 3).

    ImageSource("D:\1080p.JPG")
    ConvertToYV12(matrix="Rec709")
    Spline36Resize(1280,720)
    Spline36Resize(1920,1080)
    LSFMod()

    I'm not saying encoding at the original resolution isn't ideal if there's no bitrate limitations to worry about, but the difference in detail you can see is often negligible to none, even when comparing still frames.
    Image Attached Thumbnails Click image for larger version

Name:	1080p-720p-1080p.png
Views:	312
Size:	1.80 MB
ID:	51101  

    Click image for larger version

Name:	1080p-540p-1080p.png
Views:	227
Size:	1.67 MB
ID:	51102  

    Click image for larger version

Name:	1080p-720p-1080p-lsfmod.png
Views:	219
Size:	1.83 MB
ID:	51103  

    Last edited by hello_hello; 15th Dec 2019 at 16:50.
    Quote Quote  
  10. The 720p was not downscaled using a video player, it was rendered at that resolution in DaVinci Resolve. The 1080p is from the camera original.
    Canon C100 mk2 - Dell XPS8700 i7 - Win 10 - 24gb RAM - GTX 1060/6GB - DaVinci Resolve Studio 18.6.3 - Blackmagic Speed Editor - Presonus Faderport 1 - 3 calibrated screens
    Quote Quote  
  11. Yeah, but if the 1080p source was used for the 720p encode, it must've been downscaled for encoding. That's what I meant when I referred to the downscaling, but I don't know anything about DaVinci Resolve. The blurriness could be purely due to the upscaling on playback. I don't know how WMP upscales.
    Quote Quote  
  12. Going back to the OPs original question, 1080 has greater detail than 720. If, for instance, you intend to upload your video to YouTube, you should strive for the highest quality possible to combat the re-encoding that YT will do. As the OP says about viewing on a big screen the lower resolution of 720 will be even more apparent when the video is scaled to fit that screen. So I come out on the side of 1080 being better.
    Canon C100 mk2 - Dell XPS8700 i7 - Win 10 - 24gb RAM - GTX 1060/6GB - DaVinci Resolve Studio 18.6.3 - Blackmagic Speed Editor - Presonus Faderport 1 - 3 calibrated screens
    Quote Quote  
  13. @ChapmanDolly
    For me it is hard to believe that your 2 pictures are even from the same original frame. See for example the steam. It is totally different for the 2 pics when switching back and forth, and this happens never due to a scaling effect.
    Also, are your 2 pictures of the same type, i.e. both are I,P or B pictures?
    Quote Quote  
  14. No way of knowing what type the pics are. Original video file is 1080p 50fps MP4 at 35Mbps. For maybe a fairer comparison these two new pics are grabs from the DaVinci Resolve timeline.Top 1080, bottom 720.
    The steam is different in the original post as they are not quite the same frame.
    Image Attached Thumbnails Click image for larger version

Name:	1080p Capture.JPG
Views:	291
Size:	208.1 KB
ID:	51110  

    Click image for larger version

Name:	720p Capture.JPG
Views:	264
Size:	195.6 KB
ID:	51111  

    Last edited by ChapmanDolly; 16th Dec 2019 at 10:08.
    Canon C100 mk2 - Dell XPS8700 i7 - Win 10 - 24gb RAM - GTX 1060/6GB - DaVinci Resolve Studio 18.6.3 - Blackmagic Speed Editor - Presonus Faderport 1 - 3 calibrated screens
    Quote Quote  
  15. Originally Posted by ChapmanDolly View Post
    The 720p was not downscaled using a video player, it was rendered at that resolution in DaVinci Resolve. The 1080p is from the camera original.
    I understand you compare the 1080p original from the camera with a re-encoded (re-rendered) 720p frame of DaVinci. Means the 720p went through an extra encoding step, if I read your post correctly. I would conclude that the blurriness of the 720p is an effect of the re-encoding rather than loss of details due to the downscaling alone.

    Secondly, the true resolution of the "original" out of the camera appears to be less than 1080p (despite the picture size of 1080p of course) due to the on-the-fly compression of the camera. You can see compression artefacts like halos and mashed details (leaves of the trees, grass in the background). The 720p DaVinci picture seems to have smoothed away some of these artefacts, but introducing little blurriness at the same time.

    Anyway, the resolution of a true 1080p high quality source will be better than 720p (by definition!). Normally 720p is perceived as a slightly softer picture.
    Quote Quote  
  16. Capturing Memories dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    If encoding time is your concern then leave the resolution as the source to save re-scaling processing time, If blu-ray is 1080p then so be it, besides in the future if the footage is viewed in 4K and 8K TV sets the upscaling will be a straight math by just doubling the scan lines, but with 720p there is going to be a compromise. You never go wrong by seeking better quality that I can tel you for sure.
    Quote Quote  
  17. Originally Posted by dellsam34 View Post
    If encoding time is your concern then leave the resolution as the source to save re-scaling processing time, .......
    Really? Re-scaling is much faster than encoding.
    Rescaling + encoding at 720p is much faster than no rescaling and encoding at 1080p (assuming the OP re-encodes the 1080p source to reduce the file size, rather than just copying it).
    Quote Quote  
  18. Originally Posted by ChapmanDolly View Post
    Going back to the OPs original question, 1080 has greater detail than 720. If, for instance, you intend to upload your video to YouTube, you should strive for the highest quality possible to combat the re-encoding that YT will do. As the OP says about viewing on a big screen the lower resolution of 720 will be even more apparent when the video is scaled to fit that screen. So I come out on the side of 1080 being better.
    1080p can potentially have more detail than 720p but it's not always the case, so a lower resolution mightn't necessarily look less detailed when upscaled to 1080p. Downscaling your original 1080p screenshot and then upscaling it again with a sharper resizer shows that. At least it appears that way to me.
    And it also doesn't take into consideration the OP wants to use a bitrate of 4000kb/s. A 1080p encode at that bitrate is more likely to have visible encoding artefacts than the same bitrate at 720p. I'd rather a little blurriness than compression artefacts if I had to choose.

    Here's an example of a little TLC making an upscaled 720p encode look sharper than the 1080p source, and better to me, even after some noise filtering, although I doubt the OP wants to go to that much trouble.

    Originally Posted by dellsam34 View Post
    If encoding time is your concern then leave the resolution as the source to save re-scaling processing time, If blu-ray is 1080p then so be it, besides in the future if the footage is viewed in 4K and 8K TV sets the upscaling will be a straight math by just doubling the scan lines, but with 720p there is going to be a compromise. You never go wrong by seeking better quality that I can tel you for sure.
    The time it takes to rescale is negligible compared to the difference in encoding speed for 720p and 1080p.
    Would it be as simple as doubling the scan lines? If you had straight lines running across the screen and you just doubled the scan lines, wouldn't it look something like this? The image still has to be resized.
    Image Attached Images  
    Last edited by hello_hello; 16th Dec 2019 at 16:51.
    Quote Quote  
  19. Capturing Memories dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    Yes but 1080p will display better in 4K and 8K screen while 720 look like Cr@p, I don't know why someone would consider 720p in 2019 when the source in fact is higher, may as well just record on VHS and be done with it.
    Quote Quote  
  20. Well I didn't get around to addressing the bitrate. As the OP has said they were intending to use 4Mbps you might as well forget 720 as well. It will of course look awful at such a low bitrate.
    Hello_hello, why would you re-scale downwards and then up again? Just use the proper tools in a decent NLE. All I did was change the timeline resolution from 1920 x 1080 to 1280 x 720, no messing about at all.
    Canon C100 mk2 - Dell XPS8700 i7 - Win 10 - 24gb RAM - GTX 1060/6GB - DaVinci Resolve Studio 18.6.3 - Blackmagic Speed Editor - Presonus Faderport 1 - 3 calibrated screens
    Quote Quote  
  21. Sharc, the second set of two frame grabs are from Resolves' timeline, no rendering was done only the timeline resolution changed. The loss of detail in 720 is particularly obvious at the bottom of the frame on the wire of the fence and the grass. The artifacts are more likely from the jpeg captures.
    These are the video details of the camera-original clip as reported by MediaInfo.
    Anyway, I just posted the comparison clips to illustrate the difference between the resolutions, something others have not done.
    Image Attached Thumbnails Click image for larger version

Name:	Media Info.JPG
Views:	243
Size:	57.4 KB
ID:	51113  

    Last edited by ChapmanDolly; 16th Dec 2019 at 19:04. Reason: More information
    Canon C100 mk2 - Dell XPS8700 i7 - Win 10 - 24gb RAM - GTX 1060/6GB - DaVinci Resolve Studio 18.6.3 - Blackmagic Speed Editor - Presonus Faderport 1 - 3 calibrated screens
    Quote Quote  
  22. @ChapmanDolly
    ok, thanks for clarification.
    Quote Quote  
  23. Originally Posted by dellsam34 View Post
    Yes but 1080p will display better in 4K and 8K screen while 720 look like Cr@p, I don't know why someone would consider 720p in 2019 when the source in fact is higher, may as well just record on VHS and be done with it.
    Available bitrate for one thing, but if a 4k or 8k display isn't any larger than a 1080p display, the resolution might be higher but the actual size of the image isn't, so is it really going to look worse? My ex bought a 4k TV not so long ago. The screen is a few inches bigger than my TV, but I don't recall even standard definition looking any worse than it does on mine, at least in respect to upscaling. She didn't buy a particularly expensive TV, so in every other respect my not-particularly expensive 1080p plasma blows it away for picture quality at any resolution.
    Quote Quote  
  24. Originally Posted by ChapmanDolly View Post
    Well I didn't get around to addressing the bitrate. As the OP has said they were intending to use 4Mbps you might as well forget 720 as well. It will of course look awful at such a low bitrate.
    What encoder are you using? It depends on the source. I've encoded plenty of 720p video where the resulting bitrate has been much lower than 4000kb/s and some where it's much higher, using the same quality for x264 each time (usually CRF18). The lower bitrate encodes don't look awful just because the bitrate is low, it just means the video was easier to compress.

    Originally Posted by ChapmanDolly View Post
    Hello_hello, why would you re-scale downwards and then up again? Just use the proper tools in a decent NLE. All I did was change the timeline resolution from 1920 x 1080 to 1280 x 720, no messing about at all.
    I thought that was obvious. Wasn't the the object of the exercise to see how much detail is lost due to downscaling? You used a NLE to downscale. I assumed the downscaled version was an encode given you originally said they both used the same bitrate, but for you to post a 1080p screenshot of the downscaled version it was obviously upscaled to 1080p again at some stage. I did the same thing, only I did the upscaling and downscaling in a single Avisynth script and took a screenshot of the output. If your NLE can't do the same I've no idea how it means you spent less time messing about, or that the NLE I'm using isn't a proper tool.

    Originally Posted by ChapmanDolly View Post
    Anyway, I just posted the comparison clips to illustrate the difference between the resolutions, something others have not done.
    Do you understand how I did the same now, only with a different result?
    I'm pretty sure in the case of your screenshot, sharper resizing shows it can be done with virtually no perceivable loss of detail.

    Here's the same downscaling and upscaling again, only this time with a not-so-sharp resizer. When comparing this screenshot to my previous one, all you're seeing is the difference in upscaling and downscaling methods. It's similar to your result but it doesn't mean I haven't previously illustrated it doesn't have to be. With sharper upscaling and downscaling and maybe some subtitle sharpening in between, when a 1080p source doesn't have 1080p worth of detail it's not uncommon for the downscaled version to look a tad more detailed, but that's something of an illusion due to it being sharper.

    ImageSource("D:\1080p.jpg")
    BilinearResize(1280,720)
    BilinearResize(1920,1080)
    Image Attached Thumbnails Click image for larger version

Name:	Bilinear Resizing.png
Views:	163
Size:	1.49 MB
ID:	51122  

    Last edited by hello_hello; 17th Dec 2019 at 16:57.
    Quote Quote  
  25. See post below.
    Canon C100 mk2 - Dell XPS8700 i7 - Win 10 - 24gb RAM - GTX 1060/6GB - DaVinci Resolve Studio 18.6.3 - Blackmagic Speed Editor - Presonus Faderport 1 - 3 calibrated screens
    Quote Quote  
  26. No encoding used to produce either pic in my second post. They are direct grabs from the timeline. Firstly the timeline was set to 1080, the clips native resolution, then it was changed to 720.
    Both JPEG screen grabs are 1080, the first of a 1080 video, the second of a 720 video. It's a good illustration of what a 1280 x 720 video will look like when played on a monitor or tv with 1920 x 1080 resolution. The bitrate of the original clip has not changed when producing the 720 version. The frame grabs, using Windows Snipping Tool, has produced jpegs with a resolution only a third of that of the actual video frames.
    Last edited by ChapmanDolly; 17th Dec 2019 at 18:17.
    Canon C100 mk2 - Dell XPS8700 i7 - Win 10 - 24gb RAM - GTX 1060/6GB - DaVinci Resolve Studio 18.6.3 - Blackmagic Speed Editor - Presonus Faderport 1 - 3 calibrated screens
    Quote Quote  
  27. Instead of jpegs, here are two identical clips. Both have the same grading and both have been rendered from the same original. One is at 1080p the other 720p.
    Image Attached Files
    Canon C100 mk2 - Dell XPS8700 i7 - Win 10 - 24gb RAM - GTX 1060/6GB - DaVinci Resolve Studio 18.6.3 - Blackmagic Speed Editor - Presonus Faderport 1 - 3 calibrated screens
    Quote Quote  
  28. Just for fun, here is the same clip at 720p but at the OPs proposed 4Mbps. Lots of compression artifacts, especially in the shadows.
    Image Attached Files
    Canon C100 mk2 - Dell XPS8700 i7 - Win 10 - 24gb RAM - GTX 1060/6GB - DaVinci Resolve Studio 18.6.3 - Blackmagic Speed Editor - Presonus Faderport 1 - 3 calibrated screens
    Quote Quote  
  29. What encoder did you use? The one which is shipped with DaVinci? How does it look when you encode it at 1080p with 4Mbps? Is your conclusion still that 1080p is better than 720p?
    Even your "Graded 1080p" has heavy compression artifacts which are not unusual for on-the-fly AVC compression of cameras. The 1-second GOP pulsing (every 50 frames) is obvious, for example. And as I mentioned before, true resolution is less than 1080.
    Quote Quote  
  30. What are you using to view the clips? No pulsing every second here. What compression artifacts are you seeing?
    Yes, these are rendered out of Resolve and yes I still conclude that 1080 is better than 720.
    Image Attached Files
    Canon C100 mk2 - Dell XPS8700 i7 - Win 10 - 24gb RAM - GTX 1060/6GB - DaVinci Resolve Studio 18.6.3 - Blackmagic Speed Editor - Presonus Faderport 1 - 3 calibrated screens
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!