VideoHelp Forum




+ Reply to Thread
Page 2 of 2
FirstFirst 1 2
Results 31 to 37 of 37
  1. Capturing Memories dellsam34's Avatar
    Join Date
    Jan 2016
    Location
    Member Since 2005, Re-joined in 2016
    Search PM
    I agree on Youtube and streaming, DVD's were mostly interlaced contents, Maybe the later years of DVD where some movies authored from progressive digital files originated from 2k/4k telecine machines. But I agree the main topic is upscaling.
    Quote Quote  
  2. Originally Posted by poisondeathray View Post
    How would you "watch" the original non-aliased source ?
    You are defninitely right in this little point.
    Quote Quote  
  3. Originally Posted by Quint View Post
    Originally Posted by poisondeathray View Post
    How would you "watch" the original non-aliased source ?
    You are defninitely right in this little point.

    It's a major point; arguably the most important point in the whole thread.

    Do you have a SD TV ? Is that what you're watching it on ?

    Aliasing artifacts generated during upscaling - they are the artifacts that are clearly visible to everybody - the buzzing fuzzy lines that flicker. Clear indication of a crappy upscale. It's analogous to a bad deinterlace - because that's what deinterlacing is partially - spatial interpolation of the missing scanlines (deinterlacers are upscaling spatially). Deinterlacing is only 2x vertical upscale. SD to UHD is >4x , the artifacts can be as bad or worse. The larger the enlargement factor, the more prone to aliasing

    The improved 1st gen upscalers in the early 2000's were antialiasers first. That's the most important step for "upscaling". Before that, lanczos 3-tap was the "gold standard" for upscaling - lanczos was in all the broadcast engineer publications, AV magazines , even NLE's started implementing the option because it was so prevalent in the literature. That's why EEDI1/2/3 , NNEDI1/2/3 derivatives were so popular 10-15 years ago (and now) - the antialiasing - much superior to lanczos

    Do I need to post a demonstration ? You're going to get aliasing on a non aliased SD source when using normal "upscaling" - or for you "upscaling without Improving procedure"
    Quote Quote  
  4. Originally Posted by poisondeathray View Post
    Originally Posted by Quint View Post
    Originally Posted by poisondeathray View Post
    How would you "watch" the original non-aliased source ?
    You are defninitely right in this little point.

    It's a major point; arguably the most important point in the whole thread.
    Yes, of course it is. Don't you realise irony even when I use an extra
    Just nice meant, that you are right of course.
    Quote Quote  
  5. Originally Posted by Quint View Post
    Originally Posted by poisondeathray View Post
    Originally Posted by Quint View Post
    Originally Posted by poisondeathray View Post
    How would you "watch" the original non-aliased source ?
    You are defninitely right in this little point.

    It's a major point; arguably the most important point in the whole thread.
    Yes, of course it is. Don't you realise irony even when I use an extra
    Just nice meant, that you are right of course.
    Sorry , missed it

    Cheers
    Quote Quote  
  6. Member
    Join Date
    May 2024
    Location
    Mesquite, NV
    Search Comp PM
    Originally Posted by poisondeathray View Post
    Originally Posted by dellsam34 View Post
    Originally Posted by poisondeathray View Post
    Sure, but the OP is dealing with "480p" , why would he deinterlace ?.
    Yes, but most of 480p contents came from video interlaced tapes (analog and digital alike) with the exception of film contents that were scanned directly from film into a progressive format not into an interlaced format such as Betacam or SD telecine machines,
    More common when someone says "480p" in 2023 would be a DVD source, Youtube, or some SD stream. "Video interlaced tape" would be near the very bottom of my list for guesses.

    I've seen bad de-interlaced contents using hardware processing throughout the years of SD broadcasting when we moved to digital TV and DVD, My point was, most losses occur during that de-interlacing step, This further proves my point that computer processing is better than hardware processing.
    I agree bad deinterlacing causes many issues, in the past and now - but there are many assumptions that you're making about what the OP's source was.

    I don't see how those assumptions + observations "prove" that "computer processing is better than hardware processing." ? Where is the "proof"? I don't see the connection ? "computer processing" can be VERY bad too...anyone that reads a few threads on video forums will learn that quickly

    But I can make educated guesses about what you're referring to - QTGMC in terms of deinterlacing is very good - that is your "computer processsing" example (QTGMC has some issues , but still the best overall on most sources).

    The high end TV processing chips have very good deinterlacing too, I daresay they are getting very close , maybe even surpass soon (and low end displays are still terrible, similar to a bob deinterlace) . Deinterlacing does not get high priority for development, because nobody cares. 99.99% of video is progressive now. Think of all the cell phones, go pros, point and shoot cameras, dslrs - those are the majority of consumer video. Consider that TempGaussMC_beta2 (QTGMC's precursor) was released around 2008. That's a long run for an open source software.

    Theoretically, computer processing for *any* video processing task will always be potentially "better", because of the limitations of fixed hardware. Sure you can get some firmware revisions and minor improvements for a chip. But a software setup potentially has more resources, doesn't need real time deadline. A software solution can be customized to a specific situation. Besides - all hardware is prototyped in software first anyways, before commiting to silicon . HW is always going to be 1 or more generations behind state of the art developments. But you, me, and Joe Public do not necessarily have access to that state of the art computer processing. eg. I don't have access to Sony or Samsung's training or models for their next gen chips. The only public assess for their state of the art processing is you buy their next TV.
    Actually, the true shame of this is that most TV/Movies original film contains higher resolutions (remember how good movies looked in the theater even way back in the 60s,70s,80s when Televisions were limited to 480i/p resolution by CRT technology. And, even when VHS video recorders (which won out over the higher quality Betamax due to longer tape capacity of VHS tapes) and Video Rental stores popped all over (from Blockbuster to the Mom and Pop stores). In fact, in the US Air Force we used to have VHS tape recorders that only had 30 minutes of capacity and a major upgrade in the 90s was switching to 8mm tape recorders and players that allowed us to record two hours of video with smaller tapes and higher/longer quality. Back to original topic the real crime in all this is the industry (if it wanted) has the ability to produce very high quality digital transfers IF it so desired. The best are created by scanning each frame using a high quality video scanner that CAN produce high quality 4K transfers when the studio owning the rights to it so desires. When it comes to TV (especially if it was not considered to be a long running series or it was low budget and stayed low budget) there are great disparities in the quality of the original tape media. And, finally, even with the advent of High Quality 4K Digital Video cameras there are many, many things that are broadcast in 720P (HD) or 1080P (FHD) to keep bandwidth costs down (most sporting events fit into this category - B-Ball, Baseball, Football, Hockey, etc.). And, even if 4K video is available , the studio (Disney, I'm looking at you) will stream (which means conversion to a lower quality stream) their content in 720P rather than 1080P (which upscales very nicely - just doubling from 1980x1080 to 3840x2160P). On most of my 4K TVs (even OLEDs) I struggle to see a significant difference from a good 1080P Blu Ray to a 4K HDR UHD Blu Ray. When you read review of 4K UHD Blu Rays you will always have references to small snippets (i.e. the Scene @ time stamp 20:30 - 21:30 where the ______ looks spectacular. For various reasons (including a fire that destroyed everything) I have owned a number of different TVs at sizing ranging from 40 inch FHD TV all the way to 55" LG B2 OLED to currently a 65" QLED with Full Array Local Dimming (next TV will definitely be QLED plus Mini LED as not will to pay the price AND take the risks involved in potential burn in of all current OLED/QD-OLED TVs. You get 95% of the picture quality of the OLED with higher brightness for half the price. I'm mostly looking at the newest Hisense/TCL level QLED Mini LEDs vs the LG/Samsung/Sony OLEDs or QD-OLEDS. Sure wish the broadcasters would put out more high quality video (especially sports) than is usually broadcast.
    Quote Quote  



Similar Threads

Visit our sponsor! Try DVDFab and backup Blu-rays!