Something that’s been floating around my mind for a while. Fair warning, this is a bit philosophical, a bit tongue in cheek and even a little nonsensical.
I so often see posters want to “best” quality while changing one or more aspect (enhance, correct or shrink) of the original. Of course it’s impossible to make change of this type without losing some quality, but how much is too much? What is the “best” quality?
Even a DV copy when “corrected” and “enhanced” is a generation down from the original source. Does this make it definitively better than the original? Or is the original, the “best” quality with the 2nd generation copy subjectively better?
I fully agree and seek to have “movies seen as they are meant to be seen”. But as we get generations away from the creator (be it a cinematic director or home movie maker), we may tend to “correct” and and “enhance” the video to suit the current generation.
I’ve fiddled with my HDTV’s settings so that a high quality source “pops out of the screen”, looking “almost lifelike”. I can see every strand of hair, every imperfection on their face, every wrinkle in their clothing. After a few moments of this, I reset everything back to the original settings since this NOT what the director intended.
I’ve mentioned this before, but I’ve seen Akira Kurosawa’s Seven Samurai dozens of times over multiple decades. On the movie screen, TV, videotape, laserdisc, DVD and now Blu-ray. I hope to be around when it’s available in 4K or beyond!
The quality of each medium “enhances” the viewing experience…or does it? One might argue that the “best” quality is the original film stock, with all its inherent cinematic blemishes and limitations. Kurosawa knew the capabilities and limitations of his medium and worked with them. THIS is how he envisioned his film be seen.
On the other hand, one may state that the corrected and enhanced video is more esthetically pleasing and easier to view critically. Not hampered by the occasional imperfection, the view can fully concentrate on the composition. The finer nuances of the film that were hidden by the limitations of film stock can now be seen. If he had today's technology, would THIS be how he envisioned the his film be seen?
A final thought. I think about the Mona Lisa. Bill Gates could probably have an exact copy of the Mona Lisa, stroke for stroke, colored as it was the day it was created, but it would never be better than the original…or would it?![]()
+ Reply to Thread
Results 1 to 10 of 10
-
-
Who cares what the director intended....convert Seven Samurai to 4K 48fps 3D now!
-
a. 'best' is what the viewer (you) prefer the most; if you like the HD/UHD/SD version better than another fine
b. directors rarely target a specific medium (VHS/DVD/Blu-ray) when directing a movie, so I wouldn't bother with that string of thoughts that much (normally they take what is available and inside the budget). That said, if you change the colors, remove artistic noise and similar, it is probably something that the (art) director didn't want but if you like it see 'a.'
c. "Kurosawa knew the capabilities and limitations of his medium and worked with them." I agree that directors work with the limitation of the camera&co equipment (may be even the current cinema projectors), but I doubt that they really know the 'the capabilities and limitations' of the medium the video will end up for normal users,... -
If everyone stuck with somebody elses intentions, what if their intention was half-baked? What if it was all wrong, but for the right reasons? Or right for all the wrong reasons?
Most movies flop. So do Directors really know what they're doing anyway? -
So do Directors really know what they're doing anyway?
- not all directors are so good
- if the script isn't good, having a good director won't save the movie
- if the director doesn't have the tools to do his job properly, the result might still be a technical disaster
- How can any one really know what they are doing with all the consequences?
-> I wouldn't blame just the director for a crappy movie
If everyone stuck with somebody elses intentions, what if their intention was half-baked? What if it was all wrong, but for the right reasons? Or right for all the wrong reasons? -
-
The quality of each medium “enhances” the viewing experience…or does it? One might argue that the “best” quality is the original film stock, with all its inherent cinematic blemishes and limitations.
So, in my own opinion, what constitutes the highest quality viewing experience will necessarily vary from viewer to viewer, and probably that will vary with viewing context. For me, my eye has been trained to spot things like compression artifacts and other visual anomalies. I'm distracted by such things, so I prefer the highest quality the current tech can offer. I've never seen Seven Samurai, but I expect I would enjoy it most in a digitally-enhanced presentation, even though Kurosawa's original cut from film stock would probably be closest to his artistic vision. I think I would be interested in such a cut for different reasons--but for purposes of becoming engrossed in a narrative, I find I need very high visual quality to "forget" that I'm watching a film, sitting in a screening room.
But I'm a product of the digital revolution, so my views are skewed in that direction. I'm actively trying to appreciate "old tech," but I find I've been quite spoiled by the new offerings.
Edit:
You know what? I've thought more on it: I would prefer to view a film as close to the original context of the director. So, for Seven Samurai, I would most prefer to see it in Kurosawa's screening room on a period projector amplified over a period speaker system, completely devoid of digital anything.
Conversely, I would most prefer to experience LotR in Peter Jackson's screening room in its original 8k on seven-figure equipment (or whatever the state of the art is).
If I'm limited to consumer tech, I would default to the cutting edge in either case (I would never want to see either on VHS, for example).
Edit 2:
Apparently I have more to say on the subject. As far as correction/enhancement goes, I think the overall guiding principle is faithfulness to the original. And by "original," I mean the real-life original scene that was shot, not necessarily the raw DV dump. Case in point: I recently cut together a 2-camera wedding sequence using cheap cameras. One camera shot warm footage, the other, cool. It would have been most faithful to the original footage if I just left them raw. Instead I sacrificed some "empirical quality" of the cooler footage to improve the overall "subjective quality" of the sequence. The final, "warm" sequence is, in my opinion, much less distracting and much more faithful to the original event than if I left it untouched.Last edited by diprotic; 7th Nov 2013 at 08:09. Reason: More bantering
-
The whole "that's how the director wanted the film to look" is such a cop-out.
Directors never really make those choices.
It's all technical mumbo-jumbo, and they couldn't care less.Want my help? Ask here! (not via PM!)
FAQs: Best Blank Discs • Best TBCs • Best VCRs for capture • Restore VHS -
-
Maybe this is a good place to ask/rant......
Why, in this digital age, does so much of the digital stiff released on disc look so crappy? I'm not referring to digital compression and any artefacts it may produce, but the picture quality itself. Why does so much of it look like noisy, grainy, over-saturated crap with bad luminance levels? I mean.... I don't go to the Cinema much..... but I don't recall ever sitting in one and watching a movie while thinking this looks like a noisy, grainy, over-saturated piece of crap with bad luminance levels, so why does so much of the stuff I watch via disc look that way?
And this 4k, 8k thing..... I've never watched anything in 4k or 8k, but if the "4k mastering" used when producing Bluray releases isn't anything but marketing I'd be amazed. And most 1080p video might contain 1080p worth of noise, but I can downsize to 720p much of the time without losing any actual picture detail, so will 4k/8k be any better?
I don't know anything about the digital production process, but I'd also be keen to know if there's a standard noise filter used in digital production which by design doesn't de-noise the first frame or two of every single scene. Anybody seen the Bluray (or even iTunes) release of The Sopranos? That'd be a classic example of what I'm referring to. It drives me nutty if I sit close enough to the screen for it to be obvious.
Well, the more video I convert the more I tend to filter it in one way or another. At least the stuff I care about. For the stuff I don't the result probably isn't worth the huge effort which can go into "getting it right" but while I mightn't go so far as to say my efforts tend to make my encodes look definitely better, I'd be willing to claim the originals often tend to look subjectively worse.
Similar Threads
-
dumb video "screenshot" question (from gopro hero3 mp4 file)
By wstt in forum EditingReplies: 1Last Post: 7th Apr 2013, 04:54 -
Quality problem "caused" during "deinterlacing" ?
By Hombre_86 in forum DVD RippingReplies: 26Last Post: 25th May 2010, 21:28 -
AutoGK question: Can I "Force IVTC" on "Hard-encoded NTSC&am
By BozQ in forum DVD RippingReplies: 11Last Post: 12th Jun 2009, 13:46 -
DVD with menus, replacing "bad" video with "good" video
By chipsndukes in forum Authoring (DVD)Replies: 34Last Post: 4th Dec 2008, 17:45 -
Question about "legally" converting itunes drm video
By jimdagys in forum Newbie / General discussionsReplies: 9Last Post: 17th Nov 2008, 10:12