Is it possible to analyze the perspective of scenes and match the result with another scene's perspective ?
I try to explain my question:
I have 3 MOV files with different distance from the stage.
I would to know the file closest and the file is far from stage.
Thanks in advance
+ Reply to Thread
Results 1 to 8 of 8
Imho, you can measure the perspective of picture using Krita for example, or some NLE program for video. But I think it is impossible to match them.
Check youtube about Krita tutorial perspective there are several perspective types. I think you can matched it only if the video is 3d scene in some 3d program like blender 3d MAX MAYA etc. Matchmaking take in account the position of camera. So you can do rotoscopy if you get the perspective, but not look them all the same (perspective). All you can do is change Height and Width and that is all you can do. To match perspective in 2d is IMHO almost impossible.
I would match the results, not the scene.
I'm studying Krita.
Thanks for help.
I know, you want match 3 sources of videos. You can transform the stage to rectangle, then resize to same size for all 3 sources. That is only way I can see to achieve something. It also depends how is source shaking or if it was from tripod or something. But with perspective you IMHO couldn't do anything. Will be very happy if somebody proves I am wrong.
I just want to know the location and distance of multiple cameras from a point of common interest
if you can do it, post screenshot of each camera. Probably someone will helps you. Don't forget, that zoom, changing perspective as well (in different way than it just seems to be near). You said, that it is some stage there, it can be good reference for measuring which cam is nearest and which far. Or which is on left or right. It all depends on reference object. Since nobody answered to this post, it seems that it isn't as easy as it looks.
Hugin - Panorama photo stitcher. If you have enough overlap between the cameras, it can recreate the 3-D environment. (EDIT maybe not)
Last edited by raffriff42; 10th Nov 2017 at 06:57.
As an adjunct to supporting & improving a stereo3D (multi-)camera workflow, I've devised a 3D object that is a solid counterpart version to calibration test/chip charts. It requires being set at certain known distances and has color-coded sides & corners/vertices ( it is a polyhedral skeleton of certain size), mounted on a stand.
Thus, when shot by cams (in stere- or mon-ocular), you can fairly easily match the color-coded corners and apply mesh warping adjustments to fully link up the differing perspectives into a contiguous (virtual) 3D whole. Similar to what is used to track in 3D for mocap or match move (though mine is static, but is quick to setup).
Structured light mapping can achieve similar results.
If you can use this to help you, go for it. Just give me technical credit in any productions you utilize it with (contact me for more details, then let me know when complete & give me a demo link so I can reference it).