Hello,
I am wondering if there is a way to merge two .flv videos in a way so that the new video plays both videos at the same time. For example, video1.flv and video2.flv when merged, the output.flv would have video1.flv on the left hand side, video2.flv on the right hand side and both would play at the same time.
I am using Adobe Media Server to record two streams in a chat session (one stream for each side), and I wanted to play them back the way they were recorded so that it appears as if you are seeing the session again. If the above approach is not correct or not doable, then is there a different way to achieve my goal (i.e., play back two streams of a session)?
Regards,
~Barjawi
Try StreamFab Downloader and download from Netflix, Amazon, Youtube! Or Try DVDFab and copy Blu-rays! or rip iTunes movies!
+ Reply to Thread
Results 1 to 12 of 12
Thread
-
-
Lots of ways.
1. Throw both videos into anly NLE that can handle two tracks, reposition them side by side and export
2. Open them in Avisynth, use StackHorizontal() to play them side by side
3. Open them in Adobe Flash and create a .swf frame to play them simultaneously (synch may be iffy.) -
Thank you for your quick reply.
Sorry I forgot to mention that I want to do that using linux command line. My goal here is to make it automatic. I don't want to manually do that. Basically, once a session is recorded, I want to run a command that will join the videos together right away so that it is available for viewing asap.
Thanks, -
For commandline, probably ffmpeg , using -vf scale , pad and overlay
eg
http://stackoverflow.com/questions/9293265/ffmpeg-2-videos-transcoded-and-side-by-side-in-1-frame -
Time to get your hands dirty, then!
Work hard now & easier later, or easy now and hard later.
Scott -
Hello,
Thank you all for your help. After getting my hands dirty with ffmpeg... I have some idea on how to use it.
I was able to achieve the desired result. However, one thing is not working with me at all.
My current goal is getting the two stereo audio inputs (from the two videos) to play at the same time (to become one stereo output that plays on both sides of the stream [left and right]).
It seems that no matter what I try, I always end up with each audio stream playing on one side of the output stream. For example audio stream from Video1 would play on the right side of my headset, and audio stream from Video2 would play on the left side of my headset.
Here are the command lines that I tried. I hope someone can shed some light on where I am missing up:
using amerge and -ac 2
Code:ffmpeg -y -i video1.flv -i video2.flv -filter_complex '[0:v] setpts=PTS-STARTPTS, pad=2*iw:ih [left]; [1:v] setpts=PTS-STARTPTS [right], [left][right] overlay=main_w/2:0 [vout]; [0:a][1:a] amerge=inputs=2 [aout]' -map '[vout]' -map '[aout]' -ac 2 output.flv
Code:ffmpeg -y -i video1.flv -i video2.flv -filter_complex '[0:v] setpts=PTS-STARTPTS, pad=2*iw:ih [left]; [1:v] setpts=PTS-STARTPTS [right], [left][right] overlay=main_w/2:0 [vout]; [0:a][1:a] amerge=inputs=2, pan=stereo|c0<c0+c2|c1<c1+c3 [aout]' -map '[vout]' -map '[aout]' output.flv
Code:ffmpeg -y -i video1.flv -i video2.flv -filter_complex '[0:v] setpts=PTS-STARTPTS, pad=2*iw:ih [left]; [1:v] setpts=PTS-STARTPTS [right], [left][right] overlay=main_w/2:0 [vout]; [0:a][1:a] join [aout]' -map '[vout]' -map '[aout]' -ac 2 output.flv
Lastly, I am currently in control of the video resolution and audio bit rate and codec. However, from my tests, I found that the command fails if for example the resolution changed mid session. I was wondering if there is a way to handle these scenarios through the command line itself, or do I have to write other scripts that check the files before the command line runs?
Regards, -
Only thing I notice quite different from those reference wiki methods (beyond the existence of video in your inputs) is they use DoubleQuotes and you used SingleQuotes.
************************
You cannot consistently overlay if the parameters of the overlay change. You will have to make them consistent throughout FIRST, and then overlay. This may require an encoding step. Why would they change, unless you've (or others) have been inappropriately joining dissimilar sections?
Scott -
I used single quotes because that's how it is being used in the Documentation. Actually, in the beginning, I used double quotes just like the examples on the net. While I was testing stuff, I came across an error that complained about the double quotes. The error wasn't of course in the double quotes themselves but before them. Anyways, I changed them to single quotes afterwards and stayed with them.
I have a chat-like application that uses Adobe Media Server as a back end. Using the Flex SDK allows me control over quality of the audio/video. Usually I change these settings whenever needed based on latency in the network. Thus the need (sometimes) to have a stream that changes settings during the session.
Is there any other method to do the task other than the three ways I described above?
Thanks, -
I fixed it
I was treating the input audio streams as stereo and was mapping them to a stereo... but actually, the inputs are mono (not stereo) and that was the problem
So I fixed it by changing my command line to:
Code:ffmpeg -y -i video1.flv -i video2.flv -filter_complex '[0:v] setpts=PTS-STARTPTS, pad=2*iw:ih [left]; [1:v] setpts=PTS-STARTPTS [right], [left][right] overlay=main_w/2:0 [vout]; [0:a][1:a] amerge, pan=stereo:c0<c0+c1:c1<c0+c1 [aout]' -map '[vout]' -map '[aout]' output.flv
Thank you so much @Cornucopia and to everyone else who contributed to this thread. -
That would do it. Your OP led me to believe it was stereo.
Scott -
Yes that was a mistake on my end. I don't know why I "assumed" the input audio streams to be stereo.
When I tried all what I could find online, I started thinking that there must be something else wrong, something in my own inputs... so I checked them and found it
Regards,
~Barjawi