VideoHelp Forum

+ Reply to Thread
Results 1 to 10 of 10
Thread
  1. Member
    Join Date
    Sep 2021
    Location
    Portugal
    Search Comp PM
    Hi.
    Here's my problem: the person that will be recorded needs to see himself while at it.
    So it's the issue of monitoring in real timeI found that using the tee pseudomuxer seems to be the simplest and most straightforward.
    The tee muxer can be used to write the same data to several outputs, such as files or streams. It can be used, for example, to stream a video over a network and save it to disk at the same time.
    Perfect except it's not over a network but just over the screen.

    So it gives us something like:
    ffmpeg -f v4l2 -i /dev/video0 -framerate 30 -video_size 1280x720 -c:v utvideo -f tee "out.mvk|[f=nut]-" | ffplay -

    But it says :
    Code:
    Output #0, tee, to 'out.mvk|-':
    Output file #0 does not contain any stream
    pipe:: Invalid data found when processing input
        nan    :  0.000 fd=   0 aq=    0KB vq=    0KB sq=    0B f=0/0
    If I need to use the map thing I don't know how I should do it.

    Thanks for your help.
    Quote Quote  
  2. Member
    Join Date
    Sep 2021
    Location
    Portugal
    Search Comp PM
    In this blog I am going to explain how to take the live video streaming url as input and transcode it and record it in a local disk using FFmpeg.
    I'm sorry sir, there is no stream in my case, I want to record from the camera and send the video to both the screen and a file. To see in live and with filters applied, what will be saved on the disk.
    Quote Quote  
  3. Member
    Join Date
    Feb 2006
    Location
    United States
    Search Comp PM
    Originally Posted by Sentinel166 View Post
    In this blog I am going to explain how to take the live video streaming url as input and transcode it and record it in a local disk using FFmpeg.
    I'm sorry sir, there is no stream in my case, I want to record from the camera and send the video to both the screen and a file. To see in live and with filters applied, what will be saved on the disk.
    try this here - https://trac.ffmpeg.org/wiki/Capture/Desktop
    i assume you mean webcam correct ?? - https://trac.ffmpeg.org/wiki/Capture/Webcam
    Quote Quote  
  4. Member
    Join Date
    Sep 2021
    Location
    Portugal
    Search Comp PM
    Yes webcam. Thanks I'll look at it tomorrow.
    Quote Quote  
  5. I have this for downloading and playing a live m3u8 stream. It should work for a camera too.

    Code:
    ffmpeg -y -i "http://live.prd.go.th:1935/live/ch1_L.sdp/chunklist_w1249643471.m3u8" -c:v libx264 -af "volume=15dB" output.mp4 -c copy -f mpegts - | ffplay -x 1280 -y 720 -
    Quote Quote  
  6. Member
    Join Date
    Sep 2021
    Location
    Portugal
    Search Comp PM
    Code:
    ffmpeg  -y -f v4l2 -framerate 30 -video_size 1280x720  -input_format mjpeg -i /dev/video2 -c:v utvideo -vf hflip essai.nut -c copy -f nut - | ffplay -
    That one is the command that works with your exemple.
    Except the time lag is terrible. it's supposed to be realtime.

    and whatever -f format I choose for the pipe stream I always get:
    Code:
    av_interleaved_write_frame(): Broken pipe
    Error writing trailer of pipe:: Broken pipe
    frame=  263 fps= 26 q=-0.0 Lq=-1.0 size=  165895kB time=00:00:08.78 bitrate=154646.4kbits/s speed=0.879x
    video:205515kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown
    Conversion failed!
    as I understand when asked for several output like that ffmpeg encodes multiple times.
    I want this and nothing else will work regards to real time:
    camera -> filter > | -> file
    | -> screen !

    If you can tweak my command to remove the time lag, then fine.
    Otherwise I'll need the tee muxer.
    Last edited by Sentinel166; 28th Sep 2021 at 05:41.
    Quote Quote  
  7. Member
    Join Date
    Sep 2021
    Location
    Portugal
    Search Comp PM
    I tried all manners of tee, including with command subtitution. Nevers works out.
    ffplay -f v4l2 -framerate 30 -video_size 1280x720 -input_format mjpeg -vf hflip -i /dev/video2 | tee -a essai; ffmpeg -y -i essai -f utvideo essai.nut
    "essai" isn't recognized. apparently sending frames in real time to append to a file doesn't fit with the age old unix f***in' syntax.

    Is it possible to stack up all the data in the essai file, so that the input ffmpeg takes is exactly the same as if coming straight from the webcam ?

    piping from ffmpeg to ffplay is way to slow.

    It has to work with something like:
    ffmpeg -nostdin -f v4l2 -i /dev/video2 -framerate 30 -video_size 1280x720 -c:v utvideo -f tee -map 0:v "[f=nut]essai.nut|[f=data](ffplay -)"

    Can THAT work ? How ?
    Last edited by Sentinel166; 28th Sep 2021 at 07:07.
    Quote Quote  
  8. Member
    Join Date
    Oct 2021
    Location
    Wheeling WV USA
    Search Comp PM
    "stacking up" data is not going to work, especially if it is sliced up. compressed video formats like MP4 are made up of different kinds of frame types that must follow in the correct order. I frames contain a whole image and others are differences that need to have that particular frame before them.

    your command line (which i see in text, not in code) has "| tee -a" which in Linux will run the system "tee" command which knows nothing about any file format. it works only on raw byte streams. it can be useful in some cases, but not for collecting multiple parts of just about any data format, especially compressed formats like video or audio.

    you are creating a "mishmash". ffmpeg and its other tools may be able to find video in there. but at the boundaries you are creating, it will be a mess. you would be better to transcode that out to raw uncompressed frames of equal size. but that will be huge data and transcoding back to compressed will take many times the play duration for typical general purpose CPUs (BTDT).
    Quote Quote  
  9. I don't know if it will help but sending raw video to ffplay may result in less delay. I only have a webcam to test with. It has an inherent delay so I can't tell how much delay is from the webcam and how much from the processing/piping.

    Code:
    ffmpeg -y -f dshow -r 30 -i video="USB Video":audio="Digital Audio Interface (USB Digital Audio)" ^
    -c:v h264_qsv -c:a pcm_s16le output.mkv ^
    -c:v rawvideo -c:a pcm_s16le -f nut - | "G:\Program Files\ffmpeg64\bin\ffplay.exe" -
    Quote Quote  



Similar Threads