When I download broadcasts from france.tv, with youtube-dl, from the m3u8 stream, I very often get a slightly corrupted video, with typically a few seconds where the audio is stuttering while the picture freezes, then it goes back to normal. If I download the same video again, it is usually correct, but for longer videos, it is sometimes necessary to download 3-5 times until I get two files strictly identical, which is the only way I could find to ensure that I got a good download without watching the whole video right away. So I made a script to download each video twice in a row, then if both downloaded files have the same size and MD5, i consider that the download is good ; if one is bigger (corrupted videos are always slightly bigger — I did thorough tests using the Avisynth “Subtract” method described here before coming to that conclusion), I download a third time, and check again. Obviously this is quite impractical. (Especially considering that my downloading speed is quite low at 1MB/s, so for long broadcasts each corrupted download means a waste of 30+ minutes of downloading time.)

Is it common with m3u8 streams and/or youtube-dl ? Or is it more likely an issue with this particular provider ? Or a problem with my system, or connection stability ? Or something else ? Is there any way to ensure that this kind of error doesn't happen, or gets corrected right away ? (I don't have such issues with regular HTTP downloads.)

Example link. (This one should stay online a few weeks at least, and it's a 90min. show, as it more likely to happen with longer broadcasts.)
Script currently used :
Code:
chcp 1252
set lien=https://www.france.tv/france-5/la-grande-librairie/la-grande-librairie-saison-13/2110297-emission-du-mercredi-9-decembre-2020.html
youtube-dl -f best %lien% --all-subs --write-info-json --write-description --write-pages -o "G:\%%(upload_date)s - F5 - %%(title)s [%%(id)s].%%(ext)s"
youtube-dl -f best %lien% -o "G:\%%(upload_date)s - F5 - %%(title)s [%%(id)s] vérif.%%(ext)s"
Side question : youtube-dl downloads and actually writes m3u8 fragments, adding unnecessary clutter to the partition's filesystem (scanning it with a data recovery software I can see hundreds of deleted .part-fragX.part files), and also records to a .mp4.part temporary file, which seems to be in TS format, then at the end uses ffmpeg to convert that temporary file to an actual MP4 file (a process which is oddly described as “Fixing malformed AAC bitstream in [file.mp4]”), which means that free space must be at least double the actual size of the video being downloaded (it can be a problem for long broadcasts, combined with the fact that I have to download twice at least). If downloading directly with ffmpeg, from the m3u8 URL, it records straight away to a MP4 file, without writing the fragments. Why is the downloading process so convoluted with youtube-dl if it can be done in such a straightforward way with a utility which is not even primarily meant for downloading ?
Also, the resulting file has the exact same size (provided that no corruption occured — see first question), and usually only 3 bytes are different, right before the index at the end, in an area that contains the text “SoundHandler”. Those 3 bytes apparently indicate the maximum audio bitrate : on the ffmpeg download, that field is identical to the one right afterward, which apparently indicates the average bitrate, and MediaInfo reports the bitrate as “constant” ; on the youtube-dl download, the value is slightly higher than the one right afterward (for instance : 00 01 84 2F / 00 01 77 00 for a file which MediaInfo identifies as having a 96kbps average audio bitrate and a 99.4kbps maximum audio bitrate), and MediaInfo reporte the bitrate as “variable” (both in the “Audio” section and in the “General” section). Why this small discrepancy ?