Over the years I saved a gazillion Youtube links / URLs, “to watch later”, and later we are indeed. (“Hindsight is 20/20”, as they say, whoever they may be, and we're now in 2020.) A lot of them must be defunct by now, but some not, some might still be relevant and worth watching before civilization collapses, but since for many of them I only copied the URL it would be very tedious to examine them one by one, not even remotely knowing what they are about.
So I would like to convert those lists with a clever script which, for each URL, would output the URL followed by the title of the video, and perhaps additionally extra information like the duration, the name of the account and whatnot, and mark the ones no longer online as such. One way would be to load ’em all into JDownloader and let it analyse them, but I already have another gazillion of links in JDownloader's download queue, and it's a quite slow and unwieldy utility, I'd prefer something very light.
Can this be done with youtube-dl or something similar ?
+ Reply to Thread
Results 1 to 5 of 5
I just ran into the same problem and found this post, I used youtube-dl, the gui version but the CLI variant works well too. Downloaded from here https://mrs0m30n3.github.io/youtube-dl-gui/
I just copied my list into it and scrolled through the list.
If you want to have duration, name of account etc then look into the advances settings or use the CLI variant of the program and pipe it through whatever unix program you like.
youtube-dl -o "b:\Videos\%(uploader)s_%(upload_date)s_%(resolution)s_%(id)s_%(title)s.%(ext)s" --write-thumbnail --skip-download url
url is based on the id,
Well, at least I got an answer before the end of the year... é_è
(Cuz pretty soon hindsight is gonna be 2021, and one day it's gonna be 2050, which sure ain't much.)