VideoHelp Forum

Try DVDFab and copy Ultra HD Blu-rays and DVDs! Or rip iTunes movies and music! Download free trial !
+ Reply to Thread
Results 1 to 8 of 8
Thread
  1. Member
    Join Date
    May 2020
    Location
    Melbourne, Australia
    Search Comp PM
    Hey all, I am currently attempting to archive the contents of a YouTube channel which has massively sprung to life over the last few weeks.

    The channel's owners have run a pop culture website for years. Since about 2006 their site has hosted snippets of THOUSANDS of unique audio-only music performances (full versions by subscription). They are unique because they record the artists in their own studios (usually 3-5 songs), creating "one off" live recordings - hence my interest. A few weeks ago they started uploading full session performances to YouTube, en masse, sometimes 600 clips in a day.

    The problem is they have now surpassed 20000 uploads on their YouTube channel, and YouTube seems to limit the number of items viewable to 20000 as a "video playlist". Thus if I feed the URL https://www.youtube.com/user/***trotter/videos into my download program (I've tried TVDownloader and MediaHuman) it limits the number of possible downloads to 20000 - presumably the most recent. It doesn't matter how many new clips they add, this number always stays at 20000.

    The same result occurs if I choose "play all" in the full video list on the YouTube page - https://www.youtube.com/watch?v=An3-V6j0Hh8&list=UUNCIPlHNNqEHw3Ac1CqEZEg&index=1 sets up a displayed playlist of 20000 items (only).

    Would any of you fine people have any ideas to get around this and delve back further than the allowed 20000? With each deluge of uploads they do I am getting further away from the older ones. I estimate the oldest uploads on this YouTube channel to have been around 2013.

    Many thanks for any assistance! I'm hoping I don't have to do individual artist or clip searches as this would take ages and be very impractical.

    *The missing part (***) of the above URL is day.
    Quote Quote  
  2. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    20000 titles !!!!


    Either someone has too much time on their hand to upload such a number or.....


    Maybe, just maybe, yt set a random number that any channel could provide in the thought that no channel would EVER get that close.


    In that scenario the older uploads cease to exist (might find them on the way back machine but even then do not consider downloading on more than a 1 2 1 basis).


    And let's just consider the other factor. The channel exists to promote items only available via membership. What is those self-same items actually also appeared on the 'free' playlist. If I have my maths correct, an upload of 600 titles creates a playlist-life-expectancy of less 4 days. Those titles are still available via subscription. Yet what if those 20000 titles were not even unique and they simply re-uploaded older ones to the top of the pile. Well you can find that out if you ignore what the titles ay and blindly download 20000 titles a day. Good luck with that !!
    Quote Quote  
  3. Member
    Join Date
    May 2020
    Location
    Melbourne, Australia
    Search Comp PM
    MediaHuman's YouTube Downloader is great because it keeps a history of what's already been downloaded from a playlist or channel, and only notifies when new titles appear. So fortunately I don't believe I need to download 20000 titles a day for what I'm asking - even if I could!

    Anecdotally, there have been (and presumably still are) other YT channels with way more than 20000 titles on them. Some gaming channels apparently have over a million uploads. So I believe the issue here is a display/access one of a channel, rather than an enforced capacity limit of the channel.

    Regarding your last par, yes you're absolutely correct. The channel does exist to refer YT users back to its website to subscribe (the premise being that most people don't know how to download stuff from YT - which is a pretty sound contention, it just doesn't apply to me). But I have to say your maths looks dodgy regarding the 600 titles a day... to reach 20000 would take 33-1/3 days, not 4. And I don't believe they are reuploading older songs to the top of the pile. I haven't observed this happening, and with MediaHuman I would find out really quickly anyway because the repeat file names would be tagged with a "-1".

    So my initial question remains: how can I access files #20001-∞ on a YT channel... for a bulk download?
    Quote Quote  
  4. Member DB83's Avatar
    Join Date
    Jul 2007
    Location
    United Kingdom
    Search Comp PM
    Ok. I'll concede the wine-sponsored maths.


    But if a playlist is restricted to 20000 items - still a huge number and did those channels you reported actually have a playlist ? - you can hardly expect some code to grab earlier ones in bulk.


    Yet by looking at the channel's actual uploads you can see what is available. You might have to scroll through many pages to do that for the earlier ones.
    Quote Quote  
  5. Member
    Join Date
    May 2020
    Location
    Melbourne, Australia
    Search Comp PM
    OK, this was my somewhat cumbersome workaround... partially staring me in the face the whole time. DB83's last comment inspired me to try something which worked. So thank you for that! I'm providing this info for any future obsessives wrestling with this same issue.

    I decided to try scrolling down the entire page of videos for this channel, which takes some time, to see whether that would display more than the 20000 clips normally generated. YouTube only allows loading 30 videos at a time using this method with several seconds pause in between, which takes longer the longer the page gets. So I lodged my Pg Dn key down with some folded up cardboard overnight. In the morning the whole page had downloaded. I saved the page for offline reference.

    The last (oldest) files were ones I didn't have, so I was pretty sure I was onto a winner. However accessing the URLs was a pain and I was facing considerable time-consuming work in copying and loading them individually into MediaHuman YTD.

    Then, thicko here finally realised YouTube allows a very useful sort/display function from oldest-to-newest. I noticed though that the oldest files from my overnight scrolling -which seemed to display consecutively - didn't appear until quite a way into the list, presumably due to some sort of weird YouTube algorithm. The very oldest files on the channel were uploaded six years ago, and a large number of them too. I had already downloaded the files they'd uploaded five years ago, so I scrolled down the "oldest-to-newest" display just far enough to load all the six-year-old clips and a bit beyond that, then saved the URLs into a .txt file using Chrome extension Link Klipper. I will be feeding this list of URLs into MediaHuman and combining the results with the downloaded clips I already have. MediaHuman YTD has "download from a list" functionality, it's a very good program.

    It's a bit ass-about, working in reverse, but it will work for what I wanted to do.
    Quote Quote  
  6. So I lodged my Pg Dn key down with some folded up cardboard overnight.
    One word : AutoIt. (Well, that's three words now.) (And actually fourteen in total.)

    In the morning the whole page had downloaded. I saved the page for offline reference.
    What browser do you use which didn't explode along the way and destroy the entire fabric of the universe ? Did you bless your computer with Chuck Norris' sweat, or some other bodily fluid, to ensure that it would never ever crash even under the most excruciatingly insane workloads ?

    Anecdotally, there have been (and presumably still are) other YT channels with way more than 20000 titles on them. Some gaming channels apparently have over a million uploads.
    NOT anecdotally — no wonder the greenhouse gases emissions caused by Internet are now almost double that of airplane traffic, and increasing at a much faster rate. é_è
    The word “Cloud” is one big smokescreen to conceal the fact that everything going on online is very real and has very real consequences, it's not some ethereal magic that comes out of thin air and then vanishes once we stop thinking about it. I'm not saying that people should stop watching and uploading videos, but a wee bit of a sense of responsibility and moderation in that matter would be more than welcome, for the sake of whatever is worth preserving more than a few decades from our current civilisation. I am not sure that those gazillions of video game recordings are part of it.
    https://www.youtube.com/watch?v=4JkIs37a2JE
    (The irony is that he became a part of it now, he's been absorbed by the very “virtual insanity” he was ranting about... It's a bit too preachy for my taste — and some lines are just plain silly — but it's nevertheless highly relevant some 25 years later.)
    Quote Quote  
  7. Member
    Join Date
    May 2020
    Location
    Melbourne, Australia
    Search Comp PM
    Thank you for your amusing contribution!
    One word : AutoIt.
    I don't know what this is or how to do it (or even if it's Windows-relevant).
    What browser do you use which didn't explode along the way
    Chrome. And no, no particular tweaking, it went a bit slow but it handled it OK.
    Quote Quote  
  8. Thank you for your amusing contribution!
    Well, at least one part was intended to be serious... é_è


    I don't know what this is or how to do it (or even if it's Windows-relevant).
    Yes it's Windows-relevant. It's a special environment and rudimentary programming language meant (as its name implies) to automate tasks, mostly repetitive tasks, and execute them without having to be there to press the button every 108 minutes to prevent the end of the world (for instance), and without having to resort to dirty tricks like putting folded cardboard on the keyboard or training your dog to do it. I only used it once, recently, for a slightly more complicated task (so the cardboard trick wouldn't have worked, and my beloved cat is no more — 4 years with a breast cancer, she definitely went the distance) ; as part of an in-depth data recovery endeavour, I wanted to copy the “Info” tab in Recuva to a text file for each of about 90000 files on a HDD (as it doesn't allow to export all the required information at once and I don't know any utility that does), doing that manually would have taken days or even weeks, and been so excruciatingly tedious that there would have been many errors along the way ; I battled with the basics for an evening, did some promising tests which revealed unexpected issues so the script had to be tweaked a bit, but finally got it to work as intended, and then spent the next evening reading a book while the computer did the task unattended (mostly, as I did it in bursts of 10000, fearing that the computer would crash halfway through, which it does quite often).
    The script looked like this :
    Code:
    WinWaitActive("[CLASS:PiriformRecuva]")
    Local $i = 0
    Send("{DOWN}")
    Sleep(800)
    Send("{UP}")
    Sleep(800)
    Do
       Local $sText = WinGetText("[CLASS:PiriformRecuva]")
       $hFileOpen = FileOpen("G:\Recuva WD30EZRX AutoIt 4.txt", 1)
       FileWrite($hFileOpen, $sText)
       Send("{DOWN}")
       WinWaitActive("[CLASS:PiriformRecuva]", "Filename:")
       Sleep(10)
       $i = $i + 1
    Until $i = 10000
    In the case you described, just to press a single button over and over, it would have been very simple, perhaps two or three lines would be enough.
    Quote Quote  



Similar Threads