summaryrefslogtreecommitdiff
path: root/src/content/blog/2018/12/21/ytdl-subs.adoc
diff options
context:
space:
mode:
Diffstat (limited to 'src/content/blog/2018/12/21/ytdl-subs.adoc')
-rw-r--r--src/content/blog/2018/12/21/ytdl-subs.adoc279
1 files changed, 0 insertions, 279 deletions
diff --git a/src/content/blog/2018/12/21/ytdl-subs.adoc b/src/content/blog/2018/12/21/ytdl-subs.adoc
deleted file mode 100644
index 10afbf6..0000000
--- a/src/content/blog/2018/12/21/ytdl-subs.adoc
+++ /dev/null
@@ -1,279 +0,0 @@
-= Using "youtube-dl" to manage YouTube subscriptions
-
-:ytsm-ann: https://old.reddit.com/r/DataHoarder/comments/9sg8q5/i_built_a_selfhosted_youtube_subscription_manager/
-:ytsm-code: https://github.com/chibicitiberiu/ytsm
-:ytdl: https://youtube-dl.org/
-
-I've recently read the {ytsm-ann}[announcement] of a very nice
-{ytsm-code}[self-hosted YouTube subscription manager]. I haven't used YouTube's
-built-in subscriptions for a while now, and haven't missed it at all. When I
-saw the announcement, I considered writing about the solution I've built on top
-of {ytdl}[youtube-dl].
-
-== Background: the problem with YouTube
-
-:net-giants: https://staltz.com/what-happens-when-you-block-internet-giants.html
-
-In many ways, I agree with {net-giants}[André Staltz's view on data ownership
-and privacy]:
-
-____
-I started with the basic premise that "I want to be in control of my data".
-Sometimes that meant choosing when to interact with an internet giant and how
-much I feel like revealing to them. Most of times it meant not interacting with
-them at all. I don't want to let them be in full control of how much they can
-know about me. I don't want to be in autopilot mode. (...) Which leads us to
-YouTube. While I was able to find alternatives to Gmail (Fastmail), Calendar
-(Fastmail), Translate (Yandex Translate), _etc._ YouTube remains as the most
-indispensable Google-owned web service. It is really really hard to avoid
-consuming YouTube content. It was probably the smartest startup acquisition
-ever. My privacy-oriented alternative is to watch YouTube videos through Tor,
-which is technically feasible but not polite to use the Tor bandwidth for these
-purposes. I'm still scratching my head with this issue.
-____
-
-Even though I don't use most alternative services he mentions, I do watch videos
-from YouTube. But I also feel uncomfortable logging in to YouTube with a Google
-account, watching videos, creating playlists and similar things.
-
-Using the mobile app is worse: you can't even block ads in there. You're in
-less control on what you share with YouTube and Google.
-
-== youtube-dl
-
-:other-sites: https://rg3.github.io/youtube-dl/supportedsites.html
-
-youtube-dl is a command-line tool for downloading videos, from YouTube and
-{other-sites}[many other sites]:
-
-[source,sh]
-----
-$ youtube-dl https://www.youtube.com/watch?v=rnMYZnY3uLA
-[youtube] rnMYZnY3uLA: Downloading webpage
-[youtube] rnMYZnY3uLA: Downloading video info webpage
-[download] Destination: A Origem da Vida _ Nerdologia-rnMYZnY3uLA.mp4
-[download] 100% of 32.11MiB in 00:12
-----
-
-It can be used to download individual videos as showed above, but it also has
-some interesting flags that we can use:
-
-* `--output`: use a custom template to create the name of the downloaded file;
-* `--download-archive`: use a text file for recording and remembering which
- videos were already downloaded;
-* `--prefer-free-formats`: prefer free video formats, like `webm`, `ogv` and
- Matroska `mkv`;
-* `--playlist-end`: how many videos to download from a "playlist" (a channel, a
- user or an actual playlist);
-* `--write-description`: write the video description to a `.description` file,
- useful for accessing links and extra content.
-
-Putting it all together:
-
-[source,sh]
-----
-$ youtube-dl "https://www.youtube.com/channel/UClu474HMt895mVxZdlIHXEA" \
- --download-archive ~/Nextcloud/cache/youtube-dl-seen.conf \
- --prefer-free-formats \
- --playlist-end 20 \
- --write-description \
- --output "~/Downloads/yt-dl/%(uploader)s/%(upload_date)s - %(title)s.%(ext)s"
-----
-
-This will download the latest 20 videos from the selected channel, and write
-down the video IDs in the `youtube-dl-seen.conf` file. Running it immediately
-after one more time won't have any effect.
-
-If the channel posts one more video, running the same command again will
-download only the last video, since the other 19 were already downloaded.
-
-With this basic setup you have a minimal subscription system at work, and you
-can create some functions to help you manage that:
-
-[source,sh]
-----
-#!/bin/sh
-
-export DEFAULT_PLAYLIST_END=15
-
-download() {
- youtube-dl "$1" \
- --download-archive ~/Nextcloud/cache/youtube-dl-seen.conf \
- --prefer-free-formats \
- --playlist-end "$2" \
- --write-description \
- --output "~/Downloads/yt-dl/%(uploader)s/%(upload_date)s - %(title)s.%(ext)s"
-}
-export -f download
-
-
-download_user() {
- download "https://www.youtube.com/user/$1" "${2-$DEFAULT_PLAYLIST_END}"
-}
-export -f download_user
-
-
-download_channel() {
- download "https://www.youtube.com/channel/$1" "${2-$DEFAULT_PLAYLIST_END}"
-}
-export -f download_channel
-
-
-download_playlist() {
- download "https://www.youtube.com/playlist?list=$1" "${2-$DEFAULT_PLAYLIST_END}"
-}
-export -f download_playlist
-----
-
-With these functions, you now can have a subscription fetching script to
-download the latest videos from your favorite channels:
-
-[source,sh]
-----
-#!/bin/sh
-
-download_user ClojureTV 15
-download_channel 'UCmEClzCBDx-vrt0GuSKBd9g' 100
-download_playlist 'PLqG7fA3EaMRPzL5jzd83tWcjCUH9ZUsbX' 15
-----
-
-Now, whenever you want to watch the latest videos, just run the above script
-and you'll get all of them in your local machine.
-
-== Tradeoffs
-
-=== I've made it for myself, with my use case in mind
-
-
-[qanda]
-Offline::
-My internet speed it somewhat
-reasonable{empty}footnote:internet-speed[
- Considering how expensive it is and the many ways it could be better, but also
- how much it has improved over the last years, I say it's reasonable.
-], but it is really unstable. Either at work or at home, it's not uncommon to
-loose internet access for 2 minutes 3~5 times every day, and stay completely
-offline for a couple of hours once every week.
-+
-Working through the hassle of keeping a playlist on disk has payed off many,
-many times. Sometimes I even not notice when the connection drops for some
-minutes, because I'm watching a video and working on some document, all on my
-local computer.
-+
-There's also no quality adjustment for YouTube's web player, I always pick the
-higher quality and it doesn't change during the video. For some types of
-content, like a podcast with some tiny visual resources, this doesn't change
-much. For other types of content, like a keynote presentation with text written
-on the slides, watching on 144p isn't really an option.
-+
-If the internet connection drops during the video download, youtube-dl will
-resume from where it stopped.
-+
-This is an offline first benefit that I really like, and works well for me.
-
-
-Sync the "seen" file::
-I already have a running instance of Nextcloud, so just dumping the
-`youtube-dl-seen.conf` file inside Nextcloud was a no-brainer.
-+
-You could try putting it in a dedicated git repository, and wrap the script with
-an autocommit after every run. If you ever had a merge conflict, you'd simply
-accept all changes and then run the following to tidy up the file:
-+
-[source,sh]
-----
-$ uniq youtube-dl-seen.conf > youtube-dl-seen.conf
-----
-
-
-Doesn't work on mobile::
-My primary device that I use everyday is my laptop, not my phone. It works well
-for me this way.
-+
-Also, it's harder to add ad-blockers to mobile phones, and most mobile software
-still depends on Google's and Apple's blessing.
-+
-If you wish, you can sync the videos to the SD card periodically, but that's a
-bit of extra manual work.
-
-
-=== The Good
-
-
-[qanda]
-Better privacy::
-We don't even have to configure the ad-blocker to keep ads and trackers away!
-+
-YouTube still has your IP address, so using a VPN is always a good idea.
-However, a timing analysis would be able to identify you (considering the
-current implementation).
-
-
-No need to self-host::
-There's no host that needs maintenance. Everything runs locally.
-+
-As long as you keep youtube-dl itself up to date and sync your "seen" file,
-there's little extra work to do.
-
-
-Track your subscriptions with git::
-After creating a `subscriptions.sh` executable that downloads all the videos,
-you can add it to git and use it to track metadata about your subscriptions.
-
-
-=== The Bad
-
-
-[qanda]
-Maximum playlist size is your disk size::
-This is a good thing for getting a realistic view on your actual "watch later"
-list. However I've run out of disk space many times, and now I need to be more
-aware of how much is left.
-
-
-=== The Ugly
-
-We can only avoid all the bad parts of YouTube with youtube-dl as long as
-YouTube keeps the videos public and programmatically accessible. If YouTube
-ever blocks that we'd loose the ability to consume content this way, but also
-loose confidence on considering YouTube a healthy repository of videos on the
-internet.
-
-
-== Going beyond
-
-Since you're running everything locally, here are some possibilities to be
-explored:
-
-
-=== A playlist that is too long for being downloaded all at once
-
-You can wrap the `download_playlist` function (let's call the wrapper
-`inc_download`) and instead of passing it a fixed number to the `--playlist-end`
-parameter, you can store the `$n` in a folder (something like
-`$HOME/.yt-db/$PLAYLIST_ID`) and increment it by `$step` every time you run
-`inc_download`.
-
-This way you can incrementally download videos from a huge playlist without
-filling your disk with gigabytes of content all at once.
-
-
-=== Multiple computer scenario
-
-The `download_playlist` function could be aware of the specific machine that it
-is running on and apply specific policies depending on the machine: always
-download everything; only download videos that aren't present anywhere else;
-_etc._
-
-
-== Conclusion
-
-youtube-dl is a great tool to keep at hand. It covers a really large range of
-video websites and works robustly.
-
-Feel free to copy and modify this code, and send me suggestions of improvements
-or related content.
-
-== _Edit_
-
-2019-05-22: Fix spelling.