Compare commits

...

3 commits

Author SHA1 Message Date
5c42677eac
Add note about <link rel="alternate"> 2024-05-06 20:31:03 +10:00
b71422b97c
Fix typo 2024-05-06 20:20:42 +10:00
aae6ebd5cf
Force wrapping of inline code if needed
Stops horizontal scroll on mobile for pre-formatted text like URLs.
2024-05-06 14:54:26 +10:00
3 changed files with 14 additions and 5 deletions

View file

@ -9,6 +9,7 @@ Build with the [Zola] static site compiler:
Generate dates in front-matter from vim:
:r! date +\%Y-\%m-\%dT\%H:\%M:\%S\%:z
:r! date -Iseconds
## Terminal screenshots

View file

@ -2,8 +2,8 @@
title = "Exporting YouTube Subscriptions to OPML and Watching via RSS"
date = 2024-05-06T10:38:22+10:00
#[extra]
#updated = 2024-02-21T10:05:19+10:00
[extra]
updated = 2024-05-06T20:24:23+10:00
+++
This post describes how I exported my 500+ YouTube subscriptions to an OPML
@ -77,7 +77,7 @@ which means I needed to determine the channel id for each page. To do that
without futzing around with Google API keys and APIs I needed to download the
HTML of each channel page.
To do that I generated a config file for `curl` from the JSON file:
First I generated a config file for `curl` from the JSON file:
jaq --raw-output '.[] | (split("/") | last) as $name | "url \(.)\noutput \($name).html"' subscriptions.json > subscriptions.curl
@ -145,6 +145,12 @@ Let's break that down:
- `TITLE`, `XML_URL`, and `URL` are escaped.
- Finally we generate a JSON object with the title, URL, and RSS URL and write it into a `json` directory under the name of the channel.
**Update:** [Stephen pointed out on Mastodon][sedmonds] that the HTML contains the usual
`<link rel="alternate"` tag for RSS auto-discovery. I did check for that initially but
I think the Firefox dev tools where having a bad time with the large size of the YouTube
pages and didn't show me any matches at the time. Anyway, that could have been used to
find the feed URL directly instead of building it from the `og:url`.
Ok, almost there. That script had to be run for each of the channel URLs.
First I generated a file with just a plain text list of the channel URLs:
@ -218,7 +224,7 @@ It does the following:
- Indent the OPML document.
- Write it to stdout using a Unicode encoding with an XML declaration (`<?xml version='1.0' encoding='utf-8'?>`).
Whew that was a lot! With the OMPL file generated I was finally able to import
Whew that was a lot! With the OPML file generated I was finally able to import
all my subscriptions into Feedbin.
All the code is available in [this
@ -293,3 +299,4 @@ option.
[feedbin-sharing]: https://feedbin.com/help/sharing-read-it-later-services/
[OpenGraph]: https://ogp.me/
[feedbin-search]: https://feedbin.com/help/search-syntax/
[sedmonds]: https://aus.social/@popcorncx/112392881683597817

View file

@ -36,6 +36,7 @@ pre, code {
padding: 0.1em 0.2em;
font-size: 16px;
border-radius: 3px;
overflow-wrap: anywhere;
}
pre {
padding: 0.5em 1em;