forked from wezm/youtube-to-opml
40 lines
1.6 KiB
Markdown
40 lines
1.6 KiB
Markdown
|
# YouTube Subscriptions to OPML
|
||
|
|
||
|
This repo contains a small collection of scripts that I used to turn my YouTube subscriptions into an OPML file for import in to [Feedbin].
|
||
|
|
||
|
## Dependencies
|
||
|
|
||
|
The scripts have only been run on a Linux system using GNU coreutils. They will
|
||
|
probably need some tweaking to run on other UNIX-like systems.
|
||
|
|
||
|
- [Scraper](https://lib.rs/crates/scraper)
|
||
|
- [jaq](https://github.com/01mf02/jaq)
|
||
|
- curl
|
||
|
- Python
|
||
|
- awk
|
||
|
- GNU make (I haven't tested non-GNU make)
|
||
|
|
||
|
## Usage
|
||
|
|
||
|
1. Visit your [subscriptions page](https://www.youtube.com/feed/channels)
|
||
|
2. Repeatedly scroll to the end of the page to make them all load
|
||
|
3. Run the following in the JavaScript console to copy the list of subscriptions to you clipboard as JSON array:
|
||
|
|
||
|
```javascript
|
||
|
copy(JSON.stringify(Array.from(new Set(Array.prototype.map.call(document.querySelectorAll('a.channel-link'), (link) => link.href))).filter((x) => !x.includes('/channel/')), null, 2))
|
||
|
```
|
||
|
|
||
|
**Note:** I only tested the above on Firefox.
|
||
|
|
||
|
Also why do this instead of processing the subscriptions.csv from Google Takeout?
|
||
|
|
||
|
1. Takeout generates multiple gigabytes of archives I have to download to get the CSV file.
|
||
|
2. It's slow to generate. This process can be done whenever you want.
|
||
|
|
||
|
4. Paste the list of subscriptions into `subscriptions.json`.
|
||
|
5. Run `make fetch` to fetch the channel pages of all the subscriptions. This only needs to be run once.
|
||
|
6. Run `make channel-json` to extract info from each channel page.
|
||
|
7. Run `make subscriptions.opml` to generate the OPML file.
|
||
|
|
||
|
[Feedbin]: https://feedbin.com/
|