Skip to content

Bash script to backup whole YouTube channels to mega.nz

License

Notifications You must be signed in to change notification settings

DeesseFortuna/yt2mega

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

yt2mega

This is a bash script to backup whole YouTube channels to mega.nz.

Requires youtube-dl and megatools.

Optionally, use aria2 to speed up downloads using multiple segments/connections. (See below)

Didn't spend a lot of time ironing things out or adding features, but here it is for anyone who wishes to use it.

Features:

  • Resumable/Schedulable (checks mega to see if the video is already backed up, by id grep)
  • Serial operation (i.e., download-upload-delete), putting disk space over speed
  • Chronologically indexed filenames
  • Better than doing it manually

Flaws:

  • Only supports one folder level deep on mega, i.e., /Root/YouTube-Backups-VSauce
  • Only runs on GNU+Linux (There is always Bash on Ubuntu on Windows though)
  • Haven't tested it, but deleted videos might mess up the indexing - Please let me know if you encounter this, and how you fixed it if you did.

aria2c with youtube-dl (thanks, tobbez)

The following commands allow youtube-dl to optimize download speed with parallel segmented files using aria2, run them a line at a time or via script:

mkdir -p ~/.config/youtube-dl/
cat > ~/.config/youtube-dl/config <<EOF
-o "[%(upload_date)s][%(id)s] %(title)s (by %(uploader)s).%(ext)s"
--external-downloader aria2c
--external-downloader-args "-c -j 3 -x 3 -s 3 -k 1M"
EOF

--external-downloader-args cheatsheet:

  • -c, --continue[=true|false] 🔗
  • -j, --max-concurrent-downloads= 🔗
  • -s, --split= (number of segments per file) 🔗
  • -k, --min-split-size=🔗

Releases

No releases published

Packages

No packages published

Languages