Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ensure IO is properly closed when importing NewPipe subscriptions #4346

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
48 changes: 22 additions & 26 deletions src/invidious/user/imports.cr
Expand Up @@ -290,42 +290,38 @@ struct Invidious::User
end

def from_newpipe(user : User, body : String) : Bool
io = IO::Memory.new(body)

Compress::Zip::File.open(io) do |file|
Compress::Zip::File.open(IO::Memory.new(body), true) do |file|
file.entries.each do |entry|
entry.open do |file_io|
# Ensure max size of 4MB
io_sized = IO::Sized.new(file_io, 0x400000)

next if entry.filename != "newpipe.db"

tempfile = File.tempfile(".db")

begin
File.write(tempfile.path, io_sized.gets_to_end)
rescue
return false
end

db = DB.open("sqlite3://" + tempfile.path)
# Ensure max size of 4MB
io_sized = IO::Sized.new(file_io, 0x400000)

user.watched += db.query_all("SELECT url FROM streams", as: String)
.map(&.lchop("https://www.youtube.com/watch?v="))
temp = File.tempfile(".db") do |tempfile|
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would we maybe want to randomly generate part of the file file's name? (ex: .db-{UUID}) I imagine this might cause some issues if invidious starts to use Concurrency

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that might be a huge problem! The ideal option would be ti use in-memory databases and use sqlite3_serialize, but that requires to make a PR to crystal-sqlite3

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that would be even better! I imagine there might need to be a PR made here as well (if serialization is supported by other databases): crystal-db

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In memory-databases are already supported in crystal-sqlite3 like so:

DB.open "sqlite3::memory" do | db |
  db.exec "create table contacts (name text, age integer)"
  db.exec "insert into contacts values (?, ?)", "John Doe", 30
end

The only thing that needs to be added is a way to actually access an in-memory database.

There's no need for a PR directly imo, an issue would be enough.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

begin
File.write(tempfile.path, io_sized.gets_to_end)
rescue
return false
end

user.watched.uniq!
Invidious::Database::Users.update_watch_history(user)
DB.open("sqlite3://" + tempfile.path) do |db|
user.watched += db.query_all("SELECT url FROM streams", as: String)
.map(&.lchop("https://www.youtube.com/watch?v="))

user.subscriptions += db.query_all("SELECT url FROM subscriptions", as: String)
.map(&.lchop("https://www.youtube.com/channel/"))
user.watched.uniq!
Invidious::Database::Users.update_watch_history(user)

user.subscriptions.uniq!
user.subscriptions = get_batch_channels(user.subscriptions)
user.subscriptions += db.query_all("SELECT url FROM subscriptions", as: String)
.map(&.lchop("https://www.youtube.com/channel/"))

Invidious::Database::Users.update_subscriptions(user)
user.subscriptions.uniq!
user.subscriptions = get_batch_channels(user.subscriptions)

db.close
tempfile.delete
Invidious::Database::Users.update_subscriptions(user)
end
end
temp.delete
SamantazFox marked this conversation as resolved.
Show resolved Hide resolved
end
end
end
Expand Down