Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixed the ignoration of tempo information in original MXL file #11

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

RibomBalt
Copy link

Greetings, Blob Opera is amazing and I really appreciate this cool project!

However, I experienced similar issue with #10 that the tempo is not right. And I think I find out why.

Reason for the BUG

First, in blobopera/recording.py, the Part.from_part method, 314-316 lines (in your original commit):

for index, syllable in enumerate(syllables):
      duration = current.quarterLength / len(syllables)
      time = (current.offset + index * duration) / tempo

I've checked that the quaterLength and offset field are measured in quarter numbers, rather than real time (second). This means that with a tempo of "Quarter = 80", 4 quarters should be 3 seconds but the offset is still 4. However, it seems that the offset is directly used to generate the result. This means that the result tempo is fixed at 60. In my case it's even worse since the tempo is changing in the middle of the song :)

This may explain issue #10, since the tempo of the score therein is about 116, so the speed is about halfed.

Then, in blobopera/recording.py, the Part.from_part method, 248-252 lines:

notes = (
    event
    for event in part.flat
    if isinstance(event, music21.note.GeneralNote)
)

This is used to iterate over all note/rest event in a music21 part and discard others. However, the tempo event is discarded as well.

My work to DEBUG & works TODO

The two classes I added is just used to conveniently store the information of tempo. Since these classes are not used in protocols with the server, they don't extend from proto.Message class.

However, I only edited the import part of this code, so the export part will also lack the tempo information. This is future work to do, but for purpose of using it to generate Blob Opera videos, it's enough.

@0x2b3bfa0 0x2b3bfa0 linked an issue Oct 28, 2021 that may be closed by this pull request
@Raseven
Copy link

Raseven commented Mar 26, 2023

The current release does not read the tempo from the input, but the --tempo switch does accept any float value, not just 0.5-2.0. However, when you use values above 2.0, the blobs start to desync, especially Soprano.

I merged the changes @RibomBalt made into the current version of recording.py, and I can confirm that it sets the correct tempo, but it also causes the same desync issue for tempos above 120.

I found a workaround: MuseScore has a feature called "Paste half duration". So, for a 170 BPM song, you can copy and paste at half duration, and then set --tempo to 1.42 (which *60 = 85.2 BPM), and then it will play properly when uploaded to Blob Opera.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Speed is altered
2 participants