Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Source example code for Player strategy needs to be updated #74

Closed
MileanCo opened this issue Mar 26, 2024 · 7 comments
Closed

Source example code for Player strategy needs to be updated #74

MileanCo opened this issue Mar 26, 2024 · 7 comments

Comments

@MileanCo
Copy link

MileanCo commented Mar 26, 2024

Just a nit pick, but the src code on the readme page for "Player strategy" is missing some imports and definitions. For example, I had to update the import like this
import { createRealTimeBpmProcessor, getBiquadFilter } from 'realtime-bpm-analyzer';

And define an audio context like so
const audioContext = new AudioContext();

Also the track appears to be missing, how do you get this from the DOM / html page defined? These things might be really obvious to some people but not to others :)

@MileanCo
Copy link
Author

MileanCo commented Mar 28, 2024

I tried the example and couldnt get it to work... am I missing something? It just says the BPM is undefined. Heres my code

typescript file (Angular)
`

    import { createRealTimeBpmProcessor, getBiquadFilter } from 'realtime-bpm-analyzer';
    
    const audioContext = new AudioContext();
    const realtimeAnalyzerNode = await createRealTimeBpmProcessor(audioContext);
    
    // Set the source with the HTML Audio Node
    const track : any = document.getElementById('track');
    const source = audioContext.createMediaElementSource(track);
    const lowpass = getBiquadFilter(audioContext);
    
    // Connect nodes together
    source.connect(lowpass).connect(realtimeAnalyzerNode);
    source.connect(audioContext.destination);
    
    realtimeAnalyzerNode.port.onmessage = (event) => {
      if (event.data.message === 'BPM') {
        console.log(event)
        console.log('BPM', event.data.result);
      }
      if (event.data.message === 'BPM_STABLE') {
        console.log('BPM_STABLE', event.data.result);
      }
    };

`
html file

<audio src="/assets/Come With Me.wav" id="track"></audio>

Then if I load this URL in my angular web app it starts downloading the file, so it's definitely there. If I change it to an invalid path I get an error about file not found.

@dlepaux
Copy link
Owner

dlepaux commented Mar 30, 2024

Hey @MileanCo

The audioContext must be create after a human gesture so basically you should create within an event handler based on a button click. Typically: Get BPM or something.

If the error persists can you share your repo or an example for me to try it out ?

Thanks

@MileanCo
Copy link
Author

MileanCo commented Apr 3, 2024

Hey @dlepaux , thanks for the response. I do have the audioContext being created after a button click. Im using Angular and Typescript, perhaps this is the reason? Here are some more code snippets (not complete project, it would be a lot of code here). Perhaps this is somehow missing $event in the button click function?

import { Component, OnInit } from '@angular/core';
import { HttpClient } from '@angular/common/http';
import { FormControl, ValidationErrors, Validator, Validators } from '@angular/forms';
import { ActivatedRoute, NavigationEnd, Router } from '@angular/router';

import { createRealTimeBpmProcessor, getBiquadFilter } from 'realtime-bpm-analyzer';

@Component({
  selector: 'app-home',
  templateUrl: './home.component.html',
  styleUrls: ['./home.component.less']
})
export class HomeComponent implements OnInit {
    public demo_track = "Mystec - Kuze's Dream.mp3";

    constructor(
        private router: Router,
        private httpClient: HttpClient,
        private activatedRoute: ActivatedRoute) {
      }
    ngOnInit(): void {
    }

    async submit() {
        const audioContext = new AudioContext();
        const realtimeAnalyzerNode = await createRealTimeBpmProcessor(audioContext);
        
        // Set the source with the HTML Audio Node
        const track : any = document.getElementById('track');
        const source = audioContext.createMediaElementSource(track);
        const lowpass = getBiquadFilter(audioContext);
        
        // Connect nodes together
        source.connect(lowpass).connect(realtimeAnalyzerNode);
        source.connect(audioContext.destination);
        
        realtimeAnalyzerNode.port.onmessage = (event) => {
          if (event.data.message === 'BPM') {
            console.log(event)
            console.log('BPM', event.data.result);
          }
          if (event.data.message === 'BPM_STABLE') {
            console.log('BPM_STABLE', event.data.result);
          }
        };
    }
}

home.component.html

<audio src="/assets/{{demo_track}}" id="track"></audio>

<button mat-raised-button color="primary" aria-label="Submit" (click)="submit()">
            Submit
</button>       

You get the the home page (index page) via a router


/* Lib imports */
import { NgModule } from '@angular/core';
import { ExtraOptions, RouterModule, Routes } from '@angular/router';
/* App imports */
import { HomeComponent } from './home/home.component';

const routes: Routes = [
    { path: '', component: HomeComponent },
];

const routerOptions: ExtraOptions = {
    // useHash: false,
    anchorScrolling: 'enabled',
  };

@NgModule({
  declarations: [],
  imports: [
    RouterModule.forRoot(routes, routerOptions)
  ],
  exports: [
    RouterModule
  ]
})
export class AppRoutingModule { }

If I use an invalid path to an .mp3 file that doesnt exist, I get this error

       GET http://localhost:4200/assets/Mystec%20-%20Kuze's%20Dream.mp3f net::ERR_ABORTED 404 (Not Found)

If I use a valid path, I get BPM undefined. I print the event.data and it looks like this

{
    "message": "BPM",
    "data": {
        "bpm": [],
        "threshold": 0.2
    }
}

You can easily generate your own Angular project by using 'ng generate':
ng generate app [name]
ng generate component component-name --module app
https://angular.io/cli/generate

@MileanCo
Copy link
Author

MileanCo commented Apr 3, 2024

I printed the various pieces and got this

        // Set the source with the HTML Audio Node
        const track : any = document.getElementById('track');
        console.log(track)
        const source = audioContext.createMediaElementSource(track);
        console.log(source)
        const lowpass = getBiquadFilter(audioContext);
        console.log(lowpass)
        
        // Connect nodes together
        source.connect(lowpass).connect(realtimeAnalyzerNode);
        source.connect(audioContext.destination);
Screenshot 2024-04-03 at 14 02 46

@dlepaux
Copy link
Owner

dlepaux commented Apr 3, 2024

@MileanCo it seems that you successfully plugged the library.
The expected behavior here is that you need to play the track. Then you will eventually get a result.

The reasons why you don’t have any results can be because your track doesn’t have enough energy in it (bass frequencies).

you can try to remove the lowpass filter to check if you get any results.

Can you share the audio file you’re trying to analyse please ? I would take a look tomorrow.

@MileanCo
Copy link
Author

MileanCo commented Apr 4, 2024

OK, even after playing the audio by just doing track.play() it still doesnt work - and I can hear the song playing. It's a House track so it should be really easy to identify the BPM. No idea why it isnt working here, but probably some environment weirdness related to angular or typescript.

Anyways I tried the Local/Offline strategy and it worked! Here's the code I used (had to change from what you had on the README page again, it complained in typescript about reader.result having 3 different types "string, ArrayBuffer, or null").

Here's the code I got working


    async submit() {
        const audioContext = new AudioContext();
        const url = "/assets/You Got Me.mp3";
        const fileBlob:any = await this.httpClient.get(url, {responseType: 'blob'}).toPromise()

        const reader:FileReader = new FileReader();      
        reader.onload = function() {
            const arrayBuffer = reader.result as ArrayBuffer;
            // The file is uploaded, now we decode it
            audioContext.decodeAudioData(arrayBuffer, audioBuffer => {
                // The result is passed to the analyzer
                realtimeBpm.analyzeFullBuffer(audioBuffer).then(topCandidates => {
                    // Do something with the BPM
                    console.log('topCandidates', topCandidates);
                });
            });
        }
        reader.readAsArrayBuffer(fileBlob);
    }

And the result

[
    {
        "tempo": 124,
        "count": 327,
        "confidence": 0
    },
    {
        "tempo": 165,
        "count": 241,
        "confidence": 0
    },
    {
        "tempo": 99,
        "count": 196,
        "confidence": 0
    },
    {
        "tempo": 142,
        "count": 102,
        "confidence": 0
    },
    {
        "tempo": 110,
        "count": 93,
        "confidence": 0
    }
]

Can we always assume the first top candidate (index 0) is correct? The track is indeed 124.

@dlepaux
Copy link
Owner

dlepaux commented Apr 10, 2024

@MileanCo Glad you succeed to use it !
The first candidate (with the higher count) is most of the time the most accurate one. Though, the second one can be more appropriate sometimes. The ratio is about 80% of the time accurate for the first one.
Can I close the issue ? Thank you

@dlepaux dlepaux closed this as completed May 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants