Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Questions about the paper #5

Open
Hatins opened this issue May 8, 2024 · 0 comments
Open

Questions about the paper #5

Hatins opened this issue May 8, 2024 · 0 comments

Comments

@Hatins
Copy link

Hatins commented May 8, 2024

Hi @NikolaZubic

Hello, recently I carefully studied the paper and source code. There are a few aspects I don't quite understand.

Firstly, in the process of Output masking, why is only 'C' masked and not 'A' and 'B'?

Secondly, in your code, if only adjusting the 'step_scale', the mask of 'C' remains unaffected.

            step = step_scale * torch.exp(self.log_step)

            freqs = step / step_scale * self.Lambda[:, 1].abs() / (2 * math.pi)
            mask = torch.where(freqs < bandlimit * 0.5, 1, 0)  # (64, )

Since the freqs are not linked with the value of step_scale. (freqs = step / step_scale * self.Lambda[:, 1].abs() / (2 * math.pi) -> freqs = torch.exp(self.log_step) * self.Lambda[:, 1].abs() / (2 * math.pi)). Is the code wrong?

Question 2 is accompanied by another question, Question 3. In the paper, it's only mentioned that generalization from low frequency to high frequency is achieved by masking 'C', but in the code, the discretization of 'A' and 'B' is also related to 'step_scale'. So, I wonder, for generalization from low frequency to high frequency, is it necessary to adjust all three values 'A', 'B', and 'C'?

       if not torch.is_tensor(step_scale) or step_scale.ndim == 0:
            # step_scale = torch.ones(signal.shape[-2], device=signal.device) * step_scale
            step = step_scale * torch.exp(self.log_step)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant