Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unsupported atoms are simply ignored? #616

Open
RMeli opened this issue Apr 3, 2022 · 3 comments
Open

Unsupported atoms are simply ignored? #616

RMeli opened this issue Apr 3, 2022 · 3 comments

Comments

@RMeli
Copy link
Contributor

RMeli commented Apr 3, 2022

I would expect the following code to fail

import torch
import torchani

device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')

model = torchani.models.ANI2x(periodic_table_index=True).to(device)

coordinates = torch.tensor([[[0.03192167, 0.00638559, 0.01301679],
                             [-0.83140486, 0.39370209, -0.26395324],
                             [-0.66518241, -0.84461308, 0.20759389],
                             [0.45554739, 0.54289633, 0.81170881],
                             [0.66091919, -0.16799635, -0.91037834]]],
                           requires_grad=True, device=device)

# Species with unsupported atomic number 42
species = torch.tensor([[42, 1, 1, 1, 1]], device=device)

energy = model((species, coordinates)).energies

but it works without error/warnings despite the unsupported atom

In [12]: energy.item()
Out[12]: -2.1059779035978243

Is this intended? I understand the need to ignore atoms internally due to padding, but ignoring input atoms silently feels very error-prone to me (from an user perspective).

@RMeli RMeli closed this as not planned Won't fix, can't repro, duplicate, stale Oct 22, 2023
@UnixJunkie
Copy link

An exception should be raised on such cases.
Silently ignoring atoms is really bad.

@yueyericardo
Copy link
Contributor

yueyericardo commented Oct 23, 2023

Thanks! We will take care of this issue soon!

@yueyericardo yueyericardo reopened this Oct 23, 2023
@RMeli
Copy link
Contributor Author

RMeli commented Oct 23, 2023

@UnixJunkie totally agree. I closed this issue only because it has not been acknowledged for over a year and I don't even know if it's still relevant.

Thanks @yueyericardo for re-opening an looking into this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants