-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
dials.index
writes out reflections for a rejected experiment
#2652
Comments
It looks like this is the problem: >>> from dials.array_family import flex
>>> rt=flex.reflection_table.from_file("indexed.refl")
>>> list(set(rt["id"]))
[0, 1, 2, -1]
>>> (rt["id"] == 0).count(True)
1329
>>> (rt["id"] == 1).count(True)
1924
>>> (rt["id"] == 2).count(True)
349
>>> (rt["id"] == -1).count(True)
467 There are 349 reflections with an
|
Indeed, if I do So, this is a |
DIALS_ASSERT(id[i] < num_expr) failure.
dials.index
writes out reflections for a rejected experiment
Ok, I think I understand why this is a corner case now. For this data set, when we get here: dials/src/dials/algorithms/indexing/indexer.py Lines 614 to 616 in 9af1ec1
the value of the loop counter i_cycle is 1. That is, we have already done one round of refinement before the crystal model becomes "too similar" to an existing model. This means that the attribute self.refined_reflections has already been set in the previous round with id values up to 2.
The fix would be to reset these reflections to have an id of |
For the other data set I looked at trying to reproduce this, the new crystal model was immediately "too similar" to an existing model and no refinement with the new model was done. So |
I found a weird error trying to process data from https://zenodo.org/records/10974780.
Taking just the first data set (
exp_705
), which has more than one lattice, then running these commands:leads to an arcane error message like
I think this is related to the fact that we try to find 3 lattices, but the third is rejected as being too similar to the first. However, I have been unable so far to reproduce this with a different data set.
The text was updated successfully, but these errors were encountered: