Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RemoveDataObjectJob appears to break when run synchronousely #53

Open
chrispenny opened this issue Mar 14, 2021 · 0 comments
Open

RemoveDataObjectJob appears to break when run synchronousely #53

chrispenny opened this issue Mar 14, 2021 · 0 comments

Comments

@chrispenny
Copy link
Collaborator

I have use_sync_jobs set to true.

Currently, this seems to only be an issue with this config on. If RemoveDataObjectJob is run through the usual ProcessJobQueueTask, it seems to work successfully.

Screen Shot 2021-03-15 at 8 53 02 AM

I’m struggling to decipher parts of this, but this is what I have so far…

I archive a page.

  • Good: SearchServiceExtension::removeFromIndexes() is called. a DataObjectDocument is created and passed to $this->getBatchProcessor()->removeDocuments(). As far as I can tell, this Document accurately represents my SiteTree record.
  • Good: DataObjectBatchProcessor loops through my one Document, it instantiates a Job, and runs the Job.
  • Bad?: RemoveDataObjectJob::setup() has a var $documents, which is equal to the output from the array_map which looks to use $this->document->getDependentDocuments(). Adding some logging to this, I can see that getDependentDocuments() is an empty array (because my SiteTree record doesn’t have any dependencies). This (empty) array is then passed to $this->indexer->setDocuments(). I can’t see anywhere that $this->document (which represents the original SiteTree record) is added to the indexer.
  • Bad continued: Indexer::processNode() is calling $documents = array_shift($remainingChildren), but if $remainingChildren is empty, then $documents will be equal null.
$arr = [];
$documents = array_shift($arr);

var_dump($documents);

// Output: NULL

I haven't been able to determine what is "different" between Queued Jobs vs synchronously run Jobs. Perhaps there is some cache/state persisted between the Jobs when run synchronously in the same request?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant