You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, this seems to only be an issue with this config on. If RemoveDataObjectJob is run through the usual ProcessJobQueueTask, it seems to work successfully.
I’m struggling to decipher parts of this, but this is what I have so far…
I archive a page.
Good: SearchServiceExtension::removeFromIndexes() is called. a DataObjectDocument is created and passed to $this->getBatchProcessor()->removeDocuments(). As far as I can tell, this Document accurately represents my SiteTree record.
Good: DataObjectBatchProcessor loops through my one Document, it instantiates a Job, and runs the Job.
Bad?: RemoveDataObjectJob::setup() has a var $documents, which is equal to the output from the array_map which looks to use $this->document->getDependentDocuments(). Adding some logging to this, I can see that getDependentDocuments() is an empty array (because my SiteTree record doesn’t have any dependencies). This (empty) array is then passed to $this->indexer->setDocuments(). I can’t see anywhere that $this->document (which represents the original SiteTree record) is added to the indexer.
Bad continued: Indexer::processNode() is calling $documents = array_shift($remainingChildren), but if $remainingChildren is empty, then $documents will be equal null.
I haven't been able to determine what is "different" between Queued Jobs vs synchronously run Jobs. Perhaps there is some cache/state persisted between the Jobs when run synchronously in the same request?
The text was updated successfully, but these errors were encountered:
I have
use_sync_jobs
set to true.Currently, this seems to only be an issue with this config on. If
RemoveDataObjectJob
is run through the usualProcessJobQueueTask
, it seems to work successfully.I’m struggling to decipher parts of this, but this is what I have so far…
I archive a page.
SearchServiceExtension::removeFromIndexes()
is called. aDataObjectDocument
is created and passed to$this->getBatchProcessor()->removeDocuments()
. As far as I can tell, thisDocument
accurately represents mySiteTree
record.DataObjectBatchProcessor
loops through my oneDocument
, it instantiates aJob
, and runs theJob
.RemoveDataObjectJob::setup()
has a var$documents
, which is equal to the output from thearray_map
which looks to use$this->document->getDependentDocuments()
. Adding some logging to this, I can see thatgetDependentDocuments()
is an emptyarray
(because mySiteTree
record doesn’t have any dependencies). This (empty)array
is then passed to$this->indexer->setDocuments()
. I can’t see anywhere that$this->document
(which represents the originalSiteTree
record) is added to theindexer
.Indexer::processNode()
is calling$documents = array_shift($remainingChildren)
, but if$remainingChildren
is empty, then$documents
will be equalnull
.I haven't been able to determine what is "different" between Queued Jobs vs synchronously run Jobs. Perhaps there is some cache/state persisted between the Jobs when run synchronously in the same request?
The text was updated successfully, but these errors were encountered: