You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a feed from Sage API, which contains pagination.
The feed itself is set up correctly in FeedMe and imports data into Entries just great.
The correct node is set for the Pagination parameter.
Processing the feed, it does not paginate. It imports only the first batch.
This is the data in the Queue Job where we can see "paginationNode": "$next", and Sage's node for the next page is $next (yes, they use a dollar symbol for their keys).
If I do the same query using Consume rather than FeedMe you can see what sort of data we get back from their API:
Note the incorrect domain. That's Sage and we can't do anything about that, so that's actually being "fixed" via a module as follows, by listening to the After Fetch Feed and substituting the proper domain in:
Update: we have managed to get the client to correct the main URL on Sage's end, so we are now getting "real" URLs returned, and have thrown away our edit in out Module.
It still does not paginate, and the logs don't show any attempt to paginate.
{"date":"2024-01-24 09:27:32","type":"info","message":"Protected Parks - Sage API: Preparing for feed processing."}
{"date":"2024-01-24 09:27:32","type":"info","message":"Protected Parks - Sage API: Finished preparing for feed processing."}
{"date":"2024-01-24 09:27:32","type":"info","message":"Protected Parks - Sage API: Starting processing of node `#1`.","key":"vyjozztouhyyxyfbuxea"}
{"date":"2024-01-24 09:27:32","type":"info","message":"Protected Parks - Sage API: Match existing element with data `{\"title\":\"Battersea Fields Millgrove Street Open Space\",\"sageSiteId\":\"5001\"}`.","key":"vyjozztouhyyxyfbuxea"}
...SKIPPED 10 IMPORTS...
{"date":"2024-01-24 09:27:32","type":"info","message":"Protected Parks - Sage API: The following elements have been deleted: {\"10\":244229,\"11\":244230,\"12\":244231,\"13\":244232,\"14\":244233,\"15\":244234,\"16\":244235,\"17\":244236,\"18\":244237,\"19\":244238}.","key":"ljoqblqqgeefcwwglcaw"}
{"date":"2024-01-24 09:27:32","type":"info","message":"Protected Parks - Sage API: Processing 10 elements finished in 0.38s"}
Digging with my basic knowledge of the debugger in PHP, it looks as though FeedMe/Craft is rejecting the supplied pagination URL as being invalid?
At this point, it seems as though either the Sage API is using/returning URLs that are non-standard, or that there's some edge case here which Craft is identifying as an invalid URL when it's a valid URL?
I'm at the edge of my knowledge here, and I don't know what about this is invalid.
Description
I have a feed from Sage API, which contains pagination.
The feed itself is set up correctly in FeedMe and imports data into Entries just great.
The correct node is set for the Pagination parameter.
Processing the feed, it does not paginate. It imports only the first batch.
This is the data in the Queue Job where we can see
"paginationNode": "$next",
and Sage's node for the next page is$next
(yes, they use a dollar symbol for their keys).If I do the same query using Consume rather than FeedMe you can see what sort of data we get back from their API:
Note the incorrect domain. That's Sage and we can't do anything about that, so that's actually being "fixed" via a module as follows, by listening to the After Fetch Feed and substituting the proper domain in:
And does seem to be working (you can see the replaced domain):
I can not work out why pagination is not triggering.
Additional info
The text was updated successfully, but these errors were encountered: