You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
see issue #2853:
The issue only occurs if a CSV node is used in a sequence and the columns property of a CSV node is initially empty:
In this case the columns property of a CSV node is set correctly during processing the first part of a sequence.
But the columns property is not updated during the second and other following parts. So, all written data files contain wrong header followed by rows with empty values only.
This behaviour is exactly the same like in closed issue #2853 - obviously the issue was not fixed successfully.
Expected Behavior
For every part in a sequence, the CSV node should not only set the payload to msg.payload but also set the columns property to the current value of msg.columns.
Hello together,
I just provided a sample flow with sample data for 2 tables.
With the first table all its ok, with the second table occurs the issue described.
See "debug 12" / msg.payload (contains generated csv string).
Many thanks for your help: Michael
Current Behavior
see issue #2853:
The issue only occurs if a CSV node is used in a sequence and the columns property of a CSV node is initially empty:
In this case the columns property of a CSV node is set correctly during processing the first part of a sequence.
But the columns property is not updated during the second and other following parts. So, all written data files contain wrong header followed by rows with empty values only.
This behaviour is exactly the same like in closed issue #2853 - obviously the issue was not fixed successfully.
Expected Behavior
For every part in a sequence, the CSV node should not only set the payload to msg.payload but also set the columns property to the current value of msg.columns.
Steps To Reproduce
No response
Example flow
Environment
The text was updated successfully, but these errors were encountered: