Replies: 1 comment
-
I was wrong again, I removed the try and INFO - Marking task as FAILED. dag_id=dag_download_list_pro_match_id_from_date, task_id=upload_match_id_to_temp_table ERROR - Failed to execute job 112 for task upload_match_id_to_temp_table (integer out of range |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello. I'm new to using airflow, I searched for similar discussions but couldn't find one.
I created a table:
CREATE TABLE IF NOT EXISTS temp_promatches_download_queue (
match_id INTEGER PRIMARY KEY
);
I wrote dag and task:
dag completed successfully, but there is no data in the table.
Marking task as SUCCESS. dag_id=dag_download_list_pro_match_id_from_date, task_id=upload_match_id_to_temp_table
I found my mistake, in the table match_id is INTEGER, and the number we are trying to record is larger, we must use BIGSERIAL.
From the above question. Is this normal PostgresHook behavior, or is it a bug? Logically, if the data is not written, the task should end “failed”
Beta Was this translation helpful? Give feedback.
All reactions