You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[2022-06-17 01:57:44,140] {base.py:68} INFO - Using connection ID 'airbyte_default' for task execution.
[2022-06-17 01:57:44,141] {http.py:129} INFO - Sending 'POST' to url: http://airbyte-server:8001/api/v1/connections/sync
[2022-06-17 01:57:49,912] {base.py:68} INFO - Using connection ID 'airbyte_default' for task execution.
[2022-06-17 01:57:49,913] {http.py:129} INFO - Sending 'POST' to url: http://airbyte-server:8001/api/v1/jobs/get
[2022-06-17 01:57:53,029] {base.py:68} INFO - Using connection ID 'airbyte_default' for task execution.
[2022-06-17 01:57:53,031] {http.py:129} INFO - Sending 'POST' to url: http://airbyte-server:8001/api/v1/jobs/get
[2022-06-17 01:57:56,071] {base.py:68} INFO - Using connection ID 'airbyte_default' for task execution.
[2022-06-17 01:57:56,072] {http.py:129} INFO - Sending 'POST' to url: http://airbyte-server:8001/api/v1/jobs/get
[2022-06-17 01:57:59,154] {base.py:68} INFO - Using connection ID 'airbyte_default' for task execution.
[2022-06-17 01:57:59,162] {http.py:129} INFO - Sending 'POST' to url: http://airbyte-server:8001/api/v1/jobs/get
[2022-06-17 01:57:59,329] {taskinstance.py:1889} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/airbyte/operators/airbyte.py", line 77, in execute
hook.wait_for_job(job_id=job_id, wait_seconds=self.wait_seconds, timeout=self.timeout)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/airbyte/hooks/airbyte.py", line 84, in wait_for_job
raise Exception(f"Encountered unexpected state `{state}` for job_id `{job_id}`")
Exception: Encountered unexpected state `failed` for job_id `1`
[2022-06-17 01:57:59,337] {taskinstance.py:1400} INFO - Marking task as FAILED. dag_id=airflow_summit_airbyte, task_id=airbyte_sync_source_dest_example, execution_date=20220101T000000, start_date=20220617T015734, end_date=20220617T015759
[2022-06-17 01:57:59,377] {debug_executor.py:87} ERROR - Failed to execute task: Encountered unexpected state `failed` for job_id `1`.
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/executors/debug_executor.py", line 79, in _run_task
ti._run_raw_task(job_id=ti.job_id, **params)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 71, in wrapper
return func(*args, session=session, **kwargs)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1451, in _run_raw_task
self._execute_task_with_callbacks(context, test_mode)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1598, in _execute_task_with_callbacks
result = self._execute_task(context, task_orig)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1659, in _execute_task
result = execute_callable(context=context)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/airbyte/operators/airbyte.py", line 77, in execute
hook.wait_for_job(job_id=job_id, wait_seconds=self.wait_seconds, timeout=self.timeout)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/airbyte/hooks/airbyte.py", line 84, in wait_for_job
raise Exception(f"Encountered unexpected state `{state}` for job_id `{job_id}`")
Exception: Encountered unexpected state `failed` for job_id `1`
[2022-06-17 01:57:59,389] {backfill_job.py:190} ERROR - Task instance <TaskInstance: airflow_summit_airbyte.airbyte_sync_source_dest_example backfill__2022-01-01T00:00:00+00:00 [failed]> failed
[2022-06-17 01:57:59,396] {dagrun.py:583} ERROR - Deadlock; marking run <DagRun airflow_summit_airbyte @ 2022-01-01T00:00:00+00:00: backfill__2022-01-01T00:00:00+00:00, externally triggered: False> failed
[2022-06-17 01:57:59,397] {dagrun.py:622} INFO - DagRun Finished: dag_id=airflow_summit_airbyte, execution_date=2022-01-01T00:00:00+00:00, run_id=backfill__2022-01-01T00:00:00+00:00, run_start_date=2022-06-17 01:57:34.108440+00:00, run_end_date=2022-06-17 01:57:59.397766+00:00, run_duration=25.289326, state=failed, external_trigger=False, run_type=backfill, data_interval_start=2022-01-01T00:00:00+00:00, data_interval_end=2022-01-02T00:00:00+00:00, dag_hash=None
[2022-06-17 01:57:59,401] {backfill_job.py:378} INFO - [backfill progress] | finished run 1 of 1 | tasks waiting: 2 | succeeded: 1 | running: 0 | failed: 1 | skipped: 0 | deadlocked: 0 | not ready: 2
[2022-06-17 01:57:59,417] {backfill_job.py:460} ERROR - Task instance <TaskInstance: airflow_summit_airbyte.dbt_deps backfill__2022-01-01T00:00:00+00:00 [upstream_failed]> with state upstream_failed
[2022-06-17 01:57:59,439] {backfill_job.py:523} ERROR - Task instance <TaskInstance: airflow_summit_airbyte.dbt_run backfill__2022-01-01T00:00:00+00:00 [upstream_failed]> upstream failed
[2022-06-17 01:57:59,453] {backfill_job.py:378} INFO - [backfill progress] | finished run 1 of 1 | tasks waiting: 0 | succeeded: 1 | running: 0 | failed: 3 | skipped: 0 | deadlocked: 0 | not ready: 0
Some task instances failed:
DAG ID Task ID Run ID Try number
---------------------- -------------------------------- ----------------------------------- ------------
airflow_summit_airbyte airbyte_sync_source_dest_example backfill__2022-01-01T00:00:00+00:00 1
airflow_summit_airbyte dbt_deps backfill__2022-01-01T00:00:00+00:00 1
airflow_summit_airbyte dbt_run backfill__2022-01-01T00:00:00+00:00 1
The dag exit with errors
Credentials were added to {HOME}/.octavia and a dbtyaml file was created according to what was described in README.
Then, I ran the tools/run.shscript, and the airflow 'airbyte_sync_source_dest_example' task failed due to the above error message. Why do I get a different result when I followed it exactly as it is? I want to know why. And Thank you for your presentation.
The text was updated successfully, but these errors were encountered:
Hi @zigbang-yeezy, I'm glad you found the presentation helpful!
[2022-06-17 01:57:59,377] {debug_executor.py:87} ERROR - Failed to execute task: Encountered unexpected state `failed` for job_id `1`.
This implies that something went wrong with the Airbyte step - likely loading data into your database, since we are using an in-memory souce. Can you share some screenshots from the airbyte UI and the logs from the sync? Thanks!
Credentials were added to {HOME}/.octavia and a dbtyaml file was created according to what was described in README.
Then, I ran the
tools/run.sh
script, and the airflow 'airbyte_sync_source_dest_example' task failed due to the above error message. Why do I get a different result when I followed it exactly as it is? I want to know why. And Thank you for your presentation.The text was updated successfully, but these errors were encountered: