Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

lineage plugin throws java.util.NoSuchElementException: None.get #6384

Closed
2 of 4 tasks
2018yinjian opened this issue May 10, 2024 · 11 comments
Closed
2 of 4 tasks

lineage plugin throws java.util.NoSuchElementException: None.get #6384

2018yinjian opened this issue May 10, 2024 · 11 comments
Labels
kind:bug This is a clearly a bug priority:major

Comments

@2018yinjian
Copy link

Code of Conduct

Search before asking

  • I have searched in the issues and found no similar issues.

Describe the bug

使用kyuubi-spark-lineage模块做sparksql的数据血缘到atlas(参考https://kyuubi.readthedocs.io/en/v1.8.1-docs/extensions/engines/spark/lineage.html#get-lineage-events),一直出现该异常信息。
24/05/10 11:32:19 WARN SparkSQLLineageParseHelper: Extract Statement[30918] columns lineage failed.
java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:529) ~[scala-library-2.12.15.jar:?]
at scala.None$.get(Option.scala:527) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.getV2TableName(SparkSQLLineageParseHelper.scala:493) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:304) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.parse(SparkSQLLineageParseHelper.scala:54) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.parse$(SparkSQLLineageParseHelper.scala:52) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.parse(SparkSQLLineageParseHelper.scala:510) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.$anonfun$transformToLineage$1(SparkSQLLineageParseHelper.scala:516) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.util.Try$.apply(Try.scala:213) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.transformToLineage(SparkSQLLineageParseHelper.scala:516) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.SparkOperationLineageQueryExecutionListener.onSuccess(SparkOperationLineageQueryExecutionListener.scala:34) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.spark.sql.util.ExecutionListenerBus.doPostEvent(QueryExecutionListener.scala:165) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.doPostEvent(QueryExecutionListener.scala:135) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.postToAll(QueryExecutionListener.scala:135) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.onOtherEvent(QueryExecutionListener.scala:147) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:100) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) ~[scala-library-2.12.15.jar:?]
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) ~[scala-library-2.12.15.jar:?]
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1446) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96) ~[spark-core_2.12-3.3.1.jar:3.3.1]

Affects Version(s)

kyuubi-spark-lineage 1.8.1

Kyuubi Server Log Output

No response

Kyuubi Engine Log Output

No response

Kyuubi Server Configurations

No response

Kyuubi Engine Configurations

No response

Additional context

24/05/10 11:32:19 WARN SparkSQLLineageParseHelper: Extract Statement[30918] columns lineage failed.
java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:529) ~[scala-library-2.12.15.jar:?]
at scala.None$.get(Option.scala:527) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.getV2TableName(SparkSQLLineageParseHelper.scala:493) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:304) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.parse(SparkSQLLineageParseHelper.scala:54) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.parse$(SparkSQLLineageParseHelper.scala:52) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.parse(SparkSQLLineageParseHelper.scala:510) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.$anonfun$transformToLineage$1(SparkSQLLineageParseHelper.scala:516) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.util.Try$.apply(Try.scala:213) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.transformToLineage(SparkSQLLineageParseHelper.scala:516) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.SparkOperationLineageQueryExecutionListener.onSuccess(SparkOperationLineageQueryExecutionListener.scala:34) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.spark.sql.util.ExecutionListenerBus.doPostEvent(QueryExecutionListener.scala:165) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.doPostEvent(QueryExecutionListener.scala:135) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.postToAll(QueryExecutionListener.scala:135) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.onOtherEvent(QueryExecutionListener.scala:147) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:100) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) ~[scala-library-2.12.15.jar:?]
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) ~[scala-library-2.12.15.jar:?]
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1446) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96) ~[spark-core_2.12-3.3.1.jar:3.3.1]

Are you willing to submit PR?

  • Yes. I would be willing to submit a PR with guidance from the Kyuubi community to fix.
  • No. I cannot submit a PR at this time.
@2018yinjian 2018yinjian added kind:bug This is a clearly a bug priority:major labels May 10, 2024
@wForget
Copy link
Member

wForget commented May 11, 2024

Can you provide sql to reproduce?

@2018yinjian
Copy link
Author

df1 = spark.readStream
.format("kafka")
.options(**src_configs)
.load()
.selectExpr("offset",
"get_json_object(cast(value as string), '$.database') as database_name",
"get_json_object(cast(value as string), '$.table') as table_name",
"get_json_object(cast(value as string), '$.type') as operation_type",
"cast(get_json_object(cast(value as string), '$.isDdl') as boolean) as is_ddl",
"get_json_object(cast(value as string), '$.data') as new_data",
"cast(from_unixtime(cast(get_json_object(cast(value as string), '$.es') as bigint)/1000 ) as timestamp) as canal_create_time"
)
df = df1.filter("operation_type != 'DELETE' and is_ddl = false")
.selectExpr("offset", "database_name", "table_name", "canal_create_time",
"explode(from_json(new_data,'array')) AS new_data",
)

query = df.writeStream.queryName("ads-work-real-starrocks")
.foreachBatch(process_row)
.trigger(processingTime=str(src_process_time) + ' seconds')
.option("checkpointLocation", f"/tmp/checkpoint/{tar_tab}")
.start()
query.awaitTermination()

def process_row(df):
df_extend1 = df.filter(
"database_name = 'base_serv_work' and rlike(table_name,'serv_work_extend_[0,1,2,3,4,5,6,7]')")
.selectExpr("offset", "cast(get_json_object(new_data, '$.serv_work_id') as bigint) as id",
"cast(get_json_object(new_data, '$.source_plat') as int) as plat_source",
"cast(get_json_object(new_data, '$.work_plat') as int) as plat",
"cast(get_json_object(new_data, '$.biz_type') as int) as serv_cluster_type",
"cast(get_json_object(new_data, '$.engineer_supervisor_id') as int) as engineer_sup_id",
"cast(get_json_object(new_data, '$.liability_engineer_id') as int) as perf_engineer_id",
"cast(get_json_object(new_data, '$.source_cooperation_id') as int) as cooperation_id",
"cast(get_json_object(new_data, '$.duplicate') as int) as duplicate",
"cast(get_json_object(new_data, '$.test') as int) as test",
"cast(get_json_object(new_data, '$.receive_entrance_id') as int) as receive_entrance",
"cast(get_json_object(new_data, '$.exam_area') as int) as exam_area",
"cast(get_json_object(new_data, '$.deleted') as int) as deleted"
)
.withColumn("num", F.expr("row_number() over(partition by id order by offset desc)"))
df_extend2 = df_extend1.filter("num=1 and deleted=1").na.fill(0)
# 来源合作关联合作维表取合作及渠道信息但要排除已经停用的帐号,关联工程师维度表取名称,关联入口取入口类型
if not df_extend2.rdd.isEmpty():
df_extend = df_extend2.alias("m").join(df_coop.alias("co1"), col("m.cooperation_id") == col("co1.cooperate_id"),"left")
.join(df_eng.alias("e"), col("m.perf_engineer_id") == col("e.engineer_id"), "left")
.join(df_gate.alias("g"), col("m.receive_entrance") == col("g.gate_id"), "left")
.selectExpr("m.*", "co1.channel_one_id", "co1.channel_two_id", "co1.channel_thr_id"
# 来源合并补充字段
, "co1.cooperate_name as cooperation_name", "co1.cooperate_one_id as cooperation_one_id"
, "co1.cooperate_manage_dept_one_id as cooperation_dept_one_id",
"co1.cooperate_manage_dept_two_id as cooperation_dept_two_id",
"co1.cooperate_manage_dept_thr_id as cooperation_dept_thr_id"
, "co1.cooperation_type", "co1.coop_brand_type as brand_type", "co1.coop_brand_id as brand_id"
, "e.real_name as perf_engineer_name", "cast(g.gate_type as int) as receive_entrance_type"
,"cast(now() as bigint) as data_update_time"
)
resultDF = df_extend.selectExpr("id","plat_source","plat","serv_cluster_type","engineer_sup_id","perf_engineer_id","perf_engineer_name"
,"duplicate","test","receive_entrance","receive_entrance_type","cooperation_id","cooperation_name","cooperation_one_id"
,"cooperation_dept_one_id","cooperation_dept_two_id","cooperation_dept_thr_id","cooperation_type","brand_type","brand_id"
,"channel_one_id","channel_two_id","channel_thr_id","exam_area","data_update_time")
resultDF.write.format('starrocks').options(**tar_configs)
.option('starrocks.write.properties.partial_update','true')
.option('starrocks.columns','''id,plat_source,plat,serv_cluster_type,engineer_sup_id,perf_engineer_id,perf_engineer_name,duplicate,test
,receive_entrance,receive_entrance_type,cooperation_id,cooperation_name,cooperation_one_id,cooperation_dept_one_id,cooperation_dept_two_id
,cooperation_dept_thr_id,cooperation_type,brand_type,brand_id,channel_one_id,channel_two_id,channel_thr_id,exam_area,data_update_time''')
.mode('append').save()
通过structuredStreaming实时写starrocks频繁出现警告信息,警告日志如下
24/05/10 23:26:08 WARN SparkSQLLineageParseHelper: Extract Statement[11677] columns lineage failed.
java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:529) ~[scala-library-2.12.15.jar:?]
at scala.None$.get(Option.scala:527) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.getV2TableName(SparkSQLLineageParseHelper.scala:493) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:304) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.parse(SparkSQLLineageParseHelper.scala:54) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.parse$(SparkSQLLineageParseHelper.scala:52) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.parse(SparkSQLLineageParseHelper.scala:510) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.$anonfun$transformToLineage$1(SparkSQLLineageParseHelper.scala:516) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.util.Try$.apply(Try.scala:213) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.transformToLineage(SparkSQLLineageParseHelper.scala:516) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.SparkOperationLineageQueryExecutionListener.onSuccess(SparkOperationLineageQueryExecutionListener.scala:34) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.spark.sql.util.ExecutionListenerBus.doPostEvent(QueryExecutionListener.scala:165) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.doPostEvent(QueryExecutionListener.scala:135) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.postToAll(QueryExecutionListener.scala:135) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.onOtherEvent(QueryExecutionListener.scala:147) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:100) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) ~[scala-library-2.12.15.jar:?]
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) ~[scala-library-2.12.15.jar:?]
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1446) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96) ~[spark-core_2.12-3.3.1.jar:3.3.1]

@2018yinjian
Copy link
Author

这种场景也会出现警告信息,能否这个警告信息配置在后台日志或者去掉

df_work = spark.read.format("starrocks").options(**tar_configs).load().filter("region>=202305 and after_type in (10,20)")
df_work.createOrReplaceTempView("temp_work")
df = spark.sql("select * from temp_work limit 1")
df.show(1)
+------------------+------------------+------------------+----------------+----------+---------------+------------------+---------------------+-------------------+-------------+-------------+------------+-----------+-------------+-----------+-------------+-----------+------+------------+----------+-----------+----------+-------------------+----------------+--------------+---------------+---------------+-------------+-------------+-----------+--------------------+--------------------+--------------------+-------+------+-------------+-------------+----------+--------+------------+--------------------+----------------------+----------------+-------+------------+----------+-----------------+-------------------+------------------------+-----------------------------+-----------------------------+-----------------------------+-------------+---------------+-------------+-------------+---------------+---------------+-------------+-------+-------+----------+----------+-----------+----+-----------------+---------------+----------------+------------------+---------+----+----------------+---------------------+--------------+--------------+--------------+--------------+----------------+------------------+-----------------------+-----------------------+-----------------------+----------------+----------+--------+---------+------+---------------------+-------------+------------+------------+-----------+-------------------+-------------+-------------------+-----------+-----------------+-----------+-------+----------+---------+---------+------------+-----------+---------+----------+------------+------------+----------+----------------+------------+------------+-------------+--------------+-----------+-------------+----------------------+---------------+---------------+------------------+---------------------+----------------+
| id| work_id| order_id|work_create_time|enter_time|distribute_time|engineer_take_time|engineer_contact_time|engineer_visit_time|complete_time|checkout_time|account_time|cancel_time|receiver_type|receiver_id|canceler_type|canceler_id|region|create_month|create_day|create_hour|enter_date|engineer_visit_date|distribute_month|distribute_day|distribute_hour|distribute_date|complete_date|account_month|account_day|canceler_dept_one_id|canceler_dept_two_id|canceler_dept_thr_id|main_id|status|result_status|result_reason|after_type|other_id|other_status|order_cooperation_id|order_cooperation_name|appointment_time|user_id|scan_user_id|meter_flag|engineer_visit_id|engineer_visit_name|order_cooperation_one_id|order_cooperation_dept_one_id|order_cooperation_dept_two_id|order_cooperation_dept_thr_id|ec_product_id|serv_cluster_id|serv_group_id|serv_categ_id|ec_categ_one_id|ec_categ_two_id|sp_company_id|sp_type|grid_id|org_thr_id|org_two_id|plat_source|plat|serv_cluster_type|engineer_sup_id|perf_engineer_id|perf_engineer_name|duplicate|test|receive_entrance|receive_entrance_type|channel_one_id|channel_two_id|channel_thr_id|cooperation_id|cooperation_name|cooperation_one_id|cooperation_dept_one_id|cooperation_dept_two_id|cooperation_dept_thr_id|cooperation_type|brand_type|brand_id|exam_area|tag_id|total_hierarchy_level|refund_cancel|refund_scene|unpay_amount|meter_vaild|meter_success_vaild|prepare_price|valuation_unconfirm|project_tag|star_level_amount|province_id|city_id|city_level|county_id|street_id| longitude| latitude|user_type|user_level|categ_one_id|categ_two_id|product_id|product_brand_id|product_type|order_amount|income_amount|receipt_amount|reduce_cost|prepay_amount|channel_receive_amount|progress_amount|engineer_amount|new_machine_amount|machine_complete_time|data_update_time|
+------------------+------------------+------------------+----------------+----------+---------------+------------------+---------------------+-------------------+-------------+-------------+------------+-----------+-------------+-----------+-------------+-----------+------+------------+----------+-----------+----------+-------------------+----------------+--------------+---------------+---------------+-------------+-------------+-----------+--------------------+--------------------+--------------------+-------+------+-------------+-------------+----------+--------+------------+--------------------+----------------------+----------------+-------+------------+----------+-----------------+-------------------+------------------------+-----------------------------+-----------------------------+-----------------------------+-------------+---------------+-------------+-------------+---------------+---------------+-------------+-------+-------+----------+----------+-----------+----+-----------------+---------------+----------------+------------------+---------+----+----------------+---------------------+--------------+--------------+--------------+--------------+----------------+------------------+-----------------------+-----------------------+-----------------------+----------------+----------+--------+---------+------+---------------------+-------------+------------+------------+-----------+-------------------+-------------+-------------------+-----------+-----------------+-----------+-------+----------+---------+---------+------------+-----------+---------+----------+------------+------------+----------+----------------+------------+------------+-------------+--------------+-----------+-------------+----------------------+---------------+---------------+------------------+---------------------+----------------+

24/05/11 11:57:31 WARN SparkSQLLineageParseHelper: Extract Statement[68] columns lineage failed.
java.util.NoSuchElementException: next on empty iterator
at scala.collection.Iterator$$anon$2.next(Iterator.scala:41) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator$$anon$2.next(Iterator.scala:39) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.head(IterableLike.scala:109) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.head$(IterableLike.scala:108) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.head(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$mergeRelationColumnLineage$1(SparkSQLLineageParseHelper.scala:180) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126) ~[scala-library-2.12.15.jar:?]
at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122) ~[scala-library-2.12.15.jar:?]
at scala.collection.immutable.List.foldLeft(List.scala:91) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.mergeRelationColumnLineage(SparkSQLLineageParseHelper.scala:178) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:457) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$54(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$22(SparkSQLLineageParseHelper.scala:341) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:341) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$54(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$54(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$22(SparkSQLLineageParseHelper.scala:341) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:341) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$54(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$54(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.parse(SparkSQLLineageParseHelper.scala:54) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.parse$(SparkSQLLineageParseHelper.scala:52) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.parse(SparkSQLLineageParseHelper.scala:510) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.$anonfun$transformToLineage$1(SparkSQLLineageParseHelper.scala:516) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.util.Try$.apply(Try.scala:213) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.transformToLineage(SparkSQLLineageParseHelper.scala:516) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.SparkOperationLineageQueryExecutionListener.onSuccess(SparkOperationLineageQueryExecutionListener.scala:34) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.spark.sql.util.ExecutionListenerBus.doPostEvent(QueryExecutionListener.scala:165) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.doPostEvent(QueryExecutionListener.scala:135) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.postToAll(QueryExecutionListener.scala:135) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.onOtherEvent(QueryExecutionListener.scala:147) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:100) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) ~[scala-library-2.12.15.jar:?]
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) ~[scala-library-2.12.15.jar:?]
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1446) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96) ~[spark-core_2.12-3.3.1.jar:3.3.1]

@2018yinjian
Copy link
Author

@wForget 现在不管是命令行方式执行还是后台spark-submit 都会出现这个警告信息,特别是实时任务,日志频繁警告,能否有什么方式处理下该警告

@wForget
Copy link
Member

wForget commented May 11, 2024

at org.apache.kyuubi.plugin.lineage.helper.LineageParser.getV2TableName(SparkSQLLineageParseHelper.scala:493) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]

From error stack, it seems that relation.identifier is None. Would you mind sending a pr to avoid this error?

case relation: DataSourceV2Relation =>
val catalog = relation.catalog.map(_.name()).getOrElse(LineageConf.DEFAULT_CATALOG)
val database = relation.identifier.get.namespace().mkString(".")
val table = relation.identifier.get.name()
s"$catalog.$database.$table"

@wForget
Copy link
Member

wForget commented May 11, 2024

If it is difficult to determine which datasource table caused the exception, we can simply handle it like this:

      case relation: DataSourceV2Relation if relation.identifier.isDefined =>

@2018yinjian
Copy link
Author

scala> val workDF = spark.read.format("starrocks").option("starrocks.table.identifier", "ads.ads_real_work_sr").option("starrocks.fe.http.url", urlHttp).option("starrocks.fe.jdbc.url", urlJdbc).option("starrocks.user", userName).option("starrocks.password", password).load()
workDF: org.apache.spark.sql.DataFrame = [id: bigint, work_id: bigint ... 120 more fields]

scala> workDF.createOrReplaceTempView("temp_work_240511")
24/05/11 18:58:53 WARN package: Truncated the string representation of a plan since it was too large. This behavior can be adjusted by setting 'spark.sql.debug.maxToStringFields'.

scala> val execSql = "select a.source_id,b.main_id from ods.ods_biz_aftersale_refund_base a left join (select id,main_id from temp_work_240511 where nvl(region,0) = 202405) b on a.source_id = b.id where b.main_id isnot null"
execSql: String = select a.source_id,b.main_id from ods.ods_biz_aftersale_refund_base a left join (select id,main_id from temp_work_240511 where nvl(region,0) = 202405) b on a.source_id = b.id where b.main_id is notnull

scala> val resDF = spark.sql(execSql)
24/05/11 18:59:02 INFO HiveConf: Found configuration file file:/etc/taihao-apps/spark-conf/hive-site.xml
24/05/11 18:59:03 WARN HiveConf: HiveConf of name hive.metastore.type does not exist
24/05/11 18:59:03 INFO metastore: Trying to connect to metastore with URI thrift://master-1-2.c-bde909f92a7de709.cn-beijing.emr.aliyuncs.com:9083
24/05/11 18:59:04 INFO metastore: Opened a connection to metastore, current connections: 1
24/05/11 18:59:04 INFO metastore: Connected to metastore.
24/05/11 18:59:04 INFO Hive: instanced a metaStoreClient with type: com.sun.proxy.$Proxy48
24/05/11 18:59:04 INFO Hive: Registering function aesdecrypt com.bigdata.udf.AESDecrypt
24/05/11 18:59:04 INFO Hive: Registering function aesencrypt com.bigdata.udf.AESEncrypt
24/05/11 18:59:05 WARN HoodieBackedTableMetadata: Metadata table was not found at path hdfs://hdfs-cluster/user/hive/warehouse/ods.db/ods_biz_aftersale_refund_base/.hoodie/metadata
24/05/11 18:59:11 WARN HoodieBackedTableMetadata: Metadata table was not found at path hdfs://hdfs-cluster/user/hive/warehouse/ods.db/ods_biz_aftersale_refund_base/.hoodie/metadata
resDF: org.apache.spark.sql.DataFrame = [source_id: bigint, main_id: bigint]

scala> resDF.show()
+-------------------+-------------------+
| source_id| main_id|
+-------------------+-------------------+
|6111660950322814848|1111660950150566272|
|6111661570997755777|1111661568738859392|
|6111662540116074368|1111662539939650176|
|6111662818451398272|1111662818197361024|
|6111665385705185152|1111665383376034433|
|6111666565817442177|1111666565634183552|
|6111666750468005505|1111666750119090561|
|6111667693871570816|1111667688501287553|
|6111667725892984704|1111667725723881857|
|6111667793142881920|1111667792968536448|
|6111667831230832256|1111667830479788672|
|6111668106227752832|1111668104138201472|
|6111668122150116993|1111668121942497921|
|6111668499527901568|1111663413280774784|
|6111668590817189504|1111668588992909696|
|6111668610538282624|1111668610321226113|
|6111668736570564481|1111668736389665152|
|6111668776803639169|1111668774515644801|
|6111668968737873793|1111668963609287041|
|6111668997785263745|1111668997594421633|
+-------------------+-------------------+
only showing top 20 rows

scala> 24/05/11 18:59:37 WARN SparkSQLLineageParseHelper: Extract Statement[8] columns lineage failed.
java.util.NoSuchElementException: next on empty iterator
at scala.collection.Iterator$$anon$2.next(Iterator.scala:41) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator$$anon$2.next(Iterator.scala:39) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.head(IterableLike.scala:109) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.head$(IterableLike.scala:108) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.head(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$mergeRelationColumnLineage$1(SparkSQLLineageParseHelper.scala:180) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126) ~[scala-library-2.12.15.jar:?]
at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122) ~[scala-library-2.12.15.jar:?]
at scala.collection.immutable.List.foldLeft(List.scala:91) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.mergeRelationColumnLineage(SparkSQLLineageParseHelper.scala:178) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:457) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$54(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$54(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$22(SparkSQLLineageParseHelper.scala:341) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:341) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$54(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$44(SparkSQLLineageParseHelper.scala:392) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:392) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$54(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$22(SparkSQLLineageParseHelper.scala:341) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:341) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$22(SparkSQLLineageParseHelper.scala:341) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:341) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$54(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.$anonfun$extractColumnsLineage$54(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.Iterator.foreach$(Iterator.scala:943) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach(IterableLike.scala:74) ~[scala-library-2.12.15.jar:?]
at scala.collection.IterableLike.foreach$(IterableLike.scala:73) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractIterable.foreach(Iterable.scala:56) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map(TraversableLike.scala:286) ~[scala-library-2.12.15.jar:?]
at scala.collection.TraversableLike.map$(TraversableLike.scala:279) ~[scala-library-2.12.15.jar:?]
at scala.collection.AbstractTraversable.map(Traversable.scala:108) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.extractColumnsLineage(SparkSQLLineageParseHelper.scala:478) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.parse(SparkSQLLineageParseHelper.scala:54) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.LineageParser.parse$(SparkSQLLineageParseHelper.scala:52) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.parse(SparkSQLLineageParseHelper.scala:510) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.$anonfun$transformToLineage$1(SparkSQLLineageParseHelper.scala:516) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at scala.util.Try$.apply(Try.scala:213) ~[scala-library-2.12.15.jar:?]
at org.apache.kyuubi.plugin.lineage.helper.SparkSQLLineageParseHelper.transformToLineage(SparkSQLLineageParseHelper.scala:516) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.kyuubi.plugin.lineage.SparkOperationLineageQueryExecutionListener.onSuccess(SparkOperationLineageQueryExecutionListener.scala:34) ~[kyuubi-spark-lineage_2.12-1.8.1-jar-with-dependencies.jar:?]
at org.apache.spark.sql.util.ExecutionListenerBus.doPostEvent(QueryExecutionListener.scala:165) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.doPostEvent(QueryExecutionListener.scala:135) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.postToAll(QueryExecutionListener.scala:135) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.sql.util.ExecutionListenerBus.onOtherEvent(QueryExecutionListener.scala:147) ~[spark-sql_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:100) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:117) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:101) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) ~[scala-library-2.12.15.jar:?]
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) ~[scala-library-2.12.15.jar:?]
at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1446) ~[spark-core_2.12-3.3.1.jar:3.3.1]
at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96) ~[spark-core_2.12-3.3.1.jar:3.3.1]
这种场景怎么排除警告

@2018yinjian
Copy link
Author

private def getV2TableName(plan: NamedRelation): String = {
plan match {
case relation: DataSourceV2ScanRelation =>
val catalog = relation.relation.catalog.map(.name()).getOrElse(LineageConf.DEFAULT_CATALOG)
val database = relation.relation.identifier.get.namespace().mkString(".")
val table = relation.relation.identifier.get.name()
s"$catalog.$database.$table"
case relation: DataSourceV2Relation if relation.identifier.isDefined =>
val catalog = relation.catalog.map(
.name()).getOrElse(LineageConf.DEFAULT_CATALOG)
val database = relation.identifier.get.namespace().mkString(".")
val table = relation.identifier.get.name()
s"$catalog.$database.$table"
case _ =>
plan.name
}
} 这样修改问题依旧存在

@wForget
Copy link
Member

wForget commented May 14, 2024

这样修改问题依旧存在

Can you provide error details?

@2018yinjian
Copy link
Author

已修改代码 以兼容异常警告

@wForget
Copy link
Member

wForget commented May 16, 2024

已修改代码 以兼容异常警告

Can you send a PR for this fix?

@pan3793 pan3793 changed the title java.util.NoSuchElementException: None.get【kyuubi-spark-lineage】 lineage plugin throws java.util.NoSuchElementException: None.get May 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind:bug This is a clearly a bug priority:major
Projects
None yet
Development

No branches or pull requests

2 participants