Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature][Flink] DataSource sql generated supports cdc and jdbc task different type conversion rules #3346

Open
2 of 3 tasks
xiaofan2022 opened this issue Apr 1, 2024 · 16 comments
Assignees
Labels
Discussing The problem is being discussed New Feature New feature

Comments

@xiaofan2022
Copy link
Contributor

Search before asking

  • I had searched in the issues and found no similar feature requirement.

Description

sql generated by data source supports cdc and jdbc task different type conversion rules。 jdbc type mappingcdc type mapping

Use case

sql generated by data source supports cdc and jdbc task different type conversion rules

Related issues

No response

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@xiaofan2022 xiaofan2022 added New Feature New feature Waiting for reply Waiting for reply labels Apr 1, 2024
Copy link

github-actions bot commented Apr 1, 2024

Hello @xiaofan2022, this issue is about CDC/CDCSOURCE, so I assign it to @aiwenmo. If you have any questions, you can comment and reply.

你好 @xiaofan2022, 这个 issue 是关于 CDC/CDCSOURCE 的,所以我把它分配给了 @aiwenmo。如有任何问题,可以评论回复。

@xiaofan2022
Copy link
Contributor Author

Support for major connectors such as flink-cdc,flink-jdbc,flink-hive,flink-hudi mainstream sql field type mapping

@Zzm0809
Copy link
Contributor

Zzm0809 commented Apr 1, 2024

Please explain the purpose and design plan

@Zzm0809 Zzm0809 added Discussing The problem is being discussed and removed Waiting for reply Waiting for reply labels Apr 1, 2024
@xiaofan2022
Copy link
Contributor Author

Purpose: sql generation function automatically generates synchronous sql (type mapping) based on connector
Plan: First support jdbc type sql generated field type mapping

@Zzm0809
Copy link
Contributor

Zzm0809 commented Apr 15, 2024

Purpose: sql generation function automatically generates synchronous sql (type mapping) based on connector Plan: First support jdbc type sql generated field type mapping

Can you implement this function?

@xiaofan2022
Copy link
Contributor Author

I can do the backend. The frontend?

@aiwenmo
Copy link
Contributor

aiwenmo commented Apr 18, 2024

I can do the backend. The frontend?

Thank you very much for your participation. Please clarify the front-end requirements, and we will be responsible for their implementation.

@xiaofan2022
Copy link
Contributor Author

The front end adds different connector types (such as CDC, JDBC, Hive, Hudi) when generating SQL statements, and the back end maps the corresponding fields according to the official documentation based on the connector type.

@Zzm0809
Copy link
Contributor

Zzm0809 commented Apr 21, 2024

The front end adds different connector types (such as CDC, JDBC, Hive, Hudi) when generating SQL statements, and the back end maps the corresponding fields according to the official documentation based on the connector type.

you can provide a map data struct example of backend . i will be design the frontend

@Zzm0809
Copy link
Contributor

Zzm0809 commented Apr 22, 2024

The front end adds different connector types (such as CDC, JDBC, Hive, Hudi) when generating SQL statements, and the back end maps the corresponding fields according to the official documentation based on the connector type.

I understand whether it can be designed as follows

  1. The front end adds a drop-down box in Generate sql -> Flinksql Tag (values such as: CDC, JDBC, Hive, Hudi)
  2. Then select and call the interface input parameters before adding the connector type input parameters. The interface generates results based on the connector type, and then returns

how?

@xiaofan2022
Copy link
Contributor Author

yes

@xiaofan2022
Copy link
Contributor Author

If you want to add flink-cdc type mapping, is to add a new flink-cdc-meta, and then put the corresponding logic into flink- CDC-meta?

@Zzm0809
Copy link
Contributor

Zzm0809 commented Apr 23, 2024

@aiwenmo What do you think?

@aiwenmo
Copy link
Contributor

aiwenmo commented Apr 25, 2024

Put the corresponding logic into dinky-cdc.

@Zzm0809
Copy link
Contributor

Zzm0809 commented Apr 26, 2024

Put the corresponding logic into dinky-cdc.

It would be inappropriate to put everything in dinky-cdc. In theory, the column type information on the data source module also needs to use the type conversion here to achieve a unified effect.

@xiaofan2022
Copy link
Contributor Author

Is it possible to maintain different conversion logic for different sources (e.g. flink-cdc,jdbc,hive,hudi) in the conversion logic corresponding to the connector?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Discussing The problem is being discussed New Feature New feature
Projects
None yet
Development

No branches or pull requests

3 participants