Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kafka+Maxwell+bireme同步问题 #116

Open
stephenjwq opened this issue Nov 8, 2018 · 0 comments
Open

kafka+Maxwell+bireme同步问题 #116

stephenjwq opened this issue Nov 8, 2018 · 0 comments

Comments

@stephenjwq
Copy link

stephenjwq commented Nov 8, 2018

hello all
Now I've established the environment, Kafka+maxwell+bireme
version:
kafka:2.11-2.0.0
maxwell:1.19.0
bireme:2.0.0-alpha-1
java:jdk-8u191-linux-x64
when I start the bireme process, it report following information:

root>more bireme.out
10:09:38 ERROR Maxwell-maxwell1-test-1 - Stack Trace:
cn.hashdata.bireme.BiremeException: Transform failed.

at cn.hashdata.bireme.Dispatcher.dispatch(Dispatcher.java:60) ~[bireme-2.0.0-alpha-1.jar:?]
at cn.hashdata.bireme.pipeline.PipeLine.startDispatch(PipeLine.java:150) [bireme-2.0.0-alpha-1.jar:?]
at cn.hashdata.bireme.pipeline.PipeLine.executePipeline(PipeLine.java:106) [bireme-2.0.0-alpha-1.jar:?]
at cn.hashdata.bireme.pipeline.PipeLine.call(PipeLine.java:87) [bireme-2.0.0-alpha-1.jar:?]
at cn.hashdata.bireme.pipeline.PipeLine.call(PipeLine.java:39) [bireme-2.0.0-alpha-1.jar:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_191]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_191]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_191]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_191]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_191]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_191]

Caused by: java.lang.ClassCastException: com.google.gson.JsonNull cannot be cast to com.google.gson.JsonObject
at cn.hashdata.bireme.pipeline.MaxwellPipeLine$MaxwellTransformer$MaxwellRecord.(MaxwellPipeLine.java:129) ~[bireme-2.0.0-alpha-1.jar:?]
at cn.hashdata.bireme.pipeline.MaxwellPipeLine$MaxwellTransformer.transform(MaxwellPipeLine.java:89) ~[bireme-2.0.0-alpha-1.jar:?]
at cn.hashdata.bireme.pipeline.KafkaPipeLine$KafkaTransformer.fillRowSet(KafkaPipeLine.java:113) ~[bireme-2.0.0-alpha-1.jar:?]
at cn.hashdata.bireme.pipeline.PipeLine$Transformer.call(PipeLine.java:273) ~[bireme-2.0.0-alpha-1.jar:?]
at cn.hashdata.bireme.pipeline.PipeLine$Transformer.call(PipeLine.java:248) ~[bireme-2.0.0-alpha-1.jar:?]
... 4 more
10:09:38 INFO Scheduler - All pipeline stop.

The Maxwell status is ok:
[root@HQ-T-GDMYSQL1 maxwell-1.19.0]# bin/maxwell --config config.properties
Using kafka version: 1.0.0
10:04:22,893 WARN MaxwellMetrics - Metrics will not be exposed: metricsReportingType not configured.
10:04:23,309 INFO ProducerConfig - ProducerConfig values:
acks = 1
………………
transaction.timeout.ms = 60000
transactional.id = null
value.serializer = class org.apache.kafka.common.serialization.StringSerializer

10:04:23,372 INFO AppInfoParser - Kafka version : 1.0.0
10:04:23,372 INFO AppInfoParser - Kafka commitId : aaa7af6d4a11b29d
10:04:23,392 INFO Maxwell - Maxwell v1.19.0 is booting (MaxwellKafkaProducer), starting at Position[BinlogPosition[mysql-bin.000010:1018737021], lastHeartbeat=1541587793607]
10:04:23,542 INFO MysqlSavedSchema - Restoring schema id 4 (last modified at Position[BinlogPosition[mysql-bin.000010:849998013], lastHeartbeat=1541566377636])
10:04:23,659 INFO MysqlSavedSchema - Restoring schema id 1 (last modified at Position[BinlogPosition[mysql-bin.000010:728643351], lastHeartbeat=0])
10:04:23,877 INFO MysqlSavedSchema - beginning to play deltas...
10:04:23,890 INFO MysqlSavedSchema - played 3 deltas in 13ms
10:04:23,910 INFO BinlogConnectorReplicator - Setting initial binlog pos to: mysql-bin.000010:1018737021
10:04:23,970 INFO BinaryLogClient - Connected to localhost:3306 at mysql-bin.000010/1018737021 (sid:6379, cid:26140)
10:04:23,970 INFO BinlogConnectorLifecycleListener - Binlog connected.

The Kafka Consumer can get some data from binlog like this:
image

So if I run in a new environment and insert the data(except null), Will it run ok?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant