#遇到问题的现象描述
flink读取kafka数据,写入mysql,程序运行一段时间后报错,之后自动重启。出现问题:flink任务自动重启后,kafka还是正常消费,但是不往数据库写数据了。
#问题相关代码,报错内容
org.apache.flink.util.FlinkException: Scheduler is being stopped.
at org.apache.flink.runtime.scheduler.SchedulerBase.closeAsync(SchedulerBase.java:607) ~[streampark-flinkjob_todolist_to_sgsmart.jar:?]
at org.apache.flink.runtime.jobmaster.JobMaster.stopScheduling(JobMaster.java:962) ~[streampark-flinkjob_todolist_to_sgsmart.jar:?]
at org.apache.flink.runtime.jobmaster.JobMaster.stopJobExecution(JobMaster.java:926) ~[streampark-flinkjob_todolist_to_sgsmart.jar:?]
at org.apache.flink.runtime.jobmaster.JobMaster.onStop(JobMaster.java:398) ~[streampark-flinkjob_todolist_to_sgsmart.jar:?]
at org.apache.flink.runtime.rpc.RpcEndpoint.internalCallOnStop(RpcEndpoint.java:214) ~[streampark-flinkjob_todolist_to_sgsmart.jar:?]
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor$StartedState.terminate(AkkaRpcActor.java:563) ~[streampark-flinkjob_todolist_to_sgsmart.jar:?]
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleControlMessage(AkkaRpcActor.java:186) ~[streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at scala.PartialFunction.applyOrElse(PartialFunction.scala:123) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.actor.Actor.aroundReceive(Actor.scala:517) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.actor.Actor.aroundReceive$(Actor.scala:515) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.actor.ActorCell.invoke(ActorCell.scala:561) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.dispatch.Mailbox.run(Mailbox.scala:225) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.dispatch.Mailbox.exec(Mailbox.scala:235) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [streampark-flinkjob_todolist_to_sgsmart.jar:?]
#我的初步解答思路
查询mysql的错误连接,发现没有。可能是flink任务还能正常消费kafka的数据,但是已经不去获取mysql的连接往数据库写数据了
#操作环境。软件版本等相关信息
flink 1.13.6