I am using
I am trying to run a job using the
DataSet API through
IntelliJ. Note that If I run the same job through the
Flink UI the job runs fine. In order to run the job, I need to first specify through environment variables the amount of data that will be processed. When the amount is relatively small, the job runs fine. But as it gets bigger I am beginning to get the following error:
Connected to JobManager at Actor[akka://flink/user/jobmanager_N#XXXXXXXXXXXXX] with leader session xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx. ERROR StatusLogger Log4j2 could not find a logging implementation. Please add log4j-core to the classpath. Using SimpleLogger to log to the console... 42908 [main] ERROR com.whatever.somecompany.SomeClass - Error executing pipeline java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds] at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:223) at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:157) at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:169) at scala.concurrent.Await$$anonfun$ready$1.apply(package.scala:169) ...
What can I change in order to solve this? Do I need to increase my heap memory or something? Thanks.