The use of Flink accumulator (accumulator)

Flink's Accumulator is the accumulator, which is similar to the application scenario of Saprk Accumulator. It can well observe the data changes of the
task during its operation. The accumulator can be operated in the operator function of the Flink job task, but only after the task execution ends. Only then can the final result of the accumulator be obtained. Accumulator usage of spark .

The usage of accumulator in Flink is very simple:

1: Create an accumulator: val acc = new IntCounter();

2: Register the accumulator: getRuntimeContext().addAccumulator("accumulator", acc );

3: Use accumulator: this.acc.add(1);

4: Get the result of the accumulator: myJobExecutionResult.getAccumulatorResult("accumulator")

Let's see a complete demo:

package flink

import org.apache.flink.api.common.accumulators.IntCounter
import org.apache.flink.api.common.functions.RichMapFunction
import org.apache.flink.api.scala.ExecutionEnvironment
import org.apache.flink.api.scala._
import org.apache.flink.configuration.Configuration

/**
  * Flink的累加器使用
  */
object flinkBatch {
  de

Guess you like

Origin blog.csdn.net/xianpanjia4616/article/details/86680066