Spark: Spark Streaming

Spark Streaming uses a “micro-batch” architecture, where the streaming computation is treated as a continuous series of batch computations on small batches of data. Spark Streaming receives data from various input sources and groups it into small batches. New batches are created at regular time intervals.At the beginning of each time interval a new batch is created,and any data that arrives during that interval gets added to that batch.At the end of the time interval the batch is done growing.The size of the time intervals is determined by a parameter called the batch interval.   



 

Transformations

Transformations on DStreams can be grouped into either stateless or stateful:

  • In stateless transformations the processing of each batch does not depend on the data of its previousbatches.
  • Stateful transformations,in contrast,use data or intermediate results from previous batches to compute the results of the current batch.They include transformations based on sliding windows and on tracking state across time.

Preferences

<<learning spark>>

猜你喜欢

转载自ylzhj02.iteye.com/blog/2205094