StreamingPro

    xiaoxiao2025-10-05  5

    Declarative workflows for building Spark Streaming

    Spark Streaming Spark Streaming is an extension of the core Spark API that enables stream processing from a variety of sources.Spark is a extensible and programmable framework for massive distributed processing of datasets,called Resilient Distributed Datasets (RDD). Spark Streaming receives input data streams and divides the data into batches, which are then processed by the Spark engine to generate the results.Spark Streaming data is organized into a sequence of DStreams,represented internally as a sequence of RDDs.

    StreamingPro

    StreamingPro is not a complete application, but rather  a extensible and programmable framework for spark streaming (also include spark,storm)that can easily be used to build your streaming application. StreamingPro also make it possible that all you should do to build streaming program is assembling components(eg. SQL Component) in configuration file. 

    Features

    Pure Spark Streaming(Or normal Spark) program (Storm in future)No need of coding, only declarative workflowsRest API for interactiveSQL-Oriented workflows support  Data continuously streamed in & processed in near real-timedynamically CURD of workflows  at runtime via Rest API Flexible workflows (input, output, parsers, etc...) High performanceScalable   

    Documents

    Properties Build Run your first application Submit application dynamically CURD of workflows  at runtime via Rest API Recovery Useful modules introduction Other runtime support

    Architecture

    Snip20160510_3.png

    Declarative workflows

    Snip20160510_4.png

    Implementation

    Snip20160510_1.png
    最新回复(0)