Flink的作业提交

Flink作业提交

Flink作为一个分布式的计算引擎,类似于Spark,Flink有多重部署模式:standalonek8syarnmesos

Standalone模式

安装步骤:

1.下载安装包,并分发配置好的

2.配置flink-conf.yml & slave

3.启动Flink集群

4.确认Flink启动情况:http://localhost:8081/

进程1 -StandaloneSessionClusterEntrypoint

进程2 -TaskManagerRunner

提交Flink-Job

Flink程序的maven打包插件,亲测有效,以下是官网插件依赖地址:
https://ci.apache.org/projects/flink/flink-docs-release-1.7/dev/projectsetup/dependencies.html

<build>
    <plugins>
        <!-- We use the maven-shade plugin to create a fat jar that contains all necessary dependencies. -->
        <!-- Change the value of <mainClass>...</mainClass> if your program entry point changes. -->
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>3.0.0</version>
            <executions>
                <!-- Run shade goal on package phase -->
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <artifactSet>
                            <excludes>
                                <exclude>org.apache.flink:force-shading</exclude>
                                <exclude>com.google.code.findbugs:jsr305</exclude>
                                <exclude>org.slf4j:*</exclude>
                                <exclude>log4j:*</exclude>
                            </excludes>
                        </artifactSet>
                        <filters>
                            <filter>
                                <!-- Do not copy the signatures in the META-INF folder.
                                Otherwise, this might cause SecurityExceptions when using the JAR. -->
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/*.SF</exclude>
                                    <exclude>META-INF/*.DSA</exclude>
                                    <exclude>META-INF/*.RSA</exclude>
                                </excludes>
                            </filter>
                        </filters>
                    </configuration>
                </execution>
            </executions>
        </plugin>

        <!-- Java Compiler -->
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>3.1</version>
            <configuration>
                <source>1.8</source>
                <target>1.8</target>
            </configuration>
        </plugin>

        <!-- Scala Compiler -->
        <plugin>
            <groupId>net.alchim31.maven</groupId>
            <artifactId>scala-maven-plugin</artifactId>
            <version>3.2.2</version>
            <executions>
                <execution>
                    <goals>
                        <goal>compile</goal>
                        <goal>testCompile</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>
    </plugins>

</build>

1.通过Flink的监控页面进行提交

2.通过shell命令行形式进行提交

# a>> standalone提交
bin/flink run  #执行命令
-c com.shufang.flink.examples.FlinkWordCount  #主类
[-p 2 ] #设置并行度
xxxxxxxxxxxxx.jar  #jar包的路径
--port 9999
--host localhost
---------------------------------
bin/flink list [--all] #列出正在运行的作业
bin/flink cancel [jobId] #取消作业
---------------------------------

# b>> yarn-session提交:必须要求flink有hadoop支持的版本
bin/yarn-session 
-n 2  #taskmanager的数量
-s 2  #taskmanager的slot数量
-jm 1024 #jobmanager的内存
-tm 1024 #taskmanager的内存
-nm test #出现在yarn界面上的名字-AppName
-d #后台运行
#开启完之后就可以提交作业了
bin/flink run 
-m yarn-cluster  #只听master资源管理器
-c com.shufang.flink.examples.FlinkWordCount 
xxxxxxxx.jar
--input ///x.txt
--output ////x.csv
发布了65 篇原创文章 · 获赞 3 · 访问量 2168

猜你喜欢

转载自blog.csdn.net/shufangreal/article/details/104592448