Contrast Spark Shell and Spark2.2 and the Spark1.6

2019-12-12  09:37:43

Spark Shell

Spark-shell is Shell Spark comes with an interactive program, user-friendly interactive programming, you can write Scala program in command Spark

Spark-shell used for Spark-Shell test has two modes start local mode and a mode Cluster

Local mode:

ps: Spark must enter the installation directory under the bin directory 
spark-shell 
start is the local mode, local mode only SparkSubmit start a process in the machine, there is no cluster to establish contact, although there SparkSubmit process but will not be submitted to the cluster

 

 

Cluster mode:

Spark must enter the installation directory bin directory under 
the Spark-shell \ 
--master the Spark: // hadoop01: 7077 \ 
--executor-Memory 512M \ 
--total-Executor-Cores 1 PS: there must be a command --master the latter two can not

  

Exit Shell:

1. Use: quit exit the shell 
2. Use ctrl + c to exit the shell   
PS: use ctrl + c to exit the shell may occupy the port of background appears 
   to view the listening port netstat -apn | grep 4040 as long as the port occupied by the process will kill

 

 Comparison of Spark2.2shell and Spark1.6shell

1) Spark2.2

 

 2) Spark1.6

 Description: Spark2.X version shell has two built-in objects

SparkContext  ——> 变量名:sc

SparkSession  ——>变量名:spark

SparkSession不能单独拿出来解释,2.x之后spark将SQLContext和HiveContext进行整合提供一种全新的创建方式SparkSession

Spark1.6版本shell中有两个内置对象

SparkContext  ——>变量名:sc

SQLContext  ——>变量名:sqlcontext

 

 

 

 

 

Guess you like

Origin www.cnblogs.com/yumengfei/p/12027506.html