SparkSql, SparkCore save data to relational database

After data analysis, it will definitely be implemented. Generally, it will be lost in hdfs, or a common relational database. Here, I will use mysql to tell you about it.

sql is actually easy to operate, just call the method

    val prop=new java.util.Properties()
    prop.put("driver","com.mysql.jdbc.Driver")
    prop.put("user","root")
    prop.put("password","root")

	resultDF.write.jdbc("jdbc:mysql://localhost:3306/test?createDatabaseIfNotExist=true&characterEncoding=UTF-8","student",prop)

If you are using Core, then call foreachRDD to write jdbc is OK, the following is an example of how I save wordcount

tuple.foreachRDD(rdd=>{
    
    
      rdd.foreach(word=>{
    
    
        Class.forName("com.mysql.jdbc.Driver")
        //获取mysql连接
        val conn = DriverManager.getConnection("jdbc:mysql://localhost:3306/test", "root", "")
        //把数据写入mysql
        try {
    
    
          var totalcount = word._2
          var sql = "";
          var querysql="select count from wordcount where titleName='"+word._1+"'"
          val queryresult: ResultSet = conn.prepareStatement(querysql).executeQuery()
          //存在更新,不存在添加
          if(queryresult.next()){
    
    
            totalcount = queryresult.getString("count").toInt+word._2
            sql = "update wordcount set count='"+totalcount+"' where titleName='"+word._1+"'"
          }else{
    
    
            sql = "insert into wordcount (titleName,count) values ('" + word._1 + "','" + totalcount + "')"
          }

          conn.prepareStatement(sql).executeUpdate()
          println("保存结束--------------------------------------------------------------")
        } finally {
    
    
          conn.close()
        }
      })
    })

Guess you like

Origin blog.csdn.net/dudadudadd/article/details/114374087