网站介绍:文章浏览阅读69次。import Utils.SparkUtilsimport org.apache.spark.SparkContextimport org.apache.spark.rdd.RDDobject Demo { def main(args: Array[String]): Unit = { val sc: SparkContext =SparkUtils.getSparkContext() val rdd: RDD[(String, Int)] =sc.textFile("data/_mappartitionsrdd[2] at flatmap at renthousecount.scala:11
- 链接地址:https://blog.csdn.net/DearNingning/article/details/117589131
- 链接标题:进阶RDD_mappartitionsrdd[2] at flatmap at renthousecount.s-CSDN博客
- 所属网站:blog.csdn.net
- 被收藏次数:9299
- 网站标签:mappartitionsrdd[2] at flatmap at renthousecount.scala:11