pyspark 实践汇总3

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/yepeng2007fei/article/details/79670017
#本地字符串时间转时间戳
fullend = "20170120"
trackdate = int(time.mktime(time.strptime(fullend,'%Y%m%d')))


#spark字符串时间转时间戳
test1 = test.withColumn('unixtime',unix_timestamp('time','yyyy/MM/dd'))


#多个字段/属性同时outer join
pctdata = data.join(data1,["phone_id","idcard","time"],"outer")


#ln函数计算,p是概率或一个值
return int(11.0 * math.log((1-p)/p,math.e)+12.0)


#spark数据框中一列是列表list,将其分拆成2个属性

data2 = data1.withColumn("phone",data1["Tokens"].getItem(0)).withColumn("idcard",data1["Tokens"].getItem(1))

感兴趣的朋友想与我交流的话可以加群Python & Spark 636866908或者加群R语言&大数据分析456726635

猜你喜欢

转载自blog.csdn.net/yepeng2007fei/article/details/79670017