'Translate Python to Pyspark
i have code Python for take all values contains NCO-ETD in Type with groupby ID and Date
cond = 'NCO - ETD'
df_ = (data.where(data.assign(new = data['Type'].str.contains(cond))
.groupby(['Date', 'ID'])['new']
.transform('any')])
df_
Now I need translate this code to PySpark.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|