'How to escape ( parentheses in Spark Scala?
I am trying to replace parentheses in a string (i.e. column names). It is working fine with white spaces but not with (
parentheses. I tried """
, \(
, \\(
but I am always getting an error. I also tried this tip How can I escape special symbols in scala string? but it did not help me. Can you please tell me how to solve this?
import org.apache.commons.lang3.StringEscapeUtils
var newDf = df
for(col <- df.columns){
newDf = newDf.withColumnRenamed(col,col.replaceAll(StringEscapeUtils.escapeJava("("), "_"))
newDf = newDf.withColumnRenamed(col,col.replaceAll(" ", "-"))
}
Thanks a lot!
Solution 1:[1]
replaceAll uses regex, so you can just put parenthesis in a character class by which you don't have escape them:
val df = Seq((1,2)).toDF("(ABC)", "C D")
df.columns
// res28: Array[String] = Array((ABC), C D)
var newDf = df
for(col <- df.columns){
newDf = newDf.withColumnRenamed(col, col.replaceAll(" ", "-").replaceAll("[()]", "_"))
}
newDf.columns
// res30: Array[String] = Array(_ABC_, C-D)
Or \\(|\\)
should also work:
newDf.withColumnRenamed(col, col.replaceAll(" ", "-").replaceAll("\\(|\\)", "_"))
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 |