'regex_extract_all not working with spark sql

I'm using databricks notebook to extract all field occurrences from a text column using the regexp_extract_all function. Here is the input:

field_map#'IFDSIMP.7'.$1.$0 == 'X') OR (field_map#'IFDSIMP.14'.$1.$0 == 'X')

I'm able to extract values using df as a view

SELECT regexp_extract_all(raw, "field_map#\'.*?\'", 0) as field1 from fieldViews

+-------- |field1 | +------- |"field_map#'IFDSDEP.7'"| | field_map#'IFDSIMP.14'|

However, getting empty results set with spark SQL.

spark.sql("SELECT regexp_extract_all(raw, 'field_map#\'.*?\'', 0) as field1from fieldViews")



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source