'Databricks Error: AnalysisException: Incompatible format detected. with Delta

I'm getting the following error when I attempt to write to my data lake with Delta on Databricks

fulldf = spark.read.format("csv").option("header", True).option("inferSchema",True).load("/databricks-datasets/flights/")

fulldf.write.format("delta").mode("overwrite").save('/mnt/lake/BASE/flights/Full/')

The above produces the following error:

AnalysisException: Incompatible format detected.

You are trying to write to `/mnt/lake/BASE/flights/Full/` using Databricks Delta, but there is no
transaction log present. Check the upstream job to make sure that it is writing
using format("delta") and that you are trying to write to the table base path.

To disable this check, SET spark.databricks.delta.formatCheck.enabled=false
To learn more about Delta, see https://docs.databricks.com/delta/index.html

Any reason for the error?



Solution 1:[1]

This worked in my similar situation:

%sql CONVERT TO DELTA parquet.`/mnt/lake/BASE/flights/Full/`

Solution 2:[2]

Such error usually occurs when you have data in another format inside the folder. For example, if you wrote Parquet or CSV files into it before. Remove the folder completely and try again

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 John Stud
Solution 2 Alex Ott