'IllegalArgumentException: File must be dbfs or s3n: /

dbutils.fs.mount(
  source = f"wasbs://{blob.storage_account_container}@{blob.storage_account_name}.blob.core.windows.net/",

  mount_point = "/mnt/MLRExtract/",

  extra_configs = {f"fs.azure.account.key.{blob.storage_account_name}.blob.core.windows.net":blob.storage_account_access_key})

While creating the mount point I'm facing this below error--

IllegalArgumentException: File must be dbfs or s3n: /



Solution 1:[1]

This error IllegalArgumentException mainly occurs something wrong with the syntax. You can follow the below syntax, I reproduce the same thing in my environment, and it is working fine.

Syntax:

spark.conf.set("fs.azure.account.key.<storage-account-name>.dfs.core.windows.net", dbutils.secrets.get(scope="<Scope-Name>",key="Key_Value"))

dbutils.fs.mount(
source = "wasbs://<container-name>@<storage-account-name>.blob.core.windows.net/",
mount_point = "/mnt/io24",
extra_configs = {"fs.azure.account.key.<storage-account-name>.blob.core.windows.net":"<storage-account-Access key>"}).

Ref1

Reference:

https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/adls-gen2/azure-datalake-gen2-sp-access

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1