'How to get workspace name inside a python notebook in databricks

I am trying get the workspace name inside a python notebook. Is there any way we can do this?

Ex: My workspace name is databricks-test. I want to capture this in variable in python notebook



Solution 1:[1]

By using of below command , we can get the working workspace ID . But getting the workspace name ,I think difficult to find it .

spark.conf.get("spark.databricks.clusterUsageTags.clusterOwnerOrgId")

Solution 2:[2]

To get the workspace name (not Org ID which the other answer gives you) you can do it one of two main ways

spark.conf.get("spark.databricks.workspaceUrl")

which will give you the absolutely URL and you can then split on the first. i.e

spark.conf.get("spark.databricks.workspaceUrl").split('.')[0]

You could also get it these two ways:

dbutils.notebook.entry_point.getDbutils().notebook().getContext() \

.browserHostName().toString()

or

import json

json.loads(dbutils.notebook.entry_point.getDbutils().notebook() \
  .getContext().toJson())['tags']['browserHostName']

Top tip if you're ever wondering what Spark Confs exist you can get most of them in a list like this:

sc.getConf().getAll()

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Karthikeyan Rasipalay Durairaj
Solution 2 Alex Ott