How to detect Databricks environment programmatically

steven35 :

I'm writing a spark job that needs to be runnable locally as well as on Databricks.

The code has to be slightly different in each environment (file paths) so I'm trying to find a way to detect if the job is running in Databricks. The best way I have found so far was to look for a "dbfs" directory in the root dir and if it's there then assume it's running on Databricks. This doesn't feel like the right solution. Does anyone have a better idea?

pathikrit :

You can simply check for the existence of an environment variable e.g.:

def isRunningInDatabricks(): Boolean = 
  sys.env.contains("DATABRICKS_RUNTIME_VERSION")

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=69391&siteId=1