Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Note
This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is in Public Preview.
Databricks CLI use is subject to the Databricks License and Databricks Privacy Notice, including any Usage Data provisions.
The workspace
command group within the Databricks CLI allows you to list, import, export, and delete workspace files and folders. See What are workspace files?.
databricks workspace delete
Delete a workspace object.
Deletes an object or a directory (and optionally recursively deletes all objects in the directory). If path does not exist, this call returns an error RESOURCE_DOES_NOT_EXIST
. If path is a non-empty directory and recursive is set to false, this call returns an error DIRECTORY_NOT_EMPTY
.
Object deletion cannot be undone and deleting a directory recursively is not atomic.
databricks workspace delete PATH [flags]
Arguments
PATH
The absolute path of the notebook or directory.
Options
--json JSON
The inline JSON string or the @path to the JSON file with the request body.
--recursive
The flag that specifies whether to delete the object recursively.
databricks workspace export
Export a workspace object or the contents of an entire directory.
If path does not exist, this call returns an error RESOURCE_DOES_NOT_EXIST
.
If the exported data would exceed size limit, this call returns MAX_NOTEBOOK_SIZE_EXCEEDED
. Currently, this API does not support exporting a library.
databricks workspace export SOURCE_PATH [flags]
Arguments
PATH
The absolute path of the object or directory. Exporting a directory is only supported for the DBC, SOURCE, and AUTO format.
Options
--file string
Path on the local file system to save exported file at.
--format ExportFormat
This specifies the format of the exported file. Supported values: AUTO
, DBC
, HTML
, JUPYTER
, RAW
, R_MARKDOWN
, SOURCE
databricks workspace export-dir
Export a directory recursively from a Databricks workspace to the local file system.
databricks workspace export-dir SOURCE_PATH TARGET_PATH [flags]
Arguments
SOURCE_PATH
The source directory path in the workspace
TARGET_PATH
The target directory path on the local file system
Options
--overwrite
overwrite existing local files
databricks workspace get-status
Get the status of an object or a directory. If path does not exist, this call returns an error RESOURCE_DOES_NOT_EXIST
.
databricks workspace get-status PATH [flags]
Arguments
PATH
The absolute path of the notebook or directory.
Options
databricks workspace import
Imports a workspace object (for example, a notebook or file) or the contents of an entire directory. If path already exists and overwrite is set to false, this call returns an error RESOURCE_ALREADY_EXISTS
. To import a directory, you can use either the DBC
format or the SOURCE
format with the language field unset. To import a single file as SOURCE
, you must set the language field.
databricks workspace import TARGET_PATH [flags]
Arguments
PATH
The absolute path of the object or directory. Importing a directory is only supported for the DBC
and SOURCE
formats.
Options
--content string
The base64-encoded content.
--file string
Path of local file to import
--format ImportFormat
This specifies the format of the file to be imported. Supported values: AUTO
, DBC
, HTML
, JUPYTER
, RAW
, R_MARKDOWN
, SOURCE
--json JSON
The inline JSON string or the @path to the JSON file with the request body.
--language Language
The language of the object. Supported values: PYTHON
, R
, SCALA
, SQL
--overwrite
The flag that specifies whether to overwrite existing object.
databricks workspace import-dir
Import a directory recursively from the local file system to a Databricks workspace.
Notebooks will have their extensions stripped.
databricks workspace import-dir SOURCE_PATH TARGET_PATH [flags]
Arguments
SOURCE_PATH
The source directory path on the local file system
TARGET_PATH
The target directory path in the workspace
Options
--overwrite
overwrite existing workspace files
databricks workspace list
List the contents of a directory, or the object if it is not a directory. If the input path does not exist, this call returns an error RESOURCE_DOES_NOT_EXIST
.
databricks workspace list PATH [flags]
Arguments
PATH
The absolute path of the notebook or directory.
Options
--notebooks-modified-after int
UTC timestamp in milliseconds.
databricks workspace mkdirs
Create the specified directory (and necessary parent directories if they do not exist). If there is an object (not a directory) at any prefix of the input path, this call returns an error RESOURCE_ALREADY_EXISTS
.
Note that if this operation fails it may have succeeded in creating some of the necessary parent directories.
databricks workspace mkdirs PATH [flags]
Arguments
PATH
The absolute path of the directory. If the parent directories do not exist, it will also create them. If the directory already exists, this command will do nothing and succeed.
Options
--json JSON
The inline JSON string or the @path to the JSON file with the request body.
databricks workspace get-permission-levels
Get workspace object permission levels.
databricks workspace get-permission-levels WORKSPACE_OBJECT_TYPE WORKSPACE_OBJECT_ID [flags]
Arguments
WORKSPACE_OBJECT_TYPE
The workspace object type for which to get or manage permissions.
WORKSPACE_OBJECT_ID
The workspace object for which to get or manage permissions.
Options
databricks workspace get-permissions
Get the permissions of a workspace object. Workspace objects can inherit permissions from their parent objects or root object.
databricks workspace get-permissions WORKSPACE_OBJECT_TYPE WORKSPACE_OBJECT_ID [flags]
Arguments
WORKSPACE_OBJECT_TYPE
The workspace object type for which to get or manage permissions.
WORKSPACE_OBJECT_ID
The workspace object for which to get or manage permissions.
Options
databricks workspace set-permissions
Set workspace object permissions.
Sets permissions on an object, replacing existing permissions if they exist. Deletes all direct permissions if none are specified. Objects can inherit permissions from their parent objects or root object.
databricks workspace set-permissions WORKSPACE_OBJECT_TYPE WORKSPACE_OBJECT_ID [flags]
Arguments
WORKSPACE_OBJECT_TYPE
The workspace object type for which to get or manage permissions.
WORKSPACE_OBJECT_ID
The workspace object for which to get or manage permissions.
Options
--json JSON
The inline JSON string or the @path to the JSON file with the request body.
databricks workspace update-permissions
Update the permissions on a workspace object. Workspace objects can inherit permissions from their parent objects or root object.
databricks workspace update-permissions WORKSPACE_OBJECT_TYPE WORKSPACE_OBJECT_ID [flags]
Arguments
WORKSPACE_OBJECT_TYPE
The workspace object type for which to get or manage permissions.
WORKSPACE_OBJECT_ID
The workspace object for which to get or manage permissions.
Options
--json JSON
The inline JSON string or the @path to the JSON file with the request body.
Global flags
--debug
Whether to enable debug logging.
-h
or --help
Display help for the Databricks CLI or the related command group or the related command.
--log-file
string
A string representing the file to write output logs to. If this flag is not specified then the default is to write output logs to stderr.
--log-format
format
The log format type, text
or json
. The default value is text
.
--log-level
string
A string representing the log format level. If not specified then the log format level is disabled.
-o, --output
type
The command output type, text
or json
. The default value is text
.
-p, --profile
string
The name of the profile in the ~/.databrickscfg
file to use to run the command. If this flag is not specified then if it exists, the profile named DEFAULT
is used.
--progress-format
format
The format to display progress logs: default
, append
, inplace
, or json
-t, --target
string
If applicable, the bundle target to use