Share via


storage-credentials command group

Note

This information applies to Databricks CLI versions 0.205 and above. The Databricks CLI is in Public Preview.

Databricks CLI use is subject to the Databricks License and Databricks Privacy Notice, including any Usage Data provisions.

The storage-credentials command group within the Databricks CLI contains commands to manage storage credentials in Unity Catalog. A storage credential represents an authentication and authorization mechanism for accessing data stored on your cloud tenant. Each storage credential is subject to Unity Catalog access-control policies that control which users and groups can access the credential. If a user does not have access to a storage credential in Unity Catalog, the request fails and Unity Catalog does not attempt to authenticate to your cloud tenant on the user's behalf. See Manage storage credentials.

databricks storage-credentials create

Create a new storage credential.

The caller must be a metastore admin or have the CREATE_STORAGE_CREDENTIAL privilege on the metastore.

databricks storage-credentials create NAME [flags]

Arguments

NAME

    The credential name. The name must be unique among storage and service credentials within the metastore.

Options

--comment string

    Comment associated with the credential.

--json JSON

    The inline JSON string or the @path to the JSON file with the request body.

--read-only

    Whether the credential is usable only for read operations.

--skip-validation

    Supplying true to this argument skips validation of the created credential.

Global flags

Examples

The following example creates a new storage credential:

databricks storage-credentials create my-storage-credential

The following example creates a storage credential with a comment:

databricks storage-credentials create my-storage-credential --comment "S3 credential for analytics data"

The following example creates a read-only storage credential:

databricks storage-credentials create my-storage-credential --read-only

databricks storage-credentials delete

Delete a storage credential from the metastore. The caller must be an owner of the storage credential.

databricks storage-credentials delete NAME [flags]

Arguments

NAME

    Name of the storage credential.

Options

--force

    Force an update even if there are dependent external locations or external tables (when purpose is STORAGE) or dependent services (when purpose is SERVICE).

Global flags

Examples

The following example deletes a storage credential:

databricks storage-credentials delete my-storage-credential

The following example force deletes a storage credential:

databricks storage-credentials delete my-storage-credential --force

databricks storage-credentials get

Get a storage credential from the metastore. The caller must be a metastore admin, the owner of the storage credential, or have some permission on the storage credential.

databricks storage-credentials get NAME [flags]

Arguments

NAME

    Name of the storage credential.

Options

Global flags

Examples

The following example gets information about a storage credential:

databricks storage-credentials get my-storage-credential

databricks storage-credentials list

List storage credentials. The list is limited to only those storage credentials the caller has permission to access. If the caller is a metastore admin, retrieval of credentials is unrestricted. There is no guarantee of a specific ordering of the elements in the array.

databricks storage-credentials list [flags]

Options

--max-results int

    Maximum number of storage credentials to return.

--page-token string

    Opaque pagination token to go to next page based on previous query.

Global flags

Examples

The following example lists all storage credentials:

databricks storage-credentials list

databricks storage-credentials update

Update a storage credential on the metastore.

The caller must be the owner of the storage credential or a metastore admin. If the caller is a metastore admin, only the owner field can be changed.

databricks storage-credentials update NAME [flags]

Arguments

NAME

    Name of the storage credential.

Options

--comment string

    Comment associated with the credential.

--force

    Force update even if there are dependent external locations or external tables.

--isolation-mode IsolationMode

    Whether the current securable is accessible from all workspaces or a specific set of workspaces. Supported values: [ISOLATION_MODE_ISOLATED, ISOLATION_MODE_OPEN]

--json JSON

    The inline JSON string or the @path to the JSON file with the request body.

--new-name string

    New name for the storage credential.

--owner string

    Username of current owner of credential.

--read-only

    Whether the credential is usable only for read operations.

--skip-validation

    Supplying true to this argument skips validation of the updated credential.

Global flags

Examples

The following example updates a storage credential's comment:

databricks storage-credentials update my-storage-credential --comment "Updated S3 credential"

The following example changes the owner of a storage credential:

databricks storage-credentials update my-storage-credential --owner someone@example.com

The following example renames a storage credential:

databricks storage-credentials update my-storage-credential --new-name updated-credential

databricks storage-credentials validate

Validate a storage credential. At least one of external_location_name and url need to be provided. If only one of them is provided, it will be used for validation. And if both are provided, the url will be used for validation, and external_location_name will be ignored when checking overlapping urls.

Either the storage_credential_name or the cloud-specific credential must be provided.

The caller must be a metastore admin or the storage credential owner or have the CREATE_EXTERNAL_LOCATION privilege on the metastore and the storage credential.

databricks storage-credentials validate [flags]

Options

--external-location-name string

    The name of an existing external location to validate.

--json JSON

    The inline JSON string or the @path to the JSON file with the request body.

--read-only

    Whether the storage credential is only usable for read operations.

--storage-credential-name string

    Required.

--url string

    The external location url to validate.

Global flags

Examples

The following example validates a storage credential against an external location:

databricks storage-credentials validate --storage-credential-name my-storage-credential --external-location-name my-external-location

The following example validates a storage credential against a URL:

databricks storage-credentials validate --storage-credential-name my-storage-credential --url s3://my-bucket/path

Global flags

--debug

  Whether to enable debug logging.

-h or --help

    Display help for the Databricks CLI or the related command group or the related command.

--log-file string

    A string representing the file to write output logs to. If this flag is not specified then the default is to write output logs to stderr.

--log-format format

    The log format type, text or json. The default value is text.

--log-level string

    A string representing the log format level. If not specified then the log format level is disabled.

-o, --output type

    The command output type, text or json. The default value is text.

-p, --profile string

    The name of the profile in the ~/.databrickscfg file to use to run the command. If this flag is not specified then if it exists, the profile named DEFAULT is used.

--progress-format format

    The format to display progress logs: default, append, inplace, or json

-t, --target string

    If applicable, the bundle target to use