subcategory |
---|
Storage |
-> Note If you have a fully automated setup with workspaces created by databricks_mws_workspaces or azurerm_databricks_workspace, please make sure to add depends_on attribute in order to prevent default auth: cannot configure default credentials errors.
This data source allows to get list of file names from get file content from Databricks File System (DBFS).
data "databricks_dbfs_file_paths" "partitions" {
path = "dbfs:/user/hive/default.db/table"
recursive = false
}
path
- (Required) Path on DBFS for the file to perform listingrecursive
- (Required) Either or not recursively list all files
This data source exports the following attributes:
path_list
- returns list of objects withpath
andfile_size
attributes in each
The following resources are used in the same context:
- End to end workspace management guide.
- databricks_dbfs_file data to get file content from Databricks File System (DBFS).
- databricks_dbfs_file_paths data to get list of file names from get file content from Databricks File System (DBFS).
- databricks_dbfs_file to manage relatively small files on Databricks File System (DBFS).
- databricks_library to install a library on databricks_cluster.
- databricks_mount to mount your cloud storage on
dbfs:/mnt/name
.