You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Problem Statement
Being able to work with both files and folders is pretty common pattern. Currently there is an upload method in the files API that supports file by file upload. It would be awesome to point towards a folder instead. This is possible with dbutils cp command from CLI today:
databricks fs cp --recursive ./data dbfs:/Volumes/.... However not supported in the SDK yet either it seems
Problem Statement
Being able to work with both files and folders is pretty common pattern. Currently there is an upload method in the files API that supports file by file upload. It would be awesome to point towards a folder instead. This is possible with dbutils cp command from CLI today:
databricks fs cp --recursive ./data dbfs:/Volumes/.... However not supported in the SDK yet either it seems
Proposed Solution
Either make the https://github.com/databricks/databricks-sdk-py/blob/main/databricks/sdk/service/files.py#L668 neutral to files/directories or add a upload folder function that can wrap the the upload file one.
The text was updated successfully, but these errors were encountered: