New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot write dataframe to Databricks Unity Catalog table #3397
Comments
@edgararuiz, do you happen to know anything regarding this? :) |
Hi, |
@edgararuiz thanks for the repsonse. Yes, I want to create a permanent table in Unity Catalog. Do you know which method I should use for that? I have not been able to locate the correct one myself. I need to be able to specify which catalog and schema the table should be created in, like using the |
@edgararuiz sorry for pinging you again. But do you have any updates/ideas? |
Hi @Zurina & @edgararuiz
A workaround might be:
(Here |
@cocinerox, thanks for your input. I agree, that part should work. Your workaround definitely works, but I hope this will be possible to do in native R eventually :) |
@Zurina, a "native" R solution:
where
|
Morning, the latest version of |
@edgararuiz It works for me. Thanks! |
Hi,
I am unsuccessful in writing to a table in Databricks Unity Catalog. I can easily read data from catalogs/schemas. I am using Python Databricks connect. I receive the same result regardless of using Azure Token or PAT Token. For example, this code:
The above works well. But I seem to be unable to write data. I have tried the following:
sparklyr::copy_to(sc, my_table, in_catalog("main", "default", "my_table2"))
I receive:
Using:
Any ideas to how I can write to a specific table in Unity Catalog with the path format catalog.schema.table?
The text was updated successfully, but these errors were encountered: