-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] .databricks.env not updating after extension configuration target change #1476
Comments
Because of the above, the |
Thanks for the response, and apologies for the delay in my getting back to you. What I am trying to accomplish is to use the python databricks-sdk to programmatically query some of the api's. Previously, the databricks-vscode plugin was a solution for authentication; it manages U2M tokens and starts the metadata service. It still does those things (from what I can tell), but now there is no access to the metadata service url from within the python environment when running locally. My understanding of Databricks Connect is that it allows you to submit jobs (query, notebook code, etc) to a Databricks cluster from a local client (vscode) instead of the web interface. Since what I'm doing doesn't require spark, or a running cluster, why am I limited to only running the code using Databricks Connect? |
"Run current file with Databricks Connect" does the same things as the old extension was doing with the python "run" button. Now it's just a separate action and explicitly named, although indeed the name is too specific. You can freely use it to run the python code that doesn't use dbconnect, it will still work and provide you with the auth. As a possibly more fitting alternative, you can create a |
The launch config I think works closest to what I want to do (thanks for that!), but I am running into issues with authentication when running pytest now. I had previously been relying on the variables added to |
Describe the bug
my project uses metadata-service to authenticate, but when i start a new vscode session and the terminals reactivate, the new
DATABRICKS_METADATA_SERVICE_URL
is not propagated to.databricks/.databricks.env
. i am working around this by deleting the .env file and reinitiating the extension configuration.DATABRICKS_BUNDLE_TARGET
andDATABRICKS_HOST
aren't being updated when switching targets, either.additionally, vscode is not picking up
.databricks/.databricks.env
variable declarations when setting the enviroment variables for a python terminal. i am working around that by copying the file down to the project root.To Reproduce
Steps to reproduce the behavior:
workaround to get the new url in the .env file is to:
"databricks.python.envFile": "${workspaceFolder}/.env",
from.vscode/settings.json
. skipping this step restores the previous version of the .env file, with the inactive metadata-service url..databricks/.databricks.env
databricks
extension pane, underconfiguration
, select a newtarget
to initiate a change to the host and bundle target..databricks/.databricks.env
to.env
, and the env file in project root gets picked up by vscodeanother weird symptom is that the config setting in the docs is not recognized by the vscode settings linter. i don't know if this is related.
full output of the python terminal below.
53566
andfd53d2ad-ee05-437d-8dbe-b12383b84c4e
are the port and random uuid from the previous vscode session's terminals.if we do have the correct metadata-service url, changing the target environment without manually refreshing the .env files returns the following:
Help:About
Databricks Extension Version: v2.4.8
The text was updated successfully, but these errors were encountered: