Hi Team, We are trying to clean up our gooddata wo...
# gooddata-cloud
t
Hi Team, We are trying to clean up our gooddata workspace. We have many orphan workspace and would like to delete these programtically. The question is how to get updated_at workspace information using gooddata python sdk. thank you!
m
Hi Toan, I am afraid that this metadata is not available in the workspace, may I know the reason you need to retrieve it and how it relates to the cleanup?
t
@Moises Morales I would like to clean up orphan workspace. I have find the way to solve the original problem. I have run my script on my local machine. Now I have problem with running it in serverless databrick notebook. the function: workspaces = sdk.catalog_workspace.list_workspaces() make error when decode return data error as following: File /local_disk0/.ephemeral_nfs/envs/pythonEnv-ee17b0fe-cded-41ba-a8d6-cf6bd632128e/lib/python3.11/site-packages/gooddata_api_client/api_client.py:225, in ApiClient.__call_api(self, resource_path, method, path_params, query_params, header_params, body, post_params, files, response_type, auth_settings, _return_http_data_only, collection_formats, _preload_content, _request_timeout, _host, _check_type, _content_type, _request_auths) 223 if match: 224 encoding = match.group(1) --> 225 response_data.data = response_data.data.decode(encoding) 227 return_data = self.deserialize( 228 response_data, 229 response_type, 230 _check_type 231 ) 232 else:
m
I have tested this on my own serverless Databrick instance and I ran into the same error, despite trying to a few ways to pass the UTF-8 harcoded. All in all, it seems the serverless network layer ends up modifying the response. You mentioned that you were able to run the script locally, may I know the use case for which you are trying to run it this way? Have you also tried using a standard cluster type instead?