Hi Gooddata! We noticed that our filters that use ...
# gooddata-cn
n
Hi Gooddata! We noticed that our filters that use dimension tables were not properly filtering the data. Although the fact tables were being filtered correctly, we found that all entries on dimensions were visible to everyone on workspaces. Our database dropped all tables and data, but the table schema remained the same. After the last database update, we called the api/v1/actions/dataSources/{}/uploadNotification endpoint. Do you have any tips on what could have caused this issue?
j
Are you aware of the concept of Physical Data Model? If you change the relational model in you database, you have to re-scan it in GoodData. We cache the model on our side to do not query DB catalogs too often (it is costly, esp. on clustered DBs like Snowflake).
n
@Jan Soubusta, thank you for your response. In this case, we removed the tables, but when we applied the migration, all the tables and columns remained the same. However, something went wrong, and the WDF_fields were not respected, even though they existed in the tables. Regarding automation, which command and API should we use to correctly rescan the data?
j
Hm, I am wrong, sorry
Basically, there is a dropdown(picker) in the left panel of the modelling UI application, containing the list of available data sources.
The popup with the message is displayed when you hover over the underlying button
The button re-scans the data source and refreshes the PDM.
n
Yes, but we want do this via api or something not on ui. Cause we have a pipeline to send the data, and then we want to call rescan. Basically we and this “re-scans the data source and refreshes the PDM.” using commands or API
j
I recommend to use our Python SDK
Here is an open sourced repository containing end-to-end demo: https://gitlab.com/patrikbraborec/gooddata-data-pipeline The main README contains a description of onboarding and links to articles.
n
Thanks for your response @Jan Soubusta will take a look on this!