Hi Guys, I’ve added an additional detail here abou...
# gooddata-platform
Hi Guys, I’ve added an additional detail here about the errors when exporting/importing the whole layout, could you please help out us here? Thanks https://gooddataconnect.slack.com/archives/C01USCF4S10/p1659093059278549
There are two models in GoodData: • Physical Data Model (PDM). It is a part of data sources • Logical Data Model (LDM). It is a part of workspaces. It contains mapping from datasets to tables and from facts/attributes to table columns. Most likely here you forgot to remove the column. We have two models to allow users to decouple physical model from analytics objects (metrics, insights). When you do a breaking change into physical model, you need to refactor logical model only (e.g. change name of mapped column), and you do not need to update any metrics, insight or dashboards.
Thanks for the info! I see, but I’ve seen for example different errors which may related to what you mentioned but I don’t know how can I fix it
for example:
Copy code
"metrics": [
              "content": {
                "format": "#,##0",
                "maql": "SELECT COUNT({label/table.requestid}) WHERE {label/table.status}= \"ALLOWED\""
              "description": "allowed_sdk",
              "id": "allallowed",
              "title": "CHECK - allowed"
we deleted this metric on the UI already, but I need to delete it by hand as well from the exported json
I need to understand your expectations, what you wanna achieve. You work in our UI apps. You also export model (LDM, workspace, all workspaces, whole organization) into JSON file(s). Do you want to import the JSON to another environment? Regularly? Why? Do you want to version JSON files in git? Do you want to sync the state of GoodData environment (what users do in UI apps) with JSON files regularly? Generally, I strongly recommend to utilize our python-sdk: repo - https://github.com/gooddata/gooddata-python-sdk DOC - https://gooddata-sdk.readthedocs.io/en/latest/ Specifically, there are declarative methods storing an equivalent of JSON files to disk and to load them back to any environment. In this case, we store YAML files into directory structure, so it is more convenient for developers to edit them manually if needed. Additionally, my colleague @Patrik Braborec wrote very nice article related to how to manage metadata with python-sdk and how to build CI/CD pipeline: https://medium.com/gooddata-developers/how-to-automate-data-analytics-using-ci-cd-9f1475065d61 Let me know what you wanna achieve and we surely find a solution.
You work in our UI apps.
You also export model (LDM, workspace, all workspaces, whole organization) into JSON file(s).
Do you want to import the JSON to another environment? Regularly? Why?
Yep, we have 3 environments and we would like to keep all the 3 with the same charts and visualization stuff…there is the QA where we building the charts on the Gooddata.CN then we would like to export the whole stuff and then test it in DEV and later on go with that to PROD. We’re using just the API (organization export GET) and would like to import into another enviroment with a PUT (of course with changing the needed login/access fields). Should we do it differently in order to avoid this issue/error?
It has to work with APIs too, it is just more convenient to use the python-sdk. Also, org export/import is IMO not a good way. You export and import not only workspaces, but also data sources, users, usergroups and permissions. Usually there are different users and permission in DEV compared to PROD. Often also data sources are different (dev vs. prod database). But if you really want to copy everything, org export/import is the right way. You always have to manage secure fields (ingest passwords). If the source DEV environment is fully working (consistent) and you export it, the import should work. E.g., if you delete a table from a data source and you delete everything related and everything works without error in DEV, org export/import should work without error.
Regarding the error you observe - did you really export the whole organization before importing it?
yep, just full export and import without any changes in the workspace scope
using the /api/v1/layout/organization GET & PUT
OK. That means that the source organization is broken. It seems that you re-scanned the data source and column myTable.myField is no longer there, but there is an entity in LDM model mapped to this column. Please, can you go to the workspace mentioned in the error message and open "Data" section (LDM Modeler app) and check if there is any error? Do you know, which LDM dataset was mapped to the table mentioned in the error?
Hi @Jan Soubusta Thanks for the answer! The strange thing that we started from a clean source organization, so we cleaned up the JSON, reload it without any error and then it works. After there were some work in the application, export and import again and errors appear again. And it’s not about the LDM, it’s just about deleted metrics as well. If we check the Data part there is no error I think.
I see. Generally, it is possible to delete an object with dependencies (other objects depend on it). AFAIK we always want users when they try to do something like this, e.g. "Do you really want to delete this metric? These insights/dashboards depend in it!" But if users know what they do - e.g. they want to delete it and start from scratch and the final state will be valid - then we allow them to delete such objects and create not fully valid model. May I ask you for your ideas about how it should ideally work for you?
Yes, it’s totally okay and I don’t have better idea jere but in this case the underlying descriptor and the exported JSON will be invalid from application point of view…and can’t be imported again without errors…my question would be that are there any tool or anything which can cleanup this unvalid json (remove unreferenced/invalid lables, metrics, etc) ?
Would you be ok to use a tool written in Python?
Or to be more specific - could you run
pip install gooddata-sdk
in your environment(s)?
yep, sure, did it on my mac, do you have suggestions which command should I use now? thanks
We are discussing this topic internally, I will answer ASAP.
👍 1
As of now, unfortunately, we do not have any production-ready solution for this use case. We know about this issue and we know how to solve it. I try to prioritize the delivery of the solution in our roadmap. I can't guarantee that it will be a part of the next GoodData.CN release, which should happen 2022-09-08. Then we should release every 10 weeks. The solution will be released sooner in our cloud(hosted) offering: https://www.gooddata.com/trial/ (releases every 2 weeks).
Okay, thank you!
Meanwhile, I try to create a simple workaround solution using our python-sdk, because I need it too 😉 Will keep you posted.
great, it would be good to have some script which can cleanup the json from dead references
Hi @Jan Soubusta, do you have any update on this? Have you managed to create some kind of workaround with Python?
Sorry, I forgot to respond. It was not feasible to do it. Recently, we released a new version 2.1 together with a new version of python-sdk : https://community.gooddata.com/product-updates/gooddata-cn-2-1-0-649 Now it should be feasible to implement the script. I try to do a PoC and will let you know.
I see, okay, thanks!