Greetings, thanks to everyone for all of their hel...
# gooddata-cn
Greetings, thanks to everyone for all of their help so far... things are moving along and am super excited to get GoodData.CN running. I am having an issue creating my physical data model... I have have I would call a "normal" postgres database... but when I try to add a single table to the physical data model, it adds basically all tables from the database and then when I try to publish it gives numerous errors complaining that
Multiple datasets are mapped to the same table
I was hoping to be able to add the database basically as it was with no changes to the tool... Tableau was able to handle this database with no issue whatsoever. Might anyone have any suggestions? I am reading the docs and following them, but they don't seem to be much help. Thanks in advance and please let me know if there is more information that I can provide.
Hi again Vincil. I will ping my team to help you further but wanted to supply you with 2 GoodData university courses, Understanding the Logical Data Model (LDM) and Designing Data Models. It may help answer some of your issues/questions.
awesome thanks, watching them now!
This is one error that gets intermittently thrown:
Copy code
msg="Bad Request" logger=com.gooddata.tiger.web.exception.BaseExceptionHandling thread=http-nio-9007-exec-10 orgId=default spanId=265f6ded6e01f95c traceId=265f6ded6e01f95c userId=demo exc="errorType=com.gooddata.tiger.metadata.service.ldm.model.ReferenceWithoutPrimaryKeyReferentialException, message=Column using reference naming convention points to non-existing primary key columnWithNaming=user_id referencedTableId=orm_model_team_user referencedColumn=id
It seems that good data is making some assumptions about the data model... but what I am not seeing is a way to intervene or override
and I will say, that I have interacted with the physical data model without that error being thrown
so...not sure why the error is thrown sometimes and not others
@Leilani Greer I was able to get past my issue by editing the generated pdm file from the API, removing anything that was not necessary, and then using that to import
So glad you were able to find a solution and apologies for the delay! Here is some help documentation for working with modifying the LDM that might help in the future.
no was a learning experience... learning it's better to deal through the API...and now, I have it all scripted... so in some way it makes me feel better... more reproducible
better than Tableau LOL
If I need to make a model change I can probable do it by hand
from now on
Our API's are very powerful! I believe you are working with David Munka and he can always set up a call with one of SE's in the US to help you or answer technical questions too. Just wanted to let you know it is an option. šŸ˜„
sounds good...just getting the hang of it for now
Completely understand
Our slack channel is great for that
yes, glad I can make forward progress
This is interesting. Generally there are two steps: ā€¢ generate and store PDM ā€¢ create LDM - generate or drag and drop tables in LDM Modeler or create it completely manually How did you create the LDM which you was not able to publish (returned the error)? If you generate it in LDM Modeler (or through generateLogicalModel API) and the result was not valid, could you share the PDM so I could reproduce it on my side and find the root cause?
sure thing... I am still trying to pick apart when happened
finally how I was able to make it work, was by exporting the pdm using the API, and then editing by hand to use only the items that I needed
some problems centered around the user of postgres uuid type