Our team is about to integrate gooddata platform into our application. I was trying to build a proof of concept project, but failed to scan the postgresql dataschema.
The UI crashes immediately in chrome, with out of memory. In firefox it keeps running with around 9GB memory usage, but frozen even after 30 minutes. Is there a limitation to the number of tables in the schema? Our schema contains 300+ tables currently.
Is this a bug, or am I doing something wrong? If so, please suggest me a work around.
- run gooddata/gooddata-cn-ce:latest locally with docker
- connect custom datasource via API (POST /api/entities/dataSources)
- click connect data on workspace
- after a few seconds the google chrome tab crashes with OOM
If I scan only the views, everything works fine. I tried to scan parts of the schema with different prefixes, but that did not load any tables, if I did specify any prefix.
Best answer by jacekView original