We're trying to load a big database into a model. I can create PDM and LDM through API calls, but when I press the "Create Model" button in the UI a browser tab crashes.
Are there any limitations on the database size?
Best answer by Robert MouchaView original
May I know what Data Source are you using, please? I have to admit, it is bit strange, as per the Modeler in UI uses our API calls as well.
We’re using the Postgres data source. Some tables in the DB are 20G large, may this be the cause?
Yes, this might be an issue. Please be so kind and check this Community post, hopefully it will help to optimize your model.
It’s not a matter of db table size - the more important is an overall complexity of your data model - number of tables and views, number of table columns, number of table relations (foreign keys).
You may filter tables on common prefix to load a subset of tables that are subject of your interest. Later on, you may append another portion of your model.
We are aware of performance issues in modeler, UI starts to slow down when number of datasets exceeds two or three hundreds. Some improvements were done in recent release but there are still some cases that need to be improved.