Hey and good morning! We have the problem, that be...
# gooddata-platform
t
Hey and good morning! We have the problem, that because of the size of our data, we cannot load a complete data model in one go, and instead have to load e.g. first all dimensions, and then the fact tables. This is all more or less ok, but we have a problem in the following scenario: Whenever we changed a underlying computation that leads to some keys being deleted / no longer available, we cannot first load the dimensions in, since this makes the data model inconsistent. Thus, what I’d want to do is: Before loading the fresh data in, I want to flush out the old data and thus avoid inconsistencies. What is the easiest way to do so?
1
m
Hi Thomas, if you want to completely wipe the data loaded into the dataset in GoodData Platform, probably the fastest way is to use MAQL DDL command SYNCHRONIZE. https://help.gooddata.com/doc/enterprise/en/data-integration/data-modeling-in-gooddata/data-modeling-and-maql/maql-ddl/#MAQLDDL-Synchronize If you do not use the “preserve data” modifier, this will immediatelly remove all data in the datasets mentioned in the command. And it is usually much faster than invoking MAQL DELETE command. It is similar to how SQL command TRUNCATE works. Please note that this operation can not be undone. To invoke this command, you send it to the API (documentation here) or gray page /gdc/md/projectID/ldm/manage2
t
thanks!
that works just fine
👍 1
1