Good morning beautiful people! We have a situatio...
# gooddata-cloud
Good morning beautiful people! We have a situation that requires us to remove all data uploaded to all datasets in our model for the last 2 days, so everything from January 10th forward. We are examining a few options, but the simplest solution seems to be executing a MAQL statement for each dataset. Does anyone have any experience with a situation like this? Any ideias? Context: We have a few Bricks configured to extract data from some tables in BigQuery, these tables had only partial data due to internal failures on our side, so the data was not fully loaded for most accounts. For example, if there should have been 100 events, there were only 85 at the moment of extraction, so only 85 were loaded by the Brick execution.
Hi Tomas, it sounds to me you are using GoodData Platform rather than GoodData Cloud (in GD Cloud, you are querying the DB directly, not extracting the data through bricks). If so, the way of running MAQL statement at /ldm/manage2 resource is the correct way. Running simply
SYNCHRONIZE {dataset.datasetId};
will delete all the data in the particular dataset.
Other option is to trigger full reload of data (delete all data from dataset and load all values for the particular client_id from your DB).
Hey @Boris i apologize for confusing the channels, I did mean to post on the Platform 😔
We are trying to avoid that, as a full load on all datasets can take up to 5-6 days in our past experience
What do you think about restoring the affected projects from a snapshot? That will probably be easier? Then we could just do the incremental load to get the data back on track
yes, we might be able to do that, the question is how many projects are affected.... please send a request to us (, we can figure out the details there.
🏃 1
I forgot to mention, these tables don't have PKs, so incremental load on the dates affected results in duplication...
Thank you @Boris! I have just sent an email requesting this assistance. TKT n. 118718