Thomas Karbe
04/30/2024, 8:32 AM"message":"Feature flag etl.lastRecordDeduplication must be disabled for upload file size larger than %s
. Is there a way to make this work temporarily, before we can work on reducing the dataset size?Moises Morales
04/30/2024, 8:34 AM{
"settingItem": {
"key": "etl.lastRecordDeduplication",
"value": "true",
"source": "catalog",
"links": {
"self": "/gdc/projects/workspace_id/config/etl.lastRecordDeduplication"
}
}
}
Thomas Karbe
04/30/2024, 8:41 AMThomas Karbe
04/30/2024, 8:41 AMThomas Karbe
04/30/2024, 8:45 AMThomas Karbe
04/30/2024, 8:46 AMMoises Morales
04/30/2024, 8:48 AMThomas Karbe
04/30/2024, 8:50 AMMichal Hauzírek
04/30/2024, 12:03 PMetl.lastRecordDeduplication
feature flag is set, a data load will still:
• remove all old data if it is a ful lload
• update the existing data based on the defined primary key in case of incremental load
The only difference is really in what happens if there is a duplicity on the primary key inside the batch of data being uploaded.Thomas Karbe
04/30/2024, 12:11 PM