hi GD team, i have a problem with the implementati...
# gooddata-platform
m
hi GD team, i have a problem with the implementation of the changes to my data model. i have amended some fact datasets with new keys columns that are supposed to reference a newly created dimension. i have ran a full load on this new dim so it already exists and seems to be connected to the fact dataset(s). while i'm trying to run a full force load to any of the affected datasets i stumble upon a following error.
Copy code
Project "ox3jebjo7n512ue1ci68tyjk9xc169tu" was not integrated. Reason: Non-synchronized dataset "dataset.fact_subscriptions/77675", identified unrelated columns "906566".
fact datasets are incremental. when i was saving the changes to my model a notif below has popped up. i picked save anyway but no data was deleted since then. do you have any hints where should i look into to fix this error?
btw i did almost the same thing to the same datasets one week ago and it worked alright. so i struggle to understand what exactly went differently this time. i see my new keys in the data model but do not see them as a part of the datasets in grey pages so seems like something has asynchroniesed but i'm not sure what exactly
m
Hi Masha, based on this error, it seems there is some discrepancy between your physical and logical data model. Please try to SYNCHRONIZE the dataset as navigated in this article: https://help.gooddata.com/doc/enterprise/en/data-integration/data-modeling-in-gooddata/data-modeling-and-maql/maql-ddl/#MAQLDDL-Synchronize
m
Hi @Michael Ullock, thanks much,
SYNCHRONIZE
helped for some datasets but for some i'm getting different results within a couple of seconds of the same
get
request. for instance i run the same get request towards fact_rewards dataset and once i get a normally mapped body of the object
Copy code
{
  "column": {
    "meta": {
      "summary": "",
      "uri": "/gdc/md/ox3jebjo7n512ue1ci68tyjk9xc169tu/obj/906562",
      "contributor": "/gdc/account/profile/ffce5af76e69a37de5c047556753a50a",
      "title": "col.f_fact_rewards.dim_conversions_id",
      "created": "2024-10-28 17:12:44",
      "deprecated": "0",
      "author": "/gdc/account/profile/ffce5af76e69a37de5c047556753a50a",
      "category": "column",
      "identifier": "col.f_fact_rewards.dim_conversions_id",
      "tags": "",
      "updated": "2024-10-28 17:12:44",
      "isProduction": 1
    },
    "content": {
      "columnType": "fk",
      "columnDBName": "dim_conversions_id",
      "table": "/gdc/md/ox3jebjo7n512ue1ci68tyjk9xc169tu/obj/17172"
    }
  }
}
and the next request in 2 secs returns me this
Copy code
<html xmlns='<http://www.w3.org/1999/xhtml>' lang='en-US' xml:lang='en-US'>

<head>
    <title>GoodData Resource</title>
    <meta http-equiv='Content-Type' content='application/xhtml+xml; charset=utf-8' />
    <style type='text/css' media='screen'>
        html {
            background-color: #eee;
            font-size: 12px;
        }

        html body div {
            padding: 3px;
            background-color: #999;
        }
    </style>
</head>

<body>
    <h1>GoodData Resource</h1>
    <hr />
    <pre>{
   "column" : {
      "content" : {
         "columnDBName" : "dim_conversions_id",
         "columnType" : "fk",
         "table" : "<a href="/gdc/md/ox3jebjo7n512ue1ci68tyjk9xc169tu/obj/17172">/gdc/md/ox3jebjo7n512ue1ci68tyjk9xc169tu/obj/17172</a>"
      },
      "meta" : {
         "author" : "<a href="/gdc/account/profile/ffce5af76e69a37de5c047556753a50a">/gdc/account/profile/ffce5af76e69a37de5c047556753a50a</a>",
         "category" : "column",
         "contributor" : "<a href="/gdc/account/profile/ffce5af76e69a37de5c047556753a50a">/gdc/account/profile/ffce5af76e69a37de5c047556753a50a</a>",
         "created" : "2024-10-28 17:12:44",
         "deprecated" : "0",
         "identifier" : "col.f_fact_rewards.dim_conversions_id",
         "isProduction" : 1,
         "summary" : "",
         "tags" : "",
         "title" : "col.f_fact_rewards.dim_conversions_id",
         "updated" : "2024-10-28 17:12:44",
         "uri" : "<a href="/gdc/md/ox3jebjo7n512ue1ci68tyjk9xc169tu/obj/906562">/gdc/md/ox3jebjo7n512ue1ci68tyjk9xc169tu/obj/906562</a>"
      }
   }
}
</pre>
    <hr />
</body>

</html>
seems like two versions exist in parallel somehow. does it mean
SYNCHRONIZE
didn't work/i didn't execute it properly or it's something else?
ook, i looks like it works the same (changes with the next requests) with the other objects as well so probably not the root cause here then
m
If I understand correctly, you are trying to execute the MAQL DDL command via an API client, hence the response you got in your second code block. It usually denotes the super secure token is no longer valid, but it may also happen for other reasons. To rule out this, can you please try excuting the command directly in gray pages: {your_domain}/gdc/md/<project>/ldm/manage2? Please let me know how you get on.
m
hi @Moises Morales, thanks much for the fast reply, i did run it in grey pages and it worked but now i'm getting the following error:
Copy code
Project "ox3jebjo7n512ue1ci68tyjk9xc169tu" was not integrated. Reason: Cannot start integration of dataset: [dataset.dim_customers]. There is an unfinished integration.The integration has started at [2024-10-29 16:38:24] with ID: [3edc45c687fedd7bfe7a2e83cdbf7b4c00000010]
now thinking about synchronising it without preserving the data and then reloading the whole thing. or maybe you can recommend something better here?
m
If the soft synchronization is not working, then I would recommend trying to hard synchronize the dataset (without preserving data as you stated). Regarding the error you got, you will need to wait for the previous process to either error out, finish, or time out until you can start you new integration.
m
thank you! could you please explain what exactly does 'integration process' mean? how does it gets triggered and where can i check if it's still running?
that error message above was a result of the data load process failure and i usually see that integration is a part of every load but i'm not aware where is it running independent of the load process
m
Hi Masha, the integration process is part of the data synchronization process that occurs when datasets in the LDM are mapped to the physical data stored in the data source. This process ensures that the structure and relationships defined in the LDM align with the actual data. To check the the progress - If you’re running this via API, and the process is running, the response will indicate it with the integration start time and status.
m
got it, thank you, Michael!