Error when loading data into column that's not in the data model

  • 9 March 2022
  • 8 replies

I have a column in one of my tables that I've removed from the data model, but when I try to load the data, it is throwing an error because it can't load data into that column (even though I removed it) because the values are too large for the type. Why is it trying to load data into a column I don't want GoodData to worry about?  Does the Load process load all data regardless of what’s been selected in the Model?




Best answer by Moises Morales 9 March 2022, 15:12

View original

8 replies

Userlevel 2

Hello Chris,

What method are you using to load data in Gooddata? Can you share with us error ID / request ID / task ID from the failed data load?

Thank you


Thank you in advance for the potential help, Jan.  We’re reading Hubspot data from a warehouse we’ve built.  They have an hs_deal table with all the deals in the CRM.  There are columns in that table to track the time spent in each stage of the CRM’s pipeline, and one of those columns is called “hs_deal.property_hs_time_in_1105203”.  However, I’ve removed the fact corresponding to that column from the data model, but the load is still throwing an error, “SQLSTATE_NUMERIC_VALUE_OUT_OF_RANGE. Details: numeric field overflow: 20120646217.0'.” I’ve attached the run log in case it helps too.

Userlevel 3

Hi Chris, 


It is possible that the fact that was removed from the Logical Data Model only (UI) still exists on the backend, therefore ADD is still throwing the numeric field overflow error. You will have to go to the project gray pages and get the fact identifier to drop it via MAQL DDL.

Here’s how you can access gray pages: Access the Gray Pages for a Workspace. From there, navigate to “metadata” > objects > query > dataSet. Find the corresponding dataset and click on the fact you want to drop, its identifier will be in its “meta” definition.

Alternatively, you can also find the dataset metadata via our GoodData Extension Tool for Chrome by clicking “Object” and then “Object” from the dataset page under “Manage” in the UI.

Once you get the object identifier, you can proceed to run the MAQL DDL query to drop it (see Remove attribute/fact in the documentation), e.g.:

ALTER DATASET {dataset.players} DROP {attr.players.age};

Please make sure the identifier is correct since the changes are irreversible.

However, if you would still like to updload the problematic column, you can do so by altering its datatype so it can fall under the specified range, again this is achievable by running a MAQL DDL query and you have to double check the datatype is correct, more information can be found here: Specify a DATATYPE

Let me know if this helps. 


Thank you Moises.  I go to the gray pages, but it asks for a login, and my GoodData credentials are getting rejected.  Is there a different account I need to setup to access the gray pages?



Userlevel 3

Can you please make sure you are using your own domain and not our default domain? https://{your_domain}/gdc/projects/{workspace_id}

Moises, I was able to login and removed some cols using MAQL, but now getting this error when loading data.  I looked in the metadata, but there’s no attribute with id 27658 (first one in the list) for example.


2022-03-09T22:15:48.194+0100 [WARN]: Project "xyp336qpoai0p3fqmra8xrbkwpmnk9u5" was not integrated. Reason: Non-synchronized dataset "dataset.hs_deal/28360", identified unused columns "27658,27646,27534,27770,27448,27844,27812,27408,2163,27484,27710,27634,27560,27454,27750,27836,27556,27590,27904,27720,27764,27854,27398,27506,27866,27772,27422,27896,27768,27662,27706,27804,2181,27504,27636,27524,27610,27682,27780,27450,27404,27652,27614,27782,27618,27558,27698,27890,27474,27824,27508,27578,27648,27766,27876,27498,27814,27718,27604,27678,27452,27548,27406,27740,27414,27532,27838,27848,27676,27714,27458,27522,27460,27462,27688,27724,27722,27738,27576,27732,27834,27670,27700,27674,27756,27882,27496,27774,27514,27482,27716,27464,27410,27638,27680,27778,27754,27906,27694,27490,27612,27858,27494,27684,27516,27400,27796,27880,27582,27884,27868,27762,27728,27594,27892,27468,27748,27702,27566,27786,27416,27430,27794,27826,27672,27752,27760,27510,27800,27802,27874,27620,27552,27478,27692,27788,27530,27602,27654,27550,27628,27864,27886,27660,27434,27664,27424,27758,27596,27428,27554,27632,27570,27616,27568,27500,27730,27622,27476,2119,27418,27894,27542,27580,27798,27564,27806,27624,27574,27456,27440,2097,27488,27902,27608,27846,27420,27746,27708,27486,27828,27842,27492,27644,27900,27640,27412,27872,27592,27668,27776,27606,27480,27518,27402,27586,27734,27502,27830,27426,27436,27820,27690,27816,27526,27878,27512,27562,27790,27818,27544,27908,27446,27600,27520,27650,27442,27432,27538,27686,27810,27888,27808,27626,27466,27744,27898,27784,27822,27540,27666,27852,27832,27726,27470,27588,27742,27584,27860,27642,27472,27704,27862,27736,27696,27438,27656,27792,27572,27850,27528,27598,27870,27444,27840,27712,27630,27536,27856,27546".
Userlevel 2

Hello Chris,

Can you try to run synchronize command on the dataset? That should do the trick.


Thank you, that worked!