morning, our platform is not accessible (endless s...
# gooddata-platform
m
morning, our platform is not accessible (endless spinning wheel and then unknown error)
i cannot recall it happening during the last 2 years and now it's for the 3d time in a month. i'm not sure where exactly to address it
j
Hello Masha, I'm really sorry for the terrible inconvenience. Our engineers have recently noticed an issue and are currently working on a fix here. We will keep you posted with further details.
m
thanks Joseph, it seems to be back now. i have notified our csm about the issue, hopefully he'll be able to resolve it for the future
j
Thanks for the confirmation
🙏 1
m
@Joseph Heun URGENT our customer facing workspace has a really weird validation log
eg PDM::PDM_VS_DWH, ERROR, PDM_TABLE_MISSING: Object id:"tab.d_pifc_fact_kpi_saleschannel" [423901] table 'd_pifc_fact_kp_aaad656bc3bm52j' doesn't exist.
table name is not correct
Copy code
tableDataLoad "d_pifc_fact_kp_aaac656bc3bm52j" {aacVbK51qDYd}"
it's related to multiple objects
j
Hi Masha, this is related to the issue. Currently, our engineers need to restore the metadata in the workspace. these errors will disappear after the workspace has been restored.
m
it's causing the data load errors meaning all the customer facing insight are down, do you guys have an ETA for that to be resolved?
j
Unfortunately, we do not have an ETA at the moment. The workspaces need to be restored in a consecutive order, but we will keep an eye on the process and let you know when it is close.
Do you have specific workspaces you are interested in?
m
do you have access to to the validation log ibe shared above? that's exactly the workspace, workspace id is the part of the link. can it be tackled urgently? once again, this specific workspace in embeded to the customer interface, all our customers are reporting there's no data
j
We can see that that specific workspace is in the middle of the first batch of restorations. It will be no more than 4 hours for the restoration to finish. Hopefully much sooner
m
ok, thank you for the info!
@Joseph Heun seems like i was eventually able to fix it via
synchronising
all the the datasets in metadata, validation log also seems ok now. do you think it can announce it as solved or it can be rewritten by the ongoing restorations? thanks much again for staying in touch
j
That workspace restoration has not finished yet, but it is nice to know there may be a workaround. However, I'm not sure if this synchronization actually fixed everything. Please let us know if you stumble upon any issues.
🙏 1
r
Heya @Masha Akatieva, Radek from the GD Technical Support team here - I noticed some errors coming from one of your workspaces a couple minutes ago. We figured that since you fixed the others via synchronizing the datasets, you might take the same approach for this one, and avoided restoring it - the workspace ID is
lpykd6t5y3j432mdd61pn2gsltf6iiun
, please let me know if you already looked at that one too, and if so, if you'd like me to run the restore for it! 🙂
m
hi @Radek Novacek🙂 thanks for pointing it out, i didn't notice it and for some reason we were not getting alerts for this kind of load failures, will check all of them rn
@Radek Novacek hi Radek, hope you're doing well! our biggest workspace's dataload failed twice today with an error that we cannot relate to our data. is it also something related to your recent maintenance or we should look deeper into our data? just reloading helped for the first time and now it has happened again. thanks in advance!
Copy code
Project "ox3jebjo7n512ue1ci68tyjk9xc169tu" was not integrated. Reason: DBD::MariaDB::db do failed: Max connect timeout reached while reaching hostgroup 110 after 10000ms [for Statement "BEGIN"]
r
Hi Masha, we have noticed some issues in the datacenter, they are currently being addressed! 🙂
m
got it and thanks for the fast feedback, please keep us posted if possible🙂
r
Okay, everything should be back to normal now! Please accept my apologies for any inconvenience caused 🙂
m
no worries and thanks much for checking!