Hi Team, On our Dashboard, lots of time visualizat...
# gooddata-cloud
p
Hi Team, On our Dashboard, lots of time visualization/data filters fails to load(refer to screenshot 1), however if refresh multiple time they get loaded, this wasn't happening earlier, I checked network logs and it shows that some execution failing in backend with 500 error . Is there a way we can see the execution logs in Good data to better track what's happening. please advise
j
Hello Pradeep, Could you please send us traceID from this error? We will gladly double-check the logs for you. You should be able to obtain it while reproducing the erorr again and check the console or network tab in browser web tools.
p
Thanks @Julius Kos, For filter not loading ,trace ids are 790ead824df61dc4774bc4648ba82cf4 6120642a3a338624ca7ba4cc508f828b Trace id for visualization not loading 220eca69c880d8db6cf4f5489020a8bf
👀 1
j
Hi Pradeep, unfortunately, it is not entirely clear from our logs what is happening. Is this happening only to some specific visualisations or is this some pattern across your workspace(s)? Could you please DM me the direct links into the Dashboard in question and the visualisation? Thanks
p
Hi @Julius Kos, Thanks for following up on this . this is happening to specific visualizations on a dashboard , unfortunately this dashboard is our prod environment and I won' t be able to you give you access, right now i am focusing on fixing filers not loading issue, so any specific guidance to figure out the issue . feel free to reach out to me for any info
Hi @Julius Kos , DM you the link
j
Hi Pradeep, pls rather send it via DM, for security purposes 🙂
just delete it, I got it now, thanks
So far it looks that you are definitely hitting some limit:
Copy code
eason: 'xtab-rows: The limit for the maximum size of the d…cs was exceeded. Limit: 100_000. Actual: 268_661.',
I’m currently internally checking for more information
p
there are multiple issue on dashboard. I am ordering them by our priority to fix 1. Filters doesn't load , give 500 error 2. %participation across program and total participation across program not loading 3. highest donation total, highest volunteer totals, (this has 100,000 limit issue)
j
I have noticed that one Insight simply uses too many datapoints. You can check some of the limits here: https://support.gooddata.com/hc/en-us/articles/9703853324179-GoodData-Cloud-Limits Regarding breaching the 100,000 limit, we are checking internally. The headlines which you meant probably end up on some limit as well as I can see some connection timeout in logs.
In any case, I have passed this issue to our L2 Technical Team. One of the team members will review the problem and contact you as soon as possible.
p
thanks @Julius Kos
Hi @Julius Kos, following up on the issue
b
Hi Pradeep, This is Branislav from the L2 Technical Support team. Thank you for you patience while I reviewed the case after taking over from Julius. Currently, I have an update regarding the issue #1 - Filters doesn't load , give 500 error. We have noticed the error(s) on our side as well and notified the developers with a request to review and investigate further. So far, the errors seem to be caused by the following:
Copy code
message=Connection is closed
that occurs after ~ 30s. In the meantime while waiting on the developers review, as this did not happen before, I would like to kindly ask if you are aware of any recent setup changes of your database. Mostly it would be related to some connection timeout(s). We are sorry for the inconvenience caused and would like to thank you for your patience while the development team reviews and if needed, prioritises and works on the fix. Of course, we will continue the investigation and will share an update with you after Easter as soon as we hear back from them.
p
Hi @Branislav Slávik, Thank you for update, We haven't made any changes to database recently. Let me know if you need any other info.
Hi @Branislav Slávik, Let me know if you have any update on the "Filters not loading" issue
Hi @Branislav Slávik @Julius Kos, any update on above issue ?
b
Hi @Pradeep Soni, apologies for the delayed answer. Unfortunately there is no update yet due to the Easter break. We will continue our investigation today and will let you know as soon as we have any news.
p
Hi @Branislav Slávik, Following up on this.
b
Hi @Pradeep Soni,
The report for the development team mentioned before is still being reviewed and investigated. We are sorry for the inconvenience caused and would like to thank you for your patience while they prioritise and work on the fix. I will share an update with you as soon as I hear back from them.
p
Hi @Branislav Slávik, Let me know if you have any update.
Hi @Branislav Slávik, Following up on the issue. let me know if you have any update
b
Hi @Pradeep Soni, Our developers just got back to me with the following notes: 1.) We do not have any 30s timeout limit for queries that could cause the issue. 2.) Could you please double-check on your side if there is nothing in between, e.g. some firewall or setup on the DB side that could be closing the connections? 3.) After further investigation, the following possibly related error was found:
Copy code
Connection org.postgresql.jdbc.PgConnection@678e070c marked as broken because of SQLSTATE(08006), ErrorCode(0)","exc":"org.postgresql.util.PSQLException: An I/O error occurred while sending to the backend.
This error means that your PostgreSQL server closed the connection for some reason. Their hypothesis is that we might be going over the
max_connections
set on your PostgreSQL instance. With that in mind, they are kindly requesting you to check your PostgreSQL logs for messages like:
Copy code
pq: sorry, too many clients already
pg: too many connections for database "exampledatabase"
pg: too many connections for role "examplerole"
in order to confirm the above?
Hi @Pradeep Soni, I was just wondering if you have had a chance to review my latest update? I have provided it some time ago but have not heard back from you. If you would like to share an update, or require more time to work through the latest message, simply reply here and let us know. If there is no update from your side, this issue will be considered as resolved within the next few days.
p
Hi @Branislav Slávik, We haven't found anything blocking or causing DB connection to drop , so we are still investigating,