In our lake we store multiple date times, specifically for our use-case a postgress database with columns utilizing timestamp with time zone, nullable.
We don’t have a calculated column today, nor are we calculating at insert so we’re trying to create a metric that selects the average of call end time - call start time). Since things may cross over the 24 hour mark and of we want to present data stored in UTC across different timezones, I’m looking for a way to make this work.
In SQL it would be something like:
SELECT AVG(TIMESTAMPDIFF(SECOND, call_start_time, call_end_time)) AS avg_time_not_answered FROM your_table_name WHERE Interaction_handled = 0;
How can I replicate this in Gooddata, secondly, what is the easiest way to change visualization of our data into a selected timezone?
Best answer by Radek NováčekView original
Usually, when it comes to implementations that aren’t available by default (or are difficult to implement), there is always the chance for a customer to work with us towards a custom solution, so that may be a good option if you’re planning on using granularity in seconds as a cornerstone of your analytics - alternatively, I’ll be happy to put in a product request towards our engineers on your behalf to have this considered as a candidate for a new feature.
As another option, you could always consider giving GoodData Cloud Native a look as well - this version includes a DATETIME_DIFF function, which is nearly what you’re looking for, excluding the granularity in seconds (but like above, I can put in a product request for you).
I’ll review this. Is there any plan to simplify this for customers with any upcoming enhancement. I’m working towards a reference table to intervals in the day so this makes sense on the surface and will review this for shorter term.
Also, wondering if you have any purpose use-case documentation - some of your customers have to be call center and I'm sure they’ve tackled the same thing we’re looking to do. Any knowledge sharing here?
Radek from the GoodData Technical Support team here! There is a way to achieve this, though it’s unfortunately not as easy as the SQL function.
As the Date dimension by default can only go as low as days in terms of granularity, you would need to use a custom Date dimension in your LDM - that way, you could get the granularity of hours/minutes/seconds as well.
Afterwards, there’s Date attribute arithmetic to help you calculate the average event time - to get an example close to your calculated average, this would be as simple as
Secondly, as far as timezones are concerned - timezones can either be set per-workspace, or per-user (setting a timezone under user settings would override timezone settings for the workspace). We have an article on how timezones affect reporting that goes into detail, but the short version would be that a visualisation has to be built with timezone changes in mind (using MAQL macros with relative references such as THIS, PREVIOUS and NEXT).
Hope this helps!