ui/js sdk question: When scrolling through a Pivot...
# gooddata-platform
e
ui/js sdk question: When scrolling through a PivotTable I see there are network requests going out for getting 100 rows at a time. How can I invoke my own requests for this data? We want to use our own in-house table component and to have some more control over the data itself without needing to configure the actual insight.
j
Hello Evan! Jiri from GoodData here. For this I would recommend following the end-to-end flow as described here.
The idea is that you build the payload request programmatically in the code, and for paging you request a specific window, e.g.
const firstPage = await result.readWindow([0, 0], [10, 10]);
.
Please let me know if it helps!
e
Thanks Jiri! Will let you know if I have any follow-up questions
👍 3
Is there a way to apply filters without having to generate an entirely new execution?
j
Hello Evan, are we still talking about the end-to-end execution flow using the raw Promise (i.e., you are not using the <Execute /> component), correct?
Just to make sure I unnderstand, you already build (and probably also executed) an execution, like this:
Copy code
const result = await backend
   .workspace("workspace_id")
   .execution()
   .forItems(measuresAndAttributes, [filter])
   .withSorting(...sort)
   .withDimensions(...dimensions)
   .execute();

const firstPage = await result.readWindow([0, 0], [10, 10]);
const allData = await result.readAll();
And now you're looking for a way to just call the
.forItems(measuresAndAttributes, [filter])
again, only with updated filters array?
e
correct
I’ve gotten it to work with making a new execution, but if there is a simpler way to do it I’d still like to know!
j
I believe that creating a new execution every time is the right approach. Do you see a problem with it? Are you concerned about performance maybe? Or is just the code cleanness and readability? Would you recommend we implement it another way? I think that creating a new execition is a more "react-ive" and more "functional" than modifying an existing one, and that once compiled the performance difference is negligible. But please let us know if you think otherwise, we are interested in your feedback!
e
Mostly a cleanliness and organization issue, or I guess it would be more accurate to say it’s a compatibility thing? I’m using ag-grid to visualize the data sourced with gooddata and their implementation of server-side filtering/sorting is to have a mostly static “data source” with a
getRows(params)
callback where
params
contains the sort, filter, and infinite scroll position information. This made it very simple to get a new set of rows as the user scrolls, but when it came to applying ag-grid defined sorts and filters I had to build a new execution first before actually performing the read query
j
Ah, I see. I'll make a note of this and pass it to our devs, and if there are more similar requests in the future we will consider a different approach. But I hear you were able to make it work, so there most likely won't be a follow-up in the near future. Thank you for your feedback @Evan Shen!
e
no problem!
another followup: I switched to using
forBuckets
instead of
forInsightByRef
since I want to be able to dynamically add and remove attributes/measures to the execution. How do I convert a ICatalogFact and ICatalogAttribute to a measures and attributes to use with
forBuckets
?
j
Hello @Evan Shen, instead of
forBuckets
I would recommend you to try
forItems
, it could look something like this:
Copy code
const measure = Ldm.Revenue;
// const measureFromFact = Ldm.Budget.Sum;
const viewBy = Ldm.DateDatasets.Date.Month.Short;
const stackBy = Ldm.ProductCategory;

const result = await backend
    .workspace(workspace)
    .execution()
    .forItems([measure, viewBy, stackBy], [dateFilter, attributeFilter])
    .withDimensions(...newTwoDimensional([viewBy], [MeasureGroupIdentifier, stackBy]))
    .execute();
The
Ldm
variable is a product of Catalog Export, please read https://sdk.gooddata.com/gooddata-ui/docs/export_catalog.html for more.
e
Ok, I was able to get it to work. Thanks again!