Automated migration of dashboards / insights to use new attributes

  • 21 June 2023
  • 6 replies

  • Known Participant
  • 18 replies


As I have updated my data model a bit, I’d like now to migrate all the dashboards to use new attributes in place of the old ones.

What can be done to automate it, at least partially? article shows the usage of `` endpoint, so I can check which old attributes are used where. Unfortunately I couldn’t find any documentation page describing which categories of the returned entities means what. Some are self descriptive, but not all.

So, knowing how my old attributes are used, and to which attributes they should be changed instead, what are the options, preferably via Ruby SDK or grey pages endpoints, to update existing dashboards/insights/metrics, so it doesn’t have to be made manually?





Best answer by Anonymous 21 June 2023, 16:20

View original

6 replies

Hi Hanna!

The usedby API endpoint will return all metadata objects associated with the object you are querying, akin to what is specified in our API documentation here: However, it will conveniently wrap them in a Ruby object, an instance of one of the classes shipped with the SDK. Therefore, to assume what exactly is the object that is being returned, you would need to rely on our API documentation. Alternatively, you can iterate with <object>.class within your Ruby script to determine its type, print it out or save this information to a file while looking for dependencies.

If I understand correctly, you are trying to replace old attributes with new ones in your model. There is no straightforward way to perform a bulk replacement everywhere, but I would follow these steps:

  1. Run a dependency analysis for all old attributes you are trying to replace, i.e. saving the response from usedby for every old attribute.
    1. Focus on dashboardFilter, userFilter, filterContext, filter and metric objects, especially their content (expression).
  2. Prepare a 1:1 mapping of the objects you wish to replace.
    1. You will need to replace every metric or filter expression with the new attribute.
  3. Iterate over the objects while updating and saving their new definition (with new attribute in place).

Please let me know if you find this helpful.

Thank you!

Hi Marek,

Thanks for the steps, so as I understand step 3. has to be done manually. 1. and 2. will be rather straightforward. 



H Hanna,

You are most welcome!

The third step can also be done via Ruby (SDK); I would suggest iterating over an array of the metadata objects, editing their expression and saving them. Once you have the mapping ready, it could be done automatically. Pasting a code snippet you might find useful. It is not complete, therefore it will not work out of the box, but it should give you a general idea how to approach this:

# prepare a hash type object that contains k:v mapping between old and new attributes
# let's call it mapping
mapping = {}

# grab all metrics that contain the identifier of an old attribute (can also be an iterative step to loop through all old attributes)
project.metrics.each { |p| new_metrics << p if p.identifier == 'old_object_identifier' }

# find a match, replace the expression and save the metric
mapping.each do |k, v|
new_metrics.each do |p|
if p.expression.include? k.uri
p.expression = p.expression.sub (k.uri, v.uri)

Hope this helps.


Hi Marek,

That is really helpful! Is there’s a similar way of updating filters or insights? I couldn’t find it.



Ah, and since some attributes are in new datasets, it would be a safer approach to prepare a copy of the original metrics/insights to ensure all is working correct with new attributes. Can all objects be cloned?



Hi again Hanna,

Glad you find that helpful! You can repeat the process analogically for all the metadata objects in my comment above (dashboardFilter, userFilter, filterContext, filter). Alternatively, you can just clump them all together and update in bulk (that is, if those object support all the methods you will be calling during the iteration).

As for cloning, you can clone the project in its entirety (that means, including all metadata). The easiest way to do that is via GoodData Extension Tool:

  1. Obtain your authorization token:
    1. Navigate to the project you wish to clone.
    2. Click on GD Extension Tool (let’s just call it ‘extension’) and then navigate to Workspace > Info.
    3. The grey page resource will display a response from the API endpoint in JSON format. Your authorization token can be found under "project" : {"content" : { "authorizationToken":...}}
  2. Prepare the original project for export:
    1. Navigate to the project you wish to clone.
    2. Click on the extension and then navigate to Workspace > Export.
    3. Based on whether you wish to export the project with its data, users or email schedules, mark the respective tickboxes. However, for development purposes I suggest copying just the metadata.
    4. Hit submit and save the import token that will be displayed under { "exportArtifact" : { "token" : "actual_token" } }.
  3. Clone the project:
    1. Wait until the task from step 2.) is finished (you can navigate to the URI of export task shown above the export token.
    2. Click on the extension and then navigate to Workspace > New.
    3. In the dialogue, enter the name of your cloned project along with your respective authorization and export tokens. You can optionally add a description of the workspace as well.
    4. Click on submit. A URL with the new project ID will be displayed, where you can check the status of the new project once you click it.
    5. Once the status is enabled, your project has been successfully cloned.

It is also possible to do it programatically, either via Ruby SDK or by calling the respective API endpoints with a tool or a language of your choice. More information available here: