Hello team, I am planning on migrating a self-hos...
# gooddata-cn
j
Hello team, I am planning on migrating a self-hosted deployment of the GoodData.CN Community Edition (All-in-one Docker image) to the recommended K8s installation. However, I am concerned about dumping the database data (e.g.,
pg_dump
on the first deployment and
pg_restore
on the second) regarding possible incompatibilities. Is it safe to perform the migration? Do you have any recommendations? Additional information: • My GoodData.CN Community Edition deployment is using
gooddata/gooddata-cn-ce:2.2
• I am following this link for the K8s chart installation with Helm Thank you!
👀 1
j
How many users did you create in the community edition deployment? How many data sources did you register there? Do you plan to use the DEX in the K8S deployment too, or do you plan to switch to OIDC provider?
j
I created around 20 users and a single data source. As far as I know, the development team has changed the original authentication method to a a private/internal authentication party, so I suspect that we will move forward with the same method.
j
Why do I asked these questions? Because you could "workaround" this by exporting/importing metadata using declarative APIs (/api/v1/layout) or even better by corresponding Python SDK methods. The only tricky part is to transfer the sensitive properties like user passwords / data source passwords. If you no longer use the DEX, you need only to inject one data source password, which seems to be very easy. If you consider this approach, I can help you to implement it.
j
Thank you for the recommendation @Jan Soubusta. I'll give you more feedback after deciding internally with the team. They seem to be having trouble with some APIs under /api/v1/layout when filtering data, but I am not into that part.
j
Personally I recommend to use Python SDK instead of raw APis. We can quickly fix anything not working ideally in Python SDK 😉
j
Great, I'll pass them this information!