Solved

GD Cloud Native Docker - connectivity issue (timeout)

  • 2 December 2021
  • 7 replies
  • 246 views

I installed Cloud Native GoodData, Community edition. I followed instructions from here: https://hub.docker.com/r/gooddata/gooddata-cn-ce/

I am able to run the docker container and access the application. I was also able to add demo-ds datasource, scan it and create some model out of it.

However, when I try to connect to either to an on premise Postgres database or a Snowflake instance, the connection to those datasources are always timing out. Any suggestions how to get around it?

Thank you, 

Aleks

icon

Best answer by Robert Moucha 13 December 2021, 19:28

View original

7 replies

Updat #1.

 I was able to connect to an on premise Postgres database. I tested it via http://localhost:3000/api/actions/dataSource/test and got success message. However, after setting up the datasource and trying to scan it, Nginx server times out after 1 minute. Any ideas how to extend this timeout? 

Userlevel 2

Hi Aleks,

just guessing - it may be caused by too many tables in the schema, which you specify in the configuration of the data source. How many tables are in the schema?

Anyway, can you please use “docker logs” command or watch logs produced in the terminal, where you start GD.CN and paste here any errors you can find there? Error logs related to communication of GD.CN with the Postgres may be produced after the timeout.

 

Postgres schema in question has only 44 tables. It times out after 60 seconds. What is the proper way to configure corporate proxy to your docker container? Could it be ignoring Docker’s global settings for proxy? 

This is what I see in logs:

2021/12/03 13:41:10 [error] 18712#18712: *82 upstream timed out (110: Connection timed out) while reading response header from upstream, client: 172.17.0.1, server: localhost, request: "POST /api/actions/dataSources/MDS-dev/scan HTTP/1.1", upstream: "http://127.0.0.1:9060/api/actions/dataSources/MDS-dev/scan", host: "localhost:3000", referrer: "http://localhost:3000/modeler/"
2021/12/03 13:41:10 [error] 18712#18712: *82 open() "/usr/share/nginx/html/50x.html" failed (2: No such file or directory), client: 172.17.0.1, server: localhost, request: "POST /api/actions/dataSources/MDS-dev/scan HTTP/1.1", upstream: "http://127.0.0.1:9060/api/actions/dataSources/MDS-dev/scan", host: "localhost:3000", referrer: "http://localhost:3000/modeler/"
172.17.0.1 - - [03/Dec/2021:13:41:10 +0000] "POST /api/actions/dataSources/MDS-dev/scan HTTP/1.1" 404 193 "http://localhost:3000/modeler/" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.45 Safari/537.36"
ts="2021-12-03 13:41:11.001" level=ERROR msg="read datasource metadata" logger=com.gooddata.tiger.grpc.client.datasource.metadata.DataSourceMetadataClient thread=reactor-http-epoll-8 orgId=default spanId=6fa2c0b3eabe5882 traceId=6fa2c0b3eabe5882 userId=demo exc="kotlinx.coroutines.JobCancellationException: MonoCoroutine was cancelled; job=MonoCoroutine{Cancelling}@3558554b

Update #2.

This is an exception that I get trying to connect to snowflake: 

 

ts="2021-12-03 15:49:00.262" level=ERROR msg="HikariPool-4 - Exception during pool initialization." logger=com.zaxxer.hikari.pool.HikariPool thread=DefaultDispatcher-worker-1 orgId=default spanId=d2833cd0d49c359f traceId=100ca0c1f7035763 userId=admin exc="net.snowflake.client.jdbc.SnowflakeSQLException: JDBC driver encountered communication error. Message: Exception encountered for HTTP request: Connect to wd19745.us-east-2.aws.snowflakecomputing.com:443 [wd19745.us-east-2.aws.snowflakecomputing.com/3.130.46.178, wd19745.us-east-2.aws.snowflakecomputing.com/18.218.153.55, wd19745.us-east-2.aws.snowflakecomputing.com/3.133.26.128] failed: Connection refused (Connection refused).
        at net.snowflake.client.jdbc.RestRequest.execute(RestRequest.java:317)
        at net.snowflake.client.core.HttpUtil.executeRequestInternal(HttpUtil.java:550)
        at net.snowflake.client.core.HttpUtil.executeRequest(HttpUtil.java:491)
        at net.snowflake.client.core.HttpUtil.executeGeneralRequest(HttpUtil.java:456)
        at net.snowflake.client.core.SessionUtil.newSession(SessionUtil.java:627)
        at net.snowflake.client.core.SessionUtil.openSession(SessionUtil.java:283)
        at net.snowflake.client.core.SFSession.open(SFSession.java:694)
        at net.snowflake.client.jdbc.SnowflakeConnectionV1.initialize(SnowflakeConnectionV1.java:172)
        at net.snowflake.client.jdbc.SnowflakeConnectionV1.<init>(SnowflakeConnectionV1.java:124)
        at net.snowflake.client.jdbc.SnowflakeDriver.connect(SnowflakeDriver.java:169)
        at com.zaxxer.hikari.util.DriverDataSource.getConnection(DriverDataSource.java:138)
        at com.zaxxer.hikari.pool.PoolBase.newConnection(PoolBase.java:364)
        at com.zaxxer.hikari.pool.PoolBase.newPoolEntry(PoolBase.java:206)
        at com.zaxxer.hikari.pool.HikariPool.createPoolEntry(HikariPool.java:476)
        at com.zaxxer.hikari.pool.HikariPool.checkFailFast(HikariPool.java:561)
        at com.zaxxer.hikari.pool.HikariPool.<init>(HikariPool.java:115)
        at com.zaxxer.hikari.HikariDataSource.<init>(HikariDataSource.java:81)
        at com.gooddata.tiger.sqlexecutor.db.ConnectedDataSource.<init>(ConnectedDataSource.kt:24)
        at com.gooddata.tiger.sqlexecutor.db.ConnectedDataSource.<init>(ConnectedDataSource.kt)
        at com.gooddata.tiger.sqlexecutor.db.ConnectedDataSource$Companion.createFromDataSource(ConnectedDataSource.kt:51)
        at com.gooddata.tiger.sqlexecutor.db.DataSourceMapper$getConnectedDataSource$1.invoke(DataSourceMapper.kt:50)
        at com.gooddata.tiger.sqlexecutor.db.DataSourceMapper$getConnectedDataSource$1.invoke(DataSourceMapper.kt:48)
        at com.gooddata.tiger.sqlexecutor.db.DataSourceMapper.getDatabaseVendor(DataSourceMapper.kt:67)
        at com.gooddata.tiger.sqlexecutor.db.DataSourceMapper.getConnectedDataSource(DataSourceMapper.kt:48)
        at com.gooddata.tiger.sqlexecutor.datasource.metadata.service.TestService$testDefinition$2$1.invokeSuspend(TestService.kt:34)
        at com.gooddata.tiger.sqlexecutor.datasource.metadata.service.TestService$testDefinition$2$1.invoke(TestService.kt)
        at com.gooddata.tiger.sqlexecutor.datasource.metadata.service.TestService$testDefinition$2$1.invoke(TestService.kt)
        at com.gooddata.tiger.sqlexecutor.datasource.metadata.service.TestService.wrapTest(TestService.kt:44)
        at com.gooddata.tiger.sqlexecutor.datasource.metadata.service.TestService.access$wrapTest(TestService.kt:19)
        at com.gooddata.tiger.sqlexecutor.datasource.metadata.service.TestService$testDefinition$2.invokeSuspend(TestService.kt:33)
        at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
        at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:56)
        at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:571)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:738)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:678)
        at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:665)
Caused by: net.snowflake.client.jdbc.internal.apache.http.conn.HttpHostConnectException: Connect to wd19745.us-east-2.aws.snowflakecomputing.com:443 [wd19745.us-east-2.aws.snowflakecomputing.com/3.130.46.178, wd19745.us-east-2.aws.snowflakecomputing.com/18.218.153.55, wd19745.us-east-2.aws.snowflakecomputing.com/3.133.26.128] failed: Connection refused (Connection refused)
        at net.snowflake.client.jdbc.internal.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:159)
        at net.snowflake.client.jdbc.internal.apache.http.impl.conn.PoolingHttpClientConnectionManager.connect(PoolingHttpClientConnectionManager.java:373)
        at net.snowflake.client.jdbc.internal.apache.http.impl.execchain.MainClientExec.establishRoute(MainClientExec.java:381)
        at net.snowflake.client.jdbc.internal.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:237)
        at net.snowflake.client.jdbc.internal.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:185)
        at net.snowflake.client.jdbc.internal.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
        at net.snowflake.client.jdbc.internal.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:111)
        at net.snowflake.client.jdbc.internal.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
        at net.snowflake.client.jdbc.internal.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
        at net.snowflake.client.jdbc.internal.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
        at net.snowflake.client.jdbc.RestRequest.execute(RestRequest.java:174)
        ... 35 more
Caused by: java.net.ConnectException: Connection refused (Connection refused)
        at java.base/java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.base/java.net.AbstractPlainSocketImpl.doConnect(Unknown Source)
        at java.base/java.net.AbstractPlainSocketImpl.connectToAddress(Unknown Source)
        at java.base/java.net.AbstractPlainSocketImpl.connect(Unknown Source)
        at java.base/java.net.SocksSocketImpl.connect(Unknown Source)
        at java.base/java.net.Socket.connect(Unknown Source)
        at net.snowflake.client.jdbc.internal.apache.http.conn.ssl.SSLConnectionSocketFactory.connectSocket(SSLConnectionSocketFactory.java:339)
        at net.snowflake.client.jdbc.internal.apache.http.impl.conn.DefaultHttpClientConnectionOperator.connect(DefaultHttpClientConnectionOperator.java:142)
        ... 45 more

Userlevel 2

OK, so this is definitely an issue related to network connectivity.

I do not have any issues when connecting from inside the docker container (community edition) to e.g. snowflake in AWS. But we do not have any restrictions in our company network. 

Where exactly is your docker container running? Where exactly is your Postgres instance running? Could any kind of company network policy block the connection between the docker container and your data sources?

As a reminder, I have 2 connectivity issues:

  1. Snowflake. I wrote a Java program and I was able to connect to Snowflake from my machine successfully. However, I still cannot connect from GD cloud native docker container. I tried passing our proxy to the container via “-e” , but in that case the container fails to start. What is the proper way to configure GDCN to use a proxy as it seems that global Docker settings are ignored?
  2. Postgres. I can actually connect to Postgress server when I just test the connection via api/actions/dataSource/test. But it times out when I try to scan the same datasource. To answer your questions: Postgres is on the same internal network as my machine. There is no restrictions/firewalls on that Postgres instance. GD CN container is running on my laptop.
Userlevel 2

Hi, if I understand correctly, you’re behind some corporate proxy, so you can’t access remote resources directly?

There’s no explicit support for proxied connections in GoodData.CN and I believe it will not work out of the box, even if you try set usual proxy-related variables.

 

Reply