Hi, I’m trying to “scan my data source (Postgres)”...
# gooddata-cn
p
Hi, I’m trying to “scan my data source (Postgres)” via the UI, but it times out after ~1minute, which I believe is too short for my database:
Copy code
{
  "abeType": "UE",
  "cause": {
    "message": "Request failed with status code 504",
    "name": "Error",
    "stack": "Error: Request failed with status code 504\n at e.exports (<http://localhost:3000/modeler/app.dd9b3208b967c638b221.js:2:2729724>)\n at e.exports (<http://localhost:3000/modeler/app.dd9b3208b967c638b221.js:2:2732147>)\n at XMLHttpRequest.g (<http://localhost:3000/modeler/app.dd9b3208b967c638b221.js:2:2725044>)",
    "config": {
      "url": "/api/v1/actions/dataSources/4c931590-8678-4292-8168-dea3969b1de1/scan",
      "method": "post",
      "data": "{\"scanTables\":true,\"scanViews\":false,\"separator\":\"__\",\"tablePrefix\":\"\",\"viewPrefix\":\"\",\"schemata\":[]}",
      "headers": {
        "Accept": "application/json, text/plain, */*",
        "X-Requested-With": "XMLHttpRequest",
        "X-GDC-JS-PACKAGE": "gdc-msf-modeler",
        "X-GDC-JS-PACKAGE-VERSION": "f87c2fcb13eda46b8497e20213cf382a544b6f0d",
        "Content-Type": "application/json"
      },
      "transformRequest": [
        null
      ],
      "transformResponse": [
        null
      ],
      "timeout": 0,
      "withCredentials": true,
      "xsrfCookieName": "XSRF-TOKEN",
      "xsrfHeaderName": "X-XSRF-TOKEN",
      "maxContentLength": -1,
      "maxBodyLength": -1,
      "transitional": {
        "silentJSONParsing": true,
        "forcedJSONParsing": true,
        "clarifyTimeoutError": false
      }
    }
  }
}
is there any setting to increase the timeout?
j
Not sure, if it is possible to tweak the scan timeout - @Pavel Cerny do you know? Anyway, scan on PostgreSQL is usually very fast. How many tables do you have in this database?
There options how to configure the scan to reduce number of involved tables, e.g. you can specify tablePrefix. Not sure if it could help in your case.
p
which environment have you used?
j
I see localhost:3000, so community edition on localhost. Question is, where the PostgreSQL database, you are trying to scan, is running - also on localhost or anywhere in a cloud (far from your localhost)? The latency can also play significant role.
p
1 minute is timeout in nginx. You would need to tweak nginx.conf to prolong it. I do not recommend that. As @Jan Soubusta wrote, scanning PG metadata is typically fast, but it depends on number of objects in data source schema. By a chance we have recently optimized PG scan. It also highly reduces effect of latency on the operation. Optimization is available in latest development community edition docker image so you can give it a try. Another option is to limit number of objects to scan by
tablePrefix
as was already advised.
r
it's possible the db is not accessible from the inside of the container and packets are silently dropped by some network device. Did you successfully tested the connection when creating the data source?
p
I’ve managed to scan the data source — specified table prefix and prefix separator. I suspect number of tables was an issue. Thanks guys 🙂
j
Btw we are currently working on performance tuning of the scan. Specifically in the case of PostgreSQL the scan of 1000 tables took 106s and now it takes 4 seconds. The improvement should be released soon.
🙌 1