Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Internal server error when ADMIN_ENDPOINT is set #6755

Open
2 of 4 tasks
sneakylenny opened this issue Oct 30, 2024 · 1 comment
Open
2 of 4 tasks

bug: Internal server error when ADMIN_ENDPOINT is set #6755

sneakylenny opened this issue Oct 30, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@sneakylenny
Copy link

sneakylenny commented Oct 30, 2024

Describe the bug

I was trying to host Logto locally behind a proxy. I used Nginx Proxy Manager to do so. I used the deployment steps in the docs as reference but I could not get it to work successfully.

Expected behavior

I expected to be able to host it on a proxy in a couple minutes but this was not the case.
After receiving errors, I expected to be able to check any logs or get a more useful error message.

How to reproduce?

I used "localhost" as domain as any other domain yields the same result

  1. I added Logto to my docker compose:

    compose.yml
    services:
      proxy:
        image: jc21/nginx-proxy-manager:2.12.1
        restart: unless-stopped
        ports:
          - '80:80'
          - '81:81'
          - '443:443'
        volumes:
          - ./nginx/data:/data
          - ./nginx/letsencrypt:/etc/letsencrypt
    
      logto:
        image: svhd/logto:1.21.0
        depends_on:
            postgres:
              condition: service_healthy
        entrypoint: ["sh", "-c", "npm run cli db seed -- --swe && npm start"]
        ports:
          - 3001:3001
          - 3002:3002
        environment:
          - TRUST_PROXY_HEADER=1
          - DB_URL=postgres://postgres:p0stgr3s@postgres:5432/logto
          # Mandatory for GitPod to map host env to the container, thus GitPod can dynamically configure the public URL of Logto;
          # Or, you can leverage it for local testing.
          - ENDPOINT=https://auth.localhost/
          - ADMIN_ENDPOINT=https://test.localhost/
    
      postgres:
        image: postgres:14-alpine
        user: postgres
        environment:
          POSTGRES_USER: postgres
          POSTGRES_PASSWORD: p0stgr3s
        healthcheck:
          test: ["CMD-SHELL", "pg_isready"]
          interval: 10s
          timeout: 5s
          retries: 5
        volumes:
          - postgres-data:/var/lib/postgresql/data    
    
    volumes:
      postgres-data:
  2. I added Logto to the proxy:

    • Upload SSL certificate on "SSL Certificates" tab by clicking on "Add SSL Certificate" and "custom" (I used self signed for development purposes)

    • Go to hosts tab "Proxy Hosts"

    • Click on "Add proxy host"

    • Enter "auth.localhost" into domain names

    • Enter the name of the service (logto) as Forward Hostname / IP and the container port as Forward Port

    • Enable Force SSL on the SSL tab

    • Click save

    • Add "test.localhost" the same way you added "auth.localhost"

    • These were my resulting configs:

      Proxy for auth.localhost
      # ------------------------------------------------------------
      # auth.localhost
      # ------------------------------------------------------------
      
      map $scheme $hsts_header {
        https "max-age=63072000; preload";
      }
      
      server {
        set $forward_scheme http;
        set $server "logto";
        set $port 5071;
      
        listen 80;
        listen [::]:80;
      
        listen 443 ssl;
        listen [::]:443 ssl;
      
        server_name auth.localhost;
      
        # Custom SSL
        ssl_certificate /data/custom_ssl/npm-2/fullchain.pem;
        ssl_certificate_key /data/custom_ssl/npm-2/privkey.pem;
      
        # Force SSL
        include conf.d/include/force-ssl.conf;
      
        access_log /data/logs/proxy-host-3_access.log proxy;
        error_log /data/logs/proxy-host-3_error.log warn;
      
        location / {
          # Proxy!
          include conf.d/include/proxy.conf;
        }
      
        # Custom
        include /data/nginx/custom/server_proxy[.]conf;
      }
      Proxy for test.localhost
      # ------------------------------------------------------------
      # test.localhost
      # ------------------------------------------------------------
      
      map $scheme $hsts_header {
        https "max-age=63072000; preload";
      }
      
      server {
        set $forward_scheme http;
        set $server "logto";
        set $port 3002;
      
        listen 80;
        listen [::]:80;
      
        listen 443 ssl;
        listen [::]:443 ssl;
      
        server_name test.localhost;
      
        # Custom SSL
        ssl_certificate /data/custom_ssl/npm-2/fullchain.pem;
        ssl_certificate_key /data/custom_ssl/npm-2/privkey.pem;
      
        access_log /data/logs/proxy-host-5_access.log proxy;
        error_log /data/logs/proxy-host-5_error.log warn;
      
        location / {
          # Proxy!
          include conf.d/include/proxy.conf;
        }
      
        # Custom
        include /data/nginx/custom/server_proxy[.]conf;
      }
  3. Registering and logging in was possible on test.localhost, but on the dashboard (and any other route) I got error toast "Internal server error."

  4. Checking the network debug tools I see that a request to auth.localhost is causing it:

    afbeelding

  5. Manually checking one of the requests, for example the first https://auth.localhost/api/dashboard/users/total, results in:

    {"message":"Internal server error."}

    (Same result when bypassing the proxy on http://localhost:3001/api/dashboard/users/total)

I noticed when not defining ADMIN_ENDPOINT I get a different error:

- - ADMIN_ENDPOINT=https://test.localhost/
+ - ADMIN_ENDPOINT

Now when hitting https://auth.localhost/api/dashboard/users/total I get:

{"message":"Authorization header is missing.","code":"auth.authorization_header_missing"}

Which to me is a clear error of what's wrong, but I now I can't go to the admin console to login as I will run into CSP and SSL errors.

I tried the following without success:

  • I tried adding this manually but got the same result:
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto https;
  • I tried deleting the database volume
  • I tried different domain and subdomains
  • I tried searching the issues for similar errors

If I don't use any proxy at all and don't set any ENDPOINT variable everything works fine as expected.

Here's a log of my docker-compose:

postgres-1  | 
postgres-1  | PostgreSQL Database directory appears to contain a database; Skipping initialization
postgres-1  | 
postgres-1  | 2024-10-30 15:40:34.176 UTC [1] LOG:  starting PostgreSQL 14.13 on aarch64-unknown-linux-musl, compiled by gcc (Alpine 13.2.1_git20240309) 13.2.1 20240309, 64-bit
postgres-1  | 2024-10-30 15:40:34.176 UTC [1] LOG:  listening on IPv4 address "0.0.0.0", port 5432
postgres-1  | 2024-10-30 15:40:34.176 UTC [1] LOG:  listening on IPv6 address "::", port 5432
postgres-1  | 2024-10-30 15:40:34.177 UTC [1] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres-1  | 2024-10-30 15:40:34.178 UTC [14] LOG:  database system was interrupted; last known up at 2024-10-30 15:39:22 UTC
proxy-1     | ❯ Configuring npm user ...
proxy-1     | useradd warning: npm's uid 0 outside of the UID_MIN 1000 and UID_MAX 60000 range.
proxy-1     | ❯ Configuring npm group ...
proxy-1     | ❯ Checking paths ...
proxy-1     | ❯ Setting ownership ...
postgres-1  | 2024-10-30 15:40:34.237 UTC [14] LOG:  database system was not properly shut down; automatic recovery in progress
postgres-1  | 2024-10-30 15:40:34.238 UTC [14] LOG:  redo starts at 0/199B9B0
postgres-1  | 2024-10-30 15:40:34.238 UTC [14] LOG:  invalid record length at 0/19A4F60: wanted 24, got 0
postgres-1  | 2024-10-30 15:40:34.238 UTC [14] LOG:  redo done at 0/19A4F28 system usage: CPU: user: 0.00 s, system: 0.00 s, elapsed: 0.00 s
postgres-1  | 2024-10-30 15:40:34.245 UTC [1] LOG:  database system is ready to accept connections
proxy-1     | ❯ Dynamic resolvers ...
proxy-1     | ❯ IPv6 ...
proxy-1     | Enabling IPV6 in hosts in: /etc/nginx/conf.d
proxy-1     | - /etc/nginx/conf.d/default.conf
proxy-1     | - /etc/nginx/conf.d/production.conf
proxy-1     | - /etc/nginx/conf.d/include/ssl-ciphers.conf
proxy-1     | - /etc/nginx/conf.d/include/log.conf
proxy-1     | - /etc/nginx/conf.d/include/ip_ranges.conf
proxy-1     | - /etc/nginx/conf.d/include/letsencrypt-acme-challenge.conf
proxy-1     | - /etc/nginx/conf.d/include/proxy.conf
proxy-1     | - /etc/nginx/conf.d/include/block-exploits.conf
proxy-1     | - /etc/nginx/conf.d/include/force-ssl.conf
proxy-1     | - /etc/nginx/conf.d/include/assets.conf
proxy-1     | - /etc/nginx/conf.d/include/resolvers.conf
proxy-1     | Enabling IPV6 in hosts in: /data/nginx
proxy-1     | - /data/nginx/proxy_host/5.conf
proxy-1     | - /data/nginx/proxy_host/3.conf
proxy-1     | - /data/nginx/proxy_host/2.conf
proxy-1     | ❯ Docker secrets ...
proxy-1     | 
proxy-1     | -------------------------------------
proxy-1     |  _   _ ____  __  __
proxy-1     | | \ | |  _ \|  \/  |
proxy-1     | |  \| | |_) | |\/| |
proxy-1     | | |\  |  __/| |  | |
proxy-1     | |_| \_|_|   |_|  |_|
proxy-1     | -------------------------------------
proxy-1     | User:  npm PUID:0 ID:0 GROUP:0
proxy-1     | Group: npm PGID:0 ID:0
proxy-1     | -------------------------------------
proxy-1     | 
proxy-1     | ❯ Starting backend ...
proxy-1     | ❯ Starting nginx ...
proxy-1     | [10/30/2024] [3:40:34 PM] [Global   ] › ℹ  info      Using Sqlite: /data/database.sqlite
proxy-1     | [10/30/2024] [3:40:34 PM] [Migrate  ] › ℹ  info      Current database version: none
proxy-1     | [10/30/2024] [3:40:34 PM] [Setup    ] › ℹ  info      Logrotate Timer initialized
proxy-1     | [10/30/2024] [3:40:34 PM] [Global   ] › ⬤  debug     CMD: logrotate /etc/logrotate.d/nginx-proxy-manager
proxy-1     | [10/30/2024] [3:40:34 PM] [Setup    ] › ℹ  info      Logrotate completed.
proxy-1     | [10/30/2024] [3:40:34 PM] [IP Ranges] › ℹ  info      Fetching IP Ranges from online services...
proxy-1     | [10/30/2024] [3:40:34 PM] [IP Ranges] › ℹ  info      Fetching https://ip-ranges.amazonaws.com/ip-ranges.json
proxy-1     | [10/30/2024] [3:40:34 PM] [IP Ranges] › ℹ  info      Fetching https://www.cloudflare.com/ips-v4
proxy-1     | [10/30/2024] [3:40:34 PM] [IP Ranges] › ℹ  info      Fetching https://www.cloudflare.com/ips-v6
proxy-1     | [10/30/2024] [3:40:34 PM] [SSL      ] › ℹ  info      Let's Encrypt Renewal Timer initialized
proxy-1     | [10/30/2024] [3:40:34 PM] [SSL      ] › ℹ  info      Renewing SSL certs expiring within 30 days ...
proxy-1     | [10/30/2024] [3:40:34 PM] [IP Ranges] › ℹ  info      IP Ranges Renewal Timer initialized
proxy-1     | [10/30/2024] [3:40:34 PM] [Global   ] › ℹ  info      Backend PID 165 listening on port 3000 ...
proxy-1     | [10/30/2024] [3:40:34 PM] [SSL      ] › ℹ  info      Completed SSL cert renew process
logto-1     | 
logto-1     | > cli
logto-1     | > logto db seed --swe
logto-1     | 
logto-1     | info Seeding skipped
logto-1     | npm notice
logto-1     | npm notice New minor version of npm available! 10.8.2 -> 10.9.0
logto-1     | npm notice Changelog: https://github.com/npm/cli/releases/tag/v10.9.0
logto-1     | npm notice To update run: npm install -g [email protected]
logto-1     | npm notice
logto-1     | 
logto-1     | > start
logto-1     | > cd packages/core && NODE_ENV=production node .
logto-1     | 
logto-1     | (node:44) ExperimentalWarning: Importing JSON modules is an experimental feature and might change at any time
logto-1     | (Use `node --trace-warnings ...` to show where the warning was created)
logto-1     | cache    warn No Redis client initialized, skipping
logto-1     | core     info Core app is running at http://localhost:5071/
logto-1     | core     info Core app is running at https://auth.localhost/
logto-1     | admin    info Admin app is running at http://localhost:5072/
logto-1     | admin    info Admin app is running at https://test.localhost/
logto-1     | rQRpW_QrZogND-oa   <-- GET /sign-in
logto-1     | tenant   info Init tenant: admin Custom domain: undefined
logto-1     | rQRpW_QrZogND-oa   --> GET /sign-in 200 89ms 4.5kb
logto-1     | uJLT5-6h85ZOvdZ-   <-- PUT /api/interaction
logto-1     | uJLT5-6h85ZOvdZ-   --> PUT /api/interaction 204 577ms 
logto-1     | vZBEFvr-2a2eLb4N   <-- POST /api/interaction/submit
logto-1     | vZBEFvr-2a2eLb4N   --> POST /api/interaction/submit 200 14ms 70b
logto-1     | fneOa66tciJyZebZ   <-- GET /oidc/auth/12GM4zYPByEJID_V7_AO2
logto-1     | fneOa66tciJyZebZ   --> GET /oidc/auth/12GM4zYPByEJID_V7_AO2 303 23ms 47b
logto-1     | eAODg2g5wB90nPvk   <-- GET /consent
logto-1     | eAODg2g5wB90nPvk   --> GET /consent 302 7ms 137b
logto-1     | RjsIDCAV3XqsRFiV   <-- GET /oidc/auth/V8p6ARlX6dhW_4InkssZ4
logto-1     | RjsIDCAV3XqsRFiV   --> GET /oidc/auth/V8p6ARlX6dhW_4InkssZ4 303 14ms 485b
logto-1     | N3ZnVVObfNdOEPJq   <-- GET /console/callback?code=xzLgvHHEicXJMzXFjzW5S5o1lrLF0yQ_9GRds7ma_EA&state=FTinzZSCt4ICsw6oOwwCYwsDts1wWvq45XgsDVosYeXkZL9FBD-ua6Xr6n5qUMtZAIPVSvatSp7xSmNqQQR3kQ&iss=https%3A%2F%2Ftest.localhost%2Foidc
logto-1     | N3ZnVVObfNdOEPJq   --> GET /console/callback?code=xzLgvHHEicXJMzXFjzW5S5o1lrLF0yQ_9GRds7ma_EA&state=FTinzZSCt4ICsw6oOwwCYwsDts1wWvq45XgsDVosYeXkZL9FBD-ua6Xr6n5qUMtZAIPVSvatSp7xSmNqQQR3kQ&iss=https%3A%2F%2Ftest.localhost%2Foidc 200 4ms 640b
logto-1     | Crh3mI4qH3Rr3stH   <-- GET /console/assets/cytoscape--v800llU.js
logto-1     | NC_zJXgiLFR8VhHE   <-- GET /console/assets/phrases-ja-mIiSdMxi.js
logto-1     | iJ2p8V3cJMdXzd_A   <-- GET /console/assets/phrases-ko-DiTQBnOj.js
logto-1     | 1hdVtWcpY7gHNYpe   <-- GET /console/assets/phrases-pl-pl-DXmFHVyp.js
logto-1     | NC_zJXgiLFR8VhHE   --> GET /console/assets/phrases-ja-mIiSdMxi.js 200 26ms 39.88kb
logto-1     | __2CqiHKrfCEoU6o   <-- GET /api/.well-known/endpoints/default
logto-1     | __2CqiHKrfCEoU6o   --> GET /api/.well-known/endpoints/default 200 3ms 33b
logto-1     | WrxJ300aoFYNrluH   <-- POST /oidc/token
logto-1     | oidc-provider WARNING: already parsed request body detected, having upstream middleware parser is not recommended, resolving to use req.body or request.body instead
logto-1     | WrxJ300aoFYNrluH   --> POST /oidc/token 200 31ms 795b
logto-1     | vL1hIrGalxEScwJw   <-- GET /console/assets/apple-touch-icon-D-pXUhFz.png
logto-1     | j1lWvhbsGdwFMLdU   <-- GET /oidc/jwks
logto-1     | j1lWvhbsGdwFMLdU   --> GET /oidc/jwks 304 2ms 
logto-1     | muhHg9lC1AF8GiAC   <-- POST /oidc/token
logto-1     | muhHg9lC1AF8GiAC   --> POST /oidc/token 200 18ms 933b
logto-1     | S9iAUjcnWAr3CeJp   <-- GET /me
logto-1     | S9iAUjcnWAr3CeJp   --> GET /me 200 5ms 303b
logto-1     | i0Q0BPdMgZtyozed   <-- POST /oidc/token
logto-1     | i0Q0BPdMgZtyozed   --> POST /oidc/token 200 14ms 918b
logto-1     | vSnSp65Q1HEx-QEK   <-- POST /oidc/token
logto-1     | vSnSp65Q1HEx-QEK   --> POST /oidc/token 200 13ms 932b
logto-1     | c94V38itu-0ue_oM   <-- OPTIONS /api/dashboard/users/new
logto-1     | tenant   info Init tenant: default Custom domain: undefined
logto-1     | 8Yp8hN0LlhR-D2Fu   <-- OPTIONS /api/dashboard/users/total
logto-1     | 7kOegw7LAYDZ2iz0   <-- OPTIONS /api/dashboard/users/active?date=2024-10-30
logto-1     | c94V38itu-0ue_oM   --> OPTIONS /api/dashboard/users/new 204 34ms 
logto-1     | 8Yp8hN0LlhR-D2Fu   --> OPTIONS /api/dashboard/users/total 204 33ms 
logto-1     | 7kOegw7LAYDZ2iz0   --> OPTIONS /api/dashboard/users/active?date=2024-10-30 204 33ms 
logto-1     | KAwHEqOgit_RxVQV   <-- GET /api/dashboard/users/new
logto-1     | q1FnoS4Fd4ij8aCY   <-- GET /api/dashboard/users/total
logto-1     | snxYu1-YGisCRWG2   <-- GET /api/dashboard/users/active?date=2024-10-30
logto-1     | KAwHEqOgit_RxVQV error TypeError: fetch failed
logto-1     |     at node:internal/deps/undici/undici:13185:13
logto-1     |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
logto-1     |     at async function_ (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:18:28)
logto-1     |     at async Ky._retry (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:215:20)
logto-1     |     at async result.<computed> [as json] (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:54:39)
logto-1     |     at async getAdminTenantTokenValidationSet (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4244:25)
logto-1     |     at async getKeysAndIssuer (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4293:22)
logto-1     |     at async verifyBearerTokenFromRequest (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4300:28)
logto-1     |     at async authMiddleware (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4326:48)
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:2202:5
logto-1     |     at async cors (/etc/logto/node_modules/.pnpm/@[email protected]/node_modules/@koa/cors/index.js:109:16)
logto-1     |     at async /etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-mount/index.js:58:5
logto-1     |     at async /etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-mount/index.js:52:26
logto-1     |     at async compressMiddleware (/etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-compress/lib/index.js:56:5)
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:743:7
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:1233:7 {
logto-1     |   headers: {
logto-1     |     'Access-Control-Allow-Origin': 'https://test.localhost',
logto-1     |     'Access-Control-Expose-Headers': '*',
logto-1     |     vary: 'Origin'
logto-1     |   },
logto-1     |   [cause]: Error: connect ECONNREFUSED 127.0.0.1:443
logto-1     |       at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1607:16) {
logto-1     |     errno: -111,
logto-1     |     code: 'ECONNREFUSED',
logto-1     |     syscall: 'connect',
logto-1     |     address: '127.0.0.1',
logto-1     |     port: 443
logto-1     |   }
logto-1     | }
logto-1     | KAwHEqOgit_RxVQV   --> GET /api/dashboard/users/new 500 965ms 36b
logto-1     | q1FnoS4Fd4ij8aCY error TypeError: fetch failed
logto-1     |     at node:internal/deps/undici/undici:13185:13
logto-1     |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
logto-1     |     at async function_ (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:18:28)
logto-1     |     at async Ky._retry (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:215:20)
logto-1     |     at async result.<computed> [as json] (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:54:39)
logto-1     |     at async getAdminTenantTokenValidationSet (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4244:25)
logto-1     |     at async getKeysAndIssuer (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4293:22)
logto-1     |     at async verifyBearerTokenFromRequest (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4300:28)
logto-1     |     at async authMiddleware (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4326:48)
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:2202:5
logto-1     |     at async cors (/etc/logto/node_modules/.pnpm/@[email protected]/node_modules/@koa/cors/index.js:109:16)
logto-1     |     at async /etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-mount/index.js:58:5
logto-1     |     at async /etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-mount/index.js:52:26
logto-1     |     at async compressMiddleware (/etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-compress/lib/index.js:56:5)
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:743:7
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:1233:7 {
logto-1     |   headers: {
logto-1     |     'Access-Control-Allow-Origin': 'https://test.localhost',
logto-1     |     'Access-Control-Expose-Headers': '*',
logto-1     |     vary: 'Origin'
logto-1     |   },
logto-1     |   [cause]: Error: connect ECONNREFUSED 127.0.0.1:443
logto-1     |       at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1607:16) {
logto-1     |     errno: -111,
logto-1     |     code: 'ECONNREFUSED',
logto-1     |     syscall: 'connect',
logto-1     |     address: '127.0.0.1',
logto-1     |     port: 443
logto-1     |   }
logto-1     | }
logto-1     | q1FnoS4Fd4ij8aCY   --> GET /api/dashboard/users/total 500 958ms 36b
logto-1     | snxYu1-YGisCRWG2 error TypeError: fetch failed
logto-1     |     at node:internal/deps/undici/undici:13185:13
logto-1     |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
logto-1     |     at async function_ (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:18:28)
logto-1     |     at async Ky._retry (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:215:20)
logto-1     |     at async result.<computed> [as json] (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:54:39)
logto-1     |     at async getAdminTenantTokenValidationSet (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4244:25)
logto-1     |     at async getKeysAndIssuer (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4293:22)
logto-1     |     at async verifyBearerTokenFromRequest (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4300:28)
logto-1     |     at async authMiddleware (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4326:48)
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:2202:5
logto-1     |     at async cors (/etc/logto/node_modules/.pnpm/@[email protected]/node_modules/@koa/cors/index.js:109:16)
logto-1     |     at async /etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-mount/index.js:58:5
logto-1     |     at async /etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-mount/index.js:52:26
logto-1     |     at async compressMiddleware (/etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-compress/lib/index.js:56:5)
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:743:7
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:1233:7 {
logto-1     |   headers: {
logto-1     |     'Access-Control-Allow-Origin': 'https://test.localhost',
logto-1     |     'Access-Control-Expose-Headers': '*',
logto-1     |     vary: 'Origin'
logto-1     |   },
logto-1     |   [cause]: Error: connect ECONNREFUSED 127.0.0.1:443
logto-1     |       at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1607:16) {
logto-1     |     errno: -111,
logto-1     |     code: 'ECONNREFUSED',
logto-1     |     syscall: 'connect',
logto-1     |     address: '127.0.0.1',
logto-1     |     port: 443
logto-1     |   }
logto-1     | }
logto-1     | snxYu1-YGisCRWG2   --> GET /api/dashboard/users/active?date=2024-10-30 500 957ms 36b
logto-1     | J97f2xuNp7Ib6L3R   <-- GET /api/dashboard/users/new
logto-1     | 4ghtRsOab65i2Qc6   <-- GET /api/dashboard/users/total
logto-1     | w1P6F20wAMEnU0U7   <-- GET /api/dashboard/users/active?date=2024-10-30
logto-1     | J97f2xuNp7Ib6L3R error TypeError: fetch failed
logto-1     |     at node:internal/deps/undici/undici:13185:13
logto-1     |     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
logto-1     |     at async function_ (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:18:28)
logto-1     |     at async Ky._retry (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:215:20)
logto-1     |     at async result.<computed> [as json] (file:///etc/logto/node_modules/.pnpm/[email protected]/node_modules/ky/distribution/core/Ky.js:54:39)
logto-1     |     at async getAdminTenantTokenValidationSet (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4244:25)
logto-1     |     at async getKeysAndIssuer (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4293:22)
logto-1     |     at async verifyBearerTokenFromRequest (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4300:28)
logto-1     |     at async authMiddleware (file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:4326:48)
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:2202:5
logto-1     |     at async cors (/etc/logto/node_modules/.pnpm/@[email protected]/node_modules/@koa/cors/index.js:109:16)
logto-1     |     at async /etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-mount/index.js:58:5
logto-1     |     at async /etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-mount/index.js:52:26
logto-1     |     at async compressMiddleware (/etc/logto/node_modules/.pnpm/[email protected]/node_modules/koa-compress/lib/index.js:56:5)
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:743:7
logto-1     |     at async file:///etc/logto/packages/core/build/main-BOZ4RWKM.js:1233:7 {
logto-1     |   headers: {
logto-1     |     'Access-Control-Allow-Origin': 'https://test.localhost',
logto-1     |     'Access-Control-Expose-Headers': '*',
logto-1     |     vary: 'Origin'
logto-1     |   },
logto-1     |   [cause]: Error: connect ECONNREFUSED 127.0.0.1:443
logto-1     |       at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1607:16) {
logto-1     |     errno: -111,
logto-1     |     code: 'ECONNREFUSED',
logto-1     |     syscall: 'connect',
logto-1     |     address: '127.0.0.1',
logto-1     |     port: 443
logto-1     |   }


After here it's just repetition of the errors above, or similar errors

Context

  • Logto Cloud
  • Self-hosted, Logto version = 1.21.0
    • Container (Docker image)
    • Raw Node.js

Screenshots

afbeelding

Comment

I'm hoping I made a dumb mistake or maybe shouldn't use Nginx Proxy Manager at all as this service fits perfectly in our ecosystem.

@sneakylenny sneakylenny added the bug Something isn't working label Oct 30, 2024
@cdrx
Copy link

cdrx commented Nov 11, 2024

I think internally Logto uses the URL in ENDPOINT or ADMIN_ENDPOINT for making requests to those URLs from inside the backend.

This works, as long as neither resolve to localhost - I assume in your case auth.localhost and test.localhost both do.

In this case, when Logto makes a request to those hosts the request does not go via nginx-proxy-manager. Logto is trying to make a request (inside its own container) to 127.0.0.1:443 where only 127.0.0.1:3001 and 127.0.0.1:3002 are listening.

You can work around this by adding these hosts as aliases on your nginx-proxy-manager, overriding the DNS resolution and forcing Logto to send those requests via nginx. Add this to your nginx-proxy-manager service in docker-compose.yml:

services:
  proxy:
    ...
    networks:
      default:
        aliases:
          - auth.localhost
          - test.localhost
  ...

You may also need to tell Logto to ignore TLS failures, since it won't be aware of the TLS certificate you created. You can add: NODE_TLS_REJECT_UNAUTHORIZED=0 to Logto's environment to do that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Development

No branches or pull requests

2 participants