Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Teleport crashes with vague error when using --insecure-no-tls #53423

Closed
fheinecke opened this issue Mar 25, 2025 · 1 comment
Closed

Teleport crashes with vague error when using --insecure-no-tls #53423

fheinecke opened this issue Mar 25, 2025 · 1 comment
Labels

Comments

@fheinecke
Copy link
Contributor

Expected behavior:

Teleport should not crash when the flag is set.

Current behavior:

Teleport crashes when the flag is set, with the following error:

ERROR REPORT:
Original Error: *trace.BadParameterError listener cannot be nil
Stack Trace:
        github.com/gravitational/teleport/lib/limiter/listener.go:40 github.com/gravitational/teleport/lib/limiter.NewListener
        github.com/gravitational/teleport/lib/limiter/limiter.go:181 github.com/gravitational/teleport/lib/limiter.(*Limiter).WrapListener
        github.com/gravitational/teleport/lib/service/service.go:4542 github.com/gravitational/teleport/lib/service.(*TeleportProcess).initProxyEndpoint
        github.com/gravitational/teleport/lib/service/service.go:3968 github.com/gravitational/teleport/lib/service.(*TeleportProcess).initProxy.func1
        github.com/gravitational/teleport/lib/service/supervisor.go:581 github.com/gravitational/teleport/lib/service.(*LocalService).Serve
        github.com/gravitational/teleport/lib/service/supervisor.go:307 github.com/gravitational/teleport/lib/service.(*LocalSupervisor).serve.func1
        runtime/asm_arm64.s:1223 runtime.goexit

Bug details:

  • Teleport version - v14.0.0 forward, maybe older versions as well. Both OSS and ent are affected.
  • Recreation steps
  1. Create a default Teleport config: teleport configure -o file:///tmp/teleport.yaml
  2. Run Teleport with the flag (running in a container to avoid possible "contamination" with other system config): docker run --rm --mount type=bind,src=/tmp/teleport.yaml,dst=/etc/teleport/teleport.yaml public.ecr.aws/gravitational/teleport-distroless:17.3.4 --debug --insecure-no-tls`
  3. Observe the error. On v14, this panics with a nil pointer dereference.
  • Debug logs
% teleport configure -o file:///tmp/teleport.yaml                                                                                                      

A Teleport configuration file has been created at "/tmp/teleport.yaml".
To start Teleport with this configuration file, run:

sudo teleport start --config="/tmp/teleport.yaml"

Note that starting a Teleport server with this configuration will require root access as:
- Teleport will be storing data at "/var/lib/teleport". To change that, edit the "data_dir" field in "/tmp/teleport.yaml".
Happy Teleporting!

% docker run --rm -it --mount type=bind,src=/tmp/teleport.yaml,dst=/etc/teleport/teleport.yaml public.ecr.aws/gravitational/teleport-distroless:17.3.4 --insecure-no-tls
2025-03-25T21:42:29.751Z INFO  Starting Teleport with a config file version:17.3.4 config_file:/etc/teleport/teleport.yaml common/teleport.go:780
2025-03-25T21:42:29.761Z INFO [PROC:1]    Generating new host UUID pid:7.1 host_uuid:6274d561-8e7c-4fec-823b-0dd81c713834 service/service.go:6676
2025-03-25T21:42:30.269Z INFO [PROC:1]    Service is creating new listener. pid:7.1 type:debug address:/var/lib/teleport/debug.sock service/signals.go:242
2025-03-25T21:42:31.322Z INFO [AUTH]      Updating cluster configuration: StaticTokens([]). auth/init.go:509
2025-03-25T21:42:31.323Z INFO [AUTH]      Creating namespace: "default". auth/init.go:516
2025-03-25T21:42:31.324Z INFO [AUTH]      Creating access graph settings: kind:"access_graph_settings" version:"v1" metadata:{name:"access-graph-settings"} spec:{secrets_scan_config:ACCESS_GRAPH_SECRETS_SCAN_CONFIG_DISABLED}. auth/init.go:1022
2025-03-25T21:42:31.324Z INFO [AUTH]      Creating session recording config: Kind:"session_recording_config" Version:"v2" Metadata:<Name:"session-recording-config" Namespace:"default" Labels:<key:"teleport.dev/origin" value:"defaults" > > Spec:<Mode:"node" ProxyChecksHostKeys:"\010\001" > . auth/init.go:984
2025-03-25T21:42:31.324Z INFO [AUTH]      Creating cluster networking configuration: Kind:"cluster_networking_config" Version:"v2" Metadata:<Name:"cluster-networking-config" Namespace:"default" Labels:<key:"teleport.dev/origin" value:"config-file" > > Spec:<KeepAliveInterval:300000000000 KeepAliveCountMax:3 ProxyListenerMode:Multiplex TunnelStrategy:<AgentMesh:<> > > . auth/init.go:944
2025-03-25T21:42:31.327Z INFO [AUTH]      Creating cluster auth preference: AuthPreference(Type="local",SecondFactors=["SECOND_FACTOR_TYPE_OTP"]). auth/init.go:903
2025-03-25T21:42:31.346Z INFO [AUTH:1]    Starting migration. pid:7.1 version:1 name:create_db_cas trace_id:1b19c65b585bb11d7156b81a48d4285b span_id:0713ed480ee32404 migration/migration.go:119
2025-03-25T21:42:31.350Z INFO [AUTH:1]    Completed migration. pid:7.1 version:1 name:create_db_cas trace_id:1b19c65b585bb11d7156b81a48d4285b span_id:0713ed480ee32404 migration/migration.go:139
2025-03-25T21:42:31.350Z INFO [AUTH]      First start: generating openssh certificate authority. auth/init.go:701
2025-03-25T21:42:31.350Z INFO [AUTH]      First start: generating oidc_idp certificate authority. auth/init.go:701
2025-03-25T21:42:31.352Z INFO [AUTH]      First start: generating okta certificate authority. auth/init.go:701
2025-03-25T21:42:31.352Z INFO [AUTH]      First start: generating user certificate authority. auth/init.go:701
2025-03-25T21:42:31.352Z INFO [AUTH]      First start: generating host certificate authority. auth/init.go:701
2025-03-25T21:42:31.352Z INFO [AUTH]      First start: generating saml_idp certificate authority. auth/init.go:701
2025-03-25T21:42:31.352Z INFO [AUTH]      First start: generating db_client certificate authority. auth/init.go:701
2025-03-25T21:42:31.353Z INFO [AUTH]      First start: generating db certificate authority. auth/init.go:701
2025-03-25T21:42:31.357Z INFO [AUTH]      First start: generating spiffe certificate authority. auth/init.go:701
2025-03-25T21:42:31.365Z INFO [AUTH]      First start: generating jwt certificate authority. auth/init.go:701
2025-03-25T21:42:31.501Z INFO  emitting audit event event_type:role.created fields:map[cluster_name:Mac.echozulu.local code:T9000I ei:0 event:role.created expires:0001-01-01T00:00:00Z name:access time:2025-03-25T21:42:31.501Z trace.component:audit uid:5444c439-a000-40db-ba50-e52ca494a317 user:system] events/emitter.go:287
2025-03-25T21:42:31.503Z INFO  emitting audit event event_type:role.created fields:map[cluster_name:Mac.echozulu.local code:T9000I ei:0 event:role.created expires:0001-01-01T00:00:00Z name:wildcard-workload-identity-issuer time:2025-03-25T21:42:31.504Z trace.component:audit uid:57a2c6da-4fa2-4141-af75-b80da0f0409f user:system] events/emitter.go:287
2025-03-25T21:42:31.505Z INFO  emitting audit event event_type:role.created fields:map[cluster_name:Mac.echozulu.local code:T9000I ei:0 event:role.created expires:0001-01-01T00:00:00Z name:editor time:2025-03-25T21:42:31.505Z trace.component:audit uid:3fcefd5d-cd84-4e27-a52c-125ebf1b3b9e user:system] events/emitter.go:287
2025-03-25T21:42:31.507Z INFO  emitting audit event event_type:role.created fields:map[cluster_name:Mac.echozulu.local code:T9000I ei:0 event:role.created expires:0001-01-01T00:00:00Z name:terraform-provider time:2025-03-25T21:42:31.508Z trace.component:audit uid:843f1007-3a4b-41fa-96e0-9691ece043ce user:system] events/emitter.go:287
2025-03-25T21:42:31.509Z INFO  emitting audit event event_type:role.created fields:map[cluster_name:Mac.echozulu.local code:T9000I ei:0 event:role.created expires:0001-01-01T00:00:00Z name:auditor time:2025-03-25T21:42:31.509Z trace.component:audit uid:8307340f-af11-423b-a18a-e52dc2c998ba user:system] events/emitter.go:287
2025-03-25T21:42:31.511Z INFO [AUTH]      Auth server is running periodic operations. auth/init.go:622
2025-03-25T21:42:31.514Z INFO [AUTH:COMP] upload completer starting check_interval:5m0s events/complete.go:165
2025-03-25T21:42:31.515Z INFO [PROC:1]    Successfully obtained credentials to connect to the cluster. pid:7.1 identity:Admin service/connect.go:464
2025-03-25T21:42:31.517Z INFO [PROC:1]    The process successfully wrote the credentials and state to the disk. pid:7.1 identity:Admin service/connect.go:508
2025-03-25T21:42:31.522Z INFO [AUTH:1:CA] Cache "auth" first init succeeded. cache/cache.go:1152
2025-03-25T21:42:31.522Z INFO [PROC:1]    Service is creating new listener. pid:7.1 type:auth address:0.0.0.0:3025 service/signals.go:242
2025-03-25T21:42:31.522Z WARN [AUTH:1]    'proxy_protocol' unspecified. Starting Auth service with external PROXY protocol support, but IP pinned connection affected by PROXY headers will not be allowed. Set 'proxy_protocol: on' in 'auth_service' config if Auth service runs behind L4 load balancer with enabled PROXY protocol, or set 'proxy_protocol: off' otherwise pid:7.1 service/service.go:2399
2025-03-25T21:42:31.523Z WARN [AUTH:1]    Configuration setting auth_service/advertise_ip is not set, using inferred address pid:7.1 address:172.17.0.2:3025 service/service.go:2482
2025-03-25T21:42:31.523Z INFO [WORKLOAD_] Starting to generate new CRL workloadidentityv1/revocation_service.go:607
2025-03-25T21:42:31.523Z INFO [WORKLOAD_] Finished generating new CRL revocations:0 workloadidentityv1/revocation_service.go:659
2025-03-25T21:42:31.523Z INFO [UPLOAD:1]  starting upload completer service pid:7.1 service/service.go:3396
2025-03-25T21:42:31.523Z INFO [UPLOAD:1]  Creating directory. pid:7.1 directory:/var/lib/teleport/log service/service.go:3412
2025-03-25T21:42:31.523Z INFO [UPLOAD:1]  Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload service/service.go:3412
2025-03-25T21:42:31.523Z INFO [UPLOAD:1]  Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload/streaming service/service.go:3412
2025-03-25T21:42:31.523Z INFO [UPLOAD:1]  Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload/streaming/default service/service.go:3412
2025-03-25T21:42:31.523Z INFO [UPLOAD:1]  Creating directory. pid:7.1 directory:/var/lib/teleport/log service/service.go:3412
2025-03-25T21:42:31.523Z INFO [UPLOAD:1]  Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload service/service.go:3412
2025-03-25T21:42:31.523Z INFO [UPLOAD:1]  Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload/corrupted service/service.go:3412
2025-03-25T21:42:31.523Z INFO [UPLOAD:1]  Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload/corrupted/default service/service.go:3412
2025-03-25T21:42:31.524Z INFO [AUTH:1]    Auth service is starting. pid:7.1 version:17.3.4 git_ref:v17.3.4-0-g29c19ac listen_address:172.17.0.2:3025 service/service.go:2436
2025-03-25T21:42:31.524Z INFO [AUTH:1]    Starting autoupdate_agent_rollout controller pid:7.1 component:rollout-controller period:1m0s rollout/controller.go:119
2025-03-25T21:42:31.524Z INFO [UPLOAD]    uploader server ready scan_dir:/var/lib/teleport/log/upload/streaming/default scan_period:5s filesessions/fileasync.go:194
2025-03-25T21:42:31.524Z INFO [UPLOAD:1:] upload completer starting check_interval:5m0s events/complete.go:165
2025-03-25T21:42:31.527Z INFO [PROC:1]    Successfully obtained credentials to connect to the cluster. pid:7.1 identity:Instance service/connect.go:464
2025-03-25T21:42:31.527Z INFO [PROC:1]    Successfully obtained credentials to connect to the cluster. pid:7.1 identity:Proxy service/connect.go:464
2025-03-25T21:42:31.528Z INFO [PROC:1]    Successfully obtained credentials to connect to the cluster. pid:7.1 identity:Node service/connect.go:464
2025-03-25T21:42:31.530Z INFO [PROC:1]    features loaded from auth server pid:7.1 identity:Instance features:Kubernetes:true App:true DB:true Desktop:true DeviceTrust:<> AccessRequests:<> AccessList:<> AccessMonitoring:<> Policy:<> SupportType:SUPPORT_TYPE_FREE JoinActiveSessions:true entitlements:<key:"AccessLists" value:<> > entitlements:<key:"AccessMonitoring" value:<> > entitlements:<key:"AccessRequests" value:<> > entitlements:<key:"App" value:<enabled:true > > entitlements:<key:"CloudAuditLogRetention" value:<> > entitlements:<key:"DB" value:<enabled:true > > entitlements:<key:"Desktop" value:<enabled:true > > entitlements:<key:"DeviceTrust" value:<> > entitlements:<key:"ExternalAuditStorage" value:<> > entitlements:<key:"FeatureHiding" value:<> > entitlements:<key:"HSM" value:<> > entitlements:<key:"Identity" value:<> > entitlements:<key:"JoinActiveSessions" value:<enabled:true > > entitlements:<key:"K8s" value:<enabled:true > > entitlements:<key:"LicenseAutoUpdate" value:<> > entitlements:<key:"MobileDeviceManagement" value:<> > entitlements:<key:"OIDC" value:<> > entitlements:<key:"OktaSCIM" value:<> > entitlements:<key:"OktaUserSync" value:<> > entitlements:<key:"Policy" value:<> > entitlements:<key:"SAML" value:<> > entitlements:<key:"SessionLocks" value:<> > entitlements:<key:"UpsellAlert" value:<> > entitlements:<key:"UsageReporting" value:<> >  service/connect.go:1097
2025-03-25T21:42:31.531Z INFO [AUTH:SPIF] Obtained lock, SPIFFEFederation syncer is starting pid:7.1 machineidv1/spiffe_federation_syncer.go:219
2025-03-25T21:42:31.535Z INFO [PROC:1]    The process successfully wrote the credentials and state to the disk. pid:7.1 identity:Instance service/connect.go:508
2025-03-25T21:42:31.535Z INFO [INSTANCE:] Successfully registered instance client. pid:7.1 service/service.go:2954
2025-03-25T21:42:31.535Z INFO [PROC:1]    Reusing Instance client. pid:7.1 identity:Node additional_system_roles:[Auth Node Proxy] service/connect.go:1068
2025-03-25T21:42:31.535Z INFO [PROC:1]    Reusing Instance client. pid:7.1 identity:Proxy additional_system_roles:[Auth Node Proxy] service/connect.go:1068
2025-03-25T21:42:31.539Z INFO [PROC:1]    The process successfully wrote the credentials and state to the disk. pid:7.1 identity:Node service/connect.go:508
2025-03-25T21:42:31.543Z INFO [PROC:1]    The process successfully wrote the credentials and state to the disk. pid:7.1 identity:Proxy service/connect.go:508
2025-03-25T21:42:31.548Z INFO [NODE:1:CA] Cache "node" first init succeeded. cache/cache.go:1152
2025-03-25T21:42:31.548Z INFO [PROC:1]    Service is creating new listener. pid:7.1 type:node address:0.0.0.0:3022 service/signals.go:242
2025-03-25T21:42:31.548Z INFO [NODE:1]    SSH Service is starting. pid:7.1 version:17.3.4 git_ref:v17.3.4-0-g29c19ac listen_address:0.0.0.0:3022 cache_policy:in-memory cache service/service.go:3191
2025-03-25T21:42:31.549Z INFO [PROXY:1:C] Cache "proxy" first init succeeded. cache/cache.go:1152
2025-03-25T21:42:31.549Z INFO [PROC:1]    Service is creating new listener. pid:7.1 type:proxy:web address:0.0.0.0:3080 service/signals.go:242
2025-03-25T21:42:31.549Z WARN [PROC:1]    Teleport process has exited with error. error:[
ERROR REPORT:
Original Error: *trace.BadParameterError listener cannot be nil
Stack Trace:
        github.com/gravitational/teleport/lib/limiter/listener.go:40 github.com/gravitational/teleport/lib/limiter.NewListener
        github.com/gravitational/teleport/lib/limiter/limiter.go:181 github.com/gravitational/teleport/lib/limiter.(*Limiter).WrapListener
        github.com/gravitational/teleport/lib/service/service.go:4542 github.com/gravitational/teleport/lib/service.(*TeleportProcess).initProxyEndpoint
        github.com/gravitational/teleport/lib/service/service.go:3968 github.com/gravitational/teleport/lib/service.(*TeleportProcess).initProxy.func1
        github.com/gravitational/teleport/lib/service/supervisor.go:581 github.com/gravitational/teleport/lib/service.(*LocalService).Serve
        github.com/gravitational/teleport/lib/service/supervisor.go:307 github.com/gravitational/teleport/lib/service.(*LocalSupervisor).serve.func1
        runtime/asm_arm64.s:1223 runtime.goexit
User Message: listener cannot be nil] pid:7.1 service:proxy.init service/supervisor.go:313
2025-03-25T21:42:31.549Z ERRO [PROC:1]    Critical service has exited with error, aborting. pid:7.1 service:proxy.init error:[
ERROR REPORT:
Original Error: *trace.BadParameterError listener cannot be nil
Stack Trace:
        github.com/gravitational/teleport/lib/limiter/listener.go:40 github.com/gravitational/teleport/lib/limiter.NewListener
        github.com/gravitational/teleport/lib/limiter/limiter.go:181 github.com/gravitational/teleport/lib/limiter.(*Limiter).WrapListener
        github.com/gravitational/teleport/lib/service/service.go:4542 github.com/gravitational/teleport/lib/service.(*TeleportProcess).initProxyEndpoint
        github.com/gravitational/teleport/lib/service/service.go:3968 github.com/gravitational/teleport/lib/service.(*TeleportProcess).initProxy.func1
        github.com/gravitational/teleport/lib/service/supervisor.go:581 github.com/gravitational/teleport/lib/service.(*LocalService).Serve
        github.com/gravitational/teleport/lib/service/supervisor.go:307 github.com/gravitational/teleport/lib/service.(*LocalSupervisor).serve.func1
        runtime/asm_arm64.s:1223 runtime.goexit
User Message: listener cannot be nil] service/signals.go:174
ERROR: listener cannot be nil
@fheinecke fheinecke added the bug label Mar 25, 2025
@zmb3
Copy link
Collaborator

zmb3 commented Mar 26, 2025

Closing in favor of #53476

@zmb3 zmb3 closed this as not planned Won't fix, can't repro, duplicate, stale Mar 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants