You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Teleport version - v14.0.0 forward, maybe older versions as well. Both OSS and ent are affected.
Recreation steps
Create a default Teleport config: teleport configure -o file:///tmp/teleport.yaml
Run Teleport with the flag (running in a container to avoid possible "contamination" with other system config): docker run --rm --mount type=bind,src=/tmp/teleport.yaml,dst=/etc/teleport/teleport.yaml public.ecr.aws/gravitational/teleport-distroless:17.3.4 --debug --insecure-no-tls`
Observe the error. On v14, this panics with a nil pointer dereference.
Debug logs
% teleport configure -o file:///tmp/teleport.yaml A Teleport configuration file has been created at "/tmp/teleport.yaml".To start Teleport with this configuration file, run:sudo teleport start --config="/tmp/teleport.yaml"Note that starting a Teleport server with this configuration will require root access as:- Teleport will be storing data at "/var/lib/teleport". To change that, edit the "data_dir" field in "/tmp/teleport.yaml".Happy Teleporting!
% docker run --rm -it --mount type=bind,src=/tmp/teleport.yaml,dst=/etc/teleport/teleport.yaml public.ecr.aws/gravitational/teleport-distroless:17.3.4 --insecure-no-tls2025-03-25T21:42:29.751Z INFO Starting Teleport with a config file version:17.3.4 config_file:/etc/teleport/teleport.yaml common/teleport.go:7802025-03-25T21:42:29.761Z INFO [PROC:1] Generating new host UUID pid:7.1 host_uuid:6274d561-8e7c-4fec-823b-0dd81c713834 service/service.go:66762025-03-25T21:42:30.269Z INFO [PROC:1] Service is creating new listener. pid:7.1 type:debug address:/var/lib/teleport/debug.sock service/signals.go:2422025-03-25T21:42:31.322Z INFO [AUTH] Updating cluster configuration: StaticTokens([]). auth/init.go:5092025-03-25T21:42:31.323Z INFO [AUTH] Creating namespace: "default". auth/init.go:5162025-03-25T21:42:31.324Z INFO [AUTH] Creating access graph settings: kind:"access_graph_settings" version:"v1" metadata:{name:"access-graph-settings"} spec:{secrets_scan_config:ACCESS_GRAPH_SECRETS_SCAN_CONFIG_DISABLED}. auth/init.go:10222025-03-25T21:42:31.324Z INFO [AUTH] Creating session recording config: Kind:"session_recording_config" Version:"v2" Metadata:<Name:"session-recording-config" Namespace:"default" Labels:<key:"teleport.dev/origin" value:"defaults" > > Spec:<Mode:"node" ProxyChecksHostKeys:"\010\001" > . auth/init.go:9842025-03-25T21:42:31.324Z INFO [AUTH] Creating cluster networking configuration: Kind:"cluster_networking_config" Version:"v2" Metadata:<Name:"cluster-networking-config" Namespace:"default" Labels:<key:"teleport.dev/origin" value:"config-file" > > Spec:<KeepAliveInterval:300000000000 KeepAliveCountMax:3 ProxyListenerMode:Multiplex TunnelStrategy:<AgentMesh:<> > > . auth/init.go:9442025-03-25T21:42:31.327Z INFO [AUTH] Creating cluster auth preference: AuthPreference(Type="local",SecondFactors=["SECOND_FACTOR_TYPE_OTP"]). auth/init.go:9032025-03-25T21:42:31.346Z INFO [AUTH:1] Starting migration. pid:7.1 version:1 name:create_db_cas trace_id:1b19c65b585bb11d7156b81a48d4285b span_id:0713ed480ee32404 migration/migration.go:1192025-03-25T21:42:31.350Z INFO [AUTH:1] Completed migration. pid:7.1 version:1 name:create_db_cas trace_id:1b19c65b585bb11d7156b81a48d4285b span_id:0713ed480ee32404 migration/migration.go:1392025-03-25T21:42:31.350Z INFO [AUTH] First start: generating openssh certificate authority. auth/init.go:7012025-03-25T21:42:31.350Z INFO [AUTH] First start: generating oidc_idp certificate authority. auth/init.go:7012025-03-25T21:42:31.352Z INFO [AUTH] First start: generating okta certificate authority. auth/init.go:7012025-03-25T21:42:31.352Z INFO [AUTH] First start: generating user certificate authority. auth/init.go:7012025-03-25T21:42:31.352Z INFO [AUTH] First start: generating host certificate authority. auth/init.go:7012025-03-25T21:42:31.352Z INFO [AUTH] First start: generating saml_idp certificate authority. auth/init.go:7012025-03-25T21:42:31.352Z INFO [AUTH] First start: generating db_client certificate authority. auth/init.go:7012025-03-25T21:42:31.353Z INFO [AUTH] First start: generating db certificate authority. auth/init.go:7012025-03-25T21:42:31.357Z INFO [AUTH] First start: generating spiffe certificate authority. auth/init.go:7012025-03-25T21:42:31.365Z INFO [AUTH] First start: generating jwt certificate authority. auth/init.go:7012025-03-25T21:42:31.501Z INFO emitting audit event event_type:role.created fields:map[cluster_name:Mac.echozulu.local code:T9000I ei:0 event:role.created expires:0001-01-01T00:00:00Z name:access time:2025-03-25T21:42:31.501Z trace.component:audit uid:5444c439-a000-40db-ba50-e52ca494a317 user:system] events/emitter.go:2872025-03-25T21:42:31.503Z INFO emitting audit event event_type:role.created fields:map[cluster_name:Mac.echozulu.local code:T9000I ei:0 event:role.created expires:0001-01-01T00:00:00Z name:wildcard-workload-identity-issuer time:2025-03-25T21:42:31.504Z trace.component:audit uid:57a2c6da-4fa2-4141-af75-b80da0f0409f user:system] events/emitter.go:2872025-03-25T21:42:31.505Z INFO emitting audit event event_type:role.created fields:map[cluster_name:Mac.echozulu.local code:T9000I ei:0 event:role.created expires:0001-01-01T00:00:00Z name:editor time:2025-03-25T21:42:31.505Z trace.component:audit uid:3fcefd5d-cd84-4e27-a52c-125ebf1b3b9e user:system] events/emitter.go:2872025-03-25T21:42:31.507Z INFO emitting audit event event_type:role.created fields:map[cluster_name:Mac.echozulu.local code:T9000I ei:0 event:role.created expires:0001-01-01T00:00:00Z name:terraform-provider time:2025-03-25T21:42:31.508Z trace.component:audit uid:843f1007-3a4b-41fa-96e0-9691ece043ce user:system] events/emitter.go:2872025-03-25T21:42:31.509Z INFO emitting audit event event_type:role.created fields:map[cluster_name:Mac.echozulu.local code:T9000I ei:0 event:role.created expires:0001-01-01T00:00:00Z name:auditor time:2025-03-25T21:42:31.509Z trace.component:audit uid:8307340f-af11-423b-a18a-e52dc2c998ba user:system] events/emitter.go:2872025-03-25T21:42:31.511Z INFO [AUTH] Auth server is running periodic operations. auth/init.go:6222025-03-25T21:42:31.514Z INFO [AUTH:COMP] upload completer starting check_interval:5m0s events/complete.go:1652025-03-25T21:42:31.515Z INFO [PROC:1] Successfully obtained credentials to connect to the cluster. pid:7.1 identity:Admin service/connect.go:4642025-03-25T21:42:31.517Z INFO [PROC:1] The process successfully wrote the credentials and state to the disk. pid:7.1 identity:Admin service/connect.go:5082025-03-25T21:42:31.522Z INFO [AUTH:1:CA] Cache "auth" first init succeeded. cache/cache.go:11522025-03-25T21:42:31.522Z INFO [PROC:1] Service is creating new listener. pid:7.1 type:auth address:0.0.0.0:3025 service/signals.go:2422025-03-25T21:42:31.522Z WARN [AUTH:1] 'proxy_protocol' unspecified. Starting Auth service with external PROXY protocol support, but IP pinned connection affected by PROXY headers will not be allowed. Set 'proxy_protocol: on' in 'auth_service' config if Auth service runs behind L4 load balancer with enabled PROXY protocol, or set 'proxy_protocol: off' otherwise pid:7.1 service/service.go:23992025-03-25T21:42:31.523Z WARN [AUTH:1] Configuration setting auth_service/advertise_ip is not set, using inferred address pid:7.1 address:172.17.0.2:3025 service/service.go:24822025-03-25T21:42:31.523Z INFO [WORKLOAD_] Starting to generate new CRL workloadidentityv1/revocation_service.go:6072025-03-25T21:42:31.523Z INFO [WORKLOAD_] Finished generating new CRL revocations:0 workloadidentityv1/revocation_service.go:6592025-03-25T21:42:31.523Z INFO [UPLOAD:1] starting upload completer service pid:7.1 service/service.go:33962025-03-25T21:42:31.523Z INFO [UPLOAD:1] Creating directory. pid:7.1 directory:/var/lib/teleport/log service/service.go:34122025-03-25T21:42:31.523Z INFO [UPLOAD:1] Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload service/service.go:34122025-03-25T21:42:31.523Z INFO [UPLOAD:1] Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload/streaming service/service.go:34122025-03-25T21:42:31.523Z INFO [UPLOAD:1] Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload/streaming/default service/service.go:34122025-03-25T21:42:31.523Z INFO [UPLOAD:1] Creating directory. pid:7.1 directory:/var/lib/teleport/log service/service.go:34122025-03-25T21:42:31.523Z INFO [UPLOAD:1] Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload service/service.go:34122025-03-25T21:42:31.523Z INFO [UPLOAD:1] Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload/corrupted service/service.go:34122025-03-25T21:42:31.523Z INFO [UPLOAD:1] Creating directory. pid:7.1 directory:/var/lib/teleport/log/upload/corrupted/default service/service.go:34122025-03-25T21:42:31.524Z INFO [AUTH:1] Auth service is starting. pid:7.1 version:17.3.4 git_ref:v17.3.4-0-g29c19ac listen_address:172.17.0.2:3025 service/service.go:24362025-03-25T21:42:31.524Z INFO [AUTH:1] Starting autoupdate_agent_rollout controller pid:7.1 component:rollout-controller period:1m0s rollout/controller.go:1192025-03-25T21:42:31.524Z INFO [UPLOAD] uploader server ready scan_dir:/var/lib/teleport/log/upload/streaming/default scan_period:5s filesessions/fileasync.go:1942025-03-25T21:42:31.524Z INFO [UPLOAD:1:] upload completer starting check_interval:5m0s events/complete.go:1652025-03-25T21:42:31.527Z INFO [PROC:1] Successfully obtained credentials to connect to the cluster. pid:7.1 identity:Instance service/connect.go:4642025-03-25T21:42:31.527Z INFO [PROC:1] Successfully obtained credentials to connect to the cluster. pid:7.1 identity:Proxy service/connect.go:4642025-03-25T21:42:31.528Z INFO [PROC:1] Successfully obtained credentials to connect to the cluster. pid:7.1 identity:Node service/connect.go:464
2025-03-25T21:42:31.530Z INFO [PROC:1] features loaded from auth server pid:7.1 identity:Instance features:Kubernetes:true App:true DB:true Desktop:true DeviceTrust:<> AccessRequests:<> AccessList:<> AccessMonitoring:<> Policy:<> SupportType:SUPPORT_TYPE_FREE JoinActiveSessions:true entitlements:<key:"AccessLists" value:<> > entitlements:<key:"AccessMonitoring" value:<> > entitlements:<key:"AccessRequests" value:<> > entitlements:<key:"App" value:<enabled:true > > entitlements:<key:"CloudAuditLogRetention" value:<> > entitlements:<key:"DB" value:<enabled:true > > entitlements:<key:"Desktop" value:<enabled:true > > entitlements:<key:"DeviceTrust" value:<> > entitlements:<key:"ExternalAuditStorage" value:<> > entitlements:<key:"FeatureHiding" value:<> > entitlements:<key:"HSM" value:<> > entitlements:<key:"Identity" value:<> > entitlements:<key:"JoinActiveSessions" value:<enabled:true > > entitlements:<key:"K8s" value:<enabled:true > > entitlements:<key:"LicenseAutoUpdate" value:<> > entitlements:<key:"MobileDeviceManagement" value:<> > entitlements:<key:"OIDC" value:<> > entitlements:<key:"OktaSCIM" value:<> > entitlements:<key:"OktaUserSync" value:<> > entitlements:<key:"Policy" value:<> > entitlements:<key:"SAML" value:<> > entitlements:<key:"SessionLocks" value:<> > entitlements:<key:"UpsellAlert" value:<> > entitlements:<key:"UsageReporting" value:<> > service/connect.go:1097
2025-03-25T21:42:31.531Z INFO [AUTH:SPIF] Obtained lock, SPIFFEFederation syncer is starting pid:7.1 machineidv1/spiffe_federation_syncer.go:2192025-03-25T21:42:31.535Z INFO [PROC:1] The process successfully wrote the credentials and state to the disk. pid:7.1 identity:Instance service/connect.go:5082025-03-25T21:42:31.535Z INFO [INSTANCE:] Successfully registered instance client. pid:7.1 service/service.go:29542025-03-25T21:42:31.535Z INFO [PROC:1] Reusing Instance client. pid:7.1 identity:Node additional_system_roles:[Auth Node Proxy] service/connect.go:10682025-03-25T21:42:31.535Z INFO [PROC:1] Reusing Instance client. pid:7.1 identity:Proxy additional_system_roles:[Auth Node Proxy] service/connect.go:10682025-03-25T21:42:31.539Z INFO [PROC:1] The process successfully wrote the credentials and state to the disk. pid:7.1 identity:Node service/connect.go:5082025-03-25T21:42:31.543Z INFO [PROC:1] The process successfully wrote the credentials and state to the disk. pid:7.1 identity:Proxy service/connect.go:5082025-03-25T21:42:31.548Z INFO [NODE:1:CA] Cache "node" first init succeeded. cache/cache.go:11522025-03-25T21:42:31.548Z INFO [PROC:1] Service is creating new listener. pid:7.1 type:node address:0.0.0.0:3022 service/signals.go:2422025-03-25T21:42:31.548Z INFO [NODE:1] SSH Service is starting. pid:7.1 version:17.3.4 git_ref:v17.3.4-0-g29c19ac listen_address:0.0.0.0:3022 cache_policy:in-memory cache service/service.go:31912025-03-25T21:42:31.549Z INFO [PROXY:1:C] Cache "proxy" first init succeeded. cache/cache.go:11522025-03-25T21:42:31.549Z INFO [PROC:1] Service is creating new listener. pid:7.1 type:proxy:web address:0.0.0.0:3080 service/signals.go:2422025-03-25T21:42:31.549Z WARN [PROC:1] Teleport process has exited with error. error:[ERROR REPORT:Original Error: *trace.BadParameterError listener cannot be nilStack Trace: github.com/gravitational/teleport/lib/limiter/listener.go:40 github.com/gravitational/teleport/lib/limiter.NewListener github.com/gravitational/teleport/lib/limiter/limiter.go:181 github.com/gravitational/teleport/lib/limiter.(*Limiter).WrapListener github.com/gravitational/teleport/lib/service/service.go:4542 github.com/gravitational/teleport/lib/service.(*TeleportProcess).initProxyEndpoint github.com/gravitational/teleport/lib/service/service.go:3968 github.com/gravitational/teleport/lib/service.(*TeleportProcess).initProxy.func1 github.com/gravitational/teleport/lib/service/supervisor.go:581 github.com/gravitational/teleport/lib/service.(*LocalService).Serve github.com/gravitational/teleport/lib/service/supervisor.go:307 github.com/gravitational/teleport/lib/service.(*LocalSupervisor).serve.func1 runtime/asm_arm64.s:1223 runtime.goexitUser Message: listener cannot be nil] pid:7.1 service:proxy.init service/supervisor.go:3132025-03-25T21:42:31.549Z ERRO [PROC:1] Critical service has exited with error, aborting. pid:7.1 service:proxy.init error:[ERROR REPORT:Original Error: *trace.BadParameterError listener cannot be nilStack Trace: github.com/gravitational/teleport/lib/limiter/listener.go:40 github.com/gravitational/teleport/lib/limiter.NewListener github.com/gravitational/teleport/lib/limiter/limiter.go:181 github.com/gravitational/teleport/lib/limiter.(*Limiter).WrapListener github.com/gravitational/teleport/lib/service/service.go:4542 github.com/gravitational/teleport/lib/service.(*TeleportProcess).initProxyEndpoint github.com/gravitational/teleport/lib/service/service.go:3968 github.com/gravitational/teleport/lib/service.(*TeleportProcess).initProxy.func1 github.com/gravitational/teleport/lib/service/supervisor.go:581 github.com/gravitational/teleport/lib/service.(*LocalService).Serve github.com/gravitational/teleport/lib/service/supervisor.go:307 github.com/gravitational/teleport/lib/service.(*LocalSupervisor).serve.func1 runtime/asm_arm64.s:1223 runtime.goexitUser Message: listener cannot be nil] service/signals.go:174ERROR: listener cannot be nil
The text was updated successfully, but these errors were encountered:
Expected behavior:
Teleport should not crash when the flag is set.
Current behavior:
Teleport crashes when the flag is set, with the following error:
Bug details:
teleport configure -o file:///tmp/teleport.yaml
The text was updated successfully, but these errors were encountered: