Skip to content

Conversation

@boxofrad
Copy link
Contributor

@boxofrad boxofrad commented Jan 9, 2026

This log looks pretty concerning but is actually totally normal and expected, we've had a number of customer reports of it being confusing.

2025-12-03T14:39:40.989Z DEBU  Waiting to retry operation again wait:43.025160021s error:[
ERROR REPORT:
Original Error: *trace.LimitExceededError cannot acquire semaphore auth_server/auto_update_bot_version_reporter (err-max-leases)
Stack Trace:
	github.com/gravitational/teleport/[email protected]/types/semaphore.go:147 github.com/gravitational/teleport/api/types.(*SemaphoreV3).Acquire
	github.com/gravitational/teleport/lib/services/local/presence.go:697 github.com/gravitational/teleport/lib/services/local.(*PresenceService).acquireSemaphore
	github.com/gravitational/teleport/lib/services/local/presence.go:623 github.com/gravitational/teleport/lib/services/local.(*PresenceService).AcquireSemaphore
	github.com/gravitational/teleport/lib/services/semaphore.go:292 github.com/gravitational/teleport/lib/services.AcquireSemaphoreLock
	github.com/gravitational/teleport/lib/services/semaphore.go:330 github.com/gravitational/teleport/lib/services.AcquireSemaphoreLockWithRetry.func1
	github.com/gravitational/teleport/[email protected]/utils/retryutils/retry.go:185 github.com/gravitational/teleport/api/utils/retryutils.(*Linear).For
	github.com/gravitational/teleport/lib/services/semaphore.go:329 github.com/gravitational/teleport/lib/services.AcquireSemaphoreLockWithRetry
	github.com/gravitational/teleport/lib/auth/machineid/machineidv1/auto_update_version_reporter.go:163 github.com/gravitational/teleport/lib/auth/machineid/machineidv1.(*AutoUpdateVersionReporter).runLeader
	github.com/gravitational/teleport/lib/auth/machineid/machineidv1/auto_update_version_reporter.go:137 github.com/gravitational/teleport/lib/auth/machineid/machineidv1.(*AutoUpdateVersionReporter).Run
	github.com/gravitational/teleport/lib/service/service.go:2580 github.com/gravitational/teleport/lib/service.(*TeleportProcess).initAuthService.func8
	github.com/gravitational/teleport/lib/service/supervisor.go:604 github.com/gravitational/teleport/lib/service.(*LocalService).Serve
	github.com/gravitational/teleport/lib/service/supervisor.go:327 github.com/gravitational/teleport/lib/service.(*LocalSupervisor).serve.func1
	runtime/asm_amd64.s:1700 runtime.goexit
User Message: cannot acquire semaphore auth_server/auto_update_bot_version_reporter (err-max-leases)] retryutils/retry.go:193

This log looks pretty concerning but is actually totally normal and expected,
we've had a number of customer reports of it being confusing.
@boxofrad boxofrad added machine-id no-changelog Indicates that a PR does not require a changelog entry backport/branch/v18 labels Jan 9, 2026
slog.DebugContext(ctx, "Waiting to retry operation again", "wait", wait.String(), "error", err)
} else {
r.BeforeRetry(ctx, err, wait, r.attempt)
}
Copy link
Contributor

@rosstimothy rosstimothy Jan 9, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would this be less scary/noisy if we didn't include the stack trace? Can we remove the callback and always unwrap the error instead?

		slog.DebugContext(ctx, "Waiting to retry operation again", "wait", r.Duration().String(), "error", trace.Unwrap(err))

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah - I think removing the stack trace makes this much less visually disruptive.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backport/branch/v18 machine-id no-changelog Indicates that a PR does not require a changelog entry size/sm

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants