Description
Description
We are seeing occasional errors logged in NewRelic that appear to be originating in the NewRelic gem code. Here is a sample backtrace:
NoMethodError: undefined method `add_request_headers' for nil:NilClass
…ms/newrelic_rpm-9.16.1/lib/new_relic/agent/instrumentation/net_http/instrumentation.rb:24:in `request_with_tracing'
…undle/gems/newrelic_rpm-9.16.1/lib/new_relic/agent/instrumentation/net_http/prepend.rb:15:in `request'
…local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/net_http/connection_pool.rb:348:in `request'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/net_http/handler.rb:80:in `block in transmit'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/net_http/handler.rb:134:in `block in session'
…local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/net_http/connection_pool.rb:106:in `session_for'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/net_http/handler.rb:129:in `session'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/net_http/handler.rb:77:in `transmit'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/net_http/handler.rb:46:in `block in call'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/net_http/handler.rb:206:in `block in span_wrapper'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/telemetry/no_op.rb:29:in `in_span'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/net_http/handler.rb:202:in `span_wrapper'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/net_http/handler.rb:45:in `call'
…r/local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/plugins/content_length.rb:24:in `call'
…local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/plugins/request_callback.rb:118:in `call'
/usr/local/bundle/gems/aws-sdk-s3-1.176.1/lib/aws-sdk-s3/plugins/s3_signer.rb:78:in `call'
/usr/local/bundle/gems/aws-sdk-s3-1.176.1/lib/aws-sdk-s3/plugins/s3_host_id.rb:17:in `call'
/usr/local/bundle/gems/aws-sdk-s3-1.176.1/lib/aws-sdk-s3/plugins/http_200_errors.rb:17:in `call'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/xml/error_handler.rb:10:in `call'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/sign.rb:53:in `call'
…r/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/transfer_encoding.rb:27:in `call'
…cal/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/helpful_socket_errors.rb:12:in `call'
/usr/local/bundle/gems/aws-sdk-s3-1.176.1/lib/aws-sdk-s3/plugins/s3_signer.rb:53:in `call'
/usr/local/bundle/gems/aws-sdk-s3-1.176.1/lib/aws-sdk-s3/plugins/redirects.rb:20:in `call'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/user_agent.rb:69:in `call'
<truncated 26 additional frames>
…/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/plugins/raise_response_errors.rb:16:in `call'
/usr/local/bundle/gems/aws-sdk-s3-1.176.1/lib/aws-sdk-s3/plugins/sse_cpk.rb:24:in `call'
/usr/local/bundle/gems/aws-sdk-s3-1.176.1/lib/aws-sdk-s3/plugins/dualstack.rb:21:in `call'
/usr/local/bundle/gems/aws-sdk-s3-1.176.1/lib/aws-sdk-s3/plugins/accelerate.rb:43:in `call'
…/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/checksum_algorithm.rb:111:in `call'
…local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/jsonvalue_converter.rb:16:in `call'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/invocation_id.rb:16:in `call'
…r/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/idempotency_token.rb:19:in `call'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/param_converter.rb:26:in `call'
…local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/plugins/request_callback.rb:89:in `call'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/response_paging.rb:12:in `call'
…/local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/plugins/response_target.rb:24:in `call'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/telemetry.rb:39:in `block in call'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/telemetry/no_op.rb:29:in `in_span'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/telemetry.rb:53:in `span_wrapper'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/aws-sdk-core/plugins/telemetry.rb:39:in `call'
/usr/local/bundle/gems/aws-sdk-core-3.214.0/lib/seahorse/client/request.rb:72:in `send_request'
/usr/local/bundle/gems/aws-sdk-s3-1.176.1/lib/aws-sdk-s3/client.rb:17185:in `put_object'
/usr/src/app/app/services/logging_caching/logger_helper.rb:142:in `save_to_s3'
Specifically, it appears to be blowing up here, meaning that NewRelic::Agent::Tracer.start_external_request_segment
seems to be incorrectly returning nil
.
Expected Behavior
NewRelic instrumentation reliably works correctly when multiple requests are in flight at once on different threads on JRuby.
Troubleshooting or NR Diag results
Provide any other relevant log data.
TIP: Scrub logs and diagnostic information for sensitive information
Steps to Reproduce
This appears to be a race condition. We see it quite frequently when multiple requests are running at once on different threads, but it is hard to specifically reproduce.
Your Environment
JRuby 9.4.9.0 (jruby:9.4.9.0-jdk21
docker image)
NewRelic 9.16.1
Additional context
Add any other context about the problem here. For example, relevant community posts or support tickets.
For Maintainers Only or Hero Triaging this bug
Suggested Priority (P1,P2,P3,P4,P5):
Suggested T-Shirt size (S, M, L, XL, Unknown):
Metadata
Assignees
Type
Projects
Status
In Sprint