Description
Logstash information:
Please include the following information:
- Logstash version: 8.17.3
- Logstash installation source (e.g. built from source, with a package manager: DEB/RPM, expanded from tar or zip archive, docker):
Logstash is installed by downloading the official tar.gz archive (logstash-8.17.3-linux-x86_64.tar.gz) directly from Elastic’s artifacts site, verifying its checksum, and extracting it inside the Docker image. The Dockerfile is based on openjdk:17-mariner, and Logstash is not installed via a package manager or built from source. - How is Logstash being run (e.g. as a service/service manager: systemd, upstart, etc. Via command line, docker/kubernetes):
Logstash is running inside a Docker container, and it's booted through docker command.
Plugins installed: (bin/logstash-plugin list --verbose
)
logstash-filter-prune, logstash-output-kusto
JVM (e.g. java -version
):
If the affected version of Logstash is 7.9 (or earlier), or if it is NOT using the bundled JDK or using the 'no-jdk' version in 7.10 (or higher), please provide the following information:
- JVM version (
java -version
): Java Virtual Machine (JVM) version: OpenJDK 17 - JVM installation source (e.g. from the Operating System's package manager, from source, etc).: From the base image mcr.microsoft.com/openjdk/jdk:17-mariner, which includes the Microsoft Build of OpenJDK 17 pre-installed (i.e., bundled in the container image, not installed via OS package manager or source).
- Value of the
LS_JAVA_HOME
environment variable if set.
OS version (uname -a
if on a Unix-like system):
Linux 5.15.0-1082-azure #91~20.04.1-Ubuntu SMP Tue Feb 25 03:23:03 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux
Description of the problem including expected versus actual behavior:
Steps to reproduce:
Please include a minimal but complete recreation of the problem,
including (e.g.) pipeline definition(s), settings, locale, etc. The easier
you make for us to reproduce it, the more likely that somebody will take the
time to look at it.
- The basic flow of our service: backend service ---> RabbitMQ ---> Logstash ---> Kusto
- We observed that the service had been running normally for a few days, but suddenly Logstash stopped consuming messages from RabbitMQ. Upon checking the Logstash container's status and logs, we noticed that the container had recently restarted and was only up for a few seconds. Further inspection of the container logs revealed a
java.lang.OutOfMemoryError: Java heap space
error:
Dumping heap to java_pid1.hprof ...
Heap dump file created [1138047821 bytes in 4.153 secs]
[2025-04-07T04:42:29,756][FATAL][org.logstash.Logstash ][txlog] uncaught error (in thread [txlog]>worker24)
java.lang.OutOfMemoryError: Java heap space
Provide logs (if relevant):
docker logs a7205dbf703a 2>&1 | head -n 100
Using bundled JDK: /usr/share/logstash/jdk
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2025-04-07T04:42:09,717][WARN ][logstash.runner ] NOTICE: Running Logstash as a superuser is strongly discouraged as it poses a security risk. Set 'allow_superuser' to false for better security.
[2025-04-07T04:42:09,724][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
[2025-04-07T04:42:09,726][WARN ][logstash.runner ] The use of JAVA_HOME has been deprecated. Logstash 8.0 and later ignores JAVA_HOME and uses the bundled JDK. Running Logstash with the bundled JDK is recommended. The bundled JDK has been verified to work with each specific version of Logstash, and generally provides best performance and reliability. If you have compelling reasons for using your own JDK (organizational-specific compliance requirements, for example), you can configure LS_JAVA_HOME to use that version instead.
[2025-04-07T04:42:09,726][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.17.3", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.6+7-LTS on 21.0.6+7-LTS +indy +jit [x86_64-linux]"}
[2025-04-07T04:42:09,729][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2025-04-07T04:42:09,760][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override logstash.jackson.stream-read-constraints.max-string-length
configured to 200000000
[2025-04-07T04:42:09,761][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override logstash.jackson.stream-read-constraints.max-number-length
configured to 10000
[2025-04-07T04:42:09,905][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2025-04-07T04:42:10,445][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2025-04-07T04:42:11,199][INFO ][org.reflections.Reflections] Reflections took 76 ms to scan 1 urls, producing 152 keys and 530 values
[2025-04-07T04:42:11,472][INFO ][logstash.filters.ruby.script] Test run complete {:script_path=>"/usr/share/logstash/sanitize_txlog.rb", :results=>{:passed=>0, :failed=>0, :errored=>0}}
[2025-04-07T04:42:11,548][INFO ][logstash.codecs.jsonlines] ECS compatibility is enabled but target
option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the target
option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2025-04-07T04:42:11,910][INFO ][logstash.javapipeline ] Pipeline txlog
is configured with pipeline.ecs_compatibility: v8
setting. All plugins in this pipeline will default to ecs_compatibility => v8
unless explicitly configured otherwise.
[2025-04-07T04:42:11,948][INFO ][logstash.outputs.kusto ][txlog] Preparing Kusto resources.
[2025-04-07T04:42:11,949][INFO ][logstash.outputs.kusto ][txlog] Using user managed identity.
[2025-04-07T04:42:12,354][INFO ][com.microsoft.azure.kusto.ingest.QueuedIngestClientImpl][txlog] Creating a new IngestClient
[2025-04-07T04:42:12,358][INFO ][com.microsoft.azure.kusto.data.http.HttpClientFactory][txlog] Creating new CloseableHttpClient client
[2025-04-07T04:42:12,511][INFO ][com.azure.core.http.netty.implementation.NettyUtility][txlog] The following is Netty version information that was found on the classpath: 'io.netty:netty-common' version: 4.1.118.Final, 'io.netty:netty-handler' version: 4.1.118.Final, 'io.netty:netty-handler-proxy' version: 4.1.118.Final, 'io.netty:netty-buffer' version: 4.1.118.Final, 'io.netty:netty-codec' version: 4.1.118.Final, 'io.netty:netty-codec-http' version: 4.1.118.Final, 'io.netty:netty-codec-http2' version: 4.1.118.Final, 'io.netty:netty-transport-native-unix-common' version: 4.1.118.Final, 'io.netty:netty-transport-native-epoll' version: 4.1.118.Final, 'io.netty:netty-transport-native-kqueue' version: 4.1.118.Final. The version of azure-core-http-netty being used was built with Netty version 4.1.108.Final and Netty Tcnative version 2.0.65.Final. If your application runs without issue this message can be ignored, otherwise please align the Netty versions used in your application. For more information, see https://aka.ms/azsdk/java/dependency/troubleshoot.
[2025-04-07T04:42:12,580][INFO ][com.microsoft.azure.kusto.ingest.ResourceManager][txlog] Refreshing Ingestion Resources
[2025-04-07T04:42:12,600][INFO ][com.microsoft.azure.kusto.data.ExponentialRetry][txlog] execute: Attempt 0
[2025-04-07T04:42:12,601][INFO ][logstash.filters.json ][txlog] ECS compatibility is enabled but target
option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the target
option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2025-04-07T04:42:12,612][INFO ][logstash.javapipeline ][txlog] Starting pipeline {:pipeline_id=>"txlog", "pipeline.workers"=>32, "pipeline.batch.size"=>512, "pipeline.batch.delay"=>500, "pipeline.max_inflight"=>16384, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x127bc4a6 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2025-04-07T04:42:14,175][INFO ][logstash.javapipeline ][txlog] Pipeline Java execution initialization time {"seconds"=>1.56}
[2025-04-07T04:42:14,179][INFO ][com.azure.identity.ManagedIdentityCredential][txlog] Azure Identity => Managed Identity environment: AZURE VM IMDS ENDPOINT
[2025-04-07T04:42:14,228][INFO ][logstash.javapipeline ][txlog] Pipeline started {"pipeline.id"=>"txlog"}
[2025-04-07T04:42:14,241][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:txlog], :non_running_pipelines=>[]}
[2025-04-07T04:42:14,609][INFO ][logstash.inputs.rabbitmq ][txlog][loggingbot] Connected to RabbitMQ {:url=>"amqp://guest:XXXXXX@localhost:5672/"}
[2025-04-07T04:42:14,632][INFO ][logstash.inputs.rabbitmq ][txlog][loggingbot] Connected to RabbitMQ {:url=>"amqp://guest:XXXXXX@localhost:5672/"}
[2025-04-07T04:42:14,641][INFO ][logstash.inputs.rabbitmq ][txlog][loggingbot] Connected to RabbitMQ {:url=>"amqp://guest:XXXXXX@localhost:5672/"}
[2025-04-07T04:42:14,688][INFO ][logstash.inputs.rabbitmq ][txlog][loggingbot] Connected to RabbitMQ {:url=>"amqp://guest:XXXXXX@localhost:5672/"}
[2025-04-07T04:42:14,724][INFO ][logstash.inputs.rabbitmq ][txlog][loggingbot] Declaring exchange 'logging_fanout' with type fanout
[2025-04-07T04:42:14,724][INFO ][logstash.inputs.rabbitmq ][txlog][loggingbot] Declaring exchange 'logging_fanout' with type fanout
[2025-04-07T04:42:14,724][INFO ][logstash.inputs.rabbitmq ][txlog][loggingbot] Declaring exchange 'logging_fanout' with type fanout
[2025-04-07T04:42:14,730][INFO ][logstash.inputs.rabbitmq ][txlog][loggingbot] Declaring exchange 'logging_fanout' with type fanout
/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/march_hare-4.6.0-java/lib/march_hare/queue.rb:161: warning: already initialized constant org.jruby.gen::InterfaceImpl868482742
/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/march_hare-4.6.0-java/lib/march_hare/queue.rb:161: warning: already initialized constant org.jruby.gen::InterfaceImpl868482742
/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/march_hare-4.6.0-java/lib/march_hare/queue.rb:161: warning: already initialized constant org.jruby.gen::InterfaceImpl868482742
[2025-04-07T04:42:15,014][INFO ][logstash.outputs.kusto ][txlog][51d1baf26915b795cd423d866799563862c713d431f72864949dc1384826ab9c] Opening file {:path=>"/kusto/apistats/2025-04-06-09-38data.txt.Txlog.UserLog"}
[2025-04-07T04:42:15,016][INFO ][logstash.outputs.kusto ][txlog][51d1baf26915b795cd423d866799563862c713d431f72864949dc1384826ab9c] Creating directory {:directory=>"/kusto/apistats"}
[2025-04-07T04:42:15,381][INFO ][logstash.outputs.kusto ][txlog][51d1baf26915b795cd423d866799563862c713d431f72864949dc1384826ab9c] Opening file {:path=>"/kusto/apistats/2025-04-06-09-44data.txt.Txlog.UserLog"}
[2025-04-07T04:42:15,400][INFO ][com.microsoft.azure.kusto.ingest.ResourceManager][txlog] Refreshing Ingestion Resources Finished
[2025-04-07T04:42:15,424][INFO ][com.microsoft.azure.kusto.ingest.ResourceManager][txlog] Refreshing Ingestion Auth Token
[2025-04-07T04:42:15,428][INFO ][com.azure.identity.ManagedIdentityCredential][txlog] Azure Identity => Managed Identity environment: AZURE VM IMDS ENDPOINT
[2025-04-07T04:42:15,445][INFO ][com.microsoft.azure.kusto.ingest.ResourceManager][txlog] Refreshing Ingestion Auth Token Finished
java.lang.OutOfMemoryError: Java heap space
Dumping heap to java_pid1.hprof ...
Heap dump file created [1138047821 bytes in 4.153 secs]
[2025-04-07T04:42:29,756][FATAL][org.logstash.Logstash ][txlog] uncaught error (in thread [txlog]>worker24)
java.lang.OutOfMemoryError: Java heap space
at java.lang.invoke.Invokers$Holder.invokeExact_MT(java/lang/invoke/Invokers$Holder) ~[?:?]
at jdk.internal.reflect.DirectMethodHandleAccessor.invokeImpl(jdk/internal/reflect/DirectMethodHandleAccessor.java:154) ~[?:?]
at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(jdk/internal/reflect/DirectMethodHandleAccessor.java:103) ~[?:?]
at java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:580) ~[?:?]
at org.logstash.ackedqueue.Queue.deserialize(org/logstash/ackedqueue/Queue.java:759) ~[logstash-core.jar:?]
at org.logstash.ackedqueue.Batch.deserializeElements(org/logstash/ackedqueue/Batch.java:89) ~[logstash-core.jar:?]
at org.logstash.ackedqueue.Batch.(org/logstash/ackedqueue/Batch.java:49) ~[logstash-core.jar:?]
at org.logstash.ackedqueue.Queue$SerializedBatchHolder.deserialize(org/logstash/ackedqueue/Queue.java:917) ~[logstash-core.jar:?]
at org.logstash.ackedqueue.Queue.readBatch(org/logstash/ackedqueue/Queue.java:616) ~[logstash-core.jar:?]
at org.logstash.ackedqueue.ext.JRubyAckedQueueExt.readBatch(org/logstash/ackedqueue/ext/JRubyAckedQueueExt.java:158) ~[logstash-core.jar:?]
at org.logstash.ackedqueue.AckedReadBatch.create(org/logstash/ackedqueue/AckedReadBatch.java:49) ~[logstash-core.jar:?]
at org.logstash.ext.JrubyAckedReadClientExt.readBatch(org/logstash/ext/JrubyAckedReadClientExt.java:87) ~[logstash-core.jar:?]
at org.logstash.execution.WorkerLoop.run(org/logstash/execution/WorkerLoop.java:82) ~[logstash-core.jar:?]
at java.lang.invoke.DirectMethodHandle$Holder.invokeSpecial(java/lang/invoke/DirectMethodHandle$Holder) ~[?:?]
at java.lang.invoke.LambdaForm$MH/0x00007f5a3c5e5400.invoke(java/lang/invoke/LambdaForm$MH) ~[?:?]
at java.lang.invoke.Invokers$Holder.invokeExact_MT(java/lang/invoke/Invokers$Holder) ~[?:?]
at jdk.internal.reflect.DirectMethodHandleAccessor.invokeImpl(jdk/internal/reflect/DirectMethodHandleAccessor.java:153) ~[?:?]
at jdk.internal.reflect.DirectMethodHandleAccessor.invoke(jdk/internal/reflect/DirectMethodHandleAccessor.java:103) ~[?:?]
at java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:580) ~[?:?]
at org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:300) ~[jruby.jar:?]
at org.jruby.javasupport.JavaMethod.invokeDirect(org/jruby/javasupport/JavaMethod.java:164) ~[jruby.jar:?]
at RUBY.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:308) ~[?:?]
at org.jruby.RubyProc.call(org/jruby/RubyProc.java:354) ~[jruby.jar:?]
[2025-04-07T04:42:29,758][FATAL][org.logstash.Logstash ][txlog] uncaught error (in thread [txlog]>worker22)
java.lang.OutOfMemoryError: Java heap space
at java.nio.CharBuffer.wrap(java/nio/CharBuffer.java:436) ~[?:?]
at com.jrjackson.RubyStringConverter.convert(com/jrjackson/RubyStringConverter.java:15) ~[jrjackson-1.2.37.jar:?]
at com.jrjackson.RubyHandler.treatString(com/jrjackson/RubyHandler.java:96) ~[jrjackson-1.2.37.jar:?]
at com.jrjackson.JrParse.handleRubyToken(com/jrjackson/JrParse.java:106) ~[jrjackson-1.2.37.jar:?]
at com.jrjackson.JrParse.deserialize(com/jrjackson/JrParse.java:30) ~[jrjackson-1.2.37.jar:?]
at com.jrjackson.JrJacksonRuby.__parse(com/jrjackson/JrJacksonRuby.java:85) ~[jrjackson-1.2.37.jar:?]
at com.jrjackson.JrJacksonRuby.parse(com/jrjackson/JrJacksonRuby.java:63) ~[jrjackson-1.2.37.jar:?]
at java.lang.invoke.DirectMethodHandle$Holder.invokeStatic(java/lang/invoke/DirectMethodHandle$Holder) ~[?:?]
at java.lang.invoke.LambdaForm$MH/0x00007f5a3d1abc00.invoke(java/lang/invoke/LambdaForm$MH) ~[?:?]
at java.lang.invoke.DelegatingMethodHandle$Holder.delegate(java/lang/invoke/DelegatingMethodHandle$Holder) ~[?:?]
at java.lang.invoke.LambdaForm$MH/0x00007f5a3c6ec800.guard(java/lang/invoke/LambdaForm$MH) ~[?:?]
at java.lang.invoke.DelegatingMethodHandle$Holder.delegate(java/lang/invoke/DelegatingMethodHandle$Holder) ~[?:?]
at java.lang.invoke.LambdaForm$MH/0x00007f5a3c6ec800.guard(java/lang/invoke/LambdaForm$MH) ~[?:?]
at java.lang.invoke.Invokers$Holder.linkToCallSite(java/lang/invoke/Invokers$Holder) ~[?:?]
at usr.share.logstash.logstash_minus_core.lib.logstash.json.jruby_load(/usr/share/logstash/logstash-core/lib/logstash/json.rb:29) ~[?:?]
at java.lang.invoke.DirectMethodHandle$Holder.invokeStatic(java/lang/invoke/DirectMethodHandle$Holder) ~[?:?]
at java.lang.invoke.LambdaForm$MH/0x00007f5a3c739c00.invoke(java/lang/invoke/LambdaForm$MH) ~[?:?]
at java.lang.invoke.DelegatingMethodHandle$Holder.delegate(java/lang/invoke/DelegatingMethodHandle$Holder) ~[?:?]
at java.lang.invoke.LambdaForm$MH/0x00007f5a3c6e9000.guard(java/lang/invoke/LambdaForm$MH) ~[?:?]
at java.lang.invoke.DelegatingMethodHandle$Holder.delegate(java/lang/invoke/DelegatingMethodHandle$Holder) ~[?:?]
at java.lang.invoke.LambdaForm$MH/0x00007f5a3c6e9000.guard(java/lang/invoke/LambdaForm$MH) ~[?:?]
at java.lang.invoke.Invokers$Holder.linkToCallSite(java/lang/invoke/Invokers$Holder) ~[?:?]
at usr.share.logstash.vendor.bundle.jruby.$3_dot_1_dot_0.gems.logstash_minus_filter_minus_json_minus_3_dot_2_dot_1.lib.logstash.filters.json.filter(/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-filter-json-3.2.1/lib/logstash/filters/json.rb:84) ~[?:?]
at java.lang.invoke.LambdaForm$DMH/0x00007f5a3cda5000.invokeStatic(java/lang/invoke/LambdaForm$DMH) ~[?:?]