Git Product home page Git Product logo

Comments (19)

yawkat avatar yawkat commented on May 28, 2024 1

you could try setting micronaut.server.netty.server-type: full_content

from micronaut-core.

katoquro avatar katoquro commented on May 28, 2024 1

Hello.
We don't use multipart data at all. Recently I've deployed a new service that answers only health checks, promethus metrics, and rare POSTs with data to store it mongo. It is a very simple micro so I gave 0.5 Gb of RAM to it and I see 1 per day or 2 days OOM there
We use MN 4.2.0, netty, NO GraalVM, and Project Reactor everywhere
I'll try to investigate a bit deeper later

from micronaut-core.

yawkat avatar yawkat commented on May 28, 2024 1

@loicmathieu only works if you lower your max-request-size to something that fits in memory.

from micronaut-core.

loicmathieu avatar loicmathieu commented on May 28, 2024 1

@yawkat on user confirm that using the following configuration fixes the issue (or works around it):

configuration:
  micronaut:
    server:
      max-request-size: 1GB
      netty:
        server-type: full_content

from micronaut-core.

loicmathieu avatar loicmathieu commented on May 28, 2024

I don't know if it is of any help but I notice on an heap dump that it appears that in the DelayedExecutionFlowImpl there is a head attribute which contains a next attribute which contains a next attribute... recursively without apparent ends, looks like all the DelayedExecutionFlowImpl are next of a parent one ...

from micronaut-core.

loicmathieu avatar loicmathieu commented on May 28, 2024

cc @yawkat

from micronaut-core.

loicmathieu avatar loicmathieu commented on May 28, 2024

More information to help diagnose the issue.
A single StreamingByetBody is handling 6 millions DelayedExecutionFloImplt$OnErrorResume objects into a RequestLifecycle lambda retaining 1.6GB.
image

from micronaut-core.

tchiotludo avatar tchiotludo commented on May 28, 2024

Just raw information, our whole application is broken due to this memory leak and customers and users are complaining, we try multiple workaround with no success 😭
We also try to make a PR, but definitely http server part are really complex for new comers.
If you have any workaround advice, it will be awesome

from micronaut-core.

yawkat avatar yawkat commented on May 28, 2024

@tchiotludo please give us some way to reproduce this. The form/multipart code is very complex and I don't see a starting point for debugging here

from micronaut-core.

loicmathieu avatar loicmathieu commented on May 28, 2024

@yawkat it's very problematic as I didn't succeed in reproducing the problem.

That's why I added as much information as I could; users seem to not using form/multipart that much, and the memory leak points to RequestLifecycle so I'm not sure it is linked to form/multipart at all.

I can ask if I can share the dump if you want, but as a memory dump can contain sensitive data, I need to check first with the user and share it privately.

I can ask our users to provide more information but creating a reproducer seems to be very complex.

from micronaut-core.

loicmathieu avatar loicmathieu commented on May 28, 2024

Thanks @yawkat we will test it, meanwhile I'll try my best to make a reproducer

from micronaut-core.

loicmathieu avatar loicmathieu commented on May 28, 2024

@yawkat we cannot use micronaut.server.netty.server-type: full_content it crash for all requests with:

2024-04-09 11:33:36,466 WARN  default-nioEventLoopGroup-1-3 io.netty.channel.ChannelInitializer Failed to initialize a channel. Closing: [id: 0x646fd7cb, L:/[0:0:0:0:0:0:0:1]:8080 - R:/[0:0:0:0:0:0:0:1]:48850]
java.lang.IllegalArgumentException: maxContentLength : -2147483648 (expected: >= 0)
	at io.netty.util.internal.ObjectUtil.checkPositiveOrZero(ObjectUtil.java:144)
	at io.netty.handler.codec.MessageAggregator.validateMaxContentLength(MessageAggregator.java:88)
	at io.netty.handler.codec.MessageAggregator.<init>(MessageAggregator.java:77)
	at io.netty.handler.codec.http.HttpObjectAggregator.<init>(HttpObjectAggregator.java:128)
	at io.micronaut.http.server.netty.HttpPipelineBuilder$StreamPipeline.insertMicronautHandlers(HttpPipelineBuilder.java:608)
	at io.micronaut.http.server.netty.HttpPipelineBuilder$StreamPipeline.insertHttp1DownstreamHandlers(HttpPipelineBuilder.java:638)
	at io.micronaut.http.server.netty.HttpPipelineBuilder$ConnectionPipeline.configureForHttp1(HttpPipelineBuilder.java:380)
	at io.micronaut.http.server.netty.HttpPipelineBuilder$ConnectionPipeline.initChannel(HttpPipelineBuilder.java:299)
	at io.micronaut.http.server.netty.NettyHttpServer$Listener.initChannel(NettyHttpServer.java:892)
	at io.netty.channel.ChannelInitializer.initChannel(ChannelInitializer.java:129)
	at io.netty.channel.ChannelInitializer.handlerAdded(ChannelInitializer.java:112)
	at io.netty.channel.AbstractChannelHandlerContext.callHandlerAdded(AbstractChannelHandlerContext.java:1130)
	at io.netty.channel.DefaultChannelPipeline.callHandlerAdded0(DefaultChannelPipeline.java:609)
	at io.netty.channel.DefaultChannelPipeline.access$100(DefaultChannelPipeline.java:46)
	at io.netty.channel.DefaultChannelPipeline$PendingHandlerAddedTask.execute(DefaultChannelPipeline.java:1463)
	at io.netty.channel.DefaultChannelPipeline.callHandlerAddedForAllHandlers(DefaultChannelPipeline.java:1115)
	at io.netty.channel.DefaultChannelPipeline.invokeHandlerAddedIfNeeded(DefaultChannelPipeline.java:650)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.register0(AbstractChannel.java:514)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.access$200(AbstractChannel.java:429)
	at io.netty.channel.AbstractChannel$AbstractUnsafe$1.run(AbstractChannel.java:486)
	at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:173)
	at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:166)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569)
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
	at java.base/java.lang.Thread.run(Thread.java:840)

from micronaut-core.

loicmathieu avatar loicmathieu commented on May 28, 2024

@katoquro to check if it's the same issue you can try the following command to see if the same objects are accumulating:

jmap -histo:live <pid> | grep io.micronaut.core.execution

from micronaut-core.

loicmathieu avatar loicmathieu commented on May 28, 2024

@yawkat with this configuration, file of more than 1GB lead to a request that seems to be "blocked forever" without an exception. So it's a workaround for some of our users but not a long term solution.

Do you still need a reproducer (I'm working on it but still didn't make it reproduce the issue)?

from micronaut-core.

yawkat avatar yawkat commented on May 28, 2024

yes i still need a reproducer, either from you or from @katoquro .

full_content buffers the full request and bypasses most places that use DelayedExecutionFlow. but it's not recommended for permanent use.

from micronaut-core.

katoquro avatar katoquro commented on May 28, 2024

@loicmathieu
from the first glance, not my case. The micro is run for 5 hours

4245:             1             24  io.micronaut.core.execution.ImperativeExecutionFlowImpl

I will look for a leak in another place 🤔
image

from micronaut-core.

loicmathieu avatar loicmathieu commented on May 28, 2024

@katoquro remove the grep and look at the most present objects in the histogram: jmap -histo:live <pid> you took multiple one and check which objects grow in number this could be an easy way to find a leak.

And if it's a different leak, better to open a new issue ;)

from micronaut-core.

dstepanov avatar dstepanov commented on May 28, 2024

Can you analyze the memory dump and see what is being leaked? You can try https://eclipse.dev/mat/

from micronaut-core.

katoquro avatar katoquro commented on May 28, 2024

@dstepanov @loicmathieu
I think my case is really different. I have next graph where green line is metrics about total committed heap provided by micronenter (sum of jvm_memory_committed_bytes) and yellow is the consumed memory by the java process taken from /proc/<pid>/stat
image

it's something out of heap, non-heap, etc... 🤔

from micronaut-core.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.