Page 1 of 1

Memory usage going way over the wrapper limit

Posted: 10 Feb 2025 18:58
by explodingfire
After running the router for a while I realized it was using so much memory that some was put on swap and causing huge slowdown on the raspberry pi (debian 12).
I was running v 2.7 and nothing changed on 2.8 and neither with a different java version (17 and 23).
I then disabled the swap to see what would happen and was not disappointed, if the memory cap (in wrapper.config) is set to 500MB it only takes 1h for it to get a SIGKILL by the system (900 MB of total ram but less than 200 used out of i2p).
At first glance it looks like a memory leak as the process is taking more and more ram without realizing it (no out of memory errors in log and on graphs no more than the limit used). When using lower memory it only takes longer before the eventual crash.
Maybe I'm missing something or it's my fault but it appears to be an important issue.

wrapper.conf has no other changes than the java.maxmemory setting

logs don't really show anything (bc log level WARN)

Code: Select all

/var/log/i2p/wrapped.log
14:09:20 | Launching a JVM...
14:09:21 | WrapperManager: Initializing...
14:09:24 | Starting I2P 2.8.0-0-3~ubuntu1
14:09:24 | WARN: There may be another router already running. Waiting a while to be sure...
14:09:39 | WARN: Old router was not shut down gracefully, deleting /run/i2p/router.ping
14:09:42 | INFO: No, there wasn't another router already running. Proceeding with startup.
14:09:43 | INFO: Locally optimized native BigInteger library loaded from file
14:57:15 | JVM received a signal SIGKILL (9).
14:57:15 | JVM process is gone.
14:57:15 | JVM process exited with a code of 1, setting the Wrapper exit code to 1.
14:57:15 | JVM exited unexpectedly.
14:57:32 | Launching a JVM...
14:57:33 | WrapperManager: Initializing...
14:57:36 | Starting I2P 2.8.0-0-3~ubuntu1
14:57:36 | WARN: Old router was not shut down gracefully, deleting /run/i2p/router.ping
14:57:39 | INFO: Locally optimized native BigInteger library loaded from file
15:42:33 | JVM received a signal SIGKILL (9).
15:42:34 | JVM process is gone.
15:42:34 | JVM process exited with a code of 1, setting the Wrapper exit code to 1.
15:42:34 | JVM exited unexpectedly.
15:42:51 | Launching a JVM...
15:42:52 | WrapperManager: Initializing...
15:42:54 | Starting I2P 2.8.0-0-3~ubuntu1
15:42:55 | WARN: Old router was not shut down gracefully, deleting /run/i2p/router.ping
15:42:58 | INFO: Locally optimized native BigInteger library loaded from file
16:28:46 | JVM received a signal SIGKILL (9).
16:28:46 | JVM process is gone.
16:28:46 | JVM process exited with a code of 1, setting the Wrapper exit code to 1.
16:28:46 | JVM exited unexpectedly.
maxmemory = 400MB (here the process is taking more than 600MB) :

Code: Select all

$ cat /proc/32482/status
Name:   java
Umask:  0022
State:  S (sleeping)
Tgid:   32482
Ngid:   0
Pid:    32482
PPid:   32468
TracerPid:      0
Uid:    106     106     106     106
Gid:    112     112     112     112
FDSize: 2048
Groups: 112 
NStgid: 32482
NSpid:  32482
NSpgid: 32467
NSsid:  32467
Kthread:        0
VmPeak:  4315488 kB
VmSize:  4272648 kB
VmLck:         0 kB
VmPin:         0 kB
VmHWM:    629416 kB
VmRSS:    628904 kB
RssAnon:          616788 kB
RssFile:           12116 kB
RssShmem:              0 kB
VmData:   888440 kB
VmStk:       132 kB
VmExe:         4 kB
VmLib:     24836 kB
VmPTE:      2044 kB
VmSwap:        0 kB

Re: Memory usage going way over the wrapper limit

Posted: 15 Feb 2025 18:07
by zzz
Not your fault but not ours either.

Java virtual machine memory limits are for its "heap" which is its internally-allocated memory area. It does not include OS overhead, other types of memory usage, or total memory reported by the OS.

As shown in your graph, the JVM is correctly limiting the heap size to the configuration.

If it's too much and is crashing your JVM, then reduce the limits.

Re: Memory usage going way over the wrapper limit - Solved

Posted: 16 Feb 2025 22:36
by explodingfire
Thanks for the help!

I knew the JVM would use more ram than the heap but here the effect is definitely noticeable! (520MB used with 300MB heap).
I didn't choose the best screenshot, in previous post, to illustrate what I wanted to show, sometimes the gc kicks in without any crash showing there is enough memory available but overtime the usage keeps going up and eventually it'll happen.

After some digging I was able to determine my max ram usable and where it is spent, also it seems preferable to set Xms the same as Xmx considering it's literally a server.
Interesting to see that the actual memory used is way off from what the JVM reports especially at certain points: