You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First, thanks for nice project, I like it, especially docker version, and I'm using it in my Nude Crawler project.
Problem: when using docker image it almost never releases memory, and takes more and more (until will get our of memory).
How to reproduce problem:
# start and see how much memory it uses (tiny 11Mb)
$ sudo docker run -d --rm -p 9191:9191 --name aid --memory=1G opendating/adult-image-detector
$ sudo docker stats --no-stream
CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS
6b22f14d33a3 aid 0.00% 11.27MiB / 1GiB 1.10% 5.11kB / 0B 11.6MB / 0B 6
# Analyse first file
$ curl -s -i -X POST -F "image=@/tmp/eropicture.jpg" http://localhost:9191/api/v1/detect > /dev/null
$ sudo docker stats --no-stream
CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS
6b22f14d33a3 aid 0.00% 170MiB / 1GiB 16.60% 315kB / 1.9kB 35.9MB / 0B 10
# and two more
$ curl -s -i -X POST -F "image=@/tmp/eropicture.jpg" http://localhost:9191/api/v1/detect > /dev/null
$ curl -s -i -X POST -F "image=@/tmp/eropicture.jpg" http://localhost:9191/api/v1/detect > /dev/null
# Why now it needs extra 50Mb after two more images analysed?? Ok...
$ sudo docker stats --no-stream
CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS
6b22f14d33a3 aid 0.00% 221.7MiB / 1GiB 21.65% 932kB / 5.48kB 35.9MB / 0B 10
# Lets analyse 100 more images
$ foriin`seq 1 100`;do curl -s -i -X POST -F "image=@/tmp/eropicture.jpg" http://localhost:9191/api/v1/detect > /dev/null ;done# now 500Mb used for something...
$ sudo docker stats --no-stream
CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS
6b22f14d33a3 aid 0.00% 545.4MiB / 1GiB 53.26% 31.7MB / 174kB 36MB / 614kB 11
# Now it uses about all of available limit, sometimes 1022MiB, sometimes falls back to 976MiB, # but about at 700-800 curl request container crashes. # If run in foreground, no error messages displayed, last messages are:
2023/03/17 18:34:21 For file 2023-03-17T18:34:21Z_fabede53-3817-47f8-a077-86d712d51602.jpg, openNsfwScore=0.834231
2023/03/17 18:34:21 For file 2023-03-17T18:34:21Z_fabede53-3817-47f8-a077-86d712d51602.jpg, anAlgorithmForNudityDetection=true
2023/03/17 18:34:21 Uploaded file eropicture.jpg, saved as 2023-03-17T18:34:21Z_3c3370b3-ede2-4847-ac04-9ba5957765b0.jpg
Test image size is 299K. All tests I did on docker image I pulled today.
If set RAM limit to 300Mb, it goes to almost full memory much sooner, at about 20-30th request, but continues to work at this memory usage up to ~100th request. (then crashes). When I start it without memory limit, it runs much longer (I have 16Gb RAM), but in the and machine hits OOM anyway.
If stop sending new requests, it keeps almost full memory for long time (I tried to wait for few hours, memory usage is nearly same after few hours of idle time).
In theory, I could restart docker container after every N requests, but it looks ugly. And my script runs as ordinary user account, I would not like to give it root access even to analyze nude woman.
Maybe it's possible to tune up garbage collector or restart process? Or maybe make API call (which I would trigger after every N requests) which will release memory or restart program?
The text was updated successfully, but these errors were encountered:
Hello!
First, thanks for nice project, I like it, especially docker version, and I'm using it in my Nude Crawler project.
Problem: when using docker image it almost never releases memory, and takes more and more (until will get our of memory).
How to reproduce problem:
Test image size is 299K. All tests I did on docker image I pulled today.
If set RAM limit to 300Mb, it goes to almost full memory much sooner, at about 20-30th request, but continues to work at this memory usage up to ~100th request. (then crashes). When I start it without memory limit, it runs much longer (I have 16Gb RAM), but in the and machine hits OOM anyway.
If stop sending new requests, it keeps almost full memory for long time (I tried to wait for few hours, memory usage is nearly same after few hours of idle time).
In theory, I could restart docker container after every N requests, but it looks ugly. And my script runs as ordinary user account, I would not like to give it root access even to analyze nude woman.
Maybe it's possible to tune up garbage collector or restart process? Or maybe make API call (which I would trigger after every N requests) which will release memory or restart program?
The text was updated successfully, but these errors were encountered: