Title
#general
d

Doron Gaznavi

12/28/2021, 2:47 PM
Hi, I'm trying to scan my hard disk, using on-demand Yara scanning in python. But when I start the scan from the root, the program consumes a lot of memory and fails in about half a minute when the osqueryd service's memory usage gets to 500Mb-700Mb. The exact error is: "osqueryd worker (1565658) stopping: Memory limits exceeded: 1493492000" And a log is written: On the other hand, when I scan iteratively every directory under the root (var, home etc:...) the osqueryd service memory usage increase slowly and constantly as the program run and ends up on  ~1.3G at the end of the scanning (There is no failure). Do you know how to solve this behavior ?
2:48 PM
Last thing to note - when my program finishes, the osqueryd service doesn't release its memory and I set the flags --watchdog_memory_limit=350 --watchdog_utilization_limit=200
2:51 PM
Attaching a picture of the osquery monitoring while failure and a log written following the failure
s

seph

12/28/2021, 10:04 PM
I don't know that I can debug this, but at some level, I would expect that passing all of disk space through yara would consume a large amount of memory. (And likely fail)
d

Doron Gaznavi

12/29/2021, 11:54 AM
Thanks, I scanned the filesystem recursively and passed in every directory only its direct files through Yara. In that way the demon used about 500mb of memory and all the scan was 5 minutes long instead of 44
s

seph

12/29/2021, 1:34 PM
That kinda makes sense -- it has the effect of chunking it into reasonable batches
1:35 PM
I don't know if that's something we could do under the hood.
1:35 PM
I guess... It's also worth noting that the file table has had trouble recursing the entire filesystem before. It's not really made for it