Fleetdm Splunk logging thoughts - it is misleading...
# fleet
Fleetdm Splunk logging thoughts - it is misleading to state that fleetdm can send logs to splunk. There is no functionality as of the latest release to do this. Sending logs to firehose then to Splunk is a workaround, not a feature. What happens if you don't have AWS? A more credible solution is write fleetdm query logs to a file/directory that a splunk forwarder can index. It would also be nice if Fleet could post logs to a Splunk HTTP event collector. That indeed could be called a Fleet splunk logging feature.
👀 1
Yea I write to file and then logstash picks it up and sends to elastic. I would like what you like but for elastic instead of splunk :D
We do see users shipping logs to Splunk via both methods you described (Firehose and filesystem+splunk forwarder). We would be happy to review PRs including native logging plugins for Splunk and Elastic.
I'd also like to see scheduled queries write their output to a file/socket that can be consumed by splunk/pushed into cribl.
im pretty sure thats already how it works with the filesystem plugin or at least thats how ive been doing it. i think the op was right with the suggestion that a HEC output would be best for this more so for things like k8s, filesystem works well if its just a static docker host with a UF running on it
👍 1
Yes, the filesystem logging plugin will write scheduled query results to the filesystem on the fleet server where they can be picked up by any forwarder. @wkleinhenz what do you mean by HEC?
Sorry HEC is Splunk's Http Event Collector, its a slightly different format/input then splunk TCP that allows for logs to be submitted using HTTP theres some general docs here https://docs.splunk.com/Documentation/Splunk/8.2.3/Data/UsetheHTTPEventCollector and some dev docs https://dev.splunk.com/enterprise/docs/devtools/httpeventcollector/ thatll do a better job explaining it
For fleet running in kubernetes the filesystem plugin would work as it does elsewhere. Just need a sidecar container (splunk forwarder) or if you're using splunk connect for kubernetes (fluentd daemon set), configure it to ship fleet logs to your indexer cluster.
I'm dumb, I thought if you define
for the logs then you can have a heavy forwarder forward those to splunk. Seemingly the functionality is already there, just not the HTTP Event Collector functionality?
Any splunk forwarder, universal or heavy, can forward json files, such as osquery results. That's not a functionality. Requires installing/configuring a splunk forwarder. I know I'm parsing words here. Sending data to an HEC, which could run on an HF or indexer, takes some development to incorporate into Fleet. So why wouldn't one just install a forwarder to ship the query logs to a HF or indexer? Certainly possible, but sometimes you have constraints where it is not possible/feasible. This is why HEC comes in handy. It was built to ingest json efficiently and it's highly performant, not to mention elegant solution.