It doesn't help with installing the dissect filter specifically, but the dissect filter is for parsing and a better alternative for parsing is Elasticsearch ingest node.
You can read more about ingest node parsing at:
https://www.elastic.co/guide/en/elasticsearch/reference/6.8/ingest.html
You'll likely want to use the kv filter plugin as opposed to the grok filter plugin, as KV can pretty much auto-parse key=value pairs. You could certainly still use Grok, but if the format of the log various, it makes it harder to manage.
​
https://www.elastic.co/guide/en/logstash/current/plugins-filters-kv.html
​
Also, you'll want to have a filter using a field that is already getting extracted via the native forwarding from syslog-ng
​
filter {
if "something" in [syslog-host_from] {
kv {}
}
}
​
Thanks,
Wes
No. Just go into the filebeat.yml file and change the output to logstash.
https://www.elastic.co/guide/en/beats/filebeat/7.8/configuring-howto-filebeat.html
Also make sure you are using the correct beat version (7.8 I think)
I'm able to reach Kibana on 80/443, that's fine. I'm specifically referencing part of the packetbeat setup as explained on Elastic's site.
https://www.elastic.co/guide/en/beats/packetbeat/current/view-kibana-dashboards.html
I would still love to know why I can't open Kibana to other interfaces on my SO master server.
Yep, verified the ISO. I burned to USB using Ventoy (https://www.ventoy.net/en/index.html), which is an awesome tool, and is how I installed Security Onion 16.04.1. I don't mind trying with Balena Etcher though.
I'll try another USB port too. There are two different USB controllers on this machine. I'll keep digging. Thanks for your help.
You'll need to do something like the following since so-email doesn't support sending to external domains OOB:
Having spent the hours to download the ova file....just to install it on my server....allocate the 8 procs and 32 GB of RAM for it to still run like total crap....save yourself the time and trouble, it isn't worth it. If you are wanting something for just HIDS level detection, Wazuh has their own server platform as well for their agents. https://wazuh.com/ but I like SO because I can allocate sniffer ports and set up a SPAN/port mirror on a firewall or switch to dump the traffic to. This way it checks all of my traffic without interfering with it.
Wazuh engineer here! Ping us if you guys have any doubt or any issue about the product.
We also have Slack for the community, https://wazuh.com/community/join-us-on-slack/.
And thanks for the post Security Onion.
When you switch to Elastic Features, you can remain on the BASIC license for free forever. For more information, please see:
Not out of box, but there are several options to add it: - Qosient Argus was included until it was removed in Security Onion 16.04.4.1. However, can still be installed to provide flow capture and aggregation. - SILK. Silk was never included in SO, and requires somewhat more configuration. However, CMU provides very through documentation which should guide setup. - Logstash: logstash includes a netflow module, which one could use to add flow into the existing ELK setup. https://www.elastic.co/guide/en/logstash/current/netflow-module.html - nfsen and other similar tools are also available as docker containers “google: docker netflow”
All of these will require some setup, and have impact on your platform depending on how much data you’re trying to aggregate. Of the list, Argus may be the least intrusive, if your willing to do all your queries on cli.
Nfsen is a good “dedicated system” with webgui. FlowBAT is another GUI option that can be added to SILK.
See also the discussion here: https://www.reddit.com/r/networking/comments/4dxv6a/netflow_collectors_what_do_you_use_and_why/
Finally, what is the use case? For directly captured traffic, bro already has very similar data to netflow. So this is best pursued when you have several remote routers feeding data. A production Onion setup also has separation of duties between different node types, based on data capture, processing, and cluster management and query... so if there are a significant volume of flow data it’s worth setting up on a dedicated host, using one of the systems mentioned above or in the r/networking post.