Comments (19)
Yes, idstools can help you here. There are 2 programs, idstools-u2json and itstools-u2eve. u2json is a pretty straightforward unified record to json converter. So an event is a discrete json object, the following event is discrete, as is the extra data. All the fields are preserved so you can correlate based on (event-id, event-second). So it might a little more raw than what you are looking for.
u2eve is an attempt to output, as close as possible, Suricata compatible EVE records. So you get your event along with one packet, then any tagged packets would follow as their own "packet" records. They can be correlated with the original event as well.
I suggest you download and try them out on the command line to check the output of each.
Please see this example on how you might use it for continuous reading of your unified2 file:
http://idstools.readthedocs.io/en/latest/tools/u2json.html#example-continuous-conversion-to-json
I'm not actively running Snort anymore, but I originally made these for easy sending to unified2 to Elastic Search, befure UnifiedBeats existed.
https://blog.jasonish.org/2014/04/16/snort-logstash-elastic-search-and-kibana/
Let me know if you have any questions.
from py-idstools.
Thank you...I will give it a shot and report my findings.
from py-idstools.
Ok...so two things. 1. Packet data isn't human readable with u2json:
{"packet": {"packet-second": 1466603481, "linktype": 1, "sensor-id": 0, "packet-microsecond": 639942, "event-second": 1466603481, "length": 1506, "data": "BBVSBR4XACJBMxKyCABFAAXUHRRAADYGjSpKeYhlwKgBXwBQx3mSHxQ5NVoQalAQAO3cOwAAm1/D6G87En5wrn6taK+k1Lcf/W1MbCzWTEyrao0OQlULbmB5lCR5sInbGxvyxlXXnzL3MiKZmcYEyweXhImlv8Sa1At5SFu0+ZryuacSgG+mbImOem46ebd2chDOwBsY/G8OlwfbX3Rul/Th3gLnLm4G6O3fBP73kAcnsHJj1BMKPna9sPtTpeJMe78GsX/zA3jZcol7jxgs9zIUUPDHhdQDFS+Xmi7jfbr/Ij8Y7V13sS1VjBpLn41P95SiamH3IXEfj40F6IHhkk3pbg3mhY0QL6vAH/RXXUhqLJL/ua4j7H/PhnWO0bliS8GY+TAyGVhlTJl3DzqPPFu97oJfWQqv703jiQ34lbwZhz9vXf+GuUloeCZEUN68EiRvsAQqjJw/4CWe2hM9TIAoDynnGhUvca0FVEQEj3wZE55EX3uQlAYnOjLERJ6Yj/FDbWFO7dZomgb4UuCD3twRWzjuox2Np5NRmI22rI3cMaZPbL/p0+4P7YiN4lyq2AIbhZlIUr+mvZQ+t3X0VI32zKilML8E6pyAxAjhCr4HA0AqmOZNEBVhHPArbNKpiZlcpRuTM0VvgfbI2bp+xCa+pKUS/HD5IE0ZlsSR5AyONmwJ5vNzFksbtsReEV1qYwfhUASaHb8GhOizHAdJ56vADvDmShWxIPbGRXS0PRMBKQse3fkzm63kigO7+KJysHtxcnz2Hh1FqFfcVo4YnrdCWQ/m9yOrgKC+5y5dhrMHTYOS8N2rHJA4vXy8cdzKe7lneQliCpYHKB+ug9Gan4NZXKHL63mEL/liA27MKx8AOd/ZY+XE8WApoK67H/CZE1X+wM197PgP7uJhaFbZ932X7KTTeD7nIODh80P8oC7n2fNBy8WPj3yOkvqMefGicvrIQax/ReXpJP77EjkOFESFYh/MrsCviDiAyqfDi8vzs92TyufLw8r52clfKOt7VtcnJ9p7NFKgNaxa2WMhDenSAWWmchnf4FOEYkinTnjH5qgi/ulMvwHU3KvsszmvnB3gg8m/c8/DGpzQtYhFO2DX+NEDDf195XzBbxH6C74E/YxXTjkA6fn0btD+7umnvcOTE6yz+4/j03P4OL44vkQ8fD67AvjRozFls8OAz2nAUwfwJhyPlX0c0hVgtYKORIJ/sXQfX7932QIMpDjU/UkeQOkhAOjEufPxJL2/rJywgIztfSegm8yN8zgKKBbyCuzseydEXFHYpnMTR/zL0cXhobovACfm8Ozg4hhtqA/vPzV23/tAiNN24wSUJ8/6iN3/8X4x2/tMwwd5cO87wfsg9qjPo+OLy6sP56eHOLC/znCoexfnBzgosLZuYQxR5TxAlQgvaMcWiPavYIZxHgIiOM5xooCSvt/4j1jqZP8Ap9ePZ0hq5w/AKmZE27OFX/ndub0VFLfr3rDKKYuEnX7AvQUj2/u9E7hhhQW88ieYalhznywzt3LC51HjEm1JjslXAYN1g31/wUVAcf7YFOJNhEYa+6DO+j7N7cXHw6tPFzi9u3vH73ePccZ/393/eHxIdBt4zj2DsVSOMDgNZgZXwhn//h8xWoc4Dne2D2iIaeGCiXp7HEpF+QgmZ4byZ+ZHU39RuVzCQFws9mn3GEjo6uL846GgSQZrMmSVCy6WJfrow8oZXuLrL5A+Hhc3GJxrHF5eHFcOvXsHFtMH9jcLZniXOyx8eh4OGZTj8k/O4RGWnc0QvsNp/MiDg91/0LIFA9qpfPLRCYLXwYPeFBLOTtkSI4JgAQWRfxuw5R0tF+rv7Ph0t4IeeqRuzKicIEX84QRRzNzKwe9Y/7IiXjk4RIfHLg4btHXvBqilcamu+Tp8WPhxhOfBzz8dnu2f7F5eEoVefuc8ch8re4H/jRjaoXfrwuquHIYhuZJhLX9zvLBy327i6/K7eK8Y8tcQ2GEMS0JML5vPsfCf3MVlXPkc", "event-id": 62}}
piping that to ES, unless there's something I'm missing, won't be super useful as it doesn't contain the actual packet data. 2., it appears u2eve isn't happy with the latest 2.9.9.0 snort unified format:
Loaded 54694 rule message map entries.
Loaded 35 classifications.
Traceback (most recent call last):
File "bin/idstools-u2eve", line 12, in <module>
sys.exit(main())
File "/home/nobackup/build/py-idstools/idstools/scripts/u2eve.py", line 349, in main
writer.write(event)
File "/home/nobackup/build/py-idstools/idstools/scripts/u2eve.py", line 249, in write
encoded = json.dumps(self.formatter.filter(event))
File "/home/nobackup/build/py-idstools/idstools/scripts/u2eve.py", line 172, in filter
return self.format_event(event)
File "/home/nobackup/build/py-idstools/idstools/scripts/u2eve.py", line 144, in format_event
if event["packet"]:
KeyError: 'packet'
Thank you...still digging into this.
from py-idstools.
Packet is base64 encoded so no data is lost. If it was printable some data would be lost. I'd be open to adding a new field with the printable output but the existing will remain base64 so enough data is retained to convert it back to pcap.
Snort has slightly altered unified2 since appid without incrementing any version identifiers. I believe u2spewfoo works as long as you run the one built from the same version of Snort that created the logs. idstools needs to handle all versions, and sometimes guessing has to be used. So I'll have to dig into this specific error case. Likely in the next 2 days.
from py-idstools.
So ok...folks will want to modify these (for example you may have a different tag for destination IP then "dst_ip", and items tagged as "junk" may be important to someone :)). That being said here's the logstash stuff that I'm using...a remote logstash listening to a specific port:
input {
tcp {
type => "unified"
port => 8514
}
}
filter {
if [type] =="unified" {
grok {
match => [ "message", '%{SYSLOGTIMESTAMP:date} %{IPORHOST} unified {"type": "event", "event": {"protocol": %{INT:proto}, "classification": "%{DATA:ids_classification}", "dport-icode": %{INT:dst_port}, "sensor-id": %{INT:sensor-id}, "impact-flag": %{INT:junk}, "destination-ip": "%{IP:dst_ip}", "vlan-id": %{INT:junk}, "blocked": %{INT:junk}, "impact": %{INT:junk}, "source-ip": "%{IP:src_ip}", "priority": %{INT:ids_priority}, "msg": "%{DATA:ids_alert}", "event-second": %{INT:junk}, "pad2": %{INT:junk}, "classification-id": %{INT:junk}, "event-microsecond": %{INT:junk}, "signature-revision": %{INT:rev}, "signature-id": %{INT:sid}, "sport-itype": %{INT:junk}, "generator-id": %{INT:gid}, "mpls-label": %{INT:junk}, "event-id": %{INT:junk}}' ]
match => [ "message", '%{SYSLOGTIMESTAMP:date} %{IPORHOST} unified {"type": "packet", "packet": {"packet-second": %{INT:proto}, "linktype": %{INT:junk}, "data-hex": "%{DATA:hex}", "sensor-id": %{INT:sensor-id}, "packet-microsecond": %{INT:junk}, "event-second": %{INT:junk}, "length": %{INT:bytes}, "data-printable": "%{DATA:ascii}", "data": "%{DATA:raw}", "event-id": %{INT:junk}}' ]
}
}
mutate {
convert => [ "src_port", "integer" ]
convert => [ "dst_port", "integer" ]
gsub => [
"ids_alert", "[ \-\(\)\/\:\=]", "_",
]
remove_field => [ "junk" ]
}
}
I'll need to have a gsub for proto to convert integer to protocol names like udp, tcp, etc... Command used:
bin/idstools-u2json -C /opt/etc/snort/classification.config -S /opt/etc/snort/sid-msg.map -G /opt/etc/snort/gen-msg.map --directory tests/ --prefix ipv6-alert --packet-printable --packet-hex --extra-printable --output ~/dev/testsnort.json
That file is being read by rsyslog; /etc/rsyslog.d/11-ids.conf:
module(load="imfile")
input(type="imfile"
File="/home/dev/testsnort.json"
Tag="unified")
if ($syslogtag == "unified") then {action(type="omfwd" Target="x.x.x.x.x" Port="8514" Protocol="tcp")} else {
*.*;local7.none @normalsyslogserverip
}
Anyway awesome..screens included.
from py-idstools.
My last challenge will be somehow linking the packet data entry with the event data entry. If I do a search for say "192.168.1.1" I'll get the packet entry, but the data entry won't match. Maybe I can get logstash to merge packet and data....hrmmm....I'll work on that.
from py-idstools.
Thats why I like the "eve" format better. You at least get one packet, then have enough info to pivot to a search for more. Kibana is probably not the best interface for that, but thats partly why I came up with my own.
from py-idstools.
Good deal...yea the idea is to have everything get to ELK...I'll see what I can muster.
from py-idstools.
@jasonish - you mention https://github.com/jasonish/evebox . Recently there was a question on http://seclists.org/snort/2017/q1/663 asking what people use to monitor snort events, but nothing than legacy, or private, closed source tools came up.
What I would like to see is a way to alert on a higher level, based on some logic involving several events, like in several proposed DSL for IDS. A human looking a at the dashboards won't be able to answer certain questions, or notice certain patterns. There are academic papers published on that subject, but did not find a modular tool built based on those ideas.
I was able to get alerting based on simple questions like "are there persistent attackers on the network, who access the system over an extended period of time using various or the same types of attacks" by sending u2json
into https://prometheus.io/ , but the query language or prometheus is not flexible enough for complex, ad-hoc queries (https://groups.google.com/forum/#!msg/prometheus-users/ERmX4gB-tZA/wFscpmU1BAAJ)
Example configuration given at https://github.com/marcindulak/vagrant-snort-nfqueue-tutorial-centos7
Have you tried https://www.elastic.co/guide/en/kibana/current/timelion.html or looked into such higher level queries to be included in evebox?
FYI: snort is further adapting the format of unified2 for the changes in snort3, for example buffer
instead of packet
can appear now in unified2 log http://seclists.org/snort/2017/q1/640
from py-idstools.
@jasonish - you mention https://github.com/jasonish/evebox . Recently there was a question on http://seclists.org/snort/2017/q1/663 asking what people use to monitor snort events, but nothing than legacy, or private, closed source tools came up.
What I would like to see is a way to alert on a higher level, based on some logic involving several events, like in several proposed DSL for IDS. A human looking a at the dashboards won't be able to answer certain questions, or notice certain patterns. There are academic papers published on that subject, but did not find a modular tool built based on those ideas.
Yes, I agree something like that would be nice. I also think the industry has gone overboard with their machine learning an AI solutions to do this, in fact to the point that users don't really know whats it doing, don't understand it and just don't like it. EveBox was my attempt to make it fast and easy to acknowledge all events, but clear the meaningless ones away faster, and not much out. My work on a commercial/proprietary alert management solution for Snort showed this worked best when the customer had a dedicated team looking at events. But I'm sure organizations with dedicated teams like that are in the minority.
I was able to get alerting based on simple questions like "are there persistent attackers on the network, who access the system over an extended period of time using various or the same types of attacks" by sending u2json into https://prometheus.io/ , but the query language or prometheus is not flexible enough for complex, ad-hoc queries (https://groups.google.com/forum/#!msg/prometheus-users/ERmX4gB-tZA/wFscpmU1BAAJ)
Example configuration given at https://github.com/marcindulak/vagrant-snort-nfqueue-tutorial-centos7
Prometheus has been on my radar but I haven't had time to look into it, but the fact you found some use getting it to answer those questions has me more curious.
Have you tried https://www.elastic.co/guide/en/kibana/current/timelion.html or looked into such higher level queries to be included in evebox?
I installed Timelion as I already have Suricata and collectd logging to it. My plan was to use it to graph the more time series stuff like stats and metrics but I gave up pretty quickly when I had issues trying to do simple visualizations and switched to Grafana which just worked. I really haven't done much more with it than monitor stats for anything abnormal though.
The reports I've added to EveBox are my first attempt at higher level queries - the idea was to provide out of the box about 50-80% of the visualizations a user might go to Kibana for, and there will always be those who do need to go to Kibana.
One of the next things I'll be looking at is auto-archiving alerts based on filters the user provides. I think Elastic Search could then be leveraged to provide suggestions on others, but I have to admit I have played with related queries at all yet. Thats just my initial steps into providing some if this, then that type functionality to archive (and suppress from inbox) alerts and perhaps generating a higher order event. All just wishful thinking at the moment though.
FYI: snort is further adapting the format of unified2 for the changes in snort3, for example buffer instead of packet can appear now in unified2 log http://seclists.org/snort/2017/q1/640
Cools. Thanks for the heads up. By the way, I have Golang (evebox is golang) Snort rules parsers and a dated unified2 parser, so Evebox could potentially consume unified2 directly without too much work.
Anyways, I'd be interesting in talking about this more outside of this issue if you are.
from py-idstools.
So ok...I'm super close getting logstash to merge events into one large event. Here's what I'm seeing....there are three types of data that we get from u2:
{"type": "event", "event"}
{"type": "packet", "packet"}
{"type": "extra-data", "extra-data"}
In that order. Now...when I manually run a u2spewfoo I see:
(Event)
sensor id: 0 event id: 1235 event second: 1491499872 event microsecond: 741953
sig id: 2022493 gen id: 1 revision: 1 classification: 21
priority: 1 ip source: x.x.x.x ip destination: 166.78.177.250
src port: 42174 dest port: 80 protocol: 6 impact_flag: 0 blocked: 0
Packet
sensor id: 0 event id: 1235 event second: 1491499872
packet second: 1491499872 packet microsecond: 741953
linktype: 1 packet_length: 957
<snip>
[ 48] 91 7F 50 18 FF FF 2F 8D 00 00 47 45 54 20 2F 3F ..P.../...GET /?
[ 64] 6B 65 79 77 6F 72 64 3D 30 32 37 31 38 20 48 54 keyword=02718 HT
[ 80] 54 50 2F 31 2E 31 0D 0A 41 63 63 65 70 74 3A 20 TP/1.1..Accept:
[ 96] 74 65 78 74 2F 68 74 6D 6C 2C 20 61 70 70 6C 69 text/html, appli
[ 112] 63 61 74 69 6F 6E 2F 78 68 74 6D 6C 2B 78 6D 6C cation/xhtml+xml
[ 128] 2C 20 2A 2F 2A 0D 0A 52 65 66 65 72 65 72 3A 20 , */*..Referer:
[ 144] 68 74 74 70 3A 2F 2F 77 77 77 2E 63 61 62 6C 65 http://www.cable
[ 160] 73 74 6F 67 6F 2E 63 6F 6D 2F 0D 0A 41 63 63 65 stogo.com/..Acce
<snip>
[ 944] 39 31 32 39 63 36 30 31 30 0D 0A 0D 0A 9129c6010....
(ExtraDataHdr)
event type: 4 event length: 36
(ExtraData)
sensor id: 0 event id: 1235 event second: 1491499872
type: 1 datatype: 1 bloblength: 12 Original Client IP: x.x.x.x
(ExtraDataHdr)
event type: 4 event length: 47
(ExtraData)
sensor id: 0 event id: 1235 event second: 1491499872
type: 9 datatype: 1 bloblength: 23 HTTP URI: /?keyword=02718
(ExtraDataHdr)
event type: 4 event length: 53
(ExtraData)
sensor id: 0 event id: 1235 event second: 1491499872
type: 10 datatype: 1 bloblength: 29 HTTP Hostname: search.cablestogo.com
So per the logstash docs I can use either a "previous" or "next". I have two challenges:
- What order does u2json convert the u2 to json? My screenshot shows either a) ES just goes by timestamp, or b) u2json converts not in order? I'm not sure...so that's one question :)
The extra-data is another challenge. Again, the goal here is to have a single event that contains all the data in one shot so I can see event data with ip's and ports, link with packet data. Thanks again.
from py-idstools.
What order does u2json convert the u2 to json? My screenshot shows either a) ES just goes by timestamp, or b) u2json converts not in order? I'm not sure...so that's one question :)
u2json processes the records in the same order as they are in the unified2 file, so its the same order as Snort, should be by timestamp.
As all these have the same timestamp, its going to be up to Elastic Search - but Logstash should at least see them in the same order.
Something to keep in mind - the packet following an event is almost always associated with the event just before. But subsequent packets may not be. In the case of tagging, a packet logged could be for an event that triggered some time ago.
The extra-data is another challenge. Again, the goal here is to have a single event that contains all the data in one shot so I can see event data with ip's and ports, link with packet data. Thanks again.
u2eve will do most of this for you. It bundles the Event record, the immediate packet record and any associated extra data into a single JSON object. Packets that are associated with the immediate previous event are logged as their own packet records. I've never looked at doing such grouping with Elastic Search.
from py-idstools.
Thanks Jason...I'm trying to reduce any overhead on the ids box so that's why I'm trying to stick with u2json. Unless I'm not understanding something, the process that I'm thinking with u2json is:
u2 file -> u2.json -> syslog up to ES
How would u2eve work?
from py-idstools.
And thanks for the lightning fast answer :)
from py-idstools.
How would u2eve work?
Pretty much the same. Each line of u2eve is a single JSON object like u2json, but has the event/packet/extra-data already aggregated. So something to keep in mind is the messages would be quite large, so TCP syslog would be ideal. Extra-data messages can already be very large, so its probably something you already have to watch out for.
This might be of some help https://www.balabit.com/blog/collecting-and-parsing-suricata-logs-using-syslog-ng/ - its for Suricata, but thats the idea of u2eve, convert u2 into a Suricata like event.
Basically your flow wouldn't change, your Logstash config just becomes simpler.
from py-idstools.
Ok...I'll give it a shot and see how it works in dev...good call on the tcp thanks.
from py-idstools.
Ok...here's my logstash grok for u2eve. I'm dropping anything that doesn't have a src_ip:
if [type] =="unified" {
if [message] !~ "src_ip" {
drop { }
} else {
grok {
#icmp
match => [ "message", '%{SYSLOGTIMESTAMP:date} %{IPORHOST} unified {"timestamp": "%{TIMESTAMP_ISO8601:event-timestamp}", "sensor_id": %{INT:sensor-id}, "event_id": %{INT:junk}, "event_second": %{INT:junk}, "event_type": "%{WORD:event_type}", "src_ip": "%{IP:src_ip}", "dest_ip": "%{IP:dst_ip}", "proto": "(?<proto>ICMP)", "icmp_type": %{INT:icmp_type}, "icmp_code": %{INT:icmp_code}, "flow_id": %{INT:flow_id}, "alert": {"action": "allowed", "gid": %{INT:gid}, "signature_id": %{INT:sid}, "rev": %{INT:rev}, "signature": "%{DATA:ids_alert}", "category": "%{DATA:ids_classification}", "severity": %{INT:severity}}, "packet": "%{DATA:packet_raw}", "packet_printable": "%{DATA:ascii}", "packet_info": {"linktype": %{INT:linktype}}' ]
#no src and dst port
match => [ "message", '%{SYSLOGTIMESTAMP:date} %{IPORHOST} unified {"timestamp": "%{TIMESTAMP_ISO8601:event-timestamp}", "sensor_id": %{INT:sensor-id}, "event_id": %{INT:junk}, "event_second": %{INT:junk}, "event_type": "%{WORD:event_type}", "src_ip": "%{IP:src_ip}", "dest_ip": "%{IP:dst_ip}", "proto": "%{WORD:proto}", "flow_id": %{INT:flow_id}, "alert": {"action": "allowed", "gid": %{INT:gid}, "signature_id": %{INT:sid}, "rev": %{INT:rev}, "signature": "%{DATA:ids_alert}", "category": "%{DATA:ids_classification}", "severity": %{INT:severity}}, "packet": "%{DATA:packet_raw}", "packet_printable": "%{DATA:ascii}", "packet_info": {"linktype": %{INT:linktype}}' ]
#u2eve
match => [ "message", '%{SYSLOGTIMESTAMP:date} %{IPORHOST} unified {"timestamp": "%{TIMESTAMP_ISO8601:event-timestamp}", "sensor_id": %{INT:sensor-id}, "event_id": %{INT:junk}, "event_second": %{INT:junk}, "event_type": "%{WORD:event_type}", "src_ip": "%{IP:src_ip}", "src_port": %{INT:src_port}, "dest_ip": "%{IP:dst_ip}", "dest_port": %{INT:dst_port}, "proto": "%{WORD:proto}", "flow_id": %{INT:flow_id}, "alert": {"action": "allowed", "gid": %{INT:gid}, "signature_id": %{INT:sid}, "rev": %{INT:rev}, "signature": "%{DATA:ids_alert}", "category": "%{DATA:ids_classification}", "severity": %{INT:severity}}, "packet": "%{DATA:packet_raw}", "packet_printable": "%{DATA:ascii}", "packet_info": {"linktype": %{INT:linktype}}' ]
#match null
match => [ "message", '%{SYSLOGTIMESTAMP:date} %{IPORHOST} unified {"timestamp": "%{TIMESTAMP_ISO8601:event-timestamp}", "sensor_id": %{INT:sensor-id}, "event_id": %{INT:junk}, "event_second": %{INT:junk}, "event_type": "%{WORD:event_type}", "src_ip": "%{IP:src_ip}", "src_port": %{INT:src_port}, "dest_ip": "%{IP:dst_ip}", "dest_port": %{INT:dst_port}, "proto": "%{WORD:proto}", "flow_id": %{INT:flow_id}, "alert": {"action": "allowed", "gid": %{INT:gid}, "signature_id": %{INT:sid}, "rev": %{INT:rev}, "signature": null, "category": "%{DATA:ids_classification}", "severity": %{INT:severity}}, "packet": "%{DATA:packet_raw}", "packet_printable": "%{DATA:ascii}", "packet_info": {"linktype": %{INT:linktype}}' ]
}
}
and the command I've been using...I dropped the hex since it was too much data:
sudo bin/idstools-u2eve -C /opt/etc/snort/classification.config -S /opt/etc/snort/sid-msg.map -G /opt/etc/snort/gen-msg.map --directory /opt/var/log/external --prefix external --packet-printable --output ~/dev/testsnort.json
I don't think I have any other questions or anything else I can think of with this...thanks so much for working on this Jason..I'll be putting this into production soon :)
from py-idstools.
If you haven't done so already, you could adapt this, https://github.com/StamusNetworks/SELKS/blob/master/staging/etc/logstash/conf.d/logstash.conf, to do geoip and user agent parsing as well.
from py-idstools.
Thanks...yea geoip is rocking so I'm good. I'll close this ticket out...thanks SO MUCH Jason for all the work o n this!
from py-idstools.
Related Issues (20)
- cannot parse rule HOT 2
- appStats u2 can't work HOT 1
- Bug: Multiple instances of rule options fields clobber eachother HOT 4
- Recent versions of Snort unified2 not supported. HOT 5
- Feature Request: ability to parse the source, destination, protocol using dictionary.
- SoolRecordReader stop working HOT 2
- eve2pcap.py fails with IPv6 addresses HOT 1
- python2-scapy as pkg dependency
- Connection with suricata-update HOT 2
- Coverting packets object to pcap file HOT 7
- Feature request: mutate metadata key value pairs
- Add .md5 extension between URL's filename and its parameters HOT 1
- Provide option for idstools-u2eve to reload sid-msg.map after updating sid-msg.map contents. HOT 1
- Unified2 Event Types mpls, vlan, and appid not included in u2eve output
- u2json event.appid output is in byte format and mangled
- Tests fail with python 3.11 HOT 1
- Rule parsing fails if last option doesn't close with semi-colon HOT 1
- memory usage increase issue HOT 3
- New release to support python 3.13 HOT 6
- Wrong parsing of pcre and possibly others
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from py-idstools.