davehull / kansa Goto Github PK
View Code? Open in Web Editor NEWA Powershell incident response framework
License: Apache License 2.0
A Powershell incident response framework
License: Apache License 2.0
In my testing, I was able to get this collector to return a match on kansa.ps1 and another .ps1 script in a different path, but I'm unable to get it to return a match on C:\Windows\System32\cmd.exe. I'm working on figuring out why.
Z4n4tsu pointed out that having the directives on the first (and second lines) breaks the .SYNOPSIS / help functionality in collector scripts.
I'm working on a change to Kansa.ps1 to read those directives from the .NOTES section. Directives will be the same as they were before OUTPUT must be at the start of a line, followed by a space and then the type of output (CSV/TSV/XML/TXT/Default/BIN/Zip). PUSHBIN must also be at the start of a line followed by a space and then the path to the executable to be copied to remote hosts. The ordering will no longer matter.
This will enable the .SYNOPSIS sections to work and will not be much additional overhead because it only has to be done once per module rather than once per host.
Marking this as a bug since it breaks .SYNOPSIS, thus it is my first priority for dev at the moment.
Create a generic script that will pull Value, Data and LastWriteTime from arbitrary keys supplied by the user, maybe it has its own configuration file with a list of keys provided.
Make it return counts
How about a module that allows you to submit suspicious files to cuckoo sandbox via the REST API.
"curl -F file=@/path/to/file http://localhost:8090/tasks/create/file"
In addition to file submissions, hashes (md5, ssdeep, sha, etc...) from autoruns could be queried against existing cuckoo sandbox submissions.
API documentation here: http://cuckoo.readthedocs.org/en/latest/usage/api/#tasks-create-file
I recall seeing something about a unit testing framework for PowerShell. Look into it and see if we can use it for some automated testing.
An analysis script that consults OSINT sources for domain reputation info
Anti-malware health status is collected, but there's currently no analysis of the data.
Add a -Target argument that takes a single host as argument. That host becomes the sole target. This allows users to specify a single host at the command line.
Logparser analysis scripts were originally written with -rtp:nn, which causes the output to paginate. Go back and fix them all to use -rtp:-1 so output is not paginated at all.
I'm not sure what's going on, but on my dev machine, this module has stopped working when run via Kansa against localhost. It works stand-alone, run outside of Kansa and it works via Invoke-Command -ComputerName localhost -filepath ... (and remote hosts), but it won't work via Kansa against remote hosts. No errors are returned nor written to the error.log.
Write an analysis script for looking up hashes from Autoruns and other collected data against Virus Total, maintain a local db of known good/bad so they don't have to be looked up.
Possible bug here for user accounts that have been renamed. The SID resolution to username may not match the name of the user profile directory on disk. Need to investigate.
Modify script to return PS Objects.
Also, for locked ntuser.dat hives, they are locked because the user is logged on. Modify the script to parse the UserAssist for those logged on users, their hives should be accessible via HKEY_USERS... in theory.
List-Modules is not very robust. It needs to be able to handle all of the input that the -ModulePath argument can handle.
Lots of code duped in here. Refactor this thing with a function or something to get rid of the duplication.
One attribute WMI process collector gathers is Executable Path. Create a query to return any executables with temp in their paths.
Kansa is littered with some try/catch blocks because I started writing it before I understood that those were only useful for terminating errors. All of them should be reviewed and refactored as necessary because some of them are probably positioned where they will never catch terminating errors.
Modify all existing modules that currently require editing to set parameters so they will accept arguments from the command line, then modify modules\modules.conf with acceptable default arguments.
Create an analysis script for collected process data that returns processes by ascending order of frequency.
Create an module and/or analysis script that can generate leads for DLL Search Order hijacking.
I'd like to deprecate the -ModulePath argument's ability to point to an arbitrary directory in favor of a hard-coded .\Modules\ directory. I may modify the code such that -ModulePath can still be used to point to a single module, or make a new command line argument that can be used to point to a single collector module.
Change all analysis scripts -q:on to -stats:off. -q:on is suppressing headers when I only meant to suppress stats.
It might be nice if modules could accept command line arguments via the modules.conf file. Today any modules that take arguments, Get-File.ps1, Get-FlsBodyFile.ps1, Get-ProcDump.ps1, etc., have to be edited at the module level. If a user could edit the modules.conf file, they could make all of their changes in one place and modules could be written with sensible defaults or they could simply not run, if they are missing necessary arguments.
Write a collector for Sysinternals Rammap.
Automated analysis depends on the DATADIR directive, which is currently the top line of each Analysis script. This breaks .SYNOPSIS.
Pushbin seems to be broken. Error log shows:
Failed to copy .\Modules\bin\Handle.exe to localhost.
Could not find a part of the path '\localhost\ADMIN$.\Modules\bin\Handle.exe'.
Description says it aggregates on the first two octets of the IP, that's Get-NetstatForeign16sStack's function. the 24 stacker aggregates on the first three octets. Fix the comment in the description.
Create a collector for common browser cache and history
Let's make analysis automated in the same way that collection is. Collection is controlled by .\Modules\Modules.conf. Add functionality to Kansa.ps1 to read a configuration file for analysis that kicks off analysis scripts against the collected data and generates reports for the analyst to review.
Add an analysis script to return UserAssist keys sorted by date.
Capturing this for consideration -- Get-LogUserAssist could alert investigators of renamed accounts, but of course this requires some code.
Current Get-DNSCache module has two different ways of collecting its data depending on whether or not Get-DnsClientCache, the Powershell cmdlet, is found or not. Get-DnsClientCache is a newer cmdlet in 2012. The output from ipconfig /displaydns, does not mix well with the output from Get-DnsClientCache so in mixed environments, analysis is broken.
Either create a second collector for the newer cmdlet or normalize the data from both methods so they are the same and can be analyzed.
An analysis script that looks for entropy in dns cache entries.
As @davehull found in #41, the Windows API transparently redirects calls from a 32-bit PowerShell instance running on 64-bit operating systems from C:\Windows\System32 to C:\Windows\SysWOW64. Normally this can be bypassed by pre-pending .\ to the path, but the API call used in Get-FilesByHash.ps1 does not support this construction.
It would be nice to have a cleanup option when using PushBin. This option would delete all binaries that were pushed once Kansa was finished running. For some of the more adept adversaries that are looking to maintain a presence, it's best to not leave tools lying about that they can tamper with or possibly discover how you're finding them and get sneakier. Some enterprise systems might also flag a system having Sysinternals tools on a non-admin workstation.
Also, thanks for putting together such a great suite of tools and scripts!
Make it return objects!
Default parallel jobs for Powershell remoting is 32 hosts. Add a command line option that makes this adjustable, Powershell's Invoke-Command -ThrottleLimit command controls this.
Create an analysis script that sorts collected process data by process age.
Some modules call Get-Command in a conditional. If they find the command, they execute it, if they don't, the either execute something else, or return a message that the command could not be found. Get-Command calls in these conditionals should have an ErrorAction of SilentlyContinue.
Create a collector for directory listings of common temp folder locations.
Similar to the OUTPUT and PUSHBIN directives, would allow a module to specify a minimum PowerShell version required to run. Kansa would need to check the remote PowerShell version, throw a warning if a module cannot execute on a specific host, and then skip that module.
Add a flag for running Kansa in an analysis only mode for data that has already been collected. The user will configure the .\Analysis\Analysis.conf file according to their needs, then run Kansa with the -AnalysisOnly flag and an -DataDir argument and Kansa will run through all the appropriate analysis scripts for the data found in the argument to -DataDir.
Today only Kansa.ps1's errors make it to the error log. Figure out a way for collector scripts to pass error messages back to Kansa.ps1 and have Kansa.ps1 write them to the error log.
Write a stacker for Get-LogUserAssist.ps1 data
This function is too long and doing too many things. Refactor it into multiple smaller functions with high cohesion.
Create a module to support acquisition of a single file.
Collectors that return Powershell objects can specify how Kansa should save their data, text, xml, csv, tsv (binary and zip are also supported, but not for objects). For csv/tsv output, allow the collector author or user to add a delimiter so they aren't limited to commas or tabs.
Let's write a module that can create fuzzy hashes using SSDeep.
Need to rewrite this one so it selects Entry as opposed to FQDN.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.