Monday, June 25, 2012

GUI Risk Stats

Code:
https://github.com/JasonMOliver/Java_Parsers/blob/master/XMLVulnStatsTab.java

-----

I have been playing with gnuplot for about a day and I have to say its a lot of fun and can be complex. So far I have been able to generate some decent pictures of data for reports and such but I hope in the future this idea will get far more complex.

As of now I have attached a beta stats file for outputting .nessus files into tab delimited summery data for parsing with gnuplot.

I started out with the following two graphs I thought I would share they are both vuln data for the top 20 hosts based on over all CVSS score.

To get the data from the tool output to a parsable format that is sorted I use the following commands

java XMLVulnStatsTab TestTab.out *.nessus

head -n 1 TestTab.out | awk -F '\t' '{print $1"\t"$3"\t"$4"\t"$5"\t"$6}' > TestDataWithTabs20Vuln.dat; awk -F '\t' '{print $2"\t"$1"\t"$3"\t"$4"\t"$5"\t"$6}' TestTab.out | sort -g -r | head -n 20 | awk -F '\t' '{print $2"\t"$3"\t"$4"\t"$5"\t"$6}' >> TestDataWithTabs20Vuln.dat

At this point you should have the Top 20 data to play with;

A side not at this point you can get gnuplot for Mac in Macports

port install gnuplot

After you have gnuplot all setup and running use the following command set for the reports

#Top 20 Cluster Chart

set style data histogram
set style histogram cluster gap 1
set xtics rotate
set style fill solid border rgb "black"
set auto x
set yrange [0:*]
plot "TestDataWithTabs20Vuln.dat" using 3:xticlabels(1) title col lc rgb "purple", "TestDataWithTabs20Vuln.dat" using 4:xticlabels(1) title col lc rgb "red", "TestDataWithTabs20Vuln.dat" using 5:xticlabels(1) title col lc rgb "yellow", "TestDataWithTabs20Vuln.dat" using 6:xticlabels(1) title col lc rgb "green"



or

#Top 20 Row Chart with CVSS Total Score  
set style data histogram 
set style histogram rows gap 1 
set xtics rotate 
set style fill solid border rgb "black" 
set auto x  
set yrange [0:*] 
plot "TestDataWithTabs20Vuln.dat" using 2:xticlabels(1) title col with linespoints pointtype 5, "TestDataWithTabs20Vuln.dat" using 6:xticlabels(1) title col lc rgb "green", "TestDataWithTabs20Vuln.dat" using 5:xticlabels(1) title col lc rgb "yellow", "TestDataWithTabs20Vuln.dat" using 4:xticlabels(1) title col lc rgb "red", "TestDataWithTabs20Vuln.dat" using 3:xticlabels(1) title col lc rgb "purple"



Any way, I expect this to get more complex as I start to get use to the tool but for one day of playing (and its been a lot of fun) this seems like it will be a handy way to put a nice spin on my report data and add some color.

If you have any reports you find interesting with this data - plz add them to the thread

cheers

JSN

Geektools - Mac

Link:
http://www.macosxtips.co.uk/geeklets/system/scurity-log-parse-aka-attack-tripwire/

-----

This is something I have been playing with on my Mac hosts to just keep an eye on the event logs that no one reads. You will need to install Geektool  from the App Store (its free)

http://itunes.apple.com/us/app/geektool/id456877552?mt=12

----

Security Log Parse (aka Attack Tripwire)

Just add the following command in as an shell Geeklet

echo "Who is online:" ; who ; echo ''; echo 'Active Screen Sessions:'; screen -wls | awk -F 'in' '{print $1}'; echo ''; echo 'Failed Authentication:' ; grep 'Failed to authenticate user' /var/log/secure.log| awk -F ':' '{print $1":"$2""$4}' | awk -F '(' '{print $1}' | sort | uniq -c; grep 'authentication error' /var/log/secure.log| awk -F ':' '{print $1":"$2$6}' | sed 's/authentication error for //g' | sort | uniq -c

or if your getting a lot you can trim it to only alerts from the current month

echo "Who is online:" ; who ; echo ''; echo 'Active Screen Sessions:'; screen -wls | awk -F 'in' '{print $1}'; echo ''; echo 'Failed Authentication:'; i=$(date +"%b"); grep 'Failed to authenticate user' /var/log/secure.log| awk -F ':' '{print $1":"$2""$4}' | awk -F '(' '{print $1}' | sort | uniq -c | grep $i; grep 'authentication error' /var/log/secure.log| awk -F ':' '{print $1":"$2$6}' | sed 's/authentication error for //g' | sort | uniq -c | grep $i

This little script is good for in cafes or offices etc to see if someone it trying to log into your computer and what address they source from. Also it lists the active sessions on your machine.

*Note: The formatting of the awk may need adjusted if your not using Lion

Anyway just a fun little idea I had when I was sitting around today - enjoy and cheers

JSN

Risk Stats V3

Code:
https://github.com/JasonMOliver/Java_Parsers/blob/master/XMLVulnStatsV3.java

-----

Ok - I had an interesting idea to put in some summery data into this report so it ended in a rev 3

Now when you run XMLVulnStatsV3 you will get two extra tables at the bottom with the risk chart. One has summery data on how many unique hosts were scanned and how many failed authentication. Also the items that failed auth are now highlighted pink in the report.

Additionally you get a list of the hosts with OS that failed authentication. Note that this process supports rescans in the set so if you scan a machine 4 times and one authenticated it will not show up in the final list of hosts with failed auth.

Additional data on the core source is below.

---


The first version of the script XMLVulnStats.java will work from a .nessus file or multiple .nessus files and give you the following summery data - this script requires Excel to do some of the front end math. Due to the use of Excel the impact levels can be modified after the fact to gain more accurate results.

The command-line works as follows:

java XMLVulnStatsV3 Output.xls *.nessus

The output will be a table with the following columns

  •  IP Address
  •  Total CVSS Count - This totals the CVSS score for all Vulns on the Host
  •  Critical Count
  •  High Count
  •  Medium Count
  •  Low Count
  •  None Count
  •  Host Criticality - Adjustable figure between 100-1000 ranking hosts
  •  Risk Score - Total CVSS * Host Criticality
  •  Total Vuln - Total of Critical, High, Med, Low Vulns
  •  Average CVSS
  • Scan Depth

Additionally you will get an Average System Risk Level calculation based on the averages for all hosts.

Note that you will need to set the Host Criticality for your system after the script is run based on system knowledge. In the Federal / NIST space I have been using a spread based on the FIPS 199 level (i.e. if its a moderate system hosts are ranked between 400-600 based on impact, workstations 400, domain controllers 600, etc)

Hope you all are having fun with the data - any ideas send them my way.

cheers

JSN

Download All Nessus Reports at Command-line

So I have a lot of Nessus scan files and have been looking for a quick way to download all of the reports in Nessus V2 format for processing; I found this to be the simple way - if you put all 3 lines into a shell script even simpler.

token="$(/opt/local/bin/wget --no-check-certificate --post-data 'login=userIDn&password=password' https://127.0.0.1:8834/login -O - | grep '<token>' | sed 's/<contents><token>//g' | sed 's/<\/token><user>//g')"

/opt/local/bin/wget --post-data "token=$token" --no-check-certificate https://127.0.0.1:8834/report/list -O - | grep 'name' | sed 's/<name>//g' | sed 's/<\/name>//g' > reports

for i in $(cat reports); do /opt/local/bin/wget --post-data "token=$token&report=$i" --no-check-certificate https://127.0.0.1:8834/file/report/download -O - > $i.nessus; done;


You will need to swap out the userID and password for your local Nessus User ID and Password - but there you go a few lines and you have all of your reports.

You also my need to adjust the path for wget - I was using it from MacPorts on my machine.

cheers

JSN

Rescan Validation Update

Code:
https://github.com/JasonMOliver/Java_Parsers/blob/master/XMLValidate.java

-----

This is a quick update for the associated code to make it work with the changes in the .nessus V2 format in Nessus 5.0 +

If you update your scanner you will need to use this code on the output.

** On the up side this is the only code set that broke with the changes and all of the other scripts still work in Nessus 5.0 **

cheers

JSN

----

I received a task a while back to validate that a .nessus artifact (some scan output) could support validation that a item found in the past was fixed.

I broke this task down into a few items:

  1. Was the pluginID scanned for in the file?
  2. Was it found on any hosts in the scan output?
  3. What was scanned?
I created this little java command to validate these items from the command-line.

Its used thus:

java XMLValidate <fileName> <pluginID>

You can check for more than one pluginID at a time just simply keep adding them as args to the command.

The output looks like this:

--------
java XMLValidate ScanInput.nessus 30218

PluginID: 30218 was located as item 11903 scanned for in the plugin_set.
----> PluginID 30218 was identified on host 10.10.10.1
----> PluginID 30218 was identified on host 10.10.10.2

Scanned Hosts:
10.10.10.1
10.10.10.2
10.10.10.3
10.10.10.4
10.10.10.5


--------
Or in the case the file is clean:

--------
java XMLValidate ScanInput.nessus 30218

PluginID: 30218 was located as item 11903 scanned for in the plugin_set.
----> PluginID 30218 was NOT identified on any scanned host.

Scanned Hosts:
10.10.10.1
10.10.10.2
10.10.10.3
10.10.10.4
10.10.10.5


---------

As always drop me a note with improvements as this just represents my hack and slash attempt to save time validating a files while on a airline flight.

cheers

JSN

Risk Stats V2

Code:
https://github.com/JasonMOliver/Java_Parsers/blob/master/XMLVulnStatsV2.java

-----

Ok - I spent some time with techs in the field this week and found they really need more data about the hosts when working with risk & scans.

So I created XMLVulnStatsV2 this adds in the following columns to the table that may be helpful data about the hosts in addition to the IP address.

The output table now includes
  • FQDN
  • OS
  • Mac Address
  • Scan Start Time
For all the techs looking for a quick view of the set of scans they have conducted this is it.

Additional data on the core source is below.

---


The first version of the script XMLVulnStats.java will work from a .nessus file or multiple .nessus files and give you the following summery data - this script requires Excel to do some of the front end math. Due to the use of Excel the impact levels can be modified after the fact to gain more accurate results.

The command-line works as follows:

java XMLVulnStats Output.xls *.nessus

The output will be a table with the following columns
  •  IP Address
  •  Total CVSS Count - This totals the CVSS score for all Vulns on the Host
  •  Critical Count
  •  High Count
  •  Medium Count
  •  Low Count
  •  None Count
  •  Host Criticality - Adjustable figure between 100-1000 ranking hosts
  •  Risk Score - Total CVSS * Host Criticality
  •  Total Vuln - Total of Critical, High, Med, Low Vulns
  •  Average CVSS
Additionally you will get an Average System Risk Level calculation based on the averages for all hosts.

Note that you will need to set the Host Criticality for your system after the script is run based on system knowledge. In the Federal / NIST space I have been using a spread based on the FIPS 199 level (i.e. if its a moderate system hosts are ranked between 400-600 based on impact, workstations 400, domain controllers 600, etc)

cheers

JSN

Measurements & Risk Stats

Slides:
https://github.com/JasonMOliver/Misc/blob/master/ShmooCon%20EP%20Talk.pdf

Code:
https://github.com/JasonMOliver/Java_Parsers/blob/master/XMLVulnStats.java
https://github.com/JasonMOliver/Java_Parsers/blob/master/XMLTableStats.java

I have been working on a talk this week for ShmooCon Epilogue

http://novahackers.blogspot.com/2012/01/shmoocon-epilogue-speakers-and-location.html

For the people who missed the talk I will have the slides and video if possible posted on the media blog soon.



Epilogue: Jason Oliver-Risk Reporting Metrics from Georgia Weidman on Vimeo.

---

In the talk I go over some ideas I have been playing with to answer a few key questions all of us techs get on Risk Assessments;
  • What is the X worst machine(s)?
  • What is the over all risk level of my network?
  • What fix would have the most risk reduction effect?
In an effort to give a quantifiable answer to these questions I created a few scripts that work with .nessus V2 files. Please not the the theory in the talk can be applied to any vulnerability data but I wanted to some some actual implementation of theory with the scripts.

The first script XMLVulnStats.java will work from a .nessus file or multiple .nessus files and give you the following summery data - this script requires Excel to do some of the front end math. Due to the use of Excel the impact levels can be modified after the fact to gain more accurate results.

The command-line works as follows:

java XMLVulnStats Output.xls *.nessus

The output will be a table with the following columns

 IP Address
 Total CVSS Count - This totals the CVSS score for all Vulns on the Host
 Critical Count
 High Count
 Medium Count
 Low Count
 None Count
 Host Criticality - Adjustable figure between 100-1000 ranking hosts
 Risk Score - Total CVSS * Host Criticality
 Total Vuln - Total of Critical, High, Med, Low Vulns
 Average CVSS


Additionally you will get an Average System Risk Level calculation based on the averages for all hosts.

Note that you will need to set the Host Criticality for your system after the script is run based on system knowledge. In the Federal / NIST space I have been using a spread based on the FIPS 199 level (i.e. if its a moderate system hosts are ranked between 400-600 based on impact, workstations 400, domain controllers 600, etc)

The second script XMLTableStats.java is a simple edit of one of my older scripts that adds a column for Host Count.

The over all value of this is it will allow you to rank fix / repair order by vulnerability.

Simple run the script;
java -Xms32m -Xmx1024m XMLTableStats *.nessus > Output.xls

Then in Excel sort by Risk Factor, CVSS Score, Host Count

This will give you a fix list based on highest level of vulnerability then by quantity of hosts effected thus giving you the biggest bang for your buck if you fix by patch / issue.

Hope this helps - I hope to extend the research a bit and perfect the theory, I have only been playing with the numbers for a month or two so any feed back would be appreciated.

Keep in mind what a measurement really is - for the most part it is anything that helps you understand a figure more than before. This is not designed to be a perfect number that is definitive, its designed to give you a quantifiable baseline to work from that is for sure better than what you had before.

cheers

JSN


Update: The talk has been posted on the media blog if you are interested.