Wednesday, October 10, 2012

SAPpy - Annual 800-53 CM Control Testing Selection Automated

Code:
https://github.com/JasonMOliver/Java_Parsers/blob/master/SAPpy.java
https://github.com/JasonMOliver/Java_Parsers/blob/master/SAPpy.zip <-- With Include Files

-----

I have been working for some time with the NIST / FISMA process and always wanted to automate what controls needed to be tested given your working on a 3 year cycle but actually using the concepts of Continuous Monitoring (CM).

Way to much time is spent on the process of a Security Assessment Plan (SAP) in picking what controls need to be tested.
Additionally, people never seem to be sure what to select and if everything over the 3 year cycle has been addressed.

Its worth a note to say that the 3 year cycle is a dated concept but most Federal agencies are accrediting networks for 3 years still at this point, even if now due to CM, it is suppose to be a continuous process.

The end result of this is everything needs to be accounted for in the baseline at some time in the 3yr accreditation time frame and if your accreditation is shorter simply account for that by leaving the prior year files blank for the associated year not counted in the process.
(i.e. if you have a 2 year accreditation do not fill out the 2 + year ago data in Yr1.txt)

SAPpy simplifies that, though rather crudely, making sure every control is accounted for in the cycle and the additional requirements are also accounted for like major changes, POA&Ms, FIPS 200 updates, etc.

Simply fill in the associated text files with the following data;
 
  • Baseline.txt - All Controls from NIST in the Low, Moderate or High Baseline. * Do not adjust for FIPS 200 at this point *
  • req.txt - All of the control with annual testing requirements for the system.
  • FIPS200.txt - List all of the controls tailored out of the Baseline in the FIPS 200
  • POAM.txt - List all the controls with closed POA&Ms in the last 12 mo. *This may not be necessary if you audit during POA&M closure.
  • MajorChange.txt - List any controls that have had a major change. (i.e. moved buildings add in PE controls, etc)
  • Yr1.txt - List all of the controls tested in the SCA from 2 Years Ago
  • Yr2.txt - List all of the controls tested in the SCA from 1 Year Ago
Then run the following command;

java SAPpy

At this point SAPpy will take over sort everything out and STDOUT your set of controls that need to be tested for the year based on your baseline for the system.

Simple as that!

As always if you see issues with this code, or are seeing this process implemented with different interpretations you may need to adjust the code.

cheers and happy testing
JSN

  

Sunday, July 8, 2012

Reports with Plugin Family Analysis

Code:
https://github.com/JasonMOliver/Java_Parsers/blob/master/XMLVulnStatsV4.java
https://github.com/JasonMOliver/Java_Parsers/blob/master/XMLTableStatsV2.java

-----

I have been working on building out better charts with metrics divisions that assist in pin pointing problem areas in large networks. This started out with wanting to publish charts with vulns by OS and while I am still working on it, OS is a complicated puzzle.

Its complicated for a number of reasons;

One being OS detection is questionable and when it works some times the data has more detail that needed for simplification of OS based metrics.

Also plugins that fire based on a middle-ware are ponderous as to if you want them bunched in with the underling OS (i.e. Apache on Redhat vs Windows, etc).

So in the mean time I adjusted the Table code I had and Stats code to include the Nessus Plugin Family. This also allows me to push to gnuPlot for a nice management looking chart for vuln by OS. For the most part you can see what jumps out for places to focus on, be that Windows patches, middle-ware / application patching, web code, etc.

The one glitch type thing is Nessus uses the categories Misc and General that can be confusing and in need of some clarification.

cheer

JSN

Monday, June 25, 2012

GUI Risk Stats

Code:
https://github.com/JasonMOliver/Java_Parsers/blob/master/XMLVulnStatsTab.java

-----

I have been playing with gnuplot for about a day and I have to say its a lot of fun and can be complex. So far I have been able to generate some decent pictures of data for reports and such but I hope in the future this idea will get far more complex.

As of now I have attached a beta stats file for outputting .nessus files into tab delimited summery data for parsing with gnuplot.

I started out with the following two graphs I thought I would share they are both vuln data for the top 20 hosts based on over all CVSS score.

To get the data from the tool output to a parsable format that is sorted I use the following commands

java XMLVulnStatsTab TestTab.out *.nessus

head -n 1 TestTab.out | awk -F '\t' '{print $1"\t"$3"\t"$4"\t"$5"\t"$6}' > TestDataWithTabs20Vuln.dat; awk -F '\t' '{print $2"\t"$1"\t"$3"\t"$4"\t"$5"\t"$6}' TestTab.out | sort -g -r | head -n 20 | awk -F '\t' '{print $2"\t"$3"\t"$4"\t"$5"\t"$6}' >> TestDataWithTabs20Vuln.dat

At this point you should have the Top 20 data to play with;

A side not at this point you can get gnuplot for Mac in Macports

port install gnuplot

After you have gnuplot all setup and running use the following command set for the reports

#Top 20 Cluster Chart

set style data histogram
set style histogram cluster gap 1
set xtics rotate
set style fill solid border rgb "black"
set auto x
set yrange [0:*]
plot "TestDataWithTabs20Vuln.dat" using 3:xticlabels(1) title col lc rgb "purple", "TestDataWithTabs20Vuln.dat" using 4:xticlabels(1) title col lc rgb "red", "TestDataWithTabs20Vuln.dat" using 5:xticlabels(1) title col lc rgb "yellow", "TestDataWithTabs20Vuln.dat" using 6:xticlabels(1) title col lc rgb "green"



or

#Top 20 Row Chart with CVSS Total Score  
set style data histogram 
set style histogram rows gap 1 
set xtics rotate 
set style fill solid border rgb "black" 
set auto x  
set yrange [0:*] 
plot "TestDataWithTabs20Vuln.dat" using 2:xticlabels(1) title col with linespoints pointtype 5, "TestDataWithTabs20Vuln.dat" using 6:xticlabels(1) title col lc rgb "green", "TestDataWithTabs20Vuln.dat" using 5:xticlabels(1) title col lc rgb "yellow", "TestDataWithTabs20Vuln.dat" using 4:xticlabels(1) title col lc rgb "red", "TestDataWithTabs20Vuln.dat" using 3:xticlabels(1) title col lc rgb "purple"



Any way, I expect this to get more complex as I start to get use to the tool but for one day of playing (and its been a lot of fun) this seems like it will be a handy way to put a nice spin on my report data and add some color.

If you have any reports you find interesting with this data - plz add them to the thread

cheers

JSN

Geektools - Mac

Link:
http://www.macosxtips.co.uk/geeklets/system/scurity-log-parse-aka-attack-tripwire/

-----

This is something I have been playing with on my Mac hosts to just keep an eye on the event logs that no one reads. You will need to install Geektool  from the App Store (its free)

http://itunes.apple.com/us/app/geektool/id456877552?mt=12

----

Security Log Parse (aka Attack Tripwire)

Just add the following command in as an shell Geeklet

echo "Who is online:" ; who ; echo ''; echo 'Active Screen Sessions:'; screen -wls | awk -F 'in' '{print $1}'; echo ''; echo 'Failed Authentication:' ; grep 'Failed to authenticate user' /var/log/secure.log| awk -F ':' '{print $1":"$2""$4}' | awk -F '(' '{print $1}' | sort | uniq -c; grep 'authentication error' /var/log/secure.log| awk -F ':' '{print $1":"$2$6}' | sed 's/authentication error for //g' | sort | uniq -c

or if your getting a lot you can trim it to only alerts from the current month

echo "Who is online:" ; who ; echo ''; echo 'Active Screen Sessions:'; screen -wls | awk -F 'in' '{print $1}'; echo ''; echo 'Failed Authentication:'; i=$(date +"%b"); grep 'Failed to authenticate user' /var/log/secure.log| awk -F ':' '{print $1":"$2""$4}' | awk -F '(' '{print $1}' | sort | uniq -c | grep $i; grep 'authentication error' /var/log/secure.log| awk -F ':' '{print $1":"$2$6}' | sed 's/authentication error for //g' | sort | uniq -c | grep $i

This little script is good for in cafes or offices etc to see if someone it trying to log into your computer and what address they source from. Also it lists the active sessions on your machine.

*Note: The formatting of the awk may need adjusted if your not using Lion

Anyway just a fun little idea I had when I was sitting around today - enjoy and cheers

JSN

Risk Stats V3

Code:
https://github.com/JasonMOliver/Java_Parsers/blob/master/XMLVulnStatsV3.java

-----

Ok - I had an interesting idea to put in some summery data into this report so it ended in a rev 3

Now when you run XMLVulnStatsV3 you will get two extra tables at the bottom with the risk chart. One has summery data on how many unique hosts were scanned and how many failed authentication. Also the items that failed auth are now highlighted pink in the report.

Additionally you get a list of the hosts with OS that failed authentication. Note that this process supports rescans in the set so if you scan a machine 4 times and one authenticated it will not show up in the final list of hosts with failed auth.

Additional data on the core source is below.

---


The first version of the script XMLVulnStats.java will work from a .nessus file or multiple .nessus files and give you the following summery data - this script requires Excel to do some of the front end math. Due to the use of Excel the impact levels can be modified after the fact to gain more accurate results.

The command-line works as follows:

java XMLVulnStatsV3 Output.xls *.nessus

The output will be a table with the following columns

  •  IP Address
  •  Total CVSS Count - This totals the CVSS score for all Vulns on the Host
  •  Critical Count
  •  High Count
  •  Medium Count
  •  Low Count
  •  None Count
  •  Host Criticality - Adjustable figure between 100-1000 ranking hosts
  •  Risk Score - Total CVSS * Host Criticality
  •  Total Vuln - Total of Critical, High, Med, Low Vulns
  •  Average CVSS
  • Scan Depth

Additionally you will get an Average System Risk Level calculation based on the averages for all hosts.

Note that you will need to set the Host Criticality for your system after the script is run based on system knowledge. In the Federal / NIST space I have been using a spread based on the FIPS 199 level (i.e. if its a moderate system hosts are ranked between 400-600 based on impact, workstations 400, domain controllers 600, etc)

Hope you all are having fun with the data - any ideas send them my way.

cheers

JSN

Download All Nessus Reports at Command-line

So I have a lot of Nessus scan files and have been looking for a quick way to download all of the reports in Nessus V2 format for processing; I found this to be the simple way - if you put all 3 lines into a shell script even simpler.

token="$(/opt/local/bin/wget --no-check-certificate --post-data 'login=userIDn&password=password' https://127.0.0.1:8834/login -O - | grep '<token>' | sed 's/<contents><token>//g' | sed 's/<\/token><user>//g')"

/opt/local/bin/wget --post-data "token=$token" --no-check-certificate https://127.0.0.1:8834/report/list -O - | grep 'name' | sed 's/<name>//g' | sed 's/<\/name>//g' > reports

for i in $(cat reports); do /opt/local/bin/wget --post-data "token=$token&report=$i" --no-check-certificate https://127.0.0.1:8834/file/report/download -O - > $i.nessus; done;


You will need to swap out the userID and password for your local Nessus User ID and Password - but there you go a few lines and you have all of your reports.

You also my need to adjust the path for wget - I was using it from MacPorts on my machine.

cheers

JSN

Rescan Validation Update

Code:
https://github.com/JasonMOliver/Java_Parsers/blob/master/XMLValidate.java

-----

This is a quick update for the associated code to make it work with the changes in the .nessus V2 format in Nessus 5.0 +

If you update your scanner you will need to use this code on the output.

** On the up side this is the only code set that broke with the changes and all of the other scripts still work in Nessus 5.0 **

cheers

JSN

----

I received a task a while back to validate that a .nessus artifact (some scan output) could support validation that a item found in the past was fixed.

I broke this task down into a few items:

  1. Was the pluginID scanned for in the file?
  2. Was it found on any hosts in the scan output?
  3. What was scanned?
I created this little java command to validate these items from the command-line.

Its used thus:

java XMLValidate <fileName> <pluginID>

You can check for more than one pluginID at a time just simply keep adding them as args to the command.

The output looks like this:

--------
java XMLValidate ScanInput.nessus 30218

PluginID: 30218 was located as item 11903 scanned for in the plugin_set.
----> PluginID 30218 was identified on host 10.10.10.1
----> PluginID 30218 was identified on host 10.10.10.2

Scanned Hosts:
10.10.10.1
10.10.10.2
10.10.10.3
10.10.10.4
10.10.10.5


--------
Or in the case the file is clean:

--------
java XMLValidate ScanInput.nessus 30218

PluginID: 30218 was located as item 11903 scanned for in the plugin_set.
----> PluginID 30218 was NOT identified on any scanned host.

Scanned Hosts:
10.10.10.1
10.10.10.2
10.10.10.3
10.10.10.4
10.10.10.5


---------

As always drop me a note with improvements as this just represents my hack and slash attempt to save time validating a files while on a airline flight.

cheers

JSN