Tuesday, October 15, 2019

Nessus Compliance to Baseline Fix it List


This script is a riff on XMLCompTable - Called XMLCompTableFix. This script produces a list of fix items for the machine to bring it into compliance.

This script gives an X Y chart with tests down one axis and machines across the top axis, resulting in a chart with compliance failures and how to fix them ** See Example Below **


WN10-00-000175 - The Secondary Logon service must be disabled on Windows 10.

Check Failed - Solution: Configure the 'Secondary Logon' service 'Startup Type' to 'Disabled'. Refernces: 800-171|3.4.6,800-171|3.4.7,800-53|CM-7,CAT|II,CCI|CCI-000381,CIP|007-6-R1,CN-L3|,CN-L3|,CSCv6|9.1,CSF|PR.IP-1,CSF|PR.PT-3,ITSG-33|CM-7,NIAv2|SS13b,NIAv2|SS14a,NIAv2|SS14c,NIAv2|SS15a,PCI-DSSv3.1|2.2.2,PCI-DSSv3.1|2.2.3,PCI-DSSv3.2|2.2.2,PCI-DSSv3.2|2.2.3,Rule-ID|SV-89393r1_rule,STIG-ID|WN10-00-000175,SWIFT-CSCv1|2.3,Vuln-ID|V-74719

Monday, June 29, 2015

XMLCompTable2 - Now with details!


It has been a while between creating code to work with Nessus but I recently had a need to run CIS benchmarks that had not been edited to match a local security policy.

Due to this I had a need to see not only the default pass and fail but what the scanner found when scanning when the scanner identified a Pass and Fail.

The output looks something like the following; Machines on the top axis and tests on the side axis.

To run the command as always;

java -Xmx1g XMLCompTable2 *.nessus > output.[html/xls]

To get the best value out of this scan and parse one baseline at a time (i.e. All Windows 2008, All Windows 7, All Redhat, etc.)



Wednesday, October 10, 2012

SAPpy - Annual 800-53 CM Control Testing Selection Automated

https://github.com/JasonMOliver/Java_Parsers/blob/master/SAPpy.zip <-- With Include Files


I have been working for some time with the NIST / FISMA process and always wanted to automate what controls needed to be tested given your working on a 3 year cycle but actually using the concepts of Continuous Monitoring (CM).

Way to much time is spent on the process of a Security Assessment Plan (SAP) in picking what controls need to be tested.
Additionally, people never seem to be sure what to select and if everything over the 3 year cycle has been addressed.

Its worth a note to say that the 3 year cycle is a dated concept but most Federal agencies are accrediting networks for 3 years still at this point, even if now due to CM, it is suppose to be a continuous process.

The end result of this is everything needs to be accounted for in the baseline at some time in the 3yr accreditation time frame and if your accreditation is shorter simply account for that by leaving the prior year files blank for the associated year not counted in the process.
(i.e. if you have a 2 year accreditation do not fill out the 2 + year ago data in Yr1.txt)

SAPpy simplifies that, though rather crudely, making sure every control is accounted for in the cycle and the additional requirements are also accounted for like major changes, POA&Ms, FIPS 200 updates, etc.

Simply fill in the associated text files with the following data;
  • Baseline.txt - All Controls from NIST in the Low, Moderate or High Baseline. * Do not adjust for FIPS 200 at this point *
  • req.txt - All of the control with annual testing requirements for the system.
  • FIPS200.txt - List all of the controls tailored out of the Baseline in the FIPS 200
  • POAM.txt - List all the controls with closed POA&Ms in the last 12 mo. *This may not be necessary if you audit during POA&M closure.
  • MajorChange.txt - List any controls that have had a major change. (i.e. moved buildings add in PE controls, etc)
  • Yr1.txt - List all of the controls tested in the SCA from 2 Years Ago
  • Yr2.txt - List all of the controls tested in the SCA from 1 Year Ago
Then run the following command;

java SAPpy

At this point SAPpy will take over sort everything out and STDOUT your set of controls that need to be tested for the year based on your baseline for the system.

Simple as that!

As always if you see issues with this code, or are seeing this process implemented with different interpretations you may need to adjust the code.

cheers and happy testing


Sunday, July 8, 2012

Reports with Plugin Family Analysis



I have been working on building out better charts with metrics divisions that assist in pin pointing problem areas in large networks. This started out with wanting to publish charts with vulns by OS and while I am still working on it, OS is a complicated puzzle.

Its complicated for a number of reasons;

One being OS detection is questionable and when it works some times the data has more detail that needed for simplification of OS based metrics.

Also plugins that fire based on a middle-ware are ponderous as to if you want them bunched in with the underling OS (i.e. Apache on Redhat vs Windows, etc).

So in the mean time I adjusted the Table code I had and Stats code to include the Nessus Plugin Family. This also allows me to push to gnuPlot for a nice management looking chart for vuln by OS. For the most part you can see what jumps out for places to focus on, be that Windows patches, middle-ware / application patching, web code, etc.

The one glitch type thing is Nessus uses the categories Misc and General that can be confusing and in need of some clarification.



Monday, June 25, 2012

GUI Risk Stats



I have been playing with gnuplot for about a day and I have to say its a lot of fun and can be complex. So far I have been able to generate some decent pictures of data for reports and such but I hope in the future this idea will get far more complex.

As of now I have attached a beta stats file for outputting .nessus files into tab delimited summery data for parsing with gnuplot.

I started out with the following two graphs I thought I would share they are both vuln data for the top 20 hosts based on over all CVSS score.

To get the data from the tool output to a parsable format that is sorted I use the following commands

java XMLVulnStatsTab TestTab.out *.nessus

head -n 1 TestTab.out | awk -F '\t' '{print $1"\t"$3"\t"$4"\t"$5"\t"$6}' > TestDataWithTabs20Vuln.dat; awk -F '\t' '{print $2"\t"$1"\t"$3"\t"$4"\t"$5"\t"$6}' TestTab.out | sort -g -r | head -n 20 | awk -F '\t' '{print $2"\t"$3"\t"$4"\t"$5"\t"$6}' >> TestDataWithTabs20Vuln.dat

At this point you should have the Top 20 data to play with;

A side not at this point you can get gnuplot for Mac in Macports

port install gnuplot

After you have gnuplot all setup and running use the following command set for the reports

#Top 20 Cluster Chart

set style data histogram
set style histogram cluster gap 1
set xtics rotate
set style fill solid border rgb "black"
set auto x
set yrange [0:*]
plot "TestDataWithTabs20Vuln.dat" using 3:xticlabels(1) title col lc rgb "purple", "TestDataWithTabs20Vuln.dat" using 4:xticlabels(1) title col lc rgb "red", "TestDataWithTabs20Vuln.dat" using 5:xticlabels(1) title col lc rgb "yellow", "TestDataWithTabs20Vuln.dat" using 6:xticlabels(1) title col lc rgb "green"


#Top 20 Row Chart with CVSS Total Score  
set style data histogram 
set style histogram rows gap 1 
set xtics rotate 
set style fill solid border rgb "black" 
set auto x  
set yrange [0:*] 
plot "TestDataWithTabs20Vuln.dat" using 2:xticlabels(1) title col with linespoints pointtype 5, "TestDataWithTabs20Vuln.dat" using 6:xticlabels(1) title col lc rgb "green", "TestDataWithTabs20Vuln.dat" using 5:xticlabels(1) title col lc rgb "yellow", "TestDataWithTabs20Vuln.dat" using 4:xticlabels(1) title col lc rgb "red", "TestDataWithTabs20Vuln.dat" using 3:xticlabels(1) title col lc rgb "purple"

Any way, I expect this to get more complex as I start to get use to the tool but for one day of playing (and its been a lot of fun) this seems like it will be a handy way to put a nice spin on my report data and add some color.

If you have any reports you find interesting with this data - plz add them to the thread



Geektools - Mac



This is something I have been playing with on my Mac hosts to just keep an eye on the event logs that no one reads. You will need to install Geektool  from the App Store (its free)



Security Log Parse (aka Attack Tripwire)

Just add the following command in as an shell Geeklet

echo "Who is online:" ; who ; echo ''; echo 'Active Screen Sessions:'; screen -wls | awk -F 'in' '{print $1}'; echo ''; echo 'Failed Authentication:' ; grep 'Failed to authenticate user' /var/log/secure.log| awk -F ':' '{print $1":"$2""$4}' | awk -F '(' '{print $1}' | sort | uniq -c; grep 'authentication error' /var/log/secure.log| awk -F ':' '{print $1":"$2$6}' | sed 's/authentication error for //g' | sort | uniq -c

or if your getting a lot you can trim it to only alerts from the current month

echo "Who is online:" ; who ; echo ''; echo 'Active Screen Sessions:'; screen -wls | awk -F 'in' '{print $1}'; echo ''; echo 'Failed Authentication:'; i=$(date +"%b"); grep 'Failed to authenticate user' /var/log/secure.log| awk -F ':' '{print $1":"$2""$4}' | awk -F '(' '{print $1}' | sort | uniq -c | grep $i; grep 'authentication error' /var/log/secure.log| awk -F ':' '{print $1":"$2$6}' | sed 's/authentication error for //g' | sort | uniq -c | grep $i

This little script is good for in cafes or offices etc to see if someone it trying to log into your computer and what address they source from. Also it lists the active sessions on your machine.

*Note: The formatting of the awk may need adjusted if your not using Lion

Anyway just a fun little idea I had when I was sitting around today - enjoy and cheers


Risk Stats V3



Ok - I had an interesting idea to put in some summery data into this report so it ended in a rev 3

Now when you run XMLVulnStatsV3 you will get two extra tables at the bottom with the risk chart. One has summery data on how many unique hosts were scanned and how many failed authentication. Also the items that failed auth are now highlighted pink in the report.

Additionally you get a list of the hosts with OS that failed authentication. Note that this process supports rescans in the set so if you scan a machine 4 times and one authenticated it will not show up in the final list of hosts with failed auth.

Additional data on the core source is below.


The first version of the script XMLVulnStats.java will work from a .nessus file or multiple .nessus files and give you the following summery data - this script requires Excel to do some of the front end math. Due to the use of Excel the impact levels can be modified after the fact to gain more accurate results.

The command-line works as follows:

java XMLVulnStatsV3 Output.xls *.nessus

The output will be a table with the following columns

  •  IP Address
  •  Total CVSS Count - This totals the CVSS score for all Vulns on the Host
  •  Critical Count
  •  High Count
  •  Medium Count
  •  Low Count
  •  None Count
  •  Host Criticality - Adjustable figure between 100-1000 ranking hosts
  •  Risk Score - Total CVSS * Host Criticality
  •  Total Vuln - Total of Critical, High, Med, Low Vulns
  •  Average CVSS
  • Scan Depth

Additionally you will get an Average System Risk Level calculation based on the averages for all hosts.

Note that you will need to set the Host Criticality for your system after the script is run based on system knowledge. In the Federal / NIST space I have been using a spread based on the FIPS 199 level (i.e. if its a moderate system hosts are ranked between 400-600 based on impact, workstations 400, domain controllers 600, etc)

Hope you all are having fun with the data - any ideas send them my way.