Extracting NTDS Users Data... The Fastest Way.

Extracting what?

One of the most tedious and long tasks a pentester has to do when assessing a company's domain, is the extraction of the NTDS.dit file, located (usually) in the Domain Controllers of the Active Directory.

Retrieving the hashes of all users from the Active Directory is the first thing that a hacker (well... I think) should do after obtaining Domain Admin privileges. Obtaining the LM/NTLM hashes is crucial; it gives a huge list of possibilities to maintain access after an effective exploitation (i.e. Golden Ticket, password cracking, pass-the hash, etc.), and looks beautiful when is documented in a pentesting report ;)

Common ways to dump hashes

There are at least 3 well known ways of extracting the LM/NTLM hashes from Active Directory.

  1. Extracting the NTDS.dit file from a shadow copy using vssadmin, dumping the tables datatable and link_table with esedbexport of esedebtools framework, and retrieving the users data using scripts of the NTDSXTract framework, such as dsusers.py or dshashes.py
  2. Loading an executable (or malicious) file to a Domain Controller, such as dumpntds.exe (Github), creating the datatable and link_table files, or executing a meterpreter, and then running hashdump.
  3. Retrieving hashes (and maybe passwords) remotely using the Get-ADReplAccount script from DSInternals suite.
Probably, all the methods are effective, and all of them retrieves the data we are looking for. But not all of them are efficient (or effective?). 

Method 1: Getting the NTDS.dit and processing offline

Cons:

  • When the NTDS.dit file is huge, usually because of the existence of many users, copying and downloading the file is very time-consuming. Worst is the case when the network is slow.
  • After the tragic download of the NTDS.dit file, now its needed to extract the datatable and link_table files. The process time of this task is proportional to the amount of users registered in the domain. (Once, I had to extract hashes from a domain that have a 10GB NTDS.dit file. I spent almost a week to obtain the domain users LM/NTLM hashes).

Method 2: Extracting the tables on the DC  processing them offline

Cons:

  • Leading the dumping tables process of the NTDS.dit file over the Domain Controller, could be less longer than the method 1, but it is still inefficient.
  • We have to deal again with the processing task of the datatable and link_table files, that is pretty time-wasting in some cases.
  • On the other hand, meterpreter dumps the LM/NTLM hashes pretty quick, but unfortunately, the hashdump module does not get all the hashes.

Method 3: Retrieving hashes remotely with DSInternals.

Cons:

  • This is probably the fastest way to obtain the info we are looking for. But, despite of the quickness of the process, the output is not comprehensible for the password cracking tools (i.e. hashcat, john, l0phtcrack).
  • Parsing the file to lead the password cracking process could be tedious.

The fastest way

The fastest way is retrieving hashes with DSInternals suite, but the output file is not readable for the most common password cracking tools.

To make easier the LM/NTLM hashes extraction process, I have developed a script that uses the Get-ADReplAccount output, and parses it to a passwords cracking friendly format. The script is called dsinternalsparser.py and can be downloaded here.


The script takes the output file of the Get-ADReplAccount execution, and creates text files containing the data distributed and organized. To make it clear, let's do an example.

Execute the following command to obtain all the user data, right after setting the $cred var with the Get-Credential cmdlet:

Get-ADReplAccount -All -Server dc01 -NamingContext "dc=anotherdomain,dc=org" -Credential $cred > output.txt

After executing the Get-ADReplAccount, we obtain a file like this one:


In the example.txt file, we dump a domain with only one user. Now, to process the file, we use the 
dsinternalsparser.py, executing on a terminal:


./dsinternalsparser.py example.txt

After the processing, by default, the script generates all the possible output files with the prefix output. In case you want to get only certain files, you must specify as a parameter the files you intend to generate (i.e. --ntlm, --cleartext, --lmhistory, etc.). The same occurs if you want to change the output filenames, using the -o parameter.

As a result, the script generates the output files like these ones:



These files can be used to make faster the processing of Domain Users information, including the password cracking process. 


I compared the time processing from NTDS.dit to LM/NTLM hashes on a 10GB file. As I mentioned, I spent on the traditional way almost a week to obtain the data to start the password cracking. Using this last method, I could obtain all the hashes in hashcat format in 20 minutes.

To resume, the fastest way (I have found) you can dump the Domain Users information from NTDS.dit file is:

  1. Get the output file from Get-ADReplAccount module execution.
  2. Download the file and parse it with dsinternalsparser.py.
  3. Enjoy >:)
For more information about the Get-ADReplAccount module, and DSInternals, refer to:
If you have questions about the dsinternalsparser.py, go to:

Happy dumping.


¬#r4wd3r'

Comentarios

Entradas populares de este blog

RID Hijacking: Maintaining access on Windows machines

Off-by-one overflow explained