Second Issue of Exploit Mag is out!

The second free issue of Exploit Magazine has been released!

This month’s issue highlight’s three articles written by, well… yours truly!

(Guys we really need more contributors. Help the security community out and share your knowledge! Contact me at cyberarms (at) Live.com for more info.)

Included are updated versions of my Pentesting with Metasploitable 2 article series, Security Testing with Powershell and Powerpoint, and a short article on Listening to VoIP calls with WireShark:

Practice Pentesting with Metasploitable 2

You have been learning some mad hacking skills, but how do you test them? Wouldn’t it be great if there was a system that came with vulnerabilities that you could try to exploit? Well, there is, meet Metasploitable 2!

In this article we will take a look at the purposefully vulnerable Linux system and learn how to exploit it. We will cover scanning a system, using a remote exploit to get root access, cracking the passwords, and then using the passwords to exploit all the systems on the network.

Security Testing with PowerShell and PowerPoint

Many times hackers think “Out-of-the-Box” and manipulate common services and programs to exploit a system. In this article we will look at gaining remote shells with PowerShell through the Social Engineering Toolkit and how to get remote user credentials via PowerPoint.

Listening to VoIP Calls from Packet Captures

In this article we will look at recovering and playing voice calls from nothing more than a network packet capture that includes VoIP traffic.

How difficult would it be to scan a packet capture, find the calls out of the thousands of available packets and be able to somehow listen to the call? Well, come to find out, it is not hard at all. The feature is built into Wireshark!

Check it out!

Windows 8 Forensics: Reset and Refresh Artifacts

(Note:  The following information is primarily from a paper that I wrote detailing the Windows 8 Reset and Refresh functions.  A few pieces have been changed for formatting, but the structure has stayed the same.  The sections are broken down in the same manner as the paper. – Ethan Fleisher)

1. Introduction

1.1 Research Problem

Windows 8 ships with a new feature that will be extremely handy for the average consumer; the Reset and Refresh function.  This allows a user to choose whether or not to reinstall the OS, quickly reset their entire computer, or thoroughly reset their entire computer.  A function that has the potential to wipe out data is of extreme importance to the world of digital forensics, as it could easily make or break a case.  This paper will delve into what can be found on a machine that has had the refresh, quick reset, or thorough reset function performed on it.

2. Refreshed and Reset Machines

Before diving into the differences between a refreshed machine and a reset machine, the first important thing to look at is whether or not the machine even had one of the two functions performed on it.

2.1 Recovery Directory

Within the system recovery volume on a Windows 8 computer, a folder named recovery can be found.  Within the recovery folder, a folder labeled with the GUID will exist.  Three files are located in this folder: Winre.wim, boot.sdi, and ReAgent.xml.  Winre.wim is the windows image format file, boot.sdi is the windows deployment system image, and ReAgent.xml, which is associated with recovery.  All of this is typical behavior and can be found on any installation of a Windows 8 machine.

Windows 8 – System Recovery Volume\Recovery\GUID\ReAgent.xml

When a Refresh or Reset is done to a system, a new file/folder can be found in the Recovery folder on the system volume.  The folder, named logs, contains a file named Reload.xml.  The information contained within this file remains the same, whether or not the system was refreshed or reset.

Windows 8 – System Recovery Volume\Recovery\Logs\Reload.xml

3. Refresh vs. Data Generation

Upon first glance, there are three folders that pop out when comparing a refreshed image to one that has never been refreshed.  The windows partition contains: $SysReset, Windows.old, and Lost Files:

 4. Lost Files

To start, we’ll take a look at the Lost Files folder.  This folder will appear on more than just a system that has been refreshed, but it is still worth mentioning what it is.  The Lost Files folder contains files that still have a MFT entry on the system, but their parent folder has been deleted.  The files and folder that contained them were deleted, and the only the parent folder was overwritten.  However, the files within were not overwritten, and the MFT entries are still present.  Due to the entries still being present, forensic software is still able to know that the file exists.

With that being said, the Lost Files folder could potentially hold data of forensic value, but, in regards to this Windows 8 paper, is nothing new.

5. $SysReset

The SysReset folder contains a vast amount of information, ranging from log files to migration xml documents, all of which provide useful information to a forensic investigator.

5.1 Bin Directory

The bin directory is a great asset of information.  Within the bin directory, a directory named rollback can be found.  There are three text files that provide information relevant to the refresh that happened.  These files are:

1.       QuarantineLog.txt
2.       LogRestore.txt
3.       FolderMoveLog.txt

5.1.1 QuarantineLog.txt

QuarantineLog.txt displays which folders were saved, and where they were saved.  The contents of QuarantineLog.txt are as follows:

5.1.2 LogRestore.txt

LogRestore.txt contains the location of the migration log from the reset. This will be explained in further detail  later.

example:  D:\$SysReset\Logs\Mig

5.1.3 FolderMoveLog.txt

FolderMoveLog.txt contains a list of all folders that were moved, with listing their new location followed by their previous location.  It is notated in the format:

New file location | Previous file location

Within this text file, files ranging from typical user document files to internet favorites and also metro settings are found.

5.2 Framework Directory

The framework directory contains information that does not immediately appear to be very helpful.

Within Framework\Migration\Preserve is a file named Immersive Apps MigrationMigration.xml.  This file contains information that appears to relate to metro apps.  It contains registry key information, as well as multiple lines stating “rejuvenation”.   These rejuvenation lines relate to:

  •     AppX Payload
  •     AppX Licensing
  •     Modern Tiles
  •     Modern App Data
  •     AppX Enterprise Apps Authorization
  •     AppX Lock Screen Notifications
  •     AppX Application Tamper State Cache

5.3 Logs Directory

$SysReset contains a directory named Logs as well.  Within this directory are multiple log files and xml files defining the migration process during the refresh, stating where files previously were and also where the files currently reside.

Within the $SysReset\Logs directory is a file named MigLog.xml, as well as two subdirectories Mig and Rollback.  These files are the ones that appear to be of most importance.

5.3.1 MigLog.xml

MigLog.xml can be relatively beneficial to determining basic information about the machine itself.  Information such as the system name, user name and SID correlation, last access times/log in times, and windows mapping schemes can all be found here.

For example, by doing a simple ctrl-f search for the username that was used, the first hit provided me with last access, profile path, SID, and the domain that the account was tied to.

 

5.3.2 Logs\Mig Subdirectory

Within the sub-directory Mig are three files, two log and one xml, that provide more information about the system.  These three files are setupact.log, systemresetplatform.log, and miglog.xml.

5.3.3 Setupact.log

Setupact.log holds some basic information about the system and the setup itself.  All user profiles that are present on the machine at the time of the migration can be located within here.  By searching for the string “Processing Profile”, all of the accounts that are migrated over can be found.  These range from the system profile to localservice and networkservice, and also the created users themselves.  Default locations are mapped for each user as well.

The machine name, SID, and GUID can all also be found in the setupact.log

 

Finally, all apps that were recursively downloaded on the Windows 8 store and migrated over can be found.  Doing a search for the string “STORERECURSIVE” will bring display this.  A list of all applications downloaded from the Windows store can be found here.

5.3.4 Systemresetplatform.log

This relatively short log file contains a couple pieces of information.   It is convenient, that like the other logs, this one also gives timestamps for when the events happened.  This can very easily put a date and time to the refresh.  Also, much like setupact.log, all of the immersive metro apps that were installed on the system and migrated over can be found here.

Perhaps the more interesting piece of information on this page though is where old registry keys were unloaded to.  These keys include the software hive, system hive, and NTUSER.dat hives.  This is located at the very end of this log.

5.3.5 MigLog.xml

MigLog.xml contains similar information to the previous files, including system names, SSID numbers, domain names, profile names, mapping information, and more.  Any of these logs can be used to gain information about the system and the migration process, giving investigators locations of both old data and new migrated locations.

5.4 MigEngineStore Directory

The MigEngineStore directory contains two subdirectories: MachineSpecific and XMLs.  The MachineSpecific folder has two files containing information, migstate.dat and catalog.mig.  After  very briefly parsing these, however, it appears that the information provided is nothing overly new when compared to the other files that have been found.

The XMLs subdirectory contains two xml files, both of which appear to simply ensure that the system is setup correctly.  Once again, the information in here may be useful, but it would extremely situational.

5.5 MigEngineWork and Temp

The remaining directories in $Sys.Reset are MigEngineWork and Temp.  With the system I worked on, both of these were empty.

6. Windows.Old

The windows.old folder is an amazing resource.  Opening this folder is almost like opening the computer before the refresh was even done.  When initially drilling through the folder structure, it appears to resemble exactly that of the previous computer.

The simple breakdown of a few key points and differences looks like this:

6.1 $Recycle.Bin

Within the $Recycle.Bin folder, deleted files that were never emptied from the recycle bin can still be found.  However, unlike the current version of the system, the $R file is not displayed with its file name.  Instead, it is simply given the $R value.  However, the metadata is still present, and the $I file still contains the data itself.

6.2 System Volume Information

This folder cannot be found in the windows.old directory, only under the new install.

6.3 Users

A majority of the data in each user’s directory can be recovered from a refreshed machine.

Internet history is preserved and can be found within here.  Primarily, with Windows 8, we will be looking in a variety of places, including WebCachev24.dat and the IndexedDB directory within <user>\appdata\local\microsoft\internet explorer.  Other internet related activity, such as TypedURLs and TypedURLsTime results from NTUSER.dat, can be recovered as well.

An interesting key can be found in NTUSER.dat\software\microsoft\windows\currentversion\settingsync.  At this location is a registry value labeled LastLocalTimeChange.  This value is displayed in big endian hex format.

When run through DCode, the value in the above picture yielded Wed, 20 June 2012 16:41:10 UTC.  This could help to place the computer in a specific proximity on a certain date at the very least.

Because I was logged into the system under a Microsoft Live account, however, I would be curious to see if this key exists when only a local account has been used and the Microsoft Sync was not occurring.

Each user’s desktop contains an html file named Removed Apps.  Opening this file shows all removed applications that were installed by a third party vendor on the machine.

All of the users downloads, pictures, videos, and music are also left untouched and intact in their native folders.

Taking a glance at where Windows 7 stored jump list information, c:\users\<user>\appdata\roaming\microsoft\windows\recent\automaticdestinations, I was quickly able to find the same information.  Many of the pieces of information listed in here came from recent locations that were touched, including websites, downloaded files, and pictures.

All user assist information is capable of being captured within the windows.old directory as well.

Other items such as open/save MRUs, LNK files, RunMRU, Last Visited MRU, were all found in the same locations as Windows 7.

6.4 Windows

Before diving into registry hives, the first thing I checked was for the existence of event logs.  Much to my avail, all of the computers event logs can be located within system32\winevt\logs.  Simply exporting and viewing them in whatever preferred event log viewer is all that is necessary.

When taking a quick glance at the registry, all plugged in USB devices are able to be determined as well.  Doing a quick search for setupapi.dev.log provided results, and allowed for me to determine the first time USB drives were plugged into the system.

Other files such as prefetch files and system information, i.e. timezone info, and network history information can still be found in the same areas as Windows 7.

7. Reset vs. Data Generation

Upon first glance of a system that has undergone either of the reset functions, it would appear that not much information can be located.  Unlike the Refresh function, which contained two folders full of information (SysReset and Windows.Old), a reset machine appears as a though it is a fresh image.  While looking through the various folders, however, I was able to come across important artifacts.

The recovery volume of the system appears to be relatively untouched by the resets done to the computer.  As shown below, the MFT, $Bitmap, and other important system files were, for the most part, created and last written to prior to the reset of the computer.

Only a few other pieces of evidence were found that put the computer to a previous date.  Internet history within WebCacheV24.dat can still be located from before the machine was reset.

As noted previously, 36 bytes prior to the Visited: Ethan@http/website (highlighted in blue) is the timestamp of the visit in big-endian format.  The decoded value of this (DC 60 82 DF 4C 4A CD 01 – big endian) is equal to Thu, 14 June 2012 16:43:49 UTC.  The computer itself was reset on Wednesday, June 20th, 2012.

Along with this discovery, the history.IE5 folder contains subdirectories from dates prior to the reset, yet they all still contain empty container.dat files.

Exporting WebCacheV24.dat and parsing it with EseDbViewer presented user browsing information as well, dating back to the creation of the virtual machine.

Besides these pieces of evidence, there isn’t much that can pin the machine back to before the reset date; at least, not much that I found.  Running log2timeline, however, did provide some information about the system prior to the reset.

Most of the information that was parsed by log2timeline unfortunately just related to the system recovery partition, and more or less displayed that there was existence of a system before the reset occurred.

With the exception of these few artifacts recovered from the machine, not much else has been found that can be recovered.  Although it is somewhat disappointing, the fact that a chunk of internet history still exists is amazing.  Log2timeline giving us an insight to the fact that the system existed on a certain date is also helpful in the grand scheme of a timeline.

Both the quick and thorough resets left behind the same traces of data.  One function did not outperform the other in terms of data deletion, even when it came to WebCacheV24.dat.

8. Conclusion

It appears that a system that was simply refreshed can still provide a plethora of evidence to an investigator.  Seemingly everything about the machine pre-refresh can be recovered, and is conveniently placed into a nifty folder named windows.old.  Information in regards to the migration process itself, old mappings versus new mappings, and the exact date and time of the refresh can be found by examining the $SysReset folder and checking the specific log and xml files within.

All in all, let’s hope that people will refresh their computer if they perform any of the three features, or that other artifacts are left behind when more user activity is done on the computer.

Keep in mind too that all of this testing is being done on release preview.  Although I doubt it would change drastically when the release itself hits in a couple months, it is possible some artifacts may change.

About the Author:

Ethan Fleisher is a Senior majoring in Computer and Digital Forensics at Champlain College. Originally from Carlisle, Pennsylvania, Ethan currently works as a Forensic Intern and System Administrator at the Senator Patrick Leahy Center for Digital Investigation where he is involved in real life investigation forensic analysis, network and system administration, and forensic research. Ethan has spent close to the last year researching the Microsoft Windows 8 OS with focus on revealing new artifacts and attempting to confirm previous methodologies.

(Guest post provided by Ethan Fleisher. Original article can be found at the author’s blog dig4n6.blogspot.com.)

Hakin9 Exploiting Software SamuraiWTF Toolkit

A new issue of Hakin9 Exploiting Software is out!

Diving Through SamuraiWTF Toolkit – Massive article on setting up and using SamuraiWTF the Web Pentesting Ubuntu Distro platform.

Penetration Testing LAB Setup Guide – Exceptional article on setting up a kickin network test lab by Jeremiah Brott. I normally use physical machines or VMWare virtual machines, but in this article Jeremiah covers setting up an awesome lab using VirtualBox and PFSense. I now use this setup regularly – it works fantastic.

Web Filtering with Websense. To be or not to be filtered: that is the dilemma – Great article on Websense the web filtering program. Also a great article on why your company needs web filtering.

Malware, a cyber threat increasingly difficult to contain – I haven’t read this article yet, but read a lot of Pierluigi Paganini’s material. He is an exceptional writer and security expert.

Also in this issue:

  • Burp Suite Automating Attacks By Ric Messier
  • Memory Levels Gate Mitigation By Amr Thabet
  • Anti-Rootkits in the Era of Cyber Wars By Igor Korkin
  • Password Construction and Management By Gaurav Kumar
  • Picking Up Mushrooms in the Rain Forest – Social Engineering Information Gathering By Vlad Styran

Subscribe to Hakin9 Exploiting Software now!

The Deep Web vs Network Security Monitoring

We have all heard the horror stories of the Deep Web. You know, the evil internet underground where cyber criminals and sexual predators lurk. Where boogiemen and anarchists trade secret coded messages through encrypted channels.

But is it really that bad?

Into the Void

The “Deep Web”, Dark Web or hidden internet, is a massive collection (some say up to 500 times the size of the regular internet) of sites and databases that don’t show up in standard search engines like Google. One of the easiest ways to connect to this network is via Tor, which ensures data encryption and anonymity. There are several Deep web search engines and portals that are only accessible through Tor. They have long cryptic names that usually end in “.onion”.

Does the dark web stand up to it’s dark side nomenclature? Absolutely! View any of the portal entrance menus and you’ll instantly know that you are not in Kansas anymore. Criminals, hitmen, drug dealers and others openly ply their trade. And don’t even bother putting normal “g-rated” terms into a Deep Web search engine. It most likely won’t find a response, or it will find a very deviant response for what you typed in.

So, is this a place that you want ANYONE on your corporate network to visit?

NO WAY.

Though many use Tor for legitimate purposes, the deep web just isn’t that kind of place. But what can you do?

Enter Network Security Monitoring!

You do have a network monitoring system don’t you? If you don’t have a web proxy to control and block suspicious traffic, you can still use your network security monitoring system to catch Tor traffic.

As a test, I downloaded Talis, the Unix distro that comes all wired to run Tor out of the box. To it’s credit, it is one of the fastest tor implementations that I have seen by far. Surfing normal websites and searching with Google was relatively quick, not like the normal Tor use that I am used to on my Ubuntu or Windows systems.

I visited a couple of the “Deep Web” portals and even used the Torch search engine. Other than being painfully slow accessing these portals, I was actually able to find some legal material to use as a test! I grabbed some hardware “how-to” images and a couple goofy .pdf files.

I then pulled up my security server console to check to see if it caught anything:

It sure did! I received several alerts concerning my trip into the void. The traffic tripped several “known Tor node” rules. The Talis system IP address is listed along with the rule alerts. A security analyst monitoring this network could easily tell what corporate system was using the Tor network, and when they used it.

For further analysis, I grabbed the network packet capture for the session and imported it into my Netwitness Investigator program. It too detected the Tor traffic:

It didn’t throw an alert though, which I really thought it would. Suspicious traffic usually shows up at the top of Investigator, under “alerts”.

I did notice something else that did bother me. To be extra sure, I ran the packet capture through both Xplico, and Network Miner. The results from these backed up my initial findings.

There were no pictures… Or text documents…. Or pdf files… found in the packet capture.

As a matter of fact there was 0% detected unencrypted text. Yikes!

With just standard packet capture and detection, without SSL decryption, there would be no way to determine what was viewed or downloaded from the Tor network or worse the Deep Web.

Conclusion

The Tor network creates an encrypted channel from your system to the Tor onion routers. The data is then bounced around several servers and then unencrypted at the exit nodes, when the packets leave the Tor network. Though some businesses use Tor for legitimate purposes, most don’t use it at all. If your corporate users are accessing the Deep Web from work, then this could open up your network to a multitude of malicious threats. And if they are downloading questionable, illegal or copyrighted material this could put your corporation at legal risk.

Record and monitor ALL of your network traffic. This could help you detect issues before they become major problems. Block or monitor suspicious SSL traffic on your network. You may capture Bot command and control communication or someone using your network for less than legal purposes.