Helpful Navigation Toolbar

Tuesday, January 30, 2018

Several minor updates to buatapa!

Hello again readers and welcome back! I am pleased to announce that today there is a brand new, updated version of buatapa! Over the past several months I've had requests for better in script feedback on some of the ways that buatapa processed the results of autoruns, but just have not had the free time to sit down and try to work on implementing them. The new version is a little more "wordy", as it tries the best that it can to help the user if there are processing problems. For example, if you did not run autoruns with the needed flags, buatapa will recognize that from the output file you are running and suggest you run it again. For those on Mac (and maybe a few *nix systems), it also tells you if you do not have the proper permissions to access the autoruns output file.

There are also some slight changes to the interior processing and a little better logic flow. All in all, buatapa has held up quite well since the early testing nearly three years ago, and hopefully is a useful tool in helping to try to triage Windows systems within your environment.

If you have any questions or encounter any bugs/issues, please do not hesitate to reach out! - download here 

MD5: 8c2f9dc33094b3c5635bd0d61dbeb979
SHA-256: c1f67387484d7187a8c40171d0c819d4c520cb8c4f7173fc1bba304400846162
Version 0.0.7
Updated: January 30, 2018

Tuesday, December 26, 2017

Amazon Alexa Forensic Walkthrough Guide

Hello again readers and welcome back! We are working on wrapping up 2017 here at BriMor Labs, as this was a very productive and busy year. One of the things that Jessica and I have been meaning to put together for quite some time was a small document summarizing the URLs to query from Amazon to return some of the Amazon Echosystem data.

After several months, we (cough cough Jessica) finally was able to get the time to put it together and share it with all of you. We hope that it is helpful during your investigations and analysis, and if you need anything else please do not hesitate to reach out to Jessica or myself!

Alexa Cloud Data Reference Guide

Monday, June 26, 2017

A Brief Recap of the SANS DFIR Summit

Hello again readers and welcome back!! I had the pleasure of attending (and speaking at, more on that in a bit!) at the 10th SANS DFIR Summit this past week. It is one conference that I always try to attend, as it always has a fantastic lineup of DFIR professionals speaking about amazing research and experiences that they have had. This year was, of course, no exception, as the two day event was filled with incredible talks. The full lineup of slides from the talks can be found here. This was also the first year that the presenters had "walk-up music" before the talks.

This year, my good friend Jessica Hyde and I gave a presentation on the Amazon "Echo-system" in a talk we titled "Alexa, are you Skynet". We even brought a slight cosplay element to the talk as I dressed up in a Terminator shirt and Jessica went full Sarah Connor! One other quick note about our talk that I would like to add, is we chose the song "All The Things" by Dual Core as our walk-up music. Dual Core actually lives in Austin and fortunately his schedule allowed him to attend our talk. It was really cool having the actual artist who performed our walk-up music be in attendance at our talk!

Jessica and I speaking about the Amazon Echo-system at the 2017 SANS DFIR Summit

We admittedly had a LOT of slides and a LOT of material to cover, but if you have attended any of our presentations in the past, the reason our slide decks tend to be long is that we want to make sure that the slides themselves can still paint a pretty good picture of what we talked about. This way, even if you were not fortunate enough to see our presentation, the you can follow along and the slides and they can also serve as reference points during future examinations. We received a lot of really great comments about our talk and had some fantastic conversations afterwards as well, so hopefully if you attended you enjoyed it!

My other favorite part of the DFIR Summit is getting to see colleagues and friends that you interact with throughout the year, actually in person and not just as a message box in a chat window! Even though some of us live fairly close to each other in the greater Baltimore/DC area, we fly 1500 miles every summer to hang out for a few days. While in Austin several of us had some discussions about trying to start some local meetup type events on a more regular basis, so there definitely will be more on that to follow in the coming weeks! 

Thursday, March 9, 2017

How to load a SQL .bak file for analysis, without SQL Server previously installed

Hello again readers and welcome back! I hope that this new year has been treating you well so far! I recently worked a case with an interesting twist that I never had to deal with before, so I figured I would make a blog post about it and share my experiences. I also wanted to document the whole process just in case I have to deal with it again!

The case that I worked involved a SQL Server backup file (with a ".bak" file extension), which was created from a Microsoft SQL Server instance. Loading and parsing a SQL Server backup file is fairly trivial if you have a SQL Server environment, but I do not have a SQL Server environment and had to come up with a way to be able to process the data. 

Edited March 10, 2017 - The reddit user fozzie33 made a fantastic point that I did not specify in this particular post. I was working from a copy of the data that was originally provided, but it is best to change the attributed to read-only in an effort to ensure the raw data itself does not change. In any forensic investigation you should always be working from a copy of the data and never the original, but changing the attributes to read-only is another step one should take to limit any changes to the data, even if it is a working copy!

I followed a total of nine steps to accomplish analysis of the backed up SQL database:

1) Download SQL Server 2016 SP1 Developer edition
2) Download Microsoft SQL Server Management Studio
3) Copy executables to flash drive
4) Copy executables to offline system
5) Install SQL Server
6) Install SSMS
7) Launch SSMS & restore the SQL database
8) Make your SQL queries using SSMS
9) Great success! High five!

Step 1: Download SQL Server 2016 SP1 Developer edition

Hopefully you have a Microsoft Developer Network account, if not, pop over to the MSDN page and sign up for one, it is free and quite easy to do. Once you are logged in, you can download the SQL Server 2016 SP1 Developer edition. The reason for using this version, compared to the Express version, is that the Express version limits the size of your database to 10GB. If you know your database is going to be smaller than that, you can definitely use the Express version, but I prefer the Developer edition just to be sure I can handle the database regardless of what size the database will be. 

IMPORTANT NOTE: The license of the Developer edition explicitly prohibits using "Production data". While the backup file is indeed "Production data", I recommend installing the needed items and processing all of the data on a completely offline machine, and when you are finished with the analysis completely uninstall everything from your system. My personal take on the EULA is that Microsoft does not want you to use the Developer edition to power an online database backend, as they of course want you to purchase the license to allow you to do that. My opinion is that performing offline analysis of a SQL Server backup file is well within the limitations of the Developer license, but if you have any question on the legality of the issue please consult proper legal counsel, as I am not a lawyer nor did I stay at a Holiday Inn Express last night!

To download the files for your offline machine, first choose the "SQL Server 2016 Developer Edition Download" option. 
Choose the "SQL Server 2016 Developer Edition Download" option

The download page will load, then choose the "SQL Server 2016 Developer with Service Pack 1" option.

Choose the "SQL Server 2016 Developer with Service Pack 1" option

You will be presented with an option to download the .iso, or you can use the "Click here to utilize the SQL installer." option which will download a file with a name like "SQLServer2016-SSEI-Dev.exe". This installer will let you download the files so you can install it all to your offline machine.

Choose "Click here to utilize the SQL installer." option

The file "SQLServer2016-SSEI-Dev.exe" was downloaded

When you run the program, you will be presented with a screen containing three options. We are going to select the "Download Media" option, as we want to install it on another machine.

Choosing the "Download Media" option
On the next screen we will be presented with the option to download the ISO or the CAB. We want the CAB option as it will be easier to install on another Windows machine, so choose the "CAB" option and save it to the download path of your liking, then click the "Download" button.

Choose "CAB" option

The download will take a few seconds (or minutes, depending on your ISP) and there will be a friendly new screen informing you that the download is finished upon completion.

Congratulations, the download is now complete!

When the download is complete, you should have the files "" and "SQLServer2016-DEV-x64-ENU.exe" saved in your directory:

The files "" and "SQLServer2016-DEV-x64-ENU.exe" in the download folder

Step 2: Download Microsoft SQL Server Management Studio

The Microsoft SQL Server Management Studio (SSMS) allows you to interact with data from the SQL database in a fairly easy, fairly straight forward manner. Even if you have very limited experience dealing with data from SQL, you can pretty easily start to navigate your way through with some of the built in options from SSMS. 

Choose "Download SQL Server Management Studio" option

There should a file with a name similar to "SSMS-Setup-ENU.exe" now saved in your downloads folder. 

SSMS-Setup-ENU.exe saved in the "Downloads" folder

Step 3: Copy executables to flash drive

The filenames themselves may change based on exactly when you download them, but you should now copy the two SQL Server installation files (.box and .exe) and the SSMS installation file to a flash drive so you can transfer it to your offline system.

Files copied to flash drive for offline system

Step 4: Copy executables to offline system

Although you can install it directly from the flash drive, in my experience it is always better to copy the needed files to your offline system. 

Files copied to offline system

Step 5: Install SQL Server

The first thing we are going to do is install SQL Server to our offline system. When you double-click the file you are greeted with a popup asking for the directory in which you wish to save the extracted files. I just left this as the default option and clicked "OK".

Choose the directory for extracted files

You will see a file extraction progress bar.

File extraction progress

and when that is done, you will see a new window titled "SQL Server Installation Center". We are going to install SQL Server on our system, so click on the Installation link.

Choosing the Installation link

There are several options that are presented here, but we are only interested in the first one, labeled "New SQL Server stand-alone installation or add features to an existing installation".

Choose to install a new SQL Server instance

Once you click that option, you will see a the installation screen. Because we have the developer edition, there is no need to insert a product key, so just click Next.

"Product Key" screen

Check the box on the next screen next to "I accept the license terms".

Accept the license terms (you did read all the way through it, right?)

Your system is offline, so there is no need to check the box about using Microsoft Update, so just click Next.

Our system is offline, so this does not apply

Again, because the system is offline, you will see an error message saying it could not search for updates. This is fine, so just click Next.

Looks bad, but it is ok as our system is offline, so this is fine!

You should now see a screen labeled "Install Rules" that should list a couple of passed items and a couple of failed items. The .NET Application security should have a warning because the system is offline. However, depending on your system settings, the Windows Firewall may generate a warning because it is on, or it may pass because it is off.

"Install Rules" status

You should now see a screen labeled "Feature Selection". With this you can choose to install everything, but in my limited testing just selecting "Database Engine Services" should be enough. You can also choose where to install the files, but again the default(s) should be sufficient.

Feature Selection. Select as little, or as much, as you would like!

It may take a few minutes, but when it is finished you will see a screen labeled "Instance Configuration". You can choose whatever options that you would like, but I personally prefer to leave the default options again.

"Instance Configuration"

It may take a few minutes, but when it is finished you will see a screen labeled "Server Configuration". You can choose different options of course, but again I prefer to leave the defaults.

"Server Configuration"

Next you should see a screen labeled "Database Engine Configuration". I prefer to just leave the "Windows Authentication Mode" checked. You must also choose an account(s) for the SQL Server Administrator, the easiest option for this is to click the "Add Current User" button and it will populate. Once that is finished, click Next.

Database Engine Configuration. Don't forget to add a SQL Server administrator!

Now that ALL that work is done, you should a see a screen that resembles a tree hierarchy. Now you can click the Install button and install your SQL Server instance! This will probably take some time, so be patient!

Ready to install at last!!

Once that is finished, you should see a screen that is labeled "Complete" and several options should all say "Succeeded" next to them. You can now click "Close".

All done!!

Step 6: Install SSMS

Now, despite there being a link for Install SQL Server Management Tools in the Installation link on the SQL Server installation option, that simply opens a new page and tries to install it, which means you need an internet connection to do so. That is exactly why we downloaded SSMS separately and have it on our offline system ready to install!

To begin the process, double click the executable, and you should see a screen with "Microsoft SQL Server Management Studio" on it. All we have to do here is click "Install".

Installation screen for SSMS

You should see a screen that involves loading packages, as the process will likely take a few minutes to install.

Packages are loading, this may take a bit!

Once the installation is "complete", you will have to restart the system in order for the installation to "complete" (because it is Windows, after all!)

Installation is complete, but we have to restart to complete the installation. Huh??

Step 7: Launch SSMS & restore the SQL database

Now that SQL Server and SSMS are both installed on our system, we can launch SSMS. Navigate to Program Files and launch the executable. 

Getting ready to launch SSMS for the first time!

There may be a brief loading screen for user settings, then you should see the SSMS console, complete with the Connect to Server Window.

SSMS main console

All you should have to do is click the "Connect" button and you should see a tree view options in the "Object Explorer" window.

The "Object Explorer" window is populated

We are interested in the "Databases" option, since we are going to be restoring a database from a backup file. Right click on the "Databases" folder and choose the "Restore Database" option.

Choose the "Restore Database" option

Now we will get a new popup window that is labeled "Restore Database".

The "Restore Database" popup window

We are going to choose the "Device" option under "Source", then click on the box with the three dots.

Tick "Device", then click the box with three dots

This brings up a new window titled "Select backup devices". Our Backup media type will be file, and we will click the "Add" button to add our .bak file (PRO TIP: Saving the .bak file on the root of a drive (like in "C:\" makes it much easier to find and navigate to)). Select the file and then click "OK".

Click the "Add" button
Browse to the folder containing the .bak file

Now the Select backup devices should be populated with our backup file. As long as it is properly in box, click "OK".

Select backup devices is now populated!

There will be a pause as the system processes the information, and you should see the box under "Backup sets to restore" populate with information. As long as it populates properly, you can click "OK".

The fields are populated, so we can click OK and let the backup restore process start!

The backup process will take some time to fully restore depending on the size of the database, but once it is done restoring, it will be fully loaded and we can start to make our queries!

The restore has been completed!

Step 8: Make your SQL queries using SSMS

Once the database is loaded, you will see it under the "Databases" folder.

The database, seen under the databases folder

You can expand on the database and see all of the associated information, but more than likely "Tables" is going to be the main area that you are going to focus on.

Some of the tables in this database. There are SOOO many tables!

Thanks to the power of SSMS, you can actually use some of the preconfigured queries to get you started!

Some of the options. "Select Top 1000 Rows" is your friend!

You can select the top 1000 rows, and then build out your specific queries accordingly, however you would like!

The results of selecting the top 1000 rows from this particular table

Step 9: Great success! High five!

I definitely hope that this rather lengthy blog post helps in the event that you ever find yourself in a situation like this. It is of course much easier to get data from whatever database front end that is available, but if you can only get a backup of the raw database, it takes some time and research to build up good queries to find the information that you are after!

Monday, December 12, 2016

Live Response Collection - Bambiraptor

Good news everyone!! After a fairly busy year, the past few weeks I have finally had enough down time to work on adding some long overdue, and hopefully highly anticipated, features to the Live Response Collection. This version, named Bambiraptor, will fix some of the small issues that were pointed out in the scripts, including making it a little more pronounced that I am using the Belkasoft RAM Capture tool in the collection, such as an additional file created in both the 32 and 64 bit folder, respectively, at the request of the great folks over at Belkasoft, the autoruns output being the csv file twice, rather than one csv and one easy to read text, some additional logic built in to ensure that the "secure" options actually secure the data, and a couple of minor text fixes to the output. The biggest change is on the OSX side though, so without further ado, we shall dive into that!

The biggest change on the OSX side is the addition of automated disk imaging. It uses the internal "dd" command to do this, so again, be aware, that if you suspect your system may be SEVERELY compromised, this may generate non-consistent output. If that is the case, you should probably be looking at a commercial solution such as Blackbag's Macquistion to acquire the data from a system. Remember, the Live Response Collection is simply another tool in your arsenal, and while it does have some pretty robust capabilities, always be sure that you test and verify that it is working properly within your environment. I have tried my best to ensure that it either works properly or fails, but as there are different flavors of Mac hardware and software, it gets harder and harder to account for every possibility (this, along with the fact that I see way more Windows systems than OSX/*nix systems in the wild, is why my development plan is Windows first, followed by OSX, followed by *nix).

With the addition of the disk imaging, there are now a total of three scripts that you can choose to run on an OSX system. They are self explanatory, just like on the Windows side. However, unlike the Windows side, you MUST run specify to the script that you are running it with super user privileges, or else the memory dump & disk imaging will not occur. The Windows side is set to run automatically as Administrator as long as you click the proper pop ups, OSX, to my knowledge, does not have this option).

I have purposely held off on releasing "secure" options on the OSX side because I want quite a bit more real-world testing to hopefully identify and eliminate any bugs before starting to secure the data automatically. The reason for this, is again, it is more difficult to account for small changes that can have a big impact on the OSX side and I want to ensure the script(s) are working as properly as possible before encrypting and securely erasing collected data, as I don't want to have to run process(es) more than once because one system does not understand a single quotation mark compared to a double quotation mark.

I hope you have a chance to use the Live Response Collection, and as always, if you identify any issues with it, if you find any bugs, or if there are any additional features you would like to add, please let me know. The roadmap for next year includes rewriting portions of the OSX script to better adhere to bash scripting security guidelines, adding secure options to the OSX side, and adding memory dump & automated disk imaging to *nix systems, as well as continuing to add updates and features to the scripts as needed and/or requested. - download here 

MD5: 8603e36be474e8b69c652e5dc86adc2e
SHA-256: ec79422ce2e7218a7bc57b0caf52a5eae2eca98810ac466dddac1115aade493e 

Updated: December 12, 2016

Friday, October 28, 2016

Public release of "allyouruarecordarebelongtous" Perl script

Hello again readers and welcome back! This blog post is going to be short, as the primary purpose is to publicly announce a new script, cleverly titled "", which was in my "Who Watches The Smart Watches" presentation that I gave at OSDFCon on October 26. This Perl script will allow the user to parse out data from SQLite databases associated with Under Armour Record stored on an Android device and present that information in an easy to read format. Please let me know if you have any questions or comments about the script. 

If you would like to see the slides from my OSDFCon presentation, you can view them here.

The script itself can be found on our github page:

Please note, in order to run the script you may have to install some Perl modules. On a Windows system, to do this open a command prompt and paste the following command:

ppm install DBI DBD::SQLite DateTime IO::All

On OSX/*nix system, open a terminal window and paste the following command:

sudo cpan DBI DBD::SQLite DateTime IO::All

Additionally, I would very much like to thank Jessica Hyde ( for helping me generate some test data and helping with code reivew and script output formatting. There is no way I would have been able to put this all together in 2 1/2 weeks without her help!

Friday, June 24, 2016

Public release of "allyourpebblearebelongtous" Perl script

Hello again readers and welcome back! This blog post is going to be fairly short, as the primary purpose is to publicly announce a new script, cleverly titled "". This Perl script will allow the user to parse out data from a SQLite database associated with Pebble data stored on either an iOS or Android device, and present that information in an easy to read format. Please let me know if you have any questions or comments about the script. 

If you would like to see the slides from my SANS presentation, you can view them here

Parsed notifications from Android device

Parsed notifications from iOS device

The script can be found on our newly created github account:

Please note, in order to run the script you may have to install some Perl modules. On a Windows system, to do this open a command prompt and paste the following command:

ppm install DBI YAML DBD::SQLite Data::Plist DateTime IO::All

On a Linux system, open a terminal window and paste the following command:

sudo cpan DBI YAML DBD::SQLite Data::Plist DateTime IO::All

Additionally, I would like to thank Adrian Leong (, Mari DeGrazia (, and Heather Mahalik ( for their help in gathering and testing the collected data.