Helpful Navigation Toolbar

Tuesday, July 17, 2018

Let's Talk About Kext


Hello again readers and welcome back! Today's blog post is going to cover some of the interesting things I found poking around MacOS while developing updates to the Live Response Collection. First off, I have to offer my thanks to Sarah Edwards for taking the time to talk about what she has done with regards to the quirkiness ("official technical term") regarding MacOS, System Integrity Protection ("SIP"), kernel extensions, and everything else that completely derailed my plans for pulling data from a Mac!


Our story begins by trying to diagnose some errors that I noticed while trying to perform a memory dump on my system using osxpmem. The errors were related to my system loading the kernel extension MacPmem.kext, which resulted in the error message "/Users/brimorlabs/Desktop/Cedarpelta-DEV/OSX_Live_Response/Tools/osxpmem_2.1/temp/osxpmem.app/MacPmem.kext failed to load - (libkern/kext) system policy prevents loading; check the system/kernel logs for errors or try kextutil(8).
". Even though I was running the script as root, for some reason the kext was failing to load. 


That weird error message is weird

The Live Response Collection script has always changed the owner of the kernel extension, so I knew that ownership was also not the problem, so that left me in a bit of a bind. Fortunately the tool "kextutil" is included on a standard Mac load, so I hoped that running that command could shed some light on my issues. The results from running kextutil were mostly underwhelming, with the exception of .... what the heck is that path? 


Output from kextutil. What is "/Library/StagedExtensions/Users/brimorlabs/Desktop/Cedarpelta-DEV/OSX_Live_Response/Tools/osxpmem_2.1/osxpmem.app/MacPmem.kext" and why are you there, and not the folder you are supposed to be in?


The actual path on disk was "
Users/brimorlabs/Desktop/Cedarpelta-DEV/OSX_Live_Response/Tools/osxpmem_2.1/osxpmem.app/MacPmem.kext", but for some reason the operating system was putting it in another spot. OK, that seems really weird, so why is my system doing stuff that I don't specifically want it to do? Oh Apple, how very Python of you! :)


osxpmem/MacPmem.kext related files under the "/Library/StagedExtensions" path


As it turns out, thanks to quite a bit of research, the "Library/StagedExtensions" folder is, in very basic terms, the sandbox in which MacOS puts things that it does not trust, as a function of SIP. Now, if you were presented with the "Do you trust this extension" prompt ...


This is what the "System Extension Blocked" popup looks like. This is NOT the popup you see with osxpmem


... then, if you navigate to "Security & Privacy" and click on the 
"General" tab, and clicked "Allow"


Security & Privacy - "General". Note the "System software from developer REDACTED was blocked from loading" and the "Allow" button

It would (ok, *should*) then stop the symbolic linking (that is what I assume is happening, although that is not confirmed yet) from the original folder to the StagedExtension folder to allow the sandboxing/SIP to occur. That means that the kernel extension would then be able to run, and the world would be a glorious place. Except.....it seems that once a developer/company signs their kext, which allows the bypassing of SIP, that means that EVERYTHING signed by them in the future will also, automatically, be trusted. Obviously this could present a security issue down the road if those signing certificates would be stolen. I don't know of that happening yet, but it does seem like it is plausible and could presumably happen in the future.



I've tried a couple of workarounds to bypass the SIP process to allow me to dump memory from a system without having to go through all of the bypass sip/csrutil steps (if you are unfamiliar with that, please follow the link here). None of my attempts succeeded yet, but I am still trying. I specifically do NOT want to reboot the computer, because I want to collect memory from the system and not potentially lose volatile data. I will either update this post, or continue this as a series, when I find a sufficient work around (if there is one) to this issue! With that being said, if you found a way to dump memory from a MacOS live system that has SIP enabled, and if you are able to share it publicly (or privately) please share your methodology. I would love for the next LRC update to be able to include memory dumps from systems with SIP enabled!

Sunday, June 17, 2018

Who's Down With PTP?


Hello again readers and welcome back! Today's blog post covers a series of (unfortunate) events that I had to work through in order to acquire data from an LG Aristo phone. These methods might also work for other devices, especially ones that are severely locked down, such as those that are primarily utilized on pre-paid plans, such as TracFone. (DISCLAIMER: I am *NOT* claiming that this will work all the time. It seems that tech companies/developers sometimes take shortcuts (*gasp*) which means that devices don't quite function the way they are "supposed" to function.)


Our journey begins with using Magnet Axiom (thanks Jessica!) in an attempt to acquire data, and subsequently process that data, from a stock Android device. Following the very concise, user-friendly prompts, all of the steps were properly taken in an effort to acquire the device. However, the first issue arose when the "Trust this computer" prompt never came up on the Aristo itself. Since I've had many experiences with mobile devices in the past, my first thought was to fire up Android Debug Bridge (adb) in an attempt to make sure that adb was properly recognizing the device. Because if adb can't recognize it, acquisition through just about any commercial tool just won't work. Interestingly, choosing the "Charging only" option from the USB option in Developer mode, which is usually the standard in Android device acquisition, results in nothing being recognized in adb.



Charging, the usual method, does not work 
No devices shown in adb


So, the next step is to see if perhaps we connect with MTP (Media Transfer Protocol), it will allow adb to recognize the device. It is a different protocol, and I know from past experience that sometimes different protocols means the difference between working or not. When I chose MTP from the Developer Options, I was *finally* presented with the desired "Allow USB debugging" prompt, which also lists the unique computer fingerprint. So...success!!



MTP is picked as the USB connection on the device


Finally, debugging options show up on the device with MTP!


Or not. adb recognizes the device and allows me to send commands, such as making a backup, but what I need is the Magnet agent to be pushed to the device so we can get the user data, such as SMS, contacts, call history, etc. When connected via MTP, it seems the Aristo allows some data to be transferred from the device to a system, but it does not allow data to go from the system to the mobile device. Curses!! Foiled again!!




abd recognizes the device with MTP. Partial success!


After deliberating, some adb-kung-fu, and using Google to search for additional options, I decided to try using PTP (Picture Transfer Protocol). adb still recognized the device, however, for reasons that COMPLETELY elude me, setting it up this way allowed not only the backup to be performed, but ALSO allowed data (aka the Magnet agent) to be pushed to the device! At last, I finally had success!



Now we choose the PTP option. For some reason, this choice works!!


In Axiom, we choose the ADB (Unlocked) method
Finally! With the PTP connection, AXIOM recognizes the device!
Ready to start processing!
Data acquisition as begun at last!
Our data has been acquired!


Interestingly enough, however, when I completely cleared the Trusted Devices on the mobile device, I could not get the "Trust Connections from this device" prompt to show up using a PTP connection. So, as long as you follow the method below, you *may* be able to get data from a severely locked down mobile device!

0) Get familiar with using adb from the command line. 

It is a free download, and most commercial tools use adb behind the scenes. If you do any work with Android devices, you should know some basics of adb!

1) Connect the device using the MTP protocol. 

2) When presented with the Trust Connections prompt on the device, choose OK and make sure the "Always allow from this computer" box is checked
3) Change the connection protocol to PTP
4) Acquire the device using Magnet Axiom
5) ....
6) PROFIT!!


One additional note I would like to add about the Magnet agent when using Magnet Axiom to acquire data from a device. In my opinion, it is very important to choose the "Remove agent from device upon completion" option, found under Settings, when acquiring data from a mobile device. We ran into this issue with the agent being left behind when processing mobile devices during forward deployments. When we had devices associated with high value entities, the final step in data acquisition was that we would have to interact with the device and manually remove the agent and acquisition log(s).  (NOTE: it was not Magnet, as they did not exist at the time, it was another vendor who I will not publicly name.) It is entirely up to the end user if they feel comfortable leaving behind an agent or not. I definitely do not and will always choose to remove it. I just wanted to specifically point that out to anyone using Axiom to get data from mobile devices!



To change the agent settings, Open Process, Navigate to Tools, then Settings


Check the "Restore Device State" box to remove the Magnet agent after acquisition

Friday, April 6, 2018

Fishing for work is almost as bad as phishing (for anything)


Hello again readers and welcome back! The topic of today's blog post is something that we posted on a few years back, but unfortunately it’s worth repeating again. Companies (both large and small) who provide any kind of cyber security services have a responsibility to anyone they interact with to be completely transparent particularly when words like “breach”, “victim”, and “target” start getting thrown around. Case in point is an email that a client received from a large, well-established, cyber security services company a few weeks ago that caused a bit of internal alarm that ultimately did not contain enough information to be actionable.


In short, sharing information, threat intelligence, tactics/techniques/procedures (TTPs), indicators of compromise (IOCs), etc is something that ALL of us in the industry need to do better. I applaud the sharing of IOCs and threat information (when it’s unclassified, obviously).  If this particular email had simply contained that information in a timely manner, I would have applauded the initiative. Unfortunately the information sharing of a seven month old phish consisted of:


  • four domains
  • tentative attribution to Kazakhstan, but zero supporting evidence
  • “new” (but, admittedly, unanalyzed) malware, including an MD5 hash, and of course, 
  • a sales pitch

The recipient of this email attempted to find out more information, but was ultimately turned off by a combination of the tone and was unsure if the information was valid, or if it was just a thinly veiled sales pitch. They reached out to us directly for assistance.


I passed this particular information on to others within the information security field, and recently Arbor Networks actually put out a much more comprehensive overview of this activity, with a whole bunch of indicators and information that was not included, or even alluded to, in this particular email. I wish that more companies would take the initiative and do research into actors and campaigns such as this. If I were a CIO, and I was looking for a particular indicator from an email, but in searching for more information I came across the information in the Arbor post, I would be much more inclined to engage with Arbor if myself/my team needed external resources, than I would from an email that may have had good intentions, but felt like a services fishing expedition.


On the exact opposite end of the spectrum, the outreach of the recent Panera data loss was done perfectly. In the original email, the individual attempted to contact the proper security individuals, had no luck initially, was very disappointed by the initial response from Panera, and tried repeatedly to work with the team. The team at Panera pretty much did nothing until they went public with the issue just a few days ago, which (finally) spurred Panera to react, albeit in a less than satisfactory fashion, again. To be 100% honest, if I were in that situation I would have done everything exactly the same way.  It is a sad state of affairs when we as customers/consumers are more concerned with companies protecting our own information, than the companies who are charged with the care of that information for their services/loyalty programs/etc.


Additionally, no one wants to hear that their company or team has security issues, but responsible disclosure methods are always the way to go. However, it is hard for companies and individuals who are trying to do the right thing to highlight and address issues when “fishing for work” is so pervasive. I’ve seen many companies blow off security notifications as scams and ignore them completely, due precisely to this pervasive problem of fishing for work.


So ideally, how can we share information better?
  1. Join information sharing programs and network (Twitter, LinkedIn, conferences, etc.)
  2. Don’t “cold call” unless you have no other option. The process works much better when you already have a relationship (or know someone who does)
  3. Share complete, useful, and actionable information: recognize that not all companies can search the same way, due to limitations in resources available and even policy, regulations, and even privacy laws. Some companies cannot search by email, while others will need traditional IOCs (IPs, domains, hashes (not just MD5 hashes, also include SHA1 and SHA256 if you can)).  
  4. Include the body of the phishing email and the complete headers--if the company is unable to search for the IOCs, they may be able to determine that it was likely blocked by their security stack    
  5. Be timely. Sharing scant details of a phish from seven months ago goes well beyond the capabilities threshold of most companies 
  6. Be selective in how and to whom you share. Sending these “helpful” notifications to C-levels are guaranteed to bring the infosec department to a full-stop while they work on only this specific threat, real, imagined, or incorrect. Which brings me to #7….
  7. Make sure (absolutely sure) you are correct. “Helpful notifications” that are based on incorrect information and lack of technical expertise are common enough that a large company could have days of downtime dedicated to them. (And if the client themselves points out your technical errors with factual observations, consider the possibility that you might be wrong, apologize profusely, and DO NOT keep calling every day)

Tuesday, January 30, 2018

Several minor updates to buatapa!


Hello again readers and welcome back! I am pleased to announce that today there is a brand new, updated version of buatapa! Over the past several months I've had requests for better in script feedback on some of the ways that buatapa processed the results of autoruns, but just have not had the free time to sit down and try to work on implementing them. The new version is a little more "wordy", as it tries the best that it can to help the user if there are processing problems. For example, if you did not run autoruns with the needed flags, buatapa will recognize that from the output file you are running and suggest you run it again. For those on Mac (and maybe a few *nix systems), it also tells you if you do not have the proper permissions to access the autoruns output file.


There are also some slight changes to the interior processing and a little better logic flow. All in all, buatapa has held up quite well since the early testing nearly three years ago, and hopefully is a useful tool in helping to try to triage Windows systems within your environment.



If you have any questions or encounter any bugs/issues, please do not hesitate to reach out!



buatapa_0_0_7.zip - download here 

MD5: 8c2f9dc33094b3c5635bd0d61dbeb979
SHA-256: c1f67387484d7187a8c40171d0c819d4c520cb8c4f7173fc1bba304400846162
Version 0.0.7
Updated: January 30, 2018

Tuesday, December 26, 2017

Amazon Alexa Forensic Walkthrough Guide


Hello again readers and welcome back! We are working on wrapping up 2017 here at BriMor Labs, as this was a very productive and busy year. One of the things that Jessica and I have been meaning to put together for quite some time was a small document summarizing the URLs to query from Amazon to return some of the Amazon Echosystem data.

After several months, we (cough cough Jessica) finally was able to get the time to put it together and share it with all of you. We hope that it is helpful during your investigations and analysis, and if you need anything else please do not hesitate to reach out to Jessica or myself!



Alexa Cloud Data Reference Guide




Monday, June 26, 2017

A Brief Recap of the SANS DFIR Summit


Hello again readers and welcome back!! I had the pleasure of attending (and speaking at, more on that in a bit!) at the 10th SANS DFIR Summit this past week. It is one conference that I always try to attend, as it always has a fantastic lineup of DFIR professionals speaking about amazing research and experiences that they have had. This year was, of course, no exception, as the two day event was filled with incredible talks. The full lineup of slides from the talks can be found here. This was also the first year that the presenters had "walk-up music" before the talks.


This year, my good friend Jessica Hyde and I gave a presentation on the Amazon "Echo-system" in a talk we titled "Alexa, are you Skynet". We even brought a slight cosplay element to the talk as I dressed up in a Terminator shirt and Jessica went full Sarah Connor! One other quick note about our talk that I would like to add, is we chose the song "All The Things" by Dual Core as our walk-up music. Dual Core actually lives in Austin and fortunately his schedule allowed him to attend our talk. It was really cool having the actual artist who performed our walk-up music be in attendance at our talk!

Jessica and I speaking about the Amazon Echo-system at the 2017 SANS DFIR Summit

We admittedly had a LOT of slides and a LOT of material to cover, but if you have attended any of our presentations in the past, the reason our slide decks tend to be long is that we want to make sure that the slides themselves can still paint a pretty good picture of what we talked about. This way, even if you were not fortunate enough to see our presentation, the you can follow along and the slides and they can also serve as reference points during future examinations. We received a lot of really great comments about our talk and had some fantastic conversations afterwards as well, so hopefully if you attended you enjoyed it!


My other favorite part of the DFIR Summit is getting to see colleagues and friends that you interact with throughout the year, actually in person and not just as a message box in a chat window! Even though some of us live fairly close to each other in the greater Baltimore/DC area, we fly 1500 miles every summer to hang out for a few days. While in Austin several of us had some discussions about trying to start some local meetup type events on a more regular basis, so there definitely will be more on that to follow in the coming weeks! 


Thursday, March 9, 2017

How to load a SQL .bak file for analysis, without SQL Server previously installed


Hello again readers and welcome back! I hope that this new year has been treating you well so far! I recently worked a case with an interesting twist that I never had to deal with before, so I figured I would make a blog post about it and share my experiences. I also wanted to document the whole process just in case I have to deal with it again!


The case that I worked involved a SQL Server backup file (with a ".bak" file extension), which was created from a Microsoft SQL Server instance. Loading and parsing a SQL Server backup file is fairly trivial if you have a SQL Server environment, but I do not have a SQL Server environment and had to come up with a way to be able to process the data. 


Edited March 10, 2017 - The reddit user fozzie33 made a fantastic point that I did not specify in this particular post. I was working from a copy of the data that was originally provided, but it is best to change the attributed to read-only in an effort to ensure the raw data itself does not change. In any forensic investigation you should always be working from a copy of the data and never the original, but changing the attributes to read-only is another step one should take to limit any changes to the data, even if it is a working copy!


I followed a total of nine steps to accomplish analysis of the backed up SQL database:

1) Download SQL Server 2016 SP1 Developer edition
2) Download Microsoft SQL Server Management Studio
3) Copy executables to flash drive
4) Copy executables to offline system
5) Install SQL Server
6) Install SSMS
7) Launch SSMS & restore the SQL database
8) Make your SQL queries using SSMS
9) Great success! High five!



Step 1: Download SQL Server 2016 SP1 Developer edition https://msdn.microsoft.com/library/dd206988.aspx

Hopefully you have a Microsoft Developer Network account, if not, pop over to the MSDN page and sign up for one, it is free and quite easy to do. Once you are logged in, you can download the SQL Server 2016 SP1 Developer edition. The reason for using this version, compared to the Express version, is that the Express version limits the size of your database to 10GB. If you know your database is going to be smaller than that, you can definitely use the Express version, but I prefer the Developer edition just to be sure I can handle the database regardless of what size the database will be. 

IMPORTANT NOTE: The license of the Developer edition explicitly prohibits using "Production data". While the backup file is indeed "Production data", I recommend installing the needed items and processing all of the data on a completely offline machine, and when you are finished with the analysis completely uninstall everything from your system. My personal take on the EULA is that Microsoft does not want you to use the Developer edition to power an online database backend, as they of course want you to purchase the license to allow you to do that. My opinion is that performing offline analysis of a SQL Server backup file is well within the limitations of the Developer license, but if you have any question on the legality of the issue please consult proper legal counsel, as I am not a lawyer nor did I stay at a Holiday Inn Express last night!

To download the files for your offline machine, first choose the "SQL Server 2016 Developer Edition Download" option. 
Choose the "SQL Server 2016 Developer Edition Download" option

The download page will load, then choose the "SQL Server 2016 Developer with Service Pack 1" option.


Choose the "SQL Server 2016 Developer with Service Pack 1" option

You will be presented with an option to download the .iso, or you can use the "Click here to utilize the SQL installer." option which will download a file with a name like "SQLServer2016-SSEI-Dev.exe". This installer will let you download the files so you can install it all to your offline machine.



Choose "Click here to utilize the SQL installer." option


The file "SQLServer2016-SSEI-Dev.exe" was downloaded

When you run the program, you will be presented with a screen containing three options. We are going to select the "Download Media" option, as we want to install it on another machine.



Choosing the "Download Media" option
On the next screen we will be presented with the option to download the ISO or the CAB. We want the CAB option as it will be easier to install on another Windows machine, so choose the "CAB" option and save it to the download path of your liking, then click the "Download" button.


Choose "CAB" option

The download will take a few seconds (or minutes, depending on your ISP) and there will be a friendly new screen informing you that the download is finished upon completion.


Congratulations, the download is now complete!

When the download is complete, you should have the files "SQLServer2016-DEV-x64-ENU.box" and "SQLServer2016-DEV-x64-ENU.exe" saved in your directory:



The files "SQLServer2016-DEV-x64-ENU.box" and "SQLServer2016-DEV-x64-ENU.exe" in the download folder


Step 2: Download Microsoft SQL Server Management Studio https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms

The Microsoft SQL Server Management Studio (SSMS) allows you to interact with data from the SQL database in a fairly easy, fairly straight forward manner. Even if you have very limited experience dealing with data from SQL, you can pretty easily start to navigate your way through with some of the built in options from SSMS. 


Choose "Download SQL Server Management Studio" option

There should a file with a name similar to "SSMS-Setup-ENU.exe" now saved in your downloads folder. 


SSMS-Setup-ENU.exe saved in the "Downloads" folder



Step 3: Copy executables to flash drive

The filenames themselves may change based on exactly when you download them, but you should now copy the two SQL Server installation files (.box and .exe) and the SSMS installation file to a flash drive so you can transfer it to your offline system.


Files copied to flash drive for offline system

Step 4: Copy executables to offline system

Although you can install it directly from the flash drive, in my experience it is always better to copy the needed files to your offline system. 


Files copied to offline system


Step 5: Install SQL Server

The first thing we are going to do is install SQL Server to our offline system. When you double-click the file you are greeted with a popup asking for the directory in which you wish to save the extracted files. I just left this as the default option and clicked "OK".


Choose the directory for extracted files

You will see a file extraction progress bar.

File extraction progress


and when that is done, you will see a new window titled "SQL Server Installation Center". We are going to install SQL Server on our system, so click on the Installation link.


Choosing the Installation link


There are several options that are presented here, but we are only interested in the first one, labeled "New SQL Server stand-alone installation or add features to an existing installation".


Choose to install a new SQL Server instance

Once you click that option, you will see a the installation screen. Because we have the developer edition, there is no need to insert a product key, so just click Next.


"Product Key" screen

Check the box on the next screen next to "I accept the license terms".


Accept the license terms (you did read all the way through it, right?)


Your system is offline, so there is no need to check the box about using Microsoft Update, so just click Next.


Our system is offline, so this does not apply

Again, because the system is offline, you will see an error message saying it could not search for updates. This is fine, so just click Next.


Looks bad, but it is ok as our system is offline, so this is fine!

You should now see a screen labeled "Install Rules" that should list a couple of passed items and a couple of failed items. The .NET Application security should have a warning because the system is offline. However, depending on your system settings, the Windows Firewall may generate a warning because it is on, or it may pass because it is off.


"Install Rules" status

You should now see a screen labeled "Feature Selection". With this you can choose to install everything, but in my limited testing just selecting "Database Engine Services" should be enough. You can also choose where to install the files, but again the default(s) should be sufficient.


Feature Selection. Select as little, or as much, as you would like!

It may take a few minutes, but when it is finished you will see a screen labeled "Instance Configuration". You can choose whatever options that you would like, but I personally prefer to leave the default options again.


"Instance Configuration"

It may take a few minutes, but when it is finished you will see a screen labeled "Server Configuration". You can choose different options of course, but again I prefer to leave the defaults.


"Server Configuration"

Next you should see a screen labeled "Database Engine Configuration". I prefer to just leave the "Windows Authentication Mode" checked. You must also choose an account(s) for the SQL Server Administrator, the easiest option for this is to click the "Add Current User" button and it will populate. Once that is finished, click Next.


Database Engine Configuration. Don't forget to add a SQL Server administrator!


Now that ALL that work is done, you should a see a screen that resembles a tree hierarchy. Now you can click the Install button and install your SQL Server instance! This will probably take some time, so be patient!


Ready to install at last!!

Once that is finished, you should see a screen that is labeled "Complete" and several options should all say "Succeeded" next to them. You can now click "Close".


All done!!


Step 6: Install SSMS

Now, despite there being a link for Install SQL Server Management Tools in the Installation link on the SQL Server installation option, that simply opens a new page and tries to install it, which means you need an internet connection to do so. That is exactly why we downloaded SSMS separately and have it on our offline system ready to install!

To begin the process, double click the executable, and you should see a screen with "Microsoft SQL Server Management Studio" on it. All we have to do here is click "Install".


Installation screen for SSMS

You should see a screen that involves loading packages, as the process will likely take a few minutes to install.


Packages are loading, this may take a bit!

Once the installation is "complete", you will have to restart the system in order for the installation to "complete" (because it is Windows, after all!)


Installation is complete, but we have to restart to complete the installation. Huh??

Step 7: Launch SSMS & restore the SQL database

Now that SQL Server and SSMS are both installed on our system, we can launch SSMS. Navigate to Program Files and launch the executable. 


Getting ready to launch SSMS for the first time!

There may be a brief loading screen for user settings, then you should see the SSMS console, complete with the Connect to Server Window.


SSMS main console

All you should have to do is click the "Connect" button and you should see a tree view options in the "Object Explorer" window.


The "Object Explorer" window is populated

We are interested in the "Databases" option, since we are going to be restoring a database from a backup file. Right click on the "Databases" folder and choose the "Restore Database" option.


Choose the "Restore Database" option

Now we will get a new popup window that is labeled "Restore Database".


The "Restore Database" popup window

We are going to choose the "Device" option under "Source", then click on the box with the three dots.


Tick "Device", then click the box with three dots

This brings up a new window titled "Select backup devices". Our Backup media type will be file, and we will click the "Add" button to add our .bak file (PRO TIP: Saving the .bak file on the root of a drive (like in "C:\" makes it much easier to find and navigate to)). Select the file and then click "OK".


Click the "Add" button
Browse to the folder containing the .bak file


Now the Select backup devices should be populated with our backup file. As long as it is properly in box, click "OK".


Select backup devices is now populated!

There will be a pause as the system processes the information, and you should see the box under "Backup sets to restore" populate with information. As long as it populates properly, you can click "OK".


The fields are populated, so we can click OK and let the backup restore process start!

The backup process will take some time to fully restore depending on the size of the database, but once it is done restoring, it will be fully loaded and we can start to make our queries!

The restore has been completed!

Step 8: Make your SQL queries using SSMS


Once the database is loaded, you will see it under the "Databases" folder.


The database, seen under the databases folder

You can expand on the database and see all of the associated information, but more than likely "Tables" is going to be the main area that you are going to focus on.


Some of the tables in this database. There are SOOO many tables!

Thanks to the power of SSMS, you can actually use some of the preconfigured queries to get you started!


Some of the options. "Select Top 1000 Rows" is your friend!

You can select the top 1000 rows, and then build out your specific queries accordingly, however you would like!



The results of selecting the top 1000 rows from this particular table


Step 9: Great success! High five!


I definitely hope that this rather lengthy blog post helps in the event that you ever find yourself in a situation like this. It is of course much easier to get data from whatever database front end that is available, but if you can only get a backup of the raw database, it takes some time and research to build up good queries to find the information that you are after!