Tuesday, December 26, 2017
Hello again readers and welcome back! We are working on wrapping up 2017 here at BriMor Labs, as this was a very productive and busy year. One of the things that Jessica and I have been meaning to put together for quite some time was a small document summarizing the URLs to query from Amazon to return some of the Amazon Echosystem data.
After several months, we (cough cough Jessica) finally was able to get the time to put it together and share it with all of you. We hope that it is helpful during your investigations and analysis, and if you need anything else please do not hesitate to reach out to Jessica or myself!
Alexa Cloud Data Reference Guide
Monday, June 26, 2017
Hello again readers and welcome back!! I had the pleasure of attending (and speaking at, more on that in a bit!) at the 10th SANS DFIR Summit this past week. It is one conference that I always try to attend, as it always has a fantastic lineup of DFIR professionals speaking about amazing research and experiences that they have had. This year was, of course, no exception, as the two day event was filled with incredible talks. The full lineup of slides from the talks can be found here. This was also the first year that the presenters had "walk-up music" before the talks.
This year, my good friend Jessica Hyde and I gave a presentation on the Amazon "Echo-system" in a talk we titled "Alexa, are you Skynet". We even brought a slight cosplay element to the talk as I dressed up in a Terminator shirt and Jessica went full Sarah Connor! One other quick note about our talk that I would like to add, is we chose the song "All The Things" by Dual Core as our walk-up music. Dual Core actually lives in Austin and fortunately his schedule allowed him to attend our talk. It was really cool having the actual artist who performed our walk-up music be in attendance at our talk!
|Jessica and I speaking about the Amazon Echo-system at the 2017 SANS DFIR Summit|
We admittedly had a LOT of slides and a LOT of material to cover, but if you have attended any of our presentations in the past, the reason our slide decks tend to be long is that we want to make sure that the slides themselves can still paint a pretty good picture of what we talked about. This way, even if you were not fortunate enough to see our presentation, the you can follow along and the slides and they can also serve as reference points during future examinations. We received a lot of really great comments about our talk and had some fantastic conversations afterwards as well, so hopefully if you attended you enjoyed it!
My other favorite part of the DFIR Summit is getting to see colleagues and friends that you interact with throughout the year, actually in person and not just as a message box in a chat window! Even though some of us live fairly close to each other in the greater Baltimore/DC area, we fly 1500 miles every summer to hang out for a few days. While in Austin several of us had some discussions about trying to start some local meetup type events on a more regular basis, so there definitely will be more on that to follow in the coming weeks!
Thursday, March 9, 2017
Hello again readers and welcome back! I hope that this new year has been treating you well so far! I recently worked a case with an interesting twist that I never had to deal with before, so I figured I would make a blog post about it and share my experiences. I also wanted to document the whole process just in case I have to deal with it again!
The case that I worked involved a SQL Server backup file (with a ".bak" file extension), which was created from a Microsoft SQL Server instance. Loading and parsing a SQL Server backup file is fairly trivial if you have a SQL Server environment, but I do not have a SQL Server environment and had to come up with a way to be able to process the data.
Edited March 10, 2017 - The reddit user fozzie33 made a fantastic point that I did not specify in this particular post. I was working from a copy of the data that was originally provided, but it is best to change the attributed to read-only in an effort to ensure the raw data itself does not change. In any forensic investigation you should always be working from a copy of the data and never the original, but changing the attributes to read-only is another step one should take to limit any changes to the data, even if it is a working copy!
I followed a total of nine steps to accomplish analysis of the backed up SQL database:
1) Download SQL Server 2016 SP1 Developer edition
2) Download Microsoft SQL Server Management Studio
3) Copy executables to flash drive
4) Copy executables to offline system
5) Install SQL Server
6) Install SSMS
7) Launch SSMS & restore the SQL database
8) Make your SQL queries using SSMS
9) Great success! High five!
Step 1: Download SQL Server 2016 SP1 Developer edition https://msdn.microsoft.com/library/dd206988.aspx
Hopefully you have a Microsoft Developer Network account, if not, pop over to the MSDN page and sign up for one, it is free and quite easy to do. Once you are logged in, you can download the SQL Server 2016 SP1 Developer edition. The reason for using this version, compared to the Express version, is that the Express version limits the size of your database to 10GB. If you know your database is going to be smaller than that, you can definitely use the Express version, but I prefer the Developer edition just to be sure I can handle the database regardless of what size the database will be.
IMPORTANT NOTE: The license of the Developer edition explicitly prohibits using "Production data". While the backup file is indeed "Production data", I recommend installing the needed items and processing all of the data on a completely offline machine, and when you are finished with the analysis completely uninstall everything from your system. My personal take on the EULA is that Microsoft does not want you to use the Developer edition to power an online database backend, as they of course want you to purchase the license to allow you to do that. My opinion is that performing offline analysis of a SQL Server backup file is well within the limitations of the Developer license, but if you have any question on the legality of the issue please consult proper legal counsel, as I am not a lawyer nor did I stay at a Holiday Inn Express last night!
To download the files for your offline machine, first choose the "SQL Server 2016 Developer Edition Download" option.
|Choose the "SQL Server 2016 Developer Edition Download" option|
The download page will load, then choose the "SQL Server 2016 Developer with Service Pack 1" option.
|Choose the "SQL Server 2016 Developer with Service Pack 1" option|
You will be presented with an option to download the .iso, or you can use the "Click here to utilize the SQL installer." option which will download a file with a name like "SQLServer2016-SSEI-Dev.exe". This installer will let you download the files so you can install it all to your offline machine.
|Choose "Click here to utilize the SQL installer." option|
|The file "SQLServer2016-SSEI-Dev.exe" was downloaded|
When you run the program, you will be presented with a screen containing three options. We are going to select the "Download Media" option, as we want to install it on another machine.
|Choosing the "Download Media" option|
|Choose "CAB" option|
The download will take a few seconds (or minutes, depending on your ISP) and there will be a friendly new screen informing you that the download is finished upon completion.
|Congratulations, the download is now complete!|
When the download is complete, you should have the files "SQLServer2016-DEV-x64-ENU.box" and "SQLServer2016-DEV-x64-ENU.exe" saved in your directory:
|The files "SQLServer2016-DEV-x64-ENU.box" and "SQLServer2016-DEV-x64-ENU.exe" in the download folder|
Step 2: Download Microsoft SQL Server Management Studio https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms
The Microsoft SQL Server Management Studio (SSMS) allows you to interact with data from the SQL database in a fairly easy, fairly straight forward manner. Even if you have very limited experience dealing with data from SQL, you can pretty easily start to navigate your way through with some of the built in options from SSMS.
|Choose "Download SQL Server Management Studio" option|
There should a file with a name similar to "SSMS-Setup-ENU.exe" now saved in your downloads folder.
|SSMS-Setup-ENU.exe saved in the "Downloads" folder|
Step 3: Copy executables to flash drive
The filenames themselves may change based on exactly when you download them, but you should now copy the two SQL Server installation files (.box and .exe) and the SSMS installation file to a flash drive so you can transfer it to your offline system.
|Files copied to flash drive for offline system|
Step 4: Copy executables to offline system
Although you can install it directly from the flash drive, in my experience it is always better to copy the needed files to your offline system.
|Files copied to offline system|
Step 5: Install SQL Server
The first thing we are going to do is install SQL Server to our offline system. When you double-click the file you are greeted with a popup asking for the directory in which you wish to save the extracted files. I just left this as the default option and clicked "OK".
|Choose the directory for extracted files|
You will see a file extraction progress bar.
|File extraction progress|
and when that is done, you will see a new window titled "SQL Server Installation Center". We are going to install SQL Server on our system, so click on the Installation link.
|Choosing the Installation link|
There are several options that are presented here, but we are only interested in the first one, labeled "New SQL Server stand-alone installation or add features to an existing installation".
|Choose to install a new SQL Server instance|
Once you click that option, you will see a the installation screen. Because we have the developer edition, there is no need to insert a product key, so just click Next.
|"Product Key" screen|
Check the box on the next screen next to "I accept the license terms".
|Accept the license terms (you did read all the way through it, right?)|
Your system is offline, so there is no need to check the box about using Microsoft Update, so just click Next.
|Our system is offline, so this does not apply|
Again, because the system is offline, you will see an error message saying it could not search for updates. This is fine, so just click Next.
|Looks bad, but it is ok as our system is offline, so this is fine!|
You should now see a screen labeled "Install Rules" that should list a couple of passed items and a couple of failed items. The .NET Application security should have a warning because the system is offline. However, depending on your system settings, the Windows Firewall may generate a warning because it is on, or it may pass because it is off.
|"Install Rules" status|
You should now see a screen labeled "Feature Selection". With this you can choose to install everything, but in my limited testing just selecting "Database Engine Services" should be enough. You can also choose where to install the files, but again the default(s) should be sufficient.
|Feature Selection. Select as little, or as much, as you would like!|
It may take a few minutes, but when it is finished you will see a screen labeled "Instance Configuration". You can choose whatever options that you would like, but I personally prefer to leave the default options again.
It may take a few minutes, but when it is finished you will see a screen labeled "Server Configuration". You can choose different options of course, but again I prefer to leave the defaults.
Next you should see a screen labeled "Database Engine Configuration". I prefer to just leave the "Windows Authentication Mode" checked. You must also choose an account(s) for the SQL Server Administrator, the easiest option for this is to click the "Add Current User" button and it will populate. Once that is finished, click Next.
|Database Engine Configuration. Don't forget to add a SQL Server administrator!|
Now that ALL that work is done, you should a see a screen that resembles a tree hierarchy. Now you can click the Install button and install your SQL Server instance! This will probably take some time, so be patient!
|Ready to install at last!!|
Once that is finished, you should see a screen that is labeled "Complete" and several options should all say "Succeeded" next to them. You can now click "Close".
Step 6: Install SSMS
Now, despite there being a link for Install SQL Server Management Tools in the Installation link on the SQL Server installation option, that simply opens a new page and tries to install it, which means you need an internet connection to do so. That is exactly why we downloaded SSMS separately and have it on our offline system ready to install!
To begin the process, double click the executable, and you should see a screen with "Microsoft SQL Server Management Studio" on it. All we have to do here is click "Install".
|Installation screen for SSMS|
You should see a screen that involves loading packages, as the process will likely take a few minutes to install.
|Packages are loading, this may take a bit!|
Once the installation is "complete", you will have to restart the system in order for the installation to "complete" (because it is Windows, after all!)
|Installation is complete, but we have to restart to complete the installation. Huh??|
Step 7: Launch SSMS & restore the SQL database
Now that SQL Server and SSMS are both installed on our system, we can launch SSMS. Navigate to Program Files and launch the executable.
|Getting ready to launch SSMS for the first time!|
There may be a brief loading screen for user settings, then you should see the SSMS console, complete with the Connect to Server Window.
|SSMS main console|
All you should have to do is click the "Connect" button and you should see a tree view options in the "Object Explorer" window.
|The "Object Explorer" window is populated|
We are interested in the "Databases" option, since we are going to be restoring a database from a backup file. Right click on the "Databases" folder and choose the "Restore Database" option.
|Choose the "Restore Database" option|
Now we will get a new popup window that is labeled "Restore Database".
|The "Restore Database" popup window|
We are going to choose the "Device" option under "Source", then click on the box with the three dots.
|Tick "Device", then click the box with three dots|
This brings up a new window titled "Select backup devices". Our Backup media type will be file, and we will click the "Add" button to add our .bak file (PRO TIP: Saving the .bak file on the root of a drive (like in "C:\" makes it much easier to find and navigate to)). Select the file and then click "OK".
|Click the "Add" button|
|Browse to the folder containing the .bak file|
Now the Select backup devices should be populated with our backup file. As long as it is properly in box, click "OK".
|Select backup devices is now populated!|
There will be a pause as the system processes the information, and you should see the box under "Backup sets to restore" populate with information. As long as it populates properly, you can click "OK".
|The fields are populated, so we can click OK and let the backup restore process start!|
The backup process will take some time to fully restore depending on the size of the database, but once it is done restoring, it will be fully loaded and we can start to make our queries!
|The restore has been completed!|
Step 8: Make your SQL queries using SSMS
Once the database is loaded, you will see it under the "Databases" folder.
|The database, seen under the databases folder|
You can expand on the database and see all of the associated information, but more than likely "Tables" is going to be the main area that you are going to focus on.
|Some of the tables in this database. There are SOOO many tables!|
Thanks to the power of SSMS, you can actually use some of the preconfigured queries to get you started!
|Some of the options. "Select Top 1000 Rows" is your friend!|
You can select the top 1000 rows, and then build out your specific queries accordingly, however you would like!
|The results of selecting the top 1000 rows from this particular table|
Step 9: Great success! High five!
I definitely hope that this rather lengthy blog post helps in the event that you ever find yourself in a situation like this. It is of course much easier to get data from whatever database front end that is available, but if you can only get a backup of the raw database, it takes some time and research to build up good queries to find the information that you are after!
Monday, December 12, 2016
Good news everyone!! After a fairly busy year, the past few weeks I have finally had enough down time to work on adding some long overdue, and hopefully highly anticipated, features to the Live Response Collection. This version, named Bambiraptor, will fix some of the small issues that were pointed out in the scripts, including making it a little more pronounced that I am using the Belkasoft RAM Capture tool in the collection, such as an additional file created in both the 32 and 64 bit folder, respectively, at the request of the great folks over at Belkasoft, the autoruns output being the csv file twice, rather than one csv and one easy to read text, some additional logic built in to ensure that the "secure" options actually secure the data, and a couple of minor text fixes to the output. The biggest change is on the OSX side though, so without further ado, we shall dive into that!
The biggest change on the OSX side is the addition of automated disk imaging. It uses the internal "dd" command to do this, so again, be aware, that if you suspect your system may be SEVERELY compromised, this may generate non-consistent output. If that is the case, you should probably be looking at a commercial solution such as Blackbag's Macquistion to acquire the data from a system. Remember, the Live Response Collection is simply another tool in your arsenal, and while it does have some pretty robust capabilities, always be sure that you test and verify that it is working properly within your environment. I have tried my best to ensure that it either works properly or fails, but as there are different flavors of Mac hardware and software, it gets harder and harder to account for every possibility (this, along with the fact that I see way more Windows systems than OSX/*nix systems in the wild, is why my development plan is Windows first, followed by OSX, followed by *nix).
With the addition of the disk imaging, there are now a total of three scripts that you can choose to run on an OSX system. They are self explanatory, just like on the Windows side. However, unlike the Windows side, you MUST run specify to the script that you are running it with super user privileges, or else the memory dump & disk imaging will not occur. The Windows side is set to run automatically as Administrator as long as you click the proper pop ups, OSX, to my knowledge, does not have this option).
I have purposely held off on releasing "secure" options on the OSX side because I want quite a bit more real-world testing to hopefully identify and eliminate any bugs before starting to secure the data automatically. The reason for this, is again, it is more difficult to account for small changes that can have a big impact on the OSX side and I want to ensure the script(s) are working as properly as possible before encrypting and securely erasing collected data, as I don't want to have to run process(es) more than once because one system does not understand a single quotation mark compared to a double quotation mark.
I hope you have a chance to use the Live Response Collection, and as always, if you identify any issues with it, if you find any bugs, or if there are any additional features you would like to add, please let me know. The roadmap for next year includes rewriting portions of the OSX script to better adhere to bash scripting security guidelines, adding secure options to the OSX side, and adding memory dump & automated disk imaging to *nix systems, as well as continuing to add updates and features to the scripts as needed and/or requested.
LiveResponseCollection-Bambiraptor.zip - download here
Updated: December 12, 2016
Friday, October 28, 2016
Hello again readers and welcome back! This blog post is going to be short, as the primary purpose is to publicly announce a new script, cleverly titled "allyouruarecordrebelongtous.pl", which was in my "Who Watches The Smart Watches" presentation that I gave at OSDFCon on October 26. This Perl script will allow the user to parse out data from SQLite databases associated with Under Armour Record stored on an Android device and present that information in an easy to read format. Please let me know if you have any questions or comments about the script.
If you would like to see the slides from my OSDFCon presentation, you can view them here.
The script itself can be found on our github page:
Please note, in order to run the script you may have to install some Perl modules. On a Windows system, to do this open a command prompt and paste the following command:
ppm install DBI DBD::SQLite DateTime IO::All
On OSX/*nix system, open a terminal window and paste the following command:
sudo cpan DBI DBD::SQLite DateTime IO::All
Additionally, I would very much like to thank Jessica Hyde (https://twitter.com/B1N2H3X) for helping me generate some test data and helping with code reivew and script output formatting. There is no way I would have been able to put this all together in 2 1/2 weeks without her help!
Friday, June 24, 2016
Hello again readers and welcome back! This blog post is going to be fairly short, as the primary purpose is to publicly announce a new script, cleverly titled "allyourpebblearebelongtous.pl". This Perl script will allow the user to parse out data from a SQLite database associated with Pebble data stored on either an iOS or Android device, and present that information in an easy to read format. Please let me know if you have any questions or comments about the script.
If you would like to see the slides from my SANS presentation, you can view them here
|Parsed notifications from Android device|
|Parsed notifications from iOS device|
The script can be found on our newly created github account:
Please note, in order to run the script you may have to install some Perl modules. On a Windows system, to do this open a command prompt and paste the following command:
ppm install DBI YAML DBD::SQLite Data::Plist DateTime IO::All
On a Linux system, open a terminal window and paste the following command:
sudo cpan DBI YAML DBD::SQLite Data::Plist DateTime IO::All
Additionally, I would like to thank Adrian Leong (https://twitter.com/Cheeky4n6Monkey), Mari DeGrazia (https://twitter.com/maridegrazia), and Heather Mahalik (https://twitter.com/HeatherMahalik) for their help in gathering and testing the collected data.
Friday, April 22, 2016
Hello again readers, it has been busy over here for the past few months, but over the past few days there has been some really interesting research done by Casey Smith (@subTee) regarding COM+ objects, specifically using regsvr to access external sites (cough cough potentially malware), cleverly named "squiblydoo". The original blog post is here. Apparently it leaves almost no trace on the system, for which I reference a quick look at running it in Noriben:
|Brian Baskin's tweet regarding results of Noriben looking at "squiblydoo"|
Now, I am sure some of you are thinking, "so what, <fill in thoughts here>", because after all, several of the things in the past that we were supposed to get all spun up about (most recently, the debacle that was "badlock" have really turned out to be a lot of marketing hype and not much else). Well, this is something that you should take note of. Until/unless regsvr32 is modified to change the way that it works, there is very little left on the system itself to show that something bad happens. There have been several well respected experts weighing in on this issue (browsing for it will likely give you more information than you ever wanted to know) and the general consensus is that this is pretty worrying.
|Twitter weighs in on "squiblydoo"|
So, what to do? It is very likely that how often regsvr32 actually gets called is dependent on what you do in your environment. It really should never hit the internet, for anything (I will note that statement has not been fully determined yet) but what I have found to be the most successful solution thus far in limited testing is using the open source tool "Process Notifier". It is pretty easy to set up, you run the proper flavor (32 or 64 bit), choose "Processes to Monitor", then type "regsvr32.exe" as your process name to check, choose "Started" and click "Add", then "Apply" and "Save"
|Process Notifier options|
|Adding regsvr32 to the processes to monitor list|
Then you can set up the email alerts under "E-mail Settings", by choosing your send to email address, the message subject, and message body, and even take a screenshot if you'd like under "Message". The next part is very important, under "SMTP" I highly recommend creating a one time throw away gmail account for this, because it does save the account password in plain text on the system. Once you do all of these steps, again choose "Apply" and "Save"
|"Message" options under E-mail Settings|
|"SMTP" options under E-mail Settings|
|My emailed alert on regsvr32, complete with screenshot!|
|Command prompt running regsvr32 captured in the screenshot!|
It is important to note that if this was used maliciously, having the alert on regsvr32 means it will take the screenshot when the process starts. So you may not see your shell (or whatever else was done) but you should see the site/file that it references. And even if it downloads malware that cleans up after itself and squiblydoo, the email should have been sent before that actually happens, so (fingers crossed) you will hopefully get a notification. And if you do get a notification, this would probably be a really good time to at least start gathering data from the system, most likely at least memory and volatile data (hmm...sounds like a good job for the Live Response Collection!)
Unfortunately this only works for finding regsvr32 and does not have the capability to look for urls in the command itself, but it should be a pretty useful quick check to see if it gets called. And if your environment actually does use regsvr32 on a regular basis, this will get very noisy and a different solution will have to be found. It is also very important to remember that there still has to be a considerable amount of testing to try to remedy this situation, so this (or any other method) should only be a temporary fix until a long-term, viable, solution is presented, which is what we are all working toward!