Git Product home page Git Product logo

giftstick's Introduction

GiftStick

Summary

This project contains code which allows an inexperienced user to easily (one click) upload forensics evidence (such as some information about the system, a full disk image as well as the system's firmware, if supported) from a target device (that will boot on an external device containing the code) to Google Cloud Storage.

It supports configuring what artifacts to collect and which Cloud credentials to use.

This is not an officially supported Google product.

Usage

Make a bootable disk image with the provided script

In the tools directory, the script remaster.sh help you with the process of:

  • Creating a bootable USB disk image with the required dependencies
  • Make sure the image will boot on EFI enabled systems, as well as install third-party input drivers for latest MacBook
  • Create a GCS bucket to receive the evidence, as well as a Service Account with the proper roles & ACL.
  • Make an icon on the system's Desktop with a clickable icon to start the acquisition process.

It needs as input :

  • a Xubuntu 20.04 ISO (won't work with non-XUbuntu, untested with versions different than 20.04)
  • the name of your GCP project
  • the name of the GCS bucket (remember those need to be globally unique)

You need to have installed the Google Cloud SDK and have SetUp the environment and logged in. Then run:

bash tools/remaster.sh \
  --project some-forensics-project-XYZ \
  --bucket giftstick-uploads-XYZ
  --source_iso xubuntu-20.04-desktop-amd64.iso

Manually set up the required Google Cloud environment & call the script

First, the script needs credentials (for example, of a Service Account) that provide the following roles (see IAM roles):

  • roles/storage.objectCreator, to be able to create (but not overwrite) new storage objects,
  • (optional) roles/logging.logWriter for the StackDriver logging system.

These credentials needs to be downloaded and saved as a JSON file. For example, using a Service Account named [email protected], you can create a new key and save it as credentials.json:

gcloud iam service-accounts --project giftstick-project keys create \
        --iam-account "[email protected]" \
        credentials.json

Now pull the code and install dependencies

git clone https://github.com/google/GiftStick
cd GiftStick
pip3 install -r requirements.txt

Unfortunately, because of boto/boto#3699, some patches are required to work in a Python3 environment:

$ boto_dir=$(python -c "import boto; print(boto.__path__[0])")
$ patch -p0 "${boto_dir}/connection.py" config/patches/boto_pr3561_connection.py.patch
$ patch -p0 "${boto_dir}/s3/key.py" config/patches/boto_pr3561_key.py.patch

Once you have booted the system to acquire evidence from that newly created USB stick, and upload it to a GCS url gs://giftstick-bucket/forensics_evidence/ you can run the acquisition script this way:

cd auto_forensicate
sudo python auto_forensicate.py \
    --gs_keyfile=credentials.json \
    --logging stdout \
    --acquire all \
    gs://giftstick-bucket/forensics_evidence/

You'll then get the following hierarchy in your GCS bucket:

gs://giftstick-bucket/forensics_evidence/20181104-1543/SYSTEM_SERIAL/system_info.txt
gs://giftstick-bucket/forensics_evidence/20181104-1543/SYSTEM_SERIAL/stamp.json
gs://giftstick-bucket/forensics_evidence/20181104-1543/SYSTEM_SERIAL/Disks/
gs://giftstick-bucket/forensics_evidence/20181104-1543/SYSTEM_SERIAL/Disks/sda.hash
gs://giftstick-bucket/forensics_evidence/20181104-1543/SYSTEM_SERIAL/Disks/sda.image

Dependencies

The auto_acquisition scripts need Python3 and have been tested to work with 20.04 LTS version of Xubuntu. Previous versions should still work but are not actively supported.

The following packages should be installed in the system you're booting into:

  • sudo apt install dcfldd python-pip zenity
  • For Chipsec (optional) apt install python-dev libffi-dev build-essential gcc nasm

Acquired evidence

Currently the script uploads the following data:

  • System information (output of dmidecode)
  • For each block device that is most likely an internal disk:
    • all the bytes
    • hashes
    • the device's information (output of udevadm)
  • The system's firmware, dumped with Chipsec

It also can upload a folder (for example a mounted filesystem) with --acquire directory. In this case, the script will build a .tar file, and upload it alongside a corresponding .timeline, which is a bodyfile compatible file generated with the find command (and stat, if run on MacOS).

FAQ

Some answers to Frequenly Asked Questions can be found here

giftstick's People

Contributors

jonathan-greig avatar meeehow avatar onager avatar rgayon avatar someguyiknow avatar tomchop avatar toryc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

giftstick's Issues

Move to Xubuntu 20.04

This will (hopefully) ship with Python3 as the only version, which means we can stop pinning the cachetools version.

It also ships kernel 5.3 which means we can stop installing applespi

Enabling Stackdriver logging causes duplicate output

Enabling Stackdriver logging via --logging stackdriver causes everything logged to be output twice:

2020-07-15 01:40:02,005 - INFO - AutoForensicate - Acquisition starting with args '['auto_acquire.py', '--gs_keyfile', 'keyfile.json', '--acquire', 'all', '--logging', 'stackdriver', 'gs://bucket']'
Acquisition starting with args '['auto_acquire.py', '--gs_keyfile', 'keyfile.json', '--acquire', 'all', '--logging', 'stackdriver', 'gs://bucket]'
2020-07-15 01:40:02,167 - INFO - AutoForensicate - Uploading 'lsblk.txt' (1.8KiB, Task 1/4)
Uploading 'lsblk.txt' (1.8KiB, Task 1/4)

Error when capturing and exporting error from Chipsec

2019-08-20 18:22:00,898 - INFO - AutoForensicate - Uploading 'rom.bin' (Unknown size, Task 1/1)
rom.bin 2019-08-20 18:22:00,901 - INFO - ProcessOutputArtifact - Running command '[u'/usr/bin/python', u'/usr/local/lib/python2.7/dist-packages/chipsec_util.py', u'-l', u'/dev/stderr', u'spi', u'dump', u'/dev/stdout']'
2019-08-20 18:22:01,681 - ERROR - ProcessOutputArtifact - Command '[u'/usr/bin/python', u'/usr/local/lib/python2.7/dist-packages/chipsec_util.py', u'-l', u'/dev/stderr', u'spi', u'dump', u'/dev/stdout']' failed with 'Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/chipsec_util.py", line 263, in
sys.exit( main() )
File "/usr/local/lib/python2.7/dist-packages/chipsec_util.py", line 259, in main
return chipsecUtil.main()
File "/usr/local/lib/python2.7/dist-packages/chipsec_util.py", line 226, in main
comm.run()
File "/usr/local/lib/python2.7/dist-packages/chipsec/utilcmd/spi_cmd.py", line 119, in run
buf = _spi.read_spi_to_file( 0, spi_size, out_file )
File "/usr/local/lib/python2.7/dist-packages/chipsec/hal/spi.py", line 525, in read_spi_to_file
buf = self.read_spi( spi_fla, data_byte_count )
File "/usr/local/lib/python2.7/dist-packages/chipsec/hal/spi.py", line 541, in read_spi
self.check_hardware_sequencing()
File "/usr/local/lib/python2.7/dist-packages/chipsec/hal/spi.py", line 518, in check_hardware_sequencing
raise SpiRuntimeError("Chipset does not support hardware sequencing")
chipsec.hal.spi.SpiRuntimeError: Chipset does not support hardware sequencing

################################################################

CHIPSEC: Platform Hardware Security Assessment Framework

################################################################
[CHIPSEC] Version 1.3.0
****** Chipsec Linux Kernel module is licensed under GPL 2.0
[CHIPSEC] API mode: using CHIPSEC kernel module API
ERROR: Unsupported Platform: VID = 0x8086, DID = 0x5910
WARNING: *******************************************************************
WARNING: * Unknown platform!
WARNING: * Platform dependent functionality will likely be incorrect
WARNING: * Error Message: "Unsupported Platform: VID = 0x8086, DID = 0x5910"
WARNING: *******************************************************************
[CHIPSEC] Executing command 'spi' with args ['dump', '/dev/stdout']

[CHIPSEC] dumping entire SPI flash memory to '/dev/stdout'
[CHIPSEC] it may take a few minutes (use DEBUG or VERBOSE logger options to see progress)
[CHIPSEC] BIOS region: base = 0x00000000, limit = 0x00000FFF
[CHIPSEC] dumping 0x00001000 bytes (to the end of BIOS region)
ERROR: HSFS.FDV is 0, hardware sequencing is disabled' return code 1)
2019-08-20 18:22:01,681 - ERROR - AutoForensicate - Unable to upload artifact rom.bin
Traceback (most recent call last):
File
"/usr/local/lib/python2.7/dist-packages/auto_forensicate-20181010-py2.7.egg/EGG-INFO/scripts/auto_acquire.py", line 347, in _UploadArtifact
File "build/bdist.linux-x86_64/egg/auto_forensicate/uploader.py", line 104, in UploadArtifact
artifact.OpenStream(), remote_path, update_callback=update_callback)
File "build/bdist.linux-x86_64/egg/auto_forensicate/recipes/base.py", line 89, in OpenStream
self._stream = self._GetStream()
File "build/bdist.linux-x86_64/egg/auto_forensicate/recipes/base.py", line 231, in _GetStream
self._buffered_content = BytesIO(command_output)
TypeError: 'unicode' does not have the buffer interface

Error when starting zenity with --select_disk

sudo auto_acquire.py --acquire all --select_disk /mnt/evidence
2022-10-12 08:01:11,237 - INFO - AutoForensicate - Acquisition starting with args '['/usr/local/bin/auto_acquire.py', '--acquire', 'all', '--select_disk', '/mnt/evidence']'
2022-10-12 08:01:15,279 - ERROR - AutoForensicate - Recipe disk failed to run
Traceback (most recent call last):
  File "/usr/local/bin/auto_acquire.py", line 574, in Main
    self.Do(recipe_class(recipe_name, options=options))
  File "/usr/local/bin/auto_acquire.py", line 495, in Do
    artifacts = recipe.GetArtifacts()
  File "/usr/local/lib/python3.8/dist-packages/auto_forensicate/recipes/disk.py", line 407, in GetArtifacts
    disks_to_collect = gui.AskDiskList(all_disks)
  File "/usr/local/lib/python3.8/dist-packages/auto_forensicate/ux/gui.py", line 62, in AskDiskList
    choices = zenity.CheckList(
  File "/usr/local/lib/python3.8/dist-packages/auto_forensicate/ux/zenity.py", line 86, in CheckList
    return process.stdout.read().strip().split('|')
TypeError: a bytes-like object is required, not 'str'

Move to supporting xubuntu 22.04 as base ISO

much seems to have change on the ISOs, as isolinux seems to not exist anymore, and the only supported booting method is EFI

I suspect moving to EFI only might be an issue when booting old hardware

ProgressBar raises exceptions

sda %(percent).1f%% 2020-06-17 19:02:33,797 - INFO - LinuxDiskArtifact - Opening disk with command '['/usr/bin/dcfldd', 'if=/dev/sda', 'hashlog=sda.hash', 'hash=md5,sha1', 'bs=2M', 'conv=noerror', 'hashwindow=128M']'
2020-06-17 19:02:33,837 - ERROR - AutoForensicate - Unable to upload artifact sda
Traceback (most recent call last):
File "auto_forensicate/auto_acquire.py", line 352, in _UploadArtifact
artifact, update_callback=update_callback)
File "/usr/local/google/home/romaing/dev/gift_github/auto_forensicate/uploader.py", line 104, in UploadArtifact
artifact.OpenStream(), remote_path, update_callback=update_callback)
File "/usr/local/google/home/romaing/dev/gift_github/auto_forensicate/uploader.py", line 145, in _UploadStream
update_callback(len(buf), copied)
File "auto_forensicate/auto_acquire.py", line 120, in update_with_total
self._Update(current_bytes)
File "auto_forensicate/auto_acquire.py", line 84, in _Update
self.update()
File "/usr/local/google/home/romaing/venvs/gift_github/lib/python3.7/site-packages/progress/bar.py", line 80, in update
suffix = self.suffix % self
File "/usr/local/google/home/romaing/venvs/gift_github/lib/python3.7/site-packages/progress/init.py", line 61, in getitem
return getattr(self, key, None)
File "/usr/local/google/home/romaing/venvs/gift_github/lib/python3.7/site-packages/progress/init.py", line 147, in eta_td
return timedelta(seconds=self.eta)
OverflowError: Python int too large to convert to C int

Report used NIC for network connection

To debug possible flaky network connections it would be nice to have the active default NIC used for the upload network connection. This way we can see if eg it's wired or wireless.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.