Skip to content

Getting Started Out of the Box

Kylie Ebringer edited this page Feb 22, 2022 · 41 revisions

This section includes:

1. Editing Configuration File

2. Running the Analysis Utility

3. Components Included Out of the Box

Editing your configuration file

You must modify the carbonblackcloud and engine sections to make the Toolkit functional.

An example configuration file is available here.

Configuration fields

carbonblackcloud

Parameter Definition Format Example Value
url Base URL of your Carbon Black Cloud server https://defense-<environment>.conferdeploy.net https://defense-prod05.conferdeploy.net
api_token Concatenated API Secret Key and API Key ID [API Secret Key] / [API Key ID] ABCDEFGHIJKLMNOPQRSTUVWX/ABCDE12345
org_key Organization's Org Key, found in console 8 alphanumeric characters ABCDEFGH
ssl_verify Use SSL to verify connection to Carbon Black Cloud True or False True
expiration_seconds How long links to download binaries from AWS S3 should be valid for, in seconds Integer 3600

Parameter Details

url will be in the form https://defense-<environment>.conferdeploy.net/, where <environment> is prod02, prod03, prod05, etc. This is unique to your organization.

api_token will be a concatenation of an API Secret Key and API Key ID. See Credentials in Configuration File for more detail.

org_key can be found in your Carbon Black Cloud Console, under Settings -> API Access -> API Keys.

ssl_verify can be either True or False, depending on if you want to verify your connection with the Carbon Black Cloud via SSL.

expiration_seconds is how long the links to download binaries from AWS S3 will be valid for. This may need to be adjusted depending on how quickly your analysis engine can process binaries. With the included YARA analysis engine, 3600 seconds (60 minutes) is more than sufficient.

database

Parameter Definition Format
_provider Class name of your chosen persistence provider. Default is the built-in cbc_binary_toolkit.state.builtin.Persistor Fully-qualified class name
location Pathname of the database file to be opened. Default is ":memory", where the database is stored in RAM. path-like object

Parameter Details

The default :memory location means there is no state maintained between runs. If you want persistence between runs, use a local database file.

To get up and running quickly, use the built-in cbc_binary_toolkit.state.builtin.Persistor persistence provider.

engine

Parameter Definition Format Example Value
name Name of your analysis engine. Used in engine responses and referenced in reports sent to Carbon Black Cloud. String "Yara"
feed_id The Feed ID where the Toolkit should send analysis results to. Can be retrieved from Carbon Black Cloud Feed API 21 alphanumeric characters 578FTEQBNWSHCPGIXD4R3
type Whether the analysis engine is running locally or remotely. Only local analysis is currently supported. local local
_provider Class name of your chosen analysis engine Fully-qualified class name cbc_binary_toolkit_examples.engine.yara_local.yara_engine.YaraFactory

Parameter Details

The engine section of your configuration file is extensible -- you may add or remove parameters and adjust the analysis utility code accordingly. See the Developer Guide for more information.

For example, the YARA analysis engine requires a rules_file parameter to tell the engine where to look for YARA rules.

Parameter for YARA Definition Format Example Value
rules_file Location of file containing YARA rules path-like object __file__/example_rule.yara

Running the Analysis Utility

The analysis utility is your entry point to the Toolkit. Through command line arguments, you can analyze hashes, clear hashes from the state management database that have already been analyzed, or restart analyzing hashes if an error occurs during runtime and some hashes aren't fully analyzed.

Specify a Custom Configuration

parameter flag purpose
--config -c Specify a configuration file location

The analysis utility can be supplied with the --config parameter or -c flag to specify where to find your configuration file. If you do not specify a path to a configuration file, the analysis utility defaults to cbc-binary-toolkit/config/binary-analysis-config.yaml.example.

user@machine:~$ cbc-binary-analysis --config /path/to/config.yaml [...]

Analyze

parameter flag purpose
--file -f Input a CSV file containing hashes
--list -l Input a JSON string of hashes

There are two options for inputting hashes into the Toolkit:

  1. CSV file with one hash per line:
user@machine:~$ cbc-binary-analysis analyze --file path/to/hashes.csv

Here, you will analyze a --file, and give the path to our CSV file containing hashes, with one per line.

  1. JSON string containing hashes:

On Mac/UNIX, enclose each hash in quotes, and wrap the whole list in brackets.

user@machine:~$ cbc-binary-analysis analyze --list '["hashone64charslong", "hashtwo64charslong"]'

On Windows, use the \ escape character before each quote:

C:\Users\User> cbc-binary-analysis analyze --list [\"hashone64charslong\", \"hashtwo64charslong\"]

If a hash has been previously analyzed, it will not be analyzed again unless you clear the persistor.

Clear

parameter flag purpose
--timestamp N/A Limit clearing to before timestamp
--force N/A Skip confirmation prompt
--reports -r Remove unsent reports and hash records

The clear command removes the records of hashes having been processed, which will cause them to be sent to the engine if those hashes are submitted again.

Clear all record of hashes having been processed from the persistor:

user@machine:~$ cbc-binary-analysis clear

Clear record of hashes having been processed before a specific timestamp. Timestamp must be in ISO 8601 date format (YYYY-MM-DD HH:MM:SS.SSS):

user@machine:~$ cbc-binary-analysis clear --timestamp '2020-01-02 10:30:00.000'

A confirmation prompt will ask to confirm. To skip this prompt, use the --force flag:

user@machine:~$ cbc-binary-analysis clear --force

The --force flag can be used with or without a --timestamp.

Remove unsent Reports in addition to hash processing records, with --reports or -r:

user@machine:~$ cbc-binary-analysis clear --reports

The --reports flag can be used with or without a --timestamp.

Restart

Resume analysis on hashes that have already been ingested, but did not finish analysis and sending the results to the Carbon Black Cloud:

user@machine:~$ cbc-binary-analysis restart

For all command line options, see the Running Binary Analysis tool section in the README for detailed usage, or run cbc-binary-analysis --help from your terminal.

Components Included Out of the Box

The Toolkit can be run with minimal configuration out of the box. Here is an example Analysis Utility that uses a Built-in State Management Database and YARA Analysis Engine to show what is possible.

How the Analysis Utility example steps through execution

The example analysis utility:

  1. Loads your configuration file
  2. Initializes the necessary components, then
  3. Executes your desired functionality: analyze, clear, or restart.

If you choose to analyze hashes, the Toolkit then

  1. Communicates with the Unified Binary Store (UBS)
  2. Gives the metadata retrieved from the UBS to the analysis engine
  3. Accepts analysis results from the analysis engine
  4. Stores then sends the analysis results to a Feed

The feed that analysis results are sent to is specified by Feed ID in your configuration file.

Data Flow for Analysis Utility's analyze command

Data Flow within the Binary Analysis Toolkit

Built-in State Management Database

There is a built-in example for managing the state of hashes you've input for analysis. The example uses SQLite3 with two tables:

  1. run_state: tracks checkpoints as hashes travel through the Toolkit

  2. report_item: stores Reports queued for dispatch to the Carbon Black Cloud.

As a user, you will not interact with the state management database.

Example YARA Analysis Engine

The YARA analysis engine is using the yara-python package. YARA is a rule-based static analysis method. With string matching, you can categorize binaries and assign different severities to different categories or rules.

The engine accepts binary metadata, which contains a link to download the binary from AWS S3. After downloading, the binary is checked against the rules in the rules_file. The location of the rules file is specified in the engine: section of your configuration file.

Data Flow within the Binary Analysis Toolkit

How to use the example YARA Analysis Engine

The example YARA engine is located at cbc_binary_toolkit_examples/engine/yara_local/yara_engine.py. It uses one rule, called CompanyMatch.

rule CompanyMatch
{
	meta:
		sev = 2
	strings:
		$microsoft = "microsoft" nocase
		$google = "google" nocase

	condition:
		$microsoft or $google
}

This rule looks in a binary for the case-insensitive strings "microsoft" or "google". The rule has a low severity value of 2, because these are not signs of malicious behavior.

To use this example engine, copy the values from cbc_binary_toolkit_examples/engine/yara_local/conf.yaml.example to your Toolkit configuration file and replace feed_id with a real Feed ID. For reference, the example YARA configuration is below.

engine:
  name: Yara
  feed_id: example-feed-id
  type: local
  _provider: cbc_binary_toolkit_examples.engine.yara_local.yara_engine.YaraFactory
  rules_file: __file__/example_rule.yara

Many open and closed source repositories of YARA rules are available online. To make the example more useful, edit the example_rule.yara file and add more rules. You may also edit the rules_file value in the engine: section of your configuration file to use a different rules file. See Configuring YARA Rules for detailed instructions.

Clone this wiki locally