Skip to content
/ AutoC Public

An automated tool designed to extract and analyze Indicators of Compromise (IoCs) from open-source threat intelligence sources

License

Notifications You must be signed in to change notification settings

barvhaim/AutoC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

40 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

image

AutoC

AutoC is an automated tool designed to extract and analyze Indicators of Compromise (IoCs) from open-source threat intelligence sources.

Features

  • Threat Intelligence Parsing: Parses blogs, reports, and feeds from various OSINT sources.
  • IoC Extraction: Automatically extracts IoCs such as IP addresses, domains, file hashes, and more.
  • Visualization: Display extracted IoCs and analysis in a user-friendly interface.

Getting Started

πŸš€ Quick Start

Fastest way to get started with AutoC is to run it using Docker (with docker-compose).

Make sure to set up the .env file with your API keys before running the app (See Configuration section below for more details).

git clone https://github.com/barvhaim/AutoC.git
cd AutoC
docker-compose up --build

Once the app is up and running, you can access it at http://localhost:8000

πŸ“¦ Installation

  1. Install Python 3.11 or later. (https://www.python.org/downloads/)
  2. Install uv package manager (https://docs.astral.sh/uv/getting-started/installation/)
    • For Linux and MacOS, you can use the following command:
      curl -LsSf https://astral.sh/uv/install.sh | sh
    • For Windows, you can use the following command:
      powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
  3. Clone the project repository and navigate to the project directory.
    git clone https://github.com/barvhaim/AutoC.git
    cd AutoC
  4. Install the required Python packages using uv.
    uv sync
  5. Configure the .env file with your API keys (See Configuration section below for more details).

πŸ”‘ Configuration

Set up API keys by adding them to the .env file (Use .env.example file as a template). You can use either of multiple LLM providers (IBM WatsonX, OpenAI), you will configure which one to use in the next step.

cp .env.example .env

Supported LLM providers:

  • watsonx.ai by IBM ("watsonx") Get API Key
  • OpenAI ("openai") - Experimental
  • RITS internal IBM ("rits")
  • Ollama ("ollama") - Experimental

Suggested models by provider:

Provider (LLM_PROVIDER) Models (LLM_MODEL)
watsonx.ai by IBM (watsonx) - meta-llama/llama-3-3-70b-instruct
-ibm-granite/granite-3.1-8b-instruct
RITS (rits) - meta-llama/llama-3-3-70b-instruct
- ibm-granite/granite-3.1-8b-instruct
-deepseek-ai/DeepSeek-V3
OpenAI (openai) - gpt-4.1-nano
Ollama (ollama) Experimental - granite3.2:8b

Enhanced Blog post extraction (optional)

By default, AutoC uses combination of docling and beautifulsoup4 libraries to extract blog posts content, which behind the scenes uses requests library to fetch the blog post content.

There is an option to use Crawl4AI that uses a headless browser to fetch the blog post content, which is more reliable, but requires additional setup.

To enable Crawl4AI, you need Crawl4AI backend server, which can be run using Docker (see docker-compose.yml file for more details):

crawl4ai:
    image: unclecode/crawl4ai:0.6.0-r2
    container_name: crawl4ai
    restart: unless-stopped
    shm_size: 1g
    ports:
      - "11235:11235"

And then set the environment variables in the .env file to point to the Crawl4AI server:

USE_CRAWL4AI_HEADLESS_BROWSER_HTML_PARSER=true
CRAWL4AI_BASE_URL=http://localhost:11235

MITRE ATT&CK TTPs detection (optional)

AutoC can detect MITRE ATT&CK TTPs in the blog post content, which can be used to identify the techniques and tactics used by the threat actors. To enable MITRE ATT&CK TTPs detection, you need to set the environment variable in the .env file:

HF_TOKEN=<your_huggingface_token>
DETECT_MITRE_TTPS_MODEL_PATH=dvir056/mitre-ttp  # Hugging Face model path for MITRE ATT&CK TTPs detection

πŸ“ Usage

Run the AutoC tool with the following command:

uv run python cli.py extract --help (to see the available options)
uv run python cli.py extract --url <blog_post_url>
Image

πŸ§‘β€πŸ’» Bonus - Try our UI

Image

πŸƒUp and running options:

Assuming the app .env file is configured correctly, you can run the app using one of the following options:

Running the app

For running the app locally, you'll need node 20 and npm installed on your machine. We recommend using nvm for managing node versions.

cd frontend
nvm use
npm install
npm run build

Once the build is complete, you can run the app using the following command from the root directory:

cd ..
uv run python -m uvicorn main:app --host 0.0.0.0 --port 8000 --workers 4

One the app is up and running, you can access it at http://localhost:8000

Development

For development purposes, you can run the app in development mode using the following command:

Start the backend server:

uv run python -m uvicorn main:app --reload

and in a separate terminal, start the frontend development server:

cd frontend
nvm use
npm install
npm run build
npm run dev

Once the app is up and running, you can access it at http://localhost:5173

πŸ”¨ MCP tool for Claude Desktop (Experimental)

Image

Make sure you have Claude Desktop installed, uv package manager and Python installed on your machine. Clone the project repository and navigate to the project directory.

Install the required Python packages using uv.

uv sync

Edit claude desktop config file and add the following lines to the mcpServers section:

{
  "mcpServers": {
    "AutoC": {
      "command": "uv",
      "args": [
        "--directory",
        "/PATH/TO/AutoC",
        "run",
        "mcp_server.py"
      ]
    }
  }
}

Restart the app, you should see the AutoC MCP server in the list of available MCP servers.

About

An automated tool designed to extract and analyze Indicators of Compromise (IoCs) from open-source threat intelligence sources

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •