LogoLogo
Anvilogic WebsiteProduct Documentation
  • Welcome to Anvilogic
  • What's New
    • What's new?
      • 6.x releases
      • 5.x releases
  • Get Started
    • Onboarding guide
      • Log in and set your password
      • Define your company's threat profile
      • Select your data repository and get data in
        • Integrate Splunk as your data repository
          • Download and install the Anvilogic App for Splunk
            • Splunk Cloud Platform
              • Verify requirements
              • Install the Anvilogic App for Splunk
            • Splunk Enterprise
              • Verify requirements
              • Download the Anvilogic App for Splunk
              • Install the Anvilogic App for Splunk
          • Create the Anvilogic indexes
          • Assign the avl_admin role
          • Configure the HEC collector commands
          • Connect to the Anvilogic platform
        • Integrate Snowflake as your data repository
          • Get data into Snowflake
      • Review data feeds
      • (Optional) Upload your existing detections
      • Review and deploy recommended content
      • Additional tasks
    • Reference Architectures
      • Anvilogic on Splunk Architecture
      • Anvilogic on Azure
      • Anvilogic on Snowflake Architecture
        • FluentBit
          • Linux data
          • Syslog data
          • Windows data
        • Fluentd
      • Anvilogic on Databricks Architecture
      • Hybrid - Anvilogic on Splunk & Snowflake Architecture
  • Anvilogic Free Trial
    • Introduction and Overview
    • Sign Up for Free Trial
    • Initial Setup
    • Detection Engineering Workflow
    • Explore the Armory
    • Building a Scenario-Based Detection
    • Create SQL Detections
    • MonteAI for SQL
    • Monte Copilot
      • Monte Copilot supported tools
      • Monte Copilot licensing
      • Monte Copilot privacy and controls
    • Set MITRE ATT&CK Priorities
    • Review Maturity Score
    • Further Exploration and Next Steps
  • Anvilogic Lab
    • Anvilogic Lab Intro
      • Create SQL Detections
      • MonteAI for SQL
      • MITRE & Detection Armory
      • Deploy New Detections
  • Security Controls
    • AI security controls
    • Monte Copilot & AI privacy and controls
Powered by GitBook
On this page
  • Pre-Reqs
  • Setting up FluentBit Config

Was this helpful?

Export as PDF
  1. Get Started
  2. Reference Architectures
  3. Anvilogic on Snowflake Architecture
  4. FluentBit

Windows data

This page is designed to help customers leverage the Forward Events integration within their Anvilogic account for FluentBit.

Last updated 8 months ago

Was this helpful?

Pre-Reqs

  • Anvilogic account

  • Snowflake data repository connected to your Anvilogic account

Setting up FluentBit Config

  1. Anvilogic will provide a S3 bucket and the corresponding access keys/ids (note these change for each integration) when you create a forward events integration in your Anvilogic deployment.

  2. Following the steps of the AWS CLI install, once you have done the installation correctly - Please run aws configure and paste in the access key and id provided. Once this is completed, validate that the credentials have been created - usually C:\Users\YourUsername.aws\credentials.

  3. Once that has been validated, we need to create a system variable in order for fluentBit to read/use these credentials. To do so;

    1. Open the Start Menu and search for “Environment Variables.”

    2. Select Edit the system environment variables.

    3. In the System Properties window, click the Environment Variables button.

    4. Under System variables, click New.

    5. Enter the following:

      1. Variable name: AWS_SHARED_CREDENTIALS_FILE

      2. Variable value: C:\Users\YourUsername\.aws\credentials

    6. Next we need to configure fluentbit to read our logs and send them to S3. In this example, we will be ingesting the windows event logs. You can change what channels by simply adding or removing them.

      1. Please note, the bucket will be the bucket name/path.

        1. This could mean that it is sdi_customer_data-1 or -2 or -3.

[INPUT]
    Name         winlog
    Channels     Security, Application, System
    Interval_Sec 1

[OUTPUT]
    Name              s3
    Match             *
    bucket            avl-raw-prod-s3-221-24243202/sdi_custom_data-1
    region            us-east-1
    use_put_object    On
    Store_dir         C:\Windows\Temp\fluent-bit\s3
    s3_key_format     /$TAG/%Y/%m/%d/%H-%M-%S

Once you have pasted the above config into your fluentBit.conf file (typically located at C:\Program Files\fluent-bit\conf )

  • NOTE: You can also edit or add any of your own customer parsers for logs by editing the parser.conf file at /etc/fluent-bit/

  • Once you have edited your fluent-bit.conf, please restart the fluentBit application

  1. You can now confirm that data has landed in your snowflake account.

Please update the input section of this example config to fit your exact needs.

AWS CLI Installed
FluentBit installed