LogoLogo
Anvilogic WebsiteProduct Documentation
  • Welcome to Anvilogic
  • What's New
    • What's new?
      • 6.x releases
      • 5.x releases
  • Get Started
    • Onboarding guide
      • Log in and set your password
      • Define your company's threat profile
      • Select your data repository and get data in
        • Integrate Splunk as your data repository
          • Download and install the Anvilogic App for Splunk
            • Splunk Cloud Platform
              • Verify requirements
              • Install the Anvilogic App for Splunk
            • Splunk Enterprise
              • Verify requirements
              • Download the Anvilogic App for Splunk
              • Install the Anvilogic App for Splunk
          • Create the Anvilogic indexes
          • Assign the avl_admin role
          • Configure the HEC collector commands
          • Connect to the Anvilogic platform
        • Integrate Snowflake as your data repository
          • Get data into Snowflake
      • Review data feeds
      • (Optional) Upload your existing detections
      • Review and deploy recommended content
      • Additional tasks
    • Reference Architectures
      • Anvilogic on Splunk Architecture
      • Anvilogic on Azure
      • Anvilogic on Snowflake Architecture
        • FluentBit
          • Linux data
          • Syslog data
          • Windows data
        • Fluentd
      • Anvilogic on Databricks Architecture
      • Hybrid - Anvilogic on Splunk & Snowflake Architecture
  • Anvilogic Free Trial
    • Introduction and Overview
    • Sign Up for Free Trial
    • Initial Setup
    • Detection Engineering Workflow
    • Explore the Armory
    • Building a Scenario-Based Detection
    • Create SQL Detections
    • MonteAI for SQL
    • Monte Copilot
      • Monte Copilot supported tools
      • Monte Copilot licensing
      • Monte Copilot privacy and controls
    • Set MITRE ATT&CK Priorities
    • Review Maturity Score
    • Further Exploration and Next Steps
  • Anvilogic Lab
    • Anvilogic Lab Intro
      • Create SQL Detections
      • MonteAI for SQL
      • MITRE & Detection Armory
      • Deploy New Detections
  • Security Controls
    • AI security controls
    • Monte Copilot & AI privacy and controls
Powered by GitBook
On this page
  • Pre-Reqs
  • Setting up FluentBit Config

Was this helpful?

Export as PDF
  1. Get Started
  2. Reference Architectures
  3. Anvilogic on Snowflake Architecture
  4. FluentBit

Syslog data

This page is designed to help customers leverage the Forward Events integration within their Anvilogic account for FluentBit.

Last updated 8 months ago

Was this helpful?

Pre-Reqs

  • Anvilogic account

  • Snowflake data repository connected to your Anvilogic account

Setting up FluentBit Config

  1. Anvilogic will provide a S3 bucket and the corresponding access keys/ids (note these change for each integration) when you create a forward events integration in your Anvilogic deployment.

  2. Create a credential file on the machine that fluentBit can read from. For example, /home/<username>/creds . Inside the file please paste the following config with your specific access key/id

[default]
aws_access_key_id = AKIAIOSFODNN7EXAMPLE
aws_secret_access_key = wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
  1. Since our credentials are already updated in the /home/<username>/creds file, we need to configure the service config file for Fluent Bit and set the path to this credential file (see image for reference). To do that, fire up your favorite text editor and edit the fluent-bit.service file located at /usr/lib/systemd/system/fluent-bit.service.

    1. Environment="AWS_SHARED_CREDENTIALS_FILE=/home/<username>/creds"

  1. Then run the following commands in a terminal window

    1. sudo systemctl daemon-reload

    2. sudo systemctl start fluent-bit

  2. Next we need to configure fluentbit to read our logs and send them to S3. In this example, I will be sending logs via Syslog and sending them to S3.

[INPUT]
    Name              syslog
    Mode              udp
    Listen            0.0.0.0
    Port              1515
    Parser            syslog-rfc3164
    Mem_Buf_Limit     10MB

[OUTPUT]
    Name              s3
    Match             *
    bucket            avl-raw-prod-s3-221-24243202/sdi_custom_data-0
    region            us-east-1
    use_put_object    On
    Store_dir         /tmp/fluent-bit/s3
    s3_key_format     /$TAG/%Y/%m/%d/%H/%M/%S

Once you have pasted the above config into your fluentBit.conf file (typically located at /etc/fluent-bit/fluent-bit.conf)

  • NOTE: You can also edit or add any of your own customer parsers for logs by editing the parser.conf file at /etc/fluent-bit/

  • Once you have edited your fluent-bit.conf, please restart the fluentBit service sudo systemctl restart fluent-bit

    • You can validate that your config is working by heading to /tmp/fluent-bit/s3/ and looking inside that folder.

  1. You can now confirm that data has landed in your snowflake account.

Please update the input section of this example config to fit your exact needs.

FluentBit installed