Syslog data
This page is designed to help customers leverage the Forward Events integration within their Anvilogic account for FluentBit.
Last updated
This page is designed to help customers leverage the Forward Events integration within their Anvilogic account for FluentBit.
Last updated
Anvilogic account
Snowflake data repository connected to your Anvilogic account
Anvilogic will provide a S3 bucket and the corresponding access keys/ids (note these change for each integration) when you create a forward events integration in your Anvilogic deployment.
Create a credential file on the machine that fluentBit can read from. For example, /home/<username>/creds
. Inside the file please paste the following config with your specific access key/id
Since our credentials are already updated in the /home/<username>/creds
file, we need to configure the service config file for Fluent Bit and set the path to this credential file (see image for reference). To do that, fire up your favorite text editor and edit the fluent-bit.service file located at /usr/lib/systemd/system/fluent-bit.service.
Environment="AWS_SHARED_CREDENTIALS_FILE=/home/<username>/creds"
Then run the following commands in a terminal window
sudo systemctl daemon-reload
sudo systemctl start fluent-bit
Next we need to configure fluentbit to read our logs and send them to S3. In this example, I will be sending logs via Syslog and sending them to S3.
Once you have pasted the above config into your fluentBit.conf file (typically located at /etc/fluent-bit/fluent-bit.conf)
NOTE: You can also edit or add any of your own customer parsers for logs by editing the parser.conf file at /etc/fluent-bit/
Once you have edited your fluent-bit.conf, please restart the fluentBit service sudo systemctl restart fluent-bit
You can validate that your config is working by heading to /tmp/fluent-bit/s3/ and looking inside that folder.
You can now confirm that data has landed in your snowflake account.
Please update the input section of this example config to fit your exact needs.