Anvilogic implementation with Azure (Data Explorer, Log Analytics, and Fabric).
Last updated
Was this helpful?
Architecture Diagram
Below is the generic architecture digram for how Anvilogic works on top of Azure.
This supports both Azure Log Analytics, Azure Data Explorer (ADX), and Fabric workspaces.
Currently, we do not support querying a Log Analytics Workspace in a different tenant than the Anvilogic Azure Data Explorer Cluster.
We can only install the Anvilogic integration in one Azure tenant.
We can only support querying Log Analytics (LA) if it's within the same Azure tenant that Anvilogic is installed in.
We can support the ability to query multiple ADX tenants, even if the ADX clusters are in different tenants than Anvilogic's resoruce group.
Diagram:
PDF Download:
Frequently Asked Questions (FAQs)
What gets installed in my Azure environment?
User managed identity (gives permissions to access key vault, and ADX tables)
Azure Key Vault
ADX Cluster, database, and tables Azure Container App/environment/jobs/instance
Log analytics workspace for the container app
Azure Container app registry & cache
Does Anvilogic's integration incur any Azure costs?
Yes, there are costs associated with running the Anvilogic Resource Group in Azure, specifically on the Azure Data Explorer hosting cluster. These costs depend on the compute required to execute detections within your environment.
The average costs of running Anvilogic's required infrastructure in Azure range from $12,000-$25,000 per year annually.
Please ensure you review your billing configurations for Azure Kubernetes Service (AKS) & Azure Data Explorer (ADX) pricing tiers which control cluster management ensuring proper scaling configuration and expectations for the Anvilogic service.
What costs money?
During the set up process, a VM is created that will manage the Data Explorer Cluster. The default size upon our automated installation of that VM is a Standard_E8ads_v5 (Medium 8vCPUs).
Calculate Costs:
In the Instance section, type "E8ads"
The list price for pay-as-you-go plan for a Standard_E8ads_v5 VM is $1.40 per hour in a production with SLA cluster per VM (2 VMs are required in a production cluster with SLA).
If you are running a Standard_E8ads_v5 VM cluster in a production with SLA, that will come with 2 instances - so 2 instances at $1.40 per hour is approx. $24,528 per year.
Production (with SLA): Production clusters contain at least two nodes for the engine cluster and at least two nodes for the data management cluster. These clusters operate under the Azure Data Explorer SLA.
Dev/Test (no SLA): Dev/Test clusters contain a single node for the engine cluster and a single node for the data management cluster. These clusters are the lowest cost configuration because of its low instance count. There's no redundancy or SLA for this cluster configuration.
What permissions do I need to create/use to install Anvilogic’s Azure integration?
You will be creating the following:
Create new App registration for Anvilogic
Create a new secret in the new App that was created in Step 1
Create an Anvilogic resource group
Go through our integration set up on the Anvilogic Platform
Can you query Data Explorer, Log Analytics, and Fabric?
Yes, Anvilogic supports searching and running detection against any data source inside of an Azure LA, ADX cluster, or Fabric workspace.
You will need to give the Anvilogic app service principal permissions to query any of the ADX, LA clusters, or Fabric workspaces you want Anvilogic searches and detections to use.
For Microsoft Fabric you need to create a workspace, leverage an event stream under real time intelligence, and the destination from the event stream MUST be a KQL Database.
Currently, we do not support querying a Log Analytics Workspace in a different tenant than the Anvilogic Azure Data Explorer Cluster.
How does the Anvilogic platform query our LA, ADX, or Fabric Clusters?
For Microsoft Fabric you need to create a workspace, leverage an event stream under real time intelligence, and the destination from the event stream MUST be a KQL Database.
Currently, we do not support querying a Log Analytics Workspace in a different tenant than the Anvilogic Azure Data Explorer Cluster.
What if Azure isn't my primary SIEM and I have a hybrid set up?
Since Anvilogic supports multiple SIEM/Data Lakes, you can configure all of the events if interest (EOIs) generated from detection queries to also write a copy back to your primary Alert Lake or EOI data store. That can be located in any of the other support platforms (ex. Splunk, Snowflake).
For example - if Splunk is your primary SIEM, then you can configure all of your Azure detection results to also send a copy of the event of interest (EOI) back to the Anvilogoic index in Splunk. The Anvilogic platform handles all of this EOI routing for you.
Can you help bring alert data into Azure for us?
Yes, Anvilogic can help retrieve alerts/signals from SaaS security tools (ex. Proofpoint, Wiz, Crowdstrike, etc.) and can ingest those into the Anvilogic table in ADX for correlation.
Can you help bring raw data into Azure for us?
No, Anvilogic does not support raw data ingestion into ADX, LA, or Fabric. Data must already be present in those environments.
Anvilogic only supports raw data ingestion for Azure Snowflake.
Do you provide parsers for un-normalized data?
Yes, we provide hundreds of out-of-the-box parsers that can be used to normalize your security data inside of ADX,LA, or Fabric.
What is the Anvilogic Alert Table in ADX?
Anvilogic Alert table in ADX will store the output from all detections that are running within the App container environment.
This is a fully normalized set of signals that we call “events of interest” that can be used to escalate activity to your SOAR or can be used as a hunting index to create Threat Scenario correlations.
Do you collect the alerts stored in the Anvilogic Alert Table in ADX?
Alerts are stored inside of your Azure table in ADX you specify during the setup.
The Anvilogic AI-Insights (ex. Hunting, Tuning, Health) package requires a copy of these events to be collected and stored by Anvilogic. If enabled, a copy of those events will be collected into Anvilogic.
Do you integrate with SOAR?
Yes, Anvilogic can integrate with most SOARs via REST API through either a push or a pull method.
The following infrastructure will be created in the resource group you create for Anvilogic. We use an to deploy the infrastructure.
Visit -> Type in "azure data explorer" under products
Azure Data Explorer offers :
The will then be able to query the KQL database.
We connect into your ADX cluster and then use the Microsoft to initiate a query to any other LA, ADX cluster, or Fabric workspace that our app service principal have access to.