ELK4QRadar
This project was created to help SOC MSSP teams that use QRadar SIEM with multiple clients to collecte and centralize monitoring statistics from all QRadar deployments.
Project is available on github
Guide
PUT _template/<YOUR_TEMPLATE_NAME>
. In this repository we provide an index template that you can in your Elastic StackPopulate the YAML files in
/etc/logstash
with the appropriate data to your context. We Provide samples in this project:timezone.yml: Contains dictionary of client name and their correspondant timezones.
clientnames.yml : Contains a dictionary of input configuration tags and their correspondant client names
Copy conf.d configuration in your Logstash conf.d folder and customize to your needs.
Create a
/home/USER/Offenses/
folder to save the extracted search data from QRadar in CSV.Create the following scripted fields in Kibana
Metrics samples
Busiest Day
Busiest Hour
Offenses average by day of week
Index Template
I created a custom template for this use case:
Logstash Configuration Files
This part of the project contains logstash configuration files that will process and parse files CSV files saved by the python script in /home/elk/Offenses
notice here that I am storing my AQL search results in Offenses folder at elk
user's home folder.
PS : Please see the index template definition to have basic understanding of the defined fields used in this project.
Logstash pipelines ar organized in three parts :
Input configurations : Make an input configuration for each file you wanna ingest into elasticsearch.
Example :
Filter configuration : For processing and enriching the incoming data and normalizing event fields.
Output configuration : Used for sending data to Elasticsearch.
Example :
Last updated