ELK4QRadar
This project was created to help SOC MSSP teams that use QRadar SIEM with multiple clients to collecte and centralize monitoring statistics from all QRadar deployments.

- 1.
PUT _template/<YOUR_TEMPLATE_NAME>
. In this repository we provide an index template that you can in your Elastic Stack - 2.Populate the YAML files in
/etc/logstash
with the appropriate data to your context. We Provide samples in this project:- clientnames.yml : Contains a dictionary of input configuration tags and their correspondant client names
- 3.Copy conf.d configuration in your Logstash conf.d folder and customize to your needs.
- 4.Create a
/home/USER/Offenses/
folder to save the extracted search data from QRadar in CSV. - 5.Create the following scripted fields in Kibana
Name | Lang | Script | Format |
offense.day_of_week | painless | doc['@timestamp'].value.dayOfWeekEnum | String |
offense.hour_of_day | painless | doc['@timestamp'].value.hourOfDay | Number |
- Busiest Day

- Busiest Hour

- Offenses average by day of week

I created a custom template for this use case:
{
"index_patterns": [
"soc-statistics-offenses-*"
],
"template": {
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0
},
"mappings": {
"dynamic_templates": [
{
"strings_as_keyword": {
"mapping": {
"ignore_above": 1024,
"type": "keyword"
},
"match_mapping_type": "string"
}
}
],
"properties": {
"@timestamp": {
"type": "date"
},
"offense": {
"properties": {
"owner": {
"type": "keyword"
},
"note": {
"properties": {
"date": {
"format": "MMM d, yyyy, h:mm:ss a",
"type": "date"
}
}
},
"storagetime": {
"format": "yyyy-MM-dd hh:mm:ss a",
"type": "date"
},
"assigned": {
"properties": {
"date": {
"format": "MMM d, yyyy, h:mm:ss a",
"type": "date"
}
}
},
"id": {
"type": "keyword"
},
"starttime": {
"format": "yyyy-MM-dd hh:mm:ss a",
"type": "date"
},
"logsourcetime": {
"format": "yyyy-MM-dd hh:mm:ss a",
"type": "date"
},
"close": {
"properties": {
"date": {
"format": "MMM d, yyyy, h:mm:ss a",
"type": "date"
},
"reason": {
"type": "text"
},
"analyst": {
"type": "keyword"
}
}
},
"hour_of_day": {
"type": "keyword"
},
"status": {
"type": "keyword"
},
"day_of_week": {
"type": "keyword"
}
}
},
"domain": {
"properties": {
"name": {
"type": "keyword"
}
}
},
"rule": {
"properties": {
"severity": {
"type": "keyword"
},
"risk_score": {
"type": "keyword"
},
"name": {
"type": "keyword"
},
"threat": {
"properties": {
"technique": {
"properties": {
"name": {
"type": "keyword"
},
"id": {
"type": "keyword"
}
}
},
"tactic": {
"properties": {
"name": {
"type": "keyword"
},
"id": {
"type": "keyword"
}
}
}
}
},
"category": {
"type": "keyword"
},
"class": {
"type": "keyword"
}
}
},
"client": {
"properties": {
"name": {
"type": "keyword"
}
}
},
"analyst": {
"type": "nested",
"properties": {
"notes": {
"type": "text"
},
"username": {
"type": "keyword"
}
}
},
"event": {
"properties": {
"timezone": {
"type": "keyword"
},
"name": {
"type": "keyword"
}
}
},
"tags": {
"type": "keyword"
}
}
}
}
}
This part of the project contains logstash configuration files that will process and parse files CSV files saved by the python script in
/home/elk/Offenses
notice here that I am storing my AQL search results in Offenses folder at elk
user's home folder.PS : Please see the index template definition to have basic understanding of the defined fields used in this project.
Logstash pipelines ar organized in three parts :
- Input configurations : Make an input configuration for each file you wanna ingest into elasticsearch.
Example :input {file {path => "/home/<USER>/<FOLDER NAME>/<FILENAME>.csv"start_position => beginningtags => "<MY_CLIENT>"type => "OFFENSES"}}
- Filter configuration : For processing and enriching the incoming data and normalizing event fields.
- Output configuration : Used for sending data to Elasticsearch.
Example :output {if [type] == "OFFENSES" {elasticsearch {hosts => ["https://localhost:9200"]index => "soc-statistics-offenses-%{[client][name]}-%{+yyyy.MM}"#manage_template => falsecacert => "/etc/logstash/root-ca.pem"user => "<USERNAME>"password => "<PASSWORD>"ssl => truessl_certificate_verification => false}}}
Last modified 2yr ago