{"id":71432,"date":"2025-04-09T11:38:48","date_gmt":"2025-04-09T06:08:48","guid":{"rendered":"https:\/\/www.tothenew.com\/blog\/?p=71432"},"modified":"2025-04-22T10:50:32","modified_gmt":"2025-04-22T05:20:32","slug":"setting-up-elastalert2-for-real-time-alerting-on-elasticsearch-indices","status":"publish","type":"post","link":"https:\/\/www.tothenew.com\/blog\/setting-up-elastalert2-for-real-time-alerting-on-elasticsearch-indices\/","title":{"rendered":"Mastering Real-Time Alerting with ElastAlert2: Detecting DOS Attacks from WAF Logs"},"content":{"rendered":"<h2>Introduction<\/h2>\n<p>ElastAlert 2 is a simple framework for alerting on anomalies, spikes, and other patterns of interest in data from Elasticsearch and OpenSearch.<\/p>\n<p>ElastAlert 2 is a tool for monitoring real-time data in Elasticsearch and alerting on matching patterns.<\/p>\n<p><strong>Elastalert accepts this Alert type:<\/strong><\/p>\n<ol>\n<li>Email<\/li>\n<li>AWS SES (Amazon Simple Email Service)<\/li>\n<li>AWS SNS (Amazon Simple Notification Service)<\/li>\n<li>Chatwork Command<\/li>\n<li>Datadog<\/li>\n<li>GoogleChat<\/li>\n<li>Jira Slack<\/li>\n<li>Telegram<\/li>\n<\/ol>\n<p>It is a powerful monitoring tool, but it demands equal amounts of effort to realize its full potential.<\/p>\n<h1><\/h1>\n<h2><strong>Use Case: Real-Time DOS Attack Detection on WAF Logs with ElastAlert2<\/strong><\/h2>\n<p>In a recent project, I used ElastAlert2 to monitor and detect Denial of Service (DoS) attacks by monitoring WAF data stored in Elasticsearch. The purpose was to send email alerts in near real time when suspicious traffic patterns indicating a DOS assault were observed.<\/p>\n<h3>Challenge<\/h3>\n<p>While ElastAlert2 is an effective framework for anomaly detection and alerting on Elasticsearch\/OpenSearch data, the setup process might be difficult. The documentation is minimal, and setups require careful attention to detail, especially when interacting with custom pipelines and email notifications.<\/p>\n<h3>Solution<\/h3>\n<p>Here&#8217;s how I structured the solution:<\/p>\n<p>1. Mechanism for alerting<br \/>\nThe alert type was set to email (SMTP). When the alert condition is met (for example, a DOS pattern is discovered in logs), an email is automatically sent to the incident response team with complete information.<\/p>\n<p>2. Rule Type: Frequency.<br \/>\nDeveloped a frequency rule to detect at least one DOS-related log item within a 60-minute timeframe. The rule employed wildcard filters to capture variations such as DOS* and Behavioral* in the dos_attack_name column.<\/p>\n<p>3. Pipeline for Parsing Nested JSON Logs<br \/>\nBecause the essential log data was nested within the _source field and stored as raw JSON in event.original, I built a custom Ingest Pipeline in Elasticsearch to convert the JSON to structured fields. This enabled ElastAlert2 to view and evaluate the real log material.<\/p>\n<p>4. Test before deployment.<br \/>\nBefore setting the alert as a service, I used elastalert-test-rule to evaluate the configuration and rule logic, which helped me identify typos in the syntax.<\/p>\n<p>5. Service Integration.<br \/>\nFinally, set up ElastAlert2 as a systemd service, guaranteeing that it runs continually in the background and restarts automatically on reboot.<\/p>\n<h3>Key Takeaways<\/h3>\n<p>ElastAlert2 is great for custom alerting but requires a steep setup curve. JSON parsing through Elasticsearch pipelines is essential when logs are nested. Always validate your rules before production deployment using elastalert-test-rule<\/p>\n<p>&nbsp;<\/p>\n<h2>Prerequisites<\/h2>\n<ul>\n<li>\n<h3>Python &gt;= 3.9 version<\/h3>\n<\/li>\n<\/ul>\n<pre><em>sudo yum groupinstall \"Development Tools\"<\/em>\r\n\r\n<em>sudo yum install libffi-devel<\/em>\r\n\r\n<em>sudo yum install python3-devel<\/em><\/pre>\n<p><em><strong>Please note that the commands provided are solely for Red Hat-based Linux systems. Here are the commands for Ubuntu-based Linux systems:<\/strong><\/em><\/p>\n<pre><em>sudo apt-get install build-essential<\/em>\r\n\r\n<em>sudo apt-get install libffi-dev<\/em>\r\n\r\n<em>sudo apt-get install python3-dev<\/em><\/pre>\n<ul>\n<li>\n<h3>ElastAlert2 Installation:<\/h3>\n<\/li>\n<\/ul>\n<pre><em>git clone https:\/\/github.com\/jertel\/elastalert2.git<\/em>\r\n\r\n<em>python3 setup.py install<\/em>\r\n\r\n<em>elastalert-create-index\r\nNew index name (Default elastalert_status)<\/em>\r\n<em>Name of existing index to copy (Default None)<\/em>\r\n<em>New index elastalert_status created<\/em>\r\n<em>Done!<\/em><\/pre>\n<h1><\/h1>\n<h2>ElastAlert2 Configuration Setup:<\/h2>\n<p>ElastAlert comprises numerous configuration files; it is up to you whether you include them all in one file or separate the configuration files based on their nature. For example, here, I separate my configuration file in two ways:<\/p>\n<ol>\n<li>Main Configuration<\/li>\n<li>Rules Configuration<\/li>\n<\/ol>\n<h4><strong>Main Config:<\/strong><\/h4>\n<p>Path: \/root\/elastalert2\/examples<\/p>\n<pre># This is the folder that contains the rule yaml files\r\n\r\n# This can also be a list of directories\r\n\r\n# Any .yaml file will be loaded as a rule\r\n\r\nrules_folder: \/root\/elastalert2\/examples\/rules\r\n\r\n# How often ElastAlert will query Elasticsearch\r\n\r\n# The unit can be anything from weeks to seconds\r\n\r\nrun_every:\r\n\r\nminutes: 2\r\n\r\n# ElastAlert will buffer results from the most recent\r\n\r\n# period of time, in case some log sources are not in real time\r\n\r\nbuffer_time:\r\n\r\nminutes: 15\r\n\r\n# The Elasticsearch hostname for metadata writeback\r\n\r\n# Note that every rule can have its own Elasticsearch host\r\n\r\nes_host: &lt;es_host_ip&gt;\r\n\r\n# The Elasticsearch port\r\n\r\nes_port: 9200\r\n\r\n# The AWS profile to use. Use this if you are using an AWS CLI profile.\r\n\r\n# See http:\/\/docs.aws.amazon.com\/cli\/latest\/userguide\/cli-chap-getting-started.html\r\n\r\n# for details\r\n\r\n#profile: test\r\n\r\n# Optional URL prefix for Elasticsearch\r\n\r\n#es_url_prefix: elasticsearch\r\n\r\n# Optional prefix for statsd metrics\r\n\r\n#statsd_instance_tag: elastalert\r\n\r\n# Optional statsd host\r\n\r\n#statsd_host: dogstatsd\r\n\r\n# Connect with TLS to Elasticsearch\r\n\r\n#use_ssl: True\r\n\r\n# Verify TLS certificates\r\n\r\n#verify_certs: True\r\n\r\n# Show TLS or certificate related warnings\r\n\r\n#ssl_show_warn: True\r\n\r\n# GET request with body is the default option for Elasticsearch.\r\n\r\n# If it fails for some reason, you can pass 'GET', 'POST' or 'source'.\r\n\r\n# See https:\/\/elasticsearch-py.readthedocs.io\/en\/master\/connection.html?highlight=send_get_body_as#transport\r\n\r\n# for details\r\n\r\nes_send_get_body_as: GET\r\n\r\n# Option basic-auth username and password for Elasticsearch\r\n\r\n#es_username: someusername\r\n\r\n#es_password: somepassword\r\n\r\n# Use SSL authentication with client certificates client_cert must be\r\n\r\n# a pem file containing both cert and key for client\r\n\r\n#ca_certs: \/path\/to\/cacert.pem\r\n\r\n#client_cert: \/path\/to\/client_cert.pem\r\n\r\n#client_key: \/path\/to\/client_key.key\r\n\r\n# The index on es_host which is used for metadata storage\r\n\r\n# This can be a unmapped index, but it is recommended that you run\r\n\r\n# elastalert-create-index to set a mapping\r\n\r\nwriteback_index: elastalert_status\r\n\r\n# If an alert fails for some reason, ElastAlert will retry\r\n\r\n# sending the alert until this time period has elapsed\r\n\r\nalert_time_limit:\r\n\r\nseconds: 10\r\n\r\n_source_enabled: true\r\n\r\nlogging:\r\n\r\nversion: 1\r\n\r\nformatters:\r\n\r\nsimple:\r\n\r\nformat: '%(asctime)s %(levelname)s %(message)s'\r\n\r\nhandlers:\r\n\r\nconsole:\r\n\r\nclass: logging.StreamHandler\r\n\r\nformatter: simple\r\n\r\nlevel: DEBUG\r\n\r\nfile:\r\n\r\nclass: logging.FileHandler\r\n\r\nformatter: simple\r\n\r\nlevel: DEBUG\r\n\r\nfilename: \/var\/log\/elastalert.log\u00a0 # Adjust the path as needed\r\n\r\nroot:\r\n\r\nlevel: DEBUG\r\n\r\nhandlers: [console, file]\r\n\r\nsmtp_auth_file: \/root\/elastalert2\/examples\/smtp_auth_file.yaml\r\n\r\n# Alerting (Email)\r\n\r\nsmtp_host: &lt;smtp_host_address&gt;\r\n\r\nsmtp_port: &lt;smtp_port&gt;\r\n\r\n#smtp_ssl: false\r\n\r\n#verify_certs: false<\/pre>\n<p>I am using SMTP Configuration to send notifications via EMAIL.<\/p>\n<p><strong>smtp_auth_file<\/strong> This file includes the username and password for the SMTP server.<strong> _source_enabled<\/strong> This variable is vital, so make sure you include it in your settings.<\/p>\n<p><strong>rules_folder<\/strong> If you look at this, I&#8217;m referring to my Rules Configuration, which uses this variable<\/p>\n<p>&nbsp;<\/p>\n<p>Learn more about other parameters at the link below. All variables play a crucial role in testing your rule. Elastalert rarely detects errors or missing variables, so it&#8217;s important to pay attention to these.<\/p>\n<h4>Testing Your Rule<\/h4>\n<p>However, there is one bonus: you can test your config and rule before enabling your Elastalert service, and if there is an exception or syntax issue, you will notice it right away.<\/p>\n<pre><em>elastalert-test-rule my_rules\/rule1.yaml configuration\/config.yaml<\/em><\/pre>\n<h4><strong>Rule Configuration:<\/strong><\/h4>\n<p>Path: \/root\/elastalert2\/examples\/rules<\/p>\n<pre># Name of the rule\r\n\r\nname: \"DOS Attack Detection\"\r\n\r\n# Type of alert rule\r\n\r\ntype: frequency\r\n\r\n# The Elasticsearch index to search\r\n\r\nindex: big_ip-waf-logs-*\r\n\r\n# The filter to match DOS attack logs\r\n\r\nfilter:\r\n\r\n- bool:\r\n\r\nshould:\r\n\r\n- wildcard:\r\n\r\ndos_attack_name: \"DOS*\"\r\n\r\n- wildcard:\r\n\r\ndos_attack_name: \"Behavioral*\"\r\n\r\n# The timeframe to search for the keyword\r\n\r\ntimeframe:\r\n\r\nminutes: 60\r\n\r\n# Alert if at least 1 event is found\r\n\r\nnum_events: 1\r\n\r\n# Use local time for alerts\r\n\r\nuse_local_time: false\r\n\r\n# SMTP authentication file\r\n\r\nsmtp_auth_file: \/root\/elastalert2\/examples\/smtp_auth_file.yaml\r\n\r\n# Alert actions\r\n\r\nalert:\r\n\r\n- email\r\n\r\n# Email alert settings\r\n\r\nfrom_addr: \"cloudapm@tothenew.com\"\r\n\r\nemail:\r\n\r\n- \"chetan.singh1@tothenew.com\"\r\n\r\n# Additional alert details to include in the alert message\r\n\r\nalert_subject: \"DOS Attack Detected\"\r\n\r\nalert_text: |\r\n\r\nDEVICE TYPE : ELK\r\n\r\nAPPLICATION NAME : {0}\r\n\r\nACTION : {1}\r\n\r\nDOS ATTACK ID : {2}\r\n\r\nHOST NAME : {3}\r\n\r\nCONTEXT NAME : {4}\r\n\r\nATTACK IP ADDRESS : {5}\r\n\r\nDATE : {6}\r\n\r\nITEM NAME : {7}\r\n\r\nVALUE : {8}\r\n\r\nSEVERITY : Exception\r\n\r\nGROUP : L2 Wintel\r\n\r\nalert_text_args: [\"_index\", \"action\", \"dos_attack_id\", \"hostname\", \"context_name\", \"source_ip\", \"@timestamp\", \"dos_attack_name\", \"dos_attack_tps\"]\r\n\r\nalert_text_type: alert_text_only\r\n\r\nalert_missing_value: \"NOT FOUND\"\r\n\r\n# Disable the inclusion of the full event\r\n\r\n#include: []\r\n\r\n# Define how often this rule can trigger\r\n\r\n#realert:\r\n\r\n#\u00a0 minutes: 1\r\n\r\n# Aggregation period\r\n\r\n#aggregation:\r\n\r\n#\u00a0 minutes: 5<\/pre>\n<p>I&#8217;m using Frequency as my Rule Type in this configuration, and I&#8217;m looking for DOS attacks in my Elasticsearch Index, or more specifically, in WAF Logs.<\/p>\n<p>The rule and configuration were running properly; however, one step failed. Any ideas?<\/p>\n<h1><\/h1>\n<h2>The _source Block Challenge<\/h2>\n<p>I was unable to extract the values from Elasticsearch since the value we received in the Elasticsearch Index was contained within the <strong>_source<\/strong> block.<\/p>\n<p>https:\/\/www.elastic.co\/guide\/en\/elasticsearch\/reference\/current\/mapping-source-field.html<\/p>\n<p>If you look at the official documentation for the _source block, you will notice that it is not searchable; instead, you may see it in Kibana Discover.<\/p>\n<p>&nbsp;<\/p>\n<p>To address this issue, I enabled the <strong>_source_enabled<\/strong> variable and created a separate <strong>pipeline<\/strong> to parse the original event from the index.<\/p>\n<pre>PUT _ingest\/pipeline\/parse_json\r\n\r\n{\r\n\r\n\"description\": \"Parse JSON string in event.original field\",\r\n\r\n\"processors\": [\r\n\r\n{\r\n\r\n\"json\": {\r\n\r\n\"field\": \"event.original\",\r\n\r\n\"target_field\": \"parsed_event\"\r\n\r\n}\r\n\r\n}\r\n\r\n]\r\n\r\n}<\/pre>\n<p>&nbsp;<\/p>\n<p>By utilizing that, I was able to extract values from the Elasticsearch Index using the Elastalert configuration.<\/p>\n<h1><\/h1>\n<h2>Running ElastAlert as a Service<\/h2>\n<p>To launch ElastAlert as a service, create the following file:<strong> \/etc\/systemd\/system\/elastalert.service.<\/strong><\/p>\n<p>And it includes:<\/p>\n<pre>[Unit]\r\n\r\nDescription=elastalert\r\n\r\nAfter=multi-user.target\r\n\r\n[Service]\r\n\r\nType=simple\r\n\r\nUser=root\r\n\r\nGroup=root\r\n\r\nWorkingDirectory=\/root\/elastalert2\/\r\n\r\nExecStart=\/usr\/bin\/python3 -m elastalert.elastalert --verbose --config \/root\/elastalert2\/examples\/config.yaml --rule \/root\/elastalert2\/examples\/rules\/example_frequency.yaml\r\n\r\nStandardOutput=syslog\r\n\r\nStandardError=syslog\r\n\r\nKillSignal=SIGKILL\r\n\r\n[Install]\r\n\r\nWantedBy=multi-user.target<\/pre>\n<p>&nbsp;<\/p>\n<h2>Final Thoughts<\/h2>\n<p>Setting up ElastAlert2 can be intimidating at first owing to its lack of documentation and setup complexity. However, if you understand its structure and quirks\u2014particularly those involving the _source field\u2014it becomes an extremely useful alerting tool.<\/p>\n<p>By breaking down configurations into manageable parts, testing rules thoroughly, and leveraging tools like ingest pipelines, you can turn ElastAlert2 into a powerful ally for proactive monitoring.<\/p>\n<p>If you work with Elasticsearch and need dependable, configurable alerting, ElastAlert2 is well worth the setup effort.<\/p>\n<h2><\/h2>\n<h2>References<\/h2>\n<p>https:\/\/elastalert2.readthedocs.io\/en\/latest\/ruletypes.html<\/p>\n<p>https:\/\/github.com\/jertel\/elastalert2\/tree\/master\/examples\/rules<\/p>\n<p>https:\/\/elastalert2.readthedocs.io\/_\/downloads\/en\/latest\/pdf\/<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Introduction ElastAlert 2 is a simple framework for alerting on anomalies, spikes, and other patterns of interest in data from Elasticsearch and OpenSearch. ElastAlert 2 is a tool for monitoring real-time data in Elasticsearch and alerting on matching patterns. Elastalert accepts this Alert type: Email AWS SES (Amazon Simple Email Service) AWS SNS (Amazon Simple [&hellip;]<\/p>\n","protected":false},"author":1638,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"iawp_total_views":242},"categories":[2348],"tags":[5278,7243,3389,7244,1423],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts\/71432"}],"collection":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/users\/1638"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/comments?post=71432"}],"version-history":[{"count":6,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts\/71432\/revisions"}],"predecessor-version":[{"id":71555,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts\/71432\/revisions\/71555"}],"wp:attachment":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/media?parent=71432"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/categories?post=71432"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/tags?post=71432"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}