AWS WAFv2 S3 Logs Reading Script

  • The Linux Script is to read AWS WAFv2 logs from S3 bucket.
  • Same script can be used to read WAF Classic logs from S3.
  • WAFv2 logging should be enabled and logs should be stream to the S3 bucket via AWS Firehose.
  • The Script fetch all folders from S3 bucket from the S3 location provided in S3_Location variable.
  • Modify the S3_Location variable value to get details of specific date logs.
  • The script creates s3_waf_logs folder and downloads all WAF logs from S3 bucket into it.
  • Script searched for three parameters Actions(Allowed & Blocked request), Hosts and URIs in each log file.
  • File with name waf-logs-unique-url.csv will be created with the final output.
  • For updated code, please refer GitHub Repositoryhttps://github.com/nitinp91/WAFv2-S3-Logs-Reading-Script
#!/bin/bash
S3_Location='s3://waf.logs.mybucket/firehose/2020/'
mkdir s3_waf_logs
FILES='./s3_waf_logs/'
aws s3 cp $S3_Location $FILES --recursive --profile ana-ndc-prod

for f in $(find $FILES -name 'aws-waf-logs*')
do
        echo "Processing $f file..."
        jq -r '"\(.action), \(.httpRequest.headers[0].value), \(.httpRequest.uri)"' < $f >> waf-logs-all-urls.csv
done
sort ./waf-logs-all-urls.csv | uniq -u > waf-logs-unique-urls.csv
rm ./waf-logs-all-urls.csv
rm -rf $FILES

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *