ELK platform provides great solution aggregating and indexing various log, events within a organization. But you may not want to keep old data in logstash forever.
Also see ELK installation and configuration
To delete old data you can use “elasticsearch-curator” tool
You can simply install it as
$pip install elasticsearch-curator
Followed by some basic configuration.
create config.yml and action.yml as following
blacklist: ['elasticsearch', 'urllib3']
create log directory /var/log/curator/
action.yml – change days according to your requirement.
Delete indices older than 10 days (based on index name), for logstash-
prefixed indices. Ignore the error if the filter does not result in an
actionable list of indices (ignore_empty_list) and exit cleanly.
- filtertype: pattern
- filtertype: age
Try a dry run and check log file /var/log/curator/actions.log
/usr/local/bin/curator –config /opt/sw/curator/config.yml –dry-run /opt/sw/curator/action.yml
Once you are convinced with the logs then you can setup cronjob to auto delete old data.
0 0 * * * /usr/local/bin/curator –config /opt/sw/curator/config.yml /opt/sw/curator/action.yml