ELK hunting (logs)

I’ve got an ELK stack set up for my log drain. Out of curiosity, will elasticsearch/logstash backups ever be included with the default setup the way, say, postgresql ones are?

For databases in PHI-ready environments, the disks are all backed up the same way: disk volumes attached to the hosts in your database layers are all snapshotted nightl. I.e. that process is db-agnostic

So is there a way to configure elasticsearch to close old indexes without needing to set up our own S3 bucket etc? (As long as they are being backed up I don’t feel a need to have them in our own bucket.)

If we get too many indexes open kibana goes down.

Hey @Philonous I don’t think there’s any way to set up elasticsearch to automatically close old indexes, but you could easily modify https://github.com/aptible/elasticsearch-logstash-s3-backup to do that instead of back them up. Or just close the previous month’s indices manually every month yourself.

Yeah, Kibana does not like having too many indexes open. Each open index requires server resources, when you have too many open you’ll encounter memory errors (among other errors…)

1 Like

I’ll look into that :thinking: