Question

I have a number of applications that are running in different data centers, developed and maintained by different vendors. Each application has a web service that exposes relevant log data (audit data, security data, data related to cost calculations, performance data, ...) consolidated for the application.

My task is to get data from each system into a setup of Elasticsearch, Kibana and Logstash so I can create business reports or just view data the way I want to.

Assume I have a JBoss application server for integration to these "expose log" services, what is the best way to feed Elasticssearch? Some Logstash plugin that calls each service? JBoss uses some Logstash plugin? Or some other way?

Was it helpful?

Solution

The best way is to set up the logstash shipper on the server where the logs are created.

This will then ship them to a Redis server.

Another logstash instance will then pull the data from Redis, and index it, and ship it to Elasticsearch.

Kibana will then provide an interface to Elasticsearch, which is where the goodness happens.

I wrote a post on how to install Logstash a little while ago. Versions may have been updated since, but its still valid

http://www.nightbluefruit.com/blog/2013/09/how-to-install-and-setup-logstash/

OTHER TIPS

Do your JBoss application server writes logs to file?

In my experiences, My JBoss application(in multiple server) writes the logs to the file. Then I use logstash to read the logs file and ship all the logs to a central server. You can refer to here.

So, what can you do is setup a logstash shipper in different data center. If you do not have permission to do this, maybe you want to write a program to get the logs from different web services and then save them to a file. Then setup the logstash to read the logs file. So far, logstash do not have any plugin that can call web services.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top