{"id":11551,"date":"2014-01-30T12:53:49","date_gmt":"2014-01-30T07:23:49","guid":{"rendered":"http:\/\/www.tothenew.com\/blog\/?p=11551"},"modified":"2015-07-09T11:25:24","modified_gmt":"2015-07-09T05:55:24","slug":"building-a-central-log-server-with-logstash","status":"publish","type":"post","link":"https:\/\/www.tothenew.com\/blog\/building-a-central-log-server-with-logstash\/","title":{"rendered":"Building a Central Log Server with Logstash"},"content":{"rendered":"<p>If you are an <em>operations<\/em>\u00a0 guy and your <em>development<\/em>\u00a0 team has built some very cool application which is hosted on several web servers; the main concern is to find out the bugs and closely monitor the Application Error logs from different servers.<\/p>\n<p>Things that you can do in this case are:<\/p>\n<ul>\n<li>Give server access to <a title=\"AWS Certified Solutions Architect\" href=\"http:\/\/www.tothenew.com\/devops-automation-consulting\">development team to view the application<\/a> (Seriously you won&#8217;t do that )<\/li>\n<li>Use some utilities like <a title=\"rsyslog\" href=\"http:\/\/www.rsyslog.com\/doc\/manual.html\" target=\"_blank\"><strong>rsyslog<\/strong> <\/a>or <a title=\"syslogNg \" href=\"http:\/\/www.balabit.com\/network-security\/syslog-ng\/opensource-logging-system\" target=\"_blank\"><strong>syslogNg<\/strong> <\/a>( Too much to configure and gets very complex at times )<\/li>\n<li>Use paid applications like <strong><a title=\"loogly\" href=\"https:\/\/www.loggly.com\/\" target=\"_blank\">loogly<\/a><\/strong><\/li>\n<\/ul>\n<p><strong>Or you can use Logstash.<\/strong><\/p>\n<p><strong>What is <a title=\"logstash\" href=\"http:\/\/logstash.net\/\" target=\"_blank\">logstash<\/a>?<\/strong><\/p>\n<p>logstash is a free and open source tool for managing events and logs. You can use it to collect logs, parse them and store them in a central place.\u00a0logstash is now also a part of the\u00a0<a href=\"http:\/\/elasticsearch.org\/\">Elasticsearch<\/a>\u00a0family.<\/p>\n<p><strong>How it works ?<\/strong><\/p>\n<p>The shipping agent ships logs from the source which you can collect in an queuing agent. It also acts a buffer, used for indexing and searching from Web UI.<\/p>\n<p><a href=\"\/blog\/wp-ttn-blog\/uploads\/2014\/01\/Logstash-1.png\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-11555\" src=\"\/blog\/wp-ttn-blog\/uploads\/2014\/01\/Logstash-1.png\" alt=\"\" width=\"669\" height=\"299\" \/><\/a><\/p>\n<p><strong><span style=\"text-decoration: underline;\">Setting up Logstash<\/span><\/strong><\/p>\n<p>Setting up logtash is very simple. You are only required to configure a Shipper, a Collector, an Indexer and a Web UI.<\/p>\n<p>Now we&#8217;ll demonstrate this by sending Apache access logs.<\/p>\n<p><strong>Prerequisites :<\/strong><\/p>\n<ul>\n<li><strong>Java\u00a0 &gt; 1.5<\/strong><\/li>\n<li><strong>ElasticSearch<\/strong><\/li>\n<li><strong>Redis<\/strong><\/li>\n<\/ul>\n<p><strong>Elasticsearch<\/strong><strong> :<\/strong><\/p>\n<ul>\n<li><strong>Download <\/strong>elastic search from its <a title=\"Elastic Search\" href=\"http:\/\/www.elasticsearch.org\/\" target=\"_blank\">official website<\/a>.<\/li>\n<li>Extract the source and<\/li>\n<li>Run elasticsearch<\/li>\n<\/ul>\n<p>[shell]\/bin\/elasticsearch -f[\/shell]<\/p>\n<p>You can see the output as Elasticsearch initializes the master node, which will be something like:<\/p>\n<p><a href=\"\/blog\/wp-ttn-blog\/uploads\/2014\/01\/Logstash-2.png\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-11556\" src=\"\/blog\/wp-ttn-blog\/uploads\/2014\/01\/Logstash-2.png\" alt=\"\" width=\"669\" height=\"129\" \/><\/a><\/p>\n<p><strong>Redis :<\/strong><\/p>\n<p>We will be using Redis to collect the logs.<\/p>\n<p>If you are not familiar with Redis installation and configuration process , simply follow the instructions mentioned below:<\/p>\n<ol>\n<li><span style=\"color: #333333; font-family: 'Helvetica Neue', Helvetica, Arial, 'Nimbus Sans L', sans-serif; font-style: normal;\">Download Redis from <a title=\"Redis Download\" href=\"http:\/\/redis.io\/download\" target=\"_blank\">http:\/\/redis.io\/download<\/a> (The latest stable release is likely what you want)<\/span><\/li>\n<li><span style=\"color: #333333; font-family: 'Helvetica Neue', Helvetica, Arial, 'Nimbus Sans L', sans-serif; font-style: normal;\">Extract the source, change to the directory and run\u00a0<strong>make.<\/strong><\/span><\/li>\n<li><span style=\"color: #333333; font-family: 'Helvetica Neue', Helvetica, Arial, 'Nimbus Sans L', sans-serif; font-style: normal;\"><span style=\"color: #333333; font-family: 'Helvetica Neue', Helvetica, Arial, 'Nimbus Sans L', sans-serif; font-style: normal;\">Run Redis with<\/span><\/span>\n<p>[shell]src\/redis-server &#8211;loglevel verbose[\/shell]<\/p>\n<\/li>\n<\/ol>\n<p>To verify redis installation and its functioning:<\/p>\n<p>[shell]redis-cli info memory[\/shell]<\/p>\n<p><strong>\u00a0<\/strong><strong>Setting up Shipper :<\/strong><\/p>\n<p>First download the logstash Agent from <a href=\"http:\/\/logstash.net\/\">http:\/\/logstash.net\/<\/a>.<\/p>\n<p>Now specify the Logstash agent to ship the logs in JSON format to our Collector (Redis).<\/p>\n<p>[shell]<br \/>\ninput {<br \/>\nfile {<br \/>\n# Change this to *-log according to the location where logs are being generated<br \/>\ntype =&gt; &quot;prod-log&quot;<br \/>\n# Change this to be the location where log files are being generated<br \/>\npath =&gt; [ &quot;\/var\/log\/httpd\/*.log&quot;]<br \/>\n}<br \/>\n}<br \/>\nOutput {<br \/>\nstdout { debug =&gt; true debug_format =&gt; &quot;json&quot;}<br \/>\nredis { host =&gt; &quot;127.0.0.1&quot; data_type =&gt; &quot;list&quot; key =&gt; &quot;logstash&quot; }<br \/>\n}<br \/>\n[\/shell]<\/p>\n<p>Now, save this file as shipper.conf and start the Shipper service :<\/p>\n<p>[shell]java -jar logstash.jar agent -f shipper.conf[\/shell]<\/p>\n<p><strong>Configuring Collector and Indexer : <\/strong><\/p>\n<p>Now, we will index the logs that are being collected in redis to our elasticsearch instance.<\/p>\n<p>[shell]<br \/>\ninput {<br \/>\nredis {<br \/>\nhost =&gt; &quot;127.0.0.1&quot;&lt;\/em&gt;<br \/>\ntype =&gt; &quot;redis-input&quot;&lt;\/em&gt;<br \/>\ndata_type =&gt; &quot;list&quot;&lt;\/em&gt;<br \/>\nkey =&gt; &quot;logstash&quot;<br \/>\nmessage_format =&gt; &quot;json_event&quot;<br \/>\n}<br \/>\n}<br \/>\noutput {<br \/>\nstdout { debug =&gt; true debug_format =&gt; &quot;json&quot;}<br \/>\nelasticsearch {<br \/>\nhost =&gt; &quot;127.0.0.1&quot;<br \/>\nport =&gt; 9300<br \/>\n}<br \/>\n}<br \/>\n[\/shell]<\/p>\n<p>Save this file as index.conf and run the logstash agent :<\/p>\n<p>[shell]java -jar logstash.jar agent -f indexer.conf[\/shell]<\/p>\n<p><strong>Configuring the Logstash Web UI :<\/strong><\/p>\n<p><strong>\u00a0<\/strong>I&#8217;ll be using Kibana to view the logs as it has a nice user interface and you can also customize it as per your requirements.<\/p>\n<p>You can download kibana from : <a href=\"https:\/\/github.com\/elasticsearch\/kibana\">https:\/\/github.com\/elasticsearch\/kibana<\/a><\/p>\n<p>After downloading<\/p>\n<ul>\n<li>edit the <strong>config.js<\/strong> file and update the location of your elasticsearch server. While updating, keep in mind that the location should be accessible via your web browser.<\/li>\n<li>Rename the <strong>app\/dashboards\/logstash.json<\/strong> to <strong>app\/dashboards\/default.json<\/strong><\/li>\n<\/ul>\n<p>Now we have to host kibana to run on our web server. I&#8217;m skipping this part as you all will already be familiar will this process.<\/p>\n<p>After configuring your webserver, open the application in your web browser and access your Application Error logs.<\/p>\n<p><a href=\"\/blog\/wp-ttn-blog\/uploads\/2014\/01\/Logstash-3.png\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-11557\" src=\"\/blog\/wp-ttn-blog\/uploads\/2014\/01\/Logstash-3.png\" alt=\"\" width=\"669\" height=\"331\" \/><\/a><\/p>\n<p><strong>Post-Setup Configurations :<\/strong><\/p>\n<p>This was a simple configuration, but before using it on your production environment you must tune the Elasticsearch parameters and you can also think of distributing the load using separate instances for Redis and adding more Elasticsearch Nodes.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>If you are an operations\u00a0 guy and your development\u00a0 team has built some very cool application which is hosted on several web servers; the main concern is to find out the bugs and closely monitor the Application Error logs from different servers. Things that you can do in this case are: Give server access to [&hellip;]<\/p>\n","protected":false},"author":108,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"iawp_total_views":13},"categories":[1174],"tags":[],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts\/11551"}],"collection":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/users\/108"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/comments?post=11551"}],"version-history":[{"count":0,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/posts\/11551\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/media?parent=11551"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/categories?post=11551"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tothenew.com\/blog\/wp-json\/wp\/v2\/tags?post=11551"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}