Ansible, Kibana, Kafka, Storm, Punchlet

ansible-red-hat-blog-top

Ansible:

Ansible, an open source and yet so powerful IT automation tool by Red hat. Which can be easily easily grasp by your IT team. Ansible is so simple even if you are not an Linux guru still you can easily use it. Its powerful automation lets you simply most complex deployment in simple manner. Its takes off those repetitive task and lets you do something new.

Kibana:

Overwhelmed with 1000s of line of logs and ip addresse’s attacking your network, use Kibana. This an open source browser based visualization tool. Security experts mainly used kibana, to analyse large volume of logs. Kibana allows you to see real picture in the form of line graph, bar graph, pie charts , heat maps, region maps, coordinate maps, gauge, goals, timelion etc.

The visualization makes it easy to predict or to see the changes in trends of errors or other significant events of the input source. Kibana works in sync with Elasticsearch and Logstash which together forms the so called ELK stack.

Kafka:

Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications.

Storm:

Apache Storm is a free and open source distributed realtime computation system. Apache Storm makes it easy to reliably process unbounded streams of data, doing for realtime processing what Hadoop did for batch processing. Apache Storm is simple, can be used with any programming language, and is a lot of fun to use!

Apache Storm has many use cases: realtime analytics, online machine learning, continuous computation, distributed RPC, ETL, and more. Apache Storm is fast: a benchmark clocked it at over a million tuples processed per second per node. It is scalable, fault-tolerant, guarantees your data will be processed, and is easy to set up and operate.

Apache Storm integrates with the queueing and database technologies you already use. An Apache Storm topology consumes streams of data and processes those streams in arbitrarily complex ways, repartitioning the streams between each stage of the computation however needed. Read more in the tutorial.

Punchlet:

 

Leave a Reply

Your email address will not be published. Required fields are marked *