How much can you use Kibana in “do it yourself” mode ?
You expect to use Kibana and manage a successful Do It Yourself (DIY) installation but you wonder if it’s at your reach… As we already made this many times, we’ll let you know a bit more on what kind of challenge Kibana Elasticsearch might be and detail the step by step Do It Yourself challenge.
Kibana : What are the main steps to Do It Yourself?
Many rows … so many rows in your files. Actually, you can’t even open them on your desktop to show up what’s going on. You had a little overview of data vizualisation tools and you think kibana can be useful to build your own dashboard on your server log or IOT you installed in your business. As a result, it’s time for you to go and you feel like almost able to install the famous ELK stack on your own and use it “as a pro”.
INSTALLATION (not only kibana!)
Firstly, download usual package on your desktop !! As a result, you run an instance in local and it’s done. Then you’re the king of ELK (Elasticksearch Logstash Kibana) stack.
Even though you made it available on a local desktop, surely you’ll get in trouble when you’ll try to connect this modules one another. Indeed, Kibana is not connected to elasticsearch and Logstash sends no data to elastic. We already experienced all of these toubleshooting.
Moreover, let us tell you it’s probably much better for you to think about installing the stack in a cloud or hybrid infrastructure. Ah … you are not a IT cloud infrastructure expert … sorry … see you next year !!!
PARSING (really ready to do it yourself ?)
Ready to go ? Everything is installed and connected ? Ok, let’s go to next task. Seems that your Elasticsearch index ingests one field. Whatever you do, it’s always one single field . No way you can build a dashboard with this. In other words, you are discovering that Logstash is very sensitive to the way your data is parsed. Consequently, You will have to deal with Grok or find someone who knows how you can “talk to Grok”. Good Luck anyway!
This is not the end of your roadtrip because it’s now time to let your index know what’s the type of each field. To clarify, you’ll have to play again with your dataset in addition the json properties before indexing and storing data in case you have specific types to be recognized. Moreover, you may get a boring “MapperParsingException” message in case your dataset structure changed unless you didn’t know. Never give up 😉
This is the time when we begin to use bad words. Scaling means you’ll have to adapt for example your hardware infrastructure, queuing tool and eventually elastic/logstash/kibana versioning. In case you don’t forecast this kind of options, it may happen that some of your datasets are not shipped well. Where’s gone the big data infrastructure expert ? Who’s responsible now for dataloss ?
SECURITY (neither only kibana, nor do it yourself)
Above all, you have to know that ELK Stack does natively not provide security profile options. As a result, you now have two options : Let everyone see everything inside both indexes and Kibana or invest in security plugin/development. Your main goals will be LDAP integration and SSO. In other words, we wish you good luck to let it work with little budget and staff.
KIBANA : DO YOU REALLY STILL WANT TO DO IT YOURSELF ?
In conclusion, there’s no way to make it work with small budget and reduced staff.
kibana do it yourself is dead-end if you and it effective and secured. Just try another option and why not try drop to kibana with octave.io