Ever wondered how the search bar in all the major websites give you suggestions while you type in. If you are scratching your head for the same this would be the right article to go through.

This article is going to be in 2 parts. In the first part we’ll see how does this magical feature works and what tools and stack we need to implement it. The second part will delve into the technical details and how to go about and implement it.

Let’s get started

Let’s implement a search bar in which you input the employees age or name or salary etc. and it lists down the employees matching the query in a table as and when you type.

You specify these parameters when setting up ElasticSearch so it knows how your database is structured and where to look at in the database.

So we’ll tell ES to index only the name, email and department from the employee object. ES works on JSON objects, don’t be baffled!! yes it does! So let’s go ahead and add 1 employee object into ES. We can have a simple PUT request which has the employee object as it’s request body.

Make sure you add only the required keys of the employee data into ES. You don’t want ES to index data you are never going to search for.

I think I have provided enough theory. Let’s roll!!

What would you need to implement an Autocomplete Functionality

A search algorithm for different data types. An index which tells you what to search in the database. A mechanism to learn a users preferences for autosuggest. (Optional) A user on whom we can perform experiments (JK) :P.

Before we start you must know there are other alternatives like Algolia.

Elastic Stack (ELK)

To get started with implementing autocomplete with elastic search you’ll need 3 things: ELK which stands for ElasticSearch, Logstash, Kibana.

ElasticSearch is the algorithm which takes care of actually suggesting data from the database.

Kibana is like a console from where we can execute our queries and visually look at the ES database.

Logstash is the engine which logs the working of ES.

Now that we know how this piece of software runs we can either host it on our own servers and get our hands dirty trying to handle the configs and scaling OR we can host it on a cloud. There are many cloud hosting services available like AWS, Google Cloud, Elastic.co etc.

But what I found in my research was that AWS is the best value for money. The reason being it’s cheap, needs little configuration, has the widest network of servers available across the globe. Having a server closer to your user-base is beneficial in many ways.

AWS

AWS has its own ElasticSearch hosting facility which makes our task even easier. We just have to select the computing power and the storage we need for our needs. It comes integrated with Kibana and LogStash so that’s less work for us.

Let’s go ahead and do the following to host ES on AWS:

Go to Amazon create an account there if you haven’t already. Make sure you select the appropriate (nearest) data center from the top-right dropdown. Then click on create a new Domain. Select your required computing requirements. If you are just testing it out or developing just select the smallest one available. When it asks you for network configuration select public access for now (or set it up as per your own needs). For access policy Selecting template click on allow access to domain from specific IP's and give it your public IP. (Google what’s my IP to find out)

It’ll take some time to activate your servers, once they are online you’ll get the Active text on the dashboard.

Congratulations you’ve successfully hosted elastic search on AWS platform and it’s live too.

You can go to Kibana ( a link will be available in the dashboard) and submit a get(REST API) request to your ES server to check if it’s online.

Now all that is remaining is to configure your empty ES server and load it with your database and map it as per your own needs.

We’ll do that in the part 2 of this article. It’ll also contain the actual queries and Kibana requests. Coming Soon.