How to Set Up and Use Elasticsearch with Strapi

How to Set Up and Use Elasticsearch with Strapi

Learn how to use Elasticsearch to build a search engine for your application by adding a search feature to the Strapi Foodadvisor application.

Author: Alex Godwin

The need for a search feature in an application cannot be overstated. It could make the life of users easier and also make them excited to use an application. The ease of finding a particular resource or a collection of resources on an application greatly affects the user experience of an application (web or mobile).

There are several ways to achieve search in an application; however, in this article, we’ll be exploring the use of Elasticsearch to build a search engine for our Strapi application by adding a search feature to the Strapi Foodadvisor application.

Prerequisites

Before continuing in this article, you should have the following:

  • Knowledge of JavaScript,
  • Node.js (v14 recommended for Strapi)
  • Basic understanding of React
  • Docker

Introduction to Strapi

Strapi is the leading open-source, customizable, headless CMS that gives developers the freedom to choose their favorite tools and frameworks while also allowing editors to manage and distribute their content easily.

Strapi enables the world's largest companies to accelerate content delivery while building beautiful digital experiences by making the admin panel and API extensible through a plugin system.

Scaffolding a Strapi Project

To install Strapi, head over to the documentation. We’ll be using the SQLite database for this project. Run the following commands:

    yarn create strapi-app my-project # using yarn
    npx create-strapi-app@latest my-project # using npx

Replace my-project with the name you wish to call your application directory. Your package manager will create a directory with the specified name and install Strapi.

If you have followed the instructions correctly, you should have Strapi installed on your machine. Run the following commands to start the Strapi development server:

    yarn develop # using yarn
    npm run develop # using npm

The development server starts the app on localhost:1337/admin.

What is Elasticsearch?

Elasticsearch “Helps everyone find what they need faster—from employees who need documents from your intranet to customers browsing online for the perfect pair of shoes”.

Basically, Elasticsearch provides a way for you to integrate a full-blown search engine into your application and it’s helpful.

Why Should You Use Elasticsearch?

Traditional relational databases do not really do well when searching through a lot of text. Speed is a huge factor for search, i.e. the speed at which results are returned, and accuracy. Below are some benefits of using Elasticsearch:

  • Elasticsearch is great for searching large datasets because of it’s incredible speed. According to the documentation, Elasticsearch is almost real-time.
  • Elasticsearch is highly customizable; it provides auto-complete and logging features. The ability to track what users search and tailor search suggestions according to their needs.

What to Consider Before Using Elasticsearch

There are a couple of options for running Elasticsearch:

  • Running hosted Elasticsearch service using Elastic Cloud: the easy way to get started with Elasticsearch
  • Running a self-managed Elasticsearch:
    • Running Elasticsearch on a local machine
    • Running Elasticsearch in a Docker container
    • Running Elasticsearch cloud on Kubernetes

If you want to hit the ground running quick, you should consider using the Elastic Cloud service. It offers all features but a subscription fees is required, Although you can sign up for a free trial

Running a self-managed Elasticsearch instance means that you get all the features of Elasticsearch for free, but you have to go through the hassles of setting up the instance, i.e. making sure the host machine has enough memory to run the Elasticsearch.

Getting Started with Elasticsearch

In this tutorial, we’ll be running a self-managed Elasticsearch instance in a Docker container. If you prefer to use Elasticsearch, register for free trial here. Follow the steps below to run Strapi using Docker compose:

  1. Create an Elastic_Deployment directory.
  2. Create a .env file that will store credentials that Elasticsearch requires.

     STACK_VERSION=8.4.2
     ELASTIC_PASSWORD=Elastic_password
     KIBANA_PASSWORD=Kibana_password
     ES_PORT=9200
     CLUSTER_NAME=es-cluster
     LICENSE=basic
     MEM_LIMIT=1073741824
     KIBANA_PORT=5601
     ENTERPRISE_SEARCH_PORT=3002
     ENCRYPTION_KEYS=secret
    

    Replace the Elastic_password and kibana_password values above with whatever passwords you like.

  3. Create a docker-compose.yaml file and paste the following configurations:

     version: "2.2"
    
     services:
       setup:
         image: docker.elastic.co/elasticsearch/elasticsearch:${STACK_VERSION}
         volumes:
           - certs:/usr/share/elasticsearch/config/certs
         user: "0"
         command: >
           bash -c '
             if [ x${ELASTIC_PASSWORD} == x ]; then
               echo "Set the ELASTIC_PASSWORD environment variable in the .env file";
               exit 1;
             elif [ x${KIBANA_PASSWORD} == x ]; then
               echo "Set the KIBANA_PASSWORD environment variable in the .env file";
               exit 1;
             fi;
             if [ ! -f certs/ca.zip ]; then
               echo "Creating CA";
               bin/elasticsearch-certutil ca --silent --pem -out config/certs/ca.zip;
               unzip config/certs/ca.zip -d config/certs;
             fi;
             if [ ! -f certs/certs.zip ]; then
               echo "Creating certs";
               echo -ne \
               "instances:\n"\
               "  - name: es01\n"\
               "    dns:\n"\
               "      - es01\n"\
               "      - localhost\n"\
               "    ip:\n"\
               "      - 127.0.0.1\n"\
               > config/certs/instances.yml;
               bin/elasticsearch-certutil cert --silent --pem -out config/certs/certs.zip --in config/certs/instances.yml --ca-cert config/certs/ca/ca.crt --ca-key config/certs/ca/ca.key;
               unzip config/certs/certs.zip -d config/certs;
             fi;
             echo "Setting file permissions"
             chown -R root:root config/certs;
             find . -type d -exec chmod 750 \{\} \;;
             find . -type f -exec chmod 640 \{\} \;;
             echo "Waiting for Elasticsearch availability";
             until curl -s --cacert config/certs/ca/ca.crt https://es01:9200 | grep -q "missing authentication credentials"; do sleep 30; done;
             echo "Setting kibana_system password";
             until curl -s -X POST --cacert config/certs/ca/ca.crt -u elastic:${ELASTIC_PASSWORD} -H "Content-Type: application/json" https://es01:9200/_security/user/kibana_system/_password -d "{\"password\":\"${KIBANA_PASSWORD}\"}" | grep -q "^{}"; do sleep 10; done;
             echo "All done!";
           '
         healthcheck:
           test: ["CMD-SHELL", "[ -f config/certs/es01/es01.crt ]"]
           interval: 1s
           timeout: 5s
           retries: 120
    
       es01:
         depends_on:
           setup:
             condition: service_healthy
         image: docker.elastic.co/elasticsearch/elasticsearch:${STACK_VERSION}
         volumes:
           - certs:/usr/share/elasticsearch/config/certs
           - esdata01:/usr/share/elasticsearch/data
         ports:
           - ${ES_PORT}:9200
         environment:
           - node.name=es01
           - cluster.name=${CLUSTER_NAME}
           - cluster.initial_master_nodes=es01
           - ELASTIC_PASSWORD=${ELASTIC_PASSWORD}
           - bootstrap.memory_lock=true
           - xpack.security.enabled=true
           - xpack.security.http.ssl.enabled=true
           - xpack.security.http.ssl.key=certs/es01/es01.key
           - xpack.security.http.ssl.certificate=certs/es01/es01.crt
           - xpack.security.http.ssl.certificate_authorities=certs/ca/ca.crt
           - xpack.security.http.ssl.verification_mode=certificate
           - xpack.security.transport.ssl.enabled=true
           - xpack.security.transport.ssl.key=certs/es01/es01.key
           - xpack.security.transport.ssl.certificate=certs/es01/es01.crt
           - xpack.security.transport.ssl.certificate_authorities=certs/ca/ca.crt
           - xpack.security.transport.ssl.verification_mode=certificate
           - xpack.license.self_generated.type=${LICENSE}
         mem_limit: ${MEM_LIMIT}
         ulimits:
           memlock:
             soft: -1
             hard: -1
         healthcheck:
           test:
             [
                 "CMD-SHELL",
                 "curl -s --cacert config/certs/ca/ca.crt https://localhost:9200 | grep -q 'missing authentication credentials'",
             ]
           interval: 10s
           timeout: 10s
           retries: 120
    
       kibana:
         depends_on:
           es01:
             condition: service_healthy
         image: docker.elastic.co/kibana/kibana:${STACK_VERSION}
         volumes:
           - certs:/usr/share/kibana/config/certs
           - kibanadata:/usr/share/kibana/data
         ports:
           - ${KIBANA_PORT}:5601
         environment:
           - SERVERNAME=kibana
           - ELASTICSEARCH_HOSTS=https://es01:9200
           - ELASTICSEARCH_USERNAME=kibana_system
           - ELASTICSEARCH_PASSWORD=${KIBANA_PASSWORD}
           - ELASTICSEARCH_SSL_CERTIFICATEAUTHORITIES=config/certs/ca/ca.crt
           - ENTERPRISESEARCH_HOST=http://enterprisesearch:${ENTERPRISE_SEARCH_PORT}
         mem_limit: ${MEM_LIMIT}
         healthcheck:
           test:
             [
                 "CMD-SHELL",
                 "curl -s -I http://localhost:5601 | grep -q 'HTTP/1.1 302 Found'",
             ]
           interval: 10s
           timeout: 10s
           retries: 120
    
       enterprisesearch:
         depends_on:
           es01:
             condition: service_healthy
           kibana:
             condition: service_healthy
         image: docker.elastic.co/enterprise-search/enterprise-search:${STACK_VERSION}
         volumes:
           - certs:/usr/share/enterprise-search/config/certs
           - enterprisesearchdata:/usr/share/enterprise-search/config
         ports:
           - ${ENTERPRISE_SEARCH_PORT}:3002
         environment:
           - SERVERNAME=enterprisesearch
           - secret_management.encryption_keys=[${ENCRYPTION_KEYS}]
           - allow_es_settings_modification=true
           - elasticsearch.host=https://es01:9200
           - elasticsearch.username=elastic
           - elasticsearch.password=${ELASTIC_PASSWORD}
           - elasticsearch.ssl.enabled=true
           - elasticsearch.ssl.certificate_authority=/usr/share/enterprise-search/config/certs/ca/ca.crt
           - kibana.external_url=http://kibana:5601
         mem_limit: ${MEM_LIMIT}
         healthcheck:
           test:
             [
                 "CMD-SHELL",
                 "curl -s -I http://localhost:3002 | grep -q 'HTTP/1.1 302 Found'",
             ]
           interval: 10s
           timeout: 10s
           retries: 120
    
     volumes:
       certs:
         driver: local
       enterprisesearchdata:
         driver: local
       esdata01:
         driver: local
       kibanadata:
         driver: local
    

    To start the Elasticsearch instance, run:

     docker-compose up --remove-orphans
    

    If you get an error relating to vm.max_map_count, then refer to this documentation on how to solve it for your particular OS.

The Strapi FoodAdvisor Application

FoodAdvisor is the official Strapi demo application; you can learn a lot about Strapi by studying the repo.

To clone the repository, run the following command in your terminal:

    git clone https://github.com/strapi/foodadvisor.git

Follow the instructions below to start the foodadvisor application:

Server:

  1. Navigate to the api directory of the foodadvisor application, by runnng cd api in your terminal.
  2. In the foodAdvisor/api directory, copy the contents of the env.example file into a .env file.
  3. Run the following commands:
     yarn && yarn seed && yarn develop
    
    The Strapi server should be up and running on localhost:1337.

Client:

  1. Navigate to the client directory of the foodadvisor application by runnng cd client in your terminal.
  2. Run the following commands:
     yarn && yarn dev
    
    The next.js client should be running on localhost:3000.

Integrating Elasticsearch into the Strapi Server

To connect Elasticsearch to our Strapi server, we need to install a client that allows our server talk to Elasticsearch. Run the command below:

    yarn add @elastic/elasticsearch
    or
    npm i @elastic/elasticsearch

In the foodAdvisor/api directory, follow the instructions below.

  1. Create a helpers directory - mkdir helpers.
  2. In the helpers directory, create an elastic_client.js file by running the command below:

     cd helpers
     touch elastic_client.js
    
  3. Update the content of elastic_client.js with the following:

     const { Client } = require('@elastic/elasticsearch')
     const host = process.env.ELASTIC_HOST
    
     const fs = require('fs')
    
     const connector = () => {
    
       return new Client({
         node: host,
         auth: {
           username: process.env.ELASTIC_USERNAME,
           password: process.env.ELASTIC_PASSWORD
         },
         tls: {
           ca: fs.readFileSync('./http_ca.crt'),
           rejectUnauthorized: false
         }
       })
     }
    
     const testConn = (client) => {
       client.info()
         .then(response => console.log(response))
         .catch(error => console.error(error))
     }
    
     module.exports = {
       connector,
       testConn
     }
    

To have this connection working properly:

  1. Update the contents of the .env file in the foodadvisor/api directory. Open up the .env file and add the following credentials to it:
     ELASTIC_HOST=https://localhost:9200
     ELASTIC_USERNAME=elastic
     ELASTIC_PASSWORD=ELASTIC_PASSWORD_FROM_YOUR_ELASTIC_DEPLOYMENT
    
    Elasticsearch is always running on https://localhost:9200 and the username for the free version elasticsearch is elastic. Elastic_password should be the same password you set in you elastic_deployment/.env file.
    tls: {
          ca: fs.readFileSync('./http_ca.crt'),
          rejectUnauthorized: false
        }

To get the ./http_ca.crt file, we have to go into the Elasticsearch container. In order to do that, run:

    docker exec --it <CONTAINER_ID> /bin/bash
    cd config/certs/ca
    cat ca.crt

CONTAINER_ID is the id of the container exposed on :9200. Copy the results of the crt command, then create a file in the foodadvisor/api directory called http_ca.crt and paste the results in it.

  1. Create an elastic_index.js file in the foodadvisor/api/scripts directory and update its contents with the following lines of code:

     const strapi_url = 'http://localhost:1337/'
     const axios = require('axios')
     require('array.prototype.flatmap').shim()
    
     const { connector, testConn } = require('../helpers/elastic_client')
    
     const client = connector()
     testConn(client)
    
     const run = async () => {
    
       const response = await axios.get(`${strapi_url}api/search/restaurants`)
    
       const dataset = response.data
    
       await client.indices.create({
         index: 'foodadvisor-restaurant',
         operations: {
           mappings: {
             properties: {
               id: { type: 'integer' },
               name: { type: 'text' },
               slug: { type: 'keyword' },
               location: { type: 'text' },
               description: { type: 'text' },
               url: { type: 'text' }
             }
           }
         }
       }, { ignore: [400] })
    
       const operations = dataset.flatMap(doc => [{ index: { _index: 'foodadvisor-restaurant' } }, doc])
    
       const bulkResponse = await client.bulk({ refresh: true, operations })
    
       if (bulkResponse.errors) {
         const erroredDocuments = []
         // The items array has the same order of the dataset we just indexed.
         // The presence of the `error` key indicates that the operation
         // that we did for the document has failed.
         bulkResponse.items.forEach((action, i) => {
           const operation = Object.keys(action)[0]
           if (action[operation].error) {
             erroredDocuments.push({
               // If the status is 429 it means that you can retry the document,
               // otherwise it's very likely a mapping error, and you should
               // fix the document before to try it again.
               status: action[operation].status,
               error: action[operation].error,
               operation: body[i * 2],
               document: body[i * 2 + 1]
             })
           }
         })
       }
    
       const count = await client.count({ index: 'foodadvisor-restaurant' })
    
       console.log(count)
    
     }
    
     run().catch(console.log)
    

In the code snippet above, we’re creating an index programmatically. An index is an optimized collection of documents and each document is a collection of fields, which are the key-value pairs that contain your data. Next, we’re creating a run() function which performs a bulk insert of data into the created index.

Run npm i array.prototype.flatmap to install the array.prototype.flatmap package.

Make sure your Strapi server is up and running, then run the following command to populate your foodadvisor-restaurant Elasticsearch index:

    node /api/script/elastic_index.js

Generating a Strapi API

To generate a Strapi API, run the following:

  1. cd api && yarn strapi generate api
  2. Name the generated API search.
  3. When asked if the API is for a plugin, select "no" as the answer.

The generated API should be located in foodadvisor/api/src/api/search. The content of the directory includes:

  1. Routes Directory: Update the contents of its search.js file:

     module.exports = {
       routes: [
         {
          method: 'GET',
          path: '/search/restaurants',
          handler: 'search.restaurants',
          config: {
            policies: [],
            middlewares: [],
            auth: false
          },
         },
         {
           method: 'POST',
           path: '/search/restaurants',
           handler: 'search.search_restaurants',
           config: {
             policies: [],
             middlewares: [],
             auth: false
           },
          },
       ],
     };
    
  2. Controllers Directory: Update the contents of its search.js file:

     'use strict';
    
     /**
      * A set of functions called "actions" for `search`
      */
    
     module.exports = {
       restaurants: async (ctx, next) => {
         try {
           const data = await strapi.service('api::search.search').restaurants()
           // console.log('here', data)
           ctx.body = data
         } catch (err) {
           ctx.body = err;
         }
       },
    
       search_restaurants: async(ctx, next) => {
         try {
           const data = await strapi.service('api::search.search').search_restaurants(ctx.query)
           // console.log('here', ctx.query)
           ctx.body = data
         } catch (err) {
           ctx.body = err;
         }
       }
    
     };
    
  3. Services Directory: Update the contents of its search.js file: ```js 'use strict';

    const { connector, testConn } = require('../../../../helpers/elastic_client')

    const client = connector()

    /**

    • search service */

      module.exports = ({ strapi }) => ({

      restaurants: async () => {

       const data = await strapi.entityService.findMany('api::restaurant.restaurant', {
           populate: { information: true, place: true, images: true }
       })
      
       const mappedData = data.map((el, i) => {
           return { id: el.id, slug: el.slug, name: el.name, description: el.information.description, location: el.place.name, image: el.images[0].url }
       })
      
       return mappedData
      

      },

      search_restaurants: async (data) => {

       //test client's connection to elastic search
       testConn(client)
      
       async function read() {
      
           const search = data.s
           const field = data.field || 'name'
      
           const body = await client.search({
               index: 'foodadvisor-restaurant',
               body: {
                   query: {
                       regexp: {
      
                            [field]: {
                                value: `${search}.*`,
                                flags: "ALL",
                                case_insensitive: true,
                            },
                        }
                    }
                }
            })

            const mappedData = body.hits.hits

            await Promise.all(mappedData.map(async(el, i) => {
                mappedData[i] = await strapi.entityService.findOne('api::restaurant.restaurant', el._source.id, {
                    populate: { information: true, place: true, images: true, category: true }
                })
            }))

            mappedData.map((el, i) => {
                const images = el.images
                const place = el.place
                const category = el.category
                delete el.images
                delete el.place
                delete el.category
                const imageData = []

                images.forEach(el => {
                    imageData.push({ id: el.id, attributes: el })
                })
                el.images = {
                    data: imageData
                }
                el.place = {
                    data: {
                        attributes: place
                    }
                }
                el.category = {
                    data: {
                        attributes: category
                    }
                }
            })

            return mappedData
        }

        return read().catch(console.log)

    },

    populate_restaurants: async(data) => {
        console.log('data')
        await Promise.all(data.map(async(el, i) => {
            data[i] = await strapi.entityService.findOne('api::restaurant.restaurant', el.id, {
                populate: { information: true, place: true, images: true, category: true }
            })
        }))

        data.map((el, i) => {
            const images = el.images
            const place = el.place
            const category = el.category
            delete el.images
            delete el.place
            delete el.category
            const imageData = []
            images.forEach(el => {
                imageData.push({ id: el.id, attributes: el })
            })
            el.images = {
                data: imageData
            }
            el.place = {
                data: {
                    attributes: place
                }
            }
            el.category = {
                data: {
                    attributes: category
                }
            }
        })
    }
});


## Updating the Next.js Frontend
1. **Update the API request to include a POST request to the restaurant search route.** In the `client/utils/index.js` file add the following lines of code to its content:

```js
    export async function search(searchText) {
      console.log('searching', searchText)
      const resRestaurants = await fetch(
        getStrapiURL(`/search/restaurants?s=${searchText}`),
        {
          method: "POST"
        }
      )

      const restaurants = await resRestaurants.json()

      return { restaurants: restaurants, count: restaurants.length }
    }
  1. Next, in client/pages/restaurants/index.js , update the following code appropriately: ```js //other imports import { getData, getRestaurants, getStrapiURL, search } from "../../utils";

    //other useState hooks const [searchText, setSearchText] = useState('') const [searchData, setSearchData] = useState('')

    //replace the tag and it's children with the following

         <Header {...header} />
         <div className="flex flex-col content-end items-center md:flex-row gap-2 my-24 px-4">
           <div>
             {/* categories */}
             <select
               className="block w-52 py-2 px-3 border border-gray-300 bg-white rounded-md shadow-sm focus:outline-none focus:ring-primary-500 focus:border-primary-500"
               onChange={(value) => {
                 setCategoryId(delve(value, "target.value"))
                 setSearchData('')
               }}
             >
               <option value="">
                 {categoryId
                   ? "Clear filter"
                   : categoryText || "Select a category"}
               </option>
               {categories &&
                 categories.map((category, index) => (
                   <option
                     key={`categoryOption-${index}`}
                     value={delve(category, "attributes.id")}
                   >
                     {delve(category, "attributes.name")}
                   </option>
                 ))}
             </select>
           </div>
           <div>
             {/* location */}
             <select
               className="block w-52 py-2 px-3 border border-gray-300 bg-white rounded-md shadow-sm focus:outline-none focus:ring-primary-500 focus:border-primary-500"
               onChange={(value) => {
                 setPlaceId(delve(value, "target.value"))
                 setSearchData('')
               }}
             >
               <option value="">
                 {placeId ? "Clear filter" : placeText || "Select a place"}
               </option>
               {places &&
                 places.map((place, index) => (
                   <option
                     key={`placeOption-${index}`}
                     value={delve(place, "attributes.id")}
                   >
                     {delve(place, "attributes.name")}
                   </option>
                 ))}
             </select>
           </div>
           {/* search */}
           <div className="flex flex-col md:flex-row justify-items-end gap-2 px-2">
             <input className="block w-80 right-0 py-2 px-3 border border-gray-300 bg-white rounded-md shadow-sm focus:outline-none focus:ring-primary-500 focus:border-primary-500" placeholder="Search Restaurants" onChange={(event) => {
               setSearchText(event.target.value)
             }}/>
             <button
                   type="button"
                   className={`${
                     searchText.length <= 2 ? "cursor-not-allowed opacity-50" : ""
                   } w-1/4 p-4 border rounded-full bg-primary hover:bg-primary-darker text-white hover:bg-gray-100 focus:outline-none`} disabled={searchText.length <= 2} onClick={async () => {
                     const res = await search(searchText)
                     setSearchData(res)
                     setCategoryId(null)
                     setPlaceId(null)
                     // console.log(data.restaurants)
                   }}
                 >
                   Search
                 </button>
           </div>
         </div>
    
         <NoResults status={status || (searchData != '' && searchData.length == 0)} length={ searchData != '' ? searchData.restaurants.length : delve(data, "restaurants").length} />
    
         {/* render initial data || search results  */}
         {searchData.length <= 0 ? <div className="grid md:grid-cols-3 sm:grid-cols-2 grid-cols-1 gap-16 mt-24 px-4">
           {status === "success" &&
             delve(data, "restaurants") &&
             data.restaurants.map((restaurant, index) => (
               <RestaurantCard
                 {...restaurant.attributes}
                 locale={locale}
                 key={index}
               />
             ))}
         </div> : <div className="grid md:grid-cols-3 sm:grid-cols-2 grid-cols-1 gap-16 mt-24 px-4">
           {status === "success" &&
             delve(data, "restaurants") &&
             searchData.restaurants.map((restaurant, index) => (
               <RestaurantCard
                 {...restaurant}
                 locale={locale}
                 key={index}
               />
             ))}
         </div>
             }
    
        {delve(data, "count") > 0 && (
          <div className="grid grid-cols-3 gap-4 my-24">
            <div className="col-start-2 col-end-3">
              {searchData.length <= 0 ? <div className="flex items-center">
                <button
                  type="button"
                  className={`${
                    pageNumber <= 1 ? "cursor-not-allowed opacity-50" : ""
                  } w-full p-4 border text-base rounded-l-xl text-gray-600 bg-white hover:bg-gray-100 focus:outline-none`}
                  onClick={() => setPageNumber(pageNumber - 1)}
                  disabled={pageNumber <= 1}
                >
                  Previous
                </button>

                <button
                  type="button"
                  className={`${
                    pageNumber >= lastPage
                      ? "cursor-not-allowed opacity-50"
                      : ""
                  } w-full p-4 border-t border-b border-r text-base rounded-r-xl text-gray-600 bg-white hover:bg-gray-100 focus:outline-none`}
                  onClick={() => setPageNumber(pageNumber + 1)}
                  disabled={pageNumber >= lastPage}
                >
                  Next
                </button>
              </div>: ''}
            </div>
          </div>
        )}
      </Container>

```

Testrunning the Search Feature

Navigate to the FoodAdvisor restaurant page located at http://localhost:3000/restaurants?lang=en.

Here’s how the page should look like:

demo

Type any text into the search-box, then click the search button. Depending on the text you entered, you could have a collection of results returned to you or the no results component is rendered.

Valid search results:

App Screenshot

Invalid search results:

App Screenshot

Conclusion

In this article, we discussed what Elasticsearch is and its benefits. We also saw how to create Elastic indices programmatically and how to integrate Elasticsearch into a Strapi application. This is just the tip of the iceberg of what Elasticsearch can do. I surely do hope that now you have some basics nailed down you’re ready to explore more features of Elasticsearch.

Here’s a Link to this project’s GitHub repository