Extending Firebase with Cloud functions & ElasticSearch

Firebase is a hosted noSQL solution from Google that really focuses on the realtime aspect and the ease of use for developers. It features easy user authentication, security rules, realtime communication and blob storage. With its great features come some limitations regarding query support. Firebase only support startAt()endAt(), and equalTo() conditions and not any form of fulltext support, let alone aggregations / facets. In this tutorial we’ll show you how to add fulltext query and aggregation support to Firebase using ElasticSearch and Firebase Cloud functions.

We won’t bother with a frontend for now, but focus on the Elasticsearch and Firebase cloud function part. We’ll use an example cars database found online and add data and queries in firebase using the Firebase cli.

If you are already experienced with Firebase or have a running Elasticsearch instance, you can skip the setup part and jump right into the Firebase cloud functions.

 

Setup Firebase

To get started with firebase you need to create a new Firebase project in your Firebase console, click the ‘Add project’ button, name your project and click the ‘create project’ button.

Upgrade to Firebase plan to ‘Blaze’ to allow for network connection to the Elasticsearch VM we are about to create. The Blaze plan will cost actual money. After completing the update, it might take a few minutes to be acutally completed.

 

Setup ElasticSearch VM

Go to https://console.cloud.google.com/ and login with your Google account or use https://console.cloud.google.com/freetrial to get a free trial to Google Cloud Platform with $300 credit (but it does require a creditcard). When you are in the Google Cloud Console, go to the Cloud Launcher using the sidenav, search for ‘Elasticsearch’ (Inception!) and select the Bitnami image.

The Bitnami image is the cheap option, which is perfectly fine to get some hands-on experience. When you have selected the Bitnami Elasticsearch VM, you click on ‘Launch on Compute Engine’ button. This might trigger an ‘Enable billing’ dialog if you didn’t already have billing enabled. In the following screen you can reduce the VM machine type to ‘micro’ instead of ‘small’ as we’re not using it for any production load in this tutorial and click the ‘deploy’ button.

 

The deployment might take a little while but the deployment manager will eventually present you with the details we need to get data in the search index and query the search index: The IP adress (ephemeral, but usable for now) and the credentials for elasticsearch. After a successfull deployment you can find the instance in the sidenav menu under ‘Compute Engine’. Keep track of the Site address, Admin user and Admin password, you will need these later when configuring the Firebase cloud function.

 

Setup Elasticsearch mapping

For a basic UI in your browser to send and receive some elasticsearch requests you can install the Chrome Sense extension. Its a bit buggy, but works good enough.

https://chrome.google.com/webstore/detail/sense-beta/lhjgkmllcaadmopgmanpapmpjgmfcfig?hl=en

Now we need to tell elastic search what kind of data we’ll be sending and querying. The configuration of these mappings can be done sending an HTTP request using the Sense Chrome extension (or any other http client like Curl). It will require the credentials shown in the Bitnami Elasticsearch VM. The default username is always ‘user’.

Here you can find our Elasticsearch cars mapping

 

Setup Firebase tools

> npm install -g firebase-tools

The first command installs the firebase CLI tool. This tool helps you to do import, export and deployment of firebase projects. Deployment can consist of static files, security rules and cloud functions.We’ll focus on the cloud functions.

> firebase login

The second command will authenticate you with firebase using your default browser. If all goes well, it will show you a ‘Firebase CLI Login Successful’ page and you can close the browser. These credentials are now used whenever you deploy, import or export using the Firebase CLI.

> firebase init functions

Next we initialize a firebase functions project. You will be asked to select a firebase project. A new functions directory will be bootstrapped with some npm dependencies together with a .firebaserc and firebase.json file.

 

The Firebase Cloud functions

Now we have all the basics setup: A firebase project in the cloud, an Elasticsearch instance and a local project folder.

Firebase Cloud functions are a kind of triggers with a callback that gets executed when one of the following happens:

  • Written to the database
  • User is created or deleted
  • Something changed in the file storage
  • An Http(s) call is made
  • A message is published on a topic in Google pubsub
  • A specified Firebase analytics event

 

For our case with Elasticsearch we’ll use the database onWrite event to monitor anything that is written to /cars/{carId} . cardId is a placeholder that can be retrieved as an event parameter.

To communicate with the Elasticsearch VM we’ll require some additional dependencies:

> npm install --save request request-promise lodash

First we’ll filter the data we receive from firebase to only include the fields used in Elasticsearch, then we post the json object to the Elasticsearch cars index. For our credentials and host configuration we use firebase config variables to seperate the code from the configuration. If we don’t receive any data it means that the data is deleted and we can safely delete it in the Elasticsearch index.

 

/functions/index.js

Keep in mind that the Firebase cloud function will not be called when / or /cars gets deleted or you use the firebase-admin sdk or console import!

To set the needed firebase config variables we use the Firebase CLI functions:config:set command:

> firebase functions:config:set elasticsearch.username="user" elasticsearch.password="my_password" elasticsearch.url="http://104.154.41.53/elasticsearch/"

And now its time to deploy our first Firebase cloud function!

> firebase deploy

 

Import test data

Now it’s time to import some cars data using a node script that will read the first 1000 lines of a cars dataset. We will need a firebase service-account. You can download the keys for the service account in the Firebase console=> Project settings => Service accounts page.

 

Place the service account key json file in a new import directory (outside the functions directory), download this file as cars.json in that same import directory and copy the following file as import.js.

Now, lets run our import using the following command

> npm install --save firebase-admin
> node import.js

When all data is imported, we can query Elasticsearch with some aggregations and we would receive 1000 hits in total.

 

All firebase Cloud function invocations can be seen in the logs. You will see errors when for example the Elasticsearch VM was not available.

 

Conclusion

The actual code needed to run triggers based on the Firebase database writes is minimal. Just the way we like it. As said: the indexing of Elasticsearch is pretty basic. We’ll need to improve the error handling, find ways to trigger re-indexing when needed but Firebase cloud functions are definitly a better solution then the previous solutions like Flashlight and Firebase queues.

Liked it? Questions? Improvements? Let me know!

References

Facebooktwitterlinkedininstagram

Laat een reactie achter

Het e-mailadres wordt niet gepubliceerd. Vereiste velden zijn gemarkeerd met *