Node.js and MongoDB on AWS Lambda using Serverless Framework

May 18, 2021

Tested with Ubuntu Desktop 18.04.5 LTS and Node.js v14.17.0 LTS

In the previous post we created a simple REST API to manage todos and we deployed it on Heroku.

In this post we are going to deploy the same API on AWS Lambda doing some little changes to our code and relying on Serverless Framework to configure the cloud infrastructure.

The source code is available on GitHub.

Before beginning the tutorial, I suggest reading the brief introduction about AWS Lambda provided by the Serverless Framework documentation.

Create an AWS account

We will deploy our solution on AWS Lambda. We need to create an account in order to access all services we need.

Create a MongoDB Atlas account and setup a cluster

We will use a fully managed instance of MongoDB relying on Atlas. We need to create a MongoDB account and then set up a cluster by using this guide.

Once Atlas has finished the provisioning, we have to click on CONNECT and then Connect your application.

Copy the provided link, open the Text editor and paste it. We will use it soon.

Installing Serverless Framework

To install Serverless Framework we have to run

$ npm install -g serverless

To check if it works, we can run

$ serverless --version

and the output should be similar to

Framework Core: 2.41.1 Plugin: 4.6.0 SDK: 4.2.2 Components: 3.9.2

Creating a IAM user and configuring Serverless Framework to use AWS Access Keys

To do this task quickly, we can use the short video tutorial provided in the official Serverless Framework documentation.

Important: the documentation states clearly

Note that the above steps grant the Serverless Framework administrative access to your account. While this makes things simple when you are just starting out, we recommend that you create and use more fine grained permissions once you determine the scope of your serverless applications and move them into production.

Fine, all the boring stuff is done.

Initializing our solution

Let's create the folder that will contain our solution

$ mkdir todo-api-aws-lambda

then

$ cd todo-api-aws-lambda $ npm init -y

It is time to install the libraries we will use in our code

$ npm install mongoose && npm install mongodb-memory-server tap --save-dev

mongodb-memory-server and tap are the two libraries to write tests.

Open package.json and replace

// package.json "scripts": { "test": "echo \"Error: no test specified\" && exit 1" }

with

// package.json "scripts": { "test": "clear; tap --reporter=spec --watch" }

Let's create a folder test where we keep our test files.

Time to create the service

$ serverless create --template aws-nodejs

after we ran the command above, in our solution folder we should find 3 additional files: handler.js, .npmignore and serverless.yml. I personally hate yml and fortunately it is possible to use the json counterpart.

We can remove serverless.yml and replace it with serverless.json. The content is this

// serverless.json { "service": "todo-api-aws-lambda", "frameworkVersion": "2", "provider": { "apiGateway": { "shouldStartNameWithService": true }, "environment": { "MONGODB_URI": "mongodb+srv://<username>:<password>@<cluster-name>.mongodb.net/<db-name>" }, "lambdaHashingVersion": 20201221, "name": "aws", "region": "eu-central-1", "runtime": "nodejs14.x", "stage": "dev" }, "functions": ["${file(./todos/functions.json)}"] }

This file contains everything related to the deployment on AWS Lambda.

In line 11 we have to replace the value of MONGODB_URI with the URI we pasted in Text editor before. This is the way we can set environment variables on AWS Lambda through Serverless Framework.

In line 15 we have to be sure to use the same region of the MongoDB Atlas cluster. It is not mandatory, but setting the same value guarantees that the database and our service are deployed within the same region.

In line 18 we are using a variable: the functions of our service are defined in the file functions.json inside the folder todos. When the service to deploy is complex, we can split it in different folders and files in order to keep it easy to develop and maintain.

Let's create the folder todos and inside the file functions.json with this code

{ "create": { "handler": "./todos/handler.create", "events": [ { "http": { "method": "post", "path": "todos" } } ] }, "remove": { "handler": "./todos/handler.remove", "events": [ { "http": { "method": "delete", "path": "todos/{id}" } } ] }, "get": { "handler": "./todos/handler.get", "events": [ { "http": { "method": "get", "path": "todos", "request": { "parameters": { "querystrings": { "is-completed": true, "sort-by": true } } } } } ] }, "getById": { "handler": "./todos/handler.getById", "events": [ { "http": { "method": "get", "path": "todos/{id}" } } ] }, "update": { "handler": "./todos/handler.update", "events": [ { "http": { "method": "put", "path": "todos/{id}" } } ] } }

Our REST API provides functions to create, read (get and getById), update and delete todos.

The handler property defines which JavaScript function will manage the request.

The events property defines which events will trigger an invocation of the function.
http events are the ones triggered by AWS API Gateway. For each event we can specify the HTTP method and the path the client has to use.

The path property can have

  • path parameters defined using {paramName} syntax as in line 19

  • query string parameters that must be listed within the property request.parameters.querystrings as in line 31

The configuration of our service is ready.

Connecting and querying MongoDB: database.js

If you read the previous tutorial, Fastify and MongoDB on Heroku, you will find that the code of this module is exactly the same. We do not need to change anything ... And this is a brutal copy and paste 😁️

The first brick of our solution is the module to connect and query MongoDB. Let's create the file database.js.

We will rely on mongoose to do all the operations.

We have to create the Schema for our data collection and the Model to manipulate the documents of our collection.

// database.js const mongoose = require("mongoose"); const { Schema } = mongoose; const todoSchema = new Schema( { isCompleted: { default: false, type: Boolean, }, text: { required: true, type: String, }, }, { timestamps: true } ); const Todo = mongoose.model("Todo", todoSchema);

The schema defines a todo like a document that has a field text (always required) and a field isCompleted that will represent the status of our todos (completed or not).

The timestamps option decorates each document with two additional fields: createdAt and updatedAt both of type Date.

After defining our schema, we will use it to instantiate our model called Todo.

Now we have to create the main function that implements the logic of the database module.

// database.js const database = (mongoUri) => { mongoose.connect(mongoUri, { useFindAndModify: false, useNewUrlParser: true, useUnifiedTopology: true, }); return { close: () => { mongoose.connection.close(); }, create: async (params) => { const todo = new Todo({ text: params.text, }); return todo.save(); }, get: async (params) => { let queryParams = {}; let sortParams = {}; if (params != null) { // get by id if ("id" in params && params.id != null) { return Todo.findById(params.id).then( (response) => { return response; }, (error) => { return null; } ); } // all others get for (var key in params) { switch (key) { case "isCompleted": { queryParams.isCompleted = params[key]; break; } case "sortBy": { const paramsSortBy = params[key]; sortParams[ paramsSortBy.property ] = paramsSortBy.isDescending ? -1 : 1; break; } } } } return Todo.find(queryParams).sort(sortParams); }, remove: async (id) => { return Todo.findOneAndDelete({ _id: id }); }, update: async (todo) => { return Todo.findOneAndUpdate({ _id: todo._id }, todo, { new: true, }); }, }; }; module.exports = database;

While writing the code above, I created the tests using mongodb-memory-server in order to generate a mongoUri that I passed to the database function. All queries are executed in memory without the need to connect to the MongoDB cluster.

Let's discuss the code.

First, we use mongoose to connect to the database. Keep in mind that we don't have to wait until the connection is established to use the model.

The close() function terminates the connection to the database. This is particularly useful while writing tests. To execute a proper tear down of resources and avoid timeout errors we call it in our test suites.

The create(params) function creates a Todo. As we defined in our schema the text property is mandatory so it is the only parameter we need.

The get(params) function allows querying the database according to constraints defined as parameters.
If the id property is present, all other constraints are ignored because the caller needs a specific todo.
Otherwise, the parameters are used to prepare the query. Our API supports a filter to get todos that are completed or not through isCompleted boolean property. In addition, we can specify a sort defining the property and the direction isDescending (false for ascending, true for descending).

The remove(id) function takes the id of a todo and removes it from the database.

The update(todo) function takes a modified todo and saves it into the database. The third parameter of findOneAndUpdate needs to have as returned document the updated one. By default findOneAndUpdate returns the document as it was before the update was applied.

queryStringParser and http modules

As the database module, queryStringParser does not need any change. It is a module that perform basic validation of query string parameters.

The new module we need to create is http. It is a really simple module that we use in our functions to return the http response. Take a look at the code in GitHub.

todos/handler.js: the core of our solution

We are going to implement the functions of our service. This is the most relevant change compared with the same REST API deployed on Heroku.

We don't need to use Fastify: the routing is managed by AWS API Gateway (we configured it in todos/functions.json).

Let's create the file todos/handler.js with this scaffolding

// todos/handler.js "use strict"; const database = require("./../database"); const queryStringParser = require("./../queryStringParser"); const { response } = require("./../http"); const db = database(process.env.MONGODB_URI); module.exports.create = async (event, context) => { // ... }; module.exports.get = async (event, context) => { // ... }; module.exports.getById = async (event, context) => { // ... }; module.exports.remove = async (event, context) => { // ... }; module.exports.update = async (event, context) => { // ... };

On line 9 we create the connection to the database. We follow the best practices suggested by MongoDB Atlas official documentation.

todos/handler.create

This function creates a new todo

// todos/handler.js module.exports.create = async (event, context) => { const todo = JSON.parse(event.body); const saved = await db.create(todo); return response(saved, 201); };

We access the body of request through event.body. Then after calling the database, we use the response function of http module to return the result to the client.

todos/handler.get

This function retrieves todos. In addition, it offers the possibility to filter and sort todos. We access the query string parameters of request through event.queryStringParameters.

module.exports.get = async (event, context) => { const params = queryStringParser(event.queryStringParameters); const todos = await db.get(params); return response(todos, 200); };

This endpoint supports query strings like

Query Return
/todos?sort-by=createdAt.asc Todos sorted by the oldest to the newest
/todos?sort-by=createdAt.desc Todos sorted by the newest to the oldest
/todos?is-completed=true Todos completed
/todos?is-completed=false Todos still open

It is possible to sort even for updatedAt and of course to combine query strings e.g. /todos?is-completed=true&sort-by=createdAt.asc.

todos/handler.getById

This function retrieves the todo with the requested id. We access the path parameters of request through event.pathParameters.

module.exports.getById = async (event, context) => { const todoId = event.pathParameters.id; const todo = await db.get({ id: todoId }); if (todo == null) { return response({ message: `Todo with id ${todoId} not found` }, 404); } return response(todo, 200); };

If the required todo does not exist, we reply with a 404 HTTP status.

todos/handler.remove

This function deletes the todo with the requested id.

module.exports.remove = async (event, context) => { const todoId = event.pathParameters.id; const removed = await db.remove(todoId); return response(null, 204); };

todos/handler.update

This function updates the todo with the requested id.

module.exports.update = async (event, context) => { const todo = JSON.parse(event.body); const updated = await db.update(todo); if (updated == null) { return response({ message: `Todo with id ${todo._id} not found` }, 404); } return response(updated, 200); };

Like getById, if the required todo does not exist, we reply with a 404 HTTP status.

Deploying on AWS Lambda

We are ready to deploy our solution on AWS Lambda. We have to run

$ serverless deploy -v

The output of this command contains the URLs (autogenerated by AWS) of our functions and should be similar to

... endpoints: POST - https://mkpc4dsro9.execute-api.eu-central-1.amazonaws.com/dev/todos DELETE - https://mkpc4dsro9.execute-api.eu-central-1.amazonaws.com/dev/todos/{id} GET - https://mkpc4dsro9.execute-api.eu-central-1.amazonaws.com/dev/todos GET - https://mkpc4dsro9.execute-api.eu-central-1.amazonaws.com/dev/todos/{id} PUT - https://mkpc4dsro9.execute-api.eu-central-1.amazonaws.com/dev/todos/{id} functions: create: todo-api-aws-lambda-dev-create remove: todo-api-aws-lambda-dev-remove get: todo-api-aws-lambda-dev-get getById: todo-api-aws-lambda-dev-getById update: todo-api-aws-lambda-dev-update ...

We can use a tool like Postman to do some tests with our API. In the GitHub repository there is a ready to use collection for this purpose.

Important: remember to edit the collection variable called aws_lambda_url according to the output above. In my case, I have to replace http://localhost with https://mkpc4dsro9.execute-api.eu-central-1.amazonaws.com/dev.

If we want to access the logs of a function, we can use the command serverless logs -f <nameOfTheFunction> -t. For example, if we want to see the logs of getById function, we have to run

serverless logs -f getById -t

Final notes

If we compare this solution with the one deployed on Heroku, the big difference is only about the configuration files. The code is almost the same: we just adapted a few lines (compare todo-api-aws-lambda/todos/handler.js with todo-api-heroku/app.js).

We learned how to split Serverless Framework configuration on different files, how to set environment variables on AWS Lambda and how to access request parameters (event.body, event.pathParameters and event.queryStringParameters).

Local testing while developing for a FaaS platform is a bit challenging. Serverless Framework documentation gives some advice about how to structure the code. You can find the tests for this solution inside the test folder and you can run them using

$ npm test

As the Serverless Framework documentation suggests, it is always a good idea to use stages: at least one for development (dev) and one for production (production).

Further considerations

There some other aspects related to AWS Lambda and below there two interesting articles that I think worth your attention:

  1. Cold Starts in AWS Lambda via Brian LeRoux on Twitter

  2. Why AWS Lambda Pricing Has to Change for the Enterprise via Eoin Shanaghy on Twitter


A photo of Elia Contini
Written by
Elia Contini
Sardinian UX engineer and a Front-end web architect based in Switzerland. Marathoner, traveller, wannabe nature photographer.