Creating a serverless API with Auth in under an hour.

aws_title.png

The aim

At the end of this tutorial, you should have a serverless API with production and development environments running on AWS using Amazon Cognito, DynamoDB, API Gateway, and Lambdas. We will use Amplify CLI to configure all this so as to avoid the nightmare of having to set up the connections between these services manually. We will also add a front end that is capable of consuming this API.

We will assume that you already have the Amplify CLI set up and configured with a profile. If not head on over to https://docs.amplify.aws/cli/start/install#configure-the-amplify-cli and follow the instructions. 

For our example, we are using React for the front end but there is no reason this couldn’t work with any other front end framework.


Let us begin

So first up let's set up our project.

npx create-react-app amplify-test
cd amplify-test
amplify init
 

When asked to name our amplify project we are going to use ‘amplifytest’ but you can go with whatever you want. Just remember to use your name whenever we reference amplifytest in this tutorial.

You should then see something like the following in your terminal window. Remember to pick ‘React’ when choosing the framework.

Notice how the default env is ‘dev’. We will leave this as is and treat this as our dev environment. We will set up a ‘prod’ environment later on in the tutorial.

When asked to use an AWS Profile choose y and pick the profile you want to use for this tutorial. (it's possible to set up multiple profiles on multiple AWS accounts so you can deploy to any AWS account you want).

Now you should be able to run amplify status to see what's been set up for us so far.

At this point it's also possible to head on over to the Amplify console and see the base of our project has already been set up for us.

aws__2.jpg

Now we have a React front end with an Amplify back end, it's time to add some services.


Adding the API

We will now use Amplify to add a REST API and our first endpoint which uses a combo of AWS services under the hood. At the end of this section, we should have an endpoint in Amazon API Gateway which points to a Lambda that fetches data from a shiny new DynamoDB table.

Adding a public endpoint

To start let’s add a /users endpoint to fetch back a set of users from the database.

Run...

amplify add api
 

And you should be guided through the following setup.

Notice we called the API ‘amplifytestapi’. If you are going to be running multiple projects in a single Amplify account then it's good practice to follow some kind of naming convention otherwise it gets really hard to see which resources relate to which projects in the console. If you are planning multiple projects it's possibly better to have a different AWS account or IAM user for each project to separate them out.

When you reach the question about which function template you want to use, make sure to pick ‘CRUD function for DynamoDB’. This will ensure that Amplify asks about setting up a DynamoDB table and assigns the correct permissions for the Lambda function to access the database.

Once you have selected the function template you should be presented with the storage wizard. (NB. in Amplify both DynamoDB and S3 are under the category of storage. If you ever need to add more DynamoDB tables to your app simply run amplify add storage and choose the DynamoDB option).

Notice we keep to our naming scheme by naming the table ‘amplifytestuserstore’.

Amplify will now prompt us to add some columns to our table. At this stage it's most important to think about our primary key and if we need a sort key, based on how we might need to query for data in our application. DynamoDB is a NoSQL database so you don't really need to think about any columns other than what you want to use for the partition and sort key at this stage.

For the sake of this demo, we are going to add ‘id’ and use that as the primary key. We will skip having a sort key and add first name and surname so we can look at the defaults that amplify sets up for us in the Lambda later.

We will then be asked about the primary key. Select ID and choose ‘none’ for the sort key.

The sort key is out of the scope of this tutorial but it's extremely useful when more complicated queries need to be performed. It's basically a second key that we can use to query the database. Eg. If you had a service where users could create blog posts you might have a table named ‘posts’ with ‘id’ as primary key and ‘userID’ as sort key. This would allow you to query all posts by userID.

Answer ‘no’ to all the other questions until you get to the end of the wizard.

You should now have all of the services for your endpoint created locally.

Open up the AWS console and head on over to the DynamoDB section and click on tables. You will not see the table we created just yet, that is because we have to push the changes first.

Run amplify push (don't worry you can always cancel before pushing) and you should see the following…

Notice how Amplify informs us of what has changed and what will be updated in the cloud when we push. As all these resources are new they have the create operation and will be created upon pushing.

Choose ‘Y’ and once complete you hopefully should get a success message that includes the endpoint of your new API. It should look something like this.

 

Pro tip: If you forget the endpoint in future run amplify status and you will get a similar message with what's changed in the project plus the endpoint for any APIs and Hosting etc.

 

Navigate back to DynamoDB in the AWS console and once again visit the Tables tab. You should now see the newly created table.

At this point you can also take the time to find the newly created Lambda and API gateway in the console as well if you want.

We can also test out our new endpoint. Navigate to the endpoint you got at the end of the last step.

You should see…

{
  "message": "Missing Authentication Token"
}
 

Simplifying the Lambda template 

Before we edit our Lambda, it's worth mentioning that most of the stuff you do with amplify-cli resides in the ./amplify directory of your project. Here you will find directories for each Lambda you add under the functions directory. You will also see directories for the API and for the databases (named storage), although you will probably only ever change the function files. Also if you ever need to change anything that is outside the scope of Amplify you will find all the CloudFormation files here as well.

Right, let's open up our Lambda and make some edits.

Open ./amplify/backend/function/amplifytestusers/src/app.js

When we used amplify add API, Amplify kindly added a Lambda template to our project which includes all the paths for CRUD functions on our table. You could just use this out of the box if you wish but for this tutorial, we are going to break it down and only look at adding some GET methods so we can understand how the Lambda can be used to fetch data from DynamoDB on a very basic level. This will also get you used to how editing and pushing changes to the API will work. 

So before we proceed take a good look at the template Amplify provided to get familiar with how it works.

Then... delete it all.

We will now add the base for our Lambda. Add the following, which you will notice is much the same as the first bit of the original template.

First, include all the modules we will need.

const AWS = require('aws-sdk');
const awsServerlessExpressMiddleware = require('aws-serverless-express/middleware');
const bodyParser = require('body-parser');
const express = require('express');
 

Then configure Amplify and set our ‘tableName’ and path ‘variables’.

const dynamodb = new AWS.DynamoDB.DocumentClient();

let tableName = "amplifytestuserstore";
if (process.env.ENV && process.env.ENV !== "NONE") {
 tableName = tableName + '-' + process.env.ENV;
}

const path = "/users";
 

The tableName variable is very important and ensures that our lambda always connects to the correct environments DynamoDB table. We will see how this works later on when we create our prod environment.

The path variable defines the path to our Lambda. It's the bit appended to our API gateway endpoint we were given earlier. To access this Lambda we will be able to go to something like https://271xzvzbnb.execute-api.ap-southeast-1.amazonaws.com/dev/users

Next, we initiate the express app

// declare a new express app
var app = express();
app.use(bodyParser.json());
app.use(awsServerlessExpressMiddleware.eventContext());

// Enable CORS for all methods
app.use(function (req, res, next) {
  res.header("Access-Control-Allow-Origin", "*");
  res.header("Access-Control-Allow-Headers", "Origin, X-Requested-With, Content-Type, Accept");
  next();
});

// init the app
app.listen(3000, function () {
  console.log("App started")
});

module.exports = app
 

Adding an endpoint to return all users.

Now we will add the /users/get endpoint that will return a list of all the users in our database. Add the following to ./amplify/backend/function/amplifytestusers/src/app.js

app.get(`${path}/get`, function (req, res) {

 let queryParams = {
   TableName: tableName,
 }

 dynamodb.scan(queryParams, (err, data) => {
   if (err) {
     res.statusCode = 500;
     res.json({
       error: 'Could not load items: ' + err
     });
   } else {
     res.json(data.Items);
   }
 });
});
 

Note: We are using scan here this is a costly operation and has the potential to use up your read capacity really quickly so avoid where possible. In a production-ready app hopefully, we would be able to use query or batchGetItem instead.

 

Now let's test it out. Go ahead and run amplify push then head over to https://271xzvzbnb.execute-api.ap-southeast-1.amazonaws.com/dev/users/get and let's see if our new api endpoint works as expected.

If you see an empty array is returned from the endpoint then give yourself a pat on the back. This is totally expected as we have not actually added any users to our DynamoDB user’s table yet. We will cover this in the next step.

Adding some users

Now it's time to add some dummy users to our new user table so we have some data to fetch.

Head over to the DynamoDB console and select the amplifytestuserstore-dev table. Then select the ‘items’ tab and go ahead and click ‘Create Item’.

You will see something like the screenshot below.

Even though we set up our table to have 3 columns, ‘id’, ‘firstname’, and ‘surname’ you will only see ‘id’ as this is the only required field. Technically you can add any number of extra col’s here and they don't even have to match between entries as this if a NoSQL database.

We are going to go ahead and add firstname and surname by hitting the little plus icon and choosing the append option. It should end up looking like this…

Repeat that process a few more times to populate a few more users and then head back over to the users/get endpoint and refresh the page.

If all went well you now should see the users you just created listed out in the browser.

[
  {
    id: ‘1’,
    firstname: ‘Bob’,
    surname: ‘Geldoff’
  },
  {
    id: ‘2’,
    firstname: ‘Sally’,
    surname: ‘Pennysworth’
  }
]
 

Adding an endpoint to fetch a specific user.

Now we have our first working API endpoint. We could stop here but in the real world we would most likely want to get a user by ‘id’. let's add an endpoint to fetch a specific user passing the ‘id’ as a query param.

To do this we will create an endpoint with the path /users/get/:id

by switching out our /users/get path functionality with the following...

app.get(`${path}/get/:id`, function (req, res) {

 let queryParams;

 if (req.params.id) {

   queryParams = {
     TableName: tableName,
     Key: {
       id: req.params.id
     }
   }

   dynamodb.get(queryParams, (err, data) => {
     if (err) {
       res.statusCode = 500;
       res.json({
         error: 'Could not load items: ' + err
       });
     } else {
       if (data.Item) {
         res.json(data.Item);
       } else {
         res.json({
           error: `No user with id '${req.params.id}' exists in the database`
         });
       }

     }
   });

 } else {

   queryParams = {
     TableName: tableName,
   }

   dynamodb.scan(queryParams, (err, data) => {
     if (err) {
       res.statusCode = 500;
       res.json({
         error: 'Could not load items: ' + err
       });
     } else {
       res.json(data.Items);
     }
   });

 }
});
 

Quickly run amplify push again and when done head over to https://271xzvzbnb.execute-api.ap-southeast-1.amazonaws.com/dev/users/get/1 to see if it returns the user with id 1.

You should see something like the following depending on what you named your users…

{
  id: '1',
  firstname: 'Bob',
  surname: 'Geldoff'
}
 

This is all great for a public-facing API but that happens if we have some sensitive data that we want to protect behind a login. Here's where amplify Auth can come into play.

Adding a protected endpoint

To kick things off run amplify API update to update our API. Choose REST as our API is REST. Then select ‘amplifytestapi’.

Choose ‘add another path’ and follow the prompts as before choosing to create a new lambda.

This time we will call the Lambda ‘amplifytestsecrets’.

As before, pick ‘node’ and ‘CRUD function for DynamoDB (Integration with API Gateway)’. Follow the prompts and create a new DynamoDB called ‘amplifytestsecretstore’ and add the following columns:

id: string

userid: string

secret: string



This time when we get to the sort key option we will select yes and pick ‘userid’ as the sort key. 

 

Note: We will not be going into how to use the sort key for this tutorial but in a real-world scenario we would want to query secrets based on userid. You might want to build on this tutorial later and try setting up a ‘secrets/get/:userid’ on your own.

 

Now, answer no to all the following options until you get to 

Restrict API access (Y/n)
 

This time pick yes, then choose Authenticated users only.

We will just check ‘read’ for simplicity in this example but you can guess how ‘create’, ‘update’ etc works.

Run amplify status again and you will see auth has now been added to the stack along with the new storage table.

We will then open up our new lambda locally and replace the contents with the exact same contents of the users lambda. Open ./amplify/backend/function/amplifytestsecrets/src/app.js in your editor of choice and copy and paste the contents of the users lambda into this file.

We only need to make 2 small changes.

Line 17: Change path to /secrets

Line 12: Change tableName to ‘amplifytestsecretstore’

Run amplify push and then head over to the aws console and the DynamoDB Tables tab. You should now see the ‘amplifytestsecretstore’ table.

Add some items to the table the same as you did before. 

 

Note: If you want to do the homework make sure you add an ID of one of the users from the ‘amplifytestuserstore’ table to the ‘userid’ column.

 

Now, if all was set up as before and we had no auth on the endpoint if we headed on over to https://271xzvzbnb.execute-api.ap-southeast-1.amazonaws.com/dev/secrets/get we would see a list of secrets.

BUT... we added auth so instead, we see...

"message": "Missing Authentication Token"}
 

This is awesome as it means our auth is working and we have no access to the /secrets/get endpoint without being logged in.

To be able to see any data we need to create a user and then log in as that user.

So, let's head over to the Cognito section of AWS console. When presented with the splash screen choose the ‘manage user pools’ option not the ‘manage identity pools’. You can read about the differences here https://aws.amazon.com/premiumsupport/knowledge-center/cognito-user-pools-identity-pools/.

You should now be able to see the user pool that Amplify created for us. It should look something like this…

Click into the newly created user pool and create a user by navigating to  ‘General’ then selecting ‘Users and Groups’ and clicking ‘Create User’ and filling out the form.

Once created you should get an email from Amazon asking you to verify the user and create a password. To do this you will need to use the temporary password you set on the create user screen.

Now we have auth set up on the API and a user set up in our user pool we are ready to create the front end so we can login and test out our secrets endpoint.


Setting up the Front end

We need to head over to the app’s front end for the first time.

Setup

Make sure you are in the apps root directory and run 

yarn add aws-amplify
 

Then start the app up with yarn start and head on over to  http://localhost:3000

Create React App comes with a startup page with a fancy spinning React logo. As nice as that is we don't need any of that so open up ./src/App.js and replace the code with…

import './App.css';
import Amplify from 'aws-amplify';
import awsconfig from './aws-exports';
Amplify.configure(awsconfig);

function App() {
 return (
   <div className="App">
   </div>
 );
}

export default App;
 

Querying the API

Now when it comes to querying the API it's totally possible to use libs such as Axios, or by using the browser’s built-in fetch method, but Amplify comes with a handy helper specifically made to handle querying API Gateway and it even handles things such as switching environments for us.

First import API from aws-amplify

import Amplify, { API } from 'aws-amplify';


Then set the api name (this is the name you gave the api. You can get it by running amplify status

const apiName = 'amplifytestapi';
 

Next, we will create a function to fetch a list of users from our users endpoint. Add the following to your App() function.

useEffect(() => {
   getUsers();
 }, []);

 const getUsers = () => {
   API.get(apiName, '/users/get').then(res => {
     console.log(res);
   }).catch(e => {
     console.error({...e});
   });
 }
 

Head to http://localhost:3000 and open up the console. Hopefully, you will see the users in your database logged out. This endpoint did not require login so this query should just work.

We will now try to query the protected API Endpoint. Add the getSecrets() function to your App() function and call it in the useEffect() along with getUsers().

const getSecrets = () => {
   API.get(apiName, '/secrets/get').then(res => {
     console.log(res);
   }).catch(e => {
     console.error({...e});
   });
 };

useEffect(() => {
   getUsers();
   getSecrets();
}, []);
 

Then open up the console and hopefully you will see an error message when the /secrets endpoint is called.

Notice how the credentials are undefined which is stopping the request.

We now need to test that we can indeed fetch the secrets back once we are logged in and have valid credentials so it's time to add a login screen.

Adding a login form

There are two ways to add authentication capabilities to your application.

  1. Use pre-built UI components

  2. Call Authentication APIs manually

To save time for this example we are going with option number 1.

If you want to learn more or have a go at option number 2 head over to the amplify docs on authentication. https://docs.amplify.aws/lib/auth/getting-started/q/platform/js#configure-your-application

To be able to use the Amplify React components in our app we first need to install the ui package. Run

yarn add @aws-amplify/ui-react
 

Then open up ./src/App.js and import the required components.

import { withAuthenticator, AmplifySignOut } from '@aws-amplify/ui-react';
 

We can then create a LoginForm component and add it just before the App function is declared.

const LoginForm = withAuthenticator(() => {
 return (
   <div className="logoutHolder">
     <p>You are now signed in.</p>
     <AmplifySignOut />
   </div>
 );
});

function App() {...
 

This component can now be used in our app to display the login form if not logged in or the ‘You are now signed in’ message and a sign out button if not.

Lets go ahead and add this new component to our app by placing it in the render section of our App function.

return (
  <div className="App">
    <LoginForm />
  </div>
);
 

Visit http://localhost:3000 and your app should now show a pre styled login form.

Sign in using the user you created earlier and the form should be replaced by the ‘you are now signed in’ message and sign out button.

Checking back in on the console you should now see the two requests go through seamlessly and the list of secrets should be logged to the console along with the list of users. If you check the auth headers you should also see that the credentials are no longer undefined.

Notice how Amplify handles all the auth for you. You don't need to change anything on the front end or handle any tokens etc. yourself.

Now it's time for one of those ‘here's one I made earlier’ moments where we pretty it all up and output the responses to the requests in the UI so we don't have to keep checking the console. You could use a UI framework like google material or twitter bootstrap etc. to achieve this but, as it’s so little styling and we normally steer away from UI libraries here at Hatchd I'm just going to use good ol’ plain css.

You can get the finished code here [Link to github]. Just check it out and run amplify publish && yarn && yarn start to see the end result. Or if you are feeling brave and know what you are doing you can just style yours yourself.

You should end up with something that looks like this.

As you can see the results of the users and secrets requests are shown on the right and these requests are remade upon logging in or out.

Adding cloud hosting

Using amplify to host your frontend gives you free HTTPS and the use of CloudFront to serve your files to users as fast as possible wherever they are in the world. 

To set up the deployment of the frontend to the cloud, you simply have to run amplify add hosting and you should be presented with the following...

If you have a git repository setup then the best option is to go with ‘Hosting with Amplify Console’ and then ‘Continuous deployments’ as this will hook up your repository to the Amplify console and your app will get built and deployed every time you push to that repo.

If you don't have a repo for some reason you can choose the ‘Manual deployment’ option and then run... 

amplify publish
 

This will deploy both your front and back end unlike ‘amplify push’ which only deploys your backend.

 

Note: Without continuous deployment set up you will have to run ‘amplify publish’ every time you want to see changes in your frontend.

 

Adding a production environment

If this was a real-world application we would most definitely need both a dev and a prod environment at the very least.

This is super easy using Amplify CLI. By calling the ‘env add’ command we can create a copy of all the resources in the current checked-out environment (in our case dev) and tag them with the new environment name.

Cloning dev and deploying prod

Let’s go ahead and run amplify env add.

When asked if you want to use an existing environment, select no then name the new environment ‘prod’.

Then if we run amplify status we can see we are on the new prod env and everything needs to be created.

Once we have reviewed the changes, run amplify push.

You will now see we have a prod API endpoint. Amplify should have deployed all our prod services to the cloud that exactly matched the dev environment. 

Now is the perfect time to head over to the AWS console and have a look for all our newly created prod resources. You will notice that all the dev resources have the -dev suffix and all the prod ones have the -prod suffix.

We can also visit the prod api endpoint to see if it works…

https://utkmomb8sf.execute-api.ap-southeast-1.amazonaws.com/prod/users/get

Should just see an empty array as we don't have any data in our prod database yet.


How it all works

Whenever we add a new env we can access this in the process.env.ENV variable in the lambda.

The Lambda already automagically knows what database table to connect to based on the environment due to the following snippet that handily got created when we initially chose CRUD for DynamoDB to create our function.

let tableName = "amplifytestuserstore";
if (process.env.ENV && process.env.ENV !== "NONE") {
 tableName = tableName + '-' + process.env.ENV;
}
 

If we head over to https://localhost:3000 we should see that our frontend should have automagically picked up that we have the prod environment checked out and be using the new prod API path. 

You should now see two empty arrays in our user’s request and secret requests panels as we have no data in our prod tables.

Reload the page and check the console and you should see the prod path now being requested rather than the old dev one.

 

Note: If you still see the dev API being queried try running checkout again. amplify checkout dev && amplify checkout prod then reload the page.

 

Notice we have the error about connecting to the secret part of the API. This is because we are not logged in. We will not be able to login with our current user as this user is in the dev user pool not the prod one.

This time we will create a user using the front end functionality we gained by using the pre-made Amplify sign in component. Go ahead and click the “Create Account” link.

Fill out the form and you will be emailed a code to the email address you used. Enter that and then submit to create the user and sign in.

Note: It's good practice with the first user to log into the Amplify Console and go to the cognito section to check our new user got added to the prod userpool correctly.

Login and you should be presented with the logged in view as before with no errors in the console.

And that's it. You now have a working serverless API with both public and protected endpoints that you can deploy to any AWS account and a front end that consumes it.


Optional, Copying dev user table to prod.

At this point you may be thinking. But how do I get my dev data from my dev tables to my prod tables? And this is a good question.

DynamoDB currently does not offer a clone table functionality. It's totally possible to copy data across using backups though.

In the aws console head to the DynamoDB tables section and select the ‘amplifytestuserstore-dev’ table. Click the backups tab and create a backup. It's then possible to select this backup and restore it to a new table. Unfortunately, you can restore it to an existing table so we will have to do something that might seem scary.

We have to delete the ‘amplifytestuserstore-prod’ table.

 

Note: Although it seems like deleting the table and recreating it might sever the link with amplify trust me it doesn't. You will see this in action later when we remove the prod environment.

 

Once the prod user table has been deleted we can select the dev users table backup and restore it to a new table (this may take some time). Make sure we name the table ‘amplifytestuserstore-prod’ the same as the one we deleted. 

Once completed we can head back over to our frontend, refresh the page and we should see the users being returned from the API instead of an empty array.

Obviously, you would probably only go from dev to prod on the first deployment. But going from prod to dev can be very useful if you want to move the prod data back into dev for testing and developing with real data.


Taking it the extra mile.

If this was a true production app there are a few more steps we would take to really go that extra mile.

Pretty domain names

Using amazon route53 we could add pretty domain names to both the cloudfront hosting and the api endpoints.

Fetching secrets by user ID

In real life, we would want an endpoint that could fetch back an individual user's secrets based on user ID. The Database is set up to do this using the sort key but has yet to create an endpoint and query the database. Feel free to have a go at adding this endpoint yourself. It will help you understand the importance of planning how your database is structured and what you might need to query before rushing into creating it. NoSQL databases are not the same as relational databases as you can't just query based on anything without a cost.  

More complex permissions

If we were turning this into a big customer application we might need things such as multiple user pools each having access to their own table. The following link shows how this can be done using the same techniques we have already used.

https://aws.amazon.com/blogs/mobile/amplify-cli-enables-creating-amazon-cognito-user-pool-groups-configuring-fine-grained-permissions-on-groups-and-adding-user-management-capabilities-to-applications/

Workflow

We would also improve our flow by maybe adding scripts to the projects package.json for checking out the correct backend env at the same time as the git branch.

scripts: {
    'checkout:dev' : 'git checkout dev && amplify env checkout dev',
    'checkout:prod' : 'git checkout prod && amplify env checkout prod',
}
 

Then to checkout prod we would run

yarn checkout:prod
 

Instead of 

git checkout prod
 

This could also be achieved by git hooks but you would need to ensure your whole team had the correct hooks installed.

It’s also totally possible to have Amplify environments for each feature you work on. 


Deleting the Environments and cleaning up

Now the tutorial is over we should probably clean up and delete all those resources in the cloud. 

You will probably never get to do this in real life so let’s have some fun and go ahead and delete production.

First, make sure you are in the dev environment as you cant remove the environment you are currently on.

amplify env checkout dev
 

Then go ahead and remove prod.

amplify env remove prod
 

You can now visit the console and you will see all the resources with the -prod suffix have disappeared. If you followed the optional step of copying the dev data to prod you will be happy to see even the DynamoDB table we deleted and recreated through the console has been removed.

Then run...

amplify delete
 

To delete the dev environment and your AWS account will be back to its original state.


And to finish...

API’s are the backbone of the internet. They can be used to power just about anything, from the most basic of Websites to Corporate Apps to IOT devices, and everything in between. Hopefully, this article has helped you realise just how easy it is to get a fully scalable, Auth protected, Serverless API up and running in a matter of hours, without it costing you an arm and a leg.

Previous
Previous

Why every team needs a Generalist

Next
Next

Atomic design systems