In previous post of this series we have been looking into building back-end architecture for our serverless file hosting web app using Cognito, AppSync, Lambda, DynamoDB and CloudFront.

In this part we will look into front-end implementation and its integration to the back-end, hope you will enjoy the read so let’s get started.

Requirements

Let’s start with high level requirements and combine them into sprint tasks:

Sign-in with username/password Sign-up, Sign-out and “Forget password” List content of the file store Upload file to a filestore Delete file from a filestore Download file from a filestore Get a real-time updates to a content of the filestore on each device while using multiple devices Invalidate CloudFront cache on deleteObject mutation

This gives us enough requirements to start with. Let’s combine tasks by functionality into 3 sprints, Sprint 1 — implementing authentication (tasks 1 and 2), Sprint 2 — content management (tasks 3–6) and Sprint 3 — implementing real-time capability and cache invalidation, Sprint 4 — deployment.

Design

As per our requirements we need to implement various flows shown on a diagram below. Note that user should be able to use the app from different clients simultaneously with the real-time state update on each client.

High Level Architecture Design

Now when the requirements have been sorted and sprints have been thoroughly planned it’s time to move to implementation part.

Implementation

We are going to use React library for UI and AWS Amplify library for back-end integration. As a component framework we are going to use Semantic. For prototyping we are going to use codesandbox.io

Sprint 1 (Authentication)

Let’s start with the boilerplate code by creating a new react sandbox.

Delete function App from src/index.js and create new file src/App.js with following content:

// Create new file with App class component import React, { Component } from "react";

class App extends Component {

render() {

return (

<div>Hello World</div>

);

}

}

export default App

Update src/index.js to import our class component from src/App.js:

// Update scr/index.js to import our class from scr/App.js import React from "react";

import ReactDOM from "react-dom";

import App from "./App"

ReactDOM.render(<App />, document.getElementById("root"));

Add some dependencies by clicking on “Add Dependency” in your sandbox and add aws-amplify and aws-amplify-react packages, then update App class:

// Update src/App.js with new dependencies import Amplify from "aws-amplify";

import { withAuthenticator } from "aws-amplify-react"; // Configure Amplify library with Cognito and AppSync

Auth: {

identityPoolId: "us-east-1:516132f0-8056-4450-a1d5-fd4e6f877845",

region: "us-east-1",

userPoolId: "us-east-1_QZ3Aa0LBe",

userPoolWebClientId: "5s6jmc25o0vm3vui34r9vh580j"

},

aws_appsync_graphqlEndpoint:

"

aws_appsync_region: "us-east-1",

aws_appsync_authenticationType: "AMAZON_COGNITO_USER_POOLS"

}); Amplify.configure({Auth: {identityPoolId: "us-east-1:516132f0-8056-4450-a1d5-fd4e6f877845",region: "us-east-1",userPoolId: "us-east-1_QZ3Aa0LBe",userPoolWebClientId: "5s6jmc25o0vm3vui34r9vh580j"},aws_appsync_graphqlEndpoint: https://graphql.dlozitskiy.online ",aws_appsync_region: "us-east-1",aws_appsync_authenticationType: "AMAZON_COGNITO_USER_POOLS"}); // Update export by wrapping App class into a Higher Order Component withAuthenticator provided by Amplify library const AppComponent = withAuthenticator(App);

export default AppComponent

Once UI re-renders you will see an authentication form provided by Amplify library.

Try signing in with your Congito User Pool credentials. Now once we are logged in let’s pull some Semantic-UI dependencies and implement “Sign-out” functionality:

// Import semantic-ui framework for react to scr/App.js: import {

Grid,

Header,

List,

Segment,

Menu,

Button

} from "semantic-ui-react";

import "semantic-ui-css/semantic.css"; // Import Auth for sign-out import { Auth } from "aws-amplify"; // Create a layout with the sign-out button class App extends Component {

signOut = async () => {

await Auth.signOut();

this.props.rerender();

};

render() {

return (

<Grid padded>

<Grid.Column>

<Menu>

<Menu.Item>

<Button onClick={this.signOut}>Sign-out</Button>

</Menu.Item>

</Menu>

</Grid.Column>

</Grid>

);

}

} export default props => {

const AppComponent = withAuthenticator(App);

return <AppComponent {...props} />;

};

Update src/index.js with a wrapper component that will re-render the UI on sign-out and shows the user a login screen:

class AuthWrapper extends React.Component {

rerender = () => this.forceUpdate();

render() {

return <App rerender={this.rerender} />;

}

} ReactDOM.render(<AuthWrapper />, document.getElementById("root"));

Now if you click on “sign-out” button it will bring you to an initial “sign-in” screen.

Sprint 2 (Content Management)

For a content manage we will be using AWS Amplify Storage module. Let’s import the module and configure it:

// import and configure a storage module in src/App.js

import { Storage } from "aws-amplify";

Storage.configure({

bucket: "bucket-with-protected-content",

region: "us-east-1",

identityPoolId: "us-east-1:516132f0-8056-4450-a1d5-fd4e6f877845"

});

Now we can implement file upload functionality, let’s add “Upload file” button to our UI, we also want a corresponding metadata entry once our upload is complete so we will be calling a putObject mutation on a success of S3 upload:

// Create a new class for S3 file upload

import { Connect } from "aws-amplify-react"; const putObject = `mutation putObject($objectId: String!){

putObject(objectId: $objectId) {

objectId

userId

}

}`; class S3Upload extends React.Component {

constructor(props) {

super(props);

this.state = { uploading: false };

}

onChange = async e => {

const file = e.target.files[0];

this.setState({ uploading: true });

const identityId = await Auth.currentSession()

.then(data => {

return data.idToken.payload.sub;

})

.catch(err => console.log(err));

const result = await Storage.put(file.name, file, {

identityId: identityId,

level: "private",

customPrefix: { private: "" }

})

.then(async () => {

const result = await API.graphql(

graphqlOperation(putObject, { objectId: file.name })

);

console.info(`Created object with id ${JSON.stringify(result)}`);

});

this.setState({ uploading: false });

};

render() {

return (

<div>

<Button

primary

onClick={() => document.getElementById("uploadFile").click()}

disabled={this.state.uploading}

content={this.state.uploading ? "Uploading..." : "Upload file"}

/>

<input

id="uploadFile"

type="file"

onChange={this.onChange}

style={{ display: "none" }}

/>

</div>

);

}

} // Update layout with the button



class App extends Component {

signOut = async () => {

await Auth.signOut();

this.props.rerender();

};

render() {

return (

<Grid padded>

<Grid.Column>

<Menu>

<Menu.Item>

<S3Upload />

</Menu.Item>

<Menu.Item>

<Button onClick={this.signOut}>Sign-out</Button>

</Menu.Item>

</Menu>

</Grid.Column>

</Grid>

);

}

}

This will give us a simple layout as below:

Where “Upload file” will upload file to our S3 bucket with the Congito user sub (UUID for the authenticated user) as a prefix and also will create a metadata entry for uploaded file using AppSync mutation.

Now let’s implement our first query so we can list the content of the filestore.

// Import API module and graphqlOperation method from Amplify library, Connect component will be used to execute query or mutation import { graphqlOperation, API } from "aws-amplify";

import { Connect } from "aws-amplify-react"; // Add our getObjects query to src/App.js const getObjects = `query {

getObjects {

objectId

}

}`; // Add FileList class to iterate through an API query response class FileList extends React.Component {

Files() {

if (this.props.files.length != 0) {

return this.props.files.map(file => (

<List.Item key={file.objectId}>

<List.Content as="a">

{file.objectId}

</List.Content>

</List.Item>

));

} else {

return (

<List.Item>

<List.Content>Your filestore is empty</List.Content>

</List.Item>

);

}

}

render() {

return (

<Segment>

<List divided verticalAlign="middle">

{this.Files()}

</List>

</Segment>

);

}

} // Add FilesListLoader class that queries the API class FilesListLoader extends React.Component {

render() {

return (

<Connect

query={graphqlOperation(getObjects)}

>

{({ data, loading, errors }) => {

if (loading) {

return <div>Loading...</div>;

}

if (!data.getObjects) return;

return <FileList files={data.getObjects} />;

}}

</Connect>

);

}

} // Add FilesListLoader to our App class App extends Component {

signOut = async () => {

await Auth.signOut();

this.props.rerender();

};

render() {

return (

<Grid padded>

<Grid.Column>

<Menu>

<Menu.Item>

<S3Upload />

</Menu.Item>

<Menu.Item>

<Button onClick={this.signOut}>Sign-out</Button>

</Menu.Item>

</Menu>

<Segment>

<Header as="h3">

My Files

</Header>

</Segment>

<FilesListLoader />

</Grid.Column>

</Grid>

);

}

}

Let’s upload a file and refresh, it should give us something like that:

Let’s add “Delete” button, once clicked it should be able to delete an object from S3 bucket and also call deleteObject mutation to remove object metadata:

// Create new component

const deleteObject = `mutation deleteObject($objectId: String!){

deleteObject(objectId: $objectId) {

objectId

userId

}

}`; class S3Delete extends React.Component {

constructor(props) {

super(props);

this.state = { deleting: false };

}

onClick = async e => {

const file = this.props.file;

this.setState({ deleting: true });

const identityId = await Auth.currentSession()

.then(data => {

return data.idToken.payload.sub;

})

.catch(err => console.log(err));

await Storage.remove(file, {

identityId: identityId,

level: "private",

customPrefix: { private: "" }

}).then(async () => { const result = await API.graphql(

graphqlOperation(deleteObject, { objectId: file, userId: identityId })

);

console.info(`Deleted object with id ${JSON.stringify(result)}`);

});

};

render() {

return (

<div>

<Button

negative

onClick={this.onClick}

disabled={this.state.deleting}

content={this.state.deleting ? "Deleting..." : "Delete"}

/>

</div>

);

}

} // Include new component into FileList class FileList extends React.Component {

Files() {

if (this.props.files.length != 0) {

return this.props.files.map(file => (

<List.Item key={file.objectId}>

<List.Content floated="right">

<S3Delete file={file.objectId} />

</List.Content>

<List.Content as="a" >

{file.objectId}

</List.Content>

</List.Item>

));

} else {

return (

<List.Item>

<List.Content>Your filestore is empty</List.Content>

</List.Item>

);

}

}

render() {

return (

<Segment>

<List divided verticalAlign="middle">

{this.Files()}

</List>

</Segment>

);

}

}

The UI will look like below, once you click “Delete” and refresh the browser the file will disappear from a filestore:

Ok, we can upload and delete files, how about downloading them? Let’s implement download functionality, as you remember from Part 1 we have a Lambda resolver which signs CloudFront URLs for us, we will use this URL for a file download:

// Update FileList class to provide a signed URL on a onClick event const getObject = `query getObject($objectId: String!){

getObject(objectId: $objectId) {

url

}

}`; class FileList extends React.Component {

getUrl = async file => {

const result = await API.graphql(

graphqlOperation(getObject, { objectId: file })

);

window.location.assign(result.data.getObject.url);

};

Files() {

if (this.props.files.length != 0) {

return this.props.files.map(file => (

<List.Item key={file.objectId}>

<List.Content floated="right">

<S3Delete file={file.objectId} />

</List.Content>

<List.Content

as="a"

href="javascript:void(0)"

onClick={() => {

this.getUrl(file.objectId);

}}

>

{file.objectId}

</List.Content>

</List.Item>

));

} else {

return (

<List.Item>

<List.Content>Your filestore is empty</List.Content>

</List.Item>

);

}

}

render() {

return (

<Segment>

<List divided verticalAlign="middle">

{this.Files()}

</List>

</Segment>

);

}

}

Now when you upload the test file and refresh your browser you will see it in the filestore. Clicking on the file name will request a signed URL from an API and will start file download. That’s it for a Sprint 2.

Sprint 3 (Real-time capability and cache invalidation)

So far we had to refresh the browser every time we uploaded or deleted file to trigger a getObjects query to render the UI with the updated metadata content. This is obviously not the way how we want our production user to deal with the app so we need to implement some real-time feedback on actions that user performs with the filestore. We could simply do re-render on each update/delete, but we want it to be more sophisticated, we want to update UI even if the user has modified filestore content in a different browser or from a different device. AppSync will allow us to do this by using subscriptions. We will subscribe user to events related to his filestore.

Let’s modify the schema:

// We add a boolean state input parameter to a putObject mutation so we can differentiate putObject events from deleteObject type Mutation {

putObject(objectId: String!, state: Boolean!, comment: String): Object

deleteObject(objectId: String!): Object

} // Add state input parameter to an Object type type Object {

objectId: String

userId: String

state: Boolean

url: String

comment: String

} // Subscribe to both mutations, filtering events by userId

onObjectModify(userId: String): Object



} type Subscription {onObjectModify(userId: String): Object @aws_subscribe (mutations: ["putObject","deleteObject"])

Update putObject mutation in our code:

const putObject = `mutation putObject($objectId: String!){

putObject(objectId: $objectId, state: true) {

objectId

userId

state

}

}`;

Subscribe to AppSync events:

const onObjectModify = `

subscription onObjectModify ($userId: String){

onObjectModify (userId: $userId){

userId

}

}

`; class FilesListLoader extends React.Component {

constructor(props) {

super(props);

this.state = { identityId: "" };

}

async componentDidMount() {

await Auth.currentSession().then(data => {

this.setState({ identityId: data.idToken.payload.sub });

});

}

render() {

return (

this.state.identityId != "" && (

<Connect

query={graphqlOperation(getObjects)}

subscription={graphqlOperation(onObjectModify, {

userId: this.state.identityId

})}

onSubscriptionMsg={(prev, { onObjectModify }) => {

var index = prev.getObjects.findIndex(

obj => obj.objectId === onObjectModify.objectId

); if (!onObjectModify.state) {

prev.getObjects.splice(index, 1);

} else {

prev.getObjects.push(onObjectModify);

}

return prev;

}}

>

{({ data, loading, errors }) => {

if (loading) {

return <div>Loading...</div>;

}

if (!data.getObjects) return;

return <FileList files={data.getObjects} />;

}}

</Connect>

)

);

}

}

Now let’s make sure that our Subscription resolver allows user to subscribe only to events coming from his userId. First let’s create a dummy datasource with NONE type:

Then go an attach a resolver to our onObjectModify subscription:

Now if you open the app in multiple browsers and try to upload/delete files you should be able to see real-time changes to all app sessions.

For CloudFront cache invalidation on deleteObject mutation we will be using AppSync pipeline resolvers. Go ahead and create a new Lambda function with the role policy allowing to create CloudFront cache invalidation:

var aws = require('aws-sdk');

var cloudfront = new aws.CloudFront(); exports.handler = async (event) => {

var params = {

DistributionId: process.env.distribution_id,

InvalidationBatch: {

CallerReference: `${event.userId}_${Math.floor(Date.now() / 1000)}`,

Paths: {

Quantity: 1,

Items: [

`/${event.userId}/${event.objectId}`

]

}

}

};

await cloudfront.createInvalidation(params).promise()

.then((response) => {

console.log('response:',response);

})

.catch((error) => {

console.log('error:', error);

});

};

Create a new AppSync datasource from a Lambda function. Now we need to convert existing deleteObject resolver to a pipeline type resolver. Go to the resolver an click “Convert to pipeline resolver”:

Under functions click “Add function” and “Create new function” and chose our Lambda function as a datasource. We don’t need a response so we set it to null:

Now let’s chain this function in the pipeline, so it runs before our main deleteObject function:

Try to delete the object using UI and you should see a new invalidation created for CloudFront by this resolver:

That’s it for a Sprint 3.

Sprint 4 (Deployment)

Deployment (“Old school” way)

Now when our app is ready let’s look how to deploy it old-school way (when we provision all the resources ourselves to understand how it all binds together).

Before we go into deployment let’s prepare a new CloudFront distribution with custom SSL certificate and S3 bucket as an origin where our react app will be served from.

S3 bucket

Create an S3 bucket with the following policy:

{

"Version":"2012-10-17",

"Statement":[{

"Effect":"Allow",

"Principal": "*",

"Action":["s3:GetObject"],

"Resource":["arn:aws:s3:::dropbox-website-bucket/*"]

}

]

}

And CORS configuration:

<CORSConfiguration>

<CORSRule>

<AllowedOrigin>*</AllowedOrigin>

<AllowedMethod>GET</AllowedMethod>

<AllowedMethod>PUT</AllowedMethod>

<AllowedMethod>POST</AllowedMethod>

<AllowedMethod>HEAD</AllowedMethod>

<AllowedMethod>DELETE</AllowedMethod>

<MaxAgeSeconds>3000</MaxAgeSeconds>

<AllowedHeader>*</AllowedHeader>

</CORSRule>

</CORSConfiguration>

Enable static website hosting on a bucket:

Build

We finished with prototyping so let’s download our source code from the codesandbox and build it locally.

$ mkdir dropbox/

$ cd dropbox/

$ unzip ../q78l9l6now.zip

$ npm install

$ npm run build

Deploy

After build has finished, upload content of ./build folder to S3 created earlier

$ cd build/

$ aws s3 sync ./ s3://dropbox-website-bucket

Now the app should be available by S3 static website URL created earlier.

CloudFront

To make sure that we serve the app using AWS global infrastructure and also with custom SSL certificate let’s create a new CloudFront distribution with our S3 bucket static website as an origin.

Route53

Final step is to create an alias record in Route53 that will forward requests to our distribution:

Now our web app should be available by https://dropbox.dlozitskiy.online

Deployment (New way)

AWS is constantly working to make developers’ life easier and recently there was an announcement of AWS Amplify Console. This is a deployment tool that integrates to various source control service providers and does all the heavy lifting of building your web app, provisioning SSL, Route53 and CloudFront resources. Let’s go through the steps of deploying the same app using AWS Amplify console to see how easy it is.

Let’s create a new app and connect it to our github repository:

For build settings we are going to choose default ones as it does exactly what we need:

Save and deploy the app. That’s it! in couple clicks we got the same result.

Now if you navigate to the URL provisioned by Amplify you will see the same version as we have deployed earlier.

To serve it using custom URL we need couple more steps. Create a new Route53 public zone with the domain name that you own:

Note that you domain name registrar should have Route 53 servers as Authoritative Name Servers, for example for Freenom it should look like below:

Next — go to “Domain Management” and add a custom domain name:

It will take some time to verify the ownership, the console says it can take up to 48 hours, in our case it took around 1 hour:

Once ownership is verified let’s create a subdomain that will point to our distribution and will serve our app, go to “Manage subdomains” and add a new subdomain serving an app built from a master branch:

After some time it should be available using new subdomain URL https://dropbox.dlozitskiy.tk

Post Release and Closing Notes

That’s it and congrats that you have read so far!

So what’s next? Relax, sit back and enjoy because it’s 100% serverless and everything is taken care of :)?

Not really, good news is that running 100% serverless brings us new set of challenges, sometimes even more harder to deal with as we have less control over infrastructure we run on. We need to be closely monitoring AppSync error logs, monitor our functions for cold starts, timeouts, execution times, cost, etc, but let’s leave it to future us.

Hope you enjoyed the reading, please clap if you liked it and let me know if you have any ideas or suggestions.

The source code for the app can be found here: https://github.com/Dlozitskiy/serverless-dropbox