Users engage Google Assistant in conversation to get things done, like buying groceries or booking a ride or in our case to reach out to solve an issue (for a complete list of what’s possible now, see the Actions directory.) As a developer, you can use Actions on Google to easily create and manage delightful and effective conversational experiences between users and your own 3rd-party fulfillment service. We will learn How to build an App for Google Assistant using Dialogflow Enterprise Edition and Actions on Google. Build rich effective conversational experiences between users and your own fulfillment service. Add life-like conversations using SSML and store user data to Google Cloud Datastore. Learn how to publish your Actions for Google Assistant.

This tutorial is a part 2 continuation of our previous tutorial on How to create a chatbot using Dialogflow Enterprise Edition and Dialogflow API V2.

How to build an App for Google Assistant using Dialogflow Enterprise Edition and Actions on Google:

By the end of this tutorial, you will have a better understanding of the following:

As mentioned earlier, this tutorial is a part 2 continuation of the previous tutorial. Follow the steps from 1 to 8 from the previous tutorial and come back here to continue with this tutorial. Most of the project creation, setup, creating dialogflow agents and storing the information in Google Cloud Datastore is already covered in part 1.

Step 1: Follow this tutorial from step 1 to 8 and continue to next step.

Step 2: Dialogflow Integrations

Go to Dialogflow Console, select your agent and click on Integrations.

Click on Google Assistant

This should open up a pop-up window as shown below. Click on TEST button.

Clicking on the TEST button launches the Actions on Google Simulator as shown below.

Step 3: Actions Simulator

The actions simulator in the Actions Console lets you test your apps through an easy-to-use web interface that lets you simulate hardware devices and their settings. You can also access debug information such as the request and response that your fulfillment receives and sends.

The simulator is the best way to test your app/actions on google assistant if you don’t have the supported hardware devices.

First of all, let us test the app that we have built so far. The below example shows you the experience of our test app in Google Assistant.

Step 4: How to improve the Conversational Experience

The app/action we built so far does exactly as we intended but we also notice that the overall experience can be a whole lot better. We noticed that our app when spelling out the ticket number, it spells out it in cardinal format. However, we would like the ticket number to be spelled out in digits or individual characters. Since this tutorial is about How to build an App for Google Assistant using Dialogflow Enterprise Edition and Actions on Google, we need to make sure that our app is more engaging and natural when used over conversational dialogue.

Step 5: Add SSML (Speech Synthesis Markup Language) to your Response

When returning a response to the Google Assistant, you can use a subset of the Speech Synthesis Markup Language (SSML) in your responses. By using SSML, you can make your agent’s responses seem more life-like.

To add SSML in our responses, we need to modify our cloud functions.

Go to Dialogflow Console.

Click on Fulfillment.

Use Inline Editor to modify our cloud functions index.js to include SSML.



const http = require('http');

// Imports the Google Cloud client library

const Datastore = require('

// Your Google Cloud Platform project ID

const projectId = 'REPLACE_WITH_YOUR_PROJECT_ID';

// Instantiates a client

const datastore = Datastore({

projectId: projectId

});

// The kind for the new entity

const kind = 'ticket'; 'use strict';const http = require('http');// Imports the Google Cloud client libraryconst Datastore = require(' @google -cloud/datastore');// Your Google Cloud Platform project IDconst projectId = 'REPLACE_WITH_YOUR_PROJECT_ID';// Instantiates a clientconst datastore = Datastore({projectId: projectId});// The kind for the new entityconst kind = 'ticket'; // The kind for the new entity

const kind = 'ticket'; exports.dialogflowFirebaseFulfillment = (req, res) => {

console.log('Dialogflow Request body: ' + JSON.stringify(req.body));

// Get the city and date from the request

let ticketDescription = req.body.queryResult['queryText']; // incidence is a required param

//let name = req.body.result.contexts[0].parameters['given-name.original'];

let username = req.body.queryResult.outputContexts[1].parameters['given-name.original'];

let phone_number = req.body.queryResult.outputContexts[1].parameters['phone-number.original'];

console.log('description is ' +ticketDescription);

console.log('name is '+ username);

console.log('phone number is '+ phone_number);

function randomIntInc (low, high) {

return Math.floor(Math.random() * (high - low + 1) + low);

}

let ticketnum = randomIntInc(11111,99999);

// The Cloud Datastore key for the new entity

const taskKey = datastore.key(kind);

// Prepares the new entity

const task = {

key: taskKey,

data: {

description: ticketDescription,

username: username,

phoneNumber: phone_number,

ticketNumber: ticketnum

}

};

console.log("incidence is " , task);

// Saves the entity

datastore.save(task)

.then(() => {

console.log(`Saved ${task.key}: ${task.data.description}`);

res.setHeader('Content-Type', 'application/json');

//SSML Response to send to Dialogflow

res.send(JSON.stringify({ 'fulfillmentText': '<speak>I have successfully logged your ticket, the ticket number is <say-as interpret-as="characters">' + ticketnum + '</say-as>. Someone from the helpdesk will reach out to you within 24 hours.</speak>'}));

//res.send(JSON.stringify({ 'fulfillmentText': "I have successfully logged your ticket, the ticket number is " + ticketnum + ". Someone from the helpdesk will reach out to you within 24 hours.", 'fulfillmentMessages': "I have successfully logged your ticket, the ticket number is " + ticketnum + ". Someone from the helpdesk will reach out to you within 24 hours."}));

})

.catch((err) => {

console.error('ERROR:', err);

res.setHeader('Content-Type', 'application/json');

res.send(JSON.stringify({ 'speech': "Error occurred while saving, try again later", 'displayText': "Error occurred while saving, try again later" }));

});

}

The most important change is in line 50. We have added SSML markup to our response .

markup to our . Note: Make sure to replace the Project ID with your own Project ID that you have created earlier following the previous tutorial.

<speak>I have successfully logged your ticket, the ticket number is <say-as interpret-as="characters">' + ticketnum + '</say-as>. Someone from the helpdesk will reach out to you within 24 hours.</speak>

<speak> The root element of the SSML response. <say‑as> Lets you indicate information about the type of text construct that is contained within the element. It also helps specify the level of detail for rendering the contained text.

Learn more about SSML markup responses here.

Deploy the Cloud Functions.

Click on Integrations and Google Assistant and TEST.

and and Test the app in the simulator and check the response. You will hear that the ticket number is now spelled out as individual characters and not in the cardinal format which is exactly how we wanted it.

Step 6: End Conversation

From the previous demo, you would have noticed that our app needs to be canceled to end the conversation. However, we can automatically end the conversation once we have gathered all the information. So once the app replies back with the ticket number, we can end the conversation.

Click on Intents.

Expand the Submit Ticket intent and click on Submit Ticket — collect description intent.

intent. Under Responses, enable toggle set this intent as the end of conversation.

Step 7: Test your app using Google Home

You can also test your app using your Google Home/mini/phone or any device running Google Assistant. Make sure that you have enabled your test draft in Actions on Google console. Your app should run on any device running Google Assistant linked to your account (google account used to create this project)

You can test your app on Google Home by saying “OK Google, Talk to my test app”.

You can also check all the data appear in the Google Cloud Datastore with the user info and ticket details.

Step 8: Add Invocation Name and Choose Google Assistant Voice

In order for users to interact with your action/app, we need to add an invocation name that they can use to invoke our action.

Users verbally say the Invocation name to invoke your Action. For example, if the Invocation name is Mr. Pixel, users can say: OK Google, Talk to Mr Pixel.

Add Invocation Name.

Add Directory title.

Choose the right Google Assistant Voice depending on your persona or brand.

Step 9: Branding, Theme Customization, Invocation and Discovery Checklist

Theme customization lets you customize the look and feel of your Actions to highlight your brand identity to users. After customizing the theme, save your changes and test them out in the Simulator.

Make sure to check out the Invocation and Discovery Checklist to see if you have covered all the checklist.

Step 10: How to build an App for Google Assistant using Dialogflow Enterprise Edition and Actions on Google — Deploy and Release:

Before you deploy and publish your application/action. Make sure to go through the Publishing Checklist.

Finally, Fill in all the required information under Deploy section.

Directory information : All the required details to show your Actions in the Actions Directory.

: All the required details to show your Actions in the Actions Directory. Location targeting: Specify the countries where your Actions will trigger. Your Actions will work for people in every country who speak a language your Actions support.

Specify the countries where your Actions will trigger. Your Actions will work for people in every country who speak a language your Actions support. Surface capabilities: Surfaces your Actions will trigger on and specify required device capabilities for your Actions.

Surfaces your Actions will trigger on and specify required device capabilities for your Actions. Release: Submit your action for Production. Production release lets you officially launch your Actions to all the Google Assistant users.

Conclusion — How to build an App for Google Assistant using Dialogflow Enterprise Edition and Actions on Google:

So to conclude, we have successfully built an action for the Google Assistant. The purpose of this tutorial was to show you How to build an App for Google Assistant using Dialogflow Enterprise Edition and Actions on Google. There are still lots of things you can add to make the overall conversational experience a lot better. You can also train your agent with a lot of training data using Dialogflow. I will be covering a lot more on Actions on Google and Google Assistant in the upcoming tutorials. Let me know what you would like me to cover in the future tutorials in the comments below.

You can also check out other tutorials like

- Build your own Google Voice Assistant without code

- How to create a chatbot using Dialogflow Enterprise Edition and Dialogflow API V2

- Sentiment Analysis using Google Cloud Natural Language API

- Entity Analysis using Google Cloud Natural Language API

Also check out TechWithSach.com for more interesting tutorials on machine learning, AI, Flutter, unity and more.