Conditional Templating in Dialogflow for Google Assistant

What I like most about Dialogflow, (the tool to create Chatbots and Smart Assistants / Google Assistant apps), is that you can maintain your conversations within the Dialogflow UI.

Many users write their complete FAQ content in intents and responses. You no longer need a developer to tweak your conversations or deploy your agent which makes it very scalable for large organizations.

It's also possible to load data from external systems. In that case, you are hosting parts of the conversation elsewhere. You could store these parts in language files or databases.
Some organizations prefer to write the full conversation in the Dialogflow UI.That way your copywriters can maintain the full conversation.

It's good to know that it is possible to use templates and conditionals within the Dialogflow UI. The trick here, is to make use of the custom payload response setting, which you can find in the Intent > Responses section.

Just like how you would include Rich Cards, you can choose custom payload, and provide your own JSON.
Since JSON is just JavaScript, my first try was to use if else conditions directly in the code:

  "web": {
    "type": "text",
    "fn": "var today = new Date(); var curHr = today.getHours(); var greet = \"\"; if (curHr < 12) { greet = \"Good morning!\"; } else if (curHr < 18) { greet = \"Good afternoon!\"; } else { greet = \"Good evening!\"; } return greet;",

In my SDK back-end code or fulfillment app, I could convert the string to executable JavaScript code.
Although this works. It kinda feels dirty, since I need to use evil eval() or equivalent code in my back-end. On top of that I expect the person who maintains the conversation, to have JavaScript skills.

The next solution, came from a customer of mine. Instead, let make use of a templating library, so you can provide readable templates, and variables that an be injected.
Think Jinja (for Python or Java developers), Smarty (PHP) or Jade/Pug, Handlebars and Mustache (for JavaScript developers).

I tried this with PugJS (formerly known as Jade). It works really nice.

Let's take this intent:

Intent Name: [templating] example
Training Phrases:
* Greet me
* Greet Lee
* username - @sys.given-name - $username
Fulfillment: Enable webhook call

Here's an example custom payload:

   "custom": {
      "locals": {
          "username": "$username"
      "pug": [
          "if username\n",
           " | Hello $username\n",
           " | Hello stranger"

The values of the locals object, are the parameter values. In the pug object, I wrote a multi-line string template, with an if else branche. When using PugJS, the line indenting is importing. Pay attention to the newline \n code, and the | for using plain text.

In my fulfillment Cloud Function, I will take this template, and compile it, together with the local template variables.

Your code will look like this:

Custom greeting Google Assistant app with Dialogflow and Actions on Google

Let's have a look into how you can create custom Welcome messages for your Google Assistant with Dialogflow and Actions on Google.

First open your Dialogflow Console. Create a new Intent with the following settings:

Intent name: [bot-first-greeting]
Events: Choose Google Assistant Welcome
Training Phrases: Empty
Fulfillment: Enable Webhook call for intent

Click Save.

For the code I am using Google Cloud functions. Please see also my previous post.

Your Google Cloud Function could like this:

'use strict';

const {
} = require('actions-on-google'); //npm actions-on-google 2.1.1

const welcomeHandler = (conv) => {

  var today = new Date(); 
  var curHr = today.getHours(); 
  var greet = ""; 

  if( curHr < 12 ) {
    greet = "Good morning!"; 
  } else if (curHr < 18) {
    greet = "Good afternoon!";
  } else {
    greet = "Good evening!";


const app = dialogflow();

app.intent('[bot-first-greeting]', welcomeHandler);

exports.index = app;

Click the Fulfillment menu item, and make sure the URL points to your Google Cloud function.

Assuming that you linked the Actions on Google already in the Integrations tab; move back to your Actions on Google simulator, and refresh your app.

When you start your Google Assistant app, it will greet you, based on the time of the day.

Obviously, this is a simple example, but this can become more interesting, when loading profile information or previous contexts before starting your app.

Actions on Google for Google Cloud Functions

When building Google Assistant apps (actions) with Dialogflow, you likely will have to write some logics. The most common way in developing this logics layer is by using a webhook and a Cloud Function.

The webhook requires a URL. So technically you can use any web server and program language you like, Cloud Functions are just easy. It's serverless, which means you don't need to worry on setting up and maintaining an environment, and it scales out of the box.

Dialogflow integrates with Firebase Cloud Functions. There's an easy inline editor you can use, which creates the Cloud Function within Firebase. (Which under the hood uses the infrastructure of Google Cloud.)

For Dialogflow Enterprise customers (the Dialogflow version which is compliant and better for large organizations), Firebase Functions don't make much sense. (And in fact, are not been created within your current GCP project). You rather use the Google Cloud Functions.

Both use functions can make use of HTTP triggers. The way of invoking is different:


exports.helloWorld = function helloWorld (request, response) {
  res.send(`Hello from GCP!`);


exports.helloWorld = functions.https.onRequest((request, response) => {
  response.send("Hello from Firebase!");

The Actions on Google Node JS library (for creating Dialogflow agents with Google Assistant), explains how you can integrate the library within a Firebase Cloud function. Unfortunately, it doesn't explain to you how to integrate it with a GCP Cloud Function.

So here's how you would do this:

As you can see, you can get the request and response headers from the conv object, in the conversation handler function.

NOTE: This example was written for the 2.1.1 version of the Actions on Google NPM package:


    "name": "Demo",
    "description": "Google Assistant with Dialogflow Enterprise",
    "version": "1.0.0",
    "license": "Apache-2.0",
    "author": "Lee Boonstra",
    "engines": {
        "node": "^6.11.5"
    "dependencies": {
        "actions-on-google": "2.1.1"

Video: DLP & Vision API demo @ Google Cloud Summit Munich

Posted on in Machine Learning

At the Google Cloud Summit event in Munich, I presented the Vision API and DLP API for 3000+ IT professionals:

DLP API demo

Vision API demo

VIDEO: Google Cloud Summit Paris, Keynote: Video Intelligence

Posted on in GCP Machine Learning

At the Google Cloud Summit event in Paris, I presented the Video Intelligence API for 3000+ IT professionals:

VIDEO: Cloud on air webinar: Create custom conversations with Dialogflow

Posted on in Dialogflow Uncategorized

On the 17th of October, I run a global webinar for Google, about custom conversations and chatbots.

This was really exiting as we recorded this together with a full professional film crew!


VIDEO: Google Cloud Next Amsterdam Keynote – Video Intelligence API

Posted on in GCP Machine Learning

At the Google Cloud Next event in Amsterdam, I presented the Video Intelligence API for 3000+ IT professionals:

VIDEO: Create custom chat agents for Google Home with API.AI

Posted on in Chatbots Conversation Agents GCP

During Google Cloud Next in Amsterdam, I spoke for the Leader Circle (c-level discussions for Dutch companies) about chatbots, and smart conversation agents.
A talk about Google Home, and how you can create custom agents with the Conversation Engine / API.AI.

VIDEO: Extending G Suite with Business Applications

Posted on in G Suite JavaScript

Summer 2017, I run a technical session about G Suite at Google Cloud Next, in Amsterdam.

This session is a G Suite technical talk on how companies can extend on top of G Suite, whether it is on organization's own application or whether they want to write a code on top of G Suite. The talk is followed by a live use case with CEO and Founder of VOGSY, Leo Koster.

Enjoy watching!

Video: Machine Learning APIs for Python Developers

Posted on in GCP Machine Learning

A couple of months ago, me and my co-worker @dnovakovskiy spoke on a Python event in the North of the Netherlands: Pygrunn.

I forgot that there were recordings made,and came by it today. What a nice surprise!

So in case you are a Python dev, here you can see a video of our keynote: Machine Learning APIs for Python Developers.