Git Product home page Git Product logo

dialogflow-conversation-components-nodejs's Introduction

Actions on Google: Dialogflow Conversation Components Sample

⚠️ Warning: Conversational Actions will be deprecated on June 13, 2023. For more information, see Conversational Actions Sunset.

A sample featuring rich responses for building Actions on the Google Assistant while using the Node.js client library and deployed on Cloud Functions for Firebase.

⚠️ This code sample was built using Dialogflow. We now recommend using Actions Builder or the Actions SDK to develop, test, and deploy Conversational Actions.

Setup

Select only one of the options below.

Option 1: Add to Dialogflow

To create agent from our template:

Conversation Component

Option 2: Dialogflow Restore and Firebase CLI

Prerequisites

  1. Node.js and NPM
    • We recommend installing using NVM
  2. Install the Firebase CLI
    • We recommend using version 6.5.0, npm install -g [email protected]
    • Run firebase login with your Google account

Configuration

Actions Console

  1. From the Actions on Google Console, New project (this will be your Project ID) > Create Project > under More options > Conversational
  2. From the top menu under Develop > Actions (left nav) > Add your first action > BUILD (this will bring you to the Dialogflow console) > Select language and time zone > CREATE.
  3. In the Dialogflow console, go to Settings ⚙ > Export and Import > Restore from zip using the agent.zip in this sample's directory.

Firebase Deployment

  1. On your local machine, in the functions directory, run npm install
  2. Run firebase deploy --project {PROJECT_ID} to deploy the function
    • To find your Project ID: In Dialogflow console under Settings ⚙ > General tab > Project ID.

Dialogflow Console

  1. Return to the Dialogflow Console > select Fulfillment > Enable Webhook > Set URL to the Function URL that was returned after the deploy command > SAVE.
    Function URL (dialogflowFirebaseFulfillment): https://<REGION>-<PROJECT_ID>.cloudfunctions.net/dialogflowFirebaseFulfillment
    
  2. From the left navigation menu, click Integrations > Integration Settings under Google Assistant > Enable Auto-preview changes > Test to open the Actions on Google simulator then say or type Talk to my test app.

Testing this Sample

  • (Recommended) Open up the Assistant app then say or type OK Google, talk to my test app on a mobile device where Google Assistant is associated with the same account as your Action.
  • You can also use the Actions on Google Console simulator to test most features and preview on-device behavior.

References & Issues

Make Contributions

Please read and follow the steps in the CONTRIBUTING.md.

License

See LICENSE.

Terms

Your use of this sample is subject to, and by using or downloading the sample files you agree to comply with, the Google APIs Terms of Service.

dialogflow-conversation-components-nodejs's People

Contributors

atulep avatar canain avatar lucaswadedavis avatar mandnyc avatar norulesjustfeels avatar silvolu avatar smishra2 avatar taycaldwell avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dialogflow-conversation-components-nodejs's Issues

Validation Errors

When clicking on lists or carousels on Actions on Google simulator, there is a following validation error -

Error Type
MalformedResponse
expected_inputs[0].possible_intents[0]: intent 'actions.intent.OPTION' is only supported for version 2 and above.

Here is the Response JSON
{ "response": "my test app isn’t responding right now. Try again soon.\n", "audioResponse": "//NExAASW...", "debugInfo": { "sharedDebugInfo": [ { "name": "ExecutionResponse", "debugInfo": "HarpoonRe..." }, { "name": "ResponseValidation", "subDebugEntry": [ { "name": "MalformedResponse", "debugInfo": "expected_inputs[0].possible_intents[0]: intent 'actions.intent.OPTION' is only supported for version 2 and above." } ] } ] }, "visualResponse": {} }

How to resolve this?

Add Device Screenshots of Components

Would it be possible to show the List component looks like on a small device (e.g. iPhone 5c) vs a large device (e.g. Google Pixel XL)?

It's a little disappointing that these visual components don't have any screenshots here.

how to use without firebase?

I want to run my own node service but there seem to be some useful helper methods here.
how do i use this codebase without firebase?

how should this be rewritten in the context of a vanilla express app?

exports.dialogflowFirebaseFulfillment = functions.https.onRequest(app);

Media response MP3 live streaming specifications

Hi,

Documentation about media response has changed in the last weeks.

First it was https://developers.google.com/assistant/conversational/df-asdk/rich-responses#media_responses where you could read: "Audio for playback must be in a correctly formatted .mp3 file. Live streaming is not supported."

Now it is https://developers.google.com/assistant/conversational/prompts-media and you can now read: "Audio for playback must be in a correctly formatted MP3 file. MP3 files must be hosted on a web server and be publicly available through an HTTPS URL. Live streaming is only supported for the MP3 format."

Before and now we have made some actions for a radio broadcast station where we want to play the radio live streaming.

As an URL we use a MP3 url https://shoutcast.ccma.cat/ccma/catalunyaradioHD.mp3 with this specification:

  • MPEG Audio Layer 1/2
  • Stereo
  • 44100 Hz
  • 32 bits per sample
  • bit rate 128 Kbps

This stream is working well in:

  • Simulator
  • Google Mini
  • Google Nest Hub

But it's not working in mobile phones !! For this reason our action wasn't approved for Production.

If we change the MP3 url file to this one https://shoutcast.ccma.cat/ccma/catalunyaradio.mp3 with this specification:

  • MPEG Audio Layer 1/2
  • Stereo
  • 44100 Hz
  • 32 bits per sample
  • bit rate 64 Kbps

Then we can listen the radio also in mobile phones, but the quality is terrible only in mobile phones with Google Assistant application. You can listen in this video capture: https://drive.google.com/file/d/1eBoLRDx-n0QNHgHPWR4lzaZ1SyjWTGok/view?usp=sharing

So, Which is the correct specification for MP3 live streaming ?
Why we can't hear stream in 128 Kbps in mobile phones ?
Why is so an awful sound in 64 Kbps stream in mobile phones ?

Thanks in advance,

Eduard

Sample seems to be broken

Steps to Reproduce

  1. Restore Agent from zip (dialogflow/agent.zip)
  2. Click on Google Assistant from Integrations Menu
  3. Launch the test app and select List from the suggestions

Fulfillment URL:

https://us-central1-conversation-component-sample.cloudfunctions.net/conversationComponentApiAi

Change-Id:

Ibb8e918b355c19a278829af6b102f490b639d1c9

Error Response

"sharedDebugInfo": [
    {
      "name": "ResponseValidation",
      "subDebugEntry": [
        {
          "debugInfo": "expected_inputs[0].possible_intents[0]: intent 'actions.intent.OPTION' is only supported for version 2 and above.",
          "name": "MalformedResponse"
        }
      ]
    }
  ]

Error getting documents: Error: Unknown response type:

Hi,

I'm trying to read documents in the firestore database from dialogflow,
but i keep getting this error Error getting documents: Error: Unknown response type: "{document data}"

I want the document to be a response to the users request,
so it would be like
user: i want to see this document
agent: here is the document
agent: Document Information
below are the imports

 //these are functionalities that need to be used
const {dialogflow} = require('actions-on-google');
const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
const {Card, Suggestion} = require('dialogflow-fulfillment');

//initialise DB connection
const admin = require("firebase-admin");
 
process.env.DEBUG = 'dialogflow:debug*'; // enables lib debugging statements
//configure sdk
admin.initializeApp(functions.config().firebase);
const db = admin.firestore();
const timestamp = admin.firestore.Timestamp;

and the function to retrieve the data

function getPositiveLogs(agent){
        const dialogflowdoc = db.collection('users').doc('user3').
        collection('positiveLogs').where("date","==",true).limit(2);
        
        return dialogflowdoc.get().
        then(function(querySnapshot) {
        querySnapshot.forEach(function(doc) {
            if(!doc.exists){
                agent.add('Sorry there do not seem to be any logs to see');
            }else{
                const data= doc.data();
                agent.add(data.date);
            }
            console.log(doc.id, " => ", doc.data());
                });
            })
            .catch(function(error) {
                agent.add("error retrieving documents");
                console.log("Error getting documents: ", error);
            });
    }

I would be grateful to anyone who can tell me where I am going wrong!
Thank you

Error getting documents: Error: Unknown response type: "{"user_feeling":"happy","user_activity":"met up with friend","user_reward":"no","date":"2019-01-10T19:06:01.526Z"}" at WebhookClient.addResponse_ (/user_code/node_modules/dialogflow-fulfillment/src/dialogflow-fulfillment.js:287:13) at WebhookClient.add (/user_code/node_modules/dialogflow-fulfillment/src/dialogflow-fulfillment.js:254:12) at /user_code/index.js:185:23 at QuerySnapshot.forEach (/user_code/node_modules/firebase-admin/node_modules/@google-cloud/firestore/build/src/reference.js:731:22) at /user_code/index.js:181:23 at process._tickDomainCallback (internal/process/next_tick.js:135:7) 

``

Unable to retrieve parameters from previous context in code

Hi,

I am trying to retrieve what the user says from a previous context so that I can save it in a log for firebase.

So the conversation will go,
dialogflow agent: What are you grateful for today
user: I'm grateful for my family
dialogflow agent: Now can you tell me what you are proud of acheiving?
user: I'm proud of being able to swim 1km

I then want the fulfilment to save both responses from the user in one document.

However, nothing is working, I've been through all possible solutions shown online and none seem to work for me.
Could you possible tell me what I am doing wrong?

So first what I did was store the response in a constant then use it to set the context.

 const params = [agent.parameters.response1];
        let ctx = {'name': '1-praise - custom', 'lifespan': 5, 'parameters': {'response1':params}};
        agent.context.set(ctx);

In the next function I then retreive the same context and get the parameter from it.

const praiseParam = agent.context.get('1-praise - custom').parameters.response1;
         //var praiseParam = praiseContext.response1;
         var praise2Param = agent.parameters.second;
         
         const dialogflowAgentRef = db.collection('users').doc('user3')
            .collection('gratitufeLogs');
            
            return dialogflowAgentRef.add({
                praise: praiseParam,
                secondPraise: praise2Param
               
            })
            .then(function(docRef){
                console.log("Document written with ID: ", docRef.id);
            })
            .catch(function(error){
                console.error("Error adding document: ", error);
            });

There are no errors showing on the function logs so I am not sure what I am doing wrong?
I would be grateful for any suggestions or pointers on how to resolve this?

Thank you so much in advance!

FetchError: request to an json Uri of my sap system

Hi,

when i am trying to fetch data form the json url am getting the error as below :
FetchError: request to http://b1-dgwap01.XXXXXXXXXXX/sap/opu/odata/sap/ZMM_MOB_SALON_SUPPLIES_SRV/GET_AVAIL_BUDGETSet(XXXXXXX)?$format=json failed, reason: getaddrinfo ENOTFOUND b1-dgwap01.XXXXXX.lcl b1-dgwap01.XXXXXXXX.

where my node.js webhook code is able to fetch data form public example url's.
like : https://jsonplaceholder.typicode.com/todos, and able to return back the response to dialogflow.

is this the problem causing due to authentication given to the url.

Thank you.

Error Unknown response type Json

I am getting an error parsing a Json response. With my limited understanding of Json is it because the response start with array.

I am using request library
https://github.com/request/request

And I am getting the data from the following service.
https://api.tfl.gov.uk/StopPoint/490002072A/Arrivals

Error: Unknown response type: "{"statusCode":200,"body":[{"$type":"Tfl.Api.Presentation.Entities.Prediction, Tfl.Api.Presentation.Entities","id":"-580534719","operationType":1,"vehicleId":"BF65WJK","naptanId":"490002072A","stationName":"Sandilands Tram Stop","lineId":"119","lineName":"119","platformName":"U","direction":"outbound","bearing":"273","destinationNaptanId":"","destinationName":"Purley Way, Colonnades","timestamp":"2019-02-27T09:37:38.4799254Z","timeToStation":156,"currentLocation":"","towards":"South Croydon or West Croydon","expectedArrival":"2019-02-27T09:40:14Z","timeToLive":"2019-02-27T09:40:44Z","modeName":"bus","timing":{"$type":"Tfl.Api.Presentation.Entities.PredictionTiming, Tfl.Api.Presentation.Entities","countdownServerAdjustment":"-00:00:02.6243184","source":"2019-02-27T02:35:21.836Z","insert":"2019-02-27T09:36:59.861Z","read":"2019-02-27T09:36:57.155Z","sent":"2019-02-27T09:37:38Z","received":"0001-01-01T00:00:00Z"}},{"$type":"Tfl.Api.Presentation.Entities.Prediction, Tfl.Api.Presentation.Entities","id":"-739400795","operationType":1,"vehicleId":"BP15OLT","naptanId":"490002072A","stationName":"Sandilands Tram Stop","lineId":"119","lineName":"119","platformName":"U","direction":"outbound","bearing":"273","destinationNaptanId":"","destinationName":"Purley Way, Colonnades","timestamp":"2019-02-27T09:37:38.4799254Z","timeToStation":891,"currentLocation":"","towards":"South Croydon or West Croydon","expectedArrival":"2019-02-27T09:52:29Z","timeToLive":"2019-02-27T09:52:59Z","modeName":"bus","timing":{"$type":"Tfl.Api.Presentation.Entities.PredictionTiming, Tfl.Api.Presentation.Entities","countdownServerAdjustment":"-00:00:02.6751628","source":"2019-02-27T02:35:21.836Z","insert":"2019-02-27T09:37:09.88Z","read":"2019-02-27T09:37:07.17Z","sent":"2019-02-27T09:37:38Z","received":"0001-01-01T00:00:00Z"}},{"$type":"Tfl.Api.Presentation.Entities.Prediction, Tfl.Api.Presentation.Entities","id":"1989751219","operationType":1,"vehicleId":"BP15OMB","naptanId":"490002072A","stationName":"Sandilands Tram Stop","lineId":"119","lineName":"119","platformName":"U","direction":"outbound","bearing":"273","destinationNaptanId":"","destinationName":"Purley Way, Colonnades","timestamp":"2019-02-27T09:37:38.4799254Z","timeToStation":1745,"currentLocation":"","towards":"South Croydon or West Croydon","expectedArrival":"2019-02-27T10:06:43Z","timeToLive":"2019-02-27T10:07:13Z","modeName":"bus","timing":{"$type":"Tfl.Api.Presentation.Entities.PredictionTiming, Tfl.Api.Presentation.Entities","countdownServerAdjustment":"-00:00:02.7104852","source":"2019-02-27T02:35:21.836Z","insert":"2019-02-27T09:37:14.52Z","read":"2019-02-27T09:37:11.825Z","sent":"2019-02-27T09:37:38Z","received":"0001-01-01T00:00:00Z"}},{"$type":"Tfl.Api.Presentation.Entities.Prediction, Tfl.Api.Presentation.Entities","id":"1698915770","operationType":1,"vehicleId":"LJ08CTE","naptanId":"490002072A","stationName":"Sandilands Tram Stop","lineId":"194","lineName":"194","platformName":"U","direction":"outbound","bearing":"273","destinationNaptanId":"","destinationName":"West Croydon","timestamp":"2019-02-27T09:37:38.4799254Z","timeToStation":1532,"currentLocation":"","towards":"South Croydon or West Croydon","expectedArrival":"2019-02-27T10:03:10Z","timeToLive":"2019-02-27T10:03:40Z","modeName":"bus","timing":{"$type":"Tfl.Api.Presentation.Entities.PredictionTiming, Tfl.Api.Presentation.Entities","countdownServerAdjustment":"-00:00:02.7146433","source":"2019-02-27T02:35:21.836Z","insert":"2019-02-27T09:36:49.846Z","read":"2019-02-27T09:36:47.139Z","sent":"2019-02-27T09:37:38Z","received":"0001-01-01T00:00:00Z"}},{"$type":"Tfl.Api.Presentation.Entities.Prediction, Tfl.Api.Presentation.Entities","id":"-15610565","operationType":1,"vehicleId":"LJ13CHY","naptanId":"490002072A","stationName":"Sandilands Tram Stop","lineId":"194","lineName":"194","platformName":"U","direction":"outbound","bearing":"273","destinationNaptanId":"","destinationName":"West Croydon","timestamp":"2019-02-27T09:37:38.4799254Z","timeToStation":634,"currentLocation":"","towards":"South Croydon or West Croydon","expectedArrival":"2019-02-27T09:48:12Z","timeToLive":"2019-02-27T09:48:42Z","modeName":"bus","timing":{"$type":"Tfl.Api.Presentation.Entities.PredictionTiming, Tfl.Api.Presentation.Entities","countdownServerAdjustment":"-00:00:02.6688867","source":"2019-02-27T02:35:21.836Z","insert":"2019-02-27T09:36:49.846Z","read":"2019-02-27T09:36:47.139Z","sent":"2019-02-27T09:37:38Z","received":"0001-01-01T00:00:00Z"}},{"$type":"Tfl.Api.Presentation.Entities.Prediction, Tfl.Api.Presentation.Entities","id":"-1803100894","operationType":1,"vehicleId":"LJ10HVE","naptanId":"490002072A","stationName":"Sandilands Tram Stop","lineId":"198","lineName":"198","platformName":"U","direction":"outbound","bearing":"273","destinationNaptanId":"","destinationName":"Thornton Heath, High Street","timestamp":"2019-02-27T09:37:38.4799254Z","timeToStation":1160,"currentLocation":"","towards":"South Croydon

Error: unknown response type

I'm new to dialogflow and I wanted to use TableCard in my application to represent a response to the user. I tried it as following but it's not working:

'use strict';

const functions = require('firebase-functions');
const {WebhookClient} = require('dialogflow-fulfillment');
const {Carousel, Table} = require('actions-on-google');

process.env.DEBUG = 'dialogflow:debug';

exports.dialogflowFirebaseFulfillment = functions.https.onRequest((request, response) => {
	const agent = new WebhookClient({ request, response });
	
	function myIntentName(){
	
		agent.add(new Table({
			dividers: true,
			columns: ['header 1', 'header 2', 'header 3'],
			rows: [
			  ['row 1 item 1', 'row 1 item 2', 'row 1 item 3'],
			  ['row 2 item 1', 'row 2 item 2', 'row 2 item 3'],
			],
		  }));
	}
	
	
	let intentMap = new Map();
	
	intentMap.set('myIntentName', myIntentName);
	
	agent.handleRequest(intentMap);
	
});

And this is the error:

Error: unknown response type
    at WebhookClient.add (/user_code/node_modules/dialogflow-fulfillment/src/dialogflow-fulfillment.js:225:13)
    at cosaPuoiFare (/user_code/index.js:790:9)
    at WebhookClient.handleRequest (/user_code/node_modules/dialogflow-fulfillment/src/dialogflow-fulfillment.js:251:44)
    at exports.dialogflowFirebaseFulfillment.functions.https.onRequest (/user_code/index.js:829:8)
    at cloudFunction (/user_code/node_modules/firebase-functions/lib/providers/https.js:26:47)
    at /var/tmp/worker/worker.js:689:7
    at /var/tmp/worker/worker.js:673:9
    at _combinedTickCallback (internal/process/next_tick.js:73:7)
    at process._tickDomainCallback (internal/process/next_tick.js:128:9)

Do you have any idea why it's not working?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.