Skip to content

Commit

Permalink
Update to VA docs for personalisation, adding a skill (microsoft#286)
Browse files Browse the repository at this point in the history
* docs updates

* update to skills docs
  • Loading branch information
darrenj authored and lauren-mills committed Nov 12, 2018
1 parent fabb449 commit 9b8880c
Show file tree
Hide file tree
Showing 8 changed files with 150 additions and 181 deletions.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,7 @@ Follow the instructions below to build, deploy and configure your Assistant.

### Prerequisites
- - Ensure you have updated [.NET Core](https://www.microsoft.com/net/download) to the latest version.
-
- [Node.js](https://nodejs.org/) version 8.5 or higher.

- Install the Azure Bot Service command line (CLI) tools. It's important to do this even if you have earlier versions as the Virtual Assistant makes use of new deployment capabilities.

```shell
Expand All @@ -38,7 +36,8 @@ Once the Solution has been cloned you will see the following folder structure.

| - Virtual-Assistant
| - Assistant
| - LinkedAccounts.Web
| - LinkedAccounts
| - Microsoft.Bot.Solutions
| - Skills
| - CalendarSkill
| - DemoSkill
Expand All @@ -49,7 +48,7 @@ Once the Solution has been cloned you will see the following folder structure.
| - TestHarnesses
| - Assistant-ConsoleDirectLineSample
| - Assistant-WebTest
| - Microsoft.Bot.Solutions
| - Tests
| - VirtualAssistant.sln

### Build the Solution
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,4 @@

# Known Issues

We are tracking the following known issues on our backlog:

- The Virtual Assistant Visual Studio Project needs to have all Nuget Packages/Assemblies used by downstream skills added to the parent project.
- The Bot Framework Emulator doesn't provide the ability to control the UserId blocking testing of authentication scenarios within the emulator. The Web Test Harness provides a workaround for this.
- Skill registration only enables one Authentication Connection Name which doesn't enable the use of multiple credentials for a given skills (e.g. Microsoft and Google providers for the Email Skill)
- The LUIS model for ToDo is a placeholder, a more comprehensive language model is to follow.
Our backlog is fully accessible within the [GitHub repo](https://github.com/Microsoft/AI/)
Original file line number Diff line number Diff line change
Expand Up @@ -4,4 +4,4 @@

Natural Language Understanding is at the core of a Virtual Assistant. The LUIS Cognitive Service is used throughout the Virtual Assistant and Skills hence you can refer to the [LUIS supported languages](https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-supported-languages) to understand the foundational language availability at this time.

In-line with customer prioritisation we have prioritised English, French, Italian, German and Spanish languages for the language models made available through the Virtual Assistant along with the Language Generation (responses). Additional language support is in planning and can be prioritised according to customer demand.
In-line with customer prioritisation we have prioritised English, French, Italian, German, Spanish and Chinese languages for the language models made available through the Virtual Assistant along with the Language Generation (responses). Additional language support is in planning and can be prioritised according to customer demand.
Original file line number Diff line number Diff line change
Expand Up @@ -13,9 +13,9 @@ The folder structure of your Virtual Assistant is shown below.
| - YOURBOT.bot // The .bot file containing all of your Bot configuration including dependencies
| - README.md // README file containing links to documentation
| - Program.cs // Default Program.cs file
| - Startup.cs // Core Bot Initialisation including Bot Configuration LUIS, Dispatcher, etc.
| - Startup.cs // Core Bot Initialisation including Bot Configuration LUIS, Dispatcher, etc.
| - appsettings.json // References above .bot file for Configuration information. App Insights key
| - CognitiveModels
| - CognitiveModels
| - LUIS // .LU files containing base conversational intents (Greeting, Help, Cancel)
| - QnA // .LU files containing example QnA items
| - DeploymentScripts // msbot clone recipe for deployment
Expand Down Expand Up @@ -101,17 +101,83 @@ The introduction is presented with an [Adaptive Card](https://adaptivecards.io/)
}
```

## Update Virtual Assistant Responses

Ahead of the new Language Generation capabilities the Virtual Assistant makes use of Resource Files (RESX) for all base assistant responses. These are defined at the dialog level and can be found within the Resources folder of the corresponding Dialog within the Dialogs folder.

The `Main\Resources` folder contains responses shared across the Virtual Assistant. All resource files are localised with a separate language version as denoted by the locale suffix. For example `MainStrings.rex" under Main contains english responses with mainstrings.es containing spanish and so-on.

The in-built Visual Studio resource file editor makes it easy to apply changes to suit your Virtual Assistant scenario. Once you make changes, rebuild your project for them to take effect and ensure you update the localised versions as appropriate for your scenario.

![Resource File Editor](media/virtualassistant-resourcefile.png)

## Update the Skill responses

The Skills make use of [T4 Text Templating](https://docs.microsoft.com/en-us/visualstudio/modeling/code-generation-and-t4-text-templates?view=vs-2017) for the more complex Skill response generation. This is ahead of the new Language Generation capabilities that we will move to when ready.

You may wish to change the Skill responses to better suit your scenario and apply a different personality to all responses. This can be performed by changing the appropriate JSON file representing each dialogs responses.

You can achieve this by updating the appropriate JSON file, for example as shown these can be found within the Resources folder of the corresponding Dialog. You will need to expand the corresponding TT file and JSON file to see all of the language variations.

![Skill Text Templating JSON Response File](media/virtualassistant-skilljsonresponses.png)

An except of the `CreateEventDialog` responses files is shown below. In this case the `NoLocation` response surfaced to the Dialog code has a `Text` display and `Speak` variant enabling the client to select the most appropriate response for the users context (e.g. text versus speech led).

In addition the [`inputHint`](https://docs.microsoft.com/en-us/azure/bot-service/dotnet/bot-builder-dotnet-add-input-hints?view=azure-bot-service-3.0) is a hint to the client around microphone control, in this case this text is used for a prompt so the hint is set to expectingInput signaling that the client should automatically open the microphone for the response. If this is not set correctly, the client may inadvertently open the microphone when not needed or cause the user to have to click a speech button to respond.

```
"NoLocation": {
"replies": [
{
"text": "What is the location for the meeting?",
"speak": "What is the location for the meeting?"
}
],
"inputHint": "expectingInput"
}
```
Multiple variations for a response can be provided as shown in the error message response detailed below.
```
"EventCreationFailed": {
"replies": [
{
"text": "Event creation failed",
"speak": "Event creation failed"
},
{
"text": "Something went wrong, try again please.",
"speak": "Something went wrong, try again please."
},
{
"text": "It seems the event could not be created, please try again later.",
"speak": "It seems the event could not be created, please try again later."
},
{
"text": "Creation of the event failed, please try again.",
"speak": "Creation of the event failed, please try again."
},
{
"text": "An error occured with creating the event.",
"speak": "An error occured with creating the event."
}
],
"inputHint": "expectingInput"
},
```

Clean and Rebuild your project once changes have been made and ensure you update all localised versions as required for your assistant.

## Update the FAQ with QnAMaker

The FAQ provided features commonly asked questions about the Bot Framework, but you may wish to provide industry-specific samples.

To update an existing QnAMaker Knowledge Base, perform the following steps:
1. Make changes to your QnAMaker Knowledge Base via the [LuDown](https://github.com/Microsoft/botbuilder-tools/tree/master/packages/Ludown) and [QnAMaker](https://github.com/Microsoft/botbuilder-tools/tree/master/packages/QnAMaker) CLI tools or the [QnAMaker Portal](https://qnamaker.ai).
1. Make changes to your QnAMaker Knowledge Base via the [LuDown](https://github.com/Microsoft/botbuilder-tools/tree/master/packages/Ludown) and [QnAMaker](https://github.com/Microsoft/botbuilder-tools/tree/master/packages/QnAMaker) CLI tools leveraging the existing QnA file stored within the `CognitiveModels\QnA` folder of your project or directly through the [QnAMaker Portal](https://qnamaker.ai).
2. Run the following command to update your Dispatch model to reflect your changes (ensures proper message routing):
```shell
dispatch refresh --bot "YOURBOT.bot" --secret YOURSECRET
```

## Demoing the Skills

You can review a sample transcript showcasing the Productivity & Point of Interest Skills [here](transcripts/skillsdemo.transcript),
Expand Down
195 changes: 50 additions & 145 deletions solutions/Virtual-Assistant/docs/virtualassistant-skillenablement.md
Original file line number Diff line number Diff line change
@@ -1,149 +1,54 @@
# Creating a Skill

## Overview

The DemoSkill project provides an example Solution which you can use as a base for your Skill Creation. The documentation below covers the core changes required to enable a Bot to be used as a Skill whilst preserving the ability for the Skill to act like a normal Bot for ease of development and testing including use of the Bot Framework Emulator.

## Create a Bot as usual

If you wish to create your own Skill from scratch and not use the DemoSkill project the following steps provide the key steps to Skill enable a Bot.

## Custom Constructor

A custom constructor is needed in addition to the existing Bot constructor this is due to the direct invocation pattern which doesn't leverage the Asp.Net Core DI infrastructure. This constructor is passed a `BotState" object from the Virtual Assistant within which the Skill's state can be stored. Configuration is also passsed, most often for LUIS settings so this can be initialized for subsequent processing.

```
public DemoSkill(BotState botState, string stateName = null, Dictionary<string, string> configuration = null)
{
// Flag that can be used to identify the Bot is in "Skill Mode" for Skill specific logic
skillMode = true;
// Create the properties and populate the Accessors. It's OK to call it DialogState as Skill mode creates an isolated area for this Skill so it doesn't conflict with Parent or other skills
_accessors = new DemoSkillAccessors
{
DemoSkillState = botState.CreateProperty<DemoSkillState>(stateName ?? nameof(DemoSkillState)),
ConversationDialogState = botState.CreateProperty<DialogState>("DialogState"),
};
if (configuration != null)
{
// If LUIS configuration data is passed then this Skill needs to have LUIS available for use internally
// Only needed if LUIS is used for Turn 1+ operations (e.g. a prompt)
string luisAppId;
string luisSubscriptionKey;
string luisEndpoint;
configuration.TryGetValue("LuisAppId", out luisAppId);
configuration.TryGetValue("LuisSubscriptionKey", out luisSubscriptionKey);
configuration.TryGetValue("LuisEndpoint", out luisEndpoint);
if (!string.IsNullOrEmpty(luisAppId) && !string.IsNullOrEmpty(luisSubscriptionKey) && !string.IsNullOrEmpty(luisEndpoint))
{
LuisApplication luisApplication = new LuisApplication(luisAppId, luisSubscriptionKey, luisEndpoint);
_services = new DemoSkillServices();
_services.LuisRecognizer = new Microsoft.Bot.Builder.AI.Luis.LuisRecognizer(luisApplication);
}
}
// Dialog registration code as per the existing constructor...
}
```

## Optimise LUIS for Turn 0

As part of the utterance processing to identify what component should process an utterance the LUISResult has already been performed. To avoid duplicate LUIS processing for the Turn 0 utterance the LUIS result is passed as part of the skillBegin Event where you can then persist in State and use as part of Turn 0 processing within your skill

```
if (skillMode && state.LuisResultPassedFromSkill != null)
{
// If invoked by a Skill we get the Luis IRecognizerConvert passed to us on first turn so we don't have to do that locally
luisResult = (Calendar)state.LuisResultPassedFromSkill;
}
else
{
// Process utterance as normal
}
```

## Handle Events

Skills need to handle two distinct events, skillBegin and tokens/response.
- skillBegin: Sent by the Virtual Assistant to start a Skill conversation. The `Value` property of the Event contains a `SkillMetadata` object which includes the LUIS result for the first utterance, Configuration properties as set in the Virtual Assistant Skill configuration and Parameters relating to a given user if requested and exist for a given user.
- tokens/Response: Tokens are passed into the Skill through this event, the active dialog should have it's Continue method called to pass onto the next stage of Dialog processing now the token is available.

> The LUIS Result (Turn 0) and Parameters are only passed once in this skillBeginEvent and won't be available in future turns therefore it's important that you ensure this information is stored for use by subsequent turns. The configuration object is passed to the Skill constructor on each instantiation.
```
if (turnContext.Activity.Name == "skillBegin")
{
var state = await _accessors.DemoSkillState.GetAsync(turnContext, () => new DemoSkillState());
SkillMetadata skillMetadata = turnContext.Activity.Value as SkillMetadata;
if (skillMetadata != null)
{
// .LuisResultPassedFromSkill has the existing LUIS result which can be stored in state and used for Turn 0 processing
// .Configuration has any configuration settings required for operation
// .Parameters has any user information configured to be passed, store this for subsequent use
}
}
else if (turnContext.Activity.Name == "tokens/response")
{
// Auth dialog completion
var dialogContext = await _dialogs.CreateContextAsync(turnContext);
var result = await dialogContext.ContinueDialogAsync();
// If the dialog completed when we sent the token, end the skill conversation
if (result.Status != DialogTurnStatus.Waiting)
{
var response = turnContext.Activity.CreateReply();
response.Type = ActivityTypes.EndOfConversation;
await turnContext.SendActivityAsync(response);
## Getting Started

An initial Skill template has been made available to simplify creation of your own skill. This can be found within the [Skill-Template](https://github.com/Microsoft/AI/tree/master/templates/Skill-Template) folder of the repository.

Create a new folder called `Bot Framework` within your `%userprofile%\Documents\Visual Studio 2017\Templates\ProjectTemplates\Visual C#' folder. Then within this create a` Virtual Assistant Skill` folder and copy the contents of the template into this folder.

Restart Visual Studio, create a new project and you should now skill the `Skill Template` appear within the available C#\Bot Framework Templates.

## Adding your Skill to your Virtual Assistant

- Add a project reference to your new Skill project ensuring that the Virtual Assistant can locate your assemblies when invoking the skill
- Add the LUIS model corresponding to your new Skill to the Virtual Assistant bot file through the following command
```shell
msbot connect luis --appId [LUIS_APP_ID] --authoringKey [LUIS_AUTHORING_KEY] --subscriptionKey [LUIS_SUBSCRIPTION_KEY]
```
- Run the following command to update the Dispatcher model to reflect the new dispatch target
```shell
dispatch refresh --bot "YOURBOT.bot" --secret YOURSECRET
```
- Generate an updated Dispatch model for your Assistant to enable evaluation of incoming messages. The Dispatch.cs folder is located in the `assistant\Dialogs\Shared` folder. Ensure you run this command within the assistant directory of your cloned repo.
```shell
msbot get dispatch --bot VA-dj-us-en.bot | luis export version --stdin | luisgen - -cs Dispatch -o Dialogs\Shared
```
- Update the `assistant\Dialogs\Main\MainDialog.cs` file to include the corresponding Dispatch intent for your skill to the Skill handler, excerpt is shown below. add Authentication providers and configuration information as required.
```
case Dispatch.Intent.l_Calendar:
case Dispatch.Intent.l_Email:
case Dispatch.Intent.l_ToDo:
case Dispatch.Intent.l_PointOfInterest:
{}
````
- Finally add the your Skill configuration to the appSettings.json file
```
{
"type": "skill",
"id": "YOUR_SKILL_NAME",
"name": "YOUR_SKILL_NAME",
"assembly": "YourSkillNameSpace.YourSkillClass, YourSkillAssembly, Version=1.0.0.0, Culture=neutral",
"dispatchIntent": "l_YOURSKILLDISPATCHINTENT",
"supportedProviders": [
],
"luisServiceIds": [
"YOUR_SKILL_LUIS_MODEL_NAME",
"general"
],
"parameters": [
"IPA.Timezone"
],
"configuration": {
}
}
}
```

## Sending the End of Conversation message

```
case DialogTurnStatus.Complete:
// if the dialog is complete, send endofconversation to complete the skill
var response = turnContext.Activity.CreateReply();
response.Type = ActivityTypes.EndOfConversation;
await turnContext.SendActivityAsync(response);
await dc.EndDialogAsync();
```

## Authentication

In scenarios where your Skill needs access to a Token from the User to perform an Action this should be performed by the Custom Assitant ensuring that Tokens are held centrally and can be shared across Skills where appropriate (e.g. a Microsoft Graph token).

This is performed by sending a `tokens/request` event to the Virtual Assistant and then wait for a `tokens/response` event to be returned. If a token is already stored by the Virtual Assistant it will be returned immediately otherwise a Prompt to the user will be generated to initiate login. See [Linked Accounts](./virtualassistant-linkedaccounts.md) on how to ensure Tokens are made available during initial onboarding of the user to the Virtual Assistant.

Register a `SkillAuth` Dialog as part of your overall Dialog registration. Note this uses an EventPrompt class provided as part of the Virtual Assistant.
```
private const string AuthSkillMode = "SkillAuth";
...
AddDialog(new EventPrompt(AuthSkillMode, "tokens/response", TokenResponseValidator));
```

Then, when you require a Token request a Token from the Virtual Assistant.

```
// If in Skill mode we ask the calling Bot for the token
if (skillOptions != null && skillOptions.SkillMode)
{
// We trigger a Token Request from the Parent Bot by sending a "TokenRequest" event back and then waiting for a "TokenResponse"
var response = sc.Context.Activity.CreateReply();
response.Type = ActivityTypes.Event;
response.Name = "tokens/request";
// Send the tokens/request Event
await sc.Context.SendActivityAsync(response);
// Wait for the tokens/response event
return await sc.PromptAsync(AuthSkillMode, new PromptOptions());
}
```
Loading

0 comments on commit 9b8880c

Please sign in to comment.