How To Create An Opensource NLU API With Rasa

How To Create An Opensource NLU API With Rasa

You Have Access To An Exceptional Natural Language Understanding Tool

Introduction

There is much hype regarding chatbots and conversational AI in general.

Image for post
Rasa NLU API called from Postman

Technologies are often compared to each other in order to find the best fit for an organization or task.

The problem is that with all the technology available, the initial decisions are easy to make.

The subsequent design & architecture decisions are harder, once the system is established.

In this article I would like to focus on using an opensource option which will serve well for a seed-project.

But more then that, a framework which can evolve and support a fully fledged enterprise solution.

Starting with the NLU API will lay a good foundation to grow the solution into a fully fledged conversational interface.

NLU & Core

There are a few chatbot platforms which have a clear separation between the NLU portion and the dialog management and integration portions. This allows for the development of a stand-alone NLU API.

Image for post
Rasa chatbot architecture with NLU portion marked.

The Rasa architecture gives you the opportunity to have a NLU API which can also be used for natural language understanding tasks not related to live conversations. This includes conversations archived on email, live agent conversations etc.

NLU Data Format

There are exceptionally good videos by Rasa on how to install in different environments.

Here you can find a video by Rachel for Widows 10 installation.

Secondly, to get started, Rasa has a series of Masterclass videos which is a time efficient way of getting up to speed with the whole Rasa stack.

But, back to creating a NLU API…

The examples below used for NLU training are based on a GitHub project.

The NLU training file has the entities annotated within the intents. Hence we see the healthy trend of intents and entities merging.

Intents and entities are in a text file in markdown format.

Intents & Entities

Simple Examples

The simplest example is just having an intent defined with some examples sans any entities. Below is the check_balance intent with a few examples.

## intent:check_balance
- How much money is on my account?
- what's my balance?
- what's my current balance?
- What's left on that account?
- How much do I have on that account?

Let’s add an entity of type account_type with the example credit card.

- what's my [credit](account_type) account balance?

The result:

Image for post
The intent is check_balance and the entity is account_type of credit.

Regular Expressions

This is how regular expressions are defined within the nlu.md file…

## regex:accountNumber
- [0-9]{9}## intent:inform_accountNumber
- my account number is [123454434](accountNumber)
- my account number is [334564343](accountNumber)
- my account number is [940564343](accountNumber)

And the output…

Image for post
The intent is marked and the entity of accountNumber is extracted.

Normalize Data

Data captured in the entity can be normalize. The entity related to to credit cards accounts is captured under the entity of account_type, but always with the value of credit.

{"text":"Pay off my [credit card]{"entity": "account_type", "value": "credit"}, please"}

And the result…

Here you see the rather skewed input of creditcard accout is normalized to the value of credit in the account_type entity.

Models

Each time you train your model, a new model file is created in the models folder. This allows you to change between models for your API, roll back to previous versions etc.

Image for post
Different trained models can be invoked when running the API

Creating The API

With a single command the API is launched on port 5005. But first, make sure you have activated your anaconda virtual environment with:

conda activate rasa2

My virtual environment is called rasa2.

Run the API:

rasa run --enable-api -m models/nlu-20200917-225530.tar.gz

You can access the API on the URL

http://localhost:5005/model/parse

Interact with your API via a client like Postman.

Image for post
Sending a JSON query string to the Rasa API

Conclusion

There are other features also available when defining the training data. The features mentioned here are the primary ones, IMO. In a following article I want to look at the structures available to entities.


Read More Here

Cobus GreylingHi, I’m Cobus… Currently I conceptualize, mock-up, wire-frame, prototype and develop final products. Primarily…cobusgreyling.me

Cobus Greyling – MediumRead writing from Cobus Greyling on Medium. NLP/NLU, Chatbots, Voice, Conversational UI/UX, CX Designer, Developer…www.medium.com

Rasa: Open source conversational AIBuild contextual AI assistants and chatbots in text and voice with our open source machine learning framework. Scale it…www.rasa.com

Rasa NLU: Language Understanding for Chatbots and AI assistantsRasa NLU is an open-source natural language processing tool for intent classification, response retrieval and entity…rasa.com

RasaHQ/financial-demoThis is an example chatbot demonstrating how to build AI assistants for financial services and banking. This starter…github.com

cobusgreyling/RasaBankingBotContribute to cobusgreyling/RasaBankingBot development by creating an account on GitHub.github.com



Leave a Reply

Your email address will not be published. Required fields are marked *