Every Christmas for the past 5 or 6 years, I like to set myself a little challenge.
Of particular note… I built my first startup, Fidgetstick (2010), my second startup Braindu (2012), I built the first version of our Nutribu app (2013), and this year, I wanted to do something even more fun and challenging!
I wanted to bring intelligence and a personalised experience into one of our app projects, in the form of a chat BOT, and really put the NUWE platform and associated micro-services to the test.
So, I built…
KATE - a virtual nutritionist assistant.
###Why Kate?
The name “Kate” has been with us on our journey in building nutrition apps since the beginning. Firstly, our nutritionist consultant, one very experienced and talented Kate Cook, was the backbone of our original nutrition expertise.
Second, with the launch of Nutribu iOS v1, our female persona was called “Kate” (versus her male counterpart, Will).
So, Kate stuck around.
###Why “Nutritionist Assistant”?
I could have come over all hyperbolic, with a title like “Virtual Nutritionist” or something like that. But in my experience, and it appears the common public experience, is that chat-bots, whilst being on something of a hot streak right now, suffer from a mismatch between the promise and the reality.
Whilst professing the disruption of human jobs, replacing doctors and nurses and nutritionists and personal trainers and all manner of jobs may get some twitter buzz or techcrunch articles firing, it’s not really doing anything to move us forward as a tool or technology.
So, for me at least, the key here is that with intelligent technology, our role is think about ways we can make humans better. For me, this means playing to our strengths as humans and machines.
My Kate-BOT would be able to calculate within a blink of an eye, the nutritional intake across key micro or macro-nutrients for a whole year, or detect the presence of a potential allergen in something I’m about to eat. My real-world nutritionist wouldn’t have even reached the phone, let alone decided whether to take my call by that point.
But, my Kate-BOT will inevitably struggle to provide holistic and personalised recommendations and plans for me to follow in order to achieve my goals, not those of an overly generalised population, as well as reacting to, empathising with my emotional, behavioural and psychological biases.
###Kate for iOS
So now we decided what, we need to consider HOW?
My “simple” plan was to achieve a few things:
- A real-time Chat user interface for native iOS (it would be embedded into our Nutribu v2 app, currently in development).
- The ability to parse and infer Intent, within a given Context from natural language input, not just dumb regex string matching.
- Reacting to intent & context with intelligent responses & managing controlled Flows (e.g. an onboarding process)
##Chat UI
I didn’t really want to build a chat interface from scratch, I figured, as such a popular design pattern and mobile feature, there must be an existing control, framework or open source project I could use…
In the end, I settled on a pretty full-fledged option.
I used a really nice project by RelatedCode which wires up some additional external 3rd party libraries to Parse and Firebase for a very full-featured Chat interface, with custom message types and the option for some premium features too, which I didn’t use yet.
This project is dependent on some additional libraries, such as:
Jesse Squire’s JSQMessagesViewController
Ideaismobile’s IDMPhotoBrowser
Jess Squire’s JSQSystemSoundPlayer
RelatedCode’s ProgressHUD
Ryan Nystrom’s RNGridMenu
Olivier Poitrey’s SDWebImage
There was quite a bit of work to extract what I needed from the project to put into the Nutribu app and generally work out my way around the structure of the various classes. But once I did that and resolved any issues, I was successfully able to call the UI when a user long presses on the main button on the Nutribu main page.
We did have to add and initialise some new back end service, in addition to the existing Nuwe services already in place. This is pretty straightforward to do, but I had one initial design challenge - how to successfully map our existing NUUser to our PFUser, needed for the Parse integration.
For now, to keep it simple, I settled on a background call on login to the app to login the user to Parse with a random password and their NUUser email as username.
In our main view (called StatusView) we create these 2 methods:
- (void) signInWithParseUser:(NSString *)username withEmail:(NSString *)email andPassword:(NSString *)password {
[PFUser logInWithUsernameInBackground:email password:password
block:^(PFUser *user, NSError *error) {
if (user) {
// Do stuff after successful login.
NSLog(@"Logged in successfully with Parse as: %@", user.email);
} else {
// The login failed. Check error to see why.
NSLog(@"Logged in failed with error: %@", error.description);
}
}];
}
- (void) registerParseUser:(NSString *)username withEmail:(NSString *)email andPassword:(NSString *)password {
PFUser *user = [PFUser user];
user.username = username;
user.password = password;
user.email = email;
user[@"fullname"] = username;
[user signUpInBackgroundWithBlock:^(BOOL succeeded, NSError *error) {
if (!error) {
NSLog(@"Logged in successfully with Parse as: %@", user.email);
} else { NSString *errorString = [error userInfo][@"error"]; // Show the errorString somewhere and let the user try again.
NSLog(@"error: %@", errorString);
[self signInWithParseUser:username withEmail:email andPassword:password];
}
}];
}
And then called from viewDidLoad:
if ([NUCore getCurrentUser] == nil) {
return;
}
if ([PFUser currentUser] == nil ) {
[self registerParseUser:[NUCore getCurrentUser].email withEmail:[NUCore getCurrentUser].email andPassword:[self randomPassword];
}
This is good enough for now to ensure we have a dedicated data store for our Users, Sessions & Push notifications.
Setting up the Firebase service was very easy, as everything is already there in the project code.
In the Chat project there is a constants header file, AppConstant.h
where you’ll need to drop your Firebase app URL
#define FIREBASE @"https://YOUR_CHAT_APP.firebaseio.com"
I can already see how a nicely constructed framework / library with all of this stuff pre-configured and tighter communication between NUWE and PARSE SDK’s would work a treat. For later…
Finally, to call the chat feature, I added a long press gesture onto the existing EatButton.
- (void)eatLongPress:(UILongPressGestureRecognizer*)gesture {
if ( gesture.state == UIGestureRecognizerStateEnded ) {
[self.eatButton setBackgroundImage:[UIImage imageNamed:@"btn_round_orange.png"] forState:UIControlStateNormal];
self.eatButton.imageView.image = [UIImage imageNamed:@"microphone-512.png"];
NSString *systemId = NU_AI_USER_SYSTEM_ID;
NSString *groupId = [NSString stringWithFormat:@"%@%@", systemId, [PFUser currentUser].objectId];
[self goToChat:groupId];
}
}
- (void) goToChat:(NSString *)groupId {
ChatView *chatView = [[ChatView alloc] initWith:groupId];
[chatView setTitle:@"Kate"];
chatView.hidesBottomBarWhenPushed = YES;
UINavigationController *navController = [[UINavigationController alloc] initWithRootViewController:chatView];
navController.modalTransitionStyle = UIModalTransitionStyleCoverVertical;
[navController.navigationBar setBarTintColor:[UIColor colorWithRed:0.0/255.0 green:158.0/255.0 blue:118.0/255.0 alpha:1.0]];
[navController.navigationBar setTitleTextAttributes:
@{NSForegroundColorAttributeName:[UIColor whiteColor]}];
[self presentViewController:navController animated:YES completion:^{
NSLog(@"Showing Chat View");
}];
}
The Birth of the BOT
So this gets us a working Chat feature, without reinventing all the wheels and engine and windscreen. But there’s no sign of any intelligent life on the other side…
I decided to start with a brand new empty shell of an Objective C class that would be my prototype representation of KATE. It’s called NUAIUser
.
We want our NUAIUser
to be treated, by the system at least, just like a real user. This way, she has access to all the same message APIs as our current logged in user and we can extend them with new custom message types and actions to suit. She can also exist in group conversations, because we treat her just like other remote users, except we’ll control her initially from within our app’s logic.
For this to work, our NUAIUser
will need to exist in Parse, as if a regular user so we’ll add her as a row in the Parse Users table. This will give her an ObjectID
that we can use, for instance, if we want to send remote messages as Kate.
I decided to use a singleton pattern for our NUAIUser
, since I only want one instance of this class running in the app. The NUAIUser
could take on different characteristics with a more evolved API, but for now at least, I want to ensure I’m just dealing with one KATE-BOT instance.
You can see from the method above, that when I load the Chat UI, I am doing so by passing it a groupId
. The groupId
is for Firebase to ensure it loads the conversation between 2 parties, in this case, Kate and our current User.
The groupId
format takes a unique Id for each of these as a string. I have declared the unique id for Kate in a constants header file, equal to her objectId
from the Parse data store.
Our current user’s objectID
can be retrieved using the Parse SDK currentUser
helper method for PFUser
.
So long as these two Id’s don’t change, the user will always be taken to the chat room with conversation history between themselves and Kate.
The primary role of the NUAIUser
class will be to handle responses to User inputs and to proactively lead a process through more complete methods, that I’ll start by creating here before I outgrow it.
In order to give our NUAIUser
, from now on referred to as Kate, the ability to do stuff, we need to hook it up to the Chat View. In this case, the Chat View is our NUAIUser’s parent.
In NUAIUser.h
@interface NUAIUser : NSObject {
NSString *name;
id parentViewController;
UIView *messageView;
}
@property (nonatomic, retain) NSString *name;
+ (id)sharedNUAIUser;
- (id)initWithParentViewController:(UIViewController *) viewController;
NUAIUser.m
- (id)initWithParentViewController:(id) viewController {
if (self = [super init]) {
parentViewController = viewController;
name = @"Kate";
}
return self;
}
Kate now has access to the API of our ChatView
, so we can start by sending a simple message.
There is an existing method
- (void)messageSend:(NSString *)text Video:(NSURL *)video Picture:(UIImage *)picture Audio:(NSString *)audio
that looks useful, but it misses the ability to specify the user as Kate. We’ll use it as a template though to create a method for create to fire out messages as she likes…
- (void) AIUserMessageSend:(NSString *)text Video:(NSURL *)video Picture:(UIImage *)picture Audio:(NSString *)audio User:(NSString *)systemId Name:(NSString *)systemName
//-------------------------------------------------------------------------------------------------------------------------------------------------
{
Outgoing *outgoing = [[Outgoing alloc] initWith:groupId View:self.navigationController.view];
[outgoing send:text Video:video Picture:picture Audio:audio User:NU_AI_USER_SYSTEM_ID Name:NU_AI_USER_NAME];
//---------------------------------------------------------------------------------------------------------------------------------------------
[JSQSystemSoundPlayer jsq_playMessageSentSound];
[self finishSendingMessage];
}
We’ve added two additional parameters that we can specify, the User’s ID and her Name. This makes the method flexible so we can use it for other things, but for now, this is ok.
We also need to adapt the Outgoing
message class to use this information.
I’ll add a new method to assist…
The default method assumes that send:
is being sent by the currentUser. I need to override that, to send it as Kate.
So, instead of:
- (void)send:(NSString *)text Video:(NSURL *)video Picture:(UIImage *)picture Audio:(NSString *)audio
//-------------------------------------------------------------------------------------------------------------------------------------------------
{
NSMutableDictionary *item = [[NSMutableDictionary alloc] init];
//---------------------------------------------------------------------------------------------------------------------------------------------
item[@"userId"] = [PFUser currentId];
item[@"name"] = [PFUser currentName];
item[@"date"] = Date2String([NSDate date]);
item[@"status"] = TEXT_DELIVERED;
//---------------------------------------------------------------------------------------------------------------------------------------------
item[@"video"] = item[@"thumbnail"] = item[@"picture"] = item[@"audio"] = item[@"latitude"] = item[@"longitude"] = @"";
item[@"video_duration"] = item[@"audio_duration"] = @0;
item[@"picture_width"] = item[@"picture_height"] = @0;
//---------------------------------------------------------------------------------------------------------------------------------------------
if (text != nil) [self sendTextMessage:item Text:text];
else if (video != nil) [self sendVideoMessage:item Video:video];
else if (picture != nil) [self sendPictureMessage:item Picture:picture];
else if (audio != nil) [self sendAudioMessage:item Audio:audio];
else [self sendLoactionMessage:item];
}
We have:
- (void)send:(NSString *)text Video:(NSURL *)video Picture:(UIImage *)picture Audio:(NSString *)audio User:(NSString *)systemId Name:(NSString *)systemName
//-------------------------------------------------------------------------------------------------------------------------------------------------
{
NSMutableDictionary *item = [[NSMutableDictionary alloc] init];
//---------------------------------------------------------------------------------------------------------------------------------------------
item[@"userId"] = systemId;
item[@"name"] = systemName;
item[@"date"] = Date2String([NSDate date]);
item[@"status"] = TEXT_DELIVERED;
NSLog(@"Message delivered: %@", text);
//---------------------------------------------------------------------------------------------------------------------------------------------
item[@"video"] = item[@"thumbnail"] = item[@"picture"] = item[@"audio"] = item[@"latitude"] = item[@"longitude"] = @"";
item[@"video_duration"] = item[@"audio_duration"] = @0;
item[@"picture_width"] = item[@"picture_height"] = @0;
//---------------------------------------------------------------------------------------------------------------------------------------------
if (text != nil) [self sendTextMessage:item Text:text];
else if (video != nil) [self sendVideoMessage:item Video:video];
else if (picture != nil) [self sendPictureMessage:item Picture:picture];
else if (audio != nil) [self sendAudioMessage:item Audio:audio];
else [self sendLoactionMessage:item];
}
So now if we circle back to Kate’s Class, we can simply call:
[parentViewController AIUserMessageSend:[NSString stringWithFormat:@"Hello, I'm happy to be here!"]
Video:nil Picture:nil Audio:nil User:NU_AI_USER_SYSTEM_ID Name:NU_AI_USER_NAME];
To send a text message and render it into the Chat UI. YEAH!
I’m now going to grab a and when I get back, we’ll give Kate some mild intelligence