picWatson

How to Code Natural Language Processing on Android With IBM Watson

Thanks to the rising wave of artificial intelligence, users these days have come to expect apps that are both smart and aware of the contexts in which they’re being used. IBM Watson offers a variety of natural language-related services you can use to create such apps.

For instance, you can use its Natural Language Understanding service to extract keywords, entities, sentiments, and a lot of other semantic details from any text the user is reading. And if the text happens to be in a foreign language, you can use the Language Translator service to identify the language and translate it to one the user understands.

In this tutorial, I’m going to introduce you to a few of those services by showing you how to create an app that can translate German webpages to English and extract sentiments, important entities, and emotions from them.

Before you proceed, I suggest that you read the following introductory tutorial on IBM Watson services:

  • Machine Learning
    Coding an Android App With IBM Watson Machine Learning
    Ashraff Hathibelagal

1. Activating the Services

We’ll be working with three Watson services today, and each of them needs to be activated separately. So open your IBM Bluemix dashboard and press the Create button.

The first service we’re going to activate is the Document Conversion service, which allows us to convert HTML, PDF, and DOCX documents to plain text or JSON. Select it from the catalog, give it a meaningful name, and press the Create button.

Configure Document Conversion service

Next, go back to the catalog and choose the Language Translator service. It supports several widely spoken languages and can, by default, handle text in three domains: news, conversation, and patent. While the first two domains are adequate for most texts, the last domain can be more accurate for texts containing lots of technical or legal terms.

In its configuration page, give the service a meaningful name and press the Create button.

Configuring Language Translator service

Return to the catalog and choose the Natural Language Understanding service. We’ll be using this service to extract sentiments, entities, and emotions from unstructured text. Again, give it a meaningful name in the configuration screen and press the Create button.

Configuring Natural Language Understanding service

If you open the dashboard now, you should be able to see something like this:

Three active services in the dashboard

All the three services have unique login credentials associated with them. You must note all of them down because you’ll be needing them later. To determine the credentials of any service, select it on the dashboard, open its Service credentials tab, and press the View Credentials button.

2. Project Setup

To be able to use these three services in an Android Studio project, we must add the Watson Java SDK as an implementation dependency in the app module’s build.gradle file.

Additionally, we’ll be using the Fuel library as an HTTP client, so add it too as an implementation dependency.

Both Fuel and the Watson Java SDK can work only if our app has the INTERNET permission, so ask for it in the manifest file.

Next, add  tags containing the usernames and passwords of all the three services to the strings.xml file.

Lastly, to keep our code concise, in this tutorial we’ll be using Kotlin instead of Java, so make sure that you’ve enabled Kotlin support.

3. Using the Document Conversion Service

We’ll be using the Watson Document Conversion service to convert HTML webpages to plain text. To allow the user to type in a webpage address, add an EditText widget to your activity’s layout. Additionally, include a TextView widget to display the contents of the webpage as plain text. To make sure that the contents of lengthy webpages are not truncated, I suggest you place them inside a ScrollView widget.

In the above code, you can see that the imeOptions attribute of the EditText widget is set to actionGo. It allows users to press a “Go” button on their virtual keyboards when they’ve finished typing the address. To listen to that button-press event, add the following Kotlin code to your activity’s onCreate() method:

Inside the event listener, the first thing we need to do is determine the URL the user typed. We can do so easily by accessing the text property of the EditText widget. Once we have the URL, we can use Fuel’s httpGet() method to download the contents of the webpage.

Because we want the httpGet() method to run asynchronously, we must add a callback to it using the responseString() method, which also allows us to process the downloaded contents as a string.

It’s now time to create an instance of the DocumentConversion class, which has all the methods we need to interact with the Document Conversion service. Its constructor expects a version date along with the service’s login credentials.

The Watson Java SDK doesn’t allow us to directly pass strings to the Document Conversion service. It needs File objects instead. Therefore, let us now create a temporary file using the createTempFile() method of the File class, and write the contents of the webpage we downloaded to it using the writeText() method.

At this point, we can call the convertDocumentToText() method and pass the temporary file to it to start the conversion. The method also expects the MIME-type of the temporary file, so don’t forget to include it. Once the conversion is complete, you can display the plain text by simply assigning it to the text property of the TextView widget.

The following code shows you how to perform the conversion inside a new thread and update the TextView in the UI thread:

You can run the app now and type in a German webpage’s URL to see the Document Conversion service working.

A German webpage converted to plain text

4. Using the Language Translator Service

With the Language Translator service, we shall now convert the plain text, which is in German, to English.

Instead of updating our layout, to allow the user to manually start the translation, let us add a menu to our activity. To do so, start by creating a new menu resource file and adding the following code to it:

As you can see, the above code creates a menu with two options: translate and analyze. In this step, we’ll be working with the first option only.

To render the menu, we must inflate it inside the onCreateOptionsMenu() method of our activity.

By overriding the onOptionsItemSelected() method, we can know when the user uses the menu. Furthermore, we can determine which item the user pressed by checking the itemId. The following code checks if the user chose the translate option.

Just like the Document Service, the Language Translator service too has a dedicated class that allows us to interact with it. As you might have guessed, it’s called LanguageTranslator. To create an instance of the class, we need to pass only the service’s login credentials to its constructor.

The class has a translate() method we can now use to translate our German text to English. As its arguments, it expects the text to translate as a string, the current language of the text, and the desired language.

After the translation completes successfully, we will have access to a TranslationResult object, whose firstTranslation property contains the translated text.

The following code shows you how to perform the translation and render the result in the TextView widget.

You can now run the app again, type in a German webpage’s URL, and use the menu to translate its contents to English.

Text in German translated to English

5. Using the Natural Language Understanding Service

Finally, to perform a semantic analysis on the translated text and extract various important details from it, we can use the NaturalLanguageUnderstanding class, which serves as a client for the Natural Language Understanding service.

The following code shows you how to initialize the client only when the user presses the second option of the menu we created in the previous step:

Compared to the other natural language-related services, using the Natural Language Understanding service is slightly more involved, primarily because it has a large number of features.

For now, let’s say we want to determine the overall sentiment of the translated text, and extract all the major entities it mentions. Each entity itself can have an emotion and sentiment associated with it, so let’s say we want to extract those too.

To tell the service that we want to extract all entities and the emotions and sentiments associated with them, we need an EntitiesOptions object, which can be created using the EntitiesOptions.Builder class.

Similarly, to tell the service that we want the overall sentiment of the text, we need a SentimentOptions object.

The SentimentOptions and EntitiesOptions objects must now be bundled together to form a Features object, which can be used to compose an AnalyzeOptions object. The AnalyzeOptions object is the most important of all the above objects because it is where you specify the text you want to analyze.

Once the AnalyzeOptions object is ready, we can pass it the analyze() method to start the analysis.

The result of the analysis is an AnalysisResults object, containing all the information we asked for.

To determine the overall sentiment of the text, we must first extract the overall sentiment score using the sentiment.document.score property. A sentiment score is nothing but a floating-point number. If it’s zero, the sentiment’s neutral. If it’s negative or positive, the sentiment too is negative or positive.

Next, by looping through the entities list present in the AnalysisResults object, we can process each entity individually. By default, each entity has a type associated with it. For instance, the service can tell if an entity is a person, a company, or a vehicle. Currently, it can identify over 450 different types of entities.

Because we asked for them, each entity will now also have a sentiment score and emotions associated with it.

We can determine the sentiment score by simply using the sentiment.score property. Determining the emotion associated with an entity, however, is not as straightforward. Watson supports five emotions currently: anger, joy, disgust, fear, and sadness. Each entity will have all the five emotions, but different values associated with each of them, specifying how confident the service is that the emotion is correct. Therefore, to determine the right emotion, we must pick the one with the highest value.

The following code lists each entity along with its type, sentiment score, and emotion:

To display the output we generated, we can again update the TextView widget.

At this point, you can run the app again to see all the three services working together.

Results of semantic analysis

Conclusion

You now know how to use three of the most widely used natural language-related services Watson offers. In this tutorial, you also saw how easy it is to use the Watson Java SDK to make all the services work together to create a smart Android app.

To learn more about the services and the SDK, you can refer to the SDK's GitHub repository. And to learn more about using Watson machine learning in your own apps, check out some of our other posts here on Envato Tuts+!

  • Android SDK
    How to Use Google Cloud Machine Learning Services for Android
    Ashraff Hathibelagal
  • Android Things
    Android Things and Machine Learning
    Paul Trebilcox-Ruiz
  • Android SDK
    Introduction to Android Architecture Components
    Tin Megali
  • Machine Learning
    Coding an Android App With IBM Watson Machine Learning
    Ashraff Hathibelagal

Powered by WPeMatico

Leave a Comment

Scroll to Top