Git Product home page Git Product logo

nlapi-java's Introduction

expert.ai Natural Language API for Java v2

Java client for expert.ai Natural Language API and Studio Local Deployment Agent adds Natural Language understanding capabilities to your Java apps. The client can use either the Cloud based Natural Language API or Studio Local Deployment Agent.

Check out what expert.ai Natural Language API can do for your application with the live demo. Natural Language API provides a comprehensive set of natural language understanding capabilities based on expert.ai technology:

  • Document analysis:
    • Deep linguistic analysis:
      • Text subdivision
      • Part-of-speech tagging
      • Syntactic analysis
      • Lemmatization
      • Keyphrase extraction
      • Semantic analysis
    • Named entity recognition
    • Relation extraction
    • Sentiment analysis
  • Document classification
  • Information detection

What you'll need

  • About 15 minutes
  • A favorite text editor or IDE
  • Java JDK version 8 or higher
  • Gradle installed

Build from source

git clone [email protected]:therealexpertai/nlapi-java.git
cd nlapi-java
./gradlew build    

Generate a distribution from source

git clone [email protected]:therealexpertai/nlapi-java.git
cd nlapi-java
./gradlew distZip    

Add maven dependency

<dependency>
    <groupId>ai.expert</groupId>
    <artifactId>nlapi-java-sdk</artifactId>
    <version>2.3.1</version>
</dependency>

Setting your credentials

You need an expert.ai developer account to use the APIs and you can get one for free registering on the expert.ai developer portal.

This Java Client checks your credentials using a chain of credential providers.

The default chain checks in order the following:

  • your Environment Variables set on the machine with keys EAI_USERNAME and EAI_PASSWORD
  • the System Properties, which can be set on a System Properties file (e.g. myProperties.txt) using keys eai.username and eai.password

Usage examples

Here are some examples of how to use the library in order to leverage the Natural Language API:

Document analysis

You can get the result of the document analysis applied to your text as follows:

Natural Language API:

import ai.expert.nlapi.security.Authentication;
import ai.expert.nlapi.security.Authenticator;
import ai.expert.nlapi.security.BasicAuthenticator;
import ai.expert.nlapi.security.Credential;
import ai.expert.nlapi.v2.API;
import ai.expert.nlapi.v2.cloud.Analyzer;
import ai.expert.nlapi.v2.cloud.AnalyzerConfig;
import ai.expert.nlapi.v2.message.AnalyzeResponse;

public class AnalisysTest {

    static StringBuilder sb = new StringBuilder();

    // Sample text to be analyzed
    static {
        sb.append("Michael Jordan was one of the best basketball players of all time.");
        sb.append("Scoring was Jordan's stand-out skill, but he still holds a defensive NBA record, with eight steals in a half.");  
    }

    public static String getSampleText() {
        return sb.toString();
    }
    
    //Method for setting the authentication credentials - set your credentials here.
    public static Authentication createAuthentication() throws Exception {
    	DefaultCredentialsProvider credentialsProvider = new DefaultCredentialsProvider();
        Authenticator authenticator = new BasicAuthenticator(credentialsProvider);
        return new Authentication(authenticator);
    }

    //Method for selecting the resource to be call by the API; 
    //as today, the API provides the standard context only, and  
    //five languages such as English, French, Spanish, German and Italian
    public static Analyzer createAnalyzer() throws Exception {
        return new Analyzer(AnalyzerConfig.builder()
            .withVersion(API.Versions.V2)
            .withContext("standard")
            .withLanguage(API.Languages.en)
            .withAuthentication(createAuthentication())
            .build());
    }

    public static void main(String[] args) {
        try {
            Analyzer analyzer = createAnalyzer();
            AnalyzeResponse response = null;
            
            // Disambiguation Analisys
            response = analyzer.disambiguation(getSampleText());
            response.prettyPrint();

            // Relevants Analisys
            response = analyzer.relevants(getSampleText());
            response.prettyPrint();

            // Entities Analisys
            response = analyzer.entities(getSampleText());
            response.prettyPrint();

            // Relations Analisys
            response = analyzer.relations(getSampleText());
            response.prettyPrint();

            // Sentiment Analisys
            response = analyzer.sentiment(getSampleText());
            response.prettyPrint();
            
            // Full Analisys
            response = analyzer.analyze(getSampleText());
            response.prettyPrint();
        }
        catch(Exception ex) {
            ex.printStackTrace();
        }
    }
}

Studio Local Deployment Agent:

import ai.expert.nlapi.security.Authentication;
import ai.expert.nlapi.security.Authenticator;
import ai.expert.nlapi.security.BasicAuthenticator;
import ai.expert.nlapi.security.Credential;
import ai.expert.nlapi.v2.API;
import ai.expert.nlapi.v2.edge.Analyzer;
import ai.expert.nlapi.v2.edge.AnalyzerConfig;
import ai.expert.nlapi.v2.message.AnalyzeResponse;

public class AnalisysTest {

    static StringBuilder sb = new StringBuilder();

    // Sample text to be analyzed
    static {
        sb.append("Michael Jordan was one of the best basketball players of all time.");
        sb.append("Scoring was Jordan's stand-out skill, but he still holds a defensive NBA record, with eight steals in a half.");  
    }

    public static String getSampleText() {
        return sb.toString();
    }
    
    //Method for setting the authentication credentials - set your credentials here.
    public static Authentication createAuthentication() throws Exception {
    	DefaultCredentialsProvider credentialsProvider = new DefaultCredentialsProvider();
        Authenticator authenticator = new BasicAuthenticator(credentialsProvider);
        return new Authentication(authenticator);
    }

    //Method for selecting the API resource to be requested.
    public static Analyzer createAnalyzer() throws Exception {
        return new Analyzer(AnalyzerConfig.builder()
            .withVersion(API.Versions.V2)
            .withHost(API.DEFAULT_EDGE_HOST)
            .withAuthentication(createAuthentication())
            .build());
    }

    public static void main(String[] args) {
        try {
            Analyzer analyzer = createAnalyzer();
            AnalyzeResponse response = null;
            
            // Disambiguation Analisys
            response = analyzer.disambiguation(getSampleText());
            response.prettyPrint();

            // Relevants Analisys
            response = analyzer.relevants(getSampleText());
            response.prettyPrint();

            // Entities Analisys
            response = analyzer.entities(getSampleText());
            response.prettyPrint();

            // Relations Analisys
            response = analyzer.relations(getSampleText());
            response.prettyPrint();

            // Sentiment Analisys
            response = analyzer.sentiment(getSampleText());
            response.prettyPrint();
            
            // Full Analisys
            response = analyzer.analyze(getSampleText());
            response.prettyPrint();
        }
        catch(Exception ex) {
            ex.printStackTrace();
        }
    }
}

The API document analysis resources operate within a context. For retrieving the list of all valid contexts use this code:

import ai.expert.nlapi.security.Authentication;
import ai.expert.nlapi.security.Authenticator;
import ai.expert.nlapi.security.BasicAuthenticator;
import ai.expert.nlapi.security.Credential;
import ai.expert.nlapi.v2.API;
import ai.expert.nlapi.v2.cloud.Analyzer;
import ai.expert.nlapi.v2.cloud.AnalyzerConfig;

public class ContextsTest {


    //Method for setting the authentication credentials - set your credentials here.
    public static Authentication createAuthentication() throws Exception {
    	DefaultCredentialsProvider credentialsProvider = new DefaultCredentialsProvider();
        Authenticator authenticator = new BasicAuthenticator(credentialsProvider);
        return new Authentication(authenticator);
    }



    public static void main(String[] args) {
        try {
            // create InfoAPI
        	InfoAPI infoAPI = new InfoAPI(InfoAPIConfig.builder()
                .withAuthentication(createAuthentication())
                .withVersion(API.Versions.V2)
                .build());
    	
	    	
	        Contexts contexts = infoAPI.getContexts();
	        contexts.prettyPrint();
        }
        catch(Exception ex) {
            ex.printStackTrace();
        }
    }
}

Document classification

You can also run document classification, for example with the IPTC Media Topics taxonomy

package ai.expert.nlapi.v2.test;

import ai.expert.nlapi.security.Authentication;
import ai.expert.nlapi.security.Authenticator;
import ai.expert.nlapi.security.BasicAuthenticator;
import ai.expert.nlapi.security.Credential;
import ai.expert.nlapi.v2.API;
import ai.expert.nlapi.v2.cloud.Categorizer;
import ai.expert.nlapi.v2.cloud.CategorizerConfig;
import ai.expert.nlapi.v2.message.CategorizeResponse;

public class CategorizationIPTCTest {

    static StringBuilder sb = new StringBuilder();
    
    // Sample text to be analyzed
    static {
        sb.append("Michael Jordan was one of the best basketball players of all time.");
        sb.append("Scoring was Jordan's stand-out skill, but he still holds a defensive NBA record, with eight steals in a half.");  
    }

    public static String getSampleText() {
        return sb.toString();
    }

    //Method for setting the authentication credentials - set your credentials here.
    public static Authentication createAuthentication() throws Exception {
    	DefaultCredentialsProvider credentialsProvider = new DefaultCredentialsProvider();
        Authenticator authenticator = new BasicAuthenticator(credentialsProvider);
        return new Authentication(authenticator);
    }
    
    //Method for selecting the resource to be call by the API; 
    //as today, the API provides the IPTC classifier only, and 
    //five languages such as English, French, Spanish, German and Italian
    public static Categorizer createCategorizer() throws Exception {
        return new Categorizer(CategorizerConfig.builder()
            .withVersion(API.Versions.V2)
            .withTaxonomy("iptc")
            .withLanguage(API.Languages.en)
            .withAuthentication(createAuthentication())
            .build());
    }

    public static void main(String[] args) {
        try {
            Categorizer categorizer = createCategorizer();
            
            //Perform the IPTC classification and store it into a Response Object
            CategorizeResponse categorization = categorizer.categorize(getSampleText());
            categorization.prettyPrint();
        }
        catch(Exception ex) {
            ex.printStackTrace();
        }
    }
}

or with the GeoTax taxonomy:

package ai.expert.nlapi.v2.test;

import ai.expert.nlapi.security.Authentication;
import ai.expert.nlapi.security.Authenticator;
import ai.expert.nlapi.security.BasicAuthenticator;
import ai.expert.nlapi.security.Credential;
import ai.expert.nlapi.v2.API;
import ai.expert.nlapi.v2.cloud.Categorizer;
import ai.expert.nlapi.v2.cloud.CategorizerConfig;
import ai.expert.nlapi.v2.message.CategorizeResponse;

public class CategorizationGeoTAXTest {

    static StringBuilder sb = new StringBuilder();
    
    // Sample text to be analyzed
    static {
    	// set text to be analyzed using GeoTAX taxonomy
        sb.append("Rome is the capital city and a special comune of Italy as well as the capital of the Lazio region. ");
        sb.append("The city has been a major human settlement for almost three millennia. ");
        sb.append("It is the third most populous city in the European Union by population within city limits.");
    }

    public static String getSampleText() {
        return sb.toString();
    }

    //Method for setting the authentication credentials - set your credentials here.
    public static Authentication createAuthentication() throws Exception {
    	DefaultCredentialsProvider credentialsProvider = new DefaultCredentialsProvider();
        Authenticator authenticator = new BasicAuthenticator(credentialsProvider);
        return new Authentication(authenticator);
    }
    
    //Method for selecting the resource to be call by the API; 
    //as today, the API provides the GeoTAX classifier only, and 
    //five languages such as English, French, Spanish, German and Italian
    public static Categorizer createCategorizer() throws Exception {
        return new Categorizer(CategorizerConfig.builder()
            .withVersion(API.Versions.V2)
            .withTaxonomy("geotax")
            .withLanguage(API.Languages.en)
            .withAuthentication(createAuthentication())
            .build());
    }

    public static void main(String[] args) {
        try {
            Categorizer categorizer = createCategorizer();
            
            //Perform the GeoTAX classification and store it into a Response Object
            CategorizeResponse categorization = categorizer.categorize(getSampleText());
            categorization.prettyPrint();
        }
        catch(Exception ex) {
            ex.printStackTrace();
        }
    }
}

For retrieving the list of all categories of a taxonomy for a specific language follow this example:

GeoTax categories for English

import ai.expert.nlapi.security.Authentication;
import ai.expert.nlapi.security.Authenticator;
import ai.expert.nlapi.security.BasicAuthenticator;
import ai.expert.nlapi.security.Credential;
import ai.expert.nlapi.v2.API;
import ai.expert.nlapi.v2.cloud.Analyzer;
import ai.expert.nlapi.v2.cloud.AnalyzerConfig;

public class TaxonomiesTest {


    //Method for setting the authentication credentials - set your credentials here.
    public static Authentication createAuthentication() throws Exception {
    	DefaultCredentialsProvider credentialsProvider = new DefaultCredentialsProvider();
        Authenticator authenticator = new BasicAuthenticator(credentialsProvider);
        return new Authentication(authenticator);
    }



    public static void main(String[] args) {
        try {
            // create InfoAPI
        	InfoAPI infoAPI = new InfoAPI(InfoAPIConfig.builder()
                .withAuthentication(createAuthentication())
                .withVersion(API.Versions.V2)
                .build());
    	
	    	
	        TaxonomyResponse taxonomy = infoAPI.getTaxonomy("geotax", API.Languages.en);
        	taxonomy.prettyPrint();
        }
        catch(Exception ex) {
            ex.printStackTrace();
        }
    }
}

For retrieving the list of all valid taxonomies use this code:

import ai.expert.nlapi.security.Authentication;
import ai.expert.nlapi.security.Authenticator;
import ai.expert.nlapi.security.BasicAuthenticator;
import ai.expert.nlapi.security.Credential;
import ai.expert.nlapi.v2.API;
import ai.expert.nlapi.v2.cloud.Analyzer;
import ai.expert.nlapi.v2.cloud.AnalyzerConfig;

public class TaxonomiesTest {


    //Method for setting the authentication credentials - set your credentials here.
    public static Authentication createAuthentication() throws Exception {
    	DefaultCredentialsProvider credentialsProvider = new DefaultCredentialsProvider();
        Authenticator authenticator = new BasicAuthenticator(credentialsProvider);
        return new Authentication(authenticator);
    }



    public static void main(String[] args) {
        try {
            // create InfoAPI
        	InfoAPI infoAPI = new InfoAPI(InfoAPIConfig.builder()
                .withAuthentication(createAuthentication())
                .withVersion(API.Versions.V2)
                .build());
    	
	    	
	        Taxonomies taxonomies = infoAPI.getTaxonomies();
        	taxonomies.prettyPrint();
        }
        catch(Exception ex) {
            ex.printStackTrace();
        }
    }
}

Information detection

You can also perform information detection using one the available detectors. For example, the PII Detector (PII stands for Personal Identifiable Information) is able to detect and extract information (such as names, dates, addresses, telephone numbers, etc.) that could be considered "sensitive".

package ai.expert.nlapi.v2.test;

import ai.expert.nlapi.security.Authentication;
import ai.expert.nlapi.security.Authenticator;
import ai.expert.nlapi.security.BasicAuthenticator;
import ai.expert.nlapi.security.Credential;
import ai.expert.nlapi.v2.API;
import ai.expert.nlapi.v2.cloud.Detector;
import ai.expert.nlapi.v2.cloud.DetectorConfig;
import ai.expert.nlapi.v2.message.DetectResponse;

public class CategorizationIPTCTest {

    static StringBuilder sb = new StringBuilder();
    
    // Sample text to be analyzed
    static {
        sb.append("Michael Jordan was one of the best basketball players of all time.");
        sb.append("Scoring was Jordan's stand-out skill, but he still holds a defensive NBA record, with eight steals in a half.");  
    }

    public static String getSampleText() {
        return sb.toString();
    }

    //Method for setting the authentication credentials - set your credentials here.
    public static Authentication createAuthentication() throws Exception {
    	DefaultCredentialsProvider credentialsProvider = new DefaultCredentialsProvider();
        Authenticator authenticator = new BasicAuthenticator(credentialsProvider);
        return new Authentication(authenticator);
    }
    
   public static Detector createDetector(Authentication authentication, String detector, API.Languages lang) throws Exception {
        return new Detector(DetectorConfig.builder()
                      .withVersion(API.Versions.V2)
                      .withDetector(detector)
                      .withLanguage(lang)
                      .withAuthentication(authentication)
                      .build());
    }

    public static void main(String[] args) {
        try {
             // create detector
            Detector detectorEn = createDetector(authentication, "pii", API.Languages.en);

            // send detector request and get response
            DetectResponse detect = detectorEn.detect(getSampleTextEn());
            // print json response
            detect.prettyPrint();
        }
        catch(Exception ex) {
            ex.printStackTrace();
        }
    }
}

For retrieving the list of all valid detectors use this code:

import ai.expert.nlapi.security.Authentication;
import ai.expert.nlapi.security.Authenticator;
import ai.expert.nlapi.security.BasicAuthenticator;
import ai.expert.nlapi.security.Credential;
import ai.expert.nlapi.v2.message.DetectorsResponse;
import ai.expert.nlapi.v2.API;
import ai.expert.nlapi.v2.cloud.Analyzer;
import ai.expert.nlapi.v2.cloud.AnalyzerConfig;

public class DetectorsTest {


    //Method for setting the authentication credentials - set your credentials here.
    public static Authentication createAuthentication() throws Exception {
    	DefaultCredentialsProvider credentialsProvider = new DefaultCredentialsProvider();
        Authenticator authenticator = new BasicAuthenticator(credentialsProvider);
        return new Authentication(authenticator);
    }



    public static void main(String[] args) {
        try {
            // create InfoAPI
        	InfoAPI infoAPI = new InfoAPI(InfoAPIConfig.builder()
                .withAuthentication(createAuthentication())
                .withVersion(API.Versions.V2)
                .build());
    	
	    	
	        DetectorsResponse detectors = infoAPI.getDetectors();
            detectors.prettyPrint();
        }
        catch(Exception ex) {
            ex.printStackTrace();
        }
    }
}

API capabilities

Make reference to the Natural Language API and Studio Local Deployment Agent documentation to know more about the APIs capabilities.

Notes

The project makes use of Lombok Project.

Project Lombok is a Java library that automatically plugs into your editor. In case you use Jetbrains IntelliJ IDEA editor see this link https://projectlombok.org/setup/intellij. For Eclipse editor check this link https://projectlombok.org/setup/eclipse for installing Lombok Project.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.