The SearchAPI is a Web API that responsibles to interface our search engine. There is a main controller called SearchController to searching documents. You can make queries using features such as filters, facets, sorts and much more. Please, see API Reference to more details.
To generating a client, we can use Haxe cross-platform toolkit (incubating feature).
In order to build SearchAPI application you need to have:
- JDK 1.8
- Setup
JAVA_HOME
environment variables with path to JDK 1.8
To build this project, first time you try to build you need to run this:
./gradlew build
You can import the project in your favorite IDE:
-
Eclipse
- Run the command
./gradlew cleanEclipse eclipse
; - import project at Eclipse;
- Run the command
-
Intellij IDEA
- Import project as Gradle project using
build.gradle
- Import project as Gradle project using
You must pass the -Des.cluster.name
Java parameter.
There are two test types:
- Unit tests
./gradlew test
When you run just test
the integration tests always run together.
- Integration tests
./gradlew integrationTest
The integration tests
are responsible to guarantees a SearchAPI fine integration to ElasticSearch. We are using Docker Compose to up Elasticsearch and SearchAPI Docker containers and run SearchApiIntegrationTest class.
To skipping:
- Integration tests just use
-x integrationTest
in your Gradle execution. - Docker compose just use
-PnoDockerCompose
Gradle parameter.
There are only two endpoints to searching and monitoring:
-
Searching endpoints:
GET /v2/{index}
: Search documentsGET /v2/{index}/{id}
: Search document by idGET /v2/{index}/stream
: Streaming endpoint (using application/x-ndjson content type)
Main parameters:
Name Type Description from
int
From index to start the search from size
int
The number of search hits to return filter
string
Query DSL includeFields
string[]
Fields that will be included in the result excludeFields
string[]
Fields that will be excluded in the result There are many parameters and you can see here.
-
Monitoring endpoints:
GET /v2/cluster/settings
: Get all configsGET /v2/properties/local
: Get local propertiesGET /v2/properties/remote
: Get remote properties
SearchAPI provides a Query DSL creating flexible queries to our search engine. The idea is to abstract any kind of complexity to making queries and search optimizations.
Basically, you need just use filter
query parameter for that, for example:
curl -X GET http://api/v2/listings?filter=field1 EQ 'value1' AND (field2 EQ 'value2'OR field3 EQ 'value3')
SearchAPI parses this query using different kind of parsers and generates an Abstract Syntax Tree with the query fragments. To explanation the query fragments, please see the image below:
You can see more details in wiki.
We are deploying SearchAPI with Amazon AWS using El Asno Alado project and the mainly file to configure deploy is a Makefile located in the project's root directory.
First of all, you need to setup your AWS Credentials and sync git submodules:
export AWS_ACCESS_KEY_ID=<YOUR_ACCESS_KEY>
export AWS_SECRET_ACCESS_KEY=<YOUR_SECRET_KEY>
git submodule init && git submodule update --init --recursive
After that you can simply run make deploy
passing all required parameters to deploy:
Make sure you've pushed the docker image to DockerHub before deploying the environment.
make ENV=${ENV} \
IMAGE_NAME=${IMAGE_NAME} \
STACK_ALIAS=${STACK_ALIAS} \
AWS_DEFAULT_REGION=${AWS_DEFAULT_REGION} \
ES_CLUSTER_NAME=${ES_CLUSTER_NAME} \
deploy
-
ENV
: environment (prod
orqa
). -
IMAGE_NAME
is a string with the image pushed to Dockerhub -
STACK_ALIAS
is a string used to name Cloudformation stack. If not present, the hash of the current commit will be used to identify the stack. -
AWS_DEFAULT_REGION
is a string with the AWS region to deploy. (Eg.sa-east-1
) -
ES_CLUSTER_NAME
is a string with the current ElasticSearch cluster.
There is a load-test
sub-project that responsible to execute load tests for SearchAPI.
See Load Test
We are using jmh to code benchmark test and jmh Gradle Plugin.
To run code benchmark tests:
make benchmark
Made with โฅ by the VivaReal's engineering team.