Git Product home page Git Product logo

thehivedocs's Introduction

thehivedocs's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

thehivedocs's Issues

Creation of a new user unaccessible on the training VM

Hi,

When trying to use the training VM (both the latest one and the previous one), after the installation and with a local authentication scheme, after clicking the "update database" button on the maintenance page, it skips the creation of the new user and goes directly to the login form. It is hence not possible to login into the application because no user exist.

Could this be an issue or an error I made while installing or configuring TheHive? Thanks for your help !

API Documentation about template

Hy!

I was looking into your API Documentation, and i don't see how/if it's possible to create a case from a template.
If it is possible to do that through the API, is it possible to document it please ? :)

Elasticsearch connection via TLS.

I cannot find it in your documentation, is there a setting for switching the hive to talk to ES via TLS for it's transport connection?

Authentication section API KEY

Hello folks!
In the authentication section of the API "TheHiveDocs/api/authentication.md" it is hinted that the API key is a form of authentication. However, I suggest it is rather a form of identification and it is TLS that is doing the authentication (if any).

I suggest that this section be revisited in order make it more clear what type of authentication (if any) is really performed within the API. This would make it easier to promote the project to nitpicky security professionals who read the docs.

Thanks for all the hard work!

no vars.sh

when i start thehive ,it shows no var.sh:
"/etc/init.d/thehive: line 25: /lib/init/vars.sh: No such file or directory".
Server OS is Centos 7.4.1708.
Thanks

Oops, cannot start the server. Cannot load play.http.filters after Upgrade

Support,

Old HIVE Version: 2.10.2
New HIVE Version: 3.0.10
Server: Ubuntu 16

Binary upgrade following documented instructions.

Upon service start, we receive the following error:

Oops, cannot start the server.
at play.utils.Reflect$.loadClass$1(Reflect.scala
at play.utils.Reflect$.configuredClass(Reflect.scala
at play.utils.Reflect$.bindingsFromConfiguration(Reflect.scala
at play.api.http.HttpFilters$.bindingsFromConfiguration(HttpFilters.scala
at play.api.inject.BuiltinModule$$anonfun$$lessinit$greater$1.$anonfun$new$3(BuiltinModule.scala
at play.api.inject.BuiltinModule$$anonfun$$lessinit$greater$1.$anonfun$new$1(BuiltinModule.scala
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala
at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala
at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala
at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala
at scala.collection.TraversableLike.flatMap(TraversableLike.scala
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala
at scala.collection.AbstractTraversable.flatMap(Traversable.scala
at play.api.inject.BuiltinModule$$anonfun$$lessinit$greater$1.dynamicBindings$1(BuiltinModule.scala
at play.api.inject.BuiltinModule$$anonfun$$lessinit$greater$1.apply(BuiltinModule.scala
at play.api.inject.BuiltinModule$$anonfun$$lessinit$greater$1.apply(BuiltinModule.scala
at play.api.inject.SimpleModule.bindings(Module.scala
at play.api.inject.guice.GuiceableModuleConversions.guice(GuiceInjectorBuilder.scala
at play.api.inject.guice.GuiceableModuleConversions.guice$(GuiceInjectorBuilder.scala
at play.api.inject.guice.GuiceableModule$.guice(GuiceInjectorBuilder.scala
at play.api.inject.guice.GuiceableModuleConversions$$anon$3.$anonfun$guiced$2(GuiceInjectorBuilder.scala
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala
at scala.collection.immutable.List.foreach(List.scala
at scala.collection.TraversableLike.map(TraversableLike.scala
at scala.collection.TraversableLike.map$(TraversableLike.scala
at scala.collection.immutable.List.map(List.scala
at play.api.inject.guice.GuiceableModuleConversions$$anon$3.guiced(GuiceInjectorBuilder.scala
at play.api.inject.guice.GuiceableModule$.$anonfun$guiced$1(GuiceInjectorBuilder.scala
at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala
at scala.collection.immutable.List.foreach(List.scala
at scala.collection.TraversableLike.flatMap(TraversableLike.scala
at scala.collection.TraversableLike.flatMap$(TraversableLike.scala
at scala.collection.immutable.List.flatMap(List.scala
at play.api.inject.guice.GuiceableModule$.guiced(GuiceInjectorBuilder.scala
at play.api.inject.guice.GuiceBuilder.createModule(GuiceInjectorBuilder.scala
at play.api.inject.guice.GuiceApplicationBuilder.applicationModule(GuiceApplicationBuilder.scala
at play.api.inject.guice.GuiceBuilder.injector(GuiceInjectorBuilder.scala
at play.api.inject.guice.GuiceApplicationBuilder.build(GuiceApplicationBuilder.scala
at play.api.inject.guice.GuiceApplicationLoader.load(GuiceApplicationLoader.scala
at play.core.server.ProdServerStart$.start(ProdServerStart.scala
at play.core.server.ProdServerStart$.main(ProdServerStart.scala
at play.core.server.ProdServerStart.main(ProdServerStart.scala)
Caused by
at java.net.URLClassLoader.findClass(URLClassLoader.java
at java.lang.ClassLoader.loadClass(ClassLoader.java
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java
at java.lang.ClassLoader.loadClass(ClassLoader.java
at play.utils.Reflect$.loadClass$1(Reflect.scala
... 41 more


If we comment out play.http.filters = global.TheHiveFilters within the application.conf, the service will start, but will not connect to Elasticsearch.

Best,

Joe

Elasticsearch to current version?

Hi,

I see that you are still using Elasticsearch 2 within thehive.
Can you please update compatibility to Elasticsearch 5.x which is the current version?

Thanks in advance.

/bin/activator does not create /target/universal/stage

am building from source on a centos7 (actually I am writing a bash script for it). The line

sudo cp -r /opt/TheHive/target/universal/stage /opt/thehive

did not work. I have no knowledge about SBT but as far as I understood the issue the command

bin/activator clean stage

should be

/opt/TheHive/bin/activator clean compile stage dist

This generates for me the path that is described in the documentation.

Could this be an issue with the documentation or did I make a mistake?

Missing API documentation about the _search endpoints

Hi,

I'm looking to use the _search endpoints (mainly for /api/cases/_search and /api/tasks/_search) but I am not able to find much documentation about it.

Are some processing made on the query or are they "simply" forwarded to the elasticsearch? If so, can you add the documentation:

  • for the various parameters (query, range, etc)
  • and on the pagination methodology and how to loop on all objects returned by a query?

Many thanks

Docker-compose up fails

docker creation fails when trying to bring the docker up by executing below command

#docker-compose up

Error

2018-06-01 09:08:25,288 [INFO] from play.api.Play in main - Application started (Prod)
2018-06-01 09:08:26,528 [INFO] from play.core.server.AkkaHttpServer in main - Listening for HTTP on /0.0.0.0:9001
2018-06-01 10:36:33,002 [WARN] from akka.actor.ActorSystemImpl in application-akka.actor.default-dispatcher-11 - Illegal request, responding with status '400 Bad Request': Unsupported HTTP method: HTTP method too long (started with 'Auミ￟'). Increase akka.http.server.parsing.max-method-length to support HTTP methods with more characters.
2018-06-01 10:36:33,003 [WARN] from akka.actor.ActorSystemImpl in application-akka.actor.default-dispatcher-11 - Illegal request, responding with status '400 Bad Request': Unsupported HTTP method: HTTP method too long (started with 'Vラᅡ�o'). Increase akka.http.server.parsing.max-method-length to support HTTP methods with more characters.
2018-06-01 10:36:33,320 [WARN] from akka.actor.ActorSystemImpl in application-akka.actor.default-dispatcher-11 - Illegal request, responding with status '400 Bad Request': Unsupported HTTP method: HTTP method too long (started with 'ᆴᆰI↑￲ᅲy'). Increase akka.http.server.parsing.max-method-length to support HTTP methods with more characters.
2018-06-01 10:36:33,376 [WARN] from akka.actor.ActorSystemImpl in application-akka.actor.default-dispatcher-13 - Illegal request, responding with status '400 Bad Request': Unsupported HTTP method: HTTP method too long (started with 'ᆴ￶L'). Increase akka.http.server.parsing.max-method-length to support HTTP methods with more characters.
2018-06-01 10:36:40,555 [WARN] from akka.actor.ActorSystemImpl in application-akka.actor.default-dispatcher-14 - Illegal request, responding with status '400 Bad Request': Unsupported HTTP method: HTTP method too long (started with '￲ᅡ/ᅢチ'). Increase akka.http.server.parsing.max-method-length to support HTTP methods with more characters.
2018-06-01 10:36:40,568 [WARN] from akka.actor.ActorSystemImpl in application-akka.actor.default-dispatcher-12 - Illegal request, responding with status '400 Bad Request': Unsupported HTTP method: HTTP method too long (started with '￞?:"○'). Increase akka.http.server.parsing.max-method-length to support HTTP methods with more characters.
2018-06-01 10:36:40,587 [WARN] from akka.actor.ActorSystemImpl in application-akka.actor.default-dispatcher-8 - Illegal request, responding with status '400 Bad Request': Unsupported HTTP method: HTTP method too long (started with 'ᆴᆰXᆭ{←'). Increase akka.http.server.parsing.max-method-length to support HTTP methods with more characters.
2018-06-01 10:36:40,594 [WARN] from akka.actor.ActorSystemImpl in application-akka.actor.default-dispatcher-12 - Illegal request, responding with status '400 Bad Request': Unsupported HTTP method: HTTP method too long (started with 'ᆴᆰ￶トᄑ/→'). Increase akka.http.server.parsing.max-method-length to support HTTP methods with more characters.
2018-06-01 10:36:46,310 [WARN] from akka.actor.ActorSystemImpl in application-akka.actor.default-dispatcher-11 - Illegal request, responding with status '400 Bad Request': Unsupported HTTP '). Increase akka.http.server.parsing.max-method-length to support HTTP methods with more characters.
[root@localhost logs]# ll

How to fix this issue

Misp synchronization failed

I am using the Hive Training VM with the following configuration

  1. MISP user is given user role
  2. Tags name is default "misp"
  3. max date = 30 days

I have kept all the other config same as per the docs.
But still I am facing the following error (when checked from logs):
image

Please help!

(suggestion) Commands to create group and user

It might be an idea to point out in the documentation that on Linux systems addgroup is groupadd and adduser is useradd.

sudo groupadd thehive
sudo useradd -g thehive --system thehive

Docker-Compose Setup "None of the configured nodes are available"

https://github.com/CERT-BDF/TheHiveDocs/blob/master/installation/docker-guide.md

The instructions in the above link reads as though you either need to use docker-compose OR manually install and configure ElasticSearch.

I'm pretty new to this so I went the docker-compose route and the instructions give the the following error...

thehive_1 | [error] a.a.OneForOneStrategy - None of the configured nodes are available: [{#transport#-1}{127.0.0.1}{127.0.0.1:9200}] thehive_1 | org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{127.0.0.1}{127.0.0.1:9200}] thehive_1 | at org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(TransportClientNodesService.java:290) thehive_1 | at org.elasticsearch.client.transport.TransportClientNodesService.execute(TransportClientNodesService.java:207) thehive_1 | at org.elasticsearch.client.transport.support.TransportProxyClient.execute(TransportProxyClient.java:55) thehive_1 | at org.elasticsearch.client.transport.TransportClient.doExecute(TransportClient.java:288) thehive_1 | at org.elasticsearch.client.support.AbstractClient.execute(AbstractClient.java:359) thehive_1 | at org.elasticsearch.client.support.AbstractClient.search(AbstractClient.java:582) thehive_1 | at com.sksamuel.elastic4s.SearchDsl$SearchDefinitionExecutable$$anonfun$apply$1.apply(SearchDsl.scala:40) thehive_1 | at com.sksamuel.elastic4s.SearchDsl$SearchDefinitionExecutable$$anonfun$apply$1.apply(SearchDsl.scala:40) thehive_1 | at com.sksamuel.elastic4s.Executable$class.injectFutureAndMap(Executable.scala:21) thehive_1 | at com.sksamuel.elastic4s.SearchDsl$SearchDefinitionExecutable$.injectFutureAndMap(SearchDsl.scala:37)

The instructions may need to be modified to be more clear.

Unclear observable documentation

The documentation for uploading an observable is unclear. I am trying to upload a file as an observable to a case via the api and the documentation says:

Required attributes:

data: (string) : content of the observable (read only). An observable can’t contain data and attachment attributes
attachment (attachment) : observable file content (read-only). An observable can’t contain data and attachment attributes

why does it say data and attachment are required attributes then say "An observable can't contain data and attachment attributes"?

Keep-alive in TheHive

Hi all,

Following a discussion on the gitter channel I decided to open this issue to bring more attention to this issue. This might not be an issue, so feel free to close it if that's the case.

I'm currently running TheHive behind a load balancer, and I had several timeouts, which result in the following error pop up in the lower left corner of TheHive (regardless of which screen we are in):

image

The LoadBalancer in this case is AWS Elastic Load Balancer (ELB), and following their documentation one of the possible solutions was to increase the timeout, even though I don't believe that is the best solution.

Another recommendation given by AWS is to implement a keep-alive on the application. I looked around and I'm not sure this is supported by TheHive. I'm not sure this can be done at the OS-layer, so any feedback on this is greatly appreciated.

I saw some keep-alive settings but they only applied to Elastic. Any tips? Can TH support keep-alive?

Feature: Add "auto-completion" to the UI

An "auto-completion" feature would be great in the UI - especially for adding "tags" or generally when using text inputs in forms.

Would make using the app easier and more straight forward for new users.

;-)

API Call with Key: Resource not found by Assets controller

Followed the instructions for a Docker install, here:

https://github.com/CERT-BDF/TheHiveDocs/blob/master/installation/docker-guide.md

Everything inside of the web interface (with the exception of the Help menu option - I get "A client error occurred on GET /docs : Resource not found by Assets controller" ) seems to be working okay.

Using the following command (substituting my api key for the text in the command):

curl -H 'Authorization: Bearer APIKEY' http://127.0.0.1:9000/api/cases*

results in the following error:

"A client error occurred on GET /api/cases : Resource not found by Assets controller"

Any advice would be greatly appreciated.

Thanks,
Wes

Patching artifact/observable's reports via API impossible

Hello,

I'm using /api/case/artifact/<id> endpoint to patch artifact with reports data.
Even thought i receive a confirmation response from API (response.ok == True / response.status_code == 200/201), reports are empty.

Could you fill in the docs part with an example of how do I do that properly?
If not then maybe just show a full request/response that travels between TheHive and Cortex and Elastic when you execute an analyser.

Run Analyzer in TheHive

I want to run a PCAP file anaylzer, that I have developed, in theHive but I am getting the below message. While the analyzer works perfectly in Cortex. Please help me to resolve the issue.
capture

Discrepancy between Alert API and Artifact required attributes

The Artifact API specifies that message is a required attribute of any artifact uploaded using the API. However, the Alert API documentation uses an example of adding artifacts to an alert that does not include the supposed "required" attribute of message:

curl -XPOST -u myuser:mypassword -H 'Content-Type: application/json' http://127.0.0.1:9000/api/alert -d '{
  "title": "Other alert",
  "description": "alert description",
  "type": "external",
  "source": "instance1",
  "sourceRef": "alert-ref",
  "severity": 3,
  "tlp": 3,
  "artifacts": [
    { "dataType": "ip", "data": "127.0.0.1" },
    { "dataType": "domain", "data": "thehive-project.org" },
    { "dataType": "file", "data": "logo.svg;image/svg+xml;PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4NCjwhLS0gR2VuZXJhdG9yOiBBZG9iZSBJbGx1c3RyYXRvciAxOC4wLjAsIFNWRyBFeHBvcnQgUGx1Zy1JbiAuIFNWRyBWZXJzaW9uOiA2LjAwIEJ1aWxkIDApICAtLT4NCjwhRE9DVFlQRSBzdmcgUFVCTElDICItLy9XM0MvL0RURCBTVkcgMS4xLy9FTiIgImh0dHA6Ly93d3cudzMub3JnL0dyYXBoaWNzL1NWRy8xLjEvRFREL3N2ZzExLmR0ZCI+DQo8c3ZnIHZlcnNpb249IjEuMSIgaWQ9IkxheWVyXzEiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyIgeG1sbnM6eGxpbms9Imh0dHA6Ly93d3cudzMub3JnLzE5OTkveGxpbmsiIHg9IjBweCIgeT0iMHB4Ig0KCSB2aWV3Qm94PSIwIDAgNjI0IDIwMCIgZW5hYmxlLWJhY2tncm91bmQ9Im5ldyAwIDAgNjI0IDIwMCIgeG1sOnNwYWNlPSJwcmVzZXJ2ZSI+DQo8Zz4NCgk8Zz4NCgkJPHBhdGggZmlsbD0iIzE1MTYzMiIgZD0iTTE3Mi4yLDczdjY2LjRoLTIwLjdWNzNoLTI3LjRWNTQuOGg3NS41VjczSDE3Mi4yeiIvPg0KCQk8cGF0aCBmaWxsPSIjMTUxNjMyIiBkPSJNMjcyLjgsMTAwLjV2MzguOWgtMjAuMXYtMzQuNmMwLTcuNC00LjQtMTIuNS0xMS0xMi41Yy03LjgsMC0xMyw1LjQtMTMsMTcuN3YyOS40aC0yMC4yVjQ4LjVoMjAuMlY4Mg0KCQkJYzQuOS01LDExLjUtNy45LDE5LjYtNy45QzI2Myw3NC4xLDI3Mi44LDg0LjYsMjcyLjgsMTAwLjV6Ii8+DQoJCTxwYXRoIGZpbGw9IiMxNTE2MzIiIGQ9Ik0zNTYuMywxMTIuOGgtNDYuNGMxLjYsNy42LDYuOCwxMi4yLDEzLjYsMTIuMmM0LjcsMCwxMC4xLTEuMSwxMy41LTcuM2wxNy45LDMuNw0KCQkJYy01LjQsMTMuNC0xNi45LDE5LjgtMzEuNCwxOS44Yy0xOC4zLDAtMzMuNC0xMy41LTMzLjQtMzMuNmMwLTE5LjksMTUuMS0zMy42LDMzLjYtMzMuNmMxNy45LDAsMzIuMywxMi45LDMyLjcsMzMuNlYxMTIuOHoNCgkJCSBNMzEwLjMsMTAwLjVoMjYuMWMtMS45LTYuOC02LjktMTAtMTIuNy0xMEMzMTgsOTAuNSwzMTIuMiw5NCwzMTAuMywxMDAuNXoiLz4NCgkJPHBhdGggZmlsbD0iI0YzRDAyRiIgZD0iTTQ0NS41LDEzOS4zaC0yMC43di0zMy40aC0zNS42djMzLjRoLTIwLjhWNTQuOGgyMC44djMyLjloMzUuNlY1NC44aDIwLjdWMTM5LjN6Ii8+DQoJCTxwYXRoIGZpbGw9IiNGM0QwMkYiIGQ9Ik00NzguNiw1Ny4zYzAsNi40LTQuOSwxMS4yLTExLjcsMTEuMmMtNi44LDAtMTEuNi00LjgtMTEuNi0xMS4yYzAtNi4yLDQuOC0xMS41LDExLjYtMTEuNQ0KCQkJQzQ3My43LDQ1LjgsNDc4LjYsNTEuMSw0NzguNiw1Ny4zeiBNNDU2LjgsMTM5LjNWNzZoMjAuMnY2My4zSDQ1Ni44eiIvPg0KCQk8cGF0aCBmaWxsPSIjRjNEMDJGIiBkPSJNNTI4LjUsMTM5LjNoLTIwLjZsLTI2LjItNjMuNUg1MDNsMTUuMywzOS4xbDE1LjEtMzkuMWgyMS4zTDUyOC41LDEzOS4zeiIvPg0KCQk8cGF0aCBmaWxsPSIjRjNEMDJGIiBkPSJNNjE4LjMsMTEyLjhoLTQ2LjRjMS42LDcuNiw2LjgsMTIuMiwxMy42LDEyLjJjNC43LDAsMTAuMS0xLjEsMTMuNS03LjNsMTcuOSwzLjcNCgkJCWMtNS40LDEzLjQtMTYuOSwxOS44LTMxLjQsMTkuOGMtMTguMywwLTMzLjQtMTMuNS0zMy40LTMzLjZjMC0xOS45LDE1LjEtMzMuNiwzMy42LTMzLjZjMTcuOSwwLDMyLjMsMTIuOSwzMi43LDMzLjZWMTEyLjh6DQoJCQkgTTU3Mi4yLDEwMC41aDI2LjFjLTEuOS02LjgtNi45LTEwLTEyLjctMTBDNTc5LjksOTAuNSw1NzQuMSw5NCw1NzIuMiwxMDAuNXoiLz4NCgk8L2c+DQoJPGc+DQoJCTxnPg0KCQkJPHBhdGggZmlsbD0iI0YzRDAyRiIgZD0iTTU3LDcwLjNjNi42LDAsMTIuMiw2LjQsMTIuMiwxMS41YzAsNi4xLTEwLDYuNi0xMiw2LjZsMCwwYy0yLjIsMC0xMi0wLjMtMTItNi42DQoJCQkJQzQ0LjgsNzYuNyw1MC40LDcwLjMsNTcsNzAuM0w1Nyw3MC4zeiBNNDQuMSwxMzMuNmwyNS4yLDAuMWwyLjIsNS42bC0yOS42LTAuMUw0NC4xLDEzMy42eiBNNDcuNiwxMjUuNmwyLjItNS42bDE0LjIsMGwyLjIsNS42DQoJCQkJTDQ3LjYsMTI1LjZ6IE01MywxMTIuMWwzLjktOS41bDMuOSw5LjVMNTMsMTEyLjF6IE0yMy4zLDE0My42Yy0xLjcsMC0zLjItMC4zLTQuNi0xYy02LjEtMi43LTkuMy05LjgtNi41LTE1LjkNCgkJCQljNi45LTE2LjYsMjcuNy0yOC41LDM5LTMwLjJsLTcuNCwxOC4xbDAsMEwzOC4zLDEyOGwwLDBsLTMuNSw4LjFDMzIuNiwxNDAuNywyOC4yLDE0My42LDIzLjMsMTQzLjZMMjMuMywxNDMuNnogTTU2LjcsMTYxLjgNCgkJCQljLTguMSwwLTE0LjctNS45LTE3LjMtMTVsMzQuNywwLjFDNzEuNCwxNTYuMiw2NC44LDE2MS44LDU2LjcsMTYxLjhMNTYuNywxNjEuOHogTTk1LDE0Mi45Yy0xLjUsMC43LTMuMiwxLTQuNiwxDQoJCQkJYy00LjksMC05LjMtMy0xMS4yLTcuNmwtMy40LTguMWwwLDBsLTUuMS0xMi43YzAtMC41LTAuMi0xLTAuNS0xLjVsLTctMTcuNmMxMS4yLDIsMzIsMTQsMzguOCwzMC41DQoJCQkJQzEwNC4zLDEzMy4zLDEwMS4zLDE0MC40LDk1LDE0Mi45TDk1LDE0Mi45eiIvPg0KCQkJDQoJCQkJPGxpbmUgZmlsbD0ibm9uZSIgc3Ryb2tlPSIjRjNEMDJGIiBzdHJva2Utd2lkdGg9IjUuMjE0NiIgc3Ryb2tlLWxpbmVjYXA9InJvdW5kIiBzdHJva2UtbWl0ZXJsaW1pdD0iMTAiIHgxPSI0Ny44IiB5MT0iNjcuNSIgeDI9IjQzLjciIHkyPSI1OC45Ii8+DQoJCQkNCgkJCQk8bGluZSBmaWxsPSJub25lIiBzdHJva2U9IiNGM0QwMkYiIHN0cm9rZS13aWR0aD0iNS4yMTQ2IiBzdHJva2UtbGluZWNhcD0icm91bmQiIHN0cm9rZS1taXRlcmxpbWl0PSIxMCIgeDE9IjY2LjEiIHkxPSI2Ny41IiB4Mj0iNzAuMSIgeTI9IjU4LjkiLz4NCgkJPC9nPg0KCQkNCgkJCTxwb2x5bGluZSBmaWxsPSJub25lIiBzdHJva2U9IiNGM0QwMkYiIHN0cm9rZS13aWR0aD0iNS4yMTQ2IiBzdHJva2UtbGluZWNhcD0icm91bmQiIHN0cm9rZS1saW5lam9pbj0icm91bmQiIHN0cm9rZS1taXRlcmxpbWl0PSIxMCIgcG9pbnRzPSINCgkJCTk0LjgsMTAzLjUgMTA1LjUsODQuMiA4MS4xLDQyLjEgMzIuNyw0Mi4xIDguMyw4NC4yIDIwLDEwMy41IAkJIi8+DQoJPC9nPg0KPC9nPg0KPC9zdmc+DQo=" }
  ],
  "caseTemplate": "external-alert"
}'

Which method is appropriate for adding artifacts via alert? My testing seems to indicate that adding artifacts with a message in an Alert gives me an Invalid Format for alert.artifacts.

Content-Type usage unclear

The documentation about cases:
https://github.com/CERT-BDF/TheHiveDocs/blob/master/api/case.md
does not mention when Content-Type , if it is needed (and when not).

Fails:
curl -XPOST -u user:password -H "Content-Type: application/json" "http://127.0.0.1:9000/api/case/task/_search"

Works:
curl -XPOST -u user:password -H "Content-Type: application/json" "http://127.0.0.1:9000/api/case/task/_search" -d '{}'
curl -XPOST -u user:password "http://127.0.0.1:9000/api/case/task/_search"

Update Database - thehive:3.0.10-2

Hi,

Docker version TheHive 3.0.10-2 connects to elasticsearch:5.6.0 and then ask's to Update database.

When I click "Update Dabase" it'll never ask for "Create New User". If I refresh it'll point me to login (but there is not default user as I wasnt able to create it).

I can see that elastic index is created
curl 'http://172.17.0.3:9200/_cat/indices?v' health status index uuid pri rep docs.count docs.deleted store.size pri.store.size yellow open the_hive_13 gULOmg0fROeIlVUboD8-NA 5 1 19 0 50kb 50kb

Simple application.conf
`

Elasticsearch

search {

Name of the index

index = the_hive

Name of the Elasticsearch cluster

cluster = hive

Address of the Elasticsearch instance

host = ["172.17.0.3:9300"]

Scroll keepalive

keepalive = 1m

Size of the page for scroll

pagesize = 50

Number of shards

nbshards = 5

Number of replicas

nbreplicas = 1

Arbitrary settings

settings {
# Maximum number of nested fields
mapping.nested_fields.limit = 100
}

Enable SSL to connect to ElasticSearch

search.ssl.enabled = false
}

`

From Docker logs there are some error's while creating|migrating database.
`[info] o.e.ErrorHandler - GET /api/user/current returned 520
org.elasticsearch.transport.RemoteTransportException: [hlpgbPW][172.17.0.3:9300][indices:data/read/search]
Caused by: org.elasticsearch.index.IndexNotFoundException: no such index
at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver$WildcardExpressionResolver.infe(IndexNameExpressionResolver.java:676)
at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver$WildcardExpressionResolver.innerResolve(IndexNameExpressionResolver.java:630)
at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver$WildcardExpressionResolver.resolve(IndexNameExpressionResolver.java:578)
at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver.concreteIndices(IndexNameExpressionResolver.java:168)
at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver.concreteIndices(IndexNameExpressionResolver.java:140)
at org.elasticsearch.action.search.TransportSearchAction.executeSearch(TransportSearchAction.java:265)
at org.elasticsearch.action.search.TransportSearchAction.doExecute(TransportSearchAction.java:188)
at org.elasticsearch.action.search.TransportSearchAction.doExecute(TransportSearchAction.java:67)
at org.elasticsearch.action.support.TransportAction$RequestFilterChain.proceed(TransportAction.java:170)
at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:142)
Uncaught error from thread [application-akka.actor.default-dispatcher-10]: javax/xml/bind/DatatypeConverter, shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for for ActorSystem[application]
java.lang.NoClassDefFoundError: javax/xml/bind/DatatypeConverter
at io.jsonwebtoken.impl.Base64Codec.decode(Base64Codec.java:26)
at io.jsonwebtoken.impl.DefaultJwtBuilder.signWith(DefaultJwtBuilder.java:106)
at play.api.mvc.JWTCookieDataCodec$JWTFormatter.format(Cookie.scala:718)
at play.api.mvc.JWTCookieDataCodec.encode(Cookie.scala:577)
at play.api.mvc.JWTCookieDataCodec.encode$(Cookie.scala:575)
at play.api.mvc.DefaultJWTCookieDataCodec.encode(Cookie.scala:768)
at play.api.mvc.FallbackCookieDataCodec.encode(Cookie.scala:742)
at play.api.mvc.FallbackCookieDataCodec.encode$(Cookie.scala:741)
at play.api.mvc.DefaultSessionCookieBaker.encode(Session.scala:95)
at play.api.mvc.CookieBaker.encodeAsCookie(Cookie.scala:414)
at play.api.mvc.CookieBaker.encodeAsCookie$(Cookie.scala:413)
at play.api.mvc.DefaultSessionCookieBaker.encodeAsCookie(Session.scala:95)
at play.api.mvc.Result.$anonfun$bakeCookies$2(Results.scala:297)
at scala.Option.map(Option.scala:146)
at play.api.mvc.Result.bakeCookies(Results.scala:296)
at play.core.server.common.ServerResultUtils.prepareCookies(ServerResultUtils.scala:227)
at play.core.server.AkkaHttpServer.$anonfun$executeAction$3(AkkaHttpServer.scala:302)
at akka.http.scaladsl.util.FastFuture$.strictTransform$1(FastFuture.scala:41)
at akka.http.scaladsl.util.FastFuture$.$anonfun$transformWith$3(FastFuture.scala:51)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at play.api.libs.streams.Execution$trampoline$.executeScheduled(Execution.scala:109)
at play.api.libs.streams.Execution$trampoline$.execute(Execution.scala:71)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:68)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:284)
at scala.concurrent.Promise.complete(Promise.scala:49)
at scala.concurrent.Promise.complete$(Promise.scala:48)
at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:183)
at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:38)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:91)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:91)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:43)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.ClassNotFoundException: javax.xml.bind.DatatypeConverter
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:582)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:190)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:499)
... 42 more
[info] p.c.s.AkkaHttpServer - Stopping server...
[error] a.a.ActorSystemImpl - Uncaught error from thread [application-akka.actor.default-dispatcher-10]: javax/xml/bind/DatatypeConverter, shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[application]
java.lang.NoClassDefFoundError: javax/xml/bind/DatatypeConverter
at io.jsonwebtoken.impl.Base64Codec.decode(Base64Codec.java:26)
at io.jsonwebtoken.impl.DefaultJwtBuilder.signWith(DefaultJwtBuilder.java:106)
at play.api.mvc.JWTCookieDataCodec$JWTFormatter.format(Cookie.scala:718)
at play.api.mvc.JWTCookieDataCodec.encode(Cookie.scala:577)
at play.api.mvc.JWTCookieDataCodec.encode$(Cookie.scala:575)
at play.api.mvc.DefaultJWTCookieDataCodec.encode(Cookie.scala:768)
at play.api.mvc.FallbackCookieDataCodec.encode(Cookie.scala:742)
at play.api.mvc.FallbackCookieDataCodec.encode$(Cookie.scala:741)
at play.api.mvc.DefaultSessionCookieBaker.encode(Session.scala:95)
at play.api.mvc.CookieBaker.encodeAsCookie(Cookie.scala:414)
Caused by: java.lang.ClassNotFoundException: javax.xml.bind.DatatypeConverter
at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:582)
at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:190)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:499)
at io.jsonwebtoken.impl.Base64Codec.decode(Base64Codec.java:26)
at io.jsonwebtoken.impl.DefaultJwtBuilder.signWith(DefaultJwtBuilder.java:106)
at play.api.mvc.JWTCookieDataCodec$JWTFormatter.format(Cookie.scala:718)
at play.api.mvc.JWTCookieDataCodec.encode(Cookie.scala:577)
at play.api.mvc.JWTCookieDataCodec.encode$(Cookie.scala:575)
at play.api.mvc.DefaultJWTCookieDataCodec.encode(Cookie.scala:768)
at play.api.mvc.FallbackCookieDataCodec.encode(Cookie.scala:742)
[info] a.a.CoordinatedShutdown - Starting coordinated shutdown from JVM shutdown hook
Using secret: vP5JidPztfNsnT1503hxdP6iJoA6YMdqNI0gu3NUxFYKmHefCTjlvLKEXkrHq0uJ
Warning automatic elasticsearch host config fails
elasticsearch host not configured
[info] o.r.Reflections - Reflections took 167 ms to scan 5 urls, producing 116 keys and 1174 values
[info] module - Loading model class org.elastic4play.services.DBListModel
[info] module - Loading model class models.LogModel
[info] module - Loading model class org.elastic4play.services.AttachmentModel
[info] module - Loading model class connectors.cortex.models.ReportTemplateModel
[info] module - Loading model class models.AuditModel
[info] module - Loading model class models.AlertModel
[info] module - Loading model class models.CaseTemplateModel
[info] module - Loading model class models.UserModel
[info] module - Loading model class models.ArtifactModel
[info] module - Loading model class models.CaseModel
[info] module - Loading model class connectors.cortex.models.JobModel
[info] module - Loading model class models.TaskModel
[info] module - Loading model class models.DashboardModel
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by com.google.inject.internal.cglib.core.$ReflectUtils$1 (file:/opt/thehive/lib/com.google.inject.guice-4.1.0.jar) to method java.lang.ClassLoader.defineClass(java.lang.String,byte[],int,int,java.security.ProtectionDomain)
WARNING: Please consider reporting this to the maintainers of com.google.inject.internal.cglib.core.$ReflectUtils$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
[info] a.e.s.Slf4jLogger - Slf4jLogger started
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] i.n.u.i.PlatformDependent - Your platform does not provide complete low-level API for accessing direct buffers reliably. Unless explicitly requested, heap buffer will always be preferred to avoid potential system instability.
[info] play.api.Play - Application started (Prod)
[info] p.c.s.AkkaHttpServer - Listening for HTTP on /0.0.0.0:9000
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.s.MigrationSrv - Create a new empty database
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 2
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 3
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 4
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 5
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 6
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 7
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 8
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 9
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 10
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 11
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 12
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 13
[info] o.e.s.MigrationSrv - Migrating 0 entities from sequence
[info] o.e.s.MigrationSrv - Migrating 0 entities from alert
[info] o.e.s.MigrationSrv - Migrating 0 entities from audit
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$a#1665843734],Actor[akka://application/user/$a#1665843734])
migrateEntity(sequence) has finished : Success(())
[info] o.e.s.MigrationSrv - Migrating 0 entities from case
[info] o.e.s.MigrationSrv - Migrating 0 entities from caseTemplate
[info] o.e.s.MigrationSrv - Migrating 0 entities from case_artifact
[info] o.e.s.MigrationSrv - Migrating 0 entities from case_artifact_job
[info] o.e.s.MigrationSrv - Migrating 0 entities from case_task
[info] o.e.s.MigrationSrv - Migrating 0 entities from case_task_log
migrateEntity(audit) has finished : Success(())
migrateEntity(alert) has finished : Success(())
migrateEntity(case_artifact) has finished : Success(())
migrateEntity(caseTemplate) has finished : Success(())
migrateEntity(case_task) has finished : Success(())
migrateEntity(case) has finished : Success(())
migrateEntity(case_artifact_job) has finished : Success(())
[info] o.e.s.MigrationSrv - Migrating 0 entities from dashboard
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$b#1966013194],Actor[akka://application/user/$b#1966013194])
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$c#-1602372507],Actor[akka://application/user/$c#-1602372507])
migrateEntity(case_task_log) has finished : Success(())
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$d#238368017],Actor[akka://application/user/$d#238368017])
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$e#744329480],Actor[akka://application/user/$e#744329480])
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$f#-888545708],Actor[akka://application/user/$f#-888545708])
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$g#342460513],Actor[akka://application/user/$g#342460513])
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$h#-601023282],Actor[akka://application/user/$h#-601023282])
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$i#-1144459192],Actor[akka://application/user/$i#-1144459192])
[info] o.e.s.MigrationSrv - Migrating 0 entities from data
migrateEntity(dashboard) has finished : Success(())
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$j#1125204883],Actor[akka://application/user/$j#1125204883])
[info] o.e.s.MigrationSrv - Migrating 0 entities from dblist
migrateEntity(data) has finished : Success(())
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$k#96368690],Actor[akka://application/user/$k#96368690])
[info] o.e.s.MigrationSrv - Migrating 0 entities from reportTemplate
migrateEntity(dblist) has finished : Success(())
[info] o.e.s.MigrationSrv - Migrating 0 entities from user
migrateEntity(reportTemplate) has finished : Success(())
migrateEntity(user) has finished : Success(())
[error] s.DeadLetterMonitoringActor - receive dead message : DeadLetter(Request(500),Actor[akka://application/user/$n#778214378],Actor[akka://application/user/$n#778214378])
[info] m.Migration - Retrieve MISP attribute to update alerts
[info] m.Migration - Updating observable data type list
[info] o.e.s.MigrationSrv - End of migration

`

Install Error While Following Deb Package Install Instructions

Received this error when installing from Deb Package.

Err:11 https://dl.bintray.com/cert-bdf/debian any/main amd64 Packages
404 Not Found
Ign:12 https://dl.bintray.com/cert-bdf/debian any/main i386 Packages
Ign:13 https://dl.bintray.com/cert-bdf/debian any/main all Packages
Ign:14 https://dl.bintray.com/cert-bdf/debian any/main Translation-en_US
Ign:15 https://dl.bintray.com/cert-bdf/debian any/main Translation-en
Fetched 102 kB in 3s (34.0 kB/s)
Reading package lists... Done
W: The repository 'https://dl.bintray.com/cert-bdf/debian any Release' does not have a Release file.
N: Data from such a repository can't be authenticated and is therefore potentially dangerous to use.
N: See apt-secure(8) manpage for repository creation and user configuration details.
E: Failed to fetch https://dl.bintray.com/cert-bdf/debian/dists/any/main/binary-amd64/Packages 404 Not Found
E: Some index files failed to download. They have been ignored, or old ones used instead.

How to run Hive in the background

Hello Folks,

I have successfully provisioned my Hive deployment. I do appreciate the hard work that went into this and am happy it is so well documented. Especially for a beginner Linux user as myself.
Kudo's to the developers!

When I runt he command to start thehive service and fire up the platform it seems to take over my console session and I am unable to issue any more commands. I know in the installation document it mentions running everything under a new user. I followed those steps and created 'thehive' user.

In step 5 of the install guide it says if you would like to run thehive as a service input the commands necessary. This is fine I did that and when I enter the sudo service thehive start i don't get any errors. However when I browse to my socket address nothing is online. It's not until I issue the sudo bin/thehive -Dconfig.file=/etc/thehive/application.conf that the web interface comes online.
Whats the point of running thehive as a service if the service needs an additional command to be run to get everything up?
After I run the sudo bin/thehive -Dconfig.file=/etc/thehive/application.conf the console starts showing me all of the log information for the server. Can I get around this by somehow running this command as the 'thehive' user we setup in step5?

Any help would be appreciated, sorry for being such a noob.

Thanks,
Eddie

Integration with SIEM

Hello,

How can I receive alerts from SIEM?
Is there any tutorial guide?

Thank you very much

Database maintenance hangs during update

I am doing an install with the binaries. I am able to get to the page that says:

"Update Database"

once I click on that button I get the following in the logging:

[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.p.PluginsService - no modules loaded
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty3Plugin]
[info] o.e.p.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin]
[info] o.e.s.MigrationSrv - Create a new empty database
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 2
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 3
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 4
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 5
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 6
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 7
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 8
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 9
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 10
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 11
[info] o.e.s.MigrationSrv - Migrate database from version 0, add operations for version 12
[error] o.e.s.MigrationSrv - Migration fail
org.elasticsearch.transport.RemoteTransportException: [9rKdW9z][127.0.0.1:9300][indices:admin/create]
Caused by: java.lang.IllegalArgumentException: Rejecting mapping update to [the_hive_12] as the final mapping would have more than 1 type: [dblist, data, case_artifact_job, caseTemplate, case_task, reportTemplate, case_task_log, alert, audit, case_artifact, user, dashboard, case]
at org.elasticsearch.index.mapper.MapperService.internalMerge(MapperService.java:494)
at org.elasticsearch.index.mapper.MapperService.internalMerge(MapperService.java:350)
at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:274)
at org.elasticsearch.cluster.metadata.MetaDataCreateIndexService$IndexCreationTask.execute(MetaDataCreateIndexService.java:444)
at org.elasticsearch.cluster.ClusterStateUpdateTask.execute(ClusterStateUpdateTask.java:45)
at org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:640)
at org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:270)
at org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:195)
at org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:130)
at org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:150)
[info] o.e.ErrorHandler - POST /api/maintenance/migrate returned 400
org.elasticsearch.transport.RemoteTransportException: [9rKdW9z][127.0.0.1:9300][indices:admin/create]
Caused by: java.lang.IllegalArgumentException: Rejecting mapping update to [the_hive_12] as the final mapping would have more than 1 type: [dblist, data, case_artifact_job, caseTemplate, case_task, reportTemplate, case_task_log, alert, audit, case_artifact, user, dashboard, case]
at org.elasticsearch.index.mapper.MapperService.internalMerge(MapperService.java:494)
at org.elasticsearch.index.mapper.MapperService.internalMerge(MapperService.java:350)
at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:274)
at org.elasticsearch.cluster.metadata.MetaDataCreateIndexService$IndexCreationTask.execute(MetaDataCreateIndexService.java:444)
at org.elasticsearch.cluster.ClusterStateUpdateTask.execute(ClusterStateUpdateTask.java:45)
at org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:640)
at org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:270)
at org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:195)
at org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:130)
at org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:150)
[info] o.e.ErrorHandler - GET /api/user/current returned 520
org.elasticsearch.transport.RemoteTransportException: [9rKdW9z][127.0.0.1:9300][indices:data/read/search]
Caused by: org.elasticsearch.index.IndexNotFoundException: no such index
at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver$WildcardExpressionResolver.indexNotFoundException(IndexNameExpressionResolver.java:678)
at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver$WildcardExpressionResolver.innerResolve(IndexNameExpressionResolver.java:630)
at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver$WildcardExpressionResolver.resolve(IndexNameExpressionResolver.java:586)
at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver.concreteIndices(IndexNameExpressionResolver.java:164)
at org.elasticsearch.cluster.metadata.IndexNameExpressionResolver.concreteIndices(IndexNameExpressionResolver.java:139)
at org.elasticsearch.action.search.TransportSearchAction.executeSearch(TransportSearchAction.java:294)
at org.elasticsearch.action.search.TransportSearchAction.lambda$doExecute$4(TransportSearchAction.java:193)
at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:60)
at org.elasticsearch.index.query.Rewriteable.rewriteAndFetch(Rewriteable.java:113)
at org.elasticsearch.index.query.Rewriteable.rewriteAndFetch(Rewriteable.java:86)

While Deleting Cases, it takes too much time to reflect on GUI

Hi Team,

While deleting cases in the Hive, It's been taking longer time to reflect on GUI though notification bar shown it gets deleted already.

Find the below screenshot in which you can easily saw that both the cases are deleted successfully through api but in my hive gui its not been shown.

deleted

Reverse proxy SSL

I cannot find it in your documentation.
I am trying to set up a ssl reverse proxy to reach my application.
The schema is as follows:
internet (port 443) -> reverse proxy (port 9443) -> the hive server (port 9443)

I am looking for a configuration for an RHEL httpd server.

Do you have a configuration type to have?

Thank you

npm dependencies

I am building from source on a centos7 (actually I am writing a bash script for it). I had the following output

[error] npm WARN deprecated [email protected]: Deprecated
[error] npm WARN deprecated [email protected]: CoffeeScript on NPM has moved to "coffeescript" (no hyphen)

After some research it seems that this can be solved with the installation of semver

npm install -g semver bower grunt-cli

Could it be that this was forgotten in the documentation?

TheHive Docker Install - Cannot connect to elasticsearch.

Howdy,

Trying to test TheHive and I am installing via docker. I have installed both Docker-CE and Docker-Compose, and The Hive successfully runs and I can get to the login page. However, it does not prompt me to make an admin account, and seems to be failing to connect to elasticsearch as I am incapable of cating any indicies in a seperate PuTTY window.

Any help will be greatly appreciated. Thanks!

Auditing the time spent per task

I am struggling to create any dashboard chart or aggregate the time spent per task so that i can audit my team members.

Is there a way of pulling the timing information?

ElasticSearch issue using docker-compose.yml file

Elasticsearch keeps exiting with error 78.

Any ideas?

elasticsearch_1 | [2017-11-18T18:32:08,920][INFO ][o.e.x.m.j.p.l.CppLogMessageHandler] [controller/47] [Main.cc@128] controller (64 bit): Version 5.5.2 (Build 0d83940bb9d682) Copyright (c) 2017 Elasticsearch BV elasticsearch_1 | [2017-11-18T18:32:09,459][INFO ][o.e.d.DiscoveryModule ] [5jyNkqi] using discovery type [zen] thehive_1 | [info] c.c.s.CortexClient - new Cortex(cortex1, http://172.18.0.3:9000, ) Basic Auth enabled: false thehive_1 | [info] c.c.s.CortexSrv - Search for unfinished job ... elasticsearch_1 | [2017-11-18T18:32:17,894][INFO ][o.e.n.Node ] initialized elasticsearch_1 | [2017-11-18T18:32:17,900][INFO ][o.e.n.Node ] [5jyNkqi] starting ... elasticsearch_1 | [2017-11-18T18:32:19,679][INFO ][o.e.t.TransportService ] [5jyNkqi] publish_address {172.18.0.2:9300}, bound_addresses {[::]:9300} elasticsearch_1 | [2017-11-18T18:32:19,804][INFO ][o.e.b.BootstrapChecks ] [5jyNkqi] bound or publishing to a non-loopback or non-link-local address, enforcing bootstrap checks elasticsearch_1 | ERROR: [1] bootstrap checks failed elasticsearch_1 | [1]: max virtual memory areas vm.max_map_count [65530] is too low, increase to at least [262144] elasticsearch_1 | [2017-11-18T18:32:19,899][INFO ][o.e.n.Node ] [5jyNkqi] stopping ... elasticsearch_1 | [2017-11-18T18:32:20,066][INFO ][o.e.n.Node ] [5jyNkqi] stopped elasticsearch_1 | [2017-11-18T18:32:20,074][INFO ][o.e.n.Node ] [5jyNkqi] closing ... elasticsearch_1 | [2017-11-18T18:32:20,240][INFO ][o.e.n.Node ] [5jyNkqi] closed elasticsearch_1 | [2017-11-18T18:32:20,280][INFO ][o.e.x.m.j.p.NativeController] Native controller process has stopped - no new native processes can be started hive_elasticsearch_1 exited with code 78

Stats API

Hi, I'm working on using the stats API to pull stats from the hive and using it to mirror the graphs that are used in the dashboards. I was hoping to use that to create a client portal so they would only be able to see alert and case statistics relevant to their organization. However, I've been struggling to figure out how to use the stats API. I was wondering if some additional documentation could be added to the docs.

Request: custom severity levels

Hello,

Please can you add possibility to modify/add the case's severity levels.
for example, in my company, we need minimum 4 levels of severity for the SOC cases.

thanks for your work :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.