Git Product home page Git Product logo

spring-cloud / spring-cloud-dataflow Goto Github PK

View Code? Open in Web Editor NEW
1.1K 98.0 579.0 67.95 MB

A microservices-based Streaming and Batch data processing in Cloud Foundry and Kubernetes

Home Page: https://dataflow.spring.io

License: Apache License 2.0

Java 96.79% Ruby 0.01% XSLT 0.01% Dockerfile 0.02% Vim Snippet 0.01% Shell 2.21% JavaScript 0.01% TypeScript 0.71% Starlark 0.21% Python 0.01% Groovy 0.03%
microservices-architecture orchestration stream-processing batch-processing predictive-analytics cloud-native datapipelines

spring-cloud-dataflow's Introduction

Spring Data Flow Dashboard

Build Status - CI

Spring Cloud Data Flow is a microservices-based toolkit for building streaming and batch data processing pipelines in Cloud Foundry and Kubernetes.

Data processing pipelines consist of Spring Boot apps, built using the Spring Cloud Stream or Spring Cloud Task microservice frameworks.

This makes Spring Cloud Data Flow ideal for a range of data processing use cases, from import/export to event streaming and predictive analytics.


Components

Architecture: The Spring Cloud Data Flow Server is a Spring Boot application that provides RESTful API and REST clients (Shell, Dashboard, Java DSL). A single Spring Cloud Data Flow installation can support orchestrating the deployment of streams and tasks to Local, Cloud Foundry, and Kubernetes.

Familiarize yourself with the Spring Cloud Data Flow architecture and feature capabilities.

Deployer SPI: A Service Provider Interface (SPI) is defined in the Spring Cloud Deployer project. The Deployer SPI provides an abstraction layer for deploying the apps for a given streaming or batch data pipeline and managing the application lifecycle.

Spring Cloud Deployer Implementations:

Domain Model: The Spring Cloud Data Flow domain module includes the concept of a stream that is a composition of Spring Cloud Stream applications in a linear data pipeline from a source to a sink, optionally including processor application(s) in between. The domain also includes the concept of a task, which may be any process that does not run indefinitely, including Spring Batch jobs.

Application Registry: The App Registry maintains the metadata of the catalog of reusable applications. For example, if relying on Maven coordinates, an application URI would be of the format: maven://<groupId>:<artifactId>:<version>.

Shell/CLI: The Shell connects to the Spring Cloud Data Flow Server's REST API and supports a DSL that simplifies the process of defining a stream or task and managing its lifecycle.


Building

Clone the repo and type

$ ./mvnw -s .settings.xml clean install 

Looking for more information? Follow this link.

Building on Windows

When using Git on Windows to check out the project, it is important to handle line-endings correctly during checkouts. By default Git will change the line-endings during checkout to CRLF. This is, however, not desired for Spring Cloud Data Flow as this may lead to test failures under Windows.

Therefore, please ensure that you set Git property core.autocrlf to false, e.g. using: $ git config core.autocrlf false. For more information please refer to the Git documentation, Formatting and Whitespace.


Running Locally w/ Oracle

By default, the Dataflow server jar does not include the Oracle database driver dependency. If you want to use Oracle for development/testing when running locally, you can specify the local-dev-oracle Maven profile when building. The following command will include the Oracle driver dependency in the jar:

$ ./mvnw -s .settings.xml clean package -Plocal-dev-oracle

You can follow the steps in the Oracle on Mac ARM64 Wiki to run Oracle XE locally in Docker with Dataflow pointing at it.

NOTE: If you are not running Mac ARM64 just skip the steps related to Homebrew and Colima


Running Locally w/ Microsoft SQL Server

By default, the Dataflow server jar does not include the MSSQL database driver dependency. If you want to use MSSQL for development/testing when running locally, you can specify the local-dev-mssql Maven profile when building. The following command will include the MSSQL driver dependency in the jar:

$ ./mvnw -s .settings.xml clean package -Plocal-dev-mssql

You can follow the steps in the MSSQL on Mac ARM64 Wiki to run MSSQL locally in Docker with Dataflow pointing at it.

NOTE: If you are not running Mac ARM64 just skip the steps related to Homebrew and Colima


Running Locally w/ IBM DB2

By default, the Dataflow server jar does not include the DB2 database driver dependency. If you want to use DB2 for development/testing when running locally, you can specify the local-dev-db2 Maven profile when building. The following command will include the DB2 driver dependency in the jar:

$ ./mvnw -s .settings.xml clean package -Plocal-dev-db2

You can follow the steps in the DB2 on Mac ARM64 Wiki to run DB2 locally in Docker with Dataflow pointing at it.

NOTE: If you are not running Mac ARM64 just skip the steps related to Homebrew and Colima


Contributing

We welcome contributions! See the CONTRIBUTING guide for details.


Code formatting guidelines

  • The directory ./src/eclipse has two files for use with code formatting, eclipse-code-formatter.xml for the majority of the code formatting rules and eclipse.importorder to order the import statements.

  • In eclipse you import these files by navigating Windows -> Preferences and then the menu items Preferences > Java > Code Style > Formatter and Preferences > Java > Code Style > Organize Imports respectfully.

  • In IntelliJ, install the plugin Eclipse Code Formatter. You can find it by searching the "Browse Repositories" under the plugin option within IntelliJ (Once installed you will need to reboot Intellij for it to take effect). Then navigate to Intellij IDEA > Preferences and select the Eclipse Code Formatter. Select the eclipse-code-formatter.xml file for the field Eclipse Java Formatter config file and the file eclipse.importorder for the field Import order. Enable the Eclipse code formatter by clicking Use the Eclipse code formatter then click the OK button. ** NOTE: If you configure the Eclipse Code Formatter from File > Other Settings > Default Settings it will set this policy across all of your Intellij projects.

License

Spring Cloud Data Flow is Open Source software released under the Apache 2.0 license.

spring-cloud-dataflow's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

spring-cloud-dataflow's Issues

UI: Study PUI theming scope

As a user, I'd like to use the admin-ui and flo with consistent look and feel.

Acceptance:

  • Scope of the effort is documented

UI: Add SPI type and version to about section

As a user, I'd like to see the version and SPI type in the about section, so I can confirm which build of admin-ui I'm currently using.

Acceptance:
SPI type and the corresponding version is listed in about section

Status information about generated endpoint for the deployed stream

How is the user of the shell supposed to understand where to send messages to and receive them from?

Let's assume I have this stream:

stream create --name foobar-extreme-super-long-string --definition "http --server.port=7788 | filter --script=foobar.groovy" | elasticsearch-sink --server.port=8192"

(Note: this might not be a correct string but you get the idea ;))

Let's assume now that the deployer is deploying this on the cloud (kubernetes cluster or cloudfoundry). How is the user supposed to know how the send message to the stream and receive them back?

I would assume the following:
For http source: http://foobar-super-long-string-http.mydomain.org:7788/message
For ES transport: foobar-super-long-string-elasticsearch-sink.mydomain.org:8192

My questions are:

  1. How would the user know how "the deployer" deduces the string?
  2. Is there a way for the user to query that in the Admin (from a users perspective)? If not, what is the though process on how and where to add that?
  3. Let's say some deployers have restrictions on the names. For example, Kubernetes has the concept of a "service" that allows to expose URLs like the ones I show above. However the foobar-super-long-string-http needs to conform to RFC 952 DNS label (24 characters, etc). This means if the user passes in a very long stream name, it will fail to deploy it. This gets even worse, if the deployer pre- or postfix the ID with the name of the stream (group) and the label as the string may easily exceed 24 char (just assume the module name is very long).

module info fails on library

Running module info on a defined library fails to resolve artifact:

org.eclipse.aether.resolution.ArtifactResolutionException: Could not find artifact org.springframework.cloud.dataflow.library:hadoop-phd30:pom:exec:1.0.0.BUILD-SNAPSHOT in repository 1 (https://repo.spring.io/libs-snapshot)

It looks like it automatically registers exec to the path even thought it was defined without it as:

module register --type library --name hadoop-phd30 --coordinates 'org.springframework.cloud.dataflow.library:hadoop-phd30:pom:1.0.0.BUILD-SNAPSHOT'

Explore non-maven solution to include "domain" objects

As a user, I'd want use a non-maven solution to include "entity domain objects" that are typically required while using sinks like Cassandra or MongoDB.

Currently, the "domain objects" are included in the stream definition via --includes attribute. The file is expected to be built, bundled, and hosted as maven artifacts in the stream definition as:

--includes='org.springframework.cloud.stream.demo:example:1.0.0.BUILD-SNAPSHOT'

UI: Replace XD with Data Flow

As a developer, I'd like to replace all references of Spring XD with Spring Cloud Data Flow.

Acceptance:
Name changes reflect in admin-ui

Add "stream update" command in shell

As a user, I'd like to have direct shell commands to scale up/down a given module instance, so I can avoid SPI specific CLI commands that needs run outside of data flow.

Acceptance:

  • stream update ticktock --properties "app.log.count=3"as a new command is available for use; where:
    • --properties relates to general set of properties applicable for the stream definition
    • app.log.count relates to instance-count representation for the log application
  • Upon execution of this command, I'm able to scale the log application to 3 instances

List task executions in shell

As a user, I'd want to view the task executions via the SCDF shell.

Acceptance:

  • New shell command: task execution list is added
  • Invoking shell command provides the following details
    • name
    • id
    • start time
    • end time
    • exit code
  • Tests included to evaluate REST-API contract

Add "runtime modules" shell command

As a s-c-d user, I'd like to have runtime modules as shell command, so I can use this to list the details about the modules.

Acceptance:
Using runtime modules command in shell lists SPI agnostic module info

Modules/SCD Deployers: How to provide "cloud connector" support

Currently, s-c-s modules all come with baked in support for multiple cloud binding technologies:

<!-- Lattice core dependency that activates cloud,lattice profiles when running on Lattice --> 
<dependency> 
<groupId>org.springframework.cloud</groupId> 
<artifactId>spring-cloud-lattice-core</artifactId> 
<version>${spring-cloud-lattice.version}</version> 
<optional>true</optional> 
</dependency> 
<!-- Cloud connector dependencies --> 
<!-- Lattice connector dependency to create services info from lattice --> 
<dependency> 
<groupId>org.springframework.cloud</groupId> 
<artifactId>spring-cloud-lattice-connector</artifactId> 
<version>${spring-cloud-lattice.version}</version> 
<optional>true</optional> 
</dependency> 
<!-- CF connector dependency to create services info from CF --> 
<dependency> 
<groupId>org.springframework.cloud</groupId> 
<artifactId>spring-cloud-cloudfoundry-connector</artifactId> 
<optional>true</optional> 
</dependency> 
<!-- dependency to connect to detected cloud services --> 
<dependency> 
<groupId>org.springframework.cloud</groupId> 
<artifactId>spring-cloud-spring-service-connector</artifactId> 
<optional>true</optional> 
</dependency> 

Should the deployers add those at runtime instead?

Passing properties for undeploying a stream

Background: I'm writing a Kubernetes deployer. One of the concepts of Kubernetes are namespace (logical separation of the cluster). This can be helpful to model various "organizations" on a cluster.

Now, I want to leverage and deploy streams into different namespace. Deployment properties come in handy here.

dataflow:>stream create --name deployment1 --definition "http --server.port=9999 | log"
Created new stream 'deployment1'
dataflow:>stream deploy --name deployment1 --properties "module.*.namespace=myorg"
Deployed stream 'deployment1'

This will now be deployed on Kubernetes in namespace myorg. When I'm trying to undeploy though, this information is lost because if have no way of defining extra properties on undeploy.

What is the best way to deal with this? The idea I head was to enable the deployer SPI to add an extra bag of metadata to the ModuleDeploymentId that is returned in the deploy(..) function. Currently this return value is ignored in the admin but I think it should be persisted. This way I could return an instance of the ModuleDeploymentId as follows:

@Override
public ModuleDeploymentId deploy(ModuleDeploymentRequest request) {

  ModuleDeploymentId id = new ModuleDeploymentId.fromModuleDefinition(request.getDefinition()); 

  // do my deploy to Kubernetes into the right namespace

  id.addMetadataProperty("namespace", request.getDeploymentProperties().get("namespace"));

  return id;
}       

Then later in the undeploy:

@Override
public void undeploy(ModuleDeploymentId id) {
  logger.info("Undeploying module: {}", id);

  // undeploy in kubernetes
  kubeClient.replicationControllers().
    inNamespace(id.getMetadataProperty("namespace"))
    withName(id.toString()).delete();   
  // ...
}

Or maybe the deployment properties should be persisted as part of the StreamDefinition class directly in the StreamController in the admin? Or a separate repository implementation for the Kubernetes deployer only.

Does this make sense? Any suggestions? I can submit a PR for this if desired and useful.

TAB completion is impractical, with 100s of options

As a user, I'm trying to use module info command to lookup supported options for each module; however, I'm seeing 100s of options listed instead. Most majority of them are irrelevant to the module itself and are Boot specific overrides.

Acceptance:
When I lookup module info http:source, I should only see http module specific overrides such as:

  Option Name            Description                                                                                                                                       Default                                                              Type
  ---------------------  ------------------------------------------------------------------------------------------------------------------------------------------------  -------------------------------------------------------------------  ---------------------------------
  https                  true for https://                                                                                                                                 false                                                                boolean
  maxContentLength       the maximum allowed content length                                                                                                                1048576                                                              int
  messageConverterClass  the name of a custom MessageConverter class, to convert HttpRequest to Message; must have a constructor with a 'MessageBuilderFactory' parameter  org.springframework.integration.x.http.NettyInboundMessageConverter  java.lang.String
  port                   the port to listen to                                                                                                                             9000                                                                 int
  sslPropertiesLocation  location (resource) of properties containing the location of the pkcs12 keyStore and pass phrase                                                  classpath:httpSSL.properties                                         java.lang.String
  outputType             how this module should emit messages it produces                                                                                                  <none>                                                               org.springframework.util.MimeType

Add direct binding support for DSL and Shell

As an s-c-d user, I'd like to directly bind stream modules, so I can allow co-located contiguous modules to communicate directly instead of a remote transport.

Acceptance:

  • DSL / Shell support to directly bind modules is included in the deployment manifest
  • Documentation on direct binding with samples is available in the reference guide

Document 'partitioning' through deployment properties

As an s-c-d user, I'd like to have documentation on data partitioning, so I could use it as reference to build distributed streaming pipelines.

Acceptance:
Deployment properties and the partition awareness (e.g: partitionIndex=0..N-1) is explained with an example

Admin connection refused

I'm deploying a few streams in the shell. I keep seeing this exception. Have not identified the root cause yet. I'm running the admin in a docker container (but this might not be relevant).

2015-10-22 12:35:30.158 ERROR 7 --- [nio-9393-exec-3] o.a.c.c.C.[Tomcat].[localhost]           : Exception Processing ErrorPage[errorCode=0, location=/error]
java.io.IOException: Broken pipe
    at sun.nio.ch.FileDispatcherImpl.write0(Native Method) ~[na:1.8.0_60]
    at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) ~[na:1.8.0_60]
    at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) ~[na:1.8.0_60]
    at sun.nio.ch.IOUtil.write(IOUtil.java:65) ~[na:1.8.0_60]
    at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:471) ~[na:1.8.0_60]
    at org.apache.tomcat.util.net.NioChannel.write(NioChannel.java:127) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.tomcat.util.net.NioBlockingSelector.write(NioBlockingSelector.java:101) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.tomcat.util.net.NioSelectorPool.write(NioSelectorPool.java:172) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.coyote.http11.InternalNioOutputBuffer.writeToSocket(InternalNioOutputBuffer.java:139) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.coyote.http11.InternalNioOutputBuffer.flushBuffer(InternalNioOutputBuffer.java:244) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.coyote.http11.InternalNioOutputBuffer.addToBB(InternalNioOutputBuffer.java:189) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.coyote.http11.InternalNioOutputBuffer.access$000(InternalNioOutputBuffer.java:41) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.coyote.http11.InternalNioOutputBuffer$SocketOutputBuffer.doWrite(InternalNioOutputBuffer.java:320) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.coyote.http11.filters.ChunkedOutputFilter.doWrite(ChunkedOutputFilter.java:116) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.coyote.http11.AbstractOutputBuffer.doWrite(AbstractOutputBuffer.java:256) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.coyote.Response.doWrite(Response.java:503) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:388) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    ... 88 common frames omitted
Wrapped by: org.apache.catalina.connector.ClientAbortException: java.io.IOException: Broken pipe
    at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:393) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.tomcat.util.buf.ByteChunk.flushBuffer(ByteChunk.java:426) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.connector.OutputBuffer.doFlush(OutputBuffer.java:342) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.connector.OutputBuffer.flush(OutputBuffer.java:317) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.connector.CoyoteOutputStream.flush(CoyoteOutputStream.java:110) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at com.fasterxml.jackson.core.json.UTF8JsonGenerator.flush(UTF8JsonGenerator.java:1022) ~[jackson-core-2.6.1.jar!/:2.6.1]
    at com.fasterxml.jackson.databind.ObjectWriter.writeValue(ObjectWriter.java:891) ~[jackson-databind-2.6.1.jar!/:2.6.1]
    at org.springframework.http.converter.json.AbstractJackson2HttpMessageConverter.writeInternal(AbstractJackson2HttpMessageConverter.java:264) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.http.converter.AbstractGenericHttpMessageConverter.write(AbstractGenericHttpMessageConverter.java:100) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.web.servlet.mvc.method.annotation.AbstractMessageConverterMethodProcessor.writeWithMessageConverters(AbstractMessageConverterMethodProcessor.java:167) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.web.servlet.mvc.method.annotation.HttpEntityMethodProcessor.handleReturnValue(HttpEntityMethodProcessor.java:185) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.web.method.support.HandlerMethodReturnValueHandlerComposite.handleReturnValue(HandlerMethodReturnValueHandlerComposite.java:80) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:127) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:806) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:729) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:959) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:893) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:622) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:101) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:101) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:316) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:115) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:90) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:114) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.session.SessionManagementFilter.doFilter(SessionManagementFilter.java:69) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.authentication.AnonymousAuthenticationFilter.doFilter(AnonymousAuthenticationFilter.java:111) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:169) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.savedrequest.RequestCacheAwareFilter.doFilter(RequestCacheAwareFilter.java:48) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:101) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.authentication.logout.LogoutFilter.doFilter(LogoutFilter.java:120) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:101) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:91) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:101) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:330) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:213) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:176) ~[spring-security-web-4.0.2.RELEASE.jar!/:4.0.2.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:101) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:101) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:101) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:101) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationDispatcher.invoke(ApplicationDispatcher.java:721) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationDispatcher.doInclude(ApplicationDispatcher.java:584) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.ApplicationDispatcher.include(ApplicationDispatcher.java:523) ~[tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.StandardHostValve.custom(StandardHostValve.java:433) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.StandardHostValve.status(StandardHostValve.java:305) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.StandardHostValve.throwable(StandardHostValve.java:399) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:180) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:673) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1526) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1482) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_60]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_60]
    at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_60]

Add support to load Hadoop distribution of choice

As a s-c-d user, I'd like to have the option to choose Hadoop distribution of choice, so I can load the right Hadoop libraries in the CP.

Acceptance:

  • --hadoopDistro is an available option to bootstrap admin with right Hadoop distribution libraries (will likely be --include hadoop-phd30 when defining module)
  • create BOMs for the major distributions we would want to support
  • Integration tests included

UI: Job modules page wouldn't load

As a user, I'm trying to load Job - Modules page in admin-ui, but I'm seeing exceptions in console and the page wouldn't load.

Failed to convert value of type 'java.lang.String' to required type 'org.springframework.cloud.dataflow.core.ArtifactType'; nested exception is org.springframework.core.convert.ConversionFailedException: Failed to convert from type java.lang.String to type @org.springframework.web.bind.annotation.RequestParam org.springframework.cloud.dataflow.core.ArtifactType for value 'job'; nested exception is java.lang.IllegalArgumentException: No enum constant org.springframework.cloud.dataflow.core.ArtifactType.job 

Random NPE when hitting the admin in parallel

I'm hitting the admin with some requests (stream create, destroy all, query) in parallel. Keeping this for the record and will update as I see more.

2015-10-23 17:04:55.426  WARN 34283 --- [io-9393-exec-10] .m.m.a.ExceptionHandlerExceptionResolver : Handler execution resulted in exception: null
2015-10-23 17:05:15.613 ERROR 34283 --- [nio-9393-exec-3] o.s.c.d.a.c.RestControllerAdvice         : Caught exception while handling a request
java.lang.NullPointerException: null
    at org.springframework.cloud.dataflow.admin.controller.StreamController.calculateStreamState(StreamController.java:412) ~[classes/:na]
    at org.springframework.cloud.dataflow.admin.controller.StreamController.access$0(StreamController.java:409) ~[classes/:na]
    at org.springframework.cloud.dataflow.admin.controller.StreamController$Assembler.instantiateResource(StreamController.java:455) ~[classes/:na]
    at org.springframework.cloud.dataflow.admin.controller.StreamController$Assembler.instantiateResource(StreamController.java:1) ~[classes/:na]
    at org.springframework.hateoas.mvc.ResourceAssemblerSupport.createResourceWithId(ResourceAssemblerSupport.java:89) ~[spring-hateoas-0.19.0.RELEASE.jar:na]
    at org.springframework.hateoas.mvc.ResourceAssemblerSupport.createResourceWithId(ResourceAssemblerSupport.java:81) ~[spring-hateoas-0.19.0.RELEASE.jar:na]
    at org.springframework.cloud.dataflow.admin.controller.StreamController$Assembler.toResource(StreamController.java:449) ~[classes/:na]
    at org.springframework.cloud.dataflow.admin.controller.StreamController$Assembler.toResource(StreamController.java:1) ~[classes/:na]
    at org.springframework.data.web.PagedResourcesAssembler.createResource(PagedResourcesAssembler.java:137) ~[spring-data-commons-1.9.1.RELEASE.jar:na]
    at org.springframework.data.web.PagedResourcesAssembler.toResource(PagedResourcesAssembler.java:96) ~[spring-data-commons-1.9.1.RELEASE.jar:na]
    at org.springframework.cloud.dataflow.admin.controller.StreamController.list(StreamController.java:132) ~[classes/:na]
    at sun.reflect.GeneratedMethodAccessor119.invoke(Unknown Source) ~[na:na]
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_11]
    at java.lang.reflect.Method.invoke(Method.java:483) ~[na:1.8.0_11]
    at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:221) ~[spring-web-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:137) ~[spring-web-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:111) ~[spring-webmvc-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:806) ~[spring-webmvc-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:729) ~[spring-webmvc-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85) ~[spring-webmvc-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:959) ~[spring-webmvc-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:893) ~[spring-webmvc-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970) [spring-webmvc-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861) [spring-webmvc-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:622) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846) [spring-webmvc-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:291) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) [tomcat-embed-websocket-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.springframework.boot.actuate.autoconfigure.EndpointWebMvcAutoConfiguration$ApplicationContextHeaderFilter.doFilterInternal(EndpointWebMvcAutoConfiguration.java:249) [spring-boot-actuator-1.3.0.M5.jar:1.3.0.M5]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.springframework.boot.actuate.trace.WebRequestTraceFilter.doFilterInternal(WebRequestTraceFilter.java:102) [spring-boot-actuator-1.3.0.M5.jar:1.3.0.M5]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:207) [spring-security-web-4.0.2.RELEASE.jar:4.0.2.RELEASE]
    at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:176) [spring-security-web-4.0.2.RELEASE.jar:4.0.2.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:85) [spring-web-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.springframework.web.filter.HttpPutFormContentFilter.doFilterInternal(HttpPutFormContentFilter.java:87) [spring-web-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:77) [spring-web-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.springframework.boot.actuate.autoconfigure.MetricsFilter.doFilterInternal(MetricsFilter.java:69) [spring-boot-actuator-1.3.0.M5.jar:1.3.0.M5]
    at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) [spring-web-4.2.1.RELEASE.jar:4.2.1.RELEASE]
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:239) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:219) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.StandardContextValve.__invoke(StandardContextValve.java:106) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.StandardHostValve.__invoke(StandardHostValve.java:142) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:518) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1091) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:673) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1526) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1482) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_11]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_11]
    at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-8.0.26.jar:8.0.26]
    at java.lang.Thread.run(Thread.java:745) [na:1.8.0_11]
2015-10-23 17:05:15.618  WARN 34283 --- [nio-9393-exec-3] .m.m.a.ExceptionHandlerExceptionResolver : Handler execution resulted in exception: null

List task executions by a specific name

As a user, I'd want to view all attributes of a specific TaskExecution, so I can dig into the specifics of a single task and its performance.

Acceptance:
The information returned should include all values in a TaskExecution instance

Add TaskExecution detail shell command

As a user, I'd want to be able to view the details of a TaskExecution via the shell command.

Acceptance:

  • Issuing dataflow:> task execution view --id 1234 lists all task and execution specifics for the task-id 1234
  • Invoking shell command includes the following details
    • name
    • id
    • start time
    • end time
    • exit code

incorrect module deployment lookup - admin attempts to find a source instead of a processor

When deploying the following pipe:

queue:in > groovy-transform --script=.../myscript.groovy --variables='...' > queue:out

I get an admin exception:

2015-10-24 23:01:42.967 ERROR 5 --- [io-9393-exec-18] o.s.c.d.a.c.RestControllerAdvice         : Caught exception while handling a request
java.lang.IllegalArgumentException: Module groovy-transform of type source not found in registry
        at org.springframework.cloud.dataflow.admin.controller.StreamController.deployStream(StreamController.java:236) ~[admin.jar!/:1.0.0.BUILD-SNAPSHOT]
        at org.springframework.cloud.dataflow.admin.controller.StreamController.deploy(StreamController.java:219) ~[admin.jar!/:1.0.0.BUILD-SNAPSHOT]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_60]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_60]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_60]
        at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_60]
        at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:221) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]

The exception indicates that the admin is looking for a "source" module, while it should in fact be looking for a "processor" module (because groovy-transform receives input data from the named channel queue:in).

Part of this could be an issue with the code in https://github.com/spring-cloud/spring-cloud-dataflow/blob/master/spring-cloud-dataflow-admin/src/main/java/org/springframework/cloud/dataflow/admin/controller/StreamController.java#L231

            boolean isFirst = !iterator.hasNext();
            boolean isLast = (i == 0);

Shouldn't that be the other way around?

Add support to build Admin with individual SPI deployers

As a s-c-d developer, I'd like to break the build lifecycle to bundle SPI deployers optionally, so I don't have to build admin with all the deployer variations as one whole thing.

Acceptance:

  • admin project build includes option to build by SPI deployer
  • Local deployer is included by default

Add REST-API to retrieve task executions

As a user, I'd want to user task RESTful endpoint to obtain the statuses of task executions.

Acceptance:

  • A new REST-API is exposed and accessible
  • Invoking REST-API provides task and task-executions details as follows
    • name
    • id
    • start time
    • end time
    • exit code
  • Tests included to evaluate REST-API contract

Add support for composed apps

Composed Module currently behave as "white boxes". As soon as a module is composed (say "http | filter") then all options of the children modules are available (as e.g. http.port and filter.expression in the example above).

Change this so that a composed module is a black box: user has to explicitly expose an option for it to be available (most certainly using a short name). Hardcoding of values would be retained (and possibly overridable).

Possible syntaxes :
1)

app compose foo --definition "http --port=${myport:1234} | filter"
app compose foo --definition "http | filter" --expose port

2.1) in case of ambiguity (simulated in this particular example):

app compose foo --definition "http | filter" --expose http.port

2.2) for specifying a default:

app compose foo --definition "http | filter" --expose port=1234
  1. allow both 1) and 2), using 1) mainly for cases where we don't map 1 to 1 with the underlying option, e.g.:
filter --expression=${expr}+'foo'

UI: Task deployment page is not loading

As a user, I'm trying to load Task, Task Deployment, and Task Executions page, but I'm seeing an error (Error fetching data. Is the XD server running?) instead.
Acceptance:
REST API's are wired correctly and the pages load with expected results

mvn clean package fails in StreamCommandTests.java behind corporate firewall

mvn clean package fails the test in StreamCommandTests.java on line 36:

-------------------------------------------------------------------------------
Test set: org.springframework.cloud.dataflow.shell.command.StreamCommandTests
-------------------------------------------------------------------------------
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 23.223 sec <<< FAILURE! - in org.springframework.cloud.dataflow.shell.command.StreamCommandTests
testStreamLifecycleForTickTock(org.springframework.cloud.dataflow.shell.command.StreamCommandTests)  Time elapsed: 20.437 sec  <<< ERROR!
java.lang.IllegalArgumentException: Failure.  CommandResult = CommandResult [success=false, result=null, exception=org.springframework.cloud.dataflow.rest.client.DataFlowClientException: org.eclipse.aether.resolution.ArtifactResolutionException: Could not transfer artifact org.springframework.cloud.stream.module:log-sink:jar:exec:1.0.0.BUILD-SNAPSHOT from/to repository 1 (https://repo.spring.io/libs-snapshot): connect timed out
]
    at org.springframework.util.Assert.isTrue(Assert.java:68)
    at org.springframework.cloud.dataflow.shell.AbstractShellIntegrationTest$DataFlowShell.executeCommand(AbstractShellIntegrationTest.java:195)
    at org.springframework.cloud.dataflow.shell.command.StreamCommandTemplate.doCreate(StreamCommandTemplate.java:86)
    at org.springframework.cloud.dataflow.shell.command.StreamCommandTemplate.create(StreamCommandTemplate.java:66)
    at org.springframework.cloud.dataflow.shell.command.StreamCommandTests.testStreamLifecycleForTickTock(StreamCommandTests.java:36)

It appears to be attempting to pull an artifact from https://repo.spring.io/libs-snapshot which I believe is failing due to fact that I am required to operate behind a corporate firewall. Internally we have an artifact repository that acts as a proxy in these cases and have maven configured via settings.xml.

Support source channel to sink channel mapping with no module (bridge)

When deploying the following flow:

queue:simple1-out > queue:simple2-in

SCDF Admin fails with:

2015-10-23 09:22:02.235 ERROR 5 --- [nio-9393-exec-6] o.s.c.d.a.c.RestControllerAdvice         : Caught exception while handling a request
java.lang.IllegalArgumentException: Module bridge of type source not found in registry
        at org.springframework.cloud.dataflow.admin.controller.StreamController.deployStream(StreamController.java:235) ~[admin.jar!/:1.0.0.BUILD-SNAPSHOT]
        at org.springframework.cloud.dataflow.admin.controller.StreamController.save(StreamController.java:156) ~[admin.jar!/:1.0.0.BUILD-SNAPSHOT]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_60]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_60]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_60]
        at java.lang.reflect.Method.invoke(Method.java:497) ~[na:1.8.0_60]
        at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:221) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
        at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:137) ~[spring-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
        at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:111) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
        at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:806) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
        at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:729) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
        at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
        at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:959) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
        at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:893) ~[spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
        at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970) [spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
        at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:872) [spring-webmvc-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:648) [tomcat-embed-core-8.0.26.jar!/:8.0.26]
...

... because StreamParser attempts to create a bridge module which does not exist.

        if (bridge) {
            // Create a bridge module to hang the source/sink channels off
            tokens.decrementPosition(); // Rewind so we can nicely eat the sink channel
            moduleNodes = new ArrayList<>();
            moduleNodes.add(new ModuleNode(null, "bridge", tokens.peek().startPos,
                    tokens.peek().endPos, null));
        }

Add support to export/import stream and task definitions

As a user, I'd like to have the option to export stream and batch definitions including deployment manifest specs, so I can reuse the produced artifact and import as needed. This is specifically helpful when recovering from outages, upgrades, or setting up new infrastructure.

Acceptance:

  • dataflow:>export definitions exports all stream and batch definitions including deployment manifest as a jar/zip file
  • dataflow:>import definitions <path-to-exported-file imports definitions
  • User is alerted with error messages upon discrepancies with deployment manifest and underlying infrastructure
  • User is greeted with success message upon successful import/deployment

Depends on #188

Document Task features

As a user, I'd want to refer to Task Documentation, so I can use it while developing and deploying tasks as Spring Boot applications.

Binding properties in deployer incorrect?

I'm writing a deployer for kubernetes (will file a separate issue for that as I'm planning to contribute it - if desired).

As part of this. I'm a bit confused with the deployment parameters of the modules. When I run the following test:

dataflow:>stream create --name deployment1 --definition "http --server.port=9999 | log" --deploy

I get the following parameters when a module is deployed which I'm passing on to the module-launcher.

For the log module (I already converted them to uppercase, as I'm trying to pass them in as env vars):

MODULES=org.springframework.cloud.stream.module:log-sink:jar:1.0.0.BUILD-SNAPSHOT
SPRING_CLOUD_STREAM_BINDINGS_INPUT_DESTINATION=deployment1.0

For the http module:

SERVER_PORT=9999
SPRING_CLOUD_STREAM_BINDINGS_OUTPUT_DESTINATION=deployment1.0
MODULES=org.springframework.cloud.stream.module:http-source:jar:1.0.0.BUILD-SNAPSHOT

When I check my deployment, the messages are not send from http to sink. Upon reading up on the docs in spring-cloud-stream project, I figured out the the properties should be SPRING_CLOUD_STREAM_BINDINGS_INPUT and SPRING_CLOUD_STREAM_BINDINGS_OUTPUT (without the DESTINATION).

Is this a bug or am I doing something wrong in my deployer implementation? The fix is very simple, I just had to change the INPUT_BINDING_KEY and OUTPUT_BINDING_KEY constants in the org.springframework.cloud.dataflow.core.BindingConstants. I can submit a PR once confirmed.

Add profile support for stream repositories

As a s-c-d developer, I'd like to add support for profiles to the core Admin application, so I can back the stream repository with respective backend strategy. For example: local profile would use in-memory strategy to store the metadata.

REST - Provide better error message for incorrect

Maybe we should provide better error messages if wrong enumerations are provided.

http://localhost:9393/modules?type=job

[
  {
    "logref": "MethodArgumentTypeMismatchException",
    "message": "Failed to convert value of type 'java.lang.String' to required type 'org.springframework.cloud.dataflow.core.ArtifactType'; nested exception is org.springframework.core.convert.ConversionFailedException: Failed to convert from type java.lang.String to type @org.springframework.web.bind.annotation.RequestParam org.springframework.cloud.dataflow.core.ArtifactType for value 'job'; nested exception is java.lang.IllegalArgumentException: No enum constant org.springframework.cloud.dataflow.core.ArtifactType.job",
    "links": []
  }
]

Allow "app compose" to specify an explicit type

Currently, app composition always guesses the correct type because we don't have a module with a given name N that is both a source and a processor, or a processor and a sink (we only have the case source and sink, as in jdbc/jdbc or file/file).

If it were the case, then the heuristics for guessing the resulting type of a composition would break.

This issue is about adding the option for the user to explicitly specify the expected type of the composition, /if needed/.

Acceptance:

  • An ambiguous definition should fail.
  • At the shell level, such failure should be caught and a message displayed that invites the user to specify the type hint
  • type hint should be added to the "module compose" command, and honored through all layers (REST endpoint down to parser)
  • Unit tests for failing case and ambiguity lifting should be added

How should users customize dataflow-server

As a developer, I'd like to self-register dataflow-server with Eureka, so I can have server endpoints are discoverable entity.

This depends on Spring Initializr experience. There are gaps in it atm.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.