Git Product home page Git Product logo

webapi's Introduction

OHDSI WebAPI

OHDSI WebAPI contains all OHDSI RESTful services that can be called from OHDSI applications.

Features

  • Provides a centralized API for working with 1 or more databases converted to the Common Data Model (CDM) v5.
  • Searching the OMOP standardized vocabularies for medical concepts and constructing concept sets.
  • Defining cohort definitions for use in identifying patient populations.
  • Characterizing cohorts
  • Computing incidence rates
  • Retrieve patient profiles
  • Design population level estimation and patient level prediction studies

Technology

OHDSI WebAPI is a Java 8 web application that utilizes a PostgreSQL database for storage.

API Documentation

The API Documentation is found at http://webapidoc.ohdsi.org/

System Requirements & Installation

Documentation can be found a the Web API Installation Guide which covers the system requirements and installation instructions.

SAML Auth support

The following parameters are used:

  • security.saml.idpMetadataLocation=classpath:saml/dev/idp-metadata.xml - path to metadata used by identity provider
  • security.saml.metadataLocation=saml/dev/sp-metadata.xml - service provider metadata path
  • security.saml.keyManager.keyStoreFile=classpath:saml/samlKeystore.jks - path to keystore
  • security.saml.keyManager.storePassword=nalle123 - keystore password
  • security.saml.keyManager.passwords.arachnenetwork=nalle123 - private key password
  • security.saml.keyManager.defaultKey=apollo - keystore alias
  • security.saml.sloUrl=https://localhost:8443/cas/logout - identity provider logout URL
  • security.saml.callbackUrl=http://localhost:8080/WebAPI/user/saml/callback - URL called from identity provider after login

Sample idp metadata and sp metadata config files for okta:

  • saml/dev/idp-metadata-okta.xml
  • saml/dev/sp-metadata-okta.xml

Managing auth providers

The following parameters are used to enable/disable certain provider:

  • security.auth.windows.enabled
  • security.auth.kerberos.enabled
  • security.auth.openid.enabled
  • security.auth.facebook.enabled
  • security.auth.github.enabled
  • security.auth.google.enabled
  • security.auth.jdbc.enabled
  • security.auth.ldap.enabled
  • security.auth.ad.enabled
  • security.auth.cas.enabled

Acceptable values are true and false

Geospatial support

Instructions can be found at webapi-component-geospatial

Testing

It was chosen to use embedded PG instead of H2 for unit tests since H2 doesn't support window functions, md5 function, HEX to BIT conversion, setval, set datestyle, CTAS + CTE.

Support

Contribution

Versioning

  • WebAPI follows Semantic versioning;
  • Only Non-SNAPSHOT dependencies should be presented in POM.xml on release branches/tags.

License

OHDSI WebAPI is licensed under Apache License 2.0

webapi's People

Contributors

acumarav avatar agackovka avatar aklochkova avatar alex-odysseus avatar alexfranken avatar alondhe avatar anthonysena avatar anton-abushkevich avatar antonstepanof avatar blootsvoets avatar cbeesley avatar charhart avatar chen-regen avatar chrisknoll avatar dependabot[bot] avatar fdefalco avatar konstjar avatar leeevans avatar marpozh avatar neonkid avatar pavgra avatar pbr6cornell avatar rkboyce avatar sigfried avatar ssuvorov-fls avatar tagwhom avatar tomwhite-medstar avatar vantonov1 avatar wenzhang61 avatar wivern avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

webapi's Issues

ATLAS : Jobs link from the left navigator issue

I recently setup ATLAS locally and configured to use my own schema of the database on SQL Server . However, Jobs link from the left navigator is pointing to the different schema than I configured to use.

Should we upgrade to Java 8?

I don't know what the issues would be but something came up where I apparently used a Java 8 feature and it caused a problem for someone else. I fixed it, but Frank suggested we revisit the question since it was last discussed a long time ago.

getCohortDefinitions should go against the cohort schema

Right now cohort stuff seems hardwired against the Results schema. But people may be using two different schemas for results and cohort:

2 SELECT
โ€ฆ
5 COHORT_DEFINITION_DESCRIPTION,
6 DEFINITION_TYPE_CONCEPT_ID,
7 COHORT_DEFINITION_SYNTAX,
8 SUBJECT_CONCEPT_ID,
9 COHORT_INITIATION_DATE
10 FROM
11 @results_schema.COHORT_DEFINITION

Person service using incorrect daimon

In the PersonService, the table qualifier being used is set to the Vocabulary Daimon, but should probably be CDM:

public class PersonService extends AbstractDaoService {

@Path("{personId}")
@GET
@Produces(MediaType.APPLICATION_JSON)
public PersonProfile getPersonProfile(@PathParam("sourceKey") String sourceKey, @PathParam("personId") String personId)
{
final PersonProfile profile = new PersonProfile();

Source source = getSourceRepository().findBySourceKey(sourceKey);
String tableQualifier = source.getTableQualifier(**SourceDaimon.DaimonType.Vocabulary**);

DaimonType should be CDM

/vocbulary/lookup/mapped is using invalid concept relationships

The query loaded in the vocabulary service to resolved mapped concepts is using this source file:
/resources/vocabulary/sql/getMappedSourcecodes.sql

In this file, the query used is:

select CONCEPT_ID, CONCEPT_NAME, ISNULL(STANDARD_CONCEPT,'N') STANDARD_CONCEPT, ISNULL(c.INVALID_REASON,'V') INVALID_REASON, CONCEPT_CODE, CONCEPT_CLASS_ID, DOMAIN_ID, VOCABULARY_ID
from @CDM_schema.concept_relationship cr
join @CDM_schema.concept c on c.concept_id = cr.concept_id_1
where cr.concept_id_2 in (@identifiers)
and relationship_id in ('Maps to')
and standard_concept IS NULL
order by domain_id, vocabulary_id

The problem is that concept relationships that are invalid are pulling in the invalid relationships, and there is no way to filter this on the UI. The 'Valid' filter on the UI is referring to the returned concepts, but there's no way to control the concept_relationship rows.

CohortResults queries deadlocking on PDW

We've observed that some queries that use highly nested subqueries with joins deadlock on PDW. An example is the query found in resources.cohortresults.sql.cohortSpecific.drugEraPrevalenceOfDrug.sql

The queries in this package that use the approach of nesting large queries to query the vocabulary to build out a flattened hierarchy (i.e. RxNorm -> ATC) should be isolated to a temp table. This temp table should then be used inside of the overall analysis query.

Coalesce death_concept_id to 0 when null

Cohort definitions looking for deaths are not finding people when 'Any Death' is specified because the query performs a count(death_concept_id) on a subquery that doesn't restrict the concept_id to a non-null value. count(null) in sql returns null so we need to modify the underlying death criteria query to coalesce the death_concept_id to 0.

Slow performance of Codeset query on PDW

Due to an issue related to MMP and CTAS with UNIONs, we need to alter the concept set query (used in cohort definition and feasibility study) to create the table first, and insert each concept set query into this temp table individually.

User Authentication

As I mentioned here, Columbia may have more immediate needs for authentication. Specifically, I foresee us wanting authentication via LDAP to access WebAPI through Atlas. I am limiting the scope of this issue to authentication. If we are to implement a solution however, we don't want to preclude it from being more generally useful, becoming more robust. I'm listing here related objectives that have been mentioned so we keep this in sight:

  • Limit deployment overhead (Don't complicate WebAPI/Atlas install)
  • Allow for integration with existing infrastructure (AD/LDAP, OAuth providers)
  • If storing sensitive user information, take special care regarding sql injection
  • Make way for some flavor of authorization
  • Auditing

@fdefalco has mentioned Shiro as a possible framework and it appears that some efforts have been made already.

Add summary service

Enhance WebAPI so that it features a summarization/characterization service. The proposed service would accept requests for and respond with data directly from the Achilles results table, rather than from the Achilles-exported JSON files. The data from the service could then be used to populate the visualizations in Atlas (Datasources).

This issue is potentially part of the Achilles integration effort that may subsume issue #49.

New vocabulary API request: getRelatedConceptNames

We need a webAPI call that takes @conceptIdList and returns concept_name for all related concepts for a given @relatedVocabularyId and @relatedClassId

Use case: to be used in PENELOPE on the spontaneous report tab when we want to take a set of SNOMED concept ids and return back a list of MedDRA PT concept names. These concept names will then be sent to OpenFDA to summarize FAERS by outcome and seriousness.

Error from spring framework when searching for concept that does not exist.

Receiving the following error when going to this WebAPI service:
http://localhost:8084/WebAPI/vocabulary/concept/-1

javax.servlet.ServletException: org.springframework.dao.EmptyResultDataAccessException: Incorrect result size: expected 1, actual 0

Expected Result:
No error should be returned, and an empty object should return, or a HTTP 404 returned? Not sure how the restAPI should report the not found...can it return null?

Redshift native driver

I'm getting errors when attempting to configure a Redshift data source via the "source" table. This is specified as follows:

insert into source values
(
'RedShiftDataSource'
, 'RS'
, 'jdbc:redshift://xyz.redshift.amazonaws.com:5439/database?UID=user&PWD=pwd&ssl=true&sslfactory=com.amazon.redshift.ssl.NonValidatingFactory'
, 'redshift'
)

It would appear that the driver cannot be loaded - perhaps this is just an environment setup issue. Either way noting it as an issue for investigation.

Error on Oracle when generating cohort

The Cohort generate SQL uses a subquery alias of RAW for one of the intermediate results. This is a reserved keyword on oracle, and should be replaced with some other value.

Release v1.1.1 Steps

Release Version V1.1.1

  • Create release branch for v1.1.1
  • Use latest master as candidate V1.1.1 of WebAPI and deploy to development
  • Generate WebAPI Release Notes
  • Complete user acceptance testing
  • Obtain sign off
  • Publish Release
  • Update Jenkins automation to point to "released" branch to obtain latest stable build for OHDSI.org
  • Communicate production change to team
  • Update OHDSI.org
  • Communicate release via OHDSI forums

After crash, WebAPI cannot be re-deployed

After running with no issues for around 10 days, the WebAPI APP seems to have crashed. When I try to start it again it keeps error-ing out:


FAIL - Application at context path /WebAPI could not be started
FAIL - Encountered exception org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[/WebAPI]]

When I dig deeper into the Catalina logs I find this as the issue:

09-Jun-2016 14:23:39.722 SEVERE [http-apr-8080-exec-21] org.apache.catalina.core.ContainerBase.addChildInternal ContainerBase.addChild: start:
 org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[/WebAPI]]
        at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:153)
        at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:725)
        at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:701)
        at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
        at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:940)
        at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:467)
        at org.apache.catalina.startup.HostConfig.check(HostConfig.java:1612)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:300)
        at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819)
        at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801)
        at org.apache.catalina.manager.ManagerServlet.check(ManagerServlet.java:1451)
        at org.apache.catalina.manager.HTMLManagerServlet.upload(HTMLManagerServlet.java:285)
        at org.apache.catalina.manager.HTMLManagerServlet.doPost(HTMLManagerServlet.java:206)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:648)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:292)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
        at org.apache.catalina.filters.CsrfPreventionFilter.doFilter(CsrfPreventionFilter.java:136)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
        at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
        at org.apache.catalina.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:108)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:614)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
        at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:528)
        at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1099)
        at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:672)
        at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.doRun(AprEndpoint.java:2508)
        at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.run(AprEndpoint.java:2497)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'flyway' defined in class path resource [org/springframework/boot/autoconfigure/flyway/FlywayAutoConfiguration$FlywayConfiguration.class]: Invocation of init method failed; nested exception is org.flywaydb.core.internal.dbsupport.FlywaySqlScriptException:
Migration V1.0.0.1__schema-create_spring_batch.sql failed
---------------------------------------------------------
SQL State  : 42P07
Error Code : 0
Message    : ERROR: relation "batch_job_instance" already exists
Location   : db/migration/postgresql/V1.0.0.1__schema-create_spring_batch.sql (/opt/apache-tomcat/tomcat8/webapps/WebAPI/WEB-INF/classes/db/migration/postgresql/V1.0.0.1__schema-create_spring_batch.sql)
Line       : 1
Statement  : CREATE TABLE webapi.BATCH_JOB_INSTANCE  (
        JOB_INSTANCE_ID BIGINT  NOT NULL PRIMARY KEY ,
        VERSION BIGINT ,
        JOB_NAME VARCHAR(100) NOT NULL,
        JOB_KEY VARCHAR(32) NOT NULL,
        constraint JOB_INST_UN unique (JOB_NAME, JOB_KEY)
)

        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1566)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:539)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:476)
        at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:303)
        at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:230)
        at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:299)
        at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:194)
        at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:762)
        at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:757)
        at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:480)
        at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:118)
        at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:691)
        at org.springframework.boot.SpringApplication.run(SpringApplication.java:321)
        at org.springframework.boot.context.web.SpringBootServletInitializer.run(SpringBootServletInitializer.java:117)
        at org.springframework.boot.context.web.SpringBootServletInitializer.createRootApplicationContext(SpringBootServletInitializer.java:108)
        at org.springframework.boot.context.web.SpringBootServletInitializer.onStartup(SpringBootServletInitializer.java:68)
        at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:175)
        at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5261)
        at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
        ... 45 more
Caused by: org.flywaydb.core.internal.dbsupport.FlywaySqlScriptException:
Migration V1.0.0.1__schema-create_spring_batch.sql failed
---------------------------------------------------------
SQL State  : 42P07
Error Code : 0
Message    : ERROR: relation "batch_job_instance" already exists
Location   : db/migration/postgresql/V1.0.0.1__schema-create_spring_batch.sql (/opt/apache-tomcat/tomcat8/webapps/WebAPI/WEB-INF/classes/db/migration/postgresql/V1.0.0.1__schema-create_spring_batch.sql)
Line       : 1
Statement  : CREATE TABLE webapi.BATCH_JOB_INSTANCE  (
        JOB_INSTANCE_ID BIGINT  NOT NULL PRIMARY KEY ,
        VERSION BIGINT ,
        JOB_NAME VARCHAR(100) NOT NULL,
        JOB_KEY VARCHAR(32) NOT NULL,
        constraint JOB_INST_UN unique (JOB_NAME, JOB_KEY)
)

        at org.flywaydb.core.internal.dbsupport.SqlScript.execute(SqlScript.java:117)
        at org.flywaydb.core.internal.resolver.sql.SqlMigrationExecutor.execute(SqlMigrationExecutor.java:71)
        at org.flywaydb.core.internal.command.DbMigrate$5.doInTransaction(DbMigrate.java:287)
        at org.flywaydb.core.internal.command.DbMigrate$5.doInTransaction(DbMigrate.java:285)
        at org.flywaydb.core.internal.util.jdbc.TransactionTemplate.execute(TransactionTemplate.java:72)
        at org.flywaydb.core.internal.command.DbMigrate.applyMigration(DbMigrate.java:285)
        at org.flywaydb.core.internal.command.DbMigrate.access$800(DbMigrate.java:46)
        at org.flywaydb.core.internal.command.DbMigrate$2.doInTransaction(DbMigrate.java:207)
        at org.flywaydb.core.internal.command.DbMigrate$2.doInTransaction(DbMigrate.java:156)
        at org.flywaydb.core.internal.util.jdbc.TransactionTemplate.execute(TransactionTemplate.java:72)
        at org.flywaydb.core.internal.command.DbMigrate.migrate(DbMigrate.java:156)
        at org.flywaydb.core.Flyway$1.execute(Flyway.java:972)
        at org.flywaydb.core.Flyway$1.execute(Flyway.java:919)
        at org.flywaydb.core.Flyway.execute(Flyway.java:1320)
        at org.flywaydb.core.Flyway.migrate(Flyway.java:919)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1694)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1633)
        at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1562)
        ... 63 more
Caused by: org.postgresql.util.PSQLException: ERROR: relation "batch_job_instance" already exists
        at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2270)
        at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1998)
        at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)
        at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:570)
        at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:406)
        at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:398)
        at org.flywaydb.core.internal.dbsupport.JdbcTemplate.executeStatement(JdbcTemplate.java:238)
        at org.flywaydb.core.internal.dbsupport.SqlScript.execute(SqlScript.java:114)
        ... 84 more

09-Jun-2016 14:23:39.724 SEVERE [http-apr-8080-exec-21] org.apache.catalina.startup.HostConfig.deployWAR Error deploying web application archive /opt/apache-tomcat/tomcat8/webapps/WebAPI.war
 java.lang.IllegalStateException: ContainerBase.addChild: start: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[/WebAPI]]
        at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:729)
        at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:701)
        at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
        at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:940)
        at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:467)
        at org.apache.catalina.startup.HostConfig.check(HostConfig.java:1612)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:300)
        at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819)
        at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801)
        at org.apache.catalina.manager.ManagerServlet.check(ManagerServlet.java:1451)
        at org.apache.catalina.manager.HTMLManagerServlet.upload(HTMLManagerServlet.java:285)
        at org.apache.catalina.manager.HTMLManagerServlet.doPost(HTMLManagerServlet.java:206)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:648)
        at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:292)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
        at org.apache.catalina.filters.CsrfPreventionFilter.doFilter(CsrfPreventionFilter.java:136)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
        at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
        at org.apache.catalina.filters.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:108)
        at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
        at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
        at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212)
        at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
        at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:614)
        at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141)
        at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
        at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616)
        at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
        at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:528)
        at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1099)
        at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:672)
        at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.doRun(AprEndpoint.java:2508)
        at org.apache.tomcat.util.net.AprEndpoint$SocketProcessor.run(AprEndpoint.java:2497)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
        at java.lang.Thread.run(Thread.java:745)

09-Jun-2016 14:23:39.725 INFO [http-apr-8080-exec-21] org.apache.catalina.startup.HostConfig.deployWAR Deployment of web application archive /opt/apache-tomcat/tomcat8/webapps/WebAPI.war has finished in 14,397 ms

I started to remove all the relations and sequences it says that already exist, but I am down to the cohort definition tables... which I don't want to remove unless I have to. I have tried re-deploying the war file with the same results.

Is there a way to re-start/re-deploy with the current config? or should I bite the bullet and copy my current settings before nuking the complete webapi schema?

Calypso, Feasibility tables don't exist

I know this is related to Calypso, but i believe the problem is related to WebAPI.

When running a feasibility study on Calypso, i get this error (actually a 500- server error) on http://127.0.0.1:8080/WebAPI/feasibility/1/info

"ERROR: relation "ohdsi.feas_study_index_stats" does not exist\n Position: 39"

i checked my database, and i believe the error comes because I have my CDM datasource in a different database / schema of the web api database.

new function to support updating WebAPI test functions for multi-homed WebAPI

The current WebAPI tests have an endpoint format (e.g. ${baseUri}/vocabulary) which no longer works for mult-homed WebAPI causing the tests to fail. The jenkins CI build job is using skiptests as a workaround.

A source/sourceDaimon function is needed that can be called by the WebAPI test functions to consistently resolve to a single source id for testing. The returned source id can then be included in the test endpoint. (e.g. $(baseUri) + "/" + sourceID + "/vocabulary").

One approach could be for the new function to return the source id of the daimon with the highest priority, for a given daimon type. Note. In the case of two daimons of the same type with the same assigned priority, the single source id that is returned by this function must be deterministic so that tests are consistently reproducible,

Limit #Codesets temp table to active concept sets

Currently when generating the SQL statement used to create a cohort all concept sets in a cohort definition are generated and inserted into the #codesets table. We could optimize the query and simplify the exported SQL by only inserting records for concept sets in use in the definition.

Embedded Tomcat

Very usable feature to use embedded tomcat in spring boot.
This is my pull request. I added embedded tomcat support.
We only need to run command "java -jar WebAPI.jar"
#95

Export breaks with large concept sets

When concept sets have large (>10,000) included concepts, the IN clause in the underlying query fails. We need to add methods to the VocabularyService.java file to take in the ConceptSetExpression and embed these into the existing queries for finding included/mapped codes instead of passing a comma delimited list of concept ids.

an opportunity to increase performance of drugeraprevalence and similar WebAPI calls

The following WebAPI call can take up to 1.5 minutes to complete on the OHDSI cloud.

http://api.ohdsi.org/WebAPI/CS1/cdmresults/715997/drugeraprevalence

I reviewed the SQL code behind this call and experimentally created a new index on the achilles_results table that it uses.

The service code is here:
https://raw.githubusercontent.com/OHDSI/WebAPI/master/src/main/java/org/ohdsi/webapi/service/CDMResultsService.java

The SQL query is here:
https://raw.githubusercontent.com/OHDSI/WebAPI/master/src/main/resources/resources/cdmresults/sql/getDrugEraPrevalenceByGenderAgeYear.sql

The new index:
create index achilles_results_index_1 on achilles_results(analysis_id, stratum_1, stratum_2, stratum_3, stratum_4, stratum_5);

The change is to deploy the new index and switch around the CAST in this and similar joins in the SQL code
from:
ON num_stratum_1 = CAST(c1.concept_id as VARCHAR)
to:
ON CAST(num_stratum_1 as int) = c1.concept_id

After making this change (with the new index) the SQL query execution time went down to under a second when I ran an example query in postgresql.

Ultimately the new index should be created by the achilles R code that creates the achilles_results table . The above change can also be made to similar WebAPI SQL queries that make use of the achilles_results table.

Introduction for a Web2.0 neophyte

I want to take an active step in learning how the great tools OHDSI is producing work; WebAPI seems like a very productive place to start. But I have no experience with the technology and am having trouble making heads-from-tails of the source code.

Might I please ask if someone has a moment to briefly walk me through some basics. For example,

  1. How does one generate documentation that lists all of the services (did I use the right term?) available, like /vocabulary/, /evidence/, etc. ? I feel so dumb because I can't even find a text string "vocabulary" in the source files.
  2. What do the annotations do? A short intro to Spring Boot didn't cover many that I see.
  3. How do I get it running on my friendly MacBook to try things out? mvn clean install generates pages and pages of errors.

Help ... I'm sinking.

NPE in GenerateCohortTasklet

I tried to generate a cohort through CIRCE and the Spring Batch job threw a NPE in GenerateCohortTasklet.java:120. I fixed in this commit.

I wonder, though, if I'm doing something wrong, like using an unstable code base. I used the master branch from webapi. The commit that introduced the comparison of Integers with == is already pretty old (ea1e671 ), and what I was doing doesn't seem marginal functionality.

sqlProcedureTreemap.sql performance relatively poor

In our environment the Heracles Viewer takes <6 seconds to generate the views we've tested to complete- with the exception of the procedures view which takes 9.9 minutes. We've isolated the bottleneck to be in getProcedureTreemap when executing the query sqlProcedureTreemap.sql. We're currently attempting to use indexes to make the responsiveness of the procedures view more consistent with the others. Might there be a bug in the query? The OMOP instance in question is on SQL Server 2012.

Heracles: the conversion from int to date is unsupported

We are using SQL Server 2012 and the Microsoft JDBC driver. When we open the Heracles runner UI, we encounter the WebAPI exception "the conversion from int to date is unsupported." It appears that the problem occurs when executing the query getCohortAnalysesForCohort.sql. Specifically, it looks like the null value used here gets treated as int by SQL Server (source). Casting (see below) seems to solve the problem.

SELECT a.*,
                @cohortDefinitionId AS COHORT_DEFINITION_ID,
                0 ANALYSIS_COMPLETE,
                CAST(null as datetime) as LAST_UPDATE_TIME
FROM @ohdsi_database_schema.HERACLES_ANALYSIS a

Release 1.2.0 Steps

Release Version V1.2.0

  • Use latest master as candidate V1.2.0 of WebAPI and deploy to development
  • Generate WebAPI Release Notes
  • Complete user acceptance testing
  • Obtain sign off
  • Publish Release
  • Communicate production change to team
  • Update OHDSI.org
  • Communicate release via OHDSI forums

There is already an object named '#temp_dates' in the database

In our environment, executing multiple visualization packs in Heracles (e.g. person and observation) results in batch job failure and WebAPI reports the exception and stack trace provided below. We are using MSSQL.

2015-10-08 14:32:11.756 ERROR taskExecutor-1 org.springframework.batch.core.step.AbstractStep -  - Encountered an error executing step cohortAnalysisStep in job cohortAnalysisJob
org.springframework.jdbc.UncategorizedSQLException: StatementCallback; uncategorized SQLException for SQL[

DELETE FROM my_omop.dbo.HERACLES_results
WHERE cohort_definition_id IN (12) AND analysis_id IN (116, 117, 0, 1, 2, 3, 4, 5, 200, 201, 202, 203, 204, 206, 211);
DELETE FROM my_omop.dbo.HERACLES_results_dist
WHERE cohort_definition_id IN (12) AND analysis_id IN (116, 117, 0, 1, 2, 3, 4, 5, 200, 201, 202,
                                                       203, 204, 206, 211);
IF OBJECT_ID('tempdb..#HERACLES_cohort', 'U') IS NOT NULL
    DROP TABLE #HERACLES_cohort;

SELECT
  subject_id,
  cohort_definition_id,
  cohort_start_date,
  cohort_end_date
INTO #HERACLES_cohort
FROM my_omop.dbo.cohort
WHERE cohort_definition_id IN (12);
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, stratum_1, count_value)
  SELECT
    c1.cohort_definition_id,
    0                             AS analysis_id,
    'cumc'                        AS stratum_1,
    COUNT_BIG(DISTINCT person_id) AS count_value
  FROM my_omop.dbo.PERSON p1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON p1.person_id = c1.subject_id
  GROUP BY c1.cohort_definition_id;
INSERT INTO my_omop.dbo.HERACLES_results_dist (cohort_definition_id, analysis_id, stratum_1, count_value)
  SELECT
    c1.cohort_definition_id,
    0                             AS analysis_id,
    'cumc'                        AS stratum_1,
    COUNT_BIG(DISTINCT person_id) AS count_value
  FROM my_omop.dbo.PERSON p1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON p1.person_id = c1.subject_id
  GROUP BY c1.cohort_definition_id;
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, count_value)
  SELECT
    c1.cohort_definition_id,
    1                             AS analysis_id,
    COUNT_BIG(DISTINCT person_id) AS count_value
  FROM my_omop.dbo.PERSON p1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON p1.person_id = c1.subject_id
  GROUP BY c1.cohort_definition_id;
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, stratum_1, count_value)
  SELECT
    c1.cohort_definition_id,
    2                             AS analysis_id,
    gender_concept_id             AS stratum_1,
    COUNT_BIG(DISTINCT person_id) AS count_value
  FROM my_omop.dbo.PERSON p1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON p1.person_id = c1.subject_id
  GROUP BY c1.cohort_definition_id, GENDER_CONCEPT_ID;
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, stratum_1, count_value)
  SELECT
    c1.cohort_definition_id,
    3                             AS analysis_id,
    year_of_birth                 AS stratum_1,
    COUNT_BIG(DISTINCT person_id) AS count_value
  FROM my_omop.dbo.PERSON p1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON p1.person_id = c1.subject_id
  GROUP BY c1.cohort_definition_id, YEAR_OF_BIRTH;
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, stratum_1, count_value)
  SELECT
    c1.cohort_definition_id,
    4                             AS analysis_id,
    RACE_CONCEPT_ID               AS stratum_1,
    COUNT_BIG(DISTINCT person_id) AS count_value
  FROM my_omop.dbo.PERSON p1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON p1.person_id = c1.subject_id
  GROUP BY c1.cohort_definition_id, RACE_CONCEPT_ID;
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, stratum_1, count_value)
  SELECT
    c1.cohort_definition_id,
    5                             AS analysis_id,
    ETHNICITY_CONCEPT_ID          AS stratum_1,
    COUNT_BIG(DISTINCT person_id) AS count_value
  FROM my_omop.dbo.PERSON p1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON p1.person_id = c1.subject_id
  GROUP BY c1.cohort_definition_id, ETHNICITY_CONCEPT_ID;
IF OBJECT_ID('temp_dates', 'U') IS NOT NULL --This should only do something in Oracle
    DROP TABLE temp_dates;

SELECT DISTINCT YEAR(observation_period_start_date) AS obs_year
INTO
    #temp_dates
FROM my_omop.dbo.PERSON p1
  INNER JOIN (SELECT
                subject_id,
                cohort_definition_id,
                cohort_start_date,
                cohort_end_date
              FROM #HERACLES_cohort) c1
    ON p1.person_id = c1.subject_id
  INNER JOIN
  my_omop.dbo.observation_period op1
    ON p1.person_id = op1.person_id;
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, stratum_1, stratum_2, stratum_3, count_value)
  SELECT
    c1.cohort_definition_id,
    116                                          AS analysis_id,
    t1.obs_year                                  AS stratum_1,
    p1.gender_concept_id                         AS stratum_2,
    floor((t1.obs_year - p1.year_of_birth) / 10) AS stratum_3,
    COUNT_BIG(DISTINCT p1.PERSON_ID)             AS count_value
  FROM my_omop.dbo.PERSON p1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON p1.person_id = c1.subject_id
    INNER JOIN
    my_omop.dbo.observation_period op1
      ON p1.person_id = op1.person_id
    ,
    #temp_dates t1
  WHERE year(op1.OBSERVATION_PERIOD_START_DATE) <= t1.obs_year
        AND year(op1.OBSERVATION_PERIOD_END_DATE) >= t1.obs_year
  GROUP BY c1.cohort_definition_id,
    t1.obs_year,
    p1.gender_concept_id,
    floor((t1.obs_year - p1.year_of_birth) / 10);
TRUNCATE TABLE #temp_dates;
DROP TABLE #temp_dates;
IF OBJECT_ID('temp_dates', 'U') IS NOT NULL --This should only do something in Oracle
    DROP TABLE temp_dates;

SELECT DISTINCT YEAR(observation_period_start_date) * 100 + MONTH(observation_period_start_date) AS obs_month
INTO
    #temp_dates
FROM my_omop.dbo.PERSON p1
  INNER JOIN (SELECT
                subject_id,
                cohort_definition_id,
                cohort_start_date,
                cohort_end_date
              FROM #HERACLES_cohort) c1
    ON p1.person_id = c1.subject_id
  INNER JOIN
  my_omop.dbo.observation_period op1
    ON p1.person_id = op1.person_id;
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, stratum_1, count_value)
  SELECT
    c1.cohort_definition_id,
    117                               AS analysis_id,
    t1.obs_month                      AS stratum_1,
    COUNT_BIG(DISTINCT op1.PERSON_ID) AS count_value
  FROM my_omop.dbo.PERSON p1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON p1.person_id = c1.subject_id
    INNER JOIN
    my_omop.dbo.observation_period op1
      ON p1.person_id = op1.person_id
    ,
    #temp_dates t1
  WHERE YEAR(observation_period_start_date) * 100 + MONTH(observation_period_start_date) <= t1.obs_month
        AND YEAR(observation_period_end_date) * 100 + MONTH(observation_period_end_date) >= t1.obs_month
  GROUP BY c1.cohort_definition_id, t1.obs_month;
TRUNCATE TABLE #temp_dates;
DROP TABLE #temp_dates;
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, stratum_1, count_value)
  SELECT
    c1.cohort_definition_id,
    200                               AS analysis_id,
    --
    --
    vo1.visit_CONCEPT_ID              AS stratum_1,
    --
    COUNT_BIG(DISTINCT vo1.PERSON_ID) AS count_value
  FROM
    my_omop.dbo.visit_occurrence vo1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON vo1.person_id = c1.subject_id
  --
  GROUP BY c1.cohort_definition_id,
    --
    --
    vo1.visit_CONCEPT_ID
--
;
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, stratum_1, count_value)
  SELECT
    c1.cohort_definition_id,
    201                      AS analysis_id,
    --
    --
    vo1.visit_CONCEPT_ID     AS stratum_1,
    --
    COUNT_BIG(vo1.PERSON_ID) AS count_value
  FROM
    my_omop.dbo.visit_occurrence vo1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON vo1.person_id = c1.subject_id
  --
  GROUP BY c1.cohort_definition_id,
    --
    --
    vo1.visit_CONCEPT_ID
--
;
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, stratum_1, stratum_2, count_value)
  SELECT
    c1.cohort_definition_id,
    202                                                    AS analysis_id,
    --
    --
    vo1.visit_CONCEPT_ID                                   AS stratum_1,
    --
    YEAR(visit_start_date) * 100 + month(visit_start_date) AS stratum_2,
    COUNT_BIG(DISTINCT PERSON_ID)                          AS count_value
  FROM
    my_omop.dbo.visit_occurrence vo1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON vo1.person_id = c1.subject_id
  --
  GROUP BY c1.cohort_definition_id,
    --
    --
    vo1.visit_CONCEPT_ID,
    --
    YEAR(visit_start_date) * 100 + month(visit_start_date);
INSERT INTO my_omop.dbo.HERACLES_results_dist (cohort_definition_id, analysis_id, count_value, min_value, max_value, avg_value, stdev_value, median_value, p10_value, p25_value, p75_value, p90_value)
  SELECT
    cohort_definition_id,
    203                    AS analysis_id,
    COUNT_BIG(count_value) AS count_value,
    min(count_value)       AS min_value,
    max(count_value)       AS max_value,
    avg(1.0 * count_value) AS avg_value,
    stdev(count_value)     AS stdev_value,
    max(CASE WHEN p1 <= 0.50
      THEN count_value
        ELSE -9999 END)    AS median_value,
    max(CASE WHEN p1 <= 0.10
      THEN count_value
        ELSE -9999 END)    AS p10_value,
    max(CASE WHEN p1 <= 0.25
      THEN count_value
        ELSE -9999 END)    AS p25_value,
    max(CASE WHEN p1 <= 0.75
      THEN count_value
        ELSE -9999 END)    AS p75_value,
    max(CASE WHEN p1 <= 0.90
      THEN count_value
        ELSE -9999 END)    AS p90_value
  FROM
    (
      SELECT
        cohort_definition_id,
        num_visits                                                               AS count_value,
        1.0 * (row_number()
        OVER (PARTITION BY cohort_definition_id
          ORDER BY num_visits)) / (COUNT_BIG(num_visits)
                                   OVER (PARTITION BY cohort_definition_id) + 1) AS p1
      FROM
        (
          SELECT
            c1.cohort_definition_id,
            vo1.person_id,
            --
            --
            COUNT_BIG(DISTINCT vo1.visit_concept_id) AS num_visits
          --
          FROM
            my_omop.dbo.visit_occurrence vo1
            INNER JOIN (SELECT
                          subject_id,
                          cohort_definition_id,
                          cohort_start_date,
                          cohort_end_date
                        FROM #HERACLES_cohort) c1
              ON vo1.person_id = c1.subject_id
          --
          GROUP BY c1.cohort_definition_id, vo1.person_id
        ) t0
    ) t1
  GROUP BY cohort_definition_id;
INSERT INTO my_omop.dbo.HERACLES_results (cohort_definition_id, analysis_id, stratum_1, stratum_2, stratum_3, stratum_4, count_value)
  SELECT
    c1.cohort_definition_id,
    204                                                     AS analysis_id,
    --
    --
    vo1.visit_CONCEPT_ID                                    AS stratum_1,
    --
    YEAR(visit_start_date)                                  AS stratum_2,
    p1.gender_concept_id                                    AS stratum_3,
    floor((year(visit_start_date) - p1.year_of_birth) / 10) AS stratum_4,
    COUNT_BIG(DISTINCT p1.PERSON_ID)                        AS count_value
  FROM my_omop.dbo.PERSON p1
    INNER JOIN (SELECT
                  subject_id,
                  cohort_definition_id,
                  cohort_start_date,
                  cohort_end_date
                FROM #HERACLES_cohort) c1
      ON p1.person_id = c1.subject_id
    INNER JOIN
    my_omop.dbo.visit_occurrence vo1
      ON p1.person_id = vo1.person_id
  --
  GROUP BY c1.cohort_definition_id,
    --
    --
    vo1.visit_CONCEPT_ID,
    --
    YEAR(visit_start_date),
    p1.gender_concept_id,
    floor((year(visit_start_date) - p1.year_of_birth) / 10);
INSERT INTO my_omop.dbo.HERACLES_results_dist (cohort_definition_id, analysis_id, stratum_1, stratum_2, count_value, min_value, max_value, avg_value, stdev_value, median_value, p10_value, p25_value, p75_value, p90_value)
  SELECT
    cohort_definition_id,
    206                    AS analysis_id,
    visit_CONCEPT_ID       AS stratum_1,
    gender_concept_id      AS stratum_2,
    COUNT_BIG(count_value) AS count_value,
    min(count_value)       AS min_value,
    max(count_value)       AS max_value,
    avg(1.0 * count_value) AS avg_value,
    stdev(count_value)     AS stdev_value,
    max(CASE WHEN p1 <= 0.50
      THEN count_value
        ELSE -9999 END)    AS median_value,
    max(CASE WHEN p1 <= 0.10
      THEN count_value
        ELSE -9999 END)    AS p10_value,
    max(CASE WHEN p1 <= 0.25
      THEN count_value
        ELSE -9999 END)    AS p25_value,
    max(CASE WHEN p1 <= 0.75
      THEN count_value
        ELSE -9999 END)    AS p75_value,
    max(CASE WHEN p1 <= 0.90
      THEN count_value
        ELSE -9999 END)    AS p90_value
  FROM
    (
      SELECT
        c1.cohort_definition_id,
        --
        --
        vo1.visit_CONCEPT_ID,
        --
        p1.gender_concept_id,
        vo1.visit_start_year - p1.year_of_birth                      AS count_value,
        1.0 * (row_number()
        OVER (PARTITION BY c1.cohort_definition_id, vo1.visit_CONCEPT_ID, p1.gender_concept_id
          ORDER BY vo1.visit_start_year - p1.year_of_birth)) / (COUNT_BIG(vo1.visit_start_year - p1.year_of_birth)
                                                                OVER (PARTITION BY c1.cohort_definition_id, vo1.visit_CONCEPT_ID, p1.gender_concept_id)
                                                                + 1) AS p1
      FROM my_omop.dbo.PERSON p1
        INNER JOIN (SELECT
                      subject_id,
                      cohort_definition_id,
                      cohort_start_date,
                      cohort_end_date
                    FROM #HERACLES_cohort) c1
          ON p1.person_id = c1.subject_id
        INNER JOIN
        (SELECT
           vo0.person_id,
           --
           --
           vo0.visit_CONCEPT_ID,
           --
           min(year(vo0.visit_start_date)) AS visit_start_year
         FROM my_omop.dbo.visit_occurrence vo0
           INNER JOIN (SELECT
                         subject_id,
                         cohort_definition_id,
                         cohort_start_date,
                         cohort_end_date
                       FROM #HERACLES_cohort) c1
             ON vo0.person_id = c1.subject_id
         --
         GROUP BY person_id,
           --
           --
           vo0.visit_CONCEPT_ID
          --place_of_service_concept_id
        ) vo1
          ON p1.person_id = vo1.person_id
    ) t1
  GROUP BY cohort_definition_id,
    visit_CONCEPT_ID,
    gender_concept_id;
INSERT INTO my_omop.dbo.HERACLES_results_dist (cohort_definition_id, analysis_id, stratum_1, count_value, min_value, max_value, avg_value, stdev_value, median_value, p10_value, p25_value, p75_value, p90_value)
  SELECT
    cohort_definition_id,
    211                    AS analysis_id,
    --
    --
    visit_CONCEPT_ID
      --
                           AS stratum_1,
    COUNT_BIG(count_value) AS count_value,
    min(count_value)       AS min_value,
    max(count_value)       AS max_value,
    avg(1.0 * count_value) AS avg_value,
    stdev(count_value)     AS stdev_value,
    max(CASE WHEN p1 <= 0.50
      THEN count_value
        ELSE -9999 END)    AS median_value,
    max(CASE WHEN p1 <= 0.10
      THEN count_value
        ELSE -9999 END)    AS p10_value,
    max(CASE WHEN p1 <= 0.25
      THEN count_value
        ELSE -9999 END)    AS p25_value,
    max(CASE WHEN p1 <= 0.75
      THEN count_value
        ELSE -9999 END)    AS p75_value,
    max(CASE WHEN p1 <= 0.90
      THEN count_value
        ELSE -9999 END)    AS p90_value
  FROM
    (
      SELECT
        c1.cohort_definition_id,
        --
        --
        vo1.visit_CONCEPT_ID,
        --
        datediff(DD, visit_start_date, visit_end_date)                          AS count_value,
        1.0 * (row_number()
        OVER (PARTITION BY c1.cohort_definition_id, vo1.visit_CONCEPT_ID
          ORDER BY datediff(DD, visit_start_date, visit_end_date))) /
        (COUNT_BIG(datediff(DD, visit_start_date, visit_end_date))
         OVER (PARTITION BY c1.cohort_definition_id, vo1.visit_CONCEPT_ID) + 1) AS p1
      FROM my_omop.dbo.visit_occurrence vo1
        INNER JOIN (SELECT
                      subject_id,
                      cohort_definition_id,
                      cohort_start_date,
                      cohort_end_date
                    FROM #HERACLES_cohort) c1
          ON vo1.person_id = c1.subject_id
      --
    ) t1
  GROUP BY cohort_definition_id,
    --
    --
    visit_CONCEPT_ID
--
;
TRUNCATE TABLE #HERACLES_cohort;
DROP TABLE #HERACLES_cohort;
DELETE FROM my_omop.dbo.HERACLES_results
WHERE count_value <= 10;
DELETE FROM my_omop.dbo.HERACLES_results_dist
WHERE count_value <= 10

]; SQL state [S0001]; error code [2714]; There is already an object named '#temp_dates' in the database.; nested exception is java.sql.BatchUpdateException: There is already an object named '#temp_dates' in the database.
        at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:84)
        at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81)
        at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81)
        at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:416)
        at org.springframework.jdbc.core.JdbcTemplate.batchUpdate(JdbcTemplate.java:611)
        at org.ohdsi.webapi.cohortanalysis.CohortAnalysisTasklet$1.doInTransaction(CohortAnalysisTasklet.java:61)
        at org.ohdsi.webapi.cohortanalysis.CohortAnalysisTasklet$1.doInTransaction(CohortAnalysisTasklet.java:44)
        at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133)
        at org.ohdsi.webapi.cohortanalysis.CohortAnalysisTasklet.execute(CohortAnalysisTasklet.java:44)
        at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:406)
        at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:330)
        at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133)
        at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:271)
        at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:77)
        at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:368)
        at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
        at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:144)
        at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:257)
        at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:198)
        at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:148)
        at org.springframework.batch.core.job.AbstractJob.handleStep(AbstractJob.java:386)
        at org.springframework.batch.core.job.SimpleJob.doExecute(SimpleJob.java:135)
        at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:304)
        at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:135)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.sql.BatchUpdateException: There is already an object named '#temp_dates' in the database.
        at net.sourceforge.jtds.jdbc.JtdsStatement.executeBatch(JtdsStatement.java:1069)
        at org.springframework.jdbc.core.JdbcTemplate$1BatchUpdateStatementCallback.doInStatement(JdbcTemplate.java:572)
        at org.springframework.jdbc.core.JdbcTemplate$1BatchUpdateStatementCallback.doInStatement(JdbcTemplate.java:559)
        at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:405)
        ... 23 more

Heracles: Cannot set id to null in table heracles_visualization_data

We are using SQL Server 2012 and the Microsoft JDBC driver. When we open the Heracles runner UI, we encounter a WebAPI exception associated with getCohortSummaryData (sorry, I no longer have the stack trace). The generated SQL was attempting to insert into heracles_visualization_data with id=null. I discovered the problem was due to the annotations on VisualizationData.id being inconsistent with the DDL; the DDL does not set id to autoincrement. Setting heracles_visualization_data.id to autoincrement (see below) appears to have resolved the issue.

ALTER TABLE [HERACLES_VISUALIZATION_DATA] DROP COLUMN id
ALTER TABLE [HERACLES_VISUALIZATION_DATA] ADD id integer IDENTITY(1,1)

TherapyPathResults?

Frank,

What does the therapypathresults route do in WebAPI? We are working on some TxPath services for our new app (PANACEA) and didn't want to step on any txpath stuff already in there.

Thanks

Jon

ERROR: relation "concept" does not exist (when using PostgreSQL)

I get this error

javax.servlet.ServletException: java.lang.RuntimeException: org.postgresql.util.PSQLException: ERROR: relation "concept" does not exist
  Position: 83
    org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:397)
    org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:381)
    org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:344)
    org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:221)
    org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)

when using URL

http://localhost:8080/WebAPI/vocabulary/concept/0

Here's the content of my web.xml:

<?xml version="1.0" encoding="UTF-8"?>

<web-app version="2.5" xmlns="http://java.sun.com/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd">
    <context-param>
        <param-name>database.driver</param-name>
        <param-value>org.postgresql.Driver</param-value>
    </context-param>
    <context-param>
        <param-name>database.url</param-name>
        <!-- example for Microsoft SQL Server -->
        <param-value>jdbc:postgresql://localhost/hku?user=vocabuser&amp;password=secret</param-value> 
    </context-param>
    <context-param>
        <param-name>database.dialect</param-name>
        <param-value>postgresql</param-value>
    </context-param>    
    <context-param>
        <param-name>database.cdm.schema</param-name>
        <param-value>vocabulary</param-value>
    </context-param>    
    <context-param>
        <param-name>database.results.schema</param-name>
        <param-value>vocabulary</param-value>
    </context-param>    
    <servlet>
        <servlet-name>Jersey Web Application</servlet-name>
        <servlet-class>org.glassfish.jersey.servlet.ServletContainer</servlet-class>
        <init-param>
            <param-name>jersey.config.server.provider.packages</param-name>
            <param-value>org.ohdsi.webapi</param-value>
        </init-param>
        <load-on-startup>1</load-on-startup>
    </servlet>
    <servlet-mapping>
        <servlet-name>Jersey Web Application</servlet-name>
        <url-pattern>/*</url-pattern>
    </servlet-mapping>
</web-app>

So far I've just uploaded the vocabulary tables (V4) in the vocabulary schema. Using PosgreSQL, Java 1.7, Apache + Tomcat 7.

Connection timeout downloading maven dependencies

I am trying to set up WebAPI through IntelliJ and getting below error while compiling.

[ERROR] Failed to execute goal on project WebAPI: Could not resolve dependencies for project org.ohdsi:WebAPI:war:1.0.0-SNAPSHOT: Failed to collect dependencies for [org.springframework.boot:spring-boot-starter:jar:1.2.1.RELEASE (compile), org.springframework.boot:spring-boot-starter-log4j:jar:1.2.1.RELEASE (compile), org.springframework.boot:spring-boot-starter-web:jar:1.2.1.RELEASE (compile), org.springframework.boot:spring-boot-starter-tomcat:jar:1.2.1.RELEASE (provided), org.springframework.batch:spring-batch-core:jar:3.0.3.RELEASE (compile), org.springframework.boot:spring-boot-starter-jdbc:jar:1.2.1.RELEASE (compile), org.springframework.boot:spring-boot-starter-jersey:jar:1.2.1.RELEASE (compile), org.springframework.boot:spring-boot-starter-data-jpa:jar:1.2.1.RELEASE (compile), javax.servlet:javax.servlet-api:jar:3.1.0 (provided), org.ohdsi.sql:SqlRender:jar:1.0.0-SNAPSHOT (compile), commons-dbutils:commons-dbutils:jar:1.6 (compile), commons-io:commons-io:jar:2.4 (compile), org.springframework.boot:spring-boot-starter-test:jar:1.2.1.RELEASE (test), org.apache.commons:commons-lang3:jar:3.3.2 (compile), org.flywaydb:flyway-core:jar:3.1 (compile), org.apache.httpcomponents:httpclient:jar:4.3.6 (compile), commons-httpclient:commons-httpclient:jar:3.1 (compile), org.springframework.batch:spring-batch-admin-manager:jar:2.0.0.M1 (compile), org.postgresql:postgresql:jar:9.4-1201-jdbc41 (compile)]: Failed to read artifact descriptor for org.ohdsi.sql:SqlRender:jar:1.0.0-SNAPSHOT: Could not transfer artifact org.ohdsi.sql:SqlRender:pom:1.0.0-SNAPSHOT from/to ohdsi (http://repo.ohdsi.org:8085/nexus/content/repositories/releases): Connect to repo.ohdsi.org:8085 [repo.ohdsi.org/52.0.148.6] failed: Connection timed out: connect -> [Help 1]

Is there any settings I need to do?

source service refresh route

Add a route to the source service to invalidate the cache and reload the sources from the database without an API restart.

error in caching?

I'm seeing what I think is an error in the way the heracles visualization data is attempting to be cached. I'm seeing this on our SQL server platform.

Hibernate: insert into heracles_visualization_data (COHORT_DEFINITION_ID, DATA, DRILLDOWN_ID, END_TIME, SOURCE_ID, VISUALIZATION_KEY) values (?, ?, ?, ?, ?, ?)
2015-09-11 23:28:00.011 WARN http-nio-8080-exec-7 org.hibernate.engine.jdbc.spi.SqlExceptionHelper - - SQL Error: 273, SQLState: S0001
2015-09-11 23:28:00.011 ERROR http-nio-8080-exec-7 org.hibernate.engine.jdbc.spi.SqlExceptionHelper - - Cannot insert an explicit value into a timestamp column. Use INSERT with a column list to exclude the timestamp column, or insert a DEFAULT into the timestamp column.
2015-09-11 23:28:00.011 DEBUG http-nio-8080-exec-7 org.springframework.orm.jpa.JpaTransactionManager - - Initiating transaction rollback

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.