Git Product home page Git Product logo

fhir-from-jupyter's Introduction

Let's Build: FHIR from Jupyter

In this lab, we import the following Jupyter notebooks into IBM Watson Studio within the IBM Cloud Pak for Data:

  1. The FHIR API
  2. FHIR in Spark
  3. Predictive Modeling

Notebook 1 is configured to use an instance of the IBM FHIR Server (running on IBM Cloud) which has been loaded with sample data generated from the Synthea™️ Patient Generator.

Notebook 2 requires access to a Cloud Object Store with FHIR R4 resources from SyntheticMass.

Notebook 3 builds a predictive model from Parquet files that are built in notebook 2 from features extracted from the FHIR resources.

Notebook 1 should work from almost any Jupyter notebook environment, but notebooks 2 and 3 use Apache Spark to process bulk FHIR resources into a dataframe and therefor must run in a Jupyter environment with access to Spark.

Getting started

Download the notebooks

Clone or download the notebooks from this repo to your local system. download the notebooks

Register / log in to IBM Cloud

  1. Navigate to https://dataplatform.cloud.ibm.com/registration/stepone
  2. Choose Dallas to be nearest to the data for this lab register for IBM Cloud

If you already have an IBMid, enter it on the right. If not, enter a valid email address on the left, agree to the terms, click Next, and check your email to complete the registration process.

Once registered and logged in, continue to the Watson Studio dashboard. the watson studio dashboard

Create a project

  1. From the IBM Cloud Pak for Data dashboard, click New project + new project dialog
  2. Select Create an empty project create a project
  3. Give the project a name (e.g. FHIR from Jupyter) and click Add to define a storage service create cloud object storage
  4. Select a plan (the free Lite plan should be fine) and click Create
  5. Click Refresh to see your Cloud Object Storage instance appear and then click Create

project dashboard

Create a notebook

From the Project dashboard, click Add to project + and choose the Notebook asset type. create a notebook

Select the From file tab and select the Default Spark 2.4 & Python 3.7 environment. Then, upload one of the notebook files that was downloaded in the Download the notebooks section and click Create.

Repeat this process for the other notebooks that you desire to load.

fhir-from-jupyter's People

Contributors

eggebraa avatar gigiyuenreed avatar lmsurpre avatar nkadochn avatar prb112 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.