Git Product home page Git Product logo

bcgov / ocwa Goto Github PK

View Code? Open in Web Editor NEW
10.0 11.0 4.0 20.79 MB

Output Checking Workflow App

License: Apache License 2.0

Dockerfile 0.68% JavaScript 66.05% Shell 3.21% CSS 1.71% HTML 0.17% Smarty 1.15% Python 8.61% HCL 3.72% Gherkin 2.68% Groovy 10.09% TypeScript 0.70% Pug 0.19% SCSS 0.11% Mustache 0.92%
output-checking microservices container nodejs reactjs python docker kubernetes helm statistical-disclosure-control

ocwa's Introduction

Output Checker Workflow App · Build Status License

OCWA (pronounced "aqua") is a microservice application suite that can store, validate, and enforce file export policies for the purpose of output checking.

Table of Contents

Installation

OCWA is written in both node.js and Python 3. Docker is also strongly recommended for Windows platforms. For each of the OCWA components, refer to their associated README files for specific instructions. If you are looking at the integration tests, you will also require Katalon Studio ( with language support for Groovy and Gherkin).

Prerequisites

  • Python 3.6 or newer
  • npm 6.4.1 or newer
  • node 10.15.1 LTS or newer
  • MongoDB 4.0 or newer
  • Docker 18.09.1 or newer
  • Katalon Studio 5.10 or newer
  • Minio (Storage API)
  • Tusd (Storage API)

Operating System

OCWA was fully developed on Mac using baremetal, developed with a combo of bare metal and docker on windows (docker for the python apis) and has been deployed on Linux using Terraform, and Kubernetes using Helm. Windows with Terraform doesn't work, we haven't had enough time to devote to fixing that.

Components

Forum API

The forum API is a nodejs api providing topics (with subtopics), comments and permissions for them. Api docs are available using the OpenApi v3 specification by running the API and visiting /v1/api-docs. The Forum API also provides a websocket interface for being notified when new topics/comments are created that are relevant to the user.

Policy API

The policy API is a python api providing a policy. A policy consists of multiple rules rules define source to execute in python on a file. Policy/Rules are specified using the HCL language. Api docs are available using the OpenApi v3 specification by running the API and visiting /v1/api-docs.

Request API

The request API is a nodejs api providing the business logic behind OCWA. It uses the forum api to provide permissions by making a topic with a 1-1 request correlation. Api docs are available using the OpenApi v3 specification by running the API and visiting /v1/api-docs.

Storage API

The storage API is a combination of open source existing products. Minio is used to treat any underlying storage as though it was S3 so that only one backend needs to be supported even if the backend is GCP/Azure/Local Disk or actually S3. TUSD is used to support large file uploads so that they can be resumed if interrupted due to a connection drop or whatever reason.

Validation API

The validate API is a python api providing a validation. The validation api uses validates files from the storage api Api docs are available using the OpenApi v3 specification by running the API and visiting /v1/api-docs. Note that the Validation API is not intended to be forward facing and is intended to be accessed only by other apis with an api key/secret.

Front End

The front end is written using ReactJs. It implements the apis.

Helm

There is a helm chart in this top level. It deploys all of OCWA in one convenient package. For both below helm commands make a copy of values.yaml within the helm/ocwa directory and modify it to contain the values specific for your deployment.

Helm Install (Kubernetes)

helm dep up ./helm/ocwa
helm install --name ocwa --namespace ocwa ./helm/ocwa -f ./helm/ocwa/config.yaml

Helm Update (Kubernetes)

helm dep up ./helm/ocwa
helm upgrade ocwa ./helm/ocwa  -f ./helm/ocwa/config.yaml

Openshift (OCP)

Openshift has a bit of a different deployment as helm is not supported by the test deployment area. Additionally due to the way Openshift runs containers as a random UID, many of the images that work for Kubernetes/Docker and are standard do not work on OpenShift. As a result the following changes are required.

Mongo Image (forum-api: mongoImage: repository: ) registry.access.redhat.com/rhscl/mongodb-34-rhel7

Because the mongo image is different, the below must also change

forum-api:
  dbPod:
    persistence: /var/lib/mongodb/data
    adminEnv: MONGODB_USER
    passEnv: MONGODB_PASSWORD
    dbEnv: MONGODB_DATABASE
    addAdminPassEnv: true
    adminPassEnv: MONGODB_ADMIN_PASSWORD
    initDb: false

Contributing

If you update APIs that changes the signature at all, it is required to be under a new release (ie /v2 instead of /v1). The APIs are written specifically to make this easy. This should be discussed in an issue before implementation starts. You must pass the Travis CI builds to be able to submit a pull request that can be accepted.

Please have a read through our Code of Conduct that we expect all project participants to adhere to. It will explain what actions will and will not be tolerated.

License

OCWA is Apache 2.0 licensed.

Notes

Default Port List

Endpoint Port
Forum WS 2999
Forum WS (Nginx) 3001
Forum Api 3000
Request Api 3002
Validate Api 3003
Policy Api 3004
Formio 3001
Storage Api (Minio) 9000
Storage Api (Tusd) 1080
Front End 8000

Developer Quick Start Guide

After ensuring the prerequisite libraries are installed and cloning this repo follow the below steps to get the program up and running

  1. Configure the frontend, forum api, policy api, project api, request api and validate api by copying the default.json.example or default.json.template file in their respective /config folders, renaming to default.json and modifying or adding their values where appropriate. For the storage API you will need sign into Minio's web interface at http://localhost:9000 and create a new bucket matching the storage config options defined in the frontend and request api's default.json.
  2. Create virtual environments for both the policy and validate api's named venv (by running $ virtualenv venv in each directory)
  3. Run the startAll.py script in this directory $ python startAll.py

The script will terminate all the pieces upon CTRL+C (SigINT). The node apis and frontend will automatically restart upon any changes, but the python ones will need a manual kick. This script is not expected to work on Windows as it has not been tested on Windows.

ocwa's People

Contributors

bdolor avatar brandonsharratt avatar dependabot[bot] avatar jonesy avatar jujaga avatar pripley123 avatar repo-mountie[bot] avatar snyk-bot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ocwa's Issues

Forum Api - Email Notifications

User Story

As a user,

I want to be emailed when a discussion happens

so that I can know I need to login.

Make a templateable email for the forum api that gets sent out on comment submission to every other user involved in the discussion.

Enhance file validation outcome text to make more readable

User Story

As a requester,

I want to see and easily understand what validation rules passed / files for each file I uploaded

so that I can quickly see if and what changes I need to make to my uploaded files.

Acceptance criteria:
Validation outcome messages display the new text (as specified below)

ENV

  • DEV
  • TEST

Current text is in screen shot.
Updated text:
Not a warning file type
Not a blocked file type
File size is under 3.5Mb
File size is under 5Mb
No study ids are present in the content

test multi tomato.JPG

Validation UI - Working Limit

OCWA Issue

User Story

As a user,

I want validation to always run correctly

so that I can trust the software.

Test Case

ENV

  • DEV
  • TEST
  • PROD

TESTCASE

Upload a large number of files with a large size total the size depends on the environment. But if you have NxM where n is files being validated and M is rules such that the sum of files being actively worked on is greater than the memory of your container it will fail.

EXPECTED

eventually my files should be validated with real results

ACTUAL

The api crashes and leads to incorrect fails.

Notes

support a configuration parameter "workingLimit". All validation jobs that are in pending state are loaded into a queue. Anytime the workingLimit has not been hit the api can pull the next item in the queue if and only if it would not EXCEED the working limit as each job is 1 rule 1 file it's an easy check of the file CTL.

Note that if the CTL of a file is itself larger than the workingLimit it should fail it immediately (when it checks it for CTL) with message: "File too large to run validation on"

Not for release in MVP

Forum Api - Use NGINX

User Story

As a user,

I want websockets to go through nginx first

so that it is less likely to be blocked by corporate firewalls.


Test Case

ENV

ALL

TESTCASE

Log into OCWA, with your web console open, using a wifi connection that blocks websocket connections.

EXPECTED

The Websocket should function properly

ACTUAL

The Websocket errors out and is blocked on connection upgrade.

Fix this by making all websocket connections go through nginx instead, where it performs an automatic upgrade.

As an output checker, I am not getting notified when there are new comments

OCWA Issue

User Story

As a output checker,

I want to be notified whenever a requester has added a comment to the discussion of a request that I am reviewing

so that I can respond to the comment in a timely manner.

Test Case

ENV

STG

TESTCASE

  • As a requestor, submit a request for review
  • As an output checker, claim the same request
  • As a requestor, write a comment in the request's discussion area

EXPECTED

  • Output checker is notified by email that a new comment has been added to the request discussion

ACTUAL

  • No email

ERROR

Notes

New Request Dialog Enhancements

OCWA Issue

User Story

As a Exporter,

I want to use the new request dialog easily

so that I can submit requests as the system intends.

Test Case

ENV

DEV, TEST, PROD

TESTCASE

  • Use the new request dialog

EXPECTED

  • The cancel button should cancel the request regardless of the step it is on
  • Save/Save and Close buttons should be condensed into a single Save Draft button
  • Spacing of the fields should be made easier to read
  • Look at bypassing the first screen to add files directly
  • Show an error inline that checks the server to see if the request name has been taken already

ACTUAL

  • The buttons aren't intuitive, and in the case of the cancel button, the request is saved transitioning to step 2, which means it's not cancelled.
  • Save and Save and Close are redundant
  • You have to save the request to add files
  • If the name is taken already the whole request fails

ERROR

Notes

API Version

User Story

As a developer,

I want version 1 in the apis

so that we can release mvp.

TimeoutError / EHOSTUNREACH error when uploading multiple files simultaneously

User Story

As a _ requester_,

I want _ to be able to upload multiple output files simultaneously _

so that _ I can efficiently attach my files to my request _.

Test Case

ENV

  • DEV
  • TEST

TESTCASE

Steps to reproduce:

  1. Create a new request - fill out request name and click "Add Files"
  2. Drag and drop 5 files to the upload box (I used: the 3 valid files and 2 supporting files in the test folder)
  3. Wait for the uploading to finish
  4. Click "Save and Close"
  5. Find and view the newly created request

EXPECTED

All files would pass validation checks

ACTUAL

Some files have validation check errors and warnings. These warnings / errors are false positives. Sometimes a "TimeoutError: Request timed out" error displays. Other times a "connect EHOSTUNREACH 172.50.8.230:80" error is displayed when hovering over the validation icon to the left of the file name. Refreshing the request page can sometimes resolve the "EHOST" error, but the validation errors persist. test multi tomato.JPG

I should also note the files the validation fails on and the type of validation failures is inconsistent over multiple trials of this issue.

Speculating, it seems like there's a timeout that is happening before all validation checks are down and that these unrun or inprogress checks are presenting in the UI as failed tests

Rule "studyid_not_in_content" should gracefully pass when unable to read incoming file

OCWA Issue

User Story

As a researcher,

I want the "studyid_not_in_content" rule to not fail on files which do obviously do not have study ids

so that I can submit files such as images and other binary content.

Test Case

ENV

  • DEV
  • TEST
  • PROD

TESTCASE

  • Add any image file

EXPECTED

  • The studyid_not_in_content rule should pass

ACTUAL

  • The studyid_not_in_content rule fails

Notes

The rule needs to be rewritten to gracefully fail upon failure to decode instead of hard failing.

Inconsistent mimetype/extension detection

OCWA Issue

User Story

As a researcher,

I want to be able to upload any type of file and have the filetype be parsed and handled correctly

so that the application can properly run validation tests on the files and yield useful output.

Test Case

ENV

  • DEV
  • TEST
  • PROD

TESTCASE

  • Login and create new request
  • Click Add files
  • Drag and drop the LICENSE file from the OCWA repository as a file to submit

EXPECTED

  • The file should successfully submit, with a filetype recognized as text/plain (or similar)
  • Subsequent screens should yield validation rule results that make sense for the uploaded file

ACTUAL

  • The file appears to successfully submit, but no filetype is registered with it.
  • Subsequent validation screens do not yield proper results.

ERROR

From Validation API

2019-02-15 19:04:30,436 - DEBUG - Error 'filetype'
Process Process-54:2:
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/usr/local/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/app/validator/validator.py", line 37, in validate
    result, message = read_file_and_evaluate(source, resultObj)
  File "/app/validator/validator.py", line 51, in read_file_and_evaluate
    result['file_id'], '${file.content}' in source)
  File "/app/validator/validator.py", line 171, in read_file
    index = file[ftIndex].find('/')
KeyError: 'filetype'

Notes

  • This issue only occurs more frequently on Windows, but can be replicated on macOS as well.
  • Markdown files tend to also exhibit this issue, but not in all cases. Haven't been able to isolate root cause.
  • So far we've discovered that we simply do not have the ${file.filetype} attribute defined, which is causing an error on the validationApi.

Firefox - Upload Files button does not work

User Story

As a exporter,

I want the Upload Files button to work

so that I can attach a file to the request.

Test Case

ENV

DEV

TESTCASE

  1. Using Mozilla Firefox, log into OCWA as an exporter
  2. Create a New Request, fill in unique request name and click add files
  3. Click Upload Files

EXPECTED

  • An standard operating system window to allow file selection should appear.

ACTUAL

  • Nothing happens.

UI minor fixes

OCWA Issue

This is a catch all issue for minor bugs and tweaks that can be rolled into one PR with minimal change to the surface area of the frontend codebase.

User Story

As a user or developer,

I want these minor interface and development bugs fixed

so that user/developer experience is smoother.

Test Case

ENV

  • DEV
  • TEST
  • PROD

TESTCASE

EXPECTED

ACTUAL

ERROR

  • Escape special characters in search input
  • Change Exporter to Requester in OC sidebar
  • Implement new loading indicator in requests page.

Notes

My requests in oc interface isn't properly showing my requests

User Story

As an output checker ,

I want to be able to see requests assigned [claimed] by me

so that I can easily find requests I am associated with.

Test Case

ENV

TEST

TESTCASE

  1. Log in an output checker
  2. Assign an unassigned request to yourself
  3. On the main [Kanban view] page filter my "My Requests"

EXPECTED

I would see the request I just assigned to myself

ACTUAL

I don't see the request I just assigned to myself. Interestingly, some older requests that have been previously claimed do show.

Unable to upload more than 13 files concurrently

OCWA Issue

User Story

As a researcher,

I want to be able to upload more than 13 files at a time via drag and drop

so that all the files I need to add can be added to the validation queue at once.

Test Case

ENV

  • DEV
  • TEST
  • PROD

TESTCASE

  • Create a new request and click Add Files
  • Drag and drop 14 files into the drop-in window

EXPECTED

  • The files should begin to be processed after a reasonable delay

ACTUAL

  • Nothing happens - frontend appears completely frozen.

ERROR

Notes

Tested on both Firefox and Chrome -> issue is not browser specific.

Allow requesters to download in request app

OCWA Issue

User Story

As a Requester,

I want to be able to download file from the requests page

so that I can look at files from other members of my team in the SRE.

Test Case

ENV

DEV, TEST, PROD

TESTCASE

  • View any request with files saved to it

EXPECTED

  • Can download files from the request page

ACTUAL

  • Cannot download any files from the request page

ERROR

Notes

Docker Tagged Images

As a developer,

I want travis to build docker images on git tags

so that we can tag releases like mvp.

Remove Development text from document title and update it as history changes

OCWA Issue

User Story

As a user,

I want to have a document title that reflects the current page I am viewing

so that so my browser history/back button is easier to read.

Test Case

ENV

DEV, TEST, PROD

TESTCASE

  • Open any page of the application

EXPECTED

  • The document title no longer is listed as Development Version and changes as the webpage does, for example OCWA | Request 101 (pending)

ACTUAL

  • The title only read `OCWA [Development Version]

ERROR

Notes

Request Api - Submission passing when files are invalid

User Story

As a system administrator,

I want requests with invalid files to not pass submission

so that I have confidence in the system and remove unnecessary work from output checkers.


Test Case

ENV

DEV

TESTCASE

Create a request in DEV and upload a file (dev is important because it's currently impossible to pass the output rules). Save and after validation finishes submit the request.

EXPECTED

An error is returned because the submission is invalid

ACTUAL

The submission is accepted and approved because the system is set to auto approve currently.

Make Variable and Sub-population fields required

User Story

As a manager of the DI Program,

I want to have variable and sub-population information for each request

so that there is a place to flag sensitive variables and to accommodate future governance changes.

Test Case

ENV

  • DEV
  • TEST

Acceptance criteria:
Variable and Sub-population fields are made to be required
Field labels / helper text updated as per below

Describe the data used, to better understand the population and population attributes

VARIABLES: Provide a description of original and self-constructed variables (include full labeling of all variables and value labels)
SUB-POPULATION: In the case of sub-samples and sub-populations, the selection criteria and size of the sub-samples

INSTRUCTION: Please ensure that you also have the following elements, as appropriate, with your output submission: descriptive labeling (ideally alongside each component), information for specific output types, and, log files or annotated steps of analysis.

Validation API - Support Working Limit

OCWA Issue

User Story

As a user,

I want to always have files validated

so that I can trust OCWA.

Test Case

Uploaded many large files simultaneously, at a certain threshold the validation api will crash and give inconsistent validation results because the container went OOM and was killed.

ENV

  • DEV
  • TEST
  • PROD

EXPECTED

I should get all results

ACTUAL

it eventually dies and doesn't give me all my results and fails things it was testing without reason

Notes

From #93
Support a configuration parameter "workingLimit". All validation jobs that are in pending state are loaded into a queue. Anytime the workingLimit has not been hit the api can pull the next item in the queue if and only if it would not EXCEED the working limit as each job is 1 rule 1 file it's an easy check of the file CTL.

Note that if the CTL of a file is itself larger than the workingLimit it should fail it immediately (when it checks it for CTL) with message: "File too large to run validation on"

This is NOT to be released as a part of MVP

Integration tests are not failing Travis build when they fail

OCWA Issue

User Story

As a developer,

I want the Travis CI Integration Tests to properly return a success or failure

so that we actually know if a commit passes or fails our test suites.

Test Case

ENV

  • DEV
  • TEST
  • PROD

TESTCASE

  • Check recent Pull Request builds in Travis CI for command log details.

EXPECTED

  • A failed integration test should fail the entire Travis build.

ACTUAL

  • A failed integration test does not fail the entire Travis build.

ERROR

  • Check recent Pull Request builds in Travis CI for command log details.

Notes

  • This is a Travis CI/CD issue

direct people that do not have a security group permitting them access to OCWA to a "Not authorized" page

User Story

**As an OCWA administrator,

**I want to direct people that do not have a security group permitting them access to OCWA to a "Not authorized" page_

**so that I can provide messages to users that describe what is preventing them from accessing OCWA

Test Case

ENV

TEST, DEV

TESTCASE

Login to OCWA with an account that is in KeyCloak but does not have an OCWA security group set.

EXPECTED

The user would be directed to a "Not authorized" page

ACTUAL

The user can see the request list page but cannot do things that involve the APIs.

Add "Affirmation of Confidentiality" section to new request form

Add the following text to the new request form
header text: "Affirmation of Confidentiality"
body text: "By completing this form and submitting the output package for review, I affirm that the requested outputs are safe for release and protect the confidentiality of data, to the best of my knowledge."

Also, having something on submit reminding requesters of their obligations re: Beth's comment:
"In submitting an Output Request – I would like to see a reminder that they are responsible for ensuring Outputs are non-disclosive and uphold the secrecy provisions of the Statistics Act. "

As a manager of the DI Program,

I want requesters to affirm the confidentiality of their output requests

so that I reduce the risk of a requester submitting an output for review that is not safe.

ENV

  • DEV
  • TEST

Acceptance criteria:
Affirmation text is displayed on the new request form
A reminder is displayed upon submit re: reponsiblities re: Statistics Act

Display project title associated with a request for output checkers

User Story

As an output checker,

I want to be able to see which project a request is associated with

so that so that I know which Data Access Request (DAR) to reference when assessing the output files.

Acceptance criteria:
Given: A request has been submitted for review
When: An output checker views the request
Then: The output checker should see the project title associated with the request

Inform requester of errors in a way that allows requester to easily understand what they need to fix in their request

User Story

As a requester_,

I want to be informed of errors in a way that allows me to easily understand what I need to fix in my request

so that I can be aware of and make needed changes.

Test Case

ENV

TEST

TESTCASE

Generate an error e.g., duplicate request name

  1. Create a new request
  2. Fill out a name that is the same as a previous request
  3. Save and close the request

EXPECTED

An error message that describes what the issue is and what I need to do without technical details or jargon. The error message would be displayed long enough for me to write down or take a screenshot of the error message.

ACTUAL

A error message see screenshot is displayed temporarily on the screen
ocwaduprequestname

Policy - Rules containing curly brace regex expressions do not work

User Story

As a administrator of a secure analytics environment,

I want the ability to automatically scan uploaded output file for Study IDs

so that I can improve my confidence that row level data is not leaving the environment.

Test Case

ENV

DEV, TEST

TESTCASE

  • The following Python rule should be executable:
    print(not bool(re.search(r'[\\w]{1}[\\d]{9}', ${file.content}.decode('utf-8'))))

EXPECTED

  • The above rule executes and properly yields a pass or a fail

ACTUAL

  • The above rule fails for all files

Requester issue with uploading files (minio/tusd issue that is resulting in files being put in a "Pending" state)

User Story

As a requester,

I want to be able to upload files to my requests_

so that I can eventually take the files out of the SAE_.


Test Case

ENV

TEST

TESTCASE

Replicating this bug has been a bit difficult however,
I can consistently generate this error by

  1. creating a new request
  2. upload multiple valid files (files that don't trigger other warnings / errors) - I used larger files ~2Mb
  3. Save and close.
  4. Find and view the request

EXPECTED

The files should be displayed on the request

ACTUAL

The files show as "File not found". See screenshot.
Capture.JPG

See warnings on files but not errors for uploaded files that should only trigger warnings

User Story

**As a requester_,

**I want to see if my uploaded file generated a warning or was blocked_

so that I can know if my files are acceptable to be exported.

Test Case

ENV

Test

TESTCASE

Given: I upload a file that should generate a file size warning (the threshold is configurable).
AND the file is a file I have never uploaded before (due to TusD remembering files it's seen before)
When: I view my uploaded file and the associated warning/errors associated with it

EXPECTED

Then: I should see that my file has warning associated with it but not errors

ACTUAL

My file had both "blocked" and "warning" icons
capture

Saving / submitting files workflow causes files to unexpectedly not be included with a request


Test Case

Steps to replicate:
If I have a request with one file already uploaded and saved
and then I upload a second file I can Submit for Review even though I haven't saved the 2nd file.
After submitting if I look at my request I can see that the 2nd file is not associated with the request.

ENV

Test

EXPECTED

one of two things:

  1. I shouldn't be able to Submit my request if a file I've uploaded is not going to be included (i.e., if we follow our current process, I should need to save first before I can submit)
  2. this is the better option... that I should not need to save a request for the uploaded files to be included in the request when I submit it. When I submit it implies that my files are saved and are part of the request

ACTUAL

Only the first file shows on the request. The 2nd file does not display.

Version Number - All Apis

User Story

As a user,

I want version numbers

so that I can report bugs against a specific version.


Test Case

ENV

DEV, TEST

TESTCASE

Make any api call and try to find out about version information

EXPECTED

There should be a call that returns version information

ACTUAL

There isn't

Add a /version endpoint (that is not version routed). that returns the version information about the different apis. So that it can be added to a front end version page.

Display the name of requester on the request detail page

As a team member I want to be able to see which of my other team members created a request so that I know who to talk to on my team if I have questions about the request

Acceptance criteria:
the name of the requester of a request should display on the "Details" tab of the request page above the Output Checker name

Filter for viewing Approved requests for requester

**As a requester,

I want to be able to see all my approved requests

so that I can reference information in old requests that may be relevant to new requests.


Test Case

ENV

Test environment

TESTCASE

Given requester has a request of status "Accepted"

EXPECTED

When requester views their requests
Then there should be a filter for Accepted requests

ACTUAL

There is no filter for Accepted requests. Instead Accepted requests are displayed under "Flagged"

Approve / request revisions link is not displaying when viewing a request (OC interface)

User Story

As aoutput checker,

I want to approve / request revisions of requests I've claimed_

so that _ I can document and inform the requester of my adjudication decision_.


Test Case

ENV

TEST

TESTCASE

  1. Log into OCWA as a requester
  2. Create a new request and upload a "valid" file (one that doesn't trigger warning/blocking rules)
  3. Submit the request
  4. Log into OCWA as an output checker
  5. Claim the previously created request

EXPECTED

The options of approve or request revisions should be present

ACTUAL

Not approval or request revisions links are present

Log in landing page

User Story

As a user,

I want to be redirected after logging in to where I was before

so that I don't have to find that page again.

Test Case

TESTCASE

Open a request in a regular browser, then copy and paste that url in an incognito window and login

EXPECTED

I land on the request page that I requested

ACTUAL

I land on the request list page

Output Checker Feedback and Adjustments

OCWA Issue

User Story

As a Output Checker,

I want some UI adjustments

so that I can use the application more effectively.

  • Sort the results by
  • Add status Icon to the request page
  • Hide actions when the request has been sent back to the requester
  • Include a back link in the single request page
  • Start with OC's assigned requests only
  • Search filter in the left hand nav bar
  • Switch signed in menu option to Signed In as [...]
  • Scrolling doesn't work in Firefox
  • Unassigned is shown as assigned to the user when they shouldn't be

Test Case

ENV

DEV, TEST, PROD

TESTCASE

EXPECTED

ACTUAL

ERROR

Notes

Integration tests need to be refactored

OCWA Issue

User Story

As a developer,

I want our integration tests to be more reusable and modular

so that the code has better quality and value.

Test Case

ENV

  • DEV
  • TEST
  • PROD

Notes

This issue encapsulates the work involved on refactoring and cleaning up the integration tests.

UI - Show validation message if any

OCWA Issue

User Story

As a _ exporter_,

I want messages about validation

so that I know better why something is in the state it is.

Test Case

ENV

  • DEV
  • TEST
  • PROD

TESTCASE

Upload a file larger than 5mb (default) using a code base that has the validation queue (feature/working-limit). It will fail almost immediately because it is over working limit which gets added to the message of all the results.

The frontend doesn't show this field so it's hard to know why it failed the specific rule.

EXPECTED

ACTUAL

ERROR

Notes

Should not be able to submit a request with a blocked file

ENV

Dev

TESTCASE

  1. Create a request and upload a file that according to polices is blocked
  2. Save and close
  3. On the request page, click Submit

EXPECTED

Should not be allowed to submit (since there's a blocked file attached to the request)

ACTUAL

Submit succeeds

Helm - Add download ui

User Story

As a exporter,

I want to access the download ui

so that I can download approved files.

Exporter UI - Search doesn't search

User Story

As a user,

I want to search for requests

so that I can find my request easily.

Test Case

ENV

DEV, TEST

TESTCASE

  1. Log in to the export interface with more than 1 page of requests (Dev is perfect)
  2. Search for a request that is NOT on page 1 (bTest in Dev)
  3. See either no results or a few results
  4. Go to the next page and perform the same search, note different results

EXPECTED

I see all of the results regardless of page

ACTUAL

I only see the results from the page I am on

Note: I am not opposed to having a page filter, but we should also have a search functionality.

Add "Relationship to previous or future (planned) outputs" field

User Story

As an output checker,

I want to know if an output request has any relationship to previous or future (planned) outputs

so that I can consider this relationship when adjudicating the request.

ENV

  • DEV
  • TEST

Acceptance criteria

A new optional field exists on the request page for this field as per below:

Label: Relationship to previous or future (planned) outputs
Helper text: Describe any relationship to previous outputs. For example, a small adaptation of a previous output, pulled from the same or similar data, poses a risk of disclosure by differencing. This is especially for previously submitted tables within the same project, but could be, for example, other similar studies or projects based on the same sample of the population.

Download file giving a 502 error

OCWA Issue

User Story

As a < researcher >,

I want < to download the files that I have requested from the SRE for export >

so that < I can do as I wish with them >.

Test Case

ENV

STG

TESTCASE

  • Upload a file that has no file type extension
  • Go through the approval
  • From the download application, select the file for download

EXPECTED

  • File is downloaded

ACTUAL

  • 502 error

ERROR

_http_outgoing.js:464
    throw err;
    ^

TypeError [ERR_HTTP_INVALID_HEADER_VALUE]: Invalid value "undefined" for header "Content-Type"
    at ServerResponse.setHeader (_http_outgoing.js:473:3)
    at ServerResponse.setWriteHeadHeaders (/usr/src/app/node_modules/on-headers/index.js:82:19)
    at ServerResponse.writeHead (/usr/src/app/node_modules/on-headers/index.js:41:36)
    at IncomingMessage.stream.on (/usr/src/app/server/routes/files.js:53:9)
    at IncomingMessage.emit (events.js:194:15)
    at endReadableNT (_stream_readable.js:1103:12)
    at process._tickCallback (internal/process/next_tick.js:63:19)

Notes

Request Api - Email Notifications

User Story

As a user,

I want to be emailed when a request is updated

so that I know I have to log into OCWA.


Test Case

ENV

DEV, TEST

TESTCASE

Submit a request, then have an output checker pick it up

EXPECTED

I should receive an email

ACTUAL

No email is sent

Copy Changes

OCWA Issue

There are a few instances where the copy used is confusing or inconsistent or both.

User Story

As a Exporter,

I want easily understand the language used to describe components

so that I can use the application intuitively.

Test Case

ENV

DEV, TEST, PROD

TESTCASE

  • Change the request identifier column and request name field to "request title"
  • Add a filter on the dashboard for "My requests"
  • Add a column on the dashboard table for "requester"
  • Change "Exporter" label to be "Requester" on the individual request page

EXPECTED

  • All the above checklist items are corrected

ACTUAL

  • They are not

ERROR

Notes

Integration Tests should test on triggering branch Docker images

OCWA Issue

User Story

As a developer,

I want the integration tests to test the changes done to my work in progress branch

so that I get immediate feedback on whether I still pass the integration tests.

Test Case

ENV

  • DEV
  • TEST
  • PROD

TESTCASE

  • This is a Travis CI/CD issue. We should ideally be testing on the exact same Docker images that the Travis job was instantiated on.

EXPECTED

  • A breaking change to the UI should fail the integration tests

ACTUAL

  • A breaking change to the UI does not fail the integration tests until AFTER it has been merged into develop

ERROR

Notes

validation error false positive when submitting a request too quickly

Test Case

ENV

DEV

TESTCASE

Steps to replicate:

  1. Create a new request
  2. Upload the "test_valid_file_upload.txt" in the ui_tests/test_files folder
  3. Once the submit button is enable, wait 3 seconds
  4. Click the submit button

EXPECTED

Successful submission

ACTUAL

A validation error occurs. If you wait a few seconds and try to submit again, submission works.

Redux state.json file uploaded.

Missing filetype/mimetype description

OCWA Issue

User Story

As a researcher,

I want to be able to upload files and see the file/mime type displayed

so that I have feedback and knowledge of the file type that I just uploaded.

Test Case

ENV

  • DEV
  • TEST
  • PROD

TESTCASE

  • Login and create new request
  • Click Add files
  • Drag and drop the LICENSE file from the OCWA repository as a file to submit

EXPECTED

  • The file should successfully submit, with a filetype displayed as text/plain (or similar)
  • Subsequent screens should also display text/plain (or similar) for the uploaded file

ACTUAL

  • The filetype field is not populated.

capture

ERROR

  • The following is a parse of the License file's info metadata stored by Minio. Note that filetype/mimetime is not present in the data structure.
    {"ID":"1b427c058b82f26c114863e468a0725d+6ed67e07-36ac-47bb-87d4-4a450e0a5dae","Size":11558,"SizeIsDeferred":false,"Offset":0,"MetaData":{"fileName":"LICENSE","jwt":"redacted","lastModified":"1548877418431","size":"11558"},"IsPartial":false,"IsFinal":false,"PartialUploads":null}

Notes

  • This is a followup to Issue #64 and dependent on PR #73

Multiple file drag and drop yields inconsistent results

OCWA Issue

User Story

As a researcher,

I want to be able to drag and drop multiple files at once into the file upload screen

so that I can upload all the selected files at once.

Test Case

ENV

  • DEV
  • TEST
  • PROD

TESTCASE

  • Login and create new request
  • Click Add files
  • Drag and drop multiple files to the drag and drop element

EXPECTED

  • After some reasonable time, all of the files should be successfully submitted and be listed on the window.
  • Furthermore, all files should also show in subsequent request pages.

ACTUAL

  • Only some of the files appear to be successfully submitted and listed on the window.
  • Not all files show in subsequent request pages.
  • During upload, there is a lot of element shuffling and other strange "jittery" behavior exhibited by the UI.

ERROR

Notes

  • This is reproducible on both Windows and macOS on both Firefox and Chrome

Deleting a file from the request does not work correctly

User Story

As a exporter,

I want to delete files from the request

so that they are not a part of the request.

Test Case

ENV

DEV

TESTCASE

  • Click New Request
  • Click Add Files
  • Upload a file (File A)
  • Click Save & Close
  • Navigate to the new request
  • Click Edit Request
  • Click Add Files
  • Remove File A by clicking the x button
  • Upload a different file (File B)

EXPECTED

  • Only File B should show up on the file list.

ACTUAL

  • Both File A and File B show up on the file list.

Refactor Request Sidebar Actions

OCWA Issue

User Story

As a Exporter,

I want to edit the state of a request easier

so that I can work with the request.

Test Case

ENV

DEV, TEST, PROD

TESTCASE

  • The sidebar actions on the request page each represent a specific API action that a user can make with a request. For example the Withdraw Request button only withdraws a request, but that can be confusing to users who don't know all the possible states a request can have and how they affect its placement in the review process.

EXPECTED

  • Revise the submitted request sidebar to show Edit (which will automatically withdraw) and Cancel
  • Revise the Accepted/Cancelled/Refused request to only show a duplicate button which opens the dialog with the field pre-populated.

ACTUAL

  • Submitted request shows Withdraw and Cancel
  • Accepted/Cancelled/Refused request shows no actions

ERROR

Notes

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.