Git Product home page Git Product logo

beep's Introduction

Battery Evaluation and Early Prediction (BEEP)

Testing - main Coverage Status GitHub Repo Size

BEEP is a set of tools designed to support Battery Evaluation and Early Prediction of cycle life corresponding to the research of the d3batt program and the Toyota Research Institute.

How to cite

If you use BEEP, please cite this article:

P. Herring, C. Balaji Gopal, M. Aykol, J.H. Montoya, A. Anapolsky, P.M. Attia, W. Gent, J.S. HummelshΓΈj, L. Hung, H.-K. Kwon, P. Moore, D. Schweigert, K.A. Severson, S. Suram, Z. Yang, R.D. Braatz, B.D. Storey, SoftwareX 11 (2020) 100506. https://doi.org/10.1016/j.softx.2020.100506

beep's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

beep's Issues

[mat-2264] Write parser to ingest and process Battery Archive files

As an open source developer,

I would like to be able to read and structure data from the battery archive

So that I can collaborate with other battery data projects

Acceptance Criteria:

Structured data object containing summary and interpolated cycles
Graph showing the capacity as a function of cycle number. plotted from the structure file.

[Feature Request] train_test_split dataset based on replicate groups

The BeepDataset class has a method for generating train-test splits based on cell_IDs or unique sequence numbers. However, if there are replicate measurements for the same protocol, I would like to ensure that all replicates fall in the same group (train or test)

As a solution, include a method that first performs unique-parameter groupings based no the protocol parameter file, and then randomizes these groupings into train/test at prescribed level. Ensure the fraction of train vs test is maintained (even if some parameter groups have significantly more replicates than others).

[mat-2120] Remove need for environment variables

Problem

$ export BEEP_PROCESSING_DIR="path/to/data-share"
$ collate 

# runs ok

$ mv /path/to/data-share /path2/to/data-share
$ cd /path2/to/data-share
$ collate

# throws error because BEEP_PROCESSING_DIR is still "path/to/data-share"

As a user, I'd rather the scripts run within the cwd (or a dir that I specify) rather than have a central processing directory.

There is a similar problem for specifying the beep env. No default beep env is specified, so user has one more step to actually running beep. The config does not seem to have any differences for most of the environments:

# beep.config file
config = {
    "local": {
        "logging": {"container": "Testing", "streams": ["file"]},
        "kinesis": {"stream": "local/beep/eventstream"},
    },
    "dev": {
        "logging": {"container": "Testing", "streams": ["file"]},
        "kinesis": {"stream": "local/beep/eventstream"},
    },
    "test": {
        "logging": {"container": "Testing", "streams": ["file"]},
        "kinesis": {"stream": "local/beep/eventstream"},
    },
    "stage": {
        "logging": {"container": "BEEP_EP", "streams": ["CloudWatch", "stdout"]},
        "kinesis": {"stream": "stage/beep/eventstream/stage"},
    },
    "prod": {},
}

Local, dev, and test are exactly the same, and it's not super clear which one the user should use. The env-specfic configuration in beep.__init__ seems mostly specific to running things on AWS which I am guessing most users will not use?

Solution

Remove need for BEEP_PROCESSING_DIR.

Make the default environment "dev"(?) without the need to explicitly set it. Have extra scripts and optional env vars to enable AWS-specific staging/production/etc. beep use.

Structuring with large (>~180MB) Arbin CSV files hangs

There are around 16 csv files in the Nature Energy dataset which "validate" but cannot be "structured". The structuring command simply hangs at this step forever:

image

This is the code I am running, which works fine on all of the other files from the dataset:

for file in file_list:
    print(file)
    mode = 'events_off'
    mapped  =  {
                "mode": mode,  # mode run|test|events_off
                "file_list": [file],  # list of file paths ['path/test1.csv', 'path/test2.csv']
                'run_list': [0]  # list of run_ids [0, 1]
                }
    mapped = json.dumps(mapped)

    # Validation
    validated = validate.validate_file_list_from_json(mapped)
    validated_output = json.loads(validated)
    validated_output['mode'] = mode  # mode run|test|events_off
    validated_output['run_list'] = list(range(len(validated_output['file_list'])))
    validated = json.dumps(validated_output)
    print('Validated:', validated)

    # Data structuring
    structured = structure.process_file_list_from_json(validated)
    structured_output = json.loads(structured)
    structured_output['mode'] = mode  # mode run|test|events_off
    structured_output['run_list'] = list(range(len(structured_output['file_list'])))
    structured = json.dumps(structured_output)
    print('Structured:', structured)

The files which have this problem are these:
image

It appears to only be the large files that are affected.

I may debug this further next week, but wanted to post it here to see if this is a known issue or if there is a simple fix I am overlooking. Thanks!

BEEPDatapath diagnostic can be cleaned up and more tests added

1. diagnostic info should be available as an instance attribute for easier debugging

BEEPDatapath.diagnostic_available

It should be set upon structuring, or should throw an error if accessed and never set. Can be done with @StructuringDecorators.must_be_structured

2. BEEPDatapath test does not have a dedicated test for diagnostic interpolation and summary. The only test is using a maccor file and is restricted to BIG_FILE_TESTS=True.

Files generated by tests

The following files are generated by running tests:

beep/procedure_templates/diagnosticV2.json
beep/procedure_templates/diagnosticV3.json
beep/tests/test_files/test_LA4_waveform.MWF
beep/tests/test_files/test_US06_waveform.MWF
beep/validation_schemas/validation_records.json

Should .json and .MWF be added to .gitignore?

[Bug] Arbin _Metadata.csv file required even though no column from metadata is critical

Problem

Instantiating a RawCyclerRun from arbin file without the associated _Metadata.csv file (in this specific naming arrangement) results in a FileNotFound error. There is - seemingly - no reason why this file should be required, as the metadata in the file will never fail anything in the pipeline if it is missing

Solution

Remove need for _Metadata.csv file, can be incorporated as part of #115

Many of BEEP's structuring features still require BEEP_PROCESSING_DIR to be correctly set

BEEPDatapath.determine_structuring_parameters and all dependent methods (e.g., autostructure) depend on the parameters lookup being correct.

At minimum, a warning should be thrown if no parameters directory is found. Otherwise there are very uninformative errors like:

Traceback (most recent call last):
  File "/beep/beep/structure/profile_memory.py", line 36, in <module>
    diag_summary = md.summarize_diagnostic(diagnostic_available)
  File "/beep/beep/structure/base.py", line 965, in summarize_diagnostic
    i for i in diagnostic_available["diagnostic_starts_at"] if i <= max_cycle
TypeError: 'bool' object is not subscriptable

Which are only due to BEEP_PROCESSING_DIR not containing parameters files.

Alternatively, the default can be to use the parameters files already in beep if they would apply outside of TRI's files. @patrickherring-TRI would the parameters files in test_files apply to other user's data? It seems like they are machine specific...

Tests need upgrade/completion

  • Tests should be organized on a module-by-module basis; they are currently like halfway there
  • Many test files bloating the repo that are never used
  • Test files are not separated by module
  • Some module need need more testing:
    • structure.validation: tests were disabled to accommodate changes in validation from #107 and no longer apply to integration of validation with structuring
    • CLI needs an end-to-end test on a minimal set of small files
    • CLI needs a few more test cases for structure, featurize , train, predict
    • CLI inspect test needs to be debugged on CI although passing locally
    • auto_load and auto_load_processed need to run thru at least one of each file, plus a legacy file for auto_loaded_processed
  • Re-enable windows tests?

Automatically serialize BEEPDatapaths to .json.gz, not json

New BEEPDatapaths are a bit bigger than older ProcessedCyclerRuns since they include complete data, including the raw data.

Running the TestMaccorDatapath.test_get_diagnostic and checking that file:

processed_diagnostic.json       249.3MB

Compressing it

processed_diagnostic.json.gz    18.6MB

Compressing and decompressing even large files is computationally trivial in comparison with interpolation. It seems like the default surely should be compressing these saved files.

determine_structuring_parameters/autostructure clunky to use correctly

Written with the intention #150 is merged

determine_structuring_parameters (which also runs autostructure now) should work correctly using only the data given to the class inside __init__/from_file.

i.e.,


# don't need to set env var for it to run correctly
# should just run

datapath = BEEPDatapath.from_file("some_file")
params = datapath.determine_structuring_parameters()
datapath.structure(*params)

Or, identically,

datapath = BEEPDatapath.from_file("some_file")
datapath.autostructure()

Currently using determine_structure_parameters requires the raw filename and the filename of the parameters file, which changes based on BEEP_PROCESSING_DIR. Forgetting to set this will cause silent errors where diagnostic_available is mistakenly returned as false, leading to diagnostics not interpolated or summarized when the diagnostic actually is available.

MaccorDatapath needs specific logic for eis as/from_dict

Currently, MaccorDatapath (and any other future EIS-capable datapaths) won't serialize EIS data correctly.

BEEPDatapathWithEIS should provide extra serialization capabilities for saving/loading the standard attributes (.paths["eis"], .eis) from all of its child classes.

autostructure needs dedicated tests

Written with the intention #150 is merged

currently autostructure has no dedicated unittest; now that it is it's own dedicated function (no longer to/from_processed_cycler_run it needs more tests.

[mat-2116] Rework CLI using Click, remove JSON requirements for use

As a new user, the usage of console scripts is made more difficult by:

  1. using the console scripts requires back+forth between the readme and the cmd line, e.g., console scripts do not follow a standard "--help" interface:

cmd:

$: collate --help

out (no cli help shown):

0it [00:00, ?it/s]
{"fid": [], "protocol": [], "channel_no": [], "date": [], "strname": [], "filename": [], "file_list": []}(beepenv) 
  1. Current commands require json strings as input rather than canonical cli arguments

  2. The default command names also may interfere with other common console scripts (anything named validate in the user's path, for example)

  3. current command line interface relies on setting various env variables and assuming an underlying directory structure - may be improved if console scripts also had a --processing-path or similar option.


A framework like Click can help simplify and improve the scripts, for example:

$ beep collate --help
Usage: beep collate [OPTIONS]

  Beep collate documentation blah blah blah...

Options:
  --filename PATH          Path to the file
  --output directory PATH  Path to the output/processing directory

I've used click before and it is a great framework for engineering cli scripts

Possibly incorrect cell selection to match Nature Energy paper

πŸ‘‹ Hi!

The discards cells listed here:

discards = {"batches": ["2017-05-12", "2017-06-30", "2018-04-12"],
look like they were designed to match the cells listed here: https://github.com/rdbraatz/data-driven-prediction-of-battery-cycle-life-before-capacity-degradation/blob/master/Load%20Data.ipynb. However, I don't think the cell numbers in the second link refer to channel numbers, as their use in the first link implies. The notes for each of the batches here https://data.matr.io/1/projects/5c48dd2bc625d700019f3204 describe a different set of cells being left out of the Nature Energy paper due to various issues. Table 9 in the supplementary info of the paper has barcodes consistent with the cells referenced in the third link, and is inconsistent with the cells "discarded" in the first link.

Please do let me know if I am misunderstanding something. I'm trying to get the "cleanest" dataset possible from the data collected in the Nature Energy paper. My current understanding is that this requires removing the cells listed in the third link above.

[Feature Request] Documentation on How to Contribute to BEEP Codebase

Is your feature request related to a problem? Please describe.
We currently do not have standard procedures for submitting, reviewing and integrating PRs into the codebase.

Describe the solution you'd like
Add a new document on: contributing to beep codedatabase, that includes:

A checklist that users should go through before submitting PRs for review
A template of PRs that add new features to beep
A training documentation on unit tests: 1) what tests should be included and 2)procedures to add unit tests in PRs (e.g. upload test files etc)

[Bug] Pandas Indexing Deprecated

There seems to be an issue with the from_run method in the DeltaQFastCharge class in the featurize.py file. I ran into this error when taking the 2017-05-12_5_4C-40per_3_6C_CH18 data and editing it into my own file called β€œb1_CH18.csv” and running through the code below.

image

image

image

I believe that there is more likely than not an issue with my csv file, however it does point out an error when handling a newer version of pandas. I am currently running pandas version 1.1.5. It appears that some the cycle indices get dropped when I call dropna(), but when there are missing cycles, it returns an error about not being able to index in to the dataframe with missing cycles. Looks like the current method being used is deprecated and is replaced by using .reindex() or index.intersection().

https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#indexing-deprecate-loc-reindex-listlike

Just wanted to let you all know of this.

Thanks!

[Bug] AWS specific code should be relocated to another repo/package

Problem

Most users will not be using AWS to run beep (or if I am wrong here please correct). The AWS code is mixed in with beep, between the various required configs, extra tests, and extra submodules, there is a clear conceptual separation of beep and AWS deployment but no such separation is reflected in the code; it's all mixed together.

Solution

Move the AWS specific stuff to another repo (e.g., a private deployment repo). Make specific beep versions requirements of the deployment repo

[Feature Request]

As a developer

I need to know what columns are necessary for subsequent processing steps, especially structuring

so that I can convert and rename the right columns and determine if I am missing any requirements

Acceptance Criteria:

List of columns needed for structuring, their contents, units, data types, and standardized name

[mat-2110] Unify validate and structuring into from_cycler pipelines

Problem

There is no canonical list of keys required for each step in the pipeline, particularly structuring and validation. This is particularly difficult to do because there are so many different data formats for cyclers. This is a problem because it is unclear what kind of data allows beep to do what.

For example, the best defined schema is for Arbin files (arbin_conversion.yaml). While #108 lists the columns required for structuring somewhat, when you go to featurize them, it's hard to know what you need down the line. This inconsistency is reflected in the arguments like eis to common classes like PCR, since (seemingly?) only Maccor files can contain this data. It's also pretty hard to figure out what the validation requirements are. RCR and PCR are treated as a one-size fits all solution for all the different cyclers, making downstream debugging more complex.

In short, as a developer, following/debugging the pipeline can be somewhat complex because of the diversity of data + cycler capabilities. The pipeline is also fairly brittle with respect to adding new cyclers and new kinds of data.

Quick n' dirty solution

There must be sets of internal column keys required for

  • all tasks, regardless of cycler or available metadata (e.g., interpolation-required columns seem common among all PCRs)
  • special tasks which work among more than 1 cycler
  • special tasks which only work with one kind of cycler (EIS_spectrum + Maccor seems to be an example of this)

The schema for conversion and validation can be updated to a standard format to clarify this.

More involved solution

The quick n' dirty soluton is subject to the same issues the current implementation is, just less so.

One comprehensive fix would be with a fairy major refactor (in addition to the quick n dirty solution).

A base class BeepPipeline can contain methods required for all cyclers for structuring, validating, and featurizing. It can also enforce subclasses to implement required methods via abstractmethods.

Each cycler type would inherit BeepPipeline:

class ArbinPipeline(BeepPipeline):

These subclasses would implement required from_file methods taking care of the specific mappings to go from file columns to internal columns defined in the schema. They can also implement specific cases common to a subset of cyclers:

class IndigoPipeline(BeepPipeline, UsesDateTime)

They can also implement structuring or validation procedures unique to a particular cycler:

class MaccorPipeline(BeepPipeline)
    ...
    def structure_waveform(...)
    ...
    def eis_spectrum_from_file(...)

Data at each step of the pipeline can be gathered from their attributes:

mp = MaccorPipeline.from_file("some_pipeline.file")            # make a pipeline from a file
mp.structure()                                                 # run structuring
mp.validate()                                                  # validate required data
mp.raw_data                                                    # analogous to RawCyclerRun
mp.structured_data                                             # analogous to ProcessedCyclerRun
mp.structure_data_from_eis_file("eis.file")                    # incorporate EIS data into pipeline, not applicable to other cyclers
mp.featurize(common_features=True, specific_features=True)     # features can include common dq features and others specific to this cycler which may not be available to others

New cyclers or data formats can be added according to the guidelines set up in BeepFeaturizer, giving a clear path to writing custom processing pipelines. This might also solve some maintainability issues

[mat-3071] Fix featurization edge case

From @patrickherring-TRI

2021-10-12 21:06:20 INFO Featurizing 1 files 2021-10-12 21:06:20 INFO Applying 1 featurizers to each of 1 files 
2021-10-12 21:06:20 DEBUG Hashing file '/home/ec2-user/SageMaker/efs-readonly/structure/PreDiag_000358_000165_structure.json' to MD5 
2021-10-12 21:06:23 DEBUG File 1 of 1: Loading processed run '/home/ec2-user/SageMaker/efs-readonly/structure/PreDiag_000358_000165_structure.json'. 
2021-10-12 21:06:39 DEBUG File 1 of 1: Loaded processed run '/home/ec2-user/SageMaker/efs-readonly/structure/PreDiag_000358_000165_structure.json' into memory. 
2021-10-12 21:06:39 INFO File 1 of 1: Featurizer IntracellFeatures valid with params {'diagnostic_cycle_type': 'rpt_0.2C', 'step_type': 0, 'anode_file': '/home/ec2-user/anaconda3/envs/beep-newcli/lib/python3.8/site-packages/beep/protocol_parameters/intracell_info/anode_test.csv', 'cathode_file': '/home/ec2-user/anaconda3/envs/beep-newcli/lib/python3.8/site-packages/beep/protocol_parameters/intracell_info/cathode_test.csv'} for '/home/ec2-user/SageMaker/efs-readonly/structure/PreDiag_000358_000165_structure.json' 

/home/ec2-user/anaconda3/envs/beep-newcli/lib/python3.8/site-packages/scipy/optimize/_numdiff.py:519: RuntimeWarning: invalid value encountered in true_divide J_transposed[i] = df / dx 

2021-10-12 21:06:52 INFO File 1 of 1: Featurizer IntracellFeatures applied with params {'diagnostic_cycle_type': 'rpt_0.2C', 'step_type': 0, 'anode_file': '/home/ec2-user/anaconda3/envs/beep-newcli/lib/python3.8/site-packages/beep/protocol_parameters/intracell_info/anode_test.csv', 'cathode_file': '/home/ec2-user/anaconda3/envs/beep-newcli/lib/python3.8/site-packages/beep/protocol_parameters/intracell_info/cathode_test.csv'} for '/home/ec2-user/SageMaker/efs-readonly/structure/PreDiag_000358_000165_structure.json' 
2021-10-12 21:06:52 INFO File 1 of 1: Featurizer IntracellFeatures features for '/home/ec2-user/SageMaker/efs-readonly/structure/PreDiag_000358_000165_structure.json' written to '/home/ec2-user/SageMaker/notebooks/TRI/temp/IntracellFeatures-PreDiag_000358_000165_structure.json' 
2021-10-12 21:06:52 DEBUG Hashing sub-operation output file '/home/ec2-user/SageMaker/notebooks/TRI/temp/IntracellFeatures-PreDiag_000358_000165_structure.json' to MD5
CRITICAL:root:Feature matrix could not be created: 'BEEPFeatureMatrixError'! 
2021-10-12 21:06:52 INFO Featurization report: 2021-10-12 21:06:52 INFO All 1 featurizers succeeded: 1/1 
2021-10-12 21:06:52 INFO - /home/ec2-user/SageMaker/efs-readonly/structure/PreDiag_000358_000165_structure.json 2021-10-12 21:06:52 INFO No featurizers succeeded or file failed: 0/1 2021-10-12 21:06:52 INFO Featurization matrix created: False

@patrickherring-TRI could you attach the status.json from this run, as well as the command used to run it?

I am guessing this might be since only 1 file was being featurized, something happened with the feature matrix creation (it was designed to have multiple featurizers applied)...

auto_load_processed should be able to load a file processed anywhere

Is your feature request related to a problem? Please describe.
When reloading a file using auto_load_processed function, the function fails if it cannot find the validation schema file. This frequently occurs when the data is processed in the data pipeline and then opened on another machine. The reloading function goes to look for the file location of the validation schema and cannot find it.

Describe the solution you'd like
The auto_load_processed function should work in any environment. One possible solution is to change the schema attribute of the datapath class to be a dictionary containing the actual validation values instead of a file path.

Describe alternatives you've considered
Another is to force the function to use only validation schemas in the beep package (located at VALIDATION_SCHEMA_DIR), so that the path to those files is alway available.

Additional context
Add any other context about the feature request here.

[mat-2532] Improve structuring compute speed and memory usage with dask/modin/swifter

Interpolation can be slow if the number of cycles is high. Pandas interpolate only uses 1 core (AFAIK)? Dask (or other df parallelization libraries) can improve interpolation speeds (e.g., by 10x or more)

modin: https://modin.readthedocs.io/en/latest/
dask: https://docs.dask.org/en/latest/
swifter: https://github.com/jmcarpenter2/swifter (automatically decides whether to use 1 core via pandas or all of them via dask/modin)

an example: https://stackoverflow.com/questions/45545110/make-pandas-dataframe-apply-use-all-cores

[Feature Request] Is having a central data_share unwieldy?

Problem

Not sure if this is an actual problem, since users of beep may just have one set of files they are working with, but when you are working with multiple input files, keeping track of what is going on with each file with the CLI + data-share directory is complicated...

Solution

Simply input and output files in the working directory according to whatever names the user wants (or sensible defaults in the cwd if no args passed), should be possible wth #107

Using featurization can be confusing and has a lot of boilerplate

  1. featurization module has a lot of boilerplate, especially in validate. Amount of code could probably be reduced by 50% or more
  2. various from_run/from_processed_cycler_run/features_from_processed_cycler_run methods make it very confusing to figure out how to actually instantiate a featurizer.

I'd naively suggest using matminer.featurizers.base.BaseFeaturizer as a base class to more consistently handle applying featurization.

[mat-2111] Move tutorial and add references in the documentation

There is a nice short tutorial that might help new users get up and running, but it's not mentioned anywhere in the documentation?

Todos:

  • Mention (or embed) the tutorial in the documentation based on what @patrickherring-TRI prefers
  • Add test for the notebook/embedded code (for jupyter, this link shows how to auto-test; for embedded, the code should be loaded from the doc src md, then skipped if no docs downloaded (e.g., pip install)).

[mat-2490] Unittests can be organized by module

It is still difficult even after #150 to find where certain classes are being tested, because structuring tests are not organized by module. It can be separated by module. E.g., test_arbin.

TestStructuringUtils

[Question]

Would like to know the dependent variable (y) used in the ML model. Is it the Cycle_Index for all the cycler data types ?

BEEPLinearModelExperiment could be expanded?

Close if no longer relevant or unimportant

  • Model space could include logsitic regression: "will cycle life exceed N cycles"?
  • 95% confidence intervals removed with #107 as they were incompatible, need to be re-added

[Feature Request] Dedicated documentation pages

Problem

Having all the info in the readme can be nice for maintainability, but the readme is already pretty long and getting longer. Dedicated documentation can be more user friendly

Solution

mkdocs is a nice looking and simple solution

Test errors: ResourceNotFoundException when calling GetSecretValue

Still getting some failures when trying to run the tests. The full message is long but most of them seem to be something to do with botocore.errorfactory.ResourceNotFoundException: An error occurred (ResourceNotFoundException) when calling the GetSecretValue operation: Secrets Manager can't find the specified secret. I guess this just requires some more setup on my end?
For example:

======================================================================
ERROR: test_validation_from_json (test_validate.SimpleValidatorTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_validate.py", line 310, in test_validation_from_json
    json_output = validate_file_list_from_json(json_string)
  File "/Users/vsulzer/Documents/Energy_storage/BEEP/beep/validate.py", line 525, in validate_file_list_from_json
    events = KinesisEvents(service='DataValidator', mode=file_list_data['mode'])
  File "/Users/vsulzer/Documents/Energy_storage/BEEP/beep/utils/events.py", line 85, in __init__
    self.stream = get_secret(config['test']['kinesis']['stream'])['streamName']
  File "/Users/vsulzer/Documents/Energy_storage/BEEP/beep/utils/secrets_manager.py", line 58, in get_secret
    raise e
  File "/Users/vsulzer/Documents/Energy_storage/BEEP/beep/utils/secrets_manager.py", line 55, in get_secret
    SecretId=secret_name
  File "/Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/botocore/client.py", line 316, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/botocore/client.py", line 626, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.errorfactory.ResourceNotFoundException: An error occurred (ResourceNotFoundException) when calling the GetSecretValue operation: Secrets Manager can't find the specified secret.
-------------------- >> begin captured logging << --------------------
botocore.hooks: DEBUG: Event choose-service-name: calling handler <function handle_service_name_alias at 0x10cadc830>
botocore.hooks: DEBUG: Event creating-client-class.kinesis: calling handler <function add_generate_presigned_url at 0x10caab440>
botocore.endpoint: DEBUG: Setting kinesis timeout as (60, 60)
botocore.client: DEBUG: Registering retry handlers for service: kinesis
botocore.hooks: DEBUG: Event before-parameter-build.kinesis.ListStreams: calling handler <function generate_idempotent_uuid at 0x10cb0ae60>
botocore.hooks: DEBUG: Event before-call.kinesis.ListStreams: calling handler <function inject_api_version_header_if_needed at 0x10cb14950>
botocore.endpoint: DEBUG: Making request for OperationModel(name=ListStreams) with params: {'url_path': '/', 'query_string': '', 'method': 'POST', 'headers': {'X-Amz-Target': 'Kinesis_20131202.ListStreams', 'Content-Type': 'application/x-amz-json-1.1', 'User-Agent': 'Boto3/1.12.4 Python/3.7.7 Darwin/19.4.0 Botocore/1.15.4'}, 'body': b'{}', 'url': 'https://kinesis.us-east-2.amazonaws.com/', 'context': {'client_region': 'us-east-2', 'client_config': <botocore.config.Config object at 0x1339d0390>, 'has_streaming_input': False, 'auth_type': None}}
botocore.hooks: DEBUG: Event request-created.kinesis.ListStreams: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x134fac2d0>>
botocore.hooks: DEBUG: Event choose-signer.kinesis.ListStreams: calling handler <function set_operation_specific_signer at 0x10cb0a3b0>
botocore.auth: DEBUG: Calculating signature using v4 auth.
botocore.auth: DEBUG: CanonicalRequest:
POST
/

content-type:application/x-amz-json-1.1
host:kinesis.us-east-2.amazonaws.com
x-amz-date:20200529T021947Z
x-amz-target:Kinesis_20131202.ListStreams

content-type;host;x-amz-date;x-amz-target
44136fa355b3678a1146ad16f7e8649e94fb4fc21fe77e8310c060f61caaff8a
botocore.auth: DEBUG: StringToSign:
AWS4-HMAC-SHA256
20200529T021947Z
20200529/us-east-2/kinesis/aws4_request
db0c0ae3d6514dac87aea893ff644b5978b783934e34ffc9616249386632476d
botocore.auth: DEBUG: Signature:
51df6effb1693f2ca460689eecec3e466b8728ae8296aeef277ab12248d78a0d
botocore.endpoint: DEBUG: Sending http request: <AWSPreparedRequest stream_output=False, method=POST, url=https://kinesis.us-east-2.amazonaws.com/, headers={'X-Amz-Target': b'Kinesis_20131202.ListStreams', 'Content-Type': b'application/x-amz-json-1.1', 'User-Agent': b'Boto3/1.12.4 Python/3.7.7 Darwin/19.4.0 Botocore/1.15.4', 'X-Amz-Date': b'20200529T021947Z', 'Authorization': b'AWS4-HMAC-SHA256 Credential=AKIARMITENQXCCNA6LEU/20200529/us-east-2/kinesis/aws4_request, SignedHeaders=content-type;host;x-amz-date;x-amz-target, Signature=51df6effb1693f2ca460689eecec3e466b8728ae8296aeef277ab12248d78a0d', 'Content-Length': '2'}>
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): kinesis.us-east-2.amazonaws.com:443
urllib3.connectionpool: DEBUG: https://kinesis.us-east-2.amazonaws.com:443 "POST / HTTP/1.1" 200 41
botocore.parsers: DEBUG: Response headers: {'x-amzn-RequestId': 'd4aa0c7b-1731-2b4b-83e0-7ff85b8b95c3', 'x-amz-id-2': 'jXz88lGS883ZkBfAxFwr9lr1DCM0l+Vot9wO6ySevnlp4+sAYV2zyWMiJoD9SwFjCSWDsZVSKgMeVqoYaruvPA6xiExvZIBf', 'Date': 'Fri, 29 May 2020 02:19:47 GMT', 'Content-Type': 'application/x-amz-json-1.1', 'Content-Length': '41'}
botocore.parsers: DEBUG: Response body:
b'{"HasMoreStreams":false,"StreamNames":[]}'
botocore.hooks: DEBUG: Event needs-retry.kinesis.ListStreams: calling handler <botocore.retryhandler.RetryHandler object at 0x1342d64d0>
botocore.retryhandler: DEBUG: No retry needed.
botocore.hooks: DEBUG: Changing event name from creating-client-class.iot-data to creating-client-class.iot-data-plane
botocore.hooks: DEBUG: Changing event name from before-call.apigateway to before-call.api-gateway
botocore.hooks: DEBUG: Changing event name from request-created.machinelearning.Predict to request-created.machine-learning.Predict
botocore.hooks: DEBUG: Changing event name from before-parameter-build.autoscaling.CreateLaunchConfiguration to before-parameter-build.auto-scaling.CreateLaunchConfiguration
botocore.hooks: DEBUG: Changing event name from before-parameter-build.route53 to before-parameter-build.route-53
botocore.hooks: DEBUG: Changing event name from request-created.cloudsearchdomain.Search to request-created.cloudsearch-domain.Search
botocore.hooks: DEBUG: Changing event name from docs.*.autoscaling.CreateLaunchConfiguration.complete-section to docs.*.auto-scaling.CreateLaunchConfiguration.complete-section
botocore.hooks: DEBUG: Changing event name from before-parameter-build.logs.CreateExportTask to before-parameter-build.cloudwatch-logs.CreateExportTask
botocore.hooks: DEBUG: Changing event name from docs.*.logs.CreateExportTask.complete-section to docs.*.cloudwatch-logs.CreateExportTask.complete-section
botocore.hooks: DEBUG: Changing event name from before-parameter-build.cloudsearchdomain.Search to before-parameter-build.cloudsearch-domain.Search
botocore.hooks: DEBUG: Changing event name from docs.*.cloudsearchdomain.Search.complete-section to docs.*.cloudsearch-domain.Search.complete-section
botocore.credentials: DEBUG: Looking for credentials via: env
botocore.credentials: DEBUG: Looking for credentials via: assume-role
botocore.credentials: DEBUG: Looking for credentials via: assume-role-with-web-identity
botocore.credentials: DEBUG: Looking for credentials via: shared-credentials-file
botocore.credentials: INFO: Found credentials in shared credentials file: ~/.aws/credentials
botocore.loaders: DEBUG: Loading JSON file: /Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/botocore/data/endpoints.json
botocore.hooks: DEBUG: Event choose-service-name: calling handler <function handle_service_name_alias at 0x10cadc830>
botocore.loaders: DEBUG: Loading JSON file: /Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/botocore/data/secretsmanager/2017-10-17/service-2.json
botocore.loaders: DEBUG: Loading JSON file: /Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/botocore/data/secretsmanager/2017-10-17/service-2.sdk-extras.json
botocore.hooks: DEBUG: Event creating-client-class.secrets-manager: calling handler <function add_generate_presigned_url at 0x10caab440>
botocore.endpoint: DEBUG: Setting secretsmanager timeout as (60, 60)
botocore.loaders: DEBUG: Loading JSON file: /Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/botocore/data/_retry.json
botocore.client: DEBUG: Registering retry handlers for service: secretsmanager
botocore.hooks: DEBUG: Event before-parameter-build.secrets-manager.GetSecretValue: calling handler <function generate_idempotent_uuid at 0x10cb0ae60>
botocore.hooks: DEBUG: Event before-call.secrets-manager.GetSecretValue: calling handler <function inject_api_version_header_if_needed at 0x10cb14950>
botocore.endpoint: DEBUG: Making request for OperationModel(name=GetSecretValue) with params: {'url_path': '/', 'query_string': '', 'method': 'POST', 'headers': {'X-Amz-Target': 'secretsmanager.GetSecretValue', 'Content-Type': 'application/x-amz-json-1.1', 'User-Agent': 'Boto3/1.12.4 Python/3.7.7 Darwin/19.4.0 Botocore/1.15.4'}, 'body': b'{"SecretId": "local/beep/eventstream"}', 'url': 'https://secretsmanager.us-west-2.amazonaws.com/', 'context': {'client_region': 'us-west-2', 'client_config': <botocore.config.Config object at 0x137d998d0>, 'has_streaming_input': False, 'auth_type': None}}
botocore.hooks: DEBUG: Event request-created.secrets-manager.GetSecretValue: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x137d99910>>
botocore.hooks: DEBUG: Event choose-signer.secrets-manager.GetSecretValue: calling handler <function set_operation_specific_signer at 0x10cb0a3b0>
botocore.auth: DEBUG: Calculating signature using v4 auth.
botocore.auth: DEBUG: CanonicalRequest:
POST
/

content-type:application/x-amz-json-1.1
host:secretsmanager.us-west-2.amazonaws.com
x-amz-date:20200529T021947Z
x-amz-target:secretsmanager.GetSecretValue

content-type;host;x-amz-date;x-amz-target
c5763faee03d0d8a3d28e22207a85ed8cd6db3208b6b5007b772485712bb1914
botocore.auth: DEBUG: StringToSign:
AWS4-HMAC-SHA256
20200529T021947Z
20200529/us-west-2/secretsmanager/aws4_request
2e573c8cbe106354edba15afb9f8e6f95c82fd15bcd1de3e1ae7053b715c0ddf
botocore.auth: DEBUG: Signature:
6406017d0901e10497ce8b385a2067d77a12c4cf11650f812fcbeae9a250c6f6
botocore.endpoint: DEBUG: Sending http request: <AWSPreparedRequest stream_output=False, method=POST, url=https://secretsmanager.us-west-2.amazonaws.com/, headers={'X-Amz-Target': b'secretsmanager.GetSecretValue', 'Content-Type': b'application/x-amz-json-1.1', 'User-Agent': b'Boto3/1.12.4 Python/3.7.7 Darwin/19.4.0 Botocore/1.15.4', 'X-Amz-Date': b'20200529T021947Z', 'Authorization': b'AWS4-HMAC-SHA256 Credential=AKIARMITENQXCCNA6LEU/20200529/us-west-2/secretsmanager/aws4_request, SignedHeaders=content-type;host;x-amz-date;x-amz-target, Signature=6406017d0901e10497ce8b385a2067d77a12c4cf11650f812fcbeae9a250c6f6', 'Content-Length': '38'}>
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): secretsmanager.us-west-2.amazonaws.com:443
urllib3.connectionpool: DEBUG: https://secretsmanager.us-west-2.amazonaws.com:443 "POST / HTTP/1.1" 400 99
botocore.parsers: DEBUG: Response headers: {'Date': 'Fri, 29 May 2020 02:19:48 GMT', 'Content-Type': 'application/x-amz-json-1.1', 'Content-Length': '99', 'Connection': 'keep-alive', 'x-amzn-RequestId': '005c351b-a362-45c1-aedf-0da0d24515c7'}
botocore.parsers: DEBUG: Response body:
b'{"__type":"ResourceNotFoundException","Message":"Secrets Manager can\'t find the specified secret."}'
botocore.hooks: DEBUG: Event needs-retry.secrets-manager.GetSecretValue: calling handler <botocore.retryhandler.RetryHandler object at 0x137d99d50>
botocore.retryhandler: DEBUG: No retry needed.
--------------------- >> end captured logging << ---------------------

Error in `test_end_to_end`

Still getting one error when running the tests:

======================================================================
ERROR: Console command for end to end test, run by passing the output of
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_end_to_end.py", line 125, in test_console
    validation_output = json.loads(validation_output)
  File "/usr/local/opt/python/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/__init__.py", line 348, in loads
    return _default_decoder.decode(s)
  File "/usr/local/opt/python/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/decoder.py", line 340, in decode
    raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 157)
-------------------- >> begin captured logging << --------------------
matplotlib: DEBUG: CONFIGDIR=/Users/vsulzer/.matplotlib
matplotlib: DEBUG: (private) matplotlib data path: /Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/matplotlib/mpl-data
matplotlib: DEBUG: matplotlib data path: /Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/matplotlib/mpl-data
matplotlib: DEBUG: loaded rc file /Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/matplotlib/mpl-data/matplotlibrc
matplotlib: DEBUG: matplotlib version 3.2.1
matplotlib: DEBUG: interactive is False
matplotlib: DEBUG: platform is darwin
matplotlib: DEBUG: loaded modules: ['sys', 'builtins', '_frozen_importlib', '_imp', '_thread', '_warnings', '_weakref', 'zipimport', '_frozen_importlib_external', '_io', 'marshal', 'posix', 'encodings', 'codecs', '_codecs', 'encodings.aliases', 'encodings.utf_8', '_signal', '__main__', 'encodings.latin_1', 'io', 'abc', '_abc', 'site', 'os', 'stat', '_stat', '_collections_abc', 'posixpath', 'genericpath', 'os.path', '_sitebuiltins', '_bootlocale', '_locale', '_virtualenv', 'importlib', 'importlib._bootstrap', 'importlib._bootstrap_external', 'types', 'warnings', 'importlib.abc', 'importlib.machinery', 'importlib.util', 'contextlib', 'collections', 'operator', '_operator', 'keyword', 'heapq', '_heapq', 'itertools', 'reprlib', '_collections', 'functools', '_functools', 'threading', 'time', 'traceback', 'linecache', 'tokenize', 're', 'enum', 'sre_compile', '_sre', 'sre_parse', 'sre_constants', 'copyreg', 'token', '_weakrefset', 'mpl_toolkits', 'ruamel', 'nose', 'nose.core', 'logging', 'weakref', 'collections.abc', 'string', '_string', 'atexit', 'unittest', 'unittest.result', 'unittest.util', 'unittest.case', 'difflib', 'pprint', 'unittest.suite', 'unittest.loader', 'fnmatch', 'unittest.main', 'argparse', 'gettext', 'locale', 'unittest.runner', 'unittest.signals', 'signal', 'nose.config', 'optparse', 'textwrap', 'errno', 'configparser', 'nose.util', 'inspect', 'dis', 'opcode', '_opcode', 'nose.pyversion', 'nose.plugins', 'nose.plugins.base', 'nose.plugins.manager', 'nose.failure', 'pickle', 'struct', '_struct', '_compat_pickle', '_pickle', 'pkg_resources', '__future__', 'zipfile', 'shutil', 'zlib', 'bz2', '_compression', '_bz2', 'lzma', '_lzma', 'pwd', 'grp', 'binascii', 'pkgutil', 'platform', 'subprocess', '_posixsubprocess', 'select', 'selectors', 'math', 'plistlib', 'datetime', '_datetime', 'xml', 'xml.parsers', 'xml.parsers.expat', 'pyexpat.errors', 'pyexpat.model', 'pyexpat', 'xml.parsers.expat.model', 'xml.parsers.expat.errors', 'email', 'email.parser', 'email.feedparser', 'email.errors', 'email._policybase', 'email.header', 'email.quoprimime', 'email.base64mime', 'base64', 'email.charset', 'email.encoders', 'quopri', 'email.utils', 'random', 'hashlib', '_hashlib', '_blake2', '_sha3', 'bisect', '_bisect', '_random', 'socket', '_socket', 'urllib', 'urllib.parse', 'email._parseaddr', 'calendar', 'tempfile', 'ntpath', 'pkg_resources.extern', 'pkg_resources._vendor', 'pkg_resources._vendor.six', 'pkg_resources.extern.six', 'pkg_resources._vendor.six.moves', 'pkg_resources.extern.six.moves', 'pkg_resources._vendor.six.moves.urllib', 'pkg_resources.py31compat', 'pkg_resources._vendor.appdirs', 'pkg_resources.extern.appdirs', 'pkg_resources._vendor.packaging', 'pkg_resources._vendor.packaging.__about__', 'pkg_resources.extern.packaging', 'pkg_resources.extern.packaging.version', 'pkg_resources.extern.packaging._structures', 'pkg_resources.extern.packaging.specifiers', 'pkg_resources.extern.packaging._compat', 'pkg_resources.extern.packaging.requirements', 'pkg_resources._vendor.pyparsing', 'copy', 'pkg_resources.extern.pyparsing', 'pkg_resources.extern.six.moves.urllib', 'pkg_resources.extern.packaging.markers', 'pkg_resources.py2_warn', 'sysconfig', '_osx_support', '_sysconfigdata_m_darwin_darwin', 'nose.plugins.plugintest', 'multiprocessing', 'multiprocessing.context', 'multiprocessing.process', 'multiprocessing.reduction', 'array', '__mp_main__', 'nose.loader', 'nose.case', 'nose.importer', 'imp', 'nose.selector', 'nose.suite', 'nose.proxy', 'nose.result', 'nose.exc', 'nose.plugins.skip', 'nose.plugins.errorclass', 'nose.plugins.deprecated', 'nose.tools', 'nose.tools.nontrivial', 'nose.tools.trivial', 'nose.plugins.builtin', 'nose.plugins.attrib', 'nose.plugins.capture', 'nose.plugins.logcapture', 'nose.plugins.cover', 'nose.plugins.debug', 'pdb', 'cmd', 'bdb', 'code', 'codeop', 'glob', 'nose.plugins.doctests', 'doctest', 'nose.plugins.isolate', 'nose.plugins.failuredetail', 'nose.inspector', 'nose.plugins.prof', 'nose.plugins.testid', 'nose.plugins.multiprocess', 'queue', '_queue', 'nose.plugins.xunit', 'xml.sax', 'xml.sax.xmlreader', 'xml.sax.handler', 'xml.sax._exceptions', 'xml.sax.saxutils', 'urllib.request', 'http', 'http.client', 'email.message', 'uu', 'email._encoded_words', 'email.iterators', 'ssl', '_ssl', 'urllib.error', 'urllib.response', '_scproxy', 'nose.plugins.allmodules', 'nose.plugins.collect', 'beep', 'numpy', 'numpy._globals', 'numpy.__config__', 'numpy.version', 'numpy._distributor_init', 'numpy.core', 'numpy.core.multiarray', 'numpy.core.overrides', 'numpy.core._multiarray_umath', 'numpy.compat', 'numpy.compat._inspect', 'numpy.compat.py3k', 'pathlib', 'numpy.core.umath', 'numpy.core.numerictypes', 'numbers', 'numpy.core._string_helpers', 'numpy.core._type_aliases', 'numpy.core._dtype', 'numpy.core.numeric', 'numpy.core.shape_base', 'numpy.core._asarray', 'numpy.core.fromnumeric', 'numpy.core._methods', 'numpy.core._exceptions', 'numpy.core._ufunc_config', 'numpy.core.arrayprint', 'numpy.core.defchararray', 'numpy.core.records', 'numpy.core.memmap', 'numpy.core.function_base', 'numpy.core.machar', 'numpy.core.getlimits', 'numpy.core.einsumfunc', 'numpy.core._add_newdocs', 'numpy.core._multiarray_tests', 'numpy.core._dtype_ctypes', '_ctypes', 'ctypes', 'ctypes._endian', 'numpy.core._internal', 'numpy._pytesttester', 'numpy.lib', 'numpy.lib.mixins', 'numpy.lib.scimath', 'numpy.lib.type_check', 'numpy.lib.ufunclike', 'numpy.lib.index_tricks', 'numpy.matrixlib', 'numpy.matrixlib.defmatrix', 'ast', '_ast', 'numpy.linalg', 'numpy.linalg.linalg', 'numpy.lib.twodim_base', 'numpy.linalg.lapack_lite', 'numpy.linalg._umath_linalg', 'numpy.lib.function_base', 'numpy.lib.histograms', 'numpy.lib.stride_tricks', 'numpy.lib.nanfunctions', 'numpy.lib.shape_base', 'numpy.lib.polynomial', 'numpy.lib.utils', 'numpy.lib.arraysetops', 'numpy.lib.npyio', 'numpy.lib.format', 'numpy.lib._datasource', 'numpy.lib._iotools', 'numpy.lib.financial', 'decimal', '_decimal', 'numpy.lib.arrayterator', 'numpy.lib.arraypad', 'numpy.lib._version', 'numpy.fft', 'numpy.fft._pocketfft', 'numpy.fft._pocketfft_internal', 'numpy.fft.helper', 'numpy.polynomial', 'numpy.polynomial.polynomial', 'numpy.polynomial.polyutils', 'numpy.polynomial._polybase', 'numpy.polynomial.chebyshev', 'numpy.polynomial.legendre', 'numpy.polynomial.hermite', 'numpy.polynomial.hermite_e', 'numpy.polynomial.laguerre', 'numpy.random', 'numpy.random._pickle', 'numpy.random.mtrand', 'cython_runtime', 'numpy.random._bit_generator', '_cython_0_29_14', 'numpy.random._common', 'secrets', 'hmac', 'numpy.random._bounded_integers', 'numpy.random._mt19937', 'numpy.random._philox', 'numpy.random._pcg64', 'numpy.random._sfc64', 'numpy.random._generator', 'numpy.ctypeslib', 'numpy.ma', 'numpy.ma.core', 'numpy.ma.extras', 'watchtower', 'json', 'json.decoder', 'json.scanner', '_json', 'json.encoder', 'boto3', 'boto3.session', 'botocore', 'botocore.session', 'botocore.configloader', 'shlex', 'botocore.compat', 'botocore.vendored', 'botocore.vendored.six', 'botocore.exceptions', 'botocore.vendored.requests', 'botocore.vendored.requests.exceptions', 'botocore.vendored.requests.packages', 'botocore.vendored.requests.packages.urllib3', 'botocore.vendored.requests.packages.urllib3.exceptions', 'urllib3', 'urllib3.connectionpool', 'urllib3.exceptions', 'urllib3.packages', 'urllib3.packages.ssl_match_hostname', 'urllib3.packages.six', 'urllib3.packages.six.moves', 'urllib3.packages.six.moves.http_client', 'urllib3.connection', 'urllib3.util', 'urllib3.util.connection', 'urllib3.util.wait', 'urllib3.contrib', 'urllib3.contrib._appengine_environ', 'urllib3.util.request', 'urllib3.util.response', 'urllib3.util.ssl_', 'urllib3.util.url', 'urllib3.util.timeout', 'urllib3.util.retry', 'urllib3._collections', 'urllib3.request', 'urllib3.filepost', 'urllib3.fields', 'mimetypes', 'urllib3.packages.six.moves.urllib', 'urllib3.packages.six.moves.urllib.parse', 'urllib3.response', 'urllib3.util.queue', 'urllib3.poolmanager', 'botocore.vendored.six.moves', 'xml.etree', 'xml.etree.cElementTree', 'xml.etree.ElementTree', 'xml.etree.ElementPath', '_elementtree', 'botocore.credentials', 'getpass', 'termios', 'dateutil', 'dateutil._version', 'dateutil.parser', 'dateutil.parser._parser', 'six', 'dateutil.relativedelta', 'dateutil._common', 'dateutil.tz', 'dateutil.tz.tz', 'six.moves', 'dateutil.tz._common', 'dateutil.tz._factories', 'dateutil.parser.isoparser', 'botocore.config', 'botocore.endpoint', 'botocore.awsrequest', 'botocore.utils', 'cgi', 'html', 'html.entities', 'botocore.httpsession', 'botocore.vendored.six.moves.urllib_parse', 'certifi', 'certifi.core', 'importlib.resources', 'typing', 'typing.io', 'typing.re', 'botocore.vendored.six.moves.urllib', 'botocore.vendored.six.moves.urllib.request', 'botocore.hooks', 'botocore.history', 'botocore.response', 'botocore.parsers', 'botocore.eventstream', 'botocore.client', 'botocore.waiter', 'jmespath', 'jmespath.parser', 'jmespath.lexer', 'jmespath.exceptions', 'jmespath.compat', 'jmespath.ast', 'jmespath.visitor', 'jmespath.functions', 'botocore.docs', 'botocore.docs.service', 'botocore.docs.utils', 'botocore.docs.client', 'botocore.docs.method', 'botocore.docs.params', 'botocore.docs.shape', 'botocore.docs.example', 'botocore.docs.sharedexample', 'botocore.docs.waiter', 'botocore.docs.paginator', 'botocore.docs.bcdoc', 'botocore.docs.bcdoc.restdoc', 'botocore.docs.bcdoc.docstringparser', 'html.parser', '_markupbase', 'botocore.docs.bcdoc.style', 'botocore.docs.docstring', 'botocore.auth', 'botocore.model', 'botocore.paginate', 'botocore.args', 'botocore.serialize', 'botocore.validate', 'botocore.signers', 'botocore.discovery', 'botocore.retries', 'botocore.retries.standard', 'botocore.retries.quota', 'botocore.retries.special', 'botocore.retries.base', 'botocore.retries.adaptive', 'botocore.retries.bucket', 'botocore.retries.throttling', 'botocore.configprovider', 'botocore.errorfactory', 'botocore.handlers', 'uuid', '_uuid', 'botocore.retryhandler', 'botocore.translate', 'botocore.loaders', 'botocore.regions', 'botocore.monitoring', 'boto3.utils', 'boto3.exceptions', 'boto3.resources', 'boto3.resources.factory', 'boto3.resources.action', 'boto3.resources.params', 'boto3.resources.response', 'boto3.resources.model', 'boto3.docs', 'boto3.docs.service', 'boto3.docs.client', 'boto3.docs.resource', 'boto3.docs.base', 'boto3.docs.action', 'boto3.docs.method', 'boto3.docs.utils', 'boto3.docs.waiter', 'boto3.docs.collection', 'boto3.docs.subresource', 'boto3.docs.attr', 'boto3.docs.docstring', 'boto3.resources.base', 'boto3.resources.collection', 'tqdm', 'tqdm.std', 'tqdm.utils', 'unicodedata', 'tqdm._monitor', 'tqdm.gui', 'tqdm._tqdm_pandas', 'tqdm.cli', 'tqdm._version', 'beep.config', 'beep.conversion_schemas', 'monty', 'monty.serialization', 'monty.io', 'gzip', 'mmap', 'monty.json', 'ruamel.yaml', 'ruamel.yaml.cyaml', '_ruamel_yaml', 'ruamel.yaml.error', 'ruamel.yaml.compat', 'ruamel.yaml.reader', 'ruamel.yaml.util', 'ruamel.yaml.scanner', 'ruamel.yaml.tokens', 'ruamel.yaml.parser', 'ruamel.yaml.events', 'ruamel.yaml.composer', 'ruamel.yaml.nodes', 'ruamel.yaml.constructor', 'ruamel.yaml.comments', 'ruamel.yaml.scalarstring', 'ruamel.yaml.anchor', 'ruamel.yaml.scalarint', 'ruamel.yaml.scalarfloat', 'ruamel.yaml.scalarbool', 'ruamel.yaml.timestamp', 'ruamel.yaml.emitter', 'ruamel.yaml.serializer', 'ruamel.yaml.representer', 'ruamel.yaml.resolver', 'ruamel.yaml.main', 'ruamel.yaml.loader', 'ruamel.yaml.dumper', 'monty.msgpack', 'msgpack', 'msgpack._version', 'msgpack.exceptions', 'msgpack._packer', 'msgpack._unpacker', 'beep.protocol', 'beep.utils', 'beep.utils.events', 'pytz', 'pytz.exceptions', 'pytz.lazy', 'pytz.tzinfo', 'pytz.tzfile', 'beep.utils.secrets_manager', 'beep.utils.splice', 'pandas', 'pandas.compat', 'pandas.compat.numpy', 'distutils', 'distutils.version', 'pandas._libs', 'pandas._libs.tslibs', 'pandas._libs.tslibs.conversion', 'pandas._libs.tslibs.c_timestamp', 'pandas._libs.tslibs.nattype', 'pandas._libs.tslibs.np_datetime', 'pandas._libs.tslibs.timezones', 'pandas._libs.tslibs.tzconversion', 'pandas._libs.tslibs.timedeltas', 'pandas._libs.tslibs.offsets', 'pandas._libs.tslibs.ccalendar', 'pandas._config', 'pandas._config.config', 'pandas._config.dates', 'pandas._config.display', 'pandas._config.localization', 'pandas._libs.tslibs.strptime', 'pandas._libs.tslibs.fields', 'pandas._libs.tslibs.parsing', 'pandas._libs.tslibs.frequencies', 'pandas._libs.tslibs.period', 'pandas._libs.tslibs.timestamps', 'pandas._libs.tslibs.resolution', 'pandas._libs.hashtable', 'pandas._libs.missing', 'pandas._libs.ops_dispatch', 'pandas._libs.lib', 'fractions', 'pandas._libs.tslib', 'pandas.core', 'pandas.core.config_init', 'pandas.core.api', 'pandas.core.dtypes', 'pandas.core.dtypes.dtypes', 'pandas._libs.interval', 'pandas._libs.algos', 'pandas._typing', 'pandas.core.dtypes.base', 'pandas.errors', 'pandas.core.dtypes.generic', 'pandas.core.dtypes.inference', 'pandas.core.dtypes.missing', 'pandas.core.dtypes.common', 'pandas.core.algorithms', 'pandas.util', 'pandas.util._decorators', 'pandas._libs.properties', 'pandas.core.util', 'pandas.core.util.hashing', 'pandas._libs.hashing', 'pandas.core.dtypes.cast', 'pandas.util._validators', 'pandas.core.common', 'pandas.core.construction', 'pandas.core.indexers', 'pandas.core.arrays', 'pandas.core.arrays.base', 'pandas.compat.numpy.function', 'pandas.core.ops', 'pandas.core.ops.array_ops', 'pandas._libs.ops', 'pandas.core.ops.missing', 'pandas.core.ops.roperator', 'pandas.core.ops.dispatch', 'pandas.core.ops.invalid', 'pandas.core.ops.common', 'pandas.core.ops.docstrings', 'pandas.core.ops.mask_ops', 'pandas.core.ops.methods', 'pandas.core.missing', 'pandas.compat._optional', 'pandas.core.sorting', 'pandas.core.arrays.boolean', 'pandas.core.nanops', 'pandas.core.arrays.masked', 'pandas.core.arrays.categorical', 'pandas.core.accessor', 'pandas.core.base', 'pandas.io', 'pandas.io.formats', 'pandas.io.formats.console', 'pandas.core.arrays.datetimes', 'pandas.core.arrays.datetimelike', 'pandas.tseries', 'pandas.tseries.frequencies', 'pandas.tseries.offsets', 'dateutil.easter', 'pandas.core.arrays._ranges', 'pandas.core.arrays.integer', 'pandas.core.tools', 'pandas.core.tools.numeric', 'pandas.core.arrays.interval', 'pandas.core.indexes', 'pandas.core.indexes.base', 'pandas._libs.index', 'pandas._libs.join', 'pandas.core.dtypes.concat', 'pandas.core.indexes.frozen', 'pandas.io.formats.printing', 'pandas.core.strings', 'pandas.core.arrays.numpy_', 'pandas.core.arrays.period', 'pandas.core.arrays.sparse', 'pandas.core.arrays.sparse.accessor', 'pandas.core.arrays.sparse.array', 'pandas._libs.sparse', 'pandas.core.arrays.sparse.dtype', 'pandas.core.arrays.string_', 'pandas.core.arrays.timedeltas', 'pandas.core.groupby', 'pandas.core.groupby.generic', 'pandas.core.frame', 'pandas.core.generic', 'gc', 'pandas.core.indexes.api', 'pandas.core.indexes.category', 'pandas.core.indexes.extension', 'pandas.core.indexes.datetimes', 'pandas.core.indexes.datetimelike', 'pandas.core.indexes.numeric', 'pandas.core.tools.timedeltas', 'pandas.core.tools.datetimes', 'pandas.arrays', 'pandas.core.indexes.interval', 'pandas.util._exceptions', 'pandas.core.indexes.multi', 'pandas.core.indexes.timedeltas', 'pandas.core.indexes.period', 'pandas.core.indexes.range', 'pandas.core.indexing', 'pandas._libs.indexing', 'pandas.core.internals', 'pandas.core.internals.blocks', 'pandas._libs.writers', 'pandas._libs.internals', 'pandas.core.internals.managers', 'pandas.core.internals.concat', 'pandas.io.formats.format', 'pandas.io.common', 'pandas.core.internals.construction', 'pandas.core.series', 'pandas._libs.reshape', 'pandas.core.indexes.accessors', 'pandas.plotting', 'pandas.plotting._core', 'pandas.plotting._misc', 'pandas.core.window', 'pandas.core.window.ewm', 'pandas._libs.window', 'pandas._libs.window.aggregations', 'pandas.core.window.common', 'pandas.core.groupby.base', 'pandas.core.window.rolling', 'pandas.core.window.indexers', 'pandas._libs.window.indexers', 'pandas.core.window.numba_', 'pandas.core.window.expanding', 'pandas.core.groupby.groupby', 'pandas._libs.groupby', 'pandas.core.groupby.ops', 'pandas._libs.reduction', 'pandas.core.groupby.grouper', 'pandas.core.groupby.categorical', 'pandas.tseries.api', 'pandas.core.computation', 'pandas.core.computation.api', 'pandas.core.computation.eval', 'pandas.core.computation.engines', 'pandas.core.computation.align', 'pandas.core.computation.common', 'pandas.core.computation.ops', 'pandas.core.computation.scope', 'pandas.compat.chainmap', 'pandas.core.computation.expr', 'pandas.core.computation.parsing', 'pandas.core.reshape', 'pandas.core.reshape.api', 'pandas.core.reshape.concat', 'pandas.core.reshape.melt', 'pandas.core.reshape.merge', 'pandas.core.reshape.pivot', 'pandas.core.reshape.util', 'pandas.core.reshape.reshape', 'pandas.core.reshape.tile', 'pandas.api', 'pandas.api.extensions', 'pandas.api.indexers', 'pandas.api.types', 'pandas.core.dtypes.api', 'pandas.util._print_versions', 'pandas.io.api', 'pandas.io.clipboards', 'pandas.io.excel', 'pandas.io.excel._base', 'pandas._libs.parsers', 'csv', '_csv', 'pandas.io.excel._util', 'pandas.io.parsers', 'pandas.io.date_converters', 'pandas.io.excel._odfreader', 'pandas.io.excel._openpyxl', 'pandas.io.excel._xlrd', 'pandas.io.excel._pyxlsb', 'pandas.io.excel._xlsxwriter', 'pandas._libs.json', 'pandas.io.excel._xlwt', 'pandas.io.feather_format', 'pandas.io.gbq', 'pandas.io.html', 'pandas.io.json', 'pandas.io.json._json', 'pandas.io.json._normalize', 'pandas.io.json._table_schema', 'pandas.io.orc', 'pandas.io.parquet', 'pandas.io.pickle', 'pandas.compat.pickle_compat', 'pandas.io.pytables', 'pandas.core.computation.pytables', 'pandas.io.sas', 'pandas.io.sas.sasreader', 'pandas.io.spss', 'pandas.io.sql', 'pandas.io.stata', 'pandas.util._tester', 'pandas.testing', 'pandas._testing', 'pandas._libs.testing', 'pandas._version', 'pydash', 'pydash.__version__', 'pydash.arrays', 'pydash.helpers', 'pydash._compat', 'pydash.chaining', 'pydash.collections', 'pydash.functions', 'pydash.numerical', 'pydash.objects', 'pydash.utilities', 'pydash.predicates', 'pydash.strings', 'pydash.exceptions', 'test_collate', 'monty.tempfile', 'monty.shutil', 'beep.collate', 'docopt', 'multiprocessing.synchronize', '_multiprocessing', 'multiprocessing.util', 'fcntl', 'pandas.io.formats.csvs', 'test_end_to_end', 'beep.validate', 'cerberus', 'cerberus.validator', 'cerberus.errors', 'cerberus.platform', 'cerberus.utils', 'cerberus.schema', 'beep.structure', 'scipy', 'scipy._lib', 'scipy._lib._testutils', 'scipy._lib.deprecation', 'scipy._distributor_init', 'scipy.__config__', 'scipy.version', 'scipy._lib._version', 'scipy._lib.six', 'scipy._lib._ccallback', 'scipy._lib._ccallback_c', 'scipy.fft', 'scipy.fft._basic', 'scipy._lib.uarray', 'scipy._lib._uarray', 'scipy._lib._uarray._backend', 'scipy._lib._uarray._uarray', 'scipy.fft._realtransforms', 'scipy.fft._helper', 'scipy.fft._pocketfft', 'scipy.fft._pocketfft.basic', 'scipy.fft._pocketfft.pypocketfft', 'scipy.fft._pocketfft.helper', 'scipy.fft._pocketfft.realtransforms', 'scipy.fft._backend', 'numpy.dual', 'scipy.integrate', 'scipy.integrate.quadrature', 'scipy.special', 'scipy.special.sf_error', 'scipy.special._ufuncs', 'scipy.special._ufuncs_cxx', 'scipy.special._basic', 'scipy.special.specfun', 'scipy.special.orthogonal', 'scipy.linalg', 'scipy.linalg.linalg_version', 'scipy.linalg.misc', 'scipy.linalg.blas', 'scipy.linalg._fblas', 'scipy.linalg.lapack', 'scipy.linalg._flapack', 'scipy._lib._util', 'scipy.linalg.basic', 'scipy.linalg.flinalg', 'scipy.linalg._flinalg', 'scipy.linalg.decomp', 'scipy.linalg.decomp_svd', 'scipy.linalg._solve_toeplitz', 'scipy.linalg.decomp_lu', 'scipy.linalg._decomp_ldl', 'scipy.linalg.decomp_cholesky', 'scipy.linalg.decomp_qr', 'scipy.linalg._decomp_qz', 'scipy.linalg.decomp_schur', 'scipy.linalg._decomp_polar', 'scipy.linalg.matfuncs', 'scipy.linalg.special_matrices', 'scipy.linalg._expm_frechet', 'scipy.linalg._matfuncs_sqrtm', 'scipy.linalg._solvers', 'scipy.linalg._procrustes', 'scipy.linalg._decomp_update', 'scipy.linalg.cython_blas', 'scipy.linalg.cython_lapack', 'scipy.linalg._sketches', 'scipy.sparse', 'scipy.sparse.base', 'scipy._lib._numpy_compat', 'numpy.testing', 'numpy.testing._private', 'numpy.testing._private.utils', 'numpy.testing._private.decorators', 'numpy.testing._private.nosetester', 'scipy.sparse.sputils', 'scipy.sparse.csr', 'scipy.sparse._sparsetools', 'scipy.sparse.compressed', 'scipy.sparse.data', 'scipy.sparse.dia', 'scipy.sparse._index', 'scipy.sparse.csc', 'scipy.sparse.lil', 'scipy.sparse._csparsetools', 'scipy.sparse.dok', 'scipy.sparse.coo', 'scipy.sparse.bsr', 'scipy.sparse.construct', 'scipy.sparse.extract', 'scipy.sparse._matrix_io', 'scipy.sparse.csgraph', 'scipy.sparse.csgraph._laplacian', 'scipy.sparse.csgraph._shortest_path', 'scipy.sparse.csgraph._validation', 'scipy.sparse.csgraph._tools', 'scipy.sparse.csgraph._traversal', 'scipy.sparse.csgraph._min_spanning_tree', 'scipy.sparse.csgraph._flow', 'scipy.sparse.csgraph._matching', 'scipy.sparse.csgraph._reordering', 'scipy.special._comb', 'scipy.special._logsumexp', 'scipy.special.spfun_stats', 'scipy.special._ellip_harm', 'scipy.special._ellip_harm_2', 'scipy.special.lambertw', 'scipy.special._spherical_bessel', 'scipy.integrate.odepack', 'scipy.integrate._odepack', 'scipy.integrate.quadpack', 'scipy.integrate._quadpack', 'scipy.integrate._ode', 'scipy.integrate.vode', 'scipy.integrate._dop', 'scipy.integrate.lsoda', 'scipy.integrate._bvp', 'scipy.sparse.linalg', 'scipy.sparse.linalg.isolve', 'scipy.sparse.linalg.isolve.iterative', 'scipy.sparse.linalg.isolve._iterative', 'scipy.sparse.linalg.interface', 'scipy.sparse.linalg.isolve.utils', 'scipy._lib._threadsafety', 'scipy._lib.decorator', 'scipy.sparse.linalg.isolve.minres', 'scipy.sparse.linalg.isolve.lgmres', 'scipy.sparse.linalg.isolve._gcrotmk', 'scipy.sparse.linalg.isolve.lsqr', 'scipy.sparse.linalg.isolve.lsmr', 'scipy.sparse.linalg.dsolve', 'scipy.sparse.linalg.dsolve.linsolve', 'scipy.sparse.linalg.dsolve._superlu', 'scipy.sparse.linalg.dsolve._add_newdocs', 'scipy.sparse.linalg.eigen', 'scipy.sparse.linalg.eigen.arpack', 'scipy.sparse.linalg.eigen.arpack.arpack', 'scipy.sparse.linalg.eigen.arpack._arpack', 'scipy.sparse.linalg.eigen.lobpcg', 'scipy.sparse.linalg.eigen.lobpcg.lobpcg', 'scipy.sparse.linalg.matfuncs', 'scipy.sparse.linalg._expm_multiply', 'scipy.sparse.linalg._onenormest', 'scipy.sparse.linalg._norm', 'scipy.optimize', 'scipy.optimize.optimize', 'scipy.optimize.linesearch', 'scipy.optimize.minpack2', 'scipy.optimize._minimize', 'scipy.optimize._trustregion_dogleg', 'scipy.optimize._trustregion', 'scipy.optimize._trustregion_ncg', 'scipy.optimize._trustregion_krylov', 'scipy.optimize._trlib', 'scipy.optimize._trlib._trlib', 'scipy._lib.messagestream', 'scipy.optimize._trustregion_exact', 'scipy.optimize._trustregion_constr', 'scipy.optimize._trustregion_constr.minimize_trustregion_constr', 'scipy.optimize._differentiable_functions', 'scipy.optimize._numdiff', 'scipy.optimize._group_columns', 'scipy.optimize._hessian_update_strategy', 'scipy.optimize._constraints', 'scipy.optimize._trustregion_constr.equality_constrained_sqp', 'scipy.optimize._trustregion_constr.projections', 'scipy.optimize._trustregion_constr.qp_subproblem', 'scipy.optimize._trustregion_constr.canonical_constraint', 'scipy.optimize._trustregion_constr.tr_interior_point', 'scipy.optimize._trustregion_constr.report', 'scipy.optimize.lbfgsb', 'scipy.optimize._lbfgsb', 'scipy.optimize.tnc', 'scipy.optimize.moduleTNC', 'scipy.optimize.cobyla', 'scipy.optimize._cobyla', 'scipy.optimize.slsqp', 'scipy.optimize._slsqp', 'scipy.optimize._root', 'scipy.optimize.minpack', 'scipy.optimize._minpack', 'scipy.optimize._lsq', 'scipy.optimize._lsq.least_squares', 'scipy.optimize._lsq.trf', 'scipy.optimize._lsq.common', 'scipy.optimize._lsq.dogbox', 'scipy.optimize._lsq.lsq_linear', 'scipy.optimize._lsq.trf_linear', 'scipy.optimize._lsq.givens_elimination', 'scipy.optimize._lsq.bvls', 'scipy.optimize._spectral', 'scipy.optimize.nonlin', 'scipy.optimize._root_scalar', 'scipy.optimize.zeros', 'scipy.optimize._zeros', 'scipy.optimize.nnls', 'scipy.optimize._nnls', 'scipy.optimize._basinhopping', 'scipy.optimize._linprog', 'scipy.optimize._linprog_ip', 'scipy.optimize._linprog_util', 'scipy.optimize._remove_redundancy', 'scipy.optimize._linprog_simplex', 'scipy.optimize._linprog_rs', 'scipy.optimize._bglu_dense', 'scipy.optimize._lsap', 'scipy.optimize._lsap_module', 'scipy.optimize._differentialevolution', 'scipy.optimize._shgo', 'scipy.spatial', 'scipy.spatial.kdtree', 'scipy.spatial.ckdtree', 'scipy.spatial.qhull', 'scipy.spatial._spherical_voronoi', 'scipy.spatial._voronoi', 'scipy.spatial._plotutils', 'scipy.spatial._procrustes', 'scipy.spatial.distance', 'scipy.spatial._distance_wrap', 'scipy.spatial._hausdorff', 'scipy.spatial.transform', 'scipy.spatial.transform.rotation', 'scipy.spatial.transform._rotation_groups', 'scipy.constants', 'scipy.constants.codata', 'scipy.constants.constants', 'scipy.spatial.transform._rotation_spline', 'scipy.optimize._shgo_lib', 'scipy.optimize._shgo_lib.sobol_seq', 'scipy.optimize._shgo_lib.triangulation', 'scipy.optimize._dual_annealing', 'scipy.integrate._ivp', 'scipy.integrate._ivp.ivp', 'scipy.integrate._ivp.bdf', 'scipy.integrate._ivp.common', 'scipy.integrate._ivp.base', 'scipy.integrate._ivp.radau', 'scipy.integrate._ivp.rk', 'scipy.integrate._ivp.dop853_coefficients', 'scipy.integrate._ivp.lsoda', 'scipy.integrate._quad_vec', 'beep.featurize', 'scipy.stats', 'scipy.stats.stats', 'scipy.ndimage', 'scipy.ndimage.filters', 'scipy.ndimage._ni_support', 'scipy.ndimage._nd_image', 'scipy.ndimage._ni_docstrings', 'scipy._lib.doccer', 'scipy.ndimage.fourier', 'scipy.ndimage.interpolation', 'scipy.ndimage.measurements', 'scipy.ndimage._ni_label', '_ni_label', 'scipy.ndimage.morphology', 'scipy.stats.distributions', 'scipy.stats._distn_infrastructure', 'scipy.stats._distr_params', 'scipy.misc', 'scipy.misc.doccer', 'scipy.misc.common', 'scipy.stats._constants', 'scipy.stats._continuous_distns', 'scipy.interpolate', 'scipy.interpolate.interpolate', 'scipy.interpolate.fitpack', 'scipy.interpolate._fitpack_impl', 'scipy.interpolate._fitpack', 'scipy.interpolate.dfitpack', 'scipy.interpolate._bsplines', 'scipy.interpolate._bspl', 'scipy.interpolate.polyint', 'scipy.interpolate._ppoly', 'scipy.interpolate.fitpack2', 'scipy.interpolate.interpnd', 'scipy.interpolate.rbf', 'scipy.interpolate._cubic', 'scipy.interpolate.ndgriddata', 'scipy.interpolate._pade', 'scipy.stats._stats', 'scipy.stats._tukeylambda_stats', 'scipy.stats._discrete_distns', 'scipy.stats.mstats_basic', 'scipy.stats._stats_mstats_common', 'scipy.stats._rvs_sampling', 'scipy.stats._hypotests', 'scipy.stats.morestats', 'scipy.stats.statlib', 'scipy.stats.contingency', 'scipy.stats._binned_statistic', 'scipy.stats.kde', 'scipy.stats.mvn', 'scipy.stats.mstats', 'scipy.stats.mstats_extras', 'scipy.stats._multivariate', 'beep.helpers', 'beep.helpers.featurizer_helpers', 'matplotlib', 'matplotlib.cbook', 'matplotlib.cbook.deprecation', 'matplotlib.rcsetup', 'matplotlib.fontconfig_pattern', 'pyparsing', 'matplotlib.colors', 'matplotlib.docstring', 'matplotlib._color_data', 'cycler', 'matplotlib._version', 'matplotlib.ft2font', 'kiwisolver']
matplotlib: DEBUG: CACHEDIR=/Users/vsulzer/.matplotlib
matplotlib.font_manager: DEBUG: Using fontManager instance from /Users/vsulzer/.matplotlib/fontlist-v310.json
matplotlib.pyplot: DEBUG: Loaded backend macosx version unknown.
matplotlib.pyplot: DEBUG: Loaded backend MacOSX version unknown.
botocore.hooks: DEBUG: Changing event name from creating-client-class.iot-data to creating-client-class.iot-data-plane
botocore.hooks: DEBUG: Changing event name from before-call.apigateway to before-call.api-gateway
botocore.hooks: DEBUG: Changing event name from request-created.machinelearning.Predict to request-created.machine-learning.Predict
botocore.hooks: DEBUG: Changing event name from before-parameter-build.autoscaling.CreateLaunchConfiguration to before-parameter-build.auto-scaling.CreateLaunchConfiguration
botocore.hooks: DEBUG: Changing event name from before-parameter-build.route53 to before-parameter-build.route-53
botocore.hooks: DEBUG: Changing event name from request-created.cloudsearchdomain.Search to request-created.cloudsearch-domain.Search
botocore.hooks: DEBUG: Changing event name from docs.*.autoscaling.CreateLaunchConfiguration.complete-section to docs.*.auto-scaling.CreateLaunchConfiguration.complete-section
botocore.hooks: DEBUG: Changing event name from before-parameter-build.logs.CreateExportTask to before-parameter-build.cloudwatch-logs.CreateExportTask
botocore.hooks: DEBUG: Changing event name from docs.*.logs.CreateExportTask.complete-section to docs.*.cloudwatch-logs.CreateExportTask.complete-section
botocore.hooks: DEBUG: Changing event name from before-parameter-build.cloudsearchdomain.Search to before-parameter-build.cloudsearch-domain.Search
botocore.hooks: DEBUG: Changing event name from docs.*.cloudsearchdomain.Search.complete-section to docs.*.cloudsearch-domain.Search.complete-section
botocore.credentials: DEBUG: Looking for credentials via: env
botocore.credentials: DEBUG: Looking for credentials via: assume-role
botocore.credentials: DEBUG: Looking for credentials via: assume-role-with-web-identity
botocore.credentials: DEBUG: Looking for credentials via: shared-credentials-file
botocore.credentials: INFO: Found credentials in shared credentials file: ~/.aws/credentials
botocore.loaders: DEBUG: Loading JSON file: /Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/botocore/data/endpoints.json
botocore.hooks: DEBUG: Event choose-service-name: calling handler <function handle_service_name_alias at 0x1152c1830>
botocore.loaders: DEBUG: Loading JSON file: /Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/botocore/data/secretsmanager/2017-10-17/service-2.json
botocore.loaders: DEBUG: Loading JSON file: /Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/botocore/data/secretsmanager/2017-10-17/service-2.sdk-extras.json
botocore.hooks: DEBUG: Event creating-client-class.secrets-manager: calling handler <function add_generate_presigned_url at 0x115290440>
botocore.endpoint: DEBUG: Setting secretsmanager timeout as (60, 60)
botocore.loaders: DEBUG: Loading JSON file: /Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/botocore/data/_retry.json
botocore.client: DEBUG: Registering retry handlers for service: secretsmanager
botocore.hooks: DEBUG: Event before-parameter-build.secrets-manager.GetSecretValue: calling handler <function generate_idempotent_uuid at 0x1152efe60>
botocore.hooks: DEBUG: Event before-call.secrets-manager.GetSecretValue: calling handler <function inject_api_version_header_if_needed at 0x1152f9950>
botocore.endpoint: DEBUG: Making request for OperationModel(name=GetSecretValue) with params: {'url_path': '/', 'query_string': '', 'method': 'POST', 'headers': {'X-Amz-Target': 'secretsmanager.GetSecretValue', 'Content-Type': 'application/x-amz-json-1.1', 'User-Agent': 'Boto3/1.12.4 Python/3.7.7 Darwin/19.4.0 Botocore/1.15.4'}, 'body': b'{"SecretId": "local/beep/eventstream"}', 'url': 'https://secretsmanager.us-west-2.amazonaws.com/', 'context': {'client_region': 'us-west-2', 'client_config': <botocore.config.Config object at 0x135458350>, 'has_streaming_input': False, 'auth_type': None}}
botocore.hooks: DEBUG: Event request-created.secrets-manager.GetSecretValue: calling handler <bound method RequestSigner.handler of <botocore.signers.RequestSigner object at 0x135458290>>
botocore.hooks: DEBUG: Event choose-signer.secrets-manager.GetSecretValue: calling handler <function set_operation_specific_signer at 0x1152ef3b0>
botocore.auth: DEBUG: Calculating signature using v4 auth.
botocore.auth: DEBUG: CanonicalRequest:
POST
/

content-type:application/x-amz-json-1.1
host:secretsmanager.us-west-2.amazonaws.com
x-amz-date:20200623T154901Z
x-amz-target:secretsmanager.GetSecretValue

content-type;host;x-amz-date;x-amz-target
c5763faee03d0d8a3d28e22207a85ed8cd6db3208b6b5007b772485712bb1914
botocore.auth: DEBUG: StringToSign:
AWS4-HMAC-SHA256
20200623T154901Z
20200623/us-west-2/secretsmanager/aws4_request
c0607df32cb6f0993f5b159449b41be0ca05cab4bdc6a0e7991661013a182f5c
botocore.auth: DEBUG: Signature:
d76381b43ed46eff828a6725db014b3cf86df319068d260ea55f1d931606bd70
botocore.endpoint: DEBUG: Sending http request: <AWSPreparedRequest stream_output=False, method=POST, url=https://secretsmanager.us-west-2.amazonaws.com/, headers={'X-Amz-Target': b'secretsmanager.GetSecretValue', 'Content-Type': b'application/x-amz-json-1.1', 'User-Agent': b'Boto3/1.12.4 Python/3.7.7 Darwin/19.4.0 Botocore/1.15.4', 'X-Amz-Date': b'20200623T154901Z', 'Authorization': b'AWS4-HMAC-SHA256 Credential=AKIARMITENQXCCNA6LEU/20200623/us-west-2/secretsmanager/aws4_request, SignedHeaders=content-type;host;x-amz-date;x-amz-target, Signature=d76381b43ed46eff828a6725db014b3cf86df319068d260ea55f1d931606bd70', 'Content-Length': '38'}>
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): secretsmanager.us-west-2.amazonaws.com:443
urllib3.connectionpool: DEBUG: https://secretsmanager.us-west-2.amazonaws.com:443 "POST / HTTP/1.1" 400 99
botocore.parsers: DEBUG: Response headers: {'Date': 'Tue, 23 Jun 2020 15:49:01 GMT', 'Content-Type': 'application/x-amz-json-1.1', 'Content-Length': '99', 'Connection': 'keep-alive', 'x-amzn-RequestId': '396b3d66-e033-4877-8801-56431b17b02f'}
botocore.parsers: DEBUG: Response body:
b'{"__type":"ResourceNotFoundException","Message":"Secrets Manager can\'t find the specified secret."}'
botocore.hooks: DEBUG: Event needs-retry.secrets-manager.GetSecretValue: calling handler <botocore.retryhandler.RetryHandler object at 0x135458490>
botocore.retryhandler: DEBUG: No retry needed.
--------------------- >> end captured logging << ---------------------

[Feature Request] Optional loading of `raw_data` attribute

Problem statement
Loading of the structured files used to be relative fast since the files contained only the interpolated dataframes and these were much smaller. Having the raw data available is very convenient but slows down the loading time and dramatically increases the amount of memory required.

Solution suggestion
Making the loading the of the raw data optional should solve the problem, so that unless a flag was set, the attribute would remain empty (using the auto_load_processed method). However, it's also desirable to make the to/from file methods simple and transparent so if a lot of additional complexity is required this might not be the best solution.

Alternative solutions
Another solution could be to remove the raw data attribute or turn it into a loading method that goes and looks for the original raw file. This is less desirable since it returns to the problem of locating the raw data that was used to create the structured file.

Warnings about missing files when running tests

I get a lot of warnings about missing files when running tests. For example, see below for running CollateTest. Feel free to close this if it's expected behavior

/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH5_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH5_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH5_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH2_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH2_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH2_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH38: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH38_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH38_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH10: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH10_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH10_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH29_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH29_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH29_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH11: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH11_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH11_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH39: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH39_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH39_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH13: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH13_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH13_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH12: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH12_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH12_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH18_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH18_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH18_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH16: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH16_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH16_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH3_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH3_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH3_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH4_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH4_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH4_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH28_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH28_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH28_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH17: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH17_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH17_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH15: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH15_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH15_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH29: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH29_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH29_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH8: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH8_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH8_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH19_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH19_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH19_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH9: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH9_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH9_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH28: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH28_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH28_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH14: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH14_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH14_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH14_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH14_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH14_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH41_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH41_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH41_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH13_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH13_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH13_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH46_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH46_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH46_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH9_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH9_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH9_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH25_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH25_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH25_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH22_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH22_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH22_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH37_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH37_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH37_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH30_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH30_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH30_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH47_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH47_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH47_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH12_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH12_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH12_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH40_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH40_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH40_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH15_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH15_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH15_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH48: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH48_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH48_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH36_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH36_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH36_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH8_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH8_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH8_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH23_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH23_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH23_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH24_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH24_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH24_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH17_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH17_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH17_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH42_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH42_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH42_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH10_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH10_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH10_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH45_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH45_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH45_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH46: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH46_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH46_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH47: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH47_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH47_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH34_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH34_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH34_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH33_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH33_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH33_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH45: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH45_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH45_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH26_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH26_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH26_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH21_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH21_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH21_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH44: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH44_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH44_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH40: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH40_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH40_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH44_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH44_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH44_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH11_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH11_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH11_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH43_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH43_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH43_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH16_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH16_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH16_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH41: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH41_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH41_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH20_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH20_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH20_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH27_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH27_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH27_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH43: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH43_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH43_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH32_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH32_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH32_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH35_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH35_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH35_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH42: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH42_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH42_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH38_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH38_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH38_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH19: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH19_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH19_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH4: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH4_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH4_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH25: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH25_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH25_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH24: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH24_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH24_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH6_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH6_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH6_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH5: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH5_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH5_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH30: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH30_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH30_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH18: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH18_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH18_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH1_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH1_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH1_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH26: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH26_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH26_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH32: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH32_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH32_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH7: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH7_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH7_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH33: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH33_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH33_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH6: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH6_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH6_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH27: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH27_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH27_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH23: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH23_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH23_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH2: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH2_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH2_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH37: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH37_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH37_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH7_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH7_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH7_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH39_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH39_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH39_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH3: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH3_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH3_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH36: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH36_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH36_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH22: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH22_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH22_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH34: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH34_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH34_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH1: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH1_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH1_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH20: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH20_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH20_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH48_Metadata: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH48_Metadata_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH48_Metadata_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH21: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH21_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH21_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
/Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2018-04-12_batch8_CH35: [Errno 2] File /Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH35_Metadata.csv does not exist: '/Users/vsulzer/Documents/Energy_storage/BEEP/beep/tests/test_files/2018-04-12_batch8_CH35_Metadata.csv'
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))
  0%|                                                                                                                                     | 0/96 [00:00<?, ?it/s]/Users/vsulzer/Documents/Energy_storage/BEEP/venv/lib/python3.7/site-packages/nose/suite.py:225: PerformanceWarning: indexing past lexsort depth may impact performance.
  test(orig)
100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 96/96 [00:00<00:00, 174.69it/s]
./Users/vsulzer/Documents/Energy_storage/BEEP/beep/collate.py:104: UserWarning: Failed to parse protocol for 2019-03-06-per_CH1234: list index out of range
  warnings.warn("Failed to parse protocol for {}: {}".format(filename, e))

step_is_waveform is not general, only corresponds to maccor

step_is_waveform only works with maccor files.

it can be implemented on a datapath-by-datapath basis so that everything specific to Maccor waveform goes in MaccorDatapath, everything specific to Arbin waveform goes in ArbinDatapath, etc. Like a method step_is_waveform.

The default in BEEPDatapath ABC can be evaluating to false, with a log printed that says waveform is not implemented for this specific cycler.

Right now waveform steps will evaluate to false even if there is waveform (e.g., arbin waveform) and there is no logging or indication the step actually is waveform, it just returns false.

validation/conversion_schemas consolidated inside BEEPDatapath children, may not need to be separate module?

Written with the intention #150 is merged

Currently, validation schemas/cycler raw file specifications are set as constants from the files in conversion_schemas. These are only used in validate and structuring, and all of the code from validate can be moved into structuring, since validation is essentially a pre-step to structuring and each cycler type will have its own validation procedure.

The idea is that everything specific to a cycler with go in that cycler's structuring file, e.g., all maccor config goes in maccor.py

I am suggesting just moving these files and constants into their respective BEEPDatapath, along with their validation procedures; all cycler-specific routines would be then contained within that structuring file.

For example, ARBIN_CONFIG can easily be migrated to be a class attr of ArbinDatapath. ValidatorBeep.validate_arbin_dataframe can easily be migrated to ArbinDatapath.validate. All Arbin-specific structuring info would then be entirely contained within ArbinDatapath.

Docs need completion

Places to improve docstrings

  • Protocol CLI
  • Protocol module
  • Featurizer core module
  • Validation routines

Online documentation

  • quickstart needs docs on BEEPFeaturizer/BEEPFeatureMatrix/BEEPLinearModelExperiment
  • python tutorial docs needs in depth sections on BEEPFeaturizer/BEEPFeatureMatrix/BEEPLinearModelExperiment
  • Protocol CLI needs docs

[Bug] Dependencies have no authoritative source

It can be helpful to have one authoritative source for dependencies.

Problem

For example, requirements.txt does not list ruamel.yaml==0.16.5 while setup.py's install_requires does. This is a minor case, but, if other requirements differ significantly, users might run into discrepancies installing between pip install beep (user who wants to use it as written) and pip install -e . -r requirements.txt (a developer, or someone who edits the code). This also enables the situation where CI passes on all platforms but users installing the code fail tests due to different dependency versions.

Since other requirements (e.g., extras_require) are not listed anywhere, users wanting those extra features (e.g., running tests comprehensively).

Solution

Use requirements-*.txt files as the authoritative source for dependencies. setup.py can read these files easily with pip's built in tools:

from pip._internal.req import parse_requirements
from pip._internal.network.session import PipSession

pip_requirements = parse_requirements(os.path.join(this_dir, "requirements.txt", PipSession())
reqs = [pii.requirement for pii in p]

setup(name="beep",
      url="https://github.com/TRI-AMDD/beep",
...
      install_requires=reqs,
...

Dependencies for CI tests or running the code in production can be similarly enumerated in requirements-ci.txt or requirements-aws.txt etc. These can be loaded in modular fashion into setup.py script.

Logging can be more comprehensive

For small operations, including when files fail in a batch operation, very little relevant data is printed to console or can be shown.

For example, batch processing a bunch of runs where one fails, the only error that is returned is shown in the processed json output, which says "insufficient data". We can instead save the stack trace and return that in json.

Another example: A warning can be logged if determine_structuring_parameters does not find a diagnostic. If someone is expecting their files to have a diagnostic, and beep isn't structuring them correctly, this is useful information.

Additionally, we can set logging levels for smaller operations (showing things like what the structuring parameters are when they are determined) to DEBUG or INFO (not important). We can similarly set batch operation logging (whether a file was successful or failed) by setting their log level higher (WARNING or ERROR); these logging ops can be filtered easily by the logging module.

The end result would be logging that is both complete (high res) yet not overwhelming (can be filtered).

Remove dead code, cleanup

Dead code:

  • old validation tests
  • pretty much everything in scripts unless its critical
  • some files in utils
  • Various mentions of BEEP_PROCESSING_DIR around along with remove dev environment stuff (including in CI files)

Cleanup

  • a lot of stuff in protocol modules

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.