Git Product home page Git Product logo

jetstream's Introduction

Jetstream

Jetstream is a wrapper project around Troposphere to create and maintain CloudFormation templates for AWS. This tools uses the awesome project Troposphere to allow us to write out CF templates in Python and provides and testing on top.

Features that are available with Jetstream and Troposphere:

  • Uses Python to build CloudFormation templates (Troposphere)
  • Uses Troposphere native Validations (Troposphere)
  • Builds out full tests only on changed templates and dependencies (Jetstream)
  • Is able to publish to S3 and to a Local File System (Jetstream)

Setup

To install Jetstream from git you need to clone down the repository and pip install.

git clone [email protected]:rackerlabs/jetstream
cd jetstream && pip install -e .

Building

To build templates from a Python Package of templates run jetstream -m <python_package>.

This will put the resulting templates in a directory named artifacts in your CWD.

jetstream -m 'my_templates_package'

If you would like the templates to be tested before being generated.

jetstream -t

Templates

Every template starts out as a new Object inherited from the JetstreamTemplate class.

Example S3 Template

import time

from jetsream.template import JetstreamTemplate, TestParameters
from troposphere import Parameter, Ref, Template, Join

class S3Bucket(JetstreamTemplate):
    '''S3 Bucket Template'''
    def __ini__(self):
        self.name = 's3_bucket.template'
        self.template = Template()
        self.test_params = TestParameters()

        self.template.add_version('2010-09-09')

        environment = self.template.add_parameter(Parameter(
            'Environment',
            Default='Development',
            Type='String',
            Description='Application environment'
            AllowedValues=['Development', 'Integration', 'PreProduction',
                           'Production', 'Staging', 'Test']
        ))

        bucket_name = self.template.add_parameter(Parameter(
            'BucketName',
            Type='String',
            Description='S3 Bucket name'
        ))

        s3_bucket = self.template.add_resource(Bucket(
            'S3Bucket',
            Tags=Tags(Environment=Ref(environment))
            BucketName=Ref(bucket_name)
        ))

        self.template.add_output(Ouput(
            'Arn',
            Value=Join('', ['arn:aws:s3:::', Ref(s3_bucket)])
        )))

Testing

Templates provide you with options when it comes to what to set during creation of a test stack using the test_params attribute.

For example to create a unique bucket name for a test of an S3 Bucket:

bucket_name = self.template.add_parameter(Parameter(
    'BucketName',
    Type='String',
    Description='S3 Bucket name'
))
test_bucket_name = "test-bucket-{}".format(int(time.time()))
test_params.add(TestParameter('BucketName', test_bucket_name))

Test Parameters also allow you to specify output from a different stack by providing a template as the source.

from s3_bucket import S3Bucket

s3_bucket = self.template.add_parameter(Parameter(
    'S3Bucket',
    Type='String',
    Description='S3 Bucket Arn'
))
test_params.add(TestParameter('S3Bucket', 'Arn', S3Bucket()))

Jetstream can handle multi-level template dependencies so the following a legal dependency chain. Circular dependencies however will not work and will cause the test to fail during CloudFormation creation.

A depends on B which depends on C which depends on D, B also depends on D.

A --> B -> C
      |    |
      |    |
      | ----> D

If you extend the Jetstream template class, you can implement the following methods to hook into jetstream's behavior from your subclass:

  • def prepare_document(self): run before template documentation is generated
  • def prepare_generate(self): run before template is generated
  • def prepare_test(self): run before template is tested

jetstream's People

Contributors

bohn002 avatar brint avatar cwgem avatar gdelvalle avatar jarosser06 avatar lil-cain avatar martinb3 avatar stevengorrell avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jetstream's Issues

InvalidLocationConstraint when testing in us-east-1

Currently, when the AWS_DEFAULT_REGION is explicitly set to us-east-1. attempts to run a Jetstream test deployment result in the following error:

botocore.exceptions.ClientError: An error occurred (InvalidLocationConstraint) when calling the CreateBucket operation: The specified location-constraint is not valid

It appears related to boto/boto3#125

Cannot update docs if JSON doesn't update

It seems that, if a template is up to date, there's no way to build the docs using jetstream -d. It refuses to create any markdown files, even though none exist.

Python 3 support

$ jetstream
Traceback (most recent call last):
  File "/Users/nikolay/Projects/test/venv/bin/jetstream", line 7, in <module>
    from jetstream.cli import main
  File "/Users/nikolay/Projects/test/venv/lib/python3.6/site-packages/jetstream/cli.py", line 22, in <module>
    from jetstream import publisher, template, testing
  File "/Users/nikolay/Projects/test/venv/lib/python3.6/site-packages/jetstream/template.py", line 41
    raise RuntimeError, message, tb
                      ^
SyntaxError: invalid syntax

Force Test

Per @martinb3

We probably need to do a full rebuild if we add a template to init.py that previously didn't exist. I'm adding templates to the array in the package (via init.py) and they're not being tested, probably because they were pre-built in master while not defined in the package as templates.

Alternately, we should skip building templates that aren't in init.py, which would also fix the lack of initial test.

Implementing version tags or releases

We ran into an issue when #79 was merged, as troposphere 2.3.1 has a bug in the EC2 CreditSpecification property that caused our templates to fail to build. We were able to pin to the previous commit of jetstream to get back to a lower troposphere version. It did however raise the idea of version tags.

It would be really nice to have versions or releases for this repo, to make pinning to the desired point a little easier and prettier.

Update linting to python 3

Do we need to start doing the style/lint checks against python 3?
Should we use python 3.6 since it's the latest, vs. 3.5?

cfn-flip install order

cfn-flip requirements are explicit version bound but troposphere's requirements for it are a fairly greedy >=. Given the current setup both cfn-flip versions will conflict with each other causing issues.

Option for appending format filename extension

It would be nice if there was a flag to optionally write out the filename with whatever the format is. If the format is yaml for example, the filename would be written out as .yaml.

Jetstream should return the exact error that caused CF to fail

Per @horacio3

Currently CircleCI errors and gives back error from Main parent stack. It would be beneficial to be able to see the error from the underlying stack that gave the error.

Current error:

ERROR:jetstream.testing:Stack JetstreamTest1474980531 failure occurred: ROLLBACK_IN_PROGRESS
ERROR:jetstream.testing:Reason: {u'StackId': 'arn:aws:cloudformation:us-east-1:891714082543:stack/JetstreamTest1474980531/bf91e920-84b0-11e6-b403-503f23fb5536', u'Tags': [], u'StackStatusReason': 'The following resource(s) failed to create: [Securitygroup, ElasticBeanstalk, Vpn, S3fs, MicrosoftAd]. . Rollback requested by user.', u'CreationTime': datetime.datetime(2016, 9, 27, 12, 48, 54, 546000, tzinfo=tzlocal()), u'Capabilities': ['CAPABILITY_IAM'], u'StackName': 'JetstreamTest1474980531', u'NotificationARNs': [], u'StackStatus': 'ROLLBACK_IN_PROGRESS', u'DisableRollback': False}
ERROR:jetstream.testing:Reason: The following resource(s) failed to create: [Securitygroup, ElasticBeanstalk, Vpn, S3fs, MicrosoftAd]. . Rollback requested by user.

Python 2/3 support migration plan

Given the age of the previous python 3 bug, #46 I'm opening this one up to have a more current issue that folks don't miss due to the age of the previous one. Put simply, I've been able to get a working python 3 version of jetstream. The essential work for this can be found here:

7f8537c

Mostly copying from the commit log:

  • items(), keys(), and many other funcitons now return specific objects instead of a list, so they need to be explicitly converted in some areas (mostly a NOOP on python 2)
  • cmp is deprecated in python 3 so use == instead (also works on python 2)
  • Raising of exceptions are formatted differently. This code is python 3 specific
  • Usage of sys.stderr.write versus the print function. This code is python 3 specific
  • Modification of dynamic module import code to use inject the current directory into sys.path so that module imports work properly when not in system path. Temporary modifications to the path are reset to the original so as to avoid unnecessary import consequences
  • Fix test scripts to use proper relative imports

In particular:

  • Raising of exceptions are formatted differently - This I can't make 2 backwards compatible due to the lexer bombing out
  • sage of sys.stderr.write versus the print function - This I can't make 2 backwards compatible due to future imports needing to be the absolute top level import (meaning I can't wrap a try/catch or if block around it)

Given these changes, I'd like to have the next release be python 3 specific. Given that there are tagged release now, folks who still need time to migrate from 2 will be able to stay on that release instead. I'll also look into creating a somewhat basic migration guide based on the work I've done internally on some of the template code. Note that there is a "python 3" version of troposphere, but it's somewhat of a 2to3.py on the backend and I'd rather have that as a solid python 3 master branch.

Remove Trailing Whitespaces

Right now the templates are created with trailing white spaces on most lines, we should update the generate method to remove that.

Support appending/prepending to a test stack name

It would be useful if you can add additional information to a CloudFormation Stack name especially when multiple tests are running.

jetstream --test -m my_package --test-name ${CIRCLE_BRANCH}

Which would result in a test name of "jetstream-test-1475497601-my_feature"

@martinb3 @horacio3 Thoughts?

Markdown not rendering in Github

Some of the documentation being produced doesn't render well in Github, e.g. the 4th level header here:
screen shot 2017-05-02 at 8 10 49 am

We should add a space after #### but before any text, so that it's rendered correctly.

Support Testing Dry Run

It would be nice to be able to run --test/-t with a --dry so we can get a full dump of the templates that will be used when a test is run. This makes it easier to verify what the templates look for when running a test without actually spending the money to run the test.

CI Testing Improvements

Currently linting stylistic testing is done on the repository which helps save a number of issues. However, it doesn't validate template generation in any form. As there are currently two test templates, we should build a testing suite around template generation to ensure consistent results. This is even more critical given that metadata generation has a few permutations that need to be validated. It would also help ensure python 2/3 compatibility until 2 support gets dropped.

Support actual S3 Path instead of just bucket name

To upload the new templates to S3 requires a bucket name currently for the --path flag which doesn't make a lot of sense. A path would usually suggest s3://mybucket/templates over just the name of an s3 bucket.

Support for this path would also allow for uploading templates to a subdirectory of a bucket as well. The downside is this does break the interface a little but I think there is a bigger win here.

rack_iam integration

This issue is for integrating rack_iam into jetstream for use with creation of IAM related components. In essence rack_iam creates a CF compatible dictionary which can be used as properties for various high level troposphere IAM objects. It has the following benefits:

  • An extensive test suite
  • API documentation required for all modules, objects, and methods under the rack_iam folder
  • Run against flake8 and flake8 docstring to ensure as much valid code structure as possible
  • Much more relaxed in terms of accepting input (for those who wish more constraints and validations awacs will still be available)
  • Object attributes are standard and this enables easier code completion in IDEs

awacs will still be kept as both an alternative as well as a safeguard in the event anything happens to rack_iam. The reverse holds true as well.

Implement minimum version number for troposphere dependancy

As there have been breaking changes in the last two troposphere releases, I propose instead of pinning a particular version, we update the requirement to a minimum version for the troposphere dependency.

This change would allow each project utilizing jetstream, to adjust the troposphere version to suit their needs, without requiring changes to this tool. This would allow ample time for each project to vet the new versions, and determine if they should update to stay on the previous release. Of course, when it is deemed safe or necessary, the minimum version can be increased here, but there would not be the urgency as current releases have.

Thoughts?

Always Generating Test Master means an exception gets thrown when credentials are missing

Stack Trace:

Traceback (most recent call last):
  File "/Users/jim/.virtualenvs/jetstream/bin/jetstream", line 11, in <module>
    load_entry_point('jetstream', 'console_scripts', 'jetstream')()
  File "/Users/jim/Projects/jetstream/jetstream/cli.py", line 144, in main
    _execute(args)
  File "/Users/jim/Projects/jetstream/jetstream/cli.py", line 52, in _execute
    test = testing.Test(updated_templates)
  File "/Users/jim/Projects/jetstream/jetstream/testing.py", line 43, in __init__
    self._client = boto3.client('cloudformation')
  File "/Users/jim/.virtualenvs/jetstream/lib/python2.7/site-packages/boto3/__init__.py", line 83, in client
    return _get_default_session().client(*args, **kwargs)
  File "/Users/jim/.virtualenvs/jetstream/lib/python2.7/site-packages/boto3/session.py", line 263, in client
    aws_session_token=aws_session_token, config=config)
  File "/Users/jim/.virtualenvs/jetstream/lib/python2.7/site-packages/botocore/session.py", line 824, in create_client
    client_config=config, api_version=api_version)
  File "/Users/jim/.virtualenvs/jetstream/lib/python2.7/site-packages/botocore/client.py", line 68, in create_client
    verify, credentials, scoped_config, client_config, endpoint_bridge)
  File "/Users/jim/.virtualenvs/jetstream/lib/python2.7/site-packages/botocore/client.py", line 130, in _get_client_args
    verify, credentials, scoped_config, client_config, endpoint_bridge)
  File "/Users/jim/.virtualenvs/jetstream/lib/python2.7/site-packages/botocore/args.py", line 45, in get_client_args
    endpoint_url, is_secure, scoped_config)
  File "/Users/jim/.virtualenvs/jetstream/lib/python2.7/site-packages/botocore/args.py", line 103, in compute_client_args
    service_name, region_name, endpoint_url, is_secure)
  File "/Users/jim/.virtualenvs/jetstream/lib/python2.7/site-packages/botocore/client.py", line 203, in resolve
    service_name, region_name)
  File "/Users/jim/.virtualenvs/jetstream/lib/python2.7/site-packages/botocore/regions.py", line 122, in construct_endpoint
    partition, service_name, region_name)
  File "/Users/jim/.virtualenvs/jetstream/lib/python2.7/site-packages/botocore/regions.py", line 135, in _endpoint_for_partition
    raise NoRegionError()
botocore.exceptions.NoRegionError: You must specify a region.

This is caused by the boto3.client being initiated in the Test() init https://github.com/rackerlabs/jetstream/blob/master/jetstream/testing.py#L43 .

Rather than generating a Test object and master template every time Jetstream is run, Jetstream should only initiate these actions when the user actually wants some kind of test done. Either a full on test or perhaps a to be added dry run.

Show Diffs

It would be helpful if INFO mode would show the Diffs between the old and new templates.

Attempt to load markdown as JSON

If you remove all the generated JSON files and keep the markdown files, and then ask jetstream to generate documentation -- it will blow up with an attempt to parse markdown as JSON:

$ jetstream -m my_templates -d
$ rm -rf artifacts/*.template
$ jetstream -m my_templates -d
Traceback (most recent call last):
  File "/Users/martin/.virtualenvs/jetstream/bin/jetstream", line 11, in <module>
    load_entry_point('jetstream', 'console_scripts', 'jetstream')()
  File "/Users/martin/src/jetstream/jetstream/cli.py", line 152, in main
    _execute(args)
  File "/Users/martin/src/jetstream/jetstream/cli.py", line 76, in _execute
    publish.publish_file(tmpl.document_name(), tmpl.document())
  File "/Users/martin/src/jetstream/jetstream/publisher.py", line 130, in publish_file
    if not self.newer(file_path, contents):
  File "/Users/martin/src/jetstream/jetstream/publisher.py", line 117, in newer
    existing = json.load(fil)
  File "/usr/local/Cellar/python/2.7.12/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/__init__.py", line 291, in load
    **kw)
  File "/usr/local/Cellar/python/2.7.12/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/__init__.py", line 339, in loads
    return _default_decoder.decode(s)
  File "/usr/local/Cellar/python/2.7.12/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/decoder.py", line 364, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/local/Cellar/python/2.7.12/Frameworks/Python.framework/Versions/2.7/lib/python2.7/json/decoder.py", line 382, in raw_decode
    raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded

This seems to be because def newer(self, name, latest): in publisher.py assumes it will be called only for JSON files, but def _execute(args): in cli.py actually calls newer() for JSON and Markdown files.

Start releasing to PyPI

We should start doing regular releases to PyPI so we can allow others to depend on this tool and force specific version requirements when they use it to test. This would also allow for others to depend on a specific major version when breaking changes happen in jetstream.

Documentation should include alarms or other resources

We originally took out the list of "all resources" in the markdown / documentation produced by jetstream, but it's recently been suggested that having a list of alarms defined would be useful. Should we (a) add the alarms to the output or (b) add all the resources back, grouped by type?

Better object comparison

Currently even if nothing in the newer and older objects has changed other than ordering (specifically with dicts) jetstream will still count the template as new and run tests. This makes extra work and costs money to test things that are not actually different.

Troposphere 2.1.0 may have broke something.

This template compiles fine, but breaks during testing.

'''CodeCommit Template'''

import random
import string

from troposphere import (AWS_NO_VALUE, If, Equals, GetAtt,
                         Not, Output, Ref, codecommit)

from faws_templates.base import FAWSParameter, FAWSTemplate
from faws_templates.common import params


class CodeCommit(FAWSTemplate):
    '''CodeCommit Template'''

    def __init__(self):
        super(CodeCommit, self).__init__('codecommit.template')
        template = self.template
        template.add_description(self.name + " - Creates an AWS CodeCommit repository. Please be aware that this template will create resources for which you will be charged.")

        template.add_parameter(params.environment)

        repo_parameter_group = "Repository Configuration"
        trigger_1_parameter_group = "Trigger 1 Configuration (OPTIONAL)"

        suffix = ''.join(random.SystemRandom().choice(string.ascii_lowercase + string.digits) for _ in range(5))
        test_repo_name = "test-repo-{}".format(suffix)

        repo_name = template.add_parameter(FAWSParameter(
            "RepositoryName",
            AllowedPattern="([A-Za-z0-9._-]+)",
            Default="MyRepo",
            Type="String",
            Description="A name for the AWS CodeCommit repository. Must contain only letters, numbers, periods (.), underscores (_), and dashes (-). Maximum length 100.",
            GroupName=repo_parameter_group,
            TestParameter=(test_repo_name),
        ))

        repo_description = template.add_parameter(FAWSParameter(
            "RepositoryDescription",
            Default="Created with CloudFormation.",
            Type="String",
            Description="A detailed description for the AWS CodeCommit repository.",
            GroupName=repo_parameter_group,
            TestParameter=("Testing"),
        ))

        enable_trigger_1 = template.add_parameter(FAWSParameter(
            "Trigger1Enable",
            Default="False",
            Type="String",
            AllowedValues=["True", "False"],
            Label="Enable Trigger",
            GroupName=trigger_1_parameter_group,
            TestParameter=("True"),
        ))

        template.add_condition("isTrigger1Enable", Equals(Ref(enable_trigger_1), "True"))

        trigger_1_name = template.add_parameter(FAWSParameter(
            "Trigger1Name",
            Default="Trigger1",
            AllowedPattern="([A-Za-z0-9._-]+)",
            Description="The name of the trigger. Can only contain numbers, periods (.), underscores (_), and dashes (-). Maximum length 100.",
            Type="String",
            Label="Trigger Name",
            GroupName=trigger_1_parameter_group,
            TestParameter=("Test_Trigger"),
        ))

        trigger_1_custom_data = template.add_parameter(FAWSParameter(
            "Trigger1CustomData",
            Default="",
            Type="String",
            Description="Any custom data associated with the trigger that will be included in the information sent to the target of the trigger.",
            GroupName=trigger_1_parameter_group,
            TestParameter=("Testing"),
        ))

        template.add_condition("isCustomData1Set", Not(Equals(Ref(trigger_1_custom_data), "")))

        trigger_1_destination_arn = template.add_parameter(FAWSParameter(
            "Trigger1DestinationArn",
            Type="String",
            Description="REQUIRED. The ARN of the SNS Topic, or SQS Queue that is the target for a trigger. Full ARN required.",
            Label="Notification ARN",
            GroupName=trigger_1_parameter_group,
            TestParameter=("arn:aws:sns:us-west-2:891714082543:rackspace-support")
        ))

        trigger_1_branches = template.add_parameter(FAWSParameter(
            "Trigger1Branches",
            Type="CommaDelimitedList",
            Default="Master",
            Description="Comma Delimited List of branch names. Maximum length 256",
            GroupName=trigger_1_parameter_group,
            TestParameter=("Master"),
        ))

        trigger_1_events = template.add_parameter(FAWSParameter(
            "Trigger1Events",
            Default="all",
            Type="CommaDelimitedList",
            Description="Comma delimited list of Valid Values: \"all\", \"updateReference\", \"createReference\", \"deleteReference\". The value \"all\" cannot be used with any other values.",
            GroupName=trigger_1_parameter_group,
            TestParameter=("updateReference", "createReference"),
        ))

        repo_trigger1 = codecommit.Trigger(
            Name=Ref(trigger_1_name),
            CustomData=If("isCustomData1Set", Ref(trigger_1_custom_data), Ref(AWS_NO_VALUE)),
            DestinationArn=Ref(trigger_1_destination_arn),
            Branches=Ref(trigger_1_branches),
            Events=Ref(trigger_1_events),
        )

        repo = self.template.add_resource(codecommit.Repository(
            "Repository",
            RepositoryName=Ref(repo_name),
            RepositoryDescription=Ref(repo_description),
            Triggers=If("isTrigger1Enable", [repo_trigger1], Ref(AWS_NO_VALUE)),
        ))

        template.add_output(Output(
            "RepoName",
            Description="CodeCommit Repository Name",
            Value=GetAtt(repo, "Name")
        ))

        template.add_output(Output(
            "RepoARN",
            Description="CodeCommit Repository ARN",
            Value=GetAtt(repo, "Arn")
        ))

        template.add_output(Output(
            "CloneHTTP",
            Description="The URL to use for cloning the repository over HTTPS",
            Value=GetAtt(repo, "CloneUrlHttp")
        ))

        template.add_output(Output(
            "CloneSSH",
            Description="The URL to use for cloning the repository over SSH",
            Value=GetAtt(repo, "CloneUrlSsh")
        ))

Output without --test

E:\Github\TroposphereTemplates [codecommitfix]> jetstream -m faws_templates
Updated Templates: codecommit.template

Output with --test

E:\Github\TroposphereTemplates [codecommitfix]> jetstream --test -m faws_templates
Updated Templates: codecommit.template
Traceback (most recent call last):
  File "E:\Github\jetstream\venv\Scripts\jetstream-script.py", line 11, in <module>
    load_entry_point('jetstream', 'console_scripts', 'jetstream')()
  File "e:\github\jetstream\jetstream\cli.py", line 170, in main
    _execute(args)
  File "e:\github\jetstream\jetstream\cli.py", line 62, in _execute
    test = testing.Test(updated_templates, dry_run=args.dry_test)
  File "e:\github\jetstream\jetstream\testing.py", line 40, in __init__
    self.templates = _flatten_templates(templates)
  File "e:\github\jetstream\jetstream\testing.py", line 229, in _flatten_templates
    return _recurse_dependencies(templates).values()
  File "e:\github\jetstream\jetstream\testing.py", line 242, in _recurse_dependencies
    if test_param_group.dependencies():
  File "e:\github\jetstream\jetstream\template.py", line 129, in dependencies
    deps[source.name] = source
AttributeError: 'str' object has no attribute 'name'

Let me know if I can provide more information or assist in any way.

Documentation Generation Code Path Issue

Currently the newer method of all the publisher classes is executed when dealing with both JSON templates and markdown generation. Unfortunately the documentation part hits through a logical branch of the JSON parser. Essentially if the JSON parser can't deal with the content it just loads it as a string and compares that way. When trying on Python 3 it appears that the exception path isn't get hit in the newer method for incorrect JSON encoding leading to:

json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

This is not what the try/catch is expecting leading to the markdown logic failing out as a standard JSON error. This will make dual python 2/3 support very difficult. Instead I propose:

  1. Breaking out the generic file loading as load_content which returns a file pointer, usable by both standard file reads and JSON loading
  2. Creation of publish_template and publish_documentation which utilizes load_content and handles the parsing separately
  3. Update the CLI code to use these new calls

Another possible issue is how to handle the load_template part in dealing with both JSON and YAML at the same time. As our internal use constitutes JSON templates we haven't had much of a need to concern on the YAML part. Given that AWS has many official YAML templates however, it would probably be good to consider making this work a bit more smoothly at some point (as a separate issue).

Versioning Metadata For Template Validation

When jetstream generates templates, it does not include any kind of versioning to make it quickly apparent when an existing template and a generated template differ. To rectify this the plan is to make a metadata value which can have a freeform string to be used for versioning (such as a version number, hash value, timestamp).

  1. Update the comparison code. Currently the code to compare old and new templates uses cmp, which is deprecated in python 3. Our initial research shows that == instead looks promising to handle both python 2 and 3 accordingly, without a lot of effort to update code.
  2. Create code which removes various components from the resulting old/new object content. My initial thought is to use a list of keys in some constant so we can remove more if we need to.
  3. Create CLI flags in jetstream to both add the date/hash and decide what the format of it should look like. These will set options for now and not update any code just yet. This is to avoid breaking default configurations which may not expect this.
  4. Create the code which produces the string from the CLI options to use for our indicator

I'm thinking we can have a single PR with the above broken down into 4 functional commits to get this working.

troposphere-2.0.0 depend update (review possible breaking change)

From the release notes:

2.0.0 (2017-10-07)

  • Note: the s3.Bucket change (#844) may cause a breaking change for non-named arguments.
  • Add DefinitionBody to serverless API (#822)
  • Adding kinesis stream source to firehose (#823)
  • Add Event::Rule::Target::EcsParameters (#824)
  • Add S3 Transfer Acceleration to AWS::S3::Bucket (#833)
  • Add AvailabilityZone property to TargetDescription (#834)
  • Add Tags to NATGateway (#835)
  • Add ResourceLifecycleConfig to ElasticBeanstalk (#836)
  • Add AWS::Athena::NamedQuery (#837)
  • Added platformArn to Environment and ConfigurationTemplate (#839)
  • Events target (fixes #830) (#840)
  • Refactor s3.Bucket to remove custom init() and add tests (#844)
  • Be more explicit on the use of the Tags object for Tags (#845)

Need to check and see what the first note is about though

Test runs fail to clean up RDS stacks.

Jetstream intentionally removes any RetentionPolicy on resources when it generates templates for testing (see https://github.com/rackerlabs/jetstream/blob/master/jetstream/template.py#L248-L256). When the default AWS behavior for resources without a Retention Policy was to delete, this was the desired behavior, but AWS has since changed this default for RDS instances and Clusters. This invariably results in being unable to successfully delete such stacks as the created snapshots have dependencies on the created resources. This sometimes cascades into multiple stacks failing to delete because of dependencies of resources in the previous failure..

Instead of removing the policy, we should probably set this property to a value of Delete, explicitly forcing all resources to be deleted.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.