Git Product home page Git Product logo

docs-spring-cloud-dataflow-k8s's Introduction

Working on VMware Spring Cloud Data Flow for Kubernetes Documentation

The VMware Spring Cloud Data Flow for Kubernetes documentation is now published on docs.vmware.com. See below for information about how the documentation is stored and published and about the toolchain we now use.

If You Previously Worked With Bookbinder and Concourse

We no longer use:

Requirements

You will need:

Workflow

Complete steps:

  • Edit content files locally

  • Preview your changes locally

  • Commit and push your changes to GitHub

  • In DocWorks, publish the docs to staging

  • Review your changes on staging

  • In Docs Dashboard, promote your changes to pre-production

  • In Docs Dashboard, sign off on your changes in pre-production

  • In Docs Dashboard, promote your changes to production

For more information, see the following sections.

Editing Content Files

All content files are stored in the content repository on GitHub (https://github.com/pivotal-cf/docs-spring-cloud-dataflow-k8s). Clone this repository.

To edit documentation, check out the relevant branch, make changes locally, preview if desired (see the next section), then commit and push your changes. For example, to edit documentation for Spring Cloud Data Flow version 1.4, check out and commit changes to the v1.4 branch.

Previewing Your Changes Locally

To preview your changes locally, you must install the Markdown Local Preview CLI utility. For more information and the file download link, see Using the Markdown Local Preview CLI Utility on Confluence.

Preparing for Local Previews

After installing the utility, follow these steps to prepare for local previews:

  1. From the utility directory (for example, ~/Downloads/VMware.GTIX.MarkdownCLI-osx-x64-1.0-0.markdown-cli-osx/Markdown-Cli/), run the Markdown-Cli utility:

    $ ./Markdown-Cli
  2. Select Build Project (watches for changes), then press Enter.

  3. Enter the complete path to the docs content repository on your local machine. For example:

    /Users/bobcratchit/Work/docs-spring-cloud-dataflow-k8s
  4. Type n, then press Enter.

  5. Press any key.

  6. Select Back, then press Enter.

  7. Select Exit, then press Enter.

  8. In your IDE or text editor, open the newly-generated file _md-cli/dist/config.json.

  9. Set tocLocation to toc.md.

  10. Set variablesLocation to variables.yml.

Previewing Changes

Now you can preview your changes:

  1. From the utility directory, run the Markdown-Cli utility:

    $ ./Markdown-Cli
  2. Select Build Project (watches for changes), then press Enter.

  3. Enter the complete path to the docs content repository on your local machine. For example:

    /Users/bobcratchit/Work/docs-spring-cloud-dataflow-k8s
  4. Type y, then press Enter.

  5. In a browser, open the DocsPreview.html file that the utility generated under a _md-cli subdirectory of the content repository directory. The file path will be something like the following:

    /Users/bobcratchit/Work/docs-spring-cloud-dataflow-k8s/_md-cli/dist/DocsPreview.html
  6. Navigate to the page that you edited.

  7. If you make additional changes, the utility will automatically regenerate the preview. You will have to refresh the preview page in your browser, then navigate to your edited page again.

Publishing to Staging and Promoting to Pre-Production

To publish your changes on the staging website:

  1. Visit DocWorks (https://docworks.vmware.com/).

  2. Log in using your VMware LDAP credentials.

  3. From the main navigation menu, select Markdown.

  4. In the left-hand navigation menu, under Markdown Projects, click All.

  5. Search for the docs set (for example, "Spring Cloud Data Flow").

  6. Locate the "card" for the docs set and click Publish.

    Note
    Depending on the project, you may see Build and Deploy buttons instead of a Publish button. If so, first click Build, then when the build has finished, click Deploy.
  7. If the build fails, click the red bar representing the most recent build and review the logs for any warnings or errors. Resolve any errors and start a new build.

  8. When the build succeeds, visit the Docs Dashboard (https://docsdash.vmware.com/).

  9. Log in using your VMware LDAP credentials.

  10. From the main navigation menu, select Deployment > Stage.

  11. In the list of deployments, locate the project and click its link in the Publication column. This will take you to the staging website, where you can view your changes.

When you are satisfied with your changes on staging:

  1. Return to the Docs Dashboard.

  2. In the list of deployments, select your project (select the checkbox in the Product column).

  3. Click the Deploy Selected to Pre-Prod button.

  4. In the Deployment Status dialog, click Refresh to update the status of the deployment. When the deployment has succeeded, click Hide.

  5. From the main navigation menu, select Deployment > Pre Prod.

  6. In the list of deployments, locate the project and click its link in the Publication column. This will take you to the pre-production website, where you can view your changes.

Promoting to Production

When you are satisfied with your changes in pre-production:

  1. Return to the Docs Dashboard.

  2. In the list of deployments, select your project in the Product column).

  3. Click the Sign-Off For Release button.

  4. Confirm your sign-off.

  5. In the list of deployments, select your project again.

  6. Click the Deploy Selected to Prod button.

  7. Confirm that you want to deploy to production.

  8. In the Deployment Status dialog, click Refresh to update the status of the deployment. When the deployment has succeeded, click Hide.

  9. From the main navigation menu, select Deployment > Production.

  10. In the list of deployments, locate the project and click its link in the Publication column. This will take you to the production website, where you can view your published changes.

docs-spring-cloud-dataflow-k8s's People

Contributors

anita-flegg avatar buzzardo avatar fifthposition avatar markpollack avatar mlimonczenko avatar trisberg avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

docs-spring-cloud-dataflow-k8s's Issues

Items to fix in docs

Update docs for 1.5.2 release

Notable changes

Library updates

  • Spring Boot 2.7.9
  • Spring Cloud 2021.0.6
  • Spring Framework 5.3.25

Kubernetes

  • Added ability to fully configure pod and container security contexts.
  • Container security context is propagated to init container and additional containers.

Update docs for 1.6.0 Release

Spring Cloud Data Flow Pro 1.6.0 for Kubernetes provides a Carvel package for deployment.

A download is available on Tanzu Net named SCDF Pro Installation files for Kubernetes with version 1.6.0.

OSS Release 2.11.1 Notes
This version provides support for Spring Boot 3.x

This file includes a pdf with instructions for deploying the package. The same instructions can be found in the reference guide under Deployment using Carvel.

The relevant packages are located on Harbor at registry.pivotal.io and the customers should already have credentials to access the container registry.

Configuring GitBot is recommended

Pivotal provides the GitBot service to synchronize pull requests and/or issues made against public GitHub repos with Pivotal Tracker projects. This service does not track individual commits.

If you are a Pivotal employee, you can configure Gitbot to sync your GitHub repo to your Pivotal Tracker project with a pull request. An ask+rd@ ticket is the fastest way to get write access if you get a 404 to the config repo.

If you do not want have pull requests and/or issues copied from GitHub to Pivotal Tracker, you do not need to take any action.

If there are any questions, please reach out to [email protected].

Update screenshots

In 2.7, we have a new UI, so we would have to replace the old images with new screenshots.

1.3.0 GA release

  • Add release notes
  • Update 'Product Snapshot'
  • Add new branch to staging
  • Publish new branch
  • Switch all the RC references in the version table (1.3.0, 2.7.0, and 2.8.0):
    image

Document how to configure "keystore" for SCDF and Skipper

To enable HTTPS (SSL), one needs to add the keystore that is accessible by the application.

Oded Shopen 2:24 AM
One issue I'm facing is how to transfer the keystore contents to the application.yaml. The security section here: https://docs.pivotal.io/scdf-k8s/0-1/configuring-installation-values.html redirects you to the SCDF documentation, but that expects you to run on your own machine and not as pods on Kubernetes. So, just adding the path to the keystore file will not work because that path doesn't exist inside the pod. I guess I could add it as a volume mount to the pod but I think this should be documented.
Caused by: java.io.FileNotFoundException: /home/odedia/scdf/dataflow.keystore (No such file or directory)

Resolved with this:

kubectl create configmap keystore --from-file=/home/odedia/scdf/dataflow.keystore
Then modified ./apps/data-flow/kustomize/overlays/dev/deployment-patch.yaml as follows:
metadata:
  name: scdf-server
spec:
  template:
    spec:
      containers:
      - name: scdf-server
        volumeMounts:
          - name: database
            mountPath: /etc/secrets/database
            readOnly: true
          - name: keystore 
            mountPath: /etc/secrets/keystore
            readOnly: true
      imagePullSecrets:
      - name: scdf-image-regcred
      volumes:
        - name: database
          secret:
            secretName: postgresql
            items:
            - key: postgresql-password
              path: database-password
        - name: keystore
          configMap:
            name: keystore

And then application.yml defines: key-store: "/etc/secrets/keystore/dataflow.keystore"

Can we provide instructions on cleaning up DataFlow pro

For Devs that are experimenting with PRO, we probably want to have instructions on cleanup when their done.
i.e. something like

Cleanup

Find all components that we need to Cleanup

kubectl get all

Cleanup SCDF bits (you will need to update the services below, this is just an example)

kubectl delete replicaset.apps/scdf-server-bff8bc77c deployment.apps/scdf-server service/scdf-server

Cleanup Skipper

kubectl delete deployment.apps/skipper service/skipper

Cleanup Rabbitmq

kubectl delete statefulset.apps/rabbitmq service/rabbitmq-headless service/rabbitmq

Cleanup postgres

kubectl delete statefulset.apps/postgresql service/postgresql-headless service/postgresql

Wrong file refefrence to deployment spec for monitoring

Next, mount the secret, to make it available for use by the Data Flow server configuration. Add the secret mount settings in the application.yaml file for the Data Flow server application:

This should point to deployment-patch.yaml instead.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.