quintush / helm-unittest Goto Github PK
View Code? Open in Web Editor NEWThis project forked from helm-unittest/helm-unittest
BDD styled unit test framework for Kubernetes Helm charts as a Helm plugin.
License: MIT License
This project forked from helm-unittest/helm-unittest
BDD styled unit test framework for Kubernetes Helm charts as a Helm plugin.
License: MIT License
I have written a tutorial doc on how to write unittests for this helm-unittest framework. It has been useful to both my company and to FOSS contributors. I'd like to contribute it here, but I'm not sure where it would fit in.
Refactoring the plugin using the packaging structure as used in Helm.
This will have the following benefits:
As a result of the refactoring I will also revise the tests.
This will have the following benefits:
End goal of the refactoring is:
Several import statements point to the irills fork, meaning changes made in this repository are not necessarily reflected in the resulting binary.
Example
"github.com/lrills/helm-unittest/internal/printer"
"github.com/lrills/helm-unittest/pkg/unittest"
"github.com/lrills/helm-unittest/pkg/unittest/formatter"
Say I want check whether a value equals another value:
The above doesn't work
- asserts[5]
equal fail Template: testa/templates/deployment.yaml DocumentIndex: 0 Path: spec.template.spec.containers[0].livenessProbe.periodSeconds Expected to equal: spec.template.spec.containers[0].readinessProbe.periodSeconds Actual: 10 Diff: --- Expected +++ Actual @@ -1,2 +1,2 @@ -spec.template.spec.containers[0].readinessProbe.periodSeconds +10
Is it possible to do it without hardcoding the 'value' or is there another work around?
If template YAML file is empty, test will alway pass.
Below passes as expected.
# templates/deployment.yaml
apiVersion: batch/v1beta1
kind: Deployment
---
# tests/deployment_test.yaml
suite: test deployment
templates:
- deployment.yaml
tests:
- it: should pass
values:
- ../values.yaml
asserts:
- isKind:
of: Deployment
Below errors out with - asserts[0] isKind fail ... Expected kind: Deployment ...Actual: null
, as expected.
# templates/deployment.yaml
apiVersion: batch/v1beta1
---
# tests/deployment_test.yaml
suite: test deployment
templates:
- deployment.yaml
tests:
- it: should pass
values:
- ../values.yaml
asserts:
- isKind:
of: Deployment
Below passes, but should actually fail:
# templates/deployment.yaml
##### Empty file #####
---
# tests/deployment_test.yaml
suite: test deployment
templates:
- deployment.yaml
tests:
- it: should pass
values:
- ../values.yaml
asserts:
- isKind:
of: Deployment
I am using the https://hub.docker.com/r/aneeshkj/helm-unittest docker image, which forks you from GitHub.
This issue is copied from https://github.com/lrills/helm-unittest/issues/107
Hi,
Really like this project, and it was good to see Helm 3 support being added!
I was stumped by some cryptic error messages I was getting from Helm Unittest today:
- manifest should match snapshot
- asserts[0] `matchSnapshot` fail
Error:
template "debug/templates/b.yaml" not exists or not selected in test suite
I've managed to workout what causes it though, if you have required
values inside of a Helm Template (Using the required
function e.g. {{ required ".someParameter is required" $.Values.someParameter }}
.
Then even if you write a test for a different template that doesn't need this value, you get the above error.
To replicate:
helm create debug
cd debug
rm templates/*
rm -rf templates/tests
Create a.yaml
in the templates
directory
apiVersion: v1
kind: ConfigMap
metadata:
name: {{ required ".someParameter is required" $.Values.someParameter }}
namespace: "test"
data:
a: "a"
Create b.yaml
in the templates
directory
apiVersion: v1
kind: ConfigMap
metadata:
name: IDontNeedParameters
namespace: "iWork"
data:
b: "b"
Create a test case in tests/b_test.yaml
templates:
- b.yaml
tests:
- it: manifest should match snapshot
asserts:
- matchSnapshot: {}
Notice that this solely tests b.yaml
that has no required values
Run Helm Unit Test
helm unittest -3 .
You'll get this error:
### Chart [ debug ] .
FAIL tests/b_test.yaml
- manifest should match snapshot
- asserts[0] `matchSnapshot` fail
Error:
template "debug/templates/b.yaml" not exists or not selected in test suite
Charts: 1 failed, 0 passed, 1 total
Test Suites: 1 failed, 0 passed, 1 total
Tests: 1 failed, 0 passed, 1 total
Snapshot: 0 passed, 0 total
Time: 9.265483ms
Set someParameter
in the test:
templates:
- b.yaml
tests:
- it: manifest should match snapshot
set:
someParameter: "fake"
asserts:
- matchSnapshot: {}
Run Helm Unit test again
helm unittest -3 .
The tests now pass:
### Chart [ debug ] .
PASS tests/b_test.yaml
Charts: 1 passed, 1 total
Test Suites: 1 passed, 1 total
Tests: 1 passed, 1 total
Snapshot: 1 passed, 1 total
Time: 2.310412ms
I have example: from https://github.com/astronomer/astronomer/blob/master/values.yaml#L6
is it possible to overide global.baseDomain
value from grafana
sub-chart/sub-folder https://github.com/astronomer/astronomer/tree/master/charts/grafana:
---
suite: Test ingress
templates:
- ingress.yaml
tests:
- it: should work
set:
global.baseDomain: test
asserts:
- isKind:
of: Ingress
Hello,
looks like it is not possible to do with current state of the plugin following:
apiVersion: v1
kind: ConfigMap
metadata:
name: {{ template "my-chart.fullname" . }}
data:
my.conf: |
{{- if .Values.expose }}
cacertfile = /etc/cert/cacert.pem
certfile = /etc/cert/tls.crt
keyfile = /etc/cert/tls.key
verify = verify_none
{{- end }}
abc = qqq
qqq = abc
and I'm trying to create a test for it, where default expose: false
:
- it: should NOT configure ssl params if NOT set to be exposed
asserts:
- notMatchRegex:
path: data.my\.conf
value: cacertfile
which fails with something similar to:
FAIL abc tests tests/abc_test.yaml
- should NOT configure ssl params if NOT set to be exposed
- asserts[0] `notMatchRegex` fail
Template: my-chart/templates/abc-cm.yaml
DocumentIndex: 0
Path: data.my\.conf
Expected NOT to match:
Actual: abc = qqq
qqq = abc
matchRegex
and notMatchRegex
are using regex only for a string, but in the case I have that is multiline string and also not a yaml, where potentially that would work if full path would be used. Looks like pretty standard use case . Is it possible to introduce a new set of regex
operations which would squash multiline heredoc into single line blob, which then is possible to test using it ?
Thanks
Hi
I have a test which I'm trying to test that it is failing with
This is my _helpers.tpl
:
{{- define "my-chart.checkNameOverrideLength" -}}
{{- if .Values.nameOverride -}}
{{- if gt (len .Values.nameOverride) 54 -}}
{{- fail "nameOverride cannot be longer than 54 characters" -}}
{{- end -}}
{{- end -}}
{{- end -}}
{{- define "my-chart.name" -}}
{{- include "web-api.checkNameOverrideLength" . -}}
{{- .Values.nameOverride -}}
{{- end -}}
This is my test:
suite: test
templates:
- deployment.yaml
tests:
- it: should fail since the nameOverride is too long
values:
- ../../values.yaml
set:
nameOverride: too-long-of-a-name-override-that-should-fail-the-template-immediately
asserts:
- failedTemplate:
errorMessage: nameOverride cannot be longer than 54 characters
As you can see, all I try to check is that I get the error from the template, but I keep getting this error:
FAIL test helm-unittest/common/values_test.yaml
- should fail since the nameOverride is too long
- asserts[0] `failedTemplate` fail
Error:
template "my-chart/templates/deployment.yaml" not exists or not selected in test suite
Charts: 1 failed, 0 passed, 1 total
Test Suites: 1 failed, 0 passed, 1 total
Tests: 1 failed, 0 passed, 1 total
Snapshot: 0 passed, 0 total
Time: 18.780023ms
Error: plugin "unittest" exited with error
Any reason why it could happen and how can I resolve it?
Thanks
The chart that I want to test uses apiVersion: v2
in Chart.yaml
and the error I get is ### Error: apiVersion 'v2' is not valid. The value must be "v1"
plugin version: 0.2.4
Hi, thank you for the useful plugin.
It seems that contains assertion type doesn't support metadata.annotations map.
Is this due to any limitations ?
When we use dependencies with conditions like the following example, the dependencies which doesn't meet the conditions will be deleted from the Chart
object, and won't be reloaded in the next test case's run.
dependencies:
- name: mysql
repository: https://kubernetes-charts.storage.googleapis.com
version: 1.2.0
condition: mysql.enabled
- name: redis
repository: https://kubernetes-charts.storage.googleapis.com
version: 10.5.7
condition: redis.enabled
Fix:
Chart's dependencies should be kept before each test's run, and recover after the run to make sure the next test run has the correct dependencies loaded
Thank you for forking and improving this one. Can you publish this to dockerhub please :)
I'm trying to use helm-unittest 0.2.6 on Windows to test my charts, but I noticed something weird:
My Chart.yaml has:
dependencies:
- name: library-chart
import-values:
- child: microservice
parent: microservice
- child: global
parent: global
The difference I see in the vscode debugger is at:
https://github.com/helm/helm/blob/b1e24764325135b364a7f81fb0aab683d190c546/pkg/chartutil/dependencies.go#L234
for the first 2 test cases , r.ImportValues is of type <map[string]interface {}>
for the 3rd test case, r.ImportValues is of type <map[string]string>
, so it doesn't match any of the switch cases, and values are not imported !
I'm a newbie at Golang and not sure what the root cause may be...
But it's strange that chart object instances would be constructed differently, given it's the same Chart.yaml and test case config.
Can we get a new version released?
I need that in order to work on the helmfile integration.
Hi Quintus,
Thank you for sharing a really useful library for Helm Unit tests.
Just wondering if it is possible to ,
get the coverage report for the templates.
test the values yaml with schema.
If not in plan/roadmap to implement, please close the issue as this is not a pressing issue.
Again, thanks for the library
Just started with this plugin and the most immediate thing I've noticed is that the path selections can be a little bit restrictive, mainly around testing outputs of loops where you essentially need to test the element before and after to make sure nothing has actually changed. I'm thinking jmespath or jq would expand the usability a fair bit for example:
- equal:
path: spec.template.spec.containers[0].env[3].name
value: APPCONFIG__RABBITSETTINGS__HOSTS__0
vs
- equal:
path: spec.template.spec.containers[?name == `default`].env[?name == APPCONFIG__RABBITSETTINGS__HOSTS__0].value
value: value
vs
- equal:
path: spec.template.spec.containers | select(.name == "default").env | select(.name == "APPCONFIG__RABBITSETTINGS__HOSTS__0").value
value: value
Very loose examples, but I think being able to select by container etc would make some of the testing a lot less verbose at times, although certainly harder to read for those unversed in such things. (I've not tested these are legitimate against JSON)
I want to write sort of the equivalent of an integration test I guess.
I have a couple different "types" of apps I deploy and they all use mostly the same parameters. I want to write a test that describes a complete profile of parameters to deploy a specific classification of app. This could be "public web app with mysql DB", "internal web app with redis DB", etc.
If I could do this then these tests would also be able to serve as documentation- a complete example with ALL the specific params required to deploy app of type X.
Is this possible? Maybe this already is possible but Im not sure how the test "suites" are organized.
Example might look like this:
suite: test deployment
set:
<all the required values to deploy this type of app>
templates:
- deployment.yaml
- ingress.yaml
- <bunch of other resources>
tests:
- it: A public facing web app the uses a mysql DB
asserts:
# test Deployment resource
- isKind:
of: Deployment
- matchRegex:
path: metadata.name
pattern: -my-chart$
# also test Ingress resource in same test
- isKind:
of: Ingress
- matchRegex:
path: metadata.name
pattern: blahblahblah
Hi,
is it possible to have something like this.
If something goes completely wrong, it would be nice to have an option that stops right after the first error.
So it is easier to find the error, maybe something like template not found, or something else.
metadata.annotations is expected to be array when it should be a map.
suite: Test nginx-service
templates:
- nginx-service.yaml
tests:
- it: works with ingressAnnotations
set:
ingressAnnotations:
foo1: foo
foo2: foo
foo3: foo
asserts:
- contains:
path: metadata.annotations
content:
foo1: foo
foo2: foo
foo3: foo
FAIL Test nginx-service charts/nginx/tests/nginx-service_test.yaml
- works with ingressAnnotations
- asserts[0] `contains` fail
Template: astronomer/charts/nginx/templates/nginx-service.yaml
DocumentIndex: 0
Error:
expect 'metadata.annotations' to be an array, got:
foo1: foo
foo2: foo
foo3: foo
prometheus.io/port: "10254"
prometheus.io/scrape: "true"
service.beta.kubernetes.io/aws-load-balancer-type: nlb
Using the contains
example as a reference...
contains:
path: spec.ports
content:
name: web
port: 80
targetPort: 80
protocle:TCP
โฆI expected my test would work. Looking through the code, I don't see a reference to 'annotations' anywhere so I suspect the behavior is imported from an upstream library, however the bug is manifesting itself here so I filed the bug here.
I also tried restructuring the test to compare against metadata
and assert content of annotations:
but that didn't work either. However, I was able to get it to work with a different comparison:
asserts:
- equal:
path: metadata.annotations.foo1
value: foo
It's possible I've got an invalid test. Looking around in Github I don't see any other tests that use contains
with metadata
. It's also possible this is a documentation bug.
How do I search for a value in a yaml list?
For example say I have this:
containers:
- name: mycontainer
image: sdfsdfsd
command:
- "-argone sdfsdf"
- "-argtwo sdfsdf"
- "-argthree sdfsdf"
I just want to see if argthree
exists (without having to know the specific index number)
- matchRegex:
path: spec.template.spec.containers[0].command
pattern: .*argthress.*
That returns an error expect 'spec.template.spec.containers[0].command' to be a string
If I need to know the specific index it makes it very rigid. Thanks!
Hello,
I wanted to perform tests on one of my charts that has bitnami/redis as a dependency, I can't test some of document because the generation fails (only when running unittest).
My investigations shows that the issue could occurs to a lot of other charts.
This fails because the chart uses an include statement (to add a checksum in an annotations, which is something many people do to ensure a pod is recreated if a configmap change).
To reproduce I patched:
diff --git a/test/data/v3/basic/templates/deployment.yaml b/test/data/v3/basic/templates/deployment.yaml
index 2820188..e72f8f2 100644
--- a/test/data/v3/basic/templates/deployment.yaml
+++ b/test/data/v3/basic/templates/deployment.yaml
@@ -15,6 +15,8 @@ spec:
labels:
app: {{ template "basic.name" . }}
release: {{ .Release.Name }}
+ annotations:
+ checksum/health: {{ include (print $.Template.BasePath "/configmap.yaml") . | sha256sum }}
spec:
containers:
- name: {{ .Chart.Name }}
helm template
still works fine, but not unittest:
### Chart [ basic ] .
PASS Configmap mulit line Test tests/configmap_test.yaml
PASS Custom Resource Definition Test tests/crd_test.yaml
FAIL test deployment tests/deployment_test.yaml
- should pass all kinds of assertion
- asserts[0] `equal` fail
Template: basic/templates/deployment.yaml
DocumentIndex: 0
Error:
can't get ["template"] from a non map type:
null
[โฆ removing a lot of errors]
2021/03/17 15:50:38 warning: destination for annotations is a table. Ignoring non-table value <nil>
2021/03/17 15:50:38 warning: destination for annotations is a table. Ignoring non-table value <nil>
2021/03/17 15:50:38 warning: destination for annotations is a table. Ignoring non-table value <nil>
2021/03/17 15:50:38 warning: destination for annotations is a table. Ignoring non-table value <nil>
PASS test ingress tests/ingress_test.yaml
PASS test notes tests/notes_test.yaml
PASS test service tests/service_test.yaml
Snapshot Summary: 1 snapshot failed in 1 test suite. Check changes and use `-u` to update snapshot.
Charts: 1 failed, 0 passed, 1 total
Test Suites: 1 failed, 5 passed, 6 total
Tests: 1 failed, 13 passed, 14 total
Snapshot: 1 failed, 3 passed, 4 total
Time: 8.533923ms
Error: plugin "unittest" exited with error
helm allows for Capabilitiy detection, which is useful when using CRDs.
It would be nice if we could set the capabilities. For this particular usecase of CRDs, setting .Capabilities.APIVersions.Has
When I try to instal plugin on centos 7 I get following error:
Validating Checksum.
sha256sum: invalid option -- 's'
Try 'sha256sum --help' for more information.
Failed to install helm-unittest
It's caused by this part:
curl -s -L "$PROJECT_CHECKSUM" | grep "$DOWNLOAD_FILE" | sha256sum -c -s
While -s is valid option for shasum
shasum --help
Usage: shasum [OPTION]... [FILE]...
Print or check SHA checksums.
...
-s, --status don't output anything, status code shows success
It's not ok for sha256sum:
sha256sum --help
Usage: sha256sum [OPTION]... [FILE]...
Print or check SHA256 (256-bit) checksums.
...
The following four options are useful only when verifying checksums:
--status don't output anything, status code shows success
Workarounds:
install shasum, it will be preferred over sha256sum
yum install -y perl-Digest-SHA
if count
is defined in contains
, and if path
exists, the test will pass.
contains
will not check for the value (and
count` can even be a negative or a floating number ๐ )
diff --git a/test/data/v3/basic/tests/deployment_test.yaml b/test/data/v3/basic/tests/deployment_test.yaml
index 101b770..a67bfce 100644
--- a/test/data/v3/basic/tests/deployment_test.yaml
+++ b/test/data/v3/basic/tests/deployment_test.yaml
@@ -50,3 +50,7 @@ tests:
count: 2
- matchSnapshot:
path: spec
+ - contains:
+ path: spec.template.spec.containers
+ value: 'foo'
+ count: 1234567890 # or 6.8 or -10 or 0
Current result:
### Chart [ basic ] test/data/v3/basic
PASS test service test/data/v3/basic/tests/service_test.yaml
PASS Configmap mulit line Test test/data/v3/basic/tests/configmap_test.yaml
PASS Custom Resource Definition Test test/data/v3/basic/tests/crd_test.yaml
PASS test deployment test/data/v3/basic/tests/deployment_test.yaml
2021/03/18 20:40:38 warning: destination for annotations is a table. Ignoring non-table value <nil>
2021/03/18 20:40:38 warning: destination for annotations is a table. Ignoring non-table value <nil>
2021/03/18 20:40:38 warning: destination for annotations is a table. Ignoring non-table value <nil>
2021/03/18 20:40:38 warning: destination for annotations is a table. Ignoring non-table value <nil>
PASS test ingress test/data/v3/basic/tests/ingress_test.yaml
PASS test notes test/data/v3/basic/tests/notes_test.yaml
Charts: 1 passed, 1 total
Test Suites: 6 passed, 6 total
Tests: 14 passed, 14 total
Snapshot: 4 passed, 4 total
Time: 25.149907ms
Hey @quintush, I've been trying to install your version (until the question of who maintains this is resolved), but the helm install process is failing ATM:
shasum: standard input: no properly formatted SHA1 checksum lines found
Sorry to peg this onto this issue, but your forked repo doesn't support issues :(
โฏ helm plugin install https://github.com/quintush/helm-unittest
+ PROJECT_NAME=helm-unittest
+ PROJECT_GH=quintush/helm-unittest
+ PROJECT_CHECKSUM_FILE=helm-unittest-checksum.sha
+ : /Users/davidy/Library/helm/plugins/helm-unittest
+ type cygpath
+ [[ '' == \1 ]]
+ trap fail_trap EXIT
+ set -e
+ initArch
++ uname -m
+ ARCH=x86_64
+ case $ARCH in
+ ARCH=amd64
+ initOS
++ tr '[:upper:]' '[:lower:]'
+++ uname
++ echo Darwin
+ OS=darwin
+ case "$OS" in
+ OS=macos
+ verifySupported
+ local 'supported=linux-arm64\nlinux-amd64\nmacos-amd64\nwindows-amd64'
+ echo 'linux-arm64\nlinux-amd64\nmacos-amd64\nwindows-amd64'
+ grep -q macos-amd64
+ type curl
+ echo 'Support macos-amd64'
Support macos-amd64
+ getDownloadURL
+ local latest_url=https://api.github.com/repos/quintush/helm-unittest/releases/latest
+ [[ -z '' ]]
++ git describe --tags --exact-match
+ local version=
+ '[' -n '' ']'
+ echo 'Retrieving https://api.github.com/repos/quintush/helm-unittest/releases/latest'
Retrieving https://api.github.com/repos/quintush/helm-unittest/releases/latest
+ type curl
++ curl -s https://api.github.com/repos/quintush/helm-unittest/releases/latest
++ grep 'macos\(-amd64\)\?'
++ awk '/\"browser_download_url\":/{gsub( /[,\"]/,"", $2); print $2}'
+ DOWNLOAD_URL=https://github.com/quintush/helm-unittest/releases/download/v0.2.0/helm-unittest-macos-amd64-0.2.0.tgz
++ curl -s https://api.github.com/repos/quintush/helm-unittest/releases/latest
++ grep checksum
++ awk '/\"browser_download_url\":/{gsub( /[,\"]/,"", $2); print $2}'
+ PROJECT_CHECKSUM=https://github.com/quintush/helm-unittest/releases/download/v0.2.0/helm-unittest-checksum.sha
+ downloadFile
+ PLUGIN_TMP_FOLDER=/tmp/_dist/
+ mkdir -p /tmp/_dist/
+ echo 'Downloading https://github.com/quintush/helm-unittest/releases/download/v0.2.0/helm-unittest-macos-amd64-0.2.0.tgz to location /tmp/_dist/'
Downloading https://github.com/quintush/helm-unittest/releases/download/v0.2.0/helm-unittest-macos-amd64-0.2.0.tgz to location /tmp/_dist/
+ type curl
+ cd /tmp/_dist/
+ curl -LO https://github.com/quintush/helm-unittest/releases/download/v0.2.0/helm-unittest-macos-amd64-0.2.0.tgz
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 648 100 648 0 0 1325 0 --:--:-- --:--:-- --:--:-- 1325
100 16.7M 100 16.7M 0 0 3923k 0 0:00:04 0:00:04 --:--:-- 4795k
+ installFile
+ cd /tmp
++ find ./_dist -name '*.tgz'
+ DOWNLOAD_FILE=./_dist/helm-unittest-macos-amd64-0.2.0.tgz
+ '[' -n https://github.com/quintush/helm-unittest/releases/download/v0.2.0/helm-unittest-checksum.sha ']'
+ echo Validating Checksum.
Validating Checksum.
+ type curl
+ curl -s -L https://github.com/quintush/helm-unittest/releases/download/v0.2.0/helm-unittest-checksum.sha
+ grep ./_dist/helm-unittest-macos-amd64-0.2.0.tgz
+ shasum -a 256 -c -s
shasum: standard input: no properly formatted SHA1 checksum lines found
+ fail_trap
+ result=1
+ '[' 1 '!=' 0 ']'
+ echo 'Failed to install helm-unittest'
Failed to install helm-unittest
+ echo 'For support, go to https://github.com/kubernetes/helm'
For support, go to https://github.com/kubernetes/helm
+ exit 1
Error: plugin install hook for "unittest" exited with error
Originally posted by @funkypenguin in https://github.com/lrills/helm-unittest/issues/102#issuecomment-619643055
Took me a while to track this down. The install command and output look like this:
$โฏ helm plugin install https://github.com/quintush/helm-unittest --version 0.2.5
Support macos-amd64
Retrieving https://api.github.com/repos/quintush/helm-unittest/releases/tags/v0.2.5
Downloading https://github.com/quintush/helm-unittest/releases/download/v0.2.5/helm-unittest-macos-amd64-0.2.5.tgz to location /tmp/_dist/
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 661 100 661 0 0 4105 0 --:--:-- --:--:-- --:--:-- 4105
100 17.2M 100 17.2M 0 0 12.4M 0 0:00:01 0:00:01 --:--:-- 18.5M
Failed to install helm-unittest
For support, go to https://github.com/kubernetes/helm
Error: plugin install hook for "unittest" exited with error
Stepping through install-binary.sh
I noted that the cat
command will ultimately fail on MacOS:
$โฏ cat /etc/os-release
cat: /etc/os-release: No such file or directory
And causes the whole script to fail.
helm-unittest/install-binary.sh
Line 115 in 778bc25
I am starting to use helm-unittest
and I am planning to use a custom values file (generated through helmfile
).
Currently, it is required to specify the values
on a per-test basis.
Would it be possible in addition, to have a top-level values
entry, as follows:
values:
- ./values/default.yaml
tests:
- it: should pass
values:
- ./values/staging.yaml
asserts:
- equal:
path: metadata
The different values
field could be merged and if there are any conflicts the closer value definition to the test case takes priority.
Does it sound reasonable? I could give a try to the implementation if there are no objections
We recently updated our helm charts to apiversion 2 and are therefore using the helm3 flag (i.e. helm unittest -3 .
) to execute our test suites.
Unfortunately, when running multiple test jobs within a test suite, values provided via set
in one test job are still present in every consecutive test job. This causes assertions of the consecutive jobs to fail, due to the set assumptions no longer being valid.
I tried debugging and noticed that the call to v3engine.Render(filteredChart, vals)
actually manipulates the filteredChart.Values
field on the v3chart.Chart
struct. This is really counterintuitive because one might assume that values from a chart are immutable.
helm-unittest/pkg/unittest/test_job.go
Lines 447 to 450 in c283e70
Since filteredChart.Values
seems to be pointing to the same location as targetChart.Values
, the next test job run in the suite will end up with values altered by previous test jobs.
Background: I want to verify default labels on Kubernetes resources. They also contain "helm.sh/chart": "{{ .Chart.Name }}-{{ .Chart.Version }}"
which contains the name of the chart and it's version.
I could write a test like the one below which checks for the current chart version:
suite: test labels
release:
name: my-release
namespace: my-namespace
tests:
- it: deployment labels
set:
Release.
template: jenkins-master-deployment.yaml
asserts:
- equal:
path: metadata.labels
value:
app.kubernetes.io/component: jenkins-master
app.kubernetes.io/instance: my-release
app.kubernetes.io/managed-by: Tiller
app.kubernetes.io/name: jenkins
helm.sh/chart: jenkins-2.8.0
That would not be very practical as the tests would need to be adjusted for every version of the chart.
I could of course check each label individually, but that would be quite some repetition.
Is it somehow possible to override the chart version in a unit test to provide a fixed value or is there a better approach?
test data contains array of only one element. Test case override to add more elements. Repeatedly run the same test without change. Testcase occasionally failed. If test data contains multiple elements, i.e. without override with set command in testcase. Problem do not exist.
ingress:
enable: false
host: test.com
path: /test
port: web
extensions:
ingress:
- paths:
- /api/building-calculator/health
port: web
matchDirective: "PathPrefixStrip"
rewrite: /actuator/health
suite: Test Ingress extensions 2 or more Paths without rewrite
templates:
- deploymentSpec/extensions/ingress.yaml
values:
- ./values/ingress_ex_default.yaml
tests:
- it: ingress documents (one for ingress, one for middleware)
asserts:
- hasDocuments:
count: 4
- it: ingress kind and path
set:
deploymentSpec.app.extensions.ingress[0].matchDirective: PathPrefixStrip
deploymentSpec.app.extensions.ingress[0].paths:
- /welcome/api/wiremock
- /loyalty/api/wiremock
documentIndex: 0
asserts:
- equal:
path: metadata.annotations.kubernetes\.io/ingress\.class
value: traefik
- isKind:
of: Ingress
- equal:
path: spec.rules[0].http.paths[0].path
value: /welcome/api/wiremock
- isNull:
path: spec.rules[0].host
- equal:
path: spec.rules[0].http.paths[0].backend.servicePort
value: 80
- equal:
path: spec.rules[0].http.paths[0].backend.serviceName
value: base-helm-chart-app-svc
PASS Test Ingress Chain tests/app-ingress_chain_test.yaml
PASS Test Ingress default matchDirective value to PathPrefixStrip tests/app-ingress_default_test.yaml
PASS Test Ingress PathPrefix tests/app-ingress_pathprefix_test.yaml
PASS Test Ingress PathPrefixStrip tests/app-ingress_pathprefixstrip_test.yaml
PASS Test Ingress ReplacePathRegex tests/app-ingress_replacepathregex_test.yaml
PASS Test Ingress extensions 2 or more Paths without rewrite tests/ingress_ex_default2_test.yaml
PASS Test Ingress extensions 1 path with rewrite which force matchDirective to PathPrefix tests/ingress_ex_default_test.yaml
Charts: 1 passed, 1 total
Test Suites: 7 passed, 7 total
Tests: 29 passed, 29 total
Snapshot: 0 passed, 0 total
Time: 51.38539ms
PASS Test Ingress Chain tests/app-ingress_chain_test.yaml
PASS Test Ingress default matchDirective value to PathPrefixStrip tests/app-ingress_default_test.yaml
PASS Test Ingress PathPrefix tests/app-ingress_pathprefix_test.yaml
PASS Test Ingress PathPrefixStrip tests/app-ingress_pathprefixstrip_test.yaml
PASS Test Ingress ReplacePathRegex tests/app-ingress_replacepathregex_test.yaml
FAIL Test Ingress extensions 2 or more Paths without rewrite tests/ingress_ex_default2_test.yaml
- ingress extension middleware
- asserts[3] `equal` fail
Template: base-helm-chart/templates/deploymentSpec/extensions/ingress.yaml
DocumentIndex: 0
Path: spec.stripPrefix.prefixes[0]
Expected to equal:
/welcome/api/wiremock
Actual:
null
Diff:
--- Expected
+++ Actual
@@ -1,2 +1,2 @@
-/welcome/api/wiremock
+null
- asserts[4] `equal` fail
Template: base-helm-chart/templates/deploymentSpec/extensions/ingress.yaml
DocumentIndex: 0
Error:
[1] :
- null
PASS Test Ingress extensions 1 path with rewrite which force matchDirective to PathPrefix tests/ingress_ex_default_test.yaml
Charts: 1 failed, 0 passed, 1 total
Test Suites: 1 failed, 6 passed, 7 total
Tests: 1 failed, 28 passed, 29 total
Snapshot: 0 passed, 0 total
Time: 52.417587ms
Curious if it would be difficult to add conditional asserts...?
I was investigating using the unittest plugin to come up with a set of tests that validate some helm chart "rules" to be imposed on multiple charts. Thinking along the lines of a compliance audit test suite that ensures all the charts we develop abide by certain rules or best-practices.
To accomplish this, the asserts would need to be conditional, maybe using another assert as the condition.
Just as an example, if I wanted to check that all container images referenced used an internal registry (foo.com), I could create the following asserts:
tests:
- it: tests statefulsets use local foo.com registry
asserts:
- isNotNull:
path: spec.template.spec.containers[0].image
optional: true # indicating it fails silently if this fails and/or the path does not exist
- matchRegex:
path: spec.template.spec.containers[0].image
pattern: foo.com/*
conditional: 0 # the idx of the assert to condition this assert on.
# If assert[0] is successful, then this assert applies
That is a contrived example, and it could be implemented in a handful of different ways - like instead of referencing the other assert by index, an assert could set some "variables", similar to ansible's register:
.
I wish I was more familiar with Go as I could try to contribute to something like this...
Thanks for picking the unittest plugin, it has proven to be a vital part in our test suite so it's great to see someone furthering its development.
I can see that you have fixed a issue in #38 which currently affects our CI pipeline:
Retrieving https://api.github.com/repos/quintush/helm-unittest/releases/tags/v0.2.1
Downloading https://github.com/quintush/helm-unittest/releases/download/v0.2.1/helm-unittest-linux-amd64-0.2.1.tgz to location /tmp/_dist/
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 648 100 648 0 0 5890 0 --:--:-- --:--:-- --:--:-- 5890
100 17.1M 100 17.1M 0 0 36.9M 0 --:--:-- --:--:-- --:--:-- 36.9M
Validating Checksum.
sha256sum: unrecognized option: status
Could you create a patch release containing this fix?
Getting this error on most of my tests that work fine in helm 2.9.1
assertion.template must be given if testsuite.templates is empty
Helm 3.1.2 and Helm 3.2.3 have the same issue.
So i pulled your repo and ran the same templates and tests in your repo and they pass
Any thoughts would be appreciated
The getDownloadURL()
function does not properly parse the url because of a missing space before stderr redirect on this line:
helm-unittest/install-binary.sh
Line 88 in c8d16ec
/tmp # f=$(wget -q -O - https://api.github.com/repos/quintush/helm-unittest/releases/latest | grep "linux-amd64" | awk '/\"browser_download_url\":/{gsub(/[,\"]/,"", $2); print $2}'2>/dev/null)
/tmp # echo $f
Adding the space allows the url to be parsed properly:
/tmp # f=$(wget -q -O - https://api.github.com/repos/quintush/helm-unittest/releases/latest | grep "linux-amd64" | awk '/\"browser_download_url\":/{gsub(/[,\"]/,"", $2); print $2}' 2>/dev/null)
/tmp # echo $f
https://github.com/quintush/helm-unittest/releases/download/v0.2.4/helm-unittest-linux-amd64-0.2.4.tgz
As a workaround, install curl
as it will be preferred over wget
and contains a space before the stderr redirect.
I created a chart with Helm v3 and the Chart.yaml contains
apiVersion: v2
However, when I try to run the unit tests I get this error:
$ helm unittest <my chart>
### Error: apiVersion 'v2' is not valid. The value must be "v1"
Charts: 1 failed, 1 errored, 0 passed, 1 total
Test Suites: 0 passed, 0 total
Tests: 0 passed, 0 total
Snapshot: 0 passed, 0 total
Time: 1.804892ms
Error: plugin "unittest" exited with error
For reference, here is my Helm version:
$ helm version
version.BuildInfo{Version:"v3.4.2", GitCommit:"23dd3af5e19a02d4f4baa5b2f242645a1a3af629", GitTreeState:"dirty", GoVersion:"go1.15.5"}
I thought this fork was compatible with Helm v3?
We have some ConfigMap
s with complicated data items for configuration files and such. Since these are strings, we end up doing a lot of contains
and matchRegex
to ensure things are within the file. But since the Strings are usually structured, we'd love to be able to do something like:
templates:
- configmap.yaml
[...]
asserts:
- equal:
path: data.big_config_file\.yaml
location: storage.db.username[2]
pattern: "john_smith"
i.e. Using JSONPath to locate items in Strings.
Is this a feature on the backlog? Or is there a workaround? Thanks!
Hi @quintush, first off - thanks for the continued work on helm-unittest, its fantastic!
I've been adding a values.schema.json
to validate the inputs to the chart that I'm also testing.
I've got a simple validation on my .Values.deployment.replicas
that its a required field, and must be of type integer, e.g;
"deployment": {
"type": "object",
"required": [
"replicas"
],
"properties": {
"replicas": {
"type": "integer"
},
which gives the right validation on use of helm cli;
$ helm lint example --set deployment.replicas="'2'"
==> Linting example
[INFO] Chart.yaml: icon is recommended
[ERROR] values.yaml: - deployment.replicas: Invalid type. Expected: integer, given: string
[ERROR] templates/: values don't meet the specifications of the schema(s) in the following chart(s):
example:
- deployment.replicas: Invalid type. Expected: integer, given: string
Error: 1 chart(s) linted, 1 chart(s) failed
$ helm lint example --set deployment.replicas=2
==> Linting example
[INFO] Chart.yaml: icon is recommended
1 chart(s) linted, 0 chart(s) failed
I've got a helm-unittest case which tries to pass in replicas as a string (and also null). I attempted a few combinations of the multiline string for the errorMessage, but I'm not sure its being captured as this is thrown by helm validation not template rendering I guess?
- it: should throw linting error if the number of initial replicas is passed as string
set:
deployment:
replicas: "6"
asserts:
- failedTemplate:
errorMessage: |
Error: values don't meet the specifications of the schema(s) in the following chart(s):
example:
- deployment.replicas: Invalid type. Expected: integer, given: string
result looks like;
$ helm unittest -f tests/deployment_test.yaml example --helm3
### Chart [ example ] example
FAIL test deployment example/tests/deployment_test.yaml
- should throw linting error if the number of initial replicas is passed as string
Error: values don't meet the specifications of the schema(s) in the following chart(s):
example:
- deployment.replicas: Invalid type. Expected: integer, given: string
Any ideas if this is possible to test?
Hi
Is it possible to write unittests for library chart?
I know that user cant apply helm template @chart
on library chart therefore not sure if it can be unit tested.
Motivation behind is that libraries play crucial role as they contain common _helpers.tpl and therefore we would like to focus on unit testing it to assure the quality and compatibility with charts that depend on it.
Any advice?
Example of tpl from library templates:
{{- define "common.service.tpl" -}}
apiVersion: v1
kind: Service
metadata:
name: {{ template "common.name" .Root }}
labels:
{{- include "common.labels" .Root | nindent 4 }}
spec:
selector:
{{- include "common.selectorLabels" .Root | nindent 4 }}
type: {{ .Values.type }}
ports:
{{- range .Values.ports }}
- port: {{ .port }}
protocol: {{ .protocol | default "TCP" }}
{{- if .targetPort }}
targetPort: {{ .targetPort }}
{{- end }}
{{- if .name }}
name: {{ .name | toString | quote }}
{{- end }}
{{- end }}
{{- end -}}
and test it with:
suite: test service
templates:
- _service.tpl
tests:
- it: should work
asserts:
- isKind:
of: Service
... more asserts ...
Plugin seems not to match correctly output from rendered template and fails
So, we have the following into deployment.yaml
file:
[...]
{{- if $c.livenessProbe }}
livenessProbe:
{{- toYaml $c.livenessProbe | nindent 12 }}
{{- end }}
{{- with $c.readinessProbe }}
readinessProbe:
{{- toYaml . | nindent 12 }}
{{- end }}
[...]
As you can see, we try different approaches with if
and with
to test different scenarios ^^
When running helm template ...
we got the following output which is correct:
[...]
livenessProbe:
httpGet:
path: /
port: http
readinessProbe:
httpGet:
path: /
port: http
[...]
Now, we have also set a test like this:
- equal:
path: spec.template.spec.containers[0].livenessProbe
value:
httpGet:
path: /
port: http
- equal:
path: spec.template.spec.containers[0].readinessProbe
value:
httpGet:
path: /
port: http
Tests fails with the following output, complaining that value of above is null
:
- asserts[10] `equal` fail
Template: templates/deployment.yaml
DocumentIndex: 0
Path: spec.template.spec.containers[0].livenessProbe
Expected to equal:
httpGet:
path: /
port: http
Actual:
null
- asserts[11] `equal` fail
Template: templates/deployment.yaml
DocumentIndex: 0
Path: spec.template.spec.containers[0].readinessProbe
Expected to equal:
httpGet:
path: /
port: http
Actual:
null
We expect test suite to succeed ^^ Are we missing something or any clues? We migrated from lrills/helm-unittest
into your project to utilize helm3 and thought to ask if something had changed in behavior
PS: Thanks for you nice work ๐
Tried to install this plugin on M1 Macbook and got failure because there is no prebuilt macos-arm64 binary available.
When a chart contains errors, the unittest plugin will give no clear error.
Please make sure there is a clear distinction between an error in the chart and a failure (e.g. required/failed functions)
It'd be nice if a comparison between the expected and the actual value would be included in test failure messages for all assertions. So far I'm missing this for equal
and equalRaw
and noticed that it's already provided for documentCount
, I didn't make a listing at the time of writing this issue.
There needs to be a concept/decision made about overlong actual and expected string values. I guess the first step could be truncation from the beginning, follow-up could be an excerpt of the start of the first mismatch plus a bit of offset to make it easier to find.
Hi,
we are using your plugin on some docker images based on a google/cloud-sdk:alpine image. Sadly it does not contain "shasum".
So while installing the plugin this occurs:
Downloading https://github.com/quintush/helm-unittest/releases/download/v0.2.0/helm-unittest-linux-amd64-0.2.0.tgz to location /tmp/_dist/ % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 648 100 648 0 0 9529 0 --:--:-- --:--:-- --:--:-- 9529 100 16.9M 100 16.9M 0 0 9483k 0 0:00:01 0:00:01 --:--:-- 15.0M Validating Checksum. ./install-binary.sh: line 120: shasum: command not found
But sha1sum, sha256sum, sha3sum, sha512sum are available.
So maybe is it possible to check if shasum is existing and if not then maybe use one of the above?
Cheers,
Sascha
It would be neat if the plugin supported a --values
option that could receive an array of paths pointing to values files that would be passed down to the chart.
That would make easier test against a set of values other than the default values.yaml
within the chart folder as well as integrate it with other tools such as hemlfile
, which we use to dynamically generate the final set of values that are going to be use to deploy helm-charts.
See related issue in helmfile for extra info: roboll/helmfile#1713
Hi.
I'm trying to test the existence of an item of an array without using any positional value using the latest version of the plugin.
Yaml to be tested is :
spec:
scopes:
- phony-service.phony:graphql
Please note that phony-service.phony:graphql
is not a key/value pair but a whole string.
The following assertion fails:
asserts:
- contains:
path: spec.scopes
content:
phony-service.phony:graphql
count: 1
any: true
with that odd error:
- asserts[0] `contains` fail
Template: anything/templates/apiresource.yaml
DocumentIndex: 0
Path: spec.scopes
Expected to contain:
- phony-service.phony:graphql
Actual:
- phony-service.phony:graphql
Whereas the "positional" assertation works:
asserts:
- equal:
path: spec.scopes[0]
value: "phony-service.phony:graphql"
Any idea about what I could have missed ? I have tried to put the item content between quotes. Same error.
Thanks in advance for any help.
Hello, I have a question why do I get errors like:
incompatible types for comparison
instead of error like:
template: generic-release/templates/deployment.yaml:106:35: executing "deployment" at <eq "tmpfs">: error calling eq: incompatible types for comparison
The original error would be really helpful to me. I am really often unable to find out why my test is failing unless I run the test with debug mode. I found that the alot of information is trimmed from the error here:
helm-unittest/pkg/unittest/test_job.go
Line 466 in 1693f4f
Why not to throw whole error as is?
Hi, I have a helm chart that was created for helm2 and the unittests were working with the lrills/helm-unittest
plugin.
But when I try to use this version (Installed trought the helm2 plugin install https://github.com/quintush/helm-unittest
command) some errors about template occurs.
Helm Version: 2.16.10
The tests:
suite: test autoscaling
templates:
- horizontalpodautoscaler.yaml
tests:
- it: should use GLOBAL scaling config when release autoscaling AND Global autoscaling are enabled
set:
infra:
autoScaling:
enabled: true
minReplicas: 100
maxReplicas: 500
releases:
- name: default
environment: nimbus
infra:
autoScaling:
enabled: true
type: "hpa"
asserts:
- isKind:
of: HorizontalPodAutoscaler
- hasDocuments:
count: 1
- equal:
path: spec.minReplicas
value: 100
- equal:
path: spec.maxReplicas
value: 500
- it: should use release hpa config when Global autoscaling is disabled but release scaling is enabled.
set:
infra:
autoScaling:
enabled: false
minReplicas: 5000
maxReplicas: 7000
releases:
- name: default
environment: nimbus
infra:
autoScaling:
enabled: true
minReplicas: 2
maxReplicas: 2
asserts:
- isKind:
of: HorizontalPodAutoscaler
- hasDocuments:
count: 1
- equal:
path: spec.minReplicas
value: 2
- equal:
path: spec.maxReplicas
value: 2
- it: should'n't use any autoscaling config when release autoscaling is disabled
set:
infra:
autoScaling:
enabled: true
minReplicas: 5000
maxReplicas: 7000
releases:
- name: default
environment: nimbus
infra:
autoScaling:
enabled: false
minReplicas: 2
maxReplicas: 2
asserts:
- hasDocuments:
count: 0
Errors:
- should use GLOBAL scaling config when release autoscaling AND Global autoscaling are enabled
- asserts[0] `isKind` fail
Error:
assertion.template must be given if testsuite.templates is empty
- asserts[1] `hasDocuments` fail
Error:
assertion.template must be given if testsuite.templates is empty
- asserts[2] `equal` fail
Error:
assertion.template must be given if testsuite.templates is empty
- asserts[3] `equal` fail
Error:
assertion.template must be given if testsuite.templates is empty
Can u provide some method to debug these problems?
Thanks for the awesome work!
@quintush, your doing a fantastic job in maintaining and developing this tool that is so desperately needed in a world with so many charts and so few unit tests for them.
Did you consider taking over the original repo https://github.com/helm-unittest/helm-unittest which is already recommending your fork as more up-to-date?
warning THIS REPOSITORY IS NOT MAINTAINED ANYMORE. warning
You can consider other working forks like quintush/helm-unittest. If anyone would like to continue maintaining this repo, please mail [email protected]. I'd be glad to transfer ownership and see it live again.
It's a bit of responsibility that you've basically already taken over and would make things much more easy to and accessible.
Liking library very much. However, we utilize Chart.appVersion
in our templates, but this cannot be set when running unit tests.
Please add appVersion
to Chart
object in Test Suite.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.