solo-io / gloo Goto Github PK
View Code? Open in Web Editor NEWThe Feature-rich, Kubernetes-native, Next-Generation API Gateway Built on Envoy
Home Page: https://docs.solo.io/
License: Apache License 2.0
The Feature-rich, Kubernetes-native, Next-Generation API Gateway Built on Envoy
Home Page: https://docs.solo.io/
License: Apache License 2.0
I see " RBAC permissions have not been granted to Gloo to read from the registry, " if my service, that has a /swagger.json returning a valid swagger setup, is not read correctly. How do I ensure the right permissions? I ran "glooctl install kube" to set this up. And I have 2 things to read, including the petstore YAML that I put into a separate project, not default, that is not found with "glooctl get upstreams".
make site
works from this repo, but only produces the site locally. Need to push an updated docker image, replace the existing site docs with the newly generated ones, and update the readmes in this repo to refer to open source gloo site.
I have a service in the namespace datasvc-contacts and ran the below command which I think is correct. It is giving me an error saying --kube-service is not a valid flag?
glooctl create upstream kube -n gloo-datasvc-contacts \
--kube-service datasvc-contacts-svc \
--kube-service-namespace datasvc-contacts \
--kube-service-port 8080
Error: unknown flag: --kube-service
Usage:
glooctl create upstream kube [flags]
Flags:
-h, --help help for kube
--kube-service string name of the kubernetes service
--kube-service-labels strings labels to use for customized selection of pods for this upstream. can be used to select subsets of pods for a service e.g. for blue-green deployment
--kube-service-namespace string namespace where the kubernetes service lives (default "defaukt")
--kube-service-port uint32 the port were the service is listening. for services listenin on multiple ports, create an upstream for each port. (default 80)
--service-spec-type string if set, Gloo supports additional routing features to upstreams with a service spec. The service spec defines a set of features
Global Flags:
-i, --interactive use interactive mode
--name string name of the resource to read or write
-n, --namespace string namespace for reading or writing resources (default "gloo-system")
-o, --output string output format: (yaml, json, table)
glooctl version 0.5.0
Running on a MAC OS X with latest OS update, dark mode enabled if it matters
Latest generation 15" macbook
Most of the links in the docs on e.g.https://gloo.solo.io/introduction/concepts point to: https://gloo.solo.io/introduction/TODO
For debugging purposes, it would be good to see both the real envoy config file, but also the effective config that it sees from xDS (related to envoyproxy/envoy#2421). maybe we can add a flag for effective config? or options to see effective envoy routes
listeners
or clusters
?
On source_events_from_github
example, I found the quite magic stuff:
# Deploy NATS and minio
# --- this command only deply the NATS-streaming server and minio server
kubectl apply -f \
https://raw.githubusercontent.com/solo-io/gloo/master/example/source_events_from_github/kube-deploy.yaml
# Create a route for nats
# --- Route HTTP requests to NATS
glooctl route create --sort \
--path-exact /github-webhooks \
--upstream default-nats-streaming-4222 \
--function github-webhooks
I can track that NATS-streaming server is auto discovered: https://github.com/solo-io/gloo/blob/master/pkg/function-discovery/nats-streaming/discover_nats_test.go
Can you explain where's the logic that push the event to NATS-streaming? I can't see where the logic for github-webhooks is implemented.
Thanks.
Update: I found that gloo provide TCP filter for NATS-streaming upstream via https://github.com/solo-io/envoy-nats-streaming/blob/master/source/common/nats/message_builder.cc โ I think NATS-streaming integration should have better documentation. Thanks again for great integration.
Hello,
I am looking at this project, and it looks really neat. Thank you for putting the time and energy of supporting it.
I have to support some legacy applications, and need to support a ring hash loadbalancer (sticky sessions). I know that envoy supports it in the V2 spec, but was wonder if it is possible to configure this in gloo for use with envoy? https://www.envoyproxy.io/docs/envoy/latest/api-v2/api/v2/route/route.proto#route-routeaction-hashpolicy
Thanks!
followed all the steps, Tried installing and i got the below yaml wen executed "glooctl upstream get default-petstore-8080 -o yaml"
metadata:
annotations:
generated_by: kubernetes-upstream-discovery
kubectl.kubernetes.io/last-applied-configuration: |
{"apiVersion":"v1","kind":"Service","metadata":{"annotations":{},"labels":{"sevice":"petstore"},"name":"petstore","namespace":"default"},"spec":{"ports":[{"port":8080,"protocol":"TCP"}],"selector":{"app":"petstore"}}}
namespace: gloo-system
resource_version: "950"
name: default-petstore-8080
spec:
service_name: petstore
service_namespace: default
service_port: 8080
status:
state: Accepted
type: kubernetes
I see that features contain HashiCorp Stack (Vault, Consul, Nomad)
. But I can't find any doc say that.
Is it possible to use an existing Ingress that's deployed? I have an nginx-ingress already wired up to a loadbalancer on my K8s cluster.
When I followed
kubectl apply \
-f https://raw.githubusercontent.com/solo-io/gloo/master/example/petstore/petstore.yaml
metadata:
annotations:
kubectl.kubernetes.io/last-applied-configuration: >
{"apiVersion":"v1","kind":"Service","metadata":{"annotations":{},"labels":{"sevice":"petstore"},"name":"petstore","namespace":"default"},"spec":{"ports":[{"port":8080,"protocol":"TCP"}],"selector":{"app":"petstore"}}}
Today we tried to use gloo on kubernetes to serve gRPC service for json compatiable client, we met some problems and need some help.
wrk -t4 -c2000 -d30s -T5s --latency http://xxxxxx
then when adding new service, it will not be discovered any more. But i can monitor the gloo-system that all the pods are running. the logs is just attempting to ..., and no more retries
Could you give me some suggestions, which kind of log file do i need to debug, Thanks in advance.
Currently releasing gloo requires:
install/kube.yaml
in this repo to use the new image tagsmake site
), create and push docker imageI'm unsure if the version is being reported incorrectly or if it's actually the wrong version:
/tmp# curl -LO https://github.com/solo-io/gloo/releases/download/v0.5.2/glooctl-linux-amd64
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 610 0 610 0 0 2850 0 --:--:-- --:--:-- --:--:-- 2850
100 40.6M 100 40.6M 0 0 2771k 0 0:00:15 0:00:15 --:--:-- 2946k
/tmp# chmod +x glooctl-linux-amd64
/tmp# ./glooctl-linux-amd64 --version
glooctl version 0.5.0
Ingress container keeps failing. Stderr:
[2018-09-25 10:03:16.979][6][info][main] external/envoy/source/server/server.cc:183] initializing epoch 0 (hot restart version=10.200.16384.127.options=capacity=16384, num_slots=8209 hash=228984379728933363 size=2654312) [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:185] statically linked extensions: [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:187] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:190] filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.filters.http.rbac,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash,io.solo.azure_functions,io.solo.function_router,io.solo.gcloudfunc,io.solo.lambda,io.solo.nats_streaming,io.solo.transformation [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:193] filters.listener: envoy.listener.original_dst,envoy.listener.proxy_protocol,envoy.listener.tls_inspector [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:196] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy,io.solo.filters.network.consul_connect [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:198] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.statsd [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:200] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.zipkin [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:203] transport_sockets.downstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:03:16.982][6][info][main] external/envoy/source/server/server.cc:206] transport_sockets.upstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:03:16.983][6][critical][main] external/envoy/source/server/server.cc:78] error initializing configuration '': Proto constraint validation failed (BootstrapValidationError.Admin: ["value is required"]): [2018-09-25 10:03:16.983][6][info][main] external/envoy/source/server/server.cc:449] exiting [2018-09-25 10:03:38.400][6][info][main] external/envoy/source/server/server.cc:183] initializing epoch 0 (hot restart version=10.200.16384.127.options=capacity=16384, num_slots=8209 hash=228984379728933363 size=2654312) [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:185] statically linked extensions: [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:187] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:190] filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.filters.http.rbac,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash,io.solo.azure_functions,io.solo.function_router,io.solo.gcloudfunc,io.solo.lambda,io.solo.nats_streaming,io.solo.transformation [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:193] filters.listener: envoy.listener.original_dst,envoy.listener.proxy_protocol,envoy.listener.tls_inspector [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:196] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy,io.solo.filters.network.consul_connect [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:198] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.statsd [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:200] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.zipkin [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:203] transport_sockets.downstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:03:38.401][6][info][main] external/envoy/source/server/server.cc:206] transport_sockets.upstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:03:38.402][6][critical][main] external/envoy/source/server/server.cc:78] error initializing configuration '': Proto constraint validation failed (BootstrapValidationError.Admin: ["value is required"]): [2018-09-25 10:03:38.402][6][info][main] external/envoy/source/server/server.cc:449] exiting [2018-09-25 10:04:00.192][6][info][main] external/envoy/source/server/server.cc:183] initializing epoch 0 (hot restart version=10.200.16384.127.options=capacity=16384, num_slots=8209 hash=228984379728933363 size=2654312) [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:185] statically linked extensions: [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:187] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:190] filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.filters.http.rbac,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash,io.solo.azure_functions,io.solo.function_router,io.solo.gcloudfunc,io.solo.lambda,io.solo.nats_streaming,io.solo.transformation [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:193] filters.listener: envoy.listener.original_dst,envoy.listener.proxy_protocol,envoy.listener.tls_inspector [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:196] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy,io.solo.filters.network.consul_connect [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:198] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.statsd [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:200] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.zipkin [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:203] transport_sockets.downstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:04:00.195][6][info][main] external/envoy/source/server/server.cc:206] transport_sockets.upstream: envoy.transport_sockets.capture,raw_buffer,tls [2018-09-25 10:04:00.196][6][critical][main] external/envoy/source/server/server.cc:78] error initializing configuration '': Proto constraint validation failed (BootstrapValidationError.Admin: ["value is required"]): [2018-09-25 10:04:00.198][6][info][main] external/envoy/source/server/server.cc:449] exiting
/Users/rick/code/src/github.com/solo-io/gloo/pkg/cliutil/nsselect/gen_map_test.go:20
Expected
<[]string | len:3, cap:4>: ["ns2, s3", "ns1, s1", "ns1, s2"]
to equal
<[]string | len:3, cap:3>: ["ns1, s1", "ns1, s2", "ns2, s3"]
/Users/rick/code/src/github.com/solo-io/gloo/pkg/cliutil/nsselect/gen_map_test.go:22
------------------------------
Summarizing 1 Failure:
[Fail] Generate Options [It] should create the correct Secret options and map
/Users/rick/code/src/github.com/solo-io/gloo/pkg/cliutil/nsselect/gen_map_test.go:22
Ran 1 of 1 Specs in 0.001 seconds
FAIL! -- 0 Passed | 1 Failed | 0 Pending | 0 Skipped
--- FAIL: TestGit (0.00s)
FAIL
the gateway crd can get out of sync fairly easily, it's been noticed when virtual services are deleted.
I've followed the installation instructions at https://github.com/solo-io/gloo/blob/master/docs/getting_started/kubernetes/1.md using minikube. For some reason the UI isn't available. As I understand it's provided by control-plane
service that I exposed through "port-forward"
kubectl port-forward --namespace=gloo-system control-plane-<pod id> 8081:8081
The port is available but instead of html it replies some binary content (to be precise, the reply is 9 bytes: 00 00 00 04 00 00 00 00 00
).
Could you explain how to connect gloo UI?
Also there is some minor inconsistency in your document. "Petstore" API is served by :8080/api/pets
and not by :8080/petstore
. There is no list
action, etc.
cat <<EOF | glooctl route update --path-exact /petstore/findPet --upstream default-petstore-8080 --function findPetById --extensions -
parameters:
headers:
x-pet: '{id}'
EOF
Outputs the following:
Using virtual service: default
Unable to get updated route: unable to get old route: a matcher wasn't specified
Something I might have done wrong or a change since the tutorial was written?
For folks who wish to deal directly with the Kube yaml it would be good to have a preview for
glooctl install kube --preview
which spits out the kube yaml
Related to #202 , we misspell permissions in the glooctl help text when running glooctl create upstream kube --help
.
We also have several typos in the flags.
Any plans to support gateway authentication before proxying to upstream microservices
apparently, kubernetes allows namespaces with names that are not DNS-1035 labels, but will not allow upstreams (or presumably any CRD) to be written with a non DNS-1035 name.
this means that a namespace can start with a number, but upstream may only begin with a letter.
this could be problematic if a user tries to run UDS on a namespace starting with a number, as upstreams are currently named by namespace-servicename-port
I have this service/deployment combination in my clusters. It's a very standard default backend for ingress. Note that the service forwards from port 80 to the named port http
on the pod.
kind: Service
apiVersion: v1
metadata:
name: arch-default-backend
spec:
ports:
- port: 80
targetPort: http
selector:
app: arch-default-backend
---
kind: Deployment
apiVersion: extensions/v1beta1
metadata:
name: arch-default-backend
spec:
replicas: 1
template:
metadata:
labels:
app: arch-default-backend
spec:
terminationGracePeriodSeconds: 60
containers:
- name: default-backend
image: gcr.io/google_containers/defaultbackend:1.0
livenessProbe:
httpGet:
path: /healthz
port: 8080
scheme: HTTP
initialDelaySeconds: 30
timeoutSeconds: 5
resources:
limits:
cpu: 10m
memory: 20Mi
requests:
cpu: 10m
memory: 20Mi
ports:
- name: http
containerPort: 8080
protocol: TCP
With this setup gloo is logging this every second:
{"level":"warn","ts":1546956254.3821619,"logger":"gloo.v1.event_loop.setup.kubernetes_eds","caller":"kubernetes/eds.go:133","msg":"upstream gloo-system.default-arch-default-backend-80: port 80 not found for service arch-default-backend"}
Currently on a default install glooctl binds it's gloo-role
to the default service account. This is a security issue as most operators leave their default service account permissionless.
Have glooctl create its own service account and use that.
/tools# kubectl describe clusterrolebinding gloo-role-binding
Name: gloo-role-binding
Labels: <none>
Annotations: kubectl.kubernetes.io/last-applied-configuration={"apiVersion":"rbac.authorization.k8s.io/v1","kind":"ClusterRoleBinding","metadata":{"annotations":{},"name":"gloo-role-binding","namespace":""},"roleR...
Role:
Kind: ClusterRole
Name: gloo-role
Subjects:
Kind Name Namespace
---- ---- ---------
ServiceAccount default gloo-system
following running gloo with docker-compose, while the other services are up after running command docker-compose up
function discovery is not running.
docker ps
returns
hello, got a question, i have a new route to my monolith created with gloo "glooctl route create --path-prefix / --upstream myoldapp-8080", and i would know if i can set any kind of auth (at least basic) on this route, so user have to authenticate when then call "h t t p :// myserver /", thanks a lot
Prior versions of Gloo supported Kubernetes ingress, but it isn't currently supported in Gloo 0.5.0.
ceposta@postamaclab(~) $ glooctl get virtualservice routes
Error: virtualservice id provided was incorrect
Usage:
glooctl get virtualservice route [flags]
Maybe we could update the usage to
glooctl get virtualservice [virtual service name] routes
thoughts?
Trying https://github.com/solo-io/gloo/tree/master/example/grpc
with latest version of Virtualbox, Minikube, kubectl
, glooctl
on a blank minikube.
at this step:
[07:01] curl $GRPC_URL/bookstore.Bookstore/ListShelves
curl: (7) Failed to connect to 10.0.2.15 port 30508: Connection timed out
[07:06]
2018/05/30 06:57:59 listening on 8080
2018/05/30 06:58:15 /grpc.reflection.v1alpha.ServerReflection/ServerReflectionInfo
ERROR: logging before flag.Parse: W0530 06:46:01.324885 1 client_config.go:529] Neither --kubeconfig nor --master was specified. Using the inClusterConfig. This might not work.
ERROR: logging before flag.Parse: W0530 06:46:01.326152 1 client_config.go:529] Neither --kubeconfig nor --master was specified. Using the inClusterConfig. This might not work.
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103" registering crd v1.crd{
Plural: "upstreams",
Group: "gloo.solo.io",
Version: "v1",
Kind: "Upstream",
ShortName: "us",
}
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103" registering crd v1.crd{
Plural: "virtualservices",
Group: "gloo.solo.io",
Version: "v1",
Kind: "VirtualService",
ShortName: "vs",
}
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103" registering crd v1.crd{
Plural: "virtualmeshes",
Group: "gloo.solo.io",
Version: "v1",
Kind: "VirtualMesh",
ShortName: "vm",
}
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/cmd/kube-ingress-controller/main.go:105" starting ingress controller
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:113" Starting "gloo-ingress-syncer" controller
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:116" Waiting for informer caches to sync
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/cmd/kube-ingress-controller/main.go:81" starting ingress status sync
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/cmd/kube-ingress-controller/main.go:94" starting ingress sync
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:113" Starting "kube-ingress-controller" controller
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:116" Waiting for informer caches to sync
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:121" Starting workers
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:127" Started workers
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:121" Starting workers
"Wed, 30 May 2018 06:46:01 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:127" Started workers
non stop flow of:
ERROR: logging before flag.Parse: E0530 07:11:03.617946 1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:04.620914 1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:05.623252 1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:06.626267 1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:07.628632 1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:08.633828 1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:09.650339 1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:10.653396 1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
ERROR: logging before flag.Parse: E0530 07:11:11.655923 1 reflector.go:205] github.com/solo-io/gloo/pkg/storage/crd/roles.go:84: Failed to list *v1.Role: roles.gloo.solo.io is forbidden: User "system:serviceaccount:gloo-system:default" cannot list roles.gloo.solo.io in the namespace "gloo-system"
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:200] initializing epoch 0 (hot restart version=9.200.16384.127.options=capacity=16384, num_slots=8209 hash=228984379728933363)
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:202] statically linked extensions:
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:204] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:207] filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash,io.solo.azure_functions,io.solo.function_router,io.solo.gcloudfunc,io.solo.lambda,io.solo.nats_streaming,io.solo.transformation
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:210] filters.listener: envoy.listener.original_dst,envoy.listener.proxy_protocol
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:213] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:215] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.statsd
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:217] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.zipkin
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:220] transport_sockets.downstream: raw_buffer,ssl
[2018-05-30 06:46:50.281][1][info][main] external/envoy/source/server/server.cc:223] transport_sockets.upstream: raw_buffer,ssl
[2018-05-30 06:46:50.294][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:46:50.294][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:46:50.294][1][info][config] external/envoy/source/server/configuration_impl.cc:52] loading 0 listener(s)
[2018-05-30 06:46:50.294][1][info][config] external/envoy/source/server/configuration_impl.cc:92] loading tracing configuration
[2018-05-30 06:46:50.294][1][info][config] external/envoy/source/server/configuration_impl.cc:114] loading stats sink configuration
[2018-05-30 06:46:50.294][1][info][main] external/envoy/source/server/server.cc:399] starting main dispatch loop
[2018-05-30 06:46:55.295][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:46:55.296][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:00.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:47:00.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:05.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:47:05.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:10.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:47:10.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:15.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:47:15.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:20.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:47:20.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:25.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:47:25.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:30.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:47:30.297][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:35.299][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:47:35.299][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:40.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:47:40.298][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:45.299][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:224] gRPC config stream closed: 1,
[2018-05-30 06:47:45.299][1][warning][upstream] external/envoy/source/common/config/grpc_mux_impl.cc:39] Unable to establish new stream
[2018-05-30 06:47:47.799][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:127] cm init: initializing cds
[2018-05-30 06:47:50.313][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:379] add/update cluster gloo-system-control-plane-8081 during init
[2018-05-30 06:47:50.317][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:379] add/update cluster gloo-system-ingress-8080 during init
[2018-05-30 06:47:50.320][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:379] add/update cluster gloo-system-ingress-8443 during init
[2018-05-30 06:47:50.321][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:108] cm init: initializing secondary clusters
[2018-05-30 06:47:50.324][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:131] cm init: all clusters initialized
[2018-05-30 06:47:50.324][1][info][main] external/envoy/source/server/server.cc:383] all clusters initialized. initializing init manager
[2018-05-30 06:47:50.325][1][info][config] external/envoy/source/server/listener_manager_impl.cc:602] all dependencies initialized. starting workers
[2018-05-30 06:57:32.189][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:385] add/update cluster default-grpc-bookstore-8080 starting warming
[2018-05-30 06:57:32.189][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:392] warming cluster default-grpc-bookstore-8080 complete
[2018-05-30 06:58:15.764][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:385] add/update cluster default-grpc-bookstore-8080 starting warming
[2018-05-30 06:58:15.764][1][info][upstream] external/envoy/source/common/upstream/cluster_manager_impl.cc:392] warming cluster default-grpc-bookstore-8080 complete
[2018-05-30 07:02:50.326][1][info][main] external/envoy/source/server/drain_manager_impl.cc:63] shutting down parent after drain
"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:256" updated upstream &v1.Upstream{
Name: "default-grpc-bookstore-8080",
Type: "kubernetes",
ConnectionTimeout: 0,
Spec: &types.Struct{
Fields: {
"service_port": &types.Value{
Kind: &types.Value_NumberValue{
NumberValue: 8080.000000,
},
},
"service_name": &types.Value{
Kind: &types.Value_StringValue{
StringValue: "grpc-bookstore",
},
},
"service_namespace": &types.Value{
Kind: &types.Value_StringValue{
StringValue: "default",
},
},
},
},
Functions: []*v1.Function{},
ServiceInfo: &v1.ServiceInfo{
Type: "gRPC",
Properties: &types.Struct{
Fields: {
"descriptors_file_ref": &types.Value{
Kind: &types.Value_StringValue{
StringValue: "grpc-discovery:Bookstore.descriptors",
},
},
"service_names": &types.Value{
Kind: &types.Value_ListValue{
ListValue: &types.ListValue{
Values: []*types.Value{
&types.Value{
Kind: &types.Value_StringValue{
StringValue: "Bookstore",
},
},
},
},
},
},
},
},
},
Status: &v1.Status{
State: 1,
Reason: "",
},
Metadata: &v1.Metadata{
ResourceVersion: "1518",
Namespace: "gloo-system",
Annotations: {
"generated_by": "kubernetes-upstream-discovery",
"kubectl.kubernetes.io/last-applied-configuration": "{\"apiVersion\":\"v1\",\"kind\":\"Service\",\"metadata\":{\"annotations\":{},\"labels\":{\"sevice\":\"grpc-bookstore\"},\"name\":\"grpc-bookstore\",\"namespace\":\"default\"},\"spec\":{\"ports\":[{\"port\":8080,\"protocol\":\"TCP\"}],\"selector\":{\"app\":\"grpc-bookstore\"},\"type\":\"LoadBalancer\"}}\n",
"gloo.solo.io/discovery-type": "grpc",
},
},
}
"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 06:58:15 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 06:58:16 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 06:58:16 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:112" beginning update for []string{
"gloo-system-control-plane-8081",
"gloo-system-ingress-8080",
"gloo-system-ingress-8443",
"default-grpc-bookstore-8080",
} upstreams: 4
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:112" beginning update for []string{
"gloo-system-control-plane-8081",
"gloo-system-ingress-8080",
"gloo-system-ingress-8443",
"default-grpc-bookstore-8080",
} upstreams: 4
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:112" beginning update for []string{
"gloo-system-control-plane-8081",
"gloo-system-ingress-8080",
"gloo-system-ingress-8443",
"default-grpc-bookstore-8080",
} upstreams: 4
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:112" beginning update for []string{
"gloo-system-control-plane-8081",
"gloo-system-ingress-8080",
"gloo-system-ingress-8443",
"default-grpc-bookstore-8080",
} upstreams: 4
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:112" beginning update for []string{
"gloo-system-control-plane-8081",
"gloo-system-ingress-8080",
"gloo-system-ingress-8443",
"default-grpc-bookstore-8080",
} upstreams: 4
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:44 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:45 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:45 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:45 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:45 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:46 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:46 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:46 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:46 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:47 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:47 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:47 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:47 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:47 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:48 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:48 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:48 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:48 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:49 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:49 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:49 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:49 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:49 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:50 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:50 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:50 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:50 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "default-grpc-bookstore-8080"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/eventloop/event_loop.go:98" attempting update for "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:51 UTC: github.com/solo-io/gloo/internal/function-discovery/updater/updater.go:222" attempting to apply update for upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 07:15:52 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-control-plane-8081"
"Wed, 30 May 2018 07:15:52 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8080"
"Wed, 30 May 2018 07:15:52 UTC: github.com/solo-io/gloo/internal/function-discovery/detector/detector.go:64" no more retries for "gloo-system-ingress-8443"
ERROR: logging before flag.Parse: W0530 06:46:11.927839 1 client_config.go:529] Neither --kubeconfig nor --master was specified. Using the inClusterConfig. This might not work.
"Wed, 30 May 2018 06:46:11 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103" registering crd v1.crd{
Plural: "upstreams",
Group: "gloo.solo.io",
Version: "v1",
Kind: "Upstream",
ShortName: "us",
}
"Wed, 30 May 2018 06:46:11 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103" registering crd v1.crd{
Plural: "virtualservices",
Group: "gloo.solo.io",
Version: "v1",
Kind: "VirtualService",
ShortName: "vs",
}
"Wed, 30 May 2018 06:46:11 UTC: github.com/solo-io/gloo/pkg/storage/crd/crd_storage_client.go:103" registering crd v1.crd{
Plural: "virtualmeshes",
Group: "gloo.solo.io",
Version: "v1",
Kind: "VirtualMesh",
ShortName: "vm",
}
2018/05/30 06:46:11 starting kubernetes service discovery
"Wed, 30 May 2018 06:46:11 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:113" Starting "kube-upstream-discovery" controller
"Wed, 30 May 2018 06:46:11 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:116" Waiting for informer caches to sync
"Wed, 30 May 2018 06:46:12 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:121" Starting workers
"Wed, 30 May 2018 06:46:12 UTC: github.com/solo-io/gloo/vendor/github.com/solo-io/kubecontroller/controller.go:127" Started workers
"Wed, 30 May 2018 06:46:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:46:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:47:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:47:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:47:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:48:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:48:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:48:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:49:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:49:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:49:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:50:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:50:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:50:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:51:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:51:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:51:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:52:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:52:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:52:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:53:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:53:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:53:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:54:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:54:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:54:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:55:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:55:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:55:12 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:56:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
"Wed, 30 May 2018 06:56:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8080"
"Wed, 30 May 2018 06:56:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-ingress-8443"
"Wed, 30 May 2018 06:57:11 UTC: github.com/solo-io/gloo/pkg/config/desired_state.go:101" updating upstream "gloo-system-control-plane-8081"
glooctl
is returning "unknown flag" for --static-hosts
which appears to be supported by the help output. Am I passing this flag incorrectly?
/tools# glooctl create upstream static --name test --static-hosts example.com:80
Error: unknown flag: --static-hosts
Usage:
glooctl create upstream static [flags]
Flags:
-h, --help help for static
--service-spec-type string if set, Gloo supports additional routing features to upstreams with a service spec. The service spec defines a set of features
--static-hosts strings list of hosts for the static upstream. these are hostnames or ips provided in the format IP:PORT or HOSTNAME:PORT. if :PORT is missing, it will default to :80
--static-outbound-tls connections Gloo manages to this cluster will attempt to use TLS for outbound connections. Gloo will automatically set this to true for port 443
Global Flags:
-i, --interactive use interactive mode
--name string name of the resource to read or write
-n, --namespace string namespace for reading or writing resources (default "gloo-system")
-o, --output string output format: (yaml, json, table)```
in order to more easily expose the Settings
crd, cli commands for:
In https://github.com/solo-io/gloo/blob/master/docs/getting_started/kubernetes/2.md#steps the instructions state:
curl ${GATEWAY_URL}/petstore/findPet
bad request: Did not found json element: id
But it no longer errors like this.
I believe the function definitions have changed. The tutorial expects them to be:
glooctl upstream get default-petstore-8080 -o yaml
functions:
...
- name: findPetById
spec:
body: ""
headers:
:method: GET
path: /api/pets/{{id}}
But instead they are now:
functions:
...
- name: findPetById
spec:
body: ""
headers:
:method: GET
passthrough_body: false
path: /api/pets/{{ default(id, "") }}
In upgrading to solo-kit 0.2.1, the generate script for this file went away so changes to kube.yaml don't update the CLI
the e2e tests take a very long time to run, and often fail because of build errors that would have been caught with a very quick go vet check on the repo.
In general we should be following many of these conventions anyway.
Occurred on #182
this will ensure that master
is always consistent with code generated from .proto
files as well as any updates to solo-kit
steps should be:
dep ensure
make -B generated-code
git diff | wc -c
> 0, failcurrently, gloo returns an error on the vservice if its domains conflict with any other vservices
what this feature would do is do an opinionated merge (with some kind of route sort) of one or more virtual services based on domain. if domains overlap, we will create a merged vhost with that (it would be good to LOG it for now)
Following the getting started doc, and trying to install on a GKE cluster.
running to the following issue. Not sure whether I'm doing something wrong here
kubectl apply \
> --filename https://raw.githubusercontent.com/solo-io/gloo/master/install/kube/install.yaml
namespace "gloo-system" created
customresourcedefinition.apiextensions.k8s.io "upstreams.gloo.solo.io" created
customresourcedefinition.apiextensions.k8s.io "virtualservices.gloo.solo.io" created
customresourcedefinition.apiextensions.k8s.io "roles.gloo.solo.io" created
customresourcedefinition.apiextensions.k8s.io "attributes.gloo.solo.io" created
configmap "ingress-config" created
clusterrolebinding.rbac.authorization.k8s.io "gloo-cluster-admin-binding" created
clusterrolebinding.rbac.authorization.k8s.io "gloo-discovery-cluster-admin-binding" created
clusterrolebinding.rbac.authorization.k8s.io "gloo-knative-upstream-discovery-binding" created
deployment.apps "control-plane" created
service "control-plane" created
deployment.apps "function-discovery" created
deployment.apps "ingress" created
service "ingress" created
deployment.extensions "kube-ingress-controller" created
deployment.extensions "upstream-discovery" created
Error from server (Forbidden): error when creating "https://raw.githubusercontent.com/solo-io/gloo/master/install/kube/install.yaml": clusterroles.rbac.authorization.k8s.io "gloo-role" is forbidden: attempt to grant extra privileges: [PolicyRule{Resources:["pods"], APIGroups:[""], Verbs:["get"]} PolicyRule{Resources:["pods"], APIGroups:[""], Verbs:["watch"]} PolicyRule{Resources:["pods"], APIGroups:[""], Verbs:["list"]} PolicyRule{Resources:["services"], APIGroups:[""], Verbs:["get"]} PolicyRule{Resources:["services"], APIGroups:[""], Verbs:["watch"]} PolicyRule{Resources:["services"], APIGroups:[""], Verbs:["list"]} PolicyRule{Resources:["secrets"], APIGroups:[""], Verbs:["get"]} PolicyRule{Resources:["secrets"], APIGroups:[""], Verbs:["watch"]} PolicyRule{Resources:["secrets"], APIGroups:[""], Verbs:["list"]} PolicyRule{Resources:[
this issue should track progress on deploying Gloo with istio sidecars. the existing Gloo deployment has been tested to work with istio 1.0.3
without any additional configuration.
gloo performs the following network functions:
I ran
glooctl create upstream kube -i
and answered the namespace and name questions and then it errored with this --
Error: interactive mode not currently available for type kube
I would think it would give that right away?
#220 addresses the need to add functionality to Gloo without modifying the core Gloo repo
it introduces the ability to add additional plugins to Gloo as well as provide callbacks for the Translator via TranslatorSyncerExtension
s
just to keep track of an alternate solution, it may be prudent to add complexity to the plugins (essentially, offload the work of extending Gloo's functionality entirely to plugins).
just something to track as we continue to build out
This generate statement assumes the $GOPATH will be a single directory:
If you have multiple paths in GOPATH, it fails:
GOPATH=/home/ceposta/gopath:/another/gopath:/foo/path
Currently RequestMatcher for Ingress resources is path_regexp but k8s Ingress resource support only absolute path.
Feature Request:
provide a glooctl diagnose install
-like feature that would beforehand notify you of possible problems you will encounter during the installation
suggestion o verifications:
version mismatch or incompatibility
permissions
licenses (if it is the case)
access to bits (offline installation)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.