Git Product home page Git Product logo

kaf's Introduction

Kaf

Kafka CLI inspired by kubectl & docker

Actions Status GoReportCard GoDoc AUR version

asciicinema

Install

Install via Go from source:

go install github.com/birdayz/kaf/cmd/kaf@latest

Install via install script:

curl https://raw.githubusercontent.com/birdayz/kaf/master/godownloader.sh | BINDIR=$HOME/bin bash

Install on Archlinux via AUR:

yay -S kaf-bin

Install via Homebrew:

brew tap birdayz/kaf
brew install kaf

Usage

Show the tool version

kaf --version

Add a local Kafka with no auth

kaf config add-cluster local -b localhost:9092

Select cluster from dropdown list

kaf config select-cluster

Describe and List nodes

kaf node ls

List topics, partitions and replicas

kaf topics

Describe a given topic called mqtt.messages.incoming

kaf topic describe mqtt.messages.incoming

Group Inspection

List consumer groups

kaf groups

Describe a given consumer group called dispatcher

kaf group describe dispatcher

Write message into given topic from stdin

echo test | kaf produce mqtt.messages.incoming

Offset Reset

Set offset for consumer group dispatcher consuming from topic mqtt.messages.incoming to latest for all partitions

kaf group commit dispatcher -t mqtt.messages.incoming --offset latest --all-partitions

Set offset to oldest

kaf group commit dispatcher -t mqtt.messages.incoming --offset oldest --all-partitions

Set offset to 1001 for partition 0

kaf group commit dispatcher -t mqtt.messages.incoming --offset 1001 --partition 0

Configuration

See the examples folder

Shell autocompletion

Source the completion script in your shell commands file:

Bash Linux:

kaf completion bash > /etc/bash_completion.d/kaf

Bash MacOS:

kaf completion bash > /usr/local/etc/bash_completion.d/kaf

Zsh

kaf completion zsh > "${fpath[1]}/_kaf"

Fish

kaf completion fish > ~/.config/fish/completions/kaf.fish

Powershell

Invoke-Expression (@(kaf completion powershell) -replace " ''\)$"," ' ')" -join "`n")

Sponsors

  • The streaming data platform for developers
  • Single binary w/no dependencies
  • Fully Kafka API compatible
  • 10x lower P99 latencies, 6x faster transactions
  • Zero data loss by default

kaf's People

Contributors

adamwasila avatar almogbaku avatar ayumukasuga avatar bdomars avatar birdayz avatar chrillux avatar christian-olsen avatar dbrown14 avatar dependabot[bot] avatar fabiojmendes avatar fcce avatar flyingcougar avatar hoenirvili avatar huangnauh avatar jackcipher avatar jdowning avatar joeirimpan avatar josher19 avatar k-wall avatar lukasvyhlidka avatar midn avatar mxcoder avatar nishanthvijayan avatar robsonpeixoto avatar sdenham avatar sergiosalvatore avatar steveruble avatar teh-cmc avatar tzetter avatar vixns avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kaf's Issues

Cut a new release?

Hi! Would you be able to generate a new release and build to get the latest changes available? Thanks!

Config options

Currently stuff like brokers are hardcoded to localhost :)

  • Determine config options
  • YAML format for config file

panic: runtime error: makeslice: cap out of range

I regularly encounter the following error message for groups consuming topics with 360 partitions and more:

$ kaf group describe consumer-group.prod
Group ID: offer-upgrade-processor.prod
State: Stable
Protocol: stream
Protocol Type: consumer
Offsets:
topic1:
Partition Group Offset High Watermark Lag


0 48321 48321 0
1 48155 48155 0
2 48113 48113 0
3 48556 48556 0
...
topic2:
Partition Group Offset High Watermark Lag
--------- ------------ -------------- ---
0 1129687383 1129687722 339
1 1129936247 1129936511 264
2 1128154286 1128154637 351
...
Members:
consumer-group.prod-54418c37-2b3e-4c6a-825b-1b935ff775ce-StreamThread-1-consumer:
Host: /10.114.0.12
Assignments:
Topic Partitions
----- ----------
topic1 [36 60 84 108 132 156 180 204 228 252 276 300 324 348]
topic2 [36 60 84 108 132 156 180 204 228 252 276 300 324 348]
Metadata: AAAAAwAAAANUQYw3Kz5MaoJbG5Nf93XOAAAAAAAAAAAAAAAA
consumer-group.prod-54418c37-2b3e-4c6a-825b-1b935ff775ce-StreamThread-3-consumer:
Host: /10.114.0.12
Assignments:
Topic Partitions
----- ----------
topic1 [33 57 81 105 129 153 177 201 225 249 273 297 321 345]
topic2 [33 57 81 105 129 153 177 201 225 249 273 297 321 345]

panic: runtime error: makeslice: cap out of range

goroutine 1 [running]:
github.com/birdayz/kaf.(*SubscriptionInfo).Decode(0xc000511808, 0x16dfe80, 0xc00018e0c0, 0x11, 0x11)
/home/j0e/projects/kaf/subscription_info.go:29 +0x11a
main.tryDecodeUserData(0xc0001d4088, 0x6, 0xc0003a618c, 0x24, 0x5733, 0x0, 0x0, 0x4e, 0x0)
/home/j0e/projects/kaf/cmd/kaf/group.go:309 +0x150
main.glob..func10(0x1a9dd40, 0xc0000a34a0, 0x1, 0x1)
/home/j0e/projects/kaf/cmd/kaf/group.go:213 +0x115f
github.com/spf13/cobra.(*Command).execute(0x1a9dd40, 0xc0000a3460, 0x1, 0x1, 0x1a9dd40, 0xc0000a3460)
/home/j0e/gopath/pkg/mod/github.com/spf13/[email protected]/command.go:830 +0x2ae
github.com/spf13/cobra.(*Command).ExecuteC(0x1a9dfc0, 0xa, 0xc0000bdf88, 0x100552f)
/home/j0e/gopath/pkg/mod/github.com/spf13/[email protected]/command.go:914 +0x2fc
github.com/spf13/cobra.(*Command).Execute(...)
/home/j0e/gopath/pkg/mod/github.com/spf13/[email protected]/command.go:864
main.main()
/home/j0e/projects/kaf/cmd/kaf/kaf.go:63 +0x32

Topic command

  • create
  • ls
  • ls with arg (=filter with glob)
  • describe
  • delete

make / go install fails with genproto dependency issue in latest master

Hey folks,

maybe it's my own fault and we just need to focus on #48 but when running make install or make build or go install I always get the same error:

go: github.com/jhump/[email protected] requires
        google.golang.org/[email protected]: invalid pseudo-version: does not match version-control timestamp (2017-08-18T01:03:45Z)

I'm using GNU Make 4.2.1, go version go1.13 darwin/amd64 on macOS 10.14.6.

Could someone please help me out here?
Thank you :-)

Support for brokers in a k8s cluster

Use either
a) Kubectl
b) client-go

to either
a) Port-forward to the pods (needs port forward for each broker)
b) Run kaf inside a temporary pod (?for each) cmd

  • Static broker config from normal cfg
  • Broker discovery via headless services

kafka in docker -- `no such host ...`

With kafka running in a docker container, kaf commands give "no such host" error because they're apparently using the hostname obtained from the host rather than that provided in the config, e.g.

$ kaf topics
Unable to list topics: dial tcp: lookup c3d0d2984699: no such host

I'm running kafka and zookeeper in docker containers (thru docker-compose) with the broker port exposed to the host machine:

$ docker ps | grep bitnami
c3d0d2984699        bitnami/kafka:latest               "/entrypoint.sh /run…"   18 hours ago        Up 18 hours         0.0.0.0:9092->9092/tcp         proj_kafka_1
23ec9f7aa1e4        bitnami/zookeeper:latest           "/app-entrypoint.sh …"   18 hours ago        Up 18 hours         2181/tcp, 2888/tcp, 3888/tcp   proj_zookeeper_1

~/.kaf/config contents:

current-cluster: localhost:9092
clusters:
- name: localhost:9092
  brokers:
  - localhost:9092
  SASL: null
  TLS: null
  security-protocol: ""
  schema-registry-url: ""

Cannot connect to broker

kaf version: v0.1.20

kafka version: kafka_2.11-0.10.0.0

[jinlei1@tjwq01-ibu-video007106:~] master(+9/-4,10)* 3s ± kaf -b 10.64.2.163:8092 topics -v
[sarama] 2019/07/18 16:56:05 client.go:127: Initializing new client
[sarama] 2019/07/18 16:56:05 config.go:494: ClientID is the default of 'sarama', you should consider setting it to something application-specific.
[sarama] 2019/07/18 16:56:05 config.go:494: ClientID is the default of 'sarama', you should consider setting it to something application-specific.
[sarama] 2019/07/18 16:56:05 client.go:775: client/metadata fetching metadata for all topics from broker 10.64.2.163:8092
[sarama] 2019/07/18 16:56:05 broker.go:214: Connected to broker at 10.64.2.163:8092 (unregistered)
[sarama] 2019/07/18 16:56:05 client.go:813: client/metadata got error from broker -1 while fetching metadata: EOF
[sarama] 2019/07/18 16:56:05 broker.go:253: Closed connection to broker 10.64.2.163:8092
[sarama] 2019/07/18 16:56:05 client.go:824: client/metadata no available broker to send metadata request to
[sarama] 2019/07/18 16:56:05 client.go:565: client/brokers resurrecting 1 dead seed brokers
[sarama] 2019/07/18 16:56:05 client.go:759: client/metadata retrying after 250ms... (3 attempts remaining)
[sarama] 2019/07/18 16:56:05 config.go:494: ClientID is the default of 'sarama', you should consider setting it to something application-specific.
[sarama] 2019/07/18 16:56:05 client.go:775: client/metadata fetching metadata for all topics from broker 10.64.2.163:8092
[sarama] 2019/07/18 16:56:05 broker.go:214: Connected to broker at 10.64.2.163:8092 (unregistered)
[sarama] 2019/07/18 16:56:05 client.go:813: client/metadata got error from broker -1 while fetching metadata: EOF
[sarama] 2019/07/18 16:56:05 broker.go:253: Closed connection to broker 10.64.2.163:8092
[sarama] 2019/07/18 16:56:05 client.go:824: client/metadata no available broker to send metadata request to
[sarama] 2019/07/18 16:56:05 client.go:565: client/brokers resurrecting 1 dead seed brokers
[sarama] 2019/07/18 16:56:05 client.go:759: client/metadata retrying after 250ms... (2 attempts remaining)
[sarama] 2019/07/18 16:56:06 config.go:494: ClientID is the default of 'sarama', you should consider setting it to something application-specific.
[sarama] 2019/07/18 16:56:06 client.go:775: client/metadata fetching metadata for all topics from broker 10.64.2.163:8092
[sarama] 2019/07/18 16:56:06 broker.go:214: Connected to broker at 10.64.2.163:8092 (unregistered)
[sarama] 2019/07/18 16:56:06 client.go:813: client/metadata got error from broker -1 while fetching metadata: EOF
[sarama] 2019/07/18 16:56:06 broker.go:253: Closed connection to broker 10.64.2.163:8092
[sarama] 2019/07/18 16:56:06 client.go:824: client/metadata no available broker to send metadata request to
[sarama] 2019/07/18 16:56:06 client.go:565: client/brokers resurrecting 1 dead seed brokers
[sarama] 2019/07/18 16:56:06 client.go:759: client/metadata retrying after 250ms... (1 attempts remaining)
[sarama] 2019/07/18 16:56:06 config.go:494: ClientID is the default of 'sarama', you should consider setting it to something application-specific.
[sarama] 2019/07/18 16:56:06 client.go:775: client/metadata fetching metadata for all topics from broker 10.64.2.163:8092
[sarama] 2019/07/18 16:56:06 broker.go:214: Connected to broker at 10.64.2.163:8092 (unregistered)
[sarama] 2019/07/18 16:56:06 client.go:813: client/metadata got error from broker -1 while fetching metadata: EOF
[sarama] 2019/07/18 16:56:06 broker.go:253: Closed connection to broker 10.64.2.163:8092
[sarama] 2019/07/18 16:56:06 client.go:824: client/metadata no available broker to send metadata request to
[sarama] 2019/07/18 16:56:06 client.go:565: client/brokers resurrecting 1 dead seed brokers
[sarama] 2019/07/18 16:56:06 client.go:227: Closing Client

Consumer command panics sometimes with broker not connected

I've imported my configuration from my ccloud using this command

kaf config import ccloud

Sometimes when I try and consume things I'm receving this panic. It's unpredictable and I don't know why.

kaf consume <topic_that_I_want_to_consume_from>
panic: kafka: broker not connected

goroutine 57 [running]:
main.glob..func7.1(0x7ffeefbff82c, 0x2b, 0x4642220, 0xc00011c120, 0xc000026f28, 0x463fa80, 0xc0003b57c0, 0xc000278f58, 0xc000278f60, 0x8)
	/Users/nn/Work/Go/src/github.com/infinimesh/kaf/cmd/kaf/consume.go:96 +0xf7c
created by main.glob..func7
	/Users/nn/Work/Go/src/github.com/infinimesh/kaf/cmd/kaf/consume.go:85 +0x2a4
➜  esp-iot/services/blob-pusher-service blob-pusher ✗ vim ../../../../Work/Go/src/github.com/infinimesh/kaf/cmd/kaf/consume.go

After I retry for a couple of times the previous command magically works and I see my topic data.

Add more options to consume

First, thanks for this project, the topics and groups commands are really impressive!

The consume command is still lacking some features though. With https://github.com/fgeller/kt/ for instance you can:

  • Select which partitions to use when consuming
  • Specify an offset
  • Use "relative offsets" (newest-10, oldest+10)
  • Select both a start and end offset to consume from

panics if can't resolve broker host name

the configuration contains a single cluster in use with IP:PORT. But

$ kaf groups

panic: dial tcp: lookup 2df31c8b74c1: no such host

goroutine 1 [running]:
main.glob..func8(0xdd5c60, 0xe26fb8, 0x0, 0x0)
	/home/sitano/Projects/gocode/src/github.com/infinimesh/kaf/cmd/kaf/group.go:64 +0x65a
github.com/infinimesh/kaf/vendor/github.com/spf13/cobra.(*Command).execute(0xdd5c60, 0xe26fb8, 0x0, 0x0, 0xdd5c60, 0xe26fb8)
	/home/sitano/Projects/gocode/src/github.com/infinimesh/kaf/vendor/github.com/spf13/cobra/command.go:766 +0x2ae
github.com/infinimesh/kaf/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xdd6120, 0xa, 0xc0000cbf88, 0x410d9f)
	/home/sitano/Projects/gocode/src/github.com/infinimesh/kaf/vendor/github.com/spf13/cobra/command.go:852 +0x2ec
github.com/infinimesh/kaf/vendor/github.com/spf13/cobra.(*Command).Execute(...)
	/home/sitano/Projects/gocode/src/github.com/infinimesh/kaf/vendor/github.com/spf13/cobra/command.go:800
main.main()
	/home/sitano/Projects/gocode/src/github.com/infinimesh/kaf/cmd/kaf/kaf.go:37 +0x32

It would be nice if it will handle gracefully the case when it can not access 1 or some of the brokers hostnames.

Add WebUI

Kaf should have a WebUI.

I tend towards the following setup:

  • kaf ui starts a local webserver on port XYZ
  • grpc & grpc-web is used to communicate from VueJS (or something more lightweight) frontend
  • since we add a grpc API, this can be used to run kaf in a k8s cluster and invoke commands from local - without using e.g. telepresence to reach brokers in a cluster.

TODO

  • Implement Subscribe() for each Resource (Topic, Group?), so changes are reflected instantly

json/yaml support

Support automated use of this tool.

  • Output json/yaml
  • Maybe read input for commands - kinda like kubectl - as yaml. Would be nice for infrastructure as code
    k8s-like create-get-describe-apply-delete semantics would be possible for topic, acl, group. produce and consume would be separate (which is ok)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.