Git Product home page Git Product logo

tlm's Introduction

tlm - Local CLI Copilot, powered by CodeLLaMa. ๐Ÿ’ป๐Ÿฆ™

Tip

Starcoder2 3B model option coming soon to support workstations with limited resources.

Latest Build Sonar Quality Gate Latest Release Downloads

tlm is your CLI companion which requires nothing except your workstation. It uses most efficient and powerful CodeLLaMa in your local environment to provide you the best possible command line suggestions.

Suggest

Explain

Features

  • ๐Ÿ’ธ No API Key (Subscription) is required. (ChatGPT, Github Copilot, Azure OpenAI, etc.)

  • ๐Ÿ“ก No internet connection is required.

  • ๐Ÿ’ป Works on macOS, Linux and Windows.

  • ๐Ÿ‘ฉ๐Ÿปโ€๐Ÿ’ป Automatic shell detection. (Powershell, Bash, Zsh)

  • ๐Ÿš€ One liner generation and command explanation.

Installation

Installation can be done in two ways;

Prerequisites

Ollama is needed to download to necessary models. It can be downloaded with the following methods on different platforms.

  • On macOs and Windows;

Download instructions can be followed at the following link: https://ollama.com/download

  • On Linux;
curl -fsSL https://ollama.com/install.sh | sh
  • Or using official Docker images ๐Ÿณ;
# CPU Only
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

# With GPU (Nvidia & AMD)
docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama

Installation Script

Installation script is the recommended way to install tlm. It will recognize the which platform and architecture to download and will execute install command for you.

Linux and macOS;

Download and execute the installation script by using the following command;

curl -fsSL https://raw.githubusercontent.com/yusufcanb/tlm/1.1/install.sh | sudo bash -E

Windows (Powershell 5.1 or higher)

Download and execute the installation script by using the following command;

Invoke-RestMethod -Uri https://raw.githubusercontent.com/yusufcanb/tlm/1.1/install.ps1 | Invoke-Expression

Go Install

If you have Go 1.21 or higher installed on your system, you can easily use the following command to install tlm;

go install github.com/yusufcanb/tlm@latest

Then, deploy tlm modelfiles.

๐Ÿ“ Note: If you have Ollama deployed on somewhere else. Please first run tlm config and configure Ollama host.

tlm deploy

Check installation by using the following command;

tlm help

Uninstall

On Linux and macOS;

rm /usr/local/bin/tlm

On Windows;

Remove-Item -Recurse -Force "C:\Users\$env:USERNAME\AppData\Local\Programs\tlm"

Contributors

tlm's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

tlm's Issues

[Feature Request] Add context of previously executed commands and outputs

Is there a convenient option to read the bash history and provide a solution for any problem i might run into?

e.g. i execute

poetry install       
Installing dependencies from lock file

Package operations: 66 installs, 0 updates, 0 removals

  - Installing pycddlib (2.1.7): Failed

  ChefBuildError

  Backend subprocess exited when trying to invoke build_wheel
  
  running bdist_wheel
  running build
  running build_ext
  Compiling cdd.pyx because it changed.
  [1/1] Cythonizing cdd.pyx
  building 'cdd' extension
  creating build
  creating build/temp.macosx-14.0-arm64-cpython-312
  creating build/temp.macosx-14.0-arm64-cpython-312/cddlib
  creating build/temp.macosx-14.0-arm64-cpython-312/cddlib/lib-src
  clang -fno-strict-overflow -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -O3 -Wall -DGMPRATIONAL -Icddlib/lib-src -I/private/var/folders/db/8bjxm83551s6k9y4081jd5sh0000gn/T/tmp56717o84/.venv/include -I/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.12/include/python3.12 -c cdd.c -o build/temp.macosx-14.0-arm64-cpython-312/cdd.o
  In file included from cdd.c:1244:
  In file included from cddlib/lib-src/cdd.h:17:
  cddlib/lib-src/cddmp.h:30:11: fatal error: 'gmp.h' file not found
   #include "gmp.h"
            ^~~~~~~
  1 error generated.
  error: command '/usr/bin/clang' failed with exit code 1
  

  at /opt/homebrew/Cellar/poetry/1.8.2/libexec/lib/python3.12/site-packages/poetry/installation/chef.py:164 in _prepare
      160โ”‚ 
      161โ”‚                 error = ChefBuildError("\n\n".join(message_parts))
      162โ”‚ 
      163โ”‚             if error is not None:
    โ†’ 164โ”‚                 raise error from None
      165โ”‚ 
      166โ”‚             return path
      167โ”‚ 
      168โ”‚     def _prepare_sdist(self, archive: Path, destination: Path | None = None) -> Path:

Note: This error originates from the build backend, and is likely not a problem with poetry but with pycddlib (2.1.7) not supporting PEP 517 builds. You can verify this by running 'pip wheel --no-cache-dir --use-pep517 "pycddlib (==2.1.7)"'.

I now would love to do something like tlm fix to prompt the llm to help me to fix my problem and suggest a command to run next.

Is there something like this already? Any ideas on how to implement this?

Love the project ๐Ÿ’™

Problem with explain?

>$ tlm suggest "showing the current time as unix time"
โ”ƒ > Thinking... (1.343003842s)
โ”ƒ > date +%s
โ”ƒ > Explaining...

Error during generation: model 'tlm:-e' not found, try pulling it first

Setup is "bash" even when on mac, because I use bash and not zsh. Could it have to do with that?

Change Modelfile naming convention

Right now, tlm uses suggest:7b and explain:7b as their modelfile names. It should be something more convenient.

Suggestions;

  • tlm-[USAGE]:[VERSION]

Work Items for Release 1.1

  • Shell config options. (Automatic, Powershell, Bash)
  • Preference options update; balanced, creative, precise.
  • Better error handling on command execution.

Specify shell support

The readme boasts, "Automatic shell detection," but doesn't say which shells it supports.

E.g. I can be pretty confident that it supports Bash and Powershell, but what about POSIX sh? zsh? fish? ksh? csh?

Please add this information to the readme.

README scripts failing on Mac

For Ollama installation instructions on the README

curl -fsSL https://ollama.com/install.sh | sh

Gives this error:
ERROR This script is intended to run on Linux only.

For the installation script for tlm:

curl -fsSL https://raw.githubusercontent.com/yusufcanb/tlm/main/install.sh | sudo bash -E

Gives this error:
bash: line 39: conditional binary operator expected

how to set up the models? I have installed CodeLlama, but it's not working

$ ollama list
NAME                            ID              SIZE    MODIFIED
codellama:7b                    8fdf8f752f6e    3.8 GB  About a minute ago
wizard-vicuna-uncensored:13b    6887722b6618    7.4 GB  2 weeks ago
$ tlm suggest 'echo hello'
[err] error getting suggestion: model 'suggest:7b' not found, try pulling it first
โ”ƒ > Thinking... (1.000666ms)
panic: runtime error: index out of range [0] with length 0

goroutine 1 [running]:
github.com/yusufcanb/tlm/suggest.(*Suggest).action(0x140001a85a0, 0x140001e1640)
        /Users/chenggeng/go/pkg/mod/github.com/yusufcanb/[email protected]/suggest/cli.go:51 +0xa78
github.com/urfave/cli/v2.(*Command).Run(0x140001f0580, 0x140001e1640, {0x140001d7a00, 0x2, 0x2})
        /Users/chenggeng/go/pkg/mod/github.com/urfave/cli/[email protected]/command.go:279 +0x7b0
github.com/urfave/cli/v2.(*Command).Run(0x140001f0f20, 0x140001e1500, {0x1400019e180, 0x3, 0x3})
        /Users/chenggeng/go/pkg/mod/github.com/urfave/cli/[email protected]/command.go:272 +0x9d0
github.com/urfave/cli/v2.(*App).RunContext(0x1400029a000, {0x100d72550?, 0x140001ae008}, {0x1400019e180, 0x3, 0x3})
        /Users/chenggeng/go/pkg/mod/github.com/urfave/cli/[email protected]/app.go:337 +0x61c
github.com/urfave/cli/v2.(*App).Run(...)
        /Users/chenggeng/go/pkg/mod/github.com/urfave/cli/[email protected]/app.go:311
main.main()
        /Users/chenggeng/go/pkg/mod/github.com/yusufcanb/[email protected]/main.go:16 +0x5c

Missing newline after explain

> $ tlm explain "date +%s"
The `date` command is used to display the current date and time in seconds since the Epoch (January 1, 1970, 00:00:00 UTC). The `%s` format specifier is used to print the number of seconds since the Epoch.

So, when you run `date +%s`, it will display the current date and time in seconds since the Epoch.> $

Setup is "bash" even when on mac, because I use bash and not zsh. Could it have to do with that?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.