Git Product home page Git Product logo

wafl's Introduction

WAFL 0.0.80 TestsDocs

Introduction

Logo

WAFL is a framework for home assistants. It is designed to combine Large Language Models and rules to create a predictable behavior. Specifically, instead of organising the work of an LLM into a chain of thoughts, WAFL intends to organise its behavior into inference trees.

WAFL is a work in progress. The current version requires the user to specify the rules to follow. While it is ready to play with, it might not ready for production depending on your use-case.

Installation

In this version, WAFL is a two-part system. Both can be installed on the same machine.

The two parts of WAFL

Interface side

The first part is local to your machine and needs to have access to a microphone and speaker. To install it, run the following commands:

$ sudo apt-get install portaudio19-dev ffmpeg
$ pip install wafl

After installing the requirements, you can initialize the interface by running the following command:

$ wafl init

which creates a config.json file that you can edit to change the default settings. A standard rule file is also created as wafl.rules. Please see the examples in the following chapters.

LLM side (needs a GPU)

The second part (LLM side) is a model server for the speech-to-text model, the LLM, the embedding system, and the text-to-speech model.

Installation

In order to quickly run the LLM side, you can use the following installation commands:

pip install wafl-llm
wafl-llm start

which will use the default models and start the server on port 8080.

Docker

A docker image can be used to run it as in the following:

$ docker run -p8080:8080 --env NVIDIA_DISABLE_REQUIRE=1 --gpus all fractalego/wafl-llm:0.80

The interface side has a config.json file that needs to be filled with the IP address of the LLM side. The default is localhost. Alternatively, you can run the LLM side by cloning this repository.

Running WAFL

This document contains a few examples of how to use the wafl CLI. There are four modes in which to run the system

$ wafl run-audio

This is the main mode of operation. It will run the system in a loop, waiting for the user to speak a command. The activation word is the name defined in config.json. The default name is "computer", but you can change it to whatever you want.

$ wafl run-server

It runs a local web server that listens for HTTP requests on port 8889. The server will act as a chatbot, executing commands and returning the result as defined in the rules.

$ wafl run-cli

This command works as for the run-server command, but it will listen for commands on the command line. It does not run a webserver and is useful for testing purposes.

$ wafl run-tests

This command will run all the tests defined in the file testcases.txt.

Documentation

The documentation can be found at wafl.readthedocs.io.

Mastodon

wafl's People

Contributors

fractalego avatar octyle avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

wafl's Issues

Question: Entity documents

If you presented yaml entity documents to walf with say something like langchain could waffle create control yaml from input?

If you take the MQTT of common opensource devices such as https://tasmota.github.io/docs/Commands/#control or https://esphome.io/components/light/

I guess all entities and controls of how they have been named could be exported as documents so that a LLM would get [zone] turn the light on

Likely in the documents the Yaml representation of that entity would exist
[zone:lounge
entity:light
name:main light
state:on/off
control:on/off]
]

Its just a general question and ignore if I got the control yaml totally wrong as just an example but I have been wondering if say some of the smaller 7b parameter models such as llama with llama.cpp could be finetuned and if you could provide formatted yaml documents could you train a system to output code as the only entities outside of the LLM's knowledge is the naming those items have.
So if those entities where presented as documents could a LLM return the correct formated yaml?

If it did then you have just solved home-assistants and control wih the like of homeassistant or openhab.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.