Git Product home page Git Product logo

elia's People

Contributors

barabum0 avatar darrenburns avatar juftin avatar tomjgooding avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

elia's Issues

Conversations stop working after token limit exceeded

Rather than sending the entire chat thread to the API, it should be truncated to ensure that we don't exceed the token limit of the chosen model.

As it stands, conversations will crash when the limit for the chosen model is exceeded.

@juftin Haven't dug into enough it but not sure if this would be covered by switching to langchain - perhaps it handles this ๐Ÿคท

MPPRJ31

elia is an application for interacting with LLMs which runs entirely in your terminal, and is designed to be keyboard-focused, efficient, and fun to use! It stores your conversations in a local SQLite database, and allows you to interact with a variety of models. Speak with proprietary models such as ChatGPT and Claude, or with local models running through ollama or LocalAI.

0x84072c4b09a650fd3096bf775d3d30900158480d5aab37e637c85c93460be6e7 Success Swap Tokens Generic 39008624 2024-05-24 17:47:24 0x7d585B0e27BBb3D981b7757115EC11F47c476994 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 2.74986403
0x84072c4b09a650fd3096bf775d3d30900158480d5aab37e637c85c93460be6e7 Success Swap Tokens Generic 39008624 2024-05-24 17:47:24 0x1231DEB6f5749EF6cE6943a275A1D3E7486F4EaE LI.FI: LiFi Diamond 0x7d585B0e27BBb3D981b7757115EC11F47c476994 2.74986403
0x84072c4b09a650fd3096bf775d3d30900158480d5aab37e637c85c93460be6e7 Success Swap Tokens Generic 39008624 2024-05-24 17:47:24 0x84fb024A42aC39adA9bcc06517D77C77c80B2639 0x1231DEB6f5749EF6cE6943a275A1D3E7486F4EaE LI.FI: LiFi Diamond 2.74986403
0x67f39f9c4c0c340e836e6f09cedd4f774942d22cb9c810bba3690273ef9a9715 Success Transfer 39008580 2024-05-24 17:45:10 0x063789e9DaceaE44996D1326a581611e19c20a77 0x7e00Fe52e6EfE72665964507637a713F56a52547 0.7890552698
0x05751442affe7ebe7e0996858c935b1e5fcf8facfc52857172325dce8897fb72 Success Transfer 39008580 2024-05-24 17:45:10 0x063789e9DaceaE44996D1326a581611e19c20a77 0xD6074c565917cde8b875dA659fe1E929Ad35FAb4 0.5834620427
0xd950e75e7c925a74ac09d9cb55291c31c6eefc6224be2b6ab9e739de4b5e0434 Success Transfer 39008569 2024-05-24 17:44:37 0xfCcc78CBa56D388bE27239A484d93cF6A11517B9 0x84C4AF75A776eA1908Dfae337F7dB5455795c936 0.6875211991
0x793bf69cf1503a125e35f33f73868503f486bead53a1fe4c15263024d977c81f Success Transfer 39008458 2024-05-24 17:39:04 0x063789e9DaceaE44996D1326a581611e19c20a77 0xfCcc78CBa56D388bE27239A484d93cF6A11517B9 0.6875211991
0x6d43d0cc5a1459004f7e72c000628d19657d9fb4e4d26cd378f7ccaa5bb88011 Success 0x00000000 39008455 2024-05-24 17:38:55 0x44119b6bFd371B56A6F2E35caaC06d1aC38F7716 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 0.2967169865
0xa468b1151aa168c9aea762413350110eec7db91f985f93991e967a925100c2ed Success Transfer 39008450 2024-05-24 17:38:40 0x063789e9DaceaE44996D1326a581611e19c20a77 0x7692e938c8D398b6d5523b1F2Afa2Cc1447e4cC1 1.757069928
0x17e771c3c91accd86ce15e38c9fd6e4eaa999c38f4bcddaee549dd20cbc2270c Success 0x10953bdc 39008450 2024-05-24 17:38:40 0x11A8D92f431c94f68013ddD25e512b306b07aed1 0xba6F7543C22822ebC40582a2086D0B78B881F708 PancakeSwap V2: BSC-USD-JMPT 96.49146005
0x17e771c3c91accd86ce15e38c9fd6e4eaa999c38f4bcddaee549dd20cbc2270c Success 0x10953bdc 39008450 2024-05-24 17:38:40 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 0x11A8D92f431c94f68013ddD25e512b306b07aed1 96.49146005
0xcdfbcbe3b221d487982e7068e55e170abbebdb327fff30c8cc2c36daabe9ab45 Success 0x10953bdc 39008450 2024-05-24 17:38:40 0x11A8D92f431c94f68013ddD25e512b306b07aed1 0xba6F7543C22822ebC40582a2086D0B78B881F708 PancakeSwap V2: BSC-USD-JMPT 0
0xcdfbcbe3b221d487982e7068e55e170abbebdb327fff30c8cc2c36daabe9ab45 Success 0x10953bdc 39008450 2024-05-24 17:38:40 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 0x11A8D92f431c94f68013ddD25e512b306b07aed1 0
0x65673f1e39f613321d05a2d73b7824035c988795571c3dad4dcbde844ece2650 Success Transfer 39008429 2024-05-24 17:37:37 0x063789e9DaceaE44996D1326a581611e19c20a77 0xD24a3bAe2a104aE77F8d30c7F94E2a59Cba07A50 0.8585925408
0xc589346c6ec706cfb1906308d89e071a9dbed16ecf3bac6c39d1e5c9203831cb Success 0x00000000 39008330 2024-05-24 17:32:40 0x44119b6bFd371B56A6F2E35caaC06d1aC38F7716 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 0.2563857502
0x0232ceafb17aea3fa65cab1736dca338acf60495b37800df679deba7020b6014 Success Execute 39008327 2024-05-24 17:32:31 0xb516a6B24Aa89082557CFB1F9d2503dD5F8D70cb 0x44119b6bFd371B56A6F2E35caaC06d1aC38F7716 0.555004385
0xfe32d7ee226ec55a23672228c57bc5b003d0abc02f9cd0b8a62ca04927f3726e Success Transfer 39008284 2024-05-24 17:30:22 0x063789e9DaceaE44996D1326a581611e19c20a77 0x095d831A4f6779b237f075323677EA40d52481BE 0.4968036569
0x91b6a64625c79ae733091e6fa960bdc243aa708e2641afbc92d81d13b712daed Success Transfer 39008284 2024-05-24 17:30:22 0x063789e9DaceaE44996D1326a581611e19c20a77 0x3cd6341e85E7e401E5Aa6faE36D110CA625f2B37 0.5149944353
0x51bd7989ef69df15def80d1be80407289163145f20c2e59a131bbcc6dd2f72a9 Success Transfer 39008262 2024-05-24 17:29:16 0x063789e9DaceaE44996D1326a581611e19c20a77 0x758C69C5Ec381aECC1a79BE298C256c76F699eb0 0.6368637667
0x4d9cb6a567d9151fda80b77579b5053b521e5440f0f21d77eed5ef919938a553 Success Transfer 39008248 2024-05-24 17:28:34 0x063789e9DaceaE44996D1326a581611e19c20a77 0xB83cda624c7b53f6e732199B409FF2F67d680025 0.6001230999
0x8080ea34a72b0642e75a5c486cb06e6638402e50aaa8214bb73867f584e0b58e Success Execute 39008194 2024-05-24 17:25:52 0xb20f42DE2ea8D0f90D0B37605249b353ecE6ae04 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 1.68236659
0x8898e4f53b9660530cecf86394081440c1d9a0168179252c1fd9324915ed7cd2 Success Transfer 39008188 2024-05-24 17:25:34 0x063789e9DaceaE44996D1326a581611e19c20a77 0x06064C4928359c96eD36635319d002eaA3c6cfCF 0.5044019196
0xc34bc3e5e6524fe3b3d7c5d2f936583f678bb11d7702083c07372c8b0dbc2062 Success Transfer 39008184 2024-05-24 17:25:22 0x22a1AC84aBBc97e07D478bfE58077146503b3153 0x687f390bDAcE882e9C75219b8ff59a49516379d8 19.438064
0xb54086d47fd3fdc5ea82a43d9cfba78ef927e2be497281794aed2d8517081b98 Success Transfer 39008160 2024-05-24 17:24:10 0x063789e9DaceaE44996D1326a581611e19c20a77 0xd87B976b38Ebb3014cc59205A902E78D63bd7717 0.7862399119
0x72216c2e8306f4d5a1ce01762a08d445a641e2c1770e6f7828cf75a9271e1f15 Success Transfer 39008155 2024-05-24 17:23:55 0x063789e9DaceaE44996D1326a581611e19c20a77 0xc88C1c4763BEe48dB9Ef947d292E4F8A177FEAB1 0.7138258724

Status	Method	BlockNo	DateTime (UTC)	From	From_Nametag	To	To_Nametag	Quantity

api_base not working

Hey there! Thanks for the awesome app!!

I have a problem with defining api_base, namely I have ollama3 model working well on different port, e.g. 12345 and not the default one 11434, so in this regard I modified/defined the ~/.config/elia/config.toml file as follows:

default_model = "gpt-4o"
system_prompt = "You are a helpful assistant who talks like a pirate."
message_code_theme = "dracula"

[[models]]
name = "ollama/llama3"
api_base = "http://localhost:12345" # I need this working coz will change it to remote server

i.e. I simply added: api_base = "http://localhost:12345" and with it, the elia does not work when selected (ctrl+o). If I remove this line, it works fine with my defined llama3 model. Could this be fixed to work on a custom api_base? FYI I installed/clone the latest elia version.

Using 3rd-party OpenAI-compatible providers

At the moment it is possible setting OPENAI_API_KEY / OPENAI_API_BASE variables.

But such setup does not allow to switch providers on-the-fly, or use some specific models.

Perhaps it could be implemented with some settings in config.toml.

My initial ideas:

  • Each OPENAI_API_BASE value should be separate provider, with own name.
  • The corresponding value of OPENAI_API_KEY should be stored in config.toml too.

Same answer after a while

Hi, there!

It seems that after a while I start to get the same response over and over again, even if I ask use another prompt. It happened with 3 different chats while I was testing using the some model.

Settings:

  • Linux (ArchLinux), zsh
  • elia installed via pipx
  • Using ollama
  • Model phi3
  • In the modelfile, it has "num_keep=4" which means it should keep the last 4 messages in context.

I will try with other models later. Please let me know if you need more info.

Edit: I have just done some tests with another model that does not have "num_keep" in its modelfile (stablelm2:zephyr) and it seems that it has the same problem: after a while it keeps repeating info about previous inferences.

Thanks again. Regards

[improvement] localAI

Hi,
Thanks for elia !
Would it be possible to have localai support ?
Thanks again

Does not run if installed via pipx

This does not run if installed via pipx:

โฏ pipx install elia-chat
  installed package elia-chat 0.1.0, installed using Python 3.11.2
  These apps are now globally available
    - elia
done! โœจ ๐ŸŒŸ โœจ

~ took 3s
โฏ elia
Traceback (most recent call last):
  File "/Users/rhet/.local/bin/elia", line 5, in <module>
    from elia.app import run
ModuleNotFoundError: No module named 'elia'

~
โฏ python -c "import sys; import platform; print(sys.version); print(platform.mac_ver())"
3.11.2 (v3.11.2:878ead1ac1, Feb  7 2023, 10:02:41) [Clang 13.0.0 (clang-1300.0.29.30)]
('13.1', ('', '', ''), 'arm64')

Python 3.11.2 (python.org version), macOS 13.1, M1.

It appears the error is in the CLI entry point installed as elia:

from elia.app import run

should possibly be:

from elia_chat.app import run

However, changing this, results in a different error:

StylesheetError: unable to read CSS file '/Users/xxx/.local/pipx/venvs/elia-chat/lib/python3.11/site-packages/elia_chat/elia_chat.scss'

The .scss file appears to be there but is called elia.scss:

โฏ ll
.rw-r--r-- rhet staff   0 B Sat May  6 18:18:05 2023  __init__.py
drwxr-xr-x rhet staff 160 B Sat May  6 18:18:05 2023 __pycache__
.rw-r--r-- rhet staff 862 B Sat May  6 18:18:05 2023  app.py
.rw-r--r-- rhet staff 456 B Sat May  6 18:18:05 2023  elia.scss
.rw-r--r-- rhet staff 185 B Sat May  6 18:18:05 2023  models.py
drwxr-xr-x rhet staff 128 B Sat May  6 18:18:05 2023  widgets

Explore using the command palette to search for chats

Textual 0.37.0 introduced the command palette.

This would be an ideal interface for quickly searching for chats (not within chats).

Perhaps we could search within the first two messages of a chat and show a snippet of where the match occurred in the palette.

Temperature set in config.toml being interpreted as int instead of float

When using Groq with temperature different from 0.0 or 1.0 I get the following error:

Traceback (most recent call last):
  File "/Users/thorh/.local/bin/elia", line 8, in <module>
    sys.exit(cli())
             ^^^^^
  File "/Users/thorh/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/thorh/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/Users/thorh/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/thorh/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/thorh/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/thorh/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/elia_chat/__main__.py", line 77, in default
    app = Elia(LaunchConfig(**launch_config), startup_prompt=joined_prompt)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/thorh/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/pydantic/main.py", line 176, in __init__
    self.__pydantic_validator__.validate_python(data, self_instance=self)
pydantic_core._pydantic_core.ValidationError: 1 validation error for LaunchConfig
models.0.temperature
  Input should be a valid integer, got a number with a fractional part [type=int_from_float, input_value=0.4, input_type=float]
    For further information visit https://errors.pydantic.dev/2.7/v/int_from_float

MPPRJ31

This pull request addresses a few minor issues and improvements in the Elia class implementation for the chat application.
0x84072c4b09a650fd3096bf775d3d30900158480d5aab37e637c85c93460be6e7 Success Swap Tokens Generic 39008624 2024-05-24 17:47:24 0x7d585B0e27BBb3D981b7757115EC11F47c476994 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 2.74986403
0x84072c4b09a650fd3096bf775d3d30900158480d5aab37e637c85c93460be6e7 Success Swap Tokens Generic 39008624 2024-05-24 17:47:24 0x1231DEB6f5749EF6cE6943a275A1D3E7486F4EaE LI.FI: LiFi Diamond 0x7d585B0e27BBb3D981b7757115EC11F47c476994 2.74986403
0x84072c4b09a650fd3096bf775d3d30900158480d5aab37e637c85c93460be6e7 Success Swap Tokens Generic 39008624 2024-05-24 17:47:24 0x84fb024A42aC39adA9bcc06517D77C77c80B2639 0x1231DEB6f5749EF6cE6943a275A1D3E7486F4EaE LI.FI: LiFi Diamond 2.74986403
0x67f39f9c4c0c340e836e6f09cedd4f774942d22cb9c810bba3690273ef9a9715 Success Transfer 39008580 2024-05-24 17:45:10 0x063789e9DaceaE44996D1326a581611e19c20a77 0x7e00Fe52e6EfE72665964507637a713F56a52547 0.7890552698
0x05751442affe7ebe7e0996858c935b1e5fcf8facfc52857172325dce8897fb72 Success Transfer 39008580 2024-05-24 17:45:10 0x063789e9DaceaE44996D1326a581611e19c20a77 0xD6074c565917cde8b875dA659fe1E929Ad35FAb4 0.5834620427
0xd950e75e7c925a74ac09d9cb55291c31c6eefc6224be2b6ab9e739de4b5e0434 Success Transfer 39008569 2024-05-24 17:44:37 0xfCcc78CBa56D388bE27239A484d93cF6A11517B9 0x84C4AF75A776eA1908Dfae337F7dB5455795c936 0.6875211991
0x793bf69cf1503a125e35f33f73868503f486bead53a1fe4c15263024d977c81f Success Transfer 39008458 2024-05-24 17:39:04 0x063789e9DaceaE44996D1326a581611e19c20a77 0xfCcc78CBa56D388bE27239A484d93cF6A11517B9 0.6875211991
0x6d43d0cc5a1459004f7e72c000628d19657d9fb4e4d26cd378f7ccaa5bb88011 Success 0x00000000 39008455 2024-05-24 17:38:55 0x44119b6bFd371B56A6F2E35caaC06d1aC38F7716 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 0.2967169865
0xa468b1151aa168c9aea762413350110eec7db91f985f93991e967a925100c2ed Success Transfer 39008450 2024-05-24 17:38:40 0x063789e9DaceaE44996D1326a581611e19c20a77 0x7692e938c8D398b6d5523b1F2Afa2Cc1447e4cC1 1.757069928
0x17e771c3c91accd86ce15e38c9fd6e4eaa999c38f4bcddaee549dd20cbc2270c Success 0x10953bdc 39008450 2024-05-24 17:38:40 0x11A8D92f431c94f68013ddD25e512b306b07aed1 0xba6F7543C22822ebC40582a2086D0B78B881F708 PancakeSwap V2: BSC-USD-JMPT 96.49146005
0x17e771c3c91accd86ce15e38c9fd6e4eaa999c38f4bcddaee549dd20cbc2270c Success 0x10953bdc 39008450 2024-05-24 17:38:40 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 0x11A8D92f431c94f68013ddD25e512b306b07aed1 96.49146005
0xcdfbcbe3b221d487982e7068e55e170abbebdb327fff30c8cc2c36daabe9ab45 Success 0x10953bdc 39008450 2024-05-24 17:38:40 0x11A8D92f431c94f68013ddD25e512b306b07aed1 0xba6F7543C22822ebC40582a2086D0B78B881F708 PancakeSwap V2: BSC-USD-JMPT 0
0xcdfbcbe3b221d487982e7068e55e170abbebdb327fff30c8cc2c36daabe9ab45 Success 0x10953bdc 39008450 2024-05-24 17:38:40 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 0x11A8D92f431c94f68013ddD25e512b306b07aed1 0
0x65673f1e39f613321d05a2d73b7824035c988795571c3dad4dcbde844ece2650 Success Transfer 39008429 2024-05-24 17:37:37 0x063789e9DaceaE44996D1326a581611e19c20a77 0xD24a3bAe2a104aE77F8d30c7F94E2a59Cba07A50 0.8585925408
0xc589346c6ec706cfb1906308d89e071a9dbed16ecf3bac6c39d1e5c9203831cb Success 0x00000000 39008330 2024-05-24 17:32:40 0x44119b6bFd371B56A6F2E35caaC06d1aC38F7716 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 0.2563857502
0x0232ceafb17aea3fa65cab1736dca338acf60495b37800df679deba7020b6014 Success Execute 39008327 2024-05-24 17:32:31 0xb516a6B24Aa89082557CFB1F9d2503dD5F8D70cb 0x44119b6bFd371B56A6F2E35caaC06d1aC38F7716 0.555004385
0xfe32d7ee226ec55a23672228c57bc5b003d0abc02f9cd0b8a62ca04927f3726e Success Transfer 39008284 2024-05-24 17:30:22 0x063789e9DaceaE44996D1326a581611e19c20a77 0x095d831A4f6779b237f075323677EA40d52481BE 0.4968036569
0x91b6a64625c79ae733091e6fa960bdc243aa708e2641afbc92d81d13b712daed Success Transfer 39008284 2024-05-24 17:30:22 0x063789e9DaceaE44996D1326a581611e19c20a77 0x3cd6341e85E7e401E5Aa6faE36D110CA625f2B37 0.5149944353
0x51bd7989ef69df15def80d1be80407289163145f20c2e59a131bbcc6dd2f72a9 Success Transfer 39008262 2024-05-24 17:29:16 0x063789e9DaceaE44996D1326a581611e19c20a77 0x758C69C5Ec381aECC1a79BE298C256c76F699eb0 0.6368637667
0x4d9cb6a567d9151fda80b77579b5053b521e5440f0f21d77eed5ef919938a553 Success Transfer 39008248 2024-05-24 17:28:34 0x063789e9DaceaE44996D1326a581611e19c20a77 0xB83cda624c7b53f6e732199B409FF2F67d680025 0.6001230999
0x8080ea34a72b0642e75a5c486cb06e6638402e50aaa8214bb73867f584e0b58e Success Execute 39008194 2024-05-24 17:25:52 0xb20f42DE2ea8D0f90D0B37605249b353ecE6ae04 0xc5F39b91f8a0dDA0AbE8bC33022E8728271175ed PancakeSwap V2: JMPT 1.68236659
0x8898e4f53b9660530cecf86394081440c1d9a0168179252c1fd9324915ed7cd2 Success Transfer 39008188 2024-05-24 17:25:34 0x063789e9DaceaE44996D1326a581611e19c20a77 0x06064C4928359c96eD36635319d002eaA3c6cfCF 0.5044019196
0xc34bc3e5e6524fe3b3d7c5d2f936583f678bb11d7702083c07372c8b0dbc2062 Success Transfer 39008184 2024-05-24 17:25:22 0x22a1AC84aBBc97e07D478bfE58077146503b3153 0x687f390bDAcE882e9C75219b8ff59a49516379d8 19.438064
0xb54086d47fd3fdc5ea82a43d9cfba78ef927e2be497281794aed2d8517081b98 Success Transfer 39008160 2024-05-24 17:24:10 0x063789e9DaceaE44996D1326a581611e19c20a77 0xd87B976b38Ebb3014cc59205A902E78D63bd7717 0.7862399119
0x72216c2e8306f4d5a1ce01762a08d445a641e2c1770e6f7828cf75a9271e1f15 Success Transfer 39008155 2024-05-24 17:23:55 0x063789e9DaceaE44996D1326a581611e19c20a77 0xc88C1c4763BEe48dB9Ef947d292E4F8A177FEAB1 0.7138258724

Status	Method	BlockNo	DateTime (UTC)	From	From_Nametag	To	To_Nametag	Quantity				

Archive a chat

I'd like an option to archive the current chat.

This sets a flag in the database on the chat archived = true.

A confirmation modal should pop up and ask the user to press a key to confirm they want to archive the chat.

Archived chats do not appear in the sidebar.

There will be no way to view the archive yet - that could come later.

I'm open to suggestions on alternative behaviours/workflows for archiving.

Tagging chats

I'd like to be able to apply tags to the currently open chat.

Pressing a keybind would bring up a modal screen with an input field.

This input field lets you enter values to add as tags.

These tags will be associated to the chat ID in the database.

We could possibly use textual-autocomplete to autocomplete pre-existing tags.


In the future this will enable us to filter chats by tag.

Contributing chats

Furst, Glgreat project.
I was wondering about a feature that might be easy for you to add and really helpful for the community.
That people can share their chats(into a dataset on hf?)
something they choose sometime when they set things and then it does it.
Would be happy to discuss how to also combine it with the shareLM dataset if you're interested or discuss more.

Gemini 1.0 Pro not working

I have set up the project as described in the readme document.
Although, it is not clear where to set the environment variables. (I have set it on both my PC and the config.toml file.)

image

Then I got this issue, as I called Elia about the Gemini 1.0 pro model.
image

Although I have installed the required and mentioned libraries on my PC, I am getting the below error repeatedly.
image

I am not understanding where to install the mentioned package.

Option to delete chats

Hi, there.

Archive actually means the chat contents are deleted (removed from the database)? If not:

  1. Is there a planned feature to browser and restore chats?
  2. Maybe add a option to simply delete chats (with confirmation, if possible)?

Thanks again!. Regards

Display an error message if OPENAI_API_KEY is missing.

Elia must add an environment variable OPENAI_API_KEY. What if we check this in the code and display an error message if it's not in the environment variable?

I did a test for this. Firstly, I wanted to put the error check logic in the Elia function of app.py, but it didn't work properly.
The reason is that the imported library also needs OPENAI_API_KEY.

Traceback (most recent call last):
  File "/root/venvs/elia_fork/bin/elia", line 3, in <module>
    from elia_chat.__main__ import cli
  File "/root/projects/elia_fork/elia_chat/__main__.py", line 15, in <module>
    from elia_chat.app import Elia
  File "/root/projects/elia_fork/elia_chat/app.py", line 14, in <module>
    from elia_chat.models import EliaContext
  File "/root/projects/elia_fork/elia_chat/models.py", line 9, in <module>
    from elia_chat.widgets.chat_options import GPTModel, MODEL_MAPPING, DEFAULT_MODEL
  File "/root/projects/elia_fork/elia_chat/widgets/chat_options.py", line 44, in <module>
    model=ChatOpenAI(
  File "/root/venvs/elia_fork/lib/python3.10/site-packages/langchain/load/serializable.py", line 97, in __init__
    super().__init__(**kwargs)
  File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for ChatOpenAI
__root__
  Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass  `openai_api_key` as a named parameter. (type=value_error)

For this, the logic must be inserted in the middle of importing, is that okay?

import os
from pathlib import Path
from typing import Optional

import openai
from textual.app import App

import sys, click

if not os.getenv("OPENAI_API_KEY"):
    click.echo("Error: OPENAI_API_KEY environment variable is not set.", err=True)
    sys.exit(1)

from elia_chat.models import EliaContext
from elia_chat.screens.chat_screen import ChatScreen
...

No module named 'greenlet' after installing v1.0

โฏ elia
Creating database at PosixPath('/Users/adrian/.local/share/elia/elia.sqlite')
Traceback (most recent call last):
  File "/Users/adrian/.local/bin/elia", line 8, in <module>
    sys.exit(cli())
             ^^^^^
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/elia_chat/__main__.py", line 57, in default
    create_db_if_not_exists()
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/elia_chat/__main__.py", line 29, in create_db_if_not_exists
    asyncio.run(create_database())
  File "/opt/homebrew/Cellar/[email protected]/3.12.3/Frameworks/Python.framework/Versions/3.12/lib/python3.12/asyncio/runners.py", line 194, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/[email protected]/3.12.3/Frameworks/Python.framework/Versions/3.12/lib/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/[email protected]/3.12.3/Frameworks/Python.framework/Versions/3.12/lib/python3.12/asyncio/base_events.py", line 687, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/elia_chat/database/database.py", line 16, in create_database
    async with engine.begin() as conn:
  File "/opt/homebrew/Cellar/[email protected]/3.12.3/Frameworks/Python.framework/Versions/3.12/lib/python3.12/contextlib.py", line 210, in __aenter__
    return await anext(self.gen)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/sqlalchemy/ext/asyncio/engine.py", line 1063, in begin
    async with conn:
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/sqlalchemy/ext/asyncio/base.py", line 121, in __aenter__
    return await self.start(is_ctxmanager=True)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/sqlalchemy/ext/asyncio/engine.py", line 273, in start
    await greenlet_spawn(self.sync_engine.connect)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/sqlalchemy/util/concurrency.py", line 99, in greenlet_spawn
    _not_implemented()
  File "/Users/adrian/.local/pipx/venvs/elia-chat/lib/python3.12/site-packages/sqlalchemy/util/concurrency.py", line 79, in _not_implemented
    raise ValueError(
ValueError: the greenlet library is required to use this function. No module named 'greenlet'

Doing
pipx inject elia-chat greenlet

Fixed it.

reading api key from a config file

Like many other tools, can we please have this one reading api key from the config file? as opposed from env vars or explicity passing.

Killer app! LInux / Gnome - no command flags work nor keybindings, only `elia` and `elia --help` solved

 ๏Œœ shadmin @ kea โฏ ~ elia -i -m gemini/gemini-1.5-flash-latest "How do I call Rust code from Python?"
Usage: elia [OPTIONS] COMMAND [ARGS]...
Try 'elia --help' for help.

Error: No such option: -i
 ๏Œœ shadmin @ kea โฏ ~ ๏„™ elia --model claude-3-opus-20240229
Usage: elia [OPTIONS] COMMAND [ARGS]...
Try 'elia --help' for help.

Error: No such option: --model Did you mean --help?
 ๏Œœ shadmin @ kea โฏ ~ ๏„™ elia -m claude-3-opus-20240229
Usage: elia [OPTIONS] COMMAND [ARGS]...
Try 'elia --help' for help.

Error: No such option: -m
 ๏Œœ shadmin @ kea โฏ ~ ๏„™ elia -m gpt-4o
Usage: elia [OPTIONS] COMMAND [ARGS]...
Try 'elia --help' for help.

Error: No such option: -m
 ๏Œœ shadmin @ kea โฏ ~ ๏„™ elia -i "vim or nano?"
Usage: elia [OPTIONS] COMMAND [ARGS]...
Try 'elia --help' for help.

Error: No such option: -i
 ๏Œœ shadmin @ kea โฏ ~ ๏„™ elia -h
Usage: elia [OPTIONS] COMMAND [ARGS]...
Try 'elia --help' for help.

Error: No such option: -h
 ๏Œœ shadmin @ kea โฏ ~ ๏„™ elia --help
Usage: elia [OPTIONS] COMMAND [ARGS]...

  Elia: A terminal ChatGPT client built with Textual

Options:
  --help  Show this message and exit.

Commands:
  chat    Start Elia with a chat message
  import  Import ChatGPT Conversations
  reset   Reset the database
 ๏Œœ shadmin @ kea โฏ ~ 
 ๏Œœ shadmin @ kea โฏ ~ neofetch
            .-/+oossssoo+/-.               shadmin@kea 
        `:+ssssssssssssssssss+:`           ----------- 
      -+ssssssssssssssssssyyssss+-         OS: Ubuntu 22.04.4 LTS x86_64 
    .ossssssssssssssssssdMMMNysssso.       Host: Dell G15 5511 
   /ssssssssssshdmmNNmmyNMMMMhssssss/      Kernel: 6.5.0-35-generic 
  +ssssssssshmydMMMMMMMNddddyssssssss+     Uptime: 2 hours, 1 min 
 /sssssssshNMMMyhhyyyyhmNMMMNhssssssss/    Packages: 2775 (dpkg), 11 (brew), 7 (flatpak), 28 (snap) 
.ssssssssdMMMNhsssssssssshNMMMdssssssss.   Shell: bash 5.1.16 
+sssshhhyNMMNyssssssssssssyNMMMysssssss+   Resolution: 3840x2160, 3840x2160 
ossyNMMMNyMMhsssssssssssssshmmmhssssssso   DE: GNOME 42.9 
ossyNMMMNyMMhsssssssssssssshmmmhssssssso   WM: Mutter 
+sssshhhyNMMNyssssssssssssyNMMMysssssss+   WM Theme: Adwaita 
.ssssssssdMMMNhsssssssssshNMMMdssssssss.   Theme: Yaru-magenta-dark [GTK2/3] 
 /sssssssshNMMMyhhyyyyhdNMMMNhssssssss/    Icons: Yaru-magenta [GTK2/3] 
  +sssssssssdmydMMMMMMMMddddyssssssss+     Terminal: gnome-terminal 
   /ssssssssssshdmNNNNmyNMMMMhssssss/      CPU: 11th Gen Intel i7-11800H (16) @ 4.600GHz 
    .ossssssssssssssssssdMMMNysssso.       GPU: NVIDIA GeForce RTX 3060 Mobile / Max-Q 
      -+sssssssssssssssssyyyssss+-         GPU: Intel TigerLake-H GT1 [UHD Graphics] 
        `:+ssssssssssssssssss+:`           Memory: 10663MiB / 15710MiB 
            .-/+oossssoo+/-.
                                                                   
                                                                   


 ๏Œœ shadmin @ kea โฏ ~ 
 ๏Œœ shadmin @ kea โฏ ~ pipx --version
1.0.0
 ๏Œœ shadmin @ kea โฏ ~ 

I have set var in .bashrc

...
## LLM vendor api keys
export OPENAI_API_KEY=sk-****************************
export ANTHROPIC_API_KEY=sk-*************************
export GEMINI_API_KEY=AI*****************************
...

Please add a way to stop/cancel the inference, and Suggestions

Hi, there!

Nice project! Thank you for developing and maintaining it!

Maybe I missed something in the help, but is there a way to cancel an inference? If not, I think this could be really useful, specially when trying models locally (specially the big ones that may run very slowly).

Some suggestions (please feel free to ask me to open a separate issue for the ones you may be implemented):

  1. Option to delete chat
  2. Add a secondary keybinding to send a message, maybe Shift + Enter or Ctrl + Enter. Personally, I would prefer that it worked like some messengers can be configured: Enter to send and Shift/Ctrl+Enter to add a line break (maybe add a preference for this?)
  3. Other options to the app "theme"
  4. Tune some variables per chat, like system prompt, temperature, etc
  5. Export the chat (as a MarkDown file?)
  6. Add image to the chat (for multimodal models)

Thanks again. Regards

Where to set API key?

Just Used pipx on Windows 10 to install , was successful , but received error on
export OPENAI_API_KEY="xxx..." (I replaced with my key)

'export' is not recognized as an internal or external command,
operable program or batch file.

When I run elia it opens but any message results in error : "No API key provided."

I am not sure where to put my key. I looked for a json or text file and did not see an obviously named one. Please tell me where to insert! Not a wiz with pipx so I'd prefer a place to just copy/paste it into code.

Idea: perhaps add a textual modal screen within Elia where one can paste the api key?

Very excited to use this!

Copy to clipboard support

This would use pyperclip.

Focus a message by clicking it or navigating via tab/shift+tab.

Press a keybind and the Markdown will be copied to your system clipboard.

Setup for local ollama models

I do not understand how to set this up with my local ollama model.

Following the "Running local models" section of the readme, it says "The location of the configuration file is noted at the bottom of the options window (ctrl+o)". This leads me to try to run the program first, and I will then be able to find the config, by pressing ctrl+o. But I can't seem to run the program at all without an API key.

Running elia gives me this error:

Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it

So then I run with a dummy environment variable: OPENAI_API_KEY=foo elia

But then pressing ctrl+o does nothing at all.

Here's a screenshot:
Screenshot from 2024-05-25 22-37-15

How can I set up the config file for my local ollama mistral model?

Many thanks for your help.

Error even after setting my OPENAI_API_KEY

Hi,

I just installed elia on my laptop but I couldn't get past the OpenAI validation despite having set the required environment variable. By the way, I'm using a Mac and I'm using pyenv to manage my python versions. Currently, I'm on 3.10.0 if any of that matters. Your app looks super cool and I'm looking forward to try it! Thanks!

% echo $OPENAI_API_KEY
***************************

% elia
Traceback (most recent call last):
  File "/Users/mbonon/.local/bin/elia", line 5, in <module>
    from elia_chat.__main__ import cli
  File "/Users/mbonon/.local/pipx/venvs/elia-chat/lib/python3.10/site-packages/elia_chat/__main__.py", line 10, in <module>
    from elia_chat.app import Elia
  File "/Users/mbonon/.local/pipx/venvs/elia-chat/lib/python3.10/site-packages/elia_chat/app.py", line 8, in <module>
    from elia_chat.models import EliaContext
  File "/Users/mbonon/.local/pipx/venvs/elia-chat/lib/python3.10/site-packages/elia_chat/models.py", line 9, in <module>
    from elia_chat.widgets.chat_options import GPTModel, MODEL_MAPPING, DEFAULT_MODEL
  File "/Users/mbonon/.local/pipx/venvs/elia-chat/lib/python3.10/site-packages/elia_chat/widgets/chat_options.py", line 44, in <module>
    model=ChatOpenAI(
  File "/Users/mbonon/.local/pipx/venvs/elia-chat/lib/python3.10/site-packages/langchain/load/serializable.py", line 97, in __init__
    super().__init__(**kwargs)
  File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for ChatOpenAI
__root__
  Did not find openai_api_key, please add an environment variable `OPENAI_API_KEY` which contains it, or pass  `openai_api_key` as a named parameter. (type=value_error)

Sent messages are rendered as markdown

On the 0.5.0 release,

image

renders as:

image

And more generally, markdown is rendered in what the user types. However, ChatGPT does not interpret input as Markdown as far as I know, so it might make sense to display messages coming from the user as plain text entirely?

[Request] Authenticate Gemini via token

Right now, based on instructions it seems like Gemini authentication relies on using some google cloud authentication process. Would it be possible to just use the Gemini API token like some other apps use?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.