I was able to install the llama-cpp-python server with pip and local LLM plugin from BRAT. I keep getting the following error when I try to use LLM Instruction:
Error: SyntaxError: Unexpected non-whitespace character after JSON at position 230
In the Obsidian console, this is the error message:
Error: SyntaxError: Unexpected non-whitespace character after JSON at position 230 at JSON.parse (<anonymous>) at U (plugin:obsidian-local-llm:31:554)
I tried a couple of different prompts and three different models and got the same error. In the terminal window where the llama_cpp server is running, the text generation seems to finish because the timings are posted.