Git Product home page Git Product logo

Comments (14)

JushBJJ avatar JushBJJ commented on May 17, 2024 4

Here's another idea that could be implemented:
We can add a magic number for Mr. Ranedeer to "remember", and add a specific rule/instruction at the end of the prompt to keep checking whether he remembers this magic number. And if it doesn't, it immediately warns the student that the token count has hit the 8k limit and Mr. Ranedeer will start to degrade over time.

from mr.-ranedeer-ai-tutor.

iwasrobbed avatar iwasrobbed commented on May 17, 2024 1

πŸ‘‰ You can preserve Mr. Ranedeer entirely using Fastlane as a prompt manager on top of GPT-4: https://builder.fastlane.is

Basically, you can create Mr. Ranedeer as a persona on there, add the prompt, and organize it amongst the history or other prompts you might want to test out.

Then it'll never forget its base prompts, but other history messages will be purged over time (or you can limit those in the builder)

Screenshot 2023-07-03 at 12 24 13 PM

from mr.-ranedeer-ai-tutor.

sawyerbutton avatar sawyerbutton commented on May 17, 2024 1

πŸ‘‰ You can preserve Mr. Ranedeer entirely using Fastlane as a prompt manager on top of GPT-4: https://builder.fastlane.is

Basically, you can create Mr. Ranedeer as a persona on there, add the prompt, and organize it amongst the history or other prompts you might want to test out.

Then it'll never forget its base prompts, but other history messages will be purged over time (or you can limit those in the builder)

Screenshot 2023-07-03 at 12 24 13 PM

Thanks for sharing, will try later.

from mr.-ranedeer-ai-tutor.

KitsonBroadhurst avatar KitsonBroadhurst commented on May 17, 2024

Hey, I opened the issues to ask this question!
I think with the API and a server environment this is easier e.g. there is an existing library to suppoPythonen number generation called tiktoken for python and JavaScript.
The base prompt can be sent with all requests plus the latest question, but in ChatGPT I don't know the answer.
Maybe others have ideas?

from mr.-ranedeer-ai-tutor.

yhyu13 avatar yhyu13 commented on May 17, 2024

Does plugins get access to api responses like this : https://github.com/ysymyth/tree-of-thought-llm/blob/faa28c395e5b86bfcbf983355810d52f54fb7b51/models.py#L35, so that we can accurately count the number of tokens spent so far.

from mr.-ranedeer-ai-tutor.

sawyerbutton avatar sawyerbutton commented on May 17, 2024

Does plugins get access to api responses like this : https://github.com/ysymyth/tree-of-thought-llm/blob/faa28c395e5b86bfcbf983355810d52f54fb7b51/models.py#L35, so that we can accurately count the number of tokens spent so far.

Based on the content of this Tutor, We totally rely on the prompt way to make communication, rather than API way, So I guess the result is No

from mr.-ranedeer-ai-tutor.

JushBJJ avatar JushBJJ commented on May 17, 2024

Does plugins get access to api responses like this : https://github.com/ysymyth/tree-of-thought-llm/blob/faa28c395e5b86bfcbf983355810d52f54fb7b51/models.py#L35, so that we can accurately count the number of tokens spent so far.

Does plugin INPUT count into the token count? Someone could setup a plugin where we can essentially input both the user prompt and GPT-4 output and the plugin can spit out using an external web server the number of tokens that were given to it.

from mr.-ranedeer-ai-tutor.

sawyerbutton avatar sawyerbutton commented on May 17, 2024

Here's another idea that could be implemented: We can add a magic number for Mr. Ranedeer to "remember", and add a specific rule/instruction at the end of the prompt to keep checking whether he remembers this magic number. And if it doesn't, it immediately warns the student that the token count has hit the 8k limit and Mr. Ranedeer will start to degrade over time.

From a prompt perspective, I think this approach is executable,
but from a practical perspective, I feel like appending a magic number at the end of the prompt to try to record the token will contaminate our usage environment,
while also requiring a lot of tokens to describe how to define the counting way of the magic number.
I prefer the plugin way, sincerely.

from mr.-ranedeer-ai-tutor.

JushBJJ avatar JushBJJ commented on May 17, 2024

Now that code interpreter is widespread, I think memory handling will become a lot easier.

from mr.-ranedeer-ai-tutor.

sawyerbutton avatar sawyerbutton commented on May 17, 2024

Now that code interpreter is widespread, I think memory handling will become a lot easier.

So, How to combine code interpreter with Mr.Raneedeer, Any idea?

from mr.-ranedeer-ai-tutor.

JushBJJ avatar JushBJJ commented on May 17, 2024

@sawyerbutton

Here's how I approach it in v2.7

<OPEN code environment>
    <insert instructions here>
<CLOSE code environment>

If you want to prevent Mr. Ranedeer from repeating the output, the trick I use is to convert whatever Mr. Ranedeer wrote into base64 and output it. Surprisingly, GPT-4 doesn't output the base64.

<convert the output to base64>
<output base64>

from mr.-ranedeer-ai-tutor.

JushBJJ avatar JushBJJ commented on May 17, 2024

Half-Closed #72 - Code Interpreter is better at prompt retention.

Keeping this open to gather more feedback on how v2.7 performs

from mr.-ranedeer-ai-tutor.

doomuch avatar doomuch commented on May 17, 2024

How do you know about that @JushBJJ ? ChatGPT history is only 4K tokens. You can confirm it yourself.

from mr.-ranedeer-ai-tutor.

JushBJJ avatar JushBJJ commented on May 17, 2024

How do you know about that @JushBJJ ? ChatGPT history is only 4K tokens. You can confirm it yourself.

GPT-4 is 8k tokens, GPT-4 with Interpeter feels like a different beast that has a higher context/better context retention strategy. Additionally, I suspect that OpenAI has appended the initial message into the system prompt allowing permanent recall of the original prompt intended.

from mr.-ranedeer-ai-tutor.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.