Comments (14)
Here's another idea that could be implemented:
We can add a magic number for Mr. Ranedeer to "remember", and add a specific rule/instruction at the end of the prompt to keep checking whether he remembers this magic number. And if it doesn't, it immediately warns the student that the token count has hit the 8k limit and Mr. Ranedeer will start to degrade over time.
from mr.-ranedeer-ai-tutor.
π You can preserve Mr. Ranedeer entirely using Fastlane as a prompt manager on top of GPT-4: https://builder.fastlane.is
Basically, you can create Mr. Ranedeer as a persona on there, add the prompt, and organize it amongst the history or other prompts you might want to test out.
Then it'll never forget its base prompts, but other history
messages will be purged over time (or you can limit those in the builder)
from mr.-ranedeer-ai-tutor.
π You can preserve Mr. Ranedeer entirely using Fastlane as a prompt manager on top of GPT-4: https://builder.fastlane.is
Basically, you can create Mr. Ranedeer as a persona on there, add the prompt, and organize it amongst the history or other prompts you might want to test out.
Then it'll never forget its base prompts, but other
history
messages will be purged over time (or you can limit those in the builder)
Thanks for sharing, will try later.
from mr.-ranedeer-ai-tutor.
Hey, I opened the issues to ask this question!
I think with the API and a server environment this is easier e.g. there is an existing library to suppoPythonen number generation called tiktoken for python and JavaScript.
The base prompt can be sent with all requests plus the latest question, but in ChatGPT I don't know the answer.
Maybe others have ideas?
from mr.-ranedeer-ai-tutor.
Does plugins get access to api responses like this : https://github.com/ysymyth/tree-of-thought-llm/blob/faa28c395e5b86bfcbf983355810d52f54fb7b51/models.py#L35, so that we can accurately count the number of tokens spent so far.
from mr.-ranedeer-ai-tutor.
Does plugins get access to api responses like this : https://github.com/ysymyth/tree-of-thought-llm/blob/faa28c395e5b86bfcbf983355810d52f54fb7b51/models.py#L35, so that we can accurately count the number of tokens spent so far.
Based on the content of this Tutor, We totally rely on the prompt way to make communication, rather than API way, So I guess the result is No
from mr.-ranedeer-ai-tutor.
Does plugins get access to api responses like this : https://github.com/ysymyth/tree-of-thought-llm/blob/faa28c395e5b86bfcbf983355810d52f54fb7b51/models.py#L35, so that we can accurately count the number of tokens spent so far.
Does plugin INPUT count into the token count? Someone could setup a plugin where we can essentially input both the user prompt and GPT-4 output and the plugin can spit out using an external web server the number of tokens that were given to it.
from mr.-ranedeer-ai-tutor.
Here's another idea that could be implemented: We can add a magic number for Mr. Ranedeer to "remember", and add a specific rule/instruction at the end of the prompt to keep checking whether he remembers this magic number. And if it doesn't, it immediately warns the student that the token count has hit the 8k limit and Mr. Ranedeer will start to degrade over time.
From a prompt perspective, I think this approach is executable,
but from a practical perspective, I feel like appending a magic number at the end of the prompt to try to record the token will contaminate our usage environment,
while also requiring a lot of tokens to describe how to define the counting way of the magic number.
I prefer the plugin way, sincerely.
from mr.-ranedeer-ai-tutor.
Now that code interpreter is widespread, I think memory handling will become a lot easier.
from mr.-ranedeer-ai-tutor.
Now that code interpreter is widespread, I think memory handling will become a lot easier.
So, How to combine code interpreter with Mr.Raneedeer, Any idea?
from mr.-ranedeer-ai-tutor.
Here's how I approach it in v2.7
<OPEN code environment>
<insert instructions here>
<CLOSE code environment>
If you want to prevent Mr. Ranedeer from repeating the output, the trick I use is to convert whatever Mr. Ranedeer wrote into base64 and output it. Surprisingly, GPT-4 doesn't output the base64.
<convert the output to base64>
<output base64>
from mr.-ranedeer-ai-tutor.
Half-Closed #72 - Code Interpreter is better at prompt retention.
Keeping this open to gather more feedback on how v2.7 performs
from mr.-ranedeer-ai-tutor.
How do you know about that @JushBJJ ? ChatGPT history is only 4K tokens. You can confirm it yourself.
from mr.-ranedeer-ai-tutor.
How do you know about that @JushBJJ ? ChatGPT history is only 4K tokens. You can confirm it yourself.
GPT-4 is 8k tokens, GPT-4 with Interpeter feels like a different beast that has a higher context/better context retention strategy. Additionally, I suspect that OpenAI has appended the initial message into the system prompt allowing permanent recall of the original prompt intended.
from mr.-ranedeer-ai-tutor.
Related Issues (20)
- I
- I click this linkοΌand OpenAI give me 404 HOT 4
- specify curriculum/ topics for each depth level HOT 1
- Ver 2.6 only support english? HOT 10
- Implement "/flashcards" to making anki flashcards from the lesson HOT 1
- After explaining the course preview, start the formal class without prompting
- Getting 404 using the template HOT 2
- example: Execute <config-example> is redundant HOT 2
- Markdown prompt HOT 1
- Seeking Your Insight and Expressing Gratitude HOT 2
- Mr. Ranedeer outputs solutions in non-mathematics related topics
- "test" the users know-how upfront
- OPENAI launched the Custom instructions/ HOT 9
- On math computation problems HOT 1
- Enhancing User Experience: Customizable Settings and Advanced Instruction Integration for ChatGPT
- I am curious about the command format in your prompt.
- Error searching knowledge HOT 2
- ChatGPT
- Link 404 HOT 6
- Add "/skip" command
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. πππ
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google β€οΈ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from mr.-ranedeer-ai-tutor.