• 0 Posts
  • 379 Comments
Joined 3 years ago
cake
Cake day: July 2nd, 2023

help-circle

















  • 100% not true if they were using a single session to check multiple grants.

    Every prompt you send contains a hashed version of your entire conversation with the chatbot. When this exceeds the chat bots context window, it’s answers become less and less relevant.

    You’ll notice this if you’ve ever had a chat or guide you through something for an hour or more. It eventually gets something wrong takes you down a rabbit hole, and goes in a big circle. At this point, it can be very difficult to get the chat bot to simply respond to your prompt, i.e. if you say “you know what let’s talk about _______ instead.” It will keep talking about whatever you were talking about staying in your dumb rabbit hole loop.

    So if they did this with multiple grants eventually it would basically realize theyre looking for “yes that’s dei” and just responding with different versions of that ad nauseam.