It doesn’t have “memory” of what it has generated previously, other than the current conversation. The answer you get from it won’t be much better than random guessing.
The model is only trained to handle 4k tokens, roughly 2000 words depending on complexity. Even if it had a log of everything asked it wouldn’t be able to use any of it.
It doesn’t have “memory” of what it has generated previously, other than the current conversation. The answer you get from it won’t be much better than random guessing.
deleted by creator
Ignoring the huge privacy/liabillity issue… there are other llm’s then chatgpt.
Llama 2 also exists. You can’t control that.
The model is only trained to handle 4k tokens, roughly 2000 words depending on complexity. Even if it had a log of everything asked it wouldn’t be able to use any of it.