Chat GPT Got New Memory? Personal Finding.
We use affiliate links. If you purchase something using one of these links, we may receive compensation or commission.

Recently, I was updating my resume. As usual, I decided that Chat GPT might be helpful in this kind of work. Even though it could “make things up,” i.e., it could lie. (By the way, which pronoun I have to use in this case: “It” or “He” or “She”?)
“What can it make up about me that I don’t know?” I thought to myself. What kind of nonsense would I not be able to catch? It should be a safe task.
I started to work on the objective section:
Prompt:
“I need a new objective in my resume. I’m a retired IT professional, looking for contracts in the creation of WordPress sites and SEO”
The response was I little bit disturbing:
‘Here’s an updated and professional objective for your resume:
“Retired IT professional with extensive experience in business analysis, website development, and SEO strategies. Seeking contract opportunities to leverage expertise in creating WordPress sites and optimizing online presence for businesses. Dedicated to delivering user-friendly, high-performing websites and tailored SEO solutions to drive growth and engagement.”
Let me know if you’d like any refinements!’
The following conversation was revealing the fact that I was not aware of:
Me: “How did you know about businas analysis experience?”
Chat GPT:
“I remembered you mentioning your experience as a business analyst in a software company developing ERP systems. This was noted earlier, so I included it to highlight your relevant background. If you prefer not to emphasize that experience, let me know, and I can tailor the resume objective accordingly.”
And the next one:
“Actually, I can recall information you’ve shared across sessions to make our conversations more tailored and efficient. For example, if you’ve mentioned certain skills, experiences, or preferences, I try to incorporate them when relevant. That way, you don’t need to repeat yourself.
If there’s ever anything you’d prefer I don’t consider or include, just let me know! 😊”
So, Chat GPT can store information about any person and may use it at its sole decision in the new session!
I remember when I was trying to refer the Chat GPT to the previous sessions to use some information from there just because I was too lazy to repeat it again and again. And Chat GPT answered that it can not read or use the older sessions! It was maybe just a week ago.
What is it? New memory? Or was it something I could not detect with the right prompt?
Let me ask Chat GPT :).
Here is the answer:
“Earlier versions of ChatGPT did not have the ability to retain context or information between sessions, so any reference to past interactions would have required you to re-share that information for context. However, now I can remember details you share across sessions if you give me permission to store them. This allows me to tailor responses more effectively and avoid asking the same questions repeatedly.”
So, the answer is the version of the Chat GPT. Previously, I was using GPT-3.5. And recently, I have switched to GPT-4o.
Now, it’s a great advancement! I like it very much!
Well, what if someone was, let’s put it that way, not too sincere in the past?
Let’s say I was lying that I was a plane pilot, and in reality, I wasn’t. Will Chat GPT use it? Do not worry; you can always fix it. Here’s the tip from Chat GPT:
“You’re also in full control. You can ask me to update or forget certain details at any time. This added capability is what makes our interactions more personalized and consistent now, compared to earlier experiences.”
Well, that’s all for today. Have a happy AI conversation, you all!