Quick Takes From the ChatGPT 4.5 Announcement
- mlakas1
- Feb 27
- 2 min read

A couple of things stood out from today's ChatGPT 4.5 announcement:
First, #OpenAI unsurprisingly highlighted that the new model utilizes the most extensive training data to date—bigger, better, faster, and so on. The prevailing concern in the #AI community has been the dwindling availability of fresh training data. Interestingly, OpenAI claims to have developed a technique to train models using "synthetic" data—output generated from other, smaller models.
This synthetic data is then incorporated into the corpus used during the unsupervised learning phase of model development. Historically, synthetic data hasn't provided substantial intelligence gains. If OpenAI has figured out how to overcome this limitation, the implications are huge. It could alleviate the data bottleneck, replacing it with a new paradigm that scales with #compute. That's a big deal. Side note: look at me using the term 'compute'.

Second, and possibly more intriguing, OpenAI mentioned multiple times that ChatGPT 4.5 has made significant strides in its "EQ" or “EI” stands for Emotional Intelligence. In this context, EQ refers to the ability to recognize, understand, and respond appropriately to the emotions of others. Initially, I found this focus odd since AI discussions typically revolve around raw intelligence. However, if an AI can employ #EQ techniques such as "active listening" in its conversations, it could be profoundly impactful. Utilizing active listening a participant strengthens relationships and builds trust by understanding the speaker's perspective, offering encouragement, reflection, and moving the conversation forward using open-ended questions. OpenAI even provided an example of active listening in action.
Developing an emotional bond with the model will likely increase its 'stickiness'—its ability to keep users engaged, encourage repeat usage, and, perhaps most importantly, create difficulties switching services. In terms of "knowing your user," this takes things to another level entirely. What if your product understands the very psyche of the person using it? Something to really chew on. One must consider that an emotionally intelligent AI may lead individuals to develop unhealthy attachments.
Big announcement today. This begs the question…. do androids dream of electric sheep? I’ll see myself out…



Comments