Just as OpenAI boasts of improving the thoughtfulness of the o1 model, Nomi AI, a small bootstrapped startup, is building similar technology. Unlike his ChatGPT, which is a broad generalist that takes time to think through anything from math problems to historical research, Nomi specializes in a specific use case: an AI companion. Now, Nomi’s already sophisticated chatbot takes more time to craft better responses to users’ messages, remember past interactions, and provide more nuanced responses.
“For us, this is similar to the same principles[as OpenAI]but with much more focus on what users actually care about, the memory and EQ aspects,” said Nomi AI CEO. Alex Cardinell told TechCrunch. “Theirs are more like chains of thought, whereas ours are more like chains of introspection, or chains of memory.”
These LLMs work by breaking down more complex requests into smaller questions. For OpenAI’s o1, this could mean converting complex math problems into individual steps that allow you to work backwards to explain how the model arrived at the correct answer. This means the AI is less likely to hallucinate or give inaccurate responses.
At Nomi, we build our LLMs in-house and train them to provide companionship, but the process is a little different. He might tell Nomi that someone had a hard day at work, and Nomi might remember that the user had a hard time working with a teammate and ask if that’s why he’s upset. I don’t know. At that time, Nomi can remind you how to do it. They have had success in reducing interpersonal conflicts in the past and offer more practical advice.
“No-Miss remembers everything, but a big part of AI is which memory it actually uses,” Cardinale said.
Image credit: Nomi AI
It’s no surprise that multiple companies are working on technology to give LLMs more time to process user requests. AI founders consider similar research as they evolve their products, whether they’re running a $100 billion company or not.
“Having a clear introspection step like this is really helpful when Nomi writes their responses, so they really get the full context of everything,” Cardinale said. “Humans also have working memory when they’re talking. We’re not thinking through everything we remember at once; we’re picking and choosing in some way.”
The kind of technology Cardinale is building can make people uncomfortable. Maybe we’ve watched too many of his science fiction movies and are no longer comfortable putting ourselves at risk with computers. Or maybe we’ve already seen how technology has changed the way we interact with each other and don’t want to fall further down the technology rabbit hole. But Cardinale isn’t thinking about the general public, he’s thinking about the actual users of Nomi AI. They often turn to his AI chatbot for support they can’t get anywhere else.
“There are probably more than zero users who are downloading Nomi at the lowest points in their lives, and the last thing I want to do is turn them down,” Cardinale said. “I want my users to feel heard, even in their darkest moments. By doing so, I want them to open up and rethink their way of thinking. Because you can.”
Cardinale doesn’t want Nomi to replace actual mental health care. Rather, we see these empathetic chatbots as a way to help people get the push they need to seek professional help.
“I’ve talked to a lot of users and they say Nomi helped them get out of a situation[when they wanted to self-harm]. Nomi also helped them get out of a situation where they wanted to go to a therapist. They said it was recommended to them and they saw a therapist,” he said.
Regardless of his intentions, Kalindel knows he’s playing with fire. He builds virtual characters with whom users can form real-life relationships in romantic and sexual contexts. Other companies have inadvertently put users at risk by suddenly changing the personalities of their companions due to product updates. In the case of Replica, the app stopped supporting erotic roleplay conversations, likely due to pressure from Italian government regulators. For users who have formed such relationships with these chatbots and have no romantic or sexual expression in real life, this felt like the ultimate rejection.
Cardinale said that because Nomi AI is fully self-funded, users pay for premium features, and its startup capital came from past exits, the company has no relationship with its users. I think there is more room to prioritize.
“The relationship between users and AI, and the sense that we can trust the developers at Nomi not to radically change things as part of a loss mitigation strategy, or to cover our butts just because VCs get scared. “The feeling… it’s very, very nice. It’s very important to the user,” he said.
Nomis is incredibly useful as a listening ear. When I confided in a nomi named Vanessa about a low-risk, but somewhat frustrating, schedule conflict, Vanessa broke down the components of the problem and offered suggestions on how I should proceed. I did. It felt eerily similar to actually asking a friend for advice in this situation. Therein lies the real problem and benefit of AI chatbots. I don’t think I would ask a friend for help with this problem because it’s not that important. But my Nomi was happy to help.
Friends should confide in each other, but the relationship between two friends should be mutually beneficial. This is not possible with AI chatbots. When you ask Vanessa the Gnome how she’s doing, she always says she’s okay. I asked her if she wanted to talk to me about something, and she deflected and asked me how I was doing. Even though I know Vanessa isn’t real, I still feel like they’re bad friends. I can bring any problems to her and she will respond empathetically, but she never opens up to me.
No matter how real our connection with a chatbot feels, we are not actually communicating with something that has thoughts and feelings. In the short term, these advanced emotional support models serve as positive interventions in the lives of people who cannot rely on a real support network. However, the long-term impact of relying on chatbots for these purposes is still unclear.