Mira Murati: Minister of Truth and Chief Technology Officer for Open AI, after Sam Altman left in September 2014 announcing his exit
Executives at Microsoft, which has invested a reported $13 billion in OpenAI, were said to be “blindsided” by news of Altman’s exit, and Microsoft’s CEO, Satya Nadella, was furious, Bloomberg reports.
After Sam Altman left OpenAI on Friday, Mira Murati became its minister of truth, but she was also its chief technology officer. In addition to heading the teams that develop tools such as ChatGPT and Dall-E, it’s been her job to make sure those products don’t mislead people, show bias, or snuff out humanity altogether.
When it was learned of the surprising capabilities of chatGpT, it triggered an arms race between big tech companies to build more powerful Artificial intelligence. The bot’s success turned Altman into a tech celebrity, consulted by world leaders on the future path of AI technology.
Altman appeared yesterday at the Asia-Pacific Economic Cooperation summit in San Francisco, telling hundreds of business and government leaders that AI systems could solve humanity’s most-pressing problems if their development were pursued responsibly.
“We’re on a path to self-destruction as a species right now,” he said, sitting alongside executives from Meta and Google. If we are to flourish for hundreds, thousands, and millions of years more, we need new technology.
Mira Murati. My background is in engineering, and I worked in aerospace, automotive, VR, and AR. Both in my time at Tesla [where she shepherded the Model X], and at a VR company [Leap Motion] I was doing applications of AI in the real world. I very quickly believed that AGI would be the last and most important major technology that we built, and I wanted to be at the heart of it. Open AI was the only organization at the time that was incentivized to work on the capabilities of AI technology and also make sure that it goes well. I joined the team to work on our supercomputing strategy and manage a few research teams.
What Do You Know About OpenAI? Answering Three Employees at an Emergency All-Hands Conference on Friday, May 3: “We’re sorry to hear that we quit,” said Sutskever
It’s hard to remember the big moments. We live in the future, and we see crazy things every day. But I do remember GPT-3 being able to translate. I speak several languages, including Italian, Albanian and English. I conjured up a pair of English and Italian questions. Even though we never trained it to translate in Italian, it can do it fairly well.
When OpenAI became a for-profit entity, you were already there when it was a nonprofit. How did you feel about that?
It wasn’t something that was done lightly. You have to deploy our models at scale to understand how to make them better. That costs a lot of money. It requires you to have a business plan, because your generous nonprofit donors aren’t going to give billions like investors would. As far as I know, there’s no other structure like this. It was important to protect the nonprofit’s mission.
Jakub Pachocki, a lead research on OpenAI’s groundbreaking language model GPT-4; Aleksander Madry, a professor at MIT recruited by Altman to work on AI safety; and Szymon Sidor, a researcher who has worked on a branch of AI known as reinforcement learning, all reportedly quit as the crisis deepened.
“This was the board doing its duty to the mission of the nonprofit, which is to make sure that OpenAI builds AGI that benefits all of humanity,” Sutskever told employees at an emergency all-hands on Friday afternoon, according to a report in The Information.
OpenAI declined to provide further comment on the situation. They didn’t reply to requests for comment. Inquiries sent to the three researchers who quit also went unanswered.
What did you do at the QPO? Tell me what you’re up to, when I’m not with you, but you can do something, don’t you?
When I asked how the company was running when he was not around, he said that they have an incredibly great team that can do lots of things. “But some things only a CEO can do—some HR thing of the moment, or you have to kill some project, or something with a major partner.” He would bat out responses at the end of the day if he found any items on his phone. Then he would go back to speechifying, meeting developers, and taking tea with prime ministers.