More and more people are referring to AI being a whole new species — and one that threatens to supplant humans in the long run.
Former Microsoft Chief Research and Strategy Officer Craig Mundie offers a nuanced perspective on this existential question, outlining three potential outcomes for humanity’s relationship with artificial intelligence. In a recent interview, Mundie distinguished between AI’s current “tool phase” and what he sees as an inevitable progression toward co-evolution, emphasizing that humans still have agency in shaping this trajectory — for now.

“Right now, humans have entered the tool phase vis-à-vis AI. It’s gotten to be powerful enough that it’s clearly a beneficial tool,” Mundie explained. “What people haven’t thought through is this is the first technology in human history that doesn’t stop at tool as you get into this period of time.”
This distinction is crucial to Mundie’s framework. Unlike previous technological advances — from the printing press to the internet — AI represents something fundamentally different: a technology that will eventually transcend its role as a mere instrument. “The real question for me is how do humans take advantage of the partnership that they — I think we can guarantee is still available to us in this coexistence period — in order to decide how do the humans themselves want to prepare for the world of ultimately co-evolution,” he said.
Mundie emphasized that humans retain some control during this critical window, largely because AI didn’t emerge from external forces. “This AI, this super intelligence thing, did not arrive on a spaceship. We made it, we’re still making it. As a result, we have some control over it. We may not know how to extend that control in perpetuity, but at least in this finite period of time, I think we can influence that outcome, whether it’s in health or intellect or input-output systems or whatever it might be, in a partnership with these machines.”
However, this period of human agency is temporary. “Once you get out of that period, you are on a co-evolutionary path,” Mundie warned. “And you have to think, okay, where does that end up? There’s only a few possible outcomes. One is that one becomes irrelevant relative to the other. One is that there’s a long-term symbiotic relationship. And the third is there’s some kind of hybridization between the two evolved environments or species.”
These three scenarios — irrelevance, symbiosis, or hybridization — represent what Mundie calls “unknowable outcomes.” His prescription is pragmatic: “All you can do is try to hedge the bets for humans by making sure that in the early stages we retain a real partnership symbiotic relationship between the evolving machine and the humans.”
Other people in the tech world have made similar comparisons. Microsoft’s AI Chief Mustafa Suleyman has said that AI will feel like a new digital species. Ray Kurzweil has said that humans will need to merge with AI and become a hybrid species. ‘Sapiens’ author Yuval Noah Harari has said that AI could become a new species that will take over the planet, and Turing Award winner Judea Pearl has even said that AI could one day keep humans as pets.
Mundie’s perspective arrives at a pivotal moment in AI development. Recent advances in large language models, robotics, and AI reasoning capabilities have accelerated discussions about artificial general intelligence (AGI) and its implications. Major tech leaders, from OpenAI’s Sam Altman to Google’s Demis Hassabis, have acknowledged that we may be approaching AGI within the decade. Meanwhile, concerns about AI alignment and control have prompted calls for regulation and safety research from researchers and policymakers alike. Mundie’s framework suggests that while we debate governance and safety measures, we should also be preparing for a future where the very nature of human-AI interaction fundamentally transforms — moving beyond the current paradigm where we simply use AI tools to one where we co-evolve alongside artificial intelligence as a distinct form of intelligence in its own right.