Technology
Google: The AI Company
In this episode of Acquired, hosts Ben Gilbert and David Rosenthal explore Google's evolution into an AI powerhouse, examining the challenges and strategic dilemmas the company faces in the curre...
Google: The AI Company
Technology •
0:00 / 0:00
Interactive Transcript
spk_0
I went and looked at a studio, well a little office that was going to turn into a studio nearby,
spk_0
but it was not good at all.
spk_0
It had drop-sealing, so I could hear the guy in the office next to me.
spk_0
You would be able to hear him talking about episodes.
spk_0
Third co-host.
spk_0
Third co-host.
spk_0
Is it Howard?
spk_0
No, it was like a lawyer.
spk_0
It seemed to be like talking through some horrible problem that I didn't want to listen to,
spk_0
but I could hear every word.
spk_0
Does he want millions of people listening to this?
spk_0
Right.
spk_0
Right.
spk_0
All right.
spk_0
All right.
spk_0
Let's do a podcast.
spk_0
Let's do a podcast.
spk_0
Yeah.
spk_0
Who got the truth?
spk_0
Is it you?
spk_0
Is it you?
spk_0
Is it you?
spk_0
Is it you?
spk_0
Who got the truth now?
spk_0
Who?
spk_0
Is it you?
spk_0
Is it you?
spk_0
Is it you?
spk_0
spk_0
Sit me down.
spk_0
Say it straight.
spk_0
Another story on the way.
spk_0
Who got the truth?
spk_0
Welcome to the Fall 2025 season of Acquired, the podcast about great companies and the stories
spk_0
and playbooks behind them.
spk_0
I'm Ben Gilbert.
spk_0
I'm David Rosenthal.
spk_0
And we are your hosts.
spk_0
Here's a dilemma.
spk_0
Imagine you have a profitable business.
spk_0
You make giant margins on every single unit you sell.
spk_0
And the market you compete in is also giant.
spk_0
One of the largest in the world, you might say.
spk_0
But then on top of that, lucky for you, you also are a monopoly in that giant market with 90% share and a lot of locking.
spk_0
And when you say monopoly, monopoly as defined by the US government.
spk_0
That is correct.
spk_0
But then imagine this.
spk_0
In your research lab, your brilliant scientists come up with an invention.
spk_0
This particular invention, when combined with a whole bunch of your old inventions by all your other brilliant scientists,
spk_0
turns out to create the product that is much better for most purposes than your current product.
spk_0
So you launched the new product based on this new invention, right?
spk_0
Right.
spk_0
I mean, especially because out of pure benevolence, your scientists had published research papers about how awesome the new invention is and lots of the inventions before also.
spk_0
So now there's new startup competitors quickly commercializing that invention.
spk_0
So of course, David, you change your whole product to be based on the new thing, right?
spk_0
This sounds like a movie.
spk_0
Yes, but here is the problem.
spk_0
You haven't figured out how to make this new incredible product anywhere near as profitable as your old giant cash printing business.
spk_0
So maybe you shouldn't launch that new product.
spk_0
David, this sounds like quite the dilemma to me.
spk_0
Of course, listeners, this is Google today and in perhaps the most classic textbook case of the innovators dilemma ever.
spk_0
The entire AI revolution that we are in right now is predicated by the invention of the transformer out of the Google brain team in 2017.
spk_0
So think open AI and chat GPT and THROPIC and video hitting all time highs, all the craziness right now depends on that one research paper published by Google in 2017.
spk_0
And consider this not only did Google have the densest concentration of AI talent in the world 10 years ago that led to this breakthrough.
spk_0
But today they have just about the best collection of assets that you could possibly ask for.
spk_0
They've got a top tier AI model with Gemini.
spk_0
They don't rely on some public cloud to host their model.
spk_0
They have their own in Google Cloud that now does $50 billion in revenue that is real scale.
spk_0
They're a chip company with their tensor processing units or TPUs, which is the only real scale deployment of AI chips in the world besides Nvidia GPUs.
spk_0
Maybe AMD, maybe, but these are definitely the top two somebody put it to me in research that if you don't have a foundational frontier model or you don't have an AI chip, you might just be a commodity in the AI market and Google is the only company that has both.
spk_0
Google still has a crazy bench of talent and despite chat GPT becoming kind of the Kleenex of the era, Google does still own the text box.
spk_0
The single one that is the front door to the internet for the vast majority of people anytime anyone has intent to do anything online.
spk_0
But the question remains what should Google do strategically?
spk_0
Should they risk it all and lean into their birthright to win in artificial intelligence?
spk_0
Or will protecting their gobs of profits from search hamstring them as the AI wave passes them by?
spk_0
But perhaps first we must answer the question, how did Google get here David Rosenthal?
spk_0
So listeners today we tell the story of Google the AI company.
spk_0
Who?
spk_0
You like that David?
spk_0
I love it.
spk_0
I do hire like a Hollywood script writing consultant without telling me.
spk_0
I wrote that 100% myself with no AI. Thank you very much.
spk_0
No AI.
spk_0
Listeners, if you want to know every time an episode drops, vote on future episode topics or get access to corrections from past episodes, check out our email list.
spk_0
That's acquired.fm slash email.
spk_0
Come talk about this episode with the entire acquired community in Slack after you listen. That's acquired.fm slash Slack.
spk_0
Speaking of the acquired community, we have an anniversary celebration coming up.
spk_0
We do.
spk_0
10 years of the show.
spk_0
We're going to do an open zoom call with everyone to celebrate, kind of like how we used to do our LP calls back in the day with LPs.
spk_0
And we are going to do that on October 20th, 2025 at 4pm Pacific time.
spk_0
Check out the show notes for more details.
spk_0
If you want more acquired, check out our interview show ACQ2.
spk_0
Our last interview was super fun.
spk_0
We sat down with Toby Lutke, the founder and CEO of Shopify about how AI has changed his life and where he thinks it will go from here.
spk_0
So search ACQ2 in any podcast player.
spk_0
And before we dive in, we want to briefly thank our presenting partner, JP Morgan Payments.
spk_0
Yes. Just like how we say every company has a story.
spk_0
Every company's story is powered by payments and JP Morgan Payments is a part of so many of their journeys from seed to IPO and beyond.
spk_0
So with that, this show is not investment advice.
spk_0
David and I may have investments in the companies we discuss.
spk_0
And this show is for informational and entertainment purposes only David Google the AI company.
spk_0
So Ben, as you were alluding to in that fantastic intro really up in the game.
spk_0
If we rewind 10 years ago from today, before the transformer paper comes out, all of the following people, as we've talked about before, were really happy.
spk_0
And then we have our Google employees, Ilya Sitskiver, founding chief scientist of open AI, who along with Jeff Hinton and Alex Koshewski had done the seminal AI work on Alex net and just published that a few years before all three of them were Google employees.
spk_0
As was Dario Amade, the founder of Anthropic Andre Carpati, chief scientist at Tesla until recently, Andrew Ings, Sebastian Thrun, Noem Shazir, all the deep mind folks, Demis Asabis, Shane Legg, Mustafa Sulebin, Mustafa now in addition to in the past having been a founder of DeepMind runs AI at Microsoft.
spk_0
Basically every single person of note in AI worked at Google with the one exception of Jan LeCune who worked at Facebook.
spk_0
Yeah, it's pretty difficult to trace a big AI lab now back and not find Google in its origin story.
spk_0
Yeah, I mean, the analogy here is it's almost as if at the dawn of the computer era itself, a single company like say IBM had hired every single person who knows how to code.
spk_0
So it'd be like, you know, if anybody else wants to write a computer program, oh, sorry, you can't do that.
spk_0
Anybody who knows how to program works at IBM. This is how it was with AI and Google in the mid 2010s.
spk_0
But learning how to program a computer wasn't so hard that people out there couldn't learn how to do it.
spk_0
Learning how to be an AI researcher significantly more difficult.
spk_0
Right. It was the stuff of very specific PhD programs with a very limited set of advisors and a lot of infighting in the field of where the direction of the field was going.
spk_0
What was legitimate versus what was crazy, heretical, religious stuff?
spk_0
Yep. So then yes, the question is how do we get to this point?
spk_0
Well, it goes back to the start of the company. Larry Page always thought of Google as an artificial intelligence company.
spk_0
And in fact, Larry Page's dad was a computer science professor and had done his PhD at the University of Michigan in machine learning and artificial intelligence, which was not a popular field in computer science back then.
spk_0
Yeah. In fact, a lot of people thought specializing in AI was a waste of time because so many of the big theories from 30 years prior to that had been kind of disproven at that point, or at least people thought they were disproven.
spk_0
And so it was frankly contrarian for Larry's dad to spend his life in career and research work in AI.
spk_0
And that rubbed off on Larry. I mean, if you squint, PageRank, the PageRank algorithm that Google was founded upon, is a statistical method.
spk_0
You could classify it as part of AI within computer science.
spk_0
And Larry, of course, was always dreaming much, much bigger. There's the quote that we've said before on this show in the year 2000, two years after Google's founding, when Larry says artificial intelligence would be the ultimate version of Google.
spk_0
If we had the ultimate search engine, it would understand everything on the web. It would understand exactly what you wanted. And it would give you the right thing.
spk_0
That's obviously artificial intelligence. We're nowhere near doing that now. However, we can get incrementally closer. And that is basically what we work on here.
spk_0
It's always been a AI company.
spk_0
Yep. And that was in 2000.
spk_0
Well, one day, in either late 2000 or early 2001, the timelines are a bit hazy here.
spk_0
A Google engineer named George Herrick is talking over lunch with Ben Gomes, famous Google engineer, who I think would go on to lead search.
spk_0
And a relatively new engineering hire named Nome Shazir.
spk_0
Now, George was one of Google's first 10 employees, incredible engineer. And just like Larry Page's dad, he had a PhD in machine learning from the University of Michigan.
spk_0
And even when George went there, it was still a relatively rare contrarian subfield within computer science.
spk_0
So the three of them are having lunch. And George says offhandedly to the group that he has a theory from his time as a PhD student that compressing data is actually technically a technology.
spk_0
And it's technically equivalent to understanding it. And the thought process is if you can take a given piece of information and make it smaller, store it away, and then later re-instantiate it in its original form, the only way that you could possibly do that is if whatever force is acting on the data actually understands what it means because you're losing information going down to something smaller.
spk_0
And then re-creating the original thing, it's like you're a kid in school. You learn something in school, you read a long textbook, you store the information in your memory, then you take a test to see if you really understood the material.
spk_0
And if you can recreate the concepts, then you really understand it.
spk_0
Which kind of foreshadows big LLMs today are like compressing the entire world's knowledge into some number of terabytes that's just like the smash down little vector set little at least compared to all the information in the world.
spk_0
But it's kind of that idea, right? You can store all the world's information in an AI model in something that is like kind of incomprehensible and hard to understand. But then if you uncompress it, you can kind of bring knowledge back to its original form.
spk_0
And these models demonstrate understanding, right?
spk_0
Man, do they? That's the question. That's the question. They certainly mimic understanding.
spk_0
So this conversation is happening. This is 25 years ago. And no, the new hire, the young buck, he sort of stops in his tracks and he's like, wow, if that's true, that's really profound.
spk_0
Is this in one of Google's microkitchens?
spk_0
This is in one of Google's microkitchens. They're having lunch.
spk_0
Where did you find this, by the way, at 25-year-old?
spk_0
This is in theplex. This is like a small little passage in Stephen Levy's great book that's been a source for all of our Google episodes in theplex.
spk_0
There's a small little throwaway passage in here about this because this book came out before JetGPT and AI and all that.
spk_0
So, no, I'm kind of latches on to George and keeps vibing over this idea. And over the next couple months, the two of them decide in the most Google-y-fashion possible that they are just going to stop working on everything else.
spk_0
And they're going to go work on this idea on language models and compressing data and can they generate machine understanding with data.
spk_0
And if they can do that, that would be good for Google. I think this coincides with that period in 2001 when Larry Page fired all the managers in the engineering organization.
spk_0
So everybody was just doing whatever they wanted to do.
spk_0
Funny.
spk_0
So there's this great quote from George in a book.
spk_0
A large number of people thought it was a really bad thing for Nomanai to spend our talents on, but Sanjay Gammawad, and Sanjay, of course, being Jeff Deans famous prolific coding partner, thought it was cool.
spk_0
So George would posit the following argument to any doubters that they came across.
spk_0
Sanjay thinks it's a good idea, and no one in the world is as smart as Sanjay. So why should Nomanai accept your view that it's a bad idea?
spk_0
So if you beat the best team in football, are you the new best team in football, no matter what?
spk_0
Yeah. So all of this ends up taking Noman, George, deep down the rabbit hole of probabilistic models for natural language, meaning for any given sequence of words that appears on the internet.
spk_0
What is the probability for another specific sequence of words to follow?
spk_0
So this should sound pretty familiar for anybody who knows about LLM's work today.
spk_0
Oh, kind of like a next word predictor.
spk_0
Yeah, or next token predictor if you generalized it.
spk_0
Yep.
spk_0
So the first thing that they do with this work is they create the did you mean spelling correction in Google search?
spk_0
Oh, that came out of this.
spk_0
That came out of this? No, created this.
spk_0
So this is huge for Google because obviously it's a bad user experience when you mistype a query and then need to type another one, but it's a tax to Google's infrastructure because every time these mis typed queries are going, well, Google's infrastructure goes and serves the results to that query that are useless and immediately overwritten with the new one.
spk_0
Right.
spk_0
And they retype it.
spk_0
And if it's really high confidence, then you actually just correct it without even asking them and then ask them if they want to opt out instead of opting in.
spk_0
It's a great feature and it's sort of a great first use case for this in a very narrowly scoped domain.
spk_0
Totally. So they get this when they keep working on it.
spk_0
No, I'm in George.
spk_0
And they end up creating a fairly large from using large and quotes here, you know, for the time language model that they call affectionately fill the probabilistic hierarchical inferential learner.
spk_0
These AI researchers love creating their acronyms.
spk_0
They love their word buttons.
spk_0
Yeah.
spk_0
So fast forward to 2003 and Susan Majeski and Jeff Dean are getting ready to launch ad sense.
spk_0
They need a way to understand the content of these third party web pages, the publishers, in order to run the Google ad corpus against them.
spk_0
Well, Phil is the tool that they use to do it.
spk_0
Huh, I had no idea that language models were involved in this.
spk_0
Yeah. So Jeff Dean, borrows fill and famously uses it to code up his implementation of ad sense in a week because he's Jeff Dean.
spk_0
And boom, ad sense. I mean, this is billions of dollars of new revenue to Google overnight because it's the same corpus of ads, better ad words that are search ads that they're now serving on third party pages.
spk_0
They just massively expanded the inventory for the ads that they already have in the system. Thanks to fill.
spk_0
Thanks to fill.
spk_0
All right.
spk_0
This is a moment where we got to stop and just give some Jeff Dean facts.
spk_0
The Jeff Dean is going to be the through line of this episode of wait, how did Google pull that off?
spk_0
How did Jeff Dean just go home and over the weekend rewrite some entire giant distributed system and figure out all of Google's problems?
spk_0
Back when Chuck Norris facts were big, Jeff Dean facts became a thing internally at Google.
spk_0
I just want to give you some of my favorites.
spk_0
The speed of light in a vacuum used to be about 35 miles per hour. Then Jeff Dean spent a weekend optimizing physics.
spk_0
So good.
spk_0
Jeff Dean's pin is the last four digits of pie.
spk_0
Oh,
spk_0
Only Googlers would come up with these.
spk_0
Yes. To Jeff Dean, N.P. means no problemo.
spk_0
Oh, yeah. I've seen that one before. I think that one's my favorite.
spk_0
Yes.
spk_0
Oh, so good.
spk_0
Also a wonderful human being who we spoke to in research and was very, very helpful.
spk_0
Thank you Jeff. Yes.
spk_0
So language models definitely work.
spk_0
Definitely going to drive a lot of value for Google.
spk_0
And they also fit pretty beautifully into Google's mission to organize the world's information and make it universally accessible and useful.
spk_0
If you can understand the world's information and compress it and then recreate it,
spk_0
yeah, that fits the mission, I think. I think that checks the box.
spk_0
Absolutely.
spk_0
So Phil gets so big that apparently by the mid 2000s, Phil is using 15% of Google's entire data center infrastructure.
spk_0
And I assume a lot of that is AdSense ad serving, but also did you mean and all the other stuff that they start using it for within Google.
spk_0
So early natural language systems, computationally expensive.
spk_0
Yes. So, okay. Now mid 2000s, fast forward to 2007, which is a very, very big year for the purposes of our story.
spk_0
Google had just recently launched the Google Translate product.
spk_0
This is the air of all the great, great products coming out of Google that we've talked about.
spk_0
You're maps and Gmail and docs and all the wonderful things at Chrome and Android are going to come later.
spk_0
They had like a 10 year run where they basically launched everything you know of at Google except for search.
spk_0
Truly in a 10 year run.
spk_0
And then there were about 10 years after that from 2013 on where they basically didn't launch any new products that you've heard about until we get to Gemini, which is this fascinating thing.
spk_0
But this 03 to 2013 era was just so rich with hit after hit after hit magical.
spk_0
And so one of those products was Google Translate, you know, not the same level of user base or perhaps impact on the world as Gmail or maps or whatnot.
spk_0
But still a magical magical product.
spk_0
And the chief architect for Google Translate was another incredible machine learning PhD named Franz Ock.
spk_0
So Franz had a background in natural language processing and machine learning and that was his PhD, he's German there.
spk_0
You got his PhD in Germany.
spk_0
At the time, DARPA, the defense advanced research projects agency, division of the government, had one of their famous challenges going for machine translation.
spk_0
So Google and Franz of course enters this and Franz builds an even larger language model that blows away the competition in this year's version of the DARPA challenge to the side of 2006 or 2007 gets a astronomically high blue score for the time it's called the bilingual evaluation understudy is the sort of algorithmic benchmark for judging the quality of translations.
spk_0
At the time higher than anything else possible.
spk_0
So Jeff Dean, here's about this and the work that Franz and the translate team have done is like, this is great.
spk_0
This is amazing.
spk_0
When are you guys going to ship this in production?
spk_0
Oh, I heard this story.
spk_0
So Jeff and know him talk about this on the door catch podcast.
spk_0
Yes.
spk_0
That episode is so so good.
spk_0
And Franz is like, no, no, no, no, Jeff, you don't understand.
spk_0
This is research. This isn't for the product.
spk_0
We can't ship this model that we built.
spk_0
This is a N gram language model.
spk_0
Grams are like a number of words in a cluster.
spk_0
And we've trained it on a corpus of two trillion words from the Google search index.
spk_0
This thing is so large, it takes it 12 hours to translate a sentence.
spk_0
So the way the DARPA challenge worked in this case was you got a set of sentences on Monday.
spk_0
And then you had to submit your machine translation of those set of sentences by Friday.
spk_0
Plenty of time for the servers to run.
spk_0
Yeah, they were like, okay, so we have whatever number of hours it is from Monday to Friday.
spk_0
Let's use as much compute as we can to translate these couple sentences.
spk_0
Hey, learn the rules of the game and use them to your advantage.
spk_0
Exactly.
spk_0
So Jeff Dean being the engineering equivalent of Chuck Norris.
spk_0
He's like, hmm, let me see your code.
spk_0
So Jeff goes and parachutes in and works with the translate team for a few months.
spk_0
And he rearchitects the algorithm to run on the words in the sentences in parallel instead of sequentially.
spk_0
Because when you're translating a set of sentences or a set of words in a sentence,
spk_0
you don't necessarily need to do it in order.
spk_0
You can break up the problem into different pieces, work on it independently.
spk_0
You can parallelize it.
spk_0
And you won't get a perfect translation, but imagine you just translate every single word.
spk_0
You can at least go translate those all at the same time in parallel, reassemble the sentence,
spk_0
and like mostly understand what the initial meaning was.
spk_0
Yeah.
spk_0
And as Jeff knows very well because he and Sanjay basically built it with Erzholza,
spk_0
Google's infrastructure is extremely parallelizable, distributed.
spk_0
You can break up workloads into little chunks and them all over the various data centers
spk_0
that Google has reassemble the projects, return that to the user.
spk_0
They are the single best company in the world at parallelizing workloads across CPUs,
spk_0
across multiple data centers.
spk_0
CPUs, we're still talking CPUs here.
spk_0
And Jeff's work with the team gets that average sentence translation time down from 12 hours to 100 milliseconds.
spk_0
And so then they ship it in Google translate and it's amazing.
spk_0
This sounds like a Jeff Dean fact.
spk_0
Well, you know, it used to take 12 hours and then Jeff Dean took a few months with a note of 100 milliseconds.
spk_0
Right. Right. Right. Right. Right.
spk_0
So this is the first large, I'm using large and quote here, language model used in production in a product at Google.
spk_0
They see how well this works.
spk_0
Maybe we could use this for other things like predicting search queries as you type.
spk_0
That might be interesting. You know, and of course, the crown jewel of Google's business that also might be interesting application for this.
spk_0
The ad quality score for ad words is literally the predicted click through rate on a given set of ad copy.
spk_0
You can see how an LLM that is really good at ingesting information, understanding it and predicting things based on that might be really useful for calculateable.
spk_0
So that's the way to calculate ad quality for Google.
spk_0
Yep, which is the direct translation to Google's bottom line.
spk_0
Indeed.
spk_0
Okay. So obviously all of that is great on the language model front.
spk_0
I said 2007 was a big year.
spk_0
Also in 2007 begins the sort of momentous intersection of several computer science professors on the Google campus.
spk_0
So in April of 2007, Larry Page hires Sebastian Thrun from Stanford to come to Google and work first part time and then full time on machine learning applications.
spk_0
Sebastian was the head of sale at Stanford, the Stanford Artificial Intelligence Laboratory, legendary AI laboratory that was big in the sort of first wave of AI back in the 60s, 70s when Larry's dad was in the first wave of AI back in the 60s, 70s,
spk_0
and then he had to be a dad was active in the field then actually shut down for a while and then had been restarted and reenergized here in the early 2000s and Sebastian was the leader, the head of sale.
spk_0
Funny story about Sebastian the way that he actually comes to Google.
spk_0
Sebastian was kind of enough to speak with us to prep for this episode.
spk_0
I didn't realize it was basically an aqua hire.
spk_0
He and some I think it was grad students were in the process of starting a company had term sheets from benchmark and Sequoia.
spk_0
Yes.
spk_0
And Larry came over and said, what if we just acquire your company before it's even started in the form of signing bonuses?
spk_0
Yes, probably a very good decision on their part.
spk_0
So sale this group within the CS department at Stanford not only had some of the most incredible, most accomplished professors and PhD AI researchers in the world.
spk_0
They also had this stream of Stanford undergrads that would come through and work there as researchers while they were working on their CS degrees or symbolic system degrees or whatever it was that they were doing a Stanford undergrads.
spk_0
One of those people was Chris Cox, who's the product officer at Meta.
spk_0
Yeah, that was kind of how he got his start to hidden all of this in AI and obviously Facebook and Meta are going to come back into the story here in a little bit.
spk_0
Wow.
spk_0
You really can't make this up another undergrad who passed through sale while Sebastian was there was a young freshman and sophomore who would later drop out of Stanford to start a company that went through why
spk_0
the computer is very first batch in summer 2005.
spk_0
I'm on the edge of my seat. Who is this?
spk_0
Any guesses?
spk_0
Dropbox, Reddit. I'm trying to think who else was in the first batch.
spk_0
Oh, no, no, but way more on the nose for this episode.
spk_0
The company was a failed local mobile social network.
spk_0
Oh, Sam Altman looped Sam Altman.
spk_0
That's amazing. He was at sale at the same time.
spk_0
He was at sale.
spk_0
Yep, that's not a good researcher.
spk_0
Wow.
spk_0
Wow, right?
spk_0
Well, we told you that it's a very small set of people that are all doing all of this.
spk_0
Man, I miss those days.
spk_0
Sam presenting at the WDC with Steve Jobs on stage with the double pop collar.
spk_0
Right.
spk_0
Different time and tech.
spk_0
The double popped collar. That was amazing.
spk_0
That was a vibe. That was a moment.
spk_0
Oh, man.
spk_0
All right. So April 2007, Sebastian comes over from sale into Google Sebastian,
spk_0
Thrun.
spk_0
From one of the first things he does over the next set of months is a project called Ground Truth for Google Maps.
spk_0
Which is essentially Google Maps.
spk_0
It is essentially Google Maps.
spk_0
So before Ground Truth, Google Maps existed as a product,
spk_0
but they had to get all the mapping data from a company called Tele Atlas.
spk_0
I think there were two, they were sort of a doopoly.
spk_0
NavTech was the other one.
spk_0
Yeah, NavTech and Tele Atlas.
spk_0
But it was this like kind of crappy source of truth map data that everyone used
spk_0
and you really couldn't do any better than anyone else because you all just used the same data.
spk_0
Yep. It was not that good.
spk_0
And it cost a lot of money.
spk_0
Tele Atlas and NavTech were multi-billion dollar companies.
spk_0
I think maybe one or both of them were public at some point, think I'd acquired,
spk_0
but a lot of money, a lot of revenue.
spk_0
Yep. And Sebastian's first thing was Street View, right?
spk_0
So he already had the experience of orchestrating this fleet of all these cars to drive around and take pictures.
spk_0
Yes. So then coming into Google, Ground Truth is this sort of moon shot type project
spk_0
to recreate all the Tele Atlas data.
spk_0
Mostly from their own photographs of streets from Street View.
spk_0
And they incorporated some other data.
spk_0
There was like census data they used.
spk_0
And I think it was 40-something data sources to bring it all together.
spk_0
But Ground Truth was this very ambitious effort to create new maps from Whole cloth.
spk_0
Yep. And just like all of the AI and AI enabled projects with Google that we're talking about here,
spk_0
works very, very well, very quickly, huge win.
spk_0
Well, especially when you hire a thousand people in India to help you sift through all the discrepancies and the data
spk_0
and actually hand draw all the maps.
spk_0
Yes. We are not yet in an era of a whole lot of AI automation.
spk_0
So on the back of this win with Ground Truth, Sebastian starts lobbying to Larry and Sergey,
spk_0
hey, we should do this a lot. We should bring in AI professors, academics.
spk_0
I know all these people into Google part time.
spk_0
They don't have to be full-time employees.
spk_0
Let them keep their posts in academia, but come here and work with us on projects for our products.
spk_0
They'll love it. They get to see their work used by millions of millions of people.
spk_0
We'll pay them. They'll make a lot of money. They'll get Google stock.
spk_0
And they get to stay professors at their academic institutions.
spk_0
Win, win, win. Win, win, win.
spk_0
So as you would expect, Larry and Sergey are like, yeah, yeah, yeah, that's a good idea.
spk_0
Let's do that. More of that.
spk_0
So in December of 2007, Sebastian brings in a relatively little known machine learning professor
spk_0
from the University of Toronto named Jeff Hinton to the Google campus to come and give a tech talk.
spk_0
Not yet hiring him, but come give a tech talk to all the folks at Google.
spk_0
And talk about some of the new work, Jeff, that you and your PhD and postdoc students there
spk_0
at the University of Toronto are doing on blazing new paths with neural networks.
spk_0
And Jeff Hinton, for anybody who doesn't know the name, now very much known as the Godfather of neural networks.
spk_0
And really the Godfather of the whole direction that AI went in.
spk_0
Modern AI.
spk_0
He was kind of a fringe academic.
spk_0
Yeah. At this point in history, I mean, neural networks were not a respected subtree of AI.
spk_0
No, totally not.
spk_0
And part of the reason is there had been a lot of hype 30, 40 years before around neural networks
spk_0
that just didn't pan out.
spk_0
So it was effectively everyone thought disproven and certainly backwater.
spk_0
Yep. Then do you remember from our Nvidia episodes, my favorite piece of trivia about Jeff Hinton?
spk_0
Oh, yes, that his grandfather, great grandfather was George Bull.
spk_0
Yeah. He is the great, great grandson of George and Mary Bull, who invented Boolean algebra and Boolean logic.
spk_0
Which is hilarious now that I know more about this because that's the basic building block of symbolic logic,
spk_0
of defined deterministic computer science logic.
spk_0
And the hilarious thing about neural nets is it's not, it's not symbolic AI,
spk_0
it's not I feed you the specific instructions and you follow a big if then tree.
spk_0
It is non-deterministic. It is the opposite of that field.
spk_0
Which actually just underscores again how sort of heretical this branch of machine learning and computer science was.
spk_0
Right.
spk_0
So Ben, as you were saying earlier, neural networks not a new idea and had all of this great promise in theory.
spk_0
But in practice, just took too much computation to do multiple layers.
spk_0
You could really only have a single or maybe small single digit number of layers in a computer neural network up into this time.
spk_0
But Jeff and his former postdoc guy named Jan LeCoon started vandalizing within the community.
spk_0
Hey, if we can find a way to have multi-layered, deep layered neural networks, something we call deep learning,
spk_0
we could actually realize the promise here.
spk_0
It's not that the idea is bad. It's that the implementation,
spk_0
which would take a ton of compute to actually do all the math to do all the multiplication required to propagate through layer after layer after layer of neural networks to sort of detect and understand and store patterns.
spk_0
If we could actually do that, a big multi-layered neural network would be very valuable and possibly could work.
spk_0
Yes.
spk_0
Here we are now in 2007, mid-2000s.
spk_0
Moore's Law has increased enough that you could actually start to try to test some of these theories.
spk_0
So Jeff comes and he gives this talk at Google.
spk_0
It's on YouTube. You can go watch it.
spk_0
We'll link to it in the description.
spk_0
spk_0
This is incredible. This is the artifact of history sitting there on YouTube.
spk_0
And people at Google, Sebastian, Jeff Dean, all the other folks who are talking about it, they get very, very, very excited.
spk_0
Because they've already been doing stuff like this with translate and the language models that they're working with.
spk_0
That's not using deep neural networks that Jeff's working on.
spk_0
So here's this whole new architectural approach that if they could get it to work,
spk_0
would enable these models that they're building to work way better.
spk_0
Recognize more sophisticated patterns, understand the data better, very, very promising.
spk_0
Again, kind of all in theory at this point.
spk_0
So Sebastian Throne brings Jeff Hinton into the Google Fold after this tech talk.
spk_0
I think first as a consultant over the next couple of years, and then this is a basic later,
spk_0
Jeff Hinton technically becomes an intern at Google.
spk_0
That's how they get around the part-time, full-time policies here.
spk_0
He was a summer intern in some around 2011, 2012.
spk_0
And mind you at this point, he's like 60 years old.
spk_0
Yes. So in the next couple of years after 2007 here, Sebastian's concept of bringing these computer science machine learning academics into Google's contractors or part-time interns,
spk_0
basically letting them keep their academic posts and work on big projects for Google's products internally,
spk_0
goes so well that by late 2009, Sebastian and Larry and Sergey decided,
spk_0
hey, we should just start a whole new division within Google.
spk_0
And it becomes Google X, the moonshot factory, the first project within Google X, Sebastian leads himself.
spk_0
Oh, David, don't say it. Don't say it.
spk_0
I won't say the name of it. We will come back to it later.
spk_0
But for our purposes for now, the second project would be critically important, not only for our story, but to the whole world.
spk_0
Everything in AI, changing the entire world, and that second project is called a Google Brain.
spk_0
But before we tell the Google Brain story, now is a great time to thank our friends at JPMorgan Payments.
spk_0
Yes. So today we are going to talk about one of the core components of JPMorgan Payments, their treasury solutions.
spk_0
Now, treasury is something that most listeners probably do not spend a lot of time thinking about, but it's fundamental to every company.
spk_0
Yep. Treasury used to be just a back-off as function, but now great companies are using it as a strategic lever.
spk_0
With JPMorgan Payments treasury solutions, you can view and manage all your cash positions in real time,
spk_0
and all of your financial activities across 120 currencies in 200 countries.
spk_0
And the other thing that they acknowledge really in their whole strategy is that every business has its own quirks, so it's not a cookie cutter approach.
spk_0
They work with you to figure out what matters most for you and your business, and then help you gain clarity, control, and confidence.
spk_0
So whether you need advanced automation or just want to cut down on manual processes and approvals, their real-time treasury solutions are designed to keep things running smoothly.
spk_0
Whether your treasury is in the millions or billions, or perhaps like the company we're talking about this episode in the hundreds of billions of dollars.
spk_0
And they have some great strategic offerings like Pay-by-Bank, which lets customers pay you directly from their bank account.
spk_0
It's simple, secure, tokenized, and you get faster access to funds and enhanced data to optimize revenue and reduce fees.
spk_0
This lets you send and receive real-time payments instantly just with a single API connection to JPMorgan.
spk_0
And because JPMorgan's platform is global, that one integration lets you access 45 countries and counting and lets you scale basically infinitely as you expand.
spk_0
As we've said before, JPMorgan payments moves $10 trillion a day, so scale is not an issue for your business.
spk_0
Not at all. If you're wondering how to actually manage all that global cash, JPMorgan again has you covered with their liquidity and account solutions that make sure you have the right amount of cash and the right currencies in the right places for what you need.
spk_0
So whether you're expanding into new markets or just want more control over your funds, JPMorgan payments is the partner you want to optimize liquidity, streamline operations, and transform your treasury.
spk_0
To learn more about how JPMorgan can help you and your company, just go to jpmorgan.com slash acquired and tell them that Ben and David send you.
spk_0
All right, David. So Google Brain.
spk_0
So when Sebastian left Stanford full-time and joined Google full-time, of course somebody else had to take over sale.
spk_0
And the person who did is another computer science professor, Brink, named Andrew Eng.
spk_0
This is like all the hits. All the hits. This is all the AI hits on this.
spk_0
Yes. So what does Sebastian do? He recruits Andrew to come part time. Start spending a day away on the Google campus.
spk_0
And this coincides right with the start of X and Sebastian formalizing this division.
spk_0
So one day in 2010, 2011 timeframe, Andrew's spending his day away on the Google campus and he bumps into who else?
spk_0
Jeff Dean. And Jeff Dean is telling Andrew about what he and Franz have done with language models and what Jeff Hinton is doing deep learning.
spk_0
Of course, Andrew knows all this and Andrew's talking about what he and sale are doing at Stanford.
spk_0
And they decide, you know, the time might finally be right to try and take a real big swing on this within Google.
spk_0
And build a massive, really large deep learning model in the vein of what Jeff Hinton has been talking about on highly paralysable Google infrastructure.
spk_0
And when you say the time might be right, Google had tried twice before and neither project really worked.
spk_0
They tried this thing called brains on Borg. Borg is sort of an internal system that they used to run all of their infrastructure.
spk_0
They tried the cortex project and neither of these really worked.
spk_0
So there's a little bit of scar tissue in the sort of research group at Google of our large scale neural networks actually going to work for us on Google infrastructure.
spk_0
So the two of them, Andrew and Jeff Dean pull in Greg Carrado, who is a neuroscience PhD and amazing researcher who was already working at Google.
spk_0
And in 2011, the three of them launch the second official project within X, probably enough called Google Brain.
spk_0
And the three of them get to work building a really, really big, deep neural network model.
spk_0
And if they're going to do this, they need a system to run it on.
spk_0
You know, Google is all about taking this sort of frontier research and then doing the architectural and engineering system to make it actually run.
spk_0
Yes.
spk_0
So Jeff Dean is working on this system when the infrastructure and he decides to name the infrastructure dist belief, which of course is a pun, both on the distributed nature of the system.
spk_0
And also on, of course, the word disbelief because no, I thought I was going to work most people in the field thought this was not going to work.
spk_0
And most people in Google thought this was not going to work.
spk_0
And here's a little bit on why and it's a little technical, but follow me for a second.
spk_0
All the research from that period of time pointed to the idea that you needed to be synchronous.
spk_0
So all the compute needed to be sort of really dense happening on a single machine with really high parallelism, kind of like what GPUs do that you really would want it all sort of happening in one place.
spk_0
So it's really easy to kind of go look up and see, hey, what are the computed values for everything else in the system before I take my next move.
spk_0
What Jeff Dean wrote with dist belief was the opposite.
spk_0
It was distributed across a whole bunch of CPU cores and potentially all over a data center or maybe even in different data centers.
spk_0
So in theory, this is really bad because it means you would need to be constantly waiting around on any given machine for the other machines to sink their updated parameters before you could proceed.
spk_0
But instead, the system actually worked asynchronously without bothering to go and get the latest parameters from other cores.
spk_0
So you were sort of updating parameters on stale data.
spk_0
You would think that wouldn't work.
spk_0
The crazy thing is it did.
spk_0
Yes.
spk_0
Okay. So you've got dist belief, what do they do with it now?
spk_0
They want to do some research.
spk_0
So they try out, can we do cool neural network stuff?
spk_0
And what they do in a paper that they submitted in 2011 right at the end of the year is I'll just give you the name of the paper first.
spk_0
Building high level features using large scale unsupervised learning.
spk_0
But everyone just calls it the cat paper.
spk_0
The cat paper.
spk_0
You talked to anyone at Google, you talked to anyone at AI, they're like, oh, yeah, the cat paper.
spk_0
What they did was they trained a large nine layer neural network to recognize cats from unlabeled frames of YouTube videos using 16,000 CPU cores on a thousand different machines.
spk_0
And listeners just to like underscore how seminal this is, we actually talked with SoonDar in prep for the episode.
spk_0
And he cited seeing the cat paper come across as desk as one of the key moments that sticks in his brain in Google story.
spk_0
Yeah. A little later on they would do a TGIF where they would present the results of the cat paper.
spk_0
And you talked to people at Google, they're like, that TGIF, oh my god, that's what it all changed.
spk_0
Yeah. It proved that large neural networks could actually learn meaningful patterns without supervision and without label data.
spk_0
And not only that, it could run on a distributed system that Google built to actually make it work on their infrastructure.
spk_0
And that is a huge unlock of the whole thing.
spk_0
Google's got this big infrastructure asset.
spk_0
Can we take this theoretical computer science idea that the researchers have come up with and use dist belief to actually run it on our system?
spk_0
Yeah. That is the amazing technical achievement here.
spk_0
That is almost secondary to the business impact of the cat paper.
spk_0
I think it's not that much of a leap to say that the cat paper led to probably hundreds of billions of dollars of revenue generated by Google and Facebook and by dance over the next decade.
spk_0
Definitely pattern recognizers and data.
spk_0
So YouTube had a big problem at this time, which was that people would upload these videos and there's tons of videos being uploaded to YouTube.
spk_0
But people are really bad at describing what is in the videos that they're uploaded.
spk_0
And YouTube is trying to become more of a destination site, trying to get people to watch more videos, trying to build a feed, increase dwell time, et cetera, et cetera.
spk_0
And the problem is that recommender is trying to figure out what to feed and it's only just working off titles and descriptions that people were writing about their own videos.
spk_0
And whether you're searching for a video or they're trying to figure out what video to recommend next, they need to know what the video is about.
spk_0
Yep. So the cat paper proves that you can use this technology, a deep neural network running on dist belief to go inside of the videos in the YouTube library and understand what we're doing.
spk_0
What they were about and use that data to then figure out what videos to serve to people.
spk_0
If you can answer the question cat or not a cat, you can answer a whole lot more questions to here's to quote from Jeff Dean about this.
spk_0
We built a system that enabled us to train pretty large neural nets through both model and data parallelism.
spk_0
We had a system for unsupervised learning on 10 million randomly selected YouTube frames as you sang.
spk_0
It would build up unsupervised representations based on trying to reconstruct the frame from the high level representations.
spk_0
We got that working and training on 2000 computers using 16,000 cores.
spk_0
After a little while, that model was actually able to build a representation at the highest neural net level where one neuron would get excited by images of cats.
spk_0
It had never been told what a cat was, but it had seen enough examples of them in the training data of head on facial views of cats that that neuron would then turn on for cats and not much else.
spk_0
It's so crazy. I mean, this is the craziest thing about unlabeled data unsupervised learning that a system can learn what a cat is without ever being explicitly told what a cat is.
spk_0
And that there's a cat neuron.
spk_0
Yeah. And so then there's a iPhone neuron and a San Francisco Giants neuron and all the things that YouTube recommends.
spk_0
Not to mention porn filtering, explicit content filtering.
spk_0
Not to mention copyright identification and enabling revenue share with copyright holders.
spk_0
Yeah, this leads to everything in YouTube.
spk_0
Basically puts YouTube on the path to today becoming the single biggest property on the internet and the single biggest media company in the planet.
spk_0
This kicks off a 10 year period from 2012 when this happens until chat GPT on November 30th, 2022 when AI is already shaping the human existence for all of us and driving hundreds of billions of dollars of revenue.
spk_0
It's just in the YouTube feed and then Facebook borrows it and they hire Jan LeCune and they start Facebook AI research and then they bring it into Instagram and then TikTok and bite dance take it.
spk_0
And then it goes back to Facebook and YouTube with reels and shorts.
spk_0
This is the primary way that humans on the planet spend their leisure time for the next 10 years.
spk_0
This is my favorite David Rosenfallism. Everyone talks about 2022 onward as the AI era.
spk_0
And I love this point from you that actually for anyone that could make good use of a recommender system and a classifier system, basically a company with a social feed, the AI era started in 2012.
spk_0
Yes, the AI era started in 2012 and part of it was the cat paper.
spk_0
The other part of it was what Jensen and Nvidia always calls the big bang moment for AI, which was Alex net.
spk_0
Yes.
spk_0
So we talked about Jeff Hinton back at the University of Toronto.
spk_0
He's got two grad students who he's working with in this era.
spk_0
Alex Krashevsky and Ilya Satskiver.
spk_0
Of course.
spk_0
Future co-founder and chief scientist of OpenAI.
spk_0
And the three of them are working with Jeff's deep neural network ideas and algorithms to create an entry for the famous image net competition in computer science.
spk_0
This is Fei-Fei Li's thing from Stanford.
spk_0
It is a annual machine vision algorithm competition.
spk_0
And what it was was Fei-Fei had assembled a database of 14 million images that were hand labeled.
spk_0
Famously, she used mechanical Turk on Amazon, I think to get them all hand labeled.
spk_0
Yes. And it gets right.
spk_0
And so then the competition was what team can write the algorithm that without looking at the labels, suggesting the images could correctly identify the data.
spk_0
The best algorithms that would win the competitions year-to-year were still getting more than a quarter of the images wrong.
spk_0
So like 75% success rate. Great.
spk_0
Way worse than a human.
spk_0
Can't use it for much in a production setting when quarter of the time you're wrong.
spk_0
So then the 2012 competition along comes Alex net.
spk_0
It's error rate was 15%.
spk_0
Still high, but a 10% leap from the previous best being a 25% error rate all the way down to 15 in one year.
spk_0
A leap like that had never happened before.
spk_0
It's 40% better than the next best on a relative basis.
spk_0
Yes.
spk_0
And why is it so much better, David?
spk_0
What did they figure out that would create a $4 trillion company in the future?
spk_0
So what Jeff, and Alex and Ilya did is they knew like we've been talking about all episode that deep neural networks had all this potential.
spk_0
And more is a lot advanced enough that you could use CPUs to create a few layers.
spk_0
They had the aha moment of what if we re-architected this stuff not to run on CPUs, but to run on a whole different class of computer chips that were by their very nature highly, highly, highly parallelizable video game graphics cards made by the leading company in the space at the time in video.
spk_0
Not obvious at the time, and especially not obvious that this highly advanced cutting edge academic computer science research that was being done on super computers usually that was being done on super computers with incredible CPUs would use these toy video game cards that retail for $1000.
spk_0
Yeah, they less at that point time, a couple hundred bucks.
spk_0
So the team in Toronto, they go out to like the local best buyer something they buy two Nvidia GeForce GTX 580s, which were Nvidia's top of the line gaming cards at the time.
spk_0
The Toronto team re-writes their neural network algorithms in CUDA in videos programming language.
spk_0
They train it on these two off the shelf GTX 580s, and this is how they achieve their deep neural network and do 40% better than any other entry in the image net competition.
spk_0
So in Jetson says that this was the big bang moment of artificial intelligence, a he's right, this shows everybody that holy crap if you can do this with two off the shelf GTX 580s, imagine what you could do with more of them or with specialized chips and be this event is what sets in video on the path from a somewhat struggling PC gaming accessory maker to the leader of the AI wave and the most valuable company in the world today.
spk_0
And this is how AI research tends to work is there's some breakthrough that gets you this big step change function.
spk_0
And then there's actually a multi-year process of optimizing from there where you get these kind of diminishing returns curves on breakthroughs where the first half of the advancement happens all at once.
spk_0
And then the second half takes many years after that to figure out, but it's rare and amazing and must be so cool when you have an idea you do it and then you realize, oh my god, I just found the next giant leap in the field.
spk_0
It's like I unlocked the next level to use the video game analogy.
spk_0
Yes. I leveled up.
spk_0
So after Alex net, the whole computer science world is above is people are starting to stop doubting neural networks at this point.
spk_0
Yes. So after Alex net, the three of them from Toronto, Jeff Hinton, Alex Krashevsky and Ilya Sitsky were do the natural thing.
spk_0
They started company called DNN research deep neural network research company does not have any products.
spk_0
This company has AI researchers who just want a big competition and predictably as you might imagine, it gets acquired by Google almost immediately.
spk_0
Oh, are you intentionally shortening this?
spk_0
That's what I thought the story was.
spk_0
Oh, it is not immediately.
spk_0
Oh, okay.
spk_0
There's a whole crazy thing that happens where the first bit is actually from Baidu.
spk_0
Oh, I did not know that.
spk_0
So Baidu offers $12 million.
spk_0
Jeff Hinton doesn't really know how to value the company and doesn't know if that's fair.
spk_0
And so he does what any academic would do to best determine the market value of the company.
spk_0
He says, thank you so much.
spk_0
I'm going to run an auction now.
spk_0
And I'm going to run it in a highly structured manner where every time anybody wants to bid, the clock resets and there's another hour where anybody else can submit another bid.
spk_0
No way.
spk_0
So I didn't know this. This is crazy.
spk_0
He gets in touch with everyone that he knows from the research community who is now working at a big company who he thinks, hey, this would be a good place for us to do our research.
spk_0
That includes Baidu, that includes Google, that includes Microsoft, and there's one other.
spk_0
Facebook, of course.
spk_0
It's a two year old startup.
spk_0
Oh, wait, so it does not include Facebook?
spk_0
It does not include Facebook.
spk_0
Think about the year. This is 2012.
spk_0
So Facebook's not really in the AI game yet.
spk_0
They're still trying to build their own AI lab.
spk_0
Yeah, yeah, because Jan LeCoon and Fairwood started in 2013.
spk_0
Is it Instagram?
spk_0
Nope.
spk_0
It is the most important part of the end of this episode.
spk_0
Wait, well, it can't be Tesla because Tesla is older than that.
spk_0
Nope.
spk_0
Well, opening I wouldn't get founded for years.
spk_0
Wow. Okay, you really got me here.
spk_0
What company's slightly predated OpenAI doing effectively the same mission?
spk_0
Oh, of course.
spk_0
Of course.
spk_0
I didn't even play in sight.
spk_0
Deep mind.
spk_0
Wow.
spk_0
Deep mind, baby.
spk_0
They are the fourth bidder in a four way auction for DNN research.
spk_0
Now, of course, right after the bidding starts, DeepMind has to drop out.
spk_0
They don't actually have the cash to be able to buy.
spk_0
Yeah, didn't even cross my mind because my first question was like, where the hell would they get the money?
spk_0
Because they had no money.
spk_0
But Jeff Hinton already knows and respects Demis, even though he's just doing this at the time start up called DeepMind.
spk_0
That's amazing.
spk_0
Wait, how is DeepMind in the auction but Facebook is not?
spk_0
Isn't that wild?
spk_0
That's wild.
spk_0
So the timing of this is concurrent with the, it was then called NIPS.
spk_0
Now it's called NRIPS conference.
spk_0
So Jeff Hinton actually runs the auction from his hotel room at the Harris casino in Lake Tahoe.
spk_0
Oh my God.
spk_0
Amazing.
spk_0
So the bids all come in and we got to thank Cade Metz, the author of Genius Makers, great book on the whole history of AI that we're actually going to reference a lot in this episode.
spk_0
The bidding goes up and up and up.
spk_0
At some point Microsoft drops out.
spk_0
They come back in, told you DeepMind drops out.
spk_0
So it's Baidu and Google really going at the end.
spk_0
And finally at some point the researchers look at each other and they say, where do we actually want to land?
spk_0
We want to land at Google.
spk_0
And so they stop the bidding at $44 million and just say, Google, this is more than enough money.
spk_0
We're going with you.
spk_0
Wow.
spk_0
I knew it was about $40 million.
spk_0
I did not know that old story.
spk_0
It's almost like Google itself and you know the Dutch auction IPO process.
spk_0
Right.
spk_0
How fitting.
spk_0
It's kind of a perfect DNA.
spk_0
Yes.
spk_0
Wow.
spk_0
And the three of them were supposed to split at 33 each and Alex and Ilya go to Jeff and say,
spk_0
I really think you should have a bigger percent.
spk_0
I think you should have 40% and we should each have 30 and that's how it ends up breaking down.
spk_0
Wow.
spk_0
What a team.
spk_0
Well, that leads to the three of them joining Google Brain directly and turbocharging everything going on there.
spk_0
Spoiler alert, a couple of years later, Astro Teller who would take over running Google X
spk_0
after Sebastian Thrun left.
spk_0
He would get quoted in the New York Times in a profile of Google X that the gains to Google's core businesses
spk_0
and search and ads and YouTube from Google Brain have way more than funded all of the other bets that they have made within Google X
spk_0
and throughout the company over the years.
spk_0
That's one of these things and if you make something a few percent better that happens to do tens of billions of dollars
spk_0
or hundreds of billions of dollars in revenue, you find quite a bit of loose change in those couch cushions.
spk_0
Yes, quite a bit of loose change.
spk_0
But that's not where the AI history ends within Google.
spk_0
There is another very important piece of the Google AI story that is an acquisition from outside of Google,
spk_0
the AI equivalent of Google's acquisition of YouTube.
spk_0
That's what we talked about a minute ago.
spk_0
Deep-mind.
spk_0
But before we tell the Deep-mind story, now is a great time to thank a new partner of ours, Centri.
spk_0
Yes, listeners that is S-E-N-T-R-Y like someone's standing guard.
spk_0
Yes, Centri helps developers debug everything from errors to latency and performance issues pretty much any software problem
spk_0
and fix them before users get mad.
spk_0
As their homepage puts it, they are considered to quote unquote not bad by over four million software developers.
spk_0
And today we're talking about the way that Centri works with another company in the acquired universe, Anthropic.
spk_0
Anthropic used to have some older monitoring systems in place.
spk_0
But as they scaled and became more complex, they adopted Centri to find and fix issues faster.
spk_0
So when you're building AI models like we're talking about all episode here,
spk_0
small issues can ripple out into big ones fast.
spk_0
Let's say you're running a huge compute job like training them all.
spk_0
If one node fails, it can have massive downstream impacts,
spk_0
costing huge amounts of time and money.
spk_0
Centri helped Anthropic detect bad hardware early,
spk_0
so they could reject it before causing a cascading problem and taking debugging down to hours instead of days for them.
spk_0
And one other fun update from Centri, they now have an AI debugging agent called SEER.
spk_0
SEER uses all the context that Centri has about your app usage to run root cause analysis as issues are detected.
spk_0
It uses errors, span data, logs, and tracing, and your code to understand the root cause, fix it, and get you back to shipping.
spk_0
It even creates pull requests to merge code fixes in.
spk_0
And on top of that, they also recently launched agent and MCP server monitoring.
spk_0
AI tooling tends to offer limited visibility into what's going on under the hood, shall we say.
spk_0
Centri's new tools make it easy to understand exactly what's going on.
spk_0
This is everything from actual AI tool calls to performance across different models and interactions between AI and the downstream services.
spk_0
We're pumped to be working with Centri, we're big fans of the company, and of all the great folks we're working with there.
spk_0
They have an incredible customer list, including not only Anthropic, but Cursor, Versel, Linear, and more.
spk_0
And actually, if you're in San Francisco or the Bay Area, Centri is hosting a small invite only event with
spk_0
David and I in San Francisco for product builders on October 23rd, you can register your interest at centri.io slash acquired that centri SEMTRY.io slash acquired.
spk_0
And just tell them that Ben and David sent you.
spk_0
All right, David, deep mind.
spk_0
I kind of like your framing, the YouTube of AI.
spk_0
The YouTube of AI for Google.
spk_0
They bought this thing for, we'll talk about the purchase price, but it's worth what, $500 billion today.
spk_0
I mean, this is as good as Instagram or YouTube in terms of greatest acquisitions of all time.
spk_0
100%.
spk_0
So I remember when this deal happened, just like I remember when the Instagram deal happened, because the number was big at the time.
spk_0
It was big, but I remember it for a different reason.
spk_0
It was like when Facebook bought Instagram, like, oh my god, this is, wow, what a tectonic shift in the landscape of tech.
spk_0
In January 2014, I remember reading on TechCrunch this random news.
spk_0
Right.
spk_0
You're like, deep what?
spk_0
That Google is spending a lot of money to buy something in London that I've never heard of.
spk_0
That's working on artificial intelligence.
spk_0
Question mark.
spk_0
Right.
spk_0
This really illustrates how outside of mainstream tech AI was at the time.
spk_0
Yeah.
spk_0
And then you dig in a little further and you're like, this company doesn't seem to have any products.
spk_0
And it also doesn't even really say anything on its website about what deep mind is.
spk_0
It says it is a, quote unquote, cutting edge artificial intelligence company.
spk_0
Wait, did you look this up on the way back machine?
spk_0
I did, I did.
spk_0
Oh, nice.
spk_0
To build general purpose learning algorithms for simulations, e-commerce, and games.
spk_0
This is 2014.
spk_0
This does not compute, does not register.
spk_0
Simulations, e-commerce, and games.
spk_0
It's kind of a random spattering of exactly.
spk_0
It turns out though, not only was that description of what deep mind was fairly accurate, this company and this purchase of it by Google
spk_0
was the butterfly flapping its wings equivalent moment that directly leads to open AI,
spk_0
chat GPT, anthropic, and basically everything.
spk_0
Certainly Gemini.
spk_0
That we know, Gemini directly in the world of AI today.
spk_0
And probably XAI, given Elon's involvement.
spk_0
Yeah, of course, XAI.
spk_0
In a weird way, it sort of leads to Tesla self-driving too.
spk_0
Carpathian.
spk_0
Yeah, definitely.
spk_0
Okay.
spk_0
So what is the story here?
spk_0
DeepMine was founded in 2010 by a neuroscience PhD named Demis Hassabas, who previously started a video game company.
spk_0
Oh, yeah.
spk_0
And a postdoc named Shane Legg at University College London, and a third co-founder who was one of Demis's friends from growing up.
spk_0
Mustafa Suleiman.
spk_0
This was unlikely to say the least.
spk_0
This would go on to produce a night and Nobel Prize winner.
spk_0
Yes.
spk_0
So Demis, the CEO, was a childhood chess prodigy turned video game developer who, when he was age 17 in 1994, he had gotten accepted to the University of Cambridge.
spk_0
But he was too young, and the University told him, hey, take a, you know, gap year, come back.
spk_0
He decided that he was going to go work in a video game developer at a video game studio called Bullfrog Productions for the Year.
spk_0
And while he's there, he created the game Theme Park, if you remember that.
spk_0
It was like a theme park version of SimCity.
spk_0
This was a big game.
spk_0
This was very commercially successful.
spk_0
Roller Coaster Tycoon would be sort of a clone of this that would have many, many sequels over the years.
spk_0
Oh, I played a ton of that.
spk_0
Yeah.
spk_0
It sells 15 million copies in the mid 90s.
spk_0
Wow.
spk_0
Wow.
spk_0
Then after this, he goes to Cambridge, studies computer science there.
spk_0
After Cambridge, he gets back into gaming, found another game studio called Elixir.
spk_0
That would ultimately fail.
spk_0
And then he decides, you know what?
spk_0
I'm going to go get my PhD in neuroscience.
spk_0
And that is how Demis ends up at University College, London.
spk_0
There he meets Shane Legg, who's there as a postdoc.
spk_0
Shane is a self described at the time, member of the lunatic fringe in the AI community.
spk_0
In that he believes, this is 2008, 9, 10, he believes that AI is going to get more and more and more powerful every year.
spk_0
And that it will become so powerful that it will become more intelligent than humans.
spk_0
And Shane is one of the people who actually popularizes the term artificial general intelligence, AGI.
spk_0
Wow.
spk_0
Interesting.
spk_0
Which of course, lots of people talk about now and approximately zero people were afraid of that.
spk_0
I mean, you had like the Nick Bostrom type folks, but very few people were thinking about super intelligence,
spk_0
or the singularity or anything like that.
spk_0
For what it's worth, not Elon Musk.
spk_0
He's not included in that list because Demis would be the one who tells Elon about this.
spk_0
Yes, we'll get to it.
spk_0
So Demis and Shane hit it off.
spk_0
They pull in Mustafa, Demis's childhood friend who is himself extremely intelligent.
spk_0
He had gone to the University of Oxford and then dropped out, I think at age 19,
spk_0
to do other startup-y type stuff.
spk_0
So the three of them decided to start a company, DeepMind, the name of course being a reference to Deep Learning,
spk_0
Jeff Hinton's work and everything coming out of the University of Toronto.
spk_0
And the goal that the three of these guys have of actually creating an intelligent mind with Deep Learning.
spk_0
Like, Jeff and Ilyann Alex aren't really thinking about this yet.
spk_0
As we said, this is lunatic fringe type stuff.
spk_0
Yes.
spk_0
Alex Net, the cat paper, that whole world is about better classifying data.
spk_0
Can we better sort into patterns?
spk_0
It's a giant leap from there to say, oh, we're going to create intelligence.
spk_0
Yes. I think probably some people promise almost certainly at Google we're thinking,
spk_0
oh, we can create narrow intelligence that'll be better than humans at certain tasks.
spk_0
I mean, a calculator is better than humans at certain tasks.
spk_0
Right.
spk_0
But I don't think too many people were thinking, oh, this is going to be general intelligence smarter than humans.
spk_0
Right.
spk_0
So they decide on the tagline for the company is going to be solve intelligence and use it to solve everything else.
spk_0
Ooh, I like it.
spk_0
I like it.
spk_0
Yeah, yeah, yeah, they're good marketers to these guys.
spk_0
So there's just one problem to do what they want to do.
spk_0
Money.
spk_0
Just say money is the problem.
spk_0
Money is the problem for lots of reasons.
spk_0
But even more so than any other given startup in the 2010 era,
spk_0
it's not like they can just go spin up an AWS instance and build an app and deploy it to the App Store.
spk_0
They want to build really, really, really, really, really big, deep learning neural networks that requires Google size levels of compute.
spk_0
Well, it's interesting.
spk_0
And actually, they don't require that much funding yet.
spk_0
The AI of the time was go grab a few GPUs.
spk_0
We're not training giant LLMs.
spk_0
That's the ambition eventually.
spk_0
But right now what they just need to do is raise a few million bucks.
spk_0
But who's going to give you a few million bucks when there's no business plan?
spk_0
When you're just trying to solve intelligence, you need to find some lunatics.
spk_0
It's a tough sell to VCs.
spk_0
Except for the exact right.
spk_0
As I say, they need to find some lunatics.
spk_0
Oh, I chose by words carefully.
spk_0
Yeah, we use the term lunatic in a...
spk_0
It's endearing.
spk_0
Yes.
spk_0
Most endearing possible way here, given that they were all basically right.
spk_0
So in June 2010,
spk_0
Demison Chain managed to get invited to the Singularity Summit in San Francisco, California.
spk_0
Because they're not raising money for this in London.
spk_0
Yeah, definitely not.
spk_0
I think they tried for a couple months and learned that that was not going to be a viable path.
spk_0
Yes.
spk_0
This summit, the Singularity Summit,
spk_0
organized by Ray Kurzweil,
spk_0
a future Google employee, I think, Chief Futurist, who noted Futurist.
spk_0
Eli Ezzer, you Kowski,
spk_0
and Peter Teal.
spk_0
Yes.
spk_0
So Demison Chain are excited about getting this invite.
spk_0
Like, this is probably our one chance to get funded.
spk_0
But we probably shouldn't just walk in guns blazing and say,
spk_0
Peter, can we pitch you?
spk_0
Yeah.
spk_0
So they finagle their way into Demison getting to give a talk on stage at the summit.
spk_0
Always the hack.
spk_0
They're like, this is great.
spk_0
This is going to be the hack.
spk_0
The talk is going to be our pitch to Peter.
spk_0
And Founders Fund.
spk_0
Peter has just started Founders Fund at this point.
spk_0
No, obviously.
spk_0
Member of the PayPal Mafia.
spk_0
Very wealthy.
spk_0
I think it a big Roth IRA at this point.
spk_0
This is the right way to frame this.
spk_0
Big Roth IRA that he had invested in Facebook, first investor in Facebook.
spk_0
He is the perfect target.
spk_0
They architect the presentation at the summit to be a pitch directly to Peter,
spk_0
essentially a thinly veiled pitch.
spk_0
Shane has a quote in Parme Olson's great book Supremacy that we used as a source
spk_0
for a lot of this deep mind story.
spk_0
And Shane says, we needed someone crazy enough to fund an AGI company.
spk_0
Somebody who had the resources not to sweat a few million and liked Suprem Bish's stuff.
spk_0
They also had to be massively contrarian because every professor that he would go talk to
spk_0
would certainly tell him absolutely do not even think about funding this.
spk_0
That Venn diagram sure sounds a lot like Peter Teal.
spk_0
So they show up at the conference.
spk_0
Demison is going to give the talk, goes out on stage, he looks out into the audience.
spk_0
Peter is not there.
spk_0
Turns out Peter wasn't actually that involved in the conference.
spk_0
He's a busy guy.
spk_0
He's a co-founder, co-organizer, but is a busy guy.
spk_0
Yes, guys like shoot.
spk_0
We missed our chance.
spk_0
What are we going to do?
spk_0
And then Fortune turns in their favor.
spk_0
They find out that Peter is hosting an after party that night at his house in San Francisco.
spk_0
They get into the party.
spk_0
And then he's like,
spk_0
Demison, he's very, very smart.
spk_0
Is anybody who's ever listened to him talk?
spk_0
Would immediately know.
spk_0
He's like, rather than just pitching Peter head on.
spk_0
I'm going to come about this obliquely.
spk_0
He starts talking to Peter about chess because he knows that everybody does that Peter Teal loves chess.
spk_0
And Demison had been the second highest ranked player in the world as a teenager in the under 14 category.
spk_0
Good strategy.
spk_0
The man knows his chess moves.
spk_0
So Peter is like, hmm, I like you.
spk_0
You seem smart.
spk_0
What do you do?
spk_0
And Demison explains he's got this.
spk_0
AGI startup.
spk_0
They were actually here.
spk_0
He give a talk on stage as part of the conference.
spk_0
People are excited about this.
spk_0
And Peter said, uh, okay, all right.
spk_0
Come back to Founders Fund tomorrow and give me the pitch.
spk_0
So they do.
spk_0
They make the pitch.
spk_0
It goes well.
spk_0
Founders fund leads deep minds seed round of about $2 million.
spk_0
My how times have changed for AI company seed rounds these days.
spk_0
Oh, yes.
spk_0
Imagine leading deep minds seed round with less than $2 million check.
spk_0
And through Peter and Founders Fund, they get introduced.
spk_0
Hey, Elon, you should meet this guy.
spk_0
Do another member of the PayPal Mafia, Elon Musk.
spk_0
Yes.
spk_0
So it's teed up in a pretty low key way.
spk_0
Hey, Elon, you should meet this guy.
spk_0
He's smart.
spk_0
He's thinking about artificial intelligence.
spk_0
So Elon says great come over to SpaceX.
spk_0
I'll give you the tour of the place.
spk_0
So Demis comes over for a launch in a tour of the factory.
spk_0
Of course, Demis thinks it's very cool.
spk_0
But really, he's trying to reorient the conversation over to artificial intelligence.
spk_0
And I'll read this great excerpt from an article in the Guardian.
spk_0
Musk told Hassabas his priority was getting to Mars as a backup planet in case something went wrong here.
spk_0
I don't think he'd thought much about AI at this point.
spk_0
Hassabas pointed out a flaw in his plan.
spk_0
I said, what if AI was the thing that went wrong here?
spk_0
Then being on Mars wouldn't help you because if we got there,
spk_0
then it would obviously be easy for an AI to get there through our communication systems
spk_0
or whatever it was.
spk_0
He hadn't thought about that.
spk_0
So he said that for a minute without saying anything, just sort of thinking.
spk_0
Hmm, that's probably true.
spk_0
Shortly after, Musk too became an investor in deep mind.
spk_0
Yes.
spk_0
Yes.
spk_0
Yes.
spk_0
I think it's crazy that Demis is sort of the one that woke Elon up to this idea of,
spk_0
we might not be safe from the AI on Mars either.
spk_0
Right.
spk_0
I hadn't considered that.
spk_0
So this is the first time the bit flips for Elon of,
spk_0
we really need to figure out a safe, secure AI for the good of the people
spk_0
that's sort of seed being planted in his head.
spk_0
Which of course is what DeepMind's ambition is.
spk_0
We are here doing research for the good of humanity like scientists in a peer reviewed way.
spk_0
Yep.
spk_0
I think all that is true.
spk_0
Also, in the intervening months to year after this meeting between Demis and Elon and Elon investing in deep mind,
spk_0
Elon also starts to get really, really excited and convinced about the capabilities of AI in the near term.
spk_0
And specifically the capabilities of AI for Tesla.
spk_0
Yes.
spk_0
Like with everything else in Elon's world, once the bit flips and he becomes interested,
spk_0
he completely changes the way he views the world, completely sheds all the old ways and actions that he was taking in.
spk_0
It's all about what do I most do to embrace this new world view that I have.
spk_0
And other people have been working on for a while already by this point.
spk_0
AI driving cars.
spk_0
Yep.
spk_0
That sounds like it would be a pretty good idea for Tesla.
spk_0
Does.
spk_0
So Elon starts trying to recruit as many AI researchers as he possibly can and machine vision and machine learning experts into Tesla.
spk_0
And then Alex net happens.
spk_0
And man, Alex net's really, really, really good at identifying, classifying images and cat videos on YouTube and the YouTube recommender feed.
spk_0
Well, is that really that different from a live feed a video from a car that's being driven and understanding what's going on there?
spk_0
Can we process it in real time and look at differences between frames?
spk_0
Perhaps controlling the car?
spk_0
Not all that different.
spk_0
So Elon's excitement channeled initially through deep mind and demos about AI and AI for Tesla starts ratcheting up big time.
spk_0
Yep.
spk_0
Meanwhile back in London, deep mind is getting to work.
spk_0
They're hiring researchers.
spk_0
They're getting to work on models.
spk_0
They're making some vague noises about products to their investors.
spk_0
Maybe we could do something in shopping, maybe something in gaming like the description on the website at the time of acquisition said.
spk_0
But mostly what they really, really want to do is just build these models and work on intelligence.
spk_0
And then one day in late 2013, they get a call from Mark Zuckerberg.
spk_0
He wants to buy the company.
spk_0
Mark has woken up to everything that's going on at Google after Alex net.
spk_0
And what AI is doing for social media feed recommendations at YouTube.
spk_0
The possibility of what it can do at Facebook and for Instagram.
spk_0
He's gone out and recruited Jan LeCoon, Jeff Hintens, old postdoc, who's together with Jeff, one of the sort of godfathers of AI and deep learning.
spk_0
And really popularize the idea of convolutional neural networks.
spk_0
The next hot thing in the field of AI at this point in time.
spk_0
And so with Jan, they have created fair Facebook AI research, which is a Google brain rival within Facebook.
spk_0
And remember who the first investor in Facebook was who's still on the board.
spk_0
Peter Teal.
spk_0
And is also the lead investor in deep mind.
spk_0
Where do you think Mark learned about deep mind?
spk_0
Peter Teal.
spk_0
Was it, do you know for sure that it was from Peter?
spk_0
No, I don't know for sure, but like how else could Mark have learned about this start up in London?
spk_0
I've got a great story of how Larry Page found out about it.
spk_0
Oh, okay. Well, we'll get to that in one second.
spk_0
So Mark calls and offers to buy the company.
spk_0
And there are various rumors of how much Mark offered.
spk_0
But according to Parme Olsen in her books, Parme C, the reports are that it was up to $800 million.
spk_0
Company with no products and a long way from a GI.
spk_0
That squares with what Kate Metz has in his book that the founders would have made about twice as long as it was.
spk_0
And that's not as much money from taking Facebook's offer versus taking Google's offer.
spk_0
Yep.
spk_0
So, Demis of course takes this news to the investor group.
spk_0
Which by the way, is kind of against everything the company was founded on.
spk_0
The whole aim of the company and what he's promised the team is that deep mind is going to stay independent,
spk_0
do research, publish in the scientific community.
spk_0
We're not going to be sort of captured and told what to do by the whims of a capitalist institution.
spk_0
Yep.
spk_0
So definitely some deal point negotiating that has to happen with Mark and Facebook if this offer is going to come through.
spk_0
But Mark is so desperate at this point.
spk_0
He is open to these very large deal point negotiations, such as Jan Lecune gets to stay in New York.
spk_0
Jan Lecune gets to stay operating his lab at NYU.
spk_0
Jan Lecune is a professor.
spk_0
He's flexible on some things.
spk_0
Turns out, Mark is not flexible on letting Demis keep control of deep mind if he buys it.
spk_0
Demis sort of argued for we need to stay separate and carved out and we need this independent oversight board with his ability to intervene.
spk_0
If the mission of deep mind is no longer being followed and Mark's like, no, you'll be a part of Facebook.
spk_0
Yeah.
spk_0
And you'll make a lot of money.
spk_0
So as this negotiation is going on, of course the investors in deep mind get wind of this Elon finds out about what's going on.
spk_0
He immediately calls up Demis and says, I will buy the company right now with Tesla stock.
spk_0
This is late 2013, like early 2014, Tesla's market cap is about 20 billion dollars.
spk_0
So Tesla stock from then to today is about a 70X run up.
spk_0
Demis and Shane and Mustafa are like, wow, okay, there's a lot going on right now.
spk_0
But to your point, they have the same issues with Elon and Tesla that they had with Mark.
spk_0
Elon wants them to come in and work on autonomous driving for Tesla.
spk_0
They don't want to work on autonomous driving.
spk_0
Right.
spk_0
Or at least exclusively.
spk_0
At least exclusively.
spk_0
Yep.
spk_0
So then Demis gets a third call from Larry Page.
spk_0
Do you want my story of how Larry knows about the company?
spk_0
I absolutely want your story of how Larry knows about the company.
spk_0
All right.
spk_0
So this is still early in deep minds life.
spk_0
We haven't progressed all the way to this acquisition point yet.
spk_0
Apparently Elon Musk is on a private jet with Luke Nozick, who's another member of the PayPal Mafia and an angel investor in deep mind.
spk_0
And they're reading an email from Demis with an update about a breakthrough that they had where deep mind AI figured out a clever way to win at the Atari game breakout.
spk_0
Yes.
spk_0
And the strategy it figured out with no human training was that you could bounce the ball up around the edges of the bricks.
spk_0
And then without needing to intervene, it could bounce around along the top and win the game faster without you needing to have a whole bunch of interactions with the paddle down at the bottom.
spk_0
They're watching this video of how clever it is and flying with them on the same private plane is Larry Page.
spk_0
Of course, because Elon and Larry used to be very good friends.
spk_0
Yes. And Larry is like, what are you watching? What company is this? And that's how he finds out.
spk_0
Wow. Yes.
spk_0
Elon must have been so angry about all this.
spk_0
And the crazy thing is this kinship between Larry and Demis is I think the reason why the deal gets done at Google.
spk_0
Once the two of them get together, they are like peas in a pod. Larry has always viewed Google as an AI company.
spk_0
Demis, of course, views deep mind so much as an AI company that he doesn't even want to make any products until they can get to AGI.
spk_0
And Demis, in fact, we should share with listeners. Demis told us this when we were talking to him to prep for this episode, just felt like Larry got it.
spk_0
Larry was completely on board with the mission of everything that deep mind was doing.
spk_0
And there's something else very convenient about Google. They already have brain.
spk_0
So Larry doesn't need Demis and Shayna Mustafa and DeepMind to come work on products within Google.
spk_0
Right. Brain is already working on products within Google. Demis can really believe Larry when Larry says,
spk_0
No, stay in London. Keep working on intelligence. Do what you're doing. I don't need you to come work on products within Google.
spk_0
Brain is like actively going and engaging with the product groups trying to figure out,
spk_0
Hey, how can we deploy neural nets into your product to make it better? That's like their reason for being.
spk_0
So they're happy to agree to this.
spk_0
And it's working. Brain and neural nets are getting integrated into search into ads into Gmail into everything.
spk_0
It is the perfect home for deep mind. Home away from home, shall we say?
spk_0
Yes. And there's a third reason why Google is the perfect fit for deep mind infrastructure.
spk_0
Google has all the compute infrastructure you could ever want right there on tap.
spk_0
Yes, at least with CPUs so far. Yes. So how's the deal actually happen?
spk_0
Well, after buying DNN research, Alan Eustis, who David you spoke with, right?
spk_0
Yep. Was Google's head of engineering at the time he makes up his mind that he wanted to hire all the best deep learning research talent that he possibly could and he had a clear path to do so.
spk_0
A few months earlier, Larry Page held a strategy meeting on an island in the South Pacific in Kade Metz's book. It's an undisclosed island.
spk_0
Of course he did.
spk_0
Larry thought that deep learning was going to completely change the whole industry. And so he tells his team, this is a quote, let's really go big,
spk_0
which effectively gave Alan a blank check to go secure all the best researchers that he possibly could.
spk_0
So in 2013, he decides I'm going to get on a plane in December before the holidays and go meet deep mind.
spk_0
Crazy story about this. Jeff Hinton, who's at Google at the time, had a thing with his back where he couldn't sit down.
spk_0
He either has to stand or lay. And so a long flight across the ocean is not doable.
spk_0
But he needs to be there as a part of the diligence process. You have Jeff Hinton. You need to use him to figure out if you're going to buy a deep learning company.
spk_0
And so Alan Eustace decides he's going to charter a private jet.
spk_0
And he's going to build this crazy custom harness rig so that Jeff Hinton won't be sliding around when he's laying on the floor during takeoff and landing.
spk_0
Wow. I was thinking the first part of this.
spk_0
I'm pretty sure Google has planes. They could just get in a Google play for whatever reason. This was a separate charter.
spk_0
But it's not solvable just with a private plane. You need also a harness.
spk_0
Right. And Alan is the guy who set the record for jumping out of the world's highest was it a balloon.
spk_0
I actually don't know the highest free fall jump that anyone has ever done even higher than that red bull stunt a few years before.
spk_0
So he's like very used to designing these custom rigs for airplanes. He's like, oh, no problem. You just need a bed and some straps.
spk_0
I jumped out of the atmosphere in a scuba suit. I think we'll be fine. That is amazing.
spk_0
So they fly to London. They do the diligence. They make the deal. Demis has true kinship with Larry. And it's done.
spk_0
$550 million US dollars. There's an independent oversight board that is set up to make sure that the mission and goals of deep mind are actually being followed.
spk_0
And this is an asset that Google owns today that again, I think is worth half a trillion dollars if it's independent.
spk_0
Do you know what other member of the PayPal Mafia gets put on the ethics board after the acquisition?
spk_0
Read Hoffman. Read Hoffman has to be given the open AI tie later.
spk_0
We are going to come back to read in just a little bit here. Yes. So after the acquisition, it goes very well very quickly.
spk_0
Famously, the data center cooling thing happens where deep mind carved off some part of the team to go and be an emissary to Google and look for ways to use deep mind.
spk_0
And one of them is around data center cooling very quickly. July of 2016 Google announces a 40% reduction in the energy required to cool data centers.
spk_0
I mean, Google is not a lot of data centers, a 40% energy reduction. I actually talked with Jim Gow, who's a friend of the show and actually led a big part of this project.
spk_0
And I mean, it was just the most obvious application of neural networks inside of Google right away pays for itself.
spk_0
Yeah, imagine that paid for the acquisition pretty quickly there. Yes. David, should we talk about AlphaGo on this episode?
spk_0
Yeah, yeah, yeah.
spk_0
I watched the whole documentary that Google produced about it. It's awesome. This is actually something that you would enjoy watching, even if you're not researching a podcast episode and just looking to pull something up and spend an hour or two.
spk_0
I highly recommend it. It's on YouTube. It's the story of how deep mind post acquisition from Google trained a model to beat the world Go champion at Go.
spk_0
And I mean, everyone in the whole Go community coming in thought there's no chance this guy Lee C. Dahl is so good that there's no way that an AI could possibly beat him.
spk_0
It's a five game thing. It just won the first three games straight. I mean, completely cleaned up. And with inventive new creative moves that no human has played before. That's sort of the big crazy takeaway.
spk_0
There's a moment in one of the games right where it makes a move of people like is that a mistake?
spk_0
Like 37 air. Yeah, move 37. Yeah, yeah. And then 100 moves later it plays out and that it was like completely genius. And humans are now learning from deep minds strategy of playing the game and discovering new strategies.
spk_0
A fun thing for acquired listeners who are like, why is it go? Go is so complicated compared to chess chess has 20 moves that you can make at the beginning of the game in any given turn and then mid game.
spk_0
There's like 30 to 40 moves that you could make go on any given turn has about 200. And so if you think combinatorially the number of possible configurations of the board is more than the number of atoms in the universe.
spk_0
That's a great damness quote by the way. And so he says even if you took all the computers in the world and ran them for a million years as of 2017.
spk_0
That wouldn't be enough compute power to calculate all the possible variations. So it's cool because it's a problem that you can't brute force.
spk_0
You have to do something like neural networks and there is this white space to be creative and explore and so it served as this amazing breeding ground for watching a neural network be creative against a human.
spk_0
Yeah. And of course it's totally in with. Demis is background in the DNA of company of playing games. You have demos was chess champion and then after go then they play starcraft right.
spk_0
Oh really I actually didn't know that yeah that was the next game that they tackle was starcraft a real time strategy game against an opponent.
spk_0
And that'll come back up in a sec with another opponent here in open AI.
spk_0
Yes David but before we talk about the creation of the other opponent. Should we think another one of our friends here required.
spk_0
Yes we should.
spk_0
All right listeners we are here today to tell you about a new friend of the show we are very excited about work OS.
spk_0
Yes if you're building software that is used in enterprises you've probably felt the pain of integrating things like SSO SCIM or SCIM permissions audit logs and all the other features that are required by big customers.
spk_0
And if you haven't felt this pain yet just wait until you get your first big enterprise customer and trust us you will.
spk_0
Yes work OS turns these potential deal blockers into simple drop in APIs and while work OS had great product market fit a few years ago with developers who just want to save on some headache.
spk_0
They really have become essential in the AI era.
spk_0
Yeah I was shocked when they sent over their latest customer list almost all the big AI companies use work OS today as the way that they've been able to rapidly scale revenue so fast.
spk_0
Companies like open AI and drop it cursor perplexity Sierra replant for sell hundreds of other AI startups all rely on work OS as their all the solution.
spk_0
So I called the founder to ask why and he said it's basically two things one in the AI era these companies scale so much faster that they need things like authorization authentication and SSO quickly to become enterprise ready and keep up with customer demand even early in life.
spk_0
Unlike older SaaS companies of yesterday and two unlike that world where you could bring your own little SaaS product just for you and your little team these AI products reach deep into your company systems and data to become the most effective so IT departments are scrutinizing heavier than ever to make sure that new products are compliant before they can adopt them.
spk_0
Yeah it's this kind of like second order effect of the AI era that the days of all just swipe a credit card bring your own SaaS solution for your product team.
spk_0
You actually need to be enterprise ready a lot sooner than you did before.
spk_0
Yeah it's not just about picking up that big potential customer for the revenue itself either it's about doing it so your competitors don't enterprise readiness has become so table stakes for companies no matter their stage and work OS is basically the weapon of choice for the best software companies to short cut this process and get back to focusing on what makes their beer taste better building the product itself.
spk_0
Amen amen so if you're ready to get started with just a few lines of code for Samil skim our back SSO authorization authentication and everything else to please IT admins and their checklists check out work OS it's the modern software platform to make all this happen that's work OS dot com and just tell them that Ben and David sent you.
spk_0
All right David so what are the second order effects of Google buying D mind well there's one person.
spk_0
Who is really really really upset about this and maybe two people if you include Mark Zuckerberg but.
spk_0
Mark tends to play his cards a little closer to the vest of course Elon Musk is very upset about this acquisition.
spk_0
When Google buys deep mind out from under him Elon goes ballistic as we said Elon and Larry had always been very close and now here's Google who.
spk_0
Elon has already started to sour on a little bit as he's now trying to hire AI researchers and you've got Alan used this flying around the world sucking up all of the AI researchers into Google.
spk_0
And Elon's invested in deep mind wanted to bring deep mind into his own a IT Matasla and gone out from under him so this leads to one of the most faithful dinners in Silicon Valley's history.
spk_0
Organized in the summer of 2015 at the Rosewood Hotel on Sand Hill Road of course where else would you do a dinner in Silicon Valley but the rosewood by two of the most leading figures in the valley at the time Elon Musk and Sam Altman Sam of course being president of Y combinator at the time.
spk_0
So what is the purpose of this dinner they are there to make a pitch to all of the AI researchers that Google and to a certain extent Facebook have sucked up and basically created this do happily status on again.
spk_0
Google's business model and Facebook's business model these feed recommenders or these classifiers turn out to be unbelievably valuable so they can it's funny in hindsight saying this paid tons of money to these people tons of money like millions of dollars take them out of academia and put them into their dirty capitalist research labs inside the companies selling advertising yes how dirty could you be.
spk_0
And the question in the pitch that Elon and Sam have for these researchers gathered at this dinner is what would it take to get you out of Google for you to leave and the answer to go around the table from almost everybody is nothing you can't why would we leave for getting paid way more money than we ever imagined many of us get to keep our academic positions and affiliations.
spk_0
And we get to hang out here Google with each other with each other iron sharpens iron is that the best minds in the world getting to do cutting edge research with enormous amount of resources and hardware at their disposal.
spk_0
It's amazing best infrastructure in the world we've got Jeff Dean here there is nothing you could tell us that would cause us to leave Google.
spk_0
Except there's one person who is intrigued and to quote from an amazing wired article at the time by Kate Mets who would later write genius makers right yeah exactly.
spk_0
Quotes the trouble was so many of the people most qualified to solve these problems were already working for Google and no one at the dinner was quite sure that these thinkers could be learned into a new startup even if musk and malt mean work behind it.
spk_0
But one key player was at least open to the idea of jumping ship and there's a quote from that key player.
spk_0
I felt like there were risks involved but I also felt like it would be a very interesting thing to try the most illegal code of all time the most illegal code of all time because that person was illia setskiver.
spk_0
Of course of Alex net and DNN research and Google and about to become founding chief scientist of open AI.
spk_0
So the pitch that Elon and Sam are making to these researchers is let's start a new nonprofit AI research lab where we can do all this work out in the open you can publish free of the forces of Facebook and Google and independent of their control.
spk_0
Yes, you don't have to work on products you can only work on research you can publish your work it will be open it will be for the good of humanity all of these incredible advances this intelligence that we believe is to come will be for the good of everyone not just for Google and Facebook.
spk_0
And for one of the researchers it seemed too good to be true so they basically weren't doing it because they didn't think anyone else would do it.
spk_0
It's sort of an activation energy problem where once Ilya said okay I'm in and once he said I'm in by the way Google came back with a big counter something like double the offer and I think it was delivered from Jeff Dean personally and Ilya said nope I'm doing this that was massive for getting the rest of the top researchers to go with him.
spk_0
And it was nowhere near all of the top researchers who left now Google to do this but it was enough was a group of seven or so researchers who left Google and joined Elon and Sam and Greg Brappman from stripe who came over to create open AI because that was the pitch we're all going to do this in the open and that's totally what it was.
spk_0
And the stated mission of open AI was to quote advance digital intelligence in the way that is most likely to benefit humanity as a whole unconstrained by a need to generate financial return.
spk_0
Which is fine as long as the thing that you need to fulfill your mission doesn't take tens of billions of dollars.
spk_0
Yes.
spk_0
So here's how they would fund it originally there was a billion dollars pledged yes and that came from famously Elon Musk Sam Altman read Hoffman Jessica Livingston who I think most people don't realize was part of that initial trunch and Peter teal.
spk_0
Yep founders fund of course would go on to put massive amounts of money into open AI itself later as well.
spk_0
The funny thing is it was later reported that a billion dollars was not actually collected only about 130 million of it was actually collected to fund this nonprofit and for the first few years that was plenty for the type of research they were doing the type of compute they needed most of that money was going to paying salaries to the researchers not as much as they could make it Google and Facebook but still millions of dollars.
spk_0
So that really worked until it really didn't so David what were they doing in the early days well in the first days it was all hands on deck recruiting and hiring researchers and there was the initial crew that came over and then pretty quickly after that in early 2016 and they get a big big win when Dario Amade leaves Google comes over joins Ilya and crew.
spk_0
At open AI dream team you know assembling here and was he on Google brain before this he was on Google brain yep and he along with Ilya would run large parts of open AI for the next couple of years before of course leaving to start and
spk_0
and then we're still a couple of years away from anthropic clawed jet GPT Gemini everything today for at least the first year or two basically the plan at open AI is let's look at what's happening at deep mind and show the research community that we can do as a new lab do the same incredible things that they're doing and maybe even do them better.
spk_0
Is that why it looks so game like in game focused yes yes so they start building models to play games famously the big one that they do is do to two defense of the agents to the massively online battle arena video game like all right well.
spk_0
All right well deep mind you're playing Starcraft well will go play dough to two that's even more complex more real time and similar to the emergent properties of go the game would devise unique strategies that you wouldn't see humans trying so it clearly wasn't humans coded their favorite strategies in rules and it was emergent yup.
spk_0
They did other things they had a product called universe which was around training computers to play thousands of games from Atari games to open world games like Grand Theft Auto.
spk_0
So they had something where they were teaching a model how to do a Rubik's cube and so it was a diverse set of projects that didn't seem to coalesce around one of these is going to be the big thing.
spk_0
Yeah it was research stuff it was what deep mind was doing yeah it was like a university research it was like deep mind.
spk_0
And if you think back to Elon being an investor in deep mind being really upset about Google acquiring it out from under him makes sense and I think Elon deserves a lot of credit for having his name and his time attached to open a at the beginning a lot of the big heavy hitter recruiting was Elon throwing his weight behind this I'm willing to take chance absolutely.
spk_0
Okay so that's what's going on over at open AI doing a lot of deep mind like stuff.
spk_0
Bunch of projects not one single obvious big thing they're coalescing around it's not chat GPT time let's put it that way.
spk_0
Let's go back to Google because last we sort of checked in on them yeah they bought deep mind but they had their talent rated and I don't want you to get the wrong impression about where Google is sitting just because people left to go to open AI.
spk_0
So back in 2013 when Alex Krushesky arrives at Google with Jeff Hinton and Ilya Sitskyver he was shocked to discover that all their existing machine learning models were running on CPUs.
spk_0
People had asked in the past for GPUs since machine learning workloads were well suited to run parallel but Google's infrastructure team had pushed back and said the added complexity and expanding and diversifying the fleet let's keep things simple that doesn't seem important for us.
spk_0
Where a CPU shop here yes and so to quote from genius makers in his first days at the company he went out and bought a GPU machine this is Alex from a local electronic store stuck it in the closet down the hall from his desk plugged it into the network and started training his neural networks on this loan piece of hardware.
spk_0
Just like he did in academia except this time Google's paying for the electricity.
spk_0
Obviously one GPU is not sufficient especially as more Googlers wanted to start using it to and Jeff Dean and Alan Eustis had also come to the conclusion that dist belief while amazing had to be re architected to run on GPUs and not CPUs.
spk_0
So spring of 2014 rolls around Jeff Dean and John Geandra we haven't talked about this episode.
spk_0
Yeah, GG. Yes, you might be wondering wait isn't that the Apple guy yes he went on to be appleshead of AI who at this point in time was at Google and oversaw Google brain 2014 they sit down to make a plan for how to actually formally put GPUs into the fleet of Google's data centers which is a big deal is a big change.
spk_0
But they're seeing enough reactions to neural networks that they know to do this after Alex is just a matter of time.
spk_0
Yeah, so they settle on a planned order 40,000 GPUs from Nvidia. Yeah, of course who else you get order them from for a cost of $130 million.
spk_0
That's a big enough price tag that the request gets elevated to Larry page who personally approves it even though finance wanted to kill it because he goes look the feature of Google is deep learning.
spk_0
As an aside, let's look at Nvidia at the time this is a giant giant order their total revenue was $4 billion.
spk_0
This is one order for 130 million. I mean Nvidia is primarily consumer graphics card company at this point. Yes, and their market cap is $10 billion.
spk_0
It's almost like Google gave Nvidia a secret that hey, not only does this work in research like the ImageNet competition, but neural networks are valuable enough to us as a business to make a hundred plus million dollar investment in right now.
spk_0
No questions asked. We got to ask Jensen about this at some point this had to be a tell.
spk_0
This had to really give Nvidia the confidence, oh, we should way forward invest on this being a giant thing in the future.
spk_0
So all of Google wakes up to this idea they start really putting it into their products Google photos happened Gmail starts offering typing suggestions David as you pointed out earlier Google's giant ad words business started finding more ways to make more money with deep learning.
spk_0
In particular when they integrated it they could start predicting what ads people would click in the future.
spk_0
And so Google started spending hundreds of millions more on GPUs on top of that 130 million, but very quickly paying it back from their ad system.
spk_0
So it became more and more of a no brainer to just buy as many GPUs as they possibly could.
spk_0
But once neural net started to work anyone using them, especially at Google scale kind of had this problem.
spk_0
Well, now we need to do giant amounts of matrix multiplications. Any time anybody wants to use one the matrix multiplications are effectively how you do that propagation through the layers of the neural network.
spk_0
So you sort of have this problem. Yes, totally there's the inefficiency of it, but then there's also the business problem of.
spk_0
Wait a minute it looks like we're just going to be shipping hundreds of millions soon to be billions of dollars over to Nvidia every year for the foreseeable future.
spk_0
Right. So there's this amazing moment right after Google rolls out speech recognition their latest use case for neural nets just on Nexus phones because again they don't have the infrastructure to support it on all Android phones.
spk_0
It becomes a super popular feature and Jeff Dean does the math and figures out if people use this for I don't call it three minutes a day and we roll it out to all billion Android phones.
spk_0
We're going to need twice the number of data centers that we currently have across all of Google just to handle it just for this feature.
spk_0
Yeah, there's a great quote where Jeff goes to Erzholzel and goes we need another Google.
spk_0
Or David as you were hinting at the other option is we build a new type of chip customized for just our particular use case.
spk_0
Yeah, matrix multiplication tensor multiplication a tensor processing unit you might say.
spk_0
Yes, it wouldn't that be nice so conveniently Jonathan Ross who's an engineer at Google has been spending his 20% time at this point in history working on an effort involving FPGA's these are essentially expensive but programmable chips that yield really fantastic results.
spk_0
So they decide to create a formal project to take that work combined it with some other existing work and build a custom ASIC or an application specific integrated circuit.
spk_0
So enter David as you said the tensor processing unit made just for neural networks that is far more efficient from GPUs at the time with the trade off that you can't really use it for anything else.
spk_0
It's not good for graphics processing. It's not good for lots of other GPU workloads just matrix multiplication and just neural networks.
spk_0
But it would enable Google to scale their data centers without having to double their entire footprint.
spk_0
So the big idea behind the TPU if you're trying to figure out like what was the core insight they use reduced computational precision.
spk_0
So it would take numbers like 4,586.8272 and round it just to 4,586.8 or maybe even just 4,586 with nothing after the decimal point.
spk_0
And this sounds kind of counterintuitive at first why would you want less precise rounded numbers for this complicated math.
spk_0
The answer is efficiency. If you can do the heavy lifting in your software architecture or what's called quantization to account for it.
spk_0
You can store information as less precise numbers. Then you can use the same amount of power and the same amount of memory and the same amount of transistors on a chip to do far more calculations per second.
spk_0
So you can either spit out answers faster or use bigger models. The whole thing is quite clever behind the TPU.
spk_0
The other thing that has to happen with the TPU is it needs to happen now because it's very clear speech to text as a thing.
spk_0
It's very clear some of these other use cases at Google.
spk_0
Yeah. Demand for all of this stuff that's coming out of Google brain is through the roof immediately.
spk_0
Right. And we're not even two LLM's yet. It's just like everyone sort of expects some of this.
spk_0
Whether it's computer vision and photos or speech recognition like it's just becoming a thing that we expect and it's going to flip Google's economics upside down if they don't have it.
spk_0
So the TPU was designed verified built and deployed into data centers in 15 months.
spk_0
Well, it was not like a research project that could just happen over several years.
spk_0
This was like a hair on fire problem that they launched immediately.
spk_0
One very clever thing that they did was a they used the FPGA's as a stopgap.
spk_0
So even though they were like too expensive on a unit basis, they could get them out as a test fleet and just make sure all the math worked before they actually had the ASICs printed it.
spk_0
I don't know if it was a TSMC but you know fabbed and ready.
spk_0
The other thing they did is they fit the TPU into the form factor of a hard drive.
spk_0
So it could actually slot into the existing server racks. You just pop out a hard drive and you pop in a TPU without needing to do any physical rearchitecture.
spk_0
Wow. That's amazing. That's the most Google-y infrastructure story since the cork boards.
spk_0
Exactly. Also, all of this didn't happen in Mountain View. It was at a Google satellite office in Madison, Wisconsin.
spk_0
Whoa. Yes.
spk_0
Why Madison, Wisconsin?
spk_0
There was a particular professor out of the university and there was a lot of students that they could recruit from.
spk_0
Wow.
spk_0
Yeah. I mean it was probably them or Epic. Where are you going to go work?
spk_0
Yeah.
spk_0
Wow.
spk_0
They also then just kept this a secret.
spk_0
Right. Why would you tell anybody about this?
spk_0
Because it's not like they're offering these in Google Cloud at least at first.
spk_0
And why would you want to tell the rest of the world what you're doing?
spk_0
So the whole thing was a complete secret for at least a year before they announced it at Google I.O.
spk_0
So really crazy.
spk_0
The other thing to know about the TPUs is they were done in time for the AlphaGo match.
spk_0
So that match ran on a single machine with four TPUs in Google Cloud.
spk_0
And once that worked, obviously that gave Google a little bit of extra confidence to go really, really re-production.
spk_0
So that's the TPU. V1, by all accounts, was not great.
spk_0
They're on V7 or V8 now. It's gotten much better.
spk_0
TPUs and GPUs look a lot more similar than they used to.
spk_0
Then they've sort of adopted features from each other.
spk_0
But today Google, it's estimated has two to three million TPUs for reference.
spk_0
Nvidia shipped people don't know for sure.
spk_0
Somewhere around four million GPUs last year.
spk_0
So people talk about AI chips like it's this just one horse race within video.
spk_0
Google has like an almost Nvidia scale internal thing making their own choice.
spk_0
So they're going to have a lot of different types of chips at this point for their own and for Google Cloud customers.
spk_0
The TPU is a giant deal in AI in a way that I think a lot of people don't realize.
spk_0
Yep. This is one of the great ironies and maddening things to open AI in Elon Musk is that open AI gets founded in 2015 with the goal of, hey, let's shake all this talent out of Google and level the playing field.
spk_0
And Google just accelerates.
spk_0
Right. They also build TensorFlow.
spk_0
That's the framework that Google brain built to enable researchers to build and train and deploy machine learning models.
spk_0
And they built it in such a way that it doesn't just have to run on TPUs.
spk_0
It's super portable without any rewrites to run on GPUs or even CPUs to.
spk_0
So this would replace the old disbelief system and kind of be their internal and external framework for enabling ML researchers going forward.
spk_0
So somewhat paradoxically during these years after the founding of open AI.
spk_0
Yes, some amazing researchers are getting siphoned off from Google and Google brain.
spk_0
But Google brain is also firing on all cylinders during this time frame.
spk_0
Delivering on the business purposes for Google left and right.
spk_0
Yes. And pushing the state of the art forward in so many areas.
spk_0
And then in 2017, a paper gets published from eight researchers on the Google brain team.
spk_0
Kind of quietly.
spk_0
These eight folks were obviously very excited about the paper and what it described and the implications of it.
spk_0
And they thought it would be very big.
spk_0
Google itself.
spk_0
Cool. This is like the next iteration of our language model work.
spk_0
Great.
spk_0
Which is important to us.
spk_0
But are we sure this is the next Google?
spk_0
No.
spk_0
No. There are a whole bunch of other things we're working on that seem more likely to be the next Google.
spk_0
But this paper and its publication would actually be what gave open AI the opportunity to build the next Google.
spk_0
To grab the ball and run with it and build the next Google.
spk_0
Because this is the transformer paper.
spk_0
Okay. So where did the transformer come from?
spk_0
Like what was the latest thing that language models had been doing at Google?
spk_0
So coming out of the success of Franz Axe's work on Google Translate and the improvements that happened there.
spk_0
In like the late 2000s-ish 2007.
spk_0
Yeah.
spk_0
Mid-delay 2000s.
spk_0
They keep iterating on Translate.
spk_0
And then once Jeff Hidden comes on board and Alex net happens, they switch over to a neural network based language model for Translate.
spk_0
Which was dramatically better and like a big crazy cultural thing.
spk_0
Because you've got these researchers parachuting in again led by Jeff Dean saying,
spk_0
I'm pretty sure our neural networks can do this way better than the classic methods that we've been using for the last 10 years.
spk_0
What if we take the next several months and do a proof of concept?
spk_0
They end up throwing away the entire old code base and just completely wholesale switching to this neural network.
spk_0
There's actually this great New York Times magazine story that ran in 2016 about it.
spk_0
And I remember reading the whole thing with my jaw on the floor like, wow, neural networks are a big Finged deal.
spk_0
And this was the year before the transformer paper would come out before the transformer paper.
spk_0
Yes.
spk_0
So they do the rewrite at Google Translate, make it based on recurrent neural networks,
spk_0
which were state of the art at that point in time.
spk_0
And it's a big improvement.
spk_0
But as teams within Google Brain and Google Translate keep working on it, there's some limitations.
spk_0
And in particular, a big problem was that they quote unquote forgot things too quickly.
spk_0
I don't know if it's exactly the right analogy, but you might say in sort of like today's transformer world speak,
spk_0
you might say that the context window was pretty short.
spk_0
As these language models progressed through text, they needed to sort of remember everything they had read
spk_0
so that when they need to change a word later or come up with the next word,
spk_0
they could have a whole memory of the body of text to do that.
spk_0
So one of the ways that Google tries to improve this is to use something called long short term memory networks or LSTMs as the acronym that people use for this.
spk_0
And basically what LSTMs do is they create a persistent or long short term memory.
spk_0
You got to use your brain a little bit here for the model so that it can keep context as it's going through a whole bunch of steps.
spk_0
And people were pretty excited about LSTMs at first.
spk_0
People are thinking like, oh, LSTMs are what are going to take language models and large language models mainstream.
spk_0
Right.
spk_0
And indeed in 2016, they incorporated into Google Translate these LSTMs.
spk_0
It reduces the error rate by 60%.
spk_0
Huge jump.
spk_0
Yep.
spk_0
The problem with LSTMs though, they were effective, but they were very computationally intensive.
spk_0
And they didn't parallelize that great.
spk_0
All the efforts that are coming out of Alex Ned and the TPU project of parallelization.
spk_0
This is the future. This is how we're going to make AI really work.
spk_0
LSTMs are a bit of a roadblock here.
spk_0
Yes.
spk_0
So a team within Google Brain starts searching for a better architecture that also has the attractive properties of LSTMs that it doesn't forget context too quickly, but can parallelize and scale better.
spk_0
Take advantage of all these new architectures.
spk_0
Yes.
spk_0
And a researcher named Jacob Oskarit had been toying around with the idea of broadening the scope of quote unquote attention in language processing.
spk_0
What if rather than focusing on the immediate words, instead, what if you told the model, hey, pay attention to the entire corpus of text.
spk_0
Not just the next few words.
spk_0
Look at the whole thing.
spk_0
And then based on that entire context and giving your attention to the entire context, give me a prediction of what the next translated word should be.
spk_0
Now, by the way, this is actually how professional human translators translate text.
spk_0
You don't just go word by word.
spk_0
I actually took a translation class in college, which was really fun.
spk_0
You read the whole thing of the original in the original language.
spk_0
You get and understand the context of what the original work is.
spk_0
And then you go back and you start to translate it with the entire context of the passage in mind.
spk_0
So it would take a lot of computing power for the model to do this, but it is extremely parallelizable.
spk_0
So Jacob starts collaborating with a few other people on the brain team.
spk_0
They get excited about this.
spk_0
They decide that they're going to call this new technique the transformer.
spk_0
Because one, that is literally what it's doing.
spk_0
It's taking in a whole chunk of information, processing, understanding it, and then transforming it.
spk_0
And they also love transformers and skits.
spk_0
That's not why they named it the transformer.
spk_0
And it's taking in the giant corpus of text and storing it in a compressed format, right?
spk_0
Yeah.
spk_0
I bring this up because that is exactly how you pitched the microkitchen conversation with Noam Shazir in 2000, 2001, 17 years earlier.
spk_0
Who is a co-author on this paper?
spk_0
Yes, well, so speaking of Noam Shazir, he learns about this project.
spk_0
And he decides, hey, I've got some experience with this.
spk_0
This sounds pretty cool.
spk_0
LSTMs definitely have problems.
spk_0
This could be promising.
spk_0
I'm going to jump in and work on it with these guys.
spk_0
And it's a good thing he did because before Noam joined the project, they had a working implementation of the transformer.
spk_0
But it wasn't actually producing any better results than LSTMs.
spk_0
Noam joins the team.
spk_0
Basically pulls a Jeff Dean, rewrites the entire code base from scratch.
spk_0
And when he's done, the transformer now crushes the LSTM-based Google Translate solution.
spk_0
And it turns out that the bigger they make the model, the better the results get.
spk_0
It seems to scale really, really, really well.
spk_0
Stephen Levy wrote a piece and wired about the history of this.
spk_0
And there are all sorts of quotes from the other members of the team just littered all over this piece with things like Noam is a magician.
spk_0
Noam is a wizard. Noam took the idea and came back and said, it works now.
spk_0
Yeah.
spk_0
And you wonder why Noam and Jeff Dean are the ones together working on the next version of Gemini now.
spk_0
Yes. Noam and Jeff Dean definitely two peas in a pot here.
spk_0
Yes.
spk_0
So we talked to Greg Carrado from Google Brain, one of the founders of Google Brain.
spk_0
And it was a really interesting conversation because he underscored how elegant the transformer was.
spk_0
And he said it was so elegant that people's response was often this can't work.
spk_0
It's too simple.
spk_0
Transformers are barely a neural network architecture.
spk_0
Right. It was another big change from the Alex net, Jeff Hinton lineage neural networks.
spk_0
Yeah. It actually has changed the way that I look at the world because he pointed out that in nature, this is Greg,
spk_0
the way things usually work is the most energy efficient way they could work.
spk_0
Almost from an evolution perspective that the most simple elegant solutions are the ones that survive because they are the most efficient with their resources.
spk_0
And you can kind of port this idea over to computer science too that he said he's developed a pattern recognition inside of the research lab to realize that you're probably on to the right solution when it's really simple and really efficient versus the world.
spk_0
And that's a complex idea.
spk_0
Very clever. I think it's very true.
spk_0
You know how when you sit or when you have a thorny problem and you debate and you wipe or it and you come with all it and then you're like, oh my God, oh my God, it's so simple.
spk_0
And that ends up being the right answer.
spk_0
Yeah. There's an elegance to the transformer.
spk_0
Yes. And that other thing that you touched on there, this is the beginning of the modern AI.
spk_0
Just feed it more data. The famous piece, the bitter lesson by Rich Sutton, wouldn't be published until 2019 for anyone who hasn't read it.
spk_0
It's basically we always think as AI researchers were so smart and our job is to come up with another great algorithm, but effectively in every field from language to computer vision to chess.
spk_0
You just figure out a scalable architecture and then the more data wins.
spk_0
Just these infinitely scaling more data, more compute better results.
spk_0
Yes. And this is really the start of when that starts to be like, oh, we have found the scalable architecture that will go so far for, I don't know, close to a decade of just more data and more energy, more compute better results.
spk_0
So the team and know them like, yo, this thing has a lot, a lot of potential.
spk_0
This is more than better translate. We can really apply this.
spk_0
Yeah, this is going to be more than better Google translate. The rest of Google though definitely slower to wake up to the potential.
spk_0
They build some stuff within a year they build Bert the large language model.
spk_0
Yes, absolutely. It is a false narrative out there that Google did nothing with the transformer after the paper was published. They actually did a lot.
spk_0
In fact, Bert was one of the first LLMs.
spk_0
Yes, they did a lot with transformer based large language models after the paper came out.
spk_0
What they didn't do was treat it as a wholesale technology platform change.
spk_0
Right. They were doing things like Bert and Mum, this other model, you know, they could work it into search results quality.
spk_0
And I think that did meaningfully move the needle even though Google wasn't bragging about it and talking about it.
spk_0
They got better at query comprehension. They were working it into the core business.
spk_0
Just like every other time, Google brain came up with something great.
spk_0
Yep. So in perhaps one of the greatest decisions ever for value to humanity and maybe one of the worst corporate decisions ever for Google.
spk_0
Google allows this group of eight researchers to publish the paper under the title attention is all you need.
spk_0
Obviously a nod to the classic Beatles song about love.
spk_0
As of today in 2025, this paper has been cited over 173,000 times in other academic papers, making it currently the seventh most cited paper of the 21st century.
spk_0
And I think all of the other papers above it on the list have been out much longer.
spk_0
And also, of course, within a couple of years, all eight authors of the transformer paper had left Google to either start or join AI startups, including open AI.
spk_0
Brutal.
spk_0
And of course, no, I'm starting character AI, which would call it a acquisition.
spk_0
He would end up back at Google via some strange licensing and IP and hiring agreement on the few billion dollars order.
spk_0
Very, very expensive mistake on Google's part is fair to say that 2017 begins the five year period of Google not sufficiently seizing the opportunity that they had created with the transformer.
spk_0
Yes. So speaking of seizing opportunities, what is going on at open AI during this time?
spk_0
And does anyone think the transformers a big deal over there?
spk_0
Yes, yes, they did. But here's where history gets really, really crazy.
spk_0
Right after Google publishes the transformer paper in September of 2017, Elon gets really, really fed up with what's going on at open AI.
spk_0
There's like seven different strategies. Are we doing video games? Are we doing competitions? What's the plan?
spk_0
What is happening here as best as I can tell all you're doing is just trying to copy deep mind.
spk_0
Meanwhile, I'm here building SpaceX and Tesla. Self-driving is becoming more and more clear as critical to the future of Tesla.
spk_0
I need AI researchers here and I need great AI advancements to come out to help what we're doing at Tesla. Open AI isn't cutting it.
spk_0
So he makes an ultimatum to Sam and the rest of the open AI board.
spk_0
He says, I'm happy to take full control of open AI and we can merge this into Tesla.
spk_0
I don't even know how that would be possible to merge a nonprofit into Tesla.
spk_0
But in Elon land, if he takes over as CEO of open AI, it almost doesn't matter.
spk_0
We're just treating it as if it's the same company anyway, just like we do with the deals with all of my companies.
spk_0
Right. Or he's out completely along with all of his funding.
spk_0
Sam and the rest of the board are like, no.
spk_0
And as we know now, they're sort of calling capital into the business. It's not like they actually got all the cash upfront.
spk_0
Right. So they're only 130 million ish into the billion dollars of commitment.
spk_0
They don't reach a resolution. And by early 2018, Elon is out along with him, the main source of open AI's funding.
spk_0
So either this is just a really, really, really bad misjudgment by Elon or the sort of panic that this throws open AI into is the catalyst that makes them reach for the transformer and say, all right.
spk_0
We got to figure things out. Necessity is the mother of invention. Let's go for it.
spk_0
That's true. I don't know if during this personal tension between Elon and Sam if they had already decided to go all in on transformers or not.
spk_0
Because the thing you very quickly get to if you decide transformers, language models, we're going all in on that.
spk_0
You do quickly realize you need a bunch of data. You need a bunch of compute. You need a bunch of energy. You need a bunch of capital.
spk_0
And so if your biggest backer is walking away, the 3D chess move is, oh, we got to keep him because we're about to pivot the company and we need his capital for this big pivot we're doing.
spk_0
The 4D chess is if he walks away, maybe I can turn it into a for profit company and then raise money into it and eventually generate enough profits to fund this extremely expensive new direction we're going in.
spk_0
I don't know which of those it was. I don't know either. I suspect the truth is it's some of both.
spk_0
Yes. But either way, how nuts is it that a, these things happen at the same time and b, the company wasn't burning that much cash and then they decided to go all in on we need to do something so expensive that we need to be a for profit company in order to actually achieve this mission.
spk_0
Because it's just going to require hundreds of billions of dollars for the far foreseeable future.
spk_0
Yep. So in June of 2018, OpenAI releases a paper describing how they have taken the transformer and developed a new approach of pre-training them on very large amounts of general text on the internet and then fine tuning that general pre-training to specific use cases.
spk_0
And they also announced that they have trained and run the first proof of concept model of this approach, which they are calling GPT1, generatively pre-trained transformer version 1.
spk_0
Which we should say is right around the same time as Bert and right around the same time as another large language model based on the transformer out of here in Seattle, the Ellen Institute.
spk_0
Yes. Indeed.
spk_0
So it's not as if this is heretical and a secret other AI labs, including Google's own is doing it.
spk_0
But from the very beginning, OpenAI seemed to be taking this more seriously given the cost of it would require betting the company if they continued down this path.
spk_0
Yep. We're betting the nonprofit, betting the entity.
spk_0
Yes. We're going to need some new terminology here.
spk_0
Yes. So Elon's just walked out the door. Where are they going to get the money for this?
spk_0
Sam turns to one of the other board members of OpenAI.
spk_0
Read Hoffman. Read just a year or so earlier had sold LinkedIn to Microsoft and read is now on the board of Microsoft.
spk_0
So Reed says, hey, why don't you come talk to Satya about this?
spk_0
Do you know where he actually talks to Satya?
spk_0
Oh, I do. In July of 2018, they set a meeting for Sam Altman and Satya Nadella to sit down.
spk_0
While they're both at the Allen and Company Sun Valley conference in Sun Valley, I don't know.
spk_0
It's perfect. And while they're there, they hash out a deal for Microsoft to invest $1 billion into OpenAI in a combination of both cash and Azure cloud credits.
spk_0
And in return, Microsoft will get access to OpenAI's technology, get an exclusive license to OpenAI's technology for use in Microsoft's products.
spk_0
And the way that they will do this is OpenAI, the nonprofit will create a captive for-profit entity called OpenAI LP controlled by the nonprofit OpenAI Inc.
spk_0
And Microsoft will invest into the captive for-profit entity.
spk_0
Read Hoffman joins the board of this new structure along with Sam, Ilya, Greg Brockman, Adam D'Angelo, and Tasha McColley.
spk_0
And thus the modern OpenAI for-profit nonprofit question mark is created.
spk_0
The thing that's still being figured out even today here in 2025 is created.
spk_0
This is like the complete history of AI. This is not just the Google AI episode.
spk_0
Well, these things are totally inextricable. And I was just going to say this is the Google Part III episode.
spk_0
Microsoft, they're back. Microsoft is Google's mortal enemy.
spk_0
Yes.
spk_0
That in our first episode on the founding of Google and Search, and then in the second episode on Alphabet and all the products that they made the whole strategy at Google was always about Microsoft.
spk_0
They finally beat them on every single front. And here they are showing up again and saying, what was Sate's line? We just want to see them dance.
spk_0
I think the line that would come a couple years later is we want the world to know that we made Google dance.
spk_0
Oh, man.
spk_0
But this is all still pre-ch at GPT. This is just Sam lining up the financing he needs for what appears to be a very expensive scaling exercise.
spk_0
They're about to embark on with GPT 2 and onward.
spk_0
Yep. And this is the right time to talk about why from OpenAI's perspective, Microsoft is the absolute perfect partner.
spk_0
It's not just that they have a lot of money.
spk_0
Although that helps.
spk_0
I mean, that helps. That helps a lot. But more important than money, they have a really, really great public cloud.
spk_0
Azure.
spk_0
Yes.
spk_0
OpenAI is not going to go buy a bunch of NVIDIA GPUs and then build their own data center here at this point in 2018.
spk_0
That's not the scale of company that they are. They need a cloud provider in order to actually do all the compute that they want to do.
spk_0
If they were back at Google and these researchers were doing it great, then they have all the infrastructure.
spk_0
But OpenAI needs to tie themselves to someone with the infrastructure.
spk_0
And there's basically only two non-Google options. They're both in Seattle.
spk_0
Hey, what of them in Microsoft is really interested. Also has a lot of cash.
spk_0
It seems like a great partnership.
spk_0
That's true. I wonder if they did talk to AWS at all about it.
spk_0
Because I think this is a crazy Easter egg. It has to take to say it out loud.
spk_0
But I think AWS was actually in the very first investment with Elon in OpenAI.
spk_0
Oh, wow.
spk_0
And I don't know if it was in the form of credits or what the deal was.
spk_0
But I'd seen it reported a couple places that AWS actually was in that nonprofit round.
spk_0
Yeah. In the nonprofit funding, the donations too.
spk_0
Yes. The early OpenAI.
spk_0
Anyway, Microsoft's OpenAI, they end up tying up a match made in heaven.
spk_0
Satya and Sam are onstage together talking about how this amazing partnership and marriage has come together.
spk_0
And they're off to model training.
spk_0
Yeah. And this paves the way for the GPT era of OpenAI.
spk_0
But before we tell that story.
spk_0
Yes. Now is a great time to think one of our favorite companies, Shopify.
spk_0
Yes. And this is really fun because we have been friends and fans of Shopify for years.
spk_0
We just had Toby on ACQ2 to talk about everything going on in AI.
spk_0
And everything that has happened at Shopify in the six years now since we covered the company on acquired.
spk_0
It's been a pretty insane transformation for them.
spk_0
Yeah. So back at their IPO, Shopify was the go to platform for entrepreneurs and small businesses to get online.
spk_0
What's happened since is that is still true.
spk_0
And Shopify has also become the world's leading commerce platform for enterprises of any size period.
spk_0
Yeah. So what's so cool about the company is how they've managed to scale without losing their soul.
spk_0
Even though companies like Everlane and Vory and even older established companies like Mattel are doing billions of revenue on Shopify,
spk_0
the company's mission is still the same as the day Toby founded it to create a world where more entrepreneurs exist.
spk_0
Oh, yeah. Ben, you got to tell everyone your favorite enterprise brand that is on Shopify.
spk_0
Oh, I'm saving that for next episode. I have a whole thing planned for episode two of this season.
spk_0
Okay. Okay. Okay. Anyway, the reason enterprises are now also using Shopify.
spk_0
Shopify is simple because businesses of all sizes just sell more with Shopify.
spk_0
They built this incredible ecosystem where you can sell everywhere.
spk_0
Obviously your own site that's always been true.
spk_0
But now with Shopify, you can easily sell on Instagram, YouTube, TikTok, Roblox, Roku, ChatGPT, Perplexity, anywhere.
spk_0
Plus with Shop Pay, they're accelerated checkout.
spk_0
You get amazing conversion and it has a built in user base of 200 million people who have their payment information already stored with it.
spk_0
Shopify is the ultimate example of not doing what doesn't make your beer taste better.
spk_0
Even if you're a huge brand, you're not going to build a better e-commerce platform for your product.
spk_0
But that is what Toby and Shopify's entire purpose is so you should use them.
spk_0
Yes. So whether you're just getting started or already at huge scale, head on over to Shopify.com slash acquire.
spk_0
That's shopify.com slash acquired and just tell them that Ben and David sent you.
spk_0
All right. So what will be in GPT 2? Is that what's being trained right here?
spk_0
Yes. GPT 2.
spk_0
This was the first time I heard about it. Data scientists around Seattle were talking about this cool.
spk_0
Right. So after the first Microsoft partnership, the first billion dollar investment in 2019 opening I releases GPT 2,
spk_0
which is still early, but very promising that can do a lot of things.
spk_0
A lot of things, but it required enormous amount of creativity on your part.
spk_0
You kind of had to be a developer to use it.
spk_0
And if you were a consumer, there was a very heavy load put on you.
spk_0
You had to go write a few paragraphs and then paste those few paragraphs into the language model.
spk_0
And then it would suggest a way to finish what you were writing based on the source paragraphs, but it wasn't interactive.
spk_0
Yes. It was not a chat interface.
spk_0
Yes. There was no interface essentially for it.
spk_0
It was an API.
spk_0
But it can do things like obviously translate text. I mean Google's been doing that for a long time.
spk_0
But GPT 2, you could do stuff like make up fake news headline and give it to GPT 2.
spk_0
And it would write a whole article.
spk_0
You would read it and you'd be like, eh, sounds like it was written by a bot.
spk_0
Yeah.
spk_0
But again, there was no front door to it for normal people.
spk_0
You had to really be willing to wait in the muck to use this thing.
spk_0
So then the next year in June of 2020, GPT 3 comes out.
spk_0
Still no front door user interface to the model.
spk_0
But it's very good.
spk_0
GPT 2 showed the promise of what was possible.
spk_0
GPT 3, it's starting to be in the conversation of Ken this thing past the Turing test.
spk_0
Oh, yeah.
spk_0
You have a hard time distinguishing between articles that GPT wrote and articles that humans wrote.
spk_0
It's very good.
spk_0
And there starts to be a lot of hype around this thing.
spk_0
And so even though consumers aren't really using it, the broader awareness is that there's something interesting on the horizon.
spk_0
I think the number of AI pitch decks that VCs are seeing is starting to tick up around this time.
spk_0
As is the Nvidia stock price.
spk_0
Yes.
spk_0
So then in the next year in the summer of 2021, Microsoft releases GitHub Copilot using GPT 3.
spk_0
This is the first, not just Microsoft product that comes out with GPT baked into it, but first productization.
spk_0
Product anywhere.
spk_0
Yeah, first productization of GPT.
spk_0
Yeah. So if any open AI technology.
spk_0
Yeah.
spk_0
It's big.
spk_0
This starts a massive change in how software gets written in the world.
spk_0
Slowly then all at once.
spk_0
It's one of these things where at first just a few software engineers and there was a lot of whispers of how cool is this.
spk_0
It makes me a little bit more efficient.
spk_0
And now you get all these comments like 75% of all companies code is written with AI.
spk_0
Yep.
spk_0
So after that, Microsoft invests another $2 billion in open AI, which seemed like a lot of money at the time.
spk_0
So that takes us to the end of 2021.
spk_0
There's an interesting kind of context shift that happens around here.
spk_0
Yeah.
spk_0
The bottom falls out on tech stocks.
spk_0
Crypto, the broader markets really, everyone suddenly goes from risk on to risk off.
spk_0
And part of it was worn Ukraine, but a lot of it was interest rates going up.
spk_0
And Google gets hit really hard.
spk_0
The high watermark was November 19th of 2021.
spk_0
Google was right at $2 trillion of market cap.
spk_0
About a year after that slide began, they were worth a trillion dollars.
spk_0
Nearly a 50% drawdown.
spk_0
Wow.
spk_0
So towards the end of 2022 leading up to the launch of chat GPT.
spk_0
People, I think are starting to realize Google's slow.
spk_0
They're slow to react to things.
spk_0
It feels like they're an old, crusty company.
spk_0
Are they like the Microsoft of the 2000s where they haven't had a breakthrough product in a while?
spk_0
People are not bright on the future of Google.
spk_0
And then chat GPT comes out.
spk_0
Yeah.
spk_0
Wow.
spk_0
Which means if you were bullish on Google, fact and then contrary,
spk_0
and you could have invested at a trillion dollar market cap.
spk_0
Which is interesting.
spk_0
Like in October of 21, the market was saying that the forthcoming AI wave will not be a strength for Google.
spk_0
Or maybe what it was saying is we don't even know anything about a forthcoming AI wave.
spk_0
Because people are talking about AI, but they've been talking about VR and they've been talking about crypto.
spk_0
And they've been talking about all this frontier tech.
spk_0
And like that's not the future at all.
spk_0
This company just feels slow and unadaptive.
spk_0
And slow and unadaptive at that point in history, I think would have been a fair characterization.
spk_0
They had an internal chatbot, right?
spk_0
Yes, they did.
spk_0
All right.
spk_0
So before we talk about chat GPT, Google had a chatbot.
spk_0
So Nome Shazir, incredible engineer, re-architected the transformer, made it work,
spk_0
one of the lead authors of the paper,
spk_0
a story career within Google has all of this sway, should have all of this sway within the company.
spk_0
After the transformer paper comes out, he and the rest of the team are like,
spk_0
guys, we can use this for a lot more than Google translate.
spk_0
And in fact, the last paragraph of the paper, are you about to read the transformer paper?
spk_0
Yes, I am.
spk_0
We are excited about the future of attention-based models and plan to apply them to other tasks.
spk_0
We plan to extend the transformer to problems involving input and output modalities other than text,
spk_0
and to investigate large inputs and outputs such as images, audio, and video.
spk_0
This is in the paper.
spk_0
Wow.
spk_0
Google obviously does not do any of that for quite a while.
spk_0
Nome, though, immediately starts advocating to Google leadership,
spk_0
hey, I think this is going to be so big, the transformer, that we should actually consider
spk_0
just throwing out the search index and the template links model and go all in on transforming all of Google into one giant transformer model.
spk_0
And then Nome actually goes ahead and builds a chatbot interface to a large transformer model.
spk_0
Is this Lambda?
spk_0
This is before Lambda, Mina is what he calls it.
spk_0
And there is a chatbot in the late teens 2020 timeframe that Nome is built within Google that arguably is pretty close to chat GPT.
spk_0
Now, it doesn't have any of the post-training safety that chat GPT does, so it would go off the rails.
spk_0
Yeah, someone told us that you could just ask it who should die, and it would come up with names for you of people that should die.
spk_0
It was not a shipable product.
spk_0
It was a very raw, not safe, not post-trained chatbot and model.
spk_0
Right.
spk_0
But it existed within Google, and they didn't ship it.
spk_0
And technically, not only did it not have post-training, it didn't have RLHF either.
spk_0
This very core component of the models today, the reinforcement learning with human feedback that chat GPT, I don't know if it had it in three, but it did in 3.5 and it did for the launch of chat GPT.
spk_0
Realistically, it wasn't launchable, even if it was an open AI thing because it was so bad.
spk_0
But a company of Google Stature certainly could not take the risk.
spk_0
So strategically, they have this working against them.
spk_0
But aside from the strategy thing, there's two business model problems here.
spk_0
One, if you're proposing, drop the 10 blue links and just turn Google.com into a giant AI chatbot.
spk_0
Revenue drops when you provide direct answers to questions versus showing advertisers and letting people click through two websites.
spk_0
That upsets the whole Apple cart.
spk_0
Obviously, they're thinking about it now, but until 2021, that was an absolute non-starter to suggest something like that.
spk_0
Two, there were legal risks of sitting in between publishers and users.
spk_0
I mean, Google at this point had spent decades fighting the public perception and court rulings that they were disintermediating publishers from readers.
spk_0
So there was like a very high bar internally, culturally to clear if you were going to do something like this.
spk_0
Even those info boxes that popped up that took until the 20 teens to make it happen, those really were mostly on non-monetizable queries anyway.
spk_0
So any time that you were going to say, hey, Google's going to provide you an answer instead of 10 blue links, you had to have a bullet proof case for it.
spk_0
Yeah. And there was also a brand promise and trust issue too.
spk_0
Consumers trusted Google so much for us even today.
spk_0
You know, when I'm doing research for acquired, we need to make sure we get something right. I'm going to Google.
spk_0
I look something up in Claude.
spk_0
Yeah.
spk_0
It gives me an answer. I'm like, that's a really good answer. And then I verify by searching Google that I can find those facts too if I can't click through the sources on Claude.
spk_0
That's my workflow.
spk_0
Which sort of sounds funny today, but it's important.
spk_0
If you're going to propose replacing the 10 blue links with a chatbot, you need to be really damn sure that it's going to be accurate.
spk_0
Yes.
spk_0
And in 2020, 2021, that was definitely not the case.
spk_0
Arguably still isn't the case today.
spk_0
And there also wasn't a compelling reason to do it because nobody was really asking for this product.
spk_0
Right.
spk_0
No new and people in Google knew that you could make a chatbot interface to a transformer based LLM.
spk_0
And that was a really compelling product. The general public didn't know.
spk_0
Open AI didn't even really know. I mean, GPT was out there.
spk_0
Do you know the story of the launch of chat GPT?
spk_0
Well, I think I do. I have it in my notes here.
spk_0
All right. So they've got the GPT 3.5. It's becoming very, very useful.
spk_0
Yeah. This is late 2022. They've got 3.5.
spk_0
But there's still this problem of how am I supposed to actually use it?
spk_0
How does it productized?
spk_0
And Sam just kind of says we should make a chatbot.
spk_0
That seems like a natural interface for this. Can someone just make a chat?
spk_0
And within like a week, internally, someone makes a chat.
spk_0
They just turn calls to the chat GPT 3.5 API into a product where you're just chatting with it.
spk_0
And every time you kick off a chat message, it just calls GPT 3.5 on the API.
spk_0
And that turns out to be this magic product.
spk_0
I don't think they expected it. I mean, servers are tipping over.
spk_0
They're working with Microsoft to try to get more compute.
spk_0
They're cutting deals with Microsoft in real time to try to get more investment,
spk_0
to get more Azure credits or get advances on their Azure credits in order to handle the incredible load in November of 2022
spk_0
that's coming in of people wanting to use this thing.
spk_0
They also just throw up a paywall randomly because they thought that the business was going to be an API business.
spk_0
They thought that the projections were all about how much revenue they were going to do through B2B licensing deals.
spk_0
And then they just realized, oh, there's all these consumers trying to use this.
spk_0
Put up a paywall to at least dampen the most expensive use of this thing so we can kind of offset the costs or slow the rollout.
spk_0
This isn't Google search, 89% gross margin stuff here.
spk_0
Right. So they end up having incredibly fast revenue takeoff just from the quick stripe paywall that they threw up over a weekend to handle all the demand.
spk_0
So to say that open AI had any idea what was coming would also be completely false.
spk_0
They did not get that this would be the next big consumer product when they launched it.
spk_0
Ben Thompson loves to call open AI the accidental consumer tech company, right?
spk_0
Yes.
spk_0
It was definitely accidental.
spk_0
Now there is actually another slightly different version of the motivation for launching the chat.
spk_0
Is this the Dario?
spk_0
Interface, yeah, the Dario and Anthropic version.
spk_0
Anthropic was working on what would become Claude and rumors were out there and people at open AI got wind of like,
spk_0
oh, hey, Anthropic and Dario are working on a chat interface.
spk_0
We should probably do one too.
spk_0
And if we're going to do one, we should probably launch it before they launch theirs.
spk_0
So I think that had something to do with the timing.
spk_0
But again, I don't think anybody including open AI realized what was going to happen, which is venue alluded to it,
spk_0
but to give the actual numbers on November 30th, 2022.
spk_0
Basically Thanksgiving open AI launches a research preview of an interface to the new GPT 3.5 called chat GPT.
spk_0
That morning on the 30th Sam Altman tweets, today we launch chat GPT.
spk_0
Try talking with it here and then a link to chat.open AI dot com within a week less than a week actually it gets one million users by the end of the year.
spk_0
So you know, one month later, December 31st, 2022, it has 30 million users by the end of the next month by the end of January 23 of two months after launch.
spk_0
It crosses 100 million registered users, the fastest product in history to hit that milestone.
spk_0
Completely insane.
spk_0
Completely insane.
spk_0
Before we talk about what that unleashes within Google, which is the famous code red to rewind a little bit back to know him in the chatbot within Google.
spk_0
Google mean Google does keep working on mean.
spk_0
They develop it into something called lambda, which is also a chatbot also internal.
spk_0
I think it was a language model at this point in time, they still differentiated between the underlying model brand name and the application name.
spk_0
Yes, Lambda was the model and then there also was a chat interface to lambda that was internal for Google use only.
spk_0
No, it's still advocating to leadership.
spk_0
We got to release this thing.
spk_0
He leaves in 2021 and found a chatbot company character AI that still exists to this day and they raise a lot of money as you would expect.
spk_0
And then Google ultimately in 2024 after chat GPT launches pays $2.7 billion I think to do a licensing deal with character AI, the net of which dome comes back to Google.
spk_0
Yeah, I think Larry and Sergei were like, if we're going to compete seriously, we kind of need no back and blank check to go get him.
spk_0
Yep. So throughout 2021, 2022, Google's working on the Lambda model and then the chat interface to it.
spk_0
In May of 2022, they do release something that is available to the public called AI Test Kitchen, which is a AI product test area where people can play around with Google's internal AI products, including the Lambda chat interface.
spk_0
Yep. And all fairness predates chat GPT.
spk_0
Do you know what they do to nerf chat so that it doesn't go too far off the rails? This is amazing.
spk_0
No. For the version of Lambda chat that is in AI Test Kitchen, they stop all conversations after five turns.
spk_0
So you can only have five turns of conversation with the chatbot and then it's just, and we're done for today. Thank you. Goodbye.
spk_0
Oh, wow. And the reason they did that was for safety of like, you know, if the more turns you had with it, the more likely it would start to get.
spk_0
And then you could go off the rails. And honestly, it was a fair concern. I mean, this thing was not for public consumption. And if you remember back a few years before Microsoft released Tay, which was this crazy racist chatbot.
spk_0
Yeah, they launched it as a Twitter bot, right? And it was going off the rails on Twitter. This was in 2016, I think.
spk_0
Right. Maximum impact of badness. Yeah.
spk_0
And so despite Google all the way back in 2017, Sundar declared we are an AI first company is being understandably very cautious in real public AI launches, especially on consumer facing things.
spk_0
Yep. And as far as anyone else is concerned before chat GPT, they are an AI first company. And they're launching all this amazing AI stuff.
spk_0
It's just within the vector of their existing products. Right. So chat GPT comes out becomes the fastest product in history to 100 million users.
spk_0
It is immediately obvious to Sundar Larry Sergei, all of Google leadership that this is an existential threat to Google chat GPT is a better user experience to do the same job function that Google search does.
spk_0
And to underscore this. So if you didn't know it in November of 22, you sure knew it by February of 23 because good ol Microsoft our biggest scariest enemy.
spk_0
Oh, yeah. Announces a new Bing powered by open AI and Satya has a quote. It's a new day for search. The race starts today.
spk_0
There is an announcement of a new AI powered search page. He says we want to rethink what search was meant to be in the first place. In fact, Google's success in the initial days came by reimagining what could be done in search.
spk_0
And I think the AI era we're entering gets us to think about it. This is the worst possible thing that could happen to Google that now Microsoft can actually challenge Google on their own turf.
spk_0
Intent on the internet with a legitimately different better differentiated product vector. Not what Bing was trying to do copycat. This is the full leap frog and they have the technology partnership to do it or so everybody thinks at the moment.
spk_0
Oh my God, terrifying.
spk_0
This is when Satya says the quote in an interview around this launch with Bing. I want people to know that we made Google dance.
spk_0
Oh boy, well, hey, if you come at the king, you'd best not miss. Right. And this big launch kind of misses. Yes. So what happens in Google December 2022, even before the big launch, but after the chat GPT moment, Sundari, she's a code red within the company.
spk_0
And what does that mean up until this point, Google and Sundar and Larry and everyone have been thinking about AI as a sustaining innovation in Clay Christiansen's terms.
spk_0
This is great for Google. This is great for our products. Look at all these amazing things that we're doing. It further entrenched is incumbents.
spk_0
If further is entrenching our lead in all of our already leading products, we can deploy more capital in a predictable way to either drive down costs or make our product experiences that much better than any startup could make.
spk_0
Get more monetized that much better. All the things once chat GPT comes out on a dime overnight, AI shifts from being a sustaining innovation to a disruptive innovation.
spk_0
It is now an existential threat. And many of Google's strengths from the last 10, 15, 20 years of all the AI work that's happened in the company are now liabilities. They have a lot of existing castles to protect.
spk_0
That's right. They have to run everything through a lot of filters before they can decide if it's a good idea to go try to out open AI open AI.
spk_0
So this code red that's soon to our issues to the company is actually a huge moment because what it means and what he says is we need to build and ship real native AI products, ASAP.
spk_0
This is actually what you need to do at the textbook response to a disruptive innovation as the incumbent. You need to not bury your head in the sand and you need to say, okay, we need to like actually go build and ship products that are comparable to these disruptive innovators.
spk_0
And you need to be laser operationally in all the details to try and figure out where is it that the new product is actually cannibalizing our old product and where is it that the new product can be complimentary and just lean in all the ways in which you can be complimentary in all the different little scenarios.
spk_0
And really what they've been trying to do this ballet from 2022 onward is protect the growth of search while also creating the best AI experiences they can.
spk_0
And so it's very clever the way that they do AI overviews for some but not all queries and they have AI mode for some but not all users.
spk_0
And then they have Gemini the full AI app but they're not redirecting Google dot com Gemini.
spk_0
It's this like very delicate dance of protecting the existing franchise well also building a hopefully non cannibalizing as much as we can new franchise.
spk_0
Yeah, and you see them really going hard and I think building leading products in non search cannibalizing categories like video.
spk_0
Right, via three or nano banana these are things that don't in any way can't realize the existing franchise they in fact use some of Google strength all the YouTube training data and stuff like that.
spk_0
Yeah, so what happens next as you might expect it gets worse before it gets better code red goes out December 2022 barred baby launch barred.
spk_0
Oh boy well even before that January 23 when open AI hits 100 million registered users for chat GPT Microsoft announces they are investing another 10 billion dollars in open AI and says that they now own 49% of the for profit entity.
spk_0
Incredible in and of itself but then now think about this from the Google lens of Microsoft are out of you they now arguably own obviously in retrospect here they don't own open AI but it seems at the time like oh my god Microsoft might now own open AI which is our first true existential threat in our history as a company not great Bob.
spk_0
So then February 2023 the big integration launches. Satya has the code about wanting to make Google dance. Meanwhile Google is scrambling internally to launch.
spk_0
AI products as fast as possible. So the first thing they do is they take the Lambda model and the chatbot interface to it.
spk_0
They rebrand it as barred they ship that publicly and they release it immediately February 2023.
spk_0
Ship it publicly available GA to anyone which maybe was the right move but God it was a bad product it was really bad I didn't know the term at the time RLHF but it was clear it was missing a component of some magic that chat GPT had this reinforcement learning with human feedback where you could really tune the appropriateness the tone the voice the sort of correctness of the responses it just wasn't there.
spk_0
Yep so to make matters worse in the launch video for barred a video is a choreographed pre recorded video where they're showing conversations with barred.
spk_0
Barred gives an inaccurate factual response to one of the queries that they include in the video.
spk_0
This is one of the worst keynotes in history after the barred launch and this keynote Google stock drops 8% on that day and then like we were saying once the actual product comes out it becomes clear it's just not good.
spk_0
Yep and it pretty quickly becomes clear it's not just that the chatbot isn't good it's the model isn't good.
spk_0
So in May they replaced Lambda with a new model from the brain team called Palm it's a little bit better but it's still clearly behind not only GPT 3.5 but in March of 2023 open AI comes out with GPT 4 which is even better you can access that now through chat GPT.
spk_0
And here is where Sundar makes two really really big decisions number one he says we cannot have to a it's within Google anymore we're merging brain and deep mind into one entity called Google deep mind which is a giant deal this is in full violation of the original deal terms of bringing deep mind in.
spk_0
Yep and the way he makes it work as he says Dennis you are now CEO of the AI division of Google Google deep mind this is all hands on deck and you in deep mind are going to lead the charge you're going to integrate with Google brain and we need to change all of the past 10 years of culture around building and shipping AI products within Google.
spk_0
So to further illustrate this when alphabet became alphabet they had all these separate companies but things that were really court of Google like YouTube actually stayed a part of Google deep mind was its own company that's how separate this was they're working on their own models in fact those models are predicated on reinforcement learning that was the big thing that deep mind had been working on the whole time and so reading in between the lines it's soon are looking at his two AI labs and going look at it.
spk_0
I know you two don't actually get along that well but look I don't care that you had different charters before I am taking the responsibility of Google brain and giving it to deep mind and deep mind as absorbing the Google brain team I think that's what you should sort of read into it because as you look at where the models went from here they kind of came from deep mind.
spk_0
Yep there's a little bit of interesting backstory to this to so Mustafa schoolman the third co founder of deep mind at some point before this he became like the head of Google AI policy or something he had already shifted over to brain and to Google he stayed there for a little while and then he ended up getting close with who else read Hoffman remember read is on the ethics board for deep mind and
spk_0
Mustafa and read leave and go found inflection AI which fast forward now into 2024 after the absolute insanity that goes down at open AI in Thanksgiving 2023 when Sam Altman gets fired over the weekend during Thanksgiving and then brought back by Monday when all the team threatened to quit and go to microsoft open I loves Thanksgiving can't wait for this year they love Thanksgiving.
spk_0
Yeah gosh after all that which certainly strains the Microsoft relationship remember again read is on the board Microsoft Microsoft does one of these acquisition type deals with inflection AI and brings Mustafa in as the head of AI for Microsoft crazy wild right just wild crazy turn of events.
spk_0
Okay so that first big decision that soon are makes is unifying deep mind and brand that was huge equally big he says I want you guys to go make a new model and we're just going to have one model that is going to be the model for all of Google internally for all of our AI products externally it's going to be called Gemini no more different models no more different teams
spk_0
just one model for everything this is also a huge deal it's a giant deal and it's twofold it's push and it's pull it's saying hey if anyone's got a need for an AI model you got to start using Gemini but to it's actually kind of the Google plus thing where they go to every team and they start saying Gemini as our future you need to start looking for ways to integrate Gemini into your product yes I'm so glad you brought up Google plus this came up with a few folks I spoke to in the research obviously this is all
spk_0
playing out real time but the point a lot of people at Google made is the Gemini situation is very different than the Google plus situation this is a technical thing a which has always been Google's wheelhouse but be even more importantly this is the rational business thing to do in the age of these huge models even for a company like Google
spk_0
there are massive scaling laws to models the more data you put in the better it's going to get the better all the outputs are going to be and because of scaling laws you need your models to be as big as possible in order to have the best performance possible if you're trying to maintain multiple models within a company you're repeating multiple huge costs to maintain huge models you definitely don't want to do that you need to centralize on just one model
spk_0
yeah it's interesting there's also something to read into where at first it was the Gemini model underneath the barred product barred was still the consumer name then at some point they said no we're just calling it all Gemini and Gemini became the user facing name also this pulls in my quintessence from the alphabet episode I know it's a little bit woo woo but with Google saying we're actually going to name the consumer service the name of the egg
spk_0
AI model there's sort of admitted to themselves this product is nothing but technology there isn't productiness to do on top of it it's just like Gmail Gmail was technology it was fast search it was lots of storage it was used in the web the productiness wasn't particular the way that like Instagram was all about the product Gemini the model Gemini the chatbot says we're just exposing our amazing breakthrough technology to you all
spk_0
and you get to interface directly with it anthropologically looking from afar it kind of feels like it's that principle at work I totally agree I think it's actually a really important branding point and sort of rallying point to Google and Google culture to do this right
spk_0
all right so this is all the stuff going on in Google 2023 ish in AI before we catch up to the present I have a whole other branch of alphabet that has been a real bright spot for AI can I go there can I take this off ramp if you will can you take the wheel so to speak may I take the wheel may I investigate another bet yeah please tell us the way my story awesome
spk_0
so we got a rewind back all the way to 2004 the DARPA grand challenge which was created as a way to spur research into autonomous ground robots for military use and actually what it did for our purposes here today is create the seed talent for the entire self driving car revolution 20 years later
spk_0
so the competition itself is really cool there's a 132 mile race course now mind you this is 2004 in the Mojave desert that the cars have to race on it is a dirt road no humans are allowed to be in or interact with the cars they are monitored 100% remotely and the winner gets one million dollars one million dollars which was a break from policy normally these are grants not prize money
spk_0
so this needs to be authorized by an act of Congress the one million dollars eventually felt comical so the second year they raised the pot to two million dollars crazy thinking about these researchers are worth today that that was the prize for the whole thing
spk_0
so the first year in 2004 went fine there were some amazing tech demonstrations on these really tight budgets but ultimately zero of the 100 registered teams finished the race
spk_0
but the next year in 2005 was the real special year the progress that the entire industry made in those first 12 months from what they learned is totally insane of the 23 finalists that were entering the competition
spk_0
22 of them made it past the spot where the furthest team the year before had made it the amount that the field advanced in that one year is insane not only that five of those teams actually finished all 132 miles two of them or from Carnegie Mellon and one was from Stanford
spk_0
led by a name that all of you will now recognize Sebastian Thrun
spk_0
and indeed this is Sebastian's origin story before Google now as we said Sebastian was kind enough to help us with prep for this episode but I actually learned most of this from watching a 20 year old Nova documentary that is available on Amazon Prime video
spk_0
thanks to Brett Taylor for giving us the tip on where to find this documentary yes the hot research tip
spk_0
so what was special about this Stanford team well one there is a huge problem with noisy data that comes out of all of these sensors you know it's in a car in the desert getting rocked around it's in the heat it's in the sun so common wisdom and what Carnegie Mellon did was to do as much as you possibly can on the hardware to mitigate that so things like custom rigging and gimbals and giant springs to stabilize the sensors
spk_0
Carnegie Mellon would essentially buy a Hummer and rip it apart and rebuild it from the wheels up we're talking like welding and real construction on a car the Stanford team did the exact opposite they viewed any new piece of hardware is something that could fail and so in order to mitigate risks on race day they used all commodity cameras and sensors that they just mounted on a nearly unmodified Volkswagen
spk_0
so they only innovated in software and they figured they would just kind of come up with clever algorithms to help them clean up the messy data later very googly right very googly the second thing they did was an early use of machine learning to combine multiple sensors they mounted laser hardware on the roof just like what other teams were doing and this is the way that you can measure texture and depth of what is right in front of you and the data it's super precise but you can't drive very fast because you don't really want to do that
spk_0
really know much about what's far away since it's this fixed field of view it's very narrow essentially you can't answer that question of how fast can I drive or is there a turn coming up so on top of that the way they solved it was they also mounted a regular video camera that camera can see a pretty wide field of view just like the human eye and it can see all the way to the horizon just like the human eye and crucially it can see color so what it would do this is like really clever
spk_0
they would use a machine learning algorithm in real time in 2005 this computer is like sitting in the middle of the car they would overlay the data from the lasers on top onto the camera feed and from the lasers you would know if the area right in front of the car was okay to driver not then the algorithm would look up in the frames coming off the camera overlaid what color that safe area was and then extrapolate by looking further ahead other parts of the video frame to see where that safe area
spk_0
extended to you could figure out your safe path through the desert that's awesome it's so awesome I'm imagining like a Dell PC sitting in the middle of this car in 2005 it's not far off in the email that we send out all shares and photos of it it could then drive faster with more confidence and it new when turns were coming up again this is real time on board the camera 2005 is wild on that tech so ultimately both of these bets worked and the Stanford T
spk_0
in the game one in super dramatic fashion they actually passed one of the Carnegie Mellon teams autonomously through the desert it's like this big dramatic moment in the documentary so you would kind of think so then Sebastian goes to Google and builds way more no as we talked about earlier he does join Google through that crazy please don't raise money from benchmark and so quiet and we'll just hire you instead but he goes and works on street view and project ground truth and co-found Google X David as you are looting to earlier
spk_0
this project chauffeur that would become way more is the first project inside Google X and I think the story right is that Larry came to Sebastian and was like yes you know that self-driving car stuff like do it
spk_0
and so I was like no come on that was a dark but challenge and there's like no no you should do it he's like no that won't be safe there's people running around cities I'm not just going to put multi-ton killer robots on roads and go and potentially harm people and Larry finally comes to him and says why what is the technical reason that this is impossible and Sebastian goes home has a sleep on it and he comes in the next morning and he goes I realize what it was I'm just afraid
spk_0
so they start he's like there's not a technical reason as long as we can take all the right precautions and hold a very high bar on safety let's get to work so Larry then goes great I'll give you a benchmark so that way you know if you're succeeding he comes up with these 10 stretches of road in California that he thinks will be very difficult to drive it's about a thousand miles and the team starts calling it the Larry 1000 and it includes driving to Tahoe Lombard Street in San Francisco
spk_0
highway one to Los Angeles the Bay Bridge this is the bogey yeah if you can autonomously drive these stretches of road pretty good indication that you can probably do anything yeah so they start the project in 2009 within 18 months this tiny team I think they hired I don't know it's like a dozen people or something they've driven thousands of miles autonomously and they managed to succeed in the full Larry 1000 within 18 months
spk_0
totally unreal how fast they did it and then also totally unreal how long it takes after that to product eyes and create the way mode that we know today right it's like the first 99% and then the second 99% that takes 10 years yeah self driving is one of these really tricky types of problems where it's surprisingly easy to get started even though it seems like it would be an impossible thing but then there's edge cases everywhere weather road conditions other drivers novel road lay
spk_0
out night driving so it takes this massive amount of work for a production system to actually happen so then the question is what business do we build what is the product here and there was what Sebastian wanted which was highway assist sort of the lowest stakes most realistic let's make a better cruise control there's what Eric Schmidt wanted which is crazy he proposed oh let's just go by Tesla and that'll be our starting place and then we'll just put all of our self driving equipment on the cars David do you know what it would have
spk_0
cost to buy Tesla at the time I think at the time that negotiations were taking place between Elon and Larry and Google this was in the depths of the model S production scaling was I think Google could
spk_0
about the company for five billion dollars that's what I remember it was three billion three billion dollars oh my goodness obviously that didn't happen but what a crazy
spk_0
alternative history that could have been right I mean I think if that had happened deep mine would have gone down in the same way and probably open AI would not have gotten
spk_0
founded who that's probably right I think that is obviously unprovable right the counterfactuals that we always come up with on this show you can't
spk_0
know yeah seems more likely than not to me that at a minimum open AI would not exist right so then there was what Larry wanted to do option
spk_0
three build robot taxis yeah and ultimately that is at least right now what they would end up doing so we could do a whole
spk_0
episode about this journey but we will just hit some of the major points for the sake of time the big thing to keep in
spk_0
mind here neither Google nor the public really knew if self driving was something that could happen in the next two years
spk_0
from a given point or take another 10 and just to illustrate it for the first five years of project show for it did
spk_0
not use deep learning at all they did the Larry 1000 without any deep learning and then win another three and a half years
spk_0
well that's crazy yeah I get totally illustrates you never know how far away the end goal is and this is a field that comes from
spk_0
the only way progress happens is through these series of breakthroughs and you don't know a how far the next breakthrough is
spk_0
because at any given time there's lots of promising things in the field most of which don't work out and then be when there is a
spk_0
breakthrough actually how much lift that will give you over existing methods so anytime people are forecasting oh and AI we're going to be
spk_0
able to do XYZ and X years it's a complete fools errand even the experts don't know here the big milestones 2013 they started using
spk_0
convolutional neural nets they could identify objects they got much better perception capabilities this 2013 2014
spk_0
period is when Google found religion around deep learning so this is like right after the 40,000 GPUs rolled out so they've
spk_0
actually got some hardware to start doing this on now 2016 they've seen enough technology proof that they think let's
spk_0
commercialize this we can actually spin this out into a company so Waymo becomes its own subsidiary inside of alphabet it's no
spk_0
longer a part of Google X anymore 2017 obviously the transformer comes out they incorporate some learnings from the
spk_0
transformer especially around prediction and planning March of 2020 they raise $3.2 billion from folks like silver like Canada
spk_0
pension investment board and recent harrow its and of course the biggest check I think alphabet I think they're always the
spk_0
biggest check because alphabet is still the majority owner even after a bunch more fund raises in October of 2020 they launched the
spk_0
first public commercial no human behind the driver seat thing in Phoenix it's the first in the world this is 11 years after
spk_0
succeeding in the Larry 1000 and this is nuts I had given up at this point I was like that's cute that Waymo and all these other
spk_0
companies are trying to do self driving seems like it's never going to happen and then they actually were doing a large volume
spk_0
of rides safely with consumers and charging money for it in Phoenix then they were against San Francisco where for me and lots of people in San
spk_0
Francisco is a huge part of life in the city here now it's amazing yeah every time I'm down I love taking them they're launching in Seattle soon I'm pumped
spk_0
interestingly they don't make the hardware so they use a Jaguar vehicle yep that from what I can tell is only in Waymo's like I don't know
spk_0
if anybody else drives that Jaguar or if you can buy it but they're working on a sort of van next they have some next generation hardware for anyone who hasn't
spk_0
taken it it's an Uber but with no driver and that launched in June of 24 along the way there they raised their quote unquote series B another 2.5 billion then after this
spk_0
San Francisco roll out they raised their quote unquote series C 5.6 billion this year in January they were reportedly doing more in gross bookings than
spk_0
lift in San Francisco I totally believe it I mean it is the number one option in San Francisco that I and everybody I know to always goes to for
spk_0
right healing it's like try to get away Mo if there's not a way Mo available anytime soon you go then go down the stack like we're living in the
spk_0
future and how quickly we fail to appreciate it yeah and what's cool I think for people who it hasn't come to their city and is not part of their lives yet
spk_0
it's not just that it's a cool experience to not have a driver behind the way pretty quickly that just fades it's actually a different
spk_0
experience so if I need to go somewhere with my older daughter I don't mind hailing a way Mo bringing the car seat installing the car
spk_0
seat in the way Mo and driving with my daughter and she loves it we call the robot car and she's like a robot car I'm so excited
spk_0
huh I would never do that with a new bar that's interesting to my dog whatever I need to go with my dog I get super awkward to
spk_0
kill a new bird like hey I got my dog get a kid the dog committed not a big deal with a way Mo and then when you're in town
spk_0
yeah we can actually have sensitive conversations in the car you can have phone calls it really is a different experience
spk_0
yeah that's so true yeah so may as well catch up to today they're operating in five cities Phoenix San Francisco LA Austin and Atlanta they have hundreds of
spk_0
thousands of paid rides every week they've now driven over a hundred million miles with no human behind the wheel
spk_0
growing at two million every week there's over 10 million paid rides across two thousand vehicles in the fleet
spk_0
they're going to be opening a bunch more cities in the US next year they're launching in Tokyo their first international city
spk_0
slowly and then all at once I mean that's kind of the lesson here the technology they really continued with that
spk_0
multi sensor approach all the way from the DARPA grand challenge camera LiDAR they added radar and actually they use
spk_0
audio sensing as well and their approach is basically any data that we can gather is better because that makes it safer
spk_0
so they have 13 cameras for LiDAR 6 radar and the array of external microphones this is obviously way more
spk_0
expensive of a solution than what Tesla is just doing with cameras but way most party line is they believe it is the only
spk_0
path to fall autonomy to hit the safety bar and regulatory bar that they're aiming for yeah it seems like a really big
spk_0
line in the sand for them anytime you talk to somebody in that organization yeah and look as a regular user of both products
spk_0
happy owner and driver of a model wide addition to regular way bow user at least with the current instantiation of full self driving on my
spk_0
Tesla vastly different products full self driving on my model Y is great I use it all the time on the freeway but I would never not pay
spk_0
attention whereas every time I get in a way mo it's almost like Google search right it's like I just trust that oh this is going
spk_0
to be completely and totally safe and I'm sitting in the back seat and I can totally tune out I think I trust my model Y
spk_0
FSD more than you do but I get what you're saying and frankly regulatory you are required to still pay attention in Tesla and not in the way
spk_0
Mo the safety thing is super real though I mean if you look at the numbers over a million motor vehicle crashes cause fatalities every
spk_0
year or there's over a million fatalities in the US alone over 40,000 deaths occur per year so we break that down
spk_0
that's 120 every day that's like a giant cause of death yes the study that way Mo just released last month showed that they have 91% fewer crashes
spk_0
with serious injuries or worse compared to the average human driver even controlled for the fact that way mo's right now are only driving on city surface streets
spk_0
so they control that apples to apples with human driving data and it's a 91% reduction in those serious either fatality or serious injury things
spk_0
why are we all talking about this all the time every day this is going to completely change the world in a giant cause of death yeah
spk_0
so while we're in way mo land what do you think about doing some quick analysis great because I've been scratching my head here of
spk_0
what is this business now I promise we'll go back to the rest of Google AI and catch up to today it is super expensive to operate especially at early scale the training is high the
spk_0
inference is high the hardware is high etc etc also the operations are expensive yes and in fact they're experimenting some cities they actually outsource the operations so the fleet is
spk_0
managed by there's a rental car company in Texas that manages it or they've partnered I believe with lift and with Uber and different so they're trying all sorts of
spk_0
oh no versus partnership models to operate it yeah and operations are like these are electric cars they need to be charged they need to be cleaned they need to be returned to depots they need to be checked out they need to have
spk_0
senses replaced so the question is what is the potential market opportunity how big could this business be and there's a few different ways you could try to quantify it one total market size thing you could do is try to some the entire
spk_0
auto maker market cap today and that would be two and a half trillion globally if you include Tesla or 1.3 trillion without but way
spk_0
most not really making cars that's probably the wrong way to slice it you could look at all the right sharing companies today which might be a better
spk_0
comp because that's the business that way was actually in today that's on the order of 300 billion most of which is Uber yep that's addressable market cap today with
spk_0
right sharing way most ambitions though are bigger than that they want to be in the cars that you want they want to be in long
spk_0
whole trucking so they believe they can grow the share of transportation because there's blind people that could own a car there's elderly people who could get where they need to go on their
spk_0
own without having a driver that sort of thing so the most squishy but I think the most interesting way to look at it is what is the value from all of the reduction in accidents
spk_0
because that's really what they're doing it's a product to replace accidents with non accidents I think that's viable but I get I would say as a regular user of the product it is a
spk_0
different and expanding product to human right share so your argument is whatever number I come up with for reducing accidents it's still a bigger market than that because there's
spk_0
additional value created in the product experience itself yeah scoping just to ride share now that we have way more in San Francisco I use way more in scenarios where I
spk_0
would never use a new burrow lift yeah makes sense so here's the data we have the CDC released a report saying deaths from crashes in 2022 in the US
spk_0
resulted in 470 billion dollars in total costs including medical costs and the cost estimates for lives lost which is crazy that the CDC has some way of putting the cost on human life but they do so if you
spk_0
reduce crashes 10X which is what way most seems to be saying in their data at least for the serious crashes that's over 420 billion dollars a year in total costs that we would save as a nation now it's not totally apples to apples I recognize this but that cost savings is more
spk_0
than Google does today in revenue in their entire business you could see a path to a Google sized opportunity for way more as a standalone company just through this analysis as long as they figure out a way to get
spk_0
cost down to the point where they can run this as a large and profitable business yeah it is a incredible 20 plus year success story within Google
spk_0
the way I want to close it is the investment so far actually hasn't been that large when you consider this opportunity they have burned somewhere in the neighborhood of 10 to 15 billion dollars that's sort of why I was listing all the investments yeah yeah to get to this point jump change compared to foundational models dude also let's just keep it scoped in this sector that's one year of uber's profits
spk_0
wow seems like a good bet I used to think this was like some wild goose chase it now looks really really smart yeah totally agree also that cost 10 to 15 billion is the profits that Google made last month
spk_0
yeah
spk_0
Google well speaking of Google should we catch us up to today with Google AI yes so I think where you were is the Gemini launch so soon
spk_0
our makes these two decrease mid 2023 one were merging brain and deep mind into one team for AI within Google and to we're going to standardize on one model the future Gemini and deep mind slash brain
spk_0
team you go build it and then everybody in Google you're going to use it not to mention apparently Serge brain is like now back as an employee working on Gemini yes employee number got is do badge back yeah got his badge back
spk_0
so once soon to our makes these decisions Jeff Dean and Oriil vinyalis from brain go over and team up with the deep mind team and they start working on Gemini I believe are now by the way you got Jeff Dean working on it I'm in if you got Jeff Dean on it it's probably going to work if you want to believe
spk_0
or yeah wait till I'm going to tell you next once they get no one back when they do the deal with character AI bring him back into the fold
spk_0
no one joins the Gemini team and Jeff and no more the two co technical leads for Gemini now so let's go let's go so they actually announce this very quickly at the Google I owe keynote in May 2023
spk_0
they announced Gemini they announced the plans they also launch AI overviews in search first as a labs product and then later that becomes just standard for everybody using Google search which is crazy by the way the number of Google searches that happen is unfathomably large I'm sure there's a number for but just think about that's about the highest level of computing scale that exists other than like high bandwidth things like streaming but just think about the instances of
spk_0
Google searches that happen they are running an LLM inference on all of those or at least as many as they're willing to show AI overviews on which I'm sure is not every query but many a subset yeah but still a large large number of Google
spk_0
so I mean I see them all the time yep this is really Google immediately deciding to operate at AI speed I mean chat GPT happened in November 30th 2022 we're now in May 2023 all of these decisions have been made all of these
spk_0
changes have happened in their announcing things that I owe and they're really flexing the infrastructure that they've got I mean the fact that they can go like oh yeah sure let's do inference on every query or Google we can handle it so I keep part of this new Gemini model that they announce in May 2023 is it's going to be multimodal again this is one model for everything text images video audio one model they release it for early public access in December
spk_0
2023 so also crazy six months they build it they train it they release it that is amazing wild February 2024 they launched Gemini 1.5 with a 1 million token context window much much larger context window than any other model on the market which
spk_0
enables all sorts of new use cases there's all these people who were like I tried to use AI before but it couldn't handle my XYZ use case now they can
spk_0
yep the next year February 2025 they released Gemini 2.0 March of 2025 one month later they launched Gemini 2.5 pro in experimental mode and then that goes GA in June this is like Nvidia pace how often they're shipping
spk_0
yeah seriously and also in March of 2025 they launch AI mode so you can now switch over on Google dot com to
spk_0
JAPA mode and their split testing auto opting some people into AI mode to see what the responses this is the golden goose
spk_0
yeah the elephant is tap dancing here yeah then there's all the other AI products that they launch so notebook LM comes out during this
spk_0
period AI generated podcasts which does that sound like us to you it feels a little trained the number of texts that we got when that came out of this must be trained on
spk_0
acquired I do know that a bunch of folks on the notebook LM team are acquired fans so I don't know if they trained on us and then there's the video with the image stuff
spk_0
V03 nano banana genie 3 that just came out recently this insane this is a world builder based on props and videos yeah you haven't actually used it yet right you
spk_0
watch that hype video yeah I watch the video I haven't actually used it yeah I mean if it does that that's unbelievable it's a real time
spk_0
generative world builder world builder yeah you look right and it invents stuff to your right I mean you combine that with like a
spk_0
vision pro hardware you're just living in a fantasy land so they announced there are now 450 million monthly users of Gemini now that includes everybody who's accessing nano banana yeah I can't believe this stat
spk_0
this is insane even with recently being number one in the app store it still feels hard to believe Google saying it so it must be true but I just wonder what are they counting as use cases of the Gemini app
spk_0
right certainly everybody who's using nano banana is using Gemini but is it counting AI overviews or is it counting AI mode or is it counting something where I'm like accidentally like meta said that crazy
spk_0
high number of people using meta AI right right right that was complete garbage that was people searching Instagram who accidentally hit a llama model that made some things
spk_0
happen and they were like go away I actually am just looking for a user is it really 450 million or is it 450 million yeah good question either
spk_0
way going from zero is crazy impressive in the amount of time that they've done especially given revenues at an all time high they seem to so far be at least in this squishy early phase able to figure out how to keep the
spk_0
core business going while doing well as a competitor in the cutting edge of AI yeah and to foreshadow a little bit to we're going to do a bullet bear here in a minute as we talked about in our
spk_0
alphabet episode Google does have a history of navigating platform shifts incredibly well in the transition mobile true definitely a rockier start here in the AI platform shift much
spk_0
rockier but hey look I mean if you were to lay out a recipe for how to respond given the rocky start be hard to come up with a much better
spk_0
slate of things than what they've done over the last two years yeah all right should I give us the snapshot of the business today give us the snapshot of the
spk_0
business today oh yeah also by the way the federal government decided they were monopoly and then decided not to do anything about it because of AI yeah so between the time when we
spk_0
shipped our alphabet episode and here with our Google AI episode or our part two in part three for those who prefer simpler
spk_0
naming schemes yeah there was a US versus Google antitrust case the judge first rule that Google was a monopoly in internet search and then did not come up with any material remedies I mean there are some but I would call them in
spk_0
material they did not need to spin off chrome and they did not need to stop sending tens of billions of dollars to Apple and others in other words yes Google's a monopoly and the cost
spk_0
of doing anything about that would have too many downstream consequences on the ecosystem so we're just going to let them keep doing what they're doing and one of the reasons that the judge cited of why they
spk_0
weren't going to really take these actions is because of the race in AI that because tens of billions of dollars of funding have gone into companies like open AI and
spk_0
and then in the topic and perplexity Google essentially has this new war to fight and we're going to leave it to the free market to do its thing where it creates viable competition on its own and we're not going to hamstring Google
spk_0
personally I think this argument is a little bit silly I mean none of these AI companies are generating that income and just because they've raised a huge amount of money it doesn't mean that will last forever they'll all burn through their existing cash in a pretty short period of time and if the
spk_0
biggest ever dry up Google doesn't have any self-sustaining competition right now whether in their old search business or in AI it is all dependent on people believing that the opportunity is so large they keep pouring tens of billions of dollars into these
spk_0
competitors. Yeah plenty of other folks have made the sort of glib comment but there's merit to it of hey as flatfoot it is Google was when Chatsy PT happened if the outcome of this is they avoid a
spk_0
Microsoft level distraction and damage to their business from a US federal court monopoly judgment worth it. Well there's a funny meme here that you could draw you know that meme of someone pushing the domino and it knocking over some big wall later yeah there's the domino of Ilya leaving Google to start open AI and the downstream effect is Google is not broken up. Yeah right exactly
spk_0
actually saves Google it actually saves Google. It's totally wild totally wild. All right so here's the business today over the last 12 months Google has generated 370 billion dollars in revenue on the earnings side they've generated a hundred and 40 billion over the last 12 months which is more profit than any other tech company and the only company in the world with more earnings is Saudi Aramco.
spk_0
Let's not forget Google is the best business ever and we also made the point at the end of the alphabet episode even in the midst of all of this AI and everything that's happened over the last 10 years the last five years.
spk_0
Google's core business has continued to grow 5x since the end of our alphabet episode in 2015 2016 yeah market cap Google surge past their old peak of 2 trillion and just hit that 3 trillion mark earlier this month there the 4th most valuable company in the world behind Nvidia Microsoft and Apple.
spk_0
Google is just crazy on their balance sheet actually this is pretty interesting I normally don't look at balance sheet as a part of this exercise but it's useful and here's why in this case they have 95 billion in cash and marketable securities and I was about to stop there and make the point wow look how much cash and resources they have I'm actually surprised it's not more so it used to be a hundred and 40 billion in 2021 and over the last 4 years they've massively shift from this mode of accumulating cash to deploying cash and a huge
spk_0
part of that has been the capex of the AI data center build out so they're very much playing offense in the way that met a Microsoft and Amazon are into playing that capex but the thing that I can't quite figure out is the largest part of that was actually buybacks and they started paying a dividend.
spk_0
So if you're not a finance person the way to read into that is yes we still need a lot of cash for investing in the future of AI and data centers but we still actually had way more cash than we needed and we decided to distribute that to shareholders.
spk_0
Yeah that's crazy best business of all time right that illustrates what a crazy business their core search ads businesses if they're saying the most capital intense race in business history is happening right now.
spk_0
We intend to win it yeah and we have tons of extra cash lying around on top of what we think plus a safety cushion for investing in that capex race yeah yes well so there are two businesses that are worth looking at here one is Gemini to try to figure out what's happening there and two is a brief history of Google Cloud I want to tell you the cloud numbers today but it's probably worth actually understanding how did we get here
spk_0
on cloud. Yep first on Gemini because this is Google and they have I think the most obfuscated financials of any of the companies we've studied the anger be the most in being able to hide the ball in their financial statements.
spk_0
Of course we don't know Gemini specific revenue what we do know is there are over 150 million paying subscribers to the Google one bundle.
spk_0
Most of that is on a very low tier that's on like the $5 a month $10 a month the AI stuff kicks in on the $20 a month tier where you get the premium AI features but I think that's a very small fraction of the 150 million today I think that's what I'm on but two things to know one it's growing quickly that 150 million is growing almost 50% you're over year but two is Google has a subscription bundle that 150 million people are subscribed to and so I've kind of had it in my head.
spk_0
That AI doesn't have a future as a business model that people pay money for that it has to be add supported like search but hey that's not nothing that's like a that's almost half of America I mean how many subscribers does Netflix have Netflix is in the hundreds of millions yeah spotifies now a quarter billion something like that yeah we now live in a world where there are real scaled consumer subscription services I owe you a lot of money.
spk_0
So this insight to Shishir Maroto we chatted actually last night because I named dropped him in the last episode and then he heard it and so we reached out we talked and that's made me do a 180 I used to think if you're going to charge for something your total address will market shrunk by 90 to 99% but he kind of has this point that if you build a really compelling bundle and Google has the digital assets to build a compelling bundle oh my goodness you to premium NFL Sunday ticket.
spk_0
Yes stuff in the play store YouTube music all the Google one storage stuff they could put AI in that bundle and figure out through clever bundle economics a way to make a paid AI product that actually reaches a huge number of paying subscribers totally so really can't figure out how much money Gemini makes right now probably not profitable anyway so what's the point of even analyzing it.
spk_0
Yep but okay tell us the cloud story so we intentionally did not include cloud in our alphabet episode Google part two effectively Google part two yes because it is a new product and now very successful one within Google that was started during the same time period as all the other ones that we talked about during Google part two.
spk_0
But it's so strategic for AI yes it is a lot more strategic now in hindsight than it looked when they launched it so just quick background on it it started as Google App Engine it was away in 2008 for people to quickly spin up a back end for a web or soon after a mobile app it was a platform as a service so you had to do things in this very narrow
spk_0
Google way it was very opinionated you had to use this SDK you had to write it in Python or Java you had to deploy exactly the way they wanted to deploy it was not a thing where they would say hey developer.
spk_0
You can do anything you want just use our infrastructure it was opinionated super different than what AWS was doing at the time and what they're still doing today which the whole world eventually realize was right which is cloud should be infrastructure as a service even Microsoft pivoted Azure to this reasonably good.
spk_0
So we're going to do it quickly where it was like you want some storage we got storage for you you want to VM we got a VM for you want some compute you want a database we got to fundamental building blocks.
spk_0
So eventually Google launches their own infrastructure as a service in 2012 took four years they launched Google compute engine that they would later rebrand Google cloud platform that's the name of the business today.
spk_0
The knock on Google is that they could never figure out how to possibly interface with the enterprise their core business they made really great products for people to use that they loved polishing they made all a self service possible and the way they made money was from advertisers and let's be honest there's no other choice but to use Google search right it didn't necessarily need to have a great enterprise experience for their advertising customers because they were going to come anyway right so they've got this.
spk_0
Self serve experience meanwhile the cloud is a knife fight these are commodities all about the enterprise it's the lowest possible price and it's all about enterprise relationships and clever ways to bundle and being able to deliver a full solution you say solution I hear grows margin yes but yes so Google out of their natural habitat in this domain and early on they didn't want to give away any crown jewels they viewed their infrastructure as this is our secret thing.
spk_0
We don't want to let anybody else use it and the best software tools that we have on it that we've written for ourselves like big table or board how we run Google or disbelief these are not services that were making available on Google cloud yeah these are competitive advantages yes and then they hired the former president of Oracle Thomas Kerian yes and everything kind of changed so 2017 two years before he comes in they had four billion dollars in revenue 10 years into running this business.
spk_0
2018 is their first very clever strategic decision they launch Kubernetes the big insight here is if we make it more portable for developers to move their applications to other clouds the world is kind of wanting multi cloud here right where the third place player we don't have anything to lose yes so we can offer this tool it kind of counter position against AWS and Azure we shift the developer paradigm to use these containers they are going to be able to do that.
spk_0
We're going to work a straight on our platform and then you know we have a great service to manage it for you it was very smart so this kind of becomes one of the pillars of their strategy is you want multi cloud we're going to make that easy and you can sure choose AWS or Azure to it's going to be great so David as you said the former president of Oracle Thomas Kerian is hired in late 2018 you couldn't ask for a better person who understands the needs of the enterprise than the former president of Oracle.
spk_0
This shows up in revenue growth right away in 2020 they cross 13 billion in revenue which was nearly tripling in three years they hired like 10,000 people into the go to market organization I'm not exaggerating that and that's on a base of 150 people when he came in most of which were seated in California not regionally distributed throughout the world.
spk_0
So the funniest thing is Google kind of was a cloud company all along they had the best engineers building this amazing infrastructure right they had the products they had the infrastructure they just didn't have the go to market organization right and the productization was all like Google it was like for us for engineers they didn't really build things that let enterprises build the way they wanted to build this all changes 2022 they hit 26 billion in revenue 2023 they're like a real viable third cloud.
spk_0
They also flipped a profitability in 2023 and today they're over 50 billion dollars in annual revenue run rate it's growing 30% year over year they're the fastest growing of the major cloud providers 5x and 5 years and it's really three things it's finding religion on how to actually serve the enterprise it's leaning into this multi cloud strategy and actually giving enterprise developers what they want and three AI has been such a good tailwind for all
spk_0
hyperscalers because these workloads all need to run in the cloud because it's giant amounts of data and giants amount of compute and energy but in Google Cloud you can use TPUs which they make a ton of and everyone else is desperately begging Nvidia for allocations to GPUs so if you're willing to not use CUDA and build on Google stack they have an abundant amount of TPUs for you.
spk_0
So this is why we saved cloud for this episode there are two aspects of Google Cloud that I don't think they first saw back when they started the business with App Engine but are hugely strategically important to Google today one is just simply that cloud is the distribution mechanism for AI.
spk_0
So if you want to play an AI today you either need to have a great application a great model a great ship or a great cloud Google is trying to have all four of those yes there is no other company that has I think more than one I think that's the right call think about the big AI players and video chips kind of has a cloud but not really they just have chips and the best chips and the chips everyone wants but
spk_0
chips and then use look around the rest of the big tech companies meta right now only an application they're completely out of the race for the frontier models at the moment we'll see what their hiring spree yields you look at Amazon infrastructure they have application maybe I don't actually know if Amazon dot com I'm sure it benefits from LMS in a bunch of ways mainly it's cloud yes cloud and cloud leader Microsoft cloud it's just cloud right they make some models but I mean they've got applications
spk_0
big cloud cloud apple nothing nothing AMD just chips yep open AI model and then pick model yep these companies don't have their own data centers they are like making noise about making their own chips but not really and certainly not at scale Google has scale data center scale chips scale usage of model I mean even just from Google
spk_0
dot com queries now on AI overviews and scale applications yes yeah they have all of the pillars of AI and I don't think any other company has more than one and they have the very most net income dollars to lose right so then there's the chip side specifically this if Google didn't have a cloud it wouldn't have a chip business it would only have an internal chip business the only way that external companies users developers
spk_0
model researchers could use TPUs would be if Google had a cloud to deliver them because there's no way in hell that Amazon or Microsoft are going to put TPUs from Google in their clouds we'll see we'll see I guess I think within a year it might happen there are rumors already that some
spk_0
neoclouds in the coming months are going to have TPUs interesting nothing announced but TPUs are likely going to be available in neocloud soon which is an interesting thing why would Google do that are they trying to build an
spk_0
video type business where they make money selling chips I don't think so I think it's more that they're trying to build an ecosystem around their chips the way that kuda does and you're only going to credibly be able to do that if your chips are accessible and anywhere that someone's running their existing workloads
spk_0
yeah be very interesting it happens and you know look you may be right maybe there will be TPUs in AWS or Azure someday but I don't think they would have been able to start there if Google didn't have a cloud and there weren't
spk_0
any way for developers to use TPUs and start wanting TPUs would Amazon or Microsoft feel like you know all right Google we'll take some of your TPUs even though no developer out there uses something
spk_0
right all right well with that let's move into analysis I think we need to do bull and bear on this one you have to this time got to bring that back for these episodes in the present seems like we need to paint the possible futures
spk_0
yes bringing back bull and bear I love it will do playbook powers quintessence bring home perfect all right so here's my set of bull cases Google has distribution to basically all humans as the front door to the Internet they can funnel that however they want you've seen it with
spk_0
essentially an all time high and it's a default behavior yeah powerful so that is a bet on implementation that Google figures out how to execute and build a great business out of AI but it is still there is to lose
spk_0
and they've got a viable product it's not clear to me that Gemini is any worse than open AI or topics products no I completely agree this a value creation value capture thing the value creation is there in spades the value capture mechanism is still TBD
spk_0
yeah Google's old value capture mechanism is one of the best in history so that's the issue at hand let's not get confused that it's not like a good experience a great experience yeah yeah yeah yeah
spk_0
okay so we've talked about the fact that Google has all the capabilities to win an AI and it's not even close foundational model chips hyperscaler all this with self sustaining funding I mean that's the other crazy thing is you look at the clouds have self-sustaining funding and video has self-sustaining funding none of the model makers have self-sustaining funding so they're all dependent on external capital yeah Google is the only model maker who has self-sustaining funding yes isn't that crazy
spk_0
yeah basically all the other large scale usage foundational model companies are effectively startups yes and Google's is funded by a money funnel so large that they're giving extra dollars back to shareholders for fun yeah
spk_0
again we're in the whole case well when you put it that way yeah I think we didn't mention Google has incredibly fat pipes connecting all of their data centers after the dot com crash in 2000 Google bought all that dark fiber for pennies on the dollar and they've been activating it over the last decade they now have their own private back hall network between data centers no one has infrastructure like this yep not to mention that serves YouTube their fat pipes
spk_0
which in and of itself is its own bull case for Google in the future at a great point yeah Ben Thompson at a big particle about this yesterday at the time of recording yeah that was like a mega bull case that Ben Thompson published this week that it was an interesting point a text based internet is kind of the old internet it's the first instantiation of the internet because we didn't have much bandwidth the user experience that is actually compelling is video high resolution
spk_0
video everywhere all the time we already live in the YouTube internet right and not only can they train models on really the only scale source of UGC media across long form and short form but they also have that as the number two search engine this massive destination site so they previewed things like you'll be able to buy AI labeled or AI determined things that show up in videos and if they wanted to they could just go
spk_0
label every single product in every single video and make it all instantly shoppable doesn't require any human work to do it they could just do it and then run their standard ads model on it that was a mind expanding piece that been published yesterday or I guess if you're listening to this a few weeks ago about that
spk_0
and then there's also all the video AI applications that they've been building like flow and video what is that going to do for generating videos for YouTube that will increase engagement and add dollars for YouTube yeah
spk_0
gonna work real well yeah they still have an insane talent bench even though you know they've blood talent here and there and lost people they have also shown they're willing to spend billions for the right people and retain them
spk_0
unit economics let's talk about unit economics of chips everyone is paying Nvidia 75 80% gross margins implying something like a four or five X markup on what it costs to make the chips a lot of people refer to this as the gents and tax or the N video tax you can call
spk_0
that you call it good business you can call pricing power you call it scarcity of supply whatever you want but that is true anyone who doesn't make their own chips is paying a giant giant premium to Nvidia Google has to still pay some margin to their chip hardware partner broad calm that handles a lot of the work to actually make the chip
spk_0
interface with TSMC I have heard that broad calm has something like a 50% margin when working with Google on the TPU versus Nvidia's 80% but that's still a huge difference to play with a 50% gross margin from your supplier or an 80% gross margin from your supplier is the difference between a 2x markup and a 5x markup yeah I guess that's right when you frame it that way it's actually a giant difference of the impact to your cost so you might
spk_0
wonder appropriately or chips actually the big part of the cost of like the total cost of ownership of running one of these data centers or training one of these models chips are the main driver of the cost they depreciate very quickly I mean this is at best a five year appreciation because of how fast we are pushing the limits of what we can do with chips the needs of next generation models how fast TSMC is able to produce yeah I mean even that is ambitious right if you think
spk_0
you're going to get five years of depreciation on a i chips five years ago we were still two years away from chat GPT right or think about what Jensen said at um we were at GTC this year he was talking about blackwell he said something about hopper he was like yeah you don't want hopper my sales guys are going to hate me but like you really don't want hopper at this point
spk_0
these were the H 100 this was the hot chip just when we were doing our most recent and video episode yes things move quickly yes so I've seen estimates that over half the cost of running an a i data center is the chips and the associated depreciation the human cost that R&D is actually a pretty high amount because hiring these air researchers and all the software engineering is meaningful call it 25 to 33% the power is actually a very small part it's like two to six
spk_0
percent so when you're thinking about the economics of doing what Google is doing it's actually incredibly sensitive to how much margin are you paying your supplier in the chips because it's the biggest cost driver of the whole thing.
spk_0
So I was sanity checking some of this with Gavin Baker who's the partner at a trade ease management to prep for this episode he's like a great public equities investor who's studied the space for a long time we actually interviewed him at the Nvidia GTC pre game show and he pointed out normally like in his
spk_0
historical technology eras it hasn't been that important to be the low cost producer Google didn't win because they were the lowest cost surge engine apple didn't win because they were the lowest cost you know that's not what makes people win but this era might actually be different because these AI companies don't have 80% margins the way that we're used to in the technology business or at least in the software business at best these AI companies look like 50% gross margins.
spk_0
So Google being definitively the low cost provider of tokens because they operate all their own infrastructure and because they have access to low markup hardware it actually makes a giant difference and might mean that they are the winner in producing tokens for the world very compelling bill case there that's a weirdly winding analytical
spk_0
bookcase but it's kind of the if you want to really get down to it they produce tokens. Yep, I've got one more bullet point to add to the bookcase for Google here.
spk_0
Everything that we talked about in part two the alphabet episode all of the other products within Google Gmail maps docs chrome android that is all personalized data about you that Google
spk_0
loads that they can use to create personalized AI products for you that nobody else has another great point.
spk_0
So really the question to close out the bookcase is is AI a good business to be in compared to search search is a great business to be in so far
spk_0
AI is not but in the abstract again we're in the bookcase so I'll give you this it should be with traditional web search you type in two to three words.
spk_0
That's the average query length and I was talking to Bill gross and he pointed out that an AI chat you're often typing 20 plus words.
spk_0
So there should be an ad model that emerges and ad rates should actually be dramatically higher because you have perfect precision right you have even more intent.
spk_0
Yes, you know the crap out of what that user wants so you can really decide to target them with the ad or not an AI should be very good at targeting with the ad so it's all about
spk_0
figuring out the user interface the mix of paid versus not exactly what this ad model is but in theory even though we don't really know what the product looks like now it should actually lend itself very well to monetization.
spk_0
And since AI is such a amazing transformative experience all these interactions that were happening in the real world or weren't happening at all like answers to questions and being on a time spent is now happening in these AI chat.
spk_0
So it seems like the pie is actually bigger for digital interactions than it was in the search area so again monetization should kind of increase because the pie increases there.
spk_0
Yeah.
spk_0
And then you've got the bulk case of waymo could be its own Google size business.
spk_0
I was just thinking that yeah that's scoping all of this to a replacement to the search market waymo and potentially other applications of AI beyond the traditional search market could
spk_0
add to that right and then there's the like galaxy brain bull case which is if Google actually creates a GI none of this even matters anymore and like of course it's the most valuable thing that feels out of this scope for a required episode it's disconnected.
spk_0
Yes agree.
spk_0
Bearcase so far this is all fun to talk about but then the product shape of AI has not lent itself well to ads so despite more value creation there's way less value capture.
spk_0
Google makes something like 400 ish dollars per user per year just based on some napkin math in the US that's a free service that everyone uses and they make 400 ish dollars a year.
spk_0
Who's going to pay 400 dollars a year for access to AI it's a very thin slice of the population some people certainly will but not every person in America some people will pay 10 million but right.
spk_0
So if you're only looking at the game on the field today I don't see the immediate path to value capture and think about when Google launched in 1998 it was only two years before they had ad words they figured out an amazing value capture mechanism instantly very quickly yeah another bear case think back to Google launch in 1998 it was immediately obviously the superior product.
spk_0
Yes definitely not the case today no there's four five great products Google's dedicated AI offerings and chat box was initially the immediately obviously inferior product and now it's arguably on par with several others right they own 90% of the search market I don't know what they own of the AI market but it eight 90% is at 25% I don't know but at steady state it probably will be something like 25 maybe up to 50% but this is going to be a lot of money.
spk_0
So it's going to be a market with several big players in it so even if they monetize each user as great as they monetize it in search they're just going to own way less of them.
spk_0
Yep or at least it certainly seems that way right now yes AI might take away the majority of the use cases of search and even if it doesn't I bet it takes away a lot of the highest value ones.
spk_0
If I'm planning a trip I'm planning that an AI I'm no longer searching on Google for things that are going to land expedient ads in my face or health another huge vertical.
spk_0
Hey I think I might have something that reminds me of misotheliuma is it that or not right oh where are you going to put the lawyer ads maybe you put them there maybe it's just an ad product thing but these are very high value queries former searches that those feel like some of the first things that are getting
spk_0
siphoned off to AI. Yep any other bear cases I think the only other bear case I would add is that they have the added challenge now of being the incumbent this time around and people and the ecosystem isn't necessarily rooting for them in the way that people were rooting for Google when they were a startup and in the way that people were still rooting for Google in the mobile transition.
spk_0
I think the startups have more of the hearts and minds these days right so I don't think that's quantifiable but is just going to make it all a little harder path to row this time around.
spk_0
Yep you're right they had this incredible PR and public love tailwind the first time around.
spk_0
Yep and part of that systemic to like all of tech and all of big tech is just generally more out of favor with the country in the world now than it was 10 or 15 years ago there's more important it's just big infrastructure it's not underdogs anymore yeah and that affects the opening eyes and the anthropics and the startups to but I think to a lesser degree yeah they had to start behaving like big tech companies really early in their life compared to Google I mean Google gave a playboy interview during their
spk_0
quiet period of their IPO times have changed well I mean given all the drama at open AI I don't know they're trying to miss acting like a mature company fair fair company entity whatever they are yes yeah but point taken well I worked most of my playbook into the story itself so you want to power yeah great let's move on and do power Hamilton Helmer's seven powers analysis of Google here in the AI era.
spk_0
And the seven powers are scale economies network economies counter positioning switching costs branding cornered resource and process power and the question is which of these enables a business to achieve persistent differential returns what entitles them to make greater profits than their nearest competitor sustainably normally we would do this on the business all up I think for this episode we should try to scope it to AI products yes agree usage of Gemini AI mode and AI
spk_0
overviews versus the competitive set of anthropic open AI perplexity grok met AI etc scale economies for sure even more so in AI than traditionally in tech yeah they're just way better I mean look they're amortizing the cost of model training across every Google search I'm sure it's some super distilled down model that's actually happening for AI overviews but think about how many
spk_0
inference tokens are generated for the other model companies and how many inference tokens are generated by Gemini they just are amortizing that fix training costs over a giant giant amount of inference that I saw some crazy chart will send it out to email subscribers in April of 24 Google was processing 10 trillion tokens across all their surfaces in April of 25
spk_0
that was almost 500 trillion wow that's a 50 x increase in one year of the number of tokens that they're vending out across Google services through inference and between April of 25 in June 25 it went from a little under 500 trillion to a little under one quadrillion tokens technically 980 trillion but they are now because it's later in the summer definitely sending out maybe even
spk_0
multiple quadrillion tokens wow wow so among all the other obvious scale economies things of amortizing all the cost of their hardware they are
spk_0
amortizing the cost of training runs over a massive amount of value creation yeah scale economies must be the biggest one I find switching costs to be relatively low
spk_0
I use Gemini for some stuff then it's really easy to switch away that probably stops being the case when it's personal AI to the point that you're talking about integrating with your calendar and your mail and all that stuff
spk_0
yeah the switching costs have not really come out yet in AI products although I expect they will yes they have within the enterprise for sure
spk_0
yep network economies I don't think if anyone else is a Gemini user it makes it better for me because they are suck it up the whole internet whether anyone's participating or not
spk_0
yep I agree I'm sure AI companies will develop network economies over time I can think of ways it could work but yeah right now no and arguably for the foundational model companies can't think of obvious reasons right now
spk_0
where does Hamilton put distribution because that's a thing that they have right now that no one else has despite chat GBT having the clean x brand Google distribution is still unbelievable
spk_0
I don't know is that a cornered resource cornered resource I guess yeah definitely of that yeah Google search is a cornered resource for sure certainly don't have counter positioning they're getting counter positioned
spk_0
yep I don't think they have process power unless they were like coming up with the next transformer reliably but I don't think we're necessarily seeing that there's great research being done it a bunch of different labs branding they have
spk_0
yeah branding is a funny one right well I was going to say it's a little bit to my bear case point about they're the incumbent it cuts both ways but I think it's net positive yeah probably for most people they trust Google
spk_0
yeah they probably don't trust these who knows AI companies but I trust Google I bet that's actually stronger than any downsides as long as they're willing to still release stuff on the cutting edge yeah
spk_0
so to sum it up its scale economies is the biggest one it's branding and it's a cornered resource and potential for switching costs in the future yep sounds right to me
spk_0
but it's telling that it's not all of them you know in search it was like very obviously all of them or most of them yep quite telling well I'll tell you after hours and hours
spk_0
spending multiple months learning about this company my contestants when I boil it all down is just that this is the most fascinating example of the innovators dilemma ever I mean Larry and Sergey control the company they have been quoted repeatedly saying that they would rather go bankrupt then lose it AI
spk_0
will they really if AI isn't as good a business as search and it kind of feels like of course it will be of course it has to be it's just because of the sheer amount of value creation but if it's not and they're choosing between two outcomes one is fulfilling our mission of organizing the world's information
spk_0
and making it universally accessible and useful and having the most profitable tech company in the world which one wins because if it's just the mission they should be way more aggressive on AI mode than they are right now and full flip over to Gemini
spk_0
it's a really hard needle of thread I'm actually very impressed at how they're managing to currently protect the core franchise but it might be one of these things where it's being a roaded away at the foundation in a way that just somehow isn't showing up in the financials yet I don't know
spk_0
I'm not going to take a big tech companies Google as unlikely as it seems given how things started is probably doing the best job of trying to thread the needle with AI right now and that is incredibly commendable to soon dar and their leadership
spk_0
they are making hard decisions like we're unifying deep mind and brain we're consolidating and standardizing on one model and we're going to ship this stuff real fast
spk_0
while at the same time not making rash decisions
spk_0
it's hard, rapid but not rash you know yes and obviously we're still in early innings of all this going on
spk_0
and we'll see in 10 years where it all ends up
spk_0
yeah being tasked with being the steward of a mission and the steward of a franchise with public company shareholders is a hard dual mission
spk_0
and Sundar and the company is handling it remarkably well especially given where they were five years ago
spk_0
and I think this will be one of the most fascinating examples in history to watch it play out
spk_0
totally agree well that concludes our Google series for now
spk_0
yes all right let's do some car vats all right let's do some car vats
spk_0
well first off we have a very very fun announcement to share with you all the NFL called us
spk_0
we're going to the Super Bowl baby acquired is going to the Super Bowl this is so cool
spk_0
it's the craziest thing ever the NFL is hosting a innovation summit the week at the Super Bowl the Friday before Super Bowl Sunday
spk_0
the Super Bowl is going to be in San Francisco this year in February and so it's only natural coming back to San Francisco with the Super Bowl that the NFL should do an innovation summit
spk_0
yep and we're going to host it that's right so the Friday before there's going to be some great on stage interviews and programming
spk_0
most of you you know we can't fit millions of people in a tidy auditorium in San Francisco the week of the Super Bowl when every other venue has tons of stuff
spk_0
to so there will be an opportunity to watch that streaming online and as we get closer to that date in February we will make sure that you all know a way that you can tune in
spk_0
and watch the MC and interviewing and festivities at hand Super Bowl week it's going to be an incredible incredible day leading up to
spk_0
an incredible Sunday yes well speaking of sport my carve out is I finally went and saw F1 it is great I highly recommend anyone go see it whether you're an F1 fan or not
spk_0
it is just beautiful cinema I mean did you see it in the theater or I did see the theater yeah I unfortunately missed the IMAX window but it was great
spk_0
it's my first time being in movie theater in a while and whether you watch it at home or whether you're watching the theater I recommend the theater
spk_0
but it's going to be a great surround sound experience wherever you are I haven't been to the movie theater since the eras tour which I think is just more about the current state of my family life with two young children
spk_0
yes my second one so if you're going to laugh is the travel pro suitcase this is the brand that pilots and flight attendants use right maybe I think I've seen some of them use it usually they use something higher end like a
spk_0
Briggs and Riley or a to me or you know travel pro is not the most high end suitcase but I bought two really big ones for some international travel that we were doing with my two-year-old toddler and I must say
spk_0
they're robust the wheels glide really well they're really smooth they have all the features you would want their soft shell so you can like really jam it full of stuff but it's also a thick
spk_0
amount of protection so even if you do jam it full of stuff it's probably not going to break this is approximately the most budget suitcase you could buy I mean I'm looking at the big honken international check bag version
spk_0
it's four hundred and sixteen dollars on Amazon right now I've seen it cheaper they have great sales pretty often everything about this suitcase checked lots of boxes for me and I completely thought I would be the person buying the
spk_0
Ramo a suitcase or the something very high-end and this is just perfect so I think I may be investing in more travel pro suitcases more travel pro nice nice well I mean hey look for family travel you don't want nice stuff yeah I mean I bought it thinking like I'll just
spk_0
get some crappy for this trip but it's been great I don't understand why I wouldn't have a full lineup of travel pro gear so basically this is my like budget pick gone right that I highly recommend for all of you
spk_0
I love how acquired is turning into the wire cutter here that's it for me today great all right I have to carve out I have one carve out then I have a update in my ongoing Google carve out saga but first my actual carve out it is the glue guys podcast
spk_0
oh it's great those guys are awesome so great our buddy Ravi Gupta partner at Sequoia and his buddies Shane baddie a former basketball player and Alex Smith the former quarterback for the 49ers and Candace of the Chiefs and the Redskins their dynamic is so great they have so much fun
spk_0
half of their episodes like us are just them then half of their episodes are with guests bed and I we went on it a couple weeks ago that was really fun
spk_0
when we were on it we were talking about this dynamic of some episodes do better than others and pressure for episodes one not in the guys brought up this interview they did with the guy named right Thompson and they said like like this is an episode it's got like 5,000
spk_0
listens nobody's listen to it it's so good and the mentality that we have about it is not that we're embarrassed that nobody listen to it it's that we feel sorry for the people who have not get listen to it because it's so good
spk_0
like that is the way to think about that's great your episode so here you are you're giving everyone the gift of giving everyone the gift because I then I was like all right well I got to go listen to this
spk_0
episode right Thompson I didn't know anything about him before I probably read his work in magazines over the years without realizing it he's the coolest dude he has the same accent as Bill Gurley so
spk_0
listening to him sounds like listening to so good to Bill Gurley instead of being a VC only wrote about sports and basically dedicated his whole life to understanding the mentality and psychology of athletes and coaches it's so cool it's so cool it's a great
spk_0
great episode highly highly highly recommend all right legitimately I'm queuing that up right now great that's my car about and then my ongoing family video gaming saga in Google part one I said I was debating
spk_0
between the switch to in the steam deck that's right first you got the steam deck because you decided your daughter actually wasn't old enough to play video games with you so you just got the thing for you the update was I went with the steam deck for that
spk_0
reason I thought if it's just for me it would be more ideal I have an update you also got a switch no not yet okay okay but the most incredible thing happened my daughter notice this device that appeared in our house the dad plays
spk_0
every now and then and we were on vacation and I was playing the steam deck and she was like what's that well let me tell you and I was playing I've been playing this really cool indie old school style RPG called see of stars it's like a chrono trigger style super Nintendo style RPG playing it
spk_0
and my daughter comes to she's like can I watch you play and I'm like hell yeah you can watch me play I get to play video games and you sit here and snuggle with me and like you know amazing I get to play video games and call it parenting then it gets even better probably
spk_0
like two weeks ago we're playing and she's like hey dad can I try I'm like absolutely you can try I hand her the steam deck and it was the most incredible experience one of the most incredible experiences I've had as a parent because she doesn't know how to play video games
spk_0
and I'm watching her learn how to like use a joystick and I'm supervised learning yeah yeah supervised I'm telling her what to do and then within two or three nights she got it she doesn't even know how to read yet but she figured it out it's like awesome watching in real time and so now the last week
spk_0
it's turned to mostly she's playing and I'm like helping her asking questions are like well what do you think you should do here like you know should you go here I think this is the goal I think this is where it's so so fun so I think I might actually pretty soon her birthday's coming up
spk_0
end up getting a switch so that we can play you know together on the switch right but unintentionally the steam deck was the gateway drug for my student to be four year old daughter that's awesome there you go
spk_0
parent of the year right there getting to play video games and honey I got it I'll take it out yeah I got I got it
spk_0
all right well listeners we have lots of thank yous to make for this episode we talked to so many folks who are instrumental in helping put it together first to thank you to our partners this season
spk_0
JP Morgan payments trusted reliable payments infrastructure for your business no matter the scale that's JP Morgan dot com slash acquired
spk_0
century the best way to monitor for issues in your software and fix them before users get mad that's century dot I O slash acquired work OS the best way to make your app enterprise ready starting with single sign on in just a few lines of code work
spk_0
OS dot com and Shopify the best place to sell online whether you're a large enterprise or just a founder with a big idea Shopify dot com slash acquired the links are all in the show notes as always all of our sources for this
spk_0
episode are linked in the show notes yes first Stephen Levy at wired and his great classic book on Google in the plex which has been an amazing source for all three of our Google
spk_0
episodes definitely go by the book and read that also department Wilson at Bloomberg for her book supremacy about deep mind and open AI which was a main source for this
spk_0
episode and I guess also decayed Matt's right for genius makers yeah yeah great book our research thank you's max Ross Liz read Josh Woodward Greg
spk_0
Gerrato Sebastian Thrun and a Patterson Brett Taylor Clay Bevor Demis Ashabis Thomas Curian soon darpachahi a special thank you to Nick Fox
spk_0
who is the only person we spoke to for all three episodes for research we got the hat trick yeah to arvin Navarotna at worldly partners for his great
spk_0
right up on alphabet linked in the show notes to Jonathan Ross original team member on the TPU and today the founder and CEO of
spk_0
Grock that's Grock with a Q making chips for inference to the Waymo folks Demitri dog love and Suzanne Philly on to Gavin Baker from a trade
spk_0
management to M G Seagler writer at spyglass M G is just one of my favorite technology writers and pundits O Z tech
spk_0
and the creator that's right to Ben Eidelsen for being a great thought partner on this episode and his excellent recent
spk_0
episode on the step change podcast on the history of data centers I highly recommended if you haven't listened
spk_0
already it's only episode three for them of the entire podcast and they're already getting I don't know 30 40,000
spk_0
listens on it I mean this thing is taken off amazing dude that's way better than we were doing on episode three it's way
spk_0
better than we were doing and if you like acquired you will love the step change podcast and bend is a dear friend so
spk_0
highly recommend checking it out to Korai Kovaak Chaloo from the deep mine team building the core Gemini models to
spk_0
Shashir Morota the CEO of Grammarley formerly ran product at YouTube to Jim Gau the CEO of Fadra and former deep mine team member
spk_0
Chathen Pudugunta partner at Benchmark door cash Patel for helping me think through some of my conclusions to draw and to Brian Lawrence
spk_0
from Oak Cliff Capital for helping me think about the economics of AI data centers if you like this episode go check out our
spk_0
episode on the early history of Google and the 2010s with our alphabet episode and of course our series on Microsoft and Nvidia
spk_0
after this episode go check out ACQ to with Toby Lutke the founder and CEO Shopify and come talk about it with us in the Slack
spk_0
at acquired.fm slash slack and don't forget our 10th anniversary celebration of acquired we are going to do a open zoom call
spk_0
an LP call just like the days of your with anyone listeners come join us on zoom it's going to be on October 20th have 4 p.m. Pacific time details are in the show notes
spk_0
and with that listeners we'll see you next time we'll see you next time
spk_0
who got the truth is it you is it you is it you who got the truth now
Topics Covered
Google AI company
innovators dilemma
artificial intelligence revolution
transformer invention
AI talent concentration
Gemini AI model
Google Cloud revenue
tensor processing units
AI chip market
machine learning history
PageRank algorithm
compressing data technology
language models
AI understanding
Google's AI strategy
acquired podcast