Sept. 12, 2024

The Oracle Of Silicon Valley Shares How to Create More Value Than You Capture | Tim O’Reilly

The player is loading ...
The Eric Ries Show

What is the relationship between technology and society? What happens to idealism over time? I’m fortunate to have discussed these questions and many more with Tim O’Reilly for this episode of The Eric Ries Show.

Tim is the founder of O’Reilly Media, which has provided countless programmers and technologists with foundational information for doing their work well. He’s also been a long-time witness to the changes and growth of tech, and has consistently looked far ahead of other people, perhaps most famously in his book What’s the Future and Why It’s Up to Us

Many call Tim the Oracle of Silicon Valley. He thinks of himself more as “a voice in the wilderness” and someone who brings together people with great ideas to build new things. In our conversation, we covered how to build a business with a real ethos, how to gain competitive advantage by doing the right thing, and why thinking beyond the quarter, or even the year, is crucial for survival.

As he said, “Companies need to think about the long term, which is: who is going to provide what you are the gatekeeper for if you basically have a relentless acquisition of all the value for yourself?” Ultimately, he believes – and I couldn’t agree more – that “we need to build an economy in which the important things are paid for in self-sustaining ways rather than as charities to be funded out of the goodness of our hearts.”

Other topics we touched on include:

  • AI from multiple angles
  • Tech companies as the fulcrum between suppliers and customers
  • The extractive power of ads
  • What it means to see a business as an ecosystem
  • Values as a map of the world
  • And much, much more

 

Brought to you by:

Mercury – The art of simplified finances. ⁠⁠⁠⁠⁠⁠Learn more⁠⁠⁠⁠⁠⁠.

DigitalOcean – The cloud loved by developers and founders alike. ⁠⁠⁠⁠⁠⁠Sign up⁠⁠⁠⁠⁠⁠.

Neo4j – The graph database and analytics leader. ⁠⁠⁠⁠⁠⁠Learn more⁠⁠⁠⁠⁠⁠.

 

Where to find Tim O'Reilly:

• O’Reilly Media: https://www.oreilly.com/ 

• X: https://x.com/timoreilly 

• Facebook: https://www.facebook.com/timoreilly

• LinkedIn: https://www.linkedin.com/in/timo3/

 

Where to find Eric:

• Newsletter: ⁠⁠⁠⁠⁠⁠https://ericries.carrd.co/⁠⁠⁠⁠⁠⁠ 

• Podcast: ⁠⁠⁠⁠⁠⁠https://ericriesshow.com/⁠⁠⁠⁠⁠⁠ 

• X: ⁠⁠⁠⁠⁠⁠https://twitter.com/ericries⁠⁠⁠⁠⁠⁠ 

• LinkedIn: ⁠⁠⁠⁠⁠⁠https://www.linkedin.com/in/eries/⁠⁠⁠⁠⁠⁠ 

• YouTube: ⁠⁠⁠⁠⁠⁠https://www.youtube.com/@theericriesshow⁠⁠⁠⁠⁠⁠ 

 

In This Episode We Cover:

(00:43) Meet Tim O’Reilly

(06:25) How Eric and Tim met

(08:38) On not getting caught up in trends of the moment

(10:44) Tim’s early career and how his thinking evolved

(13:10) From open source to Web 2.0

(14:06) Working to make the world a better place

(16:11) How idealism is subsumed by the system we work in

(18:25) Resisting the lure of profit

(19:53) Thinking about companies as members of an ecosystem

(21:20) Creating versus capturing value

(23:38) Internet aggregators

(26:45) Choosing value creation over-extraction as a means of sustainability

(30:00) Ads and the lesson of screen placement

(32:49) AI and Google

(37:13) Values are a map of the world

(39:56) How to build a company ethos

(42:24) How adpoting values publicly makes them more powerful

(47:46) Dune

(49:31) Literature’s evolution

(50:21) Anthony Trollope’s proto-feminist novel, Can You Forgive Her

(49:30) George Elliot’s The Mill on the Floss

(51:22) The Dune movies

(52:39) Turning books into movies

(54:23) Tim’s favorite childhood books

(58:29) Why doing good is the best path to success

(1:02:22) Generative AI, value, and trust

(1:12:25) How idealists talk themselves out of it

(1:15:53) Lightning Round

 

Referenced: 

 

Production and marketing by ⁠⁠⁠⁠⁠⁠https://penname.co/⁠⁠⁠⁠⁠⁠.

Eric may be an investor in the companies discussed.

Transcript

Tim O'Reilly (00:00:00):
People are just starved for idealism.

Eric Ries (00:00:02):
Totally.

Tim O'Reilly (00:00:03):
They're starved for values and companies that express values, it's winning. And so I think a lot of it is just a practical side of it. Google, when they were idealistic and expressed their ideals, Amazon when they were idealistic and expressed their ideals, had enormous loyalty from their staff and they had people who felt mission driven. And when you lose that, you lose something. How can you not just become more and more dullified? Idealism is a critical part of the human soul.

Eric Ries (00:00:43):
Welcome to the Eric Ries Show, I'm your host. Today, one of the most profound thinkers about the relationship of technology and society, someone who has literally been the person selling the picks and shovels to the gold rush that is Silicon Valley. Tim O'Reilly, he's the founder of O'Reilly Media, and believe me, if there is a technologist or a programmer in your life, you go to their bookshelf, you will find many of these books with funny looking animals on the covers. They are ubiquitous in the technology industry as the reference manuals we all need, myself included, to do any kind of programming or technical work. These books are so famous among technologists they're often referred to by the name of the animal on the cover, like the famous camel book from my shelf. But if you want to learn more about Python or machine learning or any technical topic, there is a funny looking animal with an O'Reilly book, but Tim is not just a publisher of technical works.

(00:01:34):
That's how he got started. But as you'll see in this conversation, he is someone who has been a witness, but also a participant in the massive changes that technology has driven through business, through our society in so many different ways. He is the one who popularized many important concepts in technology from open source software to Web 2.0, the Maker Movement government as a platform. He was very early to the importance of deep tech. You may have heard of O'Reilly Alpha Tech for those who've been around a little while.

(00:02:03):
He wrote a book called WTF, which stands for What's The Future? [inaudible 00:02:08] What did you think? And which is terrific, a very important set of ideas about AI and the development of platforms long before that was the topic that on everybody's mind, someone who has had a hand in seeing and shaping the future. In this conversation, we got into how to build a business with a real ethos, what it means to see a business as an ecosystem, how to gain competitive advantage by doing the right thing, and so many more topics around AI, about regulation and policy and just the role that technology plays in society and can in the future. It's absolutely my pleasure to present to you this conversation with Tim O'Reilly

(00:02:45):
I've started a lot of companies and I've helped a lot more people start companies too, and therefore I've had a lot of banks and a lot of bank accounts, and so I'm really delighted that this episode is brought to you by Mercury, the company I trust for startup banking. Every time someone on my team uses their Mercury linked debit card, I get an email with the details and just that little bit of financial intelligence always in my inbox gives me a much clearer understanding of what we're spending. That's what Mercury is like through all its financial workflows. They're all powered by the bank account, everything's automatic. And for those of us that remember the recent banking crisis, Mercury was there for a lot of startups who needed them. They've since launched features like Mercury Treasury and Mercury Vault with up to $5 million in FDIC insurance through their partner bank and their sweep networks.

(00:03:33):
Certain conditions must be satisfied for pass through FDIC Insurance to apply, apply in minutes at Mercury.com and join over 100,000 ambitious startups that trust Mercury to get them performing at their best. Mercury, the art of simplified finances. Mercury is a financial technology company, not a bank. Banking services provided by Choice Financial Group and Evolve Bank and trust members, FDIC. This episode is brought to you by DigitalOcean, the cloud loved by developers and founders alike. Developing and deploying applications can be tough, but it doesn't have to be. Scaling a startup can be a painful road, but it doesn't have to be. When you have the right cloud infrastructure, you can skip the complexity and focus on what matters most. DigitalOcean offers virtual machines managed Kubernetes plus new solutions like GPU Compute with a renewed focus on ensuring excellent performance for users all over the world, DigitalOcean has the essential tools developers need for today's modern applications with the predictable pricing that startups want.

(00:04:38):
Join the more than 600,000 developers who trust DigitalOcean today with $200 in free credits and even more exclusive offers. Just for listeners, at do.co slash Eric. Terms and conditions apply.

(00:04:52):
Hey Tim, thanks for coming on.

Tim O'Reilly (00:04:54):
Glad to be here. It's great to see you.

Eric Ries (00:04:56):
Really a pleasure to be here with Tim O'Reilly. Now I have to say, I wanted to give you your appropriate title and I was looking online. People have called you the Oracle of Silicon Valley, the Scribe of Silicon Valley. I stopped looking after a certain point. It was like people felt the need to give you a special title to represent your role in the technology industry. Do you have a favorite of all these things people call you?

Tim O'Reilly (00:05:20):
Not really. I think I like to say my reputation exceeds me. I feel like I am definitely someone who has trod a different path than most people in Silicon Valley, and I tell people about that and I don't think of myself as an Oracle. I think of myself really as probably more of a voice in the wilderness sometimes

Eric Ries (00:05:50):
That I can definitely relate to. I think what's cool about your perspective, and I know I've known you many years now, is both very focused on ideas, really. If there's anyone who could be called a philosopher in the tech industry, you live up to that billing, but you've also built O'Reilly Media. You've been a founder and a CEO and you've been in the operational trenches of trying to put your ideas into practice and I just feel like that's such an unusual combination. So yeah, thank you for doing that.

Tim O'Reilly (00:06:19):
Yeah, you're welcome.

Eric Ries (00:06:21):
Now, I don't know if you remember this story. So for those that don't don't know, here's how I met Tim O'Reilly. So when I was a programmer, I had the... every programmer, ask any programmer you know, we all have the O'Reilly Media books on our shelves with the animal cover. And I had tons of them. And when I very, very, very first started writing about Lean Startup, you may not remember this, Tim, I very first started writing about it I got a summons to come meet you up in Sebastopol where O'Reilly was headquartered, which for me was a very big deal at that time. It was before anyone of any significance on planet earth had said anything to me about Lean Startup. Somehow Tim knew it was going to be a thing and he asked me to come up there and we had a conversation and I felt so validated and seen and appreciated. Really, it was the first sign to me that I was onto something with Lean Startup. First of all, thank you for doing that.

Tim O'Reilly (00:07:11):
Oh, you're very welcome. That is kind of what I do. I like to find interesting people and connect them with more possibilities than they might find on their own.

Eric Ries (00:07:22):
That was definitely my experience and I wanted to talk to you about it because to me that's just a perfect place to start. You have this uncanny knack for identifying new and interesting things and people, ideas that are going to be important for technology and for the world. You were [inaudible 00:07:39], starting from Unix way, way back. Writing manuals for Unix. I always have to think about the alternate history of what would've happened if you'd pursued your Frank Herbert writing career instead, which we can get into. But to open source and government as a platform and O'Reilly Alpha Tech, which people probably don't remember now. The idea that Deep Tech was going to be a really important part of the industry, and I could go on and on about these ideas that you've been early to and been the voice in the wilderness from .

Tim O'Reilly (00:08:10):
Big Data, Web 2.0,, the Maker Movement.

Eric Ries (00:08:12):
Web 2.0. I was like, I'm definitely missing some important ones. Yeah. I want to understand now from your perspective. How I experienced it as someone who had a new idea that I thought was worthy of talking about. But I want to know from your perspective, how do you cultivate this network? You should call it O'Reilly radar, it almost felt to me like you actually have an early warning system for when something interesting is coming up through the culture, through technology. How does it work?

Tim O'Reilly (00:08:34):
Well, I think the very first thing that I do is I try not to get caught up in the enthusiasms of the moment. And that was always the case, just even from where I lived. I was up in Sebastopol two hours from Silicon Valley, close enough to see it, but seeing it from the cheap seats and effectively able to see patterns where other people got caught up in all the details. And so that was part of it. Part of it was also some training I had when I was quite young, in general semantics and the whole idea that the map is not the territory. This idea that you can get caught up in the story that people have fed you about the way the world works, when in fact the fundamental entrepreneurial act is to look at the world afresh and draw a new map. There's a wonderful article I read once by Michael Schrage where he basically said, great entrepreneurs don't find their users, they create their users.

(00:09:37):
Google created people who expected to be able to find anything at any time. Steve Jobs created people who expected to be able to carry the internet in their pocket, and those things weren't obvious, but they are obvious if you have a map of the world that you have constructed on your own, that takes into account trends that you see. So a big part of what I do is a kind of scenario planning and you learn these tools. When I discovered scenario planning, I was like, oh, this really fits what I do. Just this idea that you identify trends that you can see and you draw them out. And the trends are vectors, they have both a direction and a magnitude and you can kind of follow them and you can look and you can start to place things along these trend lines and you can watch them start to accumulate or not.

(00:10:39):
And so that's a big part of it. So for example, if you look at my evolution from thinking about open source through Web 2.0 through government as a platform, it's really on one trend line or clustered around that trend line. And that trend line was the rise of the internet. And so if you look at, I started out very early in my career working with Berkeley Unix and becoming part of that early Unix community, which was a collaborative community even in the shadow of the fact that it was in a proprietary license. And it was that experience that made me doubt the Free software Foundation's narrative, that it was somehow related to software licenses in a particular legal construct. I went, wait, I've been part of a collaborative community that built this most amazing thing for which the GNU system is actually derivative, which was people just going, okay, we're sharing stuff for the hell of it.

(00:11:44):
And I started thinking about, okay, this is on this trend line of the rise of network collaboration more than it is a particular licensing idea. That was sort of part of it, but you had to draw a bigger map. Even then, even if you were thinking about licensing, I go, Tim Berners-Lee put the web in the public domain, and yet that was outside the purview of the Free Software Foundation's movement because they weren't sort of revolting against copyright. The Berkeley Unix and the X-Window system, which I was also very involved in early on doing documentation for, were both just give us credit, build on what we did. And so I went, oh, there's something else at work. And then I kept thinking about that trend line of network collaboration, and I started to see web services. I started to see distributed computation. And a lot of what happens is I read something like in your case, I read your stuff about the Lean Startup and went, oh, this totally makes sense. Bing.

(00:12:43):
But in this particular case, it was something written by one of my editors, Andy Oram, and he was talking about MP3.COM Versus Napster. And that represented a very deep paradigm shift. And I went, oh my God, all these things fit together. And that's when I started talking about the internet. The internet operating system was in 2001. That later morphed into Web 2.0. But it was also, I was thinking a lot about the transition from, again, pattern recognition, looking at the transition between IBM and Microsoft, which was the transition from control of the industry by control over hardware to control of the industry by control over software and going, oh, open source and the internet are commoditizing software the same way that the PC commoditized hardware, what will become valuable? Thinking that, oh, it's going to be data.

(00:13:35):
And so that was what really led me from open source to Web 2.0. But they're all along this trend line. So that's part of it. And then the other part is just looking for interesting people. I think a lot of people, great investors, they're kind of looking for people who will make them a ton of money. And that's never been my model. I want people who are fundamentally interesting because they're going to do something that will make the world a better place. And also just, I guess, I gave a talk back in 2008, I think it was called Why I Love Hackers. I've all been very associated with that hacker community because it's these idealists who want to make the world a better place. And back in that talk, I had just come back from being in Sicily, and it was it's just amazing looking at the caves, the quarries in Syracuse, and you think of the early technologists who were kind of inventing this stuff that we now take for granted.

(00:14:48):
It's completely uninteresting, but structurally, all these interesting things being invented back then too. And the people who invented them were just trying stuff and a lot of times failing. And I kind of became very interested in failure. In that talk in 2008, I quoted his poem of Rilke called The Man Watching, which is about Jacob wrestling with the angel and being beaten and knowing he would be beaten, but getting strengthened by the fight. And that's when I really started giving these talks about work on stuff that matters because I felt like a lot of technologists were working on fundamentally trivial problems, just ones that would make them a ton of money.

(00:15:42):
And again, that was sort of obviously the observation of people like Jeff Hammerbacher who said, the great minds in my generation are working on figuring out how to get people to click on ads. And so I've always been drawn to people who wanted to change the world rather than people who were just going to make the most money. In fact, sometimes people I would meet who would be potentially really good entrepreneurs, and I found them distasteful, so I wouldn't invest.

Eric Ries (00:16:10):
Yes, indeed. Have you been disappointed? I feel like so many people who start with that idealism, the company that they ultimately wind up creating, it's like if you think about the early idealism of Don't Be Evil, or these companies that had a real ethos to them, has it been depressing to see what they've become?

Tim O'Reilly (00:16:30):
No. I mean, yes and no. I mean, in one sense, I think that it is inevitable, just like aging is inevitable. I think it's also a broad societal problem, not the problem with the technologies. It's not that Larry and Sergey became evil. It's not that Jeff Bezos became evil, but they were building companies in a system which demands that people think short-term, which demands that you sacrifice your values to profit.

(00:17:13):
And it's a little bit like you're building a machine and it has certain requirements. And if you don't think about that upfront, you're just not going to solve the problem. And while a lot of people have thought about it, they're going, oh, we'll have super voting stock and that will give us control. But that's kind of like saying-

Eric Ries (00:17:32):
[inaudible 00:17:32] find the one weird trick. The one weird trick to avoid the whole system, yeah.

Tim O'Reilly (00:17:35):
Right. And it doesn't work because, in fact, you're paying your employees in stock which must go up. So it's like you bought an SUV and you can't just say, well, I'm going to not use very much gas because you built one that uses a lot of gas and this is the particular kind of gas that it uses. So it's not that Larry and Sergey or Mark or Jeff, maybe at some point they went, oh, I just want to have a bigger pile than the next guy. But they're already unthinkably wealthy. But it's really that they built a machine where in order attract the best talent, you have to have a rising stock price, and therefore that starts to guide all your decisions-

Eric Ries (00:18:20):
Yeah, it becomes self-fulfilling. Like [inaudible 00:18:23].

Tim O'Reilly (00:18:22):
Yeah, it becomes a self-fulfilling prophecy, and it takes a lot of character to resist that. And there are the history of companies that have done that. There's a wonderful book, which... Was it you gave it to me? I forget the name of the author. You'll probably remember the guy from Stanford about the capitalists who... The history of all these companies that started with [inaudible 00:18:49].

Eric Ries (00:18:49):
Oh, you're talking about the Enlightened Capitalist by James O'Toole.

Tim O'Reilly (00:18:51):
Yeah.

Eric Ries (00:18:52):
It's a great book, but also a very depressing book about how many times people have tried to get this right and screwed it up.

Tim O'Reilly (00:18:58):
And it lasts for a while. Johnson, & Johnson did it for a long time. And how is it this company that he kind of celebrates as this company that had these values, suddenly they became a public company and suddenly they literally rewrote their history, wiped out all the training that they had given to build this culture that put values before profit, the one that where a division manager could decide to have a hundred million dollar recall on Tylenol when there was the packaging problem. And yet after a decade or so being a public company, they're involved in the opioid crisis. It's just like, it's that, now that's sad, but I do think that you look at bright spots, and I don't think anything is perfectly bright, but I remember connecting with Satya Nadella when my book came out in 2017, and his book Hit Refresh came out about the same time, and I went up to interview him, and we talked a lot about this notion of value creation.

(00:20:08):
And because he talked in his book about when he took over, he realized Microsoft had to go back to its roots. It had gone from being a company going back to Visual Basic that had enabled other people to do things to being an extractive company that was trying to ring out every last, and he said, we had to go back to our roots of enabling an ecosystem. And so I think a lot about that ecosystem value idea. And for me, this is sort of a funny personal story about one of our sayings at O'Reilly is create more value than you capture. And where did that come from? It was back in 2000. We were having a management retreat, and I happened to tell the story of a couple of internet billionaires who told me that they'd started their company with the help of an O'Reilly book.

(00:21:00):
And I kind of laughed and said, they got billions, we got 35 bucks, or maybe a few hundred bucks if they bought more than one. But that seems like pretty great. We played a role in making these companies, in making this industry. And one of my senior managers, guy named Brian Irwin said, yeah, we create more value than we capture. And we went, damn, that's a great saying. And we've used it ever since, and we've tried to live by it. And so a really great example. We now morphed O'Reilly. I mean, if you look at the history of the company, many of our innovations came from trying to create value for other people. Why did we start our conference business? It was because we had this best-selling book about the Perl programming language. And Sun came out with this big Java conference and went, there's no corporate backer for open source software.

(00:22:03):
Let's make a conference for Perl. Let's tell everybody how important this is. And then it was like, wait, all of our books are about things that are free and open source and they don't have this kind of thing. So we created the open source convention. Of course, the whole industry took off and did that, but we were really about trying to build new communities. When we launched our Web 2 summit, it was like, literally, our strategy the prior year was how do we reignite enthusiasm in the computer industry after the dot-com bust?

Eric Ries (00:22:33):
After dot-com, yeah.

Tim O'Reilly (00:22:34):
Yeah. We were trying to, basically, we said, we still believe in this future. We want to tell a story that gets people excited. And similarly, when we started what was originally a joint venture with Pearson called Safari Books online, it was because there were a bunch of startups in eBooks. And I said, this is not sustainable for authors. And I did the math for the guy over dinner, just as he was sort of explaining his business model. And I go, this is never going to be more than ancillary. And I think, again, back to the trendline thing, I think eBooks are going to be really important. This is in 2000, we'd done our first ebook in 1987, 20 years before the Kindle. And we've been exploring this space for a long time. And I said, this has to be mainstream and it has to have an economic model that allows authors and publishers to get paid. So that's when we launched this service with a different business model, which we evolved and we continued to evolve and we added video. And a really interesting thing happened in 2016. End of 2016, we introduced this live online training, and it was a very, very powerful, successful feature for us. And there are two things that were really important about it. One was contrary to the normal narrative that you focus exclusively on your users. In a way, the innovation here came because the problem we were trying to solve was how did we make more money for our authors? Because we saw that as ourselves as at the fulcrum of this interdependent system, because as the publishing was becoming a smaller and smaller market, we had to find new ways for authors to monetize their expertise.

(00:24:33):
And conferences, the economics were such that we couldn't really afford to pay people. We said, okay, in this model, we can afford to pay people. And so that's, I think, where I really started getting super clear about this. The tech companies in some sense are at a fulcrum between their suppliers and their consumers.

Eric Ries (00:24:59):
We don't normally use that language with tech companies I think is interesting, even though the history of business is full of power dynamic relationships between suppliers and distributors.

Tim O'Reilly (00:25:11):
And if you look at Ben Thompson's work on internet aggregators, his whole point is the aggregators get to commoditize their suppliers. And that's exactly what you're seeing with Google and Amazon. And that sort of really struck me because of my experience at O'Reilly that that's a de facto sign of too much market power. If you can commoditize your suppliers, then you're clearly too big. Yeah, and too powerful.

Eric Ries (00:25:40):
So let me take you back to that ethos, because to me, I remember when you first wrote the essay about creating more value than you capture, and on the one hand, it seemed like something so obvious. It was like, I can't believe somebody had to write this down. But to me, ultimately, that's the greatest sign of a truly great essay, is that it brings to your mind, it catalyzes for you something that you thought everybody know and you give it a name.

Eric Ries (00:26:00):
[inaudible 00:26:00] for you, something that you thought everybody could know and you give it a name. And in the years since I reflected on it a lot because I've met so many companies that clearly didn't get the memo, that are so busy trying to capture value they're actually destroying the engine that generates the value that could create the broadly shared prosperity that they could be the epicenter about. You talked a little bit about the distinction between an extractive or exploitative business model or ethos, and then this more creating value. I don't know what you like, I don't know if generative is the right word, a different ethos. Just walk me through a little bit how you think about the two sides of that coin and how people who have a choice in the matter should be thinking about what they want to do.

Tim O'Reilly (00:26:45):
Well, I think first of all, there's a long-term perspective, which is that if you want to survive over the long-term you have to create a balanced ecosystem, one that leaves enough for everybody to prosper. And I had watched obviously first as Microsoft took all the value. I still remember what Walt Mossberg told me years ago about a conversation he had with Steve Ballmer when he was the CEO of Microsoft. And he said, "Steve, if you would just dial back the greed only 5%, your problems would go away."

(00:27:26):
It's just that last little bit. And I think for a long time, Google really threaded the needle but I think they've started to lose the plot. Amazon completely lost the plot with the introduction of their ad business. And the recent research I've been doing at University College London, we discovered that ads are now a third of the most clicked-on products because of course they're given top billing where they're most likely to be clicked on. And the products that are advertised are generally 30% worse by Amazon's own metrics. The metric we used was, "Okay, if they've built all these algorithms to figure out what's the best combination of price, user ratings and other factors, the organic rankings represent their best judgment about what's best."

(00:28:23):
And then you can look at the distance between the advertised products organic ranking and then their placement in ads.

Eric Ries (00:28:32):
Yeah. That's a perfect illustration because you're literally ... When people talk about extractive practices or extracting a rent, that's literally you are ... I wouldn't say stealing, maybe too strong a word. How would you describe it? You're taking value away from your own customer for yourself, for the benefit of the end product.

Tim O'Reilly (00:28:51):
That's right. And these products are objectively worse by Amazon's metrics, and 17% more expensive on average. Amazon is really betraying their customers. And so in that case ... I think that's a pretty clear case. And it's interesting because my colleagues on the papers that I was writing were all like, "We got to get this before regulators."

(00:29:13):
And I'm like, "No, I got to get this in front of Andy Jassy."

(00:29:17):
And I sent it to Andy and he said he would read it but has not got back to me.

Eric Ries (00:29:21):
Yeah, that's ...

Tim O'Reilly (00:29:26):
But I do know that sometimes the things that I have done in that regard have made a difference. And I do see companies that are still trying to fight the good fight. But I think even apart from the purely extractive, I think that we have to come to grips with the fact that internet platforms, knowledge platforms and looking forward to the age of generative AI, these are platforms with enormous power that must be dealt with responsibly. A good example is ... First of all, I consider Google's original design to be really a masterpiece of market design, where you had organic search plus advertising, and they were complementary. And that's classic economics, literally running an invisible hand that was for an unpriced market, which was an incredible breakthrough in market coordination. And that's organic search results. And they go, "But still, people might want to raise the visibility of something because of an economic incentive. And we'll give them a space to do that."

(00:30:36):
But then they start gradually to overlay the price market, which is I think, less efficient and less user-centric, and replace the organic results. And I tried to show them, they're like, "Well, we're still trying to evaluate the quality and we're trying to ... "

(00:30:57):
They're not as bad as Amazon. But I showed Danny Sullivan, who replaced Matt Cutts as the Google spokesperson, Google spam search spokesperson, and I showed him a search for buy tires. And I said, "Look at the market-shaping power of how you arrange the results on the screen."

(00:31:20):
A lot of times all I have to do is just ... You use the product and you notice things if you have this point of view, you can't unsee them. I go, "I'm looking for a local tire place."

(00:31:31):
And I go, "Bang."

(00:31:32):
And I get Pep Boys and I get all these people who are big national chains who have lots of money to advertise and they fill the first screen and now there's no organic results at all. You have to scroll down three or four screenfuls on my laptop and further on my phone, to get to the map that shows the local tire merchants. And I go, "I don't know how influential that is, haven't measured it."

(00:31:59):
We measured it for Amazon and we see that there is really an impact to that screen placement, but go, "Wow, we care about small businesses in this country, I thought. And here's Google with this enormous power to shape the attention of their users away from them, towards national brands who are willing to pay or willing and able to pay."

(00:32:20):
And I consider that a less efficient, less competitive market. And Google should really care about that in the long-term because eventually, if ...

Eric Ries (00:32:30):
Well, yeah, it's a short-term, long-term thing. It's ultimately going to destroy their own business.

Tim O'Reilly (00:32:35):
Yeah.

Eric Ries (00:32:35):
It actually undermines their own ... Even if you just think about it in purely self-interested terms, you're undermining your own market power. If you drive all the local competitors out of business, then you just hand power to a small number of monopoly national chains who now you have to negotiate with on that basis.

Tim O'Reilly (00:32:50):
Yeah. And this is I think, very clear in the case of generative AI. Google's been moving in the direction of just saying, "Well, it's better for the user to just get the answer and not to have to go to a web page."

(00:33:04):
And the thing I think about that is that's absolutely true, but if you have the ecosystem perspective, you go, "It's better for the user to do this, so how do we still make sure that the supplier of the information gets paid?"

(00:33:21):
And Google's done that some. You look at song lyrics, they made a five-year license with one company. I go, "Oh, that's a retreat from the efficiency of their old model."

(00:33:31):
What they should have done is say, "We're going to take the lyrics from whoever has the top search ranking."

(00:33:37):
... which would've preserved the competitive interests of everybody, or we're going to round robin them or whatever."

(00:33:43):
There is a variety of ways that they could have paid people on the basis of this incredible collective intelligence engine that they built, but instead they went back to the old easy way of, "Well, we'll just make a licensing deal with one company. We'll probably make one where it's the best deal for us and not necessarily the best for the user, and certainly not the best for the supplier, and take away those incentives."

(00:34:09):
And now you look at what's happening with generative AI and they're hoovering up all kinds of content. And while they're using RAG, retrieval augmented generation, to say, "Well, we got this information from these sites."

(00:34:24):
With summaries, there's not really an incentive for somebody to click through often. And they're not saying, "Well, we ought to pay people."

(00:34:30):
Whereas at O'Reilly, we're doing that too with our content on the O'Reilly platform. We're making this huge investment to be able to actually tie it into our compensation system, saying, "Yeah, we gave a summary and it came from information in these three books. And here's how we're going to allocate the payment."

(00:34:48):
Because we're going to continue to pay people when we create these derivative works, not just go, "Well, we sucked it in and now we can make a product that substitutes for what those authors have done, because we know that if those authors stop producing, eventually we will have to do it all. And we don't have that capability."

(00:35:09):
And so that's where I think that companies need to think about the long term, which is who is going to provide what you are the gatekeeper for if you basically have a relentless acquisition of all the value for yourself. Now, in the case of Amazon, they're going, "Well, prices will go up and there'll still be people who will be willing to provide."

(00:35:35):
But yeah, anyway. [inaudible 00:35:39].

Eric Ries (00:35:40):
It's certainly a far cry from your margin at amy opportunity, and the idea that the way you keep your company strong is to always be on the side of your key customers. And I might even go so far as to say stakeholders of ... I don't know that they would use that language. And I think what's interesting about what you're saying, the pattern that I've seen in these examples that you're giving is that there's this short-term benefit to being extractive that gives you this short-term juice boost, but because you neglect the ecosystem perspective, you neglect the long-term consequences, you actually hollow out your business, you make it more vulnerable to competition. And you look at ...

(00:36:15):
Google is a perfect example. I can remember that feeling of Google being invincible, that they had such the perfect business, no one could ever assail them because customer loyalty was so high they had the best results. But now, not only do they not have that loyalty, the product is just manifestly not good for all the reasons you're talking about. And it's not like ... You don't need some esoteric academic study to determine this. Anyone can run any search for any topic you want and you get all sponsored results. And I am old enough to remember when Larry and Sergey used to write those rant posts about how Yahoo and Excite and other search engines were all sponsored results and there was no real ... The people that work there, have they not read those essays? To me, it's such a powerful example of how these incentives that we're running companies under inevitably seem to drive them into this mediocre state, which then makes them more vulnerable and therefore less valuable. It's actually an error in thinking.

Tim O'Reilly (00:37:09):
Yeah, and in some ways ... I was just thinking as you're talking, values are a map of the world, a map of what you think matters. And if your value is just making money for yourself, that shows in your product ultimately. And if your map of the world says, "We have to create value for everybody."

(00:37:33):
You're going to make different product decisions. And I think that's the central challenge, I think. And you asked me earlier am I depressed, and I'll go back to that a Rilke poem, "What we fight with is so small, and when we win it makes us small."

(00:37:57):
It's a big thing to try to get people to change their values. We have this extractive culture. You look at everything from ... Big tech is only one example. Look at climate change. Our whole society has to change. And so I'd rather fight that big thing and fight it both through what I write about and what I talk about out in the world, but also just through the way I run my own business.

Eric Ries (00:38:25):
One thing that I remember from the debate, I remember even when you wrote to work on things that matter, that was people who I think deliberately misunderstand ideas like that to saying, "Oh, you're saying everyone should run a nonprofit or be an activist or go into politics."

Tim O'Reilly (00:38:38):
No.

Eric Ries (00:38:40):
One thing so striking to me about the stories that you're telling is that in every one of these cases, there's this nexus of the thing which is the most long-term thinking thing, the thing which is actually the most favorable to customers and the other parts of our ecosystem, more favorable to others. And also on the surface, it's the harder solution. Making a great product is actually really, really hard. And when given the opportunity, I think human nature is to take shortcuts and to be lazy about it. And I think there's this interesting thing where people don't appreciate until you really get in the trenches trying to build a company, if you have the ambition to build a great company, how much ... You have to provide that support for people in the organization to make the more difficult choice to do the right thing, not just because it's the right thing to do for moral reasons but because that ultimately is the source of greatness. That is your ultimate competitive advantage, to be a more trusted counterparty, to be someone who actually has a reputation for building things that people care about.

(00:39:41):
Going back to the idea that you should try to create more value than you capture, to me, what's really interesting about that is not only is it good advice for founders, but it also ... It's really interesting to me that you were able to build a company with that as its ethos. And I wonder if you talk a little bit about from a operational point of view, from the point of view managing people and driving a company, how do you instill an ethos like that and have it stick?

Tim O'Reilly (00:40:05):
I think we were fortunate in a certain sense, in being not in the hurly burly of the worst rush, competitive rush that a lot of companies feel where you have to win or you die. And so we could choose ... It was just easier and more natural. We were not in the mainstream. This used to be the basic business pattern. People did a thing, they did it well. And I think it's really the financialization of our industry over the last few decades that has totally corrupted companies. I don't know. I think being a private company really helps. People don't come expecting to get super rich, and so therefore they don't make the decisions that you make when that's your primary overriding goal. And I think people came because they were drawn by the values.

(00:41:27):
And a lot of the public work that we did beyond our actual business told this story, which we then felt we had to live up to. I've always been a fan of the end of Kurt Vonnegut's Mother Night, where the character who was a propagandist for the Nazis while secretly being an agent for the West, was judged afterwards. And Vonnegut says, "You are what you pretend to be, so you better watch what you pretend."

(00:42:05):
And I always thought, "Well, that works both directions. If you pretend to be better than you are, that's a really good thing."

(00:42:12):
The fact that Google said, "Don't be evil."

(00:42:14):
... until 2019 when they took it off the name plate ...

Eric Ries (00:42:17):
They took it off, I know. It's so sad.

Tim O'Reilly (00:42:22):
That matters. If you espouse your values publicly, that helps a lot because it's so easy to talk yourself out of them.

Eric Ries (00:42:32):
And I've actually seen ... One of the tools that I have found really effective is getting people to do the right thing for the wrong reasons. And one of the things that has surprised me is how often I'll be in a meeting working with a company, talking to a product manager in the trenches somewhere with a company, debating whether to do the right thing or not, and anything that gives ... There's usually one person, the torchbearer for the company's soul, there's one person there at least in the meeting who's like, "Look, we've really got to do the right thing. It's going to be better for the company in the long run. We're all shareholders here."

(00:43:00):
And you want to give that person as many tools as possible. I've been amazed how astonishingly powerful it is that they can say, "Look, if it ever got out that we didn't do this, we'd be pilloried in the press because we've said that we have this ethos. If we fail to live up to what we said, then that's going to cause a problem."

(00:43:17):
And then people are like, "Oh yeah, good point. We don't want to do it because it's the right thing, but because it might be potentially embarrassing to not do the right thing. Okay, we'll go along with it."

(00:43:27):
I'm curious, what are other tools like that that you've found effective either in your own company or the companies you've been around? What empowers people to actually make the right call even when it's difficult?

Tim O'Reilly (00:43:37):
Well, I think when people care themselves about the values, it's not something you can impose. I think back to my early ... The way I used to hire people in the early days of the company, I would just talk about my values and what I hoped to do and why. And I could watch if people were lit up by that, that gave them a big market hiring, whereas people who I could tell didn't really get it and didn't respond to it weren't a fit. You look for people who have those kinds of values, I think is part of it. But a big part of it is our self-image as individuals and our self-image as a company is something of a habit. It's created through lots of small decisions that people make, and again, it can lead you astray. I've had many battles inside the company with people who have their own idea about, " Well, these are our values."

(00:45:02):
It might be something as little as, "Well, what are our products ... How are our products designed?"

(00:45:07):
I remember I used to have these battles with our production department because they were very concerned about little niceties of punctuation, and we're like, "But you're breaking the code in the examples."

(00:45:25):
That's more important. And we'd have that battle because their value was around the thing that they added.

Eric Ries (00:45:32):
The typesetting, yeah.

Tim O'Reilly (00:45:35):
Yeah. And just little cases like that where you have these collisions and you have to work that through. But I guess, a lot of it is just you have to just continue to express your values in a way that inspires people. And I don't really have any other magic toolkit. I don't think it's that you put in place incentives because when you put in place incentives, they're almost always wrong.

Eric Ries (00:46:14):
Yeah, they're fundamentally distortionary.

Tim O'Reilly (00:46:17):
Yeah because you think it's one thing and then people start chasing it, and then the world changes around you and you're still pursuing the thing that you got told will make your bonus. You want people who intelligently grasp the values and apply them in new contexts, and you give those people a lot of autonomy.

Eric Ries (00:46:39):
This episode is brought to you by my longtime friends at Neo4j, the graph database and analytics leader. Graph databases are based on the idea that relationships are everywhere. They help us understand the world around us. It's how our brains work, how the world works, and how data should work as a digital expression of our world. Graph is different. Relationships are the heart of graph databases, they find hidden patterns across billions of data connections deeply, easily and quickly. It's why Neo4J is used by more than 75 of the Fortune 100 to solve problems such as curing cancer, going to Mars, investigative journalism, and hundreds of other use cases. Knowledge graphs have emerged as a central part of the generative AI stack, Gartner calls them the missing link between data and AI. Adding a knowledge graph to your AI applications leads to more accurate and complete results, accelerated development and explainable AI decisions, all on the foundation of an enterprise-strength cloud database. Go to aneoforj.com/Eric to learn more.

(00:47:46):
I have to ask you, with Dune thrust back into the popular culture, what's that been like for you? Any thoughts about the new movies and about the fact that at this moment in time, Dune is right back front and center?

Tim O'Reilly (00:48:00):
It's funny because of course, I have not read Dune for probably 30 years. I reread it once I think, after doing the book, but I have not ... And I went back and I reread the chunk of my book that was about Dune when Jason interviewed me for the Dune Pod a couple of years ago when the first movie came out. And it was amazing how much I'd forgotten. I was like, "Whoa. I interviewed Albert Lord, I'd totally forgotten that. How cool is that?" And there were parts of it I remembered. There's been a couple of other things people have asked me to write, introductions to books and so on, and try to get access to reprint some of my essays because I did two books. Not only did I do the original book, which is called Frank Herbert, but I also did a book called The Maker of Dune, which was a collection of Frank's essays, so we're listed as co-authors because it included a lot of interviews that I did which I had the rights to and ...

Eric Ries (00:49:04):
How fascinating.

Tim O'Reilly (00:49:08):
It's been interesting to come back and watch the critiques of Frank, and it's also just ... I certainly don't get where he was coming from in the way that it seemed very clear to me at the time. Again, I should go back and read the book and see if I just missed some of the subtext. There was just a whole set of essays about how he's in some sense praising this colonialist narrative. And I'm like, "Dude, where do you get that?"

Eric Ries (00:49:43):
That's not the book I read. Oh, yeah. Definitely, definitely not.

Tim O'Reilly (00:49:47):
But anyway, it does remind us how each generation reads a thing through the lens of its own culture.

Eric Ries (00:49:53):
Totally. Oh, my God.

Tim O'Reilly (00:49:53):
[inaudible 00:49:54].

Eric Ries (00:49:55):
The idea of colonialism is so different now than it would've been in his time and his more subtle subversion of it. I could see how that might actually go over people's heads now.

Tim O'Reilly (00:50:05):
Yeah, and I think about ... Yeah, I think about that a lot with literature. I read a lot of books that were once incredibly popular, and later are not. For example, in Victorian literature, one of Anthony Trollope's books is called Can You Forgive Her? It's really a proto feminist novel. It's basically posing the question, here's this woman, and instead of marrying the guy who will raise her place in society, she marries an up and coming politician who's from the whatever, and she's making the choice for very different reasons than the traditional choices. And the title is, Can You Forgive Her, for making this choice? And you go, "That was pretty subversive at the time."

(00:50:54):
And you look back on it and go, "Well, no. Oh, my God, it's so supportive."

(00:50:58):
Or The Mill on the Floss where there's this class violating love affair and at the end, George Eliot just has to kill them off in a flood because you can't actually ... You watch people struggling with the bounds of their current milieu. And I give him a lot of credit for that. But anyway, back to the movie, I thought Jessica was the big fly in the ointment for the first movie. I loved the character of Jessica, she was a very strong character. She's also a central mover in the narrative because she violates the ... She stands against the a Benedict in having Paul in the first place as a son.

Eric Ries (00:51:44):
In some ways it's her act that sets the entire plot in motion.

Tim O'Reilly (00:51:48):
That's right. But also, she's just such a badass.

Eric Ries (00:51:51):
Totally, such a badass.

Tim O'Reilly (00:51:53):
And then in the movie, when they have her ... She's just sniveling in fear outside when they're doing the Gom Jabbar, and then she's in the ornithopter ...

Tim O'Reilly (00:52:03):
... bar and then she's in the ornithopter in the sandstorm. She's reciting the litany against fear like she's petrified. And I go, "God. That's as if you had Obi-Wan Kenobi sniveling with fear when they entered the..."

Eric Ries (00:52:16):
Well, hopefully you haven't seen the Obi-Wan Kenobi show then.

Tim O'Reilly (00:52:22):
No, I have not.

Eric Ries (00:52:23):
Don't. If you have that concern definitely don't.

Tim O'Reilly (00:52:26):
Yeah. But... So anyway. And then in the second movie, they had her go all in on being the Bene Gesserit schemer. So-

Eric Ries (00:52:32):
Yeah. That's a shame.

Tim O'Reilly (00:52:34):
So I guess I just felt like that was my one beef. I thought... And I understood why they made the changes that they made with making the... Externalizing. The stuff in the book is often Paul's inner monologue where he's struggling with things [inaudible 00:52:54]. So they externalize it with Chani and sex and different factions in the frame. And I go, "That was clever."

(00:53:03):
So I guess when I watch movies that are based on books I love, particularly books that I loved as a kid, I don't mind that they change them as long as they change them in the direction that they were going. A good example of that, two good examples of that, one was when Brad Bird made John Carter of Mars and he turned John Carter, who was this sort of very principled Southern gentleman, this very 1911 romantic hero, into this world-weary cynic. I go, "You just totally changed the character of this character that I grew up with. And [inaudible 00:53:52] no, you can't do that." Whereas when they made Captain America the movie from the comic books, they made him much faster and stronger than he was in the comic book, but the character was fundamentally the same. So it was just Captain America only more so.

Eric Ries (00:54:06):
Which is great inflation for a superhero's... Yeah.

Tim O'Reilly (00:54:10):
Now it's okay. So it's just interesting, particularly when you think about the things that are your childhood influences.

Eric Ries (00:54:19):
What did you like about it as a kid?

Tim O'Reilly (00:54:22):
Well, in general, for me, there was a sense of in science fiction that I loved that was saying something about that you could matter. And this was clearest in books like Heinlein's Have Space Suit-Will Travel and all of his juveniles in fact, but also in very clearly in books like Dune. You have a teenager. Andre Norton's The Time Traders. You have a teenager who suddenly gets swept up in these events and discovers depths of potential and character. And that's what I thought George Lucas did so well in the original Star Wars. He really captured that exact kind of narrative arc with Luke Skywalker. Here's this kid. He's stuck somewhere. And then suddenly he's dragged into this much wider world where he has a chance of influence.

(00:55:26):
And I think of that as a sort of... And I actually wrote about that in my book about Frank Herbert, which is that this is a kind of... It's a version of the hero's journey, and it's a version that I always loved about science fiction growing was that it said... And comic books too, particularly the Marvel comic books. Like Peter Parker gets bitten by a spider and suddenly he has these new powers and he has to come to grips with them. And you're a nerdy kid.

Eric Ries (00:55:58):
Don't I know it? Yeah.

Tim O'Reilly (00:55:59):
And this is like meat and drink to you. It's like [inaudible 00:56:03].

Eric Ries (00:56:02):
Well, I'll never forget discovering the internet and feeling like I got bit by the spider. I mean, that's really what it felt like to me. All of a sudden, I have literally been given magic powers and the depths of which I can't possibly understand, but it felt so real and obviously very much primed by a whole childhood of those stories to see it that way. I can't help because we've been talking about these founders and building these companies. They also have that experience. You've been around this long enough. You've seen multiple people go from just a random kid with an idea to a master of the universe and a shaper of society and wealthy and powerful beyond their wildest imaginings. You've seen that-

Tim O'Reilly (00:56:47):
Yeah. And it's so sad how many of them though have not taken the, "With great power comes great responsibility," message from that?

Eric Ries (00:56:56):
Well, or the message of Dune. It's funny we were talking about that because the people who read those books non-ironically and didn't pick up on the danger of it, I feel like are actually living the hell of it right now.

Tim O'Reilly (00:57:08):
That's right. Yeah. They basically think, "Oh, man. I'm so rich and successful. I must be right."

Eric Ries (00:57:15):
Oops. Yeah. Yeah. I always try to remind people that the ancients warned us about demagoguery, not because it doesn't work. It's dangerous because it does work. So when people say like, "Oh, well, you got to hand it to him. Yeah. Sure. He's evil or bad or whatever, but it sure is successful," it's like, "No. That's not a compliment. That's the danger they've been trying to warn us about for thousands of years. Here we are. We reversed it and turned them into celebrities."

Tim O'Reilly (00:57:37):
And that's exactly what Frank was trying to do.

Eric Ries (00:57:43):
Yeah. Yeah. No. I feel like talking about what are the lessons of that fiction for our time. That's got to be one of them that somehow we as a society have lost this concept. We've missed the subtlety here. That's very telling.

Tim O'Reilly (00:57:54):
That's a really good observation. Yeah.

Eric Ries (00:57:57):
Okay. So taking us back to... We were talking about working about stuff that matters and create more value than you capture. And those were such formative essays and concepts for me. So first of all, thank you for being such a consistent voice for that. Can I read you something that you wrote?

Tim O'Reilly (00:58:13):
Sure.

Eric Ries (00:58:14):
Okay. You said that, "I want to make clear that work on stuff that matters does not mean focusing on nonprofit work causes or other forms of do-gooder-ism. Nonprofit projects often do matter a great deal and people with tech skills can make important contributions, but it's essential to get beyond that narrow box." And then you said, "We need to build an economy in which the important things are paid for in self-sustaining ways rather than as charities to be funded out of the goodness of our hearts." And that really stuck with me. And I just wonder if you would talk a little bit about what was your motivation in saying that and just how do you... I think a lot of people would find it confusing to consider for-profit companies like vehicles of social change. And we so much associate do-gooders with the nonprofit sector like... Tell me about that.

Tim O'Reilly (00:59:02):
That's one of the reasons why I loved things like the open source movement and the early web. Larry and Sergey really weren't like today's entrepreneurs like, "Jesus, we can get filthy rich," and that's the primary overlay. They were like, "Hey. We have this new way to do search and we think we can create access to all the world's information. And isn't that amazing?" So they were trying to do work that mattered. And even Mark when he started Facebook, certainly the very early stages, it was a dating app, whatever, but then you develop a somewhat idealistic narrative that, "We're going to try to connect the world and make people communicate with each other better." And of course, that's really the issue that I think we have to struggle with is why do these idealistic dreams go sideways? And I do think that that's a whole other conversation, but you have to at least start with the stuff that you are doing will have a positive impact.

(01:00:13):
And as I think I wrote in one of my pieces, you want to be doing things that were worth doing even if you don't win. And that's a critical concept. It goes back to that Rilke poem that we talked about, but it's also just that the world moves forward failure by failure, not just success by success. And you look at people who tried things and they didn't work, but then the next person who came along learned from it and built on it and then you go forward. At that time when I wrote that piece, it was the period when there were people making all these really stupid social media apps, throwing sheep, whatever. It was the equivalent of the Bored Ape era of social media.

Eric Ries (01:01:07):
Oh, yeah. Yeah. Totally.

Tim O'Reilly (01:01:09):
And there were some people who were working on crypto with the idea of making serious change in the world. A bunch of people were making totally trivial, useless stuff just because they thought they could make money at it. In fact, then they did make a bunch of money at it. And that's shameful. It's a kind of grift. And-

Eric Ries (01:01:27):
I remember a financial reporter called me about the GameStop phenomenon during the pandemic and they wanted my comment. And they said, "What's the problem with this?" And I said, "Oh, the problem is actually very simple. The problem is you have a lot of people who are making a lot of money without doing anything. They're not making anything. They're just making money." And the reporter was like, "No." It's like, "That's not what I'm calling for comment about that." He was like, "Wouldn't that describe half of our financialized economy?" And I'm like-

Tim O'Reilly (01:01:57):
And the answer is yes. [inaudible 01:01:59]-

Eric Ries (01:01:58):
Yes. That's why people... Because they were like, "Why are people upset about this?" This is a real deep finance reporter was like, "This is not news. This is not... People do this." It's like people might not have realized it before, but people are upset and they have a right to be upset and their intuition is right that people should make money because they created value, not just because they were able to capture value that somebody else created enough. And the reporter hung up on me. They couldn't handle it.

Tim O'Reilly (01:02:22):
And of course the other side of capturing value that you didn't create is that you took it from someone who did create it directly or indirectly. Maybe [inaudible 01:02:30].

Eric Ries (01:02:31):
[inaudible 01:02:32] zero-sum thinking.

Tim O'Reilly (01:02:33):
But ultimately it goes back to, well, where'd it come from? And all of this is very, very relevant today in this generative AI question where the companies are being very evasive about what they're training on. I just watched an interview with Mira Murati and the interviewer was asking her, "Did you train on YouTube content?" She's like, "We trained on publicly available content, [inaudible 01:03:00] content. "

Eric Ries (01:03:01):
" Including YouTube or not?" Yeah. Exactly.

Tim O'Reilly (01:03:03):
Yeah. And you go, "Well, yeah. So you basically..." They were talking specifically about Sora. And so you go, "Well, they did in some sense appropriate all this raw material that's just lying around, but in the course of that, they're sweeping up potentially a lot of people's livelihoods." And similarly, I think about the increasing use in search of AI generated summaries of the news articles. And that's very different than the original web where, yeah, "We read your document in order to create the search index and the output is a product that helps people find you and go to your site. And now we're creating a product that makes it so that people don't have to go to your site."

Eric Ries (01:03:55):
Totally.

Tim O'Reilly (01:03:55):
And so right there, you go, "Okay. That's appropriation now. Once you've created..." Let's just write in copyright law. Of course you can read anything. It's the outputs that you have to question. And if the outputs are a substitute, then that's infringing, but they don't want to have to pay for this. And the question is, "Well, if they don't want to have to pay for it..." And they'll say, "Oh, well. It would just be too expensive." I go, "Well, that means you have a shitty business model. If you can't afford to pay for your raw materials and you can only succeed if you appropriate [inaudible 01:04:34]." World is worse off,

Eric Ries (01:04:36):
Totally incompatible with the idea that AGI will be the most valuable technology in history and that this company is going to be worth $10 trillion. Probably can afford it. Well, and what's interesting to me-

Tim O'Reilly (01:04:45):
Well, again, that's a different kind of grift because it's like, "Can we get our 10 trillion in the derivatives market effectively of stock price rather than in the fundamental market of actually the economics of the business?" which is why you have two thirds of businesses going public today are losing money, but they can get money from dumb investors. Yeah.

Eric Ries (01:05:11):
It's literally time travel, financial value transported from the future into the present. It's a really cool magic trick.

Tim O'Reilly (01:05:17):
Yeah. Exactly.

Eric Ries (01:05:18):
But I think your point is not just about generative AI, but really about extractive, exploitative versus generative... Generative. Maybe I have to use a different word now, but value creating business model, because there really is a difference. And what's interesting to me, I've been part of a lot of these policy conversations about generative AI, and I have my own personal view, which is we should have a universal licensing regime paired with a sovereign wealth fund and a UBI and really... Because it required the collective output of all humanity. Our collective creative output literally is what is necessary to create these technologies, and everyone should have a stake in their financial outcome. And I'm sure we could riff... I'm sure you have your own solution. We could riff on -

Tim O'Reilly (01:05:58):
I have a bit more market focus view, which is that there are technologies that allow you to allocate value.

Eric Ries (01:06:03):
Oh, yeah. Totally. You could do it that way too.

Tim O'Reilly (01:06:05):
I mean, one of the things that's so amazing is these systems are really good at algorithmic allocation. They're great advance in that. And for example, in the search example I'm giving you, Google knows exactly where the content for the summaries that is coming from. They're using RAG. They're literally retrieval augmented generation. They're basically doing traditional search and then having it summarize the top results. So they knew exactly where their content came from and [inaudible 01:06:33]-

Eric Ries (01:06:32):
Yeah. And if they were motivated to share the prosperity, they could.

Tim O'Reilly (01:06:35):
They could. They could say, "Oh, we're not sending you traffic. And if we're not sending you traffic because it's unmonetized search, maybe that's okay because your knowledge is out there or whatever and you just want credit or whatever and we've given you credit. But if you were counting on monetizing it and we're monetizing it instead, we need to share that monetization with you." And there's no reason why they could not do an algorithmic round robin that says, "Okay. We got it from these source.' That's what we're doing at O'Reilly is we're building summaries and other kinds of derivatives work, translation-

Eric Ries (01:07:16):
But always with an attribution and the sharing.

Tim O'Reilly (01:07:18):
Yeah. Or we're basically still saying, "Oh, we're able to trace it back to an author's work or a set of different products, books or videos, whatever, and then allocate proportionally the kind of payments that we would've made for the original work."

Eric Ries (01:07:33):
I remember we did that at IMBU too for virtual goods. We would calculate the electron cost of the thing and allocate it among all the many creators who were there. So this is a really interesting conversation because the policy side of this is fascinating. And what should the regulation be? Obviously, the press is being totally dominated by this conversation, regulating AI or not. But to me, we're skipping over a much more interesting question, which is when I talk to founders especially, executives at companies, board members, they're so focused naturally on what they're allowed to do. The presumption is that if something is legal, it's worth doing and otherwise not. And we're skipping over what I think is the far more fundamental question of what ought we to want to do as company.

Tim O'Reilly (01:08:18):
That's true.

Eric Ries (01:08:19):
Yes, maybe it's true that you can get away with being exploitative, but should you want to do that? And I'm curious, when people ask you for advice on that point, how do you help them understand that not only is it morally right... I mean, it's basically immoral to run an extractive business model. I wouldn't sleep well at night if that was my business. But in addition to having these ethical virtues to it, there's also a competitive advantage to it. Ultimately, companies that destroy their own source of trustworthiness collapse.

Tim O'Reilly (01:08:49):
Yeah. It's not just a source of trustworthiness. We've already seen worries about model collapse because you're getting AI-generated content ingested and you get this copy machine effect where it all just gets gutted down. So you go, "Wow. You need people to continue to create content."

Eric Ries (01:09:06):
You need humans, yeah, to make the content.

Tim O'Reilly (01:09:08):
And so if you're not going to reward humans and you're going to incentivize them to create a bunch of AI-generated dreck... Again, we already see. I mean, Google is having to do search updates because people are doing effectively new kinds of SEO attacks just generating endless amounts of content. And it's what Cory Doctorow calls it enshitification.

Eric Ries (01:09:31):
The more [inaudible 01:09:33].

Tim O'Reilly (01:09:33):
You read these articles and they're so clearly AI generated and it's just like clearly they're just trying to make a bunch of pages that they can put ads on. And Google quite rightly is going, "Well, this is bad not just for the user. It's bad for us." And I the, "It's bad for us," is fairly critical. I think we talked earlier about my conversations with Amazon about this. When you recommend-

Eric Ries (01:10:00):
Yeah. Totally. You have data. You have them dead rights.

Tim O'Reilly (01:10:02):
Right. Yeah. "When you recommend worse products that cost more, you think your users won't eventually notice? And if you basically start taxing your vendors more heavily, don't you think that they will notice and look for alternatives and the first crack in your market power, they'll desert you?" And that's where it makes it very easy if regulators go, "Oh, yeah. We're going to basically say we're going to outlaw Amazon's most favored nation clause," which is one of the things they've gotten away with, "That you have to give us the best price you give anywhere," what will happen is people will go, "Well, Amazon makes us pay more for visibility, so our prices are going to be higher there and we'll give lower prices somewhere else where we don't have to pay for visibility." And all of a sudden the whole basically badly designed broken machine starts to creak.

(01:10:57):
And so I think that there's a lot of business models that they're just bad and eventually it catches up with you. And the problem with our economy is that in so many cases, you can... I used the analogy in one piece I wrote. It's a little bit like you can take home your winnings when the horse is in a horse race around the first turn and you don't have to wait until the race is over. And it used to be that you had to wait for an IPO and... So I think things like allowing employees and founders to cash out prematurely, it certainly feels great for the founders, but I'm not sure it's really good for the long-term health of businesses because you get these inflated valuations and this really strong incentive to inflate the valuation by whatever means possible and get out before people figure it out.

Eric Ries (01:12:01):
We certainly have seen a lot of that lately. And it was interesting because when you were telling the story about that research you did about Amazon that the rest of the team that worked with you on the research wanted to go straight to the regulators and you wanted to go straight to Amazon. That to me, that really encapsulates a lot of the difference in the way that you've approached trying to solve these problems. [inaudible 01:12:21] could be a good faith interlocutor of these powerful people [inaudible 01:12:24] on your own term.

Tim O'Reilly (01:12:25):
Well, I would say that I also believe that while there are cynical, evil people in business, there are a lot of people who are fundamentally idealistic but they're very able to talk themselves out of their idealism. I always think... I forget which of the Jane Austen books it's in, but in beginning of one of her novels, there's a scene where this guy has... His father says he has to give his... He's remarried, but his first wife who died, he wants to give their... Now I forget exactly what the setup is, but he has to... The son who inherits because of the inheritance rules... Or maybe it's the nephew or whatever who inherits, "You have to actually take care of my wife and daughters." And then it's this whole scene where the guy, his wife and his family talk him out of his intention to do well by them and he whittles down what he's really going to give them. [inaudible 01:13:36]-

Eric Ries (01:13:35):
And they're very subtle about it. They're not like, "Oh, yeah, be mean." They're just like, "What about this? What about this? What about that?" Yeah. It's very subtle.

Tim O'Reilly (01:13:47):
And it's that kind of... It's not small mindedness. It's small moralness of... And the expediency. Anyway. I think we both believe very strongly in the long-term. And there's a lot of other areas in which this is also relevant. If you look at government services, there's a bunch of well-meaning people, but they're not thinking enough about the long-term. There's a lot of people who, for example... This is particularly extreme in California, I think, where this performative legislation... Actually, I guess it's true all over. It's just like, "This legislation is not designed to do anything except signal something that will help me get reelected." And they don't even think about... This is the subject of my wife's work. They don't even think about, "Well, how would this thing be implemented? And if it were implemented, would it work?"

Eric Ries (01:14:47):
Yeah. No. The institutional crisis of these kind of flabby, lack of moral character institutions just addicted to their own short-term self-perpetuating, that is really not limited to the for-profit domain at all. I mean, we've seen the simultaneous collapse of government institutions, of unions, of hotels, of... Newspaper journalism had this terrible moral collapse. You see political parties becoming completely ineffective. And it's like you have a wide range of institutional types in for-profit, nonprofit, government, public sector, private sector, all kind of going through a similar kind of existential crisis at the same time. You got to say, "Well, this is not caused by bad-meaning people or some incompetent executive." We tend to focus our fire on this particular person or this sector or this person, but I feel like what I've appreciated about your work over all these years is your ability to zoom out and see the societal wide issue, the deep rot at the core, the moral principle which is being neglected, which explains so many things that we're seeing at the surface.

(01:15:52):
I want do a quick lightning round if you're willing, because as I was prep... We've had so many great conversations over the years, and I've read so much of your work and I said I have so many O'Reilly books all over my house. And so I was like, "I know Tim's work pretty well," but I was preparing, and it really struck me as I was going through essay after essay after essay over all these years. You have an incredible knack for these aphorisms that just encapsulate a concept so succinctly. And we've already talked about so many. You got your buzzwords and keywords that you're really good at, like Web2.0 and government as a platform, but you also... And of course, "Create more value than you capture." We talked about, "Work on things that matters," but I had a list of them.

(01:16:33):
Can I just ask you about them real quick? And then I want to ask you about how do you come up with these things that are so catchy? Can I just do a quick lightning round of a couple of them?

Tim O'Reilly (01:16:41):
Sure.

Eric Ries (01:16:41):
Just because... There's no way we can get into each of these topics in the depth they deserve.

Tim O'Reilly (01:16:45):
Sure.

Eric Ries (01:16:45):
They really struck me. Okay. All right. So a recent one which I think is so important is that, "You can't regulate what you can't understand."

Tim O'Reilly (01:16:51):
Yeah.

Eric Ries (01:16:52):
I use this one all the time. What does that mean?

Tim O'Reilly (01:16:54):
Well, I think it was... I first started talking about it as a result of this work I was doing looking at the enshitification of internet services. And we all remember what they looked like back when we first were delighted by them.

Eric Ries (01:17:12):
It wasn't that long ago. Yeah.

Tim O'Reilly (01:17:13):
It wasn't that long ago. And then we're aware of what they're like now, but we can't actually point to anything. When did it change? Like, "When did that suddenly happen like this?" And I found various kinds of breadcrumb trails over the years. Some SEO guy has... "Here's 10 years of the evolution of Google's ad display or whatever." But that got me on this whole path towards thinking about a disclosure as something that ought to be there. I can look back and I can look at Google's financials from 2004 when they went public and Google's financials today and is a through line, but you can't look at, "Well, what was Google's ad load?" Or even, "How many users [inaudible 01:17:59]?"

Eric Ries (01:18:00):
How did the algorithm change? Yeah.

Tim O'Reilly (01:18:00):
How did the algorithm change?

Eric Ries (01:18:01):
How did their [inaudible 01:18:02]?

Tim O'Reilly (01:18:02):
And so there's a-

Tim O'Reilly (01:18:02):
How [inaudible 01:18:01]? There's a sense that I came to that it was just that if we had a magic wand exposing the internals of these companies would really help. Of course, it's very hard to do that retroactively, but part of what I got excited about is looking forward with all these, the AI regulation front and center for everybody and all these companies knew. Also, in the stage when they're trying to attract users by doing good work, this is a great time to get a baseline of what good looks like. Is there a way to regulate?

(01:18:47):
But how did I actually write that piece? You can't regulate what you don't understand. It's funny because what happened was, as is so often the case, some external stimulus like somebody asked me to give a talk, or in this case our marketing department said, "Can you write something about AI?" I said, "I'm not sure that I have anything to say that I haven't already said." They were like, "Oh, please. We'd love it if you'd write something new." So I sit down and somehow it just popped into my head the analogy to accounting. The fact that here's this-

Eric Ries (01:19:22):
That's such a key analogy.

Tim O'Reilly (01:19:23):
... This set of practices that were originally developed in double entry accounting and had developed 700 or 800 years ago and it became the fabric of how we manage businesses for all the things that it doesn't work entirely. There's a lot of obfuscation still, but that accounting standards and auditing are based on some system that companies actually themselves use to basically manage and regulate themselves. I said, "Oh, that's what we need for AI. We need disclosure of what are you actually doing to manage this system?" You read the account, it's like "We do red teaming." Well, what does red teaming look like?

Eric Ries (01:20:11):
Yeah, what does that mean? Can you imagine if that was your financial report? Just said, "We do auditing."

Tim O'Reilly (01:20:13):
Exactly. We stress test our financials. Red teaming typically consists of people coming up with a list. It's like early search, early search, Sergey had a list of searches that he did and said, "This one sucks. Fix it."

Eric Ries (01:20:30):
I wish we could run that. I bet if you ran that list against Google today, it would do horrible, sadly.

Tim O'Reilly (01:20:36):
But red teaming today is just some list of things that they're worried about. Of course, they worry about the wrong things and we haven't really built the practice, but over time they will actually get pretty good at figuring out, but we ought to know what that is.

Eric Ries (01:20:48):
We ought to know, yeah. I think before you can make any kind of intelligent bans or prescriptive coercive regulations, you need more state capacity to understand what you're talking about. I think that's critically important.

(01:21:01):
All right, next one. I love this one. You talked about how we're suffering from a deficit of idealism. I love that term of phrase.

Tim O'Reilly (01:21:09):
I hadn't thought of that as a bumper sticker, but I could see that. Yeah, all I can say is I grew up in a household where my father used to borrow money so he could meet his charitable obligations. He believed he had the tithe and sometimes he couldn't do it, so he would go to the debt market to give money away. That's just a strong set of values and a kind of idealism about what our role is in this world that I just imbibed from an early age, but I've also seen that just from when I first gave that talk, I think in 2008 which morphed in, it was called Wile of Hackers, the original talk. But it's one where I recited that real good poem, but then when Jen came up to me afterwards and said, "I want a talk like that, but for entrepreneurs", and that became work on stuff that matters, but I really saw through as I was giving that talk how people are just starved for idealism.

Eric Ries (01:22:29):
Totally.

Tim O'Reilly (01:22:30):
They're starved for values. Companies that express values, it's winning. I think a lot of it is just a practical side of it. Google, when they were idealistic and express their ideals. Amazon, when they were idealistic and express their ideals, had enormous loyalty from their staff, and they had people who felt mission driven.

Eric Ries (01:22:56):
It was totally different.

Tim O'Reilly (01:22:57):
When you lose that, you lose something. How can you not just become more and more dullified? Idealism is a critical part of the human soul.

Eric Ries (01:23:13):
Yeah, it's very strange I feel like to talk about these spiritual topics as sources of business advantage, but that's what we're talking about. I really believe that. These companies are alive. They're super organisms with their own soul and their own ethos and their own moral compass. People feel something deep when they connect with them as customers, as investors, as employees, as whatever, any relationship we have with them, it's incredibly powerful and it requires that authentic commitment to something idealistic. The second you lose it, it's like the flow state in a human being, extremely difficult to get into, very easy to lose.

Tim O'Reilly (01:23:49):
Yeah. There's another element, just this I love, this is slightly, it's a related concept, but I got this from a book called Air Guitar by an art critic named Dave Hickey. He wrote an essay, which is very influential for me, called The Birth of the Big Beautiful Art Market. It was about how Harley Earl, who was a VP of marketing for General Motors turned the automobile market from into what he called an art market, which is a market in which things are sold on the basis of what they mean rather than what they do.

(01:24:23):
His description, the whole ladder of the GM introduced it different.

Eric Ries (01:24:27):
Sure, a brilliant innovation.

Tim O'Reilly (01:24:29):
Different cars meant something different about your status and whatever. You had this whole product ladder where the product wasn't that different, but it meant something different. As soon as I read that, I went, "Oh, that was what Steve Jobs did to the computer industry." We all remember if you were around in the PC era when it was like how many megabytes of memory you had or kilobytes in originally? How big was your desk? How fast was your processor? That was the marketing and then Steve comes along and says, "Think different."

Eric Ries (01:25:01):
What color is your iMac? Is the color different [inaudible 01:25:03]?

Tim O'Reilly (01:25:02):
I did a 1984 ad, having a Mac means something and Apple has ridden that to enormous heights and it's not exactly the same as idealism.

Eric Ries (01:25:14):
You're making meaning for people.

Tim O'Reilly (01:25:16):
But meaning and ideals are deeply intertwined.

Eric Ries (01:25:22):
AI is a mirror, not a master.

Tim O'Reilly (01:25:26):
Well, that one really can't, I was just trying to describe what I see out there. People have this notion that AI controls us, just thinking about bias in AI. Yes, let's just say you have an AI hiring engine and it's going to make all these bad decisions, and it's like bad AI, have to fix it. Really, what's the AI doing? It's trained on actual practices. If you went and looked in the mirror and you said, "Wow, these clothes look terrible on me."

Eric Ries (01:26:12):
Bad clothes.

Tim O'Reilly (01:26:12):
You wouldn't say "Fix the mirror."

Eric Ries (01:26:14):
Bad mirror.

Tim O'Reilly (01:26:15):
You'd say, "The clothes look terrible." I go we should be going and looking at all these AI algorithmic results that we don't like and go, "Wow, those values look really shitty on us. How are we going to change ourselves?"

Eric Ries (01:26:30):
What was it trained on? It just goes back to your algorithmic allocation thing. It could actually tell us exactly where the mistakes are in our own [inaudible 01:26:38].

Tim O'Reilly (01:26:38):
Exactly. I go, yeah, some sentencing algorithm is bad. They go, "Great, let's not just fix the freaking algorithm. Let's fix the sentencing, the actual sentencing practices that we train that algorithm on." I just don't see people taking that next step.

Eric Ries (01:26:58):
No, no, it reminds me actually-

Tim O'Reilly (01:26:59):
This is not only bizarre to me.

Eric Ries (01:27:01):
See, if you got by this analogy, it reminds me of the Innocence Project, realizing that there would be this brief window of time technologically where you could use DNA evidence to prove that there were systematic injustices in the criminal justice system and how people have taken that as a thing about DNA evidence and how important it is, rather than as an indicator that there was only this brief window of time to get this perfect beautiful x-ray of exactly what's going on and find all the practices that would have to be fixed. It seems like the same error.

Tim O'Reilly (01:27:32):
Just that one I think it's an important concept, but I don't think it was terribly successful because I don't think people have really ever gotten the implications.

Eric Ries (01:27:42):
Yet.

Tim O'Reilly (01:27:44):
Yeah, that's fair.

Eric Ries (01:27:46):
Okay, you got time for a couple more?

Tim O'Reilly (01:27:48):
Sure.

Eric Ries (01:27:48):
Okay. Generosity is at the beginning of prosperity.

Tim O'Reilly (01:27:53):
Well, I've been a big believer in the value, the innovation you get from other people. Some of this maybe came from open source software, but I probably had that value before. You just think about everything you read that became part of who you are. As Elizabeth Barrett Browning said, "What I do and what I dream include thee, as the wine must taste of its own grapes." She was talking about her husband, but she could have been talking about all the books she'd read, but it was really came from my interactions I think of people who were mentors to me in open source early on.

(01:28:41):
Bob Scheifler at the X Windows system. We were basically taking their documentation and enhancing it and then reselling it. A lot of people were like, "You have to be giving it away." I go, "Well, then we don't have a business model to do that, so we won't actually create the value." I went to Bob and I said, "Gee, how do you feel about this?" He says, "No, that's what I want you to do. We gave some reference software and reference documentation, and we want everybody to build on it." We just gave it away. The same thing that I talked to Kirk McCusick at UC Berkeley, and he's like, "Yeah, we're just asking for credit, but we want people to build on our stuff." That was a very different-

Eric Ries (01:29:21):
This is what success looks like.

Tim O'Reilly (01:29:24):
... Then Tim Berners-Lee putting the web into the public domain. I just imbibed that early in my computer career. Then you look back and of course you read how the original Phil Neumann architecture was put into the public domain, and you go this whole industry, and the internet protocols, all these things were this incredible underpinning. I was actually just the other day, there was a piece in Bloomberg by Tyler Cowen about reflecting on a recent Harvard study that came up with a calculation of the economic value of open source software is about $8.8 trillion. I haven't read the methodology.

Eric Ries (01:30:15):
It's certainly a lot.

Tim O'Reilly (01:30:21):
Literature is a big part of my life and how I think. There's a wonderful passage in Les Miserables, which of course is an incredible humane novel about the inequality and the human condition and the need to lift people up, but after Valjean has escaped from prison and he stole these candlesticks, and the old Ave says, "No, no, I gave them to him." Then he goes and he becomes this industrialist, and it's this wonderful passage about how in the places where he worked, everyone became more prosperous. There was not a single house that was not better off.

(01:31:02):
That's how people used to think about business. It was a force at least ideally, there were always the [inaudible 01:31:10], but it was this notion that it was an engine of prosperity. We still have that even in the notion of even in the 21st century or 20th century neoliberal economics, it's still positioned as an engine of prosperity. It's just the difference between the ... This great quote from Usenet so many years ago. I don't know who whoever said it originally, but it was one of those Usenet Singh's. The difference between theory and practice is always greater in practice than it is in theory.

Eric Ries (01:31:41):
I remember that one very well.

(01:31:46):
The Metaverse is not a place.

Tim O'Reilly (01:31:48):
Well, yeah, that was an observation that I made because it always struck me. This is even back when the internet was first happening, I talked about for example, this was still when people didn't, even back in the early '90, people didn't really grock what was different about the internet.

Eric Ries (01:32:10):
The information [inaudible 01:32:11].

Tim O'Reilly (01:32:11):
I said, "Google doesn't happen." I literally had I forget what year it was, but a debate with Richard Stallman about how I said, "Look, even if Amazon gave away their source code and you had it, you wouldn't have Amazon because it's not actually." He's like, "Well, it doesn't run on my machine, so it doesn't matter morally."

Eric Ries (01:32:32):
I know, I know. I had [inaudible 01:32:36].

Tim O'Reilly (01:32:36):
I was like, "You're missing the whole point." There's a new paradigm happening where these things are happening in the space between. The internet happens between my browser and someone else's server, and it's this shared space. Then if you just look at the way to think properly, think about the Metaverse, the Metaverse is just maybe a more vivid version of the internet, but I always thought that it actually there was a way that they were trying to imagine out of gaming when I thought it was actually-

Eric Ries (01:33:17):
Yeah, as a destination, a place you would go.

Tim O'Reilly (01:33:19):
Right, that it should be imagined out of communication. The example I think I gave in that piece that I wrote was of my wife and I would basically do morning exercises with a friend who had a Peloton subscription. She would basically play these exercise videos like 20-minute abs with Robin and whatever strength training with Rad. We would do it every morning, we would do a half hour with Rad or Robin and so we have these virtual people who are just video, they're not avatars, who are leading us in two different locations through an exercise routine.

(01:34:08):
I go, "Why is that not the Metaverse?"

Eric Ries (01:34:10):
Oh, it is. Yeah.

Tim O'Reilly (01:34:12):
I thought, well, that's more Metaverse than going into some dumb place where people are cartoon avatars. I still think that and I love what Phil Liebman was doing where he was saying, "How do we get more aliveness into Zoom and more interactivity?" I think that there's still some potential there that got lost. I think it got lost because I think the Metaverse was ever actually a thing. It was a ploy to convince people that there was some future value that Facebook should be given now. It was effectively a mind hack on the financial system rather than a real technology that they were trying to drive forward.

Eric Ries (01:35:08):
Here's one that's very near and dear to my heart. Focusing less on shareholders might just save the world.

Tim O'Reilly (01:35:17):
Well, the original notion of shareholder value goes back to Milton Friedman's 1981. I think, what was it, '81? Piece where he said, "Well, you know, the only responsibility of a business is to make money for shareholders." He was speaking out against corporate responsibility. One of the things a lot of people don't know was that was an essay in the same magazine, I think it was in TIME, where they were making a big deal about the Ralph Nader, Unsafe at Any Speed. He was literally arguing against ...

Eric Ries (01:35:56):
Oh, Unsafe At Any Speed.

Tim O'Reilly (01:36:02):
Yeah, it was a big Ralph Nader thing. Leaving aside that side of it, the fundamental notion that if you give more money to shareholders, they will pass it on, was like worth, maybe Friedman might've been right, but in practice, he wasn't.

Eric Ries (01:36:35):
Back to the difference between theory and practice. It had some very unintended consequences.

Tim O'Reilly (01:36:39):
I don't fault him for saying, "Hey," because they were worried about the managers would appropriate the money that should be going to the shareholders, and they'd spend it on the things that they cared about rather than the things that the owners would care about and so it ought to be given to the owners. I go, "Well, but in practice, it didn't work out that way." The real problem of course was when Jensen and Meckling said, "Well, the way to get shareholder value is to align executive compensation with shareholders by giving them stock grants." At that point, you built a machine in which everybody on the inside profited by doing the wrong thing.

Eric Ries (01:37:24):
Yeah, it has a gravitational pull all of its own. People can't believe me when I tell them that stock buybacks were illegal until 1983 or I can't remember the year now. A very recent innovation.

Tim O'Reilly (01:37:37):
Yeah, it was considered stock manipulation. That whole ... anyway.

Eric Ries (01:37:45):
All right, last one. A marketplace is a lot like an ecosystem. It has to be circular.

Tim O'Reilly (01:37:51):
Well, maybe this goes back to doom. It's influenced on my thinking that this idea that ecosystems matter. That's probably where I first encountered, I was 14 years old. Then of course, reading Stewart Brand later and really getting into that sense of we live in an ecosystem, but it's obvious that businesses also have ecosystems. They have competitors, they have common sales, have various dependencies, they have parasites.

Eric Ries (01:38:32):
And symbionts, yeah.

Tim O'Reilly (01:38:36):
Yeah, thinking about what happens in ecosystems for example when an apex predator becomes too powerful, it takes out everything else and then it dies off. Whenever it gets out of balance, everyone suffers. I would just say that I just always think that way about business as well. You have to think about not just your users. You have to think about your suppliers, you have to think about your employees, you have to think about your society because we are enmeshed in an ecosystem.

Eric Ries (01:39:22):
It's a very holistic way of looking at business. I've always appreciate it.

(01:39:26):
Tim, thank you so much for doing this. It was really a pleasure to have this conversation. Always amazing to talk to you. I want to say thank you obviously for your leadership and so much that you've given to the community over so many years but also just for your humanism. You could be the oracle of Silicon Valley, but to me, you're the humanist of Silicon Valley. You stand for very human, very human values in our whole industry. It's had a huge impact on me and on so many people. So I just wanted to say thank you.

Tim O'Reilly (01:39:50):
Thank you so much.

Eric Ries (01:39:52):
Really a pleasure to have you on.

(01:39:55):
You've been listening to the Eric Ries Show. Special thanks to the sponsors for this episode, DigitalOcean, Mercury, and Neo4J. The Eric Ries Show is produced by Jordan Bornstein and Kiki Garthwaite, researched by Tom White and Melanie Rehaq. Visual designed by Reform Collective. Title theme by DP Music. I'm your host, Eric Ries. Thanks for listening and watching. See you next time.