Sal Khan arrived at the idea for Khan Academy truly organically. While working at a hedge fund, he took time in the evenings to tutor a younger cousin in math over the phone. Soon, a family tutoring network was in place, and from there, it was only a few years before Sal realized that the kind of help he was giving his family could – and should – be made available to everyone for free, across disciplines and geographic locations. Today, Khan Academy has over 170 million users and is available in 50 languages in 190 countries.
To fully commit to his vision, he founded Khan Academy as a non-profit, providing advantages that companies focused on making money by any means necessary will never have. As you’ll hear, avoiding what he calls the “very strange” market forces around education has been one of the keys to Khan Academy’s ability to build deep trust and loyalty.
It has also ensured a rare level of adaptability that has been especially important for Khan Academy’s role as a major early adopter of AI. Sal’s experiences with AI and education are widely applicable, as is his belief that we all have a duty to take a hand in shaping AI’s place in our world. Being open to new technology instead of fearing it can help us avoid the dystopian nightmares so many people have predicted are imminent. Equally important is his commitment to balance in all things, including salary and work-life choices not just for himself, but all Khan Academy employees. As he puts it, “Having a life can make you a better leader, thinker, and decision-maker.” His story, which has many chapters yet to come, shows that being a mission-controlled organization is no barrier to success in every sense.
Other topics we touched on include:
Having a long-term vision for education
Creating value as a non-profit
His skepticism about non-profits
Why he believes in remote work
The early results of Khan Academy’s AI integration
The power of knowing what you want your life to stand for
My main takeaways from the episode are:
—
Brought to you by:
Mercury – The art of simplified finances. Learn more.
DigitalOcean – The cloud loved by developers and founders alike. Sign up.
Neo4j – The graph database and analytics leader. Learn more.
—
Where to find Sal Khan:
• X: @salkhanacademy
• LinkedIn: https://www.linkedin.com/in/khanacademy/
Where to find Eric:
• Newsletter: https://ericries.carrd.co/
• Podcast: https://ericriesshow.com/
• YouTube: https://www.youtube.com/@theericriesshow
—
In This Episode We Cover:
(00:37) Meet Sal Khan
(04:24) Why Sal founded Khan Academy as a non-profit
(06:10) How his day job as a hedge fund analyst made him think longer-term
(09:26) How turning down venture capital has put Khan Academy in a better position for growth
(11:24) Creating value as a non-profit
(12:54) How nonprofits can fill in for government in education and healthcare
(13:30) Sal’s skepticism about non-profits
(16:01) The social return on investment framework
(17:49) The value of transparency
(18:22) Khan Academy by the numbers
(21:19) On making enough money and taking a risk to pursue a dream
(22:17) The counter-intuitive hiring benefits of being a non-profit
(27:46) Khan Academy as a leader in AI and education
(28:06) Turning fear into features
(30:05) Khan Academy’s top fears around adopting AI
(32:36) How being trustworthy led to early GPT-4 access
(34:04) Khan Academy’s AI experiments and results so far with Khanmigo
(36:55) Sal’s hopes for AI and special needs education
(38:52) Sal’s new book, Brave New Words
(41:51) AI as an amplifier of human intent
(43:38) The necessity of using technologies and tools we’re afraid of
(44:45) Balancing material needs and self-fulfillment
(48:20) Why Khan Academy has gone to fully remote work
(50:52) How the humanity of Sal’s hedge fund boss eventually led to Khan Academy
(53:18) Lightning round!
—
Referenced:
The Foundation Series, Isaac Asimov
Khan Academy 2018 (HBS case study)
Khanmigo: Revolutionizing Learning with GenAI (HBS case study)
Robin Hood Foundation Cost-Benefit Analysis Framework
Brave New Words: How AI Will Revolutionize Education (And Why That’s A Good Thing)
The One World Schoolhouse: Education Reimagined
—
Production and marketing by https://penname.co/.
Eric may be an investor in the companies discussed.
Sal Khan (00:00:00):
All technologies humans have ever developed amplify human intent. A knife can kill, a knife can save your life, it can cook food, it can keep you alive. But AI, the public narrative is almost completely dominated by the negative. Whether or not this is a net positive or a net negative for humanity is not a flip of a coin. It is based on what we do. And if more positive intent is put behind, is amplified with AI, then you're going have net positive. If the good folks just wring their hands and say, "We don't want to have anything to do with this," and the bad folks, the bad folks are going to do whatever they're going to do, regulation or not. They don't follow rules, then we're going to be in bad place.
Eric Ries (00:00:37):
Welcome to the Eric Ries Show. My guest today is Sal Khan, founder of Khan Academy. What started as one man tutoring his cousin in math over the phone after his hedge fund day was over quickly became a family tutoring network. From there, it just kept growing. A few years later, Khan Academy was launched and today it has over 170 million users and is available in 50 languages and 190 countries.
(00:01:01):
Sal's a true contrarian. He chose to found Khan Academy as a non-profit. This has provided advantages that other companies merely focused on making money by any means necessary will never have. As you'll hear, avoiding what he calls the very strange market forces around education has been one of the keys to Khan Academy's ability to build deep trust and loyalty.
(00:01:23):
Khan Academy has also been a major early adopter of the new AI. Sal has learned more than almost anyone about how to apply this new wonder to the field of education, but I think you'll find his lessons are widely applicable. As he explains, it's our duty to take a hand in shaping AI's place in our world. If we're open to do technology instead of fearing it, we can avoid the dystopian nightmares so many people have predicted are imminent. For Sal, living a life that reflects everlasting values is truly compatible with bracing the new, just one of the many lessons he imparted in our conversation. Up next, Sal Khan.
(00:02:02):
This episode is brought to you by DigitalOcean, the cloud loved by developers and founders alike. Developing and deploying applications can be tough, but it doesn't have to be. Scaling a startup can be a painful road, but it doesn't have to be. When you have the right cloud infrastructure, you can skip the complexity and focus on what matters most.
(00:02:23):
DigitalOcean offers virtual machines manage Kubernetes plus new solutions like GPU compute, with a renewed focus on ensuring excellent performance for users all over the world, DigitalOcean has the essential tools developers need for today's modern applications, with the predictable pricing that startups want. Join the more than 600,000 developers who trust DigitalOcean today with $200 in free credits and even more exclusive offers just for listeners at do.co/eric. Terms and conditions apply.
(00:02:55):
The Erich Ries Show is brought to you by Mercury, the bank account I actually use for my startup. I've been around a lot of startups and a lot of FinTech products over the years. People often think the way to simplify the complexity of finance is to add layer upon layer upon layer of software and automations and workflows, and all that winds up with is a really complicated mess. Mercury's figured out the thing that really matters. The bank account. If all of your workflows, all of your automations are driven from the place where the data and the money already are, life gets a lot simpler. Mercury simplifies your financial operations with powerful banking, giving you greater control, precision, and speed so you can operate at your best. We all know speed is the ultimate advantage that startups possess. Your bank account needs to speed you up, not slow you down. Apply in minutes at mercury.com and join over 200,000 ambitious startups that trust Mercury to simplify their finances and perform at their best. Mercury is a financial technology company, not a bank. Banking services provided by Choice Financial Group and Evolve Bank & Trust members FDIC.
(00:03:57):
Okay, first of all, thank you for making time. Thanks for coming on the show.
Sal Khan (00:04:04):
Thanks for having me.
Eric Ries (00:04:05):
I want to talk first off about why you decided to make Khan Academy a non-profit. Because we're so used to the idea that entrepreneurs are trying to make money, they're building a for-profit company, they're going to take it public one day. That's what it means to be a valuable company to so many people. When you were doing your tutoring, you famously were tutoring your cousin and then so many other people, you were becoming a YouTube influencer. You easily could have taken this in a for-profit direction. Lots of people have made a lot of money on ed tech companies. You felt it was important to be a non-profit. Why?
Sal Khan (00:04:39):
Yeah, good question. If you go back to 2007, 2008, at that point, I had already been working on this project of sorts for three, four years. It had already gotten some traction on the order of 100,000 folks were using it every month. And when I say it, not only the videos I was creating on YouTube, but I had a software platform that would give users practice and feedback, and there were teacher tools. And that was actually the original Khan Academy, and it still is where we put most of our resources, that whole practice feedback, teacher tool aspect of it.
(00:05:15):
I remember there were folks who reached out. At that point, I lived in Silicon, I still live in Silicon Valley, and they said, "Hey, I'll write the check right now. You can quit your day job, do this full time." And it was tempting, and usually that first conversation went really well. And I have nothing against for-profit companies. My day job at the time was working at a hedge fund, which is about as for-profit as it gets. But there was something about the second or third conversations. And all of these were good people, but it was all around what do you start putting behind a paywall, or where could you put ads that won't completely disrupt the education process?
(00:05:55):
I started to think about all of the users who were already benefiting and had the potential to benefit from this, but would not if you started increasing the frictions, if you started putting it behind pay walls.
(00:06:08):
I also thought a lot, my day job, I was a hedge fund analyst. I was talking to a lot of public companies, and I thought a lot about how annoying people like myself would make these leaders at these public companies think very short term, really about the next quarter. There's actually very little incentive in public companies to think more than a few years out. If someone's thinking five years out, they're considered a big thinker in a lot of places.
(00:06:38):
And I also saw working at a hedge fund that how much the capital structure influences the decisions you have to make. A lot of for-profit companies are started with a mission in mind, oftentimes a very mission oriented founder. But by the time you grow and you have a lot of other stakeholders, that mission by definition is not the bottom line. The bottom line is the fiduciary to shareholder value, etc.
(00:07:06):
And so when Khan Academy started to become a thing, a little delusional part of me said, "Well, what if instead of Khan Academy," and both of these are delusional thoughts. There's the for-profit delusional dream that every entrepreneur has. It's like, "What if this is the next Google? What if this is the next Meta?" Whatever. Which is great, make a lot of money, have a lot of impact. But then even more delusional part of my brain said, "Well, what if this is the next Oxford or what if this is the next Smithsonian?" And that started to really appeal to me because those institutions last hundreds of years, in some cases last thousands of years.
(00:07:45):
And it felt, a lot of times when I'm faced with the decision, I'm like, "What would a protagonist in some of my favorite books do?" And I said, "Well, if I talked to Hari Seldon in the Foundation series," he would've started this as a nonprofit and tried to make an institution that exists for generations to come. And only a nonprofit, if it can last, can stay true to that mission.
(00:08:15):
Now nonprofits have their own set of rough spots that we could talk about, but that was the thinking. What if Khan Academy could be an institution that could reach billions of people over maybe not just decades, but hundreds of years?
Eric Ries (00:08:30):
Well, I admire and love the long-term vision of that, and the chutzpah really to go live the dream. Why do you think being a nonprofit was the key to being able to think long-term like that? I think most people would find that counterintuitive.
Sal Khan (00:08:46):
Yeah. There's several Harvard Business School cases now on Khan Academy, all kind of linked to this idea. The first one was, should Khan Academy essentially be a nonprofit or not? And then some of the cases since then have been around some of our growing pains that we've had and then our pivot now in the AI world.
(00:09:04):
And I even look at some of the, let's call it peer ed tech companies that started, most of them started a few years after Khan Academy. A few started before. And I know a lot of the founders and they had similar missions. They wanted to democratize say, higher education or something else.
(00:09:24):
But what happens is as soon as you start taking some of the venture money, this isn't a bad thing. I think this could be a good thing in many cases. The attitude is go big or go home. A lot of money. Grow fast, even if that creates some cultural growing pains, even if that... And then after about two or three years, maybe four or five years, if you have generous investors, they start turning the screws a little bit on how exactly you're going to be monetizing this.
(00:09:54):
And at that point, it usually involves narrowing the ambitions or narrowing the mission. So some of the peer groups that wanted to democratize higher education or think about free college, they've gone to where the money is, and I don't criticize them for it. They had to as a business. But it's okay, we're going to turn into certifications for the most part, people who already have college degrees but want a specialization in say, data science or something like that.
(00:10:20):
That's great. That's creating value for some people, but that they weren't able to stay true to that big mission of what if higher education were to look different. And ironically, we are slow, I wouldn't say we're medium speed or fast, but steady, where we're actually maybe in a better position, even though it didn't happen in three or four years. But we're in a better position now to actually tackle some of these issues around credentialing not just at a high school level, but also at a career and college level.
Eric Ries (00:10:52):
I think it's a really remarkable thing that you've been able to build. So first of all, I'm full of admiration for it. But I also think it's really interesting that you talked about the inspiration for something like the Smithsonian. Smithsonian is a nonprofit, but it has incredible revenues every year, creates incredible amount of value in the world, and so does Khan Academy. So I think it's really interesting, there's this kind of paradox that being nonprofit has allowed you to create more value. And I kind of feel like the implication of that is there's something wrong with the way we're doing for-profit companies. If a for-profit company supposedly is supposed to be all about maximizing value creation, all about making the most money, what do you make of the fact that you've been able to create more value as a nonprofit than you would have been able to as a for-profit?
Sal Khan (00:11:37):
Yeah, it's a debate. I think I have friends will give me a good argument why Khan Academy maybe would've created even more impact or value as a for-profit. I'm skeptical of it, because a lot of our journey has actually been people joining the effort and that energy amplifying what we're doing, which I don't think would've happened also. People want to be part of it in a lot of ways, not just from a resourcing point of view, but also from a talent point of view. One of the questions is, could you even attract talent as a nonprofit, especially in an environment? So we've been able to.
(00:12:10):
I think for me, if you were to want to start a new car company, if you want to start, let's call it general productivity software. I'm in general favor of the for-profit route, and I'm generally in favor of the rapid creative destruction process. I think that for the most part, it's good for society.
(00:12:34):
I think nonprofits are interesting where if you had a super innovative and creative government, it's the types of things that super innovative and creative government would do, because it's a public good. Market forces either don't lead to outcomes that are consistent with our values or they just don't work. I think education and healthcare are two areas where the market forces are very strange. The beneficiary, the payer, and the decision maker are different people, which is not a good setup. And we have values around both of those areas that aren't necessarily consistent with, if you can afford it, you should get it. It should be like, no, everyone who wants to learn should learn. Anyone who's about to die, someone should save their lives.
(00:13:17):
So I think those areas where in theory it should be government, but the government is not likely to be nimble enough or innovative enough. That's where the nonprofit sector could be interesting. But look, I think I'm also very skeptical of nonprofits. When I was in business school, they didn't fail anyone at business school, but they tell you internally, if you essentially would've failed a class. When I was at HBS, they gave you either one, two, or three. A three meant you were in the bottom 10% of a class. And the one course that I got a three in was social entrepreneurship.
Eric Ries (00:13:52):
What do you know about that? Oh my goodness. That's such a great example of how we try to credential and gatekeep these things, but actually the people, it's really the misfits and the radicals who have the good ideas. And yeah, I hope you wear that as a badge of honor now.
Sal Khan (00:14:10):
Yeah. The reason why I got a low grade in that class is I was skeptical of a lot of nonprofit efforts. I remember the final exam was about some bicycle ride that purportedly was going to help cure disease, and I wrote on the paper, I'm like, "How does this bicycle ride cure that disease?"
(00:14:27):
Now, in hindsight, I didn't realize that the professor is the person who actually created that nonprofit. So that was not a strategic move. But now I'm not as cynical about those types of things. I actually think, yes, the bicycle ride itself, and maybe the money it raises isn't going to necessarily be the defining thing that cures the disease, but it builds awareness and it gets people bought into a mission.
(00:14:51):
But I think there could be some very negative, just as we were talking about in the for-profit world, people can very short-term, go big or go home, sometimes not aligned with our values. In the nonprofit world, sometimes you have the opposite problem where things are not nimble. The nonprofit exists just for the sake of existing. You don't necessarily have as much creative destruction as you should in the nonprofit world.
Eric Ries (00:15:13):
Yeah, we've all encountered those nonprofits, and it is kind of sad when that happens. What do you view as the mark of a successful... I don't want to call it a nonprofit. I actually think the way our society thinks about profit is ridiculous. The idea that you and the Smithsonian are not-for-profit entities, it seems really backwards to me. This seems like a lot of value is being created here, so I wish we had different terminology for it.
(00:15:34):
But this kind of mission-driven enterprise, what I'm going to call it, what do you view as the hallmarks of a success there? People must pitch you now all the time that they want to create the Khan Academy for something else. How do you differentiate between something that's likely to become a movement like you've built versus something that is more likely to be one of these sad, self-serving, bureaucratic nonprofits?
Sal Khan (00:15:59):
Yeah, I think you have to make a strong argument on social return on investment. And there's some frameworks out there. The Robin Hood Foundation, large foundation out of New York has a framework. And by their framework, it's sometimes hard to quantify the value that you create. Even for-profit companies usually only capture a small amount of the value that they're creating. But they quantify it in the education world based on some correlations that have been seen between learning outcome improvements and lifetime earning. And that doesn't even talk about things like lowering your chance of getting incarcerated or having to go on social services, etc., etc.
(00:16:38):
But with their measure, a good nonprofit would have a 2X, 3X, social return on investment. Now, social return on investments are higher typically than regular return on investments because everyone's trying to get a regular return on investment, so that's much more competitive. So if I told you I had a 30% return on investment as a hedge fund, you're like, "Okay, how do I sign up?" If you're doing that in a low risk way.
(00:17:03):
But Khan Academy, if you say a nonprofit is at a 3X, a typical decent one, a good one is at 10X, Khan Academy with very conservative assumptions about our reach and our engagement, our impact is about a 500X. So I think being able to quantify aspects like that, being able to measure what you're actually doing.
(00:17:25):
I also think one of our benefits is, a lot of times my wife and I were thinking, "We should donate to something in some part of the world. We really care about that." But then we were like, "Do we really know what they're doing? Do we really know? Is it really going to buy a mosquito net or is it going to line some bureaucrat's pocket or who knows what's going to happen with that?"
(00:17:48):
And I think one of the properties that Khan Academy has is that we're hyper-transparent. No one needs to question what's going on with Khan Academy. They can see it very clearly. In fact, most of our donors are users themselves. Their children use it. So it's not hard for them to imagine, "If Khan Academy creates this new course or creates these new features, not only will my kids benefit. In fact, I've already seen my kids benefit. But I can definitely see how these other kids who otherwise would not have access are going to benefit from it."
Eric Ries (00:18:18):
Give us some vanity metrics. Just brag for a second. How big is Khan Academy now? Give us a sense of the scale of the impact. How big is the team? What's the budget like? How many kids use it? Just whatever you feel comfortable sharing.
Sal Khan (00:18:30):
Yeah, I'll share whatever y'all want to know. I guess the most vanity metric is registered users, which is I think approaching 170 million. Maybe it's a little over 170 million registered users at this point. It's in almost every country of the world. Almost every major language. There's 50 plus translation efforts of Khan Academy. That's another thing that would've never happened if we were for-profit. You would not have seen a Swahili version of Khan Academy. You would've not have seen a Bengali version of Khan Academy. But those exist. And you would not have seen people giving their time and donating to make those types of things happen. So we're proud of that.
(00:19:08):
As a team, we're now approaching 300 folks. I don't view that as a vanity metric. That gives me a little bit of stress because we have to think about where the resource is going to come from. So we're still predominantly philanthropically supported. We do have revenue streams, where school districts that are looking to get more support and training, integration with their rostering systems, district dashboards, now AI tools on top of that, they pay us a little bit of extra revenue. But that's kind of the high level Khan Academy by the numbers.
Eric Ries (00:19:40):
It's been a remarkable run. It's been really fun to watch you do it, and listen on behalf of my kids. It's been fun to watch them use the product and get value out of it.
(00:19:48):
You've mentioned the recruiting thing. So when I hear people talk about for-profit first, non-profit, what kind of company? I want to build a mission-driven company. Very common set of beliefs out there that if you're that mission-driven, if you go all the way to being a non-profit. You're not going to be able to recruit talent. You're not going to be able to raise money. You'll be at a competitive disadvantage because you won't have the discipline of the markets and all this other stuff. But you seem to have Judo moved all of those disadvantages into advantages. You were saying you were able to hire incredible people. You'll be able to recruit people into this mission. You've being able to enlist non-employee folks to come and do this work with you. Talk about some of those counterintuitive benefits of being a mission-driven organization.
Sal Khan (00:20:29):
Yeah. I remember when I was at a hedge fund, someone forwarded me some research paper that says beyond a certain threshold, money doesn't really matter to folks. What really matters folks is they need some money. They need to feel like they have a, let's call it an upper middle-class lifestyle. They'd be able to go to a restaurant every now and then have two cars in the driveway, maybe let's call it a 2,000 square foot house in a nice neighborhood. Go on vacation every now and then and pay for your kid's college. That's kind of the American dream, so to speak.
(00:21:01):
But beyond that, what people really want is a sense of mission, intellectually challenging work. And they want to work with other really interesting aligned people. And when I read that when I was in hedge fund, that sounds good, but that doesn't seem to be the way the world works.
(00:21:18):
But when Khan Academy started, first of all, I made the decision myself. I said, "You know what? If I could go back in 2008 or 2009," 2009 is when I quit my day job at the hedge fund. And I was making good money then. More money than I needed, frankly. But we weren't independently wealthy. We were saving money for a house. So that was my incentive to stay at the hedge fund. Our first child had just been born. But my wife and I sat down and we said, "You know what? If we can make enough money," I kind of called it if I could make a kind of professor salary, and she was in fellowship at the time, but she wanted to work for the city hospital. She still does work for the city hospital, so she also wasn't pursuing the most high paying way to be a doctor. Together, we can support an upper middle class lifestyle or even middle class lifestyle in Silicon Valley. That's all we need. Everything else is gravy to some degree. It just kind of helps build our security and things like that.
(00:22:13):
And that was a bet I personally felt like taking. And then in those early days when Khan Academy started to get resources and we were out to hire, we were able to find some really incredible people, as good as anyone to come join. Now, even then, it was like, "Well, maybe these are just a few people come out of the woodwork."
(00:22:33):
But now that we've hired over the years, that paper that came out that I was skeptical of back when I was a hedge fund analyst was 100% right. Today, Khan Academy, we do pay way better than most nonprofits. Our board, to their credit, they do not think that working in a nonprofit should be somehow a vow of poverty, that somehow you should get paid less if you're doing more important work. So we pay. There are folks at Khan Academy making many hundreds of thousands of dollars a year. They're not making millions of dollars a year, which many of these same people could make other places. But they are able to get that hopefully upper middle-class lifestyle.
(00:23:13):
But we're giving them a mission. We're giving them really intellectually challenging work, things that they feel proud of. And we're getting a great yield. I was just talking to our HR team a couple of weeks ago, and right now we have about an 88% yield on our offers. Almost everyone that we give an offer to has an another offer from a Google, from a Microsoft, from a Meta, and we have an 88% yield.
(00:23:36):
I remember back in the day, in the early days, Eric Schmidt of Google was on our board, and this was when I had never run an organization much less a non-profit. I was like, "Eric, any advice you have on recruiting?" And he said, "Well, what's your yield?" And after I told him, he's like, "We should be taking advice from you because we're also throwing stock and we're trying to create all these golden handcuffs, and we can't retain and attract people at the same level that you are."
Eric Ries (00:24:04):
[inaudible 00:24:04] of things come for Google, sadly.
Sal Khan (00:24:05):
Well, yeah. Well, everyone has had to deal with, when you feel like you're less connected to a mission where you're like-
Eric Ries (00:24:16):
No, that's exactly it. And I don't mean to pick on Google in particular. You already said, the fundamental truth, which is that as these companies tend to grow, they tend to lose that connection to the mission, and that puts them at a competitive disadvantage. So it's funny to me that the things that they do supposedly in service of getting more efficient and being more profitable actually wound up, allow them to lose this ability to create value, to retain talent, to inspire customers, and the other things that being highly trustworthy gives you.
(00:24:46):
And I feel like you've always felt this special responsibility being in education to be an incredibly trustworthy, talk about a transparent counterparty, someone you can entrust your kids and their education to. Talk about the challenge of that.
(00:25:00):
How do you ensure that the whole company, the whole organization, embodies that ethos? Especially as you get 300 people now, you're at a stage where you can't personally oversee what every single person's doing in every single team. Talk about building that internal culture where you treat the welfare of students as sacred and really put that at the center of everything that you do.
Sal Khan (00:25:22):
Yeah. The good news is, I think just by definition of who we are and our mission, the people who are drawn to Khan Academy care deeply about those things. That's the first thing that they care about is making sure that we're doing the right thing by users, by learners, by teachers, by families, etc. I would say that as a not-for-profit, it's almost a reverse of, it's not so much that I have to put a ton of energy to make sure that our team cares deeply about these issues. It's more that I have to sometimes put energy to make sure that our team doesn't get so fixated on an edge case that they're afraid to move forward. A lot of what we've done with artificial intelligence initially, everyone's like, "This is amazing what's here," but what about student privacy? What about these AIs can hallucinate? It's not particularly good at math. What if students have shady conversations with the AI? How do we keep them safe?
(00:26:22):
And I think left to maybe consensus, if we were a consensus-driven organization, we might've said, "Oh, let's just not do it. It's still not necessarily ready for prime time." But I had to do a lot of evangelizing the idea of if we don't do it, it's not that no one's going to do it. It's just that someone who cares less is going to do it. They're going to fill that vacuum. And is the world really better off? So instead of using these as reasons to shy away, or not move, or make the perfect the enemy of the good, let's turn all of these fears into features and put something out there that we can really stand behind.
(00:26:58):
So a lot of what I do is just making sure we have that tension between some of the stereotypes of a nonprofit and some of the stereotypes as a for-profit. Every day, I do have a little bit of a chip on my shoulder when people are skeptical. And they're less so now, but especially in the early years of Khan Academy being a nonprofit, will we be nimble enough? Will we be innovative enough? I like to show people, no, not only can we be enough, but we're more innovative. We're more nimble than, a lot of our partners who are some of the big AI providers, if you ask them, who's really done a strong pivot, unusually strong pivot into AI, most of them are saying Khan Academy, which is not what you would expect to hear from a nonprofit.
Eric Ries (00:27:37):
No, it's been really remarkable actually. I want you to dive into that turning fear into features, that use AI as a case study and talk about, just because Khan Academy was been a really interesting leader in AI and education. And you've been everybody's favorite demo partner, launch partner. You were part of the infamous Scarlett Johansson, Sam Altman video, got him in trouble. Khan Academy's been central to so many of these AI providers story. You've had to lean into that. Walk us through what that was like. How do you turn fear into features?
Sal Khan (00:28:07):
Yeah. Well, there's certain things where when you see it, it just kind of slaps you in your face that this is a big deal. And when Sam and Greg, Sam Altman and Greg Brockman from OpenAI reached out to me the summer of 2022, and this was months before chat GPT existed, and they showed GPT-4, which many folks know, the original version of Chat GPT was not even built on GPT-4. It GPT-3.5.
(00:28:33):
So when I saw more advanced model well before the world had seen it, it slapped me in the face that this is a very, very big deal. And even though it had imperfections, the rate at which it was improving, and now I even realize back in 2022, I underestimated the rate of improvement, even though I was probably one of the more bullish people on how fast it would improve. I said, "This is going to change everything. This is going to create huge opportunities for what we try to do." As a nonprofit, our mission is free world-class education for anyone anywhere. When we say world-class, it means how do you emulate what a great personal tutor, what a great teaching assistant would do?
(00:29:06):
I always point out Alexander the Great had Aristotle. When we had mass public education, we could not afford that. So we had this factory model of education. But Khan Academy over the last 15 or so years has tried to approximate what a great tutor would do with things like video, with personalized software, by aiding teachers so that they could personalize for their classroom more. So it was obvious it could maybe get that much closer to a approximating it.
(00:29:30):
So I just really encourage the team to, "All right, yes, what you're saying is a real risk. Don't just keep fixating on this. How do we mitigate that risk? Come up with something, and don't over-design it? Like if you had to ship something by next month, what would it look like? And then how do we weigh whether it's good enough or not?" And so we just started to build, and we're still doing that. We still have these debates, but that's I think helped us-
Eric Ries (00:29:57):
Give some examples from those early AI MVPs that you built. What were the fears and what ultimately did you do to mitigate those fears?
Sal Khan (00:30:04):
Yeah. Some of the most obvious things as soon as anyone looks at any of these AIs is it could be used for cheating. That's number one risk. What happens if a student wants to have, for lack of better word, a shady conversation? It could be a shady cheating conversation or it could be shady something else. They're trying to build a bomb, or do something hateful, or talk about something hateful, whatever it might be, or harm themselves. And so we said, "Okay, well, these things are pretty easy to prompt." Actually, Chat GPT, the original Chat GPT or GPT 3.5 was not that steerable. It actually was not so easy to prompt to be safe, but GPT-4 and onwards, and not just from open AI, but other models of that same kind of frontier models, you can prompt them to be much safer so that they act as an ethical Socratic tutor, etc., etc. So that's one layer.
(00:30:51):
The next layer is, especially for under-18 users, how do you provide transparency to teachers and parents and potentially the school system itself, if something, well, transparency in general so that you can report back saying, "Hey, this is what the kids have been up to. This might be useful to think about as you plan your next lesson." But also, if a student is saying, "I want to harm myself. I want to build a bomb" can the AI proactively notify?
(00:31:17):
And look, this is what a great tutor would do. If I'm tutoring a young kid and they want to cheat, I'd say, "Hey, that's not what I'm here for, but how would you approach the question? I'll remind you of this formula, but you have to do the thing." And if that student said, "Hey Sal, can you help me build a bomb?" I'd be like, "No." And I would send an email to their mom or to their teacher saying," I don't know if they're just trying to play around or not, but you should look into this. Why are they asking me to build a bomb?" So that, Khanmigo does that?
(00:31:46):
Then there's obviously a whole bunch of data privacy things that a lot of users don't pay attention to on a daily basis, but we don't want important student or sensitive student information to leak into the general models. So we've been very careful that the personal identifiable information being used to train by the general model. We haven't done it yet. We might use it for fine tune training for something that stays within our Khan Academy nonprofit sandbox.
(00:32:19):
But those are our key guardrails that we've had. And hallucinations and math errors, we've done a ton of work above and beyond the base models anchoring it on Khan Academy's content, doing a bunch of double checks, triple checks to just make it a lot better.
Eric Ries (00:32:35):
It just occurred to me that the fact that Sam and Greg reached out to you to show you GPT-4, that's also a really interesting consequence of being the trusted party that you were. If you were just another ed tech company, maybe they wouldn't do that.
Sal Khan (00:32:50):
Oh, I think that's right. They told me that before They even showed me that GPT-4 demo in summer of 2022. They said, "Look, we think this is going to be the model that really wakes people up to what generative AI." As a side note, no one thought GPT-3.5 was worth anyone's time. They made GPT as a whim, and it kind of surprised everyone, including them, that people paid attention to it. Everyone thought GPT-4 was going to be the model that woke everyone up. But they said, "Look, but it might also be a little bit unnerving and scary." So they wanted to lead or launch with trusted partners in a trusted use case that could make use of this technology. And yes, they immediately thought of Khan Academy. And I think the trusted side is we're nonprofit and what we represent, not just as a nonprofit, but hopefully people-
Eric Ries (00:33:40):
Sure, it's about being mission driven really more than, your tax exempt status is not really the thing that inspires anybody.
Sal Khan (00:33:46):
Exactly. The mission driven and that the trust we've been able to accrue, but also capable of making use of it. They didn't go to Harvard or Stanford. Those are also nonprofits, but they're like, "Okay, those are large nonprofits."
Eric Ries (00:33:59):
Yeah, it would take forever.
Sal Khan (00:34:00):
That aren't necessarily going to move fast.
Eric Ries (00:34:04):
Talk a little bit about what results you're seeing from the AI. You've had more experience now with anybody using AI in an educational context. Some people were very bullish about that possibility. Some people were quite frightened about it. What have you seen?
Sal Khan (00:34:17):
Yeah, it's still very early days. We launched in spring of 2023. And even that summer, we did some very light internal studies to see what it's doing. Minimum is it not doing harm? And we got a clear signal that it was not doing harm, and a slight signal that it was driving a little bit more engagement. We're tooling for some more detailed studies over the next year or two.
(00:34:41):
My intuition is you're not going to see AI dramatically accelerate... Khan Academy pre-AI has 50 plus efficacy studies on it that students do this personalized practice for let's call it 60 minutes a week, that they're accelerating by a pretty large amount, 30, 40, 50% in many cases.
(00:35:01):
I think the AI, you'll see a marginal impact to neutral impact on the actual efficacy. But what you will see hopefully is that the AI is driving more engagement, and it drives more engagement. One, as a student is stuck on a question or they're watching a video and something was said, they can have a conversation about it. They can dig a little bit deeper. But also by helping the teacher engage the student. Teachers are the biggest engagement mechanic. You can make all the game mechanics you want, but the end of the day, it's the teacher telling the student to do it that's going to make them do it.
(00:35:33):
And there, using the AI to support the teacher with lesson plans, progress reports, creating assignments, having narrative read of what the students are up to, I think will also drive engagement of classrooms that will then drive those efficacy results. But it's still early.
(00:35:49):
There's a lot of interesting user interface things we've discovered. A lot of kids run with it, I would say about 10 to 15% of them. As soon as they see these tools, they just figure it out and they're off to the races. But I was surprised there's a lot of other students, including students in high school, who they're just not used to being able to have a free-formed academic conversation. And I remember this was a younger group, it was a second or third grade class that we were trying this out with, and a lot of the kids were telling Khanmigo weird things. It would be a math problem, and they would say like, "Huh," or, "IDK," or, "Can you say this word?" Whatever, at first it was like, "Okay, I'm so sorry. This isn't appropriate for this age group." And the teacher's like, "No, no. You don't understand how valuable this is. They do that to me. They raise their hand and they forget what they were going to say, or they can't articulate their question, and now they're getting practice with it."
(00:36:42):
And you could imagine, especially if you're a high school student and you haven't developed that skill of just needing to be able to articulate what you need, who cares if you were able to learn the algebra or not? That's a more important skill of articulating what you need.
(00:36:55):
This is anecdotal, but an educator in Florida recently told me how their special needs kids, especially kids on the Asperger's or autism spectrum, they are using Khanmigo more than their peers. Now, that could be a double-edged sword. You could say, well, does that isolate them more folks on the-
Eric Ries (00:37:13):
Or is it a chance to practice those skills?
Sal Khan (00:37:14):
Exactly. And their evidence is, it's actually more of the latter, that these kids are engaging with Khanmigo more than other kids, but they are also based on observation seeming to get more confident communicating to other people because of it. So that's a hope that hopefully we can build on.
Eric Ries (00:37:29):
This episode is brought to you by my longtime friends at Neo4j, the graph database and analytics leader. Graph databases are based on the idea that relationships are everywhere. They help us understand the world around us. It's how our brains work, how the world works, and how data should work as a digital expression of our world.
(00:37:49):
Graph is different. Relationships are the heart of graph databases. They find hidden patterns across billions of data connections deeply, easily, and quickly. It's why Neo4j is used by more than 75 of the Fortune 100 to solve problems such as curing cancer, going to Mars, investigative journalism, and hundreds of other use cases.
(00:38:10):
Knowledge graphs have emerged as a central part of the generative AI stack. Gartner calls them the missing link between data and AI. Adding a knowledge graph to your AI applications leads to more accurate and complete results, accelerated development, and explainable AI decisions, all on the foundation of an enterprise-strength cloud database. Go to neo4j.com/eric to learn more.
(00:38:37):
I love your optimism even in the face of all the uncertainty that has faced us, and I guess you're used to it now because of course, having done the impossible, it gives you a certain sense of confidence that what people say can and can't be done is not very likely to be right.
(00:38:51):
Want to ask you about the new book. I see it right there behind you, Brave New Words on your shelf there. It's only just come out, so people should definitely check it out. We'll make sure we have a link in the show notes. But to me, what strikes me about the thesis there is just it's overriding optimism about what's possible in the century to come. Talk about why you wrote the book and what you want people to take away from it.
Sal Khan (00:39:14):
Yeah. I wrote the book, when we were under a non-disclosure agreement about two years ago with OpenAI, and as I said, it slapped me in the face that this is going to change, not just what I do. It's going to change the world. I was like, "Someone's got to write a book about this." And it took me a while to get through the OpenAI lawyers to even let me pitch the book to other folks. They eventually relented, but it was very cloak and dagger top secret stuff, as you could imagine.
Eric Ries (00:39:40):
Oh, sure, sure. That's gotten them in trouble too recently. A little too much on the secrecy.
Sal Khan (00:39:44):
And I'm used to literally being an open book, no pun intended. So it was a new muscle for me. But it wasn't just that someone should write a book, but it was also clear that the education establishment was going to struggle with this. And then obviously when ChatGPT came out, cheating, they struggled. It is today an emergency.
(00:40:07):
And at the same time I said, "Well look, all technology amplifies human intent. There's going to be negative intent. There's going to be lazy intent on the part of some students, but there could be some very positive intent here. And so let's hope that the baby doesn't get thrown out with the bathwater. So hopefully I, or we can show that there's another way to do it."
(00:40:28):
And also, I found everything is moving so fast. I wanted to get my own head around it. I wanted to structure my own thinking. This is the second book I wrote. The first one was The One World Schoolhouse I wrote back in 2011 when Khan Academy was really picking up, and I was skeptical of writing a book back then. I was like, "Anything I have to say, I just put it on YouTube." But the publishers convinced me, and it was a really good exercise of framing my own thinking about, what is Khan Academy? Where does it fit into the world? What is personalized learning? What should education of the future look like? What is competency based learning?
(00:41:00):
And so this time I was less skeptical of writing a book. I was like, "This will help me frame my own thoughts." And I tried to write it in a way. Another big fear is you're writing a book about AI. By the time the book's out, it's outdated. But I wanted to write it in a way that shows that's pretty evergreen. And so far, I'm pretty confident that what's written is going to stand at least a reasonable test of time.
Eric Ries (00:41:23):
So what's a thesis? Give us the main takeaway. When people read the book, what do you want them to take away from it?
Sal Khan (00:41:30):
It's very fashionable these days for folks and I to sit at dinner parties and have conversations about what's going to happen with AI. And it's very easy for a lot of folks to index on the negative. "Oh, deep fakes. Oh, fraud. Oh, authoritarian governments." And look, those are all real risks. But the overarching theme of the book, which in some ways transcends education, is all technology that humans have ever developed amplify human intent. A knife can kill, a knife can save your life, it can cook food. It can keep you alive. Same thing is true of fire. Same thing is true of wheels. Same doing true of the steam engine. This is going to be true of AI.
(00:42:13):
And what's interesting about AI is I've never seen so much hand-wringing on a technology, on a new technology before. Every other technology that's come about, the steam engine. Immediately people said, "Oh, this could help us with transportation, it could help us with this." And then there's probably some people in the military like, "Oh yeah, we can make tanks and we can make submarines," and all this kind of stuff. But AI, the public narrative is almost completely dominated by the negative.
(00:42:40):
And so the overarching theme is, look, whether or not this is a net positive or a net negative for humanity is not a flip of a coin. It is based on what we do. And if more positive intent is put behind, is amplified with AI, then you're going to have net positive. If the good folks just wring their hands and say, "We don't want to have anything to do with this," and the bad folks, the bad folks are going to do whatever they're going to do, regulation or not, they don't follow rules. Then we're going to be in a bad place. We're going to go into a dystopian world. And there are a lot of really good things that AI should be able to do, especially in the education.
(00:43:16):
In the education. I use the term, the most powerful thing I could imagine doing with AI is if it can improve HI, human intelligence, human potential, if it can give us all more meaning and purpose. And I don't know the odds of that happening, but I think we have to at least try. If we don't try, we're pretty much-
Eric Ries (00:43:35):
So it's definitely not going to happen for sure.
Sal Khan (00:43:36):
It's definitely not going to happen.
Eric Ries (00:43:37):
There must be lots of entrepreneurs, would be social entrepreneurs, educators watching this. What would you like them to do? You could wave a magic wand and say, "Look, get off your couch. Stop hand wringing. Instead, do this." What do you want?
Sal Khan (00:43:50):
Well, I encourage everyone to start using things that they're afraid of, whatever the thing is they're afraid of, whether it's AI or not. But since we're talking about AI, you just start using it. Use it in creative ways, try out new tools. I'm encouraging everyone on the Khan Academy team. A lot of our team has embraced using AI even for their day-to-day job. But some people don't or haven't thought about how to use it. And I'm like, "Look, it's going to be good for you and good for the organization if you take a little bit of time to just, might slow you down a little bit initially to learn the new tools." But move ahead, because if you don't, especially with this, what we're in, the world might pass you by. And people say, it's gone around the internet. You won't be replaced by an AI, but you could get by someone using an AI. And that's what I would encourage folks.
(00:44:35):
And if we think broader than AI, I encourage a lot of folks, and I see this, and I don't want to be preachy. It's not like I figured out how to live the best life, etc. But just always take stock of what you really want and what you really want your life to stand for. Don't be ashamed if you'd have certain material needs, but question if they get beyond a normal level, whether you really need those things and whether you're chasing things that you think will fill your soul that won't, and you're ignoring the things that actually will fill your soul.
Eric Ries (00:45:10):
I've wondered if it makes sense, because I've looked at those same studies you were talking about before that people, the marginal returns to extra income, they cap out pretty fast. And yet I still meet people who are making far more money than those thresholds, doing something either that they hate or that they know or suspect is really bad for the world, or is kind of contrary to their values. And yet they kind of get stuck. They can't get off that treadmill.
(00:45:35):
I've often wondered if we should have a philosophical concept of a maximum ethical salary. You're not required to have it. No law about it, but just saying, look, if you're making money to feed your family and you're doing it something you hate, it's okay. I think people, we like to moralize so much people who have less money, less power. But the people I'm thinking about, I know so many people have got way more money than they need. And if they were doing it, building something that was really mission driven, they're making the money because they, profitability is really lined up with making the world a better place, I think go make as much money as you want.
(00:46:08):
But if you're not able to do that, we need to have some way to say, "You know what? I think you should make less, take some time off or only work three days a week, or find a way to go work at Khan Academy." Find a way to be doing the things you actually care about that are reflective of your values instead of chasing the absolute most money that you can make. Does that resonate with you?
Sal Khan (00:46:32):
I wouldn't call it a maximal ethical salary. I'm not against anyone making whatever. Somehow the market's willing to pay them for their time. And many of those people then go to be donors to Khan Academy. So to some degree, the only reason why philanthropy and Khan Academy can exist is because some people are making much more than they need, and they're able to give some of that back to other folks.
Eric Ries (00:46:54):
That's a lot better than buying a yacht, that's for sure.
Sal Khan (00:46:57):
No, that's right. That's right. I think if someone gets paid a billion dollars, $10 billion, whatever to do something, God bless them. That's awesome. I think it's much more of, I wish it was, are there ways of being in our culture, ways of allowing people to reflect better on that balance and then giving them permission?
(00:47:19):
I have a friend, a very close friend who's a very senior person at a tech company out here in Silicon Valley making... He lives way I live. He would be very happy making probably a third of what he makes, but he's in meetings from 7:00 AM to sometimes midnight, not seeing his family as much as he likes. And I say, I sometimes talk. I'm like, "Look, you're a senior person. Why don't you just cut some of those meetings out?" But he can't or he doesn't think he can because he thinks it would be reflected bad upon, or if he said he wanted to work four days a week or if he wanted, he just can't. In theory, from the company's point of view, it would be a deal if they could have his salary and he could work three days a week, because it probably still-
Eric Ries (00:48:09):
Could probably get much done. Yeah.
Sal Khan (00:48:13):
Probably get 100% of the output and probably he'll stay at the company longer. So a lot of times when we at Khan Academy, I don't want to claim that we have it all figured out. But when we think about what we're doing for our team, I literally tell our team, "Unless it's an emergency, you really shouldn't be working on evenings. And I don't say that just to be a nice guy. I'm saying that because I generally think you're not going to be as productive. You're not going to be creative, and then it's going to water down what you do during the day, or you're going to burn out and you're not going to last here as long."
(00:48:47):
So I think that we've gone full remote, which is a very non-standard thing to do as an organization. But every time I go hang out with especially, well anyone, 80% of the people there are complaining that they have to do FaceTime now in the office for two or three days a week. And I say, "Why?" And they're like, "Yeah, I don't get it. Because most of my team is in this other city away."
Eric Ries (00:49:09):
Right? Because my boss told me I have to do it.
Sal Khan (00:49:10):
My boss, or my boss's boss, or their boss's boss gets some type of psychic reward from seeing people every no in the parking lot and seeing the traffic come into the headquarters, whatever it might be. But I'm like, "Well, if we can give people a little bit more of their lives back and a little bit more of their time, they're going to be more energized."
(00:49:28):
And this might sound like touchy-feely nonprofit stuff, but I would take our team head to head with any team in Silicon Valley and compare the output, compare the productivity, the agility of it. And so I think too many times, especially in Silicon Valley or Wall Street, people think it's a tension between your life and work, but it doesn't have to be.
Eric Ries (00:49:48):
Yeah. People think the way you maximize productivity is you squeeze.
Sal Khan (00:49:51):
Exactly.
Eric Ries (00:49:52):
You exploit to the maximum degree-
Sal Khan (00:49:54):
And then you chain them to golden handcuffs, and it's so psychologically damaging for you too.
Eric Ries (00:50:02):
We talk about this a lot in some of these interviews that every time you act in that way, some startup somewhere is high-fiving. You just created a liability, a competitive disadvantage that they're going to be able to poach your talent. They're going to be able to take advantage. Whatever you did that you're ashamed of, you might be able to hide it for a certain amount of time. But the damage to your trustworthiness, the damage to your reputation, that's creating competitive openings for somebody else.
Sal Khan (00:50:26):
And that's true. Although very few startups, startups sometimes, sometimes startups need to be in this, "We're going to work 24/7," type of mindset. But sometimes they don't and just out of pure fear, they act that way.
Eric Ries (00:50:40):
Oh, I've been there. Listen, the code you write at 4:00 AM is not very good. You're going to spend four hours the next day taking it out, or someone else is going to two months from now. It's very nerve-racking.
Sal Khan (00:50:52):
I give a lot of credit to my former boss at the hedge fund, which most people would not think that this is where this philosophy came from. But I remember when I started working at Wohl Capital Management, my boss's name was Dan Wohl. It was just me and him. He was our portfolio manager. I was right out of business school, this was 2004, and I remember that first week and it was like 4:30 in the afternoon. We were in Boston and he's like, "Sal, you should go home." I'm like, "Really? It's 4:30." And he's like, "The markets have closed." And I'm like, "Okay Dan, I'll go work for home." He's like, "No, no, no, no. I don't want you to work. I don't want to work in the evenings. I don't want you to work at weekend." I'm like, "Really?" Because it completely goes against the stereotype of finance or hedge fund.
Eric Ries (00:51:28):
Yeah, for sure.
Sal Khan (00:51:29):
And he's like, "Sal, our job as investors is to make a few good decisions every year and to avoid making a bunch of bad ones. And the worst decisions happen when you overanalyze things. You do it in your unproductive time. You start group think, you don't have another life. I want you to go do other things."
(00:51:47):
And that frankly was what gave me permission to tutor my cousins back in the day that led to Khan Academy. But I believe actually, we were better investors because of that. Dan's retired now, but if you look at his returns, they would rival Warren Buffett, anybody in terms of how good of an investor he was. And he taught me that you can have a life. In fact, having a life can make you a better a leader, thinker, decision maker.
Eric Ries (00:52:17):
Yeah, I love that. It's such a critical thing to understand that getting the few things right is so much more important than how many hours you log and the decisions you make under duress when you're tired on your 90th hour of the forest death march, those are not good decisions.
Sal Khan (00:52:32):
And you stress other people out. I have never-
Eric Ries (00:52:36):
You're putting trauma in the world instead of healing.
Sal Khan (00:52:37):
I have never written a good email at 10:00 PM. I now reflect, I have never once written a good email. Everyone in hindsight, I'm like, "Oh, I wish I said a little different." I usually did say it from a point of stress. I stressed out a bunch of other people and worked them up, wasted a lot of time. Never good email. Now if I do want to write an email, I'm worked up about something. I do the delay send, and nine times out of 10 I wake up in the morning and I'll delete it before it gets sent.
Eric Ries (00:53:03):
I really remember that feeling of inbox anxiety when it's like, "God, I'm afraid even to check my email." I don't know what kind of crazy missive my boss will have sent me in the middle of the night when they were stressed out about something else, taking it out on me. So yeah, I think that's good advice. All right, you got time for a lightning round?
Sal Khan (00:53:19):
Sure.
Eric Ries (00:53:20):
Okay. Just couple quick ones, because you've got so many great quotes you've given over the years and so many great ideas you've put into our total consciousness as a society. One is, you said you teach the way that you wish you were taught. What does that mean?
Sal Khan (00:53:36):
One, there's a certain notion of respect. I've always enjoyed teachers who didn't treat me as inferior to them. They treated me as their equals, and just someone who doesn't understand it yet. So that tone is very, very important.
(00:53:54):
I have also always responded to folks who don't say, "Look, just take my word for it, that A leads to B," that is open to questioning or that can give you the intuition. I've always responded to people who are willing to be very vulnerable and transparent with their thinking. You know what? I had trouble learning why A leads to B. These are the questions that I had. And honestly, I'm still a little unsure whether it's always B or not. That opens up.
(00:54:20):
And then the last thing I would say is if the instructor or the tutor is not truly passionate and excited about what they're doing, there's no way that the student is going to be. And if they're not relaxed and having fun, then there's no way that the student is going to be relaxed and having fun. So yeah, that's it. Focus on the intuition, be approachable, have fun, and speak as an equal.
Eric Ries (00:54:47):
You also, I think in the book, talk a little bit about an idea. I think you, again, attributed to Plato about that learning under compulsion is wrong. You shouldn't try to force people to learn things. They should want to bring their mind to it on their own. We have a very bizarre inheritance in our civilization of hierarchical learning methods where the student is passive, and we have this factory metaphor of cramming them full of knowledge and stamping them with a certification out the door. Just talk about to you, what does it mean in a deeper way? Why should people not be forced to learn? What's the alternative to that?
Sal Khan (00:55:23):
Yeah. Well pragmatically, the ideal is someone who is hungry for knowledge and you're feeding that hunger. But I do think that some compulsion might be necessary just as a parent. And I'm also, a school I helped start, and I'm now the chairman of. You definitely need to create some guard whales and some nudges for students. And a lot of what we're doing, even on the AI work, we realized that a great tutor doesn't just wait to be asked. A great tutor nudges students forward and is proactive. That's what I was doing with my cousins 20 years ago. I would call them up, I was like, "Look, I'm taking out the time. You need to take the time." Or I would sometimes call their parents. It's like, "They can't go to this party unless they do this with me." So that was a little bit of compulsion.
(00:56:08):
But I think if it's nothing but compulsion, then I think something is broken. As I point out, I have never met a four-year-old who is not curious. If you think about the reason why most four-year-olds will ever throw a tantrum, it's because they aren't allowed to explore something. So when people say, "Oh, that's good for curious kids or motivated," I'm like, "Well, all kids are in that category." So there's something about these classrooms that are based on this factory model where you have to be passive, and finger on lips, and you're not allowed to interact. And most people learn by talking and doing, and you're literally forbidding that in a classroom. That's where I think you really stamp down a lot of people's curiosity. Compulsion is okay in moderation. I think the most important thing is that you should be trying to do active learning as much as possible, not passive learning.
Eric Ries (00:57:00):
Yeah, and nurture people's curiosity rather than stamp it out of them, even if it's slightly inconvenient to you, the adult authority figure.
Sal Khan (00:57:07):
That's right.
Eric Ries (00:57:08):
Yeah.
Sal Khan (00:57:08):
Well, it's a whole, don't ask me a question. I mean, what a horrible signal. And the best is, and you should ask me a question and I should be very honest with you if I don't know the answer.
Eric Ries (00:57:18):
Right, right. Teachers should be modeling that that's actually a great thing. One of the most important skills is to be able to admit when you don't know something.
Sal Khan (00:57:24):
Exactly.
Eric Ries (00:57:24):
What an opportunity to model what real humility and lifelong learning looks like instead of, it's this more authoritarian model. All right. I'm going to ask you this last question. You talked about how you were thinking about when you were making the transition from hedge fund into Khan Academy, you thought to yourself, what would do great characters in literature do? What would they do in a situation like this? That was a way of thinking about it that I thought was really interesting and really literary. You mentioned Hari Seldon, but of course my mind immediately went to Dune and Frank Herbert. I guess because Dune's having a bit of a cultural renaissance these last couple years, which has been great. Obviously, a book that meant a lot to a lot of us growing up. But really like the idea of the incredibly epic actors and institutions created on a very long-term horizon. So much part of that story. I'm curious, I know you've talked about that being a book you really liked. Tell me a little bit about the characters that you wanted to be like when you were thinking about trying to create something really long-term, and did I get it right as Dune an example of the kind of thing you had in mind?
Sal Khan (00:58:23):
Yeah. Well, I'm afraid to say that I'm inspired by Dune because I don't want to claim that I'm a messiah of any kind.
Eric Ries (00:58:28):
Well, and the danger and the dangers of that is also really interesting. Yeah, I think that's right.
Sal Khan (00:58:34):
But any story that we all love and that we identify with, the characters are surrendering themselves to a mission much larger to themselves. And there is something very profound that happens. And look, I spend a lot of time still thinking about, okay, my kid's college, probably the number that I need in my bank account for me to feel secure keeps going up, probably irrationally. But I just keep reminding myself that life is short. We're all going to die, probably sooner than we expect maybe, or in ways that we don't expect.
(00:59:21):
And who cares about all this other stuff that we spend most of our time fixating on? And we're all protagonists in our own story, and what do we want that story to be? And it would've been a pretty boring story if Bilbo Baggins only thought about his 401(k) the entire time. He went out of his comfort zone, and adventures are uncomfortable. That's what I remind myself when I have moments of stress. I say, "Look, this is what makes it an adventure. Enjoy it." It has to be within reason, and don't tie your identity to the outcome here. Just try to do what you can and let the chips fall where they do. But I encourage people to think that way, because we all love those movies, but then sometimes won't live that way.
Eric Ries (01:00:11):
That's a great note to end on. Thank you for making the really epic choice, because I think how different the world would be now if you decided to stay at the hedge fund. Really glad you chose to do it the way that you did. And yeah, on behalf of, what is it, 170 million children. Thanks for doing what you're doing.
Sal Khan (01:00:28):
Oh, no. Thanks for having me, Eric.
Eric Ries (01:00:30):
You've been listening to the Eric Ries Show. Special thanks to the sponsors for this episode, DigitalOcean, Mercury, and Neo4j. The Eric Ries Show is produced by Jordan Bornstein and Kiki Garthwaite, researched by Tom White and Melanie Rieback, visual design by Reform Collective, title theme by DB Music. I'm your host, Eric Ries. Thanks for listening and watching. See you next time.