53: Inadequate Equilibria with Quinn Lewandowski

Play episode
Hosted by
Will Jarvis

How do you find secrets in the world? I got to talk with Quinn about what inadequate equilibria is, how the EMH is a poor proxy for thinking about the world in general, and how to think about low-hanging fruit. We discuss the book Inadequate Equilibria by Eliezer Yudkowsky.


Will Jarvis 0:04
Hey folks, I’m William Jarvis, along with my dad, Dr. David Jarvis. I host the podcast narratives. narratives is a project exploring the ways in which the world is better than it has been the ways it is worse in the past or making a better, more definite future. I hope you enjoy it.

If you enjoyed this episode, please subscribe. You can get on our mailing list, find show notes, transcripts, as well as videos at Nerdist podcast.com. Thanks, how are you doing?

Quinn Lewandowski 0:43
doing pretty good, you

Will Jarvis 0:44
doing quite good.

Unknown Speaker 0:46
Can’t complain.

Will Jarvis 0:49
Wanted to get together pick your brain today a little bit about a book and a concept. I found quite interesting here, especially in the past couple of couple of weeks, I’d say. Yeah, I’m thinking about this. And that’s the concept of inadequate equilibria.

Quinn Lewandowski 1:09

Will Jarvis 1:11
So I get to get started, do you know what’s an adequate equilibria?

Quinn Lewandowski 1:16
I think as toski is term, the inadequate equilibria neither of those words by themselves. And in that context equilibria are adequate by themselves, they’re adequate to some particular task. That adequate equilibria is one in which low hanging fruit to a certain level is being picked. So in the stock market. If you are aware the stock is underpriced. If you have information, you can make money off of that in a way that corrects the stock because you can buy a bunch of shares of the underpriced stock and that will cause the price to go up until it hits the sorry, let me reorient. In general, certain structures have a property wherein there is no free energy. Sorry, I’m trying to reorient. So in the stock market, if you are aware, the stock is underpriced, you can buy the stock. And that will cause the price of the stock to go up. And so you can make money off of it. And so the stock market sort of resembles busy sidewalk, in that you wouldn’t expect to see $20 bills. I feel like I’m not explaining this very well.

Will Jarvis 3:09
That’s right. Yeah. So let’s start with the sidewalk example. Yes, the classic econ example, I guess. So let’s say you’re in Penn Station. There’s tons of people, you know, 1000s of people walking through every day. And you see a $20 bill on the ground. You think like everyone’s walking past a lot of people looking down, generally people will pick up that $20 bill.

Quinn Lewandowski 3:35

Will Jarvis 3:36
It would be really weird if it stayed there for very long. Yeah. Because, you know, it’s kind of like public information. If I can see the same sidewalk. Yeah, everybody knows $20 bill, and if they pick it up, you know, they’d benefit. So the stock market example you’re giving. So the interesting thing about the stock market is, you know, there’s a lot of public information. Yeah, you know, there are a lot of actors competing. And, and so, but like you said, if we spot and we see Microsoft, it’s Microsoft stock is at $10. And we look at like how profitable they’ve been, we determine really, the stock should be $20 we can buy Microsoft stock, and then, you know, correct that help correct that price? Right? It would be because, you know, there’s a lot of sophisticated people in the stock market, trying to do this trying to make money.

Quinn Lewandowski 4:33
Yes, all

Will Jarvis 4:34
the information is publicly relatively publicly available. You know, there’s special cases but sec they really try. The securities exchange commission really tries to make sure people don’t, you know, make money with private information and things like that. You know, it would be difficult to make a profit or make a lot of money in the stock market by correcting prices.

Quinn Lewandowski 4:57

Will Jarvis 4:59
Because all the profits do It competed away.

Quinn Lewandowski 5:00
Yes. Because somebody else would, who knows what you knew has already factored that. Yeah, that’s right. Given the number of people involved, and the fact that the incentives are aligned so that you can actually make money by correcting the prices. That’s right. So in adequate equilibria, is a situation where it’s adequate to correcting the prices to consuming a certain amount of caskey uses the phrase free energy, which is an abstraction on when I have trouble putting into other words, yeah. But generally, it’s a desert ridata. If there is something that a lot of people want, and they are all in a position together, then you should expect them to have already gotten that the same way you would, you don’t see $20 bills lying down beside lying on the sidewalk when you break a walk.

Will Jarvis 6:03
So like, if we’re walking down the sidewalk, a lot of people are walking on the sidewalk, everyone can bend over and pick it up, you know, we expected it will get picked up. It’d be weird if it didn’t get picked up.

Quinn Lewandowski 6:13
Yes. Similarly, at the supermarket, um, you usually can’t do very much bear by going to try to find a new lane. Sometimes you find one that you just opening. But in general, if the lanes are open, then other people will have been looking to find the shortest line. And so you can’t jump ahead three carts at a supermarket line for an hour line.

Will Jarvis 6:41
So you’re missing like, so you know, you’re at the grocery store. And you want to check out faster, you obviously want to get home like this is just everyone’s trying to get home as fast as they can to do whatever else is high value activity. Yeah. Every can see how many people are in each lane. So it’s all publicly available information. So unless you have, so it’s difficult to do better. Yeah, then,

Quinn Lewandowski 7:04
you know, because there are a lot of people involved, and they know what you know, so you don’t have a structural advantage. And I remember reading a argument that the returns to looking for a faster lane should always be just enough to compensate the very busiest people who have the most incentive to look for a separate line. Gotcha.

Will Jarvis 7:30
So this is just like in the stock market. Yeah, there’s enough incentive to pay the professionals. Yeah, more than their next best opportunity.

Quinn Lewandowski 7:38
Yes. So you end up with the stock market, you have stocks that are not predictably underpriced or overpriced, provided you can short sell, right? With sidewalks you have no resources, no money, but you know, also not we’re collectible baseball cards, right? With the supermarket example, it’s time, you can’t get more time by locking, except it just squeaking out some of the margin. And, and inadequate equilibria is an equilibria that fails to achieve a particular level of this. So your cast, he uses the example of a treatment he looked into for his wife, she has very bad seasonal affective disorder. And he did some research and thought he had come up with a new therapy that would help her since the existing therapies were not working. Right. And he’s not a doctor. So is, from a certain perspective, you wouldn’t expect him to be able to do that. You would think that all of the ideas that amateur doing research could find doctors in the system, find because there are a lot of doctors,

Will Jarvis 8:58
right. So I love this example. I really love this example. It was an kaskus book, inadequate equilibria. So the example is, you know, good Cal skis, the author’s wife has really bad, seasonal affective disorder, which is where you get depressed because you’re not exposed enough sunlight. And so he started looking into this and, you know, one of the options was he sends her to South America, you know, for, you know, paste, and she flies down to South America and spends the winter there. That’s really expensive. So he started thinking, you know, what else could I do? And he looks into it. And he looks at traditional light boxes, that’s the traditional therapy is you, you know, put a lot of light and and look at it and that helps, you know, mimics the sun. That’s what you’re doing. Getting more sunlight, but he noticed, you know, maybe this is not bright enough. And so he goes out and he gets like these, you know, super powerful led nips brings up like 10 of them or something and like just incredibly Bright lights all throughout the apartment strings them up, spends about 1000 bucks and suddenly, you know, his wife feels much better all the time. Yes, no more suicidal ideation, like, like a lot better doing a lot better, which is really weird, because there are physicians, researchers, MDS, PhDs, who are selected for being the smartest, very conscientious people that work on, you know, sad seasonal affective disorder exclusively, you know, why was it cast be able to beat the the researchers and their stated aim of making a treatment for seasonal affective disorder that worked, that actually worked,

Quinn Lewandowski 10:45
because this is just rephrasing the fact that he could kind of but the system was not adequate to that level. And the reason is going to come down to misaligned incentives, because you have a lot of people in the area who are smart. And this is not a very obscure idea, in that, you know, if some light helps, you would expect more light to help more. This is fairly, fairly common sense. And yet, it looks like nobody thought about. And this tells us that no one was incentivized to think of it. Right. Because of the number of people involved, if you have a committee at your workplace, and they are incentivized to think of good ideas, maybe that four or five or 12 people in the cafe, just won’t think of the ideas. If you have hundreds or 1000s of people in the system, then you can be fairly sure that they’ll think of a obvious idea.

Will Jarvis 11:50
And these are smart people too. Yeah, we should like this is I think that’s what’s good about this example. It’s not like, I mean, these are experts in this particular field that that came to life. And they I mean, how many years of education, you know, like, four years of medical school, four years undergrad, four years of residency, you know, exam or research fellowships, whatever.

Quinn Lewandowski 12:09
And it if he had a dream, where a leprechaun told him to mix together certain specific chemicals, and he had done it and it had worked by coincidence, that would be one thing, right? But he tried more of the thing that we know, help some red light

Will Jarvis 12:30
just started up.

Quinn Lewandowski 12:33
They are smart people, and they are conscientious people, but we don’t even need them to be smart or conscientious. So it’s just very clear that this system is not properly incentivizing people to attain that level of rigor, kind of, because it isn’t that takes a creative leap. It’s the they’re not looking carefully enough to find that treatment. And that’s really obvious stream. So

Will Jarvis 13:09
I want to I want to interject so Quinn is is is. Is this What’s going on? Is are sad researchers can have seasonal affective disorder researchers, all these MDS PhDs, are they optimizing on something other than curing seasonal affective disorder? So that’s like, that’s the state of day. That is the stated aim, but their stated aim is not exactly maybe they’re optimizing for publications and prestigious journals.

Quinn Lewandowski 13:37
Yes, or

Will Jarvis 13:39
something else?

Quinn Lewandowski 13:40
Yep, institutional power, job security,

Will Jarvis 13:43
that’s great. Ah,

Quinn Lewandowski 13:46
there, sometimes you get evolutionary dynamics, we’re in the system itself selects people. So Cornell’s iron law of bureaucracy says, I’ve heard it phrased a couple different ways. But it says that each organization is made up of some people who pursue the stated goal of the organization and some people who try to get more power within the organization, right. And if you wait long enough, the people who are trying to get power will end up having more power. And so the organization becomes increasingly dominated by people who are trying to do things that are not the stated aim of the organization.

Will Jarvis 14:23
So it gets so the bad money crowds out the good or something.

Quinn Lewandowski 14:28
Yes. Well, it’s a Think of it like an evolutionary dynamic, and that there is something selecting the people who are going to have power, their desire for power. Gotcha. And so because that’s being selected for eventually crowds out the other thing, even if that other thing correlates with when you read AI people, they talk about programming and the goal that correlates with the end tended call very well, but not perfectly. And it turns out that optimizing hard enough for the proxy decouples, the proxy from the original call. Good target style.

Will Jarvis 15:15
Can you say that one more time. So

Quinn Lewandowski 15:18
if you program and so I’ve seen this work through with a bunch of different examples, and it’s hard to distill all of them down to, because you keep feeling like you just have to add more details and keeps not working. But say, you think humans being happy is good. I think that’s not too controversial. Hopefully, most of our listeners would agree, programming, Ai, powerful AI to make people happy. If it’s fairly weak, it might try I don’t know, giving them compliments, connecting with potential friends or romantic partners. But once it becomes strong enough, the way to make the humans happiest is to administer them all heroin.

Will Jarvis 16:02
So you just run around the robot, just like it’s just like stabbing people trying to put in IV fentanyl or something. This is like, this is the best thing.

Quinn Lewandowski 16:11
Yeah. Well, I see. Happiness correlates with what we want. But it’s not all of what we want. Because we don’t want to be on heroin for the rest of our right. Right, right. And it turns out that if it pursues our happiness, which is usually very good, it ends up deep. Happiness is a proxy for whatever we really want. And pursuing that hard enough, optimizing hard enough on that metric decouples, the proxy, the tails come part, right. And this is significant because a market or a structure like a market, which involves a lot of smart people pursuing the proxy, will reliably at least start to decouple the proxy. With sufficiently strong optimization, it will definitely decouple the proxy, but humans aren’t AI.

Will Jarvis 17:06
Right? This is this, this is very interesting. Like it’s a really important. And I feel like part of this is why conceptually, startups can even exist at all. Yes. Like, it doesn’t even make sense. Like, why would you know, like all like Microsoft, you know, it makes billions and billions of dollars in profit. You know, why don’t they just anything that’s software related? Why if they just got they’re making this place, things happen?

Quinn Lewandowski 17:34
Well, they’re moral, nice. I mean, they have lots and lots of levels of middle management, which means there have lots of people optimizing for proxies, they’re decoupled from the things they’re actually supposed to optimize for. Right. And, you know, also communication difficulties, coordination, difficulties, right. startups are more nimble is the conventional wisdom. And I think that’s valid, you have a handful of people who really care about

Will Jarvis 18:04
is this also something? Is there something else going on? This is maybe orthogonal and less important, but where startups you know, the beginning founders, even small units, and it could be more than one person, but I want to think of them as monarchies in some sense. So they can like, get rid of anyone that is optimizing all power, not on the stated mission.

Quinn Lewandowski 18:29

Will Jarvis 18:30
And then you slowly have this degradation into oligarchy over time. where, you know, they’ll say the founders leave, and then suddenly, you know, it’s people that are optimizing on power, not on state admission, and it just goes on to right the drain over time. Yeah, low entropy.

Quinn Lewandowski 18:48
You start with a benevolent dictator, right. And you end up with oligarchy, or a bureaucracy represented by bureaucrats. And, you know, bureaucracy has a negative connotation. And it’s worth noting, I think, this is where that comes from. I mean, if bureaucracies were perfectly aligned to incentives, they would get things done very efficiently. Right, but as a general rule, they’re not we suck at that.

Will Jarvis 19:18
Well, they’re, they’re perfectly aligned to the wrong incentives. Yes.

They are perfectly aligned to the wrong incentives and not the incentives we want them to be aligned to.

Unknown Speaker 19:30
So I think this is a this is really important because a lot of even like, I remember

Will Jarvis 19:38
in econ 101 you know, you always like the first thing you model is perfect competition. Yeah. Probably because it’s like really easy to model and so we just hammer perfect competition, like, you know, there’s monopolies too and like, oligopoly and all these things in between. But, you know, they talk a lot about perfect competition and And this is where all the profits are competed away.

Unknown Speaker 20:04

Will Jarvis 20:06
that’s like, this is not exactly how, and maybe maybe that’s orthogonal to I’m not exactly sure where I’m going with that. But but just the sense that like, so when we think of things that work well, like the stock market, yes. Well, we think that transfers to everything, when oftentimes, very oftentimes, it doesn’t like wait, you know, the, I think the seasonal affective disorder example. It’s really powerful. Because huge gains. Yeah. And relatively little, you know, $1,000. And, you know, some sitting there thinking, what if we just had more light? Yeah, it was hard to think about. I don’t know, like, just think about why people miss things, or incentivized to miss things. And also, the difference between stated goals and actual goals. Yes. And the stories we tell yourself, tell ourselves about that, I think are important to keep in mind. I know, it’s, it’s a web of ideas. There’s a lot going on

Quinn Lewandowski 21:05
there. Yeah. It’s a good web of ideas. There are a bunch of guidance, and there are a bunch of relevant ideas. And they’re I think, part of what I was saying about the iron law of bureaucracy is using that to introduce that. Sometimes it’s the case that no Bay in the system has made this discovery, like picking up a $20 bill. And sometimes people may have made the discovery. But there are selection effects in place such that that doesn’t reach the public, it doesn’t become part of the public record. So it wouldn’t surprise me if some doctor somewhere had made the same discovery kaski. That is, bro, as you know, more light is pretty obvious, right? So there is some mb q&a as to where the system didn’t let anyone discover that or didn’t propagate the knowledge once they had, because now that you had caskey has I don’t think the health care system is going to adopt that, because he’s not properly credentialed. Right. And that is, you know, it’s worth noting, it doesn’t. It’s non obvious that when someone discovers a medical thing that works really well and helps people that the health care system doesn’t adopt that. Right.

Will Jarvis 22:27
It’s very, it’s very, it’s quite worrisome. Yeah. Many levels. Right. I mean, it’s it’s quite worrisome on. On just so many levels, I think it’s probably the case, right? If you explicitly asked, you know, a couple sad researchers, I don’t know, it’d be really interesting to call them like, if I called him up and said, Hey, like, what if you just like crank the light up? And they might be like, yeah, that could be the thing that I can’t get published on that axis or something, or there’s some something else going on, which is really bizarre.

Quinn Lewandowski 23:01
My suspicion, which is not enough to explain it, but I think it’s probably involved is lawsuits and risk, because it is pretty obvious. It’s more of the thing that works. I know, I keep saying that. But so the fact that that discovery hasn’t been put forward implies to me that there’s some significant cost, like I have a difficult time believing that fewer than 1% of researchers working in this field would think of that, right? I like your cast guy respected. caskey has loads of non obvious ideas, but this is really a pre obvious idea.

Will Jarvis 23:48
It wouldn’t take it’s almost something where if you asked like, you know, it’s like a kindergartener, it’s like, what do we do? It’s got like, it’s got like, the pandemic, it’s like, well, all these people like, what should we do? It’s like, well, maybe they should quarantine for 14 days for they come in the country, like, I don’t know. Like, it’s not it’s not a high bar. Like, it’s not a high bar. It’s like more light. Right.

Quinn Lewandowski 24:08
Yeah, it is. Which to me suggests there’s some there not just on incentivized to discover that there’s some punitive factor on the table. Right. And, you know, I sort of doubt the I think if they tried to, and something went wrong, they would probably not be legally liable. But if you’re a doctor and you only do things you will probably not be legally liable for you’ll get one of those wrong, right? It will be bad for you.

Will Jarvis 24:41
You wouldn’t play Russian Roulette with that kind of thing. No.

Quinn Lewandowski 24:44
I’m with your career. Yeah. I think one of the things underlying the parent of stagnation is risk aversion and lawsuits. I don’t have the exact relationship between the lawsuits and the risk aversion. figured out seems a little bit like a chicken and egg thing where they cause it, but they’re also caused by it.

Will Jarvis 25:06
Right? Yeah, I think it’s so I don’t know, here’s one thing like, so when people talk about risk aversion, and stagnation. I do wonder like, they are people really less like more risk averse than they were in the past. And like, like, at some like attitudinal level, like, that’s my question like I like and that’s like, weird to me. But then I wonder if it’s like, it’s almost like a compounding thing. And I don’t know how this get started. But maybe like, you know, if the world is more static, you become more risk averse. And then like, because like, you know, let’s say, you know, the consequences for taking risks are increased, you have less chances, you know, like, I don’t know, I don’t know, does that make sense at all?

Quinn Lewandowski 25:56
Yeah, does. Partly, if the world is more static, then it’s easier to feel like you don’t have as much to lose. And it’s easier to swerve internally stagnate to get into a rat, psychologically. I think, three martial arts causes asymmetric justice. And I’m sure that that’s a piece of it. That’s where you are to blame for unforeseen bad consequences of your actions, but you are not symmetrically credited with unforeseen, good consequences. And it the result of that over time is to disincentivize all action.

Will Jarvis 26:46
Is that something where like, you are to blame, if you throw the switch in the trolley problem, you’re not really to blame, if you let it happen. Yeah, so more people die. But you know, you didn’t cause someone to die.

Quinn Lewandowski 26:58
Some of that. But also, you know, if you throw the switch in the trolley problem, and you save more lives, and that’s allowed to count against the people you kill, then you’ll be incentivized to throw the switch anytime you’re saving more people than would otherwise be killed. So I think saw this is we just had some people develop vaccines for the pandemic, right. And I don’t think they’re going to capture a 10th of the excess value from that what people would be willing to pay. And, you know, it makes sense that you’re going to get people who are much more reluctant to take risks, if they are responsible for the risks, but they only get to reap part of the game. Right? Well,

Will Jarvis 27:43
that actually be that would be super interesting. I would, I don’t I don’t know. Like, I should look this up. But I would be really curious to know how much the scientist came up with the mRNA, the end of the first mRNA vaccine, you know, how much that woman got paid? Yeah. And you know, how much is it? Like, probably, I guarantee it wasn’t 10% of the net value. But if it gave everyone beyond pray, you know, some cue supporters would be hunting her down with rifles, which is like, severe punishment compared to the, the upside, which and there’s like, just a little upside? Yeah, I don’t know. But I don’t know how to fix like, you know, scientists having upside and things like that.

Quinn Lewandowski 28:33
No. I think sometimes it’s really hard. And other times, it’s a little easier. Our times people are punishing them for things they really couldn’t have foreseen. And it does seem to me that is not good for us to stop doing that. If we’re not going to reward them symmetrically for things that, which doesn’t, um, to the extent that something was psychological, it’s not going to help. Or it’s not going to fix it, that might help. Because once you start looking for this asymmetric justice stuff, you see it just about everywhere. People are to blame for bad things that happen. And they are symmetrically credited with good things that happen. And so if you’re a sad researcher, and it occurs to you that this treatment might work, you know, that if you give someone skin cancer, you have a very good chance of being sued, and having your insurance go up. And right. If you succeed in if the treatment succeeds, and it helps, you aren’t going to get some metric gains from that. Right. And so anyone who cares about their career will get out of the habit of either doing that sort of experiment and to some extent, even looking for that sort of experiment right? So I think that’s enabling some degree of the psychological effect, because we’re selecting very heavily for people who don’t take unnecessary risks. Right. But I also think there might be a thing where, you know, humans use role models. So if all of the people who succeed in the field don’t take unnecessary risks, where necessary risks are unnecessary to them, but sometimes very, very good for right people as a whole. Yeah, then you know, a lot of people are going to internalize that. That’s how you do the job.

Will Jarvis 30:34
Right. Yeah. And that actually reminds me of even in the case of this, this is a very light example of this, but I think it’s quite illustrative. Do you remember it was coming years ago, there is an econ blogger? who couldn’t get tenure? Because or like what really had trouble getting placed in in an academic job? Because they had a blog?

Quinn Lewandowski 31:01
Yeah, Scott wrote about that.

Will Jarvis 31:03
Yeah. I just I just want to read that, you know, and that’s not even. Yes, writing a blog. Right. You know, it’s like that, nor that tiny norm. Yes, no. violation is like enough to

Quinn Lewandowski 31:18
Oh, hi, even and surely, it’s interesting, looking at the pattern of norms. I mean, because I’m not sure that he always would have been norm violation. Or I’m thinking more of decades past and, you know, decades past, you couldn’t do a block, right. But I think the there is an asymmetric incentive in terms of mobs, crowds. A mob can take things from a minority. I’m thinking of, you know, the nightmarish pogroms and stuff, but also much more mundane stuff. But a mob can’t profit by coming together to bestow gifts. So you have hate mobs, but not wild mobs. And so in a certain kind of environment, where you’re not going to be rewarded for being singled out. But you have a strong incentive to be illegible, and invisible. Benjamin Hoffman has an essay called scapegoat games, where he’s talking about how in our current society being singled out is presumptively bad most of the time. And so having a blog makes you stand out. Right. So um, and if I crossed that with Zvi martial arts, which I think he would be okay with, because they’re French martial arts thinks that, to a large extent, people are habituated to the moral mazes. To a system in which standing out is bad to the point that I’m writing a blog is bad, because it signals that you’re the kind of person who would write a block, which means that you’re not being prudent and

Will Jarvis 33:18
Right, right.

Quinn Lewandowski 33:21
And I do see this, you know, mentality in some places. Yeah. So to some extent, it’s a norm violation, because it’s not smart. And it’s not smart, or not prudent, not appropriate. Not. And it’s not prudent, because getting singled out is bad, rather, thank God. Right. And people have adapted to that system.

Unknown Speaker 33:50

Unknown Speaker 33:52
I still am like,

Will Jarvis 33:56
I still there’s still something missing in my mind about how and this is kind of a, it’s not entirely related, but I think it is related in an important way. It’s still unclear to me why it feels like in the past, we are more okay with people having different ideas than we are now. In the sense, like, and we’ve talked about this before you and I think supposedly we’re talking about this more, you know, Peter Thiel talks about, you know, most of these, the founders of the big tech companies having some mild form of Asperger’s or something where they just kind of can subsist ignore, like social pressures there a certain extent. But like, I’m not clear why like, I don’t feel like it just changes because like people decide like, we’re gonna start meeting down, you know, any any nail that sticks up like that. So do you have any thought on on why like How like I don’t know, and why is always ever determined?

Quinn Lewandowski 35:03
Yeah, I’m in one of those places where I know, the model that we’ve worked out so far is incomplete because it’s changing. And I don’t really know why it’s changing. Yeah, I can point to things that might be helping it to change. But I don’t feel like I have enough of those to actually explain. Yeah, Sava is is getting richer. And so adapting to that, yeah. And having more to lose, and being more insulated from disaster. That’s, I don’t think that’s all of it that

Will Jarvis 35:37
right. And, and just I want to talk more of that, but I think it was in Scotts review of zero to one. And I get the sense, like, in a world like this, where nails are beating down. If you can keep it to yourself and be it in some sense, and not getting beaten down. being weird is almost an inexhaustible, like, it’s a really good thing. Yeah, like, because there are all these asymmetric profit opportunities.

Quinn Lewandowski 36:11
Yeah, there are. And, you know, if you’re neurologically or psychologically weird, a lot of the times the kind of things aren’t as much work for you, there are more work for other people, or things are more work for you than they are for other people. But you can exploit the ones you’re good at. And you can delegate the ones you’re not good at. Yeah. Um, so to some extent, I think of as Lang you, I think of the market like a race, kind of, like, I’m visualizing a race from above. So it’s an unusually simple race. And the market lets you come out from a slant, right. And having odd goals will do that kill a lot of the time, if what you want is something that now a lot of our people want. Yeah. So both having a different tool set, and having goals that people that you’re not directly competing with people for

Will Jarvis 37:14
right. Well, and that, but in some yet, Yes, exactly. I think it’s really important, like the goals that you’re not directly competing for with other people and like, the more you can get away from that, from competing, but it seems like it’s almost in substance, it’s gotten easier to make real progress. Because, you know, if you’re competing with a sad researcher, to create an actual care for or you know, real viable treatment, for severe seasonal affective disorder, you’ve got this in minutes advantage, because they’re not really like the sad researchers are not actually competing with you on creating a treatment for sad. Yes. Which is very weird.

Quinn Lewandowski 37:52
Yes. Like,

Will Jarvis 37:53
really like, like, I let my mind like, I think about this, like, this is just so bizarre, and almost like, I think you and I should try and get a sad researcher on the podcast, because I want to understand, like, you know, like, what’s the explicit story that you tell yourself about this?

Quinn Lewandowski 38:09
I have more luck with an ex sad research, like sad research. Are they banished from the academy? Yeah, or just haven’t less than incentive to play politics as far as the problem. Scott Alexander as an essay, where he mentioned that in 2014, if you and your friends were being the experts, that was awesome, and then you’d done something good. And in 2021, if you and your friends are being the experts? Well, it’s just getting kind of sad now. Yeah, I think he was talking about COVID, specifically, but right, generalizes. And so I guess that’s me returning to what I said before that it’s changing. It’s some parts of air getting worse. And I don’t totally understand why.

Unknown Speaker 39:01

Will Jarvis 39:06
It’s, ma’am. In some sense, is expert over, like his expert, his expertise, like our experts, like somewhat overrated at this point, then is this. Yeah. In fields that are that have been around for a while. Yeah. Is that is that something fair? We could?

Quinn Lewandowski 39:31
Yeah, I think so. I think human psychology would be enough to make it likely that they would be overrated, even outside of the because humans have tribal chieftains, but science is relatively new. And so it seems counterintuitive to people that you would trust the physics experts about physics, not about things that aren’t physics, right. But I think further overrated because All of what we’re talking about,

Will Jarvis 40:03
right? There can be really weird incentives. Yeah, like really weird incentives. One other example. Do you remember the the infant formula? example?

Quinn Lewandowski 40:19
Yes. For enteral nutrition?

Will Jarvis 40:23

Unknown Speaker 40:24
Can you talk about that at all?

Will Jarvis 40:25
I can’t remember the exact details.

Quinn Lewandowski 40:28
I’m not sure they have the exact details, either. I do have it on my iPad, right. So as I recall it. And another example of what we’re talking about, it’s a treatment for babies who are born with malformed digestive tracts for bowel syndrome. And so they need parental nutrition, they need special nutrition. And we are using the wrong fat in the nutrition in the United States. Yes. And this causes them to get brain damaged and sick and sometimes die. And we have excellent reason to think that we’re using the wrong fats. It was approved by the FDA when we didn’t have a good breakdown of different fats. And we kind of have fat, our fat chat, right.

Will Jarvis 41:26
And the the stuff we we use, we currently use and have been using, it’s a soybean oil tend to be sand. That’s what’s approved as soybean oil. And it has something to do with omega six to omega three like that, that break down. And if it’s wrong, you get all these terrible developmental problems don’t read.

Quinn Lewandowski 41:52
So yes, I found a nice basically what we’ve been talking about.

So in 2012, there was a single Hospital in the United States that could provide correctly formulated parenteral nutrition, the Boston Children’s Hospital, the formulation was illegal to sell across state lines. It looks like this one is on the FDA, which was the one who had they had to approve. And partially because it goes on for quiet but you’re trying to pick out which

a doctor who gives a baby a nutrition formula that isn’t FDA approved will lose their job, a hospital that doesn’t fire that kind of doctor will be sued. A scientist that writes proposals for a big, expensive definitive study won’t get Gramp. And while they were busy Wranglers, failed cramp proposals, they’ll have lost their momentum toward tenure. So we have you know, almost like dominoes, but in reverse, right. blocked

I think the example is he’s using it mostly a demonstrate that this continues to be in concern, even when the stakes are life and death. And the people being killed or anybody’s out group. Nobody wants to kill babies.

Will Jarvis 43:58
Right? No one explicitly wants this to happen.

Quinn Lewandowski 44:01
Yeah, and yet, and yeah.

Will Jarvis 44:05
Well, and it’s because you know, that, uh, you know, if you get formula that’s not approved near position. Something happens. Yes. Fired was your job? You know, you’ve got to drive Uber. Yes. And it’s a big status downgrade. Yes, for a physician, and it keeps going up the chain, right, the FDA, you know, for whatever reason that, you know, they’ve got these weird incentives can’t do it, you know? Yeah. And then like the manufacturers, they don’t really have enough incentive to like, go out and do the study. And then, you know, cost God knows how much to go through the FDA. And it’s just like, all the circular crazy incentives in ends up with us, you know, killing. Yeah, infants.

Quinn Lewandowski 44:49
Yeah. All the way up the chain. And the FDA. I think asymmetric justice is At least part of that FDA people have an incentive not to approve potentially dangerous medications. Right. And I think they also have an incentive not to acknowledge that they’ve made a mistake when it’s something like this, even if the evidence is are wise mountain, right. Partly. I was reading while I was listening, a podcast I used to listen to to read through the Gulag archipelago. So I haven’t actually read but there’s this scene in the read aloud the struck stuck with me. It’s the applause sin. Oh, yeah. Where they have a moment of applause for comrade stone. And it goes on, and on and on, because you don’t want to be the first person to stop. Huh? Similarly, I think the FDA I forget exactly. Yes. Oh, right. You don’t want to acknowledge that you made a mistake. Right first person to acknowledge the mistake is gonna get scape guide. So even ever, evidence is mounting from outside. They’re not going to get rewarded for acknowledging.

Will Jarvis 46:20
Yeah, and this actually, this reminds me there’s a great political science paper I read, you know, it’s probably a year ago, and so I can’t remember quite all the details. But it’s about politicians apologizing. Yeah. And like it categorically just like does not work. Yes. Like, apologizing does not work, like. And so your incentive is actually just to like to not deny, deny, deny, deny, deny, just keep denying. So like, you know, like, like you said, like, the FDA said, God dang, like, did, we screwed up, I can’t do this anymore. You know, we need to fix it. You know, it’s the front page, The New York Times, and they’re like, we got to fire these people. And then, and you know, the hospitals like God, you know, God, we’re screwing up, this is bad. There’s a great documentary, it’s on. There’s a fire in Romania. And it was at a nightclub, a lot of people died. And then the state health service kept all the patients in a burn unit. But there’s always like, crazy corruption and they weren’t taken care of properly, they should have been shipped, they should have sent them to Germany, where they have the profit facilities. But as a matter of pride, they didn’t want to do this, like we can take care of them here. And so they end up firing the health minister, and they get in someone who’s at least in the documentary, you get the sense, his explicit goal is like, like to clean up, he’s trying to clean up the hospitals and like, admit we’re wrong. And so you know, he’s like, goes up there. He’s like, we’re terrible. You know, the state hospitals, they don’t work right now. There’s like all this corruption, there’s getting money off the top, the, the cleaning agents they were using, they were diluting them to a point where they couldn’t get prevent Marsa. Like they couldn’t clean the surfaces enough. And he was like, trying to like go in and clean this up. And he’s just like, unable to do it. I’m not exactly sure I was going back. But it’s great phone wants to include in the show notes. It just to see like, what happens if an actor who understands all these things, tries to go in and fix it. And he’s just like, completely unable. Like he’s patient rights advocate. He understands like these early issues. And he’s like, well, what if we go in? And we just Oh, yeah, so they’re using this cleaner, the cleaner is not working. Because it’s like, the alcohol level is way too low to actually, like, destroy any bacteria. So he says, Well, can we just like go in and remove it from all the hospitals? And his advisors goes? No, I’m sorry. That’s illegal.

Quinn Lewandowski 48:50

Will Jarvis 48:50
you know, we’ll get traveled to court see, like, Can we take Can we get a court order? It’s like, No, I can’t really do that. So we just got to keep it. And it’s like, Man, it’s some level. You think if you’re that guy, don’t you just like walk down the hospital and like, get your staff and said, we’re throwing all this out? We’re gonna go pull it out. Like, I’m gonna walk down there myself if I have to. And I guess the incentives are, you just can’t get hired again, if you ever do that, yeah. Is that is that what’s going on? It’s because like, you know, there’s points where Fauci You know, he could have done like, I’m sure there’s some more things like this, you know, that there’s, there’s small things we could do that would really help. But if I go against it, and I actually make it happen, you know, I just get fired. I can’t get hired again. Yes. And you’re optimizing for staying in that job and doing that you’re still trying to, you know, you want to do as best you can, but you’re optimizing towards like keeping your job.

Quinn Lewandowski 49:43
Yeah. And the jobs are populated by people who’ve made that decision. You know, if you lose your job, you get to make that decision. But one time you really get the deal once Yeah, you get replaced by someone who will make many many decisions. So Almost evolutionary dynamic and play were making the what we think of is the right decision mean, right get selected out. And you only get to make that one decision. And like,

Will Jarvis 50:13
and so I’m sure the calculus you make in your mind is like, well, maybe I can make changes in the margin, you know, like, and I know this is the right decision, but like maybe going forward, I can make better decisions the next person up and also, well, of course, what’s really in front of mine is like, I want to keep my career going.

Quinn Lewandowski 50:29
Yeah. I’m not sure. That’s something you could think. And if it’s my model of humanity, there would be people thinking, I read this V, in his mouth of the people who’ve ended up in charge is that their mentality is really strongly adapted to what you need to do to be in charge. And so they’re unlikely to be trying to do the right thing. They’re unlikely to Boeing male energy.

Will Jarvis 51:00
So so Okay, that makes sense. So if it is a very competitive environment, yeah. You don’t have time to even think about doing the right thing. Like this is like completely secondary to Yeah, you have to just, it’s competitive. You know, it’s like the Yeah, yeah. Like it gets competed out completely if it’s competitive enough.

Quinn Lewandowski 51:23
Um, john nourished who we talked to, he pointed out, he has an essay, where he speculates that a lot, the Asperger’s stuff is one side of a bell curve. And the other side are people who are really skewed toward being social. I would like a word with fewer bad connotations than manipulative, but their own interest toward dealing with other people. And we don’t have a label for those people. Because if you’re very, very good at dealing with other people, you’re running the culture in the first place, right? I feel like the people who are highly placed in the current system, and at least the most, you know, competitive positions are likely to be way at the edge of the bell curve. And I think that not just that bell curve, but in general, they’re likely to be very extreme. And so I’m not sure that works. I’m not sure this is productive to say, but I’m not sure it works to model them, like they’re trying in good faith. Because it’s simultaneously true that 90% of humanity is strong in good faith. And, you know, in our 5% is just not trying, so they’re not doing in bad faith. Yeah. And at the same time, these people are unusually likely to be derived from the remaining 5%. Right? It’s complicated, because it’s not I don’t think they’re just, you know, psychopaths or whatever, you could do a simplified model where making decisions just for power makes you more likely to end up in control, right? And make 1% of the population psychopaths, right? And you’re going to see those people honest. You’re going to see leadership positions being controlled almost exclusively by that one person.

Will Jarvis 53:25
Right? Well, and we can even think about this in terms of like, competitive sprinting. Yeah. So this has been this big, thick in the news recently with drug testing stuff. But anyway, so you know, the we’ve got Olympic Trials going on right now. Yeah. And you know, if you’re an Olympian, you need the genetic talent predisposition. Yeah, you got to have the fast twitch muscle fibers, you have to have, you know, your pelvis has to be a certain way, you know, like, all this genetic stuff. And then you have to spend every waking hour essentially, every waking hour, yes, you know, optimizing on this one thing like it because it’s so competitive. You know, you have to like train, right, you have to eat, right? You’ve got to, you know, have a good plan, you got to be smart about it. You’ve got to, you know, and you’ve got to spend a lot of time. And so at the end, like, because it’s so competitive, the more competitive it is, the less time you have for there’s no slack. There’s just no slack to do anything else.

Quinn Lewandowski 54:23
Yeah, very much like that. Or like i’m suppose we were thinking about, we were talking about fields medal winners. I was saying it’s neat that they persevere through all that boring math, right? Because you know, most people find math kind of boring, so people don’t find it. But my guess is that very, very few fields now people find math even a little bit boring. And politicians, for instance, our national level politicians and people who end up buying with CDC and are similarly really have only selected. And it’s harder to say what? Because I don’t think we’re just psychopaths. But I think we might, I believe they have a mentality that is entirely geared toward winning the zero sum status games on in the symbolic social round. So, you know, the head, the FDA does not know that babies are dying. You can’t tell them the babies are dying. They’ll process that as a social signal or part of a coalition of power. And it sounds like I’m dehumanizing them. But I’m not doing that gratuitously. I think. I think that if we expect that they are trying their best, that we will be badly wrong again and again and again. And we’ll make bad predictions. Right. And I think that’s consistent with a world where almost everyone is trying their best.

Will Jarvis 55:56
Right? It is not the case. And maybe our students, but don’t think you’re saying it’s not the case that they’re even like, in some moral sense trying to do be a bad person? No, they’re literally just optimizing. Like, they want to win. Yes, they really want to win. And they’re optimizing on whatever it takes to win.

Quinn Lewandowski 56:17

Will Jarvis 56:19
Whatever game they’ve selected,

Quinn Lewandowski 56:21
I’m thinking, I have direct evidence for this. I play video games. And there’s a funny thing where when you start a video game, you’re thinking about how play it and then it sort of fades into the background. You know what the jump button is, when you can get to a level where you couldn’t tell someone what the jump button was, but your hand just does. Right? I think that a lot of people who’ve risen to the top of moral mazes are similarly their brains are doing. They’re just executing the right social move, right? instinctively, reflexively in sort of the same way, and I don’t know how to communicate that. The way your brain can run almost a different program, a different game in a totally compartmentalised way. I have a non video game example. And not everyone plays video games. Right? Driving maybe? Yeah. I think there’s something like that going on. Right. I think there would almost have to be,

Will Jarvis 57:36
right. Yeah, I think you’re absolutely right. It does speak just suddenly level levels. Advice I draw from this. Yeah, like life advice, is you should always be really concerned about hyper competitive environments.

Quinn Lewandowski 57:51

Will Jarvis 57:52
And you should always as best you can. And this scares me to death. Like I literally like, which I think is good. Because I think if you’re scared about things, you know, there’s a chance to get familiar merely write them and like, you know, work on them. But like, you know, doing something that, like, you got to make sure you’re optimizing on the right axis.

Quinn Lewandowski 58:13

Will Jarvis 58:14
Like, if you do you need to think about this a lot. Like and? Yeah, I don’t know. Like, it’s, I mean, it’s really quite scary, right. I mean, like, no one’s setting out to make this. No one’s setting out to have the infant’s die. You know what I mean? And then yet we hear we end up. Yeah, I don’t know. It’s quite horrifying. Right?

Quinn Lewandowski 58:35
Yeah. This? It’s, I think, Marshall, it says that if you were picking something to keep you up at night, this would be a good candidate. I agree with that.

Will Jarvis 58:47
It is much scarier than explicitly bad people or stuff like like villains like movie villains. Because, you know, at least this is in some sense, like, why you can get rid of the person. And then that’s a much more solvable problem, then,

Quinn Lewandowski 59:00
you know, like, our mentalities human mentality are so well adapted to dealing with villains that we read stories about them for fun. I mean, right, we really want to do that the same way. We want to eat lots of sugar and brown. And just by eating lots of sugar and fat, you know, sometimes it’s the right thing to do, right? If you never ate any sugar or fat, I think it would not be good for you. But this is scarier, because it’s something we’re not adapted to doing. Right, trying to do it manually. Right. Backing up a little bit. Yeah.

Will Jarvis 59:40
Remember the post conflict versus mistake theory?

Quinn Lewandowski 59:42

Will Jarvis 59:43
So we’ve been talking a lot about, try to summarize as quickly as you can. Mistake theory is kind of what we’ve been talking about, you know, everyone’s like trying to optimize on something. They’re not explicitly trying to do something bad. But they end up doing something bad. Yeah. Conflict is like Well, sometimes people actually do things to try and win like a conflict and they’re actually trying to do something bad. Is it? Do you think it’s a bias that we have? You know, we’re hi system advisors, you know, like, I would definitely be biased towards mistakes theory. Do you think we’re biased towards mistake theory too much? And that conflict? You know, we should think more about conflict theory, or is that like, you think we have it properly weighted

Quinn Lewandowski 1:00:28
scotch essay ties together a bunch of different qualities? Yeah. And he shows how they correlate each of them is explained as a cause of the previous quality. Right. But they don’t, they’re not logically Adak. And once you have something that correlates with something, but so I think of them as a cluster, okay, rather than a single thing. And I think some aspects of that cluster, we’re doing the right amount of, right. For all I know, something we’re not doing enough of, and some we probably are doing too much. Right? Um, is basic one sense definition is conflict theorists see this. See, the problems of society is primarily being about conflicts, evil people doing things who, right? We have to, and mistake theorists think people are making mistakes. And some, when he draws out the correlations, you have conflicts, theorics, to a large extent issuing discourse. Yeah. And issuing truth seeking and issuing Voltaire style liberalism and debate. Right. I think that part is definitely a mistake. But I think trying to understand clearly both the existence of conflict theorists and the existence of people who have goals, that goals, goals that are not consistent with our goals, and you’re right. It’s important not to be blind to that. Yeah. I think there’s a party game called werewolf or mafia. Yeah. And I’ve never played that. I’ve read essays about that. Yeah. It’s played with a group of people, some of whom are secretly werewolves. Yeah. And most of whom are villagers. And the villagers have to try and detect the werewolves. Yeah, I’m wishing I played it, I’d have a better idea of how it works. The remarkable thing about is how well it aligns a set of incentives. If you’re a villager, you want clarity, you are in conflict with the werewolves. But the optimal strategy for you looks remarkably like the sort of liberalism that Voltaire or Bertrand Russell brought to mind, right? You want clarity, you want coordination, you want communication. And if you’re a werewolf, you want discord and confusion. And you want to gum up the works of coordinating as well as you can without tipping your hand that that’s what you’re trying to do. And the reason it’s bad to tip your hand, that’s what you’re trying to do is that it identifies you as a werewolf. And that makes you a target for coordination. And you would end up incentivizing coordination. Got it? So I see that framework as sort of, I add that to Scott’s conflict mistake theory. And I’m okay. For my personal I’m in the service of my own goals from my own ethics, I’m okay. Being a conflict theorist. I don’t act as a werewolf.

Although I do think I think very few people are trying to do the wrong thing terminally. I think some of them are trying to do the wrong thing, because it’s the wrong thing. instrumentally This is known as the martial arts thing that I’m doing the right thing creates ambiguity about why you did who might have done it, because it was the right thing. If you’re trying to signal, say, loyalty to the group, right, need to take actions, they’re the wrong thing. Right. I also think a lot of people are applying high level heuristics rather than reasoning about the physical world.

Will Jarvis 1:04:32
Right. Yeah, I think you’re right. And just mentioning, so like, doing the wrong thing, just to do the wrong thing. Yeah, example that. And I should get a example for the left too, because this all sounds like I’m picking on somebody. But, you know, like, so. Anti masking. Yeah. You know, like masking, it’s a low cost intervention. You know, even though the data like I, you know, it’s not super clear. It’s common sense, but you know, like, It reduces some viral load, because you’ve got some physical barrier. Like I said, low cost, easy to implement, you know, probably helps. And so it would be rational for people to do it. Right. Yeah. But you know, not wearing a mask is signaling in affiliation with the end group. Yeah, that is not costless. And it’s like, probably not the right thing. But you know, it’s not the right thing to do. Yes. And so you see that a lot? I think so. I don’t know.

Quinn Lewandowski 1:05:28
I think that’s one thing grab in that.

Will Jarvis 1:05:30
Yeah, it was part of it. And we’ve actually talked about some of the reasons recently, but perhaps less related. I want to I want to talk about one more thing. Before I let you go. So when we think of like these kind of like, collective action problems, like where, you know, like, the incentives are just not strong enough, like, so the baby formula, a cure for sad. Perhaps the baby formula examples better for what I’m what I’m trying to get at here. And I haven’t thought about this too much. So we’ll see where this goes. But what do you think can be done? So like, you know, the example the baby formula? Is it just as it tastes someone who’s like, relatively rational and has a lot of money and can just like, put in place, you know, and like, make this their one issue and then go solve it? Is that what it takes? Like, like, how would you go about even fixing a problem like this? Do you see I’m saying? I don’t

Quinn Lewandowski 1:06:39
know. It’s an important question. Yeah. I’m

Will Jarvis 1:06:46
like, so for example, we had a Alexey goosy on the podcast, recently to talk to him. And he started an organization called new science nonprofit and check it out. And his idea is to create kind of a new kind of an exit exit institution for science. So there’s like a institution with the different incentives and scientists currently have, you know, could it be something like you make finding a cure for sad, actually, what you’re optimizing for? Yeah. So like, maybe you like put up just like a straight up bounty? And you say, like, if we gave you $50,000, for care for, you know, sad, right.

Quinn Lewandowski 1:07:29
I bet that’ll help. Would that. Yeah.

Will Jarvis 1:07:31
Would that help?

Quinn Lewandowski 1:07:32
I don’t know. Yeah, I think that would help. God get expensive fast,

Will Jarvis 1:07:37
would get very expensive, fast. That’s a real problem. Yeah.

Quinn Lewandowski 1:07:41
There’s, I mean, it’s kind of a cop out. But there’s what we’re doing now. I mean, there are lots of people involved in the system who are taking actions that are net that and the more clarity we have about that. Ideally, we would stop with the asymmetric justice are cut down. So your reward people for doing good. Thanks. And we punish them last? Yeah. I mean, I think

Instrumental instrumentally on for political change, you probably need the populace to have some idea. And for the populace tip, some idea, you need some degree of clarity that say the FDA knew this thing was killing people and let happen. And I find that a little bit distasteful, because I’m adding more blame to a system that’s already really, really full blank. Yeah, but it might help at least a little bit. Interesting. I did. I think a lot of this does rely on some degree of ambiguity. But I also don’t know how to the public isn’t my weapon of choice. And I don’t really know how I talked to them. Right? I have a very strong instinct that creating clarity is good. That has both locally beneficial effects and that the unforeseen effects skew positive

Will Jarvis 1:09:29
edges. So is it something like making explicit like more explicit, like, talking about like, what people are actually optimizing on time?

Quinn Lewandowski 1:09:41
Yeah. What they’re optimizing on what we can conclude what we have good evidence for? Yeah. I really wish I had something better than that.

Will Jarvis 1:09:59
Yeah. It’s very, it’s very difficult. And I put you on the spot there. But I do think it is also an argument for, you know, like things happening on the edges, like, yes, probably easier. It’s probably easier.

Quinn Lewandowski 1:10:13
Yeah, there is, you know, there’s also creating a whisper network, like people who have problems with seasonal affective disorder can read about your task is stuff even if they can’t get through the mainstream medical system. Right. And, you know, to some extent, you could, theoretically have a long term, you could have an increasingly developed in formal communicate self communication until it supplants the formal set. I think something like that either has happened or is in the process of happening with Wikipedia. Got me it started, as you know, nobody took it seriously. And now, I think most people take it seriously. I don’t know if they’re, I’m not saying they’re necessarily right to do so. Right. But

Will Jarvis 1:11:01
definitely, yeah, it definitely gives getting more. Yeah, like, you know, it’s it’s kind of when it’s like, you know, especially like pages a lot of Well, again, like pages with a lot of people editing it and like working on it like you can get pretty close. Yeah, like you have the knockdown drag out fight about like, you know, any given topic.

Quinn Lewandowski 1:11:19
So I guess that’s another strategic Avenue, create the bear incentives informally, and position them to take over formally once it becomes obvious enough to enough people and that part don’t know how to do that they’re working better than the institutional and saps

Will Jarvis 1:11:42
Right. Yeah. That makes sense. Any other concluding remarks? I know we’ve talked about a ton today.

Quinn Lewandowski 1:11:53
I can’t think of anything.

Will Jarvis 1:11:55
Cool. So I’ll I’ll put the link to the book. Inadequate equilibria. Yeah, quite recommended. It’s not well read either. Not good. It’s good book. There’s also a good book review. If you even that’s too long. I’ll put that to Scott did Scott did a good book. And we didn’t even get through all the stuff about the outside view. Well, I guess, roundabout way we’ve kind of talked about it. But yeah, perhaps a topic for another day. Yeah. Well, thanks, Glen, for coming on.

Quinn Lewandowski 1:12:23
Thank you for having me. These are always really fun. Yeah,

Will Jarvis 1:12:26
it’s good time. I learned a lot. Well, that’s our show for today. I’m Will Jarvis and I’m will join us next week for more narratives.

Transcribed by https://otter.ai

Join the discussion

More from this show