Last month, former Facebook and Pinterest executive Tim Kendall told Congress during a House hearing on the dangers of social media that Facebook made its products so addictive because its ad-driven business model relies on people paying attention to its product longer every day. He said much the same in the Netflix documentary "The Social Dilemma," in which Kendall -- along with numerous other prominent early employees of big tech companies -- warns of the threat that Facebook and others pose to modern society.
Kendall -- who today runs Moment, an app that helps users monitor device habits and reinforces positive screen-time behavior -- isn't done campaigning against his former employer yet. On Friday morning, we talked with him about the FTC inching closer to filing an antitrust lawsuit against Facebook for its market power in social networking; what he thinks of the DOJ's separate new antitrust lawsuit against Google; and how venture capital contributed to the "unnatural" ways the companies have commanded our attention -- and advertisers' dollars along with it.
Our conversation has been excerpted. You can hear the full conversation here.
TC: Like everyone else, you wrestle with addiction to the apps on your phone. At what point did you decide that you wanted to take a more public role in helping to identify the problem and potentially help solve it?
TK: I've always been interested in willpower, and the various things that weaken it. I have addiction in various parts of my family and extended family, and I've seen up close substance abuse, drug abuse. And as I started to look at this problem, it felt really similar. It's the same shape and size as being addicted to drugs or having a behavioral addiction to food or shopping. But it didn't seem like anyone was treating this with the same gravity.
TC: What has been the reaction of your colleagues to you turning the tables on this industry?
TK: It has evolved in the sense that, at the beginning of this, I was kinder to Facebook. When I started talking publicly about my work with Moment, I said, 'Look, I think that those folks are focused on the right issues. And I think they're going to solve the problem.' And I was out there throughout 2018, saying that. Now I've gotten a lot more vocal [about the fact that] I don't think they're doing enough. And I don't think it's happening quickly enough. I think they're absolutely negligent. And I think the negligence is really about not fully and accurately understanding what their platforms are doing to individuals and what their platforms are doing to society. I just do not think they have their arms around it in a complete way.
Is that deliberate? Is that because they're delusional? I don't know. But I know that the impact is very serious. And they are not aligned with the rest of us in terms of how severe and significant that impact is.
I think everyone within Facebook has confirmation bias, probably in the same way that I have confirmation bias. I am picking out the family at the restaurant that's not looking at each other and staring at their phones and thinking, 'Look at Facebook, it's ruining families.' That's my confirmation bias. I think their confirmation bias is 'There's so much good that Facebook has done and is doing for the world.' I can't dispute that, and I suspect that the leaders there are looking to those cases more often and dismissing the severity of the cases that we talk about, [including] arguably tipping the election in 2016, propagating conspiracy theories, propagating misinformation.
TC: Do you think that Facebook has to be regulated by the FTC?
TK: I think that something has to change. What I would really like to see is the leaders of government all over the world, the consumers that really care about this issue, and then the leaders of the company get together and maybe at the start it's just a discussion about where we are. But if we could just agree on the common set of facts of the situation that we're in, and the impact that these platforms are having on our world, if we could just get some alignment in a non-adversarial dynamic, I believe that there is a path whereby [all three can] come together and say, 'Look, this doesn't work. The business model is incongruent with the long-term well-being of society, and therefore -- not unlike how fossil fuels are incongruent with the long-term prospects for Earth, we need to have a reckoning and then create a path out of it.'
Strict regulation that's adversarial, I'm not sure is going to solve the problem. And it's just going to be a drawn-out battle whereby more individuals are going to get sick [from addiction to their phones], and [companies like Facebook are] going to continue to wreak havoc on society.
TC: If this antitrust action is not necessarily the answer, what potentially could be on the regulatory front, assuming these three are not going to come together on their own?
TK: Congress and the Senate are looking really closely at Section 230 of the Communications Decency Act that allows -- and has allowed since it got put in place in 1996 -- platforms like Google and Facebook to operate in a very different way than your traditional media company does in that they're not liable for the content that shows up on their network.
That seemed like a great idea in 1996. And it did foster a lot of innovation because these bulletin board and portal-like services were able to grow unabated as they didn't have to deal with the liability issues on every piece of content that got posted on their platform. But you fast-forward to today, it sure seems like one of the ways that we could solve misinformation and conspiracy theories and this tribalism that seems to take root by virtue of the social networks.
If you rewind five or 10 years ago, the issue that really plagued Facebook and to a lesser extent, Google, was privacy. And the government threatened Facebook again and again and again, and it never did anything about it. And finally, in 2019, it assessed a $5 billion fine and then ongoing penalties beyond that for issues around privacy. And it's interesting. It's been a year since those were put in place, and we haven't had any issues around privacy with Facebook.
TC: You were tasked with developing Facebook's ad-driven business and coming up with a way for Pinterest to monetize its users. As someone who understands advertising as well as you do, what do you think about this case that the DOJ has brought against Google?
TK: If you're trying to start an online business, and you want to monetize that business through advertising, it's not impossible, but it is an incredibly steep uphill battle.
Pinterest ultimately broke through when I was president of Pinterest and working on their revenue business. But the dominance of both Google and Facebook within advertising makes it really difficult for new entrants. The advertisers don't want to buy from you because they basically can get to anyone they want in a very effective way through Google and Facebook. And so what do they need Pinterest for? What do they need Snap for? Why do they need (XYZ) startup tomorrow?
That's on the advertising side. On the search side, Google has been stifling competition for years, and I mean that less in terms of allowing new entrants into search -- although the government may be asserting that. I actually mean it in terms of content providers and publishers. They've been stifling Yelp for years. They've been basically trying to create these universal search boxes that provide the same local information that Yelp does. [Yelp] shows up organically when I search for sandwich shops in downtown San Mateo, but then [Google puts] their own stuff above it and pushes it down to create a wedge to hurt Yelp's business so that [Google] can support and build up their own local business. That's anti-competitive.
TC: Along with running Moment, you've been talking with startups that are addressing some of the issues we're seeing right now, including startups that alert readers if a news outlet is left- and right-leaning so they're aware of any biases ahead of time. Would you ever take outside money to invest? We're starting to see these solo GPs raise pretty enormous first-time funds.
TK. I think traditional venture capital, with traditional limited partners, and the typical time frame of seven years from when the money goes in and the money needs to come out, created some of the problems that we have today. I think that companies are put in a position, once they take traditional venture capital, to do unnatural things and grow in unnatural ways. Absolutely the social networks that took venture capital felt the pressure at the board level from traditional venture capitalists to grow the user base faster and monetize it more quickly. And all those things led to this extractive business model that we're looking at today with a critical eye and saying, 'Oh, whoops, maybe this business model is creating an outcome that we don't really like.'
If I ever took outside money to do more serious professional-grade investing, I would only take it from wealthy individuals and there would be an explicit term that basically said, 'There's no time horizon. You don't get your money back in seven to 10 years necessarily.' I think that's the criteria you need to have if you're really going to do investing in a way that doesn't contribute to the problems and misaligned incentives that we're dealing with today.