Anusha Alikahn Transcript

[00:00:00] Welcome to, let’s Hear It. Let’s Hear. It is a podcast for and about the field of foundation and nonprofit communications produced by its two co-hosts, Eric Brown and Kirk Brown. No relation,

Kirk: well said Eric. And I’m Kirk.

And I’m Eric. The podcast is sponsored by the College Futures Foundation. Which envisions a California where post-secondary education advances equity and unlocks upward mobility now and for generations to come to learn more, visit college futures.org.

Kirk: You can find, let’s hear it on any podcast subscription platform.

You can find us online at, let’s hear@cast.com. You can find us on LinkedIn

Kirk: and

yes,

Kirk: even on Instagram. And if you like the show, please, please, please rate us on Apple Podcasts so that more people can find us.

Let’s get onto the show.

Kirk: So I think that there’s a chance that as a result of this episode, we’re gonna have our first. Mm, not 15 billion listeners. It’s 50 [00:01:00]

  1. 50 billion. 50 billion monthly views on Wikipedia. Oh, by the way, we started, right? You didn’t welcome anybody in. You’re, this is the new Kirk.

Kirk: It’s retired. This is the, this is the second retired.

Yeah. We don’t do the welcome in anymore.

This is the less welcoming Kirk,

Kirk: but you’re gonna keep bringing it up. So we’re gonna end up accidentally welcoming people because we keep talking about the welcome. It no longer be part of our situation.

Now you see the My evil ways.

Kirk: I know it’s never gonna let, you’re never gonna let us let it down.

But yeah, so I think with the, okay, let’s see. So what percentage of 50 billion are we gonna hit as a result of this episode?

1.0. Whoa, whoa, whoa, whoa, whoa, whoa, whoa, whoa. One. Hey, what could possibly happen is maybe somebody will make a Wikipedia page. For, let’s hear it.

Kirk: Oh, well, more importantly for Eric Brown, don’t you think?

Don’t you think, do you already have a Wikipedia page or no?

We will talk about this in the blah blah. After the show. Let’s, let’s just talk about what we’re gonna listen to.

Kirk: Let’s, let’s set it up because this is [00:02:00] awesome. You artfully got through some really good and important stuff for our entire field in this discussion, but let’s, let’s set this up.

I really, really, really loved this conversation with Anusha Ali Kahn, who’s the communications. The Chief Communications Officer at the Wikimedia Foundation, the organization that runs Wikipedia, which is a daily, if not hourly, if not minute by minute, part of so many people’s lives. Wikimedia turned 24 this month in January, it is translated into 334 languages.

There are 61 million articles and 50. Billion monthly views. How about them apples?

Kirk: Unbelievable. Incredible conversation. Anusha, thank you so much for coming on. Lets hear it. So let’s let, let’s go in and listen. There’s so much to talk about. As always, this is Anusha, Alcon from the Wikimedia Foundation on Let’s hear it.

Welcome to, let’s Hear It. My guest today is Anusha Ali Kahn, the Chief Communications Officer for the [00:03:00] Wikimedia Foundation. And. The newly elected board chair of the communications Network. So Anisha has a law degree and a journalism degree, by the way, people, so do not argue with her because she will clean your clock.

Anisha, thank you so much for coming on. Let’s hear it. So great to be here with you. Eric. You have had such an interesting career Before you were at Wikimedia, you were at the Knight Foundation there in warm and wonderful Miami. What brought you to Wikimedia? How did you end

Anusha: up in this place? So I discovered Wikimedia Foundation through my work at Knight Foundation initially, and what I loved about it was in fact that it blended this world of the nonprofit with the technology focus, and I’ve been very interested in technology throughout my career.

As you mentioned, I started my career in law, but my first communications job ended up being with the UN and specifically when I left the un I was working for the [00:04:00] Department of Field Support and the Information Communications Technology Division. And then found my way tonight and have always been just deeply interested in the ways in which.

Information and journalism and technology can actually help to improve communities. And I think we can really lean on, on each other when we’re working for this from the same script. What really intrigued me about the Wikimedia Foundation as well as the Wikimedia model, is that it is in fact built on consensus and, the model itself is designed to bring a lot of people together from a lot of different backgrounds who may have different political beliefs, beliefs, different beliefs about a lot of things but they have to be able to find common ground working through the facts as they stand. And and that was intriguing to me, this blend of sort of, you know, being a nonprofit as well as working on a technology product.[00:05:00]

And if you care about technology, communications and nonprofits, frankly perhaps Wikipedia may be the apotheosis. All of those things, which is, you know, it’s, it’s incredible to, to, to me anyway, that it has become such a central part in almost everybody’s information lives. And now Wikimedia, Wikipedia’s gonna be 23 in January, I believe.

You know, it can drink 24, it can drive a car. 24. Okay, you better. Pretty soon. It’s, you know, it’s, it’s, it’s working its way to middle age if it, if it lasts that long. But I think a lot of this conversation will be about the future of this. But how did it get to this thing? I remember when if somebody said something was in Wikipedia, like, eh, it’s, you know, didn’t quite have the currency that it now has.

Anusha: That’s exactly right. When Wikipedia started in 2021, it was treated with a lot of skepticism. Even even an internet joke. A lot of people just simply didn’t think that an online encyclopedia that [00:06:00] anyone could edit would work. I think at the time a reporters called it something between a fluke and a quirky outlier.

And of course, you know, we’ve had teachers warn us not to use it as a source, but, but over time Wikipedia has really exceeded everyone’s expectations. It’s now the world’s largest online encyclopedia and one of the top 10 most visited websites. It has in fact more than 15 billion views on a monthly basis.

And so it’s overcome its rep in that it had, its in, in its youth, so to speak. A recent. Recent media stories have called it the Last Best Place on the Internet, and even the last bastion of Shared Truth. And we’re facing this moment where polarization is rampant and it plays out most loudly in the information space.

These battles over truth are everywhere. This is a model that promotes finding common ground. Based on the [00:07:00] facts, as they’ve been researched and reported by journalists and academics and other people,

and it was founded by Jimmy Wales famously a, a fascinating, almost iconic, classic figure, I think in, in a lot of people’s minds.

What was it about? What did he see? Do you think in this model and which was at the time, I mean, when you talked about a wiki, it was often, sometimes like a little way for a team to come together and write something, but it hadn’t really been used in this way. What did he see in people’s abilities to share?

To partner, do whatever that has created this thing that is now in 334 languages that has over 60 million articles on it.

Anusha: Ultimately, I think this stemmed from a deep trust in people. So, you know, as you said, this is all about the folks that edit Wikipedia and there’s over [00:08:00] 260,000 volunteers who add content to it every day.

That’s why we’re now at 60 million articles and. I often get the question, and as does Jimmy Wells, you know, can’t, can’t anyone just edit Wikipedia. And that is true that everyone is invited to edit it. And that could be something as simple as adding a citation or as complicated as starting a breaking news story from scratch.

But at the same time, it’s not as straightforward as simply adding any content. What. Jimmy Wales envision alongside this community of people that was adding content to Wikipedia was a set of principles that every volunteer has to take on if they want their edits to stick. And that was built in collaboration with volunteers.

So you know, all the information on Wikipedia has to have a source, and that could come from a reliable news outlet. Or an academic journal or a book the [00:09:00] information has to be delivered from a neutral point of view, and that’s really important in this environment. I think it means without. No opinion.

So if we consider what isn’t trusted in our media ecosystem today, it’s stories with a bent or an angle that they’re trying to push. And also I would say that transparency rains on Wikipedia. You can see how a story evolves on its history page from the day it begins to the last edits. You can see editors debate thorny topics on the article talk page, and that can be as like mundane as who holds the seed of power, humans or cats in the pet owner relationship to topics as as origin of C.

Yeah, it’s, I obviously, obviously it’s cats, right?

Anusha: Clearly cats. Yeah. So yeah, I think the principles are really what makes it unique along with the people that add the content.

Given [00:10:00] that, as you said, it is meant to be as objective a, a, an entity as you can get in terms of information, it’s not supposed to have a point of view.

It’s not supposed to have a or a slant, let’s just say. Given that we are now in this culture in which, I mean, I don’t have to. Spell it out too far, but nobody seems to agree on anything or, or we, we, you know, we have differing versions of truth, of fact, of science, hell, maybe even of gravity. We, we may find like gravity may be made illegal pretty soon.

What is it a, what are the, what do you think the things are about Wikipedia that will allow it to endure? Because it’s, it seems to be going in the face of our kind of trends or cultures or whatever. And why would, yeah, why would people be interested, frankly, in objective things anymore when we are just now?

It’s almost trained [00:11:00] to be. Combative and, and pushing against what was considered to be fact.

Anusha: You know, you neutrality sits beside certain other characteristics of the site itself, so there’s no ads on Wikipedia and we also don’t track users. So there’s a very high bar for privacy on the site and there’s essentially no incentive for content to go viral.

Or, and we don’t use those algorithms to personalize information feeds. So what you see is what you get on Wikipedia, whether you’re in New York or you’re in Nairobi. And to your question, neutrality on Wikipedia is all about providing information without taking sides there. There was a study by Harvard Business School published in Nature and it found that the more people engage with perspectives.

Different from their own on Wikipedia. And the more volunteers who edit a Wikipedia article, the better and more balanced it becomes. And, [00:12:00] and I really think that’s what makes Wikipedia different from social media. It’s not a place to share your opinions, but rather to document facts collaboratively.

I, I wanna go back to those early days.

And to the extent you weren’t there yet, but I’m sure you have lots and lots of good Wikipedia history to talk about. What were those early days like? How do you create something? That has, that ends up being this powerful when your idea is so oddball, frankly, to so many folks.

Anusha: I think it goes back again to the reason that Wikipedia existed, which is because of its collaborative model.

You know, we’re, we’re not trying to keep you scrolling for hours or track everything you do online, and we don’t optimize also for time spent on the site. The goal of Wikipedia as it was envisioned was to spark curiosity. It still does that now. And we get people learning and hopefully, you know, they’re clicking [00:13:00] on those sources and citations that really lead them to other places.

And down those Wikipedia rabbit holes, which I think a lot of folks know. So, you know, unlike social media, we’re not part of what some people are calling this attention economy. There’s a, there’s a scholar, Shoshana Zuboff, who talks about how platforms built. On targeted ads, try to extract as much data as possible about their users.

That’s been called surveillance Capitalism and Wikipedia is the opposite of that. We have never from the beginning, monetize people’s behavior, and we’ve always prioritized privacy. So the early building of Wikipedia was really, you know grounded in those types of principles and very much is about why it.

Became successful and is so popular today,

and that’s based on trust of this trust that it will continue to not monetize. You probably wiki, [00:14:00] Wikipedia probably knows a lot more about more people than many of these great big tech titans that you, if you wanted to monetize it, I assume you could sell it in a minute and it would be worth a

Anusha: fortune, would it not?

That’s right. So, you know, we are in this moment actually where we’ve seen a lot of new tools, including chat, GPT, and other generative AI tools, really lean on Wikipedia’s data to feed some of the answers that you get. And we’re. Grappling with some of the challenges that that actually poses to our model.

Because if more people are looking at that content on other sites, that means that less people are going to our site. That means less people give, and that means less people are motivated to join the Wikimedia movement and also become editors. So you’re exactly right that this [00:15:00] information now has become extremely.

Essential to the truth backbone of the internet, while Wikipedia itself is becoming less visible. So platforms like chat, GPT, that use Wikipedia as a source but don’t acknowledge it really are posing an existential crisis to some of the ways that we have worked in the past. And we have to rethink and work side by side with some of these companies to tackle some of the challenges that we’re seeing.

Well, okay, so let’s go

there. We, let’s get into this AI question. There was a really interesting piece in the Times Magazine in July called Wikipedia’s Moment of Truth that explores Wikipedia’s future in the face of ai. First of all, as you say that the AI chatbots have just kind of munched their way through the entire Wikipedia encyclopedia.

If, have they not? They’ve just kind of eaten it all up and now they’re doing what they will with it based on their own algorithms and their [00:16:00] own. Biases, frankly. How does

Anusha: Wikipedia survive that? It’s a great question. Lately, a lot of folks has been asking whether AI will in fact replace human editors on Wikipedia.

And the short answer in our world is no. We don’t believe that AI is going to replace human editors. So even as AI gets more advanced and widely used, there’s always going to be a critical need for humans to do that challenging and the nuance work of giving people real knowledge. If you think about what makes Wikipedia special, it’s not just.

Any content platform. The value of Wikipedia really lies in its volunteers, those people who are doing that research and discussing, and we’re working together to decide what’s accurate and reliable. So it’s not about just creating text, it’s about a process. It’s about finding sources [00:17:00] and critiquing them and, and building consensus.

And it’s not work. In our view, that can be automated because it’s so deeply human. That isn’t to say that AI can’t help. We do see opportunities for tools like large language models to make editing more efficient on Wikipedia itself and, and even more fun. But at the end of the day, there’s not a world where we don’t need humans to tackle complex issues, whether that’s.

Understanding political conflicts or figuring out something quirky like you know, the history of model trains.

Well, that’s a huge relief. We’re gonna take a very short break. We’ll be back with Anusha Ali Khan. After this, you are listening to, let’s Hear It, a podcast about foundation and nonprofit communications hosted by Eric Brown and Kirk Brown.

If you’re enjoying this episode, you may just be a rule breaker. Tune in to break Fake Rules, a new [00:18:00] limited series podcast with Glen Gall, CEO of the Stubs Foundation. Hear from leaders in philanthropy, nonprofits, government media, and more to learn about challenges they’ve overcome by breaking fake rules and which rules we should commit to breaking together.

We are also sponsored by the Conrad Previs Foundation. Check out their amazingly good podcast, and we’re not just saying that. Stop and talk. Hosted by Previs Foundation, CEO. Grant Ola. You can find them at stop and talk podcast.com. And now back to the show. Welcome back to, let’s Hear It. My guest today is Anusha Ali Khan, the Chief Communications Officer for the Wikimedia Foundation.

And congratulations, newly elected board chair of the Communications Network, and we’re just talking about Wikipedia in the face of ai. And thank you for reminding us that humans will always matter, which is really helpful. H how do you, is there a Wikipedia 2.0 or 10.0 that? Manages that negotiates, navigates the, this future [00:19:00] technology world that we’re all really just, I think, I don’t know, fumbling our way through in many ways.

Anusha: So I think what’s important to keep in mind is that Wikipedia has actually been using AI for quite some time. So while people really know about Wikipedia from the editors and the humans that add to the content, AI has helped us with all sorts of things from translations to. Detecting vandalism on the site, and there are bots that essentially aid Wikipedia editors in reverting edits if there is bad behavior detected on the site itself.

The main thing to remember in those circumstances though, is that there’s always human oversight. One of the biggest challenges with AI right now, I believe, is trust. You know, how do we be, how do we know where the information comes from? How do we ensure it’s accurate and [00:20:00] unbiased? And Wikipedia offers a model for how that can be done.

You know, open AI and other models are suffering from hallucination problems where the model may respond confidently with facts that are entirely made and without clear attribution and real links, the things that Wikipedia provides to where information was collected from. Users just aren’t able to easily see and understand whether it is misinformation unless they’re doing separate search and with the scale and speed at which these large language models are being adopted.

This runs the risk of introducing vast amounts of misinformation to the internet and. Places like Wikipedia are gonna be more important in light of that. The other thing I would add just is that Wikipedia, again, prioritizes [00:21:00] transparency and our head of machine learning regularly live streams. His, his work, which is, you know.

Pretty unheard of. Even in the technology world. He publishes links to his internal chat, his ticketing system for errors, and you can watch him problem solve in real time. All of our AI models are open source, so we are already doing work in this space. That kind of transparency is really rare, but I think large language models have a lot to learn from the fact that it helps to build trust when you are that transparent.

That’s like the TV show, big Brother for the nerdiest nerds on Earth. Like, like, hey, let’s, let’s watch his chat. You know, his, his ticketing system. Woo hoo. You talked a little bit about bias, and I’d like to, I’d like to go a little deeper on that because I mean, all information is the product of some kind of bias, and as I think you’ve even said that, what is it? One in five biographies on [00:22:00] Wikipedia are of women? So how do you take this almost kind of a double whammy of the bias that is built into many of the chatbots with the bias that is inherent in human beings, many of whom are Wikipedia editors?

Anusha: Bias in AI is a huge challenge. The problem is that most AI tools are trained on existing internet content, which is heavily skewed toward English, and that is a huge issue because the next billion people coming online are gonna be from Africa and Asia, and many of them speak languages that are barely represented online.

So, you know, if we think about it, less than 1% of the internet itself is an Arabic, even though millions of people speak it. AI translation tools right now struggle with these languages because they’re mostly built, built [00:23:00] for English, but they miss out on properly representing the diversity of our world.

You’re also exactly right that the places that AI is training on, including Wikipedia, reflect their own biases. So only 19% on biograph of biographies on Wikipedia are about women. We are putting a deliberate focus on changing that one of. Two pillars of our strategic direction is about knowledge equity, and that means representing missing knowledge in our projects of people who have been left out of the historical record.

But there’s a lot of AI tools right now that are not paying attention to filling those gaps. And again, it’s that human touch that ensures that knowledge is truly reflective of our worlds, of our world itself. That’s really something that no algorithm as it stands, can fully replicate.

Let’s kind of flip this around a little bit because [00:24:00] there, I, I assume that there are good possibilities for how AI and Wikipedia.

Relate to each other, how or, or how Wikipedia can improve how AI is used, particularly the large language models. What do you see as the kind of the happy. Outcome of this potentially fraught relationship?

Anusha: I think that AI has a lot to learn from Wikipedia in that volunteers on Wikipedia show up with a lot of humility and AI could do with some humility.

You hear that ai, you need to be more humble, but I love that. Exactly. Fundamentally, you know. In order to preserve and encourage more people to contribute knowledge to the commons, the people that that are developing these technologies need to look to augmenting and supporting human [00:25:00] participation in, in creating knowledge.

They shouldn’t be impeding or replacing the human creation of knowledge because. Humans are needed to essentially feed them. We wanna be able to prevent this idea called model, model collapse, where AI is essentially creating bad answers because it’s left to train on AI versus human generated content.

So you know, they shouldn’t cannibalize the sources that they rely on. They also, to the point about bias, they need to pay attention to equity. Large language models should be building in checks and balances that don’t push information biases, and widen what are already big knowledge gaps on the internet.

We’re. At Wikimedia Foundation, we’re working directly with volunteers so that we can add traditionally excluded histories and perspectives, and that needs to continue. [00:26:00] Also, again, they really need to embrace more transparency. You know, just as an example. Several years ago folks may remember that Amazon developed an algorithm that screens, jobs, applications, applying ai, and, and they ended up pulling it off the market because they found out it was bias against women.

How did they find out it was bias? And in what way was it bias? For other companies exploring that same thing. If Amazon were to re release their algorithm, that could help the entire industry move forward. And that’s the standard that we hope that generative AI tools will set.

Are you, you know, do you have the, the, the hot phone, the red phone to Sam Altman and the other pus of AI in which you’re lobbying them, encouraging them in treating them to, for, to ensure that their models don’t.

Discriminate that they don’t reveal a bias, that they are helpful, that they [00:27:00] are helping us build towards a better world.

Anusha: Yeah, we are trying to have these conversations right now. We’re also leaning on a lot of coalition building as well as research in order to inform more people that are working in social good.

Philanthropy, I think has also a very big role to play here. A lot of philanthropists and. Foundations and nonprofits right now are looking at the ways that we what feeds polarization, which is people that are working from different sides of the information spectrum. And I, I believe that we can build better collaboration more people from civil society, philanthropy, even government, in order to solve some of these big challenges that we are seeing together.

I, I wanna. Walk Daintily on

this next little piece of ground, but do you think that what, you know, like what role does government have to play? Is there a regulation framework that you think that would improve how people get information, [00:28:00] reduces bias and, and makes us all more productive, happy, and leads to a better future.

Anusha: Any regulation and you know, we work on a global stage, so we work with a lot of different policy makers and de decision makers at Wikimedia Foundation. We have a terrific global advocacy team that has these conversations all the time. What we see fundamentally is that there is often a lack of understanding of WIKIMEDIA’S model and collaborative content building.

What often happens is that governments in the space are. Building laws to per, to prevent the harms on the internet, and they’re specifically targeting large technology companies that have the means and the resources to be able to take into account what could be, say, large fines or other consequences when harmful content shows up on these platforms.

At the same time, these laws [00:29:00] aren’t necessarily contemplating nonprofit organizations. With human rights policies in place that exist in the public interest. And so the main thing that we’re trying to do with regulators is encourage them to understand the Wikimedia model better so that they don’t cut off what is a lifeline in many societies to a product that both provides.

Education, heritage preservation, as well as just a space that you can debate with friends and show up and ask wild and quirky questions to.

Okay. What is the wildest and quirkiest question that you have encountered? What is your favorite, what is your favorite cocktail party? Example, example of how fun it is to work at Wikimedia Foundation.

Anusha: I think it’s the topics themselves. So I never realized that like ironing was a sport on Wikipedia, you know, like. [00:30:00] It is things like that that like blow my mind on a regular basis. I, I learned the term cute aggression from Wikipedia, which like I feel like I practice cute aggression all the time, but I didn’t know that there was a word for it.

I’ve never been accused of that. I have to tell you of being my aggression is usually very un cute. Oh, and then what another question it off, it seems to me that the Wikipedia model. Ought to be applicable elsewhere, but we really haven’t seen it in quite the same way. Are there other places where you think that this could eventually take off?

Or, I mean, why? Why hasn’t. Other forms of how we build and share and, and improve information taken on the way it has here with Wikipedia.

Anusha: So Wikipedia itself, like the basis of Wikipedia is software that is derived on what we called Media Wiki. And that is [00:31:00] actually used in a variety of different settings, including in fact, by nasa for example.

So. Well, you’re right that there is nothing quite like Wikipedia when it comes to the scale at which people visit it and have adopted it as a primary focus of where they get information. It has been used by organizations and smaller communities. As a model for building together and building collaboratively, what we would love to see is, in fact, you know, going back to our conversation about generative AI tools that some of, at least the potential for open source.

Collaboration as well as shared lessons is eventually reflected on a lot of def different technology platforms. So the model itself is different, definitely scalable and applicable to very different settings than what we use it for, which is in [00:32:00] the context of a encyclopedia.

Well, Sam Altman, if you’re listening and I’m sure you are, just listen to Anusha and and take her advice.

I really appreciate your time. I love this conversation. I am both absolutely petrified and yet somehow still hopeful about the future. What about you?

Anusha: I’m really hopeful about the future, and Wikimedia has always. Been a technology optimist. You know my hope right now is that the next steps of technology itself caters to the best of humanity, and that means our curiosity and our creativity and.

This idea that we can collaborate and find common ground. I, I think that Wikipedia shows us what’s possible when people work together to build something bigger than themselves, and it really tells us a lot about human motivation as well. So I hope that spirit carries.

Well, it certainly, I, it carries [00:33:00] with me and you’ve given me something to smile about for the rest of the day, if not the week, maybe the year.

Thank you so much. Anusha Anusha, Ali Khan, chief Communications Officer for the Wikimedia Foundation. I’m gonna go right onto Wikipedia right now and learn about professional ironing. ’cause you know, if the, if the podcasting thing or the communication stuff doesn’t work out, maybe I can find a second career.

Good luck, Garrick. Thank you. It was great. Thank you. Thank you so much. I really, really appreciated it.

Kirk: And we’re back. So, yeah, I got it wrong. I heard 15. It’s 50 billion views per month,

  1. What’s the difference?

Kirk: These are outta Garrison Surf Point. They

are ridiculous number of views, such an important resource. In our lives, not the least, because it is the thing that so many of these, you know, AI, body thingies are built on.

They’re using all that stuff to, to drive their engines, which is the big challenge. And one of the things that [00:34:00] obviously came up a lot in our conversation,

Kirk: it made me feel like the entire conversation about Wikipedia is the promise and the tragedy of the internet all rolled up into a really perfectly neat.

Bundle thinking through this work, this collaborative work built on transparency, it’s volunteer, it’s public interest, and then it’s being harvested by large language models. So that it can be monetized and fed into the engine of surveillance capitalism as you were going through this conversation, and I thought Anusha was incredibly gracious talking through the nuance of all that and actually very, very WikiEd Wikipedia, like and talking through brave, right fact-based, not a lot of emotion around it that blended the lawyer, the technology, the information piece.

Man, doesn’t it just feel like it’s the microcosm for what we wanted the internet to be? And then it’s the microcosm for the challenges of what the internet is being driven to become. Doesn’t it feel like that’s, it’s [00:35:00] all right there in that Wikipedia conversation.

You got it. But this concept of surveillance capitalism, oh geez.

Should fill every single person on the planet with absolute bone shaking dread. Because we are just simply the product and everything. We do everything we want, everything we buy almost everything we think is being surveilled, and then someone is taking that information and turning it into money.

Usually it’s to sell us some crap that we probably don’t want or need, but. Also may be to sell somebody else us, which is worse. And sometimes it’s more nefarious even than that, which is to sell a, sell somebody our, our personal information that they can then use so that they can steal from us. So how so Have a nice day, Mr.

Brown.

Kirk: So Jimmy Wells found Wikipedia 24 years ago and, and even these dates and ages, right? 24 years. It just, that is such an interesting number, right? Because it’s a quarter of a century. You know, so it’s actually kind of a long time, and yet at the same time it’s such a, it’s [00:36:00] such a, just a, a brief, just snap your fingers.

It’s all, all gone. So Jimmy Wells found Wikipedia. Have you ever tried to figure out what Jimmy Whale’s net worth is? No idea. I don’t think you can figure it out, but, but the only estimate that I could find was $1 million. So here’s my question for you, Mr. Brown. What’s a more fair representation of air quotes?

The value of the internet? Jimmy Wales in is 1 million or Mark Zuckerberg, and I can’t What, what’s the number? Is it hundreds of billions? I don’t know. I have no idea what’s what’s, what’s the more accurate representation of what the true value of the internet should be worth? Because I feel like your whole conversation with DeNucci was actually jumping around the edges of that topic with that question of where is value?

Where does it get stored, and who holds onto the value of all this information, all this conversation that’s being created. That’s kind of what’s in, in imbalance and what’s in question here? I feel like

well think about the value of the contribution, I suspect,

Kirk: ugh,

[00:37:00] versus the,

Kirk: yeah,

the recompense and the value of the contribution of, of Wikipedia.

I, I can’t imagine it’d be very, very hard to put a number on.

Kirk: Well, what we love about Wikipedia and what the internet was first presented to us as was a home for collaboration. You know, and the way that Wikipedia builds that collaboration and, and, and the principles that Anusha talked about, you have a source, mutual points of views.

You deliver things with no opinion, but this key thing around transparency and the fact that Wikipedia in all of its works says, Hey, we’re gonna bring you in. And so this question of value, it’s, it’s funny, this is a great communications moment because these words mean different things in different contexts.

But I would, I think we would agree that Wikipedia has de generated tremendous value to all of us. Because of the way it’s been constructed. What it has not done to all of our benefit is concentrated that financial or market value in the hands of just a couple of people, because that transparency leaves that value [00:38:00] open to all of us.

It doesn’t just, it doesn’t just collapse it into some other, into a couple of hands. So it’s funny, this whole conversation about ai, what’s a bot? What’s not, how are, how are these things being constructed? It all feels like it sits on this edifice of, of transparency. It’s such a nuanced thing. We can’t even quite wrap our hands around transparency.

And yet it’s so obvious when you have it versus when you don’t.

Yeah, for sure. And the thing about Wikipedia is that it works. It it really, really, really works. Yeah. And when you go to a page, you can have some confidence. I. Reasonable confidence that this Yeah, that this is the truth. As long as it, you know, it hasn’t been edited in the last eight seconds and the, the community hasn’t come surrounded a page and, and fixed the, an inaccuracy.

But the fact is that, you know, we all remember back in the day, and we, I talked about this with Anusha, that, you know, there was a time when if it was on Wikipedia, people would roll their eyes and they’re like, yeah, well Wikipedia, how can you trust that? And now that. If that misgiving is gone and you can, you can trust it.[00:39:00]

Whereas if you go to Twitter or Facebook or any of these other places, you have absolutely no confidence whatsoever. What you’re reading is true and that’s the difference. The, the big question that that comes to me, and maybe you have some thoughts about this Kirk, ’cause you have big thoughts about big things, is that how do we take this model somehow Yeah.

And attach it to our political conversation. How do we attach it to our democratic institutions? How do we truly open up so that people have confidence that what they’re getting is real? And how do we build a, a culture? Of confidence in like a need to learn what and how we know that things are real.

Even if you look at the fires in Los Angeles, Henry Winkler for the fons is, is mm-hmm. Tweeting out that, you know, it was it was arson and like he knows like that’s, you know, we have, where is that, that critical [00:40:00] thinking that yeah, that has to animate important conversations. That’s what I think Wikipedia is producing for us.

But at the, the counterbalance to that is this crazy, nutty world in which any accusation. All of a sudden finds fertile ground.

Kirk: I’d love to see us do more content around this because I, you know, we’re there together. This is the central question of our time, and I feel like it’s twofold. It’s both what’s true versus not, but also who the messenger is.

Is the messenger real versus not? And, and we don’t have transparency into any of it, and we’ve talked about this before, but I, I feel like for decades, what social media and the internet really has been is like we’re all drinking a cup of coffee and it’s completely opaque. We’re just trusting that what’s in the coffee is gonna be good for us.

It’s in no other world would you, would you allow someone to hand you something that you are gonna ingest without knowing what was, what’s in it and what the sources are? And there’s something about, because [00:41:00] we can read it, because it’s, it’s, it’s either written or audio or video information. We can discern what’s true or not.

We clearly can’t, you know, our, our, our, the mechanisms of our brains are clearly not. Well equipped to navigate this stuff. And, and I think the piece that Wikimedia’s been working with for all these years is this emphasis on human actors, known people doing this work in collaboration with each other.

Right. And that’s been a missing piece. And, and I, I think that there’s a, there’s an undercurrent of this too, which is like when we talk about the people with the many billions of dollars versus the million dollar. When you can say that your engagement is any number you choose to pick out, because nobody actually knows how many of the, how many of the nodes in the network are actually real versus mechanically produced, guess what?

You’re off to the races in terms of the claims you can make around the value are so, which is why, again, I think this thing about Jimmy Wells and his net worth is actually really important because it’s not just a nonprofit model that’s that’s governing there, but it’s actually how the information [00:42:00] is being shared.

It’s, it’s a more accurate representation of where real value lives. But, so you guys got into something and, and, and I just wanna say massive thanks to Anush and the entire team at Wikimedia for thinking about knowledge equity. And this is, this is the kind of stuff, right, that just becomes an absolute target for whatever.

Whoever wants to politicize or jump up, jump up all these conversations, which again, I’m not even sure I like polarization as a concept because we don’t know how much of that air quotes, polarization is being driven by bots and propaganda campaigns and things intended to pull apart from each other, but, but to thoughtfully step back and say, guess what?

With all these pages of information, one thing we know for sure is that we’re missing knowledge and we need to have an intentional effort to actually fill in these pieces. I, this is the hard work, right, though. You have to stake a claim and say something like, we know what knowledge is there. We know what knowledge is missing, and we wanna be intentional about building it out and building that out.

Yeah, I think you’re right. You’re [00:43:00] absolutely right. And yeah, it’s not enough to say that the community will do what it wants to do. You, you do have to have a point of view around equity that we believe that certain voices are not being captured and we want that. And so it is the community plus, frankly, some, I don’t know, intelligence, caring, dare I say, virtue, that you have to attach to it as well in order to really get something that fully represents what the community and whatever way you define it.

Thinks, feels, cares about and knows.

Kirk: Yeah. I love the notion too of the Wikimedia folks having open conversations with open AI and those folks about, hey, like how do we protect what we’re creating here without, you’re you guys just grabbing all of our content and then spitting it out and monetizing it. I know.

But I mean, what do you think, I mean, how do we bolster that work? Is that, and, and Anisha talked about the role for philanthropy there, but this again, this feels like some of the really. Cornerstone work. That has to be, and I was thinking, Eric, like, is it possible that we’ve lived through these different areas of the internet and we [00:44:00] kind of grew up around this notion of the friendly internet.

It was inviting. It was, and again, the, the Wikipedia conversations are not, they’re not friendly, like those debates that are happening in the edit editorial pages, but they’re real people with real points of view. And, and, and, you know, transparently providing sources, at least in these key questions, like, who’s the seed of power?

Humans are cats. But it’s like we, we, we, we grew up around this friendly notion in the internet, and now we’re just. It just feels like we are on the cusp. Maybe, maybe descending deeply into just this ugly internet era, you know, where, where, you know, where’s, where’s truth, where’s value in this stuff that we’re seeing?

Yeah,

I, I, I can’t answer the, your big question, which is how do you make that happen elsewhere? How, what could philanthropy do to, to make sure that it happens? I think that philanthropy can do a lot of things, I think, to encourage it to happen and which is to support organizations. To be transparent and to use fact, and to make sure that the mechanisms that you are using for communication are good, rather than if you, you know, if a organization is, is kind of [00:45:00] fighting the same kind of ad hominem us versus them black versus white battles that folks who don’t have our all, all of our best interests at heart are fighting.

Then I think you end, if you’re fighting, you know, kinda doing your, your thing on somebody else’s turf, then they get to say. How it, how that conversation occurs. And I think that that’s something that can be expensive to not be able to use. Yeah. You know, step out of those arenas and, and into better ones and, you know, to a very, very small degree platforms like Blue Sky here to be, I.

Less poli, politic, politicized right now, but that could change on a minute.

Kirk: Well, and and clearly one answer to this question is you hire people at Anisha to do these key roles, right? And, and the, the amount of discipline, knowledge, expertise that Anisha’s bringing to this work, the capacity to talk about it, the thoughtfulness.

Man. I mean, it feels to me like we need more anes doing this work, engaged in it, and really providing us leadership and how we sort this through because the, the level and depth of conversation [00:46:00] is transparent as Wikipedia is trying to be in. The Wikimedia Foundation is trying to be around all of this.

There’s a level of technical expertise required to navigate this conversation that clearly we need folks like to help us with.

I totally agree. And by the way, I wanted to go back to the question you asked me in the, in the intro there did used to be an Eric Brown Wikipedia page. Somebody created it.

It was up for at least a decade, and the community seemed to think that I was worthy of a Wikipedia page. And then it changed its mind. No. So I, so unless the community were to change its mind again and dec and decide, ah, that. I was worthy of a Wikipedia page. I am now relegated to nothingness status. And that’s So you, the li that’s the way it’s,

Kirk: you achieved significance and then you lost it.

It’s a, it’s a, it’s a fable for our times.

That is correct.

Kirk: Oh man.

Maybe someday the community will change its mind. You know, I’ll, I’ll exist again, but until that time you’ll just have to go to my IMD, which is [00:47:00] also. Full of falsehood. So

Kirk: that’s what can I say. That’s great. Well, Eric, that was incredible.

And Anusha, thank you so much for coming on. Let’s hear that was and, and can we please have more of these conversations about this whole topic around transparency, internet transparency, polarization, the roll of bots, how we sort through all this misinformation, disinformation. It just feels like it’s a, it’s a very sad time.

Like it, it’s funny, I almost feel like this notion that we had about the internet is a place you could go to get questions answered to find. Random people out there, but they were always people, they were always people that, you know, we weren’t in this kind of crazy world of we don’t know what’s what. That era, it feels like we’re on the other side of it and we look back at at it with fondness and we’ve gotta find a way to get there.

Again, I feel like,

well, I will do my level best to try and ferre it out. Some of those conversations. Kirk,

Kirk: since you asked. Appreciate it. Well, Anan, thank you so much for joining us on, let’s hear Eric. That was fabulous. Well done as always. It, it’s just, thank you, sir. It’s just been an a-list set of guests this year [00:48:00] and and we’re gonna keep rolling.

We’re like, wow. Our guests this year have been absolutely perfect.

Kirk: That’s great. Well, okay, everybody, we’ll see you next time on. Let’s hear. Okay, everybody. That’s it for this episode. Please let us know if you have any thoughts about what you heard today or people we should have on this show, and that definitely includes yourself.

And we’d like to thank John Ali, the tuneful and inspiring composer of our theme music,

our sponsor,

Kirk: the Lumina Foundation, and please check out Lumina’s terrific podcast, today’s students tomorrow’s talent. And you can find that@luminafoundation.org.

Certainly thank today’s guest, and of course, all of you,

Kirk: and most importantly.

Thank you, Mr. Brown.

Oh, no, no, no, no. Thank you Mr. Brown.

Kirk: Okay, everybody, till next [00:49:00] time.