DEF006 - Brittany Kaiser - Large Banner.png

DEF006 - Brittany Kaiser Interview Transcription

FACEBOOK & CAMBRIDGE ANALYTICA WHISTLEBLOWER

Interview date: Sunday 22nd Sep, 2019
Interview location: Wyoming, USA

Note: the following is a transcription of my interview with Brittany Kaiser, the Facebook and Cambridge Analytica whistleblower. I have reviewed the transcription but if you find any mistakes, please feel free to email us. You can listen to the original recording here.

In this interview, we discuss how she came to work at Cambridge Analytica, the Brexit and Trump campaigns and whether she is a whistleblower.


Interview transcription

Peter McCormack: 00:01:42       
Nice to see you.

Brittany Kaiser: 00:01:43       
Yeah, you too. Thank you so much for having me.

Peter McCormack: 00:01:45       
No, not a problem. I'm very interested in this story. Firstly, I was fully aware of the story, but not yourself, until probably a few months ago. When was that? It was when Caitlin had emailed me. She said, "She's coming to Wyoming. You've got to meet Brittany."

Brittany Kaiser: 00:02:00       
Wonderful.

Peter McCormack: 00:02:01       
And so then I went back and looked at-

Brittany Kaiser: 00:02:02       
I'm humbled.

Peter McCormack: 00:02:03       
Well, then I went back and looked at the story, and then the film was coming out, but I wasn't aware of any of the players at the time. So I'm really interested in knowing both sides of this story. When I say both sides, your life appears to be in two acts at the moment, up until Christopher Wylie mentioned your name in-

Brittany Kaiser: 00:02:20       
In parliament?

Peter McCormack: 00:02:21       
In parliament. Well, I think it was more of a hearing, wasn't it?

Peter McCormack: 00:02:23      
And then everything since.

Peter McCormack: 00:02:26       
So I'm really interested in both sides. So what I want to do is, I want to go through the background, talk about what happened, and then talk about what you're doing now.

Brittany Kaiser: 00:02:32      
Of course.

Peter McCormack: 00:02:33       
Cool. So I'm most interested in when you'd gone for the Obama campaign. So can you talk to me about what happened, because that sounds, like obviously, a very exciting thing to be involved in. You worked on the social campaign. Can you talk to me about kind of your schooling and what led to that?

Brittany Kaiser: 00:02:46       
Yeah, of course. So I had first actually met Barack Obama in 2004. It was the Democratic National Convention in Boston, and I was going to boarding school at the time outside of Boston, and I had been invited to participate in a youth leadership summit that actually taught you how to run political campaigns. So they took a bunch of young students and we ran a mock political campaign during the Democratic National Convention, and then they would take us to the different caucuses and as many of the speeches as possible, and get us involved in all of the issues, advocacy, and events, for us to fully understand the process.

Brittany Kaiser: 00:03:28       
And I went to a small environmental rally, where then state Senator Barack Obama was speaking about how him and Senator Dick Durbin were doing work in the state of Illinois to stop British petroleum from dumping into the streams and waterways that feed into Lake Michigan. It was actually a loophole that George Bush had created.

Peter McCormack: 00:03:50       
Wow.

Brittany Kaiser: 00:03:50       
You used to not be able to dump into any of the waterways in America, and then instead, all of a sudden, there's this loophole where you can't dump directly into the Great Lakes, but you can dump into the streams and waterways that feed into them.

Peter McCormack: 00:04:02       
All right.

Brittany Kaiser: 00:04:02       
So I was absolutely floored by how eloquent he was, how brilliant he was, and there were only 30 people there, about, to listen to him speak. So I stayed afterwards and asked him what I could do to help. I mean, I was totally in awe of him.

Peter McCormack: 00:04:19       
How old were you?

Brittany Kaiser: 00:04:19       
I was 15, 16.

Peter McCormack: 00:04:19       
Wow, okay.

Brittany Kaiser: 00:04:21       
15 years old, I think.

Peter McCormack: 00:04:24       
Okay.

Brittany Kaiser: 00:04:24       
And I was lucky that he had the time to speak to me, and he said, " You know what? I'm going to run for U.S. Senate. You should volunteer for my campaign." And he even invited me to breakfast the next morning. I had breakfast with him and he gave me a ticket to see his famous speech that he made about how there's not the red states and blue states, where the United States, or the whole world fell in love with him and decided he would, one day, become president.

Brittany Kaiser: 00:04:51       
I agreed, and I quit Edinburgh University briefly in order to go work on his campaign in 2007. So I was a 19 year old intern, unpaid, slaving away 24/7, sometimes sleeping in the office. They had blankets and pillows, and-

Peter McCormack: 00:05:08       
Is this in D.C.?

Brittany Kaiser: 00:05:09       
This was in Chicago.

Peter McCormack: 00:05:10       
In Chicago.

Brittany Kaiser: 00:05:11       
So, yeah, he's an Illinois Senator, so usually wherever you're from ends up being where your headquarters is.

Peter McCormack: 00:05:18       
Okay, that where you base is.

Peter McCormack: 00:05:18       
Okay.

Brittany Kaiser: 00:05:19       
And so I had, in high school, started doing digital design in order to run part of the school newspaper. I was head of photography, so I had learned a lot about digital editing and a lot about content creation. And so I got a place on the New Media team at the time. New Media meant data and digital design and creative and kind of everything put together in social media, because it was so new.

Brittany Kaiser: 00:05:47       
So, the New Media team, my first assignment was to create Barack Obama's Facebook page, and I was sat right next to Chris Hughes, the Facebook co-founder, who I had actually gone to high school with, and on a day to day basis, we were creating the first ever social media communications campaigns. No one had ever used social media for any reason, besides to talk to their friends, or to find a new girlfriend or boyfriend, or date, or what have you.

Brittany Kaiser: 00:06:14       
So, every day, we're trying new things. We started to rudimentary collect data in the most basic ways. We were looking on Facebook at what people cared about and what people were commenting on, and then we would start to categorise them, "Okay, this person cares about the environment. This person cares about healthcare." And that data would end up feeding into all the emails that we sent them, the text campaigns that they would get, the ways that we would respond to them, which at the time, was one-to-one.

Brittany Kaiser: 00:06:43       
I would literally be writing emails or messages to these people and saying, "Thank you for contacting Barack. He's on the campaign trail. So, my name's Brittany, and we're really excited that you support us, and here's everything else you should know about healthcare, because you care about healthcare policy." And so that was the beginning of, what you could call, micro-targeting, which was manual.

Peter McCormack: 00:07:07      
Well, I mean, so I've worked in the industry before this. I did 20 years in advertising, and we were a digital agency, so email campaigns, social media campaigns, so I'm fully aware of it. And one of the reasons I want to go through the backstories, I kind of want to piece together the series of events that led to you working for Cambridge Analytica, and also try and identify where something goes from being the use of technology in an appropriate way, to something a little bit more sinister.

Peter McCormack: 00:07:34       
Because I think it's very easy to just come in and say, "Right, I'm going to have a go at Brittany. I'm going to really tear her down about what happened at Cambridge Analytica." But what I believe is, there's just a series of events that takes you on a journey, so that's what I'm trying learn.

Brittany Kaiser: 00:07:47      
Thank you.

Peter McCormack: 00:07:47       
So that stuff with the Barack campaign, how much of that was almost the kind of building blocks of the work that became Cambridge Analytica?

Brittany Kaiser: 00:07:55       
It was the very beginning of data collection on social media and using people's data in order to target them with political messaging meant specifically for them. So everything that started to come out of that was advanced data analytics and more sophisticated ways of collecting more data about citizens. So, obviously, the more you know about someone, the easier it is to engage them and to get them interested in what you had to say.

Brittany Kaiser: 00:08:21       
So, at the time, everything seemed completely above board to me. I was getting people to register to vote that had never voted before. I was getting people to care about voting again that hadn't voted in 20 years. I was inspiring people to care about issues that were really important to me as well, and I saw that as a wonderful thing.

Brittany Kaiser: 00:08:40       
And throughout the many years after that, I worked for charities, nonprofits, and United Nations agencies doing fundraising and marketing and data-driven communications, and so my entire experience of data was that, it's only used for good. The more data that we have on people, the better we can actually be at achieving important goals. Right? And so I never really saw the dark side of it. I never thought, "How can this be abused?" It took me a really long time to get there.

Peter McCormack: 00:09:13       
How do you even identify what is appropriate use and what is abuse? The only thing, from my side, is that I like positive messages and positive campaigns. So if you find out somebody's on Facebook, you find the interest in the environment, then you can push Obama's environmental message to them.

Peter McCormack: 00:09:31       
The stuff I don't like is that, perhaps, they potentially want to vote for Hillary. We're going to push negative imagery of Hillary and paint a picture of her as a crooked person, almost like the defensive kind of use. That's the stuff I don't like, but-

Brittany Kaiser: 00:09:43       
I'm with you a 100%.

Peter McCormack: 00:09:44       
But the problem, Brittany, is I saw a bunch of things that Cambridge Analytica did that I saw were wrong, but with this whole research, because we've been talking for a couple of months now, I've never been able to draw the line and say, "There's the problem." Have you?

Brittany Kaiser: 00:10:00      
So it took me a really long time to fully grasp what was going on and what negative use cases of data fully looked like and what their impacts were. So, I suppose, in order to... I'll answer the question, but there's more of a backstory to that.

Peter McCormack: 00:10:17       
Okay.

Brittany Kaiser: 00:10:18       
But I mean, today, I would say there's a number of ways of kind of measuring what is positive and negative use. One is, if you are using data to commit crimes. There are very specific things that data has been used for in elections all over the world that is not legal, like voter suppression, and some negative campaigning can be considered that.

Peter McCormack: 00:10:40       
Trinidad, was that voter suppression?

Brittany Kaiser: 00:10:42       
Yes.

Peter McCormack: 00:10:42   
So, interestingly, I wrote down here, "That was the most concerning part of the whole documentary for me." Whilst I live in Britain and Brexit was a big issue, and obviously what happened with Trump is a big issue, both campaigns, that was the most concerning part of the whole film for me. In that, a bunch of voters were convinced not to vote to swing an election, and that, for me, went to... People were being manipulated to drive an election to a specific result without even realising it. I found that was very different from propaganda of sending them messages about why you should vote for this person versus that person.

Brittany Kaiser: 00:11:17       
Exactly.

Peter McCormack: 00:11:17       
This one, you had to stop them voting.

Brittany Kaiser: 00:11:20       
So what was really interesting, when I was on the Obama campaign, they decided to make a massive statement by only doing positive campaigning. So negative campaigning was completely ruled out, and any negative commentary about Democrats, or Republicans even, was to be deleted.

Peter McCormack: 00:11:38       
Yeah, right.

Brittany Kaiser: 00:11:39      
So if people would write on Barack Obama's Facebook wall, for instance, something negative about John McCain or something negative about Hillary, we deleted it, and we didn't consider it censorship. We just had a rule. We're only doing positive messaging, and a response to someone criticising Obama would also be positive and highlight what his policies were that were actually going to solve whatever problem that he was being questioned about, so that was completely amazing. That was my first experience with political campaigning with something that was only positive.

Peter McCormack: 00:12:11       
But even in that, you've got the challenge of what is a negative message that you want to delete, and what is fine, because Hillary Clinton is crooked, that's a negative message. You could probably find some grey areas where it may be a criticism of her policy, but maybe comes across as aggressive. Did you have those challenges?

Brittany Kaiser: 00:12:32       
Yeah. It was one of the hardest conversations we ever had inside Obama HQ, which was, "Okay, we're using social media for the first time to reach out to people politically," and some people are writing back some very nasty things. Some of them are lighter, and then some of them are like really incitement of racial hatred.

Peter McCormack: 00:12:53       
Right.

Brittany Kaiser: 00:12:54       
And where are we going to draw the line, number one, and are we going to either support all free speech, and say, "Everybody can say whatever they want and we're going to let that happen." Or are we going to censor things that we think are so negative that it's bad for it to be on our wall. We refuse to host that content. Right?

Peter McCormack: 00:13:17       
Okay.

Brittany Kaiser: 00:13:18      
And so, we ended up having an army of hundreds of volunteers that were kind of vetting these things, and so, sometimes, that would get asked to a supervisor, but sometimes, these are positive, excited teenagers that are making a decision about what is something that is too negative to be hosted on Barack's page. And, obviously, as you could imagine, a minority running for the presidential seat, people had a lot of really terrible things to say that we actually had an entire list of people that we would report to the FBI on a daily basis.

Peter McCormack: 00:13:55       
Wow, okay.

Brittany Kaiser: 00:13:55       
So, that was part of my job if things got that negative. So that was where I saw the problems of social media in the beginning, that people can hide behind a digital mask, and therefore, it makes people more aggressive, and it really brings out the worst in some people, unfortunately.

Peter McCormack: 00:14:13       
Well, I've lived that doing this podcast. I've had numerous arguments and offensive comments online, and then I meet the people in person, "You're going great." I just did an interview with this guy, Ragnar. We've clashed. We've even blocked each other a couple of times, but we met up, we talked through it, and we recorded one of my best interviews. Okay, so moving forward anyway. So you win the election, Obama wins election.

Brittany Kaiser: 00:14:35       
Yeah.

Peter McCormack: 00:14:35       
Everyone's happy.

Brittany Kaiser: 00:14:35    
Everyone's happy.

Peter McCormack: 00:14:37       
You must be feeling like, "This is great. What a career I've got here ahead of me." You've worked on a really positive campaign. Everyone loved Obama.

Peter McCormack: 00:14:43      
I mean, I know he has had his critics since. I really liked the guy. I could have voted for him. Right?

Brittany Kaiser: 00:14:48      
Yeah.

Peter McCormack: 00:14:49      
Okay, but after that, you to work in human rights.

Brittany Kaiser: 00:14:52      
Yeah. I actually left the Obama campaign to move to Hong Kong, where this is very topical these days, but I moved to Hong Kong, and I very quickly made friends with a lot of human rights activists there, and I was very inspired by the work that they were doing. There were a few different things that they were working on. One was helping North Korean refugees get somewhere where they could seek asylum and not be sent back. The second was opposing something called Article 8, which the Chinese Communist party was trying to put through in the Hong Kong parliament, which would mean that they would have a final veto over anything that happened.

Peter McCormack: 00:15:31      
Right.

Brittany Kaiser: 00:15:31      
So, similar to what's recently been going on in Hong Kong, the Chinese Communist party was trying their best to get Article 8 passed, and I got involved in all of these massive protests in Hong Kong, where hundreds of thousands of people would flock out into the streets, and the Hong Kong Parliament would end up either rejecting or dropping the bill, or what have you.

Brittany Kaiser: 00:15:53      
So I got very into that, and I realised political campaigning is great, but human rights activism is where I see my future, and that's what I ended up doing for so many years. I ended up doing four law degrees in human rights law and international relations and diplomacy in order to learn how to be the most effective human right's campaigner that I could, and that's how, strangely, I ended up at Cambridge Analytica.

Peter McCormack: 00:16:19      
But just before that, didn't you find frustrations in working in human rights and actually seeing the impact of the work?

Brittany Kaiser: 00:16:26     
Yes. So I found that, when I was doing high level lobbying at the United Nations, at the European Parliament, you would see, sometimes, politicians taking up your cause, maybe deciding to pass a new law, or bring about a national initiative in their country, and that's great. But a lot of times, you can't see, once a law has been passed, how quickly it's ratified and how quickly that actually turns into positive action.

Brittany Kaiser: 00:16:52     
And then I did a lot of on the ground work, so working on sustainable development programs, where I'm working in a rural village doing something like helping NGOs be more effective in areas where they don't have that much knowledge or they're new. And you could see, on a day to day basis, all of the amazing work that you're doing, but you know that you're affecting a very small group of people, and if you could get these programs throughout an entire country or a whole region, that you would have a much bigger effect. But once you start doing things at that level, it's very hard to measure if those people on the ground are actually seeing the benefit of your day to day work.

Brittany Kaiser: 00:17:32      
So that is how I got to a place where I really wanted to learn about data analytics. So I was writing my PhD in genocide prevention, so I'm an expert in crimes against humanity, and I got to this third chapter, where I was writing all about big data, and how, if you could figure out how to harness big data and model it properly, that you could build early warning systems and stop wars before they start, as opposed to spending six years at the International Criminal Court in the Hague and the war criminal dies before you even get to prosecute him.

Peter McCormack: 00:18:08       

Or most of it.

Brittany Kaiser: 00:18:09      
Yeah. So there's a lot of examples of that, and I thought I would work in international criminal justice, and when I saw how ineffective some of that work actually is, I thought preventive diplomacy is much better. And so I'm trying to write my third chapter about predictive analytics and early warning systems. No one at my law school could help me with that, so I got introduced through a friend to the CEO of Cambridge Analytica, Alexander Nix.

Peter McCormack: 00:18:38     
But that's a hell of an intro. It's not like an intro to a data scientist in a company that can help you. It's an intro to the CEO. So I'm guessing there was more to it than just, "Hey, can you help me out with this data I want to produce for my work in human rights?"

Brittany Kaiser: 00:18:51      
Yeah. I mean, you can read all about that first meeting in my book, Targeted. It's where the book opens, actually, when I first meet Alexander Nix.

Peter McCormack: 00:19:01      
Was that in London?

Brittany Kaiser: 00:19:01     
It was in London. It was in Mayfair at a sushi restaurant.

Peter McCormack: 00:19:04      
Best city in the world?

Brittany Kaiser: 00:19:05      
Yeah. And it was funny. I wasn't actually invited to the lunch to meet him at first. I was invited to the lunch to meet some other gentlemen that were there, that had an election going on in their country, and my friend knew that I had done work for the Democrats for a decade, and that maybe I could help them out with their social media strategy.

Brittany Kaiser: 00:19:25      
So I showed up, and Alexander was also invited to the lunch to pitch as well. So we were actually pitching against each other in our first meeting, which was kind of hilarious. He obviously was much more experienced than me and I found that out very quickly. And the friend that had invited us both thought it would be hilarious that his Republican and Democrat consultant friends would actually meet, and perhaps spend time together in the future.

Brittany Kaiser: 00:19:54      
And I learned a lot about what he did at that lunch, and I thought, " Hey, here's a guy that, maybe in the U.S, he's consulting to Republicans, but in the rest of the world, he is running defence contracts for NATO, helping young people not get recruited into ISIS online, and teaching all allied militaries how to keep kids at home with their families, and not sneaking themselves into Syria."

Peter McCormack: 00:20:20      
Great. So this is why Brittany didn't just jump from being a popular supporter of the Obama campaign, a Democrat pushing positive, jumping straight into negative Trump campaigns. You identified the work they were doing, and there was a lot of alignment between your interests at the time.

Peter McCormack: 00:20:36      
Okay, so that makes sense.

Brittany Kaiser: 00:20:37      
Yeah.

Peter McCormack: 00:20:37      
So...

Brittany Kaiser: 00:20:39      
So, I ended up eventually becoming a consultant to the company, specifically, originally on social programs. So they had done a lot of work for UNESCO and UNDP and a lot of ministries of health around the world, and I was trying to win them contracts in the social sphere and also in international elections. So I was pitching green parties and liberal parties, etcetera, and so forth, in different places.

Brittany Kaiser: 00:21:07      
Eventually, I ended up in the United States, mostly actually working on commercial applications. So I would have been doing the same thing you did when you were in advertising, going and meeting with the Unilever's and the Coca-Cola's of the world to explain to them how to use data better. And that was when I started to really learn the extent of the lack of data protection in the United States. So I had no idea that you could actually collect that much data as a commercial company, that you could just purchase and license the amount of data that the NSA holds. Right?

Brittany Kaiser: 00:21:41      
I didn't think that was possible. I obviously was a very avid follower of Snowden's revelations when Edward first came out, and I wasn't totally surprised that the government's collecting all this data, but I was surprised when I learned how much data Cambridge Analytica was able to buy and hold about Americans without even being an American company originally. Right?

Peter McCormack: 00:22:04      
Yeah.

Brittany Kaiser: 00:22:04      
So, foreign actors can just purchase and license insane amounts of data about American citizens, and that was kind of surprising. But at first, I thought, "Okay, this is cool. I want to learn how to use these tools," because everything that was done at Cambridge Analytica came with something called measurements of effectiveness, MOE. It's usually used by the United Nations and NGOs to describe projects, but a lot of times, the impact of those projects is very hard to measure.

Brittany Kaiser: 00:22:34      
So I thought, "Okay. Hey, I'm going to learn how to measure the effectiveness of everything that I do, so I know, if I'm campaigning, and I'm putting my blood, sweat, and tears into something that it's actually working, and this is cool, and I'm going to stay here until I really learn how to do this."

Peter McCormack: 00:22:50      
Okay.

Brittany Kaiser: 00:22:51      
Yeah.

Peter McCormack: 00:22:52      
Then?

Brittany Kaiser: 00:22:53      
So some of my-

Peter McCormack: 00:22:56      
You're obviously impressing them at this point. They've obviously taking note of your work.

Brittany Kaiser: 00:23:00      
Yeah. We were getting along very well. I developed a very close personal friendship, I thought, at the time, with Alexander, and he took me under his wing as his protege, so to say, and I traveled with him around the world, watched him pitch, learned all of the different ways the data could be used in different countries.

Peter McCormack: 00:23:19     
Living a good alongside it though?

Brittany Kaiser: 00:23:21      
Yeah.

Peter McCormack: 00:23:21      
I mean, were-

Brittany Kaiser: 00:23:22      
I mean, Cambridge Analytica, at the time, you wouldn't imagine, but we actually scrimped on the budget.

Peter McCormack: 00:23:28      
Oh, were you?

Brittany Kaiser: 00:23:28      
So, obviously, Alexander is wealthy himself, so he would treat me to nice meals, and we would go out and have fun, but we were staying in budget hotels. We flew economy, even all the way across the world. Even he flew economy, and would stay at a Holiday Inn. Right? So, I mean, he was using Rebecca Mercer's money, so he had to be accountable.

Peter McCormack: 00:23:51      
Okay, okay.

Brittany Kaiser: 00:23:52      
When he was using his own, he would treat me, which was really lovely. But-

Peter McCormack: 00:23:56      
How old were you at this time?

Brittany Kaiser: 00:23:58      
I was 26.

Peter McCormack: 00:23:59      
So it's pretty cool at 26 to be doing all this, right?

Brittany Kaiser: 00:24:01      
Yeah, I thought so.

Peter McCormack: 00:24:02      
Yeah.

Brittany Kaiser: 00:24:03      
So I wanted to learn as much as I could learn from him, and for the first year I was at the company, I was still writing my PhD. So, all of a sudden, instead of one chapter on data, I am spending every single day learning an entire doctoral thesis worth of information about how data can be used and can be effective. Right?

Brittany Kaiser: 00:24:24      
And it wasn't really until I saw some of the results of the Brexit campaign, and how data was used there, and when I was first presented the results of the Trump campaign. Those were two totally different situations, where Cambridge actually was involved in the Brexit campaign, but didn't run it. So Cambridge didn't put together the messaging.

Peter McCormack: 00:24:47      
Why was there so much denial about being involved?

Brittany Kaiser: 00:24:51      
Well, I think one of the biggest problems was that it's more, I don't know what the saying is, but people say, "It's the coverup, not the crime, that ends up being worse."

Peter McCormack: 00:25:02      
Yeah.

Brittany Kaiser: 00:25:02      
So there was a bit of work that was done for the Levy U campaign. I was one of the people doing that work, and that work was never paid for by those Brexiteers, which is why Cambridge didn't end up actually running the full campaign. We did the first phase of work, and they didn't pay us, so we didn't end up actually coming on to run the whole campaign.

Peter McCormack: 00:25:24      
Why did they don't pay you?

Brittany Kaiser: 00:25:26      
Who knows. I mean, ask Aaron Banks, not that he tells the truth, but-

Peter McCormack: 00:25:32      
Okay.

Brittany Kaiser: 00:25:33      
He ended up never paying our invoice, and I think it's possibly related to the fact that they never received the designation. Vote leave became the official campaign for Brexit, and what that means is, the designated campaign gets 7 million pounds from the government, and then the other campaign, they don't get any TV slots, they don't get any funding, and you can continue, or not, to actually run your campaign, but you don't have any official support.

Brittany Kaiser: 00:26:00      
So it would have been out of the pockets of the people from the campaign, not from donors or the government. So that's possibly one of the reasons, but I'm not going to make excuses for people that seem to not tell the truth.

Peter McCormack: 00:26:15      
But is that the depth of how much you were involved in Brexit? Or was it more?

Brittany Kaiser: 00:26:19      
Well, there's a lot more than I'm learning every few weeks about how some of the data that Cambridge Analytica came to possess from the U.K. Independence party and from Levy U may have actually been used more than we thought. So you'll find out, don't worry, very soon, but I think one of the interesting things is that Cambridge's partner company, which at the time was running all digital campaigns with Cambridge Analytica, called AIQ, where Christopher Wylie was actually working. They ran the vote leave campaign, so the official designated campaign, and Damien Collins published some of the plans, and what they did, it was a Cambridge Analytica campaign.

Brittany Kaiser: 00:27:05      
I mean it was really, data was proposed to be used in the exact same way that Cambridge proposes it. So Christopher had worked at Cambridge Analytica before, and then he switched over to AIQ, which was a Cambridge partner company. We actually called it SCL Canada. SCL Group owned Cambridge Analytica and they ran the vote leave campaign.

Peter McCormack: 00:27:28      
He's an unusual character, Christopher Wylie.

Brittany Kaiser: 00:27:32      
I've never met him in person, so we don't actually know each other.

Peter McCormack: 00:27:34      
Right. So the read I got from him was that he was almost just competitive.

Brittany Kaiser: 00:27:40      
Well, he did set up a competing company to Cambridge Analytica, called Eunoia Technologies, with apparently a larger Facebook dataset than Cambridge had, and pitched the Brexit campaign and pitched the Trump campaign.

Peter McCormack: 00:27:55      
Yeah.

Brittany Kaiser: 00:27:55      
Lost the Trump pitch to Cambridge.

Peter McCormack: 00:27:57      
Yeah.

Brittany Kaiser: 00:27:58      
And he was actually in active litigation with Alexander for a long time, and he had to shut down his company because of those lawsuits, for breaking all of his contractual obligations.

Peter McCormack: 00:28:10      
So we can take a lot from that. Okay. So, Britain votes for exit.

Brittany Kaiser: 00:28:13      
Britain votes to leave, and you start to see all of the accountability only come out afterwards. Okay, here's all the claims that these campaigns were made off of, and here's the messages that people were targeted with, and some of this is blatant lies and misinformation, not just miscalculation of numbers. And a lot of it was fear mongering, some of it was incitement of racial hatred, and you start to see that data was used to target people in order to manipulate them, and not using truthful information, and not using data for good. It wasn't a campaign built on positivity, unfortunately.

Peter McCormack: 00:28:54      
Do you think the course of history was changed?

Brittany Kaiser: 00:28:57      
Yes. I mean, especially, I lived in the U.K. for almost 14 years. I'm a permanent resident. I consider Britain my home. At the time, I really thought, "I'm going to marry an Englishman. I'm going to live in the U.K. my whole life, and I'm going to have British children." So I want to see a great future for the U.K., and for the past three years, all I've seen is turmoil. I've seen society be ripped apart in the United Kingdom, and breaks my heart.

Peter McCormack: 00:29:25      
Yeah, it isn't great. It's very interesting to see how it's going to play out actually. I was very anti a second vote. I was like, "We had a vote." Yeah, I actually voted to remain, and we set off, voted again for vote leave. I feel there's enough in this to be deserving of a second vote.

Brittany Kaiser: 00:29:42      
Yeah, I agree with you actually.

Peter McCormack: 00:29:45      
Because I felt like-

Brittany Kaiser: 00:29:45      
This is the first time I've said that out loud, but yes.

Peter McCormack: 00:29:47      
Wow, okay, yeah. No, I do, because I feel like we don't really know. We don't know if democracy was skewed here.

Brittany Kaiser: 00:29:54      
Yeah.

Peter McCormack: 00:29:54      
We can't tell, and actually, there was something Christopher Wylie said in the film, where he talked about, if you take drugs and cheat in the Olympics, we don't look at how-

Peter McCormack: 00:30:03     
If you take drugs in you cheat in the Olympics, we don't look at how much you measure, you cheat and you're out.

Brittany Kaiser: 00:30:06      
Yeah, I know.

Peter McCormack: 00:30:06      
I thought that was a really well put point.

Brittany Kaiser: 00:30:08     
I love that quote. I think it really illustrates to people what this means, because what I see when I look at what's happened in the repercussions of Brexit, I see that I was able to give evidence that gave the biggest ever fine to a private company for data abuse.

Brittany Kaiser: 00:30:30      
It was unfortunately only £500,000 to Facebook, which they make apparently every seven seconds or something like that, but still it was a fine. And then there were multiple fines given to the Leave.EU campaign and Eldon Insurance for data abuse as well. And there were also violations on campaign finance and reporting.

Brittany Kaiser: 00:30:50      
So if you tally up all of the fines that were given, it was £500,000 to Facebook and £135,000 for Arron Banks' two entities. So I really think that British democracy and the integrity of the European Union was sold for £635,000.

Peter McCormack: 00:31:12      
It's not much.

Brittany Kaiser: 00:31:14      
No, it isn't.

Peter McCormack: 00:31:15      
Jeez. Okay. Okay. Well, that's something I'm probably going to look into again in the future. I do want to now focus on the bridge into the Trump campaign. When did that become something where you had to make a decision to work on it?

Brittany Kaiser: 00:31:31      
So I never worked on it. I was actually brought to New York to do the first pitch. So I did the first pitch to Corey Lewandowski in Trump Tower, which was-

Peter McCormack: 00:31:43     
In the room?

Brittany Kaiser: 00:31:44      
... an insane experience coming into Trump Tower and being told by Corey that I was on the set of The Apprentice. So I'm on a reality TV set, pitching a campaign that I was told was not going to likely be a presidential campaign, but probably more a commercial marketing exercise.

Brittany Kaiser: 00:32:07      
And that was kind of mind blowing, but it also reassured me that this man's never going to be president, so this isn't a big deal.

Peter McCormack: 00:32:17      
Nobody thought he would.

Brittany Kaiser: 00:32:17      
I'm in here pitching a contract for Trump Corporation. Okay, just like any other company in New York that does business. Fine. Sure. And many months later when Cambridge actually started working on the campaign, it wasn't really like everybody in the company started working on it.

Brittany Kaiser: 00:32:41      
At the time the company had grown really large. We were about 120 people, and there was at least a seven person team that started on the Trump campaign and I was not one of those people. I'm not an ad tech specialist or a data scientist, so that was really the people that were sent to San Antonio and worked in Trump Tower.

Brittany Kaiser: 00:33:04      
If you were research data science or digital strategy, like that's what you did. I was business development, so I kept on going out and pitching new contracts around the world while my colleagues were working on the Trump campaign.

Peter McCormack: 00:33:19      
So you only ever pitched, you never did any work supporting any-

Brittany Kaiser: 00:33:23      
Not at all.

Peter McCormack: 00:33:24      
Okay.

Brittany Kaiser: 00:33:25      
Not at all.

Peter McCormack: 00:33:25      
That's not explained in the film.

Brittany Kaiser: 00:33:27      
No, it's not. But you only have under two hours to explain a very complex topic.

Peter McCormack: 00:33:33      
Yeah, of course. Of course.

Brittany Kaiser: 00:33:33      
So can't go into all of it, but as my colleagues were working on the Trump campaign, there is a rule by the Federal Election Commission that if you are working on a campaign, you have to do a training and sign a lot of paperwork to enter into what they call a firewall.

Brittany Kaiser: 00:33:49      
So it means nobody outside of the firewall is allowed to know what you're doing. Especially me, where I was communicating with high level people from Super PACs and other presidential campaigns, I wasn't to know what they were doing on a day to day basis at all. I couldn't be copied into emails, I couldn't be a part of meetings, I wasn't supposed to see any content, nothing.

Brittany Kaiser: 00:34:11      
So most of what was going on in the Trump campaign, unless I saw something sitting on a colleague's desk or happened to walk by one of the creative's computers as they were designing an ad, I didn't know what they were doing.

Peter McCormack: 00:34:22      
When were you first aware that they were manipulating people, misusing data?

Brittany Kaiser: 00:34:29      
When they gave what they called the Trump campaign debriefing. We had two days, two days in December, a month after election day where we were given eight hours each day, a presentation of what was done in both the Trump campaign and for the Trump Super PAC, which was called Make America Number One. It was the Defeat Crooked Hillary campaign, if you recall that.

Peter McCormack: 00:34:54     
Yeah, I remember that.

Brittany Kaiser: 00:34:55      
So it's December 20th-

Peter McCormack: 00:34:57      
The handcuffs, and the hook or crooked.

Brittany Kaiser: 00:34:59      
Yeah. Designed by Cambridge creatives. And we had 16 hours of being shown every single thing that was done in the campaign.

Peter McCormack: 00:35:13      
Okay. So that was a revelation. Who's in the room? Is Alexander in there?

Brittany Kaiser: 00:35:16      
So we had every Cambridge office around the world dialled into a webcast. So at the time we had an office in New York, in D.C., Mexico City, London, the Balkans, Hong Kong. It was really starting to proliferate.

Peter McCormack: 00:35:35      
But what was the atmosphere on that call? Was it people going, "What the fuck is going on here?" Or was everyone just enamored with it? I mean I'm trying to imagine.

Brittany Kaiser: 00:35:46      
So it was a really awkward time for me and the other people that were in New York headquarters. So New York was mostly people who were just working on commercial advertising campaigns.

Brittany Kaiser: 00:35:58      
They were selling cars and toothpaste, and this is the first time that we're receiving information about what exactly was done for the Trump campaign in the Super PAC, what the creative pieces looked like, what were the advertising, what was the strategy, and what we were hoping to take away from this is "Okay, here are all of the numbers and how impactful all of your strategies were so that we can convert that to pitches for commercial companies."

Brittany Kaiser: 00:36:28      
So all of us are in New York, waiting to just see exciting numbers, like, "Hey, 20% increase in voter registration, 35% increase in intent to vote for Donald Trump," whatever it is. And instead of getting all of these really useful numbers, we are shown some of the most disturbing pieces of advertising you've ever seen in your life.

Peter McCormack: 00:36:51      
Propaganda?

Brittany Kaiser: 00:36:53      
Definitely.

Peter McCormack: 00:36:53      
Okay.

Brittany Kaiser: 00:36:55      
100%. And things that were put together in such a way specifically to incite fear and to weaponise racism and sexism and so many different divides in American society.

Peter McCormack: 00:37:11      
Let me ask you something. The Trump campaign will have a team outside of your team, of the Cambridge Analytica team, right?

Brittany Kaiser: 00:37:17      
Yes.

Peter McCormack: 00:37:18
How much of this negative propaganda will have come from the Cambridge Analytica team, and how much would have just been part of the Trump team also having that strategy?

Brittany Kaiser: 00:37:30      
Well, the scary thing is is that it wasn't really a Cambridge Analytica Trump campaign decision. It's decisions that come out of the data.

Brittany Kaiser: 00:37:38      
So if you see that there's a large group of people that are not really likely to vote and they are neurotic, introverted and they care about national security, you would end up seeing an entire messaging campaign, inciting fear to people that are easily scared and that make their decisions based on fear.

Brittany Kaiser: 00:38:06      
And talking about how terrorists are going to come through our porous border, and if you vote for Hillary, we're all of a sudden going to be living under Sharia law. And you see these things, which is complete fear mongering and misinformation, and it's the data that's telling them to make these decisions because here are people that are easily scared.

Brittany Kaiser: 00:38:32      
Here are people that think that national security is our biggest issue, and they can easily be persuaded either not to vote or to definitely vote for Trump instead of Hillary if they're shown this type of, as you said, propaganda. I would definitely call it that. Yes.

Peter McCormack: 00:38:48      
So the Cambridge Analytica data was essentially the bedrock of the whole campaign that would have driven everything. But I publicly aware of very negative Trump campaigning. I mean it was out there. It was obvious to see. I could see the stuff they were saying about Hillary. How had you not connected that to the Cambridge Analytica work?

Brittany Kaiser: 00:39:09      
Well, I mean, again, Trump isn't controlled by anybody.

Peter McCormack: 00:39:12      
Yeah, of course.

Brittany Kaiser: 00:39:12      
He makes his own decisions about-

Peter McCormack: 00:39:14      
No, I mean the campaign.

Brittany Kaiser: 00:39:15      
Of course. Yeah.

Peter McCormack: 00:39:17     
Like for me, I just see very negative campaigning. At no part had you thought, "I wonder if this negative campaigning is the result of our work?"

Brittany Kaiser: 00:39:25      
I mean definitely at the time I wasn't very excited about this person running around talking about building a wall. I was actually building an office in Mexico City and I had moved to Mexico and a lot of my best friends were Mexican and I was actually horrified when I would hear Donald Trump speak on television.

Brittany Kaiser: 00:39:43      
But I thought, "Hey, this guy's going to lose and he's going to be discredited and this is just going to be an embarrassment for him for the rest of his life and it's going to motivate people to squash out this type of rhetoric in politics." I really thought that it was a total joke that he was running.

Peter McCormack: 00:40:00      
And after the meeting where you're all showing the work, are you doing soul searching at this point? What's going on then?

Brittany Kaiser: 00:40:06      
So it was a really interesting time in my life where I didn't have any money in savings. My father had just had brain surgery and we had learned that you would never work again, and my grandfather was about to pass away. And it was a time where I thought, "Hey, this would be a great time to leave this company, but I don't know where another pay check like this is going to come from."

Brittany Kaiser: 00:40:33      
I had actually spent most of my life working for free, doing pro bono work or earning next to nothing, sometimes even fundraising through family, friends to cover costs of me going and doing a voluntary project somewhere. So being on a salary from a for profit company, this is the first time I had ever had that. This was my first permanent job ever.

Brittany Kaiser: 00:40:54      
So I thought what I'm going to do is get on the right side of the conceptual wall and move to Mexico and have nothing to do with what's going on in America anymore.

Peter McCormack: 00:41:05      
Right. Okay. But Christopher Wylie calls you out. So was that the trigger to have to leave that moment?

Brittany Kaiser: 00:41:15      
So what had been happening in my life for a while was that I had started to become really grossed out by the data industry and really shocked at how easy it was to abuse data and how easy it was to acquire massive amounts of data on people without their consent. And I had known about blockchain technology for a long time.

Brittany Kaiser: 00:41:39      
I had my first Bitcoins in 2011, and I had been following the development in the industry and started to get very involved. And I was spending all of my personal time learning more about technology, and I thought, "Hey, okay, well there might be a way to solve all of these problems and actually turn the power of Cambridge Analytica into something that can be used for good."

Brittany Kaiser: 00:42:05      
And from Mexico, I was working with Cambridge Analytica to build a blockchain based system where users would control their own data, and they would decide what they wanted to share or not and they could hide you in their own data and share it or not. But if they shared it, they would be rewarded in tokens for that.

Brittany Kaiser: 00:42:25      
And still all the blockchain companies I advise right now are doing exactly that. And so I was really excited that the most famous data science company in the world could possibly start to build a data ownership solution. Then I built within Cambridge Analytica a blockchain advertising business. So we had a handful of blockchain companies that we were helping to promote them.

Brittany Kaiser: 00:42:49      
I mean at the time it was promoting ICOs mostly, but that's what it was like in 2017. That's more what it was like, but also for some companies it was ecosystem development, so finding users for their apps. I started to get really excited about that and just like it would be silly to leave now when my boss tells me every other day that this is going to be $1 billion company and I have the opportunity to be a part of building a massive blockchain ecosystem that'll totally change the data industry, and we can also promote blockchain companies and make them into real popular companies with mass adoption because of how powerful our ecosystem is. So I was pretty excited about that opportunity and I had no idea that my company had Facebook data they weren't supposed to have. So when Chris came out with that I thought, "Huh, what else was going on that I either didn't know about or I hadn't thought twice about?" That if I were to reflect and start to look at all of my past communications and documents that I had, might actually be a lot worse than what I thought it was when I was looking around at all the shiny objects and not really thinking really hard about what the company was possibly doing in places around the world.

Brittany Kaiser: 00:44:17      
So I was in Puerto Rico at Restart Week speaking about data ownership and speaking about the new laws that had been passed in Wyoming, because I basically quit Cambridge Analytica to come support Caitlin here in my Wyoming. That was one of the reasons why Alexander and I could not see eye to eye near the end that I said, "I'm going to go to Wyoming for a week or two and I'm not going to be doing any work for you and you're just going to have to live with that. Fire me if you want."

Brittany Kaiser: 00:44:46      
And I go to Restart Week, I'm speaking about the new Wyoming laws, I'm speaking about data ownership and the news drops about the Facebook dataset. So I thought, "Hey, either I'm going to just continue on with my blockchain work and solve these problems quietly, or I'm going to be a lot louder about what just happened then even Chris is, because I was there for everything.

Peter McCormack: 00:45:25      
But if you're there and you'd be loud about it, it feels like you were aware of things that probably were wrong and perhaps the situation, the life, the job, the money allowed you to turn a blind eye to things. Is that fair?

Brittany Kaiser: 00:45:41      
In a way, yeah. And that's the difficulty with being a whistleblower, because when you're a whistleblower it means that you are involved in things that eventually you decide are unsavoury, but you were involved and you were there.

Peter McCormack: 00:45:55      
Yeah. See, one of the interesting things is when I first watched the documentary and I was aware of Brittany Kaiser, the whistleblower, I watched it and I didn't think of you as a whistleblower afterwards.

Brittany Kaiser: 00:46:08      
Some people feel that way.

Peter McCormack: 00:46:09     
I felt more like, I kind of tend to feel like whistleblowers perhaps are the first to jump ship when they realize there's a problem and they are compelled by their own moral conscious at the time that this needs exposing. I felt like potentially that you had two choices.

Peter McCormack: 00:46:29      
One, you are wrapped up in this and you are potentially part of the negative side of the story, or you speak up and you control your narrative a bit better. And I don't think you came out of it perfect and clean, but you look at how Alexander has come out of it and it's very different.

Brittany Kaiser: 00:46:49      
I'm the only person from Cambridge Analytica to actually come out and speak and expose everything, the only person that gave my entire work computer hard drive to the government, and the only person that has gone around and done testimonies all around the world to help governments.

Peter McCormack: 00:47:08      
And I follow you and I think that's brilliant, but what I'm trying to get at, I'm trying to be as fair as I can, is that, and we will never know, but would you have done this if the news hadn't broken? And it sounds to me like you probably wouldn't have.

Brittany Kaiser: 00:47:22      
Well, I didn't know that as many things were going wrong as actually were going wrong. That's the thing. I was never an executive. I wasn't in management meetings.

Peter McCormack: 00:47:31      
But you knew enough to be a whistleblower. Do you see what I mean? It's either you either knew enough or you didn't, it didn't happen until the news broke. So I didn't see you as a whistleblower. But I don't hold that against you. I saw you were in a tough choice.

Peter McCormack: 00:47:44      
In prep to the interview I've talked to people about, have you been conflicted at times in life and career? And I think we are. I have, all parts of my life. When I worked in advertising... So in fairness to you, I wrote and I should share it with you sometime, when I quit the industry, I wrote something called Online Advertising Does Not Work.

Peter McCormack: 00:48:02      
And what happened was we got to a point where I realised we were essentially reporting stats to clients, whichever the best were, to justify our retainers. Because if we didn't have our retainers we'd lose our clients, and really what we were doing was wrong. And I had to quit the industry because I knew we were lying, I knew we were getting paid for work that didn't work. I was conflicted. Do you see what I mean?

Brittany Kaiser: 00:48:22      
Totally.

Peter McCormack: 00:48:22      
So I empathise with your situation, but I never saw it as a whistleblower.

Brittany Kaiser: 00:48:27      
Well, I mean technically a whistleblower is anyone that gives original information to either press or authorities. And I've given over a 100,000 documents and been a pro bono expert witness for a year and a half, like unable to actually work.

Brittany Kaiser: 00:48:44      
So it was a decision where I could have either stayed under the radar and just tried to make a living in the blockchain industry, which was what I was trying to do at the time. Or I could decide to put myself in a lot of danger, having no idea what was going to happen. Yes, in order to control the narrative.

Brittany Kaiser: 00:49:02      
But because when I saw Chris Wylie's story come out, I saw that a lot of the stuff that was in those articles was second and third hand information. It wasn't information that he had experienced himself. So I thought if this is going to be one of the biggest topics in the world, the conversation needs to be added to by people that actually were there and can answer everybody's questions and can provide as much helpful information as possible at a time where everyone's grasping for answers and trying to figure out what the hell happened.

Peter McCormack: 00:49:36  
kay. And when you make the decision to become a whistleblower to expose this, what are the steps you go through? So I'm imagining you had to consider your family, you had to consider your personal safety, where were you going to be and who is going to help you? Did you contact Paul or did he contact you?

Brittany Kaiser: 00:49:54      
Paul contacted me.

Peter McCormack: 00:49:55      
So he contacted you. Was he the trigger?

Brittany Kaiser: 00:49:57       

Yes.

Peter McCormack: 00:49:58    
I liked that guy.

Brittany Kaiser: 00:49:59      
Yeah. He's a wonderful person. He is a very professional political activists and human rights activist and social activists. He's helped found things like change.org and Crowdpac. I mean he really, really knows what he's doing.

Peter McCormack: 00:50:16      
He felt more than just a journalist following the story though. I think he felt like he was helping you through a very tough time.

Brittany Kaiser: 00:50:22      
He was, yeah. So we had actually connected about a year before that. He wrote to me on LinkedIn actually, and I looked at his LinkedIn profile and I thought, "Hey, someone that has played a big role in all of these amazing organisations that I really respect wants to have a phone call with me. Wow." I thought I was persona non grata working at Cambridge Analytica right now to liberal people. So great.

Brittany Kaiser: 00:50:50      
This is the type of person that I'm normally working with and friends with, so I'll get on the phone with him. And we ended up striking up a bit of a friendship, started hanging out in London and he kept on picking my brain for what was going on at the company. He was explaining to me all of the exciting things he was involved with, and whenever I was ready to quit Cambridge Analytica, that there were a lot of progressive causes waiting for my help.

Brittany Kaiser: 00:51:21      
And that was very interesting for while, but when the first whistleblowing story came out in the news, he called me and asked me if I was going to be okay with how this narrative was going to roll out, not written by people that were actually at Cambridge Analytica when all of this happened. And I said, "No, I'm not going to be okay with a story that is so intimately linked with me and my life being told by people that don't know what they're talking about."

Brittany Kaiser: 00:51:54      
So I said, "What do I need to do?" And he sent me a few links from Paul Lewis, The Guardian bureau chief in San Francisco at the time. And I read it and he really understood what he was talking about. Paul wrote about how companies like Cambridge Analytica were kind of scapegoats and Facebook was the problem. And then he talked about the lack of data legislation and what that meant. I really, really believed in a lot of the work that he had done.

Peter McCormack: 00:52:28      
Well, because one of the interesting things that I've also wrestled with is we've always been lied to in political campaigns. We've always had propaganda put at us. When politicians debate on TV they're more often than not lying or manipulating things.

Peter McCormack: 00:52:42      
One of the things I try to be fair with and try and give Cambridge Analytica the benefit of the doubt, was everyone's probably doing this. A bit like I compared it to Lance Armstrong. Lance Armstrong, part of me doesn't care that he took drugs because they were all taking drugs. He always said if he didn't take drugs he couldn't compete.

Peter McCormack: 00:53:02      
And then it turns out he was a piece of shit in the end. But in the end I thought about that situation. I assumed all companies are using data, they're all manipulating us, they're all sending us messages. This was just a bit more sinister. I try to wrestle with that.

Brittany Kaiser: 00:53:16      
Yeah. I mean for me what really bothered me about what was done is that it verges into areas that I work to the rest of my life to stamp out. So when you see messaging that verges on incitement of racial hatred and voter suppression and weaponising sexism against women, these are all things that I used to fight against. And then you see the tools that you help build and sell around the world being used to do exactly that.

Peter McCormack: 00:53:51      
Do you think Cambridge Analytica changed the course of history with the Trump campaign?

Brittany Kaiser: 00:53:56      
I think that Cambridge Analytica changed the course of history by allowing a campaign to become so incredibly negative and so viral that once again people around the world are remembering that they can't ignore politics and they can't be complacent, and that being an armchair activist doesn't mean anything. And that if you actually want to see the world be a better place and a good place, you actually have to stand up and do something about it.

Brittany Kaiser: 00:54:35      
And that if you're complacent like I was, sitting in that company being okay with what they were doing and not voting in the general election, then guess what happens?

Peter McCormack: 00:54:47      
And you didn't vote?

Brittany Kaiser: 00:54:48      
I didn't. No.

Peter McCormack: 00:54:49      
So the ads didn't work on you?

Brittany Kaiser: 00:54:52      
Yeah. I voted for Bernie Sanders in the primary.

Peter McCormack: 00:54:55      
You weren't a persuadable.

Brittany Kaiser: 00:54:57      
I had somehow been persuaded to not fly back to Chicago and cast my vote for Hillary Clinton. So I would say I was pretty persuaded in a way. I mean I doubt I will ever vote for a Republican in my life, because their ideals don't appeal to me. But I never could have been persuaded to vote for Donald Trump, but I was persuaded not to vote for Hillary, which I naturally would have done automatically without thinking about it.

Peter McCormack: 00:55:23      
It's funny. I can... Well, I can't vote because I'm British, but I couldn't vote in the election. But being up here in Wyoming, I kind of like the Republican ideals up here.

Brittany Kaiser: 00:55:32      
Libertarian ideals.

Peter McCormack: 00:55:33      
Well, so yeah, because Tyler says he's a Libertarian Republican, but he's still a Republican. He still represents the Republican party. There's an overlap between Libertarianism and Republicans and I kind of like some of that. I kind of like the lifestyle up here.

Brittany Kaiser: 00:55:49     
I love the lifestyle up here. I've moved here.

Peter McCormack: 00:55:52      
Yeah.

Brittany Kaiser: 00:55:52      
Absolutely. And so there are certain certain ideals that appealed to me in terms of promoting entrepreneurship and protecting people's freedoms and allowing people to be empowered. But there's a lot of things in progressive ideals that do the same thing in different ways. So it's kind of which way you look at it. Both sides of my family are Republican. I have no problem with the party or people that choose to vote conservative.

Brittany Kaiser: 00:56:23     
There are just some ideals that I will never be able to agree with, and therefore I've tended for most of my life to vote blue up and down the ticket. It's just how it is. But now that I'm in Wyoming and I love all of the Republicans that I work with here, and I did use to love a lot of the Republicans that I worked with at Cambridge too, people who were really fantastic, amazing thinkers that I would work with again.

Brittany Kaiser: 00:56:48      
And if given the chance to vote for those people, I guess I probably would, but I doubt I'll ever vote for a Republican president. That would be something interesting if that ever happened.

Peter McCormack: 00:56:58      
Tyler Lindholm has blown my mind as a person.

Brittany Kaiser: 00:57:01      
He's incredible.

Peter McCormack: 00:57:02      
Like fucking incredible.

Brittany Kaiser: 00:57:04      
He's brilliant.

Peter McCormack: 00:57:04      
Like literally he's brilliant and he's so engaging, he's so charismatic, he's just great. That's the kind of politician I can get behind.

Brittany Kaiser: 00:57:13      
Yeah, 100%. If Tyler wanted to run for higher office, I would be behind him 100%. But he doesn't represent anything at all that I find offensive. There are a lot of candidates that Cambridge Analytica worked with that I actually found offensive, like our current president. And I find him incredibly offensive nearly every day and every moment that I'm awake and when I'm sleeping.

Peter McCormack: 00:57:38      
But there's lots of people who don't, lots of people are perfectly happy with them. I've heard a lot of defences on-

Brittany Kaiser: 00:57:42      
And that's their right.

Peter McCormack: 00:57:43      
Yeah. But I've heard a lot of defences of him, maybe some of his policies. I'm like, "Okay, maybe his policies are okay." But I also find people seem to brush over some very clear, obvious personality flaws.

Brittany Kaiser: 00:57:56      
I mean that's the thing, you can have all the best policies in the world, but if you incite racial hatred, and if you hate women then you know what? You do not belong in the highest seat of power in the nation, if not the world at all.

Peter McCormack: 00:58:12      
And he might get another four years. It'd be interesting to see how the campaign plays out if they follow the same trajectory.

Brittany Kaiser: 00:58:19      
Well, I mean what I am very afraid of is that it's very possible that the president has committed high crimes and misdemeanors, and if he does not win, then he might have a sealed indictment and go to jail, if you actually read the Mueller report or watched his congressional testimony.

Brittany Kaiser: 00:58:37      
So if you've done human rights interviews around the world from individuals who have escaped nations of dictators who will do anything to stay in power, because they know if they're not in power that they might be imprisoned or worse. I would be really, really afraid of what's about to happen in the 2020 campaign, because he knows he has to stay in the White House in order to perhaps remain a free man.

Peter McCormack: 00:59:05      
Okay. So we've gone on a tangent here, but I want to explore this. So I follow some of the Mueller stuff. I didn't read the report, but I followed some of the testimony. This is where US politics is very different from UK politics, and I struggle to follow all or naturally understand what was going on here. Can you explain to me what's going on here?

Brittany Kaiser: 00:59:25      
Well, there is 448 pages of descriptions of nothing that has been defined as collusion, but more what has been defined as obstruction of justice, possibly. And so obstruction of justice is something that the Department of Justice can investigate, but they can't indict a sitting president. So when Mueller explains the conclusions of the report, and he is asked, "Has the president committed a crime?" He has to say-

Brittany Kaiser: 01:00:03      
The president committed a crime, he has to say ... If the president had not committed a crime, I could tell you. All right.

Peter McCormack: 01:00:11      
Okay.

Brittany Kaiser: 01:00:12      
Fantastic. So you haven't told us that he didn't commit a crime. Therefore, it's a roundabout way of saying that he did. Okay, thank you. And then when he's asked about rumours of a sealed indictment, which basically means if the president was no longer sitting, that then he could be served an indictment. He also cannot confirm that there is not a sealed indictment. So wouldn't you think that if obstruction of justice hadn't taken place and there wasn't a sealed indictment, he would just say there was no obstruction of justice and there's no sealed indictment?

Peter McCormack: 01:00:43      
Yes. Okay.

Brittany Kaiser: 01:00:45      
Exactly. It's not really that hard to read between the lines there.

Peter McCormack: 01:00:48      
Okay. How serious a crime is obstruction of justice and ...?

Brittany Kaiser: 01:00:52      
It's impeachable.

Peter McCormack: 01:00:53      
Okay.

Brittany Kaiser: 01:00:54      
Of course.

Peter McCormack: 01:00:55      
Okay. This is what you have to help me understand because why can't they then impeach him for this?

Brittany Kaiser: 01:01:00      
Well, it's in process.

Peter McCormack: 01:01:02      
Okay.

Brittany Kaiser: 01:01:02      
The US Congress has the house judiciary committee, which is currently working on an investigation, which is like a pre impeachment trial.

Peter McCormack: 01:01:14      
Okay.

Brittany Kaiser: 01:01:15      
I'm one of the 81 witnesses to it. That's public information. So very proud to serve my country. It's going slowly because-

Peter McCormack: 01:01:23     
Okay.

Brittany Kaiser: 01:01:23      
... Nancy Pelosi, who is the leader of the party and the person who would normally be in charge of gathering everybody together behind a certain vote, acting as a whip, she does not believe that impeachment right now is the right political step, that it might give him more power. Because obviously he has a very strong base. She has her reasons, but I don't think as an American, as a politician, as a citizen, as someone that cares about other human beings, that you could say that a president that incites violence, racial hatred, sexism is somebody that should go one more day without an impeachment trial. Not one more day.

Peter McCormack: 01:02:11      
Okay. I'm going to go and read that after this. I think that's something I'd like to know more about.

Brittany Kaiser: 01:02:15      
It's interesting stuff.

Peter McCormack: 01:02:16      
Okay. We went off track there a little bit. So you've made a decision, you're going to whistle blow. What's the next step? Is that where you have to disappear?

Brittany Kaiser: 01:02:28      
Well, when I realised the gravity of what I was about to do, which is say that possibly both the Brexit campaign and the Trump campaign were conducted illegally, I was not sure how that was going to go. I mean, I've been an avid follower of Julian Assange and Edward Snowden and Chelsea Manning, and that didn't work out very well for them, becoming whistleblowers or encouraging other whistleblowers. And I didn't know was going to happen. I mean, I'm talking about the UK and the United States, my two homes. Those are the two places that I would normally feel safe and welcome, and I am about to tell everybody in power that I completely disagree and perhaps I'm completely undermining their legitimacy. So what's going to happen next? I didn't know.

Peter McCormack: 01:03:24      
Okay.

Brittany Kaiser: 01:03:25      
I decided I would go somewhere where nobody could get me and wait to see what happened.

Peter McCormack: 01:03:32      
Thailand is a beautiful place.

Brittany Kaiser: 01:03:33     
It is a beautiful place. It's a wonderful country. I suggest it to anyone who hasn't been.

Peter McCormack: 01:03:37     
Been there three times, I think. Love that.

Brittany Kaiser: 01:03:39      
It's beautiful.

Peter McCormack: 01:03:42      
Okay. You're out there, basically then everything kicks off or goes crazy.

Brittany Kaiser: 01:03:46      
Yeah. Actually, all of my whistleblowing articles came out when I was on my way to the airport.

Peter McCormack: 01:03:51      
Okay.

Brittany Kaiser: 01:03:52      
And when I landed, the whole world had seen it. Honestly, I thought it was going to be way smaller than what it ended up being. I originally told Paul, "Hey, in order to make sure that everyone has their facts straight, I think I've got enough opinions and evidence that I could write a good op ed." I suggested to Paul that I would write an op ed, and he said he was thinking of something a little bit different when he introduced me to Paul Lewis. So after spending days and days of going through some of my evidence and my interviews with them, we created I think four or five articles and one video that I thought, hey, everyone will talk about it for a couple of days and then it blows over.

Peter McCormack: 01:04:43      
Nope.

Brittany Kaiser: 01:04:44      
I never thought that everyone in the world would know what Cambridge Analytica is and I never thought that everyone in the world would care about data rights all of a sudden, and that this wouldn't be five articles, this would end up being 5 million articles.

Peter McCormack: 01:05:03      
Was Alexander where before the articles broke or was he just hit with-

Brittany Kaiser: 01:05:07      
No.

Peter McCormack: 01:05:07      
Okay.

Brittany Kaiser: 01:05:09      
He was aware that something was coming out in The Guardian because they wrote to him for comment.

Peter McCormack: 01:05:15      
Okay.

Brittany Kaiser: 01:05:15      
Not my articles, but I believe for Chris Wiley's articles and some of the other things that Carole Cadwalladr wrote.

Peter McCormack: 01:05:21      
Okay.

Brittany Kaiser: 01:05:22      
They had written to him with questions, so he knew that obviously something was coming out, but I don't think they reached out to Alexander for comment on mine.

Peter McCormack: 01:05:31       
You've made it clear. You're obviously fond of him as a colleague. He took you under his wing, you said it was fun.

Brittany Kaiser: 01:05:38      
For a while.

Peter McCormack: 01:05:39      
Yeah, but you have to sacrifice those relationships in those situations now.

Brittany Kaiser: 01:05:43      
Well, I like to think that we were friends for a while, but looking back on it, it does seem like he did anything that he could to manipulate me into doing what he wanted me to do.

Peter McCormack: 01:05:55      
I think he believes what he's doing is right. The thing that I found very concerning with him was the presentation where he's talked about where we're experts in behaviour change. And I don't think he even realises what he's saying at that point. I'm like, "Basically you're saying you're experts in manipulating people to do what you want." That to me goes beyond the, here's a message, here's a campaign, like when I worked in advertising. This is manipulating people, and I don't think he realised how sinister that is.

Brittany Kaiser: 01:06:27      
And to us, it didn't sound sinister. For me. I had come from volunteering for United Nations agencies and NGOs and CBOs, and those organisations call behaviour change campaigns when you are getting people to use a modern hospital as opposed to a witch doctor, when you're getting people to use condoms as opposed to spreading HIV, when you're getting people to drink clean water-

Peter McCormack: 01:06:53      
Yeah, I guess.

Brittany Kaiser: 01:06:54     
... instead of dirty water. So for me, behaviour change campaigns have always meant something positive and that's why I was never creeped out by it. A lot of people, when they hear it for the first time, like how could you just listen to this and not freak out? But I had come to it from a totally different context so I never-

Peter McCormack: 01:07:13      
The film doesn't do this, the doesn't give you this bit.

Brittany Kaiser: 01:07:15      
Right.

Peter McCormack: 01:07:16      
I think the film is interesting. I think too much was spent on production, not enough was spent on telling the story in the right way. I still think it's good. I don't know. I mean, what's your view on it? Do you have an opinion on it?

Brittany Kaiser: 01:07:33      
I mean, I think it was a massive opportunity for my story to have a global impact that I never would've probably had that platform otherwise.

Peter McCormack: 01:07:44      
Okay.

Brittany Kaiser: 01:07:45      
And I think they were able to tell the story in an engaging way, in a way where people actually take notice and where they care. And when the movie ends, they think, what can I do now?

Peter McCormack: 01:07:56      
Okay.

Brittany Kaiser: 01:07:57      
I think it's really useful for people to-

Peter McCormack: 01:07:59      
That's the Michael Moore school.

Brittany Kaiser: 01:08:01      
Yeah, to spur people to action. And I'm avid fan-

Peter McCormack: 01:08:03      
It's not a documentary, it's a movie.

Brittany Kaiser: 01:08:04      
... of Michael Moore.

Peter McCormack: 01:08:06      
Well, he says that you don't make a documentary, you make a film.

Brittany Kaiser: 01:08:08      
Exactly.

Peter McCormack: 01:08:09      
And you've got to drive people for action.

Brittany Kaiser: 01:08:12      
Well, that was the thing. I mean, the directors who are two absolutely brilliant human beings had been trying to make a story about the data crisis since the 2014 Sony hack.

Peter McCormack: 01:08:21      
Yes.

Brittany Kaiser: 01:08:22      
And had been struggling to find characters that they could actually tell an understandable story through because they're trying to tell a story about something you can't see. So they knew from the beginning that they're going to have to work with a lot of animators in order to start to visualise what that actually means. You produce data every day and it's vacuumed off of you by all of these kleptocratic companies, and because you can't see it, most people don't take any notice. You click a terms and conditions box and you never look back.

Peter McCormack: 01:08:52      
But even now, even post Cambridge Analytica where everybody knows the sinister use of data, I still don't think people care. I still think there's apathy. Still people think-

Brittany Kaiser: 01:09:01      
Definite apathy.

Peter McCormack: 01:09:03      
Still think people are like, "Oh, fuck it."

Brittany Kaiser: 01:09:04      
Definite apathy, but well, you have more momentum now than we've ever had. I would say a since-

Peter McCormack: 01:09:10      
But this is where I think it needs to be driven by regulation-

Brittany Kaiser: 01:09:13      
exactly.

Peter McCormack: 01:09:14     
... rather than expectation of the individual. This is why I think privacy will be driven by it being a commercial tool rather than the individual seeking out their own privacy.

Brittany Kaiser: 01:09:22      
I totally agree with you, which is why I really wish I could say I expect so many companies to make the ethical choice, but they're going to have to be forced to by the law. That's why I work on law and regulation as much as possible. Just like what we did here in Wyoming, it's common sense. Legislation that still allows companies to build, it allows entrepreneurs to do what they need to do, but it protects people. Individuals still have their rights, and they have consent and transparency and the ability to actually own their own value. Here in Wyoming, your digital assets or your intangible personal property, so your data and your blockchain tokens are no different than your house.

Peter McCormack: 01:10:03      
Okay. Couple of questions I want to focus on before we come to a close. I think I've probably talked to you about this for hours. We'll have to do a follow up at some point.

Brittany Kaiser: 01:10:12      
For sure, I'd love to.

Peter McCormack: 01:10:12      
It's fascinating. Reflecting on a little, how much soul searching have you had to do and is there any areas where you've had to be personally critical, and you've looked at yourself and you're like, "I was wrong here?"

Brittany Kaiser: 01:10:25      
Yeah, definitely. I mean, I can't wait for you to read the book or listen to the audio book. I recorded it in my own voice, so it's actually-

Peter McCormack: 01:10:34      
Okay.

Brittany Kaiser: 01:10:35     
It's very emotional.

Peter McCormack: 01:10:36      
I tend to do the books, I'll tell you why the audio books. I can do an audio book in a weekend, but to read it takes me probably a month.

Brittany Kaiser: 01:10:43     
No, no. I mean same with me, just because I always keep putting it down, picking up something else. I-

Peter McCormack: 01:10:48      
HarperCollins, right? I read the preface. That's when I got to your LinkedIn, that's when I added you. That was the journey I went the last few days. So I've read the preface.

Brittany Kaiser: 01:10:57      
It's not a light title. Targeted, The Cambridge Analytica Whistleblower's Inside Story of How Big Data, Trump, and Facebook Broke Democracy.

Peter McCormack: 01:11:05      
And the keyboard as a hand grenade. I like that. It's good creative.

Brittany Kaiser: 01:11:10     
They came up with that, it's mind-blowing. That is better than what I could have hoped for, for you to propose to me. So yes, we're going with this and I want to pastor it everywhere.

Peter McCormack: 01:11:22      
Okay, but the self-reflection.

Brittany Kaiser: 01:11:23      
The self-reflection.

Peter McCormack: 01:11:24      
Self-criticism.

Brittany Kaiser: 01:11:27      
There's so much of that. Oh, God. When you look back and you realise what you turned a blind eye to because there were shiny objects all around you, it's sad. I mean, it's really crazy. I never earned more than a $100,000 a year at that company. I wasn't even paid well.

Peter McCormack: 01:11:43     
Okay.

Brittany Kaiser: 01:11:44      
I had a couple of grand a month in my pocket.

Peter McCormack: 01:11:46      
But then the film paints a picture of you're in high power meetings, going to important places, dealing with important people, parties, champagne. I can see how you get swept up into all that.

Brittany Kaiser: 01:11:58      
Right, yeah, and I did. I told myself that I was living a great life and I was doing something important. And then I was building a unicorn. I was going to own equity in a billion dollar company and I was going to be able to quit, and go use those tools for all the things that I wanted to see in the world. Now that I know how to use it, I can go and run all of my human rights campaigns and progressive political campaigns around the world for the rest of my life effectively. And that is obviously what I still intend to do, but only with data that people have consensually and transparently given to a cause that they believe in too.

Peter McCormack: 01:12:39      
You told yourself a story. You convinced yourself.

Brittany Kaiser: 01:12:42      
Yeah.

Peter McCormack: 01:12:43      
Coming out of that must be quite emotional.

Brittany Kaiser: 01:12:45      
Very, but there there's nothing more ... I don't know. There's nothing better for self-reflection than just brutal honesty. Right? I call it radical transparency. Just being totally honest about exactly what happened and making yourself live through that. Every time I have to say it, you re-punish yourself for the decisions you didn't make back then.

Peter McCormack: 01:13:14      
Do you worry about the impact that's had potentially on other people's lives?

Brittany Kaiser: 01:13:19      
Of course, every day, which is why I spend every waking moment cleaning up my mess. It's not just my mess, it's a mess of an entire industry that has been going on for 10 to 15 years this way. So I am doing my own part and no small part to change that.

Peter McCormack: 01:13:39      
You're all right?

Brittany Kaiser: 01:13:40      
Yeah.

Peter McCormack: 01:13:41      
You're going to trigger me. I cry very easily. If you go, I'll go. I promise you. Are you okay?

Brittany Kaiser: 01:13:46      
Yeah.

Peter McCormack: 01:13:47      
All right.

Brittany Kaiser: 01:13:47      
Thank you.

Peter McCormack: 01:13:48      
Okay. The last thing I want to ask you about is ... Actually it's going to be two questions, but the first one is about Facebook and big data. How big a problem is this still? What should people genuinely be scared about? And what needs to change?

Brittany Kaiser: 01:14:13      
The gravity of the problem is that if you had a Facebook account before April, 2015, your data is out there and you are never getting your privacy back.

Peter McCormack: 01:14:23      
Okay.

Brittany Kaiser: 01:14:23      
It's absolutely impossible for likely the millions of databases around the world that have your personal data from the friends API for that to ever be deleted. So is Facebook data the only problem? No. I mean most of individual's personal data that has been collected by companies since those companies existed is all over the world and it's available for purchase, not just to the highest bidder but to anybody that wants to license these things. So it's a pervasive problem throughout the whole industry. But what Facebook refuses to recognise is that it's not just that they continue to show a complete lack of for privacy or consent or the well-being of their users, it's that they're not making important changes that they could be making and that they could be investing in.

Brittany Kaiser: 01:15:22      
Instead, they're making very small cosmetic fixes. For instance, labeling if a photo or video has been manipulated. Okay, great, but you're not suppressing or stopping fake news. You are just saying, "Hey, by the way, this video has been edited," and you're still going to allow everyone that watches that video to see that, whatever it happens to be. Sometimes I find that to be okay, sometimes not. They've also made small fixes like, okay, this is a political ad, so that weird manipulated content is no longer hidden where you think it could just be a post from your friend or what have you, which was a big problem in the last election. But they could be investing in systemic change. They could be changing the way their algorithms work to push up positive content instead of negative content, but right now their algorithms favour hate. They make negative content viral and that hasn't been fixed.

Brittany Kaiser: 01:16:27      
Another thing that should scare everybody that hears this is an amazing woman called Yael Eisenstat. She had worked her entire life at the CIA in counter-terrorism and cybersecurity, and she was recruited by Facebook to become the head of elections integrity.

Peter McCormack: 01:16:48      
Okay.

Brittany Kaiser: 01:16:48      
She served in this role for six months before she quit because Mark and Cheryl said no to every single thing that she told them they needed to fix in order to protect people ahead of the elections. She gave back all of her salary and her stock and said, "I don't want anything from you people."

Peter McCormack: 01:17:08      
Okay, that's interesting.

Brittany Kaiser: 01:17:10
And she has now joined the Center for Humane Technology.

Peter McCormack: 01:17:13
Wow.

Brittany Kaiser: 01:17:14     
Founded by Tristan Harris. Both of them are brilliant human beings and all the work they're doing is incredible. But to have a cybersecurity and counter-terrorism expert come into Facebook to try to fix things and be told no to everything, and for her to literally quit and give everything back, we should all be terrified-

Peter McCormack: 01:17:34      
Yes.

Brittany Kaiser: 01:17:34      
... at what Facebook refuses to invest in and refuses to protect people when they could. Why not locate a small part of that $500 billion to actually doing something good for a change? It's disgusting.

Peter McCormack: 01:17:51      
Yeah. There must be something, there must be a risk there. But we both still use Facebook.

Brittany Kaiser: 01:17:57      
I do, yeah, because I don't want Facebook to go away. I just want-

Peter McCormack: 01:18:01      
Want it to be better.

Brittany Kaiser: 01:18:02      
I just want them to stop abusing people. That's all. I like Facebook. I like being connected to people. I like being able to communicate with people from all over the world with ease and I like having a network that I've built over the what, 13 years I've been on Facebook. So that's not the problem. Facebook as a platform is not the problem. The fact that the people who make decisions in the company refuse to make ethical decisions even when they're forced to is the problem. Right?

Peter McCormack: 01:18:31      
We've got a share price to keep up. All right. Well, look, this has been fantastic. Like I said, we could've gone for probably three, four hours like a Rogan link show. I think it's just a fair way to end this is just for you to tell people what's coming up for you, and then I think at some point in the next few months we need to do a follow up and go ... I've covered your story now. Now I really think we should go into the problems with data and the future of data. I think you and I could do another show based on that. Hopefully one day.

Brittany Kaiser: 01:18:58      
Definitely. I mean, that's why I know I'm spending the rest of my life dedicated to, okay, we've done the film, I've put out this book that actually starts from the moment I meet the CEO of Cambridge Analytica to when I decided to become a whistleblower. Once that story is totally out into the world and I can answer as many questions as are helpful for people, my entire life is dedicated to the solutions. So there are two things that I spend my time doing right now. One is campaigning for legislative and regulatory change because of the work that I got involved with here in Wyoming. I co-founded DATA, The Digital Asset Trade Association. We're a 501 C (6), so a non-profit lobbying firm. We help legislators and regulators understand what common sense legislation is so that both individuals and companies can continue to lead a happy, productive, and free life. Right?

Brittany Kaiser: 01:19:54     
I believe that entrepreneurs and regular citizens, both should have rights to do what they need to do. So helping make balanced legislation, that means that companies are not going to lose their business model. They just need to make some changes in order to protect people. That's what I work on there. And then I recently co-founded with my sister, The Own Your Data Foundation, which seeks to improve the problem of a lack of digital literacy around the world. So we are putting together curriculum that are meant, not only for children, so K-12 first, but implementing digital literacy also into universities, into companies, through HR training. I don't see a future where a kid should go and learn a typing class and learn how to send an email to their parents without being told, "Hey, every word that you type, that data is being collected by this email client and the email client of your parents, and a lot of companies in between. Are you comfortable with that? And do you want to tell your friend who you have a crush on or do you want to tell your friend, meet me at recess and I want to tell you something in person?"

Peter McCormack: 01:21:08      
Okay.

Brittany Kaiser: 01:21:09      
You start to think in a little bit of a different way. There's an amazing man called Jim Steyer who started Common Sense, amazing organisation that protects children online. He's done a lot of incredible work and we want to support that and follow in his footsteps, and proliferate that work around the world. Also, in addition to just the curriculum and implementing that into the education system and into companies, we will be running global digital literacy campaigns.

Peter McCormack: 01:21:39      
Wow.

Brittany Kaiser: 01:21:39      
To get people to care and to realise, and to learn how to protect themselves. Because you can have all the best laws in the world, but until people care, we're not going to see systemic change.

Peter McCormack: 01:21:53      
Wow. Okay. Well, this has been great. I really enjoyed this. I definitely came in with some opinions about you or some thoughts, some areas I wanted to explore. I wasn't sure. This has changed my mind on a few things.

Brittany Kaiser: 01:22:05      
Thank you.

Peter McCormack: 01:22:05      
There're some things I didn't expect, some part of the stories I didn't know. Just really glad you came on and you wanted to do this, and I hope we can hang out again in the future and do another show sometime.

Brittany Kaiser: 01:22:14      
Definitely. No, thank you so much. Absolutely brilliant.

Peter McCormack: 01:22:17      
Good luck.

Brittany Kaiser: 01:22:18      
You too. See you soon.

Peter McCormack: 01:22:20      
Do you mind if I asked your sister a couple of questions? Do you mind being asked a couple of questions?

Brittany Kaiser: 01:22:25      
Go for it.

Natalie Kaiser: 01:22:25
Sure.

Brittany Kaiser: 01:22:26      
Switch seats.

Peter McCormack: 01:22:27      
Okay. The main question I really want to ask you, what's it like watching it from the outside as her sister? What's this last what, two years, really? Two years been like?

Natalie Kaiser: 01:22:39
I mean it's interesting because a lot now ... I mean obviously it's been very publicised and people are very interested in the story. But her personal life, her personal story has become a grounding in that through the film and now the book and everything. So a lot of people are very interested in her personal journey. It was really interesting knowing her fully and seeing her go through that. And t's hard to see people just look at her time at Cambridge. I appreciate you exploring the whole journey because it's not just one day you wake up and you say, "I'm going to work for this company as a Republican consultant, and run Brexit and the Trump campaign." That doesn't just happen one day when you wake up. It's a journey.

Peter McCormack: 01:23:28      
Of course.

Natalie Kaiser: 01:23:28
There are a lot of decisions made. So I think a lot of people miss her whole story and who she is, and just jump to a conclusion about certain things. Yeah, there's fault, there's wrong, but there's also a lot of explanation and there's a lot of choices that a lot of people would've made, could've made. But some people choose to ignore the fact that human fallibility is real and people make choices, people make hard choices. Sometimes they don't pay enough attention. It's a very personable story.

Natalie Kaiser: 01:24:00
At certain points when she worked for Cambridge, you could see her drinking the Kool-Aid bit, and I felt her pulling away from herself when she, at the end, decided to come out as a whistleblower and put herself in the position to be criticised, not knowing what was going to happen. Risk everything to put it out there because she was just like, "I have to do something. I have to have to speak out about this." That for me was her coming back to herself.

Peter McCormack: 01:24:36      
Right.

Natalie Kaiser: 01:24:36
That was more of who's she's always been, with strong opinions, pushing them forward, not just speaking out about them but acting on them.

Peter McCormack: 01:24:47      
So you felt for a period of time, your sister was just changing and you were like, "What's going on here?"

Natalie Kaiser: 01:24:51
A little bit, yeah.

Peter McCormack: 01:24:52      
Any difficult conversations?

Natalie Kaiser: 01:24:57
Plenty.

Peter McCormack: 01:24:57      
Yeah? Okay.

Natalie Kaiser: 01:24:58
Plenty. Obviously, when she started working for them, I come from more of a psychology background, so the ocean modelling, she used to tell me some very interesting social impact campaigns they were working on that I generally saw the value in. It was the ocean modelling, all the psychology behind it is very interesting if they're obviously not the only people who use that. They're not the only data analytics company. But I understood how she was getting into that work, and then it's fast forward two years when you're really in the thick of it and she admits. You get swept up. She's a wildly intelligent person who's drawn to intriguing new concepts and technology, and I think in some ways that blinds you. You're so interested in the possibilities and the technology that sometimes you lose sight of ...

Peter McCormack: 01:25:56      
I think Paul in the documentary said a very pointed thing when he said everyone deserves their redemption, and I think ... Feel like you're getting it, you got it? What do you think?

Natalie Kaiser: 01:26:07
It's an ongoing journey. I've seen her also, not just through Cambridge but in the post, and she's done a lot of soul searching. It's been a very emotional journey for her. Anybody who goes through something like that and then comes up the other side, reconciling with yourself is a long process. It's not a week long process or a designated timeframe. In some ways she's very much still going through it. It's driving a lot of her decisions now. If that would've never happened, I doubt she'd be campaigning for data ownership and data rights. I doubt that would be her life goal. But everything happens as it does and you take on causes you believe in because of certain things that you've been through. Most of it is emotional.

Peter McCormack: 01:26:57      
Well, I wish you both good luck with what you do.

Natalie Kaiser: 01:26:58
Thank you.

Peter McCormack: 01:27:00      
If you ever need anything, just reach out to me if I can help in any way.

Natalie Kaiser: 01:27:02
Well, thank you very much. We appreciate that.