FWDThinking Episode 25: The future of tech… isn’t tech

Alistair Croll in conversation with The Tech That Comes Next co-authors, Afua Bruce & Amy Sample Ward.

All opinions expressed in these episodes are personal and do not reflect the opinions of the organizations for which our guests work.

Alistair Croll: Hi, and welcome to another amazing episode of FWDThinking. Today, we’re going to be talking to Afua Bruce and Amy Sample Ward about their forthcoming book. This is a bit of a teaser for a book club that we’re about to embark on in a few weeks. And I wanted to invite them both here today to talk about their book and some of the thoughts that went into it and why they felt the need to talk about what comes next in the technology world that we’re all building. So please join me in welcoming Afua and Amy. Hello you two! 

Amy Sample Ward: Hello!

Afua Bruce: Hello. So good to be here. 

Alistair Croll: So Afua, you’re a fairly familiar face to our audience. I remember seeing you on the big round stage in Ottawa and on our screens as well. And you’ve been doing lots of things with lots of different organizations.

Amy, you’re a new face to the FWD50 community. Nice to meet [00:01:00] you. Why don’t you start Amy, tell us a little bit about yourself and maybe how you got to meet Afua. 

Amy Sample Ward: Wonderful. Hello everyone. I feel like after Alistair’s intro, we like virtually walked onto a stage with the like everyone welcome. Hello everyone. I’m Amy Sample Ward. I use they / them. I’m the CEO NTEN in most of my waking hours. And I’m based in Portland, Oregon and Afua and I came to know each other through the NTEN community, through our conference that we have every year, the nonprofit technology conference. And I feel like there have been opportunities where we’ve collaborated or connected or worked on things outside of that, but that was our first place of connection.

And, a really important part of something that we collaborated on that kind of led in a, in a sideways way to the same type of collaboration of the book is working on the [00:02:00] equity guide for nonprofit tech that NTEN put out in some of their publications around how folks that really are trying to do great work and trying to change the world can use technology to do that.

And so I think we identified years ago that we both had a few things to say on that topic and looked for ways to do that together. 

Alistair Croll: Awesome. And Afua, why don’t you just remind those people that may not have seen you at what’s your background is too? 

Afua Bruce: Yeah, thanks so much. We both had a lot of things to say.

I think that’s a great description of Amy and myself and in our collaboration here. My name is Afua Bruce. I have spent my career really working at the intersection of technology, policy and communities. And so have done that within the federal government, in the US, at the white house and at the FBI, within the nonprofits. Working at data kind as well as New America and a number of nonprofits around the world, actually. And within the academic space and some work I’ve done with [00:03:00] them, Carnegie Mellon University, and some other institutions as well. I now do consulting for non-profits and companies and foundations who are looking to invest in the public interest space broadly speaking. And I’m really enjoying that. 

Alistair Croll: Yeah, your career is as varied as any I’ve seen this sort of jumps in and out of government and civic tech. So it’s really nice to see like those different perspectives. I think Dan Hahn, who I’m sure both of, you know loves to describe what he does is dumb questions as a service, right?

Just going in and sort of stating the obvious to people who may not have those, those fresh eyes to see the problems. In reading the background for the book, it certainly seems like you two are encouraging people to look at tech with that kind of child’s mind sort of, rather than assuming that it is on its way. And we just have to jump on board that we should ask better questions of the technologies that come next, because they are so intertwined with [00:04:00] equity, community and really with the asymmetric power of technology. So, I noticed early on that you talk about technology is something that should be built on community centered values.

Now I’m up here in Canada and we’ve obviously had some fairly newsworthy events happening recently where there’s lots of people who are questioning the need for centralized, organized government. And it does seem like society as a whole is having a renegotiation of the value of sort of community and collectivism versus individualism and independence.

One of the things that fascinated me about the events the last few weeks in Ottawa was the use of technology to provide an asymmetric advantage to a group. And it seems to me like this is one of the core aspects of technology is, it allows you to find asymmetric advantage and I’m sure that’s true whether you’re civic, tech, or government or nonprofit or [00:05:00] protesters or whatever. 

Can you talk a little bit about how you think that community values need to govern technology rather than, you know, startup investors? And I’m going to pick a name so that you don’t kind of both wait to talk. So, Amy, do you want to field that?

Tell me about why communities, not tech founders should be making decisions about the tech that comes next. 

Amy Sample Ward: I mean, there is a long list of reasons why, but I guess I will start with a bit of reflection from what you were just sharing about recent events and, and, you know, not even specific to Ottawa, but I think we’ve had multiple years of, of recent events to, to look at in all different ways.

And I think the spirit from which we wrote the book is that we are better and stronger when we are together. And technology is a way that we are able to be together, whether that’s just like emotionally being together, [00:06:00] you know, and, and staying in communication, but also organizing, communicating, doing work together through technology, and are reliant on tools either to be the answer and be the solution that, you know, we’re not going to get very far.

And if we’re relying on technologies to help us in that pursuit and they aren’t built for those purposes. You know, we’re going to spend a lot of our effort just making the tools work for us and never getting to do the thing we want to do. So I think the first piece of the answer. And then Afua can, can share more, is that technology should be accountable to communities because communities are using them and needing them and investors are not the sole users.

So they shouldn’t be the sole accountability person in that equation. Afua, what would you add? 

Afua Bruce: Yeah, Amy, I think you described it so well. I think that I would [00:07:00] add to that that we need to center communities because as you said, communities are, are the ones who are using technology. And I think that without centering communities in this and without really being focused on that, we can fall into the trap of thinking that technology will save us.

And that with just the right tech, if only we could find the right technology, if we could write the right program or design the right app or implement the right technology solution, then the world will magically be a better place. And that is that is the stuff of fairytales. And so really needing to put in the hard work to understand some of the cultural context, when you deploy technology to understand some of the local values of the folks who will be using the technologies to understand the, the motivations and what it means to use technology, really, to be more inclusive and to, to connect is really important. And I think we’ve got a lot of opportunity there to continue to work on that.

Alistair Croll: So one of the [00:08:00] biggest trends of the last century, if historians look back. Is the industrial nature of it. And really the industrial era was kind of, you can have any color you want, as long as it’s black, the Gerald Ford rolling out the assembly line, right? It seems like with digital, we have the ability to get away from that.

To very much personalize and customize and tailor things. So you can consume the information in the manner that you want, that it can be tailored for you. You know, we used to have three broadsheet newspapers per city. Now we have one personal newsfeed per user. And that personalization can be incredibly enabling. Someone watching this who maybe has difficulty hearing can view the captions.

Someone could be just listening to the audio and so on. This could be a transcript you can read. So technology makes it very easy to personalize and tailor things like that. And that kind of reinforces the individual nature of technology. At the same time, we have [00:09:00] created a world where we have many different versions of the truth and one of the side-effects of communities is the sort of filters they live within.

I know this sounds a little off topic from the questions that I sort of went through initially, but, can you talk a little bit about how we get technologies to bring us together rather than to put us into this kind of quilt of filter bubbles? What do we need to think about, so the technologies don’t further divide us into little groups because they have such power to personalize things?

And Afua I’m going to throw that one to you first. 

Afua Bruce: Great. Thanks so much. So I think in thinking about this, we have to recognize that many things can be true at the same time. And I don’t mean that necessarily in the way of, there are many versions of the truth, but really that many things can be true at the same time.

So it is possible to both design for people to have individual [00:10:00] ways to engage with technologies, as you said. Being able to just read the transcript from this, or to listen to the audio only, or to read the captions on this. But it is also an option for people who are developing technology and who are funding that technology to think about how do we make technology so that they are inclusive and they really do include more so that as we’re thinking about ways that we’re developing technology, how do we design for all of those things at the same time, how are we thinking more systematically in the design process about accessibility? How are we thinking more about how we’ll distribute the technology so people can actually use it and use it in a way that actually serves them. So I, I think part of it is recognizing that both things can be true.

And we can make it so that individuals can feel valued and seen. But [00:11:00] also recognizing that we can affect overall system change that creates that space for individuals also to have ways to engage. Amy, what would you add? 

Amy Sample Ward: Yeah. I want to like get out a highlighter, and underline, whatever your preferred, you know, style of marking up documents is, and just underline what Afua was saying about accessibility.

Like. Making sure that this is being recorded in a tool that allows for transcripts and captions and, you know, even like a translation plug-in or whatever else. Is not creating multiple versions of the truth. Like that’s creating a way for as many people as possible to engage in a way that is available to them in the way that they want to engage.

And I think that’s really important as Afua was saying for, you know, how we build technologies, but also. If someone was, you know, one organization uses zoom as their phone line. Right? [00:12:00] And like here’s all of their phone numbers, another uses zoom for their program delivery. One of those is not, “the right way” to use zoom, right?

And so I think as we’re trying to bring, I love Alistair bringing a critical lens. Remember that what we’re trying to kind of solve for here is that the most members of our community can use technology to actually get where they want to go. Then thinking of it as you know, oh, it’s, it’s dividing us, because someone is reviewing this as a transcript and someone watched it as a video. That didn’t divide us. We still got to the end, which was getting to engage with the content and changing our frame from: there’s one right way to do things, versus the right way is however, is accessible to the community. Helps us, I think, just to the first question, stay accountable to the community instead of to ourselves as the creator. 

Alistair Croll: Yeah, I think I’m referring more to the side effect of technology. The, the [00:13:00] fact that technology, for example, has the ability to customize or tailor something to just you. So when I go on Twitter, I see a different timeline from someone else because it’s tailoring it to me.

Which can be very nice, you know, I can mute people I don’t want to hear my feed who are abusive or obnoxious. I can look at certain hashtags that I care about, but at the same time, that level of ease of use can have a negative effect when it leads people to not realize that there are others who aren’t like them. That word community’s doing a lot of lifting there that sometimes community can be insular as opposed to inclusive. And I think at a philosophical level, technology is obviously a two-edged sword. It’s, you know, it, it can do wonderful things and make things accessible from an individual level, but it can also inadvertently convince us that ours is the only version of the truth. And [00:14:00] especially when you talk about policymakers, because you mentioned these sort of five stakeholders in your book, and I’ll come to that in a minute. When we come to policymakers, making policy about how technology works, not just in the you know, it should be available in all formats, but in more of the Marshall McLuhan, medium is the message kind of thing. We’ve gone from a medium where there was one version of the truth and it may well have been marginalized and it may well have been extremely privileged, right? Like the broadsheet newspapers were definitely not representing the interests of first nations in residential schools. That was not getting coverage in Canada.

At the same time. Now we have a thousand news channels and many of them don’t actually have truth, their opinion or whatever, and technology has made it so much easier to spin up sort of parallel universes. So I guess on a philosophical level about technology as a new medium, should there be a mandate to use technology in a way that brings us together as a community, or is technology sort of impartial and policymakers shouldn’t meddle in it? 

Amy Sample Ward: I think it’s interesting. I love this [00:15:00] conversation. Thank you for asking these questions. I am finding it interesting, when I’m hearing you talk about this and my answer here is only buying time for Afua’s was more brilliant response to come next.

But what’s most interesting to me is and I’m not trying to say Alistair, this is what you meant. Just one way of hearing what you’re saying is that technology has created that, and that is not the case. Twitter is not like, well, there’s many things I’d love to change, you know? And, and criticisms of platforms, Twitter included. Twitter, isn’t saying, “hey Alistair, you only get to see these five people”, right? Like most Twitter accounts are public and you could look at a different hashtag than the one you’re currently following, right? So yes, I think there’s, well-documented research around the echo [00:16:00] chambers that we create for ourselves across social media.

Do I think that those platforms have mandated echo chambers? No. And so an interesting entry point to something that we do talk a lot about in the book. And that is the relationship that we, as people have with technology. And we say in the book, part of what needs to shift in order for us to build a better world, have better technology is we have to change our relationship to technology, right?

It cannot be something that is constantly venerated as like, oh, you know, let us bow down to technology. No, we humans created those tools, so let’s make them better, you know? And not expecting like, oh my gosh, like I’m only hearing the same thing on Twitter, this is Twitter’s fault it. It is your fault It is your responsibility then is that person to, you know, find that different hashtag or follow other [00:17:00] folks or, or whatever it might be. Not that I’m like now attacking your Twitter following list. But I think it’s an interesting place to examine. 

What are the assumptions we’re putting on technology like big technology or, or platforms or solutions? What is the role we’re giving that technology, even in the way we’re framing these questions to ourselves? Because I think within that, we are exposing what are we valuing and who are we valuing? And we are valuing technology more than community. And, and that’s an opportunity to change.

Alistair Croll: Yeah, Jana Eggers, she’s an amazing CEO of a company called Nara Logics, but pioneer in AI and she once gave a talk and she said “We’ve got to stop thinking of technology as a devil, come to kill us or an angel come to save us and start thinking of it as a child that we should try and raise because we gave birth to it”, which I thought was a very interesting middle ground to think about.

Like it’s neither a demon nor an angel. It’s this thing that we built.

Afua, [00:18:00] feel free to chime in on that one. But I would like you to also talk about, you mentioned technologists is one of your five stakeholders, but I tend to think of a big difference. Technologists can be the people who build something. The people who deploy something and the people who maintain something, and those are generally very different roles, right? And we often talk about technologists as the first, build it, throw it over the wall and something will happen. Whereas almost all the work is involved in getting adoption, rolling it out, and then just maintaining it, and end of lifeing it.

So can you talk a little bit about how technology needs to change and you know, the tech that comes next, maybe you can talk more about the rollout and the maintenance that comes next because that’s often overlooked . 

Afua Bruce: Oh man, rollout, maintenance, terms. I get maybe not a lot of other people excited, but do get me really excited.

And that’s actually part of what we talk about when we define technologists in that section. And so yes, as we think about changing our relationship with technology, I think one [00:19:00] recognizing that everyone, all of the different stakeholders, we identify have a part: policy makers, big tech and small tech and everything in between have a part.

But really specifically with technologists it’s important. I think traditionally as you said, Alistair, when people think technologists, they think of the people who are going to create something. So someone has gotten a great idea to build the next great app for delivering happiness at the touch of a button.

Anytime you want. And that’s sometimes where it stops, but in reality, we really need to think through what the whole life cycle is because as important as it is to have that great idea, we really need to then think about what it means to maintain that and to successfully deploy that. The deployment, for example, there’s a lot of change management that goes in there.

There’s a lot of making sure you’ve designed for the appropriate situation. And then of course that maintenance fee is is [00:20:00] key in the, in the book and our chapter on technologists, actually, we dive into an example with Data Kind where I used to work uh, where Data Kind partnered with uh, John Jay College. In order to help increase the number of people who. Graduated from John Jay college. John Jay college had noticed that a number of people get to completing three quarters of the credits they need to graduate and then didn’t graduate. So they worked in close partnership with Data Kind to develop some data science algorithms and models that would help identify those students who are at risk for that.

And recommend suggestions that then the career counselors would use to inform the actual recommendations and interventions that were done. And what was important about that was really designing for the systems that John Jay college had, not necessarily the systems that the Data Kind and Data Kind by the way, is an organization that does data science and AI in service of humanity with [00:21:00] nonprofits and uh, government There are many agencies around the world.

But making sure that the Data Kind volunteers were designing for the appropriate situation there. And then really thinking about how that tool would be deployed and maintained. So it meant spending a lot of time with the John Jay college staff to explain how the technology was working, to explain how to get the information out, to explain how to run the models as John Jay college got new roles of students and would be able to run these models repeatedly, so that they could continue to use the tool over time. And it worked. After two years, John Jay college reported I think about it 900 additional students graduating because of it.

But more importantly, they describe being confident and able to understand how the technology was using, understand how to maintain and continue to use the technology, even after the Data Kind team left. And then it had worked out an arrangement, in this case with Data Kind, to [00:22:00] provide some basic system updates and maintenance over the subsequent two years to ensure that it would continue to work. 

Without that, you don’t have a system that can adapt and reflect to changing information, changing circumstances and changing goals. And so I think really, as we think about what our relationship with technology is, we have to recognize that few things are a one and done situation, right? Few things are a one and done solution.

If we want to have real sustainable impact, we need to think about what it means to maintain it. And so it is great to be that idea generator and that initial team that hits the ground that sort of parachutes in and says, I’m here to help. And then actually helps. But it is also great to sort of, you know, take your time in arriving.

You maybe took the longer route down a beautiful coastline, but you got there in time, but you’re there for the long haul and are setting up shop to maintain solutions. 

Amy Sample Ward: And I think what’s really important about that case study that [00:23:00] we talk about in the book and why it’s important, that that’s kind of an example that Afua’s offered in response to your question is because even though it’s like featured in the technology chapter or the technologist’s chapter, the technologists are not centered or the heroes of that story, right? The technology that was developed is only as good as the human counselors who are going to put it to work. And they are the ones, you know providing interventions or supports or resources to students, right. It wasn’t like let’s build a bot and then it will like send it email to the students, and now they magically graduate. Right? And like technology saves the day. No, it is still a tool. In its place, right? For use, by counselors and school administration and students and, and the like hero of our story arc or what, I don’t know, my narrative, you know, references are the students who then got to graduate and move on, right? 

It’s still not about the technology [00:24:00] or the technologists, like thank you for your service. That is great. Thank you for doing a great job, but the whole point of it wasn’t to thank them for doing that job. It was so that folks could graduate and have a degree. So I just wanted to name that because I feel like part of this conversation is, is, you know, like our, our subtext is constantly about accountability and relationship, which is very true to what Afua and I talked about constantly in writing the book and how do we make these words or values that may be, are always, or often the subtext, but not the explicit words. Like how do we make them explicit? And how do we, you know, through this book, talk about accountability and relationship and engagement as operating norms. And so I think that I just wanted to name all that because the, the way that Afua put that as the answer to your question about technologists and maintenance helps us reframe again on community.

Alistair Croll: Well, and the irony of this is that your book’s called The Tech That Comes [00:25:00] Next, but the tech that comes next, isn’t a technology. It’s the other stuff around it. One of the things I’ve been fascinated by as I’ve come to learn about digital government and government in general, over the last few years, I was familiar with the concept of a nudge.

In technology, you know, it’s a little encouragement, in behavioral economics. It’s a way of encouraging someone to do something. Government has this dark alternate, evil twin to nudge called sludge. It’s been fairly well-documented, and there are people who’ve literally coined this term.

If you’re a government who doesn’t really want- for example, I live in Quebec, one out of three people in Montreal doesn’t have a general practicioner. Can’t get a doctor, even though we have socialized medicine would be lovely. Pay lots of taxes, have no GP. The government of Quebec says, oh no, you have to go to a webpage to, to schedule an appointment, to get a healthcare card, but there’s no appointments.

So you wake up every morning at 6:00 AM and you go look. Now the governments, they’re stretched thinly. We don’t have enough money for healthcare. So the government could say, look, this is our new process for enlisting and enrolling in healthcare. And [00:26:00] intentionally slow down enrollments by putting sludge in the way of people. Taking a process and making it harder to achieve.

It feels like technology offers all a whole bunch of new types of sludge, like digital sludge. Right? And it especially targets the most marginalized groups who don’t have access to technology,, who don’t have, you know, a tech literate offspring that can do it for them or help them along. How do we make sure that as we move to tech, it’s a rising tide that lifts all boats and not just a mechanism for governments to deliver more sludge to those who don’t have the privilege that many of us do?

Afua Bruce: Yeah, as you said there are so many opportunities to introduce sludge. I think one of my favorite examples in the US is a state required you know, you put in that you were looking for work or had some work to maintain benefits on a website, only on a website. There was no way to call. There was [00:27:00] no way to go into the office. In one of the states that had one of the lowest broadband rates in the country.

And then on top of all of that, the website was down for maintenance, I think 12 hours a day. And of course overnight. So if you were working, didn’t have access in your home. Just, how are you going to be able to then put in a document that you are working in order to maintain your benefits, right?

So many opportunities for sludge, but I think what’s important here is just to recognize that again, it comes back to what’s our relationship with technology and recognizing that technology isn’t neutral, right? It is built on the systems and the ideas that we are already putting into place. And so yes, technology does uh, create opportunities to introduce a sludge, but it also can be used to help strengthen advocacy efforts.

It also can be used to help increase accountability. It can be used to help increase information. And [00:28:00] it can be used to help better inform the policy making process. And so I think working on ways that we can continue to empower community advocates, civic advocates, and the like to use technology, to help advance their mission, to help extend their mission, and to help engage with making processes and ways to hold in this case government agencies accountable and to use some of the systems to, to shed light on the impacts on disparate communities. I think this is one of the things that is still an opportunity for technology as we think about what comes next.

Alistair Croll: Amy, I’m going to change topics here a little bit. We’ve all heard that that freedom of speech is not freedom of reach by now, the idea that while citizens have freedom of speech, I can stand on a soapbox and yell as loudly as my vocal chords will let me I am paying for the privilege to get myself heard. Whether [00:29:00] that means that I have to watch ads on Twitter and as such I’m subject to their terms of service.

The terms that I have to comply with are not those guaranteed or enshrined in a bill of rights or a constitution. They’re those of a private company. And there’s really no way for a private citizen to exercise freedom of speech in technology because technology is commercial. It costs money. I have to pay for electricity. I have to pay for an internet connection. Do you think that governments are going to have to create their own stack for citizens to have digital freedom of speech? And I mean, like oauth, you know, a right to a blog post that nobody has to read it, but I have the right to post, um, government Twitter.

Like at some point we either have to say your freedom of speech doesn’t exist in the digital realm because those are commercial, or we’re going to give you freedom of speech that it aligns with our founding documents for whatever country you might be in. Is government going [00:30:00] to wind up building tech platforms?

Amy Sample Ward: It’s interesting. I mean, government already does build tech platforms. You know, and I think it’s important to acknowledge that. You know, it’s really interesting- 

Alistair Croll: we don’t build WordPress as far as I know. 

Amy Sample Ward: I also am not aware of an instance in which a government has built a WordPress alternative. But, you know, what’s really interesting in, and I don’t think this is a new conversation. Not that I think you’re presenting it as new, but I’m remembering, you know, 15 years ago when I was based in the UK. And I was working with a number of councils who at that time, 15 years ago you know, data was a big conversation and they were like well, we have all this data and it’s technically, like it is the public’s data. You know, we’re a council you’re telling me, people are interested in that? Yeah. Definitely [00:31:00] interested in that. Okay. You know? And that was when a lot of places, governments included were like, how do we make data available to the public? Is it in completely obscured, completely, you know, hard to use single file downloads that then you’re like, what am I doing with this? You know? 

Like, what do we do with this data? How do we provide it to them? And I, and I feel like those conversations. You know, maybe have a evolved, but it’s this same conversation, right? Like what is it that we have, or that we know how to navigate that we have an obligation to support the community with? You know, even just earlier this week, I was submitting written testimony to a local government here because, as a resident with an opinion. I wanted it to be shared. You know, and I could have joined the meeting by zoom. I could have submitted, you know, testimony live. I [00:32:00] could have submitted it in writing. You could submit it digitally in writing, which goes back to our accessibility conversation before, but I also think it’s important that like that was a platform, that the government was obligated to provide for me to share my opinion. Right? 

And I think there’s a lot of examples that are interesting to look at, even, you know, with the, the last US census of government, working with trusted community organizations, nonprofits, to ensure that lots of folks who are not online can still have an opinion and an experience shared, documented, you know, put, put into the record so that it’s part of policy-making and decision-making. And what I think that means is it doesn’t have to be the government saying we’re building the platform or we’re technologists now, or, or whatever, but to say, [00:33:00] Hey, in this instance, we’re not the trusted partner. Who is the trusted partner and, and what is our obligation and relationship with that trusted partner to enable them to be successful? How are governments really serving community through partnerships with nonprofits to make sure that folks a. Are getting online, that we continue to invest in making sure everyone is able to be online at home, skilled, with a device that works for them.

And that they’re, you know, engaged in the ways they want to be engaged through those community partners that they’re already working with. And trust.

Alistair Croll: Yeah, none of these are easy questions. None of the new questions. I remember when, when governments were first discovering what API meant, and that was like, oh wow, we can offer this stuff. And then of course, two years later, they’re like, take it down, take it down. 

Another question for you. There tend to be two ways of thinking about the future. One is to assume that, you know, the future is like the past only more so. 

Amy Sample Ward: I hope there’s more than [00:34:00] two ways to think about the future just for the record. 

Alistair Croll: Sure. My point is more that one assumes that the future is like the past. You know, that you can forecast based on past trends and human behaviors and so on for the future, but like, and the other says there is a singularity that something beyond which we really can’t tell what’s going to happen. All bets are off. 

If you were a feudal Lord, You probably didn’t see the arrival of, you know, Martin Luther hammering things on the door and then downstream mass production and the printing press. Like it was very hard to understand. I remember reading this amazing interview with one of the folks from Mercedes-Benz who said that we’ll never have more than a million cars because we can’t train that many chauffeurs. Like that just shows an incredible lack of imagination.

And one of my favorite examples, I worked with a sales guy, amazing sales guy, and he sent me one of his PowerPoint decks and I opened it up and there’s 700 objects on the page. Because what he was doing was whenever he wanted to change the text, he draw big white rectangle and then type some new text on top of that.[00:35:00] 

And I said, how do you not understand this? But he’d only ever used Ms. Paint and in Ms. Paint. You just fill in the space and type over it, right? So my understanding of, you know, Z layer in editing documents was opaque to him. He didn’t have a mental model of a third dimension on screens. Mercedes didn’t have a mental model of self-driving cars.

Part of me thinks that we don’t have a mental model of what it will be like to be a digital first citizen. And that many of the policies and decisions we’re making about digital today are based on an assumption that the past is like the future only more so, as opposed to that, the change we are going through now, this, it changes the sort of laws of physics of information.

That, you know, we grew up in a world where, where things were atomic and therefore scarce, and there’s only one version of them. And in the digital world, things are abundant and copies are indistinguishable from [00:36:00] the original. So this is a very philosophical, soap box to talk about, but I would love to hear from both of you is the technology that comes next, something that we can apply past rules and thinking to? And just say, let’s make sure we continue to do what we’ve done. Or is it something that we’re going to have to sort of work out from first principles, as we discovered these new laws of physics? 

And Afua I’ll, I’ll start with you. I know that’s one heck of a an intro, but I’m genuinely curious whether we can rely on what’s worked in the past or whether we need to work things out from first principles, because like the laws of information physics are new.

Afua Bruce: Yeah, thanks for this easy and very straightforward question. I really appreciate that. But I, so I think what was fun about writing this book is recognizing that we’re at a point where we really can dream about what we want to see next. We can really take a look around and see some of the inequities that are in society, [00:37:00] some of which have been hastened by technology and say, we don’t want to do this.

We designed the technology that got us here. We don’t want that anymore. So what would it look like if we imagine something different? What would it look like if we imagine something more equitable? So in the book, you know, we end each chapter with a list of questions to help people to start thinking about that.

But I think there are lots of opportunities for people in general. Hopefully you read the book and then more broadly to think about what you would want to see differently and what you would want to change and how you might want to envision a new type of technology. So will the tech tools of the future be different? Definitely technology continues to evolve. We’ve gotten new systems every day. We um, you know, are just 20 years away from quantum computing. We’ve been 20 years away for the past 40 or so years, but like really this time. And so I think there are lots of ways in which technology will advance and we’ll have new technology tools. 

At the same time. Humans are going to human. Right? We talk about ways that tech systems [00:38:00] now have been designed or ended up excluding rather than including, or we see sexism in algorithms or racism appear in algorithms. The ideas of excluding versus including sexism, racism. These are not new concepts. Back in the Martin Luther days, from your example before, people have built ways and built laws and had conversations about excluding rather than including, there were things that were sexist and racist in those, and so we have fast-forwarded a few hundred years and are facing the same things with different tools.

And so fast forward a hundred years from now, we’ll be facing the same bits of human nature. And so I think, we are at a time where we can imagine something different. We can imagine, you know, these major roles that we have: whether you’re a funder, whether you’re a technologist, whether you’re a member of a community, whether you’re a policy maker and to really think about from that accountability standpoint, how might you do your job differently?

How might [00:39:00] you exist in the world differently? How might you better partner with others to create that more equitable world? Amy over to you. 

Amy Sample Ward: Yes. Oh my gosh. I wish I was spending all that time thinking of my answer, but I was just listening to you and nodding. Yes. Agree with Afua. You know, I think going back to even the way we started this conversation, there’s a difference between thinking about okay literally what technology comes next. Like what’s the next thing that we could build. And I think what we’re saying in the book and saying in this conversation is, what’s the world that we want? Let’s, let’s be clear, as clear as we can be about like a super big idea. That’s hopefully. Far enough forward and so much more equitable than today that it’s like maybe even hard to articulate because it is like, oh my gosh, like so much better than today.

And once we have that picture, then we say, great, so where is [00:40:00] technology in service to that world? And to us, in this picture. Not, oh my gosh, what world could technology create? Oh my God. I am like off the train, if that’s where we’re going, because it is like, definitely not going somewhere good. You know, I do not want to be on the train that takes us where technology is building this.

I want us to intentionally and collectively build the world that we want and have technology stay in service to us and accountable to us and not something that, you know, it is forcing us into certain directions. So I guess the theme, the broken record is technology as a tool, humans rule. I don’t know. I was trying to make something that felt catchy there.

And what we really want is for people in communities to be centered and to be prioritized in all of those decisions, across the board. 

Alistair Croll: Well, I have dozens more questions and this is [00:41:00] fascinating, but I think we’re going to have to wrap this one up. When does the book come out? 

Afua Bruce: The book comes out on March 22nd of this year, but it is available for pre-order now.

Alistair Croll: Awesome. We’ll put a link in there. And we’ll publish this and we are going to get to have a conversation with other folks in the public service about this, who can ask you those questions firsthand, which will be amazing. 

Amy, Afua, thank you so much. Hopefully my questions didn’t take us too far afield, but I tried to make them a little different from the usual questions.

March 31st we’re going to get everyone together and have a book club conversation. They’re always fun. And hopefully you can just field questions from interested public servants. 

Rebecca and I recently did a LinkedIn live session and we found that works really well. So we actually just broadcast to LinkedIn live, which is a lot of fun.

Anyone can just join right there. And if they’re in your network, they’ll be told that you’re about to do that stuff. So we’ll see you, with some new technology that hopefully can bring us all together. [00:42:00] But I really liked this because you know, what I got from this conversation was the tech that comes next is not tech.

The tech that comes next is humans. And once we figure out what we want as humans, then maybe technology is one of the tools that can get us there. And when we start thinking like that with technology, not as an end, not even as a means to an end, but a possible tool in our tool belt. Then we create a much more human future.

And I think that’s something that is all too often, not part of the conversation. So really appreciate you both joining us today and looking forward to March 31st, when we get to dig into this in more detail. 

Amy Sample Ward: Thanks so much. 

Afua Bruce: Thanks so much.

Afua Bruce and Amy Sample Ward are no strangers to the blurry line of civic tech. Between them they’ve chalked up over 30 years working within and alongside government. Much of this work has been with communities: Amy chairs an elections commission, and contributes to public radio; Afua worked with the FBI, the White House, and New America. And they both have a ton of experience both building and deploying technology at scale.

So it’s a bit of a surprise to learn that their new book, The Tech That Comes Next, isn’t really about tech. Or rather, it is, but it’s about technology as a means to an end, to be embraced only once we’ve decided what ends we want. The full title of the book continues, “how changemakers, technologists, and philanthropists can build an equitable world,” and this is where much of our conversation centred when I interviewed them in preparation for our FWD50 Book Club on March 31.

Early proponents of the Internet saw it as a sort of borderless utopia. “To traditional corporations, networked conversations may appear confused, may sound confusing. But we are organizing faster than they are,” wrote Rick Levine in The Cluetrain Manifesto, published at the dawn of the new millennium. “We have better tools, more new ideas, no rules to slow us down.”

A decade later, Big Tech implored us to move fast and break things. And two decades later, governments reeled from bots that would divide us, and algorithms that would marginalize us. “What is the role we’re giving technology?” asked Amy when we spoke, “because I think within that, we are exposing what we are valuing and who we are valuing. And I think we are valuing technology more than community—and that’s an opportunity to change.”

As we moved on to the topic of nudges and sludges—incentives and disincentives built into processes—Afua returned to this sentiment. “What’s important here is to … recognize that technology isn’t neutral. It’s built on the systems and the ideas that we are already putting into place.”

Humans seldom notice or question the technology that has, by the inexorable hand of progress, become part of our every waking moment. The Tech That Comes Next is a reminder that behind that tech are human lives, systems, ideas, and communities. Amy and Afua want to show us technology for what it is: Not a fact of life we must accept as delivered unto us, but a tool with which to build the world we want.

To find out more about this great book and to order your copy, please visit: https://thetechthatcomesnext.com