Rob Katz is the VP of Product Management for our Office of Ethical and Humane Use of Technology here at Salesforce. He got into product management as he was getting ready to finish up his MBA and came to work for big companies like Amazon and Alexa.

In this episode, Rob and I are talking all about ethics. We discuss Ethics by Design, data ethics when it comes to building applications, and even more. Tune in to get a deeper understanding of the ethics of technology.

Show Highlights:

  • Rob’s winding career journey.
  • What he worked on at Alexa as the dedicated privacy project manager.
  • What he does in his current role at The Office of Ethical and Humane Use of Technology.
  • The lifecycle of Ethics by Design.
  • Examples of well-intended designs that ended up having unintended consequences.
  • What developers should think about in regards to ethics when building a new application.
  • What consequence scanning workshops are.

Links:

Episode Transcript

Rob Katz:
I was always interested in how we can try and make the world a slightly better place than how we found it.

Josh Birk:
That is Rob Katz, VP of Product Management for our Office of Ethical and Humane use of technology here at Salesforce. I’m Josh Brik, your host of the Salesforce Developer Podcast. And here on the podcast you’ll hear stories and insights from developers, for developers. Today, we sit down and talk with Rob about a lot of important topics, including Ethics By Design and data ethics when it comes to building applications. But we will continue as we often do with his early years.

Rob Katz:
And.

Josh Birk:
Okay.

Rob Katz:
I was one of those nerdy kids in high school who was reading things like The Economist Magazine and Foreign Policy Magazine.

Josh Birk:
Wow.

Rob Katz:
And I went to Georgetown and hung out with a bunch of other people who did things like that. And I thought that I wanted to go and try and bend the arc of the universe a little bit more towards justice and positivity through government and business and the intersection of government and business. And so that’s why.

Josh Birk:
Okay.

Rob Katz:
That’s where I started. And yeah, I’ll leave it. I’ll tell you how I ended up here too.

Josh Birk:
Well, yeah. I was going to say this, so I can kind of see the curvature, the gravitational pull from that into product management, but how did you go from that into product manager?

Rob Katz:
Well, to square the circle took a while.

Josh Birk:
Okay.

Rob Katz:
I started out doing pro-social impact investing, so I worked for a venture fund that invests in entrepreneurs in mostly in emerging markets that are trying to build businesses serving middle and low income people in those emerging markets. That venture fund is called Acumen. They’re a great organization.
It took me all over the world, living and working. And I eventually realized I wanted to go back to graduate school and earn an MBA because many of the people in that sector who I admired had earned an MBA and they were using it to run their businesses or organizations more effectively. So I managed to get into Stanford for my business degree, and off the wait list I’ll have you know.

Josh Birk:
Nice.

Rob Katz:
And I grasped at that golden ring. And while I was at Stanford, it was in the sort of techno optimist peak.

Josh Birk:
Okay.

Rob Katz:
It was 2012, 2013.

Josh Birk:
Oh yeah.

Rob Katz:
And tech was great. We were building services that were displacing old legacy systems with better, more effective, more efficient, more open mechanisms.

Josh Birk:
Right.

Rob Katz:
And tech could do no wrong. And I was living in Silicon Valley.

Josh Birk:
Right. Oh.

Rob Katz:
And two things happened. One, I started, I’m a skeptic naturally, and so I started poking at that a little bit.

Josh Birk:
Yeah.

Rob Katz:
And two is more importantly, I met a person who would later and still is become my wife. And so Clara graduated from the business school before I did, and she moved home to Seattle. And so I needed to find a way to go to Seattle after I graduated. And…

Josh Birk:
Nice.

Rob Katz:
I was running on this hypothesis that, well, finance and investing are really interesting, but I don’t know that they are the next major vector for how we’re going to affect change in the world. But these technologies and the companies.

Josh Birk:
Yeah.

Rob Katz:
That build them, man, they’re permeating my whole life. And whether those consequences are positive consequences or potentially unintended negative consequences, I need to learn how these technologies, systems, and companies work and are built and…

Josh Birk:
Gotcha.

Rob Katz:
Then my second manage to squeeze in was I managed to get a job at Amazon despite not having a computer science degree or a product management background as a product manager. So I took that opening, that door opened.

Josh Birk:
Wow.

Rob Katz:
And I said, I’m going to run towards that as an opportunity to learn.

Josh Birk:
Kind of as a side question, do you have any-. I mean, that’s actually kind of spectacular. Is it? I mean, how? Do you have any… how do I phrase this question? What was the twist, you think, that got you into product management despite not having a computer science degree or a background in product management?

Rob Katz:
Very plainly.

Josh Birk:
Yeah.

Rob Katz:
Amazon came to recruit people who were about to graduate with MBAs.

Josh Birk:
Oh.

Rob Katz:
And they were willing to take a bet on me as…

Josh Birk:
Nice.

Rob Katz:
A moldable person. And side note.

Josh Birk:
Nice. Yeah.

Rob Katz:
I try to pay that forward by volunteering and serving on the advisory board for an initiative called the Inclusive Product Management Accelerator that’s hosted.

Josh Birk:
Nice.

Rob Katz:
Here at the University of Washington in Seattle. But it’s a global program that.

Josh Birk:
Yeah.

Rob Katz:
Really is trying to make product management more inclusive by helping people from historically marginalized backgrounds get their first job in product management. Because…

Josh Birk:
Right.

Rob Katz:
If I am living, breathing proof that there is no, you can be a good product manager.

Josh Birk:
Yeah.

Rob Katz:
Without necessarily having the official in the box of what product management has been in the past.

Josh Birk:
Right.

Rob Katz:
And anyway, that’s a subject.

Josh Birk:
Yeah.

Rob Katz:
For a different conversation.

Josh Birk:
I love it though. And I have to say, doing this show has actually gained me a whole new level of respect for the concept of an MBA because my previously before talking to people like architects and yourself, my concept was, an MBA is somebody who, you just get an MBA because you eventually, you just want to be a CEO and you just sort of have to get MBA before you can be a CEO.
But it’s more than, it’s the organic nature of it, of understanding people and the systems and the structures and how things get to put together. It’s just sort of fascinating how useful it is from a utilitarian point of view across multiple disciplines. Now I do want to specifically ask, what was it like being a product manager for Alexa itself?

Rob Katz:
So that was, again, serendipity.

Josh Birk:
Okay.

Rob Katz:
It wasn’t my first product management job. I ended up at Amazon and they put me where they needed me. And I worked on a business for about a year that was directly competing with [inaudible 00:06:27] in the payments acceptance space.

Josh Birk:
Oh, okay.

Rob Katz:
Doesn’t sound like ethical tech yet, does it?

Josh Birk:
Not yet. You’re getting there.

Rob Katz:
Really interesting. I learned a ton.

Josh Birk:
Yeah.

Rob Katz:
I was on my honeymoon and we decided to shut the business down.

Josh Birk:
Okay.

Rob Katz:
And so I got one of these calls from my boss, “Hey, have a good honeymoon, don’t worry. But you’ll have about 90 days to find another role inside of the company.” And at the time, Amazon was still growing and growing and growing.
And so I took those 90 days to go and meet a bunch of different people who I had met through this and that and asked for some advice. And there was this really powerfully and inspiring VP who said, “Hey, I’m working on a secret project and I want you to be one of the first product managers on it.”

Josh Birk:
Nice.

Rob Katz:
I was like, “Oh, okay. Well, what is it?”

Josh Birk:
Tell me more.

Rob Katz:
He’s like, “I can’t tell you. It’s a secret.” I’m like, “God, you’re playing with my brain here.” “But it’s going to be awesome.” And he sort of sketched out a little bit of what it might be working on.

Josh Birk:
Yeah.

Rob Katz:
And it turned out to be Alexa.

Josh Birk:
Nice.

Rob Katz:
And sorry for anybody whose device just activated in the background, happy to talk about data privacy, because I worked on that. And I joined that team when it was just a Pringles can you can talk to that would tell you the weather and play music.

Josh Birk:
Right.

Rob Katz:
And it was awesome because I was there during that kind of accelerated growth.

Josh Birk:
Yeah.

Rob Katz:
And I had an opportunity to get exposed to the business side of it. But also it really helped me come back to my roots of tech and business for driving change in the world because I was exposed to some of the ways in which building technologies might have unintended consequences. So for example.

Josh Birk:
Right.

Rob Katz:
I was on our international expansion Tiger team in 2015 when we were launching Alexa in the UK, in the United Kingdom and in Germany. And we had a problem and the problem was that the device didn’t work very well for people in a certain part of the United Kingdom, namely Scotland. And we dug deep on this and figured out that it was a very plain explanation. There weren’t enough people with Scottish accents in our beta testing pool.

Josh Birk:
Gotcha.

Rob Katz:
And as a result, the training data was not available for the speech recognition processing algorithm to recognize the unique speech patterns of people with Scottish accents. And then it raised the second order question to me, which is, this product is for sale in North America and people across North America have all kinds of different speech patterns and accents.

Josh Birk:
Yeah.

Rob Katz:
And it’s a catch 22 because when you sell something like that and you go to general availability, you are able to generate lots and lots of training data by putting it out into the real world. And it was getting smarter and better every single day.

Josh Birk:
Yeah.

Rob Katz:
Because people with accents from around North America were speaking to it. But on the other hand, if you originally bought it and you didn’t have a speech pattern that was like mine because it’s people like me who worked at Amazon. Because it was such a secretive project, the beta pool was all Amazon employees who intended to be people like me.

Josh Birk:
Yeah.

Rob Katz:
And so it worked great for me. But what about other people? So that was the first.

Josh Birk:
Yeah.

Rob Katz:
Kind of, huh, question.

Josh Birk:
Right. Nice. Okay. We’re kind of getting towards the topic, but since you’re kind of uniquely qualified to answer this question, I have to preface it by saying I’m a huge Alexa fan, I was in the first round of the open beta.
I instantly started tinkering with it to get it to talk to Salesforce and all this kind of stuff. There’s a picture of me somewhere at the first TrailheaDX where people are just surround, I’m just swarmed by people because I’m showing them different little tricks that can do with Alexa. But the thing that it almost kind of drives me a little crazy. What do you think when you get the question about Alexa potentially being a privacy concern?

Rob Katz:
So you asked the right person or one of the right people.

Josh Birk:
Okay.

Rob Katz:
I was, again, I managed to stay on that Alexa team for a couple of different rotations, including becoming the first product manager dedicated to privacy.

Josh Birk:
Nice

Rob Katz:
Alexa’s technology, the Amazon devices that have been built with Alexa on them, are all built with a incredibly high attention to data privacy.

Josh Birk:
Okay.

Rob Katz:
And this information is dated because I haven’t worked at Amazon in a couple years now.

Josh Birk:
Yeah.

Rob Katz:
But back when I worked there, the actual hardware has a microphone off button that physically disconnects the circuits such that the microphone is not at all connected. That’s the first.

Josh Birk:
Okay.

Rob Katz:
Protection. The second thing is that how the wake word technology works is very thoughtfully designed.

Josh Birk:
Yeah.

Rob Katz:
Because the wake word is either Alexa or Echo or computer. We did the Star Trek fun.

Josh Birk:
Of course.

Rob Katz:
And that wake word is the only thing that the microphone on the device is listening to is static. And if, it’s a if then statement. If wake word, then capture audio, otherwise no. And…

Josh Birk:
Got it.

Rob Katz:
Other, because the devices have very minimal memory on them, they couldn’t actually keep more than a minute or so of audio. So they’re always.

Josh Birk:
Got it.

Rob Katz:
It’s like a CCTV in a 7-Eleven. They’re recording and deleting, recording and deleting, recording and deleting.

Josh Birk:
Got it.

Rob Katz:
And only when the wake word is detected does the next content play Christmas music. It’s we’re recording this episode December 19th, it’s snowing outside of my window. So you got Christmas music.

Josh Birk:
Got you.

Rob Katz:
Or Hanukkah music or holiday music. That is then recorded.

Josh Birk:
Yeah.

Rob Katz:
It’s transcribed using speech to text technology. That speech to text is sent to a server. It’s routed appropriately by the software that I was part of the team that helped build. And then it would go to those music service that’s hooked up to your account and it would come back and start playing Mariah Carey, All I Want For Christmas is You or whatever. And that is…

Josh Birk:
Right. Almost undoubtedly.

Rob Katz:
Yeah.

Josh Birk:
It will be that song. Yeah.

Rob Katz:
You can choose worse.

Josh Birk:
Fair. So okay, so I’m trying to think of how to frame this next question because it sounds like this was sort of a stopping point, but it was, you were already on this journey of trying to make the world a better place. Alexa has a lot of interesting ethical angles to it. Did this kind of drive you forward into your current position? Was it just sort of scratching that itch of technology and ethics and privacy all kind of bundled into one?

Rob Katz:
Well, great question. Yes. So…

Josh Birk:
Okay.

Rob Katz:
I was working on questions around representative training data. I was working on…

Josh Birk:
Yeah.

Rob Katz:
Questions around data privacy.

Josh Birk:
Yeah.

Rob Katz:
I was working on questions around sensitive topics.

Josh Birk:
Yeah.

Rob Katz:
So people say to play holiday music, but they also ask geopolitically sensitive things. They ask personally sensitive things.

Josh Birk:
Right.

Rob Katz:
They ask all kinds of sensitive things. And those questions are shaped in lots of different ways. And so you have to be able to respond and you can’t just say, “Sorry, I don’t know that one.”

Josh Birk:
Right.

Rob Katz:
And so how do you build a long tail response system that is creating a safe, inclusive answer set for everybody in different languages, different cultures? It’s a really hard problem.

Josh Birk:
Yeah.

Rob Katz:
And at the same time, Clara and I had started a family, we have two kids. My son’s almost six, and my daughter’s almost four. And I was beginning to think about the role of me as a parent raising a.

Josh Birk:
Yeah.

Rob Katz:
Morally responsible family and being a moral actor in the world. And I was thinking to myself at the same time, we conceived of Alexa as a member of the family. And so what is our responsibility as the quote unquote “Parent,” as the builder.

Josh Birk:
Yeah.

Rob Katz:
Of the technology.

Josh Birk:
Yeah.

Rob Katz:
To ensure that we’re putting out a ethically responsible, morally responsible actor into people’s living rooms, bedrooms, hospital rooms, conference rooms, classrooms. And at that very time, it was 2018, a lot of questions were coming up around bias in advanced technologies, particularly around facial recognition and how facial recognition algorithms did not work very well for people who have high melanin in their skin, black and brown folks and women in particular.
And the Algorithmic Justice league is a great organization to check out to learn more about this particular problem. And I got a call from someone who I had worked with in the past saying, “Hey, I see that you have ended up in a technology company. Do you know anybody who could help me translate ethics and humane technology concepts into a product engineering and design context?”

Josh Birk:
Got you.

Rob Katz:
And my honest answer was, “I don’t know anybody who could maybe do that, but would you be willing to give me a shot?”

Josh Birk:
Right.

Rob Katz:
And that person who called me is my current boss, our chief ethical and humane use officer at Salesforce. Her name is Paula Goldman. And that is how in 2019 I became our first VP of product management focused on ensuring that Salesforce’s technologies are designed, developed, and shipped with ethical and humane use principles baked in from the ground up.

Josh Birk:
I love it. I love it. So tell me a little bit about your current job and about the Office of Ethical and Humane use of technology?

Rob Katz:
So the Office of Ethical and Humane use of technology was started almost five years ago. And there are three primary components of it. One is around ethical use policy and that’s what are the policies that govern the use of our services and how do they ensure that we’re holding a high bar for things like human rights, privacy, safety, honesty, and inclusion.
And those five are our public ethical use principles, like Salesforce has core values, trust, customer success, innovation, equality, and sustainability. So we have these component principles that nest on top of that foundation for ethical use of technology.
So our policy team works really closely with a number of different stakeholders in the Salesforce ecosystem to ensure that the tech that we use in our customer base is in alignment with those ethical use principles. It’s really ambiguous interesting work. It’s hard, and I would encourage you to learn more about it. We’ll put some links in the show notes to learn more about it.

Josh Birk:
Got it.

Rob Katz:
But that’s not what I do. The other thing that I don’t do is our product accessibility and inclusive design work, which is super important. It’s all the things that you and your audience would be familiar with. It’s things like non-text contrast, it’s 508 compliance, it’s close captions. It’s all of the ways that our technology is made more accessible to people who have stated and also unstated abilities and disabilities.
And so it’s a great piece of work that is very complimentary to the work that I co-lead with my colleague Kathy Baxter, our principal architect of Ethical AI, which is on what we call Ethics By Design. And it’s exactly what I just said. It’s how we ensure that the products are designed, built, shipped, and implemented from a foundation of ethical and humane use of technology principles.

Josh Birk:
How does that look like? So Ethics By Design, break that down for me a little bit when it comes to, what’s the lifecycle look like for that on a company? The size and the scope of Salesforce? And especially, I mean, we do three releases a year. We’re not a product, we’re a constantly shifting platform of features. And oh, by the way, I think Mark just bought another company.

Rob Katz:
I’ll have to check on that last point. It’s never a dull moment in the world of Salesforce.

Josh Birk:
Right.

Rob Katz:
So the answer is that we really focus on a couple of areas. One is education.

Josh Birk:
Okay.

Rob Katz:
So our product teams and designers and developers, we ensure that they have a base understanding of what ethics and technology means broadly and also specifically in Salesforce.

Josh Birk:
Okay.

Rob Katz:
We also work with certain teams that are building products that may have higher opportunity or higher risks such as our AI products and our new [inaudible 00:19:20] and customer data platform products, so that those teams are more acutely aware of the opportunities and risks associated with some of these really powerful pieces of software.

Josh Birk:
Got it.

Rob Katz:
Then we work on guardrails and guidance. And so it sounds mundane, but hey, we have a point of view about how it is that you as a developer on the Salesforce platform or you as an admin setting up the platform, ought to think about setting up your org or building your application or implementing this particular piece of the platform so that it by default holds a high bar for something like data privacy or inclusion.

Josh Birk:
Gotcha. So it’s not just, “Hey, Salesforce, make sure your features are ethical.” It’s also how are platforms being used so that the reach of it, the implementation of it is also ethical by design.

Rob Katz:
So you could argue that ethics work falls into Salesforce’s core value of trust.

Josh Birk:
Okay.

Rob Katz:
Which is true.

Josh Birk:
Yeah.

Rob Katz:
I would also posit that Salesforce’s work on Ethics By Design is a component of our commitment to both customer success, which is to say you customer developer organization, as long as it’s within the bounds of what is written in your legal agreement, go, good luck.

Josh Birk:
Right.

Rob Katz:
And we say, “And here is a point of view about how we intentionally designed this feature or enhancement or product or platform, and we think you will be more successful in your digital transformation if you consider this or that.” And I’ll give some examples of that.

Josh Birk:
Right.

Rob Katz:
In a second.

Josh Birk:
Okay.

Rob Katz:
And it’s a component of innovation because what other SaaS company platform application that on which you can build has a point of view about that? And it’s a differentiator for people who are building on the Salesforce platform because it’s more robust.

Josh Birk:
Yeah.

Rob Katz:
It’s safer, it’s more… It’s hard to quantify this, but because there’s so much intentionality in it, you can focus your energies as a developer or as an admin or as a customer on doing what you are expert in. And you can know that behind the scenes we’re taking care of some of these hard ambiguous questions.

Josh Birk:
Yeah.

Rob Katz:
Because of the way that our team is integrated into the product.

Josh Birk:
Gotcha. Well, and this is not just well intended, good natured, we’re doing this because it’s the right thing. It’s also a bit of a cautionary tale, right? When it comes to data ethics, can you give me an example of something that might have been just a perfectly well-intended design but ended up having unintended consequences?

Rob Katz:
Sure. So data ethics is the practice of creating guideposts around the use of personal data in technology applications. And I mentioned before how I was living in Silicon Valley in sort of the height of the techno optimist curve.

Josh Birk:
Yes.

Rob Katz:
Where the exchange of personal data for value was implicit and the value that I at least perceived at the time was very high. And so I think that over the last 10 years or so, we’ve come to a new uneasy Détente between personalization on the one hand, which is that I expect and we expect our products and services to be personalized.

Josh Birk:
Yeah.

Rob Katz:
And on the one hand. And on the other, we are uneasy about the quantity and the accuracy of the information that is used in our day-to-day services that we’re interacting with in order to provide that personalization. And this is born out over and over again in research where customers and end users are telling us in things like the state of connected customer report, which I can link to in the show notes, that they want personalized services, this is end users, want personalized services and only if the data that are powering that personalization are transparently and clearly explained to them.

Josh Birk:
Okay. So don’t record anything that I don’t want you.

Rob Katz:
Tell me what you want it for. Just…

Josh Birk:
Gotcha.

Rob Katz:
Level with me is effectively what I read out of it.

Josh Birk:
Yeah.

Rob Katz:
And so I’ll give you an example.

Josh Birk:
Yeah.

Rob Katz:
We want to generate really personalized, let’s say really personalized customer service. So a well-intended organization, and this is hypothetical.

Josh Birk:
Right.

Rob Katz:
Could scrape the emails and the chat transcripts that are coming into their organization and apply an analysis tool to identify whether someone has preferred pronouns of he, him, his or she, her hers, or they,, them theirs. Okay? And then they could route someone, they could then mark their gender or their inferred gender in their database. And then they could send someone who has, let’s say they, them pronouns to a LGBTQ friendly customer service agent.

Josh Birk:
Got it.

Rob Katz:
Good intentions.

Josh Birk:
Right.

Rob Katz:
Super good intentions. You’re trying to personalize my experience as someone who identifies as non-binary or LGTQ or whatever.

Josh Birk:
Yeah.

Rob Katz:
And it’s discriminatory.

Josh Birk:
Right.

Rob Katz:
I just want customer service.

Josh Birk:
Right.

Rob Katz:
And I appreciate that you’re trying to meet me where I am.

Josh Birk:
Yeah.

Rob Katz:
But I just need you to reset my password or I need you to change my flight. And I don’t. So it’s that kind of thing where just because you can use the technology to do something like that doesn’t mean you should.

Josh Birk:
Yeah. How wrong can this go? Is there an example that you can think of hypothetical or otherwise where there was serious financial or legal outcomes?

Rob Katz:
So the canonical example of personalization and privacy gone wrong is when a person started getting ads for baby products from a particular retailer and they were getting these ads for baby products. They were a minor at the time, under the age of 18.

Josh Birk:
Oh.

Rob Katz:
And their parents contacted the retailer very angry.

Josh Birk:
Yeah.

Rob Katz:
“How dare you?” And it turned out that the minor was in fact expecting a baby.

Josh Birk:
Okay.

Rob Katz:
And the personalization algorithms that this retailer had built were so powerful that they were able to know before the minor’s parents were able to know that they were expecting a child. And it was a reputational hit on the one hand, and it was a real wake-up call as well. Now I would think like, “Oh hey, that’s a long.” I mean, people who are familiar with personalization in marketing especially are going to know exactly what I’m talking about.

Josh Birk:
Right.

Rob Katz:
And you would think, “No, that can’t still be happening. We’ve got to have better data quality than that and our algorithms have improved and…” So recently in the news was a really terrible article about how a fast food chain in Germany used a system, their marketing automation system to align the public holiday calendar with a series of push notifications.

Josh Birk:
Right.

Rob Katz:
And it would push a notification out based on the holiday to encourage folks to come in and get a promotion. Sounds…

Josh Birk:
Yeah.

Rob Katz:
Pretty straightforward.

Josh Birk:
Pretty straightforward, yeah.

Rob Katz:
Except for the fact that in Germany after World War II, they declared Kristallnacht a public holiday. Now if you’re not familiar, Kristallnacht was effectively a bias crime perpetrated by the Nazis against Jewish people and other minorities in 1938 at the beginning of really the genocide against Jewish folks in Germany.

Josh Birk:
Yeah.

Rob Katz:
And so this fast food chain did a push notification campaign out to everybody who had downloaded their app saying, “Come on in for Kristallnacht” And that’s just, it’s like you don’t do that.

Josh Birk:
Right. Right.

Rob Katz:
And it’s we need to include the human component in the work that we do. And I recognize how hard it is.

Josh Birk:
Right.

Rob Katz:
Because we’re trying to automate, we’re trying to do more with less. And we get that. At Salesforce, we get that.

Josh Birk:
Yeah.

Rob Katz:
We know. We’re here to help with your digital transformation.

Josh Birk:
Right.

Rob Katz:
And so if there’s a piece of guidance in the setup that says, “Be careful when…”

Josh Birk:
Right.

Rob Katz:
Or if there’s an in-app modal that pops up.

Josh Birk:
Yeah.

Rob Katz:
“Have you thought about…”

Josh Birk:
Yeah.

Rob Katz:
That’s a moment of friction that is intentional, purposeful friction in the software that we hope will help an admin or a developer or a marketer or a customer service leader from…

Josh Birk:
Right.

Rob Katz:
From stepping on their own toes, basically.

Josh Birk:
Right. Yeah. Kathy brought up that same example and it’s like the machine only knows the now and you need the human to understand the consequences of that now. And because the machine hasn’t, there’s no way for the machine to learn that. It’s not a Wikipedia of what these holidays actually mean. It’s just an array list that it’s pulling.

Rob Katz:
Sure.

Josh Birk:
Pulling things from. Yeah.

Rob Katz:
And then there’s another really mundane one.

Josh Birk:
Yeah.

Rob Katz:
I am not going to get in trouble when I say this, I think.

Josh Birk:
Okay.

Rob Katz:
I share my Disney Plus password with my mom and I share my Netflix password with my in-laws.

Josh Birk:
Okay.

Rob Katz:
So if I get an email campaign from Netflix or from Disney Plus.

Josh Birk:
Yeah.

Rob Katz:
It will say my first name, but it might go to my mom or.

Josh Birk:
Right.

Rob Katz:
To my in-laws.

Josh Birk:
Right.

Rob Katz:
And that just looks bad.

Josh Birk:
Yeah.

Rob Katz:
And it’s personalization that feels machine built and not personal.

Josh Birk:
Right.

Rob Katz:
And when people say, “Hey, I want personalized experiences,” just tell me what you need. There’s got to be a way to, instead of just filling first name comma last name into the template.

Josh Birk:
Yeah.

Rob Katz:
To consider other information.

Josh Birk:
Right.

Rob Katz:
And those are easy things to prevent if you’re willing to think about it.

Josh Birk:
Nice. Well, let’s take this up another level. And I think some of this gets into regulated industries versus unregulated industries. But if I’m going to build something like a mortgage application on top of Salesforce, and I have, I’m listening to this episode for the first time, and I’ve never even really kind of considered this. What are some considerations that the developer or the data designer, what should they be thinking about?

Rob Katz:
Well, first of all, we’re here to help so you can get in touch.

Josh Birk:
Yeah.

Rob Katz:
And my team and I would be glad to talk with you for real.

Josh Birk:
Nice.

Rob Katz:
And with your customer, with your partner. So let’s say you’re going to build an app that helps a bank originate mortgages on top of Salesforce data. So you want to be very aware of all of the regulations that govern how personal information can and can’t be used in the mortgage origination process.

Josh Birk:
Okay.

Rob Katz:
And that’s a, “Hey, you need to work really closely with your legal teams,” but it’s also how long do you have to retain that data? What systems and processes and mechanisms have you built in to automatically remove data that you don’t need anymore? Let’s say someone started but didn’t finish the application, what’s the statutory requirement for you to hold onto that or then to automatically delete it after a certain period of time?

Josh Birk:
Right.

Rob Katz:
And of course, you want to make sure that the system isn’t unintentionally discriminating against people based on their protected class. And so…

Josh Birk:
Yeah.

Rob Katz:
In the United States, zip code, because of a history of racial discrimination in housing.

Josh Birk:
Yeah.

Rob Katz:
Is a proxy for race.

Josh Birk:
Yeah.

Rob Katz:
And so you can’t discriminate on race because it’s illegal and it’s also wrong.

Josh Birk:
Right.

Rob Katz:
But if you use zip code in any sort of machine learning or any sort of algorithmic decision-making.

Josh Birk:
Yeah.

Rob Katz:
It will introduce race as a proxy. And so you need to be aware of how that might unintentionally bias the outcomes of the application.

Josh Birk:
Right. And if people are listening to this and thinking, well, that feels like it’s overthinking the situation, I just want to remind them of what you previously said about a machine figuring out that somebody was pregnant before their parents knew.

Rob Katz:
Sure. I mean, here’s another one.

Josh Birk:
We’re talking about powerful stuff.

Rob Katz:
I loved my time when I was working really closely with software engineers and I still get the opportunity to work with software engineering teams.

Josh Birk:
Yeah.

Rob Katz:
But tell you the truth, I kind of miss sometimes being in the office with the scrum teams that I was working with and the designers that I was working with. And I’ll tell you another default that really is an interesting one to me, which is, if you’re building an application on top of Salesforce, you may need the device ID or the IP address in order to debug it. Let’s say you get a ticket coming in.

Josh Birk:
Right.

Rob Katz:
Or a customer complaint, and you need to understand is it something, is it behaving poorly with Android version X or iOS version X, or is it only on Safari or is it only on Chrome?

Josh Birk:
Right.

Rob Katz:
And I had a funny moment the other day where I was hanging out with some people from my team in real life, which was cool.

Josh Birk:
Yeah.

Rob Katz:
And they were using a product that my wife’s company has built, and they ran into a bug about how the credit cards are filled into their form when they auto fill out of the web browser on a phone.

Josh Birk:
Okay.

Rob Katz:
But it worked everywhere else.

Josh Birk:
Yeah.

Rob Katz:
And I was like, “Okay, great. Let’s take some screenshots. Let’s send in the latest, what version of iOS are you using?” And my wife was like, “Thank you for submitting a bug report.” I’m like, “Once a product manager, always a product manager.”

Josh Birk:
Always a product manager. Love it.

Rob Katz:
So device ID and IP address.

Josh Birk:
Yeah.

Rob Katz:
Innocent.

Josh Birk:
Oh.

Rob Katz:
Innocent, but not.

Josh Birk:
But not, right?

Rob Katz:
So if you stop listening now, hear this, please delete them on a regular basis.

Josh Birk:
Gotcha.

Rob Katz:
And the reason I say that is that you can infer someone’s location to a very, very precise degree of accuracy based on.

Josh Birk:
Yeah.

Rob Katz:
Just things like IP address and device ID.

Josh Birk:
Yeah.

Rob Katz:
And we don’t need that data hanging around.

Josh Birk:
Right.

Rob Katz:
So why would you as a developer choose to delete these data? Well, don’t delete it right away, but you could build a rolling delete push or pull depending on how you’re building that automatically deletes it every X.

Josh Birk:
Yeah.

Rob Katz:
30 days, 60 days, 90 days, because you’re not going to get a ticket from someone who was interacting with your app 90 days ago that you need to go debug now.

Josh Birk:
Right. Right.

Rob Katz:
And if you delete it, number one, it is good hygiene.

Josh Birk:
Yeah.

Rob Katz:
Number two, it’s good from a data privacy and ethics perspective.

Josh Birk:
Yeah.

Rob Katz:
Number three, it lowers your costs because you’re not storing it. And number four, if God forbid, you have some kind of breach, suddenly these data that are personally identifiable aren’t out in the world somewhere.

Josh Birk:
Right. Right.

Rob Katz:
And so but as a product manager, no one ever told me to make sure that in my requirements delete old data.

Josh Birk:
Right. Right.

Rob Katz:
And yet we have data is doubling again by 2026 is something that we learned in the Dreamforce keynote. We need more than ever the right data, not just the most data. And so when I talk about data ethics, it’s not just, “Hey, this is the ethics guy, please do the right thing. Halo, moral responsibility.” It’s also like.

Josh Birk:
Right.

Rob Katz:
“Hey, this is good for your business.”

Josh Birk:
Right.

Rob Katz:
“And it will help you get the right signal to noise ratio so that you can build.”

Josh Birk:
Yeah.

Rob Katz:
“Those personalization experiences and those personalized experiences on the latest and most up to date data. And you can debug tickets just fine.”

Josh Birk:
Yeah.

Rob Katz:
Anyway.

Josh Birk:
Yeah. Well, and I think that’s an excellent example because it’s one that, IP addresses is, and now it’s all the news thanks to Mr. Musk. But it’s like that’s how people get doxed. When an IP address slips to the wrong person, that’s how they find your address and then send you a SWAT team. So really good example. I want to talk about consequence scanning workshops. What are those and who should be running them?

Rob Katz:
Thanks. I’m really glad you brought up the doxing example by the way.

Josh Birk:
Thanks.

Rob Katz:
So consequence scanning, it’s an agile process that was originally created by an organization in the UK called doteveryone. And so we stand on their shoulders when it comes to this. And we can put a consequence scanning link in the show notes.

Josh Birk:
Yeah.

Rob Katz:
Consequence scanning is effectively, “Hey, what are you trying to do with this new feature, product enhancement? And how might there be unintended negative consequences?” So what we do is we take a scrum team, that’s the software engineers and their managers and their TPM, their product managers, their UX designers, anybody with whom they’re working to build this, and we bring them into a room, virtual or otherwise for an hour.
And we ask them, we facilitate a process of considering what are the first order, second order, third order intended positive consequences as well as unintended negative consequences. And are there any intended negative consequences? We know that this could be bad. What are we doing to mitigate that?

Josh Birk:
Right.

Rob Katz:
Or are there any unintended positive consequences? And the process of just considering what are the first, second, and third order effects of this thing.

Josh Birk:
Yeah.

Rob Katz:
Is a way for us to ensure that what we’re building is more robust by default, and it helps us consider all of the ways that it could be misused and therefore how we can try and put some guardrails or some guidance around that misuse, hypothetical misuse, and help our customers and developers and admins and ecosystem avoid that.

Josh Birk:
And that’s our show. Now definitely check out the show notes for this one as we are going to put a lot of links out there for action items of learning more and doing more about ethics and data and application design. Now before we go, I did ask [inaudible 00:38:26] Rob’s favorite non-technical hobby.

Rob Katz:
My favorite non-technical hobby is playing Ultimate Frisbee. And the reason I love Ultimate is you don’t need refs.

Josh Birk:
Okay.

Rob Katz:
It is governed by a concept called Spirit of The Game, which put on your hippy hat for a minute and join me on this adventure. But Spirit of The Game, and anybody who plays Ultimate, who’s listening to this will be smiling. Hopefully. It’s a set of, it’s the rules of the game.

Josh Birk:
Yeah.

Rob Katz:
But it’s the intention that you don’t intentionally break the rules and.

Josh Birk:
Yeah.

Rob Katz:
Ultimate’s a non-contact sport.

Josh Birk:
Yeah.

Rob Katz:
And you call your own fouls and you call your own inbounds and out of bounds.

Josh Birk:
Yeah.

Rob Katz:
And it creates an environment of mutual responsibility for the game. And I don’t say, “Hey, I love Ultimate because it aligns nicely with my job.”

Josh Birk:
Right.

Rob Katz:
But I’m living my best life, Josh. I’ve got a great job where I get to work on these hard, interesting problems around making sure.

Josh Birk:
Yes.

Rob Katz:
That the technologies that we all use are.

Josh Birk:
Yeah.

Rob Katz:
Built responsibly and ethically. But the reason I love what I do is that it aligns with my own personal value system so well.

Josh Birk:
Right.

Rob Katz:
I love playing a sport in which you’re not trying to get away with something or to take a dive in front of the ref. I mean, I watched the World Cup final yesterday morning and it was awesome.

Josh Birk:
Gotcha.

Rob Katz:
And the Men’s World Cup final yesterday morning was super compelling. And there were two different penalty kicks called in that game.

Josh Birk:
Nice.

Rob Katz:
And in both of them, I looked at him, I’m like, “Was he really fould? Was that player really fouled?” And it struck me as so distinct from the way in which an Ultimate game is played in which if I foul you, I might call the foul on myself. Or you might say, “Hey, you fouled me.” And I was like, “You know what? You’re right.”

Josh Birk:
Yeah.

Rob Katz:
Or, “Actually, no, I disagree with you. That contact was…” And so that’s what I love to do outside of work.

Josh Birk:
I want to thank Rob for the great conversation information. And as always, I want to thank you for listening. Now if you want to learn more about this show, head on over to developer.salesforce.com/podcast where you can join our community, hear old episodes, see those show notes and have a links to your favorite podcast service. Thanks again everybody, and I’ll talk to you next week.

Get notified of new episodes with the new Salesforce Developers Slack app.