Most companies think cybersecurity is about firewalls and IT policies. But the real risk often starts with culture.
In this episode, Dutch Schwartz, VP of Cloud Services at SideChannel and former security leader at AWS and Raytheon, shares why companies with poor culture are three times more likely to face a breach. We talk about how leadership, disengaged employees, and outdated mindsets quietly open the door to cyber threats.
You'll also hear:
The $25 million AI voice scam that could have been prevented
How to build a culture of security that actually sticks
What CFOs need to understand about working with CISOs
Why cybersecurity is no longer just a tech problem
This conversation is a must-listen for finance and business leaders navigating today's digital risks.
Key Takeaways:
Strategic security starts with understanding business goals, not just technology.
Finance leaders need to speak the language of cybersecurity to drive risk-aware decision-making.
Influence and communication matter more than technical brilliance when building trust across departments.
Military leadership principles, clarity, discipline, and adaptability, translate directly into the business world.
Great cybersecurity is proactive, not reactive. It’s about anticipating threats before they materialize.
Noteworthy Quotes:
“Cybersecurity is not just an IT problem. It’s a business risk—and finance has a seat at that table.” – Dutch Schwartz
“Leadership is about aligning people around a purpose. That’s true in the military, and it’s true in the boardroom.” – Dutch Schwartz
“If you don’t understand how your data flows, you can’t protect it.” – Dutch Schwartz
“Finance and security teams should stop working in silos. The future is in collaboration.” – Dutch Schwartz
“The best strategy is the one that people can execute, not the one that looks best in PowerPoint.” – Dutch Schwartz
Key Timestamps:
00:00 – Trailer & Introduction to Dutch Schwartz
03:42 – Dutch’s military background and transition into cybersecurity
07:30 – Building strategic security at AWS and influencing at scale
11:50 – Why CFOs should understand cybersecurity
16:45 – The connection between financial data and security vulnerabilities
21:00 – Translating technical risk into business language
25:12 – Military lessons applied to corporate leadership
29:38 – Why proactive cybersecurity is a competitive advantage
34:20 – Cybersecurity, AI, and emerging trends
40:00 – Advice for finance leaders: Partner with your CISOs
43:10 – Final reflections and leadership philosophy
📬 Get Involved
Subscribe to the Off the Record Newsletter:
Join the conversation on Substack: https://wassiakamon.substack.com/
Have a question or topic suggestion?
Email: Ask@thediaryofacfo.com
🔗 Connect with Guest Dutch Schwartz on
LinkedIn: https://www.linkedin.com/in/dutchschwartz/
Website: https://sidechannel.com/
🔗 Connect with Host Wassia Kamon on
LinkedIn: https://www.linkedin.com/in/wassiakamon/
Instagram: https://www.instagram.com/wassiakamon/
🔗 Connect with The Diary of a CFO Podcast on
LinkedIn: https://www.linkedin.com/company/the-diary-of-a-cfo-podcast/
YouTube: https://www.youtube.com/@Thediaryofacfopodcast/featured
Website: https://www.thediaryofacfo.com
X (Twitter): https://x.com/thediaryofacfo?t=yjtBalOAshtyxWctnMliA&s=09
TikTok: https://www.tiktok.com/@csuitestories?is_from_webapp=1&sender_device=pc
00:00:00 --> 00:00:03 I'm an unhappy, disengaged employee. I'm unhappy.
00:00:03 --> 00:00:05 I'm not engaged. I'm not really dialed into our
00:00:05 --> 00:00:07 mission. It's much more likely then that I'm
00:00:07 --> 00:00:09 not going to follow any of the process or procedure.
00:00:09 --> 00:00:11 Companies with poor culture are three times more
00:00:11 --> 00:00:14 likely to have a breach. Today's guest is Dutch
00:00:14 --> 00:00:17 Schwartz, vice president of cloud services at
00:00:17 --> 00:00:19 Sci Channel, a leading cybersecurity provider.
00:00:20 --> 00:00:22 He previously served as a senior security advisor
00:00:22 --> 00:00:26 at Raytheon and AWS. Dutch has led security strategy
00:00:26 --> 00:00:29 and customer engagement for 50 global enterprises
00:00:29 --> 00:00:32 across industries like aviation, fintech, media,
00:00:33 --> 00:00:36 software, and gaming. Today, he partners with
00:00:36 --> 00:00:39 Fortune 100 CSOs to build cybersecurity strategies
00:00:39 --> 00:00:42 that deliver real business value. What would
00:00:42 --> 00:00:45 you say is one cybersecurity mistake you see
00:00:45 --> 00:00:48 big companies, small companies make over and
00:00:48 --> 00:00:50 over again? There's been really tragic examples
00:00:50 --> 00:00:54 of where somebody received AI simulated voice
00:00:54 --> 00:00:56 memo. People have transferred money and not insignificant,
00:00:56 --> 00:00:59 like $25 million in one instance where they transferred
00:00:59 --> 00:01:02 money because they believed it was their CEO.
00:01:02 --> 00:01:04 That one voice memo, even if it appears to be
00:01:04 --> 00:01:07 your CFO or CEO, that is not enough. There has
00:01:07 --> 00:01:10 to be a process of checks, right? If cybersecurity
00:01:10 --> 00:01:12 is only thought as this group of two people,
00:01:12 --> 00:01:15 20 people, 200 people over here, then the implication
00:01:15 --> 00:01:18 is there's a bias that happens unintentionally
00:01:18 --> 00:01:19 in people's brain, where they go, oh, well, that's
00:01:19 --> 00:01:21 not my job. I don't have a piece of this. I promise
00:01:21 --> 00:01:23 you you do, because that's where most of the
00:01:23 --> 00:01:27 risk lies. It's an unintended risk, because the
00:01:27 --> 00:01:29 criminals know this. They understand human behavior
00:01:29 --> 00:01:32 very well. You need to build a culture of security.
00:01:32 --> 00:01:35 What do any CFOs at any size company know about
00:01:35 --> 00:01:38 cybersecurity? Yeah, there's probably two things.
00:01:40 --> 00:01:43 Welcome back to the Diary of a CFO podcast, the
00:01:43 --> 00:01:45 podcast where finance leaders share the lessons,
00:01:45 --> 00:01:47 wins, and challenges that shape their careers,
00:01:47 --> 00:01:50 as well as their organizations. I'm your host,
00:01:50 --> 00:01:53 Waseea Kamon, and today I'm super delighted to
00:01:53 --> 00:01:56 have with me, Dutch Schwartz. Welcome to the
00:01:56 --> 00:01:58 show, Dutch. Waseea, thank you so much for having
00:01:58 --> 00:02:00 me. I know we're recording it today, so you might
00:02:00 --> 00:02:02 be listening to us at a later date, but it's
00:02:02 --> 00:02:05 Juneteenth, so I just wanted to... note that
00:02:05 --> 00:02:08 and reflect on what a cool day this is to be
00:02:08 --> 00:02:09 here and sharing it with you. So thanks for the
00:02:09 --> 00:02:12 invitation. I appreciate it. Of course, the pleasure
00:02:12 --> 00:02:15 is mine. I really wanted to start with your story
00:02:15 --> 00:02:18 and how did you end up in the cybersecurity space
00:02:18 --> 00:02:21 in the first place and any unexpected twist along
00:02:21 --> 00:02:24 the way? Yeah, no, it's almost all twists. So
00:02:24 --> 00:02:26 I started my career as a military officer. So
00:02:26 --> 00:02:29 I was an infantry officer in the army. It doesn't
00:02:29 --> 00:02:31 seem to have any kind of obvious connection to
00:02:31 --> 00:02:34 what I do today. So at one point in my career,
00:02:34 --> 00:02:36 I actually put up a poll right on LinkedIn and
00:02:36 --> 00:02:38 said, hey, how did you get into cybersecurity?
00:02:39 --> 00:02:41 It used to be called information security. And
00:02:41 --> 00:02:46 the favorite answer was pull up a chair. So so
00:02:46 --> 00:02:48 it's actually very common. Yeah, it was like
00:02:48 --> 00:02:52 46%. It's really common, probably more so it's
00:02:52 --> 00:02:54 a little bit of a timeline discussion, right?
00:02:54 --> 00:02:56 So probably maybe today, if you're coming straight
00:02:56 --> 00:02:59 out of a certification program or a military
00:02:59 --> 00:03:02 program, a transition program, or a university,
00:03:03 --> 00:03:05 that's probably a little bit different today.
00:03:05 --> 00:03:08 But if you go back 25, 30, certainly more than
00:03:08 --> 00:03:10 30 years ago, you didn't have these kinds of
00:03:10 --> 00:03:12 programs, right? So really most people who are
00:03:12 --> 00:03:15 above a certain age, let's just pick 40, let's
00:03:15 --> 00:03:17 say arbitrarily. probably there wasn't a path.
00:03:18 --> 00:03:20 So they probably found their way somehow into
00:03:20 --> 00:03:23 what was then information security. In my case
00:03:23 --> 00:03:25 was very similar. So I had transitioned from
00:03:25 --> 00:03:28 the active duty into the reserves. And I had
00:03:28 --> 00:03:30 a friend, we were both interviewing, right, for
00:03:30 --> 00:03:33 new jobs, right? And she found a job in like
00:03:33 --> 00:03:34 three or four days. I'm like, oh my gosh, Ann
00:03:34 --> 00:03:37 Jeanette, that's amazing, right? But her, both
00:03:37 --> 00:03:38 her parents, she said, oh no, both my parents
00:03:38 --> 00:03:41 are in the tech industry. And so they had set
00:03:41 --> 00:03:42 her up with a bunch of interviews with people
00:03:42 --> 00:03:44 they knew. And she said, you know, I interviewed
00:03:44 --> 00:03:46 with this little company and I actually really
00:03:46 --> 00:03:49 think that based on your personality and like,
00:03:49 --> 00:03:51 you would be a really good fit. And so that's,
00:03:51 --> 00:03:54 I got in really. accidentally. And that's a case
00:03:54 --> 00:03:56 with a lot of people. They find their way in
00:03:56 --> 00:03:59 from something. And maybe it's just they really
00:03:59 --> 00:04:02 love puzzles, problem -solving. They love tech,
00:04:02 --> 00:04:05 maybe. They're in some adjacent role. And then
00:04:05 --> 00:04:07 somebody's like, hey, we need somebody to figure
00:04:07 --> 00:04:10 out security for us. And they foolishly maybe
00:04:10 --> 00:04:13 raise their hands and go, I'll do it. But a lot
00:04:13 --> 00:04:15 of people came through. So that's a really interesting...
00:04:16 --> 00:04:19 a reflection on how Chief Information Security
00:04:19 --> 00:04:21 Officer, sometimes it's pronounced CISO, sometimes
00:04:21 --> 00:04:24 it's CISA, but I'll just call it CISO for the
00:04:24 --> 00:04:27 rest of the episode. So a lot of CISOs again
00:04:27 --> 00:04:30 came up. experientially, right? So like hands
00:04:30 --> 00:04:33 on keyboard, maybe to do an audit or investigation.
00:04:33 --> 00:04:36 So there's pros and cons like anything like that,
00:04:36 --> 00:04:38 right? So a lot of times they have a lot of great
00:04:38 --> 00:04:41 empathy and understanding for what their employees
00:04:41 --> 00:04:43 are doing, but they may have not had that structured
00:04:43 --> 00:04:46 path, right? That some other more mature roles,
00:04:46 --> 00:04:48 certainly the chief financial officer role, which
00:04:48 --> 00:04:51 has been around for hundreds of years in a sense,
00:04:51 --> 00:04:55 right? So there's a nasancy. right, to the CSO
00:04:55 --> 00:04:57 role. And that will play out as we have our conversation.
00:04:57 --> 00:05:00 I can kind of point out how that plays out, right,
00:05:00 --> 00:05:02 when you're working with the rest of the C -suite
00:05:02 --> 00:05:03 and the board. Because there are some nuances
00:05:03 --> 00:05:05 there, as you'd expect, right, being a newer
00:05:05 --> 00:05:09 role. Yes. And so what would you describe as,
00:05:09 --> 00:05:11 what would be your definition of cybersecurity?
00:05:12 --> 00:05:15 And what do people do in cybersecurity? Like,
00:05:15 --> 00:05:19 grade level five. Yeah, that's a great question.
00:05:19 --> 00:05:21 Because we even internally have these crazy debates
00:05:21 --> 00:05:24 on Why did cyber security become the term right?
00:05:24 --> 00:05:27 Because it used to be information security, sometimes
00:05:27 --> 00:05:29 data security. That's a little more understandable
00:05:29 --> 00:05:32 to the typical person. Cyber security really
00:05:32 --> 00:05:34 the intent there was to talk about as we move
00:05:34 --> 00:05:36 from analog to digital, right? So as we move
00:05:36 --> 00:05:40 from a physical file folder cabinet to an online
00:05:40 --> 00:05:42 version of that to a digital version of that,
00:05:42 --> 00:05:44 we were the the essence of it. We're trying to
00:05:44 --> 00:05:48 capture is how do we now talk about the value
00:05:48 --> 00:05:52 that it seems kind of ephemeral? those zeros
00:05:52 --> 00:05:54 and ones that are digital, how do you think about
00:05:54 --> 00:05:57 that digital value? And then how do you protect
00:05:57 --> 00:06:00 it? Because again, it feels a little more obvious
00:06:00 --> 00:06:03 if I can walk over to my file cabinet and say,
00:06:03 --> 00:06:04 I'm unlocking it, right? Like we used to do,
00:06:04 --> 00:06:08 been doing it years ago. It's a little less clear
00:06:08 --> 00:06:10 to, certainly to a non -technical user of, well,
00:06:10 --> 00:06:12 what do I need to do when it's in the digital
00:06:12 --> 00:06:15 realm? And so that's really, cybersecurity is
00:06:15 --> 00:06:19 really about how do I, first and foremost, in
00:06:19 --> 00:06:21 the 2020s and beyond, right? In this modern,
00:06:21 --> 00:06:23 what I would consider this current modern era,
00:06:24 --> 00:06:25 we have to talk about how do I love the business,
00:06:26 --> 00:06:29 right? How do I deliver business outcomes, right?
00:06:30 --> 00:06:33 For many, many years, cybersecurity was a subset,
00:06:33 --> 00:06:35 unintentionally probably, kind of underneath
00:06:35 --> 00:06:38 information technology. It just sort of grew
00:06:38 --> 00:06:40 out of that, right? And unfortunately, there's
00:06:40 --> 00:06:44 some challenges with that. One of it, which is,
00:06:44 --> 00:06:47 then it feels like it's very removed from the
00:06:47 --> 00:06:49 business, right? So maybe you're used to work
00:06:49 --> 00:06:51 with your tech support team, and then there's
00:06:51 --> 00:06:52 a whole other team and you don't even really
00:06:52 --> 00:06:54 interface with that team on a regular basis.
00:06:55 --> 00:06:56 And then it sort of didn't have a voice at the
00:06:56 --> 00:06:59 table because they were newer. It was sort of
00:06:59 --> 00:07:01 like going up. So for many, many years, it was
00:07:01 --> 00:07:04 very common for a CISO to report to the CIO.
00:07:05 --> 00:07:07 Not that that structure is wrong. It's not wrong,
00:07:07 --> 00:07:11 but it's maybe on average, it's probably not
00:07:11 --> 00:07:13 the best structure. right? So that was one of
00:07:13 --> 00:07:15 the things that CSRS like, hey, I probably should
00:07:15 --> 00:07:18 report to, again, it's gonna depend on the size
00:07:18 --> 00:07:20 of your company, the chief risk officer, right?
00:07:21 --> 00:07:24 Perhaps chief legal counsel, and GC, perhaps
00:07:24 --> 00:07:25 there's a subcommittee, right? That there's maybe
00:07:25 --> 00:07:27 there's four or five people that look at risk.
00:07:27 --> 00:07:30 Maybe it's directly to the CEO, right? Depending
00:07:30 --> 00:07:33 on if you're a technology company, Amazon would
00:07:33 --> 00:07:35 be a good example that Amazon and Amazon Web
00:07:35 --> 00:07:38 Services, both of the companies. In that case,
00:07:38 --> 00:07:42 it's so critical, right? To the to the safety
00:07:42 --> 00:07:45 and security of the customers, that in that sense,
00:07:45 --> 00:07:47 they report directly to the CEO. So again, I
00:07:47 --> 00:07:49 don't think there's a one size fits all, but
00:07:49 --> 00:07:51 I mean, you can see this movement of, oh, we're
00:07:51 --> 00:07:54 just sort of a cost center that's nested underneath
00:07:54 --> 00:07:58 of IT to, oh gosh, this is becoming more intrinsic
00:07:58 --> 00:08:01 to our business, right? How do we make sure that
00:08:01 --> 00:08:03 we have a seat at that table, right? And make
00:08:03 --> 00:08:05 sure that we're, and it's commensurate on CISOs
00:08:05 --> 00:08:08 as well then, to understand the business, right?
00:08:08 --> 00:08:10 And then to not just be a technologist, but to
00:08:10 --> 00:08:12 be a business executive. And that's a little
00:08:12 --> 00:08:14 bit of that shift that I've mentioned. And I'd
00:08:14 --> 00:08:17 say over the last four or five years, particularly
00:08:17 --> 00:08:19 COVID accelerated that, right? As we sort of
00:08:19 --> 00:08:22 did the world's largest social science experiment,
00:08:22 --> 00:08:24 we all went home. People, it became real apparent,
00:08:24 --> 00:08:27 oh gosh, we really are a digital company. Even
00:08:27 --> 00:08:30 if we manufacture something, even if we're in
00:08:30 --> 00:08:32 pharmaceuticals or, you know, to take your pick,
00:08:32 --> 00:08:34 people suddenly went, oh wow, big part of what
00:08:34 --> 00:08:37 we do and how we deliver. either internally or
00:08:37 --> 00:08:40 to our customers or clients or partners is digital.
00:08:40 --> 00:08:43 That actually catapulted some CISOs to that.
00:08:43 --> 00:08:45 They went from the kids table, as we might say,
00:08:45 --> 00:08:48 where I grew up, to the adult table. That's that
00:08:48 --> 00:08:50 transition. People are still finding their way.
00:08:51 --> 00:08:53 How do I make sure I'm speaking? the language
00:08:53 --> 00:08:56 that's used in the boardroom and for my C -suite.
00:08:57 --> 00:08:58 And so when I'm sort of coaching and mentoring,
00:08:58 --> 00:09:00 that's a big piece of what I talk about. I like
00:09:00 --> 00:09:03 the analogy of from the kids table to the adult
00:09:03 --> 00:09:06 table, because I'm also curious when you said
00:09:06 --> 00:09:08 a way to deliver business outcomes, what are
00:09:08 --> 00:09:10 some of the business outcomes that people in
00:09:10 --> 00:09:13 cybersecurity, the CISOs, can actually bring?
00:09:13 --> 00:09:16 It will vary, of course. So I start with like,
00:09:16 --> 00:09:18 first of all, if we're a public company or for
00:09:18 --> 00:09:20 a large private company, How do we think about
00:09:20 --> 00:09:22 risk? That's an easy, obvious one, right? So
00:09:22 --> 00:09:24 if you're a public company in the U .S., you
00:09:24 --> 00:09:27 look at Section 1A. Right. And let's look at
00:09:27 --> 00:09:28 our risk. And of course, for your audience will
00:09:28 --> 00:09:30 know this very well. Right. There's let's say
00:09:30 --> 00:09:33 we might have risk of what if there is a natural
00:09:33 --> 00:09:36 event that happens and we have a facility somewhere
00:09:36 --> 00:09:38 near a coast. OK, that kind of risk would be
00:09:38 --> 00:09:40 there. You might have the risk of currency exchange
00:09:40 --> 00:09:43 rates changing. But there will also be some risks
00:09:43 --> 00:09:45 that will either implicitly or explicitly be
00:09:45 --> 00:09:47 related to this digital value. So I always would
00:09:47 --> 00:09:49 start with if it's available, I would go to Section
00:09:49 --> 00:09:53 Wayne. What is the board, the very learned folks,
00:09:53 --> 00:09:55 what are they thinking about in terms of risk?
00:09:56 --> 00:09:59 Then I can put a lens on it and say, what's a
00:09:59 --> 00:10:02 more cybersecurity focused way of thinking about
00:10:02 --> 00:10:04 that? But I would go through each of the functions.
00:10:04 --> 00:10:07 I would talk with my CRO, my CMO, if there's
00:10:07 --> 00:10:10 a chief product officer and understand. What
00:10:10 --> 00:10:11 are the three or four things that they're really
00:10:11 --> 00:10:13 trying to do in terms of outcomes? Let's just
00:10:13 --> 00:10:16 take so customer experience is another one. So
00:10:16 --> 00:10:18 there's sort of five that I look at and I just
00:10:18 --> 00:10:19 sort of start with those on the board and we
00:10:19 --> 00:10:22 kind of usually pick a couple. Innovation, flexibility,
00:10:22 --> 00:10:26 speed to market, customer experience, scale.
00:10:26 --> 00:10:28 These are things that cybersecurity can help
00:10:28 --> 00:10:30 you with and it's going to play out very differently
00:10:30 --> 00:10:32 if you're a knowledge based company versus a
00:10:32 --> 00:10:34 manufacturing company. But there are absolutely
00:10:34 --> 00:10:37 things that you should be tying to. And when
00:10:37 --> 00:10:39 I'm stuck, when I try to ground myself, right,
00:10:39 --> 00:10:42 just based on experiences that I have. And I
00:10:42 --> 00:10:44 think about that in cybersecurity because that's
00:10:44 --> 00:10:46 a good sort of analogy of how you think about
00:10:46 --> 00:10:49 it. Right. So if I talk to you about things like
00:10:49 --> 00:10:52 customer experience and resiliency, customer
00:10:52 --> 00:10:56 retention, then now we're having a business conversation.
00:10:56 --> 00:10:58 Right. It's not about tech. The tech is under
00:10:58 --> 00:11:00 the scenes in the same way that I went to, like,
00:11:00 --> 00:11:03 you know, a week long course on on Sarbanes -Oxley.
00:11:03 --> 00:11:05 And I never, ever really want to talk about it
00:11:05 --> 00:11:08 again. Okay, it's not my area, it's not my forte,
00:11:08 --> 00:11:11 but I've had great CFOs and finance officers
00:11:11 --> 00:11:13 who that is their forte, right? So in the same
00:11:13 --> 00:11:15 way that I don't necessarily want to get into
00:11:15 --> 00:11:18 the nitty gritty of that, I don't need to, right?
00:11:18 --> 00:11:20 As long as you can explain to me why it's important
00:11:20 --> 00:11:22 and what my role in that is and how I help you.
00:11:22 --> 00:11:24 It's the same thing. It's really commensurate.
00:11:24 --> 00:11:27 on the CISOs, right, and the senior people in
00:11:27 --> 00:11:30 security to not drag you into the, if you want
00:11:30 --> 00:11:33 to, cool, we can vibe out about the tech or whatever,
00:11:33 --> 00:11:36 but I need to keep it up here, right? If I can't
00:11:36 --> 00:11:39 explain it to you, you know, like to your auntie,
00:11:39 --> 00:11:41 you know, or to teenagers, then I don't really
00:11:41 --> 00:11:43 understand it, right? I've got to be able to
00:11:43 --> 00:11:45 make it resonate with you. And I may have to
00:11:45 --> 00:11:47 try two or three different stories and find a
00:11:47 --> 00:11:51 way to help you understand. And it's the CISOs
00:11:51 --> 00:11:54 role to be a business executive first. who's
00:11:54 --> 00:11:57 got domain expertise and digital and cybersecurity,
00:11:58 --> 00:12:01 and I have to be able to help you deliver a business
00:12:01 --> 00:12:04 outcome. If I can't do that, then you should
00:12:04 --> 00:12:07 keep me at the kids table. I get it. I mean,
00:12:07 --> 00:12:09 it's still important, but you're like, yes. And
00:12:09 --> 00:12:12 then this tech nerdy guy Dutch shows up and he
00:12:12 --> 00:12:13 talks stuff I don't really know he's talking
00:12:13 --> 00:12:15 about. Like, that's not helpful. I've got to
00:12:15 --> 00:12:17 be able to. So on the CSO side of the table,
00:12:17 --> 00:12:19 when I'm coaching people, I said, Look, if you
00:12:19 --> 00:12:21 can't explain, and I'm various, right, based
00:12:21 --> 00:12:24 on the company, you can't explain to me how what
00:12:24 --> 00:12:27 you do impacts net margin. Or you can't explain
00:12:27 --> 00:12:29 to me how this impacts your free cash flow, or
00:12:29 --> 00:12:31 return on working capital or go down the list,
00:12:32 --> 00:12:34 right, then you need to go talk to your finance
00:12:34 --> 00:12:38 people and have them help you. You probably do
00:12:38 --> 00:12:39 know it. You just don't know how to articulate
00:12:39 --> 00:12:42 that today. And you need to be able to do that.
00:12:42 --> 00:12:44 I need to be. And same thing, same thing for
00:12:44 --> 00:12:47 Chief Human Resource Officer, CRO. Yeah, I got
00:12:47 --> 00:12:49 to have that connection, right? And then the
00:12:49 --> 00:12:51 same thing upwards if you have a board, of course,
00:12:51 --> 00:12:54 right? Those are often former CEOs, CFOs, COOs,
00:12:55 --> 00:12:59 right? Certainly understand risk. Yes. And that's
00:12:59 --> 00:13:01 really, that's the starting point. Like I said,
00:13:02 --> 00:13:04 risk is the starting point. And then how do I
00:13:04 --> 00:13:07 help enable things so that customers, clients,
00:13:07 --> 00:13:10 partners, whomever that ecosystem is, that they
00:13:10 --> 00:13:12 have a heck of a great experience, right? That's
00:13:12 --> 00:13:14 part of my role to help you with that. Thank
00:13:14 --> 00:13:17 you. And speaking of risk, what would you say
00:13:17 --> 00:13:22 is one cybersecurity mistake you see big companies,
00:13:22 --> 00:13:25 small companies make over and over again? So
00:13:25 --> 00:13:30 it's probably the fact that to do this well,
00:13:31 --> 00:13:34 it takes a village. And what I mean is everyone
00:13:34 --> 00:13:38 has to be involved. So here's my analogy. If
00:13:38 --> 00:13:40 you look at the history of quality in the U .S.
00:13:40 --> 00:13:43 specifically, the history of quality before you
00:13:43 --> 00:13:46 had systems of systems. It was really your craftsmanship,
00:13:46 --> 00:13:49 your artisanship, right? So you make shoes and
00:13:49 --> 00:13:51 I do paintings, right? And I'm ours. You would
00:13:51 --> 00:13:53 just look at, well, how great, well, look at
00:13:53 --> 00:13:55 how great those are. But it's really your individual
00:13:55 --> 00:13:57 craftsmanship or craftspersonship, right? But
00:13:57 --> 00:13:59 when you start to have manufacturing and systems
00:13:59 --> 00:14:02 of systems, you have to have quality. And the
00:14:02 --> 00:14:04 quality revolution was very slow in America,
00:14:04 --> 00:14:07 quite frankly. And initially what we did is we
00:14:07 --> 00:14:10 created a quality control department. Well, if
00:14:10 --> 00:14:12 there's a quality control department, I grew
00:14:12 --> 00:14:14 up working in a factory. If there's a quality
00:14:14 --> 00:14:16 control department, there's a bias that happens
00:14:16 --> 00:14:18 unintentionally in people's brain, where they
00:14:18 --> 00:14:21 go, Oh, well, that's not my job. Wow. Right.
00:14:21 --> 00:14:24 That's Jesse's job. That's Sanjay's job. That's
00:14:24 --> 00:14:27 Tim's job. You know, it's not me. And I understand
00:14:27 --> 00:14:28 it as somebody who studies history, I understand
00:14:28 --> 00:14:31 how that happens. But eventually, I can tell
00:14:31 --> 00:14:33 you working in corporate America, we all had
00:14:33 --> 00:14:35 to go back to school and learn about total quality
00:14:35 --> 00:14:37 management or whatever acronym it was, wherever
00:14:37 --> 00:14:40 you grew up and whatever. But now when you talk
00:14:40 --> 00:14:42 about you don't even have to actually call out
00:14:42 --> 00:14:45 quality, everybody understands quality, how doing
00:14:45 --> 00:14:48 things in a quality process that's repeatable,
00:14:49 --> 00:14:50 you know, everybody does that. And so you can
00:14:50 --> 00:14:52 talk to somebody and doesn't matter if it's human
00:14:52 --> 00:14:54 resources or logistics or finance, it's baked
00:14:54 --> 00:14:57 in. It's the same sort of thing. If cybersecurity
00:14:57 --> 00:15:00 is only thought of as this group of two people,
00:15:01 --> 00:15:04 20 people, 200 people over here, then the implication
00:15:04 --> 00:15:06 is, oh, well, that's not my job. I don't have
00:15:06 --> 00:15:08 a piece of this. I don't have a role in it. I
00:15:08 --> 00:15:11 promise you, you do because that's where most
00:15:11 --> 00:15:14 of the risk lies. It's an unintended risk. Because,
00:15:15 --> 00:15:17 of course, the criminals because in most cases,
00:15:17 --> 00:15:19 they're actually really criminals, right? The
00:15:19 --> 00:15:22 criminals know this, they understand human behavior
00:15:22 --> 00:15:25 very well. And so will they will they come after
00:15:25 --> 00:15:28 me directly? Maybe. But they're more likely to
00:15:28 --> 00:15:30 do social engineering and find somebody who's
00:15:30 --> 00:15:33 not aware, right? And they get their credentials.
00:15:33 --> 00:15:36 And then they get inside your system. And now
00:15:36 --> 00:15:38 we have a conversation about resilience, right?
00:15:38 --> 00:15:40 Like how resilient are we if this happens to
00:15:40 --> 00:15:45 us, right? So you need to build a culture of
00:15:45 --> 00:15:47 security, not a security culture. When we use
00:15:47 --> 00:15:48 the phrase that way, then everybody goes, oh,
00:15:48 --> 00:15:50 well, yeah, that's the culture of Dutch's team.
00:15:50 --> 00:15:52 That's true. But we need a culture of security,
00:15:53 --> 00:15:56 right? A culture of security does three things.
00:15:56 --> 00:15:58 Number one, it attracts talent, right? We have
00:15:58 --> 00:15:59 a lot of evidence that that's the case, right?
00:15:59 --> 00:16:02 If you have a great culture. Right. And that
00:16:02 --> 00:16:04 includes being safe and secure. That attracts
00:16:04 --> 00:16:06 talent. The board level, it also retains talent.
00:16:07 --> 00:16:09 Right. So that's really important. Number three,
00:16:09 --> 00:16:13 we know that companies with great culture period
00:16:13 --> 00:16:16 outperform their peer group. There's a tremendous
00:16:16 --> 00:16:18 amount of evidence of that. Right. And you can
00:16:18 --> 00:16:20 look at any multiple metrics on that. And then
00:16:20 --> 00:16:23 the fourth thing is companies with poor culture
00:16:23 --> 00:16:26 are three times more likely to have a breach.
00:16:27 --> 00:16:31 Wow. Not poor security. Poor culture. Because
00:16:31 --> 00:16:33 think about this, pretend let's just do a thought
00:16:33 --> 00:16:36 exercise. I'm a unhappy, disengaged employee
00:16:36 --> 00:16:38 for whatever reason, maybe it's me, right? But
00:16:38 --> 00:16:40 for whatever reason, I'm unhappy, I'm not engaged,
00:16:41 --> 00:16:43 I'm not really dialed into our mission, whatever
00:16:43 --> 00:16:46 it is that we do. It's much more likely then
00:16:46 --> 00:16:48 that I'm not going to follow any of the process
00:16:48 --> 00:16:50 or procedure. I'm not going to log out of all
00:16:50 --> 00:16:52 the systems, I'm not going to make sure I change
00:16:52 --> 00:16:54 all these, I'm just not going to be engaged,
00:16:54 --> 00:16:56 right? I might just hurry, I might just rush
00:16:56 --> 00:16:59 through things. And I'd be like, I know the corporate
00:16:59 --> 00:17:01 standard is I'm supposed to use this software.
00:17:01 --> 00:17:03 But I'm just going to email this out, you know,
00:17:03 --> 00:17:05 to the other person, even though I've been told
00:17:05 --> 00:17:07 not to do that. So when you have a poor culture,
00:17:08 --> 00:17:10 this comes from a book called Well Aware by George
00:17:10 --> 00:17:12 Finney. If you have poor culture, you're three
00:17:12 --> 00:17:14 times more likely to have a breach. So that's
00:17:14 --> 00:17:16 a really way to it's a way to bring that home.
00:17:16 --> 00:17:19 So it really starts with leadership, culture
00:17:19 --> 00:17:21 and strategy. Right. And again, those are not
00:17:21 --> 00:17:23 technical. But those three things are what helps
00:17:23 --> 00:17:27 you unlock the technology so that together we
00:17:27 --> 00:17:28 have the greatest chance of winning, whatever
00:17:28 --> 00:17:31 that means, whether that's serving in our constituents
00:17:31 --> 00:17:34 or for charity or for public enterprise, it doesn't
00:17:34 --> 00:17:36 matter. But whatever it is that we would define
00:17:36 --> 00:17:38 as, hey, we're high -fiving, you know, at the
00:17:38 --> 00:17:40 end of the year, that's what you have to focus
00:17:40 --> 00:17:42 on. So it starts with culture. And that's the
00:17:42 --> 00:17:44 biggest miss. It's unintended, of course, but
00:17:44 --> 00:17:47 that's the biggest miss. Small, large, doesn't
00:17:47 --> 00:17:49 matter. That's the biggest miss. Wow. That's
00:17:49 --> 00:17:52 very insightful. And I totally make sense that...
00:17:52 --> 00:17:54 if I'm not engaged and I'm not going to follow
00:17:54 --> 00:17:58 protocol and then, you know, be an open door
00:17:58 --> 00:18:00 to the criminals. Now, especially with AI now
00:18:00 --> 00:18:03 and more companies are afraid of AI attacks because
00:18:03 --> 00:18:06 they look so slick. Like they can, you can make
00:18:06 --> 00:18:10 a fake receipt and you can go through software.
00:18:10 --> 00:18:13 Do you think currently companies should be more
00:18:13 --> 00:18:15 afraid of AI attacks than they currently are
00:18:15 --> 00:18:18 or are they not? Or, you know, we think that
00:18:18 --> 00:18:21 maybe the rate is not as big. The short answer
00:18:21 --> 00:18:24 is that AI influence attacks, certainly to your
00:18:24 --> 00:18:26 point, are very slick and dangerous. So what
00:18:26 --> 00:18:30 do we see? We see a couple of things. One, what
00:18:30 --> 00:18:34 feels like to us a very human approach is very
00:18:34 --> 00:18:36 doable today, right? To your point, right? And
00:18:36 --> 00:18:39 so there's been really tragic examples of, and
00:18:39 --> 00:18:41 I don't want to call out any company, but where
00:18:41 --> 00:18:45 somebody received a AI simulated voice memo.
00:18:45 --> 00:18:48 or an email or a video or what have you. Right.
00:18:48 --> 00:18:49 And so those have already taken place. Right.
00:18:50 --> 00:18:52 So you do need to be aware of those. So what
00:18:52 --> 00:18:54 can you do about that? Right. That's the first
00:18:54 --> 00:18:57 question. The first my first answer is, well,
00:18:57 --> 00:18:59 so it starts with what do I care about the most?
00:18:59 --> 00:19:02 Right. You got to start. It's an inside out conversation.
00:19:03 --> 00:19:06 What are the what's the data? What's the process?
00:19:06 --> 00:19:07 What's the information? Do we have intellectual
00:19:07 --> 00:19:11 property? Is it client or customer facing? You
00:19:11 --> 00:19:13 start with that. Right. Because that's the easiest
00:19:13 --> 00:19:15 way. Look, what are the things that If something
00:19:15 --> 00:19:18 were to go wrong with this, it would cause a
00:19:18 --> 00:19:21 really bad day. And sometimes it's not intuitive.
00:19:22 --> 00:19:23 So you really need to say, what do we really,
00:19:23 --> 00:19:26 really care about? And then you say to yourself,
00:19:26 --> 00:19:30 what needs to go right every time for that to
00:19:30 --> 00:19:33 work successfully? OK, cool. How do we make sure
00:19:33 --> 00:19:35 to the highest degree we can that we make it
00:19:35 --> 00:19:38 easy to do the right thing and really, really
00:19:38 --> 00:19:40 hard to do the wrong thing, right? And you reinforce
00:19:40 --> 00:19:43 that. So you should be concerned. I think there's
00:19:43 --> 00:19:46 a healthy, that's the long answer. The short
00:19:46 --> 00:19:48 answer is you should have a healthy amount of
00:19:48 --> 00:19:50 concern. The way that you would protect data
00:19:50 --> 00:19:53 and information doesn't change. This is just
00:19:53 --> 00:19:55 another way that this plays out, right? The same
00:19:55 --> 00:19:59 way that people live in robbing banks or playing
00:19:59 --> 00:20:01 three card Monte and trying to misdirect you,
00:20:01 --> 00:20:03 those haven't changed because they're really
00:20:03 --> 00:20:05 all triggered on human behavior. Right. True.
00:20:06 --> 00:20:08 And you mentioned the voiceover and I don't want
00:20:08 --> 00:20:10 to name any company, but can you provide other
00:20:10 --> 00:20:14 examples of deep fakes and synthetic data? First,
00:20:14 --> 00:20:16 what's the difference between deep fakes and
00:20:16 --> 00:20:19 synthetic data? But what are some of the examples
00:20:19 --> 00:20:21 that you've seen that happened at companies?
00:20:22 --> 00:20:24 Yeah. So I'll take them in reverse for us. So
00:20:24 --> 00:20:27 synthetic data, synthetic data is legitimately
00:20:27 --> 00:20:31 used. by data scientists and people in AI research
00:20:31 --> 00:20:33 and other fields that are related. But synthetic
00:20:33 --> 00:20:36 data is if you and I are building, let's say
00:20:36 --> 00:20:38 we're building a customer -facing chat, an application
00:20:38 --> 00:20:43 of some sort, rather than use actual sensitive
00:20:43 --> 00:20:45 client data or customer data or maybe patient
00:20:45 --> 00:20:48 data in healthcare, we want to create synthetic
00:20:48 --> 00:20:52 data. We would use that as we build our large
00:20:52 --> 00:20:56 language model or AI model. There's more than
00:20:56 --> 00:20:57 one kind, by the way, but we'll just call large
00:20:57 --> 00:20:59 language models that as what everybody started
00:20:59 --> 00:21:01 from there, as we're building that model, we
00:21:01 --> 00:21:04 go through epochs or iterations, right? So that's
00:21:04 --> 00:21:07 the way that you train models. And so we'd want
00:21:07 --> 00:21:09 we need a lot of data to do that right successfully.
00:21:10 --> 00:21:13 So synthetic data in of itself isn't bad, right?
00:21:13 --> 00:21:16 It has legitimate uses. But where it can be really
00:21:16 --> 00:21:21 bad outcomes is if you can alter my synthetic
00:21:21 --> 00:21:23 data in a way that I'm not aware of or I don't
00:21:23 --> 00:21:25 realize. And then what it's going to do is it's
00:21:25 --> 00:21:28 going to create outputs when I'm doing that prompt.
00:21:28 --> 00:21:31 Let's say it's a chat bot that I wasn't anticipating,
00:21:31 --> 00:21:33 right? And so you really have to look at that.
00:21:33 --> 00:21:35 So a lot of that's around. what we would call
00:21:35 --> 00:21:38 data provenance, right? Where did the data come
00:21:38 --> 00:21:40 from? And can I show, sort of think of like a
00:21:40 --> 00:21:43 crime show on TV, right? What's the chain of
00:21:43 --> 00:21:45 custody, right? So the provenance, where did
00:21:45 --> 00:21:48 this data come from? And can I prove that that's
00:21:48 --> 00:21:50 actually where that data came from? And can I
00:21:50 --> 00:21:52 prove that it has not been altered, right? So
00:21:52 --> 00:21:55 that's pretty much in the data scientists and
00:21:55 --> 00:21:59 security realm to look at that. So how does it
00:21:59 --> 00:22:02 play out kind of in a more typical example? it's
00:22:02 --> 00:22:05 very easy to have unintended bias then creep
00:22:05 --> 00:22:09 in. If I can change the data, if I can poison
00:22:09 --> 00:22:11 the model, you might hear data poison or model
00:22:11 --> 00:22:14 poisoning, then I can trigger an outcome that
00:22:14 --> 00:22:16 you don't want, right? So you have to do a look
00:22:16 --> 00:22:18 at that. So this really around more around sort
00:22:18 --> 00:22:20 of technical controls, right? And you want to
00:22:20 --> 00:22:22 have a human in the loop, right? So you want
00:22:22 --> 00:22:24 to have a human make that last decision. So that's
00:22:24 --> 00:22:26 all about synthetic data. But what could outcome
00:22:26 --> 00:22:30 be? I could skew a radiology model so that it
00:22:30 --> 00:22:32 gives wrong information. I don't know what my
00:22:32 --> 00:22:33 intention would be there, but I could do that,
00:22:33 --> 00:22:36 right? More realistically, I could I could skew
00:22:36 --> 00:22:41 models so that I'm unfairly doing financing right
00:22:41 --> 00:22:44 for a customer set, right? So there's a lot of
00:22:44 --> 00:22:46 ways that synthetic data could be messed with.
00:22:46 --> 00:22:48 So you need to look at we're back to how do I
00:22:48 --> 00:22:51 secure my model right? Or if I'm using someone
00:22:51 --> 00:22:53 else's model, then tuning it, I want to ask some
00:22:53 --> 00:22:55 really good questions about how did you guys
00:22:55 --> 00:22:58 build this model? And do I have visibility right
00:22:58 --> 00:23:01 into how you built that and what are your guardrails
00:23:01 --> 00:23:03 right that you put around that so that we don't
00:23:03 --> 00:23:05 have explicit bias that's kind of obvious but
00:23:05 --> 00:23:08 also again implicit bias right we've all seen
00:23:08 --> 00:23:11 this play out gender bias racial bias there's
00:23:11 --> 00:23:15 a lot of ways that it's maybe unintended but
00:23:15 --> 00:23:18 doesn't matter it's still a bias right so if
00:23:18 --> 00:23:21 i feed you only resumes for a role that historically
00:23:21 --> 00:23:26 was mostly gender normative male And I give you
00:23:26 --> 00:23:29 those, what a model is going to do is go, oh,
00:23:29 --> 00:23:32 well, then most of these names are the names
00:23:32 --> 00:23:35 that I see. And it's going to make an association.
00:23:35 --> 00:23:37 Wow. Maybe you didn't even intend to, right?
00:23:37 --> 00:23:40 Because that's what models do, right? Think about,
00:23:40 --> 00:23:42 let's say, the algorithm for one of your favorite
00:23:42 --> 00:23:45 streaming platforms, right? What it does there
00:23:45 --> 00:23:48 is it makes some inferences. It may not be obvious
00:23:48 --> 00:23:50 to you. So you just click it on. Like the other
00:23:50 --> 00:23:52 day, we were changing. And my youngest daughter
00:23:52 --> 00:23:54 was like, no, no, don't do this on my profile.
00:23:54 --> 00:23:56 it's gonna it's gonna screw up my algorithm,
00:23:56 --> 00:23:59 right? Which is hilarious. But I mean, at 1415,
00:23:59 --> 00:24:01 she's about to turn 15. She already knows this.
00:24:01 --> 00:24:03 She's like, No, no, don't watch that sports show
00:24:03 --> 00:24:06 on my algorithm. Right? Well, because the algorithm
00:24:06 --> 00:24:09 makes some subtle inferences. And for whatever
00:24:09 --> 00:24:11 reason, it goes, I know that people who like
00:24:11 --> 00:24:14 this show, and this show and watch TV at this
00:24:14 --> 00:24:16 time of day, I'm going to present them these
00:24:16 --> 00:24:19 five other shows. And it can't even explain to
00:24:19 --> 00:24:22 you why it's presenting it. It just knows based
00:24:22 --> 00:24:25 on a lot of data. that it's likely that you're
00:24:25 --> 00:24:28 going to like this other show. And so what makes
00:24:28 --> 00:24:30 inferences, and that's what happens with unintended
00:24:30 --> 00:24:34 bias, is it makes an inference. It tries to say
00:24:34 --> 00:24:36 that there's a correlation or causality to these
00:24:36 --> 00:24:39 two things, a name, as an example, that shouldn't
00:24:39 --> 00:24:42 exist. It shouldn't matter, right? My name shouldn't
00:24:42 --> 00:24:43 even be on there, right, when it's looking at
00:24:43 --> 00:24:45 that resume. So that's how it can play out. And
00:24:45 --> 00:24:48 in terms of the deep fakes... Yeah, the ones
00:24:48 --> 00:24:51 that are really obvious are people have transferred
00:24:51 --> 00:24:53 money and not insignificant, like $25 million
00:24:53 --> 00:24:56 in one instance where they transferred money.
00:24:57 --> 00:25:00 Yeah, they transferred money because they believed,
00:25:00 --> 00:25:03 but fairly so. They believed it was their, I
00:25:03 --> 00:25:04 don't remember if it was the CEO or the CFO.
00:25:04 --> 00:25:07 I don't believe, I can't remember who they mimicked.
00:25:07 --> 00:25:09 And think about it from a voice standpoint or
00:25:09 --> 00:25:12 an email standpoint. It's fairly easy to do that.
00:25:13 --> 00:25:15 The criminals have... the tools to be able to
00:25:15 --> 00:25:17 remember what the differences remember back,
00:25:17 --> 00:25:19 you know, like, five years ago, you would get
00:25:19 --> 00:25:21 an email, it'd be really obvious, like, wow,
00:25:21 --> 00:25:24 okay, grammatically, it's just not quite right.
00:25:24 --> 00:25:26 And maybe there's a misspelling, it was sort
00:25:26 --> 00:25:28 of obvious, right, that it wasn't really true.
00:25:28 --> 00:25:32 But now, one of the things that large language
00:25:32 --> 00:25:34 models do really well is language, right? That
00:25:34 --> 00:25:37 so they're trained on natural language processing,
00:25:37 --> 00:25:39 NLP is what an AI person would call that. So
00:25:39 --> 00:25:42 they're very good at that. That's why it feels
00:25:42 --> 00:25:44 like We're talking to human. We're not. It's
00:25:44 --> 00:25:47 not actually thinking, but it presents itself
00:25:47 --> 00:25:50 in a way. So now what a criminal can do is create
00:25:50 --> 00:25:52 a very smooth or slick, as you said before, email
00:25:52 --> 00:25:56 and also change it really fast, right? You're
00:25:56 --> 00:25:58 doing it at automation speed. You're not doing
00:25:58 --> 00:26:00 it human speed anymore, right? If you think about
00:26:00 --> 00:26:03 if you prompt them, do you use AI like in your
00:26:03 --> 00:26:05 day to day? But yeah, think about how fast you
00:26:05 --> 00:26:09 can prompt and reprompt and do. It's that fast,
00:26:09 --> 00:26:12 right? So you can create an agent, right? to
00:26:12 --> 00:26:14 do that prompting. And now you're now I'm just
00:26:14 --> 00:26:16 I'm just walking over here watching, you know,
00:26:16 --> 00:26:18 watching my favorite show. Because it's now the
00:26:18 --> 00:26:20 model is doing it right. And so think about how
00:26:20 --> 00:26:22 quickly that is that that speed of that change
00:26:22 --> 00:26:26 is very fast. So on the other side, We have to
00:26:26 --> 00:26:29 have automation to combat that. Yeah, I'm still
00:26:29 --> 00:26:32 stuck on the voiceover. Like, how does it work?
00:26:32 --> 00:26:34 Like, how do you get caught into this? There's
00:26:34 --> 00:26:38 a fun example of recently where a gentleman named
00:26:38 --> 00:26:40 Perry Carpenter, who's renowned in the cyber
00:26:40 --> 00:26:42 security industry, I believe he did it with a
00:26:42 --> 00:26:44 reporter. It'll be one of the news channels,
00:26:45 --> 00:26:48 let's just say. Right. And all it took is a handful
00:26:48 --> 00:26:50 of clips of her because, of course, she's a public
00:26:50 --> 00:26:53 figure and he has that. And he was able to create
00:26:53 --> 00:26:57 a very, very, very slick video of her talking
00:26:57 --> 00:26:59 that she had never said, because the whole point
00:26:59 --> 00:27:01 of natural language processing is I can break
00:27:02 --> 00:27:05 words into smaller pieces, okay, into tokens.
00:27:05 --> 00:27:07 And I can reassemble those in a way that is so
00:27:07 --> 00:27:11 smooth, that to the average person, even to yourself,
00:27:11 --> 00:27:14 it's eerie. If you do it to yourself, it's very,
00:27:14 --> 00:27:16 very hard. So what do you do? Well, like, let
00:27:16 --> 00:27:18 me bring this home kind of at a personal level,
00:27:18 --> 00:27:20 right? So you have to bring it home to people
00:27:20 --> 00:27:22 in a way that they can connect with, right? Tell
00:27:22 --> 00:27:25 them a story that they understand, right? And
00:27:25 --> 00:27:28 again, there's a piece of this of I have a role
00:27:28 --> 00:27:31 explicitly, right, to secure things. but you
00:27:31 --> 00:27:33 also have a role in that. Like if we think about
00:27:33 --> 00:27:35 our neighborhood, right? Should I lock my door?
00:27:35 --> 00:27:37 But let's just say I have a security, you know,
00:27:37 --> 00:27:41 I'm in a gated community and then something happened.
00:27:41 --> 00:27:43 Well, the gated community has a role, right?
00:27:43 --> 00:27:45 Because I'm paying, right? Explicitly paying
00:27:45 --> 00:27:47 for a security team. But what if I didn't lock
00:27:47 --> 00:27:50 my door? Okay, that's my part too, right? So
00:27:50 --> 00:27:54 there's security and securing and securing behavior.
00:27:55 --> 00:27:57 Everybody has a role in that. right? And if you
00:27:57 --> 00:28:00 if we don't kind of coach it and teach it and
00:28:00 --> 00:28:03 emphasize it that way, then it's hard for people.
00:28:03 --> 00:28:05 Because people like, I can't you let those that
00:28:05 --> 00:28:06 was the great moment. By the way, that's what
00:28:06 --> 00:28:09 I can convince everybody. I'm like, right, I
00:28:09 --> 00:28:11 know you can't remember that, please, I'm begging
00:28:11 --> 00:28:14 you, use use a password manager. And here's why,
00:28:14 --> 00:28:16 right? That's a great example. Did we solve it
00:28:16 --> 00:28:19 all we did, but it was a pain. to do all that
00:28:19 --> 00:28:22 and undo that and it's all that lost time from,
00:28:22 --> 00:28:23 you know, and the stress and oh my gosh, I'm
00:28:23 --> 00:28:25 gonna get, you know, and generally today, companies
00:28:25 --> 00:28:28 are very good ultimately about getting your money
00:28:28 --> 00:28:30 back to you, but in the meantime, and sometimes
00:28:30 --> 00:28:32 that might go unnoticed for a long time, right?
00:28:32 --> 00:28:36 People might, quote, steal your tax return, right,
00:28:36 --> 00:28:39 before you even file it. Again, they think in
00:28:39 --> 00:28:42 longer terms of time than we might in our day
00:28:42 --> 00:28:45 to day, right? They might steal your child's
00:28:45 --> 00:28:47 social security number here in the U .S. when
00:28:47 --> 00:28:50 they're 13. And they'll just wait. They'll just
00:28:50 --> 00:28:54 wait, right? And they'll just sell that, right?
00:28:54 --> 00:28:56 With a whole bunch of other social security numbers.
00:28:56 --> 00:28:58 And then when that person's birthdate is 18,
00:28:58 --> 00:29:01 they'll go open credit cards in their name. Wow.
00:29:01 --> 00:29:04 They'll play a very long game, right? Because
00:29:04 --> 00:29:07 it's organized crime. It's not somebody on the
00:29:07 --> 00:29:11 corner hustling you. It's not that movie that
00:29:11 --> 00:29:13 you saw. I mean, those are funny. And those do
00:29:13 --> 00:29:15 happen, by the way. I mean, that also happens.
00:29:15 --> 00:29:18 But I mean, if I look at the preponderance of
00:29:18 --> 00:29:21 breaches that I can identify, the majority, let's
00:29:21 --> 00:29:25 just call it 55 % or 60%, are clearly the criminal
00:29:25 --> 00:29:29 activity. So this is a nation state, or I'll
00:29:29 --> 00:29:31 call it a criminal syndicate or a criminal actor.
00:29:32 --> 00:29:35 This is their job. They do this the same way
00:29:35 --> 00:29:37 that we do and candidly they collaborate very
00:29:37 --> 00:29:40 well. You're an expert in one kind of ransomware
00:29:40 --> 00:29:43 and I'm an expert in something else. We have
00:29:43 --> 00:29:46 a trading network that goes on and I'll trade
00:29:46 --> 00:29:51 your expertise for mine. So I can go hire ransomware
00:29:51 --> 00:29:55 as a service. By the hour. That's a real thing.
00:29:56 --> 00:29:58 So think about that. I mean, that is how, that
00:29:58 --> 00:30:00 is how effective it is, right? So you have to
00:30:00 --> 00:30:02 think about it differently. It's not, they weren't
00:30:02 --> 00:30:05 targeting my aunt specifically. They're targeting
00:30:05 --> 00:30:08 everybody. They're just trying to get information
00:30:08 --> 00:30:10 and then figure out what's the information that's
00:30:10 --> 00:30:12 most valuable and how did I do something? And
00:30:12 --> 00:30:14 that plays out the exact same way in the corporate
00:30:14 --> 00:30:16 world. Is there corporate espionage? Sure, there
00:30:16 --> 00:30:17 always has been, right? So there's some that
00:30:17 --> 00:30:19 are obvious, right? If you're in an environment,
00:30:20 --> 00:30:21 but there's, there's the more subtle thing where
00:30:21 --> 00:30:24 they're just always trying to steal information
00:30:24 --> 00:30:26 and data. You have to presume that's what they're
00:30:26 --> 00:30:28 doing, right? So then you play that out. Well,
00:30:28 --> 00:30:31 how would that impact me? How would that impact
00:30:31 --> 00:30:34 our reputation? Our brand? Yeah, operational
00:30:34 --> 00:30:37 stuff is usually really obvious. Like, we're
00:30:37 --> 00:30:39 a manufacturing company, and we can't do in the
00:30:39 --> 00:30:40 production line to stop. Okay, that's pretty
00:30:40 --> 00:30:43 obvious, right? that's bad. So but the more subtle
00:30:43 --> 00:30:46 things are the damage to your brand, the damage
00:30:46 --> 00:30:49 to your reputation. Do you recover? Probably
00:30:49 --> 00:30:52 if you have a really strong brand, but it's nevertheless,
00:30:52 --> 00:30:54 in the meantime, because an argument we're like,
00:30:54 --> 00:30:57 Well, yeah, but pick a public breach that we
00:30:57 --> 00:30:59 all know about. And then you look x number of
00:30:59 --> 00:31:01 months or quarters later, like see their stock
00:31:01 --> 00:31:04 price came back. That's true. But what would
00:31:04 --> 00:31:07 it have been? If it didn't take a what would
00:31:07 --> 00:31:10 it have been if it didn't take a 14 % dip or
00:31:10 --> 00:31:13 20 % which is often the case right right after
00:31:13 --> 00:31:15 an announcement right and again the subtlety
00:31:15 --> 00:31:18 of I lost trust with you if I'm a customer or
00:31:18 --> 00:31:22 client now we're back to churn right it's real
00:31:22 --> 00:31:24 hard to put your finger on that but if your churn
00:31:24 --> 00:31:28 rate changes then the correlation is likely that
00:31:28 --> 00:31:29 that event had some impact because there's some
00:31:29 --> 00:31:32 people like I don't feel comfortable with the
00:31:32 --> 00:31:35 way you know xyz company handled my information
00:31:36 --> 00:31:38 right? Yes, and vote with their feet, they're
00:31:38 --> 00:31:40 gonna go somewhere else, right? And you may just
00:31:40 --> 00:31:42 attribute that to normal term, but maybe not.
00:31:42 --> 00:31:45 Maybe it's not normal term. Maybe it's, I'm actually
00:31:45 --> 00:31:47 concerned. I don't know if I trust your I don't
00:31:47 --> 00:31:49 know if you have the level of trust that I want
00:31:49 --> 00:31:52 you to have with my information about my child's
00:31:52 --> 00:31:55 health care information as an example. And so
00:31:55 --> 00:31:57 when we look in the corporate world, you know,
00:31:58 --> 00:32:00 people always go to accounting and finance. What
00:32:00 --> 00:32:04 would you say would be one thing you wish CFO
00:32:04 --> 00:32:08 knew more about cybersecurity or knew more or
00:32:08 --> 00:32:11 did better working with CISOs, especially in
00:32:11 --> 00:32:13 small companies, they're usually not working
00:32:13 --> 00:32:17 there, but What should any CFOs at any size company
00:32:17 --> 00:32:20 know about cybersecurity? And again, if there's
00:32:20 --> 00:32:23 not a CISO, whomever that most senior person
00:32:23 --> 00:32:25 is, right, who does technology and operations,
00:32:25 --> 00:32:26 depending on how it's set up in your company,
00:32:27 --> 00:32:30 where we could benefit from the experience and
00:32:30 --> 00:32:33 knowledge that CFOs in your community has is
00:32:33 --> 00:32:36 how do we quantify the risk, right? There's a
00:32:36 --> 00:32:37 lot of discussion in the CISO community, and
00:32:37 --> 00:32:39 there has been for a number of years. And I think
00:32:39 --> 00:32:42 we would benefit from more discussions around
00:32:42 --> 00:32:45 What's an acceptable amount of risk? Right. And
00:32:45 --> 00:32:48 so when I think about it, I think about risk
00:32:48 --> 00:32:51 appetite and risk tolerance. OK. So let's say
00:32:51 --> 00:32:54 we're in a public company for a second. My expectation
00:32:54 --> 00:32:58 would be that the board sets the risk appetite.
00:32:58 --> 00:33:01 In other words, how much risk are we knowingly
00:33:01 --> 00:33:03 OK taking? Because we might go through periods
00:33:03 --> 00:33:05 where we want it. We're OK accepting more risk.
00:33:05 --> 00:33:08 Maybe we're going through a heavy period intentionally
00:33:08 --> 00:33:10 through M &A. And so we know that while we're
00:33:10 --> 00:33:12 doing that, or maybe we're opening up in a new
00:33:12 --> 00:33:15 geography or you pick a risk. Right. But there's
00:33:15 --> 00:33:17 valid reasons, business reasons. But I need some
00:33:17 --> 00:33:20 guidance. Right. What is our appetite? And if
00:33:20 --> 00:33:23 it changes, please tell me. Help us quantify
00:33:23 --> 00:33:27 what's the acceptable amount. And then let's
00:33:27 --> 00:33:29 talk through. OK. And then how much then is are
00:33:29 --> 00:33:32 we going to use cyber insurance for? And then
00:33:32 --> 00:33:35 how much are we going to essentially self insure?
00:33:35 --> 00:33:37 Right. And we're going to accept that risk. right?
00:33:38 --> 00:33:40 Because CISOs want to know that, right? But in
00:33:40 --> 00:33:43 some cases, they there hasn't been a lot of history
00:33:43 --> 00:33:46 of that conversation happening. But there's five
00:33:46 --> 00:33:48 things there. So as somebody who looks at risk
00:33:48 --> 00:33:51 from a digital standpoint, I want to understand,
00:33:51 --> 00:33:53 well, which of those kind of impact help me help
00:33:53 --> 00:33:56 me help you like instead of Jerry Maguire, right?
00:33:56 --> 00:33:59 Help me help you like help us right. So bring
00:33:59 --> 00:34:01 me into the conversation. And please bring us
00:34:01 --> 00:34:05 in soon. I can't tell you how many times a CISO
00:34:05 --> 00:34:08 or you know, a technology or security person
00:34:08 --> 00:34:12 found out about an acquisition after it had happened.
00:34:13 --> 00:34:15 Oh, yeah. Same thing with accounting. I feel
00:34:15 --> 00:34:17 like some positions... See, look, we're twinning,
00:34:17 --> 00:34:20 right? With the last year... Yeah, and we really
00:34:20 --> 00:34:22 do. And even if we, like, as soon as we can,
00:34:22 --> 00:34:24 I obviously understand there's a measured process
00:34:24 --> 00:34:26 we have to go through, but as soon as you can,
00:34:27 --> 00:34:28 please bring us... So I would just say bring
00:34:28 --> 00:34:32 us to the table, extend us some... you know,
00:34:32 --> 00:34:34 some trust, you know, initially, but we need
00:34:34 --> 00:34:37 to be at the table, right? So because and again,
00:34:37 --> 00:34:40 some of this is on our side as well, right? Because
00:34:40 --> 00:34:44 we were sort of viewed as technologists solely
00:34:44 --> 00:34:46 or first and foremost, we weren't in the business
00:34:46 --> 00:34:49 conversations. So when you're doing big moves,
00:34:50 --> 00:34:52 we're going to go into new geography, we're building
00:34:52 --> 00:34:54 a new factory, we're coming up with a new pharmaceutical
00:34:54 --> 00:34:57 offering, we're coming up with a new service,
00:34:57 --> 00:35:00 whatever that is. whomever it is that is the
00:35:00 --> 00:35:02 senior person for you, they should be in that
00:35:02 --> 00:35:05 conversation. Because they might say, oh, well,
00:35:05 --> 00:35:07 wait a minute. Well, where are we hosting the
00:35:07 --> 00:35:11 data for that new software platform? Because
00:35:11 --> 00:35:15 those customers are in Europe. So this applies
00:35:15 --> 00:35:16 differently. And the rest of the team might not
00:35:16 --> 00:35:18 know that. How do I help you move fast? I want
00:35:18 --> 00:35:22 you to move fast, but safe. Yes. Right? How do
00:35:22 --> 00:35:25 I help you do that? And you get it. So bring
00:35:25 --> 00:35:28 us in and help us quantify risk. help us build
00:35:28 --> 00:35:31 the story around how we talk to them. Maybe you
00:35:31 --> 00:35:33 and I have got, okay, cool. Hey, how do we go
00:35:33 --> 00:35:35 talk to the CRO about this? How do we talk with
00:35:35 --> 00:35:37 the CMO and make sure? Because again, my job
00:35:37 --> 00:35:41 is to enable you to move quickly, to delight
00:35:41 --> 00:35:44 our customers, but to do it safely. How safely?
00:35:45 --> 00:35:48 That's a business choice. And that's okay, right?
00:35:48 --> 00:35:49 And that's the thing you get. You have to help
00:35:49 --> 00:35:51 me understand like, hey, how much are we okay
00:35:51 --> 00:35:54 taking, you know, in terms of risk? I had a great
00:35:54 --> 00:35:57 CFO at one point. I was a young business leader.
00:35:57 --> 00:36:01 And I was debating, making acquisition within
00:36:01 --> 00:36:04 my, you know, within my realm. And I was sort
00:36:04 --> 00:36:06 of angsty and like, I ran the numbers and, and
00:36:06 --> 00:36:08 my CFO came down and said, Listen, I need you
00:36:08 --> 00:36:11 to be able to take risk. And I like was like,
00:36:11 --> 00:36:13 wait, wait, what? And he was like, No, I need
00:36:13 --> 00:36:15 you to be able to take risk. And I, and I worked
00:36:15 --> 00:36:17 at this company was a very old company. So they've
00:36:17 --> 00:36:18 been doing this a long time, right? He goes,
00:36:18 --> 00:36:20 No, no, no. We're gonna talk, you know, we've
00:36:20 --> 00:36:22 got metrics on loss and all these other things.
00:36:22 --> 00:36:24 And we talked about that for a while. I need
00:36:24 --> 00:36:26 to, I need you to fit, you know, like emotionally
00:36:26 --> 00:36:29 understand. I want, I need you to be able to
00:36:29 --> 00:36:32 take risk. If you don't take any risks, we won't
00:36:32 --> 00:36:35 grow. Yes. I just need you to, we need to have
00:36:35 --> 00:36:36 a conversation about it. And he's in at the end
00:36:36 --> 00:36:38 of the day after we run the models that we did,
00:36:38 --> 00:36:40 trust me, you know, red, orange, and after we
00:36:40 --> 00:36:42 ran the models, he said, at the end of the day,
00:36:43 --> 00:36:45 you're going to look at me and go, let's do this
00:36:45 --> 00:36:47 one. It's going to be a judgment call. We know
00:36:47 --> 00:36:50 what the model says. right? And that's okay.
00:36:50 --> 00:36:52 And he goes, and occasionally we'll be wrong.
00:36:52 --> 00:36:54 And it can't be wrong more than x percent of
00:36:54 --> 00:36:56 the time about each year. But I mean, that helped
00:36:56 --> 00:36:58 me a lot. He's like, No, we need to take risk.
00:36:58 --> 00:37:01 If we don't take risk, we don't grow. And so
00:37:01 --> 00:37:03 we need to help with the language of that and
00:37:03 --> 00:37:06 how to quantify it and how to because in the
00:37:06 --> 00:37:08 world of cybersecurity risk is just like, it's
00:37:08 --> 00:37:10 like the big giant capital letters like, oh my
00:37:10 --> 00:37:13 god, like you're always crime fighting risk.
00:37:14 --> 00:37:16 And so there's a visceral reaction to anybody
00:37:16 --> 00:37:19 who's in technology or cybersecurity around risk,
00:37:19 --> 00:37:21 right? And we need to understand that you have
00:37:21 --> 00:37:25 to take risk. Yes. Or you can't grow. I just
00:37:25 --> 00:37:27 need your help and saying, how much is acceptable?
00:37:28 --> 00:37:30 What kinds of risks are acceptable? When do I
00:37:30 --> 00:37:33 need to pull you in? You know, if I'm concerned
00:37:33 --> 00:37:35 about something, I'm getting a signal that I'm
00:37:35 --> 00:37:36 not sure that we're going in the right direction.
00:37:37 --> 00:37:39 Help me, right? Do that. Because then together,
00:37:39 --> 00:37:42 we can help have that conversation with the CRO,
00:37:42 --> 00:37:44 the CMO, the product team that, you know, whomever
00:37:44 --> 00:37:46 that is. Does that, does that resonate with you?
00:37:46 --> 00:37:49 Oh yeah, absolutely. Because sometimes, and I
00:37:49 --> 00:37:51 feel the same, especially because I came from
00:37:51 --> 00:37:53 an accounting background where people go make
00:37:53 --> 00:37:55 all kinds of transactions when they need payment,
00:37:55 --> 00:37:57 they'll say, oh, accounting, we need this. And
00:37:57 --> 00:38:02 we're like, where are you? Like literally, where
00:38:02 --> 00:38:05 are you? Yeah. Last question, what do you like
00:38:05 --> 00:38:07 to do outside of work? Because I know you're
00:38:07 --> 00:38:09 very passionate about the security. And he blows
00:38:09 --> 00:38:12 naturally as a veteran. So I'm always curious,
00:38:12 --> 00:38:15 what do you like to do outside of work? So we
00:38:15 --> 00:38:18 have six kids. So we spend a ton of time in sporting
00:38:18 --> 00:38:21 events and music events. So that takes up big
00:38:21 --> 00:38:23 chunks of time. And then my wife and I personally,
00:38:23 --> 00:38:25 we really like to travel. And we really love
00:38:25 --> 00:38:28 live music. So we went to Coachella earlier this
00:38:28 --> 00:38:32 year. Oh, nice. And camped out. So we probably,
00:38:32 --> 00:38:35 you know, we're both big readers. And then just
00:38:35 --> 00:38:37 outdoor activities is kind of implied in the
00:38:37 --> 00:38:40 kids sports thing. But like, yeah, so go for
00:38:40 --> 00:38:44 a hike, go to the beach, read a book, you know,
00:38:44 --> 00:38:47 that kind of thing. Love going to museums, you
00:38:47 --> 00:38:49 know, finding out, you know, wherever you travel,
00:38:49 --> 00:38:51 you know, just trying to get out and not go to
00:38:51 --> 00:38:53 the chain restaurant and not stay, you know,
00:38:53 --> 00:38:57 I'm the not tour person. So I want to go out
00:38:57 --> 00:38:59 and strike out and be like, hey, and then you
00:38:59 --> 00:39:00 go find somebody, you know, and then you ask
00:39:00 --> 00:39:03 the the person at your cafe like hey where should
00:39:03 --> 00:39:05 we go we do that all the time right i like to
00:39:05 --> 00:39:07 understand so if i came to visit right uh and
00:39:07 --> 00:39:09 i've been to Atlanta but i would be calling you
00:39:09 --> 00:39:13 i'm going hey i'm coming in uh to hotlanta you
00:39:13 --> 00:39:15 know next week yeah what's happening it's a cultural
00:39:15 --> 00:39:18 thing going on is or is your music going on is
00:39:18 --> 00:39:20 tell me the great rib place that i already that
00:39:20 --> 00:39:22 i don't know about that's like not the touristy
00:39:22 --> 00:39:24 one so that's what we love to do so you know
00:39:24 --> 00:39:27 food music travel all that kind of stuff and
00:39:27 --> 00:39:30 the kids Yeah. Okay. Now I'm curious to see,
00:39:30 --> 00:39:33 especially when you say I like going to the nut
00:39:33 --> 00:39:36 chains and in the outdoor, how you manage cyber
00:39:36 --> 00:39:38 security and all that because it must be, it
00:39:38 --> 00:39:40 must be following you. Like I'm a CFO. So when
00:39:40 --> 00:39:43 we went to Disney, I was calculating in my head
00:39:43 --> 00:39:46 how much money they were making at the cotton
00:39:46 --> 00:39:49 candy stand. Like literally it feels like work
00:39:49 --> 00:39:52 follows you even outside. So I'm curious, there
00:39:52 --> 00:39:55 is a fun way that you do that too, when you're
00:39:55 --> 00:39:58 off in some of your fun activities. Yeah, there's
00:39:58 --> 00:40:01 probably two things that are, you know, tongue
00:40:01 --> 00:40:04 -in -cheek, but occupational hazards, right?
00:40:04 --> 00:40:06 And that's, yeah, so you're always thinking about,
00:40:06 --> 00:40:10 oh, this process is not secure. Like, every time
00:40:10 --> 00:40:13 I come in, I'm like, how would I game this process?
00:40:13 --> 00:40:17 How would I get in and not actually be the person
00:40:17 --> 00:40:19 that's on this picture? Right to your Disney,
00:40:20 --> 00:40:22 right to your Disney point? right? I mean, so
00:40:22 --> 00:40:24 you're always thinking about that both sides,
00:40:24 --> 00:40:25 right? Usually, you're thinking about is the
00:40:25 --> 00:40:27 fun side, like, how would I break break, you
00:40:27 --> 00:40:29 know, how would I break this, right? But then
00:40:29 --> 00:40:31 how I defend against it, that's one. And then
00:40:31 --> 00:40:32 because of my background in the military, then
00:40:32 --> 00:40:35 it's all and then also, like, where's my exit?
00:40:35 --> 00:40:37 Like, how would I where would I go? How do I
00:40:37 --> 00:40:39 do this? What would I do in the scenario, right?
00:40:39 --> 00:40:42 And honestly, it's sort of unconscious, right?
00:40:42 --> 00:40:45 Because it's just sort of how you how you think
00:40:45 --> 00:40:47 about things. Yeah, all the time. But it's I
00:40:47 --> 00:40:50 would imagine it's the same thing. I know for
00:40:50 --> 00:40:52 certain when you know, in military movies, you
00:40:52 --> 00:40:53 know, we watch and we're like, oh, it would never
00:40:53 --> 00:40:55 happen that way. And when there's a good one,
00:40:55 --> 00:40:57 we'll all talk about it. Like, oh, that opening
00:40:57 --> 00:41:01 scene in this movie was great. Or that scene.
00:41:01 --> 00:41:02 Weirdly, like there's a scene in Forrest Gump,
00:41:02 --> 00:41:05 you're like, yeah, that's actually so you then
00:41:05 --> 00:41:07 you know, you're like, who who is the analyst?
00:41:07 --> 00:41:08 That's what we call it. Who is the analyst who
00:41:08 --> 00:41:11 informed the director because they were very
00:41:11 --> 00:41:13 good. Like it was a spot on I would imagine that
00:41:13 --> 00:41:18 for doctors, lawyers, when they watch those shows,
00:41:19 --> 00:41:22 they're like, that's not how that happens. You
00:41:22 --> 00:41:24 know, like, I love a few good men. And I'm well
00:41:24 --> 00:41:25 aware that's not how that court see whatever
00:41:25 --> 00:41:27 play out. Right. But I mean, I'm sure that acts,
00:41:27 --> 00:41:29 you know, that I'm sure that that attorneys are
00:41:29 --> 00:41:32 like, ridiculous. It never happens that way.
00:41:32 --> 00:41:34 But it's good theater, right? It's the same thing,
00:41:34 --> 00:41:36 right? I mean, there's some stuff that there's
00:41:36 --> 00:41:38 some again, there's some movies that you're like,
00:41:38 --> 00:41:40 yeah, that's a pretty accurate scene right there
00:41:40 --> 00:41:41 what they're doing the movie. But again, some
00:41:41 --> 00:41:44 of it is just great. You know, it's fictionalized.
00:41:44 --> 00:41:47 And it's just great fun, right? So yeah, I think
00:41:47 --> 00:41:48 everywhere you're going, you're always thinking
00:41:48 --> 00:41:50 this process and also like I think about customer
00:41:50 --> 00:41:53 experience a lot, I got this process terrible.
00:41:53 --> 00:41:56 It drives me, you know, so I do think about that
00:41:56 --> 00:41:57 a lot. Like you said, when you're going into
00:41:57 --> 00:41:58 places, then when you see great when you're like,
00:41:58 --> 00:42:01 ah, you get the Disney experience, right, where
00:42:01 --> 00:42:03 everybody comes to help you. They just stop what
00:42:03 --> 00:42:05 they're doing and go, oh, how can I? Then you
00:42:05 --> 00:42:09 go, see, that's good culture. That's good culture,
00:42:09 --> 00:42:11 because you didn't, you know, there's that story,
00:42:11 --> 00:42:13 right, that old story, right, where, you know,
00:42:13 --> 00:42:16 JFK is at the Space Center, right, and he's asking
00:42:16 --> 00:42:19 somebody using facilities. We're like, hey, what
00:42:19 --> 00:42:22 do you do? And my recollection, anyway, is the
00:42:22 --> 00:42:24 gentleman's response was, well, I'm helping put
00:42:24 --> 00:42:27 a person on the moon. Wow. That's the cult, right?
00:42:27 --> 00:42:30 So now you go, ah. You guys got it. That's the
00:42:30 --> 00:42:32 culture because that is it. Everybody's contribution
00:42:32 --> 00:42:36 matters, right? No more, no less. Everybody's
00:42:36 --> 00:42:40 got things that contribute to makes it a better
00:42:40 --> 00:42:43 team. Right. So it has to be a team. It can't
00:42:43 --> 00:42:46 just be me with a superhero cape. Yes. You know,
00:42:47 --> 00:42:50 fighting crime. It has to be all of us together
00:42:50 --> 00:42:53 because that's the only way we succeed. Right.
00:42:53 --> 00:42:56 The automation, the AI, the sophistication of
00:42:56 --> 00:42:59 criminals. is too great for any one team, right?
00:42:59 --> 00:43:02 So we all have to be like, can I make sure that
00:43:02 --> 00:43:04 I hate the password policy, but I know I get
00:43:04 --> 00:43:06 it. But I got to explain to you in a way that
00:43:06 --> 00:43:08 makes sense to you, right? And not just to you,
00:43:08 --> 00:43:10 but to each person in a way that they understand,
00:43:10 --> 00:43:13 okay, hey, this is important. Doing things safely
00:43:13 --> 00:43:16 is just how we do things here. Like that would
00:43:16 --> 00:43:18 be what I would leave you with. Doing things
00:43:18 --> 00:43:20 safely. That's just how we do stuff here. So
00:43:20 --> 00:43:22 when you're hiring, when you're onboarding, when
00:43:22 --> 00:43:24 you're training new leaders, you explicitly train
00:43:24 --> 00:43:26 them. And then over time, again, remember I said,
00:43:26 --> 00:43:29 it's pervasive and it's enduring. So once you
00:43:29 --> 00:43:33 get that culture going that direction, all the
00:43:33 --> 00:43:35 what a organizational psychologist say all the
00:43:35 --> 00:43:39 social cues will just align people. Because then
00:43:39 --> 00:43:41 they just be like, I will see this. Hey, that's
00:43:41 --> 00:43:43 not how you do that. Come on. We got to do it.
00:43:43 --> 00:43:45 You know, we got to do that. Schwartz is going
00:43:45 --> 00:43:48 to say something if I don't do this in a good
00:43:48 --> 00:43:50 way, like you're griping, but you know that there's
00:43:50 --> 00:43:52 a real reason that you're doing it. And you got
00:43:52 --> 00:43:53 to know there's a real reason you're doing it.
00:43:53 --> 00:43:55 Yes. Not because I said so. It's not because
00:43:55 --> 00:43:58 I set a policy that's annoying. Right. Everybody.
00:43:58 --> 00:44:01 We all hate that. Right. The mandatory training.
00:44:01 --> 00:44:05 Boo. What's mandatory training? That is it. Right.
00:44:05 --> 00:44:08 Just the word itself is terrible. Right. Yeah.
00:44:08 --> 00:44:12 Mandatory training. Yikes. Okay. No. Doing things
00:44:12 --> 00:44:15 safely is just how we do things here. I think
00:44:15 --> 00:44:17 that would be what I would do. Love it. Thank
00:44:17 --> 00:44:19 you so, so much for being here. You're welcome.
00:44:19 --> 00:44:22 Thanks for having me. I appreciate it. It's great
00:44:22 --> 00:44:23 to meet you. Same here.



