“Who’s looking out for you?” – A Conversation with Robert Field, Pete Cafarchio and Rick Diana

SINC The Conversation (1024 × 450 px)
SINC’s Director of Partnerships Sean Navarro spoke with Robert Field, Pete Cafarchio and Rick Diana to find outWho’s looking out for you?”

You’re a hard-working leader responsible for protecting and improving the backbone of your organization. It’s a never-ending job that takes energy, vigilance, constant improvement and keeping up with all the latest industry trends. But as you’re giving your best to your organization who’s looking out for you? What about your own personal development? Who can you trust to have no agenda other than your own success?

This episode will be unlike any other. Using an interactive, fireside chat format you’ll hear about how one tech leader took his life and career to the next level by partnering with both an agent and a coach. He’ll tell you about his challenges, what the engagements were like, and the results that he got.

Watch for Robert Field, Pete Cafarchio and Rick Diana in the speaker lineup at our Northeast 2022 Forum!

‘Data Where It Counts’: Why Nonprofit & Industry Need Data Consortium – A Conversation with Eugene Kogan, The Population Council

SINC interview Nate Arnold GE

SINC’s Director of Content Annie Liljegren spoke with Eugene Kogan in September 2021, and has edited this interview for length and clarity.

My guest today is Eugene Kogan, Global IT Director, Head of IT & Security at The Population Council. Eugene and his team received a 2021 FutureEdge 50 award for their Guatemalan data warehouse design project entitled: “Bringing Big Data to People Where It Counts.”
Presented by IDG’s CIO, the FutureEdge 50 awards honor technology advancements and the innovative cultures enabling them.

Thanks for speaking with me, Eugene. You bring an incredibly strong background in public sector/nonprofit: Thirteen years at UNICEF as Director of IT, Database and Applications Development; five years as CTO of the Rainforest Alliance; and for the last three years you’ve served as Global IT Director, Head of IT & Security at The Population Council. Seems like there’s definitely a theme there.

Eugene Kogan: Right, I’ve dedicated most of my career to the public sector: UNICEF USA, or helping farmers around the world dealing with climate change, or now working to empower girls and women in developing countries—all that is drawing from my background to help in these countries.

Once you start working within the not-for-profit space it’s easy to move in the same sector because from an IT perspective there are similarities. I used to work with brokerage companies, I used to work in the banking sector, so there it’s a little bit different—mostly hunting for money, let’s call it [chuckles]. Here, there are more of the social or human-driven elements.

How does that distinction impact the technical side of things?

Eugene Kogan: We need to deliver maybe twice as much for the same buck. The public and non-for-profit sector is always lagging and struggling with funding.

And of course working with lower and middle-income countries or poor communities with overlapping inequities, sometimes you cannot deliver the best technology, but you can make use of and innovate with what’s given.

From a technical standpoint, is there anything whose worth you saw absolutely proved during recent events, or conversely, something you were relieved to have done away with or to have moved past before the rain started to fall?

Eugene Kogan: Yes, just before Covid we completed a migration to the cloud, as a proof of concept, we we went to a complete remote environment. We have 13 physical offices, and suddenly it became like 600+ offices because everyone was at home.

We were so lucky and very happy that we completed this migration and cloud implementation just before Covid started—just one day before. We officially announced the full migration, and it was official that US and foreign offices were on a lockdown. Well, that was incredible. We were probably a mile ahead of many nonprofits and even some for-profit companies.

The Population Council was actually nominated for an IDG’s CIO award for that work.

Yes, indeed—that touches on my next question. In April you won a 2021 FutureEdge 50 award for a data warehouse project in Guatemala, the project entry was titled: “Bringing Big Data to People Where It Counts.” Here’s what I wanted to ask: Why did you win, do you think?

Eugene Kogan: Why? Because not to say we are the best, but we are the best [laughs].

No, it was the incredible teamwork among my colleagues. And it was a concept project—we’re still working on this one. It was the involvement of Guatemala team, and while mostly it was a design originating in Guatemala of course, we can scale it up for the entirety of Latin America. It was a lot of data-driven initiatives, and how to secure it, and how to work with security on this one, because we were working with a lot of confidential data about pregnancy or about HIV.

That kind of data is always highly confidential, but especially in certain countries it’s super confidential. People can lose their lives, be put in jail or into slavery—there could be very serious consequences.

So all these projects, whatever we try to build and are still building, this will help young kids and young girls in those countries to fight those difficulties.

This project was put into a paper with the help of the Guatemala team of course, and there were a lot of scientists behind this as well, so it was an achievement for the Population Council staff.

“Data is always highly confidential, but especially in certain countries it’s super confidential. People can lose their lives or be put into slavery.”

What’s something you’re thinking about that your peers and the rest of the industry should be thinking about?

Eugene Kogan: My reasoning, of course, again comes from the not-for-profit world because we do not have the investment to compare with the profit world. But we’re working with data, we’re working with people, and we must deal and compete with profitable companies as well.

Very often we need to be aligned with GDPR or local compliance issues. Regardless of where the data are—whether it’s in Guatemala, Bangladesh, Pakistan, or let’s say the Ghana office—wherever we are working we need to protect the data. And as we are working towards a data-driven analytics platform—artificial intelligence—again we need the resources and to form a kind of consortium. We need to not just be operating alone in this very dangerous world.

So it would be exciting to work with similar companies, and companies who are sometimes not thinking about all the ways data could be shared.

Yes, we all have some confidential data, related maybe to our sector. But working with boots on the ground, let’s say, we have a lot of data collectors and we have a lot of scientists in those countries. So we can bring information together and work across sectors and build these data artifacts to benefit many researchers and many companies.

Let’s say we’re working with maybe 10 segments, 10 factors, but at the same time we can supplement with some other company who’s working with another 10. So in Assam, it might not even be like 20 factors we can publish together, but at least 15, so enriched data that will be much more available.

With open access to data, we can even give a hand to institutions within those countries, and to sectors, companies, and governments.

This is not only my vision, it’s shared by a lot of people working in not-for-profit or government or public sectors. We need one voice to reach out to potential customers and partners and donors, because for some issues working together is the only way can solve them.

“In the form of an open-source, open concept, and with open access to the data, we can even give a hand to institutions within those countries…even (their) governments.”

So, benefiting governments in developing countries, then the individual citizens who are therefore impacted, as well as business and nonprofit at the same time. A rising tide lifts all boats?

Eugene Kogan: Absolutely. We are not only fighting for ourselves.

And of course there are commercial interests that benefit in this way. Of course we are looking for partners and maybe revenue in grants, but at the same time, whatever we do sooner or later will be beneficial for many companies around the world.

Where data are available, it’s good to make use of that and not always to start from scratch.

Thank you so much for your time. We’ll look forward to seeing you live in 2022.

Eugene Kogan: Yes, thank you.

Watch for Eugene Kogan in the speaker lineup at our 2022 live Forums!


View 2022 Events Schedule

Vendor Vision: Insights from the Solutions Side with Jason Coari, Lakeside Software

SINC’s Director of Content Annie Liljegren spoke with Jason Coari in November 2021, and has edited this interview for length and clarity.

I’m speaking with Jason Coari, Senior Director Product Marketing at Lakeside Software. Based in London, Jason will be joining us in Scottsdale, AZ for our 2021 West IT & Security Leaders Forum, where he’ll deliver a keynote presentation: “Empower Your Digital Workforce.”

Jason_coari

Jason, you had a deep background in tech long before joining Lakeside towards the end of the pandemic year. What can you share about that move?

Jason Coari: Yes, I’ve been in technology for my entire career, and most recently I’ve focused on end-user computing: both at Lakeside and with my previous company.

Coming to Lakeside when I did was really exciting because I knew the relevance of the kind of tech Lakeside provides, and that it is becoming even more relevant after the pandemic.

I’m looking at it from two angles: trying to provide the employees’ personal perspective, and ensure that perspective is being collected and is guiding decisions. And on the other side, making sure IT has the tools to act on that guidance in an intuitive way that helps teams work better, again ultimately to help out a business’s employees.

When you poll employees as well as C-level executives, both say they’re actually more productive while they’re remote working. But what’s quite compelling, from my perspective, is that when you actually double-click on that productivity, under the covers suggests there’s problems within the productivity increases, and that some of it may not be sustainable.

Employees may be more productive because they’re simply working longer hours, or because they’re not commuting. Employees suggested they now have problems collaborating, more than they did prior to the pandemic, because all that collaboration is now being implemented through technology. So if the tech isn’t working, collaboration isn’t happening, and collaboration is obviously a major mechanism driving productivity.

There’s all sorts of elements related to the pandemic and how it’s affecting employee experience, and I’m personally very grateful to be right in the middle of it.

We often hear about a possible disconnect between the employer’s perception of their employees’ satisfaction level with the digital experience, and then the actual degree of satisfaction.
From the vendor perspective, what do you see when coming to that problem?

JC: Yes, that’s a good question, Annie. The analogy I most often apply to an instance like this is the watermelon effect. That’s where IT is basically seeing everything look green, you know, all of our SLAs, we’re meeting our targets when it comes to meantime resolution, or we’ve got our ticket volumes and we know where those are relative to our overall goals or KPIs, but inside it’s red: employees are frustrated in some way.

And so the reality is that while those indicators may be green, they’re green because of the exact things happening on the employee side that you don’t want to be happening, and it’s actually causing productivity losses.

An example of this might be employees that perhaps lost faith in the service desk because they believe incidents are taking too long to respond to, or they’re worried about downtime because of course you’re using your computer more than ever, and so you don’t have to turn it off to get your device serviced. And so what’s happening is they’re living with their problems.

But those problems are having a direct impact on their engagement and their overall productivity, so IT sees this as everything looks rosy, everything looks great, but the employees are suffering.

What we allow organizations to adopt, both as a posture and just a best practice in general, is giving IT organizations better tools to monitor what’s going on within the device from the employee perspective—how they’re actually using the technology that’s being deployed, and a mechanism to engage employees when they need to be engaged.

So, a mechanism that can alert when employees may be working too many hours, or they’re suffering a certain issue, that automatically triggers a survey or some other sort of engagement. That helps IT take a pulse of what’s going on with employees, or even giving the employer the ability to self-heal the problem they’re seeing and solve it themselves.

We find that all employees, across the board, value a sense of autonomy. We like being able to control our own environment to the best of our ability, and so giving IT tools to allow employees to do that is good for the employee and it’s good for the business.

Give us an example of what that self-healing functionality might that look like, and also what Lakeside brings to the table—what sets you apart?

JC: Sure, let’s say an employee is having a connectivity issue; connectivity is one of the most important aspects of having a good digital employee experience.

With a piece of software like Lakesides Digital Experience Cloud, we can actually detect wifi signal strength and we can help IT generate a popup on an employee’s laptop that says Oh, looks like your wifi signal strength is low, and then provide a course of action the employee can take to remedy.

And that sort of problem where the employee is experiencing a connectivity issue—some employees wouldn’t necessarily realize it’s a wifi strength issue, they may think it’s a website that’s not performing, they may think it’s something related to memory or hardware or their device in general. Giving the employees that bit of information they can act on is again really good for the employees, as well as for the business.

As far as what Lakeside uniquely brings to the category, we have an ability to collect a degree of telemetry off of that device application and network that is truly superior within our category.

I say that because we collect across more metrics—we collect across 10,000 metrics—and we also do it more frequently. We collect data every 15 seconds, while there are others in the category that collect either every hour, or sometimes just every day.

But an employee might be having an issue that occurs intermittently or occasionally. Well, if you’re only collecting data once every hour, what happens if you’re collecting the status of that device when the incident is not manifesting itself? You’re never going to see it in the first place; you’re not getting accurate insight. Data by itself is just data, but having better data can drive better insights and those better insights can drive better actionability. So by putting all that together we give organizations the most opportunity to impact and improve digital employee experience within the category.

I’ll also share several great case studies in my presentation, specifically, one that has to do with asset rationalization. Lakeside services some very large American enterprises, and what we’ve found is that often hardware refresh cycles are conducted in ways that cost businesses a tremendous amount of capital. When it comes to refreshing the devices of thousands of employees, you think about the aggregate amount, the overall cost of that kind of investment the company’s making.

The reality we often see is that most devices don’t actually need replacing, they just need upgrading: they need upgraded memory, they need upgraded hardware. In some cases they need perhaps upgraded processors, but there’s also an opportunity to move a lot of employees from physical infrastructure to thin clients.

Giving enterprises that kind of flexibility and the ability to be more prescriptive in how they deploy hardware to their technology ends up saving organizations millions of dollars, and so I’ll be sharing more about that in the presentation.

Great to hear you’ll have some exciting case studies for us. Your presentation is titled “Empower Your Digital Workforce,” and as we’re thinking about the future of work—that’s become something of a buzz phrase, but let’s dig into it from the vendor perspective.
I don’t want to use the word ‘assume,’ but what are you anticipating about what that workforce is going to look like? Coming out of our recent global event, there must be some things you’re presuming about the workplace of the near future or of the future, and what’s going to stick.

JC: That’s a terrific question. There’s a few observations I would make based on existing trends and I suppose a short-term view out, for the next one, two, maybe three years. The world landscape changes rapidly, the technology landscape changes rapidly, and who knows what kind of environmental factors may come into play that we can’t predict at this time.

But one thing that’s happening in front of us, and that I think is perhaps one positive consequence of the pandemic, is this concept of organizations and enterprises ensuring their employees can be productive anywhere.

That freedom in giving employees more of a choice on where they are productive will, I believe, ultimately benefit employees and benefit their organizations. But the only reason organizations are able to adopt this commercial posture has to do with the fact that technology is such a critical element of employee productivity. That’s an area I don’t expect to change, and an area I think will only grow over the coming years.

And as organizations adopt the posture of supporting worker productivity, no matter where they work, I think that takes IT from a kind of back office within a commercial organization, and puts them right into the driving seat—into the front office of executive conversations.

For the Forum attendees, I think the importance of our position relative to overall company performance really improved in step change to another level over the last year or two. IT’s ability to help organizations weather this storm of the Great Resignation, for example, and to help deal with labor shortages by ensuring that employees are happy. A lot of that happiness has to do with not being frustrated, and so much of that frustration can stem from technology. Ensuring that IT is equipped to deliver the best possible digital experience, and putting IT in the driver’s seat I think is really important, and is going to continue.

The other thing I would predict is that all of us in IT are becoming more cross-functional in nature: more and more business functions are reliant on IT for the ultimate output of the employees within their group. IT is becoming more of a business partner with those groups, and therefore needs solutions that give more insight into how employees are using the technology that’s being deployed.

So again, I think the kind of perspectives provided by that user-centric approach and which digital experience management solutions offer will be critical in supporting those other business functions.

Indeed. We hear so often from our community that folks feel they’re not taking full advantage of the solutions or products they have deployed. It sounds like you’re speaking to that need to take stock, to assess how we’re making use of what we have.
And to your point about enabling people to be productive anywhere, if we remove some of the generational challenges that we sometimes see, you can make that a lot easier for folks. So something as simple as an employee not realizing they have a connectivity issue—a system that anticipates that might somewhat level the playing field between different generations that maybe aren’t as familiar with technology as another. Would that be fair to say?

JC: 
Annie, what you just brought up is actually something I often saw in the technology conferences I presented at this year: the idea of digital friction in the business and adoption of digital tools. It’s definitely not a one-size-fits-all. There really are different tolerances among employees and so a company really needs to not have a one-size-fits-all IT department. You need the ability to personalize IT service delivery and the ability to group employees, and that’s something a solution like ours can do.

Yes, you’re sharing a point more true than ever, which is that an organization’s digital culture and whether employees embrace the digital tools provided by the business is more critical than it’s ever been.

It’s critical for companies to treat that as something that needs investment and which in some ways needs digital caretakers. It needs measurement, it needs metrics behind it, and the ability to benchmark, because without that kind of insight how do you actually improve upon it? And it needs resourcing, so, yes, 100%.

Such a relevant area for discussion—we hear that theme of adoption frustration all the time.
This is your first SINC event; What are you excited to share with the West audience, or to get the audience talking about in the discussion? What are you looking forward to about your presentation and the event itself?

JC: Well, this is my first in-person event in some time so I’m pretty jazzed (laughs).

I know the term “digital employee experience” is still fairly new to a lot of organizations out there. “Employee experience” is a term that’s getting much more attention these days, and of course the idea of the customer experience has had a tremendous amount of attention for the past few years.

So I’d love it if the attendees took away a greater sense of understanding as to why I’m paying such attention to digital employee experience, why technology matters so greatly to employees in today’s world, and why it’s so important. I hope to provide a deeper understanding that, with investment and with resourcing, it’s an area of the business where you really can make genuine improvements through solutions within our category.

And then I’m also thinking about all the people in that room, how we’re all IT professionals and we’re all part of this industry, collectively. It’s a rare opportunity to have a general sharing of experience on this kind of scale, so it would be fantastic to have a spirited discussion around the challenges people are seeing and how others are resolving those challenges.

Having a discussion where all of us, including myself, can walk away having learned something—that would be really tremendous.

Sounds great, Jason. That’s exactly what we are about: curating environments and events where that level of quality conversation can take place. Thanks for your time; we’ll see you in Scottsdale.

JC: Thank you so much, it’s been delightful talking to you.

Catch Jason’s keynote presentation and engage with Lakeside Software at SINC’s West Leaders Forum


Apply to Attend

Death Rides a Unicorn – Bryson Bort on Attack Emulation, the Human Element, and Why More Tech Isn’t the Answer

SINC interview Nate Arnold GE

SINC’s Director of Content Annie Liljegren spoke with Bryson Bort in October 2021 for this interview, which has been edited for length and clarity.

Bryson Bort is founder and CEO of the adversary emulation platform SCYTHE, and co-founder of ICS Village, a nonprofit advancing awareness of industrial control system security.
A former U.S. Army captain, Bryson currently serves as Advisor to the Army Cyber Institute at West Point and DHS/CISA, and is a Senior Fellow for Cybersecurity and National Security at R Street Institute.

Bryson_Bort

Bryson is a featured speaker at our 2021 West IT & Security Leaders Forum, Dec. 5-7 in Scottsdale, AZ, where he’ll be presenting “Attack. Detect. Respond: Know Where You Stand to Prevent the Next Attack,” and leading an audience-interactive incident response exercise, “Blue Team: Choose Your Own Adventure.”

Thanks for being here, Bryson. First, I’d like to ask about your definitional difference between “attack simulation” versus “attack emulation.” You’re pretty firm about the need to demonstrate exactly how a specific org would be affected rather than relying on a linear checklist. What’s informing that distinction?

Bryson Bort: When we had the idea from a Fortune 50 consulting engagement to do this, we had no idea there was even the space or anybody else doing anything like this, which is why we went ahead and built it. We’re a cross between a red team and the traditional breach-and-attack simulation vendors, which is where the simulation part comes in.

And then all of a sudden, Gartner and Forrester come out saying, this is the breach-and-attack simulation space, and we’re thinking Well, who are these other folks?

And as we went through the marketing language repeatedly over time to figure out what exactly they did, what we realized is they were technical solutions to technical problems. You mentioned the checklist approach—that’s what they do: they look at the complexity of an attack and think, we can boil it down into this checklist, and think that with this checklist you’re going to be able to measure technical controls, and since it’s repeatable you’ll be able to identify configuration drift.

Well, that’s not security. Security is understanding the largest risk surface area that’s a part of every enterprise, which is the employees, your own staff—how did they actually respond? It’s not just looking at the tool side: what was stopped or what did we see, because it’s more complex than that. Attacks are more complex than a simple checklist.

One of the easiest things I always like to point out: my ability to do something on a computer changes based on where I am or what privileges I have. That’s a state basis of an attack and those are the kinds of things an attacker is doing as part of their attack chain, and affects what they might do, what happens next, and how it would happen.

So that’s simulation versus emulation: actually doing the thing to see what would happen, versus a simple view of it.

Your session abstract for your presentation at our West Forum declares “the solution is not more technology.” I’m curious to what degree your military background informs that emphasis on people and process: you’re a West Point graduate and a former U.S. Army Captain.
I might venture to say, it seems clear even from a civilian perspective, that although there’s countless cutting-edge technologies developed for and by the military, the mentality is people-first, and almost the opposite of a product-first, bigger, better, faster product type of solution—that in the military, personnel training is absolutely foundational to everything else.

Bryson Bort:  Yes. The talk I’m giving is specific to a purple team, which is the collaborative approach of bringing people and process into those assessments. So, Red and Blue work together for immediate goals and improvement, and then there’s all these additional benefits that come out of that process.

The reason I went into the Army was that focus on people. In the Navy, it’s ships; in the Air Force, it’s planes; in the Army, it’s people. Without a person holding something, nothing happens—it’s why we call it boots on the ground.

So throughout my life and career I’ve always had that people focus, which is what I bring to the cybersecurity realm. I understand the technical side, and I can dial it in and speak to it easily, but I think that understanding, tying into the people element and the people aspect, is one of my personal differentiators. Being in the Army was a reflection of who I am.

Tying it back on the assessment side, one of the things we talked about in the military: the more you sweat in peace, the less you bleed in war.

And while that’s a stark aphorism, what it really means for us is the more you prepare your defenses correctly, with realistic training, the better you’re going to be when something does happen, because the reality is everybody is hackable. You’re going to get breached. it’s just a question of how quickly you can minimize the impact.

“The more you prepare with realistic training, the better you’re going to be when something happens. (E)verybody is hackable…it’s a question of how quickly you can minimize the impact.”

Right, when the technology fails—how good are your people, and how well-rehearsed is your process.
I’ve watched your interview from the RSA conference discussing the Florida water plant breach. I’m currently in Kansas, where they recently sentenced a young man who in 2019 shut down the entire rural water district in Ellsworth, Kansas. He was a former employee who had not worked there for almost three months but was able to access and shut down the system from a cell phone using a shared password, and states he was intoxicated at the time. Clearly a failure of people, processes, and tech.
We’ve been speaking about security as far as risks to companies and consumers, and a breach of public utilities are another matter altogether. You’ve actually started a nonprofit focused on industrial control system security.

Bryson Bort: Yes, I co-founded a nonprofit 501(c)(3), the ICS Village. We go around the country, and sometimes the world, with critical infrastructure exhibits to teach industrial control system security. The starting point for everyone is education.

Going back to the people aspect, I made a comment to the Washington Post recently, about how in a time where the phone and my ability to download apps is the limit of the average citizen’s knowledge, a knowledgeable citizenry is an armed citizenry in this case.

And in this case you mention, it’s not that I expect the government to go in and defend water utilities, but it’s all about priorities, and the state budgets, and at the community level, which is where most of this stuff happens. And if our citizens are more aware of these problems, they’re going to be more likely to have governments at the local level fund and do these things.

One of the things I’m doing is raising awareness at the national level so there’s more federal resources deployed appropriately down to those communities. But these are neighborhood problems; the water utility is not something the federal government is part of at all. That’s a local problem, and so far, the cavalry ain’t coming.

To that point, I’d love to get your response to something Ken McGee, Research Fellow at Info-Tech, said when I interviewed him about the digital economy. I’ll give you the full quote:
“It is time to call out the cybersecurity world as a profound, abject and complete failure and to arrive at a better solution than having individual companies, individual agencies, and individuals use their limited resources to combat what is very obviously state-sponsored cyberterrorism…Let’s not ask companies to be the sole source of funding for security, let’s take a look at what it is: it is war.”

Bryson Bort: He sounds like me—I’ve given that talk. The answer is yes.
First of all, the industry is an abject failure on the tech side because there is no formal definition of security and there is no approach to that answer. The entire attack path—going back to that emulation versus simulation—is infinite, because it’s not just technical. There are all sorts of physical parts to this, and social parts of this, and people parts. It’s not as if we’ll just come up with a better mousetrap; no, the system is inherently flawed that way.

Now, to Ken’s comment, I just gave a talk at the Department of Defense last week, and I’ll quote Dmitri Alperovitch on this: we don’t have a cyber problem, we have a Russian, Iranian, Chinese, and North Korean problem.

What’s happening is happening because, geopolitically speaking, it’s the best move on the chessboard, and the US government made a strategic decision years ago that private industry was on its own. It has only recently, in about the last three years, started to change on that, but this is a generational gap we’re trying to overcome now.

“The entire attack path—going back to that emulation versus simulation—is infinite, because it’s not just technical. There are all sorts of physical parts to this, and social parts of this, and people parts.”

The point you made a moment ago, that the federal government is not going to come rescue rural water districts, is well-taken. But on the other extreme, and I’m being a bit facetious here, it’s almost as if this won’t be solved by teaching people not to click malicious links.

Bryson Bort: Oh yeah, that doesn’t do a damn thing. It’s one of those things everybody thinks is intuitive—if we just train the users—nope, that’s not going to do it. I’m an anti-hygiene person; there’s no point, build a better system and stop depending on your users to be better at it.

Going back to the water plant example and Kenneth’s quote from earlier… The thing I’ve been advocating for the last three years has been that each water plant (so to speak) is solving the same problem, so why are we depending on them to solve the same problems?

For one, that’s inefficient, and two, with the talent they have available it’s going to be a mixed bag. So that’s an example where the federal government shouldn’t come in and regulate you, they should come in and say Here’s a shared catalog of things we’ve already figured out for the 115,000 of you that there are, which is how many different water plants there are in the United States. They should say, come pick of our bounty that we’ve centrally curated.

What’s top-of-mind for you as far as something that either you wish got more play and more attention, or gets plenty of buzz but you feel the conversation needs to be different, to be deeper, or take a different approach. If you could get everybody to focus in on something, what would it be?

Bryson Bort: I’ll give you an apocryphal one… Because of asymmetric cyber capabilities, I think the United States in the next 10 to 15 years is going to be humbled on the world stage in such a way that we are no longer seen as the superpower.
I think it’s going to be the Chinese that do it, and I think how they do it is they’re going to take Taiwan, and how they take Taiwan will be by causing just enough disruption to fix US forces in a response in place.

The loss of Taiwan in that manner is going to be a smack across the face heard around the world, and American hegemony will be on a permanent slide.

And I think cyber is going to be how it happens, because nobody wants to fight an M1 Abrams on the battlefield, but it’s fine when it stays in the US.

Is that dark enough?

“Cyber is going to be how it happens because nobody wants to fight an M1 on the battlefield, but it’s fine when it stays in the U.S.”

I’m so glad my last question is about unicorns: What’s with the unicorns?

Bryson Bort: (laughs) At the consultancy that I founded, GRIMM, we would do an annual t-shirt contest for DEF CON, and I came up with the idea of the Grim riding a unicorn. I thought the juxtaposition was really funny, and our brand style is more like Disney villain: it’s not too dark, it’s not too fluffy.

It’s a nice mix that matched the juxtaposition and the design just blew up—everybody wanted it. Shortly after that, when we spun out SCYTHE, we needed something and didn’t want brand confusion with the GRIMM, and people seemed to be really into this unicorn thing, and the design was born.

Now, we have unicorn hoodies in the five colors representing black and white hats, and red, blue, and purple teams. I’ll bring the blue one with me out to Arizona to wear when I lead the Choose Your Own Adventure incident response interactive.

Excellent—we’re looking forward to having you out at West, and certainly these sessions. Appreciate your time, Bryson.

Bryson Bort:
Certainly; thank you.

Engage with Bryson and catch his sessions at SINC’s 2021 West Forum


Apply to Attend

Moral Courage and the Human Factor: SINC Sits Down with Les Correia, Estée Lauder

SINC interview Nate Arnold GE

SINC’s Director of Content Annie Liljegren spoke with Les Correia in September 2021, and has edited this interview for length and clarity.

My guest today is Les Correia, Executive Director, Global Head of Application Security and Special Projects at Estée Lauder.
Les will be featured at our 2021 West Forum December 5-7 in in Scottsdale, Arizona, delivering a session entitled “Key Considerations in Building a DevSecOps Program.”

les_correia_SINC_Interview

Great to have you here, Les. I’d like to start with a particularly prescient statement you made at the Cyber Security Exchange North American in 2018.
You said: “I don’t think we’ll ever get to this utopian perfection, so we have to instead be good at crisis management.”
Certainly no one’s going argue with that post-Covid.

Les Correia: My thinking has always been consistent in that space because, in general terms, a human being always finds a way around barriers. That’s why we call them hackers–depending on which side you are on–we’ll always find a way around, even as people, right? if you, say, create a law, people just cut the law until they can get through and then there’s a new law, and so on.

You’re always going to have somebody breaking through, because if you’re breaking through you only have to be right once, and the person who is building those defenses has to be right all the time, so you have to think that way.

And then talking about the last 18-24 months: basically you always have to be working under a method which assumes you will be broken into and start thinking ahead of time: prioritizing what’s important, and also what isn’t important, and focus on those areas, so that before the thing happens you’ve already planned.

You need to know how you’re going to deal with the crisis if something happened now, so you’re not panicking and trying to scramble.

So instead of trying to prioritize a schedule in which you’re going to do it, instead of that schedule–what is the priority itself? Then you know what is important and what you need to lose sleep over, and what you shouldn’t.

And how did that play out over the last 18 to 24 months, steering through a crisis where it seems everything is a priority?

LC: Yes, it sounds simplistic, but it isn’t–business continuity as opposed to disaster recovery. Disaster recovery is very physical: something happens, and here is what do you do. But continuity is making sure your business is continuing regardless of whatever the disaster is.

So one of the tenets of building that continuity program is understanding the risk and the business impact of different processes within the company. If you know how all the key processes, departments, locations, business units interact, you’ll know what’s important what isn’t.
For example, if you’re manufacturing, and you know what to protect and what happens if something does happen, and you’ve already prioritized Okay, is that more important than selling? Sales is another process, is payroll more important by region, and so forth. If you have those things in place, you can actually focus on what’s the priority of certain things.

It’s a hard thing to do, for a company, it requires that mindset and commitment, because most companies struggle with that. But if you’re already on that level—focusing on what’s important—there’s a less of a headache because you’ve already got that priority in place. If you don’t know what’s important, you have to go by weighing in at the spur of the moment.

Now with Covid, the main difference was reacting to access management, and making sure you’ve got good endpoint security, whatever that endpoint is.
Also the other thing was the belief in zero trust and identity, where you marry the two so you have an identity, and you have a zero trust associated with that identity. Zero trust is not a new concept, and actually may be a misnomer–more accurately ‘allow access’ or ‘trust those that require.’

So it means identifying each area that needs more access, not just coming into the environment, and then a free-for-all, but allowing identity to govern. It requires work, and that’s where the maturity, I think, is going to get better.

So if we were headed towards identity-based approaches before, WFH obviously exacerbated that along with this massive shift in endpoint context. How does that impact an identity-based methodology?

LC: You have to wrap it up together because all these second-factor, multi-factors are still associated with an identity.

The big thing in the past used to be that whole networks were segmented, but it was not based on identity. It was based on what’s important and what’s not important. But you link that up with an identity that’s always stuck to you, and add those levels, whether it’s biometric or behaviors and associated with, Annie, for example. That’s my kind of thinking anyway.

And the more factors you have the better. Now, you might use biometrics or whatever, but the point is that identity is linked really tight, so it follows you wherever you are. Of course that means you need to have a good identity system too, which is also maturing.

I’d like to take you back to something you said at the first Global Security Observatory. Here’s the full quote:
“(S)ocial engineering…gaps continue to be our bane. There has been some improvement in tools to address some behavioral issues using AI. Studies are ongoing, taking advantage of quantum computing to support AI utilizing neural networks to match human intelligence.”
The way I read this, it suggests you don’t necessarily think we can teach ourselves to behave differently or correct our own behavioral issues, and that AI will have to address, or at least support that. Would that be fair to say?
Are users simply unable to behave as needed for optimal security without the help of AI, or are there ways we can meaningfully socially engineer our behavior? I realize that’s several questions in one…

LC: No that’s fine, actually this sort of thing has come up before in my other talks…

When computers were being introduced, they said Oh it’s going to replace human beings and we’ll all be out of a job and so on, and it wasn’t—there’s even more work, and there will be even more work. So first of all, that’s a myth.

And then, even with all the focus on phishing, people are still falling for it, people are still doing the clicking. The degree to which certain behaviors are still happening–it’s unbelievable what people are willing to believe.

And that’s why you need these other kinds of input, and artificial intelligence is that. We use that word so loosely and I get annoyed sometimes–machine learning is actually what it is.

But the stuff is only as good as the data that is collected and curated—that’s the fundamental thing. I can have the best algorithms and it means nothing if I don’t have the right data to curate and therefore learn from that data and move forward.

Today, information is coming at speed, which is a huge difference, information that used to take years is coming at speed. And you want something to help you pick up the nuggets of what’s important and what isn’t, at least so you don’t sweat the normalized stuff.

So why not use tools, because today you have tools that can detect what’s right or wrong, to an extent–but you still need a human being to validate that.

You had generations who never touched a computer and then generations now who’ve been operating computers since they were babies, but you’ll never replace the human element because of the human factor.

There will always be new vulnerabilities and people will always fall for them, regardless of which generation you come from.

“You’ll never replace the human element, because of the human factor.”

Seems like that touches back to the point you made earlier: that you have to assume breach, and it’s futile to think you can eradicate foolish human behavior, so you assume people are going to act carelessly sometimes and work instead to mitigate the effects?

LC: Absolutely. If it’s not you there’s going to be someone, and so you’re only as good as your weakest link. You’re always going to have a breach somewhere or the other.

And then there’s two other key factors: first, selectivity–human beings are so selective. What we tend to do is protect our bank account like crazy and leave our social networks and retail history out there, and you’re exposing yourself, because you’ve taken a picture on your vacation to Italy and then someone knows they can come burgle your house. So with the way that selectivity concept works today, I absolutely I think privacy is a joke. We have these regulations and all that bull, but really we give it away for free.

And not only that, but also the correlation concept, because now you can correlate the information. I can find out something about a person here, and then using something on social or elsewhere, I can correlate that information.

In your public remarks and interviews and articles you consistently refer to this idea of ‘moral courage.’ Can you unpack what that idea means to you, whether in the context of your position or your industry or however you want to take it?

LC: Yes, I feel so strongly about this. Let me start with why I think that:

I think it’s more important to have better relationships than have a super IQ. And we always judge people as being such a great person because she’s got a crazy IQ and all that, and yet you might be the worst at relationships.

It’s important to have empathy, it’s important to understand cultures, to understand relationships. That’s where you can get things done, it’s a fundamental.
Now, moral courage, or the way I’m thinking about it, is somebody having the strength to speak up—because not everybody has the oomph to say something is right or wrong without consequences.But I feel if you’re a leader and especially if you’ve got a following it’s extremely important. If you hear someone being disparaged, don’t laugh it off–say something and have the strength to speak it out, even for the sake of your own integrity.
Because those little moments inform larger things, moral courage has to happen individually–with each of us, one at a time.

“Have the strength to speak out, even for the sake of your own integrity…those moments inform larger things.”

It sounds like you see part of the value of that mindset is that it’s not siloed off–it’s interwoven with and informing your strategy and your other relationships and the business itself. It’s not as though you have what’s morally correct over here, and then what’s effective over here.

LC: All of those absolutely, it’s intermingled. And it’s not enough to scramble to react when it’s suddenly fashionable to improve diversity or to hire women—these issues have been going on forever—it’s hard for women in tech. And we need all kinds of diverse thinking to make our world effective.

And most of the time we’re just followers, so I appreciate folks who can speak their mind and challenge it and have that moral courage. And I say moral, because that’s what it is, morally correct. Otherwise it’s the opposite.

People say I’m very outspoken, sometimes I’ve said it to me, but I feel that the truth is okay. It’s okay to say.

What’s top of mind for you right now, as far as something you wish the industry was paying more attention to, or something you wish your peers were thinking about in a different way?

LC: A lot but to say but I think the hardest thing is that whole social behavior, social engineering element to security awareness. It’s almost left at the back end of things, and then you say Oh, did you hear so-and-so got hit with ransomware…

And yet, this whole social engineering concept is really about thinking about how we think, and behaviors, and that education is so necessary. But it’s almost looked at as, you know, a session you have to attend for compliance reasons.

But if you do that, if you’re just doing it to pass, you’ll never going to learn anything.

So I think the big one is still social engineering and having a proper education. There’s a lot of talk and a lot written on that, but it’s a real concern.

Engage with Les and catch his session at our West IT & Security Leaders Forum, December 5-7 at the Omni Scottsdale Resort & Spa at Montelucia.

West IT & Security Leaders Forum

Apply to Attend