Last year I was asked to provide a short video clip for a security product vendor’s Sales Enablement offsite with the goal of delivering a professional and inspiring message about empathy. In particular, how a vendor can raise the bar with their field sales team in understanding the nature of the challenges facing CISOs and how to be a partner–not just a vendor. In other words, how to be authentic and genuine when speaking with CISOs about security.
I was reminded of a good definition of empathy as compared with sympathy, and made that one of my points in the recording. It goes something like this: sympathy is when you share somebody’s feelings; so you can only be sympathetic with someone if you’ve been in their position or done their particular job. If you’ve never been a security officer or in an individual contributor role on a security team, then sympathy is not an option. This is where empathy comes in. Empathy is the act of working to understand a person’s feelings. To feel “for” someone rather than to feel “with” someone is the difference between empathy and sympathy. In this sense, empathy is wrapped up in a larger concept often referred to as emotional intelligence, or EQ (compared to IQ, and championed by Daniel Goleman in his first book on the subject in 1995, Emotional Intelligence: Why It Can Matter More Than IQ).
And while I’m referencing some of the sources for this view on digital empathy, I would be remiss if I didn’t mention the writings of Kelly Shortridge. Kelly was a former SecurityScorecard Product Manager and while she and I did not overlap at the company, I’ve found her work to be an inspiration. I feel that a basis for understanding the quantification of risk in a scalable and repeatable manner has played a part in our approach to advising executives and boards on risk management; it’s also helped make the world a safer place through data-driven analysis and reflection on the nature of human behavior.
Security Awareness Training
When someone fails one of my monthly phishing tests, I am adamant that a carrot rather than stick approach is best. I intentionally ask my team to inject little flags and imperfections in every single phishing test that we create in order to train a mental muscle of skepticism, and not simply trick everyone with “a perfect phish.” On that note, the ability to notice a typo or grammatical error is what we should seek in running security awareness campaigns, because they also occur in real life campaigns and attacks. We should not seek to punish someone for clicking on a link. This momentary lapse in judgment could be the result of a busy day of meetings and urgent emails which generate a degree of fatigue. It could be that a third-party vendor or service provider has suffered a Business Email Compromise (BEC) event and the bad actors are harvesting the address book and sent-mail folder of that (formerly) trusted user account.
We are human, and humans are imperfect; this is not a flaw that will ever be remediated. There is no “wetware patch” that will stop a person from being curious or anxious about an email that mentions upcoming layoffs, for example. Any program of security awareness that has harsh consequences, such as termination after three failures (I’m told this is a real policy in some companies), is working against instead of with these powerful forces of human nature.
This is why we must shift our thinking from a fear of failure to the embrace of it. We can only be truly resilient if we accept the fact that we will continue to be attacked by organized crime, nation states, and other bad actors who aim to do us harm. Yes, we can and should try to prevent bad things from happening to the degree and extent possible. But I believe that a prevention bias has emerged in infosec which robs us of attention (and potentially budget) for “right of boom” capabilities and readiness. The truth is, any company can be breached by an attacker with sufficient time and resources. You may only be collateral damage to an outbreak of a strain of malware or ransomware, but you will be hit–it’s only a matter of time. Or your infrastructure might just be a stepping stone toward another target, as was the case with the Twilio breach. When “boom” happens and a breach occurs, or a ransomware event finds its “patient zero,” it is our ability to detect and respond to the attack that determines just how extensive the damage will be. No amount of “left of boom” spending will be considered sufficient after the security event occurs.
The Limits of Compassion
There may well be limits to the compassion that we can sustainably offer to our colleagues, our community and beyond. A person with low empathy has trouble connecting to other people’s circumstances. They believe that an adverse event would never happen to them, or that they would handle such a situation “much better.” I cannot argue or win a philosophical argument about such lack of empathy, but I do know that it, like many other traits and skills, needs to be practiced in order to persist as part of our worldview. We must nurture the good behaviors that we believe to be enabling a fair and just society: where trust is earned, compassion is rewarded, and the calculus of the creation and distribution of value is transparent and open. And at the end of the day, when it comes to cybersecurity, trust must be nurtured. The SecurityScorecard ratings platform is, in this context, an exercise in supporting the idea that we must share our understanding of risk, observations of dangerously-misconfigured infrastructure, and the belief that the world can benefit from such sustained efforts. Our collective resilience is built upon trust, transparency, and digital empathy.
The 2-minute clip I recorded on the subject of digital empathy, and being authentic in our interactions with each other as vendors, partners and customers is available here: https://youtu.be/-83Fli4O2ZY
References:
https://en.wikipedia.org/wiki/Digital_empathy
https://en.wikipedia.org/wiki/Emotional_Intelligence