The Big Hack – NY Magazine

NY Magazine has an interesting (fictional) story about a 2017 cyber attack. Almost all of the elements seem plausable: a mix of hardware-based and people-based attack vectors, failures of coordination, confusion, and the financial consequences.

It’s worth a read.


Poets & Quants MBAs To Watch: Anthony Harbour







Congratulations to Rotman School of Management graduate Anthony Harbour for being selected by Poets & Quants as one of their MBAs To Watch this year. Anthony was a student in my Catastrophic Failure in Organizations course as part of a great cohort in 2016 (thanks for the shoutout to the course, Anthony!). A Los Angeles native, Anthony came to Rotman with prior experience at the U.S. Securities and Exchange Commission and left a lasting mark on the Rotman School. You can read about his many great contributions to our community on his Poets & Quants profile. Congratulations, Anthony!


Reg AT – Don’t Go There

Craig Pirrong, the Streetwise Professor, recently wrote about his skeptical take on the CFTC’s desire to examine the source code of trading algorithms. The proposed Regulation AT (Automated Trading) has many issues, and Pirrong calls out two of them:

I seriously doubt that the CFTC can attract people with the coding skill necessary to track down errors in trading algorithms, or can devote the time necessary… for a truly effective review.

This is a great point. If the CFTC is so burdened with what’s on their regulatory plate already, how can they possibly add this? And how can the CFTC hope to compete with trading firms for the technical talent required to effectively review such code?

Second, and more substantively, reviewing individual trading algorithms in isolation is of limited value in determining their potentially disruptive effects…

This is because in complex systems, attempts to improve the safety of individual components of the system can actually increase the probability of system failure.

Pirrong is a scholar after our own hearts, and he hits on so many important points here. The theory of complex systems tells us that non-holistic safety mechanisms often make things worse.

For example, after the 2010 Flash Crash, the SEC implemented single-stock circuit breakers. Such measures seem like a good idea, and the circuit breakers often help minimize disruptions. But on August 24, 2015, these single-stock circuit breakers halted trading in 471 different ETFs and stocks. This in turn lead to further dislocation as many key ETF liquidity providers simply stopped trading because they could no longer model the baskets of securities that underlie many ETFs.

Worse, if the intent is to prevent Knight-like fiascos, the CFTC should look elsewhere. Knight’s problem wasn’t even a coding error. Knight’s code worked—it was just deployed incorrectly. If that sounds like splitting hairs, that’s precisely the point. These systems are so complicated that code divorced from configuration files and deployment procedures is essentially meaningless.

I understand where the desire for a Reg AT-type solution comes from. The complexity of the financial markets is increasing, and we’ve seen over and over that regulators are struggling to get a handle on things. But if the CFTC really wants a window into the risk of automated trading, they should take a page from the Federal Aviation Administration’s playbook (as we’ve argued before). The FAA supports the airline industry’s quest for safety by cooperatively interfacing with airline-run Safety Management Systems. These systems specify a structure for reporting, discussing,  correcting errors, and for auditing those corrections—largely without the fear of regulatory reprisal.

The CFTC should drop the costly, draconian, ultimately counterproductive Reg AT proposal. Instead, they should consider “Reg SMS,” in which they work with the industry to set up standard error capture, discussion, and QA processes—modeled after the airlines’ Safety Management Systems—so we can all get a handle on this complexity together.

Just as there are best practices for coding, there are best practices for managing complexity. The CFTC needs to look for them.

Complexity Strikes T. Rowe Price

T. Rowe Price: Invest with Confidence… but vote with skepticism?

When we think about complexity, we naturally think about systems that seem risky, like nuclear power, aviation, space flight, the power grid, or high frequency trading.  But a recent, and costly, proxy voting mistake shows that even systems that seem really boring can have big consequences when they fail.

T. Rowe Price, the asset manager, announced this week that it was paying almost two hundred million dollars to clients for mishandling of a proxy vote related to the 2013 leveraged buyout of Dell Inc. At the time of the buyout, T. Rowe Price held the computer maker’s shares in a variety of its mutual funds and client accounts.

Even as T. Rowe Price actively opposed the buyout and advocated for a higher price for Dell shares, their proxy voting system mistakenly voted “for” the merger. Like almost all complexity-driven errors, this was a combination of human error (T. Rowe Price employees failed to check that the voting record matched what they expected), external factors (the shareholder vote was postponed several times, which overwrote the “Against” vote that T. Rowe Price recorded), and seemingly benign design decisions that have unintended consequences: in this case, that the T. Rowe Price’s default vote for a management-supported merger was “For” the proposal.

On May 31, 2016, the court ruled that Dell’s fair value per share was $17.62 and not $13.75. Because of their mistaken vote for the merger, T. Rowe Price’s shareholders were denied the additional $3.87 per share.

In a press release, T. Rowe Price pointed out that the $3.87 difference in share value “[validated] the firm’s original investment thesis.” A validation that’s now resulting in a $200 million loss for the firm. Seems like a Pyrrhic victory.

The challenge for any firm with complex technology like this is that it’s hard to tell where such errors might be lurking. The vast majority of the time, T. Rowe Price’s system recorded the intended vote. The problem is that this mistake came with a large price tag. And more likely than not, the next costly and unexpected error (at T. Rowe Price or another firm) won’t have anything to do with proxy voting. Instead, it will be a mishandled options conversion, dividend election, or something outside of the corporate actions space entirely.

So how can firms manage to protect themselves against the spectrum of possible errors? First, they should think of complexity itself as a risk factor. One way that this could have been explicitly considered is by noting near misses, instances where a vote or other corporate action was almost recorded incorrectly, but was caught. Sensitivity to near misses allows firms to correct deep and systematic errors before they become costly.

Second, recognize that organizational (and technical) boundaries can obscure what’s going on. In this case, interactions between T. Rowe Price’s fund managers and corporate actions group diffused responsibility. And their technology platform, integrated with an external processing agent, didn’t always tell the full picture. Boundaries like this create risk.

Finally, design systems defensively with the assumption that individuals are fallible. T. Rowe Price’s corporate action voting system had sensible defaults recorded for the majority of votes. And though there was a process to change the vote, the Dell leverage buy-out was a clear special case, especially as its multiple postponements required multiple votes. Just as happened at Knight Capital (where a technologist failed to roll out new code on all eight of Knight’s servers), humans struggle to accomplish tasks that require exceptional precision with little differentiation. Designing and using checklists can help, but only when supported by an organizational culture of dissent and healthy checks and balances.

Hat tip to Steve Lofchie at The Cadwalader Cabinet for the story.

%d bloggers like this: