Spreadsheet Screw-ups (or bad choice of Excel for serious purposes)

Not a lot of detail here, but:

CHICAGO (CBS) — A paperwork mistake caused delays for travelers on Southwest Airlines Wednesday.

Southwest halted flights at Midway International Airport while the issue was fixed. The airline blames a spreadsheet error that occurred during a system update.

It caused problems with aircraft weight and balance data.

Southwest said it was only a paperwork issue, and not a flight safety issue.

Why are spreadsheets involved here? This sounds like a bad idea.

Error in Canadian spreadsheets… but only the French-language ones.

To qualify for SSUC – which has so far helped nearly 400,000 companies with grants totaling $ 59 billion – employers must fill out “spreadsheets”: Excel spreadsheets that ARC has been offering on its website since March 15th.

However, the spreadsheet available to companies for payment periods from July 5 to August 29 contained several major errors until last Friday, nine days before the deadline for grant applications.

One of them appeared in the document nearly four months ago. The last “was only present from November 23 to November 30, but it was likely a little earlier,” the CRA admitted in an email sent to Task.

However, the problem resolved by the CRA afterwards had nothing to do with the language in which the paper was used … nor with the bug that Mr. Reed discovered. Instead, the bug that affected the contractor last Friday has been fixed. According to a review by duty, The issue has been affecting this spreadsheet since at least October 5th.

This mistake had the effect of drastically reducing support for businesses, but only for some owners and their families. It only exists in the French version of the spreadsheet.

Did you get to calculate A(double dot) X? :grinning:

Via Matt Levine’s newsletter (Bloomberg)

Morningstar

Yesterday the U.S. Securities and Exchange Commission sued Morningstar Credit Ratings LLC for being too generous when it rated commercial mortgage-backed securities, and it is another software-design horror story, though of an opposite kind.

In commercial mortgage-backed securities, CMBS, banks package a bunch of commercial mortgages into a pool and then issue securities—“certificates”—backed by the pool. If mortgages in the pool default, some of the certificates—the junior tranches—will lose money first, while other certificates—the senior tranches—won’t lose anything until the junior tranches are completely wiped out. This means that the senior tranches are very safe and can get AAA ratings from ratings firms like Morningstar, which means very conservative and regulated investors can buy them.

Much of the game of issuing CMBS, then, is about maximizing the portion of the securities that get high (ideally AAA) ratings. If you pool a bunch of conservative low-loan-to-value loans against a diversified pool of good properties into a CMBS, a lot of the certificates will be AAA; if you pool a bunch of risky loans against garbage, more of the certificates will be BBB or worse.

Ratings firms have a natural and well understood conflict of interest. On the one hand, they should try to give things the correct ratings; if you hand them a pool of risky loans against garbage, they should give the securities relatively low ratings (not much AAA), to reflect the actual probability of default and uphold their standards of intellectual honesty and protect innocent investors and so forth. On the other hand, the banks that issue CMBS want good ratings (lots of AAA), and if a ratings firm doesn’t give out lots of AAA ratings the banks (who build the CMBS, pick the ratings agencies and pay them) will go somewhere else. Also to be fair investors want lots of AAA ratings; most (not all!) of the time, everyone is happiest if they just all pretend that everything is AAA and the ratings agencies back them up.

But this conflict is, again, very well understood, and politicians and regulators hate it, and the SEC monitors the ratings firms to make sure they’re not just giving everything AAA ratings. One way to do that is by regulating their ratings models, or at least their disclosure of their models. The ratings firms have to have quantitative models that take inputs—about the cash flows of the buildings in the CMBS pool, etc.—and apply some predictable process to them to decide how much of the pool gets rated AAA, etc. And they have to disclose how those models work.

In theory the purpose of this is so that investors can understand what the ratings mean. In practice the purpose is to let the SEC sue ratings firms if they nudge ratings up on an ad hoc basis. The investors don’t actually care how the model works or what the ratings mean; they just want to buy AAA-rated stuff with high yields. But once the ratings firm writes down how its model works, if it deviates from the model to please a bank, the SEC can sue it. “You said that you rated CMBS in a principled way based on standard criteria, and instead you just rated everything AAA, so, fraud.” From the SEC’s press release yesterday:

“To increase transparency and guard against conflicts of interest, the federal securities laws require credit rating agencies to disclose how ratings are determined and to have effective internal controls to ensure they adhere to their ratings methodologies,” said Daniel Michael, Chief of the SEC Enforcement Division’s Complex Financial Instruments Unit. “In this action, the complaint alleges that Morningstar failed on both counts by permitting analysts to make undisclosed adjustments over which Morningstar had no effective internal controls.”

Fine. Morningstar had a model for rating CMBS, which it disclosed in documents on its website called “CMBS New-Issue Ratings Opinions” and “CMBS Subordination Model.” Here’s how the SEC describes it:

The first step of Morningstar’s rating process was to underwrite a representative sample of the pool of commercial real estate loans that collateralized each CMBS transaction. … Through this underwriting process, Morningstar calculated the expected net cash flow that each commercial property would generate over the life of the loan, along with the value of each property, using a capitalization rate that Morningstar determined. As a result, the key outputs of the underwriting process were the net cash flow and capitalization rate for each loan.

The next step of Morningstar’s disclosed methodology, as explained in the publicly available CMBS Subordination Model document, was to input each loan’s net cash flow and capitalization rate from the underwriting process into Morningstar’s Subordination Model, an Excel spreadsheet. The Subordination Model then subjected these values to “defined sets of stresses” to assess the likelihood of loans to default at each rating category. The model’s outputs showed the loans’ losses under various economic scenarios, expressed as a percentage of the total value of the CMBS certificates being issued. Those percentages were the model-generated subordination levels, or credit enhancement, for the various rating categories.

Intuitively, if severe stress would lead to 23% defaults, then at most 77% of the pool could be AAA-rated, etc.

Did you catch the bad words in that passage? The bad words were “an Excel spreadsheet.” Morningstar had a ratings model that was subject to careful regulatory scrutiny; Morningstar disclosed how it worked, and investors supposedly relied on that disclosure when they bought CMBS. But the model lived in Microsoft Excel. If you were the Morningstar analyst working on rating a new deal, you copied your last deal into New_Deal.xlsx, and you started typing. You could type lots of things! It’s Excel, why not. The SEC complains:

Morningstar failed to disclose that a central feature of its Subordination Model allowed analysts to make “loan-specific” adjustments to the disclosed net cash flow and capitalization rate stresses.

The adjustments were made in two columns of cells in the Excel spreadsheet … labelled “LOAN SPECIFIC ADJUSTMENTS TO BASE N[ET] C[ASH] F[LOW] STRESS,” and the other was labelled “LOAN SPECIFIC ADJUSTMENTS TO BASE CAP RATE STRESS.” …

Other than the labels, Morningstar provided its analysts with no criteria or guidance for when or how to employ these adjustments. … Nor did the Subordination Model constrain how large the analyst-employed stress adjustments could be.

Even the column labels in the Subordination Model’s Excel spreadsheet failed to constrain the use of the adjustments. Morningstar’s corporate representative said that analysts could use the adjustments for reasons having nothing to do with a specific loan, such as to nudge a rating produced by the model to align with expectations. Specifically, Morningstar’s analysts could use the “loan-specific” stress adjustments when “the aggregate levels that are spit out by the model are either too high or too low relative to other similar transactions we’ve looked at.”

“Even the column labels in the Subordination Model’s Excel spreadsheet failed to constrain the use of the adjustments”! Even the column labels! The analysts looked at the columns labeled “do some fudging here, but not in a bad way,” and they did some fudging in a bad way, and now the SEC is mad. Even the column labels! Citi’s problem is that it had an opaque high-strung bit of software that, if you didn’t check exactly the right boxes, would wire hundreds of millions of dollars out of the bank for its own perverse amusement. Morningstar’s problem is that it put a highly regulated model in a regular old Excel spreadsheet where analysts could type whatever they wanted, and did.

I wonder how many Highly Regulated Excel Spreadsheets there are in the financial industry. Thousands, surely. There you are, doing your job, in your Highly Regulated Excel Spreadsheet. And you get some result you don’t like and you say, well, I dunno, I’ll just multiply everything by 1.02, that seems fine. And then years later regulators are like, no no no, that was a Highly Regulated Excel Spreadsheet, the column labels were sacrosanct, you can’t just type whatever you want there. But of course you could just type whatever you wanted there, because it was in Excel and that’s how Excel works.

Here are key links –

the SEC press release about the lawsuit:
https://www.sec.gov/news/press-release/2021-29

The complaint:
https://www.sec.gov/litigation/complaints/2021/comp-pr2021-29.pdf

I’m trying to figure out how to do this for https://www.actuarial.news/

Hmmm.

tl;dr:

The SEC is pissed that Morningstar uses a spreadsheet model that has no controls to do the CMBS rating.

Various analysts could (and maybe did) adjust the models to hit whatever target they wanted.

As Matt Levine says, “…it was in Excel and that’s how Excel works.”

Financial Times Article

Another article about the covid 16K record screw up. Estimated lives lost: 1500 :grimacing:

from the article:

The story about Excel running out of numbers just seemed so bizarre at first. That’s why we were sharing Clippy memes, and why I took pleasure in teasing Gates about it. But his response, which seemed po-faced at the time, was right. He wasn’t laughing, because he understood that this wasn’t a comedy; it was a tragedy.

Fetzer and Graeber have calculated a conservative estimate of the number of people who died, unknown victims of the spreadsheet error. They think the death toll is at least 1,500 people.

So the next time there’s a pandemic, let’s make sure we have our spreadsheets in order. After all, as Luca Pacioli, the father of accounting, warned us more than five hundred years ago, without a good spreadsheet, you will grope your way forward, “and you may meet great losses”.

One thousand five hundred deaths. Relative to the scale of the whole pandemic, this is just a sliver of the total tragedy. But as the needless price of bad data management, they are great losses indeed.

2 Likes

When the Defense Information Systems Agency sought a new satellite services acquisition on behalf of the Navy, it included a spreadsheet so bidders could fill in their prices. But the spreadsheet included the prices from the current contract, which were supposed to be inaccessible.

Joe Petrillo: Sure. This is another excel spreadsheet disaster, and we talked about one a few weeks ago. It involved an acquisition of satellite telecom services for the Navy’s Military Sealift Command. It was an acquisition of commercial satellite telecommunications services. And they were divided into both bandwidth and non-bandwidth services. And the contract would be able to run to for up to 10 years in duration**. Part of the contract, as you said, was an excel spreadsheet of the various different line items with blanks for offers to include their price. Unfortunately, this spreadsheet had hidden tabs, 19 hidden tabs, and those included, among other things, historical pricing information from the current contract. So Inmarsat, which was the incumbent contractor, holding that contract, notified the government and said, look you’ve disclosed our pricing information, do something about it. So the government deleted the offending spreadsheet from the SAM.gov website. But they understood and this was the case, third party aggregators had already downloaded it, and it was out there, it was available.

https://www.nature.com/articles/d41586-021-02211-4

Autocorrect errors in Excel still creating genomics headache

Despite geneticists being warned about spreadsheet problems, 30% of published papers contain mangled gene names in supplementary data.

Am I the only one who thinks this is on the geneticists for naming genes using month abbreviations? Like they couldn’t come up with a slightly different acronym. It’s also hilarious that they don’t proofread specifically for these errors.

2 Likes

The point is, unfortunately, that most people don’t “proofread” their spreadsheets.

They don’t even do the most basic input data validation. That you’re getting the same number of records out as you put in, for example. That the gene name results you see are what you expect. Etc.

1 Like

https://www.techrepublic.com/article/excel-is-still-a-security-headache-after-30-years-because-of-this-one-feature/

Microsoft released Excel 4.0 for Windows 3.0 and 3.1 in 1992 and many companies still use this functionality in legacy operations. The problem is that bad actors have started using Excel sheets and macros as a new way to deliver malware.

Tal Leibovich, head of threat research at Deep Instinct, explained at a presentation during DEFCON 29 why this legacy scripting language has been the vehicle for a recent rise in malware delivery. Leibovich presented “Identifying Excel 4.0 Macro strains using Anomaly Detection” with Elad Ciuraru last week. Deep Instinct is a cybersecurity company specializing in endpoint protection and using deep learning to stop cyberattacks.

Security organizations first noticed a spike in March 2020 of this kind of attack. Microsoft released a new runtime defense against Excel 4.0 macro malware in March. Leibovich said that he has seen a substantial increase over the last two years of hackers using Excel 4.0 Macros in attacks.

1 Like

Not a very serious error:

https://www.cbssports.com/mlb/news/will-the-yankees-bats-cool-down-at-the-oakland-coliseum-plus-other-best-bets-for-thursday/

I’d like to start today’s newsletter off with a correction because this is a newsletter that takes accuracy seriously. In yesterday’s edition, I wrote that the Tampa Bay Rays were 45-40 this season as underdogs. That’s not true. I accidentally looked at the wrong column of one of my spreadsheets – “one of my spreadsheets” is the coolest thing anybody has ever written – and I’m embarrassed that I didn’t realize it right away.

The truth is that the Rays were 26-23 as underdogs heading into the game. They’re now 27-23 after beating the Phillies, so hopefully, the win helps you forgive me for the mistake easier.

1 Like

:woman_facepalming:

2 Likes

More on the item in Nature

This is an argument that sometimes it’s a good choice:

A Senate committee has heard that Services Australia staff were forced to revert back to Excel spreadsheets to rectify some of the errors made by Centrelink’s automated income compliance program.

The remarks were made on Friday during a hearing held by the Community Affairs References Committee as part of its ongoing probe of Centrelink’s compliance program.

Representatives from the agency responsible for the program that colloquially became known as robo-debt were asked if Excel spreadsheets, rather than computer systems, were being used to manually view debt files to determine and calculate refunds.

“No, and yes – if I can put it in those terms,” Services Australia integrity strategy, engagement, and policy branch national manager Robert McKellar said.

“The use of Excel spreadsheets was a part of the process of disaggregating what we would call mismatched records and multiple debts where there was a necessity to conduct some detailed analysis, where a customer had multiple assessments against their record and we needed to identify the correct one to apply back into the customer’s record on our main system in order to correctly assess whether a debt included averaging income information or not.”

He said that was not a functionality available on the main system, but explained the Excel spreadsheet process was needed to assess the correct information in order to record that back onto customers’ records.

It basically is saying that their system is screwed up, and Excel is being used as a kludge to make up for that deficiency.

Excel: the duct tape of finance systems.

That said, I have some sympathy for this department, because they were being made to deal with a government-created problem (and yes, they’re part of the government, too)…

What a mess.

3 Likes

“No and yes…” and the run-on sentence that followed reminds me of “Yes, Prime Minister”. :grinning_face_with_smiling_eyes:

1 Like

The guy in the photo failed upwards.

Sure… Excel is superior to any system that you can’t readily edit/see/adapt/upgrade/fix/comprehend.