As I waited for the results of the Iowa primary Monday evening, I was reminded of something I know only too well as a technologist. That is technology is greatly misunderstood and often overrated. This is particularly true with software, which is often a black box, where even the developer may not understand everything that they’ve created.

Having developed many products, mostly hardware, but an increasing amount of software, the process is always the same: build it, test it to find the bugs/defects, fix them, and test it again. It’s an iterative process that is repeated over and over again. Even at the end, just before shipment, there are often bugs or defects that are known but not fixed due to indifference, cost, or a need to ship. And because the engineers cannot or don’t bother to imagine all of the ways their product will be used, failures often occur that were never imagined.

I sometimes think I know too much about this process and imagine what could go wrong due to the nature of this process. I think of an airplane that has thousands of elements and subsystems, millions of lines of software code, that all need to be tested in various sequences and in different situations. Taken together the odds of a failure can be high, which is why they have redundant systems to minimize the possibility of a catastrophic failure.

Fortunately most products we buy do not result in a death if they fail.  But they do fail. It’s the nature of how they are made.

The app used for the Iowa primary appears, based on many accounts, to be developed over a period of just two months with little testing. This is a product that should never have used. The build, test, and fix cycle appears not to even have begun with this product.

But errors such as this one, occurred because those buying or using the product just assumed high tech stuff works. But more often than not it doesn’t work exactly as intended. Products with software and hardware can be so complex that errors are difficult to find. They may only occur after a specific sequence or how it’s used and they may be random and difficult to repeat. The only way to know is to do what the customer will do, test it to death under all sorts of conditions.

The takeaway is we should not assume a product will always work as intended. We see this everyday. Our computers freeze, our phones reboot, and software programs crash.

In fact, that’s the reason I avoid using tech to do mundane things such as turning on lights or remotely controlling my thermostat. Why use tech to do something that is simpler to do without it?  That adage also applies to using paper ballots instead of electronic tabulation.

In reaction to the Iowa app, several software engineers – who know software – posted their own tweets about software:

People who come to my house are shocked by my non smart home. I know too much about what can go wrong!

21 years, have worked on lots of high availability stuff, can confirm and will add that firmware and hardware are also not to be trusted.

I’ve only got like 15 years or so in Ops but I can say with certainty that you should not only not use software for anything, but that you should also never believe a software developer about anything unless they say that their software can’t do something.

Those in the know are just much more cautious than those not in the know who just assume tech stuff works.

In the specific case of this app, ignorance of those in charge was apparently bliss. They missed so many warning signs because they were technically inept or just assumed tech stuff works. As reported by The Verge:

  • The consulting group that made the app, Shadow, was paid just over $60,000 to develop the app, far less than it should actually cost to develop.
  • The party insisted on “security through obscurity,” arguing that talking about the app too much would give hackers a heads-up to attack it.
  • The app was rushed out so quickly it had to be distributed via two platforms mobile developers use to beta-test their software, TestFlight and TestFairy.
  • Installing an app via these methods is not simple even for savvy users, and having voters install it on their personal phones has security implications that I can’t even begin to enumerate.
  • The app required two-factor authentication (good!), but the instructions were apparently not sufficient, leading to yet more frustration. The New York Times reports that “some [precinct chairs] also took pictures of the worksheets they had been given — the PINs fully visible — and tweeted them out in frustration. Had the app worked, the information might have given trolls or hackers a chance to download the program and tamper with it.” Yikes.
  • The app was distributed using the TestFairy platform’s free tier and not its enterprise one. That means Shadow didn’t even pony up for the TestFairy plan that comes with single sign-on authentication, unlimited data retention, and end-to-end encryption.”
  • Shadow itself reportedly didn’t have the coding chops to pull off the app in the first place, especially on such a tight timeline. How carefully was this outfit vetted?
  • The whole ecosystem of tech for campaigns is dysfunctional in the first place, with the wheel getting reinvented every four years and widespread distrust in publicly available tools. Here’s an eye-opening Twitter thread about the issue.
  • Shadow’s previous products caused alarm among security experts. That includes members of the Biden campaign, which ended its partnership with Shadow after its texting platform “did not pass our cybersecurity checklist,” a Biden staffer tells the NYT.
  • The app itself wasn’t stress-tested, and the errors it showed on election night were singularly unhelpful, as Motherboard reported.
  • Local volunteers realized the fiasco was coming and the party didn’t adjust to open up more phone lines to accommodate the change.

What’s so distressing is the Iowa app created havoc from within. We’ve yet to see the impact from those that purposely create havoc.