Advertisement
Tech

Electronic voting machines are broken—and here’s the code to prove it

‘To [senators], ‘certified’ is like putting ‘organic’ on a yogurt cup.’

Photo of John-Michael Bond

John-Michael Bond

Article Lead Image

As allegations of vote rigging and manipulation continue to plague the 2016 election, one developer is heading straight to the source to investigate—the source code, that is.

Featured Video

Regardless of what state you live in, one common denominator is direct recording electronic (DRE) machines. While some states use paper ballots and only five states (Delaware, Georgia, Louisiana, South Carolina, and New Jersey) use DRE machines exclusively, many use a combination of both, which means the machines are used in some capacity all around the country. While it’s understandable to expect slight variations from state to state, you would imagine that there would be some controls in place to make sure the code of voting machines would be secure. But the reality makes dimpled chads look like a walk in the park.

Following the last election, Emily Gorcenski, a developer and writer in Virginia, was curious about those standards. What security measures are in place to ensure the software code regulating our elections doesn’t accidentally switch or erase a vote?

After a viral tweetstorm about how one machine’s source code guidelines focused on the code’s style rather than how it functioned, Gorcenski took to software community Github to report in further detail.

Advertisement

Voting machines have a lot of components that need to be tested. Electrical systems, physical cases, security locks, scanners, speakers, and all that stuff has to meet certain engineering quality standards. Certification bodies exist to test these machines. However, software is a more mysterious being. Software verification is very hard; in fact, it’s one of the hardest problems out there.

Software can introduce lots of failures: it can change a vote, it can count a vote twice or not at all, it can lose votes. So one would expect that voting machine software is thoroughly checked and ensured that the answer it outputs is always the correct answer. My findings indicate that this is not the case.

Furthermore, Gorcenski says, “The software standards for voting machines mostly govern code style … [which] is form, not function.” And “most of the style review was performed by automated tools, not humans. This means that backdoors or other attempts at deliberate malfeasance can be easy to sneak into the software.”

In one example, Gorcenski cites how a piece of code could allow one candidate to get their votes double-counted, but only after the machine has recorded 1,000 votes. So if the machine was tested for just 100 votes, the anomaly could slip by undetected; the automated checkers would miss it because it is stylistically correct. 

In a situation like this where the standard is simply checking to make sure the style is correct, rather than code secure, there are some gaping opportunities for misdeeds—especially when you consider that some of these machines can be updated via USB. So while these machines are checked for obvious security problems, like not leaving in default passwords and ensuring physical locks work, they don’t take into account the need to evaluate the source code of the machine for security flaws.

Advertisement

According to Gorcenski, some security issues—such as weak encryption, deliberate backdoors, and vote manipulation—are only visible through examining the source code itself.

We spoke to Emily Gorcenski about the country’s biggest blind spots in our electronic voting machine software, what inspired her investigation, and what America can do better in the future to keep our voting machines more secure.

What inspired you to start looking at the code in the first place?

A few days before the thread that went viral, that led me to look at the stuff and setting up a Github, I had another thread that got traction where I compared voting machines to medical devices: 

Advertisement

I’ve worked in medical device R&D and aerospace R&D, so I have a lot of first-hand knowledge of developing software in regulated spaces. From that thread, I was pointed to the fact that there does exist federal certifications, but they only apply if a company wants to sell a voting machine in a state that requires adherence to those certifications. [Editor’s note: Voting system regulations are determined at the state level, and according to the Election Assistance Commission (EAC), 16 states have no requirement that their voting machines be tested by a federally accredited laboratory or meet the requirements of federal certification.]

Advertisement

“States need to strengthen their local regulations to require testing and certification of voting devices.”

What was your first reaction when you saw there was no codified set of federal regulations for electronic voting machines?

It didn’t really shock me at all. The federal government has little power over managing elections, and generally speaking, regulations that apply to devices tend to apply to broad industries: medical devices, vehicles, aircraft. Electronic voting is still a fairly niche market. President and vice-president are the only federal elected positions, and those elections only happen once every four years; twice, if we count primaries. Everything else is state or local level, so it makes sense to expect state-by-state variation in how voting is handled.

What changes do you believe need to be put in place to help fix the problems outlined in your report?

Advertisement

States need to strengthen their local regulations to require testing and certification of voting devices. Random audits should be a part of the voting process, and paper copies of ballots should be mandatory. At the very least, the certification process would need to include an integrity analysis of all vote-handling code pathways and a complete cybersecurity analysis, which would have an expiry date, as threats evolve rapidly. 

To make manufacturers’ lives easier, the certification effort needs to have a modification pathway to reduce certification overhead costs when making minor changes to the code, e.g. to update code in response to a new virus. Right now, engineers at these companies are letting bugs live in the wild because fixing them would trigger a recertification effort. This needs to be fixed.

I also feel like a push toward open-source development is critical to ensure that the voting systems can be trusted. Open-source software and hardware could, with proper support, create a very trustworthy electronic voting system. The federal government could help fund this through grants.

There are still senators who don’t use email. Do you think these problems are due to a lack of understanding of how software works?

Advertisement

I think that at that level senators don’t care about the technical details; to them, “certified” is like putting “organic” on a yogurt cup. We don’t care about the details; we just want to know that it means something, even though it often doesn’t. Broadly speaking, regulators cannot move at the pace of the software industry. This is why we need folks like USDS and 18F to help keep things adapting and to be a bridge to work with industry.

What sort of codified federal regulations would you like to see for electronic voting machines? 

I’m not a lawyer, so I don’t even know if voting machines could be federally regulated. But the government has in the past induced states to adopt certain laws: Look at the minimum drinking age. First, we’d need the EAC and NIST to be much more transparent and flexible in developing software standards for these machines. The hardware validation, for what it’s worth, is quite good, but software is always where the risk profile resides.

One thing that is desperately needed, though, is regulations that include civil and criminal penalties for willfully failing to address vulnerabilities. If you knowingly leave a critical bug in a piece of medical device software, you are civilly—and possibly criminally—liable for any harm that occurs. This needs to happen for voting machines. It’s simply unfathomable that manufacturers leave knowingly these devices full of defects in the field. And they do it because there is no feedback pathway that makes them pay if something anomalous happens.

Advertisement

“Version control should be mandatory: All code should be visible to all developers, and they should be able to see when and who made a change.” 

Other than manually going through each line of code, what options do we have for improving the review process of our voting machine code?

The software industry has developed more tools than you can imagine to handle some of these things: unit tests, integration tests, and behavior driven development models would all catch this if implemented properly. Version control should be mandatory: All code should be visible to all developers, and they should be able to see when and who made a change. Without it, it’s possible for a rogue employee to sneak some code in that alters voting records without anyone noticing. This is much more difficult, though not impossible, with the version control systems that are now standard in the tech industry.

What reasons can you think of for the lack of standardized regulation on how we handle voting machines and their code across the country? 

Advertisement

The software industry is generally averse to regulation. It’s expensive to comply, it’s burdensome, and it makes it difficult to switch to new tools, new languages, etc. Regulations and standards do make it challenging to work in software: Companies leave bugs in software because the rules aren’t written that they can fix them in a cheap and easy fashion. And there aren’t really any model industries out there that have proven that agile regulations are possible. Combined with the constitutional issues that delegate control over voting procedures to the states, I don’t know that this will change anytime in the near future. But what can change is the existing certification guidelines can be updated, and this should be a core focus of the EAC.

 
The Daily Dot