The FBI's failure to unlock the cellphone of the Texas church shooter is reigniting the debate over encryption and government access to secured communications.
Earlier this week, FBI Special Agent Christopher Combs blamed the industry standard encryption for blocking investigators' ability to crack the PIN code on the gunman's device.
Software exists to thwart a passcode, but if forced, investigators run the risk of erasing all of the phone's data. The FBI sent the Texas gunman's phone to its lab in Quantico, Va., to try to determine another method.
"With the advance of the technology and the phones and the encryptions, law enforcement, whether that's at the state, local or federal level, is increasingly not able to get into these phones," Combs said.
But some experts say this is less about encryption and more about the FBI's failure to act swiftly to access information on tech devices. Many argue investigators lack technical expertise and the budget to properly examine these devices.
Peter Swire, a professor of law and ethics at the Georgia Institute of Technology, tells Here & Now's Robin Young the FBI lost critical time when it failed to reach out to Apple for help within the first 48 hours of obtaining the phone, which is reportedly an iPhone.
In a statement, Apple said it offered to assist law enforcement on the day of the shooting. Apple says it could have advised the FBI to use the dead shooter's fingerprint to unlock the phone, but after 48 hours that option is no longer available.
"People lose phones, and then they can take measures to try to get in. If it's in the first 48 hours, you assume it's the real person doing it," Swire says. "To leave it open forever to that fingerprint, for instance, means that you could go after somebody whose phone you've stolen, get a fingerprint off their glasses weeks later ... break into their house and get their fingerprint. And that would be a huge security vulnerability."
Swire says police officers need better training in what technical options are available for accessing information.
"The police sometimes could go to apps, the police can go to the cloud, the police can go to a lot of places for evidence," he says. "It's not as though the phone is the only source. And if we had better training on what is available, it might not feel so painful for what isn't available."
Swire explains that if the shooter uploaded information from his phone to iCloud, the FBI could serve a warrant to obtain the information.
As NPR's Alina Selyukh reported, encryption doesn't completely stop surveillance "but pushes it further out from the networks that deliver the communications — where they are scrambled — to the devices where it gets unscrambled for the user."
The debate over encryption began decades ago, but it most recently entered the spotlight in the aftermath of the December 2015 mass shooting in San Bernardino, Calif. The fight erupted after Apple refused a court order to write special software that would have created a back door into the shooter's iPhone.
The FBI and Apple argued the case in court and in Congress, but in the end, the FBI paid an unknown third party more than $1 million to unlock the phone without Apple's help.
At the time, Apple said that creating a back door would "undermine the very freedoms and liberty our government is meant to protect." The government argued that stronger encryption mechanisms were making devices "warrantproof."
"I don't see a situation where the government ... is going to force Apple to roll back encryption of the iPhone. I think that ship's sailed," Christopher Soghoian, former principal technologist at the American Civil Liberties Union, told NPR in 2016. "Law enforcement has to deal with the fact that we live in the world of encryption. And the way the feds are dealing with it is embracing the hacking."
Ethicists argue that if Apple wrote software for one terrorist's cellphone, it would create a precedent for foreign governments, such as China and Russia, to make similar demands for devices in their countries.
"We have to have a free society, and when government has the right reason they get to go look at stuff," Swire says. "The police have access to lots and lots and lots of things, but that doesn't mean they should break everybody's computers and cellphones."