Media
Keystone fatigue? Get over it
Righting speech
Does the NSA Tap That? What We Still Don’t Know About the Agency’s Internet Surveillance
Among the snooping revelations of recent weeks, there have been tantalizing bits of evidence that the NSA is tapping fiber-optic cables that carry nearly all international phone and Internet data.
The idea that the NSA is sweeping up vast data streams via cables and other infrastructure — often described as the “backbone of the Internet” — is not new. In late 2005, the New York Times first described the tapping, which began after the Sept. 11, 2001 attacks. More details emerged in early 2006 when an AT&T whistleblower came forward.
But like other aspects of NSA surveillance, virtually everything about this kind of NSA surveillance is highly secret and we’re left with far from a full picture.
Is the NSA really sucking up everything?
It’s not clear.
The most detailed, though now dated, information on the topic comes from Mark Klein. He’s the former AT&T technician who went public in 2006 describing the installation in 2002-03 of a secret room in an AT&T building in San Francisco. The equipment, detailed in technical documents, allowed the NSA to conduct what Klein described as “vacuum-cleaner surveillance of all the data crossing the internet -- whether that be peoples' e-mail, web surfing or any other data.”
Klein said he was told there was similar equipment installed at AT&T facilities in San Diego, Seattle, and San Jose.
There is also evidence that the vacuuming has continued in some form right up to the present.
A draft NSA inspector’s general report from 2009, recently published by the Washington Post, refers to access via two companies “to large volumes of foreign-to-foreign communications transiting the United States through fiberoptic cables, gateway switches, and data networks.”
Recent stories by the Associated Press and the Washington Post also described the NSA’s cable-tapping, but neither included details on the scope of this surveillance.
A recently published NSA slide, dated April 2013, refers to so-called “Upstream” “collection” of “communications on fiber cables and infrastructure as data flows past.”
These cables carry vast quantities of information, including 99 percent of international phone and Internet data, according to research firm TeleGeography.
This upstream surveillance is in contrast to another method of NSA snooping, Prism, in which the NSA isn’t tapping anything. Instead, the agency gets users’ data with the cooperation of tech companies like Facebook and Google.
Other documents leaked by Edward Snowden to the Guardian provide much more detail about the upstream surveillance by the British Government Communications Headquarters (GCHQ), the NSA’s U.K. counterpart.
GCHQ taps cables where they land in the United Kingdom carrying Internet and, phone data. According to the Guardian, unnamed companies serve as “intercept partners” in the effort.
The NSA is listening in on those taps too. By May 2012, 250 NSA analysts along with 300 GCHQ analysts were sifting through the data from the British taps.
Is purely domestic communication being swept up in the NSA’s upstream surveillance?
It’s not at all clear.
Going back to the revelations of former AT&T technician Mark Klein — which, again, date back a decade — a detailed expert analysis concluded that the secret NSA equipment installed at an AT&T building was capable of collecting information “not only for communications to overseas locations, but for purely domestic communications as well."
On the other hand, the 2009 NSA inspector general report refers specifically to collecting “foreign-to-foreign communications” that are “transiting the United States through fiber-optic cables, gateway switches, and data networks”
But even if the NSA is tapping only international fiber optic cables, it could still pick up communications between Americans in the U.S.
That’s because data flowing over the Internet does not always take the most efficient geographic route to its destination.
Instead, says Tim Stronge of the telecom consulting firm TeleGeography, data takes “the least congested route that is available to their providers.”
“If you’re sending an email from New York to Washington, it could go over international links,” Stronge says, “but it’s pretty unlikely.”
That’s because the United States has a robust domestic network. (That’s not true for some other areas of the world, which can have their in-country Internet traffic routed through another country's more robust network.)
But there are other scenarios under which Americans’ purely domestic communication might pass over the international cables. Google, for example, maintains a network of data centers around the world.
Google spokeswoman Nadja Blagojevic told ProPublica that, “Rather than storing each user's data on a single machine or set of machines, we distribute all data — including our own — across many computers in different locations.”
We asked Blagojevic whether Google stores copies of Americans’ data abroad, for example users’ Gmail accounts. She declined to answer.
Are companies still cooperating with the NSA’s Internet tapping?
We don’t know.
The Washington Post had a story earlier this month about agreements the government has struck with telecoms, but lots of details are still unclear, including what the government is getting, and how many companies are cooperating.
The Post pointed to a 2003 “Network Security Agreement” between the U.S. government and the fiber optic network operator Global Crossing, which at the time was being sold to a foreign firm.
That agreement, which the Post says became a model for similar deals with other companies, did not authorize surveillance. Rather, the newspaper reported, citing unnamed sources, it ensured “that when U.S. government agencies seek access to the massive amounts of data flowing through their networks, the companies have systems in place to provide it securely.”
Global Crossing was later sold to Colorado-based Level 3 Communications, which owns many international fiber optic cables, and the 2003 agreement was replaced in 2011.
Level 3 released a statement in response to the Post story saying that neither agreement requires Level 3 “to cooperate in unauthorized surveillance on U.S. or foreign soil.”
The agreement does, however, explicitly require the company to cooperate with “lawful” surveillance.
More evidence, though somewhat dated, of corporate cooperation with NSA upstream surveillance comes from the 2009 inspector general report.
“Two of the most productive [signals intelligence] collection partnerships that NSA has with the private sector are with COMPANY A and COMPANY B,” the report says. “These two relationships enable NSA to access large volumes of foreign-to-foreign communications transiting the United States through fiber-optic cables, gateway switches, and data networks.”
There’s circumstantial evidence that those companies may be AT&T and Verizon.
It’s also worth noting that the NSA might not need corporate cooperation in all cases. In 2005, the AP reported on the outfitting of the submarine Jimmy Carter to place taps on undersea fiber-optic cables in case “stations that receive and transmit the communications along the lines are on foreign soil or otherwise inaccessible.”
What legal authority is the NSA using for upstream surveillance?
It’s unclear, though it may be a 2008 law that expanded the government’s surveillance powers.
The only evidence that speaks directly to this issue is the leaked slide on upstream surveillance, and in particular the document’s heading: “FAA702 Operations.” That’s a reference to Section 702 of the 2008 FISA Amendments Act. That legislation amended the Foreign Intelligence Surveillance Act, the 1970s law that governs government surveillance in the United States.
Under Section 702, the attorney general and director of national intelligence issue one-year blanket authorizations to for surveillance of non-citizens who are “reasonably believed” to be outside the U.S. These authorizations don’t have to name individuals, but rather allow for targeting of broad categories of people.
The government has so-called minimization procedures that are supposed to limit the surveillance of American citizens or people in the U.S. Those procedures are subject to review by the FISA court.
Despite the procedures, there is evidence that in practice American communications are swept up by surveillance under this section.
In the case of Prism, for example, which is authorized under the same part of the law, the Washington Post reported that the NSA uses a standard of “51 percent confidence” in a target’s foreignness.
And according to minimization procedures dating from 2009 published by the Guardian, there are also exceptions when it comes to holding on to American communications. For example, encrypted communications — which, given the routine use of digital encryption, might include vast amounts of material — can be kept indefinitely.
The government also has the authority to order communications companies to assist in the surveillance, and to do so in secret.
How much Internet traffic is the NSA storing?
We don’t know, but experts speculate it’s a lot.
“I think that there’s evidence that they’re starting to move toward a model where they just store everything,” says Dan Auerbach, a staff technologist at the Electronic Frontier Foundation. “The Utah data center is a big indicator of this because the sheer storage capacity has just rocketed up.”
We know more details about how the GCHQ operates in Britain, again thanks to the Guardian’s reporting. A breakthrough in 2011 allowed GCHQ to store metadata from its cable taps for 30 days and content for three days. The paper reported on how the spy agency — with some input from the NSA — then filters what it’s getting:
The processing centres apply a series of sophisticated computer programmes in order to filter the material through what is known as MVR – massive volume reduction. The first filter immediately rejects high-volume, low-value traffic, such as peer-to-peer downloads, which reduces the volume by about 30%. Others pull out packets of information relating to "selectors" – search terms including subjects, phone numbers and email addresses of interest. Some 40,000 of these were chosen by GCHQ and 31,000 by the NSA.How does the NSA do filtering of the data it gets off cables in the United States?
“I think that’s the trillion dollar question that I’m sure the NSA is working really hard at all the time,” Auerbach, the EFF expert. “I think it’s an incredibly difficult problem.”
Cost Curve: How hospitals don't help
NYT exposé machine hums along (UPDATED)
Audit Notes: Noonan and Morris on the IRS, free Internet, Guardian gains
Center for Investigative Reporting simplifies FOIA
Must-reads of the week
Introducing the Voices of Patient Harm
Over the last year, ProPublica has been investigating patient harm, one of the leading causes of death in America, and inviting providers and patients to share their experiences with our reporters. We’ve received hundreds of responses, and continue to report based on those tips as well as insights shared in our growing Facebook community.
Today, we’re expanding our call for stories with the Voices of Patient Harm, a new Tumblr featuring a mosaic of personal stories from people affected by patient safety issues. There are those who've been harmed directly, like Carla Muss-Jacobs, who describes the consequences of a surgical injury:
The injuries I sustained were NOT the standard “known risks." It was later discovered by my highly experienced medical expert that I did not need a total knee athroplasty, there was no medical evidence for the procedure. I was butchered and used as a medical guinea pig.
And there are others who have watched as loved ones suffered, like Veronica James, who recounts the injury and death of her mother, Vera Eliscu, in acute-care hospital in New Jersey:
After her injury, Mom never spoke again nor barely moved. I learned coma-stimulation, and became her therapist, as [physical therapists] refused to exercise her. From Jan. 09 until her Wrongful Death in Aug 09, I spent 7 days a week advocating in facilities on Mom’s behalf, documenting her med’s and [hospital-acquired illnesses] ... bringing in acupuncture & nutrition, and fighting with staff, firing 3 doctors in the process.
Robin Karr underwent a total hysterectomy without her consent. She concludes her tragic story with a call for change based on her own experience:
I want to see a better system put in place for checking for correct surgery and proper consent.
Have you been affected by patient harm? Help us capture the stories behind the statistics by sharing your story and photo with us here.
The Voices of Patient Harm is part of ProPublica’s ongoing investigation into patient safety. Have a private tip? Share it with our reporters here.
Do You Know About Problems With FEMA’s Flood Mapping?
Homeowners across the country are facing headaches over flooding, but not because of water-damaged property or lack of insurance. They are being asked to buy insurance they don’t need for houses built on high ground.
That’s because the Federal Emergency Management Agency has used outdated data in a number of its new flood maps and mistakenly mapped homeowners into high-risk flood areas. Homeowners in such areas with federally backed mortgages (and most mortgages are) are required to buy insurance through the National Flood Insurance Program.
We’ve heard from homeowners such as Donna Edgar, whose home in Texas was mapped into a high-risk flood area last year by mistake. But we don’t know exactly how widespread the problem is. That’s where you can help. If you think FEMA has mapped into a high-risk flood area by mistake, please let us know by filling out the form below.
Discover's New Look
High Noon for film incentives
Art Laffer + PR blitz = press failure
Losing the Virginia Way
The right way to write about rape
Chasing New Jersey news
Using Outdated Data, FEMA Is Wrongly Placing Homeowners in Flood Zones
When Donna Edgar found out that new flood maps from the Federal Emergency Management Agency would place her house in a high-risk flood zone, she couldn’t believe it.
Her home, on the ranch she and her husband own in Texas hill country about 60 miles north of Austin, sits well back from the nearby Lampasas River.
“Her house is on a hill,” said Herb Darling, the director of environmental services for Burnet County, where Edgar lives. “There’s no way it’s going to flood.”
Yet the maps, released last year, placed the Edgars in what FEMA calls a “special flood hazard area.” Homeowners in such areas are often required, and always encouraged, to buy federal flood insurance, which the Edgars did.
FEMA eventually admitted the maps were wrong. But it took Edgar half a dozen engineers (many of whom volunteered their time), almost $1,000 of her own money and what she called an “ungodly number of hours” of research and phone calls over the course of a year to prove it.
Edgars is far from alone.
From Maine to Oregon, local floodplain managers say FEMA’s recent flood maps — which dictate the premiums that 5.5 million Americans pay for flood insurance — have often been built using outdated, inaccurate data. Homeowners, in turn, have to bear the cost of fixing FEMA’s mistakes.
“It’s been a mess,” Darling said. “It’s been a headache for a lot of people.”
Joseph Young, Maine’s floodplain mapping coordinator, said his office gets calls “almost on a daily basis” from homeowners who say they’ve been mapped in high-risk flood areas in error. More often than not, he said, their complaints have merit. “There’s a lot of people who have a new map that’s unreliable,” he said.
Maps built with out-of-date data can also result in homeowners at risk of flooding not knowing the threat they face.
FEMA is currently finalizing new maps for Fargo, N.D., yet the maps don’t include any recent flood data, said April Walker, the city engineer, including from when the Red River overran its banks in 1997, 2009 and 2011. Those floods were the worst in Fargo’s history.
Fargo has more recent data, Walker said, but FEMA hasn’t incorporated it.
It’s unclear exactly how many new maps FEMA has issued in recent years are at least partly based on older data. While FEMA’s website allows anybody to look-up flood maps for their areas, the agency’s maps don’t show the age of the underlying data.
FEMA’s director of risk analysis, Doug Bellomo, said it was “very rare” for the agency to digitize the old paper flood maps without updating some of the data. “We really don’t go down the road” of simply digitizing old maps, he said.
FEMA did not respond to questions about the maps for Fargo or other specific areas.
State and local floodplain officials pointed to examples where FEMA had issued new maps based at least in part on outdated data. The reason, they said, wasn’t complicated.
“Not enough funding, pure and simple,” Young said.
Using new technology, FEMA today is able to gather far more accurate elevation data than it could in the 1970s and 1980s, when most of the old flood maps were made. Lidar, in which airplanes map terrain by firing laser pulses at the ground, can provide data that’s 10 times more accurate than the old methods.
Lidar is also expensive. Yet as we’ve reported, Congress, with the support of the White House, has actually cut map funding by more than half since 2010, from $221 million down to $100 million this year.
With limited funding, FEMA has concentrated on updating maps for the populated areas along the coasts. In rural areas, “it’s sort of a necessary evil to reissue maps with older data on them,” said Sally McConkey, an engineer with the Illinois State Water Survey at the University of Illinois at Urbana-Champaign, which has a contract with FEMA to produce flood maps in the state.
When old maps are digitized, mapmakers try to match up road intersections visible on them with the ones seen in modern satellite imagery (similar to what you can see using Google Earth). But the old maps and the new imagery don’t always line up correctly, leading to what Alan R. Lulloff, the science services program director with the Association of State Floodplain Managers, called a “warping” effect.
“It can show areas that are actually on high ground as being in the flood hazard area when they’re not,” he said. “That’s the biggest problem.”
When FEMA issued new maps last year for Livingston Parish in Louisiana, near Baton Rouge, they included new elevation data. But the flood studies, said Eddie Aydell III, the chief engineer with Alvin Fairburn in Denham Springs, La., who examined the maps, were “a conglomeration of many different ancient engineering studies” dating from the 1980s to 2001. The mapmakers did not match up the new elevation data with the older data correctly, he said, making structures in the parish seem lower than they really are.
“It’s going to be a nightmare for the residents of our parish,” he said.
Bonnie Marston’s parents, Jim and Glynda Childs, moved to Andover, Maine, where Marston lives with her husband, in 2010 with the intention of building a house. But when they applied for a loan the bank told them that FEMA’s new flood maps for the county, issued the year before, had placed the land on which they planned to build in a special flood hazard area. The cost: a $3,200 annual flood insurance bill, which the Childs had to pay upfront.
Marston spent about $1,400 to hire a surveyor, who concluded her parents did not belong in a special flood hazard area. FEMA eventually removed the requirement for them to buy flood insurance — though it didn’t actually update the map. The bank refunded the flood insurance premium, but Marston said FEMA wouldn’t refund the cost of the survey.
“In my mind it’s a huge rip-off,” Marston said.
Edgar, 68, a retired IBM software developer, said she couldn’t understand why FEMA thought her house was suddenly at risk of flooding. When she called FEMA and asked, she said the agency couldn’t tell her.
“They just said, ‘You need to buy flood insurance,’” she said, and told her she could apply for what’s known as a letter of map amendment if she thought she’d been mapped into a special flood hazard area in error. She worried that being in a high-risk flood area would diminish the value of her home.
Her husband, Thomas, a professor of chemical engineering at the University of Texas at Austin, knew David R. Maidment, a civil engineering professor there who is an expert on flood insurance mapping. While she hired a surveyor and wrangled with FEMA, Maidment and several of his Ph.D. students drove up to the ranch to study it as a class project.
The experience, Maidment said, showed him “in a very small microcosm” the importance of using up-to-date elevation data in new maps. The Texas state government paid to map Burnet County, where the Edgars’ ranch is located, in 2011 using lidar. But FEMA’s new maps for the county don’t include the lidar data.
FEMA removed the Edgars from the special flood hazard area in March, but again it hasn’t actually changed the maps. Letters of map amendment acknowledge that FEMA’s maps were incorrect without actually changing them. While the Edgars don’t have to buy flood insurance, the new, inaccurate maps remain.
Darling, the county’s director of environmental services, said he had gotten calls from dozens of homeowners with similar complaints about the new flood maps.
“We’ve still got ‘em coming in,” he said.
The contractor that created the new maps appeared to have taken shortcuts in drawing them, Darling said. Without new lidar data, he added, issuing a new map is “just a waste of money.”
The experience, Edgar said, had left her feeling deeply frustrated, as a both homeowner and a taxpayer. FEMA hasn’t reimbursed her for the surveying costs or for the flood insurance premium she and her husband paid. “It falls to the homeowner to hire a professional engineer and pay” hundreds, even thousands, “to disprove what I would call their shoddy work,” she said. “I don’t think that’s fair.”
Have you experienced problems with FEMA's flood maps firsthand? Let us know.