1. All Stories
  2. Politics
  3. Economy
  4. World
  5. Nuclear
  6. Society/Culture
  7. Space/Science
  8. Sports
  9. Tourism
  10. Other Media
  11. Videos
  12. Photos
  13. Cartoons
  14. Interview
    • فارسی
    • عربی
    • Türkçe
    • עברית
    • Pусский
  • RSS
  • Telegram
  • Instagram
  • Twitter
  • Facebook
  • All Stories
  • Politics
  • Economy
  • World
  • Nuclear
  • Society/Culture
  • Space/Science
  • Sports
  • Tourism
  • Other Media
  • Videos
  • Photos
  • Cartoons
  • Interview

Snowden Used Common Web Crawler Tool to Collect NSA Files

  • February, 09, 2014 - 16:30
  • Other Media news
Snowden Used Common Web Crawler Tool to Collect NSA Files

TEHRAN (Tasnim) - Whistleblower Edward Snowden used “inexpensive” and “widely available” software to gain access to at least 1.7 million secret files, The New York Times reported, quoting senior intelligence officials investigating the breach.

Other Media

The collection process was “quite automated,” a senior intelligence official revealed. Snowden used “web crawler” software to “search, index and back up” files. The program just kept running, as Snowden went about his daily routine.

“We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said.

Investigators concluded that Snowden’s attack was not highly sophisticated and should have been easily detected by special monitors. The web crawler can be programmed to go from website to website, via embedded links in each document, copying everything it comes across.

The whistleblower managed to set the right algorithm for the web crawler, indicating subjects and how far to follow the links, according to the report. At the end of the day, Snowden was able to access 1.7 million files including documents on internal NSA networks and internal “wiki" materials, used by analysts to share information across the world.

Reportedly, Snowden had full access to the NSA’s files, as part of his job as the technology contractor in Hawaii, managing computer systems in a faraway outpost that focused on China and North Korea.

Officials added that the files were accessible because the Hawaii outpost was not upgraded with the latest security measures.

The web crawler used by Snowden was similar to, but not as advanced as the Googlebot crawler, used by Google and its search engine to access billions of websites and download their contents for fast search results.

The whistleblower did raise some flags while working in Hawaii, prompting questions about his work, but he was able to ward off criticism successfully.

Snowden admitted in June to taking an undisclosed number of documents, which in the last half-year have been regularly relied on by the international media for a number of high-profile reports about the US National Security Agency and its British counterpart, GCHQ. He was then granted political asylum by Russia and now resides in Moscow.

The leaks have unveiled a number of previously unreported NSA operations, including those involving dragnet surveillance programs that put the digital lives of millions, if not billions, of individuals across the world into the possession of the US government.

 

 
Read more
Canada’s Electronic Spy Agency is Following You: New Snowden Leaks
NSA Spied on Copenhagen UN Climate Summit: Snowden Leak
tasnim
tasnim
tasnim
  • About
  • Contact Us
  • Most Visited
  • Archive
Follow Us:
  • RSS
  • Telegram
  • Instagram
  • Twitter
  • Facebook

All Content by Tasnim News Agency is licensed under a Creative Commons Attribution 4.0 International License.