Sean Lahman



« | »

How data and documents helped find bin Laden

A couple of interesting bits of information caught my eye in the flurry of news reports on the raid that killed Osama bin Laden…

Some of the earliest reports said that bin Laden was located after the US identified a courier who was in constant contact with the world’s most wanted terrorist. Interestingly, this information was contained in the thousands of pages of documents on Guantanamo detainees that were released by Wikileaks just a week ago.  Here’s a link to the relevant intelligence report, which not only mentions the courier but the spot where Bin Laden was eventually found: Abbotabad, Pakistan.

A report in the National Journal this morning also reveals the extent to which data mining and predictive analysis helped in the task of finding bin Laden.

One way they did this was to create forward-deployed fusion cells, where JSOC units were paired with intelligence analysts from the NSA and the NGA. Such analysis helped the CIA to establish, with a high degree of probability, that Osama bin Laden and his family were hiding in that particular compound.

These technicians could “exploit and analyze” data obtained from the battlefield instantly, using their access to the government’s various biometric, facial-recognition, and voice-print databases. These cells also used highly advanced surveillance technology and computer-based pattern analysis to layer predictive models of insurgent behavior onto real-time observations.

Posted by on May 2, 2011.

Categories: data viz, journalism

« | »




Recent Posts


Pages



About Sean Lahman

Sean Lahman is an award-winning database journalist and author.  He develops interactive databases and data driven stories for the Rochester Democrat and Chronicle and other Gannett newspapers and websites. He also writes a weekly column on emerging technology and innovation. Prior to joining the Democrat and Chronicle, he was a reporter and columnist with the […]more →

Switch to our desktop site