martes, 18 de febrero de 2014

University-run database sheds light on terrorism. Open-source platform synthesizes world events.


Olympic luge doubles training at the Sanki Sliding Centre near Sochi. Credit: The U.S. Army
The Olympics could provide a ripe target for terrorists, and security is especially tight this year in Sochi, the coastal city that borders Russia’s volatile Caucasus region. But analysts working with the Global Terrorism Database at the University of Maryland have tracked trends spanning nearly four decades that might provide some comfort — or not.
“There’s no consistent pattern that indicates terrorism goes up or down during Olympics,” said Erin Miller, program manager for the database, which is part of the federally funded national consortium for the Study of Terrorism and Responses to Terrorism.
The findings suggest that efforts to reinforce security at the Olympics are generally effective, she said. But, taken as a whole, the open-source database paints a picture of an insecure world, as it includes more than 113,000 identified cases of terrorism around the globe since 1970.
The GTD is the largest database of its kind in the world and is used by hundreds of government agencies, researchers and the public to identify and examine patterns of terrorism worldwide, Miller said.
It illustrates both the potential and limitations of computer-driven data collection, as well as the role universities play compiling massive amounts of data. The University of Pisa in Italy recently began storing massive MRI files for a medical research company.

Database started with index cards
When data collection went digital in 1997, University of Maryland researchers inherited 60,000 hand-written index cards that had to be keyed into the database.
Now, a program scans 1.3 million articles and media sites worldwide daily, which winnows down between 12,000 to 16,000 potentially relevant incidents each month by identifying key facets of text and eliminating duplicates, Miller explained.
That number is then culled by human readers who make the judgment on whether the information is credible and whether it will be included in the database based upon an agreed upon — and fairly broad — definition of terrorism, she said.
“It’s an interesting process of balancing the capabilities of technology with the capabilities of the human coders,” Miller said.
Software developed in-house helps organize the material. For each GTD incident, information is available on the date and location, weapons used, nature of the target, number of casualties and — when it is known — the group or individual responsible. The database also includes scores of social, economic and security variables for each incident, according to the university.
All told, the database contains details related to more than 52,000 bombings, 14,400 assassinations and 5,600 kidnappings.
Collecting and analyzing systematic information on worldwide terrorism is challenging, since there are no uniform crime reports on incidents.  There is no classified information in the GTD.

Open-source material can lack credibility
Using open source material presents its challenges. News reports may contain inaccuracies, inconsistencies and misinformation. Reports can be subject to censorship, depending on the location, and misinterpretation, depending on translation programs. And, debate exists on the very definition of terrorism.
Still, experts say, the data is useful in studying — and hopefully fighting — terrorism. The State Department actually uses information from the GTD in its annual report to Congress, Miller said.
Analysis of the data aims to detect patterns and trends in seemingly random acts of terrorism.
“What tends to be surprising is that nothing in particular is straightforward,” Miller said. “It’s been interesting how complex these patterns can be.  The challenge is in interpreting the complexities of what the data show.”

Universities as data collectors
Universities in the United States have long been repositories for large data collections since they are a hub of research activity — often in collaboration with government and industry, said James Cortada, a historian who specializes in the business and economic history of information and information technologies.
Government research funding often provides the infrastructure for building and maintaining large databases at universities, said Cortada, who is a senior research fellow at the University of Minnesota.
“University researchers have gotten into the habit of building these mass databases” in every discipline imaginable, said Cortada.
Unlike industry, academia generally has an ideal of “maintaining public-domain access to data and a strong belief that society benefits from research findings,” according to John Bagby, co-director of the Institute for Information Policy at Penn State.
“The basic academic value has been more toward open and less toward proprietary,” he said.

No hay comentarios:

Publicar un comentario