Imagine trying to organize the electronic data equivalent of 50 million four-drawer file cabinets of text documents. That is the dilemma now facing the U.S. government, according to a story in The Boston Phoenix (Dec. 29, 2010). Data storage—supposedly the eco-friendly, efficient road into the future—and the organization of that data have proven to be a burden on federal and state governments, already taxed by budget shortfalls and small staffs. “The amount of information that government agencies may be required to keep—from tweets and e-mails to tax histories,” writes Chris Faraone, “is growing faster than the capacity for storage.”
So what’s the answer? In terms of organizing the data, it depends on whom you ask. One side claims that there is just “a shortage of good old-fashioned manpower.” Meanwhile, the other side argues that if transparency and easy access were the default position for government information—rather than making parties request it—there would be less need for additional staff.
For the storage side of things, one answer has come in the form of outsourcing the work to private companies. Microsoft and Google are “running neck-and-neck . . . to win federal contracts for cloud computing,” a method of storage that would solve the current limitations of brick-and-mortar storage spaces. This solution, though, brings its own baggage in the form of security issues. Another solution is a tactic called virtualization, which is essentially a fancy word for doing away with redundancy. Implemented by Vivek Kundra, the first chief of information officer in a presidential administration, virtualization has led to the consolidation of many data centers. “It’s a slow start,” Faraone writes. “Only an estimated 20 percent of federal databases have been virtualized so far. But it’s a start.”
This article first appeared in the May-June 2011 issue of Utne Reader.