The most popular and comprehensive Open Source ECM platform
Federal IT’s push towards the cloud seems out of step with the traditional slow and plodding approach towards projects and technology that the Federal government is well known for. It’s actually been a somewhat refreshing development. Unfortuantely a new report on the Federal government and their adoption of Big Data and data analytics draws a profile of government that is more consistent of the traditional slow-moving government body. NetApp sponsored Meritalk to perform this research.
Federal agencies are currently spending most of their IT dollars on the collection and capture of data rather than spending money on how to analyze or apply it. Federal agencies currently have in total nearly1.61 petabytes of data collected, and that is expected to double over the next two years. Roughly a third of the data that’s currently being collected is unstructured data. But when it comes to Big Data, Federal agencies report, on average, that they are at least three years away from being able to start take advantage of it.
Federal agencies identified the following reasons for how Big Data could improve the way that agencies operate:
- 59 percent — Overall Agency efficiency would improve
- 51 percent — Speed and accuracy of decisions would improve
- 30 percent — Ability to forecast would improve
- 25 percent – Easier to identify cost savings
- 23 percent — Greater understanding of citizen needs
- 21 percent — Greater understanding of the agency
- 21 percent — Increased Transparency
Mark Weber, president of U.S. Public Sector for NetApp, said that ”Government has a gold mine of data at its fingertips. The key is turning that data into high-quality information that can increase efficiencies and inform decisions. Agencies need to look at big data solutions that can help them efficiently process, analyze, manage, and access data, enabling them to more effectively execute their missions.”
But there are currently problems. 90 percent of the agencies report that there are barriers to their success in utilizing their data. The biggest barrier is a lack of resources. Those interviewed cited lack of trained personnel, not enough storage space and slow computers. Agencies are overwhelmed by the data that they must manage. 57 percent of agencies say that they have at least one data set that has grown so large that it is totally unmanageable.
The government is trying to turn the problem around. Obama recently announced a $200 million R&D initiative to investigate and prototype the use of Big Data within the Federal government.
Subra Suresh, director of the National Science Foundation, said that ”American scientists must rise to the challenges and seize the opportunities afforded by this new, data-driven revolution. The work we do today will lay the groundwork for new enterprises and fortify the foundations for U.S. competitiveness for decades to come.”