NIST Final ‘Big Data’ Framework Will Help Make Sense of Our Data-Drenched Age
To improve approaches for analyzing very large quantities of data, computer scientists at the National Institute of Standards and Technology (NIST) have released broad specifications for how to build more widely useful technical tools for the job.
Following a multiyear effort, the agency has published the final version of the NIST Big Data Interoperability Framework (NIST Special Publication Series 1500), a collaboration between NIST and more than 800 experts from industry, academia and government. Filling nine volumes, the framework is intended to guide developers on how to deploy software tools that can analyze data using any type of computing platform, be it a single laptop or the most powerful cloud-based environment. Just as important, it can allow analysts to move their work from one platform to another and substitute a more advanced algorithm without retooling the computing environment.