The most popular and comprehensive Open Source ECM platform
In 1998, Microsoft boasted about the capabilities of their new TerraServer database of geo information. They called it the “world’s largest atlas” consisting of five terabytes of data. And, at the time, it seemed like a massive amount of data. While five terabytes of data seemed unimaginably huge twenty years ago, lately a terabyte is becoming uncomfortably small.
Today, storing huge amounts of data is is the norm. Whether it is the new PC you buy for home use with a standard terabyte drive or your Internet provider’s home Internet Data Plan that comes with a cap of a terabyte of data usage, a terabyte of data no longer seems that big.
With all the new technology that is being put into cars, car makers are predicting that the cars of the future will hold massive amounts of data. Mike Demler, a senior analyst at The Linley Group, said that today “in a luxury car like a Tesla, with the big flat screen display, the storage requirements start to look like a standard tablet (10s – 100s of GB).” But data in the next generation of cars will be significantly more.
For example, a new ad campaign by Intel talks about driving technology that will process, for an average driver, four terabytes of data every day. Actually, Intel CEO Brian Krzanich estimates that smart cars of the future will generate and consume about 40 terabytes of data every eight hours that they are driven. That is the amount of data that autonomous vehicles are expected to process. Imagine how much additional data would be processed when vehicle passengers who no longer need to drive will pass their time by watching high definition streamed video.
Kathy Winter, vice president and general manager of the Automated Driving Solutions Division at Intel Corporation, said that “the exponentially growing size of the data sets necessitates an enormous amount of compute capacity to organize, process, analyze, understand, share and store. Think data center server compute power, not PC power.”