Whether it is technology, marketing, or even politics, Big Data is one of the most commonly used terms. Have you ever wondered about the data and data usage 50 or 100 years ago? Is Big Data a thing of cyber era?

The 1880 US Census took eight years to tabulate, and per an estimate, the 1890 census would take more than 10 years using the then-available methods. It was due to the invention of the Hollerith tabulating machine (punch cards), which enabled the government officials to complete the job in about a year. By early 1930s, the US population boom, issuing of social security numbers, and the growth of knowledge and research resulted in continuous information overload. Libraries were one of the first institutions to be affected directly, and had to adapt their storage methods. In the latter half of the 20th century, the data storage methods were going to change forever. The concept of virtual memory, developed by German physicist Fritz-Rudolf Güntsch, treated finite storage as infinite. Storage, managed by integrated hardware and software, permitted us to process data without the hardware memory constraints. By 1960s, due to the information influx, most organizations began to design, develop, and implement centralized computing systems. In 1970, Edgar F. Codd, an Oxford-educated mathematician, published a paper that explained the concept of relational databases for the first time. Today, routine transactions such as using bank accounts, trading stocks or online shopping, all use structures based on relational database theory.

“Data expands to fill the space available for storage” – Parkinson’s Law of Data

In 1997, the term “big data” was used for the first time in an article by NASA researchers Michael Cox and David Ellsworth, claiming that the rise of data was becoming an issue for current computer systems. As per a study published in 2000, the world had produced about 1.5 exabytes of information in 1999 (1 exabyte = 1 billion gigabyte), which increased to 161 exabytes in 2006 and 2837 exabytes in 2012.

The internet traffic in the US alone is forecasted to reach one zettabyte by the end of current year (1 zettabyte = 1000 exabytes), and this is just the beginning of Big Data explosion. In the near future, data would be generated Big and Fast. Experts believe that the annual data production by 2020 would be 44 times greater than that in 2009 with a forecast of 35 zettabytes annually.

Big Data is all set to become the new real estate, and would redefine the dollar value of information. In addition to storing data, accurate and meaningful mining of data would play even more important role towards the success of an organization.


 

Source:

  1. http://www.winshuttle.com/big-data-timeline/
  2. http://www.csc.com/big_data/flxwd/83638-big_data_just_beginning_to_explode_interactive_infographic
  3. http://www2.sims.berkeley.edu/research/projects/how-much-info/