+41 768307656
info@htc-sagl.ch

Big Data,

Big data is a term that has become increasingly prevalent in today’s society, but its roots can be traced back to the early days of computing. The history of big data can be divided into two main eras: the past, which began in the 1960s and continued until the early 2000s, and the present, which began with the emergence of social media and the Internet of Things (IoT).

In the past, the term “big data” was not commonly used. Instead, it was referred to as “large-scale data processing” or “high-performance computing.” One of the earliest examples of large-scale data processing was the US Census Bureau’s use of IBM punch card machines in the 1890 census. These machines were able to process data at a rate of 10 cards per second, a significant improvement over manual data entry.

Fast forward to the 1960s, and the era of mainframe computers had begun. IBM’s System/360 mainframe, introduced in 1964, could process up to 16.6 million characters per second, which was a massive improvement over earlier computers. As more companies and organizations began to use computers for data processing, the need for efficient storage and retrieval of large amounts of data became more pressing. The development of relational databases in the 1970s, such as Oracle and IBM’s DB2, paved the way for modern data storage and retrieval.

In the 1990s, the internet began to take shape, and with it came the explosion of online data. In 1995, Larry Page and Sergey Brin began working on a new search engine, which would eventually become Google. Their algorithm was based on analyzing the links between websites, a form of big data analysis that was revolutionary at the time. Other companies, such as Amazon and Yahoo!, also began to leverage big data in their operations.

In the early 2000s, the term “big data” began to be used more frequently. In 2001, Doug Laney, then an analyst at Gartner, coined the term “3Vs” to describe big data: volume, velocity, and variety. Volume refers to the amount of data being generated, velocity refers to the speed at which data is being generated and must be processed, and variety refers to the different types of data being generated.

Today, the era of big data is in full swing. The explosion of social media has generated massive amounts of data, which can be analyzed to gain insights into consumer behavior and preferences. The Internet of Things (IoT), which refers to the growing network of internet-connected devices, is also generating vast amounts of data. Machine learning and artificial intelligence algorithms are being used to analyze this data and generate insights that can be used to make informed decisions.

In conclusion, the history of big data is a relatively short one, but it has had a profound impact on the way we process and analyze data. From the early days of punch card machines to the explosion of social media and IoT, big data has evolved into a critical tool for businesses and organizations of all kinds. As we continue to generate more data, it is likely that big data will only become more important in the years to come.