Understand The Background Of Data Analysis Now

Research Optimus
3 min readJul 8, 2022

--

Analytics, often known as the study of analysis, has a long history, with historical instances of data analysis. We’ve discovered artifacts and monuments that date back to the dawn of data analytics, from ancient Egypt to the Sumerian kingdom.

Analytics has been increasingly crucial for businesses of all sizes over the years. In addition, data analytics has grown and diversified throughout time, bringing with it a slew of advantages.

Data analytics is now used in practically every part of our lives, and it encompasses a wide range of approaches used in business, science, and a variety of other fields.

Before we delve deeper into the history of data analysis, we must first define what data analysis is.

“Data analytics is the science of combining heterogeneous data from various sources, establishing inferences, and making predictions in order to enable innovation, obtain a competitive corporate edge, and assist strategic decision-making.”

It’s interesting looking at how things got to where they are now in order to grasp what Data Analysis can and cannot achieve now.

Key instance in the history of Data Analysis

  • In the 1930s, R.A. Fisher proposed the design of experiments along with the statistical test ‘Anova’ and Fisher’s exact test. He is also credited with the quotation’ correlation does not imply causation.’
  • In the late 1930s, W.E. Deming proposed the idea of quality control using statistical sampling.
  • In the late 1950s, Peter Luhn proposed the idea of using the indexing and information retrieval method with text and data for the purposes of business intelligence.
  • Edgar F. Codd pioneered relational databases in the 1970s, and they became quite popular in the 1980s.
  • John W. Tukey, in 1977 wrote the book ‘Exploratory Data Analysis’ which led to the development of S and S-plus languages along with the language ‘R’ that you might have heard of.
  • The design of data warehouses was established in the late 1980s to aid in the transformation of data from operational systems into decision-making support systems.
  • The usage of data mining arose directly from the development of database and data warehouse technologies in the 1990s.
  • In 1997 Tom Mitchell wrote the book ‘Machine Learning Book’ which is still one of the best sellers today.
  • Professor Ramnath Chellappa of Emory University defined cloud computing as a new “computing paradigm where the limitations of computing will be decided by economic rationale rather than technical limits alone” in 1997.
  • In 1996 two Stanford University graduate students, Larry Page and Sergey Brin, wrote a prototype search engine that ultimately led to the development of Google.
  • Roger Magoulas coined the term “big data” in 2005. Hadoop, a huge data processing system, was created the same year.
  • Microsoft, in 2007, released a data-driven science e-book titled’ The Fourth Paradigm.’
  • Peter Norvig, in 2009 proposed the unreasonable effectiveness of the data. The idea is that multiple small models and lots lots of data are much more effective than building very complex models.
  • The Economist in 2010 published an issue titled ‘The Data Deluge’ around the exponential growth in data volume.

--

--

Research Optimus
Research Optimus

Written by Research Optimus

0 Followers

Research Optimus is a premier provider of business research and financial research services. Visit https://www.researchoptimus.com/

No responses yet