How Business Intelligence Has Evolved Since 1865

by | Apr 11, 2017 | Business Intelligence (BI), General BI and Data Management

evolution of business intelligence

Business Intelligence in 2017 is the vehicle to analyze a company’s “Big Data” to gain a competitive advantage. It allows companies a look at the efficacy of past actions, which they can strategically use as the foundation to plot the path forward.

Business Intelligence, as a term, has its origin in 1865. In this post, we’ll look at its journey from 19th Century “seed” to 2017 “cloud” application. Here are the greatest BI hits:

 

1865: Business Intelligence is Born

The term “Business Intelligence” was first used to describe the business antics of politician and financier Sir Henry Furnese, in the book “Cyclopedia of Commercial and Business Anecdotes” by Richard Miller Devens. The term referred to Furnese’s ability to gather information from across Europe regarding financial markets and political situations, to be ahead of his competition every step of the way.

“The news… was thus received first by him,” reports Devens. He spoke of Furnese’s process as a “…complete and perfect train of business intelligence.” In this first incarnation, BI is recognized as the process of gathering information for succeeding in business. The BI seed is firmly planted.

 

1958: BI Gets Storage Capabilities

Following a similar logic, in the early part of the 19th century, companies gather information about their business environment, markets, and competitors. They carefully store this material… in filing cabinets. On paper.

Fast forward to the late 1940s, where the first computers came into existence and could store information. Things really ramped up in 1956, when IBM invented the hard disc drive. However, these disks were difficult to manage and seemed a risky storage solution. They were relegated to the protected confines of data centers.

Enter the “Father of Business Intelligence,” Hans Peter Luhn, a computer scientist at IBM. Luhn penned an article entitled, “A Business Intelligence System,” where he outlined the potential of BI: “An automatic system… developed to disseminate information to the various sections of any industrial, scientific, or government organization.”

He was way ahead of his time describing aspects of BI potential – like the importance of search and query metadata, security and data ownership, and the term “information on demand.” He outlined the idea of expert users to carry out difficult queries and the need for system auditing to optimize performance. You’d think Luhn took a trip to the future, the way he spoke of features like automatically learning the interests of users based on the information they use (in 1958!)

 

1970: Databases Evolved for Widespread Use

When Edgar Codd was first transferred to the IBM Research Laboratory in San Jose, CA in 1968, database products existed – but only those with highly specialized skills could really use the data provided. The multi-source information tended to be in silos, and you could only gain a one-dimensional report. That meant no single, consistent view of data was possible; the data came out fragmented.

Recognizing this flaw, Edgar Codd published an important paper in 1970 revolutionized the way man thought about databases. The proposal of a “relational database model” was adapted globally. In 2017, this great technical achievement is still very much in practice today. According to ACM, “It is no exaggeration to say that essentially all databases in use or under development today are based on Codd’s ideas. Whenever anyone uses an ATM machine, or purchases an airline ticket, or uses a credit card, he or she is effectively relying on Codd’s invention.”

The first database management system created was known as DSS, or Decision Support Systems. Many BI vendors jumped into invention, finding a variety of tools to make it possible to access and organize all data in a simpler way.

 

1980s: Streamlined and Simplified BI

The ‘80s heralded the arrival of data warehouses. The “Fathers of Data Warehousing” were Bill Inmon along with Ralph Kimbal (one of the original architects). Most businesses ran data analysis in-house solutions during office hours, or in “batch mode” at the end of the work day, as well as on weekends. It included run-the-business order entry, MRP and accounting. Inmon and Kimbal’s breakthroughs and methodologies brought structure and order to BI.

Data warehousing effectively shortened the time it took to access the organizations data. The data was now housed at a single location. The staples of BI today include Extract, Transform, Load (ETL) and OLAP – all of which were developed at that time.

The Multiway Data Analysis consortium in 1988 was a conference that led to further simplification of BI analysis. After this conference, the phrase “business Intelligence” began gathering momentum. Howard Dresner, an analyst at Gartner, brought the term into everyday use.

There were many advances in BI during this time, but users still wanted more from their data than the ability to analyze it on Excel Spreadsheets. They wanted one version of the truth.

 

1990s: BI Reports and Visualization

Business Intelligence in the ‘90s functioned as a way to produce, organize and visualize data in presentable reports. This was BI 1.0.

The challenges were the complexity of the process and the time it took to extract. In this decade, businesses became highly reliant on their IT departments, as managers could not execute BI projects on their own. Only those with extensive analytics training could gain insights, as the tools tended to be unintuitive. There was a big delay in formulating reports for senior management.

There was a need for BI that could be utilized by non-technical users. In 1997, our company, PARIS Technologies launched PowerOLAP as a respond to the call. This product – which is currently available in version 16 – provided a fast method to integrate relational data into an optimized OLAP calculation engine. It’s been called the most “Excel-user friendly” OLAP offering on the market, as it headed the self-service BI trend.

 

21st Century: BI Vendors Mushroom

At the start of 2000, companies started to realize the exceptional value of BI capabilities. They wanted the IT departments to deliver reports daily, and then hourly. BI vendors met the challenge by creating tools for self-service options. Non-technical users could now gather and analyze the specific data they needed for their jobs. Welcome to BI 2.0!

Technologies like real-time processing allowed organizations to make decisions on the latest information available. Social media platforms generated mammoth amounts of usable data for companies, as people (fans and followers) had more conversations and revealed their preferences, interests, and pain points.

A company’s IT department did not have to take on every data analysis project, as everyone finally had full access to data that was easy to view, manipulate, and draw insights from.

From 2005 onward, organizations learned that, to stay competitive, BI was a fundamental requirement. The challenge was to ensure that the data was governed, secure, and trustworthy.

 

2005 and Beyond

The size and growth of the Internet means that an extraordinary amount of data is generated every day. (Did you know that 204 million emails are sent every minute?) The vast quantities of data being amassed creates a need for more BI visualization tools to make sense of this ocean of information.

The arrival and widespread adaption of cloud-based solutions expanded the reach of BI platforms. Today, issues of complexity and speed have been addressed. The Internet’s exponential growth assisted with these developments. The cloud reduces storage costs associated with Big Data, and companies can now access insights quickly and conveniently.

So, what comes next? In the coming years, expect that the next phase of BI will include contextual insights – embedded, suggestive BI, and cognitive computing; actionable insights, and a continuous feedback loop for improvement. It all comes down to “insights.”

 

Challenges Still to be Addressed

A timeline of OLAP technology shows that the tools and language has helped users become more adept at manipulating and analyzing data. However, the software often used by businesses today remains clunky and inflexible. Most companies still utilize 2-dimensional spreadsheets to communicate multidimensional businesses, yet Excel was never designed to handle millions of rows of data. Few products, including our own Olation, work to seamlessly connect back-end systems and front-end visualization tools in a way that is lightweight and agile.

Despite BI’s improvement of white collar productivity and collaboration, only the most forward-thinking are pushing forward to change company culture and do things more effectively. There remains an unaddressed need to turn BI information into BA ‒ planning and forecasting. Most business people turn to Excel to make that happen, and do so in a way that is not dynamically connected to their data systems; nor to the spreadsheet plans that others in their company may be making.

In the digital, data-driven age, one must embrace BI to stay in business. But BI still has a way to go when it comes to revolutionizing team-wide usability for guidance in historic reporting and future planning. In 2017, PARIS Tech is proud to be on the frontier of combining relational and multidimensional database technologies into a single, streamlined solution!