Overview on how data technologies are affecting business intelligence
Every time data hops, intelligence drops. The plethora of data technologies has created a process of sequential hops tailored to meet a business use case. While this sequential hopping pipeline serves a required outcome -often at great cost, the inability to use the same pipeline, same technologies, and in most cases the same methods means that each use case has a new data pipeline comprised of a new series of sequential hops. This has created two problems in the world of business intelligence (i) too many data pipelines and technologies to manage (ii) too many hops for data before it reaches its final destination and purpose – decision enablement.
While data is still hopping, it becomes redundant for decision making
A simple business intelligence use case requires data to hop seven (7) times on an average before it reaches its end destination. Why does data have to hop so much? Firstly, data needs to get transformed multiple times from its unstructured form to a structured form, then to a relational form, then to an ingestible form for computation, then to queuing to storage for in-memory compute, and finally to analytical storage for final querying for presentation to the end-user. Secondly, at each hop, a new data management technology processes the data in a particular sequence. Each data management technology is limited to serving a specific purpose that terminates after handing over the output to another data management technology. The point and method of handover require the generation of files in specific formats that are required to be stored in the interim and re-ingested into the next technology. This process of multiple hops and storage increases latency to a point where the use of data for business intelligence becomes redundant. The perceivable method of reducing redundancy is reducing the quantum of data being processed and investing in expensive cloud services to counter the time taken for each hop – not necessarily to reduce the number of hops.
Too much disparate data technology with virtually no standards for architecting
There is way too much data technology available each seeking to find its purpose. Many a time, the purpose is assigned by solution architects who build without keeping scalability in mind. Data technology, therefore, gets consumed any and everywhere in a format that’s non-scalable beyond the immediate problem at hand. The lack of clear data architecting solutions standards and quasi-knowledge in a fast-evolving field leads to data fatigue.
Pre-architected data technology and the end of the ETL era
The only way to stopping hopping and start focusing on outcome is through pre-architected data technology integrated with computing and visualization. The old way of processing data from multiple sources through ETL pipelines that then hand data over to the next data processor and then to the next ETL pipeline and so on is simply not conducive to the needs of today’s business. ETL technologies along with disparate data technologies are set to be replaced by pre-architected business intelligence platforms with a strong data management layer that manages data ingestion, transformation, compute, storage, and processing in a virtual single-click format. More importantly, the future of data management technology is a single architecture for meeting virtually any use case.
The future of data management is no-code, no stress, and no latency
Managing data technologies today involves writing scripts at every stage of the process in formats and languages that change with each technology. The future of data management is by way of visual drag and drop of data management technologies on a single platform. The future of data management will make multiple engagements of people and processes redundant and dramatically reduce the stress for the end-user. The end-user will be able to demand data for any use case at a click and not through lengthy processes that prepare and transform data before the end-user can access it.
The promise of self-service analytics has largely fallen short due to the complexity and disparity in data technologies. The future of business intelligence is zero-hop, zero latency, on-demand data analytics. Once the data hopping stops, the analytics will become a servant to the user and not the other way around.