Every generation laughs at the old fashions, but religiously follows the new. – Henry David Thoreau
One must be absolutely modern. – Arthur Rimbaud
That which is truly modern will always be modern. – Arnold Schoenberg
Modern Analytical Platforms
There are few business functions that are more susceptible to the whims of fashion than analytics. While the underlying purposes, techniques, and business applications have been remarkably stable over time, the quickly shifting sands of technological gloss actually have serious implications for how major organizations should approach building an analytical platform for a support function or line of business.
While there have been technological advances over the past two decades, most of these have related to the volume of data that can be handled at speed and functionality within the visualization layer. These advances have done astonishingly little to increase the ability of most organizations to implement the primary use cases typically used to justify the implementation of an Analytical Platform. By way of a brag disguised as a data point, we were successfully running genetic algorithms to optimize airline schedules to maximize profitability on Excel Solver in 2001. Neither data volume nor visualization capability were limitations on a problem of that size 20 years ago.
Platform Components – Building the Car
A car is a reasonably apt metaphor for understanding these analytical components, so let’s unify our acronyms and call our modern platform a Consolidated Analytics Regime (CAR). First of all, let us consider that the price of entry into a CAR is modernized operations. This means that the organization in question has implemented systems to automate and record the interactions that constitute its primary operations. For example, when someone buys a concert ticket, the attributes of the sale (time, person purchasing, venue, act) are recorded electronically, and the exchange of money involved is completely automated.
These applications are the road that the CAR navigates. If the road is a superhighway paved with well-designed transactional systems, the only constraint on speed is the performance of the CAR itself. If transactions are not automated or stored primarily in spreadsheets, you’re driving a Ferrari on a dirt road.
Now let’s consider the tires. All of that data from your operational/transactional systems (OTS) needs to get into the CAR. This is the ETL engine. This can be as primitive as a manual .csv upload or a highly orchestrated parallel incremental ingestion. The CAR’s interaction with the OTS is ongoing, and that contact can be continual or intermittent. When it’s functioning well, ETL is reliable and near enough to real-time that the impact on the analytical use cases is unnoticeable. Like your tires, if you’re thinking about them, they’re a problem. That being said, this is the stickiest problem for most organizations.
The real engine of the CAR is the calculation layer. This is where all of the calculations and translations reside. This is also where the data association across data sources occurs. Potentially included here is a multi-dimensional engine, or a big-data solution like Hadoop, BigQuery, Azure Data Lake, etc. This layer is processing intensive and typically requires some grown-up hardware. That shouldn’t be a problem in this day and age. Contact your local cloud provider; tell ’em Northcraft sent ya.
The presentation layer is the paint on the CAR. That may sound demeaning, and maybe I was born a data guy, but I’ve had to become a message guy. If management isn’t compelled by the way you present the data, you won’t get your message across, and the decisions the data is meant to support will continue to be made by intuition. If someone wants a black CAR, don’t give them a white one.
Visualization tools utilized as presentation layers have modernized over the past twenty years, but I would argue that there are only three substantial improvements over Excel 2000:
- Interactivity across visualizations
- Presentation in a web layer – and all the mobile/collaborative capabilities that implies
- Crowd sourcing visualizations
- Fake: Direct interaction with the underlying database(s) constituting the calculation layer. (You could/can do this with Excel at least since Y2K).
I would argue that the first big leap forward in visualization tools was Tableau that achieved a modern look with same-page interactive visualizations. At the time, I was 100% certain that Microsoft would buy Tableau because they were so behind. Much to my delight, Microsoft did the hardest thing an organization can do – they threw away an existing, successful(-ish) product: SharePoint: PerformancePoint and invested heavily in its complete replacement: PowerBI. For those of you keeping score at home: my favorite example of this was Apple’s decision to replace the click-wheel on the iPod (the coolest thing I had seen at the time) with the gesture-based touch screen (the two-decade reigning champion of coolest thing).
PowerBI accomplished all three/four of the modernized capabilities, and Tableau went on to be purchased by SalesForce, which, as an application manufacturer, is a strange home. It’s certainly not too late for those guys, but there have been some recent stumbles that suggest underinvestment or lack of prioritization. With Satya’s Microsoft, I can see laser focus on the analytics consumer, and if MS gets beaten on this function or that, I see the commitment to catch up (not a sponsor). The point being, when choosing the presentation layer, you do want to have a chief that will back you up.
The final component that we’ll discuss is that of analytical products. As a general category, these are automated, decision-oriented digests of the calculation layer. In practical terms, these are the AI/ML functionality. They typically let your organization make thousands of small decisions more quickly or more accurately and juice the bottom line.
To summarize, this is the CAR that you’re attempting to build, and it’s important to get as much right as you can.
- Operational/Transactional Systems – Road
- ETL – Tires
- Calculation Layer – Engine
- Presentation Layer – Paint Job
- Analytical Products – Stereo
- In-tool reporting
- Partner Provider (Pure/Modified)
In Tool Reporting
In-tool reporting simply means handling the analytical needs of the organization with the reporting tool that comes out of the box for each application. E.g. use Remedy Smart Reporting for Remedy, Performance Analytics for ServiceNow, SolarWinds whatever it’s called for SolarWinds.
The benefits and drawbacks for this approach are pretty apparent. The real benefit is that the need for ETL is eliminated and the tool is already built. The downsides may depend on the tool, but typically:
- Inability to combine data sets from other applications
- Can only handle limited volumes of data
- Custom interface that demands non-reusable training
- No advanced analytical capabilities (stereo has been ripped out of the car)
In all truth, this isn’t a CAR. It’s an alternative to a CAR. Not everyone needs a Ferrari . . . but if you have the means, I would highly recommend it.
Best of Breed
The second, and very highly recommended option (though not by me) is to look at each component of the CAR and select the component provider that is best in breed for that function. For the tires, select Informatica. For the engine, select a combination of OracleDB, MicroStrategy, and BigQuery. For the paint, slather on some Qlik. For the stereo, crank up Graphana.
I make no significant recommendation on these individual components, but at the time of writing, they are each current contenders for the crown in their individual functions. If it seems I’m being flippant by trotting out defunct software products – that’s only because you’re reading this article six or more months later. One does not need predictive analytics to know how this will read in the astonishingly near future.
That is the crux of the issue. There is a huge risk with vendor obsolescence. What does your CAR look like when the stereo is ripped out, scratched paint, or flat tires. In a practical sense, any interoperability that your organization has fought to achieve may suddenly go down without an immediately viable alternative. You have a closet full of parachute pants.
The final, and Northcraft recommended/followed approach is to identify a provider that has a viable solution for each component in the CAR. Each component may not necessarily be the best in its class, and the impact of vendor obsolescence is greater if that comes to bear.
However, what you may lose in individual component functionality, you typically gain in interoperability. Vendor obsolescence risk can be mitigated by the size of the partner – there are only a handful of potential partners that have an end-to-end product offering. Even so, you have to keep your eye on the industry to judge continued commitment. For the most part, software providers don’t go out of business, they just stop investing in the product. If you’re not paying attention, that shirt you feel like you just bought might not look so great out on the street.
Conclusion/3rd Act Reveal
If you want to build a CAR that will continue to provide cutting edge capabilities to your organization, you have to select a provider that has a vision for and commitment to each critical component. Northcraft Analytics has a prebuilt CAR on a Microsoft backbone that fits most large IT organizations, saving you years of platform mistakes and development. We can get you up and running with the latest capabilities and keep you modern as analytical fashions change. Schedule some time with us by clicking any of the graphics in this article. We would love to help.
There is one other reason for dressing well, namely that dogs respect it, and will not attack you in good clothes – Ralph Waldo Emerson
P.S. Yes, that is Willie Nelson helping Crockett rough up Steve Buscemi