By Stephen Swoyer March 26, 2013
BI is increasingly taking on the metaphor of the scientific method: a model in which hypotheses can be tested and proven.
Bob Eve, chief marketing officer with data virtualization (DV) specialist Composite Software Inc., isn’t a big fan of the “3Vs.” This would be the “volume/velocity/variety” triplet often used to make the case for big data. A veteran technologist with an academic pedigree that includes work at both Cal Berkeley and MIT, Eve knows tech — and with three decades of experience, he knows business and economic disruption, too.
Nevertheless, Eve does make the case for a fourth, overlooked “v” — value — that he first alluded to in an interview with BI This Week last year. At TDWI’s recent World Conference in Las Vegas, Eve returned to this theme, flagging what he described as an “encouraging” development in business intelligence (BI).
“The ‘V’ in value is starting to take precedence over the other so-called V’s of big data. Everyone’s trying to do more and more on the analytics side,” he said. “It’s interesting to see the evolution from that more ‘deterministic’ business intelligence approach — for example, ‘I need a report that’s like X’ — to ‘I have this idea I was wondering about, or this hypothesis that I’d like to test’. It’s more of a scientific-method-like approach to analysis.”
That’s what’s missing, says Eve. “I think there’s this link between the scientific method and discovery. For too long, [BI] hasn’t done anything about addressing this [link].”
According to Eve, the biggest bulwark to discovery, now as ever, is data preparation and connectivity. “I think there’s a real opportunity about bringing some agility into the first half of this process. Access and [data] preparation account for over half the time that a person might spend just preparing to test a hypothesis,” he observes.
“A lot of the times, some of that data is in spreadsheets. These spreadmarts are part of the analyst’s history. They’re going to need them [in any analysis]. Some of it’s in external data sources. Cloud data. The data’s all over that they may try to pull together in their analysis. This idea of what-can-we-do-to-pull-the-data-sets-together-for-analysis, that’s something that needs to be addressed.”
This is an area Eve says Composite expects to address in 2013, although he declines to go into specifics. He concedes, however, that even in the most fully-realized of DV scenarios, at least circa-2012, connectivity isn’t exactly self-service-able. In other words, the “first” part of discovery — i.e., data access and preparation — is a lot less turnkey than it should be.
At last summer’s Pacific Northwest Summit, for example, Eve acknowledged that the DV status quo was less than ideal. “The instantiation of this [abstracted view] basically is [that] they [i.e., business users] see the views [i.e., materialized representations of source data] in our Studio, which is a development environment. That’s not what an end user is going to want to go to; it’s not friendly enough, it’s not easy enough,” Eve said.
“What’s needed, from the customer point of view … is a catalog. … [There] needs to be a catalog, there needs to be attributes about it; [there] needs to be collaboration stuff: use it for this, don’t use it for this.”
Eve declined to confirm for the record whether Composite is working on such a catalog. Instead, he reiterated his view that the DV status quo — which he likewise stressed is a marked improvement over its non-DV kith — could be improved.
“Business people aren’t really comfortable operating at a SQL level,” he acknowledged. Composite is working on a response to this problem, Eve indicated; a catalog-like scheme, he noted, “could make it a little easier for the business guy to find [data sources] and understand them.”
That said, Eve continued, “What do you do next? These are the data sets you want, but you still need to kind of isolate them so that you can build your model. How do you do this? One answer is that you pull them all together on your desktop.” This — what might be called “data blending” at the desktop level — is another area on which Composite is focusing, Eve acknowledged.
The most important new development, he says, is that the expectations of the business — to say nothing of the underlying assumptions of business intelligence — are changing.
“We’re seeing a kind of relaxing of the assumption that we’re going to productionize everything. Some of your analysis may result in failed hypotheses. That’s part of the scientific method, and I think [the business] is increasingly okay with that. It’s kind of like, ‘I haven’t failed, I found a thousand ways that didn’t work.'”