A capability model is a structure that represents the core abilities and
competencies of an entity (department, organization, person, system, and
technology) to achieve its objectives, especially in relation to its overall
mission and functions.
The Big Data Capability Model (BDCM) is defined as the key functionalities in
dealing with Big Data problems and challenges.
It describes the major features, behaviors, practices and processes in an
organization, which can reliably and sustainably produce required outcomes
for Big Data demands. BDCM consist of the following elements:
Collection: collect raw data, sources, formats, discovery, protocols, staging
ELT: extract, load and transform data Store: NoSQL repository, key-value,
column-based, document-oriented, graph, Hadoop, MPP, in-memory, cache
Integration: data move, messaging, consumption, access, connector Processing... (more)
IDG Enterprise's 2015 Big Data and Analytics survey shows that the number of
organizations with deployed/implemented data-driven projects has increased by
125% over the past year. The momentum continues to build.
Big Data as a concept is characterized by 3Vs: Volume, Velocity, and Variety.
Big Data implies a huge amount of data. Due to the sheer size, Big Data tends
to be clumsy. The dominating implementation solution is Hadoop, which is
batch based. Not just a handful of companies in the market merely collect
lots of data with noise blindly, but they don't know how to cleanse it,... (more)
The IoT market is on track to hit $7.1 trillion in 2020, according to an
IDC's study. Are we ready for this massive demand? How can we deal with the
Some firms choose to take no action by claiming they are too busy with what
they are doing. Some organizations blindly jump into it with no thinking or
planning. Some companies opt to take a bold stance to bet on something
immature. Needless to say, all these attempts are highly risky and naive.
What is mandatory is an overarching and adaptive approach to effectively
handle the rapid changes and exponential growth.
An I... (more)
Every day 2.5 quintillion (1018) bytes of data are created, and 90% of the
data in the world today has been generated in the last couple of years alone.
Big data is a general term used to describe the voluminous amount of
unstructured and semi-structured data, which takes too much time and cost too
much money to load into a traditional data store for analysis. The impact of
big data is significantly cross-cutting, for both the business and technology
management at the provider and consumer sides.
2012 tends to be a big year for big data and big analytics. To effectively
explore ... (more)
Due to the unprecedented volume, variety, and velocity of Big Data, it is
neither trivial nor straightforward to find a clear path to jumpstart the Big
Data journey. This space is overwhelmingly crowded with so many immature
options and evolving solutions. To some extent it is somewhat confusing and
daunting. Where can you find an entry point? What is the most effective way
to get on board? Which aspects should you be mindful of? How can you not miss
the paramount things?
Why do you need to begin with the basics?
Here are five areas of consideration for Big Data on-ramp: Structur... (more)