A capability model is a structure that represents the core abilities and
competencies of an entity (department, organization, person, system, and
technology) to achieve its objectives, especially in relation to its overall
mission and functions.
The Big Data Capability Model (BDCM) is defined as the key functionalities in
dealing with Big Data problems and challenges.
It describes the major features, behaviors, practices and processes in an
organization, which can reliably and sustainably produce required outcomes
for Big Data demands. BDCM consist of the following elements:
Collection: collect raw data, sources, formats, discovery, protocols, staging
ELT: extract, load and transform data Store: NoSQL repository, key-value,
column-based, document-oriented, graph, Hadoop, MPP, in-memory, cache
Integration: data move, messaging, consumption, access, connector Processing... (more)
I will present a tutorial on the service-oriented model-driven architecture
design for cloud solutions in the upcoming International Conference on Web
Services (ICWS 2009). Please join the session to explore the state-of-the-art
approach to effectively developing cloud services in a systematic fashion.
Contact Tony Shan (email@example.com) for more info.
Due to the unprecedented volume, variety, and velocity of Big Data, it is
neither trivial nor straightforward to find a clear path to jumpstart the Big
Data journey. This space is overwhelmingly crowded with so many immature
options and evolving solutions. To some extent it is somewhat confusing and
daunting. Where can you find an entry point? What is the most effective way
to get on board? Which aspects should you be mindful of? How can you not miss
the paramount things?
Why do you need to begin with the basics?
Here are five areas of consideration for Big Data on-ramp: Structur... (more)
Big Data Maturity Model (BDMM) is a qualitative method to show the growth and
increasing impact of big data capabilities in an IT environment from both
business and technology perspectives. It comprises a set of criteria,
parameters and factors that can be used to describe and measure the
effectiveness of the big data adoption and implementation.
5 levels of maturity are defined: Advanced, Dynamic, Optimized, Primitive,
and Tentative (ADOPT). The definitions of all these levels are listed below:
Primitive: initial stage of disconnected activities in an unorganized fashion
Every day 2.5 quintillion (1018) bytes of data are created, and 90% of the
data in the world today has been generated in the last couple of years alone.
Big data is a general term used to describe the voluminous amount of
unstructured and semi-structured data, which takes too much time and cost too
much money to load into a traditional data store for analysis. The impact of
big data is significantly cross-cutting, for both the business and technology
management at the provider and consumer sides.
2012 tends to be a big year for big data and big analytics. To effectively
explore ... (more)