A capability model is a structure that represents the core abilities and
competencies of an entity (department, organization, person, system, and
technology) to achieve its objectives, especially in relation to its overall
mission and functions.
The Big Data Capability Model (BDCM) is defined as the key functionalities in
dealing with Big Data problems and challenges.
It describes the major features, behaviors, practices and processes in an
organization, which can reliably and sustainably produce required outcomes
for Big Data demands. BDCM consist of the following elements:
Collection: collect raw data, sources, formats, discovery, protocols, staging
ELT: extract, load and transform data Store: NoSQL repository, key-value,
column-based, document-oriented, graph, Hadoop, MPP, in-memory, cache
Integration: data move, messaging, consumption, access, connector Processing... (more)
Due to the unprecedented volume, variety, and velocity of Big Data, it is
neither trivial nor straightforward to find a clear path to jumpstart the Big
Data journey. This space is overwhelmingly crowded with so many immature
options and evolving solutions. To some extent it is somewhat confusing and
daunting. Where can you find an entry point? What is the most effective way
to get on board? Which aspects should you be mindful of? How can you not miss
the paramount things?
Why do you need to begin with the basics?
Here are five areas of consideration for Big Data on-ramp: Structur... (more)
An architecture framework establishes a common practice for creating,
interpreting, analyzing and using architecture descriptions within a
particular domain of application or stakeholder community (ISO/IEC/IEEE
42010). The Big Data Architecture Framework (BDAF) is an architecture
framework for Big Data solutions, aimed at helping manage a set of discrete
artifacts and implementing a collection of specific design elements. BDAF
enforces the adherence to a consistent design approach, reduce the system
complexity, enhance loose-coupling, maximize reuse, decrease the
dependencies, ... (more)
As the Cloud Computing is maturing, it becomes important to establish a
knowledgebase, as a written guide to the collection of cloud engineering
lifecycle information reflecting best practices and lessons learned,
providing a framework that describes the areas of knowledge, with associated
competence, activities, tasks and skills required. Here is a jump-start with
a cloud metamodel for Cloud Computing Body of Knowledge (CCBOK):
Click image to enlarge.
Contact Tony Shan (firstname.lastname@example.org) for more info.
Though most big data projects are typically data-intensive, a pure
data-driven approach is a big risk. These initiatives actually require
multiple disciplines to implement a viable solution to complex business
problems. A comprehensive method is necessary to tackle the big data issues
and challenges systematically. The Big Data Pyramid is illustrated in the
diagram below, which include the key constructs to capitalize the big data
potentials along 4 aspects: Technology, Insights, People, and Process (TIPP).
Technology: Enabling Platform - a high-performance model of loosely-coupled... (more)