A capability model is a structure that represents the core abilities and
competencies of an entity (department, organization, person, system, and
technology) to achieve its objectives, especially in relation to its overall
mission and functions.
The Big Data Capability Model (BDCM) is defined as the key functionalities in
dealing with Big Data problems and challenges.
It describes the major features, behaviors, practices and processes in an
organization, which can reliably and sustainably produce required outcomes
for Big Data demands. BDCM consist of the following elements:
Collection: collect raw data, sources, formats, discovery, protocols, staging
ELT: extract, load and transform data Store: NoSQL repository, key-value,
column-based, document-oriented, graph, Hadoop, MPP, in-memory, cache
Integration: data move, messaging, consumption, access, connector Processing... (more)
Due to the unprecedented volume, variety, and velocity of Big Data, it is
neither trivial nor straightforward to find a clear path to jumpstart the Big
Data journey. This space is overwhelmingly crowded with so many immature
options and evolving solutions. To some extent it is somewhat confusing and
daunting. Where can you find an entry point? What is the most effective way
to get on board? Which aspects should you be mindful of? How can you not miss
the paramount things?
Why do you need to begin with the basics?
Here are five areas of consideration for Big Data on-ramp: Structur... (more)
Internet of Things (IoT) is booming. The “Software for the Internet of
Things (IoT) Developer Survey” report, published by Embarcadero
Technologies last month, shows that 77% of development teams will have IoT
solutions in active development in 2015 with almost half (49%) of IoT
developers anticipating their solutions will generate business impacts by the
end of this year.
IoT Maturity Model (IoTMM) is a qualitative method to gauge the growth and
increasing impact of IoT capabilities in an IT environment from both business
and technology perspectives. It comprises a set of crite... (more)
IDG Enterprise's 2015 Big Data and Analytics survey shows that the number of
organizations with deployed/implemented data-driven projects has increased by
125% over the past year. The momentum continues to build.
Big Data as a concept is characterized by 3Vs: Volume, Velocity, and Variety.
Big Data implies a huge amount of data. Due to the sheer size, Big Data tends
to be clumsy. The dominating implementation solution is Hadoop, which is
batch based. Not just a handful of companies in the market merely collect
lots of data with noise blindly, but they don't know how to cleanse it,... (more)
I will present a tutorial on the service-oriented model-driven architecture
design for cloud solutions in the upcoming International Conference on Web
Services (ICWS 2009). Please join the session to explore the state-of-the-art
approach to effectively developing cloud services in a systematic fashion.
Contact Tony Shan (email@example.com) for more info.