Internet of Things (IoT) is booming. The “Software for the Internet of
Things (IoT) Developer Survey” report, published by Embarcadero
Technologies last month, shows that 77% of development teams will have IoT
solutions in active development in 2015 with almost half (49%) of IoT
developers anticipating their solutions will generate business impacts by the
end of this year.
IoT Maturity Model (IoTMM) is a qualitative method to gauge the growth and
increasing impact of IoT capabilities in an IT environment from both business
and technology perspectives. It comprises a set of criteria, parameters and
factors that can be used to describe and measure the effectiveness of the IoT
adoption and implementation.
Five levels of maturity are defined: Advanced, Dynamic, Optimized, Primitive,
and Tentative (ADOPT). The definitions of these 5 levels are specified below:
Level Desc... (more)
Every day 2.5 quintillion (1018) bytes of data are created, and 90% of the
data in the world today has been generated in the last couple of years alone.
Big data is a general term used to describe the voluminous amount of
unstructured and semi-structured data, which takes too much time and cost too
much money to load into a traditional data store for analysis. The impact of
big data is significantly cross-cutting, for both the business and technology
management at the provider and consumer sides.
2012 tends to be a big year for big data and big analytics. To effectively
explore ... (more)
Due to the unprecedented volume, variety, and velocity of Big Data, it is
neither trivial nor straightforward to find a clear path to jumpstart the Big
Data journey. This space is overwhelmingly crowded with so many immature
options and evolving solutions. To some extent it is somewhat confusing and
daunting. Where can you find an entry point? What is the most effective way
to get on board? Which aspects should you be mindful of? How can you not miss
the paramount things?
Why do you need to begin with the basics?
Here are five areas of consideration for Big Data on-ramp: Structur... (more)
A capability model is a structure that represents the core abilities and
competencies of an entity (department, organization, person, system, and
technology) to achieve its objectives, especially in relation to its overall
mission and functions.
The Big Data Capability Model (BDCM) is defined as the key functionalities in
dealing with Big Data problems and challenges.
It describes the major features, behaviors, practices and processes in an
organization, which can reliably and sustainably produce required outcomes
for Big Data demands. BDCM consist of the following elements:
NoHadoop is not only Hadoop. Why?
According to the 2014 Big Data & Advanced Analytics Survey conducted by the
market research firm Evans Data, only 16% of over 400 developers surveyed
worldwide indicated that Hadoop batch processing was satisfactory in all use
cases. 71% of developers also expressed a need for real-time complex event
processing more than half the time in their applications, and 27% said they
use it all the time.
Hadoop has evolved from MapReduce and HDFS in the very beginning to a set of
technologies, including Hive, HBase, Sqoop, Flume, Pig, Mahout, etc. Though, ... (more)