NoHadoop is not only Hadoop. Why?
According to the 2014 Big Data & Advanced Analytics Survey conducted by the
market research firm Evans Data, only 16% of over 400 developers surveyed
worldwide indicated that Hadoop batch processing was satisfactory in all use
cases. 71% of developers also expressed a need for real-time complex event
processing more than half the time in their applications, and 27% said they
use it all the time.
Hadoop has evolved from MapReduce and HDFS in the very beginning to a set of
technologies, including Hive, HBase, Sqoop, Flume, Pig, Mahout, etc. Though,
Hadoop was originally designed for batch processing. It was based on the map
and reduce programming model and it is an overstretch for real-time
transactions. Various efforts have been made to enhance Hadoop. For example,
YARN was designed to decouple the resource management in the underlying... (more)
Traditional analytics are usually a process for generating reports from
structured data stored in an old-fashioned data warehouse. Big Data drives
business values by creating competitive advantages and relevance from
different sources of structured, semi-structured, and unstructured data.
Analytical processes that used to run long hours or days are now reduced by
an order of magnitude.
Real-time Big Data represents a convergence of science, engineering, and
technology disciplines to collect, transform, store, process, analyze, and
search massive data in real time. The trend is to... (more)
I will present a tutorial on the service-oriented model-driven architecture
design for cloud solutions in the upcoming International Conference on Web
Services (ICWS 2009). Please join the session to explore the state-of-the-art
approach to effectively developing cloud services in a systematic fashion.
Contact Tony Shan (email@example.com) for more info.
The emerging cloud computing is gaining more attention nowadays and is
expected to mature over the next few years. An intriguing question is raised
as to how we can advance cloud computing more effectively by leveraging
engineering practices in this field. More importantly, is the engineering
of cloud computing is a disruptive innovation?
With regard to the concept of cloud engineering, here is a formal definition
of the term -- cloud engineering is the application of a systematic,
disciplined, quantifiable approach to the ideation, conceptualization,
development, operation, and... (more)
Though most big data projects are typically data-intensive, a pure
data-driven approach is a big risk. These initiatives actually require
multiple disciplines to implement a viable solution to complex business
problems. A comprehensive method is necessary to tackle the big data issues
and challenges systematically. The Big Data Pyramid is illustrated in the
diagram below, which include the key constructs to capitalize the big data
potentials along 4 aspects: Technology, Insights, People, and Process (TIPP).
Technology: Enabling Platform - a high-performance model of loosely-coupled... (more)