Welcome!

Leading by Game-Changing Cloud, Big Data and IoT Innovations

Tony Shan

Subscribe to Tony Shan: eMailAlertsEmail Alerts
Get Tony Shan via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Top Stories by Tony Shan

Applied Cloud Engineering (ACE) is a holistic engineering approach for pragmatic development and integration of real-life cloud solutions. It is the field concerned with the converged and codified application of the Cloud Engineering and Cloud Metaengineering practices to the practical design, operation and consumption of Cloud services from a womb-to-tomb perspective, including the concepts, principles, methods, frameworks, techniques, and patterns. ACE equips the practitioners with the best industry practices, management schemes and technical skills to accelerate the adoption, buildout and execution of Cloud products in a systematic fashion. A layered structure is designed for the applied cloud engineering discipline, consisting of Foundation, Lifecycle, Implementation, and Pragmatism (FLIP). The detailed elements are subsequently defined for each layer, such as p... (more)

What We Learned in Decision 2012

Now the election in US is over. What differs the most from the last presidential election in 2008 is the impacts of new technologies such as blogs and social media. Interestingly, Nate Silver made a surprisingly accurate prediction of the election results in his famous FiveThirtyEight blog. His near-perfect forecast solidifies the relevance and significance of big data solutions. In my opinion, 4 key factors are the critical enablers to unlock the value of big data: Modeling, Algorithm, Statistics, and Semantics (MASS). Modeling: First and foremost, a good model must be establish... (more)

Big Data Capability Model | @ThingsExpo #BigData #IoT #InternetOfThings

A capability model is a structure that represents the core abilities and competencies of an entity (department, organization, person, system, and technology) to achieve its objectives, especially in relation to its overall mission and functions. The Big Data Capability Model (BDCM) is defined as the key functionalities in dealing with Big Data problems and challenges. It describes the major features, behaviors, practices and processes in an organization, which can reliably and sustainably produce required outcomes for Big Data demands. BDCM consist of the following elements: Coll... (more)

Big Data Redefined By @TonyShan | @CloudExpo [#BigData]

Big Data is a loose term for the collection, storage, processing, and sophisticated analysis of massive amounts of data, far larger and from many more kinds of sources than ever before. The definition of Big Data can be traced back to the 3Vs model defined by Doug Laney in 2001: Volume, Velocity, and Variety. The fourth V was later added in different fashions, such as “Value” or “Veracity”. Interestingly the conceptualization of Big Data in the beginning of this century seems to gain wider use now after nearly 14 years. This sounds a little strange as the present dynamic world ha... (more)

Big Data Is Really Dead | @ThingsExpo #BigData #IoT #InternetOfThings

IDG Enterprise's 2015 Big Data and Analytics survey shows that the number of organizations with deployed/implemented data-driven projects has increased by 125% over the past year. The momentum continues to build. Big Data as a concept is characterized by 3Vs: Volume, Velocity, and Variety. Big Data implies a huge amount of data. Due to the sheer size, Big Data tends to be clumsy. The dominating implementation solution is Hadoop, which is batch based. Not just a handful of companies in the market merely collect lots of data with noise blindly, but they don't know how to cleanse it,... (more)