Welcome!

Leading by Game-Changing Cloud, Big Data and IoT Innovations

Tony Shan

Subscribe to Tony Shan: eMailAlertsEmail Alerts
Get Tony Shan via: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Top Stories by Tony Shan

I will present a tutorial on the service-oriented model-driven architecture design for cloud solutions in the upcoming International Conference on Web Services (ICWS 2009). Please join the session to explore the state-of-the-art approach to effectively developing cloud services in a systematic fashion. Contact Tony Shan (tonycshan@gmail.com) for more info. ... (more)

Big Data Maturity Model

Big Data Maturity Model (BDMM) is a qualitative method to show the growth and increasing impact of big data capabilities in an IT environment from both business and technology perspectives. It comprises  a set of criteria, parameters and factors that can be used to describe and measure the effectiveness of the big data adoption and implementation. 5 levels of maturity are defined: Advanced, Dynamic, Optimized, Primitive, and Tentative (ADOPT). The definitions of all these levels are listed below: Primitive: initial stage of disconnected activities in an unorganized fashion Tentativ... (more)

Big Data Sciengineer

Big Data has been growing fast. Qualified resources with right skillsets are heavily sought after to tackle Big Data challenges. Depending on the scope of the Big Data initiatives, roles in a Big Data team vary a lot as compared with the traditional IT professionals in the project execution. One key area is Big Data Science. Essentially this is the role of data scientists. Their major tasks include Discover, Analyze and Distill (DAD): Discover: find the sources of good data, and identify the metricsAnalyze: turn data into information, with statistical analysis and miningDistill: turn... (more)

Big Data Redefined By @TonyShan | @CloudExpo [#BigData]

Big Data is a loose term for the collection, storage, processing, and sophisticated analysis of massive amounts of data, far larger and from many more kinds of sources than ever before. The definition of Big Data can be traced back to the 3Vs model defined by Doug Laney in 2001: Volume, Velocity, and Variety. The fourth V was later added in different fashions, such as “Value” or “Veracity”. Interestingly the conceptualization of Big Data in the beginning of this century seems to gain wider use now after nearly 14 years. This sounds a little strange as the present dynamic world ha... (more)

Big Data Is Really Dead | @ThingsExpo #BigData #IoT #InternetOfThings

IDG Enterprise's 2015 Big Data and Analytics survey shows that the number of organizations with deployed/implemented data-driven projects has increased by 125% over the past year. The momentum continues to build. Big Data as a concept is characterized by 3Vs: Volume, Velocity, and Variety. Big Data implies a huge amount of data. Due to the sheer size, Big Data tends to be clumsy. The dominating implementation solution is Hadoop, which is batch based. Not just a handful of companies in the market merely collect lots of data with noise blindly, but they don't know how to cleanse it,... (more)