Big Data Needs Big Brains

Two of the hottest buzzwords today in classical computing after Cloud Computing  are Big Data and Internet of Things (IoT).  The premise is that by collecting large amounts of data on all sorts of things, you can more intelligently control many activities and processes to make them more efficient and effective.   GE, for example, use the example of jet engines which are loaded with an array of sensors that can produce about 1 terabyte of data for every airplane flight.   They use this data to improve maintenance and identify problem areas before they become serious.

The challenge for computing lies in creating actionable information without drowning in this vast sea of data.  The computer programs that can sift through all the data and find something meaningful in it requires lots of computing power.   One of the potential applications of quantum mechanics is in the area of machine learning.    In certain situations, quantum computers may be able to significantly outperform classical computers and find insights that were not previously known.    This is apparently one of the reasons that Google is so interested in quantum computing.  Activities such as recognizing photos, identifying spoken words, and understanding natural language are all of high interest to Google, but also extremely difficult and require storage of a lot of data.    For more details on Google’s interest in quantum computing you can seen an article from Wired magazine here.

So by implementing quantum computing as the brains to analyze the brawn of large quantities of  big data you can create a world that is more intelligent and efficient.

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.