Tuesday, November 22, 2011

What Will The Average Consumer Due With A Super Computer?

Why did the most wealthy investor in the history of the U.S. commit over 10 Billion dollars to one tech firm over the last quarter of this year? The average American consumer reading this might be guessing this firm was Apple, or perhaps even Google, or some other mobile tech or Internet firm. None of these guesses would be accurate or even on target.

Many have speculated Warren Buffets recent investment in IBM was due to some sort of "inside information" he was made privy to during a red carpet tour he was given at one of IBM's headquarters.  But the fact is, everyone was told of the leap in technology IBM has recently hinted towards within a multimedia press release much earlier this year.

To explain the significance of IBM's claims, one must first consider the typical benchmarks associated with processor designs and their pace of advancement.  Typically, Moore's Law has accurately predicted an impressive benchmark by explaining how our computing power or processor designs maintain a pace of doubling every 18 months. IBM and 3M have recently describe a partnership, (illustrated in the link and video below,) which leverages a breakthrough, heat dissipating, adhesive along with a new type of CPU architecture in order to accomplish a 1000x's increase or leap in processing capabilities, rather than Moore's Law prediction of simply 2x's.




If you want to make processors 1,000x times faster, you're going to need some serious technology. At least that's the conventional wisdom. But 3M and IBM are claiming through this partnership to have unlocked a secret shortcut.

If all indicator are correct this potential advancement appears to be the most significant announcement in the history of computer science! An opportunity for processing power to advance by 1000x times in as little as two years! And we see a hint of just how legitimate this claim may be by witnessing the wealthiest investor in the western world commuting over 10 billion dollars toward the holders of this advancement the same year it was announced.

So in making the assumption that these coincidental, telling moments in 2011's history are indeed prophetic about a computing leap, the question must then be asked: What will the average consumer due with a "Super Computer"?  The answer is quite simply, "nothing"! Unless...  Unless, software development is adequately approached to introduce this new era of consumer level super computing to task and data crunching of new categories of applications never before demanded by the average individual.

As for the world of application which are poised to fully exploit a LEAP of 1000x times in computing power.  We certainly can expect the world of genomics to take several order of magnitudes of progression, given that it is one of the most processing bottle neck laden areas of medical and biological science.
Additionally you might actually start seeing your local weather man privy to an ability to predict weather further than a 24 hour window of accuracy. Thinks to world wide views of climatology being easily afforded and disseminated.

In an ever connected world, poised to exploit cloud computing and computational results; which sciences and application are sure to most immediately impact the daily lives of the masses? What if in addition to health sciences and climatology, other social and economic sciences, less associated with data crunching were also soon to be simple, common areas of computing comparisons and analysis.  In other words what if ares of life which we typically considered that of "opinion" become such competent, undisputed ares of science based on complex analyses of multi-spectral data, they become indisputable facts?  Such as; political sciences, macro economics, real time analyst between broadcast media and actual public opinions, ever elusive stock market analysis, the sky is really the limit.  And then of course there are all the obvious areas of penetration within the specturm of entertainment, movies, gaming, simulations, training, visualization, architecture, land planning, transportation projection and analysis, urban planning, "smart growth", etc...

Once again the field of software development will be depended on to ultimately leverage and create the benchmarks for exploiting this new level of hardware's potential along with the needs and ability to learn and adapt to it's opportunities and uses for and by the end user.

"Futurist" are needed at times like these more than ever. Predictions, redefined opportunities and speculation become paramount in creating advances and whole new markets.  Necessity is typically the mother of invention, but in a rare case where capability out paces many understood necessities; and when the market is filled with projects and goals which were limited by the researchers and public's previous understanding of what is feasible; visionary's must offer, encourage and evangelize future needs, uses and methods of exploiting such enormous leaps in potential.  Their predictions and hypothetical story telling becomes the catalyst which is then be utilized as the inspirational spirit required to take advantage and necessitate such rare leaps in opportunity.

On the higher level of software development thinking, it must be asked, will yesterdays office run on tomorrows super computer? Or is a revolution required in all areas and paradigms of use and user experience?  On a lower level are of consideration, is the jury still out on which instruction set will be the primary dependency for the new languages and application of the future capable of taking advantage of such leaps in processing and processor architecture.  And will a whole new paradigm in processors advances be present in the generations beyond this lead due to the exponential quality of chip design that relays on chips to analyze and perfect all subsequent generations.

Share/Bookmark

No comments: