To explain the significance of IBM's claims, one must first consider the typical benchmarks associated with processor designs and their pace of advancement. Typically, Moore's Law has accurately predicted an impressive benchmark by explaining how our computing power or processor designs maintain a pace of doubling every 18 months. IBM and 3M have recently describe a partnership, (illustrated in the link and video below,) which leverages a breakthrough, heat dissipating, adhesive along with a new type of CPU architecture in order to accomplish a 1000x's increase or leap in processing capabilities, rather than Moore's Law prediction of simply 2x's.
If you want to make processors 1,000x times faster, you're going to need some serious technology. At least that's the conventional wisdom. But 3M and IBM are claiming through this partnership to have unlocked a secret shortcut.
So in making the assumption that these coincidental, telling moments in 2011's history are indeed prophetic about a computing leap, the question must then be asked: What will the average consumer due with a "Super Computer"? The answer is quite simply, "nothing"! Unless... Unless, software development is adequately approached to introduce this new era of consumer level super computing to task and data crunching of new categories of applications never before demanded by the average individual.
As for the world of application which are poised to fully exploit a LEAP of 1000x times in computing power. We certainly can expect the world of genomics to take several order of magnitudes of progression, given that it is one of the most processing bottle neck laden areas of medical and biological science.
In an ever connected world, poised to exploit cloud computing and computational results; which sciences and application are sure to most immediately impact the daily lives of the masses? What if in addition to health sciences and climatology, other social and economic sciences, less associated with data crunching were also soon to be simple, common areas of computing comparisons and analysis. In other words what if ares of life which we typically considered that of "opinion" become such competent, undisputed ares of science based on complex analyses of multi-spectral data, they become indisputable facts? Such as; political sciences, macro economics, real time analyst between broadcast media and actual public opinions, ever elusive stock market analysis, the sky is really the limit. And then of course there are all the obvious areas of penetration within the specturm of entertainment, movies, gaming, simulations, training, visualization, architecture, land planning, transportation projection and analysis, urban planning, "smart growth", etc...
Once again the field of software development will be depended on to ultimately leverage and create the benchmarks for exploiting this new level of hardware's potential along with the needs and ability to learn and adapt to it's opportunities and uses for and by the end user.
"Futurist" are needed at times like these more than ever. Predictions, redefined opportunities and speculation become paramount in creating advances and whole new markets. Necessity is typically the mother of invention, but in a rare case where capability out paces many understood necessities; and when the market is filled with projects and goals which were limited by the researchers and public's previous understanding of what is feasible; visionary's must offer, encourage and evangelize future needs, uses and methods of exploiting such enormous leaps in potential. Their predictions and hypothetical story telling becomes the catalyst which is then be utilized as the inspirational spirit required to take advantage and necessitate such rare leaps in opportunity.
On the higher level of software development thinking, it must be asked, will yesterdays office run on tomorrows super computer? Or is a revolution required in all areas and paradigms of use and user experience? On a lower level are of consideration, is the jury still out on which instruction set will be the primary dependency for the new languages and application of the future capable of taking advantage of such leaps in processing and processor architecture. And will a whole new paradigm in processors advances be present in the generations beyond this lead due to the exponential quality of chip design that relays on chips to analyze and perfect all subsequent generations.