Analytics and Simulation-Powerful Tools in Business

Robert Roser, CIO, Fermilab
325
552
106

Game Changing Trends in the Industry

The most significant change ahead will be in the area of high-performance computing. The United States will soon enter the era of Exascale computing. This new era will be marked by an ecosystem of very powerful, highly parallel processors supported by networking, caching and data storage technologies. These high-performance computing platforms will be a game changer, not just for science but also for corporate America.

I’ll start with science, since that’s my first love—and the mission of my organization. Exascale machines will be so dynamic that they will enable scientists and engineers to simulate processes at a level of detail never before possible. Simulation is the engine that drives the type of physics done at Fermilab. It guides the development of the hardware we use to detect the universe’s building blocks. And it is a critical component of the discovery of new particles and forces of nature. Exascale computing will greatly increase the efficiency and power of simulated prototypes of our advanced scientific tools. Better simulated prototypes will reduce the time, cost, and risks of developing, producing, and deploying complex, innovative technical and scientific products. In short, it will significantly shorten the “time to market”. 

But simulation is not only a powerful tool for science. Successful CIOs at corporations are now demonstrating that analytics and simulation can also have a direct impact on the bottom line. CIOs are starting to build IT teams that can take advantage of these technologies. And the technologies are becoming affordable: large commercial cloud-based computational engines are now available at a reasonable cost. Corporations no longer have to own these technologies to play in this space, significantly reducing the barrier to entry. Software applications pose the most difficult remaining challenges to broad use of Exascale computing. Each type of problem typically requires a unique custom software solution, which can be costly to develop (tens of experts for several years). Cost, however, should be viewed as relative when compared to the timeto market and effort required using traditional physical prototypes.

  IT is not a cost center to be minimized but rather a place to invest 

In a global competitive economy, successful companies will need to learn to out-compute if they are to out-compete (the competition).

Ongoing Impediments in Today’s Market Place

As the CIO of a DOE National Laboratory, I am responsible for both our IT portfolio as well as our scientific computing program. This is a tricky enterprise to manage, and there are three areas that keep me up at night.

The first is security: protecting the digital data I am entrusted with. The cyber security landscape has changed dramatically in the last few years. The significant security breach at the United States Office of Personnel Management is just one example that highlights how the nature of the game has changed. We are no longer just protecting systems from individuals looking for a credit card or password. Rather, we are now dealing with government organizations out to garner intellectual property and build knowledge. These new players are well funded and patient. How to best protect all of the applications, laboratory resources, devices, and connections that our employees and scientists need to do their jobs is a challenging puzzle. Keeping up with patching, proper segmentation, secure logins are all steps one can take. Finding the right balance of investment to risk is a challenge.

The second topic has to do with making a successful transition from a myriad of legacy (custom built) applications to a modern, cloud-based software-as-a-service environment. The first challenge is to make the developer and software procurement investments. The second, and perhaps more difficult, challenge is change management. We need to answer the question: Why is IT “messing with” what already works?

The final topic has to do with budgets. In the National Laboratory system, there is tremendous pressure for every new incremental dollar to go toward the mission (in our case, to science research). Within such a mind set, how does a CIO investigate emerging technologies, upgrade systems with the times, and ensure quality of service within a fixed profile?

My wish would be for leaders in business and science to recognize that the nature of IT and computing has changed dramatically in the last few years, and that it will continue to change. The blurring between the physical business and the digital business is real and thus computing’s importance to an organization is greater now than ever before. IT is not a cost center to be minimized but rather a place to invest. If used correctly, IT will be the competitive advantage in the future of science and business.

Read Also

Delivering Business Value through Agile EA

Kevin A. Wince, Chief Enterprise Architect, General Services Administration (GSA)

Architecture Is Architecture Is Architecture

John A. Zachman, Founder and President, Zachman International

Transparency: The Universal Value-add

Eric Donnelly, SVP & Chief Enterprise Architect, PNC [NYSE: PNC]

How to Avoid Mistakes in Technology Evaluation

Ken Oxler, VP-IT Architecture & Chief Architect, Assurant [NYSE: AIZ]