Today's supercomputers can not only perform calculation after calculation with blazing speed, they process vast amounts of data in parallel by distributing computing chores to thousands of CPUs. Supercomputers are found at work in research facilities, government agencies and businesses performing mathematical calculations as well as collecting, collating, categorizing and analyzing data.
Your local weatherman bases his forecasts on data supplied by supercomputers run by NOAA, or the National Oceanic and Atmospheric Administration. NOAA's systems perform database operations, mathematical and statistical calculations on huge amounts of data gathered from across the nation and around the world. The processing power of supercomputers help climatologists predict, not only the likelihood of rain in your neighborhood, but also the paths of hurricanes and the probability of tornado strikes.
Like the weather, scientific research depends upon the number-crunching ability of supercomputers. For example, astronomers at NASA analyze data streaming from satellites orbiting earth, ground-based optical and radio telescopes and probes exploring the solar system. Researchers at the European Organization for Nuclear Research, or CERN, found the Higgs-Boson particle by analyzing the massive amounts of data generated by the Large Hadron Collider.
National Security Agency and similar government intelligence agencies all over the world use supercomputers to monitor communications between private citizens, or from suspected terrorist organizations and potentially hostile governments. The NSA needs the numerical processing power of supercomputers to keep ahead of increasingly sophisticated encryption of Internet, cell phone, email and satellite transmissions -- as well as old-fashioned radio communications. In addition, the NSA uses supercomputers to find patterns in both written and spoken communication that might alert officials to potential threats or suspicious activity.
Some supercomputers are needed to extract information from raw data gathered from data farms on the ground or in the cloud. For example, businesses can analyze data collected from their cash registers to help control inventory or spot market trends. Life Insurance companies use supercomputers to minimize their actuarial risks. Likewise, companies that provide health insurance reduce costs and customer premiums by using supercomputers to statistically analyze the benefits of different treatment options.