Fluid Dynamics
Doghouse – Weather Forecast
A recent rocket launch has maddog thinking about high performance computing and accurate weather forecasts.
Recently SpaceX, working with NASA, sent a capsule carrying four astronauts to the International Space Station. The launching and flight were flawless.
I have been watching space flights since 1961. I still remember all of the students of my elementary school packing into the school's auditorium, with the school wheeling in their large black-and-white TV on a cart (one of the few "portable TVs" in those days) to watch it launch.
That is assuming that the rocket did take off, because in those days it was very likely that a storm might move in and have the countdown delayed, or even canceled. The act of fueling the rocket and putting the astronaut (first one, later multiple astronauts) in the capsule had to be planned and executed far in advance of the takeoff, and Cape Canaveral (as it was called in those days) was very likely to have bad weather.
We have known how to predict weather 24 hours in advance for a long time with 100-percent accuracy. The problem was that it took 48 hours to gather and process the data, so we could tell you precisely what the weather would be 24 hours in the past.
The problem of weather forecasting is one called "fluid dynamics," and this is a common problem in many things we need to compute. Heat flow, turbine efficiency, virus spread, airplane and car design, you name it … they are all "fluid dynamics." Even in objects we consider "solid," there are applications of fluid dynamics technologies that can be computed.
Before 1994 these type of problems were tackled by "supercomputers," machines and operating systems designed by companies like Cray, ECL, CDC, IBM, and others. These companies would spend many, many millions of dollars on developing these supercomputers, using state-of-the-art technology, and then produce a limited number of machines that might gain the title of "world's fastest" for a couple of years until the next one came along. Often the development costs were not covered by the profits from the sales of machines and service. Some of these companies (or at least their supercomputer divisions) were going out of business.
Then two people at NASA, Dr. Thomas Sterling and Donald Becker, developed a concept that allowed the use of commodity computer components to solve these compute-intensive problems by dividing these problems into highly parallel tasks, which they called "Beowulf" systems. Roughly speaking, you could see about the same computing power from a system like this that you would see from a "supercomputer" that cost 40 times more. In addition, since these commodity based systems used an Open Source operating system (typically GNU/Linux-based) the programmers who worked on fluid dynamics problems knew how to program the system with the addition of a few libraries – Parallel Virtual Machines (PVM), Message Passing Interface (MPI), and Shared Memory Interface (OpenMP) – as well as standard methods of breaking apart large programs (decomposition, thread programming, and parallelism).
The Beowulf concept – now called High Performance Computing (HPC) and used on the world's 500 fastest computers – also allowed for people to experiment with better compilers and systems with relatively inexpensive hardware. Systems purchased for some large problem could be disassembled and repurposed for other tasks when the large problem was finished.
Early systems included Oakridge National Lab's "Stone Soupercomputer" (named after the "stone soup" fable), made of 133 cast-off desktop systems connected with simple Ethernet as the communication between boxes. Implemented by William Hargrove and Forrest Hoffman in response to not receiving the funding for a traditional supercomputer, the Stone Soupercomputer helped to solve several real-world problems, as well as act as a system to develop new HPC applications.
Over time various distributions (Rocks and OSCAR were two) specializing in this type of computing came out from the various National Laboratories that made it easier to set up your own high performance cluster, including the use of Raspberry Pis as the hardware.
Over the years, the time needed to gather and process the data to forecast weather 24 hours in advance, with 100-percent accuracy, dropped from 48 hours to 24 hours (stick your hand out the window) to 12 hours, and after that we never had to suspend a NASA launch because of weather. Over a broader scale, these calculations also helped predict weather for sporting events, weddings, agriculture, and other issues.
Today the same technology is being used to calculate the damage of a hurricane versus what would happen if the environment had been one or two degrees cooler, important in the age of climate change. Weather forecasters are showing that temperatures of a few degrees can mean rainfall differences of 10 or 15 percent. In a rainfall of 12 inches during a hurricane, this can mean one to two inches of additional water for flooding, enough to make a difference in whether your home or business will be flooded.
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.
News
-
Rhino Linux Announces Latest "Quick Update"
If you prefer your Linux distribution to be of the rolling type, Rhino Linux delivers a beautiful and reliable experience.
-
Plasma Desktop Will Soon Ask for Donations
The next iteration of Plasma has reached the soft feature freeze for the 6.2 version and includes a feature that could be divisive.
-
Linux Market Share Hits New High
For the first time, the Linux market share has reached a new high for desktops, and the trend looks like it will continue.
-
LibreOffice 24.8 Delivers New Features
LibreOffice is often considered the de facto standard office suite for the Linux operating system.
-
Deepin 23 Offers Wayland Support and New AI Tool
Deepin has been considered one of the most beautiful desktop operating systems for a long time and the arrival of version 23 has bolstered that reputation.
-
CachyOS Adds Support for System76's COSMIC Desktop
The August 2024 release of CachyOS includes support for the COSMIC desktop as well as some important bits for video.
-
Linux Foundation Adopts OMI to Foster Ethical LLMs
The Open Model Initiative hopes to create community LLMs that rival proprietary models but avoid restrictive licensing that limits usage.
-
Ubuntu 24.10 to Include the Latest Linux Kernel
Ubuntu users have grown accustomed to their favorite distribution shipping with a kernel that's not quite as up-to-date as other distros but that changes with 24.10.
-
Plasma Desktop 6.1.4 Release Includes Improvements and Bug Fixes
The latest release from the KDE team improves the KWin window and composite managers and plenty of fixes.
-
Manjaro Team Tests Immutable Version of its Arch-Based Distribution
If you're a fan of immutable operating systems, you'll be thrilled to know that the Manjaro team is working on an immutable spin that is now available for testing.