Even though almost 70% of the Earth’s surface is covered with water, only about 3% is considered drinkable—and the majority of that water is frozen in polar ice caps or glaciers. When we consider that almost a billion people on our planet do not have access to clean water, and the oceans of the world seem to be much smaller. As being one of the most abundant molecules on Earth, water has become a big problem for several civilizations.
Quest of the World
One of the most ambitious goals for our society is that of making ocean water drinkable. However, to accomplish this, we are going to need lots of innovation and a ton of processing power. Let us examine exactly how future supercomputers could help resolve our water problems and more. Researchers from the Lawrence Livermore National Laboratory feel the answer resides in carbon nanotubes.
These little microscopic cylinders could serve as a great desalination filter. Their overall radius is certainly wide enough to allow water molecules pass through, yet narrow enough to filter out the bigger salt particles. And the scale involved here is truly incredible, as the width of one nanotube is over 10,000 times smaller than the hair of a human.
After that, just fasten a couple of billion of these carbon nanotubes together, and then we would have a very effective apparatus for developing drinkable seawater. As with many things, accomplishing this configuration is easier said than actually done, and this is how scientists are hoping to use supercomputers to develop the optimal method for this task.
If scientists can efficiently test several variations of carbon nanotube filters, with the options to select parameters such as filter times, width, water salinity, and so forth, then the field’s progress would grow in a big way. We could only imagine the task of manually running experiments through the manipulation of materials which requires the use of a electron microscope to determine the next move. This task would just be way too complex, inefficient, and tedious.
This is where we turn to exascale computers, which is the very next step in scalable data processing.
Supercomputers have become a lot more powerful in recent years. The last benchmark level, which was the petascale, was reached in the year 2008 after the world’s fastest supercomputer reached a speed of more than quadrillion calculations every second. The next step would equivalent to chaining together 50 million desktop workstations. This machine will be able to conduct some quintillion calculations per second. As a reference point, it will outperform the fastest supercomputer of today by a factor of 8 to 10.
Seeing how this exascale computer could have a massive impact have on projects such as desalination is extremely exciting, and as a result, engineers are tirelessly working to create the first computer of its kind. As a matter of fact, the United States Dept of Energy has commissioned an exascale computer project of its own and has awarded grants to six different companies, such as IBM, HP, Intell, AMD, and even NVIDIA, to support the development and research of exascale.
Scientists have actually placed overwhelming faith in this technology. Progress in different fields such as wind energy, quantum mechanics, weather forecasting, and global seawater distillation reach a standstill because of the lack of computing power. The fastest processors on the planet only choke when trying to work their way through huge petabytes of information and data.