Computing power’s decades-long rise has been both steady and astounding, but for this rise to continue researchers must grapple with some fundamental challenges.
Among these is heat. As devices become more compact and powerful, they generate more heat trapped to a smaller area. Today’s cooling technologies are simply not up to the job of removing heat that will be generated by tomorrow’s devices.
A paper co-authored by Dong Liu, assistant professor of mechanical engineering with the University of Houston Cullen College of engineering, presents a new technology that could help cool these future electronics. The article, published in Nano Letters and written in collaboration with researchers from the University of Colorado Boulder (led by mechanical engineering professor Ronggui Yang) and Georgia Institute of Technology (led by professor and university president G.P. Peterson), involves a cooling process known as flow boiling.
In such systems, the heat generated by an electronic device is transferred to water that boils as it moves through a channel. This water then contacts a heat exchanger, allowing heat to be removed from the device. Notably, as these channels are shrunk, their ability to remove heat increases on a per unit volume basis. Device designers, then, often utilize a series of small, connected, parallel channels, termed "microchannels," as an efficient means of heat removal.
This design has at least one serious drawback, though. Vapor can easily build up in the channels, causing serious problems. "On the one hand, we want boiling to be as intense as possible in order to transfer as much heat as possible," said Liu. "But if the boiling process is too violent, the channel is so small that it will become blocked by vapor."
When this occurs, the vapor insulates the water from the heat it is supposed to transfer. What’s more, if a channel becomes completely blocked by vapor, it can cause a backup that affects the other channels in the system, resulting in significant pressure and temperature instabilities and causing a large decrease in cooling ability.
To combat this, Liu and his collaborators employed a monolithic approach by growing silicon nanowires directly in the channels (as opposed to fabricating them separately and then inserting them into the channels). The nanowires stand perpendicular to the channel surfaces and, through a process known as capillary wetting, draw liquid down themselves and to the channel surface. The re-wetting of the surface prevents the insulation and blockage caused by vapor.
While this modification alone is not enough to meet the cooling needs of future electronic devices, it could easily be integrated into other systems to make them more efficient, Liu stated. While most research currently targets technologies that can dissipate heat generated by 100 watts of energy per square centimeter, Liu believes this technology can help achieve a far more aggressive goal.
"A couple of weeks ago, DARPA (the Defense Advanced Research Projects Agency, which focuses on funding cutting-edge research) released a call for proposals for a microchannel-based two-phase cooling system that can dissipate 1,000 watts per square centimeter, which addresses the heat flux (heat removal per unit area), and 1,000 watts per cubic centimeter, which addresses the power density (heat removal per unit volume). We want to integrate this into a proposal that can reach that double-thousand goal," he said.