According to Network World, decision makers and end users alike might be seeing cold-running computers in the near future.
This is due to the longtime issue of computational power loss that computers and other machines experience as they run, which is released as heat. Not only do decision makers and end users experience energy loss with their machines, but they also lose a bit of money, since it can be costly to keep running equipment cool.
As a result, researchers have announced three separate breakthroughs that may help the development of computer technology cut out significant heat loss.
It’s like a superconductor, but isn’t:
The first development could be found in a new “exotic, ultrathin material” that can be used as a topological transistor, Network World says. This means that the material, known as sodium bismuthide (Na3Bi), has “unique tunable properties.” For example, it is superconductor-like, but doesn’t need to be chilled. In terms of developing a “cold computer,” this is good, since superconductivity is “partly where electrical resistance becomes eliminated through extreme cooling.”
Another group of researchers is working on electron transportation without heat production, “and is approaching it through a form of superconductivity,” Network World says. The approach, which researchers are calling “Spintronics,” involves looking at the electrons’ spin, and is also related magnetism. Researchers have been binding pairs of electrons using magnetic materials and super conductors; electrons with parallel spins can bind to pairs carrying the supercurrent over long distances through magnets.
Researchers say “inherently non-heating superconducting spintronics might now be able to replace fundamentally hot semiconductor technology, in other words,” according to Network World.
Cooling on the chip:
Other researchers are looking at computer chips directly, and redesigning them for better cooling properties. Some chips entail channels through which coolant can travel, which cools the chip down instead of adhering to heatsinks. Researchers say that heatsinks are inefficient in part because of the “needed thermal interface material.” “Electronics could be kept cooler by 18 degrees F, and power use in data centers could be reduced by 5 percent,” according to Network World.