Room Temperature Infrared Sensors with NanotubesCategory: Science & Technology
Posted: May 23, 2012 01:32PM
For any kind of sensing technology it is important to have a large signal to noise ratio, so interference is at a minimum. Generally this means minimizing the amount of whatever you are sensing that is not coming from the targeted source. Shielding helps most of the time, but for some sensors, like infrared sensors, the detector itself heats up and distorts its own observations. As reported by the Optical Society of America, researchers at Peking University, the Chinese Academy of Sciences, and Duke University have developed a new kind of sensor that practically defeats this problem.
Current infrared sensors are based on a semiconducting alloy made of mercury, cadmium, and telluride, and rely on liquid nitrogen or electric cooling to keep the alloy's heat from interfering with the sensor. What the researchers developed is a carbon nanotube infrared detector. Carbon nanotubes react very strongly to infrared radiation, making them ideal for the infrared sensors, but they have another characteristic that makes their use truly beneficial. Nanotubes conduct heat very well, so even when they normally would heat up during use, the temperature speeds away and the signal remains clear.
This discovery could advance devices used in the military, optical communications, and science, to name a few. There are likely other uses not yet thought of too, because of how limiting the need for extra cooling equipment has been in the past.