If robots are to ever enter the household, they will have to be able to recognize the objects around them. While the human brain can achieve this very quickly, computers have a much harder time determining what they are looking at, and what its orientation is. Researchers at MIT though have developed a new algorithm that may greatly improve a robot's ability to do this.
Central to the new algorithm is a statistical tool called a Bingham distribution. This is a probability distribution that the researchers realized could be applied to aligning a real object with a geometric model. To determine the orientation of an object, a robot will attempt to align a geometric model of the object with it. The catch is that rotating the model to align some points of the model and the object, may cause others to go out of alignment. It turns out that the probability a rotation will align the model and object can be described with a Bingham distribution, and thus it can be used to quickly determine the object's orientation.
When tested, the new algorithm produced about as many false positive as other algorithms but was able to identify 73% of objects in a scene, compared to 64%. The performance should improve, according to the researchers, as more information is provided to the system, such as the likelihood of certain objects being found at unusual angles.