In 1948 the research paper that created information theory was published by a professor at MIT. Since then its rules have led to digital communication, data compression, and encryption. Modern researchers at MIT and the National University of Ireland however have recently reexamined the original theory and found a flaw in one of its assumptions pertaining to encryption.
A critical concept in information theory is that of entropy; the measure of the uncertainty of a random variable. There are multiple ways to think of entropy though, and the one most used is Shannon entropy, named after the original MIT researcher, and for communication it is the right one to use. For cryptography though, the modern researchers have found it does not suffice. Shannon cryptography is the average uncertainty of a random variable, which communicated data will converge to, but encrypted data may not, so a portion of the encrypted data may have a correlation to its unencrypted form. With just one correlation, it is easier to find others and eventually completely decrypt the information. This could mean that someone could more quickly guess a password than previously thought, just by using the frequencies with which letters occur in English words.
While this may sound like troubling news to modern encryption schemes, there are two positive points to make. One is that while this research does imply it will be easier to decrypt information, it is still a difficult task to accomplish, so our data is not insecure. The second point is that the analysis of modern cryptology that this research will cause, will most likely lead to improved encryption methods, without this issue.