The latest technology buzz term is 'cloud computing' as it can have drastic effects on the performance and cost of some systems. In some cases, these effects are positive, but in others the cloud can actually be slower, despite its better hardware. Researchers at MIT are working to change that with two new tools aimed at optimizing performance.
Many companies use large databases for one reason or another, and on a regular computer, working with them is not terribly difficult. If that database is in the cloud though, we could find the performance lacking because the file may be broken across multiple servers. To view and modify the data it stores will therefore take longer and use more resources than necessary, so the researchers have developed DBSEER. Virtual machines on a server are only given so much of the system's resources, and how much depends on an estimation of its peak need, but predicting that is difficult. DBSEER use machine learning to correlate resource use and user activity to more accurately predict how many resources a virtual machine should get.
The researchers are also teaching DBSEER to deal with the fact that some operations scale much better than others, as user requests increase. To address this, new models of MySQL and other database systems are being developed to better predict when poorer scaling operations are going to occur.