Science & Technology News (545)
Posted: September 1, 2014 07:01AM
To know what is trending, Twitter scans its library of tweets and counts up the number of times certain hashtags are used. While this definitely succeeds in identifying what is being talked about the most, it does not tell us what is actually being said. Researchers at the University of Missouri-Columbia decided to change that by developing a system that actually analyzes the words being used, to extract meaning.
To start off, the researchers used tweets associated with the Super Bowl and World Series and provided their system with a small library of words they expected to be present. The system then looked for those words as well as their placement within the tweets. The result is an analysis of what is happening at a micro or local level, beneath the clutter of hashtags.
The researchers see this system potentially being used to help provide information to people about what is happening in any given situation, such as after a disaster. It could even be used to predict future events.
Source: University of Missouri-Columbia
Posted: August 29, 2014 02:11PM
When it comes to understanding the existence of the Universe, there can be some curious theories. Naturally testing these theories can be an equally interesting endeavor as they push boundaries almost unimaginable. One theory about the Universe is that what we experience and observe is actually a 2D hologram and now researchers at the Fermi National Accelerator Laboratory have an experiment to test this idea.
If the Universe is actually a 2D hologram, then the 3D world we seem to live in is just an illusion, akin to how characters we view with 2D screens believe themselves to be in their own three-dimensional worlds. To test this idea, the researchers are exploiting one of the fundamental aspects of quantum mechanics; uncertainty. If the Universe is a 2D hologram then there must exist 2D bits of information, akin to pixels, and the size of these pixels would be on the Planck scale. These bits describing spacetime, if they exist, but obey quantum mechanics like all other particles, which means there is an uncertainty when measuring their position and velocity. To make those measurements, the researchers have built a Holometer, which is comprised of two, closely place interferometers that are sensitive enough to potentially detect the small jitters of the bits.
For the running of the Holometer experiment, the researchers will be working to identify and isolate sources of noise in the measurements, which could be coming from a variety of sources, including nearby electronics. Eventually though, if the holographic theory is correct, there will a noise that cannot be removed, and thus an aspect of spacetime itself.
Posted: August 29, 2014 11:30AM
For years I was told how important a strong vocabulary is, and it took me years to realize just how true that is. Of course you can grow your vocabulary by reading books, newspapers, and hardware review websites. Now researchers at the University of Gothenburg and Karlstad University have proven that playing video games can also strengthen your vocabulary.
For their study, the researchers had 76 children from 10 to 11 years old fill out questionnaires and a language diary, to track encounters with English, outside of school. The boys reported spending an average of 11.5 hours a week playing video games, compared to the girls' 5.1 hour average. The girls however did spend, on average, 11.5 hours a week on online language-related activities, primarily Facebook, while the boys only spent 8 hours a week, on average. It is important to note that for many of these children, their first language was Swedish and they were learning English as a second language. The gamers of the group demonstrated a significantly better English vocabulary than the others, which may be the result of English being commonly used in the games they played, while on Facebook the children could use Swedish instead.
Along with finding the positive impact of playing games, the researchers also identified that MMORPGs were the most effective at developing English vocabulary, thanks to the number of players of interaction between them.
Posted: August 29, 2014 06:45AM
Whenever you take a picture, the photons that are captured were actually reflected or scattered by the objects in the image. This is how cameras have always worked, but could it be possible to take a picture with photons that never interacted with the imaged object? It must be as researchers at the University of Vienna, Institute for Quantum Optics and Quantum Information, and the Vienna Center for Quantum Science and Technology have done so.
To achieve this seemingly impossible task, the researchers turned to quantum mechanics and the phenomenon entanglement in particular. Entanglement is when particles are so tightly bound together that the properties of one can affect and determine properties of the other, even when separated over great distances. To create the entangled particles, the researchers fired a laser into a pair of nonlinear crystals, with the imaged object in the middle. Both crystals created pairs of entangled photons, but one would be an infrared photon while the other would just be red. The optics of the experiment made sure all of the infrared photons from the first crystal took the path that would have them interact with the target, a sketch of a cat, while the red photons would remain clear of it. The infrared photons would then enter the second crystal, combining with any infrared photons made there. All of these infrared photons were then discarded, while the red photons were captured by a camera that could not even see the infrared photons.
Despite the red photons never interacting with the sketch of the cat, the beams still recreated it in bright and dark patterns, because the information of the infrared photons was preserved by the red photons. This research could see some very interesting applications in the future, including in medical and biological imaging where low light imaging is important.
Source: University of Vienna
Posted: August 28, 2014 02:07PM
Over 13,000 years ago, many great animals lived in North America, including mastodons, saber-toothed cats, and even American horses, but they all disappeared at the end of the Pleistocene period. The exact cause of this mass extinction has been a matter of debate for some time now. Researchers at the University of California, Santa Barbara however have new evidence that may solve the mystery.
Studies of prehistoric times often require looking at the many layers of materials beneath us. Each layer was deposited at a certain time and the materials of each layer can tell us about the Earth's situation at the time, and about certain events. Across what is called the Younger Dryas boundary, the researchers have found many nanodiamonds. These small carbon crystals are found in their cubic forms, like those we use for jewelry, and as hexagonal crystals. What makes these particular diamonds important though is that they could only have been formed by a massive event, such as a cosmic impact.
The only other barrier that has had more than one identification of nanodiamonds is the Cretaceous-Tertiary boundary. The nanodiamonds in this barrier are the result of the cosmic impact that killed off the dinosaurs and many other species on Earth, 65 million years ago.
Posted: August 28, 2014 11:49AM
Quartz glass is not an unfamiliar material for most of us, though we likely do not consider its electrical properties. Under normal circumstances, it is an insulator, but researchers at the Vienna University of Technology and Tsukuba University have found that it can be made into a conductor. The catch is that a femtosecond laser pulse is required.
For years now, femtosecond laser pulses have been in used to measurement quantum effects in small particles, because at such a small time scale of 10-15 seconds, even quantum mechanics can be caught. What this new research demonstrates is that these pulses can also be used to trigger significant changes in a material, and thanks to computer simulations, we now know when. When the pulse strikes the quartz, it pumps electrons bound to the oxygen atoms to another atom. This allows them to behave like free electrons and the electric field of the light then drives them in one direction, creating a current. This current only exists for a very short amount of time, but does persist a little after the pulse has faded.
This process is among the fastest in solid state physics, even beating the speed of transistors, which operate on picoseconds; a thousand times slower. Next the researchers want to test other materials and potentially find one that allows a more efficient use of the effect.
Source: Vienna University of Technology
Posted: August 28, 2014 07:58AM
By combining nanoplasmonics and optical resonators, researchers at the University of Illinois have developed a new, microscopic optical amplifier. The amplifier may find some interesting uses in medicine, as a means for implanted sensors and devices to communicate with networks outside of a patient's body.
When the researchers started this work, they knew it would be difficult because of the diffraction limit of light, which puts a lower limit on the size of a device. To get around this limit, the researchers turned to plasmonics, which link photons and electrons in such a way that metal objects can break the limit. The amplifier consists of a nano-structured surface with microspheres made of polystyrene or glass on top. When a beam of light strikes the microsphere, a narrowband optical signals is created within it, and molecules on the outside of the sphere amplify it. Because of how the spheres interact with the plasmonic nanostructures on the surface, a red or green light is created with a bandwidth matching the internal signal.
Among the potential applications for this technology are power-on-a-chip systems, as it could be used to route power on a chip. Also, as the initial light signal is of a frequency that can pass through skin, these amplifiers could be used for communication between devices inside and outside of a patient.
Source: University of Illinois
Posted: August 27, 2014 02:13PM
Practically since it was first discovered, graphene has been considered a wonder material for its many special properties. For perhaps not as long, researchers have been looking for other materials that replicate some of graphene's properties, but add some other, valuable properties into the mix. Researchers at Berkeley Lab have recently discovered a new contender for graphene that could see use in photonic and optoelectric technologies.
Called MX2 materials, these two dimensional semiconductors are made of a layer of a transition metal, like tungsten or molybdenum, with a chalcogen, like sulfur, sandwiching it. The result is a structure with the same hexagonal design of graphene and its highspeed electrical conductance, but also a band gap. Band gaps are the energy difference between conducting bands and non-conducting, valence bands in a material. As graphene lacks a band gap, it lacks a means to switch its conductivity on and off, like a semiconductor. By combining layers of different MX2 materials, it is possible to control their properties and now the researchers have found that they can have very short charge transfer times of under 50 femtoseconds.
A short charge transfer time, and thereby efficient charge transfer, impacts the ability to separate charges in a material. This is important in materials for photodetectors and solar cells, because if the charges recombine too early, what energy they had would be lost.
Source: Berkeley Lab
Posted: August 27, 2014 12:01PM
More and more it looks assured that we will see silicon be replaced in electronics with flexible and transparent materials. Researchers at the University of Washington have recently created a heterojunction between a pair of two-dimensional semiconductors that could find use in future technologies.
Heterojunctions are where two different materials meet and combine, and because of the combination, the junction itself possesses different properties than the two materials. In this case the two materials were molybdenum diselenide and tungsten diselenide, which are both monolayer materials and have similar structures. The similar structures greatly helped in forming the junctions without any distortions or discontinuities. To create the composite material, the researchers put a powder mixture of both materials in a chamber heated to 900 ºC and passed hydrogen gas through it. This caused some of the evaporated atoms to move over to a cooler part of the chamber, where they could form single-layer crystals. Thanks to the different properties of the materials, they evaporate at different times, so they go to the cooler region at different times as well. After the first material cooled and former triangular crystals, the second came over and attached to the edges, forming the heterojunction.
While this experiment only used molybdenum diselenide and tungsten diselenide, the process could be used to combine other two-dimensional materials to achieve a variety of properties for use as LEDs, photovoltaics, in-line quantum wells, and more. No matter the use though, the process should be fairly easy to scale up for mass production, by using a large furnace.
Source: University of Washington
Posted: August 27, 2014 06:22AM
One day we may use electrical wires and cables capable of transmitting currents without resistance. The key to this future is understanding superconductivity, the phenomenon that enables it. Researchers at Oak Ridge National Laboratory and Vanderbilt University have recently made an interesting discovery concerning iron-based superconductors that challenges some long held beliefs.
Superconductivity arises in certain materials when they are brought down below some critical temperature. For the earliest superconductors, this temperature was just above absolute zero, but since then we have discovered high-temperature superconductors that have critical temperatures significantly higher, though still far from room temperature. Some of these superconductors are iron-based materials, which was unexpected initially as these materials also have magnetic properties. Large-range magnetism is known to suppress superconductivity, but now it has been discovered that local magnetic moments do not disrupt it In fact these isolated areas of magnetism within the material may assist superconductivity, as they are at their maximum when superconductivity is. The researchers also discovered that the number of electrons in the moments was the same for different kinds of iron-based superconductors, though their distributions differed.
Beyond the potential for understanding superconductivity better, this research could also have an impact on other technological materials and devices. To do the study, the researchers had to develop a way to measure the local moments, which had not been done before as previous research always looked at the bulk average.
Source: Oak Ridge National Laboratory
Posted: August 26, 2014 03:40PM
Just about every cancer can be a big problem, and all are dangerous when they spread, which makes it vitally important to know if it is. Catching cancer cells in a patient's blood is very difficult though, because of how few cells there can be, and that many methods for sorting cells are complicated or can damage the cells. Researchers at MIT, Pennsylvania State University and Carnegie Mellon University however, have developed a device for sorting cells with great accuracy, and relative ease.
Instead of relying on chemical tags or strong mechanical forces, this method utilizes sound waves to gently guide cells. By using two acoustic transducers on either side of a microchannel, a standing wave can be made with a pressure node parallel to the flow. This much has been accomplished before and did demonstrate that cells of different size, compressibility, and other properties, would be pushed around differently. What has been added now is a tilt, putting the pressure node at an angle, relative to the flow. This causes the cells to pass through multiple nodes, and be slightly pushed to one side, and cells of different physical properties are still affected differently.
The researchers have already tested it with plastic beads 9.9 and 7.3 microns in size, demonstrating 97% accuracy, and were able to recover 71% of cancer cells in a sample that included those and white blood cells. They also created a computer model that can predict how cells will be affected based on its properties and the angle of the sound waves, which opens up the possibility of device customization.
Posted: August 26, 2014 11:05AM
Color blindness affects only about nine percent of people, but just about every digital camera is unable to discern color with just its detector. Color images are the result of filters external to the photodetector that separate incoming light to the red, green, and blue we are familiar with. Researchers at Rice University however, have developed a photodetector with color sensitivity, while studying cephalopods.
How are cephalopods like squids related? These species have very odd skin, which the Office of Naval Research wants to be studied. Cephalopods are color blind, but it is suspected that they can still detect color with their skin, and it was from this hypothesis that the Rice researchers developed the new photodetector. On top of the typical silicon photodetector, the researchers put a layer of aluminum that was etched into, using a process commonly used in CMOS process. The thickness of an oxide layer was also manipulated to create a plasmonic grate on the detector's surface. With such control, grating could be tuned to only allow certain frequencies of light through, and to focus that light onto the detector.
Unlike the filters traditionally used in digital cameras, this plasmonic grating can be built directly onto silicon photodetectors using standard CMOS techniques. Beyond the advantages of being part of the chip, this grating is also smaller and simpler and the filters, all while mimicking how organisms detect colors.
Source: Rice University
Posted: August 26, 2014 07:22AM
As my mother and father will both exclaim at times, computers and machines can be quite dumb. For now, that is just annoying, but if we ever start relying on robots to help in our homes, it could be a real problem. To head off potential problems, researchers at Cornell University have developed Robo Brain and set it the task of processing content on the Internet, in order to learn and teach other machines.
Though currently our computers and other devices are able to find us the answers to almost any of our questions, it is not in a form that the computers can understand. This is what Robo Brain addresses by associating text with images and videos, to recognize objects and learn human language and behavior. It stores this information with a Markov model, which consists of nodes representing objects, actions, or parts of an image, and edges connecting them. When another robot searches for knowledge, it starts with its own chain of nodes, and will look for a similar one Robo Brain built. As there may be differences between what Robo Brain and the robot built, probabilities have been assigned to the nodes of Robo Brain that represent allowed variance.
As Robo Brain may get some things wrong, or just need some help, you can visit its website to provide corrections and additional information: RoboBrain.me.
Source: Cornell University
Posted: August 25, 2014 02:13PM
Okay, this is probably something that will not surprise many people. A recent study from the University of California, Los Angeles has shown that the use of digital media can impair a child's social skills. More specifically, by spending time looking at a screen instead of interacting with someone face-to-face, the children are not learning how to read emotional cues.
For the study, the researchers used two groups of sixth graders. Both groups were given a test at the beginning of the study, to determine their ability to recognize emotions that was repeated at the end. In the interim, one group of 51 students went on a five day trip to a nature and science camp that does not allow the use of electronics. Before going to camp, the group made an average of 14.02 errors on the test and at the end of the study though that number dropped to 9.41. The 54 students who did not go to the camp at that time saw little or no change in the number of errors made. The students reported that they spend, on average, 4.5 hours each school day with digital media, in one form or another.
While the study does rather strongly indicate one negative with digital media, at least it also shows it can be addressed, to a point. Make sure the kids get some face-to-face time, away from devices and they should be in better shape.
Posted: August 25, 2014 08:53AM
Some say that information is the most valuable commodity in the world currently, so naturally it has to be protected. This can be challenging though, especially with complex software systems that handle information of multiple safety levels. Researchers at Karlsruhe Institute of Technology have recently created a new tool that checks for possible security leaks, and help keep data safe.
The tool is called JOANA and works by checking the data channels a piece of software will run data through. It will identify channels that are publicly visible and those that are secured, and find where they may cross. As you can guess, secure data is most likely to be exposed at these crosses of secret and public information. There the data can get out by explicit leaks, by implicit leaks that expose patterns in the encryption, and by probabilistic leaks that could allow data to be reconstructed. Even though that last type of leak is particularly hard to identify, JOANA is able to catch it and even has a low false alarm for it.
As it stands now, JOANA is the only software analysis tool for finding all three kinds of security gaps without having a high false alarm rate. Low false alarm rates are very important, as we do not want resources to be wasted hunting a nonexistent issue or for real issues to be erroneously dismissed.
Posted: August 25, 2014 05:46AM
As technology improves, it is understandable that it has a growing presence in completing tedious tasks. According to a variety of science fiction stories though, many of us have been taught that it is not necessarily a good thing to let robots have too great a role in the work. Researchers at MIT decided to put that to the test, and initially hypothesized that a sweet-spot of shared human and robot control would exist.
The study used groups consisting of two humans and one robot that could have one of three configurations. One has the human workers allocating all of the work, another has the robot allocate all of the work, and the last one has one human allocate their own tasks, while the robot sets tasks for the other. As expected, the configuration which had the robot allocating all of the tasks was the most efficient, but it also had the most satisfied human workers. This surprised the researchers as the workers reported feeling that the robot "better understood them" and "improved the efficiency of the team."
Though a physical robot was involved in the study, this result does not suggest that they should be involved in every task. What it does show is that giving control of scheduling, delegating, and coordinating tasks to algorithms instead of people may be a better idea than anticipated.
Posted: August 22, 2014 02:00PM
Take a material to an extreme, and you can expect to see some weird things happen. One example is superconductivity, and another is superfluidity, in which a liquid will flow with zero viscosity. Helium, when cooled enough, can become a superfluid, and researchers at Berkeley Lab and the SLAC National Accelerator Laboratory have recently answered a question about how nanodroplets of helium behave.
Superfluidity is the result of the atoms in the liquid coupling together to act as though they are a single particle. This removal of viscosity can lead to some interest behaviors, such as the liquid climbing up walls. For superfluid helium, we have known for decades that rotating it will cause quantum vortices to form, regularly spaced throughout the liquid. Whether this behavior also occurs in an isolated nanoscale droplet has been an open question for some time now, and if they are present, that would mean the entire droplet acts like a single quantum object, instead of a mixture of particles.
To create the nanodroplets, helium is passed through a nozzle that has been cooled to below 10 K, and shot into a vacuum chamber at almost 200 m/s. To image the droplets before they had a chance to vanish, the researchers had to use SLAC's Linac Coherent Light Source to create very short pulses of high energy X-rays. Xenon atoms were also added to the helium, to make the vortices visible, and indeed they were. In fact the vortices behaved like those in larger samples of superfluid helium, though the nanodroplets were also clocked spinning 100,000 times faster than any previous rotating superfluid helium sample. Despite that high speed though, the droplets did not deform, like normal liquid drops would. Potentially, studying these drops could lead to a better understanding of superfluidity, thanks to how isolated the property is.
Source: Berkeley Lab
Posted: August 22, 2014 10:16AM
With how much energy comes to Earth as Sunlight, it is not surprising that we are working on ways to capture and harness it for our own uses. While some means for doing this do exist, we are still searching for better methods, such replicating Nature's photosynthesis, which is a rather efficient process. Researchers at Ruhr-University Bochum have recently succeeded in creating a semi-artificial leaf that could help bring about cheap and flexible solar cells in the future.
Semi-artificial leafs that use photosystem 1 (PS1) for absorbing light, instead of a semiconductor, have been worked on before, but PS1 has a special complicating factor. It possesses both hydrophilic and hydrophobic domains, which makes it difficult to keep immobile on electrodes. To solve the problem, the researchers created a hydrogel matric, which can have its own hydrophobic/hydrophilic properties shifted by adjusting pH levels.
The resulting semi-artificial leaf outperformed other semi-artificial bio-photoelectrodes, but also bested Nature with an electron transfer rate ten times greater. Though modern, semiconductor photovoltaics are still superior, that may change in the future, and for now the bio-photovoltaics may find use in micro-scale medical devices.
Posted: August 22, 2014 05:59AM
The droughts that have been hitting the western US in recent years are having more of an effect than probably most people would expect. Beyond the water restrictions and some dying plant life, geological data suggests the droughts have caused the entire west coast to wise a little, according to researchers at the University of California, San Diego.
Across the country there are networks of highly precise GPS stations, and those along the west coast have seen an apparent uplift in recent years. More specifically the uplift has coincided with the droughts the states in the area have been suffering. Analysis of the data indicates that some of California's mountains have risen 15 mm (over half an inch) with an average of 4 mm across the west.
Crunching the numbers reveals that approximately 240 gigatons of water, or 62 trillion gallons, would have to be missing to explain the vertical shift. This opens up a new use for GPS networks as a potential means to measure water stocks across the world.
Posted: August 21, 2014 12:57PM
As we use our mobile devices more and more, the importance of securing them becomes greater and greater. Researchers at the University of California, Riverside have recently discovered a security flaw in the Android OS with almost a perfect success rate, and almost all popular operating systems may share the vulnerability.
The vulnerability has to do with the shared-memory side channel, which contains shared memory statistics about processes and can be accessed without permissions. From this information, it is possible to infer what an app is doing, such as logging in or receiving information for a purchase. On its own, this vulnerability is not serious, but the researchers found it could be used to time an attack that exploits a feature of many modern GUIs, which is why more than just Android may be at risk. The feature is to allow the screen to be preempted, such as to show an alarm. In this case though, what comes up is a false version of the expected window. By timing the attack with the shared memory data, the user will fail to notice the switch.
The researchers tested the attack on seven apps and here are their success rates (higher is worse): Gmail at 92%, H&R Block at 92%, Newegg at 86%, WebMD at 85%, CHASE Bank at 83%, Hotels.com at 83%, and Amazon at 48% success rate. The reason Amazon has the lower success rate is because it is harder to infer the state of the app, as it can transition between almost any activity.
Posted: August 21, 2014 10:42AM
Lasers are a cool technology, but as science fiction and some science fact tells us, the beams themselves are not cool and can be used to heat and melt objects. As some other science facts also tell us though, lasers can be used to cool objects by exploiting or great control over the properties of the beam. Researchers at the Australian National University have recently used this control to cool the tip of an atomic force microscope, making it much more accurate.
Atomic force microscopes (AFMs) are among the more advanced measurement devices we have and operate by moving a cantilever with a very sharp tip over a sample. At its point, the tip can be just nanometers wide and is very sensitive to forces, such as a surface pushing against it, but also forces between molecules. It is so sensitive though that heat will cause vibrations that introduce noise to the measurements. The Australian researchers decided to tackle that noise by aiming a laser at the probe. By precisely tuning the laser, the vibrations of the probe can be cancelled out, cooling the probe to -265 ºC. This increases the sensitivity enough to detect the weight a large virus.
As the laser beam's effect overwhelms the probe, the AFM cannot be run when the laser is turned on, restricting the researchers to make measurements in millisecond long heating and cooling cycles. With additional study and data processing though, we may one day see the same sensitivity achieved without a cooling laser, thanks to our understanding of the cooling effect.
Source: Australian National University
Posted: August 21, 2014 06:24AM
In many cases, electricity is generated by driving turbines with one fluid or another, such as steam or water. What can really set power plants apart is what puts the energy into the fluid that the turbines extract. One new method may use salt for that purpose, and researchers at MIT have found that such a system is not as simple as believed.
If you have two fluids with different solute concentrations separated by a semi-permeable membrane, such as having saltwater on one side and fresh water on the other, the fluids will move to try to equalize the concentrations on both sides of the membrane. The motion of the fluids is called osmosis, and pressure retarded osmosis (PRO) is a process some have been investigating for producing electricity. The idea would be to put pressurized salt water on one side of a membrane and fresh water on the other, and use the movement of the fresh water through the membrane to turn a turbine. What the MIT researchers have discovered is that the efficiency of a PRO system is more complicated than previously thought. According to their new model, the optimal membrane size is not the maximum membrane size, as a membrane half the area could produce 95% of the maximum output power.
Potentially PRO systems could be used to power desalination plants and water treatment plants, by putting saltwater or brine on one side of the membrane, and fresh or waste water on the other. To completely power some treatment plants may require some of the largest membranes in the world, but new configurations are being developed to fit the millions of square meters in relatively small packages.
Posted: August 20, 2014 02:06PM
Though we may not think about it much, we are all aware of Earth's magnetic field. The most obvious use of it is to orient compasses, but it has other uses too, as it is used for probing in geology and archaeology. Thanks to researchers at Berkeley Lab, it may soon also find use for analyzing chemical compositions of fluids, without removing them from their native environments.
Nuclear Magnetic Resonance (NMR) is a phenomenon that can be used to determine the materials in some sample, and possibly its most common use is medicine's MRI machines. It works by measuring how atoms behave when the angle of their spins are manipulated. Normally strong and uniform magnetic fields are used, but these are not always available. What is always available on Earth though is the planet's magnetic field. Attempts have been made before to use the Earth's magnetic field for NMR, but failed because the field is so weak and the equipment was not very sensitive. The Berkeley researchers have discovered that it appears to be possible now, by using highly sensitive optical magnetometers and by looking at how the spins of molecules relax and diffuse.
Potentially this technique could be used to characterize the contents of solids underground, such as in oil wells, and actually measure hydrocarbons and water within rock, as well as inspecting the curing process of polymers and cement. The researchers next want to increase the depth their method can reach inside of a material, possibly piercing a meter or more, instead of the inches possible with current technologies.
Source: Berkeley Lab
Posted: August 20, 2014 09:39AM
Many modern solar cells are made of materials like silicon and are expensive to produce. In the future though, new photovoltaics based on polymers could replace them by being cheaper and more resilient. Finding the right polymers is tricky though, but researchers at the University of Tsukuba and Nation Institute for Materials Science have found a way to speed up the search, as published by the American Institute of Physics.
Materials science can be an exhaustive field as the materials would be to be produced for testing, and only then could it be determined if the materials is of much use. By better understanding the behaviors of a material, it is easier to predict its properties and thereby speed up the process. This is what the Japanese researchers have accomplished for candidates for organic photovoltaics by combining two kinds of photo-induced spectroscopy. The two processes important for these materials are their charge formation and charge transport efficiencies, and it is believed that the charge formation efficiency is complicated and actually dependent on a thermal activation process. What the researchers discovered is that the temperature actually does not matter, as samples demonstrated the same efficiency at 80 K and 300 K.
This discovery indicates that the charge formation efficiency for organic photovoltaics is only quantum mechanical, which actually makes it simpler than expected. The result is that it should also be easier to quickly screen materials by this property, and in turn speed up searches for new organic photovoltaic materials.
Posted: August 20, 2014 05:57AM
More and more, fiber optic cables are being installed for carrying information across networks and across the Internet, because they are great speed and capacity. Many would like to see fiber optics enter our computers as well, but shrinking the cables has been proving difficult. Researchers at the University of Alberta though, have managed to create nano-optical cables that could enter our computer chips.
Presently copper wires are used within computer chips as interconnects, because the metal does a decent job. Optical fibers could do better, but their diameter has been limited to the micrometer range, which is too large. By turning to metamaterials however, the Alberta researchers were able to go an order of magnitude smaller, without losing data, slowing the signal, or creating heat. As you can no doubt guess, bringing fiber optics into chips would also bring significantly greater speeds and efficiencies than what we see now.
Source: University of Alberta
Posted: August 19, 2014 02:09PM
One of the biggest challenges with quantum computers is finding a way to store quantum information for extended periods of time. There are many different approaches being studied right now for preserving the information, and each has its own advantages and disadvantages. Now researchers at the Vienna University of Technology have combined two of these techniques and managed to extend the stability of the information.
One of the techniques being developed encodes the quantum information onto nitrogen atoms inside of diamonds, which protects them from external forces. Another technique encodes information onto photons trapped in a resonator. The researchers have combined these two concepts by using a microwave resonator to encode information onto multiple nitrogen atoms. This actually keeps the quantum information coherent for longer than it would normally by causing all of the nitrogen atoms to be coupled with the resonator. This mass coupling prevents the atoms from losing coherence, keeping the quantum information accessible for longer.
By opening the door to hybrid quantum technologies this way, it is hard to predict what new technologies may be created in the future. Of course quantum computers will see a benefit, but the potential of this research could be greater than longer memory storage.
Source: Vienna University of Technology
Posted: August 19, 2014 11:40AM
Unless you are exceptionally careful about what information is on the Internet, there is a good chance you have been presented with a recommendation based on your online activities. Of course we all know that services and websites collect information from emails, video views, and product views, but how exactly do they generate the recommendations? That is what researchers at Columbia University want to know, and so they have developed XRay to provide greater transparency on the Internet.
Approaching the problem of how our information is used is tricky, because much of the Internet operates like a black box. Without the ability to view the processes involved in generating the recommendations, XRay has to rely on black-box correlations between inputs and outputs. At first the researchers worked with theoretical results, which were encouraging, but only theoretical, so they soon started running experiments on Gmail, Amazon, and YouTube and refining the design. Eventually XRay achieved complete success with each experiment, matching theoretical predictions in complex cases, which suggests it can scale up well.
Though the current system has only been run on Gmail, Amazon, and YouTube, it should be service-agnostic, so any site that tracks you could be studied with XRay. Thus far it has revealed that it is possible to target sensitive topics and that there does appear to be abuse of the recommendation systems. You can see examples results at XRay's website: XRay: Transparency for the Web.
Source: Columbia University
Posted: August 19, 2014 06:36AM
Lead has quite a history as the soft metal once saw many uses, such as water pipes to additives for gasoline and paint, but is now restricted to just a few, due to its potential health hazards. That is also why we see so many recycling programs specifically for lead, to keep it from getting somewhere it should not, and to reduce the amount that needs to be acquired. Now researchers at MIT have devised another recycling program that could see lead repurposed for use in solar cells.
Perovskites are a family of compounds that share similar structures, and organolead halide perovskite is being looked at for use as solar cells. Some of these cells have already exceeded 19% efficiency, which makes them almost competitive with the silicon-based solar cells you can find today. The catch is the use of lead, especially as the solar cells would require more lead to be mined. The MIT researchers however have developed a process to take the lead from car batteries and use it for solar cells. As the lead compound would actually be a thin film, they predict that the lead of a single car battery would be enough to make enough solar panels to power 30 homes.
Along with providing another reason to recycle car batteries, this could also help bring the cost of solar cells down. The process is low temperature and requires fewer steps than conventional solar cells to produce, making it potentially easier to scale up cheaply.
Posted: August 18, 2014 01:57PM
Many people likely associate the image of a table covered with beakers, flasks, burners, and at least one microscope with a chemistry laboratory, but that is likely to change in the future. Many groups around the world are working on creating lab-on-a-chip systems that will condense chemical testing equipment onto something the size of a computer chip. Researchers at the University of California, Santa Cruz have recently created a chip with the ability to identify single molecules by combining electrical and optical measurement techniques.
The chip utilizes a nanopore that acts as a smart gate, to control the flow of molecules into a channel. The nanopore also allows the researchers to make electrical measurements as the molecules crosses it. For DNA passing through the nanopore, the electrical measurements would actually be able to determine the genetic sequence of the DNA, by fluctuations of the current. Once in the channel, the molecule is also exposed to a beam of light, and changes to the light's intensity indicates the size and optical properties of the molecule, as well as the flow speed through the channel.
When the researchers tested their chip with a mixture of influenza viruses and nanobeads of similar diameter, tagged with fluorescent labels, they found that they were able to distinguish between the two particle types using their electrical and flow properties with perfect accuracy. They were even able to count the number of virus particles, which would be very useful for analyzing samples.
Posted: August 18, 2014 08:40AM
Some people may not always think about it, but impurities are important to our way of life. The changes impurities can cause in many materials has enabled and improved many technologies we have come to rely on. Now researchers at Rice and Osaka universities have found how much impurities can disrupt graphene, a material many hope will be central to future technologies.
Graphene is an atom-thick sheet of carbon with special electrical properties, including great electron mobility. Of course for those properties to be useful in future technologies, they must persist, but the researchers have found that they can be greatly affected by impurities from the environment. The researchers grew a sheet of graphene and transferred it to an indium phosphide substrate for this research. When femtosecond laser pulses struck the graphene, the indium phosphide reacted by emitting terahertz radiation that passed through the graphene. Using a spectrometer, the researchers were able to detect imperfections as small as an oxygen molecule, as they affect the electric field of the graphene, and disrupted the terahertz radiation.
Of course the knowledge of how much graphene can be affected by imperfections is going to impact the development of technologies that may use it. One potential technology may be to actually adapt this experiment's design as a highly sensitive gas sensor.
Source: Rice University