Science & Technology News (948)
Posted: July 1, 2015 07:00AM
Some day we will have to leave many electronics behind us and go to a new technology, possibly photonics. There are still many hurdles to overcome before that can happen, and among them is the packing photonic waveguides close together. Researchers at Berkeley Lab and the University of California, Berkeley have achieved that by applying adiabatic elimination.
If you place two nanowaveguides too close together, crosstalk between them will destroy any useful information. This is a significant problem if we want to achieve chip-scale quantum computing with photons and high-performance optical communications. The solution the Berkeley researchers have is to actually add an additional waveguide to the mix, between to two that would normally interfere with each other. This third, middle waveguide mediates the light passing between the two outer guides, preventing crosstalk, but does so without collecting any light within it, causing it to be effectively dark. This allowed the waveguides to be placed just 200 nm apart from each other, which is well within the destructive crosstalk proximity.
The control of light this discovery allows will enable the nanowaveguides, which are similar to the circuitry in current electronics, will allow for much denser designs, which in turn allows for more advanced and powerful devices. It may still be a long time before we see the devices that will take advantage of this research, but what comes of it should be very interesting to see.
Source: Berkeley Lab
Posted: June 30, 2015 07:40AM
Some people believe that augmented reality systems are going to come to define how we interact with the world, by providing information on top of the world around us. One of the more obvious applications for such systems is vehicle windshields, by having warnings, directions, cellphone activity, and more information displayed in the driver's view. While some see this as a way to improve safety, as the driver will not have to look away from the road, researchers at the University of Toronto have found in a new study that AR may make driving less safe.
What the study effectively boils down to is that the information thrown up on the windshield will actually split the driver's focus, because it takes some effort to recognize the information, identify it, and process it. To perform this study, the researchers used computer-based trials that would display randomly placed spots on a windshield. The participants would have to report when they saw the spots, and how many, but in some cases an additional shape would appear, and also had to be reported. When the shape was just a black-outlined square, and when it was a square, triangle, or diamond, the participants were much slower to reporting the shapes, and the number of spots. In some cases, they would just completely miss the shapes, and this grew worse when there were more spots on the screen.
While AR could be useful for driving, this study indicates that the information and warnings provided that way could actually make things worse, as that information competes for the driver's attention. Furthermore, the researchers point out that it is very possible that the times when the most information would appear, would also be when the driver needs to be most focused on their environment.
Source: University of Toronto
Posted: June 29, 2015 02:02PM
Camouflage is something some creatures do better than others, with the masters definitely being those that can actively change the color of their skin. This ability has inspired many to try to recreate it technologically, and researchers at the University of Central Florida have done just that. This new display checks the boxes for flexibility, being full-color, skin-like, and thin, while also being relatively easy to make.
Unlike traditional displays, like those are you most likely reading this on, this new display does not have a light source, and instead reflects ambient light for illumination. It is made of a liquid crystal layer on top of a metallic nanostructure shaped like an egg carton, and it all comes to being just a few microns thick. Changing the color just requires changing the voltage applied to the pixel, as the electricity alters how the liquid crystal molecules interact with the plasmon waves of the metal, causing different colored light to be reflected. This allows the display to be full-color, while other, similar displays already developed offer only a limited color palette.
To create the display, the researchers used an inexpensive nano-imprinting method that can make the structure over a large area. This opens up many applications for the technology, such as clothing that can have color and pattern changed at will, depending on the occasion or the environment.
Source: University of Central Florida
Posted: June 29, 2015 05:31AM
Fiber optics comprises the backbone of the Internet, carrying all of the information from point to point, so it can ultimately reach our devices. Despite this though, there are still some significant limitations on the technology, including a limit on the amount of power a signal can have. Thanks to researchers at the University of California, San Diego though, that limitation may soon be removed.
Normally one would expect that to make a signal go farther, they should make it more powerful, like yelling to be heard over a greater distance. This is not the case with fiber optics though, because increasing the power of the optical signals also increases the crosstalk between the channels, which makes recovering the data more difficult. The solution developed at UCSD, and first proposed last year, is to take advantage of the physics of crosstalk with a frequency comb. Because crosstalk is governed by specific physics, it is possible to tune the signals, with the frequency comb, so that what crosstalk occurs can be easily filtered it. This allowed the researchers to send optical signals at some 20 times normal power, without losing data.
Applying this, the researchers were able to send and successfully decipher signals that travelled through 12,000 Km of fiber optic cables without repeaters, which are otherwise needed. By removing the need for repeaters, it should be possible to significantly improve the speed and range of fiber optic networks, while also reducing their cost.
Posted: June 26, 2015 02:03PM
There are a host of technologies being investigated for replacing current electronics, which are rapidly approaching their theoretical limits. One strong contender is spintronics, which uses the spin of particles to store and process information, instead of charge like current devices. Like other technologies though, bringing spintronics to a usable scale is far from easy, but researchers at the University of Chicago have made a significant step in that direction.
Spin is a fundamental property of many particles, including electrons and atomic nuclei, and is what leads to magnetism. Because it is a characteristic of particles, spintronic devices would require very little power to operate and none to store data, unlike current technologies. Working with spin can be difficult though, especial the spin of atomic nuclei because they are very sensitive to their environment. Normally to align nuclei requires cooling the atoms to ultracold temperatures, but the Chicago researchers have successfully aligned nuclei in silicon carbide (SiC) at room temperature. This has been achieved by optically cooling the atoms and manipulating imperfections in the SiC crystals called color-centers. The nuclei will not directly interact with light, but the electrons in these color-centers do, and their alignment can be transferred to the nuclei.
This new method has managed to align better than 99% of the spins of certain atoms in the SiC, which is far better than the one to ten in a million that MRI machines can align with their powerful magnetic fields. Making this technique even more interesting is that SiC is already used extensive in the electronics and opto-electronics industries, making it easily accessible for producing advanced prototypes.
Source: University of Chicago
Posted: June 26, 2015 05:45AM
Lithium-ion batteries have, beyond any doubt, affected how we live today thanks to the wealth of devices they have made mobile. Since they were invented though, the basic process of manufacturing them has remained largely unchanged. Researchers at MIT are trying to change that though, and have formed the spinoff company 24M to bring the new process to the world.
The lithium-ion batteries we know and love use solid electrodes, but five years ago an alternative design was described, in which the electrodes are actually suspensions of particles. Such flow batteries would be more complicated and expensive to produce though, but the researchers decided to hybridize the two designs, developing a semisolid battery. In this design, the electrodes do not flow but consist of a colloidal suspension of particles. This shortens the path the ions have to take through the material, which in turn allows for fewer, thicker electrodes to be used, reducing the amount of unused material in the battery by as much as 80%. It also allows a drying step to be skipped in manufacturing the batteries, which is otherwise necessary in current techniques.
In addition to simplifying the manufacturing process, the semisolid batteries these researchers have created are also flexible and more resilient than modern Li-ion batteries. Further, the manufacturing process can be scaled up by adding units, unlike current methods that require the plants be built at large scale from the beginning. Currently the researchers are looking to deploy these batteries for grid-scale applications, but could see them being used in electric vehicles, where weight and volume are crucial.
Posted: June 19, 2015 02:09PM
With electronics rapidly approaching the limit of traditional materials and transistors, many are looking for new ways to store and process data using physical phenomena. One contender is skyrmions, which are small islands of magnetism found in some materials, but the problem is making them is hard to do even in laboratories. Now researchers at Argonne National Laboratory have discovered a new and simple way to make them at room temperature.
Skyrmions were only discovered a few years ago and producing bubbles of them normally requires temperatures approaching 5 K and expensive tools like spin-polarized scanning tunneling microscopes. Obviously this is a problem if they are to be put to any practical uses, which there is interest in, since researchers realized skyrmion bubbles tend not to unravel and could be moved using electric currents. The solution the Argonne researchers discovered is a constricted wire consisting of a thin layer of a magnetic material sandwiched between two conductive layers. Stripes of magnetic domains form in the material on one side, and when an electric current is applied this stripes stretch out and break at the constricted channel separating the two halves. On the other side then, the skyrmion bubbles are formed.
The hope is that from this discovery it may be possible to create a memory system based on reading the presence of skyrmions. Such a device could be built very small and require less current than other memory systems, such as racetrack memory.
Source: Argonne National Laboratory
Posted: June 19, 2015 04:34AM
When talking about computer chips, many people will think about the transistors within and how they function, forgetting about the wires that exist to connect them all. Currently these wires are made of copper and are wrapped in tantalum nitride, but that may be changing soon, thanks to the work of researchers at Stanford University and more.
First it is worth noting that the wrap around the copper wires is not to keep the current within the wires, like that in our houses, but to prevent copper atoms from entering the transistors. If this happens, the atoms will destroy the transistors. Tantalum nitride does this job well, but it appears that graphene can also do it, and bring with it superior performance. For one thing, a graphene wrapper could be eight times thinner than the smallest tantalum nitride layer, and because graphene is also a conductor, it could help carry the current between transistors.
The researchers estimate that for modern chips, this would amount to just a 4-17% improvement to wire speeds, but in two generations, as chip components shrink, the boost could be 30%. Of course more work needs to be done to take this from proof-of-concept to real-world use, but now that we know what kind of impact there can be, more researchers will likely start investigating the wires as well.
Source: Stanford University
Posted: June 18, 2015 02:23PM
Carbon is essential for life, but has been missing in electronics for a number of reasons. This could change in the foreseeable future though, as we better learn how to make carbon compounds with special and desirable properties. Now researchers at Goethe University have discovered that doping boron atoms into graphene can create an organic LED that emits an intense blue light.
Many forms of carbon are of interest for future technologies, and graphene, an atom-thick sheet of carbon, is one of the better known forms. Typically it is by manipulating the edges of graphene that researchers manipulate its properties, but more recently we have been able to work within the sheet itself. That is what is happening here, as the Goethe researchers have replaced carbon atoms with boron, within graphene nanosheets. The result is an OLED luminophore that produces light in the blue range, and the electron transport is also improved.
Naturally this discovery could be used for displays, and because the films involved are flexible, the displays could be rolled up at will.
Source: Goethe University
Posted: June 18, 2015 05:27AM
When Edison was working on the electric light, he experimented with a variety of filaments, an eventually settled on using carbon, though later other materials were selected. Now we may return to carbon in the form of graphene to make nanoscale lightbulb so bright, you can see them with the naked eye. Researchers at Columbia University, Seoul National University, and Korea Research Institute of Standards and Science have created a potential on-chip light source using graphene, and it has some useful properties.
Graphene is an atom-thick sheet of carbon many people expect will be used in future devices, because of its numerous desirable properties. I am not sure if any expected it to be used as a nanoscale lightbulb filament though, but it actually does the job very well. To help push technology to smaller scales and faster speeds, there is a drive to use optics in computers, but for that to happen we need reliable light sources that can be integrated into computer chips. Until now, no one has been able to replicate the incandescent lightbulb on this scale, partly because metal filaments would have to get so hot to emit visible light, they would damage nearby circuitry. Graphene does not have this issue though because as it heats up, its heat conductance goes down, causing the temperatures as high as 2500 ºC to stay in small spots. It also helps that the filaments are suspended above the silicon substrate, instead of being in direct contact. This improves the efficiency, and it turns out, allows the light emitted to be tuned by manipulating the distance between the filaments and the substrate, thanks to graphene being transparent.
Even though the graphene filaments are only an atom thick, the light they emit is so bright it is visible to the naked eye. That definitely makes the discovery interesting for many applications, and now the researchers are working to better characterize it, such as determining how quickly it can be switched on and off, for digital communications.
Source: Columbia University
Posted: June 17, 2015 05:40AM
One of the stereotypes of gamers is that we are out of shape, because we spend our time in front of screens instead of doing various physical activities. Active video games challenge that perception though, as they can have the player moving their entire body quite a lot. To see how much this movement may amount to, researchers at the University of Tennessee, Knoxville have studied the energy expenditure of active video game play and unstructured outdoor play, to see how the compare.
Studies have looked at active video games before, but have not considered energy expenditure when compared to outdoor play. For this study, the Tennessee researchers placed accelerometers on the hips and both wrists of children from five to eight years old. Over a three-week period, the children were monitored during 20 minute sessions of outdoor play or Kinect Adventures River Rush, which was selected because it uses the entire body, requires no special skills, and has an E rating from the ESRB.
The researchers found that active video games could actually be a good source of physical activity, with the hip sensors actually registering more activity for the video game than for outdoor play. Of course this does not mean active video games should become a primary source of physical activity, but that the right game could support a child's physical health.
Posted: June 15, 2015 04:18AM
We know most materials by their surfaces, if they are bumpy or smooth, slick or rough, and these characteristics can be very important for a variety of applications. For that reason, the ability to alter a material's surface features could be a game-changer in a number of ways, and now researchers at MIT have found a way to do just that. Making the discovery even better is that it should be possible to scale it up as needed.
The researchers started with computer simulations of a material comprised of two polymers, with one being flexible and the other being rigid. By strategically placing particles of the rigid polymer with the matrix of the flexible one, it is possible to control the material's behavior when compressed. The buckling the material would undergo when being squeezed, follows a pattern set by the placement of the rigid polymer particles, so if normally the material is smooth, compressing it would cause ridges to form. The researchers have made the material already with a 3D printer, and because the method is completely based on geometry, it should be possible to scale it to any and all sizes.
There are several applications for materials with dynamic surfaces, such as camouflage and altering fluid turbulence. With further study it may be possible to make these materials even more powerful, by using elongated particles in them, to produce asymmetrical surface patterns, and finding ways to achieve the results with electricity, instead of mechanical compression.
Posted: June 12, 2015 01:57PM
With the constant push for faster and faster electronics, it is becoming necessary to look to new materials and technologies to continue progressing. For this to happen though, we have to understand the physics involved, which can be difficult to do in some circumstances, such as at very high frequencies. Researchers at Rice University though, have recently found a way to measure the conductivity of nanowires at optical frequencies using plasmons.
Plasmons are electrons and photons coupled together and are made by firing photons at a metal. Many are interested in using them in future technologies because they possess useful properties of both electrons and photons. In this study, the researchers worked with a kind of light-activated capacitor made of two nearby plasmonic nanodisks. When these disks are connected by a wire, the charge of the disks will actually flow back and forth along the wire at optical frequencies. Changes to the wire's conductivity, even small changes, would alter the optical signature.
This ability to measure changes in conductance at optical frequencies could prove invaluable for testing nanowires for use in future devices. Current methods are not capable of making such high frequency measurements at the nanoscale, so this research could be key in letting technology reach these extreme scales.
Source: Rice University
Posted: June 12, 2015 07:25AM
Friction is everywhere and while it can be frustrating, it is also very useful and impossible to live without. There are instances though, when it seems to disappear as the phenomenon, superlubricity appears. Now researchers at MIT have finally studied friction at the nanoscale, including the emergence of superlubricity, which could have a significant impact on nanomachines.
Typically whenever two surfaces contact each other, there is friction between them, but there are times friction vanishes. To study this, the researchers created two nanoscale 'surfaces.' One is an optical lattice created by two lasers that interfere with each other to create a sinusoidal periodic pattern (a bunch of peaks and troughs). The other 'surface' is an ion crystal of sorts, produced by optically trapping a number of ions that repel each other, creating a kind of crystal structure. When the researchers ran the ions over the optical lattice, they found maximum friction when the distance between the ions matched the wavelength of the lattice. If the distance was mismatched though, superlubricity appeared. Instead of jarring, sudden movements like before, with superlubricity the ions slid up and down the optical lattice smoothly, like how a caterpillar moves.
Potentially this research could be used to design nanomachines that last long, by reducing the friction they suffer. It could also be extended up to the macroscale, to help provide a better understanding of friction on every level.
Posted: June 11, 2015 02:27PM
One way to think of magnets is as a bunch of smaller magnets, which is accurate as indeed magnetic materials do consist of small, structural domains. When the magnetic fields of these domains line up, the material, as a whole, is magnetic. The boundaries of these domains can have interesting properties, which researchers at ETH Zurich have been studying and have recently made some interesting discoveries.
More specifically the researchers were investigating oxides with multiferroic properties, which means that the materials are both magnetically ordered and electrically ordered; have north and south magnetic poles, as well as positive and negative magnetic poles. By having both ordering present, this indicates it is possible for cross-coupling to occur and that electrical voltage could change magnetic state. When studying strontium manganite the researchers discovered the domain walls within it are non-conductive, even though the material itself is. This could potentially be exploited to turn the nanoscale domains into capacitors. These capacitors could then be used to store data as charge, but with only a voltage, instead of a current, making it more efficient and prevent the generation of waste heat.
When studying a different material, terbium manganite, the researchers also found that it is possible to change the magnetism of the domains, using only a voltage. It will be a long time before this discoveries could be put to use in any application though, as much more needs to be done, but it still shows what should be possible in the future.
Source: ETH Zurich
Posted: June 11, 2015 07:02AM
While many people envision a future where every home has a 3D printer to build whatever small object is needed or wanted, the technological also has uses for large-scale manufacturing. Currently though, additive manufacturing, it is not fast enough to fully compete with traditional production methods, but that may change soon. Researchers at the University of Sheffield are working on a new process that could match injection molding on speed, while bringing with it other advantages.
This new process is called high speed sintering (HSS) and works similarly to some other additive manufacturing processes. It works by fusing layers of polymer powder together, but where other methods use lasers to provide the heat this uses a special ink and infrared light. The ink absorbs the light, heating up enough to fuse the powder, but can be done much faster and at a larger scale. The machine the researchers are working on will be able to make parts a cubic meter in size, which is three times that of existing machines. The smaller the part though, the faster the process and some could take less than a second to make, which could let it compete with injection molding for scale.
Even if the speed were lower than that of injection molding, additive manufacturing like HSS has special advantages, such as the ability to make more complex and unique parts. It can also be set up anywhere very quickly, while injection molding requires special tools that have to be made in advance.
Source: University of Sheffield
Posted: June 10, 2015 08:13AM
The idea of implants being placed into our bodies is hardly new, including devices meant to be placed into the brain. Typically these devices will cause inflammation and may need to be repositioned occasionally, and neither possibility is desirable. Now researchers at Harvard University have created a device that could be injected into the brain, but is flexible and small enough to not cause problems.
This new implant is actually the electronics scaffold the researchers were working with, which they discovered was flexible and almost invisible on its own. It is so flexible that it could be pulled into a glass needle or pipette, and from there by injected by a syringe into a subject's brain. Once there, the mesh could be used to monitor signals from neurons, as similar technologies have done. The mesh could even be used to stimulate neurons and promote regeneration, when connected to electronic devices, thanks to its subcellular features.
In addition to the great potential for applications, the mesh is also very easy to produce, as it is compatible with conventional methods for manufacturing microchips. The researchers are now looking for commercialization opportunities and expect it could have a major impact on neuroscience.
Source: Harvard University
Posted: June 9, 2015 05:01PM
If you give a person a task, they will be able to quickly determine the best means of achieving the goal. Give a robot a task though, and it may take a long time for it to decide what to do, because it will evaluate every possibility available to it, even if it does not contribute to the goal. To reduce this problem, researchers at Brown University brought their AI into Minecraft, where the number of possibilities can be controlled.
The real world is filled with possibilities, and while humans can ignore those that are unnecessary for a given task, robots will consider them and suffer a state-space explosion. This is when the array of choices is so large the robot can struggle to work with the full array. The solution the Brown researchers have developed is to teach the robots goal-based action priors, which means that the robots know what is most likely needed to complete a task. To test this approach, the researchers brought the AI in Minecraft, because they could completely control the space and possibilities the AI would be working with. The AI agent was then allowed to learn how to achieve goals by trial and error, and as it learned the researchers increased the size of the space it was working in.
When tested in larger and unfamiliar Minecraft spaces, the researchers found their AI, with priors, was able to complete tasks faster than conventional planning algorithms. The researchers then brought the algorithm out of Minecraft and into a physical robot, to have it help with baking brownies. If you are interested in the mod the researchers made for Minecraft, you can find links to it in the source link. The researchers want others to be able to experiment with it, including other researchers and players. Perhaps one day it will be possible for the algorithm to operate on the whole of a Minecraft world.
Source: Brown University
Posted: June 8, 2015 05:56AM
Even some of the most advanced power plants in the world still rely on transferring heat with steam to produce power, like old steam engines. By finding ways to better improve this heat transfer, the efficiency of power plants can be improved, potentially to very significant effect. Now researchers at MIT have found a way to increase heat transfer by a factor of four, by applying graphene to condensers.
The heat from many power plants, whether they are coal, oil, or natural gas fired, or even nuclear, is used to generate steam for turning turbines, which then create electricity. After spinning the turbines, the steam will enter condensers to convert it back into water, and restart the process. By improving the speed at which the steam condenses into water, the efficiency of the power plant itself can be improved. To that end, many have been investigating ways to apply superhydrophobic materials to the condensers, so that once droplets form on them, the droplets fall off sooner, making room for more. Typically these are polymer coatings, which can completely fail in just hours, so the MIT researchers decided to try out graphene, which does have some hydrophobic properties. The result was a four-fold increase in heat transfer in the condenser for two full weeks, without any degradation.
With further development, the researchers feel they could reach 5 to 7 times improved heat transfer. Even at just four times better transfer, a power plant's efficiency could increase by 2-3%, which may not seem like much, but could still amount to millions of dollars, per plant, per year.
Posted: June 5, 2015 06:41AM
An integral part of flat screens is the transparent conductor indium tin oxide (ITO), but it brings with it special challenges. For one thing, indium is a rare-earth metal, making it rather expensive and some estimate that in twenty years we will run out of it. Many new technologies are being investigated for replacing ITO, but now researchers have also found a way to potentially recover and recycle the element, as reported in the journal ACS Sustainable Chemistry & Engineering.
According to the report, it is possible to recover indium by grinding up LCD glass into particles less than 75 micrometers in size, and submerging them in sulfuric acid heated to 122 ºF, along with other parameters. By using this method, the hope is that the display industry will be able to recycle the material instead of having to draw more from reserves. As it is expected that China alone will throw out 100 million displays between 2014 and 2020, the impact by recovering indium could be dramatic.
Source: American Chemical Society
Posted: June 4, 2015 02:12PM
With every new step in sensing, science has been able to discover new phenomena and typically improve the performance of technology. We may be seeing this happen again soon, thanks to researchers at the University of Bristol and the Center for Quantum Technologies.
Currently, one way to study quantum processes is to use quantum process tomography. This works by containing the quantum system in a box, and then shooting quantum states through the box, and measuring them when they come out. The problem with this method is that the precision is usually limited by something called the shot noise. By borrowing points from quantum metrology though, the Bristol researchers were able to solve this problem. Quantum metrology has to do with engineering and controlling quantum systems. The new method uses the generation of multiple entangled photons to study quantum processes.
The researchers have already used the new method to study an optical process that may be used to manipulate quantum bits in future quantum computers. There are other potential uses though, as increasing the precision of optical measurements can allow for measurements to be made with less light, and that is important in medical research, where light can damage samples.
Source: University of Bristol
Posted: June 4, 2015 05:43AM
Power consumption is a big deal in electronics, and, believe it or not, but a lot of energy is used just transmitting data along interconnects between chips. To cut power usage, some have been looking for ways to replace the electrical interconnects with optical ones. Now researchers at Stanford University have made this idea more practical with an inverse design algorithm.
Most likely, the optical interconnects that would be used in computers would be made of silicon, which is transparent to infrared light like glass is to visible light. What has been holding back the deployment of these interconnects has been the need to design each connection individually. By developing an inverse design algorithm, the Stanford researchers may have solved that problem. This algorithm will allow someone to plug in the properties they want and get a producible design out.
The researchers have already demonstrated the algorithm by designing optical circuits and building them in their lab, which bodes well for adoption at commercial fabricators. From this tool we could see an interesting, new generation of electronics that could be far more efficient than modern computers, and potentially also transmit more information at the same time, as chip-scale optical links can carry 20 times the data.
Source: Stanford University
Posted: June 3, 2015 04:44PM
Some believe people will be regularly using data-glasses in the future, to seamlessly provide information to us about our environment. While the technology exists already, it is not catching on just yet. One reason for this may be the size of the devices, but researchers at Fraunhofer-Gesellschaft have done something about that by shrinking the technology.
The data glasses in question are see-through, so the information they display would appear directly on top of what a user is looking at. Most current systems that do this consist of a micro-display that generates the image and optics to project it in the correct place. Both parts typically rest on the temple of the glasses. While the new micro-display is similar in size to others, the new optics system is just five millimeters long, making it just one fifth the length of others. This was achieved by using an array of smaller optics placed next to each other, instead of a single, long optical system.
The new design is also capable of correcting the display for vision issues, such as farsightedness. This will allow users with some vision problems to see the information sharply.
Posted: May 29, 2015 02:26PM
Sometimes science can involve shooting something, but the process does not end there as one has to determine why whatever happened, happened. For decades it has been known that shooting ions at an iron-based superconductor can positively affect its properties, but the exact mechanics involved have been somewhat mysterious. That is until now, thanks to researchers at Brookhaven National Laboratory and their spectroscopic-imaging scanning tunneling microscope (SI-STM).
Superconductors are materials capable of carrying electrical currents without resistance, but require very low temperatures to do so. They can also require that the area be free of strong magnetic fields. This is the case with iron-based superconductors as the magnetic fields can cause the formation of vortices within the material, and these vortices inhibit the free flow of electrons. In the 1970's though, it was discovered that shooting high-energy ions at the superconductor can, sometimes, prevent these vortices from moving, and thus mitigate their impact. Now we know the reason is because the damage from the ions leaves either holes, just a few nanometers wide in the crystal structure, or elongated streaks, which trap the vortices.
The hope is to take advantage of this damage and control the vortices, potentially allowing for control over currents at different temperatures or varying magnetic field strengths. To perform this study though required the SI-STM because only it was able to measure the damage the ions do to the structure at the atomic scale and how it impacts superconductivity, within the material, atom-by-atom.
Source: Brookhaven National Laboratory
Posted: May 29, 2015 05:44AM
According to the Theories of Relativity, the Universe has the speed of light as its speed limit, and this rules our day-to-day life. Quantum mechanics however, likes getting around such rules and has been caught doing so, which begs the question of if it is violating relativity or going around it? Researchers at the Australian National University believe they have the answer with regard to tunneling.
Quantum tunneling is an interesting phenomenon that is important in many places, including the nuclear fusion at the core of stars, scanning tunneling microscope, and FLASH memory. It involves particles acting like waves and skipping over barriers that would otherwise block them, thanks to their position not being well-defined. The question has been if tunneling has a speed that surpasses the speed of light, and according to the researchers it does not because that rule does not exactly apply. The math apparently works out so that the time it takes to tunnel across a barrier is a complex value, with an imaginary part, so the tunneling velocity must also be imaginary. What this translates to is that the tunneling must occur instantaneously, as imaginary values do not really work in our real Universe.
This discovery should have a number of impacts by allowing technology to reach faster speeds and smaller size, where tunneling plays an important factor for leakage. It also solves problems with some attosecond scale (10-18) observations, such as a delay between a photon striking an atom and an electron being ejected from it. Based on the researchers' calculations, this delay is caused by the nucleus trying to pulling the electron back in, and not from tunneling.
Posted: May 28, 2015 04:26PM
Correcting errors is important for the reliable operation of any computing device. For modern computers, we have it down pretty well, but for future quantum computers, the challenge is completely different. Researchers at MIT though have managed to overcome one significant aspect of the challenge, by breaking the limit others suffer.
Quantum computers get their name and extraordinary power from quantum mechanics, which presents a special challenge. In classical mechanics, measuring a system tends not to change it, but in a quantum mechanical system can change it, as demonstrated by Schrodinger's Cat. Because of this there was a time when researchers believed it would be impossible to correct errors in a quantum computer, because measuring the values of qubits would cause them to collapse, defeating the purpose. Despite those concerns, error correction systems have been developed that do not actually measure the qubits' value, but their relationship to others. All of these systems are limited to only working on a square root of the total number of qubits (so you could only correct eight qubits in a 64-qubit computer). The new MIT method breaks that limit by using a special qubit bank and time. As the qubits are manipulated in the computer and take on new states, a bank of qubits is assigned to each state. By analyzing relationships within the banks, it is possible to determine where an error occurred and to fix it.
Ironically, this approach does not prevent errors and could even introduce them, but what errors there are must obey certain rules, which is why they can be corrected later. This approach allows for an arbitrary fraction of the qubits in a quantum computer to be checked for errors, and thus breaks the square root limit. Now the question is how much redundancy is needed amongst the qubit banks, and if fewer can be used, simplifying the system.
Posted: May 28, 2015 06:42AM
For humans, when something is too difficult to do we can enlist the help of others to get the job done, but for robots it is not as simple. For multiple robots to work together their actions may need to be coordinated and the results can be very efficient, in a controlled environment. In other settings, the efficiency can vanish because of how complex the computational work is, but researchers at MIT have found a solution to this problem.
Instead of requiring the robots to build a comprehensive plan, this new method breaks it up into single actions. At first these actions are evaluated to determine if the one robot involved will be able to complete the task successfully, and then checked to see if they are compatible with the steps before and after, and the additional robots involved in those steps. If the robots cannot determine a viable solution to the step, it will just skip the step and move on, coming back to it later. This deferment is actually critical to the method as it allows the actions to be interrupted at any point, and it also allows the parts the robots are carrying to be freely dropped and regrasped as needed.
The resulting sequence of actions may not be the optimal solution, but it works and can be found significantly faster. In fact some optimal solutions would take thousands of years of computation to discover, so by just finding a viable solution, this method should prove very powerful for robots in the real-world, outside of factories and such.
Posted: May 27, 2015 03:05PM
Augmented reality systems are technologies that somehow affect our view of the world for some purpose. One example would be smartphone keyboards that use a camera to display the ground you are not seeing while walking, and glasses that layer information onto our environment. Recently at Marine Corps Base Quantico, the Augmented Immersive Team Trainer (AITT) was tested with glasses, rendering a non-existent battleground onto a golf course.
The glasses, or optical see-through components, were only recently finished and this marks the first time they were integrated into the AITT system. While there are comparable see-through systems available commercially, this system offers are larger field of view and has been likened to having an HD screen in front of your eyes. The Marines do already use HUDs, but this is able to integrate more complex information into the scene than that technology.
The AITT program has been running for five years now, and is set to end in the Fall, when it will become part of the Marine Corps Program Manager for Training Systems. At this point, it will continue to be tested for improving situational awareness in training and operations.
Source: Office of Naval Research
Posted: May 27, 2015 05:39AM
If there is one thing you do not expect to turn into fertilizer, it is our computers and other electronics, but that may be changing though, thanks to researchers at the University of Wisconsin-Madison. These researchers have worked to replace the plastic substrate material commonly used with a biological material with suitable properties.
The bulk of a computer chip is not the circuitry but the substrate that supports it all. What the researchers have done is replaced the usual substrate with one made of cellulose nanofibril (CNF), which is flexible and biodegradable. Like paper, it is made from wood, but where the fibers in paper are on the micrometer scale, there are on the nanoscale for CNF. One of the issues the researchers had to overcome with it was preventing it from absorbing moisture from the air. This was achieved by applying an epoxy coating, which also improved its surface smoothness.
A chip made from this material could, when it needs to be disposed of, could be put somewhere that fungi could find and grow on, enjoying their cellulose dinner. Before that happens though, the chip would be able to perform comparably to modern chips.
Source: University of Wisconsin-Madison
Posted: May 26, 2015 03:13PM
As amazing as a material may be, it is not until it is easily accessible that it can really shine. Graphene was discovered years ago and researchers have been discovering possible uses for it since then, while also searching for new ways to make it. Researchers at MIT have recently developed a roll-to-roll means to create large graphene sheets, only limited in size by the size of the foil substrate and deposition chambers used.
Graphene is an atom-thick sheet of carbon atoms that has many extraordinary properties, but producing it in large quantities is very difficult. Typically it requires either pulling it off of pieces of graphite or furnaces with chemical vapor deposition (CVD) that only put out stamp-sized pieces. This new means though is an adaptation of the CVD method that uses two concentric tubes for the vacuum chamber, and allows a foil substrate to be run through. The foil winds around the inner tube as it moves through the system, and holes in the inner tube allows the necessary mixture of vapors to reach the substrate, to prepare it to grow the graphene, and to actually grow it. Throughout the process, the whole chamber is heated to 1000 ºC.
Thus far the system has only been built on a laboratory scale and the foil had to move through at just one inch per minute to create high-quality graphene. It is possible to go faster, but that degrades the quality of the graphene, and while lower-quality graphene may still have applications, the researchers want to see if they can push the speed, while also scaling the system up. Though the focus was on graphene, this process could be adapted for growing other 2D materials and even carbon nanotubes.