Science & Technology News (294)
Posted: March 7, 2014 08:37AM
In some situations, a project requires certain vibrational properties and getting them requires carefully selecting the materials and using carefully engineered geometries. This can be limiting, especially if there is ever a need for the vibrational properties to change over time. Researchers at EMPA and ETH Zurich have recently created a prototype material that can have its vibrational properties programmed electronically.
The prototype consists of a long strip of sheet metal, one meter long, one centimeter wide, and one millimeter thick, with ten aluminum cylinders attached to it. Between the sheet metal and the cylinders are piezo discs. These discs can convert electrical energy into mechanical energy, so when they are electronically stimulated, they change in thickness. By manipulating the discs, the researchers were able to make the entire material behave as an adaptive phononic crystal, with the ability to change how vibrational frequencies are dampened on it.
As the material is currently just one dimensional, this research is just a step toward programmable materials. It does still show that it may be possible to build electronic circuits onto a material, so that it can react to vibrations to change its properties as needed.
Posted: March 6, 2014 02:07PM
There is a decent chance we all know what it is like to go through a power outage, when we cannot use our electronics to get to our favorite websites and such. When power is restored, we can go back to our lives before, but what if a datacenter were taken offline, and the cause was a greater disaster than just a power outage? Researchers are wondering that as well and there will be multiple presentations at the OFC Conference and Exposition on it, as reported by the Optical Society.
One of the works to be presented is an algorithm that studies the risk and usage factors of data centers. Based on that information, the algorithm can determine where past to copy or move data, so that if an at-risk data center is taken down, data will not be lost and potentially users will not lose access.
Another presentation will focus on integrating wireless technology with optical fiber cables. With integrated wireless technology, if the cable is damaged, a temporary emergency wireless network can be created, until more permanent repairs can be made. A new antenna architecture featuring two differently polarized and isolated pairs of antennas will also be presented, with its 146 Gb/s data transmission rate.
Source: The Optical Society
Posted: March 6, 2014 07:57AM
To put it simply, the only safe electronic device is the one that has never been turned on, as malware writers continually find new ways to attack technology. Android devices are no exception, so many defenses have been developed, but some are not always that accurate. Researchers at North Carolina State University have recently developed a more refined version of anomaly detection for finding and stopping root exploits.
Root exploits are a class of malware attacks that take control of an operating system's administration functions, granting unlimited control of the system. One method for detecting and stopping malware that has infected applications is anomaly detection, which works by comparing the behavior of the application on a device against a record of how it should behavior. The problem is that this can throw false positives, but the North Carolina researchers are taking advantage of an interesting pattern to better protect the Android OS. As it turns out, most Android root exploits are written in C, even though most Android apps are written in Java. The researchers' Practical Root Exploit Containment (PREC) system has been designed to scrutinize C code specifically, and has significantly reduced the number of false positives.
The researchers hope to take advantage of the methods app vendors use to protect their products from malware to build the standard-behaviors database. One way to do this would be for the vendor to add PREC to their assessment processes, allowing it to record behavior data and build the database itself.
Source: North Carolina State University
Posted: March 5, 2014 01:51PM
Glasses can be rather odd materials and part of the reason for that is to do with how they transition from a solid to a liquid. There are temperatures at which they are clearly a solid and clearly a liquid, but in between it can be hard to tell. Researchers at the University of Waterloo have recently found the answer to a twenty-year-old debate concerning the surfaces of glassy materials.
About twenty years ago, it was discovered that at temperatures a glass should be a solid, the surfaces of some were flowing like liquids. Since then researchers have been working to find how this is possible. Though most people will associate glasses with windows, many materials, including metals and polymers can be glasses, as the term refers to a disordered molecular structure. For this work, the researchers were working with a glass of polystyrene. Thin slices of the glass were stacked to form staircase-like steps just 100 nm high. The researchers then observed and measured how the steps changed shape over time, and thanks to the simple 2D profile of the surface, they were able to numerically model it.
With this understanding of glasses, researchers can consider how nanostructures made of a glass will change over time. This could impact the use of thin film coatings and how small some nanoscale devices and circuitry can be made.
Source: University of Waterloo
Posted: March 5, 2014 08:43AM
Typically crystals are thought of as being beautiful stones used in jewelry, but many more materials are crystals, including metals. As molten metal cools, its atoms will align into a structure that gives it certain properties, but it is possible to force the atoms to not align properly, producing a metallic glass. Due to the amorphous structure, metallic glasses can have special properties, including great ductility, and researchers at Los Alamos National Laboratory, the University of Wisconsin, Madison, Universitat de Barcelona, and Tohoku University are working to improve those properties.
Unlike the glass of a window pane, metallic glasses can be quite flexible, but they will still reach a point where they fracture. Instead of fracturing, the researchers want the metallic glass to fail like other, more ductile materials. To learn how to do that the researchers examined shear bands, which form in a material to relieve the stress applied to it, using nanoindentation. The idea behind nanoindentation is simple enough, as it involves pressing a needle into the material, with all of the force concentrated at the tiny tip. The researchers did this repeatedly to find where and when the shear bands formed.
By controlling the formation of shear bands, it should be possible to control how a metallic glass will fail, and thus a way to shape the material's properties. The researchers have already found that there are multiple types of initiation sites for shear bands, which is counter to the assumption that only one exists.
Source: Los Alamos National Laboratory
Posted: March 4, 2014 02:10PM
For many people, the thought of solar power brings to mind arrays of silicon panels or massive reflectors, aiming light to a point. These are not the only examples of solar power technology though, and one of them may be seeing a drop in cost soon. Researchers at the University of North Carolina recently found a way to optimize the design of thin film solar cells that could reduce the amount of amorphous silicon used by an order of magnitude.
The best thin film solar cells today use a layer of amorphous silicon roughly 100 nm thick, and it captures most of the solar energy that falls on it. What the researchers design uses a 10 nm thick layer and absorbs 90% of available solar energy. This was achieved by examining the absorption of semiconductor materials in light-trapping techniques, like those of a thin film solar cell. The researchers found that absorption is maximized when the light-trapping efficiency and absorption efficiency match. From this they designed a layered, rectangular design which could potentially be used to improve the performance of thin film solar cells based on materials other than silicon.
One of the challenges to manufacturing and cutting the cost of thin film solar cells is the deposition of the semiconductor material. By reducing the amount needed by an order of magnitude, the cells could be produced faster and more cheaply.
Source: North Carolina State University
Posted: March 4, 2014 08:26AM
Though flash-based SSDs are growing in popularity, a great number of systems still rely on magnetic hard drives to store data. So many in fact that processing and storing the data on the all of the drives now accounts for a significant fraction of the world's energy consumption. Researchers at the University of York though have recently made a discovery that could reduce energy costs, without increase monetary costs.
Writing data to a magnetic drive usually requires applying a magnetic field to it, which uses a fair amount of energy. Some time ago all-optical thermally induced magnetic switching (TIMS) was discovered, for some rare-earth-transition-metal alloys call ferrimagnets. As the name suggests, it is possible to switch the magnetic state of those alloys using a laser pulse. As the laser pulse uses less energy than the magnetic field, the result would be a more efficient drive. The catch is that the alloys are prohibitively expensive. What the York researchers have discovered is a synthetic ferrimagnet comprised of two ferromagnetic materials, with a non-magnetic spacer sandwiched between.
Without the reliance on rare metals, this discovered could help bring all-optical TIMS to storage technology, cutting energy costs.
Source: University of York
Posted: March 3, 2014 01:17PM
The march of progress for electronics has been following Moore's Law for some time now, faithfully doubling the number of transistors on a chip every two years. That march may be halted in the future though, as power consumption and other factors limit our ability to shrinking the components further. One alternative to electronics that could pick up the march is photonics, and researchers have recently made a great step towards that end, as reported by the Optical Society.
Researchers have been building photonic devices for some time, but these devices are often custom built, using processes and techniques that prevent mass production. Now researchers have managed to build a modulator and tunable filter using IBM's advanced CMOS process. This same process has been used to create silicon electronics for years, and this is the first time the researchers are aware of it being used to create silicon photonics that rival electronics for energy efficiencies. The modulator can be used to translate an electrical signal to an optical one while the tunable filter can select a single frequency from multiple.
The researchers believe these devices could be used for chip-to-chip communications, where the chips could take advantage of the 10 times higher bandwidth density of the optical devices. As one optical link can carry multiple frequencies, it can carry multiple, discrete channels, whereas electrical signals each require their own wire. The work is to be presented at the OFC Conference and Exposition in San Francisco from March 9-13.
Source: The Optical Society
Posted: March 3, 2014 09:03AM
Perhaps I am just an old soul, but I tend to get my news and information by more traditional means, while many people around me are kept informed by social media. It is definitely true that social media has the ability to quickly disseminate information, but what happens once we have that information? According to researchers at the University of Copenhagen, we cease to act rationally due to social proof.
Humans are social beings, so we are constantly looking for social cues of one form or another, to inform our actions and opinions. It used to be that you would have to look at people or a crowd of people to collect social information. Now with Twitter, Facebook, and other social media systems, we can rapidly acquire large amounts of social information in an instance. When combined with social proof, this can lead to some awkward situations. Social proof is the phenomenon where someone trusts those around him or her, assuming that these other people possess greater knowledge. Of course that may not be the case, but with the power to see up-votes, retweets, etc. immediately, social proof can have a significant impact on us, as we trust the crowd instead of verifying the information.
Last year a rumor spread so rapidly on Twitter that it caused the value of the Euro to drop a small, but meaningful amount, even though that rumor was false. What might be the impact next time a rumor spreads, and no one bothers to check it?
Source: University of Copenhagen
Posted: February 28, 2014 08:23AM
Quasiparticles are interesting things in physics, as they are actually small groups of individual particles, which for one reason or another have come together to behave as one. One example is an exciton, which is actually an electron and the positively charged hole it leaves behind. Researchers at NIST and the University of Marburg recently created a quantum droplet containing about five electrons and holes.
In semiconductors, it is not unusual for electron-hole droplets to form, but they typically contain thousands to millions of pairs. The smaller droplets the researchers discovered were made with an ultrafast red laser pulsing at about 100 million times a second on gallium-arsenide. Initially the pulses just formed excitons, but as the intensity grew the pairing within the excitons fell apart and the electrons and holes collected together, forming the new quasiparticle.
With a lifespan of only about 25 picoseconds (0.000000000025 seconds), we will likely not be seeing any technology built around these. However that is long enough to perform optical studies, to learn how light interacts with highly correlated states of matter, and that may influence future optoelectronics.
Posted: February 27, 2014 05:14PM
In one form or another, we are all likely familiar with liquid crystals, and whether we like it or not, we are also familiar with bacteria. Liquid crystals and bacteria however are not familiar with each other, but that has changed now. Researchers at the Argonne National Laboratory and Kent State University have placed bacteria in a liquid crystal medium, creating living liquid crystals, and made some interesting discoveries.
Typically liquid crystals will align into long, rod-like structures along a single dimension, called a director. The researchers took a colony of Bacillus subtilis and transferred it to the liquid crystal solution, where at first the bacteria moved along with the director. As more bacteria were added however, they started to affect the liquid crystals. Like many other bacteria B. subtilis uses a long tail called a flagellum to propel itself, by spinning the tail like a corkscrew. This motion disrupted the director and started creating a wave-like pattern in the liquid crystal. As even more bacteria were added, the bacteria started forming stripes through the director, at different orientations, and by depriving the bacteria of oxygen, the stripes could be erased.
The hope for this study is that it will lead to a better understanding of active materials, which could potentially consume energy from the environment. Also this research could impact how bacteria are studied because while flagella are nanometers wide, requiring an electron microscope to see, the wake left in the liquid crystals is visible under an optical microscope.
Source: Argonne National Laboratory
Posted: February 27, 2014 08:04AM
Though I would never describe myself as a programmer, I do have some experience and know how irritating it can be to find I missed one section of code. To help with that, researchers at MIT developed Sketch, which has the ability to fill in missing could, in some situations. Now researchers at MIT have improved Sketch's efficiency, making tasks that were previously impossible for it take just milliseconds.
Sketch operates by treating the program it is analyzing as a search problem. When it comes to a spot missing code, it considers and tests the possible variations of what could fill the void. When the program is too complex though, the number of variations can grow too large for Sketch to handle. In these cases Sketch will eventually time out or give up on finding the solution. What the recent work adds is the ability for a developer to describe the criteria of the missing code. This way Sketch only has to find code that satisfies the criteria, and then move on without checking every possible variation.
By shrinking the number of variations to test, the researchers found Sketch was able to complete previously impossible analyses in milliseconds. As impressive as that sounds, it will be a long time before Sketch would be of great use to professional developers, but for some very specialized tasks, it could be very useful.
Posted: February 26, 2014 02:02PM
Many people will associate optical data transmission with the backbone of the Internet, but it is actually used in many, less grand situations, such as connecting servers and supercomputers within a single building. Like the Internet though, speed is very important and the equipment is approaching the limits of the optical technology. As reported by The Optical Society, IBM researchers have recently set a new data-rate record for multimode optical fiber of 64 Gb/s.
Multimode optical fiber is typically used to in those smaller situations mentioned above, and in this case was only 57 meters long. To achieve that record, the researchers employed a technique commonly used in electrical communication, but not in optical communication yet. This technique is called transmit equalization and works to widen the bandwidth of an optical link. Interestingly the researchers also used non-return-to-zero (NRZ) modulation, which is a standard in optical communication. What makes this interesting is that few believe such modulation can achieve transfer rates faster than 32 GB/s. Obviously that is not the case.
While the record-setting speed was only achieved with a 57 meter long multimode fiber that should suffice for many data centers where 80% of cables are less than 50 meters long, and supercomputers likewise do not require that long of cables. Also the new technology is ready right now for commercialization.
Source: The Optical Society
Posted: February 26, 2014 09:03AM
For at least some people, we may never look at chickens the same way again, as it turns out their eyes are very different from our own and many other animals. Within eyes, human and avian, are a cells called cones that are sensitive to different colors of light, and layout of the cones impacts the creature's vision. Researchers at Princeton University have found that the layout of cones in a chicken's eyes follow a special pattern, which could lead to advanced optical devices.
When first looking at the placement of the different kind of cones in a chicken's eyes, researchers thought they were placed randomly. This would seem unlikely as randomness would impair vision, and birds are known to have good vision. The researchers got to work attempting to model the layout and found that there is indeed a pattern, called disordered hyperuniformity. What this means is that over a large distance, variations of the cones are suppressed, like in a crystal, but over short distances there is disorder, like in a fluid. The reason for this is the presence of exclusion zones around each cone, in which another cone, especially one of the same type, should not enter, as they will interfere with each other. Therefore to optimally pack the cones, the avian eye evolved this disordered hyperuniform design.
Potentially this information could be applied to create self-organizing optics that can transmit light as efficiently as a crystal, while being flexible like a fluid.
Source: Princeton University
Posted: February 25, 2014 08:52AM
For many technologies and devices a lot of energy is lost as heat, which is why we have been working to somehow capture that heat and make use of it. Thermoelectric materials, which can convert between electrical and thermal energy, could be especially useful here, but materials with optimal properties have proven hard to find and hard to produce. Researchers at the University of Colorado, Boulder have taken a different approach to the problem and have found a solution that could significantly improve the situation.
Thermoelectric materials produce an electrical current when there is a heat differential across it and the stronger the differential, the stronger the current. The problem is that typically electrical and thermal conductivity go together, so a material that would transmit an electrical current well would also dissipate a temperature differential quickly. What the researchers have suggested is to add nanoscale pillars on a sheet of a thermoelectric material. Heat is carried through a material as phonons, which are the quanta of vibrational energy, and these pillars will interfere with those vibrations, slowing their movement.
According to the researchers' estimates, the pillars may impair heat flow by up to half, which is pretty significant. The actual effect of the pillars may be greater than that though, as the researchers describe their calculations as very conservative.
Source: University of Colorado, Boulder
Posted: February 25, 2014 06:22AM
Heat is a problem for our computers as it can degrade our components, but while we are typically concerned with our processors overheating, they are not the only components that can suffer. Hard drives and eventually magnetic-RAM will suffer from heat issues as an electric current is required to flip the magnetic fields within them, and that current will generate heat. Researchers at the University of Miami however, have found a way to change the direction of a magnetic field using an electric field, which can be produced without heat.
It has been known for some time that an electric field can affect a magnetic field, but this new work adds to our understanding. It takes advantage of a phenomenon called Rashba spin-orbit coupling, which comes from the interaction of an electric field and electron's spin. Magnetism comes from the spin of electrons. What the researchers discovered is that by creating a capacitor, in which one element is magnetic, the orientation of the magnetic field can be changed by charging the capacitor.
Potentially this could lead to denser magnetic memory, as the reduction in heat will allow individual elements to be placed closer together. It may also impact electromechanical devices including actuators, which convert an electrical signal into mechanical motion.
Source: University of Miami
Posted: February 24, 2014 01:07PM
Muscles are a critical part of our lives, as they are what provide the force and power to move about our world. Recreating that ability has been proving difficult though, as motors and engines can be too large, and artificial muscles can be expensive to produce. Researchers across the world, including those at that University of British Columbia, have recently developed an artificial muscle that utilizes materials commonly found in fishing lines and thread.
Artificial muscles have to have the ability to expand and contract, producing so much force in the process, similar to our natural muscles. Metal wires and carbon nanotubes have been used before to make artificial muscles, but both materials can be difficult to work with and expensive to produce, making them less-than-ideal. What the researchers developed is instead based on fibers of polyethylene and nylon, which have been twisted into tight coils. To make them expand and contract, the researchers had to change their temperature, which can be accomplished with an electrical heating element. Compared to a human muscle of equal size, it can lift objects 100 times heavier.
Being a common material, artificial muscles based on this material could be cheaper and easily deployed. Potential applications include robots as well as low-cost devices to assist those of reduced mobility.
Source: University of British Columbia
Posted: February 24, 2014 09:44AM
The backbone of the Internet is comprised in large part by lasers and optical fiber, as optical communication can transmit prodigious amounts of data rapidly and efficiently. As the world demands greater bandwidth though, the technology is approaching its limits. Researchers at the California Institute of Technology however, have developed a new laser which could potentially increase optical data transmission rates by orders of magnitudes.
For forty years, distributed-feedback semiconductor (S-DFB) lasers have been used to send the optical signals along the cables. The reason it was first used and continued to be used is its spectral purity. The light it produced was very near a single frequency, and the more pure the light is, the faster digital bits can be transmitted. Unfortunately part of the laser's design limited its spectral purity, and while it was satisfactory in the past, it may not be in the future. The problem was that the III-V semiconductors used to produce the light also absorbed some of it, which degrades the spectral purity. The Caltech researchers have found that this absorption can be prevented by adding a layer of silicon to act as a concentrator. The silicon will pull the light away from the III-V semiconductor, but will not absorb the light itself.
This new laser could reduce the spread of frequencies to be 20 times narrower than that of the S-DFB lasers in use today. Such an improvement could greatly help the Internet keep pace with our demands for higher speed.
Posted: February 21, 2014 09:02AM
There are multiple physical theories and mechanics that influence the world around us, and some of them even compete with each other, in certain scenarios. For decades now, physicists have been questioning which theories form the basis of the Universe, but finally we may be drawing close to an answer. Researchers at MIT have recently devised a way to test and potentially close a loophole in Bell's inequality, which could indicate if the classical or quantum mechanics are the foundation of the Universe.
Bell's inequality was developed fifty years ago to address the disparity of classical and quantum mechanics, as it pertains to certain phenomenon, such as entanglement. Entanglement is when two particles become so strongly coupled that measuring one will immediately affect the other, no matter how separated they are. Bell's inequality would limit entanglement, if classical mechanics is at the heart of the Universe. Since it was developed, researchers have been testing it, and finding loopholes. One major loophole still remains, the so-called 'free will' loophole. According to this loophole, the particle detectors measuring the particles may be biased towards certain measurements, due to some shared causal past. Therefore, the detectors, and the scientists operating them do not truly have free will, as the particles is concerned.
What the MIT researchers have proposed is taking advantage of distant quasars to make the measurements. By selecting two quasars, which can be nearly the age of the Universe, on opposite sides of the visible Universe, one should be able to assume that they and the light they emit have not shared any information since the Universe was born. The light from these quasars would be used to determine the settings of the particle detectors, and after enough measurements it should be possible to say if the particles are more strongly entangled than classical mechanics allows, thus indicating the Universe is based on quantum mechanics. Usefully, the proposed experiment can be completed with current technology, so it could happen in a few years.
Posted: February 20, 2014 08:19AM
When a new material is discovered, researchers work to determine its many properties, hoping to find something special. One such special property would be the ability to be made into a superconductor and transmit an electrical current without resistance. Researchers at the University of Vienna and other institutions have recently come up with a plan that may make graphene into a superconductor.
Graphene is an atom-thick sheet of carbon and was first discovered in 2004. Since then researchers have been studying it, trying to uncover its many characteristics and find ways to apply them. It is already very strong, light, and has extraordinary conductive properties, but it is not a superconductor. Other forms of carbon have been made into superconductors though, so the researchers worked at it and found that it should be possible to dope graphene with enough calcium atoms to induce superconductivity at about 1.5 K.
At such a low temperature, a graphene-based superconductor is not going to have much practical use. However, because of graphene's two-dimensional structure, it can be easily manipulated. As such, a graphene-based superconductor could be used as a tool to better study superconductivity, and perhaps help reproduce it in other materials.
Source: University of Vienna
Posted: February 19, 2014 05:22PM
A fast processor is useless if it cannot quickly access data, which is why CPUs have local caches built in, so the cores have the data they want right there. Unfortunately while the speed of processors has increased, their ability to communicate with their caches has not kept pace. Researchers at MIT though have developed two means to more intelligent use the caches, and thus improve performance.
Multi-core chips sport caches at multiple levels; one level holds caches with each core and another holds data for the entire processor to access. The protocol for storing data follows a principle of spatiotemporal locality, which assumes if a core accesses a piece of data once, it will likely request it again, and will likely request data near it in the main memory as well. While this has been working well for some time, there are some cases where it fails, such as when the data a core is working on exceeds its local storage. One of the MIT proposals would have that data shared between the local cache and the last-level-cache (LLC), which serves the entire chip. The data will also not be unnecessarily swapped around. If multiple cores need access to the same data, that data will then be stored in the LLC instead of the local caches, which will require constant updating.
The other caching method proposed would change how the LLC is treated. Instead of using it as a single memory bank, with data only stored in it once and spread out, it would be copied to blocks near the cores that need it. This will increase performance for those situations when multiple cores need access to the same data infrequently.
Though these are separate methods, the researchers are working on how to integrate them both into the same chip. As both require actively monitoring a processor's operation though, more circuitry will have to be added, about five-percent the size of the LLC. As transistors keep shrinking though, and communication becomes more important, chip-space is less crucial a concern.
Posted: February 19, 2014 07:42AM
Graphene is an atom-thick sheet of carbon with extraordinary electrical characteristics. Many would like to see it used in electronics, but it lacks one important feature, a band gap. Researchers at the University of Luxemberg and other institutions have produced for the first time artificial graphene, which could potentially be engineered with whatever properties are desired.
While some of graphene's properties may come from the carbon it is made of, some do originate from its honeycomb structure. The researchers have recreated that structure but instead of carbon it is made of nanometer-sized, semiconducting crystals. By changing the structure and chemistry of those crystals it will be possible to change the properties of the larger material.
With the ability to tune artificial graphene's properties, this discovery could enable many new systems and technologies in the future that take advantage of the material's extraordinary properties.
Source: University of Luxemberg
Posted: February 18, 2014 01:15PM
Though we may not see them, we all rely on copper or aluminum wires to provide power to our homes. While these cables work well, people have searched for better solutions. Researchers at Rice University have recently found that fibers made of carbon nanotubes could be a viable replacement.
Carbon nanotubes are known for their conductivity, so this discovery may not be too surprising, but it is when considering how many kinds of nanotubes there are. The best for conducting electricity are shaped like armchairs, but this variety is difficult to produce in bulk. Instead the researchers worked with nanotube fibers that are less conductive. To measure the nanotubes current carrying capacity, the researchers built a custom rig that charges a cable in a controlled environment until the heats causes it to break. What they found is that their nanotube fibers performed the best of any reported carbon-based fibers, but they still had worse resistivity than copper by an order of magnitude. However, because the nanotubes are so light, a cable of nanotubes, equal in mass to a copper cable, could carry four times as much current.
One of the researchers has suggested that because of how light the nanotube fibers are, one could potentially use them to power an aerial device from the ground. The fibers would be like a kite string, connecting the airborne device to a terrestrial power source.
Source: Rice University
Posted: February 18, 2014 10:04AM
Graphene is a somewhat popular material in research labs because of its many wonderful properties, including high strength and conductivity. The atom-thick sheet of carbon also has some weird properties, such as being hydrophobic, except when used to create narrow capillaries. Two years ago researchers at the University of Manchester have tested how graphene dioxide is at filtering water vapor and now they have tested it for filtering liquid water.
In the previous experiment the researchers found that laminates of graphene dioxide were impermeable to all gases and vapors besides water vapor, which passed through as though nothing were there. Even helium, the smallest molecule could not escape. Graphene dioxide was used because it allowed for atom-wide capillaries to be formed by stacking layers on top of each other, forming the laminate. When immersed in water, the laminates did swell up as it absorbed water, but water was still able to flow through very quickly.
Some ions were also able to pass through the membrane, but only those smaller than nine angstroms, or 0.9 nanometers. However the researchers do believe that with more work they will be able to reduce that value, allowing for an ultrafast filtration system that could remove even some of the smallest ions found in seawater.
Source: University of Manchester
Posted: February 17, 2014 08:28AM
Superconductivity is an interesting phenomenon of some materials that allows them to conduct electricity without resistance, but only under certain conditions. With no resistance to impair transmission, researchers have been working to understand what causes superconductivity to emerge in these materials. Researchers at Brookhaven National Laboratory have recently studied how the electron orbitals in iron-based superconductors change, as a result of dopants.
Iron-based superconductors are one kind of high temperature superconductors, which become superconducting at temperatures significantly above absolute zero, but still much lower than room temperature. They are special compounds that contain multiple elements, and the Brookhaven researchers examined how the outer-most electrons in one material are affected by the addition of a dopant, which makes the material a superconductor. The base material was barium iron arsenic, and when analyzed the electron orbitals or the iron were shown to be sandwiched between arsenic atoms. Once some cobalt atoms were doped in, the orbitals changed, creating a stronger quadrupole around the iron atoms and strongly polarizing the arsenic atoms.
This altered electronic structure helps electron couple into pairs, which is essential for superconductivity to occur. As this research was done under static conditions, and at room temperature, the researchers are going to repeat their work with super-cold samples of other materials, and perhaps help discover what creates superconductivity.
Source: Brookhaven National Laboratory
Posted: February 14, 2014 09:04AM
For anyone who plays video games, we are familiar with using an avatar to affect the virtual environment. As we play we can also start to be immersed into the game and the avatar. Now researchers at the University of Illinois at Urbana-Champaign, as reported by the Association for Psychological Science, have found an interesting link between our behavior and virtual avatars.
The researchers got 194 undergraduates to participate in, what they thought were two unrelated studies. The first had them randomly assigned to fight enemies in a video game as Superman, Voldemort, or as a circle for five minutes. Superman is a heroic avatar, Voldemort a villainous one, and a circle is just a neutral circle. The participants were then part of a blind taste test involving chocolate and chili sauce, and an intriguing twist. The participants could choose one of the foods to pour into a plastic dish a future participant would have to consume. Those who played as Superman poured out almost twice as much chocolate as chili sauce, and significantly more chocolate than anyone else. Those who played as Voldemort behaved similarly, but with the chili sauce instead of the chocolate. Interestingly the participants' identification with the avatar did not seem to affect their behavior.
Currently these findings are preliminary, but could have some interesting impacts for social behavior. After all, in video games and other virtual environments, one can do almost anything, and whatever we do is behind a virtual mask.
Posted: February 13, 2014 04:19PM
Many recognize nuclear fusion as the ultimate power source, as it could one day deliver prodigious amounts of power from relatively cheap hydrogen isotopes. While many facilities around the world have been working on a variety of fusion projects, none have yet achieved ignition, which is when more energy is released than was used to confine the fuel. Researchers at Lawrence Livermore National Laboratory using the National Ignition Facility (NIF) however have achieved 'fuel gains' for the first time, which is when more energy is released than was put into the fuel.
Instrumental in achieving this was triggering a bootstrapping with alpha particles. Alpha particles consist of two protons and two neutrons (a helium nucleus), and are released as a result of the fusion process. The bootstrapping process is when those particles deposit their energy back into the deuterium-tritium fuel, heating it further and increasing the rate of fusion, which then releases more alpha particles. The researchers also modified the laser pulse that compresses the fuel to prevent the breakup of the polymer shell around it, and help the bootstrapping occur. This led to more than an order of magnitude increase in performance.
Along with achieving fuel gains, the researchers also found that their results matched computer simulations better than ever before, which is very important for NIF. The primary purpose of NIF is to provide experimental insight for the Stockpile Stewardship Program, which works to keep the nation's nuclear weapon stockpile safe, secure, and reliable without directly testing them.
Posted: February 13, 2014 06:44AM
A problem faced by some or all tech companies is how to make their products cool, to improve sales. After all, the cooler the product, the more it is talked about and name recognition is important. Researchers at Penn State University have recently completed a series of studies into what makes something cool, and found there is more to it than previously thought.
Many people would say that a product is cool if it looks good, is original, and if it is edgy, but according to this research there is more to it than that. The product also has to appeal to the appropriate subculture (such as overclockers). Typically a subculture is the first group to adopt a product and it does not reach mainstream adoption if the subculture does not like it. Assuming that happens, if the product becomes uncool to the subculture, it can lose ground in the mainstream. Fortunately such products can become cool again, by regaining popularity within a subculture. To keep a product cool though, the company behind it has to be continually innovating.
Altogether, the researchers found that four elements contribute to coolness: subculture appeal; attractiveness; usefulness; and originality. Interestingly though, usefulness is not as important as initially thought, as participants rated some products both as cool and as low utility. Hopefully companies will not decide to produce products of low utility, hoping to capitalize on the other elements to succeed.
Source: Penn State University
Posted: February 12, 2014 02:17PM
When one works with a living cell, they generally must do so from outside its membrane. This is understandable, considering the size of a cell, but if researchers could reach inside a cell, they could do some interesting things. Researchers at Penn State University have recently achieved that by controlling nanomotors that existed within a living cell using ultrasound and magnetic fields.
Nanomotors have been created before, but have never been used in living human cells, in part because at least the chemically fueled ones use a toxic fuel. This new design however can be powered remotely using ultrasound, which does not affect a cell on its own, and does not affect the motors either, at low frequencies. Turn up the frequency however and the motors can start spinning, like an egg beater, or move forward like a battering ram and actually puncture the cell membrane. If they are not pointed in the correct direction, a magnetic field can get them pointed where you want them. This could including pointing them at an organelle, to discover the unseen reactions a cell has for such events.
Another important feature of these nanomotors is that they can move autonomously from each other. This will be exceedingly useful if they were sent to hunt cancer cells or deliver medicine, as you would not want them all to move in one direction.
Source: Penn State University
Posted: February 12, 2014 06:20AM
Anyone who has driven home during rush hour expects that uneven movement leads things to slow down. According to researchers at Northwestern University however, the opposite may be true in the case of water moving through nanochannels; the interior of nanotubes.
For some time it has been believed that water will travel through the center of a nanotube evenly. At one point when this assumption was tested, it came back that the water molecules were traveling some ten thousand times faster than predicted. This was explained away as the result of the smoothness of the interior of the nanotubes, but the new observations challenge that. Instead of the molecules moving with an even, constant flow, the Northwestern researchers suggest that instead they move intermittently as a result of the difference in the size of water molecules and the spacing between carbon atoms in the nanotube. This causes areas to form where the water molecules are unstable and then move through the channel very easily and readily.
Many applications could benefit from this improved understanding of fluid dynamics, including water desalination systems, carbon nanotube-powered batteries, and more. Also it is important to note that while they are not comprised of carbon nanotubes, cell membranes possess nanochannels that regulate fluid flow between a cell and its environment.
Source: Northwestern University