Recent Articles:

A Journey Into the Surreal Realm Where David Lynch Meets Quantum Physics

September 21, 2014 Tech Comments Off

Just yesterday we were talking about the concept of “time dilation,” in which the fabric of space-time warps under various circumstances due to the universal constancy of the speed of light (the speed of light stays the same no matter how fast you’re moving). As an explainer, it turns out that I have nothing on Andrzej Dragan, most certainly “the only quantum physics PhD who’s also photographed David Lynch holding a chicken,” in the words of Motherboard’s sister site the Creator’s Project, which recently published a Q+A with the Polish artist and scientist that’s well worth reading

His film below, Physics #1: Time Dialation, the first in a series exploring quantum mechanics, is as strange (and indeed as Lynchian as anything) as it is illustrative, or perhaps even a bit more so:

If that wasn’t quite enough, in the second installment Dragan takes on the highly counterintuitive nature of subatomic particles.

As for the dark, otherworldly tone of the pieces, Dragan told TCP, “Most of the elements of the truth that we, as scientists, discovered, are not hard to understand—but they are hard to believe.” Physics itself is surreal.

Motherboard RSS Feed

Solar-cell efficiency improved with new polymer devices

September 21, 2014 Singularity Comments Off

New light has been shed on solar power generation using devices made with polymers, thanks to a collaboration between scientists in the University of Chicago’s chemistry department, the Institute for Molecular Engineering, and Argonne National Laboratory.

Researchers identified a new polymer — a type of large molecule that forms plastics and other familiar materials — which improved the efficiency of solar cells. The group also determined the method by which the polymer improved the cells’ efficiency. The polymer allowed electrical charges to move more easily throughout the cell, boosting the production of electricity — a mechanism never before demonstrated in such devices.

“Polymer solar cells have great potential to provide low-cost, lightweight and flexible electronic devices to harvest solar energy,” said Luyao Lu, graduate student in chemistry and lead author of a paper describing the result, published online last month in the journal Nature Photonics.

Solar cells made from polymers are a popular topic of research due to their appealing properties. But researchers are still struggling to efficiently generate electrical power with these materials.

“The field is rather immature — it’s in the infancy stage,” said Luping Yu, professor in chemistry, fellow in the Institute for Molecular Engineering, who led the UChicago group carrying out the research.

The active regions of such solar cells are composed of a mixture of polymers that give and receive electrons to generate electrical current when exposed to light. The new polymer developed by Yu’s group, called PID2, improves the efficiency of electrical power generation by 15 percent when added to a standard polymer-fullerene mixture.

“Fullerene, a small carbon molecule, is one of the standard materials used in polymer solar cells,” Lu said. “Basically, in polymer solar cells we have a polymer as electron donor and fullerene as electron acceptor to allow charge separation.” In their work, the UChicago-Argonne researchers added another polymer into the device, resulting in solar cells with two polymers and one fullerene.

8.2 percent efficiency

The group achieved an efficiency of 8.2 percent when an optimal amount of PID2 was added — the highest ever for solar cells made up of two types of polymers with fullerene — and the result implies that even higher efficiencies could be possible with further work. The group is now working to push efficiencies toward 10 percent, a benchmark necessary for polymer solar cells to be viable for commercial application.

The result was remarkable not only because of the advance in technical capabilities, Yu noted, but also because PID2 enhanced the efficiency via a new method. The standard mechanism for improving efficiency with a third polymer is by increasing the absorption of light in the device. But in addition to that effect, the team found that when PID2 was added, charges were transported more easily between polymers and throughout the cell.

In order for a current to be generated by the solar cell, electrons must be transferred from polymer to fullerene within the device. But the difference between electron energy levels for the standard polymer-fullerene is large enough that electron transfer between them is difficult. PID2 has energy levels in between the other two, and acts as an intermediary in the process.

“It’s like a step,” Yu said. “When it’s too high, it’s hard to climb up, but if you put in the middle another step then you can easily walk up.”

Thanks to a collaboration with Argonne, Yu and his group were also able to study the changes in structure of the polymer blend when PID2 was added, and show that these changes likewise improved the ability of charges to move throughout the cell, further improving the efficiency. The addition of PID2 caused the polymer blend to form fibers, which improve the mobility of electrons throughout the material. The fibers serve as a pathway to allow electrons to travel to the electrodes on the sides of the solar cell.

“It’s like you’re generating a street and somebody that’s traveling along the street can find a way to go from this end to another,” Yu said.

To reveal this structure, Wei Chen of the Materials Science Division at Argonne National Laboratory and the Institute for Molecular Engineering performed X-ray scattering studies using the Advanced Photon Source at Argonne and the Advanced Light Source at Lawrence Berkeley.

“Without that it’s hard to get insight about the structure,” Yu said, calling the collaboration with Argonne “crucial” to the work. “That benefits us tremendously,” he said.

Chen noted that “Working together, these groups represent a confluence of the best materials and the best expertise and tools to study them to achieve progress beyond what could be achieved with independent efforts.

“This knowledge will serve as a foundation from which to develop high-efficiency organic photovoltaic devices to meet the nation’s future energy needs,” Chen said. — By Emily Conover


Latest Science News — ScienceDaily

New Hints of Dark Matter from the International Space Station

September 21, 2014 Tech Comments Off

Earth’s atmosphere protects us from the continuous barrage of high-energy cosmic rays that reach our planet from space. It keeps us from living in a giant microwave, in a sense. While these rays have many sources—black holes, supernovae, quasars, gamma-ray bursts, the Big Bang itself—one is of particular interest: dark matter. And hanging off the side of the International Space Station one finds the Alpha Magnetic Spectrometer, part of an experiment designed to hunt for dark matter by observing the production of antimatter in our atmosphere.

Indeed, newly crunched results from the AMS detector are now showing even more promising hints of dark matter from these showers of cosmic rays.

In the dark matter hunt, physicists are particularly interested in the pairs of matter and antimatter particles that are created when superfast-moving particles collide with bits of matter or energy in the atmosphere. Antimatter is a natural result of the particle creation that occurs when two massless particles (like photons) collide to form two massive particles (an electron and positron).

Normally, the antimatter part of the equation is annihilated immediately when it meets some bit of normal matter, resulting in nature’s most efficient, near-perfect release of energy. But up in orbit, at the very edge of the atmosphere, it’s possible to count antimatter particles before they blow up. Crucially, it’s possible to determine if there are extra antimatter particles beyond what should be expected from detectable cosmic rays. This would indicate that there is some additional source of rays so far unaccounted for.

In 2013, physicists made the discovery (explained in the video above) that above a very high-energy threshold (8 billion electronvolts), the numbers of positrons (the antimatter version of electrons) shot way up. This was a sure sign that something else is indeed bombing our atmosphere, possibly dark matter—but by no means definitively dark matter.

The new AMS results suggest the energy level at which this promising positron spike ceases and things go back to normal: 275 billion electronvolts. (For reference, the Large Hadron Collider in its most recent setup was able to smash particles at 2.36 trillion electronvolts.) So, between 8 billion and 275 billion eV is where we find this unexplained extra contribution to the cosmic ray blast zone.

Scientists have been measuring this ratio [between antimatter and matter] since 1964, Jim Siegrist, an associate director at the US Department of Energys Office of High-Energy Physics, told Fermilab’s Symmetry Breaking. This is the first time anyone has observed this turning point. What’s more, the positron overpopulated region shows a smooth distribution of those positrons across the entire energy gap, rather than sudden jolts, eliminating some non-dark matter possibilities.

Energy is but the flipside to mass, so by knowing this range of energies we can put some limits on the potential masses of our new dark matter candidates. By projecting a curve of possible masses from the observed collision energies of these excess positrons, it should be possible to eliminate some other potential non-dark matter causes, like pulsars or some other non-massive source. A massive particle, like dark matter, would necessarily show a very steep drop-off at either energy limit because of this mass constraint. A photon sailing across the cosmos from some distant black hole doesn’t have the same speed limit that results from having mass.

Describing this curve will take some more time. For one thing, researchers still have about 10 billion collision events to analyze out of the total 54 billion collected. In a few years time, it’s hoped that we’ll have the statistics to narrow things down further. Fortunately, the ISS experiment is just one of many dark matter probes currently underway, so it’s likely we won’t have to wait quite that long for our next bit of dark matter news.

Motherboard RSS Feed

Soft robotics ‘toolkit’ features everything a robot-maker needs

September 21, 2014 Robots Comments Off

A new resource unveiled today by researchers from several Harvard University labs in collaboration with Trinity College Dublin provides both experienced and aspiring researchers with the intellectual raw materials needed to design, build, and operate robots made from soft, flexible materials.

With the advent of low-cost 3D printing, laser cutters, and other advances in manufacturing technology, soft robotics is emerging as an increasingly important field. Using principles drawn from conventional rigid robot design, but working with pliable materials, engineers are pioneering the use of soft robotics for assisting in a wide variety of tasks such as physical therapy, minimally invasive surgery, and search-and-rescue operations in dangerous environments.

The Soft Robotics Toolkit is an online treasure trove of downloadable, open-source plans, how-to videos, and case studies to assist users in the design, fabrication, modeling, characterization, and control of soft robotic devices. It will provide researchers with a level of detail not typically found in academic research papers, including 3D models, bills of materials, raw experimental data, multimedia step-by-step tutorials, and case studies of various soft robot designs.

“The goal of the toolkit is to advance the field of soft robotics by allowing designers and researchers to build upon each other’s work,” says Conor Walsh, Assistant Professor of Mechanical and Biomedical Engineering at the Harvard School of Engineering and Applied Sciences (SEAS) and a Core Faculty Member at the Wyss Institute for Biologically Inspired Engineering at Harvard University.

By creating a common resource for sharing design approaches, prototyping and fabrication techniques, and technical knowledge, the toolkit’s developers hope to stimulate the creation of new kinds of soft devices, tools, and methods.

According to Walsh, who teaches a popular course in medical device design at SEAS and is founder of the Harvard Biodesign Lab, soft robotics is especially well suited to shared design tools because many of the required components, such as regulators, valves, and microcontrollers, are largely interchangeable between systems.

Dónal Holland, a visiting lecturer in engineering sciences at SEAS and graduate student at Trinity College Dublin, is one of the lead developers of the toolkit and is especially interested in the toolkit’s potential as an educational resource.

“One thing we’ve seen in design courses is that students greatly benefit from access to more experienced peers — say, postdocs in a research lab — who can guide them through their work,” Holland says. “But scaling that up is difficult; you quickly run out of time and people. The toolkit is designed to capture the expertise and make it easily accessible to students.”

Just as open-source software has spurred far-flung innovation in computing, “open design” hardware platforms — coupled with advances in computer-aided engineering and more accessible prototyping capabilities — have the potential to foster remote collaboration on common mechanical engineering projects, unleashing crowdsourced creativity in robotics and other fields.

“Open design can have as disruptive an influence on technology development in this century as open source did in the last,” says Gareth J. Bennett, assistant professor of mechanical and manufacturing engineering at Trinity College Dublin and a coauthor of a paper in Soft Robotics that describes the toolkit development. Additional coauthors are Evelyn J. Park ’13, a SEAS research fellow in materials science and engineering, and Panagiotis Polygerinos, a postdoctoral fellow in the Harvard Biodesign Lab at SEAS and the Wyss Institute.

Much of the material included in the toolkit sprang from the labs of Robert J. Wood, Charles River Professor of Engineering and Applied Sciences at SEAS, and chemist George M. Whitesides, Woodford L. and Ann A. Flowers University Professor, two researchers whose work has helped establish Harvard as a leader in soft robotics. Wood and Whitesides are also core faculty members of the Wyss Institute.

VIDEO: https://www.youtube.com/watch?v=9EYFlJhga24


Robotics Research News — ScienceDaily

Harvard Makes Soft Robotics Open-Source

September 20, 2014 Tech Comments Off

Thanks in no small part to the advent of supercheap Arduino microcontrollers and their even cheaper off-brand kin, making robots has become populist in a way that would have baffled engineers just a couple of decades ago. Robots, the classic symbol of the techno-future, are now bopping around in the suburban garages of most anyone with $ 40 or so to spend on parts and with a bit of programming acumen, or at least the desire/ability to learn a bit of code.

Thanks to a new toolkit released by researchers at Harvard University, those garage robot tinkerers can now expand into the realm of “soft” robots, e.g. robots made to squish and deform like mechanical slugs or eels. It’s hoped that these sorts of bots might be able to do things like assist in surgical operations or search and rescue missions involving tight, confined spaces (like a collapsed mine). But the whole point of releasing something like the Soft Robotics Toolkit is to put a technology’s future into the hands of really anyone with a new idea. That’s the power of open-sourcing.

Image: Eliza Grinnell, Harvard SEAS

“The toolkit includes an open source fluidic control board, detailed design documentation describing a wide range of soft robotic components (including actuators and sensors), and related files that can be downloaded and used in the design, manufacture, and operation of soft robots,” the toolkit’s website explains. “In combination with low material costs and increasingly accessible rapid prototyping technologies such as 3D printers, laser cutters, and CNC mills, the toolkit enables soft robotic components to be produced easily and affordably.”

In return, the Harvard researchers get access to new data sources, experimental setups, tutorials, and yet more case studies, all thanks to you, the garage robot-builder and now research collaborator. “The goal of the toolkit is to advance the field of soft robotics by allowing designers and researchers to build upon each other’s work,” said Conor Walsh, an associate professor of engineering at Harvard, in a statement.

“Soft robotics is a growing field that takes inspiration from biological systems to combine classical principles of robot design with the study of soft, flexible materials,” the kit’s website explains. “Many animals and plants are composed primarily of soft, elastic structures which are capable of complex movement as well as adaptation to their environment. These natural systems have inspired the development of soft robotic systems, in which the careful design of component geometry allows complex motions to be pre-programmed into flexible and elastomeric materials.”

The toolkit is described in a paper by Walsh et al in the current edition of the journal Soft Robotics (which is somewhat ironically behind a paywall).

“One thing we’ve seen in design courses is that students greatly benefit from access to more experienced peers—say, postdocs in a research lab—who can guide them through their work,” Dnal Holland, a visiting lecturer at Harvard, said. “But scaling that up is difficult; you quickly run out of time and people. The toolkit is designed to capture the expertise and make it easily accessible to students.”

So: get to work.

Motherboard RSS Feed

Fingertip sensor gives robot unprecedented dexterity

September 20, 2014 Robots Comments Off

Researchers at MIT and Northeastern University have equipped a robot with a novel tactile sensor that lets it grasp a USB cable draped freely over a hook and insert it into a USB port.

The sensor is an adaptation of a technology called GelSight, which was developed by the lab of Edward Adelson, the John and Dorothy Wilson Professor of Vision Science at MIT, and first described in 2009. The new sensor isn’t as sensitive as the original GelSight sensor, which could resolve details on the micrometer scale. But it’s smaller — small enough to fit on a robot’s gripper — and its processing algorithm is faster, so it can give the robot feedback in real time.

Industrial robots are capable of remarkable precision when the objects they’re manipulating are perfectly positioned in advance. But according to Robert Platt, an assistant professor of computer science at Northeastern and the research team’s robotics expert, for a robot taking its bearings as it goes, this type of fine-grained manipulation is unprecedented.

“People have been trying to do this for a long time,” Platt says, “and they haven’t succeeded because the sensors they’re using aren’t accurate enough and don’t have enough information to localize the pose of the object that they’re holding.”

The researchers presented their results at the International Conference on Intelligent Robots and Systems this week. The MIT team — which consists of Adelson; first author Rui Li, a PhD student; Wenzhen Yuan, a master’s student; and Mandayam Srinivasan, a senior research scientist in the Department of Mechanical Engineering — designed and built the sensor. Platt’s team at Northeastern, which included Andreas ten Pas and Nathan Roscup, developed the robotic controller and conducted the experiments.

Synesthesia

Whereas most tactile sensors use mechanical measurements to gauge mechanical forces, GelSight uses optics and computer-vision algorithms.

“I got interested in touch because I had children,” Adelson says. “I expected to be fascinated by watching how they used their visual systems, but I was actually more fascinated by how they used their fingers. But since I’m a vision guy, the most sensible thing, if you wanted to look at the signals coming into the finger, was to figure out a way to transform the mechanical, tactile signal into a visual signal — because if it’s an image, I know what to do with it.”

A GelSight sensor — both the original and the new, robot-mounted version — consists of a slab of transparent, synthetic rubber coated on one side with a metallic paint. The rubber conforms to any object it’s pressed against, and the metallic paint evens out the light-reflective properties of diverse materials, making it much easier to make precise optical measurements.

In the new device, the gel is mounted in a cubic plastic housing, with just the paint-covered face exposed. The four walls of the cube adjacent to the sensor face are translucent, and each conducts a different color of light — red, green, blue, or white — emitted by light-emitting diodes at the opposite end of the cube. When the gel is deformed, light bounces off of the metallic paint and is captured by a camera mounted on the same cube face as the diodes.

From the different intensities of the different-colored light, the algorithms developed by Adelson’s team can infer the three-dimensional structure of ridges or depressions of the surface against which the sensor is pressed.

Although there was a 3-millimeter variation in where the robot grasped the plug, it was still able to measure its position accurately enough to insert it into a USB port that tolerated only about a millimeter’s error. By that measure, even the lower-resolution, robot-mounted version of the GelSight sensor is about 100 times more sensitive than a human finger.

Plug ‘n play

In Platt’s experiments, a Baxter robot from MIT spinout Rethink Robotics was equipped with a two-pincer gripper, one of whose pincers had a GelSight sensor on its tip. Using conventional computer-vision algorithms, the robot identified the dangling USB plug and attempted to grasp it. It then determined the position of the USB plug relative to its gripper from an embossed USB symbol. Although there was a 3-millimeter variation, in each of two dimensions, in where the robot grasped the plug, it was still able to insert it into a USB port that tolerated only about a millimeter’s error.

“Having a fast optical sensor to do this kind of touch sensing is a novel idea,” says Daniel Lee, a professor of electrical and systems engineering at the University of Pennsylvania and director of the GRASP robotics lab, “and I think the way that they’re doing it with such low-cost components — using just basically colored LEDs and a standard camera — is quite interesting.”

How GelSight fares against other approaches to tactile sensing will depend on “the application domain and what the price points are,” Lee says. “What Rui’s device has going for it is that it has very good spatial resolution. It’s able to see heights on the level of tens of microns. Compared to other devices in the domain that use things like barometers, the spatial resolution is very good.”

“As roboticists, we are always looking for new sensors,” Lee adds. “This is a promising prototype. It could be developed into practical device.”

Video: http://www.youtube.com/watch?v=w1EBdbe4Nes


Robotics Research News — ScienceDaily

Tweets

Categories

Enter your email address to subscribe to this blog and receive notifications of new posts by email.