Showing posts with label physics. Show all posts
Showing posts with label physics. Show all posts
A Self ventilating suit to keep you dry and cool while you perform exercise

A Self ventilating suit to keep you dry and cool while you perform exercise

 

                                                                                    
Self-ventilating workout suit keeps athletes cool and dry
Images of garment prototype before exercise with flat ventilation flaps (F) and after exercise with curved ventilation flaps (G). Credit: Science Advances (2017). advances.sciencemag.org/content/3/5/e1601984
           
A team of MIT researchers has designed a breathable workout suit with ventilating flaps that open and close in response to an athlete's body heat and sweat. These flaps, which range from thumbnail- to finger-sized, are lined with live microbial cells that shrink and expand in response to changes in humidity. The cells act as tiny sensors and actuators, driving the flaps to open when an athlete works up a sweat, and pulling them closed when the body has cooled off.

The researchers have also fashioned a running shoe with an inner layer of similar cell-lined flaps to air out and wick away moisture. Details of both designs are published today in Science Advances.
Why use in responsive fabrics? The researchers say that moisture-sensitive cells require no additional elements to sense and respond to humidity. The they have used are also proven to be safe to touch and even consume. What's more, with new genetic engineering tools available today, cells can be prepared quickly and in vast quantities, to express multiple functionalities in addition to moisture response.
To demonstrate this last point, the researchers engineered moisture-sensitive cells to not only pull flaps open but also light up in response to humid conditions.
"We can combine our cells with genetic tools to introduce other functionalities into these living cells," says Wen Wang, the paper's lead author and a former research scientist in MIT's Media Lab and Department of Chemical Engineering. "We use fluorescence as an example, and this can let people know you are running in the dark. In the future we can combine odor-releasing functionalities through genetic engineering. So maybe after going to the gym, the shirt can release a nice-smelling odor."
Wang's co-authors include 14 researchers from MIT, specializing in fields including mechanical engineering, chemical engineering, architecture, biological engineering, and fashion design, as well as researchers from New Balance Athletics. Wang co-led the project, dubbed bioLogic, with former graduate student Lining Yao as part of MIT's Tangible Media group, led by Hiroshi Ishii, the Jerome B. Wiesner Professor of Media Arts and Sciences.
Shape-shifting cells
In nature, biologists have observed that living things and their components, from pine cone scales to microbial cells and even specific proteins, can change their structures or volumes when there is a change in humidity. The MIT team hypothesized that natural shape-shifters such as yeast, bacteria, and other microbial cells might be used as building blocks to construct moisture-responsive fabrics.

"These cells are so strong that they can induce bending of the substrate they are coated on," Wang says.
The researchers first worked with the most common nonpathogenic strain of E. coli, which was found to swell and shrink in response to changing humidity. They further engineered the cells to express green fluorescent protein, enabling the cell to glow when it senses humid conditions.
They then used a cell-printing method they had previously developed to print E. coli onto sheets of rough, natural latex.
The team printed parallel lines of E. coli cells onto sheets of latex, creating two-layer structures, and exposed the fabric to changing moisture conditions. When the fabric was placed on a hot plate to dry, the cells began to shrink, causing the overlying latex layer to curl up. When the fabric was then exposed to steam, the cells began to glow and expand, causing the latex flatten out. After undergoing 100 such dry/wet cycles, Wang says the fabric experienced "no dramatic degradation" in either its cell layer or its overall performance.
No sweat
The researchers worked the biofabric into a wearable garment, designing a running suit with cell-lined latex flaps patterned across the suit's back. They tailored the size of each flap, as well as the degree to which they open, based on previously published maps of where the body produces heat and sweat.
"People may think heat and sweat are the same, but in fact, some areas like the lower spine produce lots of sweat but not much heat," Yao says. "We redesigned the garment using a fusion of heat and sweat maps to, for example, make flaps bigger where the body generates more heat."
Support frames underneath each flap keep the fabric's inner cell layer from directly touching the skin, while at the same time, the are able to sense and react to humidity changes in the air lying just over the skin. In trials to test the running suit, study participants donned the garment and worked out on exercise treadmills and bicycles while researchers monitored their temperature and humidity using small sensors positioned across their backs.
After five minutes of exercise, the suit's flaps started opening up, right around the time when participants reported feeling warm and sweaty. According to sensor readings, the flaps effectively removed sweat from the body and lowered skin temperature, more so than when participants wore a similar running suit with nonfunctional flaps.
When Wang tried on the suit herself, she found that the flaps created a welcome sensation. After pedaling hard for a few minutes, Wang recalls that "it felt like I was wearing an air conditioner on my back."
Ventilated running shoes
The team also integrated the moisture-responsive fabric into a rough prototype of a running shoe. Where the bottom of the foot touches the sole of the shoe, the researchers sewed multiple flaps, curved downward, with the cell-lined layer facing toward—though not touching—a runner's foot. They again designed the size and position of the flaps based on heat and sweat maps of the foot.
"In the beginning, we thought of making the flaps on top of the shoe, but we found people don't normally sweat on top of their feet," Wang says. "But they sweat a lot on the bottom of their feet, which can lead to diseases like warts. So we thought, is it possible to keep your feet dry and avoid those diseases?"
As with the workout suit, the flaps on the running shoe opened and lit up when researchers increased the surrounding humidity; in dry conditions the flaps faded and closed.
Going forward, the team is looking to collaborate with sportswear companies to commercialize their designs, and is also exploring other uses, including moisture-responsive curtains, lampshades, and bedsheets.
"We are also interested in rethinking packaging," Wang says. "The concept of a second skin would suggest a new genre for responsive packaging."
"This work is an example of harnessing the power of biology to design new materials and devices and achieve new functions," says Xuanhe Zhao, the Robert N. Noyce Career Development Associate Professor in the Department of Mechanical Engineering and a co-author on the paper. "We believe this new field of 'living' materials and devices will find important applications at the interface between engineering and biological systems."
Violating law of energy conservation in the early universe may explain dark energy

Violating law of energy conservation in the early universe may explain dark energy


universe
This is the "South Pillar" region of the star-forming region called the Carina Nebula. Like cracking open a watermelon and finding its seeds, the infrared telescope "busted open" this murky cloud to reveal star embryos tucked inside finger-like pillars of thick dust. Credit: NASA
Physicists have proposed that the violations of energy conservation in the early universe, as predicted by certain modified theories in quantum mechanics and quantum gravity, may explain the cosmological constant problem, which is sometimes referred to as "the worst theoretical prediction in the history of physics."
The physicists, Thibaut Josset and Alejandro Perez at the University of Aix-Marseille, France, and Daniel Sudarsky at the National Autonomous University of Mexico, have published a paper on their proposal in a recent issue Physical Review Letters.
"The main achievement of the work was the unexpected relation between two apparently very distinct issues, namely the accelerated expansion of the universe and microscopic physics," Josset told Phys.org. "This offers a fresh look at the cosmological constant problem, which is still far from being solved."
Einstein originally proposed the concept of the cosmological constant in 1917 to modify his theory of in order to prevent the universe from expanding, since at the time the universe was considered to be static.
Now that modern observations show that the universe is expanding at an accelerating rate, the cosmological constant today can be thought of as the simplest form of , offering a way to account for current observations.
However, there is a huge discrepancy—up to 120 orders of magnitude—between the large theoretical predicted value of the cosmological constant and the tiny observed value. To explain this disagreement, some research has suggested that the cosmological constant may be an entirely new constant of nature that must be measured more precisely, while another possibility is that the underlying mechanism assumed by theory is incorrect. The new study falls into the second line of thought, suggesting that scientists still do not fully understand the root causes of the cosmological constant.
The basic idea of the new paper is that violations of energy conservation in the could have been so small that they would have negligible effects at local scales and remain inaccessible to modern experiments, yet at the same time these violations could have made significant contributions to the present value of the cosmological constant.
To most people, the idea that conservation of energy is violated goes against everything they learned about the most fundamental laws of physics. But on the cosmological scale, conservation of energy is not as steadfast a law as it is on smaller scales. In this study, the physicists specifically investigated two theories in which violations of energy conservation naturally arise.
The first scenario of violations involves modifications to quantum theory that have previously been proposed to investigate phenomena such as the creation and evaporation of black holes, and which also appear in interpretations of quantum mechanics in which the wavefunction undergoes spontaneous collapse. In these cases, energy is created in an amount that is proportional to the mass of the collapsing object.
Violations of energy conservation also arise in some approaches to quantum gravity in which spacetime is considered to be granular due to the fundamental limit of length (the Planck length, which is on the order of 10-35 m). This spacetime discreteness could have led to either an increase or decrease in energy that may have begun contributing to the cosmological constant starting when photons decoupled from electrons in the early universe, during the period known as recombination.
As the researchers explain, their proposal relies on a modification to general relativity called unimodular gravity, first proposed by Einstein in 1919.
"Energy from matter components can be ceded to the gravitational field, and this 'loss of energy' will behave as a cosmological constant—it will not be diluted by later expansion of the universe," Josset said. "Therefore a tiny loss or creation of energy in the remote past may have significant consequences today on large scale."
Whatever the source of the energy conservation violation, the important result is that the energy that was created or lost affected the cosmological constant to a greater and greater extent as time went by, while the effects on matter decreased over time due to the expansion of the universe.
Another way to put it, as the physicists explain in their paper, is that the cosmological constant can be thought of as a record of the energy non-conservation during the history of the universe.
Currently there is no way to tell whether the violations of energy conservation investigated here truly did affect the cosmological constant, but the physicists plan to further investigate the possibility in the future.
"Our proposal is very general and any violation of energy conservation is expected to contribute to an effective cosmological constant," Josset said. "This could allow to set new constraints on phenomenological models beyond standard .
"On the other hand, direct evidence that dark energy is sourced by energy non-conservation seems largely out-of-reach, as we have access to the value of lambda [the ] today and constraints on its evolution at late time only."

Credit: Lisa Zyga  
 

Energy scenarios that actually provide useful decision-support tools for policymakers and investors

geekkeep.blogspot.com


Fulfilling the promise of the 2015 Paris Agreement on climate change—most notably the goal of limiting the rise in mean global surface temperature since preindustrial times to 2 degrees Celsius—will require a dramatic transition away from fossil fuels and toward low-carbon energy sources. To map out that transition, decision-makers routinely turn to energy scenarios, which use computational models to project changes to the energy mix that will be needed to meet climate and environmental targets. These models account for not only technological, economic, demographic, political, and institutional developments, but also the scope, timing, and stringency of policies to reduce greenhouse gas emissions and air pollution.
Energy scenarios provide useful decision-support tools for policymakers and investors
Credit: David Pilbrow/Flickr
Model-driven scenarios provide policymakers and investors with a powerful decision-support tool but should not be used as a decision-making tool due to several limitations. So argues a new study in the journal Energy and Environment by Sergey Paltsev, deputy director of the MIT Joint Program on the Science and Policy of Global Change and a senior research scientist for both the Joint Program and the MIT Energy Initiative. The study shows that overall, energy scenarios are useful for assessing policymaking and investment risks associated with different emissions reduction pathways, but tend to overestimate the degree to which future energy demand will resemble the past.
"Energy scenarios may not provide exact projections, but they are the best available tool to assess the magnitude of challenges that lie ahead," Paltsev observes in the study, a unique review of the value and limits of widely used energy scenarios that range from the International Energy Agency (IEA) World Energy Outlook, to the Joint Program's own annual Food, Water, Energy and Climate Outlook (which uses the MIT Economic Projection and Policy Analysis model), to a recent Intergovernmental Panel on Climate Change (IPCC) assessment report (AR5) presenting 392 energy scenarios aligned with the 2 C climate stabilization goal.
The study points out that because energy scenarios tend to vary widely in terms of the projections they produce for a given policy and the degree of uncertainty associated with those projections, it's not advisable to base an energy policy or investment decision on a single energy scenario. Taken collectively, however, energy scenarios can help bring into sharp focus a range of plausible futures—information decision-makers can use to assess the scale and cost of the technological changes needed to effect significant transformations in energy production and consumption. A careful review of multiple energy scenarios associated with a particular emissions pathway can provide a qualitative analysis of what's driving the results and the potential risks and benefits of a proposed policy or investment.
That said, projections in energy scenarios can sometimes be highly inaccurate due to factors that are difficult to anticipate.
For example, according to the study, which compared several energy scenario projections to historical observations, most energy scenarios do not account for sudden changes to the status quo. One of the greatest contributors to uncertainty in energy scenarios is the demand for low-emitting energy technologies, whose timing and scale of deployment—dependent on several economic and political factors—is highly unpredictable. Paltsev notes that the IEA constantly underestimates ; in its 2006 World Energy Outlook, the agency projected for 2020 a level of wind power generation that the world exceeded as early as 2013.
In addition, while energy scenarios have been largely successful in projecting the quantity of (e.g., the 1994 IEA World Energy Outlook's projection for 2010 was off by only 10 percent, despite highly disruptive developments such as the breakup of the Soviet Union, the world recession in 2008, and the emergence of the shale gas industry), most have been considerably off the mark when it comes to projecting energy prices (e.g., in 1993 dollars, the 1994 IEA WEO projected $28/barrel in 2010, but the actual price was $53/barrel).
Recognizing the steep challenge in projecting demand and prices for different energy sources in the midst of a dramatic energy transition, Paltsev emphasizes that governments should not try to pick a "winner"—a single energy technology that seems poised to reduce emissions singlehandedly—but rather adopt a strategy that targets emissions reductions from any energy source.
"Governments shouldn't pick the winners, because most likely that choice will be wrong," he says. "They should instead design policies such as carbon-pricing and emissions trading systems that are designed to achieve emissions reduction targets at the least cost."

credit: Mark Dwortzan 
The strength of real hair inspires new materials for body armor

The strength of real hair inspires new materials for body armor


Strength of hair inspires new materials for body armor
Researchers at the University of California San Diego investigate why hair is incredibly strong and resistant to breaking. Credit: iStock.com/natevplas
In a new study, researchers at the University of California San Diego investigate why hair is incredibly strong and resistant to breaking. The findings could lead to the development of new materials for body armor and help cosmetic manufacturers create better hair care products.
Hair has a strength to weight ratio comparable to steel. It can be stretched up to one and a half times its original length before breaking. "We wanted to understand the mechanism behind this extraordinary property," said Yang (Daniel) Yu, a nano-engineering Ph.D. student at UC San Diego and the first author of the study.
"Nature creates a variety of interesting materials and architectures in very ingenious ways. We're interested in understanding the correlation between the structure and the properties of biological materials to develop synthetic materials and designs—based on nature—that have better performance than existing ones," said Marc Meyers, a professor of mechanical engineering at the UC San Diego Jacobs School of Engineering and the lead author of the study.
In a study published online in Dec. in the journal Materials Science and Engineering C, researchers examined at the nano-scale level how a strand of human behaves when it is deformed, or stretched. The team found that hair behaves differently depending on how fast or slow it is stretched. The faster hair is stretched, the stronger it is. "Think of a highly viscous substance like honey," Meyers explained. "If you deform it fast it becomes stiff, but if you deform it slowly it readily pours."
Hair consists of two main parts—the cortex, which is made up of parallel fibrils, and the matrix, which has an amorphous (random) structure. The matrix is sensitive to the speed at which hair is deformed, while the cortex is not. The combination of these two components, Yu explained, is what gives hair the ability to withstand high stress and strain.
And as hair is stretched, its structure changes in a particular way. At the nano-scale, the cortex fibrils in hair are each made up of thousands of coiled spiral-shaped chains of molecules called alpha helix chains. As hair is deformed, the alpha helix chains uncoil and become pleated sheet structures known as beta sheets. This structural change allows hair to handle up a large amount deformation without breaking.
This structural transformation is partially reversible. When hair is stretched under a small amount of strain, it can recover its original shape. Stretch it further, the structural transformation becomes irreversible. "This is the first time evidence for this transformation has been discovered," Yu said.
"Hair is such a common material with many fascinating properties," said Bin Wang, a UC San Diego PhD alumna and co-author on the paper. Wang is now at the Shenzhen Institutes of Advanced Technology in China continuing research on hair.
The team also conducted stretching tests on hair at different humidity levels and temperatures. At higher humidity levels, hair can withstand up to 70 to 80 percent deformation before breaking. Water essentially "softens" hair—it enters the matrix and breaks the sulfur bonds connecting the filaments inside a strand of hair. Researchers also found that hair starts to undergo permanent damage at 60 degrees Celsius (140 degrees Fahrenheit). Beyond this temperature, hair breaks faster at lower stress and strain.
"Since I was a child I always wondered why hair is so strong. Now I know why," said Wen Yang, a former postdoctoral researcher in Meyers' research group and co-author on the paper.
The team is currently conducting further studies on the effects of water on the properties of . Moving forward, the team is investigating the detailed mechanism of how washing hair causes it to return to its original shape.
Second-generation stars identified, giving clues about their predecessors

Second-generation stars identified, giving clues about their predecessors


The figure shows a sub-population of ancient stars, called Carbon-Enhanced Metal-Poor (CEMP) stars. These stars contain 100 to 1,000,000 times LESS iron (and other heavy elements) than the Sun, but 10 to 10,000 times MORE carbon, relative to iron. The unusual chemicalcompositions of these stars provides clues to their birth environments, and the nature of the stars in which the carbon formed. In the figure, A(C) is the absolute amount of carbon, while the horizontal axis represents the ratio of iron, relative to hydrogen, compared with the same ratio in the Sun. Credit: University of Notre Dame
University of Notre Dame astronomers have identified what they believe to be the second generation of stars, shedding light on the nature of the universe's first stars.
A subclass of carbon-enhanced metal-poor (CEMP) , the so-called CEMP-no stars, are ancient stars that have large amounts of carbon but little of the (such as iron) common to later-generation stars. Massive first-generation stars made up of pure hydrogen and helium produced and ejected by stellar winds during their lifetimes or when they exploded as supernovae. Those metals—anything heavier than helium, in astronomical parlance—polluted the nearby from which new stars formed.
Jinmi Yoon, a postdoctoral research associate in the Department of Physics; Timothy Beers, the Notre Dame Chair in Astrophysics; and Vinicius Placco, a research professor at Notre Dame, along with their collaborators, show in findings published in the Astrophysics Journal this week that the lowest metallicity stars, the most chemically primitive, include large fractions of CEMP stars. The CEMP-no stars, which are also rich in nitrogen and oxygen, are likely the stars born out of hydrogen and helium gas clouds that were polluted by the elements produced by the universe's first stars.
"The CEMP-no stars we see today, at least many of them, were born shortly after the Big Bang, 13.5 billion years ago, out of almost completely unpolluted material," Yoon says. "These stars, located in the halo system of our galaxy, are true second-generation stars—born out of the nucleosynthesis products of the very first stars."
Beers says it's unlikely that any of the universe's first stars still exist, but much can be learned about them from detailed studies of the next generation of stars.
"We're analyzing the chemical products of the very first stars by looking at what was locked up by the second-generation stars," Beers says. "We can use this information to tell the story of how the first elements were formed, and determine the distribution of the masses of those first stars. If we know how their masses were distributed, we can model the process of how the first stars formed and evolved from the very beginning."
The authors used high-resolution spectroscopic data gathered by many astronomers to measure the chemical compositions of about 300 stars in the halo of the Milky Way. More and heavier elements form as later generations of stars continue to contribute additional metals, they say. As new generations of stars are born, they incorporate the metals produced by prior generations. Hence, the more heavy metals a star contains, the more recently it was born. Our sun, for example, is relatively young, with an age of only 4.5 billion years.
A companion paper, titled "Observational constraints on first-star nucleosynthesis. II. Spectroscopy of an ultra metal-poor CEMP-no star," of which Placco was the lead author, was also published in the same issue of the journal this week. The paper compares theoretical predictions for the chemical composition of zero-metallicity supernova models with a newly discovered CEMP-no star in the Milky Way galaxy.

Credit ; Brian Wallheimer 
A Swiss firm acquires Mars One private project

A Swiss firm acquires Mars One private project


Mars One consists of two entities: the Dutch not-for-profit Mars One Foundation and a British public limited company Mars One Ve
Mars One consists of two entities: the Dutch not-for-profit Mars One Foundation and a British public limited company Mars One Ventures
A British-Dutch project aiming to send an unmanned mission to Mars by 2018 announced Friday that the shareholders of a Swiss financial services company have agreed a takeover bid.
"The acquisition is now only pending approval by the board of Mars One Ventures," the company said in a joint statement with InFin Innovative Finance AG, adding approval from the Mars board would come "as soon as possible."
"The takeover provides a solid path to funding the next steps of Mars One's mission to establish a permanent human settlement on Mars," the statement added.
Mars One consists of two entities: the Dutch not-for-profit Mars One Foundation and a British public limited company Mars One Ventures.
Mars One aims to establish a permanent human settlement on the Red Planet, and is currently "in the early mission concept phase," the company says, adding securing funding is one of its major challenges.
Some 200,000 hopefuls from 140 countries initially signed up for the Mars One project, which is to be partly funded by a television reality show about the endeavour.
Those have now been whittled down to just 100, out of which 24 will be selected for one-way trips to Mars due to start in 2026 after several unmanned missions have been completed.
"Once this deal is completed, we'll be in a much stronger financial position as we begin the next phase of our mission. Very exciting times," said Mars One chief executive Bas Lansdorp.
NASA is currently working on three Mars missions with the European Space Agency and plans to send another rover to Mars in 2020.
But NASA has no plans for a manned to Mars until the 2030s.
First discovered signs of weird quantum property of empty space?

First discovered signs of weird quantum property of empty space?


First signs of weird quantum property of empty space?
This artist’s view shows how the light coming from the surface of a strongly magnetic neutron star (left) becomes linearly polarised as it travels through the vacuum of space close to the star on its way to the observer on Earth (right). …more
By studying the light emitted from an extraordinarily dense and strongly magnetized neutron star using ESO's Very Large Telescope, astronomers may have found the first observational indications of a strange quantum effect, first predicted in the 1930s. The polarization of the observed light suggests that the empty space around the neutron star is subject to a quantum effect known as vacuum birefringence.
A team led by Roberto Mignani from INAF Milan (Italy) and from the University of Zielona Gora (Poland), used ESO's Very Large Telescope (VLT) at the Paranal Observatory in Chile to observe the neutron star RX J1856.5-3754, about 400 light-years from Earth.
Despite being amongst the closest , its extreme dimness meant the astronomers could only observe the star with visible light using the FORS2 instrument on the VLT, at the limits of current telescope technology.
Neutron stars are the very dense remnant cores of massive stars—at least 10 times more massive than our Sun—that have exploded as supernovae at the ends of their lives. They also have extreme magnetic fields, billions of times stronger than that of the Sun, that permeate their outer surface and surroundings.
These fields are so strong that they even affect the properties of the empty space around the star. Normally a is thought of as completely empty, and light can travel through it without being changed. But in quantum electrodynamics (QED), the quantum theory describing the interaction between photons and charged particles such as electrons, space is full of virtual particles that appear and vanish all the time. Very can modify this space so that it affects the polarisation of light passing through it.
Mignani explains: "According to QED, a highly magnetised vacuum behaves as a prism for the propagation of light, an effect known as vacuum birefringence."
Among the many predictions of QED, however, vacuum birefringence so far lacked a direct experimental demonstration. Attempts to detect it in the laboratory have not yet succeeded in the 80 years since it was predicted in a paper by Werner Heisenberg (of uncertainty principle fame) and Hans Heinrich Euler.
First signs of weird quantum property of empty space?
This wide field image shows the sky around the very faint neutron star RX J1856.5-3754 in the southern constellation of Corona Australis. This part of the sky also contains interesting regions of dark and bright nebulosity surrounding the …more
"This effect can be detected only in the presence of enormously strong magnetic fields, such as those around neutron stars. This shows, once more, that neutron stars are invaluable laboratories in which to study the fundamental laws of nature." says Roberto Turolla (University of Padua, Italy).
After careful analysis of the VLT data, Mignani and his team detected linear polarisation—at a significant degree of around 16%—that they say is likely due to the boosting effect of vacuum birefringence occurring in the area of surrounding RX J1856.5-3754.
Vincenzo Testa (INAF, Rome, Italy) comments: "This is the faintest object for which polarisation has ever been measured. It required one of the largest and most efficient telescopes in the world, the VLT, and accurate data analysis techniques to enhance the signal from such a faint star."
"The high linear polarisation that we measured with the VLT can't be easily explained by our models unless the vacuum birefringence effects predicted by QED are included," adds Mignani.
"This VLT study is the very first observational support for predictions of these kinds of QED effects arising in extremely strong magnetic fields," remarks Silvia Zane (UCL/MSSL, UK).
Mignani is excited about further improvements to this area of study that could come about with more advanced telescopes: "Polarisation measurements with the next generation of telescopes, such as ESO's European Extremely Large Telescope, could play a crucial role in testing QED predictions of vacuum birefringence effects around many more neutron stars."
"This measurement, made for the first time now in visible light, also paves the way to similar measurements to be carried out at X-ray wavelengths," adds Kinwah Wu (UCL/MSSL, UK).
This research was presented in the paper entitled "Evidence for vacuum birefringence from the first optical polarimetry measurement of the isolated neutron star RX J1856.5−3754", by R. Mignani et al., to appear in Monthly Notices of the Royal Astronomical Society.
Combining  quantum physics and photosynthesis to make discovery that could lead to highly efficient solar cells

Combining quantum physics and photosynthesis to make discovery that could lead to highly efficient solar cells


Physics, photosynthesis and solar cells
In a light harvesting quantum photocell, particles of light (photons) can efficiently generate electrons. When two absorbing channels are used, solar power entering the system through the two absorbers (a and b) efficiently generates power …more
A University of California, Riverside assistant professor has combined photosynthesis and physics to make a key discovery that could help make solar cells more efficient. The findings were recently published in the journal Nano Letters.
Nathan Gabor is focused on experimental condensed matter physics, and uses light to probe the fundamental laws of quantum mechanics. But, he got interested in photosynthesis when a question popped into his head in 2010: Why are plants green? He soon discovered that no one really knows.
During the past six years, he sought to help change that by combining his background in physics with a deep dive into biology.
He set out to re-think by asking the question: can we make materials for solar cells that more efficiently absorb the fluctuating amount of energy from the sun. Plants have evolved to do this, but current affordable solar cells - which are at best 20 percent efficient - do not control these sudden changes in solar power, Gabor said. That results in a lot of wasted energy and helps prevent wide-scale adoption of solar cells as an energy source.
Gabor, and several other UC Riverside physicists, addressed the problem by designing a new type of quantum photocell, which helps manipulate the flow of energy in . The design incorporates a heat engine photocell that absorbs photons from the sun and converts the photon energy into electricity.
Surprisingly, the researchers found that the quantum heat engine photocell could regulate solar power conversion without requiring active feedback or adaptive control mechanisms. In conventional photovoltaic technology, which is used on rooftops and solar farms today, fluctuations in solar power must be suppressed by voltage converters and feedback controllers, which dramatically reduce the overall efficiency.
Physics, photosynthesis and solar cells
Nathan Gabor's Laboratory of Quantum Materials Optoelectronics utilizes infrared laser spectroscopy techniques to explore natural regulation in quantum photocells composed of two-dimensional semiconductors. Credit: Max Grossnickle and QMO Lab
The goal of the UC Riverside teams was to design the simplest photocell that matches the amount of solar power from the sun as close as possible to the average power demand and to suppress energy fluctuations to avoid the accumulation of excess energy.
The researchers compared the two simplest quantum mechanical photocell systems: one in which the photocell absorbed only a single color of light, and the other in which the photocell absorbed two colors. They found that by simply incorporating two photon-absorbing channels, rather than only one, the regulation of energy flow emerges naturally within the photocell.
The basic operating principle is that one channel absorbs at a wavelength for which the average input power is high, while the other absorbs at low power. The photocell switches between high and low power to convert varying levels of solar power into a steady-state output.
When Gabor's team applied these simple models to the measured solar spectrum on Earth's surface, they discovered that the absorption of green light, the most radiant portion of the spectrum per unit wavelength, provides no regulatory benefit and should therefore be avoided. They systematically optimized the photocell parameters to reduce solar energy fluctuations, and found that the absorption spectrum looks nearly identical to the absorption spectrum observed in photosynthetic green plants.
The findings led the researchers to propose that natural regulation of energy they found in the quantum heat engine photocell may play a critical role in the photosynthesis in plants, perhaps explaining the predominance of green plants on Earth.
Other researchers have recently found that several molecular structures in plants, including chlorophyll a and b molecules, could be critical in preventing the accumulation of excess in plants, which could kill them. The UC Riverside researchers found that the molecular structure of the quantum heat engine photocell they studied is very similar to the structure of photosynthetic molecules that incorporate pairs of chlorophyll.
The hypothesis set out by Gabor and his team is the first to connect quantum mechanical structure to the greenness of plants, and provides a clear set of tests for researchers aiming to verify natural regulation. Equally important, their design allows regulation without active input, a process made possible by the photocell's quantum mechanical structure.

 The paper is called "Natural Regulation of Energy Flow in a Green Quantum Photocell.

credit; Sean Nealon
scientists find that Solar cells can be made with tin instead of lead

scientists find that Solar cells can be made with tin instead of lead

,

Solar power could become cheaper and more widespread
Credit: University of Warwick
A breakthrough in solar power could make it cheaper and more commercially viable, thanks to research at the University of Warwick.
In a paper published in Nature Energy, Dr Ross Hatton, Professor Richard Walton and colleagues, explain how solar cells could be produced with tin, making them more adaptable and simpler to produce than their current counterparts.
Solar cells based on a class of semiconductors known as lead perovskites are rapidly emerging as an efficient way to convert sunlight directly into electricity. However, the reliance on lead is a serious barrier to commercialisation, due to the well-known toxicity of lead.
Dr Ross Hatton and colleagues show that perovskites using tin in place of lead are much more stable than previously thought, and so could prove to be a viable alternative to lead perovskites for solar cells.
Lead-free cells could render cheaper, safer and more commercially attractive - leading to it becoming a more prevalent source of energy in everyday life.
This could lead to a more widespread use of solar power, with potential uses in products such as laptop computers, mobile phones and cars.
The team have also shown how the device structure can be greatly simplified without compromising performance, which offers the important advantage of reduced fabrication cost.
Dr Hatton comments that there is an ever-pressing need to develop renewable sources of energy:
"It is hoped that this work will help to stimulate an intensive international research effort into lead-free perovskite solar cells, like that which has resulted in the astonishingly rapid advancement of perovskite solar cells.
"There is now an urgent need to tackle the threat of climate change resulting from humanity's over reliance on fossil fuel, and the rapid development of new solar technologies must be part of the plan."
Perovskite solar cells are lightweight and compatible with flexible substrates, so could be applied more widely than the rigid flat plate silicon that currently dominate the photovoltaics market, particularly in consumer electronics and transportation applications.
The paper, 'Enhanced Stability and Efficiency in Hole-Transport Layer Free CsSnI3 Perovskite Photovoltaics', is published in Nature Energy, and is authored by Dr Ross Hatton, Professor Richard Walton and PhD student Kenny Marshall in the Department of Chemistry, along with Dr Marc Walker in the Department of Physics.

Best weather satellite ever built is lunched into space

Best weather satellite ever built is lunched into space


Best weather satellite ever built rockets into space
This photo provided by United Launch Alliance shows a United Launch Alliance (ULA) Atlas V rocket carrying GOES-R spacecraft for NASA and NOAA lifting off from Space Launch Complex-41 at 6:42 p.m. EST at Cape Canaveral Air Force Station, Fla., Saturday, Nov. 19, 2016. The most advanced weather satellite ever built rocketed into space Saturday night, part of an $11 billion effort to revolutionize forecasting and save lives. (United Launch Alliance via AP)  
The most advanced weather satellite ever built rocketed into space Saturday night, part of an $11 billion effort to revolutionize forecasting and save lives.
This new GOES-R spacecraft will track U.S. weather as never before: hurricanes, tornadoes, flooding, , wildfires, lightning storms, even solar flares. Indeed, about 50 TV meteorologists from around the country converged on the launch site—including NBC's Al Roker—along with 8,000 space program workers and guests.
"What's so exciting is that we're going to be getting more data, more often, much more detailed, higher resolution," Roker said. In the case of tornadoes, "if we can give people another 10, 15, 20 minutes, we're talking about lives being saved."
Think superhero speed and accuracy for forecasting. Super high-definition TV, versus black-and-white.
"Really a quantum leap above any NOAA has ever flown," said Stephen Volz, the National Oceanic and Atmospheric Administration's director of satellites.
"For the American public, that will mean faster, more accurate weather forecasts and warnings," Volz said earlier in the week. "That also will mean more lives saved and better environmental intelligence" for government officials responsible for hurricane and other evacuations.
Best weather satellite ever built rockets into space
Cell phones light up the beaches of Cape Canaveral and Cocoa Beach, Fla., north of the Cocoa Beach Pier as spectators watch the launch of the NOAA GOES-R weather satellite, Saturday, Nov. 19, 2016. It was launched from Launch Complex 41 at Cape Canaveral Air Force Station on a ULA Atlas V rocket. (Malcolm Denemark/Florida Today via AP)
Airline passengers also stand to benefit, as do rocket launch teams. Improved forecasting will help pilots avoid bad weather and help rocket scientists know when to call off a launch.
NASA declared success 3 1/2 hours after liftoff, following separation from the upper stage.
The first in a series of four high-tech satellites, GOES-R hitched a ride on an unmanned Atlas V rocket, delayed an hour by rocket and other problems. NOAA teamed up with NASA for the mission.
The satellite—valued by NOAA at $1 billion—is aiming for a 22,300-mile-high equatorial orbit. There, it will join three aging spacecraft with 40-year-old technology, and become known as GOES-16. After months of testing, this newest satellite will take over for one of the older ones. The second satellite in the series will follow in 2018. All told, the series should stretch to 2036.
Best weather satellite ever built rockets into space
An Atlas V rocket lifts off from Complex 41 at Cape Canaveral Air Force Station, Fla., Saturday evening, Nov. 19, 2016. The rocket is carrying the GOES-R weather satellite. The most advanced weather satellite ever built rocketed into space Saturday night, part of an $11 billion effort to revolutionize forecasting and save lives. (Craig Bailey/Florida Today via AP)
GOES stands for Geostationary Operational Environmental Satellite. The first was launched in 1975.
GOES-R's premier imager—one of six science instruments—will offer three times as many channels as the existing system, four times the resolution and five times the scan speed, said NOAA program director Greg Mandt. A similar imager is also flying on a Japanese weather satellite.
Typically, it will churn out full images of the Western Hemisphere every 15 minutes and the continental United States every five minutes. Specific storm regions will be updated every 30 seconds.
Forecasters will get pictures "like they've never seen before," Mandt promised.
Best weather satellite ever built rockets into space
An Atlas V rocket lifts off from Complex 41 at Cape Canaveral Air Force Station, in Fla., Saturday evening, Nov. 19, 2016. The rocket is carrying the GOES-R weather satellite. The most advanced weather satellite ever built rocketed into space Saturday night, part of an $11 billion effort to revolutionize forecasting and save lives. (Craig Bailey/Florida Today via AP)
A first-of-its-kind lightning mapper, meanwhile, will take 500 snapshots a second.
This next-generation GOES program—$11 billion in all—includes four satellites, an extensive land system of satellite dishes and other equipment, and new methods for crunching the massive, nonstop stream of expected data.
Hurricane Matthew, interestingly enough, delayed the launch by a couple weeks. As the hurricane bore down on Florida in early October, launch preps were put on hold. Matthew stayed far enough offshore to cause minimal damage to Cape Canaveral, despite some early forecasts that suggested a direct strike.
Best weather satellite ever built rockets into space
This photo provided by United Launch Alliance shows a United Launch Alliance (ULA) Atlas V rocket carrying GOES-R spacecraft for NASA and NOAA lifting off from Space Launch Complex-41 at 6:42 p.m. EST at Cape Canaveral Air Force Station, Fla., Saturday, Nov. 19, 2016. The most advanced weather satellite ever built rocketed into space Saturday night, par 
credit; Marcia Dunn
A suit-X trio designed to support workers: Meet MAX

A suit-X trio designed to support workers: Meet MAX



(Tech Xplore)—Not all of us park our bodies in a chair in the morning and cross our legs to do our work. In fact, just think of vast numbers of workers doing physically demanding or just physically repetitive tasks including bending and lifting.
Workers on construction sites, factories and warehouses might cope with aches and pains brought on by their work. Hopefully, the future will provide an easy answer for workers to suit up in a suitable way for them to avoid these aches and pain.
There is a new kid on the block aiming to address such a solution, and a number of tech watchers have put them in the news this month. A California-based group aptly called suit-X announced its MAX, which stands for Modular Agile Exoskeleton. The company designs and makes exoskeletons.
"MAX is designed to support workers during the repetitive tasks that most frequently cause injury," said a company release.
Will Knight in MIT Technology Review said that this essentially is " a trio of devices that use robotic technologies to enhance the abilities of able-bodied workers and prevent common workplace injuries."
Target users, for example, could include those who carry out ceiling inspections, welding, installations and repairs. "It's not only lifting 75 pounds that can hurt your back; it is also lifting 20 pounds repeatedly throughout the day that will lead to injury," said Dr. Homayoon Kazerooni, founder and CEO, suitX."The MAX solution is designed for unstructured workplaces where no robot can work as efficiently as a human worker. Our goal is to augment and support workers who perform demanding and repetitive tasks in unstructured workplaces in order to prevent and reduce injuries."
Seeker referred to the MAX system as an exoskeleton device that could potentially change the way millions of people work.
Seeker noted its advantages as workplace exoskeletons in several ways, being lightweight such that the user can walk around unimpeded. "The exoskeleton units kick in only when you need them, and they don't require any external power source."
MAX is a product with three modules. You use them independently or in combination, depending on work needs. The three modules are backX, shoulderX, and legX.
According to the company, "All modules intelligently engage when you need them, and don't impede you otherwise."
The backX (lower back) reduces forces and torques.
The shoulderX reduces forces; it "enables the wearer to perform chest-to-ceiling level tasks for longer periods of time." In a video the company defines shoulderX as "an industrial arm exoskeleton that augments its wearer by reducing gravity-induced forces at the shoulder complex."
The legX was designed to support knee joint and quadriceps. It incorporates microcomputers in each leg. They communicate with each other to determine if the person is walking, bending, or taking the stairs." Seeker said these communicate via Bluetooth, monitoring spacing and position.
Credit: suitx
A suit-X trio designed to support workers: Meet MAX
Kazerooni spoke about his company and its mission, in Seeker. "My job is easy. I sit in front of a computer. But these guys work all day long, put their bodies through abuse. We can use bionics to help them." He also said he and his team did not create this "because of science fiction movies. We were responding to numbers from the Department of Labor, which said that back, knee and shoulder injuries are the most common form of injuries among workers."
Will Knight meanwhile has reflected on the bigger picture in developments. Can they help in preventing injury on the job and help prolong workers' careers? "New materials, novel mechanical designs, and cheaper actuators and motors have enabled a new generation of cheaper, more lightweight exoskeletons to emerge in recent years," he wrote. "For instance, research groups at Harvard and SRI are developing systems that are passive and use soft, lightweight materials."
Some companies, such as BMW, said Knight, have been experimenting with exoskeletons. "The MAX is another (bionic) step toward an augmented future of work."

credit;   Nancy Owano
New 'smart metal' technology to keep bridge operational in next big quake

New 'smart metal' technology to keep bridge operational in next big quake


A bridge that bends in an strong earthquake and not only remains standing, but remains usable is making its debut in its first real-world application as part of a new exit bridge ramp on a busy downtown Seattle highway.
"We've tested new materials, memory retaining metal rods and flexible concrete composites, in a number of bridge model studies in our large-scale shake table lab, it's gratifying to see the applied for the first time in an important setting in a seismically active area with heavy traffic loads," Saiid Saiidi, civil engineering professor and researcher at the University of Nevada, Reno, said. "Using these materials substantially reduces damage and allows the bridge to remain open even after a strong earthquake."
Saiidi, who pioneered this technology, has built and destroyed, in the lab, several large-scale 200-ton bridges, single bridge columns and concrete abutments using various combinations of innovative materials, replacements for the standard steel rebar and concrete materials and design in his quest for a safer, more resilient infrastructure.
"We have solved the problem of survivability, we can keep a bridge usable after a ," Saiidi said. "With these techniques and materials, we will usher in a new era of super earthquake-resilient structures."
The University partnered with the Washington Department of Transportation and the Federal Highway Administration to implement this new technology on their massive Alaska Way Viaduct Replacement Program, the centerpiece of which is a two-mile long tunnel, but includes 31 separate projects that began in 2007 along the State Route 99 corridor through downtown Seattle.
"This is potentially a giant leap forward," Tom Baker, bridge and structures engineer for the Washington State Department of Transportation, said. "We design for no-collapse, but in the future, we could be designing for no-damage and be able to keep bridges open to emergency vehicles, commerce and the public after a strong quake."
Modern bridges are designed to not collapse during an earthquake, and this new technology takes that design a step further. In the earthquake lab tests, bridge columns built using memory-retaining nickel/titanium rods and a flexible concrete composite returned to their original shape after an earthquake as strong as a magnitude 7.5.
"The tests we've conducted on 4-span bridges leading to this point aren't possible anywhere else in the world than our large-scale structures and earthquake engineering lab," Saiidi said. "We've had great support along the way from many state highway departments and funding agencies like the National Science Foundation, the Federal Highway Administration and the U.S. Department of Transportation. Washington DOT recognized the potential of this technology and understands the need to keep infrastructure operating following a large earthquake."
In an experiment in 2015, featured in a video, one of Saiidi's 's moved more than six inches off center at the base and returned to its original position, as designed, in an upright and stable position. Using the computer-controlled hydraulics, the earthquake engineering lab can increase the intensity of the recorded . Saiidi turned the dial up to 250 percent of the design parameters and still had excellent results.
"It had an incredible 9 percent drift with little damage," Saiidi said.
The Seattle off-ramp with the innovative columns is currently under construction and scheduled for completion in spring 2017. After the new SR 99 tunnel opens, this ramp, just south of the tunnel entrance, will take northbound drivers from SR 99 to Seattle's SODO neighborhood.
A new WSDOT video describes how this innovative technology works.
"Dr. Saiidi sets the mark for the level of excellence to which the College of Engineering aspires," Manos Maragakis, dean of the University's College of Engineering, said. "His research is original and innovative and has made a seminal contribution to seismic safety around the globe."
Use drones and insect biobots to map disaster areas

Use drones and insect biobots to map disaster areas


Tech would use drones and insect biobots to map disaster areas
Credit: North Carolina State University  
Researchers at North Carolina State University have developed a combination of software and hardware that will allow them to use unmanned aerial vehicles (UAVs) and insect cyborgs, or biobots, to map large, unfamiliar areas – such as collapsed buildings after a disaster.
"The idea would be to release a swarm of sensor-equipped biobots – such as remotely controlled cockroaches – into a collapsed building or other dangerous, unmapped area," says Edgar Lobaton, an assistant professor of electrical and computer engineering at NC State and co-author of two papers describing the work.
"Using remote-control technology, we would restrict the movement of the biobots to a defined area," Lobaton says. "That area would be defined by proximity to a beacon on a UAV. For example, the biobots may be prevented from going more than 20 meters from the UAV."
The biobots would be allowed to move freely within a defined area and would signal researchers via radio waves whenever they got close to each other. Custom software would then use an algorithm to translate the biobot sensor data into a rough map of the unknown environment.
Once the program receives enough data to map the defined area, the UAV moves forward to hover over an adjacent, unexplored section. The biobots move with it, and the mapping process is repeated. The software program then stitches the new map to the previous one. This can be repeated until the entire region or structure has been mapped; that map could then be used by first responders or other authorities.
"This has utility for areas – like collapsed buildings – where GPS can't be used," Lobaton says. "A strong radio signal from the UAV could penetrate to a certain extent into a collapsed building, keeping the biobot swarm contained. And as long as we can get a signal from any part of the swarm, we are able to retrieve data on what the rest of the swarm is doing. Based on our experimental data, we know you're going to lose track of a few individuals, but that shouldn't prevent you from collecting enough data for mapping."
Co-lead author Alper Bozkurt, an associate professor of electrical and computer engineering at NC State, has previously developed functional cockroach biobots. However, to test their new mapping technology, the research team relied on inch-and-a-half-long robots that simulate cockroach behavior.
In their experiment, researchers released these robots into a maze-like space, with the effect of the UAV beacon emulated using an overhead camera and a physical boundary attached to a moving cart. The cart was moved as the robots mapped the area.
"We had previously developed proof-of-concept software that allowed us to map small areas with biobots, but this work allows us to map much larger areas and to stitch those maps together into a comprehensive overview," Lobaton says. "It would be of much more practical use for helping to locate survivors after a disaster, finding a safe way to reach survivors, or for helping responders determine how structurally safe a building may be.
"The next step is to replicate these experiments using biobots, which we're excited about."
An article on the framework for developing local maps and stitching them together, "A Framework for Mapping with Biobotic Insect Networks: From Local to Global Maps," is published in Robotics and Autonomous Systems. An article on the theory of mapping based on the proximity of mobile sensors to each other, "Geometric Learning and Topological Inference with Biobotic Networks," is published in IEEE Transactions on Signal and Information Processing over Networks.


credit;   Matt Shipman
How machine learning advances artificial intelligence

How machine learning advances artificial intelligence


Computers that learn for themselves are with us now. As they become more common in 'high-stakes' applications like robotic surgery, terrorism detection and driverless cars, researchers ask what can be done to make sure we can trust them.
There would always be a first death in a driverless car and it happened in May 2016. Joshua Brown had engaged the autopilot system in his Tesla when a tractor-trailor drove across the road in front of him. It seems that neither he nor the sensors in the autopilot noticed the white-sided truck against a brightly lit sky, with tragic results.
Of course many people die in car crashes every day – in the USA there is one fatality every 94 million miles, and according to Tesla this was the first known fatality in over 130 million miles of driving with activated autopilot. In fact, given that most road fatalities are the result of human error, it has been said that autonomous cars should make travelling safer.
Even so, the tragedy raised a pertinent question: how much do we understand – and trust – the computers in an autonomous vehicle? Or, in fact, in any machine that has been taught to carry out an activity that a human would do?
We are now in the era of machine learning. Machines can be trained to recognise certain patterns in their environment and to respond appropriately. It happens every time your digital camera detects a face and throws a box around it to focus, or the personal assistant on your smartphone answers a question, or the adverts match your interests when you search online.
Machine learning is a way to program computers to learn from experience and improve their performance in a way that resembles how humans and animals learn tasks. As machine learning techniques become more common in everything from finance to healthcare, the issue of trust is becoming increasingly important, says Zoubin Ghahramani, Professor of Information Engineering in Cambridge's Department of Engineering.
Faced with a life or death decision, would a driverless car decide to hit pedestrians, or avoid them and risk the lives of its occupants? Providing a medical diagnosis, could a machine be wildly inaccurate because it has based its opinion on a too-small sample size? In making financial transactions, should a computer explain how robust is its assessment of the volatility of the stock markets?
"Machines can now achieve near-human abilities at many cognitive tasks even if confronted with a situation they have never seen before, or an incomplete set of data," says Ghahramani. "But what is going on inside the 'black box'? If the processes by which decisions were being made were more transparent, then trust would be less of an issue."
His team builds the algorithms that lie at the heart of these technologies (the "invisible bit" as he refers to it). Trust and transparency are important themes in their work: "We really view the whole mathematics of machine learning as sitting inside a framework of understanding uncertainty. Before you see data – whether you are a baby learning a language or a scientist analysing some data – you start with a lot of uncertainty and then as you have more and more data you have more and more certainty.
"When machines make decisions, we want them to be clear on what stage they have reached in this process. And when they are unsure, we want them to tell us."
One method is to build in an internal self-evaluation or calibration stage so that the machine can test its own certainty, and report back.
Two years ago, Ghahramani's group launched the Automatic Statistician with funding from Google. The tool helps scientists analyse datasets for statistically significant patterns and, crucially, it also provides a report to explain how sure it is about its predictions.
"The difficulty with machine learning systems is you don't really know what's going on inside – and the answers they provide are not contextualised, like a human would do. The Automatic Statistician explains what it's doing, in a human-understandable form."
Where transparency becomes especially relevant is in applications like medical diagnoses, where understanding the provenance of how a decision is made is necessary to trust it.
Dr Adrian Weller, who works with Ghahramani, highlights the difficulty: "A particular issue with new (AI) systems that learn or evolve is that their processes do not clearly map to rational decision-making pathways that are easy for humans to understand." His research aims both at making these pathways more transparent, sometimes through visualisation, and at looking at what happens when systems are used in real-world scenarios that extend beyond their training environments – an increasingly common occurrence.
"We would like AI systems to monitor their situation dynamically, detect whether there has been a change in their environment and – if they can no longer work reliably – then provide an alert and perhaps shift to a safety mode." A , for instance, might decide that a foggy night in heavy traffic requires a human driver to take control.
Weller's theme of trust and transparency forms just one of the projects at the newly launched £10 million Leverhulme Centre for the Future of Intelligence (CFI). Ghahramani, who is Deputy Director of the Centre, explains: "It's important to understand how developing technologies can help rather than replace humans. Over the coming years, philosophers, social scientists, cognitive scientists and computer scientists will help guide the future of the technology and study its implications – both the concerns and the benefits to society."
CFI brings together four of the world's leading universities (Cambridge, Oxford, Berkeley and Imperial College, London) to explore the implications of AI for human civilisation. Together, an interdisciplinary community of researchers will work closely with policy-makers and industry investigating topics such as the regulation of autonomous weaponry, and the implications of AI for democracy.
Ghahramani describes the excitement felt across the field: "It's exploding in importance. It used to be an area of research that was very academic – but in the past five years people have realised these methods are incredibly useful across a wide range of societally important areas.
"We are awash with data, we have increasing computing power and we will see more and more applications that make predictions in real time. And as we see an escalation in what machines can do, they will challenge our notions of intelligence and make it all the more important that we have the means to trust what they tell us."
Artificial intelligence has the power to eradicate poverty and disease or hasten the end of human civilisation as we know it – according to a speech delivered by Professor Stephen Hawking 19 October 2016 at the launch of the Centre for the Future of Intelligence.

Translate

Ads