Showing posts with label news. Show all posts
Showing posts with label news. Show all posts
To enable nanolaser operation at room temperature using single molecular layer and thin silicon beam

To enable nanolaser operation at room temperature using single molecular layer and thin silicon beam


 


For the first time in human history, researchers have built a nanolaser that uses only a single molecular layer. This is placed on a thin silicon beam, which operates at room temperature. The new device, developed by a team of researchers from Arizona State University and Tsinghua University, Beijing, China, could potentially be used to send information between different points on a single computer chip. The lasers also may be useful for other sensing applications in a compact, integrated format.
"This is the first demonstration of room-temperature operation of a nanolaser made of the single-layer material," said Cun-Zheng Ning, an ASU electrical engineering professor who led the research team. Details of the new laser are published in the July online edition of Nature Nanotechnology.
In addition to Ning, key authors of the article, "Room-temperature Continuous-wave Lasing from Monolayer Molybdenum Ditelluride Integrated with a Silicon Nanobeam Cavity," include Yongzhuo Li, Jianxing Zhang, Dandan Huang from Tsinghua University.
Ning said pivotal to the new development is use of materials that can be laid down in single layers and efficiently amplify light (lasing action). Single layer nanolasers have been developed before, but they all had to be cooled to low temperatures using a cryogen like liquid nitrogen or liquid helium. Being able to operate at room temperatures (~77 F) opens up many possibilities for uses of these new lasers," Ning said.
The joint ASU-Tsinghua research team used a monolayer of molybdenum ditelluride integrated with a silicon nanobeam cavity for their device. By combining molybdenum ditelluride with silicon, which is the bedrock in semiconductor manufacturing and one of the best waveguide materials, the researchers were able to achieve lasing action without cooling, Ning said.
A laser needs two key pieces – a gain medium that produces and amplifies photons, and a cavity that confines or traps photons. While such materials choices are easy for large lasers, they become more difficult at nanometer scales for nanolasers. Nanolasers are smaller than 100th of the thickness of the human hair and are expected to play important roles in future computer chips and a variety of light detection and sensing devices.
The choice of two-dimensional materials and the silicon waveguide enabled the researchers to achieve room temperature operation. Excitons in molybdenum telluride emit in a wavelength that is transparent to silicon, making silicon possible as a waveguide or cavity material. Precise fabrication of the nanobeam cavity with an array of holes etched and the integration of two-dimensional monolayer materials was also key to the project. Excitons in such monolayer materials are 100 times stronger than those in conventional semiconductors, allowing efficient light emission at room temperature.
Because silicon is already used in electronics, especially in computer chips, its use in this application is significant in future applications.
"A laser technology that can also be made on Silicon has been a dream for researchers for decades," said Ning. "This technology will eventually allow people to put both electronics and photonics on the same silicon platform, greatly simplifying manufacture."
Silicon does not emit light efficiently and therefore must be combined with other light emitting materials. Currently, other semiconductors are used, such as Indium phosphide or Indium Garlium Arsenide which are hundreds of times thicker, to bond with silicon for such applications.
The new monolayer materials combined with Silicon eliminate challenges encountered when combining with thicker, dissimilar materials. And, because this non-silicon material is only a single layer thick, it is flexible and less likely to crack under stress, according to Ning.
Looking forward, the team is working on powering their laser with electrical voltage to make the system more compact and easy to use, especially for its intended use on computer chips.
Juno to remain in current orbit at Jupiter

Juno to remain in current orbit at Jupiter


Juno to remain in current orbit at Jupiter
NASA's Juno spacecraft soared directly over Jupiter's south pole when JunoCam acquired this image on February 2, 2017 at 6:06 a.m. PT (9:06 a.m. ET), from an altitude of about 62,800 miles (101,000 kilometers) above the cloud tops. Credit: NASA
NASA's Juno mission to Jupiter, which has been in orbit around the gas giant since July 4, 2016, will remain in its current 53-day orbit for the remainder of the mission. This will allow Juno to accomplish its science goals, while avoiding the risk of a previously-planned engine firing that would have reduced the spacecraft's orbital period to 14 days.
"Juno is healthy, its instruments are fully operational, and the data and images we've received are nothing short of amazing," said Thomas Zurbuchen, associate administrator for NASA's Science Mission Directorate in Washington. "The decision to forego the burn is the right thing to do—preserving a valuable asset so that Juno can continue its exciting journey of discovery."
Juno has successfully orbited Jupiter four times since arriving at the giant planet, with the most recent orbit completed on Feb. 2. Its next close flyby of Jupiter will be March 27.
The orbital period does not affect the quality of the science collected by Juno on each flyby, since the altitude over Jupiter will be the same at the time of closest approach. In fact, the longer orbit provides new opportunities that allow further exploration of the far reaches of space dominated by Jupiter's magnetic field, increasing the value of Juno's research.
During each orbit, Juno soars low over Jupiter's cloud tops—as close as about 2,600 miles (4,100 kilometers). During these flybys, Juno probes beneath the obscuring cloud cover and studies Jupiter's auroras to learn more about the planet's origins, structure, atmosphere and magnetosphere.
The original Juno flight plan envisioned the spacecraft looping around Jupiter twice in 53-day orbits, then reducing its orbital period to 14 days for the remainder of the mission. However, two helium check valves that are part of the plumbing for the spacecraft's main engine did not operate as expected when the propulsion system was pressurized in October. Telemetry from the spacecraft indicated that it took several minutes for the valves to open, while it took only a few seconds during past main engine firings.
"During a thorough review, we looked at multiple scenarios that would place Juno in a shorter-period orbit, but there was concern that another main engine burn could result in a less-than-desirable orbit," said Rick Nybakken, Juno project manager at NASA's Jet Propulsion Laboratory in Pasadena, California. "The bottom line is a burn represented a risk to completion of Juno's science objectives."
Juno's larger 53-day orbit allows for "bonus science" that wasn't part of the original mission design. Juno will further explore the far reaches of the Jovian magnetosphere—the region of space dominated by Jupiter's magnetic field—including the far magnetotail, the southern magnetosphere, and the magnetospheric boundary region called the magnetopause. Understanding magnetospheres and how they interact with the solar wind are key science goals of NASA's Heliophysics Science Division.
"Another key advantage of the longer orbit is that Juno will spend less time within the strong radiation belts on each ," said Scott Bolton, Juno principal investigator from Southwest Research Institute in San Antonio. "This is significant because radiation has been the main life-limiting factor for Juno."
Juno will continue to operate within the current budget plan through July 2018, for a total of 12 science orbits. The team can then propose to extend the mission during the next science review cycle. The review process evaluates proposed mission extensions on the merit and value of previous and anticipated science returns.
The Juno science team continues to analyze returns from previous flybys. Revelations include that Jupiter's magnetic fields and aurora are bigger and more powerful than originally thought and that the belts and zones that give the gas giant's cloud top its distinctive look extend deep into the planet's interior. Peer-reviewed papers with more in-depth science results from Juno's first three flybys are expected to be published within the next few months. In addition, the mission's JunoCam—the first interplanetary outreach camera—is now being guided with assistance from the public. People can participate by voting on which features on Jupiter should be imaged during each flyby.
"Juno is providing spectacular results, and we are rewriting our ideas of how giant planets work," said Bolton. "The science will be just as spectacular as with our original plan."
Violating law of energy conservation in the early universe may explain dark energy

Violating law of energy conservation in the early universe may explain dark energy


universe
This is the "South Pillar" region of the star-forming region called the Carina Nebula. Like cracking open a watermelon and finding its seeds, the infrared telescope "busted open" this murky cloud to reveal star embryos tucked inside finger-like pillars of thick dust. Credit: NASA
Physicists have proposed that the violations of energy conservation in the early universe, as predicted by certain modified theories in quantum mechanics and quantum gravity, may explain the cosmological constant problem, which is sometimes referred to as "the worst theoretical prediction in the history of physics."
The physicists, Thibaut Josset and Alejandro Perez at the University of Aix-Marseille, France, and Daniel Sudarsky at the National Autonomous University of Mexico, have published a paper on their proposal in a recent issue Physical Review Letters.
"The main achievement of the work was the unexpected relation between two apparently very distinct issues, namely the accelerated expansion of the universe and microscopic physics," Josset told Phys.org. "This offers a fresh look at the cosmological constant problem, which is still far from being solved."
Einstein originally proposed the concept of the cosmological constant in 1917 to modify his theory of in order to prevent the universe from expanding, since at the time the universe was considered to be static.
Now that modern observations show that the universe is expanding at an accelerating rate, the cosmological constant today can be thought of as the simplest form of , offering a way to account for current observations.
However, there is a huge discrepancy—up to 120 orders of magnitude—between the large theoretical predicted value of the cosmological constant and the tiny observed value. To explain this disagreement, some research has suggested that the cosmological constant may be an entirely new constant of nature that must be measured more precisely, while another possibility is that the underlying mechanism assumed by theory is incorrect. The new study falls into the second line of thought, suggesting that scientists still do not fully understand the root causes of the cosmological constant.
The basic idea of the new paper is that violations of energy conservation in the could have been so small that they would have negligible effects at local scales and remain inaccessible to modern experiments, yet at the same time these violations could have made significant contributions to the present value of the cosmological constant.
To most people, the idea that conservation of energy is violated goes against everything they learned about the most fundamental laws of physics. But on the cosmological scale, conservation of energy is not as steadfast a law as it is on smaller scales. In this study, the physicists specifically investigated two theories in which violations of energy conservation naturally arise.
The first scenario of violations involves modifications to quantum theory that have previously been proposed to investigate phenomena such as the creation and evaporation of black holes, and which also appear in interpretations of quantum mechanics in which the wavefunction undergoes spontaneous collapse. In these cases, energy is created in an amount that is proportional to the mass of the collapsing object.
Violations of energy conservation also arise in some approaches to quantum gravity in which spacetime is considered to be granular due to the fundamental limit of length (the Planck length, which is on the order of 10-35 m). This spacetime discreteness could have led to either an increase or decrease in energy that may have begun contributing to the cosmological constant starting when photons decoupled from electrons in the early universe, during the period known as recombination.
As the researchers explain, their proposal relies on a modification to general relativity called unimodular gravity, first proposed by Einstein in 1919.
"Energy from matter components can be ceded to the gravitational field, and this 'loss of energy' will behave as a cosmological constant—it will not be diluted by later expansion of the universe," Josset said. "Therefore a tiny loss or creation of energy in the remote past may have significant consequences today on large scale."
Whatever the source of the energy conservation violation, the important result is that the energy that was created or lost affected the cosmological constant to a greater and greater extent as time went by, while the effects on matter decreased over time due to the expansion of the universe.
Another way to put it, as the physicists explain in their paper, is that the cosmological constant can be thought of as a record of the energy non-conservation during the history of the universe.
Currently there is no way to tell whether the violations of energy conservation investigated here truly did affect the cosmological constant, but the physicists plan to further investigate the possibility in the future.
"Our proposal is very general and any violation of energy conservation is expected to contribute to an effective cosmological constant," Josset said. "This could allow to set new constraints on phenomenological models beyond standard .
"On the other hand, direct evidence that dark energy is sourced by energy non-conservation seems largely out-of-reach, as we have access to the value of lambda [the ] today and constraints on its evolution at late time only."

Credit: Lisa Zyga  
 

Energy scenarios that actually provide useful decision-support tools for policymakers and investors

geekkeep.blogspot.com


Fulfilling the promise of the 2015 Paris Agreement on climate change—most notably the goal of limiting the rise in mean global surface temperature since preindustrial times to 2 degrees Celsius—will require a dramatic transition away from fossil fuels and toward low-carbon energy sources. To map out that transition, decision-makers routinely turn to energy scenarios, which use computational models to project changes to the energy mix that will be needed to meet climate and environmental targets. These models account for not only technological, economic, demographic, political, and institutional developments, but also the scope, timing, and stringency of policies to reduce greenhouse gas emissions and air pollution.
Energy scenarios provide useful decision-support tools for policymakers and investors
Credit: David Pilbrow/Flickr
Model-driven scenarios provide policymakers and investors with a powerful decision-support tool but should not be used as a decision-making tool due to several limitations. So argues a new study in the journal Energy and Environment by Sergey Paltsev, deputy director of the MIT Joint Program on the Science and Policy of Global Change and a senior research scientist for both the Joint Program and the MIT Energy Initiative. The study shows that overall, energy scenarios are useful for assessing policymaking and investment risks associated with different emissions reduction pathways, but tend to overestimate the degree to which future energy demand will resemble the past.
"Energy scenarios may not provide exact projections, but they are the best available tool to assess the magnitude of challenges that lie ahead," Paltsev observes in the study, a unique review of the value and limits of widely used energy scenarios that range from the International Energy Agency (IEA) World Energy Outlook, to the Joint Program's own annual Food, Water, Energy and Climate Outlook (which uses the MIT Economic Projection and Policy Analysis model), to a recent Intergovernmental Panel on Climate Change (IPCC) assessment report (AR5) presenting 392 energy scenarios aligned with the 2 C climate stabilization goal.
The study points out that because energy scenarios tend to vary widely in terms of the projections they produce for a given policy and the degree of uncertainty associated with those projections, it's not advisable to base an energy policy or investment decision on a single energy scenario. Taken collectively, however, energy scenarios can help bring into sharp focus a range of plausible futures—information decision-makers can use to assess the scale and cost of the technological changes needed to effect significant transformations in energy production and consumption. A careful review of multiple energy scenarios associated with a particular emissions pathway can provide a qualitative analysis of what's driving the results and the potential risks and benefits of a proposed policy or investment.
That said, projections in energy scenarios can sometimes be highly inaccurate due to factors that are difficult to anticipate.
For example, according to the study, which compared several energy scenario projections to historical observations, most energy scenarios do not account for sudden changes to the status quo. One of the greatest contributors to uncertainty in energy scenarios is the demand for low-emitting energy technologies, whose timing and scale of deployment—dependent on several economic and political factors—is highly unpredictable. Paltsev notes that the IEA constantly underestimates ; in its 2006 World Energy Outlook, the agency projected for 2020 a level of wind power generation that the world exceeded as early as 2013.
In addition, while energy scenarios have been largely successful in projecting the quantity of (e.g., the 1994 IEA World Energy Outlook's projection for 2010 was off by only 10 percent, despite highly disruptive developments such as the breakup of the Soviet Union, the world recession in 2008, and the emergence of the shale gas industry), most have been considerably off the mark when it comes to projecting energy prices (e.g., in 1993 dollars, the 1994 IEA WEO projected $28/barrel in 2010, but the actual price was $53/barrel).
Recognizing the steep challenge in projecting demand and prices for different energy sources in the midst of a dramatic energy transition, Paltsev emphasizes that governments should not try to pick a "winner"—a single energy technology that seems poised to reduce emissions singlehandedly—but rather adopt a strategy that targets emissions reductions from any energy source.
"Governments shouldn't pick the winners, because most likely that choice will be wrong," he says. "They should instead design policies such as carbon-pricing and emissions trading systems that are designed to achieve emissions reduction targets at the least cost."

credit: Mark Dwortzan 

Translate

Ads