Physicists retrieve 'lost' information from quantum measurements



Typically when scientists make a measurement, they know exactly what kind of measurement they're making, and their purpose is to obtain a measurement outcome. But in an "unrecorded measurement," both the type of measurement and the measurement outcome are unknown. Despite the fact that scientists do not know this information, experiments clearly show that unrecorded measurements unavoidably disturb the state of the system being measured for quantum (but not classical) systems. In classical systems, unrecorded measurements have no effect.
Although the information in unrecorded measurements appears to be completely lost, in a paper published recently in EPL, Michael Revzen and Ady Mann, both Professors Emeriti at the Technion-Israel Institute of Technology, have described a protocol that can retrieve some of the lost information.
The fact that it is possible to retrieve this lost information reveals new insight into the fundamental nature of quantum measurements, mainly by supporting the idea that quantum measurements contain both quantum and classical components.
Previously, analysis of quantum measurement theory has suggested that, while a quantum measurement starts out purely quantum, it becomes somewhat classical when the quantum state of the system being measured is reduced to a "classical-like" probability distribution. At this point, it is possible to predict the probability of the result of a quantum measurement.
As the physicists explain in the new paper, this step when a quantum state is reduced to a classical-like distribution is the traceable part of an unrecorded measurement—or in other words, it is the "lost" information that the new protocol retrieves. So the retrieval of the lost information provides evidence of the quantum-to-classical transition in a quantum measurement.
"We have demonstrated that analysis of quantum measurement is facilitated by viewing it as being made of two parts," Revzen told Phys.org. "The first, a pure quantum one, pertains to the non-commutativity of measurements' bases. The second relates to classical-like probabilities.
"This partitioning circumvents the ever-present polemic surrounding the whole issue of measurements and allowed us, on the basis of the accepted wisdom pertaining to classical measurements, to suggest and demonstrate that the non-commutative measurement basis may be retrieved by measuring an unrecorded measurement."
As the physicists explain, the key to retrieving the lost information is to use quantum entanglement to entangle the system being measured by an unrecorded measurement with a second system. Since the two systems are entangled, the unrecorded measurement affects both systems. Then a control measurement made on the entangled system can extract some of the lost information. The scientists explain that the essential role of entanglement in retrieving the lost information affirms the intimate connection between entanglement and measurements, as well as the uncertainty principle, which limits the precision with which certain measurements can be made. The scientists also note that the entire concept of retrieval has connections to quantum cryptography.
"Posing the problem of retrieval of unrecorded measurement is, we believe, new," Mann said. "The whole issue, however, is closely related to the problem of the combatting eavesdropper in quantum cryptography which aims, in effect, at detection of the existence of 'unrecorded measurement' (our aim is their identification). The issue of eavesdropper detection has been under active study for some time."
The scientists are continuing to build on the new results by showing that some of the lost information can never be retrieved, and that in other cases, it's impossible to determine whether certain information can be retrieved.
"At present, we are trying to find a comprehensive proof that the retrieval of the measurement basis is indeed the maximal possible retrieval, as well as to pin down the precise meaning of the ubiquitous 'undetermined' case," Revzen said. "This is, within our general study of quantum measurement, arguably the most obscure subject of the foundation of quantum mechanics."
Black hole hidden within its own exhaust

Black hole hidden within its own exhaust


Black hole hidden within its own exhaust
Artist impression of the heart of galaxy NGC 1068, which harbors an actively feeding supermassive black hole. Arising from the black hole's outer accretion disk, ALMA discovered clouds of cold molecular gas and dust.
 

Supermassive black holes, millions to billions of times the mass of our Sun, are found at the centers of galaxies. Many of these galactic behemoths are hidden within a thick doughnut-shape ring of dust and gas known as a torus. Previous observations suggest these cloaking, tire-like structures are formed from the native material found near the center of a galaxy.
New data from the Atacama Large Millimeter/submillimeter Array (ALMA), however, reveal that the black hole at the center of a galaxy named NGC 1068 is actually the source of its own dusty torus of dust and gas, forged from material flung out of the black hole's accretion disk.
This newly discovered cosmic fountain of cold gas and dust could reshape our understanding of how impact their host galaxy and potentially the intergalactic medium.
"Think of a black hole as an engine. It's fueled by material falling in on it from a flattened disk of dust and gas," said Jack Gallimore, an astronomer at Bucknell University in Lewisburg, Pennsylvania, and lead author on a paper published in Astrophysical Journal Letters. "But like any engine, a black hole can also emit exhaust." That exhaust, astronomers discovered, is the likely source of the torus of material that effectively obscures the region around the galaxy's super-massive black hole from optical telescopes.
NGC 1068 (also known as Messier 77) is a barred spiral galaxy approximately 47 million light-years from Earth in the direction of the constellation Cetus. At its center is an active galactic nucleus, a supermassive black hole that is being fed by a thin, rotating disk of gas and dust known as an accretion disk. As material in the disk spirals toward the central black hole, it becomes superheated and blazes bright with ultraviolet radiation. The outer reaches of the disk, however, are considerably cooler and glow more appreciably in infrared light and the millimeter-wavelength light that ALMA can detect.
ALMA image of the central region of galaxy NGC 1068. The torus of material harboring the supermassive black hole is highlighted in the pullout box. This region, which is approximately 40 light-years across, is the result of material flung …more
Using ALMA, an international team of astronomers peered deep into this region and discovered a sprinkling of cool clouds of carbon monoxide lifting off the outer portion of the accretion disk. The energy from the hot inner disk partially ionizes these clouds, enabling them to adhere to powerful magnetic field lines that wrap around the disk.
Like water being flung out of a rapidly rotating garden sprinkler, the clouds rising above the accretion disk get accelerated centrifugally along the to very high speeds—approximately 400 to 800 kilometers per second (nearly 2 million miles per hour). This is up to nearly three times faster than the rotational speed of the outer accretion disk, fast enough to send the clouds hurtling further out into the galaxy.
"These clouds are traveling so fast that they reach 'escape velocity' and are jettisoned in a cone-like spray from both sides of the disk," said Gallimore. "With ALMA, we can for the first time see that it is the gas that is thrown out that hides the black hole, not the gas falling in." This suggests that the general theory of an active black hole is oversimplified, he concludes.
With future ALMA observations, the astronomers hope to work out a fuel budget for this black hole engine: how much mass per year goes into the black hole and how much is ejected as exhaust.
"These are fundamental quantities for understanding black holes that we really don't have a good handle on at this time," concludes Gallimore.
This research is presented in the paper titled "High-velocity bipolar molecular emission from an AGN torus," by J. Gallimore et al., published in Astrophysical Journal Letters on 15 September 2016. [Preprint: arxiv.org/pdf/1608.02210v1.pdf ]

Microsoft researchers test achieve low error rate for conversational speech recognition system

  in  impressively


Microsoft researchers in test achieve impressively low error rate for conversational speech recognition system
The languages that we speak: how pervasive will they be in the computing of tomorrow? We are often being told that we are getting closer and closer to computers understanding our words as easily as a human beside us.
Now Microsoft researchers have every reason to feel especially proud. According to reports, Microsoft has stepped in front in the race for supremacy in speech recognition.
The company has claimed a significant test result in their quest for machines to understand speech. The study describing their work has been posted arXiv server. The title is "The Microsoft 2016 Conversational Speech Recognition System." Authors are eight: W. Xiong, J. Droppo, X. Huang, F. Seide, M. Seltzer, A. Stolcke, D. Yu, G. Zweig.
Wall Street Pit had a report about their work, one of a number of sites paying attention to what Microsoft researchers achieved. The Microsoft team turned to "a conversational telephone speech recognition test used as an industry standard," said Wall Street Pit. That test is the "US National Institute of Standards and Technology (NIST) 2000 Switchboard speech recognition task."
Chief speech scientist for Microsoft, Xuedong Huang, said their researchers achieved a word (WER) of 6.3%, considered the lowest in the industry.
Richard Eckel posted a piece about it, too, on the Microsoft site. The posting noted some features of their efforts. Earlier this year, Microsoft researchers won a computer vision challenge by using "a deep residual neural net system that utilized a new kind of cross-layer network connection."
It also said that "Another critical component to Microsoft researchers' recent success is the Computational Network Toolkit. CNTK implements sophisticated optimizations that enable deep learning algorithms to run an order of magnitude faster than before. A key step forward was a breakthrough for parallel training on graphics processing units, or GPUs."
(GPUs are known for computer graphics, but researchers find they are also very good for processing complex algorithms such as the ones used to understand speech, the posting said.)
As for the significance of the error rate, "Last weekend, the international conference speech communication and technology called 'Interspeech' was held in San Francisco," said Wall Street Pit. "During the event, IBM proudly announced that it was able to reach a WER of only 6.6%. Over two decades ago, the top error rate of the best published research system for computer speech recognition was at 43%."
geekkeep.com

The authors stated, "Our best single system achieves an error rate of 6.9% on the NIST 2000 Switchboard set. We believe this is the best performance reported to date for a recognition system not based on system combination. "
Liam Tung in ZDNet noted progress in this field. Tung wrote that "20 years ago the lowest error rate in speech recognition was 43 percent and that was achieved by IBM in 1995. By 2004, IBM had cut its error rate to 15.2 percent."
Tung noted that "However, these days with more research funds being funnelled into deep neural networks, tech giants are boasting error rates of well below 10 percent, but not quite at a level that exceeds human-level accuracy, which IBM estimates to be at about four percent."
In describing the system, the authors said, "Inspired by machine learning ensemble techniques, the system uses a range of convolutional and recurrent neural networks."
What distinguishes their work from previous work was explained in the paper. "Compared to earlier applications of CNNs to , our networks are much deeper, and use linear bypass connections across convolutional layers."
Tung remarked that "Like its rivals, Microsoft has made artificial intelligence a key plank in its strategy for human-computer interaction with voice-based platforms such as Cortana set to play a key role in enabling computing in wearables, mobile, the home, vehicles, and the enterprise."
Transmitting one terabit per second via optical fiber

Transmitting one terabit per second via optical fiber


Optical fiber transmits one terabit per second
TUM researchers (l-r) Fabian Steiner, Georg Böcherer, and Patrick Schulte with the statue of Claude Shannon, father of information theory – Image: Denise Panyik-Dale/Alcatel-Lucent

Nokia Bell Labs, Deutsche Telekom T-Labs and the Technical University of Munich (TUM) have achieved unprecedented transmission capacity and spectral efficiency in an optical communications field trial with a new modulation technique. The breakthrough research could extend the capability of optical networks to meet surging data traffic demands.
In an optical communications field trial Nokia Bell Labs, Deutsche Telekom T-Labs and the TU Munich showed that the flexibility and performance of optical networks can be maximized when adjustable transmission rates are dynamically adapted to channel conditions and traffic demands. As part of the Safe and Secure European Routing (SASER) project, the experiment over a deployed optical fiber network of Deutsche Telekom achieved a net transmission rate of one Terabit.
This is close to the the theoretical maximum information transfer rate of that channel and thus approaching the Shannon Limit of the fiber link. The Shannon Limit was discovered in 1948 by Claude Shannon, Bell Labs pioneer and the "father of information theory."
Novel modulation approach
The trial of the novel modulation approach, known as Probabilistic Constellation Shaping (PCS), uses quadrature amplitude modulation (QAM) formats to achieve higher transmission capacity over a given channel to significantly improve the spectral efficiency of optical communications.
PCS modifies the probability with which constellation points – the alphabet of the transmission – are used. Traditionally, all constellation points are used with the same frequency. PCS cleverly uses constellation points with high amplitude less frequently than those with lesser amplitude to transmit signals that, on average, are more resilient to noise and other impairments. This allows the transmission rate to be tailored to ideally fit the transmission channel, delivering up to 30 percent greater reach.
Maximal transmission capacity
It was 50 years ago when optical fiber was introduced. With the promise of 5G wireless technology on the horizon, optical transport systems today continue to evolve to help telecommunications operators and enterprises meet network data traffic growing at a cumulative annual rate of up to 100 percent.
PCS is now part of this evolution by enabling increases in optical fiber flexibility and performance that can move data traffic faster and over greater distances without increasing the optical network complexity.
The research is a key milestone in proving PCS could be used in the future to extend optical communication technologies. The results of this joint experiment will be presented at the European Conference on Optical Communication (ECOC) 2016 in Düsseldorf, Germany on September 19.

Transmitting data faster, further, and with unparalleled flexibility
"Increased capacities, reach and flexibility over deployed fiber infrastructures," said Bruno Jacobfeuerborn, Director Technology Telekom Deutschland and CTO Deutsche Telekom. "Deutsche Telekom provides a unique network infrastructure to evaluate and demonstrate such highly innovative transmission technologies for example. Furthermore, it also supports higher layer test scenarios and technologies."
"Information theory is the mathematics of digital technology, and during the Claude E. Shannon centenary year 2016 it is thrilling to see his ideas continue to transform industries and society," said Professor Gerhard Kramer, Head of the Institute for Communications Engineering at Technical University of Munich.
"Probabilistic constellation shaping, an idea that won a Bell Labs Prize, directly applies Shannon's principles and lets fiber optic systems transmit data faster, further, and with unparalleled flexibility," added Prof. Kramer. "The success of the close collaboration with Nokia Bell Labs, who further developed the technology, and Deutsche Telekom T-Labs, who tested it under real conditions, is satisfying confirmation that TUM Engineering is a label of outstanding quality, and that TUM teaching gives our students the intellectual tools to compete, succeed and lead globally."
Marcus Weldon, president Nokia Bell Labs & Nokia CTO, said: "Future optical networks not only need to support orders of magnitude higher capacity, but also the ability to dynamically adapt to channel conditions and traffic demand. Probabilistic Constellation Shaping offers great benefits to service providers and enterprises by enabling optical networks to operate closer to the Shannon Limit to support massive datacenter interconnectivity and provide the flexibility and performance required for modern networking in the digital era."

Translate

Ads