SpaceX at it again. reusing Dragon to ISS

SpaceX at it again. reusing Dragon to ISS

SpaceX's Dragon.

Remember the last post about SpaceX?  Well they are at it again!

This time, SpaceX has propelled supplies to International space station on saturday.More so is that they used a verssel that has flown before.

The refurbished Dragon cargo capsule propeled into space annexed to a Falcon 9 rocket at 5:07 pm (2107 GMT) from Cape Canaveral, Florida.

With a countdown made by NASA spokesman Mike Curie, the rocket blazed a steady vertical path into the clouds.
The last time this particular spaceship(Dragon) flew to space was in 2014.
The Dragon on present mission is packed with almost 6,000 pounds (2,700 kilograms) of science research, crew supplies and hardware and should arrive at the Monday(ISS time).

The supplies for special experiments include live mice to study the effects of osteoporosis and fruit flies for research on microgravity's impact on the heart.
The spacecraft is also loaded with solar panels and equipment to study neutron stars.

After about 10 minutes after launch, SpaceX successfully returned the first stage of the Falcon 9 rocket back to a controlled landing at Cape Canaveral.

The rocket powered its engines and guided itself down to Landing Zone One, not far from the launch site.
"The first stage is back," Curie said in a NASA live webcast, as video images showed the tall, narrow portion of the rocket touch down steadily in a cloud of smoke.
SpaceX said it marked the company's fifth successful landing on solid ground. Several of its Falcon 9 rockets have returned upright to platforms floating in the ocean.

The effort is part of SpaceX's push to make spaceflight cheaper by re-using costly rocket and spaceship components after each launch, rather than ditching them in the ocean.
The launch was the 100th from NASA's historic launch pad 39A, the starting point for the Apollo missions to the Moon in the 1960s and 1970s, as well as a total of 82 shuttle flights.

A Self ventilating suit to keep you dry and cool while you perform exercise

A Self ventilating suit to keep you dry and cool while you perform exercise


Self-ventilating workout suit keeps athletes cool and dry
Images of garment prototype before exercise with flat ventilation flaps (F) and after exercise with curved ventilation flaps (G). Credit: Science Advances (2017).
A team of MIT researchers has designed a breathable workout suit with ventilating flaps that open and close in response to an athlete's body heat and sweat. These flaps, which range from thumbnail- to finger-sized, are lined with live microbial cells that shrink and expand in response to changes in humidity. The cells act as tiny sensors and actuators, driving the flaps to open when an athlete works up a sweat, and pulling them closed when the body has cooled off.

The researchers have also fashioned a running shoe with an inner layer of similar cell-lined flaps to air out and wick away moisture. Details of both designs are published today in Science Advances.
Why use in responsive fabrics? The researchers say that moisture-sensitive cells require no additional elements to sense and respond to humidity. The they have used are also proven to be safe to touch and even consume. What's more, with new genetic engineering tools available today, cells can be prepared quickly and in vast quantities, to express multiple functionalities in addition to moisture response.
To demonstrate this last point, the researchers engineered moisture-sensitive cells to not only pull flaps open but also light up in response to humid conditions.
"We can combine our cells with genetic tools to introduce other functionalities into these living cells," says Wen Wang, the paper's lead author and a former research scientist in MIT's Media Lab and Department of Chemical Engineering. "We use fluorescence as an example, and this can let people know you are running in the dark. In the future we can combine odor-releasing functionalities through genetic engineering. So maybe after going to the gym, the shirt can release a nice-smelling odor."
Wang's co-authors include 14 researchers from MIT, specializing in fields including mechanical engineering, chemical engineering, architecture, biological engineering, and fashion design, as well as researchers from New Balance Athletics. Wang co-led the project, dubbed bioLogic, with former graduate student Lining Yao as part of MIT's Tangible Media group, led by Hiroshi Ishii, the Jerome B. Wiesner Professor of Media Arts and Sciences.
Shape-shifting cells
In nature, biologists have observed that living things and their components, from pine cone scales to microbial cells and even specific proteins, can change their structures or volumes when there is a change in humidity. The MIT team hypothesized that natural shape-shifters such as yeast, bacteria, and other microbial cells might be used as building blocks to construct moisture-responsive fabrics.

"These cells are so strong that they can induce bending of the substrate they are coated on," Wang says.
The researchers first worked with the most common nonpathogenic strain of E. coli, which was found to swell and shrink in response to changing humidity. They further engineered the cells to express green fluorescent protein, enabling the cell to glow when it senses humid conditions.
They then used a cell-printing method they had previously developed to print E. coli onto sheets of rough, natural latex.
The team printed parallel lines of E. coli cells onto sheets of latex, creating two-layer structures, and exposed the fabric to changing moisture conditions. When the fabric was placed on a hot plate to dry, the cells began to shrink, causing the overlying latex layer to curl up. When the fabric was then exposed to steam, the cells began to glow and expand, causing the latex flatten out. After undergoing 100 such dry/wet cycles, Wang says the fabric experienced "no dramatic degradation" in either its cell layer or its overall performance.
No sweat
The researchers worked the biofabric into a wearable garment, designing a running suit with cell-lined latex flaps patterned across the suit's back. They tailored the size of each flap, as well as the degree to which they open, based on previously published maps of where the body produces heat and sweat.
"People may think heat and sweat are the same, but in fact, some areas like the lower spine produce lots of sweat but not much heat," Yao says. "We redesigned the garment using a fusion of heat and sweat maps to, for example, make flaps bigger where the body generates more heat."
Support frames underneath each flap keep the fabric's inner cell layer from directly touching the skin, while at the same time, the are able to sense and react to humidity changes in the air lying just over the skin. In trials to test the running suit, study participants donned the garment and worked out on exercise treadmills and bicycles while researchers monitored their temperature and humidity using small sensors positioned across their backs.
After five minutes of exercise, the suit's flaps started opening up, right around the time when participants reported feeling warm and sweaty. According to sensor readings, the flaps effectively removed sweat from the body and lowered skin temperature, more so than when participants wore a similar running suit with nonfunctional flaps.
When Wang tried on the suit herself, she found that the flaps created a welcome sensation. After pedaling hard for a few minutes, Wang recalls that "it felt like I was wearing an air conditioner on my back."
Ventilated running shoes
The team also integrated the moisture-responsive fabric into a rough prototype of a running shoe. Where the bottom of the foot touches the sole of the shoe, the researchers sewed multiple flaps, curved downward, with the cell-lined layer facing toward—though not touching—a runner's foot. They again designed the size and position of the flaps based on heat and sweat maps of the foot.
"In the beginning, we thought of making the flaps on top of the shoe, but we found people don't normally sweat on top of their feet," Wang says. "But they sweat a lot on the bottom of their feet, which can lead to diseases like warts. So we thought, is it possible to keep your feet dry and avoid those diseases?"
As with the workout suit, the flaps on the running shoe opened and lit up when researchers increased the surrounding humidity; in dry conditions the flaps faded and closed.
Going forward, the team is looking to collaborate with sportswear companies to commercialize their designs, and is also exploring other uses, including moisture-responsive curtains, lampshades, and bedsheets.
"We are also interested in rethinking packaging," Wang says. "The concept of a second skin would suggest a new genre for responsive packaging."
"This work is an example of harnessing the power of biology to design new materials and devices and achieve new functions," says Xuanhe Zhao, the Robert N. Noyce Career Development Associate Professor in the Department of Mechanical Engineering and a co-author on the paper. "We believe this new field of 'living' materials and devices will find important applications at the interface between engineering and biological systems."
New theory on how Earth's crust was created

New theory on how Earth's crust was created

A composite image of the Western hemisphere of the Earth. Credit: NASA
More than 90% of Earth's continental crust is made up of silica-rich minerals, such as feldspar and quartz. But where did this silica-enriched material come from? And could it provide a clue in the search for life on other planets?
Conventional theory holds that all of the early Earth's crustal ingredients were formed by volcanic activity. Now, however, McGill University scientists Don Baker and Kassandra Sofonio have published a theory with a novel twist: some of the chemical components of this material settled onto Earth's early surface from the steamy atmosphere that prevailed at the time.
First, a bit of ancient geochemical history: Scientists believe that a Mars-sized planetoid plowed into the proto-Earth around 4.5 billion years ago, melting the Earth and turning it into an ocean of magma. In the wake of that impact—which also created enough debris to form the moon—the Earth's surface gradually cooled until it was more or less solid. Baker's new theory, like the conventional one, is based on that premise.
The atmosphere following that collision, however, consisted of high-temperature steam that dissolved rocks on the Earth's immediate surface—"much like how sugar is dissolved in coffee," Baker explains. This is where the new wrinkle comes in. "These dissolved minerals rose to the upper atmosphere and cooled off, and then these silicate that were dissolved at the surface would start to separate out and fall back to Earth in what we call a silicate rain."
To test this theory, Baker and co-author Kassandra Sofonio, a McGill undergraduate research assistant, spent months developing a series of laboratory experiments designed to mimic the steamy conditions on early Earth. A mixture of bulk silicate earth materials and water was melted in air at 1,550 degrees Celsius, then ground to a powder. Small amounts of the powder, along with water, were then enclosed in gold palladium capsules, placed in a pressure vessel and heated to about 727 degrees Celsius and 100 times Earth's surface pressure to simulate conditions in the Earth's atmosphere about 1 million years after the moon-forming impact. After each experiment, samples were rapidly quenched and the material that had been dissolved in the high temperature steam analyzed.
The experiments were guided by other scientists' previous experiments on rock-water interactions at high pressures, and by the McGill team's own preliminary calculations, Baker notes. Even so, "we were surprised by the similarity of the dissolved silicate material produced by the experiments" to that found in the Earth's crust.
Their resulting paper, published in the journal Earth and Planetary Science Letters, posits a new theory of "aerial metasomatism"—a term coined by Sofonio to describe the process by which silica minerals condensed and fell back to earth over about a million years, producing some of the earliest rock specimens known today.
"Our experiment shows the chemistry of this process," and could provide scientists with important clues as to which exoplanets might have the capacity to harbor life Baker says.
"This time in early Earth's history is still really exciting," he adds. "A lot of people think that life started very soon after these events that we're talking about. This is setting up the stages for the Earth being ready to support life."
Juno to remain in current orbit at Jupiter

Juno to remain in current orbit at Jupiter

Juno to remain in current orbit at Jupiter
NASA's Juno spacecraft soared directly over Jupiter's south pole when JunoCam acquired this image on February 2, 2017 at 6:06 a.m. PT (9:06 a.m. ET), from an altitude of about 62,800 miles (101,000 kilometers) above the cloud tops. Credit: NASA
NASA's Juno mission to Jupiter, which has been in orbit around the gas giant since July 4, 2016, will remain in its current 53-day orbit for the remainder of the mission. This will allow Juno to accomplish its science goals, while avoiding the risk of a previously-planned engine firing that would have reduced the spacecraft's orbital period to 14 days.
"Juno is healthy, its instruments are fully operational, and the data and images we've received are nothing short of amazing," said Thomas Zurbuchen, associate administrator for NASA's Science Mission Directorate in Washington. "The decision to forego the burn is the right thing to do—preserving a valuable asset so that Juno can continue its exciting journey of discovery."
Juno has successfully orbited Jupiter four times since arriving at the giant planet, with the most recent orbit completed on Feb. 2. Its next close flyby of Jupiter will be March 27.
The orbital period does not affect the quality of the science collected by Juno on each flyby, since the altitude over Jupiter will be the same at the time of closest approach. In fact, the longer orbit provides new opportunities that allow further exploration of the far reaches of space dominated by Jupiter's magnetic field, increasing the value of Juno's research.
During each orbit, Juno soars low over Jupiter's cloud tops—as close as about 2,600 miles (4,100 kilometers). During these flybys, Juno probes beneath the obscuring cloud cover and studies Jupiter's auroras to learn more about the planet's origins, structure, atmosphere and magnetosphere.
The original Juno flight plan envisioned the spacecraft looping around Jupiter twice in 53-day orbits, then reducing its orbital period to 14 days for the remainder of the mission. However, two helium check valves that are part of the plumbing for the spacecraft's main engine did not operate as expected when the propulsion system was pressurized in October. Telemetry from the spacecraft indicated that it took several minutes for the valves to open, while it took only a few seconds during past main engine firings.
"During a thorough review, we looked at multiple scenarios that would place Juno in a shorter-period orbit, but there was concern that another main engine burn could result in a less-than-desirable orbit," said Rick Nybakken, Juno project manager at NASA's Jet Propulsion Laboratory in Pasadena, California. "The bottom line is a burn represented a risk to completion of Juno's science objectives."
Juno's larger 53-day orbit allows for "bonus science" that wasn't part of the original mission design. Juno will further explore the far reaches of the Jovian magnetosphere—the region of space dominated by Jupiter's magnetic field—including the far magnetotail, the southern magnetosphere, and the magnetospheric boundary region called the magnetopause. Understanding magnetospheres and how they interact with the solar wind are key science goals of NASA's Heliophysics Science Division.
"Another key advantage of the longer orbit is that Juno will spend less time within the strong radiation belts on each ," said Scott Bolton, Juno principal investigator from Southwest Research Institute in San Antonio. "This is significant because radiation has been the main life-limiting factor for Juno."
Juno will continue to operate within the current budget plan through July 2018, for a total of 12 science orbits. The team can then propose to extend the mission during the next science review cycle. The review process evaluates proposed mission extensions on the merit and value of previous and anticipated science returns.
The Juno science team continues to analyze returns from previous flybys. Revelations include that Jupiter's magnetic fields and aurora are bigger and more powerful than originally thought and that the belts and zones that give the gas giant's cloud top its distinctive look extend deep into the planet's interior. Peer-reviewed papers with more in-depth science results from Juno's first three flybys are expected to be published within the next few months. In addition, the mission's JunoCam—the first interplanetary outreach camera—is now being guided with assistance from the public. People can participate by voting on which features on Jupiter should be imaged during each flyby.
"Juno is providing spectacular results, and we are rewriting our ideas of how giant planets work," said Bolton. "The science will be just as spectacular as with our original plan."
Violating law of energy conservation in the early universe may explain dark energy

Violating law of energy conservation in the early universe may explain dark energy

This is the "South Pillar" region of the star-forming region called the Carina Nebula. Like cracking open a watermelon and finding its seeds, the infrared telescope "busted open" this murky cloud to reveal star embryos tucked inside finger-like pillars of thick dust. Credit: NASA
Physicists have proposed that the violations of energy conservation in the early universe, as predicted by certain modified theories in quantum mechanics and quantum gravity, may explain the cosmological constant problem, which is sometimes referred to as "the worst theoretical prediction in the history of physics."
The physicists, Thibaut Josset and Alejandro Perez at the University of Aix-Marseille, France, and Daniel Sudarsky at the National Autonomous University of Mexico, have published a paper on their proposal in a recent issue Physical Review Letters.
"The main achievement of the work was the unexpected relation between two apparently very distinct issues, namely the accelerated expansion of the universe and microscopic physics," Josset told "This offers a fresh look at the cosmological constant problem, which is still far from being solved."
Einstein originally proposed the concept of the cosmological constant in 1917 to modify his theory of in order to prevent the universe from expanding, since at the time the universe was considered to be static.
Now that modern observations show that the universe is expanding at an accelerating rate, the cosmological constant today can be thought of as the simplest form of , offering a way to account for current observations.
However, there is a huge discrepancy—up to 120 orders of magnitude—between the large theoretical predicted value of the cosmological constant and the tiny observed value. To explain this disagreement, some research has suggested that the cosmological constant may be an entirely new constant of nature that must be measured more precisely, while another possibility is that the underlying mechanism assumed by theory is incorrect. The new study falls into the second line of thought, suggesting that scientists still do not fully understand the root causes of the cosmological constant.
The basic idea of the new paper is that violations of energy conservation in the could have been so small that they would have negligible effects at local scales and remain inaccessible to modern experiments, yet at the same time these violations could have made significant contributions to the present value of the cosmological constant.
To most people, the idea that conservation of energy is violated goes against everything they learned about the most fundamental laws of physics. But on the cosmological scale, conservation of energy is not as steadfast a law as it is on smaller scales. In this study, the physicists specifically investigated two theories in which violations of energy conservation naturally arise.
The first scenario of violations involves modifications to quantum theory that have previously been proposed to investigate phenomena such as the creation and evaporation of black holes, and which also appear in interpretations of quantum mechanics in which the wavefunction undergoes spontaneous collapse. In these cases, energy is created in an amount that is proportional to the mass of the collapsing object.
Violations of energy conservation also arise in some approaches to quantum gravity in which spacetime is considered to be granular due to the fundamental limit of length (the Planck length, which is on the order of 10-35 m). This spacetime discreteness could have led to either an increase or decrease in energy that may have begun contributing to the cosmological constant starting when photons decoupled from electrons in the early universe, during the period known as recombination.
As the researchers explain, their proposal relies on a modification to general relativity called unimodular gravity, first proposed by Einstein in 1919.
"Energy from matter components can be ceded to the gravitational field, and this 'loss of energy' will behave as a cosmological constant—it will not be diluted by later expansion of the universe," Josset said. "Therefore a tiny loss or creation of energy in the remote past may have significant consequences today on large scale."
Whatever the source of the energy conservation violation, the important result is that the energy that was created or lost affected the cosmological constant to a greater and greater extent as time went by, while the effects on matter decreased over time due to the expansion of the universe.
Another way to put it, as the physicists explain in their paper, is that the cosmological constant can be thought of as a record of the energy non-conservation during the history of the universe.
Currently there is no way to tell whether the violations of energy conservation investigated here truly did affect the cosmological constant, but the physicists plan to further investigate the possibility in the future.
"Our proposal is very general and any violation of energy conservation is expected to contribute to an effective cosmological constant," Josset said. "This could allow to set new constraints on phenomenological models beyond standard .
"On the other hand, direct evidence that dark energy is sourced by energy non-conservation seems largely out-of-reach, as we have access to the value of lambda [the ] today and constraints on its evolution at late time only."

Credit: Lisa Zyga  

Energy scenarios that actually provide useful decision-support tools for policymakers and investors

Fulfilling the promise of the 2015 Paris Agreement on climate change—most notably the goal of limiting the rise in mean global surface temperature since preindustrial times to 2 degrees Celsius—will require a dramatic transition away from fossil fuels and toward low-carbon energy sources. To map out that transition, decision-makers routinely turn to energy scenarios, which use computational models to project changes to the energy mix that will be needed to meet climate and environmental targets. These models account for not only technological, economic, demographic, political, and institutional developments, but also the scope, timing, and stringency of policies to reduce greenhouse gas emissions and air pollution.
Energy scenarios provide useful decision-support tools for policymakers and investors
Credit: David Pilbrow/Flickr
Model-driven scenarios provide policymakers and investors with a powerful decision-support tool but should not be used as a decision-making tool due to several limitations. So argues a new study in the journal Energy and Environment by Sergey Paltsev, deputy director of the MIT Joint Program on the Science and Policy of Global Change and a senior research scientist for both the Joint Program and the MIT Energy Initiative. The study shows that overall, energy scenarios are useful for assessing policymaking and investment risks associated with different emissions reduction pathways, but tend to overestimate the degree to which future energy demand will resemble the past.
"Energy scenarios may not provide exact projections, but they are the best available tool to assess the magnitude of challenges that lie ahead," Paltsev observes in the study, a unique review of the value and limits of widely used energy scenarios that range from the International Energy Agency (IEA) World Energy Outlook, to the Joint Program's own annual Food, Water, Energy and Climate Outlook (which uses the MIT Economic Projection and Policy Analysis model), to a recent Intergovernmental Panel on Climate Change (IPCC) assessment report (AR5) presenting 392 energy scenarios aligned with the 2 C climate stabilization goal.
The study points out that because energy scenarios tend to vary widely in terms of the projections they produce for a given policy and the degree of uncertainty associated with those projections, it's not advisable to base an energy policy or investment decision on a single energy scenario. Taken collectively, however, energy scenarios can help bring into sharp focus a range of plausible futures—information decision-makers can use to assess the scale and cost of the technological changes needed to effect significant transformations in energy production and consumption. A careful review of multiple energy scenarios associated with a particular emissions pathway can provide a qualitative analysis of what's driving the results and the potential risks and benefits of a proposed policy or investment.
That said, projections in energy scenarios can sometimes be highly inaccurate due to factors that are difficult to anticipate.
For example, according to the study, which compared several energy scenario projections to historical observations, most energy scenarios do not account for sudden changes to the status quo. One of the greatest contributors to uncertainty in energy scenarios is the demand for low-emitting energy technologies, whose timing and scale of deployment—dependent on several economic and political factors—is highly unpredictable. Paltsev notes that the IEA constantly underestimates ; in its 2006 World Energy Outlook, the agency projected for 2020 a level of wind power generation that the world exceeded as early as 2013.
In addition, while energy scenarios have been largely successful in projecting the quantity of (e.g., the 1994 IEA World Energy Outlook's projection for 2010 was off by only 10 percent, despite highly disruptive developments such as the breakup of the Soviet Union, the world recession in 2008, and the emergence of the shale gas industry), most have been considerably off the mark when it comes to projecting energy prices (e.g., in 1993 dollars, the 1994 IEA WEO projected $28/barrel in 2010, but the actual price was $53/barrel).
Recognizing the steep challenge in projecting demand and prices for different energy sources in the midst of a dramatic energy transition, Paltsev emphasizes that governments should not try to pick a "winner"—a single energy technology that seems poised to reduce emissions singlehandedly—but rather adopt a strategy that targets emissions reductions from any energy source.
"Governments shouldn't pick the winners, because most likely that choice will be wrong," he says. "They should instead design policies such as carbon-pricing and emissions trading systems that are designed to achieve emissions reduction targets at the least cost."

credit: Mark Dwortzan 

Tumor suppressor key in maintaining stem cell status in muscle

A gene known to suppress tumor formation in a broad range of tissues plays a key role in keeping stem cells in muscles dormant until needed, a finding that may have implications for both human health and animal production, according to a Purdue University study.
Shihuan Kuang, professor of animal sciences, and Feng Yue, a postdoctoral researcher in Kuang's lab, reported their findings in two papers published in the journals Cell Reports and Nature Communications. The results suggest modifying expression of the PTEN gene could one day play a role in increasing in agricultural animals and improve therapies for muscle injuries in humans.
Muscle , called satellite cells, normally sit in a quiescent, or dormant, state until called upon to build muscle or repair a damaged muscle. Inability to maintain the quiescence would lead to a loss of satellite cells. As humans age, the number of satellite cells gradually declines and the remaining cells become less effective in regenerating muscles, resulting in muscle loss – a condition called sarcopenia.
Kuang and Yue, in the Nature Communications paper, explored the role tumor-suppressor gene PTEN plays in satellite cells. The PTEN gene encodes a protein that suppresses the growth signaling, thereby, limiting the growth of fast-growing tumor cells. Mutation of the PTEN gene is associated with many types of cancers, but how the gene functions in muscle stem cells is unknown.
To understand the function of a gene, the authors first wanted to know how the gene is expressed.
"This gene is highly expressed in the satellite cells when the cells are in the quiescent state. When they become differentiated, the PTEN level reduces," Yue said.
By knocking out the PTEN gene in resting satellite cells, the researchers found that satellite cells quickly differentiate and become muscle cells. So PTEN plays an essential role in keeping satellite cells in their quiescent state.
"You no longer have the stem cells once you knock out the gene," Kuang said.
In their Cell Reports paper, Kuang and Yue took a step further to examine PTEN function in proliferating stem cells. This time, they knocked out PTEN in embryonic progenitor cells, those that will later become muscle in the mouse. They found that as the mouse grew, muscle mass increased significantly—by as much as 40 percent in some muscles—over that of a normal mouse.
"That would be significant in an animal production point of view," Kuang said.
The increased muscle came with a cost, however. Besides creating muscle, those create satellite cells. Without PTEN, not only fewer satellite cells were created, but the resulting satellite cells cannot maintain dormancy, leading to an accelerated rate of depletion during aging.
The faster depletion of satellite cells during aging wouldn't matter much in an animal production scenario, Kuang said. Beef cattle, for example, are harvested before they age. The increase in muscle mass, however, would be a significant advantage in production efficiency.
The findings may lead to improvement in human health, the authors said. The ability to control the expression of PTEN could lead to therapies for quicker healing of muscle injuries.
"If you want to quickly boost up the stem cells to repair something, you need to suppress PTEN," Kuang said. "After that, you'd need to increase PTEN to return the cells back to quiescent state. If we could do that, you would suspect that the muscle would repair more quickly."
Knowing that PTEN also suppresses tumors in many types of tissues, the authors noted that the elimination of the gene did not cause tumor formation in the cells they studied. That suggests regulation of PTEN could be a feasible method for improving human health and animal agriculture.

Credit : Brian Wallheimer
Raspberry Pi brings out shiny Compute Module 3

Raspberry Pi brings out shiny Compute Module 3

Raspberry Pi brings out shiny Compute Module 3
Compute Module 3
Another Raspberry Pi launch announcement—and another burst of news items explaining what's new, at what price.
This time it is about the Raspberry Pi Compute Module 3 (CM3). Trusted Reviews said it comes with 64-bit and multi-core functionality.
"The new Compute Module is based on the BCM2837 processor – the same as found in the Raspberry Pi 3 – running at 1.2 GHz with 1 gigabyte of RAM," said Hackaday.
The Raspberry Pi blog provided the CM3 launch announcement:
"Way back in April of 2014 we launched the original Compute Module (CM1), which was based around the BCM2835 processor of the original Raspberry Pi. CM1 was a great success and we've seen a lot of uptake from various markets, particularly in IoT and home and factory automation."
Now it has a new CM3 based on the Raspberry Pi 3 hardware. Take note: It is "providing twice the RAM and roughly 10x the CPU performance of the original Module," according to the blog.
Ars Technica noted that it was the first big upgrade since 2014. That year, said Trusted Reviews, The original module "combined the guts of a first-generation Pi with a small SODIMM-layout module."
The new version, said Joe Roberts in Trusted Reviews, "which uses the same BCM2837, a quad-core 64-bit ARMv8 part, as the Pi 3, brings the Compute Module fully up to date."
There will be two flavors—CM3 and CM3L (lite) —The 'L' version is a CM3 without eMMC Flash—that is, as described by RS Components,"not fitted with eMMC Flash and the SD/eMMC interface. But pins are available for the designer to connect their own SD/eMMC device."
According to the blog, the Lite version "brings the SD card interface to the Module pins so a user can wire this up to an eMMC or SD card of their choice."
Jon Brodkin in Ars Technica said that the Compute Module's stripped-down form factor makes it more suitable for embedded computing, as it fits into a standard SODIMM connector. The new Compute Module can run Windows IoT Core and supports Linux.
The latest version is being used by NEC, said Brodkin, in displays intended for digital signs, streaming, and presentations. The Raspberry Pi blog, meanwhile, said that "we're already excited to see NEC displays, an early adopter, launching their CM3-enabled display solution."
It stated pricing for the two flavors. The CM3 and CM3L are priced at $30 and $25, respectively (excluding tax and shipping), and this price applies to any size order. The original Compute Module is also reduced to $25. The blog said one can "Head on over to our partners element14 (or Farnell UK) and RS Components" to buy them.
What about backwards compatibility? According to the blog "The CM3 is largely backwards-compatible with CM1 designs which have followed our design guidelines."
The blog presented the caveats: The module is 1mm taller than the original module; "the processor core supply (VBAT) can draw significantly more current. Consequently, the processor itself will run much hotter under heavy CPU load, so designers need to consider thermals based on expected use cases."

credit: Nancy Owano 
The strength of real hair inspires new materials for body armor

The strength of real hair inspires new materials for body armor

Strength of hair inspires new materials for body armor
Researchers at the University of California San Diego investigate why hair is incredibly strong and resistant to breaking. Credit:
In a new study, researchers at the University of California San Diego investigate why hair is incredibly strong and resistant to breaking. The findings could lead to the development of new materials for body armor and help cosmetic manufacturers create better hair care products.
Hair has a strength to weight ratio comparable to steel. It can be stretched up to one and a half times its original length before breaking. "We wanted to understand the mechanism behind this extraordinary property," said Yang (Daniel) Yu, a nano-engineering Ph.D. student at UC San Diego and the first author of the study.
"Nature creates a variety of interesting materials and architectures in very ingenious ways. We're interested in understanding the correlation between the structure and the properties of biological materials to develop synthetic materials and designs—based on nature—that have better performance than existing ones," said Marc Meyers, a professor of mechanical engineering at the UC San Diego Jacobs School of Engineering and the lead author of the study.
In a study published online in Dec. in the journal Materials Science and Engineering C, researchers examined at the nano-scale level how a strand of human behaves when it is deformed, or stretched. The team found that hair behaves differently depending on how fast or slow it is stretched. The faster hair is stretched, the stronger it is. "Think of a highly viscous substance like honey," Meyers explained. "If you deform it fast it becomes stiff, but if you deform it slowly it readily pours."
Hair consists of two main parts—the cortex, which is made up of parallel fibrils, and the matrix, which has an amorphous (random) structure. The matrix is sensitive to the speed at which hair is deformed, while the cortex is not. The combination of these two components, Yu explained, is what gives hair the ability to withstand high stress and strain.
And as hair is stretched, its structure changes in a particular way. At the nano-scale, the cortex fibrils in hair are each made up of thousands of coiled spiral-shaped chains of molecules called alpha helix chains. As hair is deformed, the alpha helix chains uncoil and become pleated sheet structures known as beta sheets. This structural change allows hair to handle up a large amount deformation without breaking.
This structural transformation is partially reversible. When hair is stretched under a small amount of strain, it can recover its original shape. Stretch it further, the structural transformation becomes irreversible. "This is the first time evidence for this transformation has been discovered," Yu said.
"Hair is such a common material with many fascinating properties," said Bin Wang, a UC San Diego PhD alumna and co-author on the paper. Wang is now at the Shenzhen Institutes of Advanced Technology in China continuing research on hair.
The team also conducted stretching tests on hair at different humidity levels and temperatures. At higher humidity levels, hair can withstand up to 70 to 80 percent deformation before breaking. Water essentially "softens" hair—it enters the matrix and breaks the sulfur bonds connecting the filaments inside a strand of hair. Researchers also found that hair starts to undergo permanent damage at 60 degrees Celsius (140 degrees Fahrenheit). Beyond this temperature, hair breaks faster at lower stress and strain.
"Since I was a child I always wondered why hair is so strong. Now I know why," said Wen Yang, a former postdoctoral researcher in Meyers' research group and co-author on the paper.
The team is currently conducting further studies on the effects of water on the properties of . Moving forward, the team is investigating the detailed mechanism of how washing hair causes it to return to its original shape.

How To Rename Multiple Files at One Time in Windows 10 ??

In the Windows 10 File Explorer this process of renaming files in large batches is simple but for many users, myself included, the feature is not well known.
In this Quick Tip article I want to share with you how easy it is do use this capability of File Explorer.

 Process :-  

Step 1 : Select the image you want to rename
In Windows 10 there is always more than one way to accomplish most tasks so once you have File Explorer open to the directory of files you want to rename you can use the keyboard shortcut CTRL + A to select all of the files or use the Select All button on the Home view of File Explorer.Or select only those image you want to rename at once.

When You have selected the images/files that you want to rename as a group. 
Move to step 2  
Step 2 : Rename the files
Renaming files in a batch is done as you do same with the one file  rename one file .
Once all of the images/files you want to rename are selected, right click on the first image/file and select Rename from the context menu.

You will then have an editable name field for the first image/file in the sequence - just give it whatever name you choose for the group of images/files. Hit the Enter key once you have the new name typed in.

Now you will see all the files with the new name followed by a sequential number in parentheses. You have now successfully renamed your files in one batch.

Here is one last interesting thing with this feature - if you click on any other image/file in the collection it will give that file the first sequential number and then continuing from that image/file in sequential order until it hits the end of the list. At that point it will go back up to the first one and continue to renaming until the file/image just before the one you started the renaming with at the beginning.
So a key aspect of this process is to make sure you have the files in the order you want them numbered in and start with the first image/file in the directory.

Screenshot :-

Blitab Technology :createing tablet for the blind and visually impaired

Blitab Technology develops tablet for the blind and visually impaired
Blitab, a tablet with a Braille interface, looks like a promising step up for blind and low vision people who want to be part of the educational, working and entertainment worlds of digital life.
A video of the Blitab Technology founder, Kristina Tsvetanova, said the idea for such a tablet came to her during her studies as an industrial engineer. At the time, a blind colleague of hers asked her to sign him for an online course and a question nagged her: How could technology help him better?
Worldwide, she said, there are more than 285 million blind and visually impaired people.
She was aware that in general blind and low vision people were coping with old, bulky technology, contributing to low literacy rates among blind children. She and her team have been wanting to change that.
There was ample room for improvements. The conventional interfaces for the blind, she said, have been slow and expensive. She said the keyboard can range from about $5000 to $8000. Also, she said, they are limited to what the blind person can read, just a few words at a time. Imagine, she said, reading Moby Dick, five words at a time.
They have engineered a with a 14-line Braille display on the top and a touch screen on the bottom.

Part of their technology involves a high performance membrane, and their press statement said the tablet uses smart micro fluids to develop small physical bubbles instead of a screen display.
They have produced a tactile tablet, she said, where people with sight loss can learn, work and play using that device.
The user can control the tablet with voice-over if the person wants to listen to an ebook or by pressing one button, dots will be activated on the screen and the surface of the screen will change.
Romain Dillet, in TechCrunch: "The magic happens when you press the button on the side of the device. The top half of the device turns into a Braille reader. You can load a document, a web page—anything really—and then read the content using Braille."
Tsvetanova told Dillet, "We're not excluding voice over; we combine both of these things." She said they offer both "the tactile experience and the voice over experience."
Rachel Metz reported in MIT Technology Review: "The Blitab's Braille display includes 14 rows, each made up of 23 cells with six dots per cell. Every cell can present one letter of the Braille alphabet. Underneath the grid are numerous layers of fluids and a special kind of membrane," she wrote.

Blitab Technology develops tablet for the blind and visually impaired
Credit: Blitab
At heart, it's an Android tablet, Dillet said, "so it has Wi-Fi and Bluetooth and can run all sorts of Android apps."
Metz said that with eight hours of use per day, it's estimated to last for five days on one battery charge.
The tablet team have set a price to this device, at $500.
How they will proceed: First, she said they will sell directly from their web site, then scale through global distributors, and distribute to less developed world.
What's next? Dillet said in the Jan.6 article that "the team of 10 plans to ship the in six months with pre-orders starting later this month."
Blitab Technology recently took first place in the Digital Wellbeing category of the 2016 EIT Digital Challenge. EIT Digital is described as a European open innovation organization. They seek to foster digital innovation and entrepreneurial talent.

credit ;Nancy Owano 
SpaceX set to launch for first time since Sept blast

SpaceX set to launch for first time since Sept blast

Falcon 9
the above picture is the Falcon 9 rocket
SpaceX is poised to blast off a Falcon 9 rocket on Saturday, marking its first return to flight since a costly and complicated launchpad explosion in September.
The launch of 10 satellites for Iridium, a mobile and data communications company, is scheduled from Vandenberg Air Force Base in California at 9:54 am (1754 GMT).
The launch window is "instantaneous," meaning that any technical glitch or poor weather—the current forecast is just 60 percent favorable—would push the launch forward to the next opportunity on Sunday at 1749 GMT.
The stakes for SpaceX are high after a pair of accidents.
September's blast destroyed a $200 million satellite Facebook had planned to use to beam high-speed internet to Africa. Another explosion in June 2015 two minutes after liftoff obliterated a Dragon packed with goods bound for the astronauts at the International Space Station.
The incidents cost SpaceX dearly, possibly pushing the privately owned company into the red, the Wall Street Journal reported this week.
"That June 2015 disaster, followed by months of launch delays, contributed to a quarter-billion dollar annual loss and a six percent drop in revenue, after two years of surging sales and small profits," the paper said after a review of internal financial documents from 2011 to 2015, forecasts for the next decade and interviews with former SpaceX employees.
Three weeks after last September's accident, the company removed a long-standing phrase from its website saying it was "profitable and cash-flow positive."
That "suggest(ed) both profit and cash flow had moved into the red for 2016," the Journal said, noting that it found an operating loss for every quarter in 2016 and negative cash flow of roughly $15 million.
SpaceX, headed by billionaire entrepreneur Elon Musk, declined to comment on the findings and is not obligated to release its financial figures because it is a private company, the report said.
"The company is in a financially strong position and is well positioned for future growth," with $1 billion in cash and no debt, SpaceX chief financial officer Bret Johnson was quoted as saying.

Problems fixed
The June 2015 accident—in which the unmanned Dragon cargo ship exploded in a massive fireball two minutes after launch—was caused by a faulty strut that allowed a helium tank to snap loose, SpaceX said.
Last September's explosion, during a test a day prior to a scheduled launch, was traced to a problem with a pressure vessel in the second-stage liquid oxygen tank.
SpaceX said it will change the way it fuels for now and redesign its pressure vessels in the future.
Musk, who cofounded PayPal and also owns Tesla Motors, has lofty goals, including colonizing Mars and revolutionizing the launch industry by making rocket components reusable.
Founded in 2002, SpaceX logged 18 successful launches of the Falcon 9 before the 2015 accident.
The company has a $1.6 billion contract with NASA to supply the International Space Station using its Dragon space capsule, which is the only cargo ship that can return to the Earth intact.
SpaceX had hoped to resume Falcon 9 flights as early as November, then in mid-December, before pushing the date to January.
waves Nokia sues Apple for patent infringement

waves Nokia sues Apple for patent infringement

Nokia announced Wednesday it is suing Apple in German and US courts for patent infringement, claiming the US tech giant was using Nokia technology in "many" products without paying for it.
Finnish Nokia, once the world's top mobile phone maker, said the two companies had signed a licensing agreement in 2011, and since then "Apple has declined subsequent offers made by Nokia to license other of its patented inventions which are used by many of Apple's products."
"After several years of negotiations trying to reach agreement to cover Apple's use of these patents, we are now taking action to defend our rights," Ilkka Rahnasto, head of Nokia's patent business, said in a statement.
The complaints, filed in three German cities and a district court in Texas, concern 32 patents for innovations related to displays, user interface, software, antennae, chipsets and video coding. Nokia said it was preparing further legal action elsewhere.
Nokia was the world's leading mobile phone maker from 1998 until 2011 when it bet on Microsoft's Windows mobile platform, which proved to be a flop. Analysts say the company failed to grasp the growing importance of smartphone apps compared to hardware.
It sold its unprofitable handset unit in 2014 for some $7.2 billion to Microsoft, which dropped the Nokia name from its Lumia smartphone handsets.
Meanwhile Nokia has concentrated on developing its mobile network equipment business by acquiring its French-American rival Alcatel-Lucent.
Including its 2013 full acquisition of joint venture Nokia Siemens Networks, Nokia said the three companies united represent more than 115 billion euros of R&D investment, with a massive portfolio of tens of thousands of patents.
The 2011 licensing deal followed years of clashes with Apple, which has also sparred with main rival Samsung over patent claims.
At the time, Apple cut the deal to settle 46 separate complaints Nokia had lodged against it for violation of intellectual property.
Second-generation stars identified, giving clues about their predecessors

Second-generation stars identified, giving clues about their predecessors

The figure shows a sub-population of ancient stars, called Carbon-Enhanced Metal-Poor (CEMP) stars. These stars contain 100 to 1,000,000 times LESS iron (and other heavy elements) than the Sun, but 10 to 10,000 times MORE carbon, relative to iron. The unusual chemicalcompositions of these stars provides clues to their birth environments, and the nature of the stars in which the carbon formed. In the figure, A(C) is the absolute amount of carbon, while the horizontal axis represents the ratio of iron, relative to hydrogen, compared with the same ratio in the Sun. Credit: University of Notre Dame
University of Notre Dame astronomers have identified what they believe to be the second generation of stars, shedding light on the nature of the universe's first stars.
A subclass of carbon-enhanced metal-poor (CEMP) , the so-called CEMP-no stars, are ancient stars that have large amounts of carbon but little of the (such as iron) common to later-generation stars. Massive first-generation stars made up of pure hydrogen and helium produced and ejected by stellar winds during their lifetimes or when they exploded as supernovae. Those metals—anything heavier than helium, in astronomical parlance—polluted the nearby from which new stars formed.
Jinmi Yoon, a postdoctoral research associate in the Department of Physics; Timothy Beers, the Notre Dame Chair in Astrophysics; and Vinicius Placco, a research professor at Notre Dame, along with their collaborators, show in findings published in the Astrophysics Journal this week that the lowest metallicity stars, the most chemically primitive, include large fractions of CEMP stars. The CEMP-no stars, which are also rich in nitrogen and oxygen, are likely the stars born out of hydrogen and helium gas clouds that were polluted by the elements produced by the universe's first stars.
"The CEMP-no stars we see today, at least many of them, were born shortly after the Big Bang, 13.5 billion years ago, out of almost completely unpolluted material," Yoon says. "These stars, located in the halo system of our galaxy, are true second-generation stars—born out of the nucleosynthesis products of the very first stars."
Beers says it's unlikely that any of the universe's first stars still exist, but much can be learned about them from detailed studies of the next generation of stars.
"We're analyzing the chemical products of the very first stars by looking at what was locked up by the second-generation stars," Beers says. "We can use this information to tell the story of how the first elements were formed, and determine the distribution of the masses of those first stars. If we know how their masses were distributed, we can model the process of how the first stars formed and evolved from the very beginning."
The authors used high-resolution spectroscopic data gathered by many astronomers to measure the chemical compositions of about 300 stars in the halo of the Milky Way. More and heavier elements form as later generations of stars continue to contribute additional metals, they say. As new generations of stars are born, they incorporate the metals produced by prior generations. Hence, the more heavy metals a star contains, the more recently it was born. Our sun, for example, is relatively young, with an age of only 4.5 billion years.
A companion paper, titled "Observational constraints on first-star nucleosynthesis. II. Spectroscopy of an ultra metal-poor CEMP-no star," of which Placco was the lead author, was also published in the same issue of the journal this week. The paper compares theoretical predictions for the chemical composition of zero-metallicity supernova models with a newly discovered CEMP-no star in the Milky Way galaxy.

Credit ; Brian Wallheimer 
A Swiss firm acquires Mars One private project

A Swiss firm acquires Mars One private project

Mars One consists of two entities: the Dutch not-for-profit Mars One Foundation and a British public limited company Mars One Ve
Mars One consists of two entities: the Dutch not-for-profit Mars One Foundation and a British public limited company Mars One Ventures
A British-Dutch project aiming to send an unmanned mission to Mars by 2018 announced Friday that the shareholders of a Swiss financial services company have agreed a takeover bid.
"The acquisition is now only pending approval by the board of Mars One Ventures," the company said in a joint statement with InFin Innovative Finance AG, adding approval from the Mars board would come "as soon as possible."
"The takeover provides a solid path to funding the next steps of Mars One's mission to establish a permanent human settlement on Mars," the statement added.
Mars One consists of two entities: the Dutch not-for-profit Mars One Foundation and a British public limited company Mars One Ventures.
Mars One aims to establish a permanent human settlement on the Red Planet, and is currently "in the early mission concept phase," the company says, adding securing funding is one of its major challenges.
Some 200,000 hopefuls from 140 countries initially signed up for the Mars One project, which is to be partly funded by a television reality show about the endeavour.
Those have now been whittled down to just 100, out of which 24 will be selected for one-way trips to Mars due to start in 2026 after several unmanned missions have been completed.
"Once this deal is completed, we'll be in a much stronger financial position as we begin the next phase of our mission. Very exciting times," said Mars One chief executive Bas Lansdorp.
NASA is currently working on three Mars missions with the European Space Agency and plans to send another rover to Mars in 2020.
But NASA has no plans for a manned to Mars until the 2030s.