Tuesday 23 December 2014

The Milky Way Has A New Neighbour

The Milky Way, the galaxy we live in, is part of a cluster of more than 50 galaxies that make up the ‘Local Group’, a collection that includes the famous Andromeda galaxy and many other far smaller objects. Now a Russian-American team have added to the canon, finding a tiny and isolated dwarf galaxy almost 7 million light years away. Their results appear in Monthly Notices of the Royal Astronomical Society.


Freedawn Scientia - The Milky Way's new neighbour space solar system A negative image of KKs 3, made using the Advanced Camera for Surveys on the Hubble Space Telescope. The core of the galaxy is the right hand dark object at the top centre of the image, with its stars spreading out over a large section around it. (The left hand of the two dark objects is a much nearer globular star cluster.) Credit: D. Makarov









The team, led by Prof Igor Karachentsev of the Special Astrophysical Observatory in Karachai-Cherkessia, Russia, found the new galaxy, named KKs3, using the Hubble Space Telescope Advanced Camera for Surveys (ACS) in August 2014. Kks3 is located in the southern sky in the direction of the constellation of Hydrus and its stars have only one ten-thousandth of the mass of the Milky Way.


Kks3 is a ‘dwarf spheroidal’ or dSph galaxy, lacking features like the spiral arms found in our own galaxy. These systems also have an absence of the raw materials (gas and dust) needed for new generations of stars to form, leaving behind older and fainter relics. In almost every case, this raw material seems to have been stripped out by nearby massive galaxies like Andromeda, so the vast majority of dSph objects are found near much bigger companions.


Isolated objects must have formed in a different way, with one possibility being that they had an early burst of star formation that used up the available gas resources. Astronomers are particularly interested in finding dSph objects to understand galaxy formation in the universe in general, as even HST struggles to see them beyond the Local Group. The absence of clouds of hydrogen gas in nebulae also makes them harder to pick out in surveys, so scientists instead try to find them by picking out individual stars.


For that reason, only one other isolated dwarf spheroidal, KKR 25, has been found in the Local Group, a discovery made by the same group back in 1999.


Team member Prof Dimitry Makarov, also of the Special Astrophysical Observatory, commented: “Finding objects like Kks3 is painstaking work, even with observatories like the Hubble Space Telescope. But with persistence, we’re slowly building up a map of our local neighbourhood, which turns out to be less empty than we thought. It may be that are a huge number of dwarf spheroidal galaxies out there, something that would have profound consequences for our ideas about the evolution of the cosmos.”


The team will continue to look for more dSph galaxies, a task that will become a little easier in the next few years, once instruments like the James Webb Space Telescope and the European Extremely Large Telescope begin service.









– Credit and Resource –


More information: “A new isolated dSph galaxy near the Local Group.” MNRAS (February 11, 2015) Vol. 447 L85-L89 DOI: 10.1093/mnrasl/slu181 First published online December 21, 2014


Provided by Royal Astronomical Society search and more info website



The Milky Way Has A New Neighbour

Graphene offers X-ray photoelectron spectroscopy a window of opportunity

X-ray photoelectron spectroscopy (XPS) is one of the most sensitive and informative surface analysis techniques available. However, XPS requires a high vacuum to operate, which makes analyzing materials in liquid and gaseous environments difficult.


Freedawn Scientia - Graphene offers X-ray photoelectron spectroscopy a window of opportunity Drawing shows the set-up for an X-ray photoelectron spectroscopy instrument incorporating suspended, electron-transparent graphene membranes—or windows—that separate the sample from the high-vacuum detection system. Credit: NIST


Now, researchers from the National Institute of Standards and Technology (NIST), ELETTRA (Italy) and Technical University of Munich (Germany) have found that graphene—a single-atom-thick sheet of carbon—could make using XPS to study materials in these environments much less expensive and complicated than the conventional approach. Their results were published in the journal Nanoscale.









Researchers have analyzed cells and microorganisms using visible light, which, while informative and gentle, cannot be used to probe objects much smaller than about 500 nanometers. But many of life’s most important processes and interactions take place at much smaller length scales. The same is true with batteries: everything that can go wrong with them takes place at the tiny interfaces between the electrodes and the electrolyte—far beyond the reach of optical microscopes.


Many researchers would like to use X rays or electrons to look deeper into these materials, but few labs have the sophisticated equipment necessary to do so, and those labs that are so outfitted are often too pricey for today’s budget-conscious scientists.


XPS works by bombarding the surface under study with X rays. The atoms on the surface of the material absorb the X-ray energy and re-emit that energy as photoelectrons. Scientists study the kinetic energy and number of the emitted electrons for clues about the sample’s composition and electronic state.


Because X rays and photoelectrons interact with the air, XPS has to be performed under high vacuum, which makes it hard to study materials that have to be in a pressurized environment. What researchers needed was a window material that was nearly transparent to X rays and photoelectrons, but impermeable to gases and liquids and strong enough to withstand the mechanical stress of one atmosphere’s worth of pressure.


Knowing that graphene, the wonder material of the 21st century, has these properties, the group explored using it as a window to separate their sample stage’s atmospheric pressure liquid compartment from the high-vacuum conditions inside the electron spectrometer.


According to NIST researcher Andrei Kolmakov, their results demonstrate that more than enough X rays—and resultant photoelectrons—are able to pass through the graphene window to produce good quality XPS data from liquids and gases.


As an added bonus, the group was also able to measure the intensity of radiation needed to create bubbles in water, a frequently unwanted occurrence that happens when the X rays split water into oxygen and hydrogen. Knowing the point at which bubbles form, they were able to define an upper limit on the intensities of the X rays (or electrons) that can be used in this approach.


“We think our work could fill a much-needed gap,” says Kolmakov. “There are many scientists whose work would benefit from using XPS at ambient pressure, but there are not enough instruments that are equipped to analyze the samples under these conditions, and the ones out there are often too costly to use. Our design is far simpler and has the potential to reduce costs to the level that this type of measurement could be afforded by many more labs. With this imaging capability, other researchers could, for example, learn much more about how to create longer- lasting batteries and develop safer and more effective drugs.”


Of course, as often happens with new technologies, the approach has a few challenges and limitations. Kolmakov says that the adhesion of the graphene to the surface surrounding the opening needs to be improved. Moreover, the barrage of X rays degrades atomically thin graphene over time, so the team is planning to look for ways to mitigate that, if possible.









– Credit and Resources –


More information: J. Kraus, R. Reichelt, S. Günther, L. Gregoratti, M. Amati, M. Kiskinova, A. Yulaev, I. Vlassiouk, and A. Kolmakov. “Photoelectron spectroscopy of wet and gaseous samples through graphene membranes.” Nanoscale. Published online Sept. 22, 2014. DOI: 10.1039/C4NR03561E


Provided by National Institute of Standards and Technology



Graphene offers X-ray photoelectron spectroscopy a window of opportunity

Sun sizzles in high-energy X-rays

For the first time, a mission designed to set its eyes on black holes and other objects far from our solar system has turned its gaze back closer to home, capturing images of our sun. NASA’s Nuclear Spectroscopic Telescope Array, or NuSTAR, has taken its first picture of the sun, producing the most sensitive solar portrait ever taken in high-energy X-rays.


Freedawn Scientia - Sun sizzles in high-energy X-rays X-rays stream off the sun in this image showing observations from by NASA’s Nuclear Spectroscopic Telescope Array, or NuSTAR, overlaid on a picture taken by NASA’s Solar Dynamics Observatory (SDO). Credit: NASA/JPL-Caltech/GSFC


“NuSTAR will give us a unique look at the sun, from the deepest to the highest parts of its atmosphere,” said David Smith, a solar physicist and member of the NuSTAR team at University of California, Santa Cruz.









Solar scientists first thought of using NuSTAR to study the sun about seven years ago, after the space telescope’s design and construction was already underway (the telescope launched into space in 2012). Smith had contacted the principal investigator, Fiona Harrison of the California Institute of Technology in Pasadena, who mulled it over and became excited by the idea.


“At first I thought the whole idea was crazy,” says Harrison. “Why would we have the most sensitive high energy X-ray telescope ever built, designed to peer deep into the universe, look at something in our own back yard?” Smith eventually convinced Harrison, explaining that faint X-ray flashes predicted by theorists could only be seen by NuSTAR.


While the sun is too bright for other telescopes such as NASA’s Chandra X-ray Observatory, NuSTAR can safely look at it without the risk of damaging its detectors. The sun is not as bright in the higher-energy X-rays detected by NuSTAR, a factor that depends on the temperature of the sun’s atmosphere.


This first solar image from NuSTAR demonstrates that the telescope can in fact gather data about sun. And it gives insight into questions about the remarkably high temperatures that are found above sunspots—cool, dark patches on the sun. Future images will provide even better data as the sun winds down in its solar cycle.


“We will come into our own when the sun gets quiet,” said Smith, explaining that the sun’s activity will dwindle over the next few years.


With NuSTAR’s high-energy views, it has the potential to capture hypothesized nanoflares—smaller versions of the sun’s giant flares that erupt with charged particles and high-energy radiation. Nanoflares, should they exist, may explain why the sun’s outer atmosphere, called the corona, is sizzling hot, a mystery called the “coronal heating problem.” The corona is, on average, 1.8 million degrees Fahrenheit (1 million degrees Celsius), while the surface of the sun is relatively cooler at 10,800 Fahrenheit (6,000 degrees Celsius). It is like a flame coming out of an ice cube. Nanoflares, in combination with flares, may be sources of the intense heat.


If NuSTAR can catch nanoflares in action, it may help solve this decades-old puzzle.


“NuSTAR will be exquisitely sensitive to the faintest X-ray activity happening in the solar atmosphere, and that includes possible nanoflares,” said Smith.









What’s more, the X-ray observatory can search for hypothesized dark matter particles called axions. Dark matter is five times more abundant than regular matter in the universe. Everyday matter familiar to us, for example in tables and chairs, planets and stars, is only a sliver of what’s out there. While dark matter has been indirectly detected through its gravitational pull, its composition remains unknown.


It’s a long shot, say scientists, but NuSTAR may be able spot axions, one of the leading candidates for dark matter, should they exist. The axions would appear as a spot of X-rays in the center of the sun.


Meanwhile, as the sun awaits future NuSTAR observations, the telescope is continuing with its galactic pursuits, probing black holes, supernova remnants and other extreme objects beyond our solar system.









– Credit and Resource –


NASA/JPL



Sun sizzles in high-energy X-rays

One Giant Leap for Mankind One teeny step for molecular robots

A walking molecule, so small that it cannot be observed directly with a microscope, has been recorded taking its first nanometre-sized steps.


It’s the first time that anyone has shown in real time that such a tiny object – termed a ‘small molecule walker’ – has taken a series of steps. The breakthrough, made by Oxford University chemists, is a significant milestone on the long road towards developing ‘nanorobots’.


‘In the future we can imagine tiny machines that could fetch and carry cargo the size of individual molecules, which can be used as building blocks of more complicated molecular machines; imagine tiny tweezers operating inside cells,’ said Dr Gokce Su Pulcu of Oxford University’s Department of Chemistry. ‘The ultimate goal is to use molecular walkers to form nanotransport networks,’ she says.









However, before nanorobots can run they first have to walk. As Su explains, proving this is no easy task.


For years now researchers have shown that moving machines and walkers can be built out of DNA. But, relatively speaking, DNA is much larger than small molecule walkers and DNA machines only work in water.


The big problem is that microscopes can only detect moving objects down to the level of 10–20 nanometres. This means that small molecule walkers, whose strides are 1 nanometre long, can only be detected after taking around 10 or 15 steps. It would therefore be impossible to tell with a microscope whether a walker had ‘jumped’ or ‘floated’ to a new location rather than taken all the intermediate steps.



As they report in this week’s Nature Nanotechnology, Su and her colleagues at Oxford’s Bayley Group took a new approach to detecting a walker’s every step in real time. Their solution? To build a walker from an arsenic-containing molecule and detect its motion on a track built inside a nanopore.


Nanopores are already the foundation of pioneering DNA sequencing technology developed by the Bayley Group and spinout company Oxford Nanopore Technologies. Here, tiny protein pores detect molecules passing through them. Each base disrupts an electric current passed through the nanopore by a different amount so that the DNA base ‘letters’ (A, C, G or T) can be read.


In this new research, they used a nanopore containing a track formed of five ‘footholds’ to detect how a walker was moving across it.


‘We can’t ‘see’ the walker moving, but by mapping changes in the ionic current flowing through the pore as the molecule moves from foothold to foothold we are able to chart how it is stepping from one to the other and back again,’ Su explains.


To ensure that the walker doesn’t float away, they designed it to have ‘feet’ that stick to the track by making and breaking chemical bonds. Su says: ‘It’s a bit like stepping on a carpet with glue under your shoes: with each step the walker’s foot sticks and then unsticks so that it can move to the next foothold.’ This approach could make it possible to design a machine that can walk on a variety of surfaces.


It’s quite an achievement for such a tiny machine but, as Su is the first to admit, there are many more challenges to be overcome before programmable nanorobots are a reality.









‘At the moment we don’t have much control over which direction the walker moves in; it moves pretty randomly,’ Su tells me. ‘The protein track is a bit like a mountain slope; there’s a direction that’s easier to walk in so walkers will tend to go this way. We hope to be able to harness this preference to build tracks that direct a walker where we want it to go.’


The next challenge after that will be for a walker to make itself useful by, for instance, carrying a cargo: there’s already space for it to carry a molecule on its ‘head’ that it could then take to a desired location to accomplish a task.


Su comments: ‘We should be able to engineer a surface where we can control the movement of these walkers and observe them under a microscope through the way they interact with a very thin fluorescent layer. This would make it possible to design chips with different stations with walkers ferrying cargo between these stations; so the beginnings of a nanotransport system.’


These are the first tentative baby steps of a new technology, but they promise that there could be much bigger strides to come.









– Credit and Resource –


More information: “Continuous observation of the stochastic motion of an individual small-molecule walker.” Nature Nanotechnology (2014) DOI: 10.1038/nnano.2014.264


Provided by Oxford University search and more info



One Giant Leap for Mankind One teeny step for molecular robots

Students aim to put cyanobacteria on Mars to generate oxygen

Mars is a very harsh and hostile environment for future human explorers and like any other known planet it has no breathable air. That could change someday, and it may be soon enough for our generation to witness it, as the student team from Germany has a bold vision to make a first step to terraform the Red Planet, turning it more Earth-like. The plan is to send cyanobacteria to Mars to generate oxygen out of carbon dioxide which is the main component of Martian atmosphere (nearly 96%). “Cyanobacteria do live in conditions on Earth where no life would be expected. You find them everywhere on our planet!” team leader Robert P. Schröder told astrowatch.net. “It is the first step on Mars to test microorganisms.” The project is participating in the Mars One University Competition and if it wins, it will be send as a payload to Mars, onboard the Dutch company’s mission to the Red Planet. Now everyone can vote to help make it happen by visiting the CyanoKnights.bio webpage.


Freedawn Scientia - Students aim to put cyanobacteria on Mars to generate oxygen Cyanobacteria Spirulina. Credit: cyanoknights.bio


The team behind the initiative is composed of a voluntary and interdisciplinary group of students and scientists, from the University of Applied Science and Technical University, both located in Darmstadt, Germany. They call their project ‘Cyano Knights’ as Schröder explains: “Because of the long history of those Cyanos and their will to survive I named it Cyano Knights and sent the payload proposal instantly to Mars One.”









The students work on their project in the laboratory of Cell Culture Technology of the University of Applied Science and they are also in contact with different institutes, not only from Germany.


Cyanobacteria will deliver oxygen made of their photosynthesis, reducing carbon dioxide and produce an environment for living organisms. Furthermore, they can supply food and important vitamins for a healthy nutrition. The team is already testing cyanobacteria with different environmental conditions in quarantined photobioreactors and monitoring their activities to determine the best working solution on Mars.


Schröder reveals that the idea was born in August this year. But why cyanobacteria? “Initial ideas were of a technical nature, but that was too boring for me. In school I liked biotechnology and that have not changed very much. Once I heard of cyanobacteria and how they can survive in harsh conditions on earth and at this special night I had a flashback which grabbed me and convinced me completely,” he explains.


Freedawn Scientia - Students aim to put cyanobacteria on Mars to generate oxygen


The team ensures that at the end of the mission those very well quarantined microorganisms will be terminated. “We will destroy our microorganisms because we don’t want to harm Mars,” Schröder adds.


So what amount of this bacteria will be needed to fully terraform Mars? “As for now we don’t know that really, because we need to find out the best habitable conditions for each strain to cultivate them and then we have references and can calculate it,” Schröder says. “We need to test Mars-like conditions and analyze how much energy we have to put into the photobioreactor. So it’s a lot work to do.”


Mars One will take one project to Mars along with its unmanned lander mission in 2018. Voting submission will be accepted until Dec. 31, 2014. The winning university payload will be announced on Jan. 5, 2015.


Schröder, also a hopeful Mars One colonist is convinced that this is a once in a lifetime Mars shot: “I am proud to be a Mars One astronaut candidate of the current round and don’t plan to participate in other missions to get to Mars.” So it’s more like an ultimate battle for King Robert and his Cyano Knights with Mars at stake.









– Credit and Resource –


Astrowatch.net



Students aim to put cyanobacteria on Mars to generate oxygen

Activating hair growth with a help from the skin

Restoring hair loss is a task undertaken not only by beauty practitioners. Previous studies have identified signals from the skin that help prompt new phases of hair growth. However, how different types of cells that reside in the skin communicate to activate hair growth has continued to puzzle biologists. An exciting study publishing on December 23 in the open access journal PLOS Biology reveals a new way to spur hair growth.


Freedawn Scientia - Activating hair growth with a little help from the skin Skin whole mount section showing hair follicles (blue) surrounded by clusters of skin resident macrophages (red). The molecular communication between macrophages and hair follicle stem cells regulates the initiation of hair follicle growth. Credit: Donatello Castellana, CNIO


A group from the Spanish National Cancer Research Centre (CNIO) has discovered an unexpected connection—a link between the body’s defense system and skin regeneration. It turns out that macrophages are involved. These are cells from the immune system that are in charge of devouring invading pathogens, a process called phagocytosis. The authors report that macrophages induce hair growth by surrounding and activating cells in the skin that have regenerative capacity, called stem cells. The discovery that macrophages activate skin stem cells could influence technologies with potential applications in tissue regeneration, aging, and cancer.









The authors of the study are Mirna Perez-Moreno and Donatello Castellana, from the Epithelial Cell Biology Group of the BBVA Foundation-CNIO Cancer Cell Biology Programme, along with Ralf Paus, a hair immunobiology expert from the University of Manchester and Münster. “We have discovered that macrophages, cells whose main function is traditionally attributed to fight infections and wound repair, are also involved in the activation of hair follicle stem cells in non-inflamed skin,” says Perez-Moreno.


These findings emerged from an observation by Perez-Moreno while she was working on another research project. Intriguingly, the mice she was working with at that time started to regrow hair when they were given anti-inflammatory drugs. Curious as to whether close communication between stem cells and immune cells could explain this observation, the Perez-Moreno lab began to test different types of cells involved in the body’s defense system for a role in hair growth. They observed that when skin cells are dormant, a fraction of macrophages die naturally due to a normal process called apoptosis. Surprisingly, the dying and surviving cells activated nearby stem cells and hair began to grow again.


Macrophages secrete a number of factors including a class of signaling molecules called Wnts. Importantly, when the researchers treated macrophages with a Wnt inhibitor drug, the activation of hair growth was delayed—demonstrating a role for Wnt from macrophages in promoting hair growth. Although this study was carried out in mice, the researchers believe their discovery “may facilitate the development of novel treatment strategies” for hair growth in humans.


The researchers used tiny droplets, or liposomes, to carry the drug used in the study. The future use of liposomes as a way to deliver a drug to specific cells is promising and may have additional implications for the study of several pathologies, says Donatello Castellana.


From a more fundamental perspective, this research is an effort to understand how modifying the environment that surrounds adult skin stem cells can regulate their regenerative capabilities. “One of the current challenges in the stem cell field is to regulate the activation of endogenous stem cell pools in adult tissues—to promote regeneration without the need of transplantation,” says Perez-Moreno.


Because of this study, it is now known that macrophages play a key role in the environment surrounding stem cells. “Our study underlines the importance of macrophages as modulators in skin regenerative processes, going beyond their primary function as phagocytic immune cells,” say the authors in PLOS Biology.









– Credit and Resource –


More information: Castellana D, Paus R, Perez-Moreno M (2014) Macrophages Contribute to the Cyclic Activation of Adult Hair Follicle Stem Cells. PLoS Biol 12(12): e1002002. DOI: 10.1371/journal.pbio.1002002


Provided by Public Library of Science search and more info website



Activating hair growth with a help from the skin

New programming language automatically coordinates interactions between Web page components

A Web page today is the result of a number of interacting components—like cascading style sheets, XML code, ad hoc database queries, and JavaScript functions. For all but the most rudimentary sites, keeping track of how these different elements interact, refer to each other, and pass data back and forth can be a time-consuming chore.


In a paper being presented at the Association for Computing Machinery’s Symposium on Principles of Programming Languages, Adam Chlipala, the Douglas Ross Career Development Professor of Software Technology, describes a new programming language, called Ur/Web, that lets developers write Web applications as self-contained programs. The language’s compiler—the program that turns high-level instructions into machine-executable code—then automatically generates the corresponding XML code and style-sheet specifications and embeds the JavaScript and database code in the right places.









In addition to making Web applications easier to write, Ur/Web also makes them more secure. “Let’s say you want to have a calendar widget on your Web page, and you’re going to use a library that provides the calendar widget, and on the same page there’s also an advertisement box that’s based on code that’s provided by the ad network,” Chlipala says. “What you don’t want is for the ad network to be able to change how the calendar works or the author of the calendar code to be able to interfere with delivering the ads.” Ur/Web automatically prohibits that kind of unauthorized access between page elements.


Typing, scoping

Ur/Web’s ability to both provide security protection and coordinate disparate Web technologies stems from two properties it shares with most full-blown programming languages, like C++ or Java. One is that it is “strongly typed.” That means that any new variable that a programmer defines in Ur/Web is constrained to a particular data type. Similarly, any specification of a new function has to include the type of data the function acts on and the type of data it returns.


In computing the value to return, the function may need to create new variables. (A function that returned an average of values in a database, for instance, would first need to calculate their sum.) But those variables are inaccessible to the rest of the program. This is the second property, known as “variable scoping,” because it limits the scope—the breadth of accessibility—of variables defined within functions.


“You might want to write a library that has inside of it as private state the database table that records usernames and passwords,” Chlipala says. “You don’t want any other part of your application to be able to just read and overwrite passwords. Most Web frameworks don’t support that style. They assume that every part of your program has complete access to the database.”


Typing helps with security, too. Many Web development frameworks generate database queries in such a way that someone ostensibly logging into a website can type code into the username field that in fact overwrites data in the database. With Ur/Web, usernames would constitute their own data type, which would be handled much differently than database queries.









Meeting expectations

Typing is also what enables coordination across Web technologies. Suppose that a bit of JavaScript code is supposed to act on data fetched from a database and that the result is supposed to be displayed on a Web page at a location determined by some XML code. If an Ur/Web programmer wrote a database query that extracted data of a type the JavaScript wasn’t expecting, or if the JavaScript generated an output of a type that the XML page wasn’t expecting, the compiler would register the discrepancy and flag the code as containing an error.


Often, code that isn’t explicitly typed still has implicit consistency rules. For instance, if you write a query in the SQL database language that asks for the average numerical value of a bunch of text fields, the database server will tell you that it can’t process your request. To enable Ur/Web to coordinate the flow of data between Web technologies, Chlipala had to create libraries of new data types for SQL, XML, and cascading style sheets (CSS) that embody these rules.


While the Ur/Web compiler does generate XML, JavaScript, and SQL code in its current version, it doesn’t produce style sheets automatically. But, Chlipala says, “One thing the compiler can do is analyze your full program and say, ‘Here is an exhaustive list of all the CSS classes that might be mentioned, and here is a description of the context in which each class might be used, which tells you what properties might be worth setting.’ So, for instance, some particular class might never be used in a position where table properties would have any meaning, so you don’t have to bother setting those.”









– Credit and Resources –


MIT



New programming language automatically coordinates interactions between Web page components

Smartphone thumb skills alter our brains

When people spend time interacting with their smartphones via touchscreen, it actually changes the way their thumbs and brains work together, according to a report in the Cell Press journal Current Biology on December 23. More touchscreen use in the recent past translates directly into greater brain activity when the thumbs and other fingertips are touched, the study shows.









“I was really surprised by the scale of the changes introduced by the use of smartphones,” says Arko Ghosh of the University of Zurich and ETH Zurich in Switzerland. “I was also struck by how much of the inter-individual variations in the fingertip-associated brain signals could be simply explained by evaluating the smartphone logs.”


It all started when Ghosh and his colleagues realized that our newfound obsession with smartphones could be a grand opportunity to explore the everyday plasticity of the human brain. Not only are people suddenly using their fingertips, and especially their thumbs, in a new way, but many of us are also doing it an awful lot, day after day. Not only that, but our phones are also keeping track of our digital histories to provide a readymade source of data on those behaviors.


Ghosh explains it this way: “I think first we must appreciate how common personal digital devices are and how densely people use them. What this means for us neuroscientists is that the digital history we carry in our pockets has an enormous amount of information on how we use our fingertips (and more).”


While neuroscientists have long studied brain plasticity in expert groups—musicians or video gamers, for instance—smartphones present an opportunity to understand how regular life shapes the brains of regular people.


To link digital footprints to brain activity in the new study, Ghosh and his team used electroencephalography (EEG) to record the brain response to mechanical touch on the thumb, index, and middle fingertips of touchscreen phone users in comparison to people who still haven’t given up their old-school mobile phones.









The researchers found that the electrical activity in the brains of smartphone users was enhanced when all three fingertips were touched. In fact, the amount of activity in the cortex of the brain associated with the thumb and index fingertips was directly proportional to the intensity of phone use, as quantified by built-in battery logs. The thumb tip was even sensitive to day-to-day fluctuations: the shorter the time elapsed from an episode of intense phone use, the researchers report, the larger was the cortical potential associated with it.


The results suggest to the researchers that repetitive movements over the smooth touchscreen surface reshape sensory processing from the hand, with daily updates in the brain’s representation of the fingertips. And that leads to a pretty remarkable idea: “We propose that cortical sensory processing in the contemporary brain is continuously shaped by personal digital technology,” Ghosh and his colleagues write.


What exactly this influence of digital technology means for us in other areas of our lives is a question for another day. The news might not be so good, Ghosh and colleagues say, noting evidence linking excessive phone use with motor dysfunctions and pain.









– Credit and Resource –


More information: Current Biology, Anne-Dominique Gindrat, Magali Chytiris, Myriam Balerna, Eric Rouiller, Arko Ghosh “Use-dependent cortical processing from fingertips in touchscreen phone users” (2014) www.cell.com/current-biology/a… 0960-9822(14)01487-0


Provided by Cell Press search and more info website



Smartphone thumb skills alter our brains

Monday 22 December 2014

Holiday lights on the Sun: SDO imagery of a significant solar flare

The sun emitted a significant solar flare, peaking at 7:28 p.m. EST on Dec. 19, 2014.


Freedawn Scientia - Holiday lights on the Sun: SDO imagery of a significant solar flare NASA’s Solar Dynamics Observatory captured this image of a significant solar flare — as seen in the bright flash on the right — on Dec. 19, 2014. The image shows a subset of extreme ultraviolet light that highlights the extremely hot material in flares. Credit: NASA/SDO


NASA’s Solar Dynamics Observatory, which watches the sun constantly, captured an image of the event. Solar flares are powerful bursts of radiation. Harmful radiation from a flare cannot pass through Earth’s atmosphere to physically affect humans on the ground, however—when intense enough—they can disturb the atmosphere in the layer where GPS and communications signals travel.









To see how this event may affect Earth, please visit NOAA’s Space Weather Prediction Center at http://spaceweather.gov, the U.S. government’s official source for space weather forecasts, alerts, watches and warnings.


This flare is classified as an X1.8-class flare. X-class denotes the most intense flares, while the number provides more information about its strength. An X2 is twice as intense as an X1, an X3 is three times as intense, etc.



These images from NASA’s Solar Dynamics Observatory show an X-class solar flare erupting on the sun on Dec. 19, 2014. Credit: NASA/SDO/Duberstein









– Credit and Resource –


NASA



Holiday lights on the Sun: SDO imagery of a significant solar flare

Astrophysicists offer new research, tool for identifying habitable zones

Research by a University of Texas at Arlington astrophysicist sheds greater light on S-type and P-type binary stars and forms the basis for BinHab, a new online tool that can be used to calculate the regions of binary systems favorable for life, commonly known as habitable zones.


Freedawn Scientia - Astrophysicists offer new research, tool for identifying habitable zones Cuntz’s Binary Star Habitable Zone Calculator, or BinHab, allows for the calculation of S-type and P-type habitable regions in stellar binary systems.


In P-type systems the planet orbits both binary stars. In S-type systems, the planet orbits only one of the stellar components with the second component left to agitate the setting of the planet.









Astrophysicists have been tackling the problem of identifying habitable zones for many years. However, the method developed by UT Arlington Physics Professor Manfred Cuntz provides a comprehensive mathematical approach for both types of habitability.


“The challenge is to properly consider two separate criteria consisting in the amounts of stellar radiation, which provides a favorable planetary climate for life, and the gravitational influence of both stars on an existing planet,” Cuntz said.


Cuntz presents his work in a paper to be published in January 2015 in the Astrophysical Journal, a lead publication in this field of study. The paper is available online here. The first paper in the series was published in the journal in Jan. 2014 and is available here. Some of his results will also be presented at the 225th Meeting of the American Astronomical Society in January in Seattle.


Cuntz’s research is based on a pure theoretical approach but it is directly relevant to the interpretation of observational data, including those by NASA’s Kepler mission. Previous work at UT Arlington includes studies of Kepler-16, a binary system discovered in 2011, which is known to host a Saturn-type planet in a P-type system, implying that it orbits both stellar components, Cuntz said. Billy Quarles, a former UT Arlington graduate student, currently working at NASA’s Ames Research Center, led that work.


James Grover, interim dean of the UT Arlington College of Science, said this latest work holds enormous potential for those who study space in the search for life.


“Dr. Cuntz’s work holds a wide range of applications, including the assessments of observational data for extrasolar planets,” Grover said. “Additionally, the work has ramifications toward the field of astrobiology. UT Arlington students and the astrophysics and astrobiology community at large will benefit from the work and new BinHab tool.”


BinHab is explained in part in a paper that Cuntz presented last summer at the 18th Cambridge Workshop on “Cool Stars, Stellar Systems, and the Sun.” The Lowell Observatory in Flagstaff, Ariz. hosted that meeting.









– Credit and Resource –


Provided by University of Texas at Arlington search and more info website



Astrophysicists offer new research, tool for identifying habitable zones

New law for superconductors

Mathematical description of relationship between thickness, temperature, and resistivity could spur advances.

MIT researchers have discovered a new mathematical relationship — between material thickness, temperature, and electrical resistance — that appears to hold in all superconductors. They describe their findings in the latest issue of Physical Review B.


Freedawn Scientia - New law for superconductors Mathematical description of relationship between thickness, temperature, and resistivity could spur advances. Atoms of niobium and nitrogen in an ultrathin superconducting film that helped MIT researchers discover a universal law of superconductivity.


The result could shed light on the nature of superconductivity and could also lead to better-engineered superconducting circuits for applications like quantum computing and ultralow-power computing.









“We were able to use this knowledge to make larger-area devices, which were not really possible to do previously, and the yield of the devices increased significantly,” says Yachin Ivry, a postdoc in MIT’s Research Laboratory of Electronics, and the first author on the paper.


Ivry works in the Quantum Nanostructures and Nanofabrication Group, which is led by Karl Berggren, a professor of electrical engineering and one of Ivry’s co-authors on the paper. Among other things, the group studies thin films of superconductors.


Superconductors are materials that, at temperatures near absolute zero, exhibit no electrical resistance; this means that it takes very little energy to induce an electrical current in them. A single photon will do the trick, which is why they’re useful as quantum photodetectors. And a computer chip built from superconducting circuits would, in principle, consume about one-hundredth as much energy as a conventional chip.


“Thin films are interesting scientifically because they allow you to get closer to what we call the superconducting-to-insulating transition,” Ivry says. “Superconductivity is a phenomenon that relies on the collective behavior of the electrons. So if you go to smaller and smaller dimensions, you get to the onset of the collective behavior.”


Vexing variation

Specifically, Ivry studied niobium nitride, a material favored by researchers because, in its bulk form, it has a relatively high “critical temperature” — the temperature at which it switches from an ordinary metal to a superconductor. But like most superconductors, it has a lower critical temperature when it’s deposited in the thin films on which nanodevices rely.


Previous theoretical work had characterized niobium nitride’s critical temperature as a function of either the thickness of the film or its measured resistivity at room temperature. But neither theory seemed to explain the results Ivry was getting. “We saw large scatter and no clear trend,” he says. “It made no sense, because we grew them in the lab under the same conditions.”


So the researchers conducted a series of experiments in which they held constant either thickness or “sheet resistance,” the material’s resistance per unit area, while varying the other parameter; they then measured the ensuing changes in critical temperature. A clear pattern emerged: Thickness times critical temperature equaled a constant — call it A — divided by sheet resistance raised to a particular power — call it B.


After deriving that formula, Ivry checked it against other results reported in the superconductor literature. His initial excitement evaporated, however, with the first outside paper he consulted. Though most of the results it reported fit his formula perfectly, two of them were dramatically awry. Then a colleague who was familiar with the paper pointed out that its authors had acknowledged in a footnote that those two measurements might reflect experimental error: When building their test device, the researchers had forgotten to turn on one of the gases they used to deposit their films.


Broadening the scope

The other niobium nitride papers Ivry consulted bore out his predictions, so he began to expand to other superconductors. Each new material he investigated required him to adjust the formula’s constants — A and B. But the general form of the equation held across results reported for roughly three dozen different superconductors.









It wasn’t necessarily surprising that each superconductor should have its own associated constant, but Ivry and Berggren weren’t happy that their equation required two of them. When Ivry graphed A against B for all the materials he’d investigated, however, the results fell on a straight line.


Finding a direct relationship between the constants allowed him to rely on only one of them in the general form of his equation. But perhaps more interestingly, the materials at either end of the line had distinct physical properties. Those at the top had highly disordered — or, technically, “amorphous” — crystalline structures; those at the bottom were more orderly, or “granular.” So Ivry’s initial attempt to banish an inelegance in his equation may already provide some insight into the physics of superconductors at small scales.


“None of the admitted theory up to now explains with such a broad class of materials the relation of critical temperature with sheet resistance and thickness,” says Claude Chapelier, a superconductivity researcher at France’s Alternative Energies and Atomic Energy Commission. “There are several models that do not predict the same things.”


Chapelier says he would like to see a theoretical explanation for that relationship. But in the meantime, “this is very convenient for technical applications,” he says, “because there is a lot of spreading of the results, and nobody knows whether they will get good films for superconducting devices. By putting a material into this law, you know already whether it’s a good superconducting film or not.”









– Credit and Resource –


Larry Hardesty | MIT News Office



New law for superconductors

Could there be Life on an aquaplanet

MIT study finds an exoplanet, tilted on its side, could still be habitable if covered in ocean.

Nearly 2,000 planets beyond our solar system have been identified to date. Whether any of these exoplanets are hospitable to life depends on a number of criteria. Among these, scientists have thought, is a planet’s obliquity — the angle of its axis relative to its orbit around a star.


Freedawn Scientia - Life on an aquaplanet MIT study finds an exoplanet, tilted on its side, could still be habitable if covered in ocean. Images from the European Southern Observatory and Google Earth in a photo-illustration by Christine Daniloff/MIT.


Earth, for instance, has a relatively low obliquity, rotating around an axis that is nearly perpendicular to the plane of its orbit around the sun. Scientists suspect, however, that exoplanets may exhibit a host of obliquities, resembling anything from a vertical spinning top to a horizontal rotisserie. The more extreme the tilt, the less habitable a planet may be — or so the thinking has gone.









Now scientists at MIT have found that even a high-obliquity planet, with a nearly horizontal axis, could potentially support life, so long as the planet were completely covered by an ocean. In fact, even a shallow ocean, about 50 meters deep, would be enough to keep such a planet at relatively comfortable temperatures, averaging around 60 degrees Fahrenheit year-round.


David Ferreira, a former research scientist in MIT’s Department of Earth, Atmospheric and Planetary Sciences (EAPS), says that on the face of it, a planet with high obliquity would appear rather extreme: Tilted on its side, its north pole would experience daylight continuously for six months, and then darkness for six months, as the planet revolves around its star.


“The expectation was that such a planet would not be habitable: It would basically boil, and freeze, which would be really tough for life,” says Ferreira, who is now a lecturer at the University of Reading, in the United Kingdom. “We found that the ocean stores heat during summer and gives it back in winter, so the climate is still pretty mild, even in the heart of the cold polar night. So in the search for habitable exoplanets, we’re saying, don’t discount high-obliquity ones as unsuitable for life.”


Details of the group’s analysis are published in the journal Icarus. The paper’s co-authors are Ferreira; Sara Seager, the Class of 1941 Professor in EAPS and MIT’s Department of Physics; John Marshall, the Cecil and Ida Green Professor in Earth and Planetary Sciences; and Paul O’Gorman, an associate professor in EAPS.


Tilting toward a habitable exoplanet

Ferreira and his colleagues used a model developed at MIT to simulate a high-obliquity “aquaplanet” — an Earth-sized planet, at a similar distance from its sun, covered entirely in water. The three-dimensional model is designed to simulate circulations among the atmosphere, ocean, and sea ice, taking into the account the effects of winds and heat in driving a 3000-meter deep ocean. For comparison, the researchers also coupled the atmospheric model with simplified, motionless “swamp” oceans of various depths: 200 meters, 50 meters, and 10 meters.


The researchers used the detailed model to simulate a planet at three obliquities: 23 degrees (representing an Earth-like tilt), 54 degrees, and 90 degrees.


For a planet with an extreme, 90-degree tilt, they found that a global ocean — even one as shallow as 50 meters — would absorb enough solar energy throughout the polar summer and release it back into the atmosphere in winter to maintain a rather mild climate. As a result, the planet as a whole would experience spring-like temperatures year round.









“We were expecting that if you put an ocean on the planet, it might be a bit more habitable, but not to this point,” Ferreira says. “It’s really surprising that the temperatures at the poles are still habitable.”


A runaway “snowball Earth”

In general, the team observed that life could thrive on a highly tilted aquaplanet, but only to a point. In simulations with a shallower ocean, Ferreira found that waters 10 meters deep would not be sufficient to regulate a high-obliquity planet’s climate. Instead, the planet would experience a runaway effect: As soon as a bit of ice forms, it would quickly spread across the dark side of the planet. Even when this side turns toward the sun, according to Ferreira, it would be too late: Massive ice sheets would reflect the sun’s rays, allowing the ice to spread further into the newly darkened side, and eventually encase the planet.


Freedawn Scientia - Life on an aquaplanet MIT study finds an exoplanet, tilted on its side, could still be habitable if covered in ocean.


“Some people have thought that a planet with a very large obliquity could have ice just around the equator, and the poles would be warm,” Ferreira says. “But we find that there is no intermediate state. If there’s too little ocean, the planet may collapse into a snowball. Then it wouldn’t be habitable, obviously.”


Darren Williams, a professor of physics and astronomy at Pennsylvania State University, says past climate modeling has shown that a wide range of climate scenarios are possible on extremely tilted planets, depending on the sizes of their oceans and landmasses. Ferreira’s results, he says, reach similar conclusions, but with more detail.


“There are one or two terrestrial-sized exoplanets out of a thousand that appear to have densities comparable to water, so the probability of an all-water planet is at least 0.1 percent,” Williams says. “The upshot of all this is that exoplanets at high obliquity are not necessarily devoid of life, and are therefore just as interesting and important to the astrobiology community.”









– Credit and Resource –


Jennifer Chu | MIT News Office


Freedawn Scientia - Life on an aquaplanet MIT study finds an exoplanet, tilted on its side, could still be habitable if covered in ocean. The tilt of Mars’ axis varies over a 124,000-year cycle. A steeper tilt means a generally warmer climate; a slight tilt a colder one.
Courtesy of NASA



Could there be Life on an aquaplanet

New understanding of how to halt photons could lead to miniature particle accelerators

Trapping light with a twister


New understanding of how to halt photons could lead to miniature particle accelerators, improved data transmission.

Researchers at MIT who succeeded last year in creating a material that could trap light and stop it in its tracks have now developed a more fundamental understanding of the process. The new work — which could help explain some basic physical mechanisms — reveals that this behavior is connected to a wide range of other seemingly unrelated phenomena.


Freedawn Scientia - Trapping light with a twister New understanding of how to halt photons could lead to miniature particle accelerators, improved data transmission. Plot of radiative quality factor as a function of wave vector for a photonic crystal slab. At five positions, this factor diverges to infinity, corresponding to special solutions of Maxwell equations called bound states in the continuum. These states have enough energy to escape to infinity but remain spatially localized.


The findings are reported in a paper in the journal Physical Review Letters, co-authored by MIT physics professor Marin Soljačić; postdocs Bo Zhen, Chia Wei Hsu, and Ling Lu; and Douglas Stone, a professor of applied physics at Yale University.









Light can usually be confined only with mirrors, or with specialized materials such as photonic crystals. Both of these approaches block light beams; last year’s finding demonstrated a new method in which the waves cancel out their own radiation fields. The new work shows that this light-trapping process, which involves twisting the polarization direction of the light, is based on a kind of vortex — the same phenomenon behind everything from tornadoes to water swirling down a drain.


In addition to revealing the mechanism responsible for trapping the light, the new analysis shows that this trapped state is much more stable than had been thought, making it easier to produce and harder to disturb.


“People think of this [trapped state] as very delicate,” Zhen says, “and almost impossible to realize. But it turns out it can exist in a robust way.”


In most natural light, the direction of polarization — which can be thought of as the direction in which the light waves vibrate — remains fixed. That’s the principle that allows polarizing sunglasses to work: Light reflected from a surface is selectively polarized in one direction; that reflected light can then be blocked by polarizing filters oriented at right angles to it.


But in the case of these light-trapping crystals, light that enters the material becomes polarized in a way that forms a vortex, Zhen says, with the direction of polarization changing depending on the beam’s direction.


Freedawn Scientia - Trapping light with a twister New understanding of how to halt photons could lead to miniature particle accelerators, improved data transmission. Vortices of bound states in the continuum. The left panel shows five bound states in the continuum in a photonic crystal slab as bright spots. The right panel shows the polarization vector field in the same region as the left panel, revealing five vortices at the locations of the bound states in the continuum. These vortices are characterized with topological charges +1 or -1.
Courtesy of the researchers


Because the polarization is different at every point in this vortex, it produces a singularity — also called a topological defect, Zhen says — at its center, trapping the light at that point.









Hsu says the phenomenon makes it possible to produce something called a vector beam, a special kind of laser beam that could potentially create small-scale particle accelerators. Such devices could use these vector beams to accelerate particles and smash them into each other — perhaps allowing future tabletop devices to carry out the kinds of high-energy experiments that today require miles-wide circular tunnels.


The finding, Soljačić says, could also enable easy implementation of super-resolution imaging (using a method called stimulated emission depletion microscopy) and could allow the sending of far more channels of data through a single optical fiber.


“This work is a great example of how supposedly well-studied physical systems can contain rich and undiscovered phenomena, which can be unearthed if you dig in the right spot,” says Yidong Chong, an assistant professor of physics and applied physics at Nanyang Technological University in Singapore who was not involved in this research.


Chong says it is remarkable that such surprising findings have come from relatively well-studied materials. “It deals with photonic crystal slabs of the sort that have been extensively analyzed, both theoretically and experimentally, since the 1990s,” he says. “The fact that the system is so unexotic, together with the robustness associated with topological phenomena, should give us confidence that these modes will not simply


be theoretical curiosities, but can be exploited in technologies such as microlasers.”


The research was partly supported by the U.S. Army Research Office through MIT’s Institute for Soldier Nanotechnologies, and by the Department of Energy and the National Science Foundation.









– Credit and Resource –


David L. Chandler | MIT News Office



New understanding of how to halt photons could lead to miniature particle accelerators

Friday 19 December 2014

Pinpoint laser heating creates a maelstrom of magnetic nanotextures

A simulation study by researchers from the RIKEN Center for Emergent Matter Science has demonstrated the feasibility of using lasers to create and manipulate nanoscale magnetic vortices. The ability to create and control these ‘skyrmions’ could lead to the development of skyrmion-based information storage devices.


Freedawn Scientia - Pinpoint laser heating creates a maelstrom of magnetic nanotextures Figure 1: Schematic representation of skyrmion creation by local heating using a laser. Credit: Mari Ishida, RIKEN Center for Emergent Matter Science (lower part); modified with permission from ref. 1 © 2014 W. Koshibae & N. Nagaosa (insets)


The information we consume and work with is encoded in binary form (as ‘1’s or ‘0’s) by switching the characteristics of memory media between two states. As we approach the performance and capacity limits of conventional memory media, researchers are looking toward exotic physics to develop the next generation of magnetic memories.









One such exotic phenomenon is the skyrmion—a stable, nanoscale whirlpool-like magnetic feature characterized by a constantly rotating magnetic moment. Theoretically, the presence or absence of a skyrmion at any location in a magnetic medium could be used to represent the binary states needed for information storage. However, researchers have found it challenging to reliably create and annihilate skyrmions experimentally due to the difficulty in probing the mechanics of these processes in any detail. The challenge lies in the incredibly short timescale of these processes, which at just a tenth of a nanosecond is up to billion times shorter than the timescale observable under the Lorentz microscope used to measure magnetic properties.


The study authors, Wataru Koshibae and Naoto Nagaosa, sought a solution to this problem by constructing a computational model that simulates the heating of a ferromagnetic material with pinpoint lasers (Fig. 1). This localized heating creates both skyrmions and ‘antiskyrmions’. The simulations, based on known physics for these systems, showed that the characteristics of skyrmions are heavily dependent on the intensity and spot size of the laser. Further, by manipulating these two parameters, it is possible to control skyrmion characteristics such as creation time and size.


“Heat leads to random motion of magnetic spins,” explains Nagaosa. “We therefore found it surprising that local heating created a topologically nontrivial ordered object, let alone composite structures of skyrmions and antiskyrmions” The issue of control is what differentiates these structures.


Nagaosa believes that as skyrmions are quite stable, these nanoscale features could conceivably be used as an information carrier if a reliable means of creating them at will can be achieved. Koshibae and Nagaosa’s work could therefore form the basis of the development of state-of-the-art memory devices. The work also provides valuable information on the creation of topological particles, which is crucial for advancing knowledge in many other areas of physics.









– Credit and Resource –


More information: Koshibae, W. & Nagaosa, N. “Creation of skyrmions and antiskyrmions by local heating.” Nature Communications 5, 5148 (2014).> DOI: 10.1038/ncomms6148


Provided by RIKEN



Pinpoint laser heating creates a maelstrom of magnetic nanotextures

Thermoelectric power plants could offer economically competitive renewable energy

A new study predicts that large-scale power plants based on thermoelectric effects, such as small temperature differences in ocean water, could generate electricity at a lower cost than photovoltaic power plants.


Freedawn Scientia - Thermoelectric power plants could offer economically competitive renewable energy A thermoelectric power plant might use energy harvested from ocean waves to pump cold water up through a heat exchanger/generator near the surface. The heat exchanger is made of thermoelectric materials which can use the temperature gradient between the warm and cold water to generate electricity. Credit: Liu. (CC BY 3.0)


Liping Liu, Associate Professor at Rutgers University, envisions that thermoelectric power plants would look like giant barges sitting in the tropical ocean, where electricity is generated by heating cold, deep water with warm, shallow water heated by the sun. Liu has published a paper in the New Journal of Physics in which he analyzes the feasibility of such power plants.









“This work is about the new idea of large-scale green power plants that make economic use of the largest accessible and sustainable energy reservoir on the earth,” Liu told Phys.org, speaking of the oceans. This is because the sun heats the surface water to a temperature that, in tropical regions, is about 20 K higher than water 600 m deep. Essentially, the surface water acts as a giant storage tank of solar energy.


As Liu explains, thermoelectric power plants would work by harvesting the energy of ocean waves to pump cold water from a few hundred meters deep up through a long channel. As the cold water nears the surface, it enters a heat exchanger where it is heated by surface water on the outside. The heat exchanger acts as an electric generator, as its tubes are made of thermoelectric materials that can transfer heat through their walls and directly convert temperature differences into electricity.


Large-scale, ocean-based thermoelectric power plants would have many advantages. For one, the “fuel” or temperature differences are free, unlimited, and easily accessible. Also, the plants do not take up space on land. Because they have no moving solid parts, they would have low maintenance costs. In addition, the power output does not depend on the time of day or season. And finally, the method is green, as it does not release emissions.


Freedawn Scientia - Thermoelectric power plants could offer economically competitive renewable energy A thermoelectric power plant can also use geothermal sources to produce the temperature gradient. Here, hot water is pumped up to the heat exchanger/generator, where it is cooled by air. Credit: Liu. (CC BY 3.0)


Small-scale thermoelectric generators are already used commercially in applications such as microelectronics, automobiles, and power generation in remote areas. In these designs, the conversion efficiency is the most important factor because the fuel accounts for the largest portion of the cost. Most commercial devices have a conversion efficiency of around 5% to 10% of the ideal Carnot efficiency, with state-of-the-art devices achieving efficiencies of up to 20%. Although research is currently being done to further improve the efficiency, there are still limits to how high it can go.









In the new paper, Liu shows that large-scale thermoelectric power plants wouldn’t need to operate at extremely high efficiencies to be economically competitive; instead, the key would lie in engineering simple structures such as laminated composites in order to support mass production. These improvements focus on the conversion capacity, which, unlike efficiency, can be improved by orders of magnitude. In other words, because the fuel is free and in limitless supply, large-scale thermoelectric power plants could make up with their sheer size what they lack in efficiency.


The cost of generating electricity varies by source. According to the US Department of Energy, the estimated cost per year of one megawatt of electricity in 2016 is about $0.83 million for conventional coal plants, compared to $1.84 million for photovoltaic power plants. Liu’s analysis estimates that a thermoelectric power plant could generate electricity for less than $1.84 million, although an exact estimate is difficult at this stage. This estimate is for a thermoelectric generator that lasts for 20 years and uses ocean water with a 10 K temperature difference as fuel. If water from geothermal sources is used instead, the temperature difference could be 50 K or more, resulting in an even higher power gain and lower cost per watt.


Overall, the analysis shows that thermoelectric power plants look very promising and could contribute to solving the world’s energy problems. Liu plans to work toward this goal in future research.


“We are currently working on experimentally validating the predicted power factor of the thermoelectric composites,” Liu said. “Once this is validated, we will seek to fabricate a table-top prototype of the generator that uses ice water and hot water as ‘fuel.'”









– Credit and Resource –


More information: Liping Liu. “Feasibility of large-scale power plants based on thermoelectric effects.” New Journal of Physics. DOI: 10.1088/1367-2630/16/12/123019


Journal reference: New Journal of Physics search and more info website



Thermoelectric power plants could offer economically competitive renewable energy

Attack on classical cryptography system raises security questions

How secure is completely secure? In the world of secure communication, a scheme may be completely secure until it’s not—that is, until an attack is proposed that reveals a weak spot in the scheme. This is what’s currently going on for Kish key distribution (KKD), which claims to derive total and unconditional security using classical rather than quantum techniques, thus avoiding the complexity and expense of quantum cryptographic schemes. But now a new paper has uncovered a vulnerability in KKD that enables an eavesdropper to correctly determine more than 99.9% of the transmitted bits. Fortunately, countermeasures may exist to protect against this attack and regain the system’s security.


Freedawn Scientia - Attack on classical cryptography system raises security questions In the Kish key distribution (KKD) system, the two resistance values represent the states of an information bit. A cryptographic key is transmitted along the wire by randomly switching between the two resistance values, which can be detected by the sender and receiver via their thermal noise on the line. Since no net power flows through the line, the only way that an eavesdropper can measure the resistance values is by injecting current into the wire and measuring the voltage and current changes in each direction, but the extra current would be quickly noticed. Credit: Gunn, et al. ©2014 Nature Scientific Reports


“The worthiness of a cryptographic scheme is measured by the number of papers that try to attack it,” Derek Abbott, Professor at The University of Adelaide in Australia and coauthor of the new paper, told Phys.org. Abbott and coauthors Lachlan J. Gunn and Andrew Allison have published their paper in a recent issue of Nature’s Scientific Reports.









By Abbott’s measure, KKD has proven to be very appealing (as many people have tried to attack it) since it was first proposed in 2005. Notably, KKD has stood up to attacks from Amnon Yariv (2009 winner of the National Medal of Science) from Caltech, as well as Charles H. Bennett of IBM. Bennett co-developed the first ever quantum cryptography protocol in 1984 (he is the first “B” in the so-called BB84 protocol).


Security from thermal noise

In the 2005 paper that first introduced KKD, Laszlo B. Kish, Professor at Texas A&M University, described a system that promises unconditional security from the second law of thermodynamics. The scheme transmits a cryptographic key along a wire by randomly switching between two resistor values, which represent the states of an information bit, at the two ends of the line. The sender and receiver passively detect each other’s resistance values via the thermal noise on the line. Each time the two parties determine each other’s resistance values, they secretly share one bit of information.


Because the second law prohibits net power from flowing from one resistor to another when the system is at equilibrium, a potential eavesdropper cannot determine the resistance values. The only way an eavesdropper could intercept the bits is by injecting current into the wire and measuring the voltage and current changes in each direction to determine the resistance values, but the extra current would be quickly noticed.


The design of the KKD system relies on a thorough understanding of the physics of waves traveling through a transmission line. One debatable requirement for unconditional security in KKD is that transmission lines prohibit the propagation of waves that are below a certain frequency, v/(2L), where L is the transmission line length and v the signal propagation velocity. This restriction is claimed to arise from the fact that wave modes do not propagate below this frequency.


In the new paper, the researchers show in simulations and experiments that waves with frequencies below this critical value do actually propagate along the transmission line. The reason, they explain, is that at low frequencies a coaxial cable supports TEM (Transverse Electromagnetic) modes, which have no low frequency cutoff.


The researchers detected the existence of propagating TEM waves on a coaxial cable by constructing a directional wave measurement device, which they then used to successfully eavesdrop. They showed that, merely by measuring the TEM waves traveling along the transmission line, an eavesdropper can determine both resistor values, allowing them to correctly intercept more than 99.9% of the bits without being caught.









Attacks, counterattacks: the cycle continues

Although this sounds bad for KKD, it in no way spells the end for the cryptographic scheme.


Already, several critiques of the new attack have been proposed in response to a preprint of the paper. Several critiques argue that the signals in the transmission line are not actually waves for a variety of reasons, such as flaws in the model, insufficient power on the line to excite wave modes, and not meeting the definition of a wave. However, the Adelaide team refutes each of these arguments, in the latter case by pointing out that there is no definitive definition of a wave.


Returning to defend his system, Kish, along with coauthor Claes-Goran Granqvist, has proposed a countermeasure to the attack. They suggest that increasing the noise temperature on the side with the smaller resistance compared to the side with the greater resistance can remove an eavesdropper’s information. The Adelaide researchers agree that this countermeasure may be effective as long as the sender and receiver are able to account for the noise difference in their measurements.


“The countermeasure has yet to be demonstrated in practice, and this is the next step,” Abbott said. “It will be promising if it can be shown to work for practical bandwidths and cable lengths; this remains to be seen.”


Looking at the bigger picture of secure communication, Abbott explains that even a system that is completely secure in theory is vulnerable to attack. In other words, there is no such thing as a completely secure scheme in the first place.


“The concept of ‘unconditional security’ is a Platonic idealization that is true only for the mathematical description of a cryptographic scheme,” Abbott said. “Any physical realization can never be fully captured by such idealizations, and therefore in my opinion any cryptographic scheme is open to attack. Math is watertight, but the physical world leaks. At the end of the day, what matters is how many decades a given scheme can hold off an attack for. But this is notoriously difficult to predict.”









– Credit and Resource –


More information: Lachlan J. Gunn, et al. “A directional wave measurement attack against the Kish key distribution system.” Scientific Reports. DOI: 10.1038/srep06461


Journal reference: Scientific Reports search and more info website



Attack on classical cryptography system raises security questions

Quantum physics Has Just Got Less Complicated

Here’s a nice surprise: quantum physics is less complicated than we thought. An international team of researchers has proved that two peculiar features of the quantum world previously considered distinct are different manifestations of the same thing. The result is published 19 December in Nature Communications.


Freedawn Scientia - Quantum physics just got less complicated Quantum physics says that particles can behave like waves, and vice versa. Researchers have now shown that this ‘wave-particle duality’ is simply the quantum uncertainty principle in disguise. Credit: Timothy Yeo / CQT, National University of Singapore


Patrick Coles, Jedrzej Kaniewski, and Stephanie Wehner made the breakthrough while at the Centre for Quantum Technologies at the National University of Singapore. They found that ‘wave-particle duality’ is simply the quantum ‘uncertainty principle’ in disguise, reducing two mysteries to one.









“The connection between uncertainty and wave-particle duality comes out very naturally when you consider them as questions about what information you can gain about a system. Our result highlights the power of thinking about physics from the perspective of information,” says Wehner, who is now an Associate Professor at QuTech at the Delft University of Technology in the Netherlands.


The discovery deepens our understanding of quantum physics and could prompt ideas for new applications of wave-particle duality.


Wave-particle duality is the idea that a quantum object can behave like a wave, but that the wave behaviour disappears if you try to locate the object. It’s most simply seen in a double slit experiment, where single particles, electrons, say, are fired one by one at a screen containing two narrow slits. The particles pile up behind the slits not in two heaps as classical objects would, but in a stripy pattern like you’d expect for waves interfering. At least this is what happens until you sneak a look at which slit a particle goes through – do that and the interference pattern vanishes.


The quantum uncertainty principle is the idea that it’s impossible to know certain pairs of things about a quantum particle at once. For example, the more precisely you know the position of an atom, the less precisely you can know the speed with which it’s moving. It’s a limit on the fundamental knowability of nature, not a statement on measurement skill. The new work shows that how much you can learn about the wave versus the particle behaviour of a system is constrained in exactly the same way.


Wave-particle duality and uncertainty have been fundamental concepts in quantum physics since the early 1900s. “We were guided by a gut feeling, and only a gut feeling, that there should be a connection,” says Coles, who is now a Postdoctoral Fellow at the Institute for Quantum Computing in Waterloo, Canada.









It’s possible to write equations that capture how much can be learned about pairs of properties that are affected by the uncertainty principle. Coles, Kaniewski and Wehner are experts in a form of such equations known as ‘entropic uncertainty relations’, and they discovered that all the maths previously used to describe wave-particle duality could be reformulated in terms of these relations.


“It was like we had discovered the ‘Rosetta Stone’ that connected two different languages,” says Coles. “The literature on wave-particle duality was like hieroglyphics that we could now translate into our native tongue. We had several eureka moments when we finally understood what people had done,” he says.


Because the entropic uncertainty relations used in their translation have also been used in proving the security of quantum cryptography – schemes for secure communication using quantum particles – the researchers suggest the work could help inspire new cryptography protocols.


In earlier papers, Wehner and collaborators found connections between the uncertainty principle and other physics, namely quantum ‘non-locality’ and the second law of thermodynamics. The tantalising next goal for the researchers is to think about how these pieces fit together and what bigger picture that paints of how nature is constructed.









– Credit and Resource –


More information: “Equivalence of wave-particle duality to entropic uncertainty” Nature Communications DOI: 10.1038/ncomm6814 (2014) Preprint available at http://arxiv.org/abs/1403.4687


Provided by National University of Singapore search and more info website



Quantum physics Has Just Got Less Complicated

Japanese mission aims for an asteroid in search of origins of Earth's water

The European Space Agency’s Rosetta mission to land on comet 67P was one of the most audacious in space history. The idea of landing on a small chunk of icy rock 300m kilometres away from Earth and hurtling towards the sun at speeds approaching 135,000km/hour is incredible – made more so by the fact they actually achieved it.


Freedawn Scientia - After Rosetta, Japanese mission aims for an asteroid in search of origins of Earth's water In space, no one can appreciate your artistic spacecraft rendering. JAXA


What scientists have learned from the data returned by Rosetta supports the need for another ambitious space mission that has just begun: the Japanese Aerospace Exploration Agency (JAXA) Hayabusa2 mission will intercept not a comet, but an asteroid, landing on its surface no fewer than three times.


Data returned by the Rosetta mission has already provided us with many surprises, including the results now published in the journal Science, which reveal that the nature of the water found on comet 67P does not match that found on Earth.









Examining the vaporous cloud that encloses the comet nucleus, Rosetta measured the ratio of hydrogen to its heavier form, deuterium, and found it was three times higher than that found on Earth. This is an important discovery, since while water is vital to our existence on Earth, it is not at all obvious where it came from.


In the beginning

The Earth was formed from small rocky planetesimals that circled the young sun, coalescing into a planet that was most likely born a dry world. Ices are not found in the planetary formation process until we reach lower temperatures much further out into the solar system. This means that the Earth must have had a water delivery at a later time.


One hypothesis is that water came via comet impacts. Comets are formed in the chilly reaches around the giant planets of Jupiter, Saturn, Uranus and Neptune and are heavy in ice. During the end of our solar system’s formation, a large number of these were scattered towards the inner planets via gravitational kicks from their mammoth planetary neighbours. Striking our dry world, their icy contents could have begun the formation of our oceans.


But Rosetta’s analysis of comet 67P suggests that our oceans are not filled with fresh comet water. What we need is an alternative source, which leads us to Hayabusa2’s mission to the asteroids.


Freedawn Scientia - After Rosetta, Japanese mission aims for an asteroid in search of origins of Earth's water Asteroid 1999 JU3’s position is relative to the Earth. Credit: JAXA


Answers from asteroids

The JAXA Hayabusa2 mission, which launched in early December, aims to intercept asteroid 1999 JU3, touch down on its surface three times, deploy a lander with a trio of rovers and return to Earth with the asteroid samples in 2020. In short, it is a worthy successor to Rosetta.


Both comets and asteroids are left-over rocky parts from the planet formation process, but asteroids sit much closer to the Earth. The majority form a band orbiting the sun beyond Mars, known as the asteroid belt, but Hayabusa2’s target is far closer, currently orbiting the sun between the Earth and Mars.


Asteroids come in different flavours. The S-type group have been heated during their lifetime in processes that alter their original composition, while C-type asteroids – the target of Hayabusa2 – are thought to have changed very little since their original formation.


As its name implies, Hayabusa2 has a predecessor that visited the S-type asteroid, Itokawa, which showed evidence of experiencing heating up to 800°C. While its exploration illuminated much about the evolution of such space rocks, it held no answers as to the arrival of water on Earth.


Answers in clay


Freedawn Scientia - After Rosetta, Japanese mission aims for an asteroid in search of origins of Earth's water Hayabusa 2 under construction ahead of its epic journey. Credit: JAXA









At only around 1km across, 1999 JU3 has insufficient gravity to hold liquid water, but observations suggest it contains clays, which require water to form. This, and its current unstable orbit, implies that it was once part of a larger object that broke apart.


After completing an initial analysis, Hayabusa2’s first touchdown will be at the site of the discovered clays. While Rosetta deployed a lander to reach the comet surface, Hayabusa2 will itself make contact with the asteroid, firing a bullet as it descends to break up surface material that it can gather. It will do this twice more at different locations; the third descent will preceded by the firing of a larger missile to bring up rocky debris from beneath the surface of the asteroid. While making a direct landing is risky, the advantage is that these samples can be brought back to Earth for thorough analysis.


Despite touching down itself, Hayabusa2 will also deploy a lander. Developed by the same German and French teams that built the Rosetta lander, Philae, Hayabusa2’s MASCOT (Mobile Asteroid Surface SCout) will run on a 15-hour battery and dispatch three small rovers to explore the surface.


Life’s building blocks in space

However, water may be only one part of the secrets to be discovered on 1999 JU3. Previous research has suggested that reactions with water on asteroids are linked to the production of amino acids: the organic building blocks for life. Not only this, but these amino acids seem to be predominantly left-handed; a distinctive feature of those in life on Earth.


While amino acids created in the laboratory appear equally as both left- and right-handed mirror images, biology strongly favours the left-handed version. We don’t know the reason for this preference, making the suggestion that such selectivity could have begun in space extremely exciting. If this turns out to be true, then scientists opening Hayabusa2’s sample jar in six years time may not only find the source of our water, but perhaps also the very beginnings of life.










Japanese mission aims for an asteroid in search of origins of Earth's water