Super User

Super User

Nearly one trillion species could be living on Earth, yet 99.999 per cent of them remain undiscovered, according to the largest-ever analysis of microbial data.

Researchers combined microbial, plant and animal community datasets from government, academic and citizen science sources, resulting in the largest compilation of its kind.

These data represent over 5.6 million microscopic and non-microscopic species from 35,000 locations across all the world's oceans and continents, except Antarctica.

"Estimating the number of species on Earth is among the great challenges in biology," said Kenneth J Locey, a postdoctoral fellow at Indiana University in the US.

"Our study combines the largest available datasets with ecological models and new ecological rules for how biodiversity relates to abundance. This gave us a new and rigorous estimate for the number of microbial species on Earth," said Locey.

"Until recently, we've lacked the tools to truly estimate the number of microbial species in the natural environment," he said.

"Many earlier attempts to estimate the number of species on Earth simply ignored microorganisms or were informed by older datasets that were based on biased techniques or questionable extrapolations," said Jay T Lennon, associate professor at the IU Bloomington College of Arts and Sciences' Department of Biology.

"Until now, we haven't known whether aspects of biodiversity scale with something as simple as the abundance of organisms," Locey added.

"As it turns out, the relationships are not only simple but powerful, resulting in the estimate of upwards of one trillion species," he said.

The results also suggest that identifying every microbial species on Earth is an almost unimaginably huge challenge.

The Earth Microbiome Project - a global multidisciplinary project to identify microscope organisms - has so far catalogued less than 10 million species.

"Of those catalogued species, only about 10,000 have ever been grown in a lab, and fewer than 100,000 have classified sequences," Lennon said.

"Our results show that this leaves 100,000 times more microorganisms awaiting discovery - and 100 million to be fully explored. Microbial biodiversity, it appears, is greater than ever imagined," said Lennon.

The research was published in the journal Proceedings of the National Academy of Sciences.

Courtesy – Deccan Herald

 

 India is home to four of the five cities in the world with the worst air pollution, the World Health Organization said on Thursday.

But while WHO experts acknowledge India faces a "huge challenge", many countries are so bad that they have no monitoring system and cannot be included in its ranking.

The dirtiest air was recorded at Zabol in Iran, which suffers from months of dust storms in the summer, and which clocked a so-called PM2.5 measure of 217. The next four were all Indian: Gwalior, Allahabad, Patna and Raipur.

India's capital New Delhi was the survey's ninth worst city, measured by the amount of particulate matter under 2.5 micrograms found in every cubic metre of air, with an annual average PM2.5 measurement of 122.

Tiny particulate matter can cause lung cancer, strokes and heart disease over the long term, as well as triggering symptoms such as heart attacks that kill more rapidly. The WHO says more than 7 million premature deaths occur every year due to air pollution, 3 million of them due to outdoor air quality.

New Delhi was ranked worst in 2014 with a PM2.5 reading of 153. It has since tried to tackle its toxic air by limiting the use of private cars on the road for short periods.

Maria Neira, head of public health, environmental and social determinants of health at the WHO, praised India's government for developing a national plan to deal with the problem when others have been unable to.

"Probably some of the worst cities that are the most polluted ones in the world are not included in our list, just because they are so bad that they do not even have a good system of monitoring of air quality, so it's unfair to compare or give a rank," she said.

Common causes of air pollution include too many cars, especially diesel-fuelled vehicles, the heating and cooling of big buildings, waste management, agriculture and the use of coal or diesel generators for power.

On average, pollution levels worsened by 8 percent between 2008 and 2013, although most cities in rich countries improved the state of their air over the same period.

The WHO data, a survey of 3,000 urban areas, shows only 2 percent of cities in poorer countries have air quality that meets WHO standards, while 44 percent of richer cities do.

The WHO database has almost doubled in size since 2014, and the trend towards more transparency translated into more action to deal with the problem, Neira said.

However, there was still very sparse data on Africa, she said.

Courtesy – Deccan Herald

Scientists have made a 3D map of 3,000 galaxies 13 billion light years from Earth, and found that Einstein's general theory of relativity is valid even far into the universe.

Since it was discovered in the late 1990s that the universe is expanding at an accelerated rate, scientists have been trying to explain why.

The mysterious dark energy could be driving acceleration, or Einstein's theory of general relativity, which says gravity warps space and time, could be breaking down.

To test Einstein's theory, researchers including those from the Kavli Institute for the Physics and Mathematics (Kavli IPMU) and University of Tokyo in Japan, used data on more than 3,000 distant galaxies to analyse their velocities and clustering.

Their results indicate that even far into the universe, general relativity is valid, giving further support that the expansion of the universe could be explained by a cosmological constant, as proposed by Einstein in his theory of general relativity.

"We tested the theory of general relativity further than anyone else ever has. It's a privilege to be able to publish our results 100 years after Einstein proposed his theory," said Teppei Okumura, Project Researcher at Kavli IPMU.

"Having started this project 12 years ago it gives me great pleasure to finally see this result come out," said Karl Glazebrook, Professor at Swinburne University of Technology.

No one has been able to analyse galaxies more than 10 billion light years away, but the team managed to break this barrier thanks to the FMOS (Fibre Multi-Object Spectrograph) on the Subaru Telescope, which can analyse galaxies 12.4 to 14.7 billion light years away.

Courtesy - Deccan Herald

 In the largest finding of planets to date, NASA has announced the discovery of 1,284 new planets outside our solar system, more than doubling the number of exoplanets found by the Kepler space telescope.

Nine of the newly found planets may be potentially habitable, NASA said.
"This gives us hope that somewhere out there, around a star much like ours, we can eventually discover another Earth," said Ellen Stofan, chief scientist at NASA Headquarters in Washington.

"This announcement more than doubles the number of confirmed planets from Kepler," said Stofan.

Analysis was performed on the Kepler space telescope's July 2015 planet candidate catalogue, which identified 4,302 potential planets.

For 1,284 of the candidates, the probability of being a planet is greater than 99 per cent - the minimum required to earn the status of "planet".

An additional 1,327 candidates are more likely than not to be actual planets, but they do not meet the 99 per cent threshold and will require additional study.

The remaining 707 are more likely to be some other astrophysical phenomena. This analysis also validated 984 candidates previously verified by other techniques.

"Before the Kepler space telescope launched, we did not know whether exoplanets were rare or common in the galaxy," said Paul Hertz, Astrophysics Division director at NASA.

"Thanks to Kepler and the research community, we now know there could be more planets than stars," said Hertz.

"This knowledge informs the future missions that are needed to take us ever-closer to finding out whether we are alone in the universe," he said.

Kepler captures the discrete signals of distant planets - decreases in brightness that occur when planets pass in front of, or transit, their stars.

Since the discovery of the first planets outside our solar system more than two decades ago, researchers have resorted to a one-by-one process of verifying suspected planets.

The latest findings are based on a new method that can be applied to many planet candidates simultaneously.

In the newly-validated batch of planets, nearly 550 could be rocky planets like Earth, based on their size.

Nine of these orbit in their sun's habitable zone, which is the distance from a star where orbiting planets can have surface temperatures that allow liquid water to pool.

With the addition of these nine, 21 exoplanets now are known to be members of this exclusive group.

"This work will help Kepler reach its full potential by yielding a deeper understanding of the number of stars that harbour potentially habitable, Earth-size planets - a number that's needed to design future missions to search for habitable environments and living worlds," said Natalie Batalha, Kepler mission scientist at NASA.

Of the nearly 5,000 total planet candidates found to date, more than 3,200 now have been verified, and 2,325 of these were discovered by Kepler.

Courtesy – Deccan Herald

 

Scientists have developed a new non-invasive, personalised 3D virtual heart assessment tool to help doctors determine whether a patient faces a risk of life-threatening arrhythmia.

When electrical waves in the heart run amok in a condition called arrhythmia, sudden death can occur, researchers said.

To save the life of a patient at risk, doctors currently implant a small defibrillator to sense the onset of arrhythmia and jolt the heart back to a normal rhythm.

However, it is difficult to decide which patients truly need the invasive, costly electrical implant.

"Our virtual heart test significantly outperformed several existing clinical metrics in predicting future arrhythmic events," said Natalia Trayanova from Johns Hopkins University in the US.

"This non-invasive and personalised virtual heart-risk assessment could help prevent sudden cardiac deaths and allow patients who are not at risk to avoid unnecessary defibrillator implantations," said Trayanova.

Researchers formed its predictions by using the distinctive magnetic resonance imaging (MRI) records of patients who had survived a heart attack but were left with damaged cardiac tissue that predisposes the heart to deadly arrhythmias.

The study involved data from 41 patients who had survived a heart attack and had an ejection fraction - a measure of how much blood is being pumped out of the heart - of less than 35 per cent.

Researchers used a pre-implant MRI scans of the recipients' hearts to build patient-specific digital replicas of the organs.

Using computer-modeling techniques, the geometrical replica of each patient's heart was brought to life by incorporating representations of the electrical processes in the cardiac cells and the communication among cells.

In some cases, the virtual heart developed an arrhythmia, and in others it did not. The result, a non-invasive way to gauge the risk of sudden cardiac death due to arrhythmia, was dubbed VARP, short for virtual-heart arrhythmia risk predictor, researchers said.

The method allowed the researchers to factor in the geometry of the patient's heart, the way electrical waves move through it and the impact of scar tissue left by the earlier heart attack.

"We demonstrated that VARP is better than any other arrhythmia prediction method that is out there," said Trayanova.

"By accurately predicting which patients are at risk of sudden cardiac death, the VARP approach will provide the doctors with a tool to identify those patients who truly need the costly implantable device, and those for whom the device would not provide any life-saving benefits," she said.

The findings were published in the journal Nature Communications.   

Courtesy – Deccan Herald

 

Air bubbles trapped in 2.7 billion-year-old rocks suggest that early Earth's air weighed less than half of today's atmosphere, researchers including one of Indian-origin have found.

The research from the University of Washington reverses the commonly accepted idea that the early Earth had a thicker atmosphere to compensate for weaker sunlight.

The finding also has implications for which gases were in that atmosphere, and how biology and climate worked on the early planet, researchers said.

"For the longest time, people have been thinking the atmospheric pressure might have been higher back then, because the sun was fainter," said Sanjoy Som, who did the work as part of his UW doctorate in Earth and space sciences.

"Our result is the opposite of what we were expecting," said Som.

Researchers used bubbles trapped in cooling lava as a "paleobarometer" to determine the weight of air in our planet's youth.

To measure air pressure farther back in time, researchers needed a site where truly ancient lava had undisputedly formed at sea level.

In the field site in Western Australia, discovered by Tim Blake of the University of Western Australia, the Beasley River has exposed 2.7 billion-year-old basalt lava.

The lowest lava flow has "lava toes" that burrow into glassy shards, proving that molten lava plunged into seawater. The team drilled into the overlying lava flows to examine the size of the bubbles.

A stream of molten rock that forms a lava quickly cools from top and bottom, and bubbles trapped at the bottom are smaller than those at the top. The size difference records the air pressure pushing down on the lava as it cooled, 2.7 billion years ago.

Rough measurements in the field suggested a surprisingly lightweight atmosphere. More rigorous X-ray scans from several lava flows confirmed the result: The bubbles indicate that the atmospheric pressure at that time was less than half of today's.

Earth 2.7 billion years ago was home only to single-celled microbes, sunlight was about one-fifth weaker, and the atmosphere contained no oxygen.

But this finding points to conditions being even more different than previously thought, researchers said.

A lighter atmosphere could affect wind strength and other climate patterns, and would even alter the boiling point of liquids, they said.

The study was published in the journal Nature Geoscience.

 

Courtesy – Deccan Herald

Scientists, including one of Indian-origin, have engineered a strain of bacteria that enables a "one-pot" method for producing advanced biofuels from a slurry of pre-treated plant material.

The Escherichia coli (E coli) is able to tolerate the liquid salt used to break apart plant biomass into sugary polymers, researchers said.

Since the salt solvent, known as ionic liquids, interferes with later stages in biofuels production, it needs to be removed before proceeding, a process that takes time and money. Developing ionic-liquid-tolerant bacteria eliminates the need to wash away the residual ionic liquid.

The achievement is a critical step in making biofuels a viable competitor to fossil fuels because it helps streamline the production process, researchers said.

"Being able to put everything together at one point, walk away, come back, and then get your fuel, is a necessary step in moving forward with a biofuel economy," said Aindrila Mukhopadhyay from the US Department of Energy's Lawrence Berkeley National Laboratory.

"The E coli we have developed gets us closer to that goal. It is like a chassis that we build other things onto, like the chassis of a car," said said Mukhopadhyay.

"It can be used to integrate multiple recent technologies to convert a renewable carbon source like switchgrass to an advanced jet fuel," she said.

The basic steps of biofuel production start with deconstructing the cellulose, hemicellulose and lignin that are bound together in the complex plant structure.
Enzymes are then added to release the sugars from that gooey mixture of cellulose and hemicellulose, a step called saccharification.

Bacteria can then take that sugar and churn out the desired biofuel. The multiple steps are all done in separate pots.

Researchers pioneered the use of ionic liquids, salts that are liquid at room temperature, to tackle the deconstruction of plant material because of the efficiency with which the solvent works.

However, what makes ionic liquids great for deconstruction also makes it harmful for the downstream enzymes and bacteria used in biofuel production.

They established that an amino acid mutation in the gene rcdA, which helps regulate various genes, leads to an E coli strain that is highly tolerant to ionic liquids.

They used this strain as the foundation to build on earlier work - including the ionic-liquid-tolerant enzymes - and take the steps further to the one-pot biofuel finishing line.
The findings were published in the journal Green Chemistry.

Courtesy – Deccan Herald

Repeatedly playing violent video games reduces emotional responses like guilt, scientists have found for the first time.

Rapidly advancing technology has created more realistic video games. Images are sharp, settings have depth and detail, and the audio is crisp and authentic.

At a glance, it appears so real that research has found that gamers feel guilty committing unjustified acts of violence within the game.

Researchers from University at Buffalo, Michigan State University and University of California Santa Barbara in the US found that the moral response produced by the initial exposure to a video game decreases as experience with the game develops.

Why this is happening remains a mystery, according to Matthew Grizzard, from University at Buffalo.

Gamers often claim their actions in a video game are as meaningless to the real world as players capturing pawns on a chess board.

However, previous research shows that immoral virtual actions can elicit higher levels of guilt than moral virtual actions. This finding would seem to contradict claims that virtual actions are completely divorced from the real world.

Researchers wanted to replicate their earlier research and determine whether gamers' claims that their virtual actions are meaningless actually reflects desensitisation processes.

Although the findings suggest that desensitisation occurs, mechanisms underlying these findings are not entirely clear.

There are two arguments for the desensitisation effect, Grizzard said.

"One is that people are deadened because they've played these games over and over again. This makes the gamers less sensitive to all guilt-inducing stimuli," he said.

The second argument is a matter of tunnel vision.

"Gamers see video games differently than non-gamers, and this differential perception develops with repeated play," he said.

"Non-gamers look at a particular game and process all that's happening. For the non-gamer, the intensity of the scene trumps the strategies required to succeed," he said.

But gamers ignore much of the visual information in a scene as this information can be meaningless to their success in a game, according to Grizzard.

"This second argument says the desensitisation we're observing is not due to being numb to violence because of repeated play, but rather because the gamers' perception has adapted and started to see the game's violence differently," he said.

"Through repeated play, gamers may come to understand the artificiality of the environment and disregard the apparent reality provided by the game's graphics," Grizzard said.

The study was published in the journal Media Psychology.

Courtesy – Deccan Herald

Scientists have for the first time decoded how deep sleep - also called slow-wave sleep - may be promoting the consolidation of recent memories in our brain.

Research strongly suggests that sleep, which constitutes about a third of our lives, is crucial for learning and forming long-term memories.

But exactly how such memory is formed is not well understood and remains, despite considerable research, a central question of enquiry in neuroscience.

The study by researchers at the University of California, Riverside provides for the first time a mechanistic explanation for how deep sleep, also called slow-wave sleep, may be promoting the consolidation of recent memories.

During sleep, human and animal brains are primarily decoupled from sensory input.

The brain remains highly active, showing electrical activity in the form of sharp-wave ripples in the hippocampus (a small region of the brain that forms part of the limbic system) and large-amplitude slow oscillations in the cortex (the outer layer of the cerebrum), reflecting alternating periods of active and silent states of cortical neurons during deep sleep.

Traces of episodic memory acquired during wakefulness and initially stored in the hippocampus are progressively transferred to the cortex as long-term memory during sleep.

Using a computational model, the researchers provide a link between electrical activity in the brain during deep sleep and synaptic connections between neurons.

They show that patterns of slow oscillations in the cortex, which their model spontaneously generates, are influenced by the hippocampal sharp-wave ripples and that these patterns of slow oscillations determine synaptic changes in the cortex.

The model shows that the synaptic changes, in turn, affect the patterns of slow oscillations, promoting a kind of reinforcement and replay of specific firing sequences of the cortical neurons - representing a replay of specific memory.

"These patterns of slow oscillations remain even without further input from the hippocampus," said Yina Wei, a postdoctoral researcher.

"We interpret these results as a mechanistic explanation for the consolidation of specific memories during deep sleep, whereby the memory traces are formed in the cortex and become independent of the hippocampus," said Wei.

Wei explained that according to the biologically realistic network model the researchers used, input from the hippocampus reaches the cortex during deep sleep and influences how the slow oscillations are initiated and propagated in the cortical network.

The study appears in the Journal of Neuroscience.

Courtesy – Deccan Herald

 

Office-goers, take note! Reducing sitting time at workplace by 71 minutes per day may lower the risk of heart diseases, diabetes and all-cause mortality, a new study has claimed.

Researchers conducted a multicomponent work-based intervention to reduce sitting time and prolonged sitting periods.

The results, which were followed up at one month and three months, showed a reduction of 0.61 percentage points in body fat percentage. This was as a result of 71 minutes shorter sitting time during working hours after one month.

"A reduction in sitting time by 71 minutes per day and increases in interruptions could have positive effects and, in the long run, could be associated with reduced risk of heart diseases, diabetes and all-cause mortality, especially among those who are inactive in their leisure time," said Janne Tolstrup from University of Southern Denmark.

As many as 317 office workers in 19 offices across Denmark and Greenland were randomly put into the intervention or control groups. The intervention included environmental office changes and a lecture and workshop, where workers were encouraged to use their sit-stand desks.

By wearing an accelerometre device, researchers were able to measure results across a five day working week.

After one month, participants in the intervention group sat down for 71 minutes less in an 8 hour work day than the control group. This reduced to 48 minutes after three months.

The number of steps per workday hour was seven per cent higher at one month and eight per cent higher at three months, researchers said.

Relatively few people complained of any pain as a result of standing more, with less than six per cent of people reporting negative consequences, they said. The findings were published in the journal International Journal of Epidemiology.

Courtesy – Deccan Herald

Page 10 of 19