Your address will show here +12 34 56 78
Health
Stanford University scientists think a newly improved drug might help fight off viruses causing Ebola, dengue and Zika among others. Attempts to destroy viruses, including common cold viruses have failed up until now. Scientists at Stanford decided to solve this problem from a different angle by boosting the human body’s ability to resist the virus rather than directly fighting the virus. This work was published in ‘Nature Chemical Biology’. This approach has worked, in a lab dish at least, with a drug that fights two disease-causing viruses and potentially many more. Chaitan Khosla, a professor of chemistry and of chemical engineering who was one of the senior authors on the paper said, the drug could be effective against viruses that use RNA instead of DNA as their genetic material. “Most of the really nasty viruses use RNA,” Khosla said, including Ebola, dengue, Zika and Venezuelan equine encephalitis virus (VEEV), a mosquito-borne virus which infects horses but can also kill people. In addition, the team is conducting tests on animals to check the safety of the drug and to see which viral diseases it can fight off. A drug with a similar concept was initially developed by GlaxoSmithKline . However, after its few initial publications it was found that over time it prevented the cells from dividing and therefore, it was shelved. Khosla and his team studied the drug and decided to resurrect it by improving its mechanism of action. With the new approach Khosla and his team created a solution by feeding the cells a slightly different building block that is only used for DNA generation and not RNA. In this way, the cells successfully fought against dengue and VEEV and continued with their cell division. Hence, the drug could become less toxic to animals and ultimately to people. Khosla said if the drug combination works in animals, they hope it might be among the first antiviral approaches for human disease.
0

Environment
A new study by Stanford University could improve future seismic hazard predictions. The new research reveals how the rupture of multiple faults can lead to stronger earthquakes. Based on the new findings, the 1812 earthquake of Southern California was likely due to the slippage of one fault triggering the rupture of a second fault. Previously, scientists only blamed the San Andreas Fault for the 7.5 magnitude quake of Southern California. However, the study reveals the nearby San Jacinto Fault to be an accomplice. The San Jacinto Fault has been underestimated in causing serious quakes in tandem with the San Andreas Fault. “This study shows that the San Jacinto Fault is an important player that can influence what the San Andreas Fault is doing and lead to major earthquakes,” said Julian Lozos, the author of the study published in ‘Science Advances’ and who is currently an assistant professor of geological sciences at California State University . “It’s important that we learn more about how activity on any single fault can affect other faults.” According to evidence found by the study, the San Jacinto Fault slipped first between the cities of San Jacinto and Moreno Valley. The rupture then travelled north and crossed over to the San Andreas Fault close to a place called the Cajon Pass. This location is where the two fault lines run as close as 1.5 kilometres. Together the two ruptured faults caused the Southern California earthquake on that ill-fated December morning. “Understanding this earthquake in particular, and complex earthquakes in general, is important to quantifying seismic hazard”, said geophysicist Greg Beroza, the Wayne Loel Professor at Stanford School of Earth, Energy & Environmental Sciences. Lozos’ research could help the Uniform California Earthquake Rupture Forecast (UCERF) in preparing for future earthquakes. In earlier UCERF reports the estimated chance of an earthquake shaking California by 2015 was about 4.7 percent. However, in its latest report, after taking into account the effect of multi-fault ruptures, this estimate has grown to about 7 percent. Lozos also hopes his research increases earthquake awareness in the general public. Especially for the millions of Californians living in the Inland Empire, undercut by both the San Andreas and San Jacinto Fault lines. “People shouldn’t just be thinking about the San Andreas Fault,” Lozos said. “There are lots of other faults, so it’s important for everyone in the regions at risk to have their earthquake kits ready.”
0

Environment
An expert on nuclear materials at Stanford University says we need to reassess natural disaster risks, acknowledge the links between nuclear energy and renewables and rethink the language used when referring to these disasters. According to Rodney Ewing, the Frank Stanton Professor in Nuclear Security and senior fellow at the Centre for International Security and Cooperation in the Freeman Spogli Institute, the reason for the nuclear meltdown was not an accident as mentioned in the media and various scientific papers, but rather a failure of the safety analysis. In case of a powerful earthquake, power plants automatically shut down their reactors. In addition, generators start to work immediately to sustain the circulation of coolant over the nuclear fuel to prevent heating and possible meltdown. However, at Fukushima the tsunami flooded the diesel engines which were installed low and close to the coast, cutting off the power supply to the cooling system. The poorly placed generators lead to the catastrophic tragedy at Fukushima. “This is why when I refer to the tragedy at Fukushima, it was not an accident,” said Ewing. “The Japanese people and government were certainly well acquainted with the possibility of tsunamis.” His second lesson is to rethink the meaning of ‘risk’. He says referring to an earthquake or tsunami as a rare event, while geological records show it has happened and will happen again, reduces its sense of urgency to be prepared in advance. “It can be that the risk analysis works against safety, in the sense that if the risk analysis tells us that something’s safe, then you don’t take the necessary precautions,” he said. The assumption that the reactors were safe during an earthquake, prevented further anticipation in case of a tsunami. He said in the case of the reactors at the Fukusima Power Plant, one should not assess the risk of one of its reactors being hit by a tsunami. But must assess the risk of a tsunami hitting any one of the reactors over its lifetime. In the latter the probability of a reactor being hit by a tsunami increases, especially if the geological record for the evidence of tsunamis are also considered. The third lesson learned, according to Ewing, is the strong link between nuclear energy and the future of renewables. Since the Fukushima tragedy Ewing has noticed the continuous effect of the Fukushima disaster throughout the nuclear industry. He believes this impact in turn will greatly effect the future of renewable energy resources. The Nuclear Commission in the United States has required all reactor sites to reassess their risk from natural disasters, especially in the central of the United States. However, this reaction was not globally shared. “In countries like Germany and Switzerland, the Fukushima tragedy was the last straw,” Ewing said. “This was particularly true in Germany, where there has always been a strong public position against nuclear power and against geologic waste disposal. Politically, Germany announced that it will shut down its nuclear power plants.” Ewing says Germany is a great example, since it is a technologically advanced country trying to avoid the use of nuclear energy. At the same time it is trying to reduce its carbon emissions. This move towards renewable energy sources will be costly, but it’s a price many Germans are willing to pay.
0

Environment, Politics, News
According to a Stanford University study, collective efforts to reduce deforestation are more than twice as effective as “confrontational” programs implemented by either nongovernmental  organizations or industry. Various eco-certifications inform consumers of their impact on deforestation. However, there hasn’t been much research on their effectiveness up until now. The study finds that  these certifications have improved forest product sustainability to a great extent. According to “Impacts of Nonstate, Market-Driven Governance on Chilean forests” published in “Proceedings of The National Academy of Sciences”, Market-driven attempts have reduced deforestation to a great extent, with multi-party collaborations having the greatest impact. “Our research shows that these market-based conservation efforts have reduced deforestation in Chile,” said lead author Robert Heilmayr, a recent graduate from Stanford’s Emmett Interdisciplinary Program in Environment and Resources, in the paper “Impacts of Nonstate, Market-Driven Governance on Chilean forests” published in “Proceedings of The National Academy of Sciences” A comparison on the conservation outcomes between CERTFOR, a largely industry developed certification program, Joint Solutions Project (JSP), an NGO-instigated deforestation moratorium and Forest Stewardship Council (FSC), a cooporation between industry and nongovernmental organizations, has provided insight into this issue. While CERTFOR had 16 percent reduction in deforestation on average, JSP-only participants experienced an average reductions of 20 percent. With 43% reduction in deforestation, FSC resulted in the greatest success. According to Heilmayr and co-author  Eric Lambin,the George and Setsuko Ishiyama Provostial Professor in the School of Earth, Energy & Environmental Sciences, the balance between strict environmental requirements with cost-effective solutions was responsible for FSC’s leading success. This balance creates a notion among participants that their interests have been protected and hence they follow through on requirements. The analysis also suggests in contrast to government policies, private and market-driven programs are better at lowering deforestation rates in places of high deforestation. “Traditional conservation policies like national parks often protect remote, pristine locations,” Heilmayr said. “Agreements between companies and environmentalists can reduce deforestation in more threatened forests.” “In the globalization era, deforestation is increasingly associated with consumption in distant, international markets,” said Lambin . “We need new approaches to environmental governance that regulate the impact of international actors.”
0