Di seguito gli interventi pubblicati in questa sezione, in ordine cronologico.
Battery technology hasn't kept pace with advancements in portable electronics, but the race is on to fix this. One revolutionary concept being pursued by a team of researchers in New Zealand involves creating "wearable energy harvesters" capable of converting movement from humans or found in nature into battery power.
This image shows a hand-pumped soft generator the researchers are using to demonstrate it. Credit: N/A
A class of variable capacitor generators known as "dielectric elastomer generators" (DEGs) shows great potential for wearable energy harvesting. In fact, researchers at the Auckland Bioengineering Institute's Biomimetics Lab believe DEGs may enable light, soft, form-fitting, silent energy harvesters with excellent mechanical properties that match human muscle. They describe their findings in the American Institute of Physics' journal Applied Physics Letters.
"Imagine soft generators that produce energy by flexing and stretching as they ride ocean waves or sway in the breeze like a tree," says Thomas McKay, a Ph.D. candidate working on soft generator research at the Biomimetics Lab. "We've developed a low-cost power generator with an unprecedented combination of softness, flexibility, and low mass. These characteristics provide an opportunity to harvest energy from environmental sources with much greater simplicity than previously possible."
Dielectric elastomers, often referred to as artificial muscles, are stretchy materials that are capable of producing energy when deformed. In the past, artificial muscle generators required bulky, rigid, and expensive external electronics.
"Our team eliminated the need for this external circuitry by integrating flexible electronics—dielectric elastomer switches—directly onto the artificial muscles themselves. One of the most exciting features of the generator is that it's so simple; it simply consists of rubber membranes and carbon grease mounted in a frame," McKay explains.
McKay and his colleagues at the Biomimetics Lab are working to create soft dexterous machines that comfortably interface with living creatures and nature in general. The soft generator is another step toward fully soft devices; it could potentially be unnoticeably incorporated into clothing and harvest electricity from human movement. When this happens, worrying about the battery powering your cell phone or other portable electronics dying on you will become a thing of the past. And as an added bonus, this should help keep batteries out of landfills.
Even as the veggie blame game is now under way across the EU, where a super resistant strain of E. Coli is sickening patients and filling hospitals in Germany, virtually no one is talking about how E. Coli could have magically become resistant to eight different classes of antibiotic drugs and then suddenly appeared in the food supply.
This particular e.coli variation is a member of the O104 strain, and O104 strains are almost never (normally) resistant to antibiotics. In order for them to acquire this resistance, they must be repeatedly exposed to antibiotics in order to provide the "mutation pressure" that nudges them toward complete drug immunity.
So if you're curious about the origins of such a strain, you can essentially reverse engineer the genetic code of the e.coli and determine fairly accurately which antibiotics it was exposed to during its development. This step has now been done (see below), and when you look at the genetic decoding of this O104 strain now threatening food consumers across the EU, a fascinating picture emerges of how it must have come into existence.
The genetic code reveals the history
When scientists at Germany's Robert Koch Institute decoded the genetic makeup of the O104 strain, they found it to be resistant to all the following classes and combinations of antibiotics:
• nalidixic acid
• amoxicillin / clavulanic acid
In addition, this O104 strain posses an ability to produce special enzymes that give it what might be called "bacteria superpowers" known technically as ESBLs:
"Extended-Spectrum Beta-Lactamases (ESBLs) are enzymes that can be produced by bacteria making them resistant to cephalosporins e.g. cefuroxime, cefotaxime and ceftazidime - which are the most widely used antibiotics in many hospitals," explains the Health Protection Agency in the UK (http://www.hpa.org.uk/Topics/InfectiousDiseases/InfectionsAZ/ESBLs/).
On top of that, this O104 strain possesses two genes -- TEM-1 and CTX-M-15 -- that "have been making doctors shudder since the 1990s," reports The Guardian (http://www.guardian.co.uk/commentisfree/2011/jun/05/deadly-ecoli-resistance-antibiotic-misuse). And why do they make doctors shudder? Because they're so deadly that many people infected with such bacteria experience critical organ failure and simply die.
Bioengineering a deadly superbug
So how, exactly, does a bacterial strain come into existence that's resistant to over a dozen antibiotics in eight different drug classes and features two deadly gene mutations plus ESBL enzyme capabilities?
There's really only one way this happens (and only one way) -- you have to expose this strain of e.coli to all eight classes of antibiotics drugs. Usually this isn't done at the same time, of course: You first expose it to penicillin and find the surviving colonies which are resistant to penicillin. You then take those surviving colonies and expose them to tetracycline. The surviving colonies are now resistant to both penicillin and tetracycline. You then expose them to a sulfa drug and collect the surviving colonies from that, and so on. It is a process of genetic selection done in a laboratory with a desired outcome. This is essentially how some bioweapons are engineered by the U.S. Army in its laboratory facility in Ft. Detrick, Maryland (http://en.wikipedia.org/wiki/National_Biodefense_Analysis_and_Countermeasures_Center).
Although the actual process is more complicated than this, the upshot is that creating a strain of e.coli that's resistant to eight classes of antibiotics requires repeated, sustained expose to those antibiotics. It is virtually impossible to imagine how this could happen all by itself in the natural world. For example, if this bacteria originated in the food (as we've been told), then where did it acquire all this antibiotic resistance given the fact that antibiotics are not used in vegetables?
When considering the genetic evidence that now confronts us, it is difficult to imagine how this could happen "in the wild." While resistance to a single antibiotic is common, the creation of a strain of e.coli that's resistant to eight different classes of antibiotics -- in combination -- simply defies the laws of genetic permutation and combination in the wild. Simply put, this superbug e.coli strain could not have been created in the wild. And that leaves only one explanation for where it really came from: the lab.
Engineered and then released into the wild
The evidence now points to this deadly strain of e.coli being engineered and then either being released into the food supply or somehow escaping from a lab and entering the food supply inadvertently. If you disagree with that conclusion -- and you're certainly welcome to -- then you are forced to conclude that this octobiotic superbug (immune to eight classes of antibiotics) developed randomly on its own... and that conclusion is far scarier than the "bioengineered" explanation because it means octobiotic superbugs can simply appear anywhere at any time without cause. That would be quite an exotic theory indeed.
My conclusion actually makes more sense: This strain of e.coli was almost certainly engineered and then released into the food supply for a specific purpose. What would that purpose be? It's obvious, I hope.
It's all problem, reaction, solution at work here. First cause a PROBLEM (a deadly strain of e.coli in the food supply). Then wait for the public REACTION (huge outcry as the population is terrorized by e.coli). In response to that, enact your desired SOLUTION (total control over the global food supply and the outlawing of raw sprouts, raw milk and raw vegetables).
That's what this is all about, of course. The FDA relied on the same phenomenon in the USA when pushing for its recent "Food Safety Modernization Act" which essentially outlaws small family organic farms unless they lick the boots of FDA regulators. The FDA was able to crush farm freedom in America by piggybacking on the widespread fear that followed e.coli outbreaks in the U.S. food supply. When people are afraid, remember, it's not difficult to get them to agree to almost any level of regulatory tyranny. And making people afraid of their food is a simple matter... a few government press releases emailed to the mainstream media news affiliates is all it takes.
First ban the natural medicine, then attack the food supply
Now, remember: All this is happening on the heels of the EU ban on medicinal herbs and nutritional supplements -- a ban that blatantly outlaws nutritional therapies that help keep people healthy and free from disease. Now that all these herbs and supplements are outlawed, the next step is to make people afraid of fresh food, too. That's because fresh vegetables are medicinal, and as long as the public has the right to buy fresh vegetables, they can always prevent disease.
But if you can make people AFRAID of fresh vegetables -- or even outlaw them altogether -- then you can force the entire population onto a diet of dead foods and processed foods that promote degenerative disease and bolster the profits of the powerful drug companies.
It's all part of the same agenda, you see: Keep people sick, deny them access to healing herbs and supplements, then profit from their suffering at the hands of the global drug cartels.
GMOs play a similar role in all this, of course: They're designed to contaminate the food supply with genetic code that causes widespread infertility among human beings. And those who are somehow able to reproduce after exposure to GMOs still suffer from degenerative disease that enriches the drug companies from "treatment."
Do you recall which country was targeted in this recent e.coli scare? Spain. Why Spain? You may recall that leaked cables from Wikileaks revealed that Spain resisted the introduction of GMOs into its agricultural system, even as the U.S. government covertly threatened political retaliation for its resistance. This false blaming of Spain for the e.coli deaths is probably retaliation for Spain's unwillingness to jump on the GMO bandwagon. (http://www.naturalnews.com/030828_GMOs_Wikileaks.html)
That's the real story behind the economic devastation of Spain's vegetable farmers. It's one of the subplots being pursued alongside this e.coli superbug scheme.
Food as weapons of war - created by Big Pharma?
By the way, the most likely explanation of where this strain of e.coli was bioengineered is that the drug giants came up with it in their own labs. Who else has access to all the antibiotics and equipment needed to manage the targeted mutations of potentially thousands of e.coli colonies? The drug companies are uniquely positioned to both carry out this plot and profit from it. In other words, they have the means and the motive to engage in precisely such actions.
Aside from the drug companies, perhaps only the infectious disease regulators themselves have this kind of laboratory capacity. The CDC, for example, could probably pull this off if they really wanted to.
The proof that somebody bioengineered this e.coli strain is written right in the DNA of the bacteria. That's forensic evidence, and what it reveals cannot be denied. This strain underwent repeated and prolonged exposure to eight different classes of antibiotics, and then it somehow managed to appear in the food supply. How do you get to that if not through a well-planned scheme carried out by rogue scientists? There is no such thing as "spontaneous mutation" into a strain that is resistant to the top eight classes of brand-name antibiotic drugs being sold by Big Pharma today. Such mutations have to be deliberate.
Once again, if you disagree with this assessment, then what you're saying is that NO, it wasn't done deliberately... it happened accidentally! And again, I'm saying that's even scarier! Because that means the antibiotic contamination of our world is now at such an extreme level of overkill that a strain of e.coli in the wild can be saturated with eight different classes of antibiotics to the point where it naturally develops into its own deadly superbug. If that's what people believe, then that's almost a scarier theory than the bioengineering explanation!
A new era has begun: Bioweapons in your food
But in either case -- no matter what you believe -- the simple truth is that the world is now facing a new era of global superbug strains of bacteria that can't be treated with any known pharmaceutical. They can all, of course, be readily killed with colloidal silver, which is exactly why the FDA and world health regulators have viciously attacked colloidal silver companies all these years: They can't have the public getting its hands on natural antibiotics that really work, you see. That would defeat the whole purpose of making everybody sick in the first place.
In fact, these strains of e.coli superbugs can be quite readily treated with a combination of natural full-spectrum antibiotics from plants such as garlic, ginger, onions and medicinal herbs. On top of that, probiotics can help balance the flora of the digestive tract and "crowd out" the deadly e.coli that might happen by. A healthy immune system and well-functioning digestive tract can fight off an e.coli superbug infection, but that's yet another fact the medical community doesn't want you to know. They much prefer you to remain a helpless victim lying in the hospital, waiting to die, with no options available to you. That's "modern medicine" for ya. They cause the problems that they claim to treat, and then they won't even treat you with anything that works in the first place.
Nearly all the deaths now attributable to this e.coli outbreak are easily and readily avoidable. These are deaths of ignorance. But even more, they may also be deaths from a new era of food-based bioweapons unleashed by either a group of mad scientists or an agenda-driven institution that has declared war on the human population.
Additional developments on this e.coli outbreak
• 22 fatalities have so far been reported, with 2,153 people now sickened and possibly facing kidney failure.
• An agricultural ministry in Germany said that even though they now know the source of the outbreak is a German sprout farm, they are still not lifting their warnings for people to avoid eating tomatoes and lettuce. In other words, keep the people afraid!
• "The German variant of E coli, known as O104, is a hybrid of the strains that can cause bloody diarrhoea and kidney damage called 'hemolytic uremic syndrome'." (http://www.independent.ie/world-news/europe/german-beansprouts-to-blame-as-e-coli-death-toll-reaches-22-2667140.html)
• A total of ten European nations have reported outbreaks of this e.coli strain, mostly from people who had visited northern Germany.
• The following story is in German, and it hints that the e.coli outbreak might have been a terrorist attack (http://www.aerztezeitung.de/medizin/krankheiten/infektionskrankheiten/magen-darminfekte/article/657699/ehec-rki-behoerde-kritik.html). Yeah, a terrorist attack by the drug companies upon innocent people, as usual…
Author: Mike Adams; Source: NaturalNews.com
Automotive engineering has brought in advanced concepts in today’s auto industries. Cars are now installed with 8- to 16- to 32-bit processors, with thoughts of introducing the state-of-art processor technology in the very near future.
The proven technology of dual core processors, bring in speed and faster computing power, with a lower clock speed, hence consuming less power. This would mean less heat generation. In modern information technology systems, dual core processors have already created a mark, with thoughts going on for multi-core processors to be soon introduced in the market.
In the past, auto manufacturers have used dual-core processors in automotive engineering, but the thoughts are fast changing with applications taking a new turn. It is understood that some auto manufacturers is contemplating using triple-core processors in vehicles, and is also working with groups to implement quad-core processor systems.
There are many areas in automotive engineering that have notable application that require the performance of dual-core processors. The most important areas of application are fuel saving, and emission reductions, with diagnosis of safety management and transformation of hardware based functions to systems based on software.
With regards to the transformation from single-core units to dual-core would be simpler than, if this change was required to be made from a 16-bit system to a 32-bit processor. There would have been requirement for wide changes in the software making it suitable to run on a 32-bit system. As for the transformation from single-core to dual-core, the changes would be simpler, without having any major re-writes.
Cars are now fitted with distributed systems, one of them being the distributed control of the car suspensions, which provide a much smoother ride along with superior road holding. Today, the modern car is equipped with dozens of computers, to provide balanced performance if its engine, control of fuel consumption, and emission reduction.
There are cars fitted with shark fin shaped antennas on the top, which provide entertainment in the form of satellite radio, GPS, and cellular networks.
Today, automotive engineering has provided the knowledge of the application of general purpose computers in vehicles, but the disadvantage is that, these are not re-programmable. It is often wondered as to why the cars really need general purpose computer systems.
To explain matters simply, many of the existing functions in the vehicles are being extended today through use of on-board general purpose computers, which has provided large benefits regarding many of the functions that we usually find in a car.
More-over, computers today makes you avail the features in a car that could not be thought of in the past. During 1980?s there were strong feeling about having a PC at home, though people did not know why they need them. Today, every home has a computer and rightly so.
General purpose computers in cars bring you the modern advantage of having applications where you have Wi-Fi enabled audio system, download mp3 music and e-books, or get your email and have it read to you as you drive. One day, you will be able to share your radio system with other drivers, keep a full GPS log of your travels, and perhaps record public shows of your interest.
There are future applications to come, with the concept taking shape everyday in implementation of general purpose computers in your car.
As many students of history are familiar, Galileo Galilei, famed mathematician & astronomer, known today by many as the “father of modern science”, was forced by the Catholic Church under threat of torture to recant his “heretical” view that the earth revolved around the sun and not vice-versa in the 17th century. This scientifically valid idea voided long held religious dogma and hence challenged the Church's integrity itself.
In a letter from 1634, René Descartes, one of the world's most noted thinkers and philosophers, stated: “Doubtless you know that Galileo was recently censored by the Inquisitors of the Faith, and that his views about the movement of the earth were condemned as heretical. I must tell you that all the things I explained in my treatise, which included the doctrine of the movement of the earth, were so interdependent that it is enough to discover that one of them is false to know that all the arguments I was using are unsound. Though I thought they were based on very certain and evident proofs, I would not wish, for anything in the world, to maintain them against the authority of the church.... I desire to live in peace and to continue the life I have begun under the motto to 'live well you must live unseen'.”
If we step back and think about the challenges that faced this small progressive and scientific community during 17th Century Europe and compare the fear and patterns of suppression coming from the established orthodoxy of that time to that of the modern-day, we find only mere variation. Descartes' revelation and retreat from exposure, as expressed by the motto: 'to live well you must live unseen' is a disheartening disposition that speaks volumes and sadly carries on to this day across the world. The use of fear, intimidation and other time tested variations of oppression continue to persist as the dominant institutions of our society work to protect it's established orders regardless of social validity. Even more, the overall cultural itself, which invariably tends to support the accepted beliefs put forward by those that define “power” of a period, also tends to condemn those who choose to pose a challenge as it becomes a threat to the mass accepted identity itself.
The result is that many simply are not willing to risk their lives, occupations and reputations to challenge the orthodoxy of the time.
In late May 2011 news reports were generated that detailed how the Federal Bureau of Investigation in the United States was actively targeting “Political Activists” under the pretense of “Terrorism”.
Just as people like John Lennon and Martin Luther King Jr. were watched and harassed by the FBI for their activism decades ago, it appears modern, so-called “Anti-Terrorism” resources are being used to target environmentalists, peace, animal and political activists.
Just like the accusations of “Communism” against people like MLK Jr. in the mid 20th century, this newer, more generalized device called “Terrorism” of the 21st century is no less an “heretical”, accusatory tool than what was employed by the Inquisition century's ago to maintain the politico-religious social system.
So, we can sympathize with Descartes' notion, as to move against the Zeitgeist is to position yourself against the odds, regardless of how empirical, necessary or obvious the truth you wish to convey and act upon is.
Unfortunately, Descartes' position is unacceptable in the modern world. The risks that now exist within our current order are beginning to far outweigh the temporal personal risks generated by the act of activist objection itself.
It is no longer issues of accurate data, “rights” and “freedoms”. Today our very stability as a civilization is now in question and, if left unhindered, it threatens us all, regardless of one's position in the modern feudal hierarchy.
So, we can sit in confusion and watch as global unemployment rises due to technological unemployment and the resulting regional instability that is sure to grow. We can stare blankly at the systematic debt collapse of the world economy, country by country, like dominos, as self-appointed global banking institutions that derive money out of nothing impose austerity measures against the poor and middle class of each country to help support the wealthy, furthering the income divide.
We can twiddle our thumbs as what we have called “democracy” turns inexplicably into global plutocracy and the world economy becomes measured by how much money the rich move around amongst themselves. We can distract ourselves with our little gadgets as the rain-forests – considered by many to be the “lungs” of this planet – are destroyed at faster and faster rates, reducing our ability to absorb the growing CO2 in the atmosphere. We can keep the TV on as the clean water and food shortages that currently affect over 1 billion people continue to grow to 2 billion... 3 billion. We can scan the tabloids at the grocery store news stands as the very basis of industrial civilization, the Hydrocarbon Economy, inches towards crisis scarcity with virtually no active initiative taken to change course.
We can continue to pretend that our “leaders” are anything but “mis-leaders”, set in motion by monetary commercial interests that follow the rules of the free-market with all legislation and offices going to the highest bidder, one way or another... and we can stand amused as a new global arms race gains speed as each country comes to terms with the very real reality that wars for resources are upon us in a way unlike any period in history.
This is what separates our world from the one Descartes hid from.
The fact is, the fear tactics of the Orthodoxy - in this context the FBI or any such “Intelligence Agency” - are no longer worthy of viable concern or even acknowledgment. At no time in history has any true social change come in a manner that was not opposed with hostility by the dominant orders of the time. If you choose fear, then fear exists and those little lists/tactics held by the Intelligence/Police Agencies have merit. If you choose choose love, pride and self-respect then no accusations, lists, or threats can ever stop you. The trick now is in numbers and if we can gain critical mass and override the “divide and conquer” techniques used to keep the orthodoxy in place, the game is over.
The Zeitgeist Movement is a global sustainability activist group working to bring the world together for the common goal of species sustainability before it is too late. It is a social movement, not a political one, with over 1100 chapters across nearly all countries. Divisionary notions such as nations, governments, races, political parties, religions, creeds or class are non-operational distinctions in the view of The Movement. Rather, we recognize the world as one system and the human species as a singular unit, sharing a common habitat. Our overarching intent could be summarized as “the application of the scientific method for social concern.”
To learn more about our work, please visit http://www.thezeitgeistmovement.com/
Telematics, a mash-up of telecommunications and informatics, is the science of scanning the world with wireless devices to extract data, sending this data to a computer network, and using the information to do anything from tracking packages to monitoring the highway speed of grocery trucks. UPS relies heavily on telematics, as does GM with its OnStar navigation system. The federal government could do a better job of capitalizing on the science, according to Michael J. Ravnitzky. So he started thinking about one of the largest mobile networks on Earth: the post office.
Ravnitzky is a chief counsel at the Postal Regulatory Commission, the government agency that oversees the U.S. Postal Service. The post office is in bad shape. From 2006 to 2009, mail volume dropped by 17 percent and officials have threatened to cut Saturday service. But where others see an inefficient and increasingly outdated system, Ravnitzky sees opportunity.
With its 218,684 vehicles stopping at more than 150 million delivery points along some 232,000 routes every day, the postal-delivery fleet could be reconceived as a vast data-gathering network. “If you were designing a data collection system from scratch, it would look a lot like the postal service,” Ravnitzky says. As he reasoned in a New York Times op-ed last December, the postal network could be used to measure air pollution and ozone levels while aiding Homeland Security operations by scanning for biological or chemical agents. Or it might detect and report WiFi and cellular dead zones. Using telematics, the postal service could evolve into an entirely new kind of public utility. It could also provide a new source of revenue. Private companies or other government agencies could buy space for their sensors on mail trucks.
Although Ravnitzky’s idea is just that—an idea—there’s precedent: Two years ago, 32 Greyhound buses rigged with sensors set off across the country to gather atmospheric and environmental data for the National Weather Service; 2,000 more such buses will roll out soon.
There’s already real interest in Ravnitzky’s plan. Marc Chapman, a compliance director for Atmos Energy, the largest natural-gas distributor in the country, says he is looking into whether sensors could be attached to postal-service trucks to detect gas leaks. Telematics might just save Saturday delivery.
Boris Babenko believes there are huge opportunities for integrating computer science, and in particular computer vision, into health care and medical research, making life easier for researchers, physicians and ultimately patients.
Babenko’s strong conviction is leading to the development of powerful tools to aid in bioengineering research. In particular, Babenko, a UC San Diego computer science grad student, is working with a team of researchers to develop technology that will automate the arduous process of analyzing the vast amount of data necessary for tissue engineering. In their research, Babenko and his partner, bioengineering grad student Jessica DeQuach, seek to automate blood vessel counting in images, and to make the distinction between data collection and analysis more clear. They will present their work April 14, 2011 during Research Expo, whose theme is “Innovation for Life."
Tissue engineering is an interdisciplinary field that offers the promise of improving, repairing and/or replacing damaged tissue in the human body. Research in this area involves the development of various biomaterials and processes that facilitate the fabrication of such tissue. In their project, the engineering students are focusing on quantifying arteriole formation. An arteriole is one of the small terminal branches of an artery, especially one that connects with a capillary.
“Arteriole formation is critically important for biomaterial remodeling to help bring blood flow to the damaged area, which is why it is an analysis tool often using in tissue engineering,” DeQuach said.
Collecting this vast amount of data is currently done manually and requires an intensive amount of time and meticulous effort.
“In this project we aim to ease the burden of doing such analysis via modern computer vision techniques,” Babenko said. “While the state of the art in computer vision still requires expert oversight, the long-term goal of our work is to automate this process as much as possible.”
The students are working under computer science professor Serge Belongie and bioengineering professor Karen Christman. Their poster, titled “Towards automated quantification of arteriole formation via computer vision,” will be one of 250 research posters that engineering graduate students will present at Research Expo.
Babenko explained that analyzing images is a central issue in biology – either to study things that are too small for the naked eye, or to study something from the inside in a non-invasive
manner. With current imaging and computer technologies researchers can gather huge amounts of data. Much of this analysis, however, is still done manually.
“I think that computer vision technology could make a big impact in this area by offering powerful tools to aid in bioengineering research,” Babenko said. “Counting blood vessels is a perfect example of the tedious task bioengineers have to perform. For example, in Prof. Christman’s lab a typical experiment requires a scientist to spend up to 80 hours to annotate images, and yet we believe it is within the reach of current computer vision technology.
“The short term goal is to significantly reduce the amount of time this analysis takes, while the long term goal is to open up new possibilities for experiments that simply could not have been done due to this annotation bottleneck,” he added. “Computer vision is a fast growing field, and this progress is ready to be applied in the real world.”
Provided by Jacobs School of Engineering for ZeitNews.org
Researchers at the University of California, Berkeley, are launching a powerful new tool for sorting through and mapping all of California’s fatal and serious traffic collisions.
Starting today (Wednesday, April 6), anyone with access to the Internet can register for a free account to access the Transportation Injury Mapping System, or TIMS, to perform customized searches of 130,000 serious and fatal crashes in the state. Users can view the history of crashes from 2000 to 2008, the most recent year data are available, by county, city, neighborhood or along specific routes. Additional years of collision data will be incorporated into TIMS as they become available.
“This tool is meant to provide professionals and the general public with data to identify traffic safety problems and potential solutions,” said John Bigham, lead researcher for the TIMS project and the Geographic Information Systems program manager at UC Berkeley’s Safe Transportation Research and Education Center (SafeTREC). The center is based at the School of Public Health and the Institute of Transportation Studies.
The UC Berkeley researchers had a wealth of data from which to work. The California Highway Patrol (CHP) collects data on all reported crashes – including collisions on local roads as well as on state highways – and records the information in the California Statewide Integrated Traffic Records System. This database formed the foundation for the records in the TIMS project.
“I don’t know of any other state that allows the general public to access crash data in this way,” said Bigham.
Funding for this program was provided by a grant from the California Office of Traffic Safety (OTS) through the National Highway Traffic Safety Administration (NHTSA), whose Fatality Analysis Reporting System database was also used for TIMS.
While state and local records of crash data are publicly available through the CHP, there had previously been no easy way for users to interactively sort through the information and map it by such factors as location, time and date of the collision; weather and road conditions; the influence of alcohol; and whether pedestrians or bicyclists were involved.
SafeTREC researchers set out to “geocode” the crash data by developing a process to add coordinates to each collision that involved a fatality or injury requiring hospitalization. They then created a web-based data query system that allows users to not only conduct searches and download the results, but to also visualize the results using Google Maps and ArcGIS Server, a mapping software from the Environmental Systems Research Institute. When users click on a location, pop-up boxes provide information about the collision, and the street view feature gives users a realistic picture of the collision site.
Other common data sources, such as census tract information, school locations and zip code boundaries, can be displayed on the maps. Users also have the ability to select collisions on the map and download the associated data files. Video tutorials and FAQs are available to help guide new users through the site.
“Actually seeing the crashes on a map is a tool in its own right,” said Bigham. “If a picture is worth a thousand words, a map is worth a thousand rows in a spreadsheet.”
According to NHTSA, motor vehicle collisions are the leading cause of death for people ages 3 to 34 in the United States. In California, records show that in 2008, more than 3,400 people were killed and more than 240,000 were injured in motor vehicle crashes. Those statistics include collisions with pedestrians and bicyclists. The California Department of Transportation estimates that traffic collisions cost $25 billion in economic damages to the state in a single year.
TIMS, which has been used by members of state agencies for beta testing over the past few months, will be of particular use to Caltrans, CHP and other agencies to better analyze crash data and focus their efforts to reduce traffic fatalities and injuries, the researchers said.
“TIMS represents a major breakthrough in California’s already strong record in road safety.” said David Ragland, director of SafeTREC and adjunct professor of epidemiology at UC Berkeley’s School of Public Health. “In the past several years, due to the effort and resources expended by several state agencies, deaths and injuries from traffic crashes have shown significant declines. TIMS will help build on this success by making it easier for our public agencies –and anyone with access to the Internet –to actually see the locations and types of crashes occurring on our state’s roads, to identify clusters of crashes and to help determine causal factors that can be modified.”
The researchers eventually plan to improve upon the TIMS website with the addition of data on minor injury collisions and more options for queries and spatial analysis of collisions. They emphasized that to truly evaluate the safety of a site, factors such as the relative volume of vehicle, pedestrian and bicycle traffic must be considered.
“The tools on this website allow you to simplify the first steps in exploring crash data, but these other elements need to be considered to get the full picture,” said Bigham. “In the future, we hope to provide more focused applications that incorporate many of these elements to provide an even stronger tool for crash analysis.”
More information: www.tims.berkeley.edu/
Provided by University of California - Berkeley for ZeitNews.org
On the screen behind Boagiu, slides showed a succession of maps with yet-to-be-built motorways, ring roads and bypasses. There were ambitious new national roads, upgraded railway lines, train stations, ports and airports. The conference room's 21st-floor windows gave onto a breathtaking view of the capital. Block after block, mile after mile was clogged with traffic.
"My task," she told the investors, eying them from behind rimless eye glasses, "is to recover delays in infrastructure."
The gaps Boagiu must fill are huge -- and common in scores of developing countries from eastern Europe to Africa to Asia. But Romania's story also exposes another issue, one which goes to the heart of the European project.
The country, which shook off communism in 1989 and joined the European Union in 2007, has a potential 4.6 billion euros (4 billion pounds) in EU funding for transport infrastructure, available until 2013. By the end of last year, Bucharest had managed to use just 47 million euros of that. If Boagiu can't find a way to speed up projects and use the funds, the country will lose them.
Like countries suddenly enriched by the discovery of oil, former communist states that have access to billions in European Union development funds can find them both blessing and curse. The funds -- some 160 billion euros between 2007 and 2013 across the former eastern bloc -- are meant to help new members catch up with the rest of the EU.
But what if a country like Romania simply can't absorb that cash? Should it concentrate on fixing its government services and institutions - its software, as it were -- before it can move to fancy new hardware like motorways? Is it possible to graft developed-world standards onto states whose institutions are running years behind those of the donors?
"Public investment spending is not small," Romania's central bank Governor Mugur Isarescu told a news conference last October. "But there are 42,000 unfinished investment projects in Romania. This is not efficient. We are the country of unfinished projects."
Romania's massive infrastructure deficit dates back more than 20 years, to when the country was in the grip of Nicolae Ceausescu, one of Communism's most repressive dictators. In the 1980s, Ceausescu backed an export-led drive to clear Romania of billions of dollars in foreign debt, slashing investment to pay off creditors. That left infrastructure lagging behind even Romania's Balkan neighbours -- countries which historically had been much poorer.
According to a global competitiveness report by the World Economic Forum, Romania ranks just 134th out of 139 countries by the quality of its roads. The WEF says transport infrastructure is still one of the chief reasons hampering investment. The country is the EU's ninth-largest member by land area, but has only 331 km (211 miles) of motorway, less than half that of neighbouring Hungary (925 km) and not even three percent of Germany's 12,813 km.
Go for a drive in Romania (population 22 million) and you can bump for hours over gravel country roads to reach villages -- some without electricity, indoor plumbing or running water -- whose schools have closed because the young have moved away for a better life. Dusty national roads lead past lush farmland which is failing to achieve its potential because machinery is outdated and land ownership fragmented. Cities are choked with traffic because there's no way to drive round them. The rail system is no better. Outdated trains travel at an average 45 km per hour, while elsewhere in Europe the top speed can hit 320 km/hour.
When mobile phone maker Nokia announced it was moving a production plant to Romania from Germany in 2008, horses and carts still travelled the road to the new site. That same year Daimler chose Hungary over Romania or Poland as the site of a new 800 million-euro car factory with about 2,500 jobs. Hungary, which has higher labour costs and tax rates than Romania, credited the win to its dense network of motorways.
With a cheap and skilled labour force and attractive flat tax on income and profit, Romania has attracted investment by carmakers Renault and Ford. But even they have complained about the roads. "When it bought the plant, Ford wanted to build 1,000 cars a day and ... that would bring a lot of money and jobs to Romania," U.S. ambassador Mark Gitenstein told a Romanian television station in April. "But unless there is a motorway ... it will not make 1,000 cars a day or hire so many people. You need motorways."
None of Romania' existing motorways connect the country with its neighbours. It's a closed system. Even ambitious projects like the Transylvania Motorway have so far failed to live up to their promise.
U.S. construction group Bechtel broke official ground on Romania's biggest motorway at a site near the 15th-century Transylvanian village of Valisoara on a mild summer day in 2004. Then prime minister Adrian Nastase cut ceremonial ribbons and excavators bit into the ground to the soundtrack of Vivaldi's "Four Seasons."
On that day, the future seemed almost tangible: there would be a smooth, spacious four-lane motorway, 415 km long with more than 300 bridges, 70 overpasses and 19 interchanges, connecting the central Romanian region of Transylvania to Hungary. The road would bring jobs, tourists and foreign investors.
"A motorway is forever," Michael Mix, Bechtel's then project manager said in a 2007 company brief. "It is a legacy."
Seven years since the project began, a little more than 10 percent of the road has been delivered. The state has paid Bechtel more than 1 billion euros of public money and analysts say the project will end up costing at least double the initial estimate of 2.2 billion euros. The deadline has been pushed back a year to 2013, but could end up taking years longer.
Rather than being forever, "it feels as if this motorway may take forever," quipped Ana Otilia Nutu, an infrastructure expert at Romanian Academic Society, a think tank.
Infrastructure projects and overruns go hand in hand the world over. But a 2010 study by JASPERS, a European Union agency that helps eastern European states prepare projects eligible for EU cash, found cost overruns were more likely in Romania than in eight other central and eastern European states included in the study, largely due to weak public administration.
Even by Romanian standards, the Bechtel example is extreme. In the years since the groundbreaking, government inquiries have found the deal disadvantaged Bucharest from the start. The project was granted to Bechtel without a public tender, despite clear legislation demanding transparency. This angered international bodies including the European Union, which said it wouldn't support it, leaving the financing burden to the state.
At the time of the initial deal, Nastase said Romania could not afford to navigate a lengthy tender process if it wanted to catch up with affluent western European states. He lost power in late 2004, and a new centre-right coalition government put motorway works on hold while it renegotiated.
Those talks, which lasted for eight months, showed how the initial contract was bad for Romania. The deal committed the country to giving Bechtel an interest-free loan of 250 million euros, on top of monthly payments for works. It made it virtually impossible under Romanian law to pursue compensation if Bechtel failed to meet its obligations. It left Bechtel in charge of controlling costs, giving it a free hand to decide the route. It even contained translation errors unfavourable to Romania, the transport ministry said in 2005.
A revised contract cut 126 million euros off the overall price. It scrapped the interest-free loan, and the government took over road design -- which gave it more control over costs. At the same time, most of the terms were made public.
Researchers at Wake Forest University have developed a new type of polymer solar-thermal device that combines photovoltaics with a system that captures the Sun's infrared radiation to generate heating. By taking advantage of both heat and light, researchers say the device could deliver up to 40 percent savings on the cost of heating, as well as helping reduce power bills by producing electricity.
The hybrid cell is designed with an integrated array of clear tubes, five millimeters (approx 1/4 inch) in diameter. Lying flat, visible sunlight shines into the clear tube which is filled with an oil blended with a proprietary dye, heating the oil which then flows into a heat pump to transfer the warmth inside a home.
Electrical current is produced via a polymer photovoltaic sprayed onto the back of the tubes.
The result is a solar-thermal device with an impressive 30 percent conversion efficiency.
In comparison to flat solar cells, the tube design also has the advantage of being able to capture light at oblique angles, so it can accumulate power for a much longer stretch in the day and be more readily integrated into building materials – it could be produced to resemble a roofing tile for example.
The research team aims to produce a 3 foot square solar thermal cell over the coming months, a key step in bringing the technology closer to market.
"It's a systems approach to making your home ultra-efficient because the device collects both solar energy and heat," said David Carroll, Ph.D., director of the Center for Nanotechnology and Molecular Materials at Wake Forest University. "Our solar-thermal device takes better advantage of the broad range of power delivered from the sun each day."
Economic problems may be fuelling a rise in depression in England, it has been suggested. Prescriptions for anti-depressant drugs such as Prozac rose by more than 40% over the past four years, data obtained by the BBC shows. GPs and charities said they were being contacted increasingly by people struggling with debt and job worries. They said financial woe could often act as a "trigger", but added other factors may also be playing a role in the rise.
The rise has happened at a time when the government has been increasing access to talking therapies, which should in theory curb the demand for anti-depressants. In the last year alone referrals for talking therapies rose four-fold to nearly 600,000, Department of Health figures showed.
Dr Clare Gerada, head of the Royal College of GPs, said some of the rise in prescribing was also likely to be due to increased awareness about the condition and doctors getting better at diagnosis.
But she added: "Of course, in times of economic problems we would expect mental health problems to worsen - and GPs are seeing more people coming in with debts racking up, or who have lost their job and are cancelling their holidays.
"They feel guilty that they can't provide for their family and these things can often act as a trigger for depression."
Mental health charity Sane also said it had seen more people contacting its e-mail and phone advice lines with money worries. Its chief executive, Marjorie Wallace, said: "It is impossible to say for sure that economic problems are leading to a rise in depression. But we are certainly hearing more from people who are worried where the next meal is coming from, job security and cuts in benefits - many who are getting in touch with us for the first time.
"It is a toxic combination, especially for those who already have darker thoughts and other problems."
Emer O'Neill, chief executive of the charity Depression Alliance UK, said: "There is an increase in the number of people suffering from depression certainly, and the economic downturn has had an impact on that. "But I think what's happened is that a lot of the stigma has lifted on depression," she told BBC Breakfast. "It's OK to say you have depression now - and people in general are getting much better information about what it is and they are coming forward and talking to GPs more about it."
Staying on drugs
The figures, obtained from NHS Prescription Services under the Freedom of Information Act, cover anti-depressant prescribing from 2006 to 2010, during which time the country had to cope with the banking crisis, recession and the start of the spending cuts.
They showed the number of prescriptions for selective serotonin re-uptake inhibitors, the most commonly prescribed group of anti-depressants, rose by 43% to nearly 23 million a year.
The data also showed increases in other types of anti-depressants, including drugs such as Duloxetine which tends to be used for more serious cases.
As well as increasing demand for help, the rise could also be related to patients staying on the drugs for longer.
Care services minister Paul Burstow said: "The last recession has left many people facing tough times. If people do experience mental health problems, the NHS is well placed to help.
"We're boosting funding for talking therapies by £400m over the next four years. This will ensure that modern, evidence-based therapies are available to all who need them, whether their depression or anxiety are caused by economic worries or anything else."