Skip to content

The Economic Cost of Increased Temperatures

August 10, 2012

MIT Study: Warming Episodes Hurt Poor Countries and limit Long-term Growth

Even temporary rises in local temperatures significantly damage long-term economic growth in the world’s developing nations, according to a new study co-authored by an MIT economist.

Looking at weather data over the last half-century, the study finds that every 1-degree-Celsius increase in a poor country, over the course of a given year, reduces its economic growth by about 1.3 percentage points. However, this only applies to the world’s developing nations; wealthier countries do not appear to be affected by the variations in temperature.

“Higher temperatures lead to substantially lower economic growth in poor countries,” says Ben Olken, a professor of economics at MIT, who helped conduct the research. And while it’s relatively straightforward to see how droughts and hot weather might hurt agriculture, the study indicates that hot spells have much wider economic effects.

“What we’re suggesting is that it’s much broader than [agriculture],” Olken adds. “It affects investment, political stability and industrial output.”

Varied effects on economies

The paper, “Temperature Shocks and Economic Growth: Evidence from the Last Half Century,” was published this summer in the American Economic Journal: Macroeconomics. Along with Olken, the authors are Melissa Dell PhD ’12, of Harvard University, who was a PhD candidate in MIT’s Department of Economics when the paper was produced, and Ben Jones PhD ’03, an economist at Northwestern University.

The study first gained public attention as a working paper in 2008. It collects temperature and economic-output data for each country in the world, in every year from 1950 through 2003, and analyzes the relationship between them. “We couldn’t believe no one had done it before, but we weren’t really sure we’d find anything at all,” Olken says.

By looking at economic data by type of activity, not just aggregate output, the researchers concluded there are a variety of “channels” through which weather shocks hurt economic production — by slowing down workers, commerce, and perhaps even capital investment.

“If you think about people working in factories on a 105-degree day with no air conditioning, you can see how it makes a difference,” Olken says.

One consequence of this, borne out in the data, is that the higher temperatures in a given year affect not only a country’s economic activity at the time, but its growth prospects far into the future; by the numbers, growth lagged following hot years.

To see why, Olken suggests, first think of a dry year for vegetables in your backyard garden: The bad weather would hurt the plants, but if the weather is reasonable the following year, the backyard crop would return to its normal level. Now contrast that with problems that affect, say, industrial and technological development, and capital investment; temperature shocks limiting those activities can compound over time.

“If you think about economic growth, you build on where you were last year,” Olken explains. For longer-term industrial or technological projects, he adds, “If it’s that kind of activity that’s lost, then it affects the country’s long-run growth rate, [and it’s] not a one-off hit.”

Political change in the weather

Olken, Dell and Jones also integrated data about forms of government into the study, and found that temperature shocks are associated with an increase in political instability. A 1-degree-Celsius rise in a given year, they found, raises the probability of “irregular leader transitions,” such as coups, by 3.1 percentage points in poor countries. In turn, the authors write, “poor economic performance and political instability are likely mutually reinforcing.”

Olivier Deschenes, an economist at the University of California at Santa Barbara, calls the study “an important finding because most of the prior research on the economic impacts of climate change have focused on a few sectors of the economy, predominantly the agricultural sector.” By contrast, he notes, the broader finding of the current paper matters “because the growth rate is a key measure of the economic success of a nation and the standard of living of its population.”

Deschenes, who also conducts research on the economic and health effects of temperature changes, suggests that the “next step” for scholars “is to identify adaptation strategies that can moderate the negative impacts of global climate change in the coming decades.”

As Olken observes, the study does not try to account for all the possible problems that could be generated by long-term climate change, such as rising oceans, floods or increased storms. Still, he adds, the paper does suggest some general points about the economic impact of a warming atmosphere. It is vital, he says, to “think about the heterogeneity of the impact between the poor and rich countries” when leaders and policymakers map out the problems the world may confront in the future.

“The impacts of these things are going to be worse for the countries that have the least ability to adapt to it,” he adds. “[We] want to think that through for the implications for future inequality. It’s a double whammy.”

[Source: Peter Dizikes, MIT News Office]

Scientists Urge New Approaches to Plant Research

July 19, 2012

You’d be amazed at how much you can learn from a plant.

In a paper published this week in the journal Science, a Michigan State University professor and a colleague discuss why if humans are to survive as a species, we must turn more to plants for any number of valuable lessons.

In a recently published paper in the journal Science, MSU professor Robert Last discusses the importance of plant research. Photo by G.L. Kohuth.

In a recently published paper in the journal Science, MSU professor Robert Last discusses the importance of plant research. Photo by G.L. Kohuth.

“Metabolism of plants provides humans with fiber, fuel, food and therapeutics,” said Robert Last, an MSU professor of biochemistry and molecular biology. “As the human population grows and nonrenewable energy sources diminish, we need to rely increasingly on plants and to increase the sustainability of agriculture.”

However, Last and co-author Ron Milo of the Weizmann Institute of Science point out that despite decades of plant genetic engineering, there are relatively few types of commercial products originating from this body of work.

“This is in part because we do not understand enough about the vastly complex set of metabolic reactions that plants employ,” Last said. “It’s like designing and building a bridge armed only with satellite images of existing bridges.”

The authors say that perhaps the best approach is to bring together a variety of disciplines – not just plant scientists – to study how plants operate.

They also suggest looking hard at what brought plants to the place they are today – evolution.

“We think that understanding design principles of plant metabolism will be aided by considering how hundreds of millions of years of evolution has led to well-conserved examples of metabolic pathways,” Last said.

One of the amazing aspects of plant metabolism is this: It must continuously strike a balance between evolving to meet an ever-changing environment while maintaining the internal stability needed to carry on life as it knows it.

In addition, the authors point out that plants experiment with specialized (also called secondary) metabolism which can produce novel chemicals that are used to defend against pathogens and herbivores.

“Humans benefit from this ‘arms race’ because some of these compounds have important therapeutic properties,” Last said. “Unfortunately, design principles are not so well studied in these rapidly evolving metabolic processes. Using new approaches, including considering optimality principles, will lead to advances in medicinal chemistry as well as creating more and healthier food.”

Last is Barnett Rosenberg chair of Biochemistry and Molecular Biology and Plant Biology. Co-author Milo is a professor of plant sciences at Israel’s Weizmann Institute of Science.

Last’s research also is supported by MSU AgBioResearch.

[Source: MSU news release]

New Biofuel Process Dramatically Improves Energy Recovery

July 19, 2012

A new biofuel production process created by Michigan State University researchers produces energy more than 20 times higher than existing methods.

The results, published in the current issue of Environmental Science and Technology, showcase a novel way to use microbes to produce biofuel and hydrogen, all while consuming agricultural wastes.

A new biofuel production process created by Michigan State University researchers produces energy more than 20 times higher than existing methods.

A new biofuel production process created by Michigan State University researchers produces energy more than 20 times higher than existing methods.

Gemma Reguera, MSU microbiologist, has developed bioelectrochemical systems known as microbial electrolysis cells, or MECs, using bacteria to breakdown and ferment agricultural waste into ethanol. Reguera’s platform is unique because it employs a second bacterium, which, when added to the mix, removes all the waste fermentation byproducts or nonethanol materials while generating electricity.

Similar microbial fuel cells have been investigated before. However, maximum energy recoveries from corn stover, a common feedstock for biofuels, hover around 3.5 percent. Reguera’s platform, despite the energy invested in chemical pretreatment of the corn stover, averaged 35 to 40 percent energy recovery just from the fermentation process, said Reguera, an AgBioResearch scientist who co-authored the paper with Allison Spears, MSU graduate student.

“This is because the fermentative bacterium was carefully selected to degrade and ferment agricultural wastes into ethanol efficiently and to produce byproducts that could be metabolized by the electricity-producing bacterium,” Reguera said. “By removing the waste products of fermentation, the growth and metabolism of the fermentative bacterium also was stimulated. Basically, each step we take is custom-designed to be optimal.”

The second bacterium, Geobacter sulfurreducens, generates electricity. The electricity, however, isn’t harvested as an output. It is used to generate hydrogen in the MEC to increase the energy recovery process even more, Reguera said.

“When the MEC generates hydrogen, it actually doubles the energy recoveries,” she said. “We increased energy recovery to 73 percent. So the potential is definitely there to make this platform attractive for processing agricultural wastes.”

Reguera’s fuel cells use corn stover treated by the ammonia fiber expansion process, an advanced pretreatment technology pioneered at MSU. AFEX is an already proven method that was developed by Bruce Dale, MSU professor of chemical engineering and materials science.

Dale is currently working to make AFEX viable on a commercial scale.

In a similar vein, Reguera is continuing to optimize her MECs so they, too, can be scaled up on a commercial basis. Her goal is to develop decentralized systems that can help process agricultural wastes. Decentralized systems could be customized at small to medium scales (scales such as compost bins and small silages, for example) to provide an attractive method to recycle the wastes while generating fuel for farms.

[Source: MSU news release]

All Models Are Wrong…

July 18, 2012

By Matt Artz

The world around us is a complex place, and one way we manage that complexity is through a process of abstraction. In its purest sense, abstraction is a reduction of detail down to the bare essentials we still need in order to understand.

Maps are a fascinating example of abstraction.  Maps are abstractions of landscapes and geography, and have proven to be a particularly useful aspect of human technology throughout our history.  Until relatively recently, maps were predominantly two-dimensional: paper maps with complex geography abstracted onto a flat surface.  New methods of presentation were created in an attempt to relay geographic information that moved beyond the two dimensions, but these methods, while useful, often fell short of conveying the true nature of complex and dynamic nature of geography.

Enter computers.  The move from paper-based abstractions towards computer-based abstractions of geographic space has given us a powerful new context for understanding—and not just for two-dimensional landscapes, but for geography spanning the third and fourth dimensions as well.

It’s that fourth dimension—trying to understand what happened in the past or what might happen in the future— where things can get really complicated.

Geospatial professionals are often faced with tasks that involve modeling—the attempt to simulate what might (or did) happen in a particular system through a process of abstraction and simplification.  But the process of reducing a complex system down to its essence without losing important details is fraught with uncertainty and peril.

It’s important to grasp that more than just individual models are at play; complex systems such as earth’s climate require multiple models of component systems.  We’ve reached a high level of sophistication with many individual models, and progress is being made at integration between models.  But we need a more comprehensive and open method of consolidating and relating inputs and outputs from all models.

While much progress has been made in recent years to develop models to help us to better understand our world in the context of these domains, there is still much more to be done at the macro scale—especially in the area of integration.  As we gain more detailed understanding of different granular systems and their components, the challenge in addressing complex issues such as global climate change is coupling these models together to gain a more complete picture.  The combination of powerful hardware, sophisticated software, and increased human knowledge have all contributed to better models and more accurate simulations, but a geographic information system (GIS)-based framework for integrating these disparate representations of past, present, and future states is key to understanding the whole earth.

GIS itself is an incredibly valuable tool for spatial analysis and modeling, but there are a many standalone models available designed for highly specialized, domain-specific modeling, analysis, and problem solving.   Most domain-specific models are not yet and probably never will be fully implemented in a GIS framework; however, the spatial display, analysis, and data management capabilities of GIS can still be utilized to greatly streamline almost any modeling workflow.

GIS enables a comprehensive modeling framework where the software is used for workflow management and post-modeling support for multiple domain-specific models; in addition, outputs from multiple models can be compared, analyzed, and modeled within the GIS system itself.  Such a GIS-based framework offers a comprehensive environment for modeling across complex earth systems.

Creating a framework that successfully brings together and manages a plethora of data sources and modeling systems to tackle the most pressing environmental issues of our time is surely a monumental challenge, but it is a challenge for which GIS is well suited.  Once the data and technology framework is in place and a clear workflow is established, the challenge then becomes organizing a large group of people to do the work of modeling multiple complex scenarios in order to identify the best of possible design futures for the planet.

“Essentially, all models are wrong, but some are useful.”
George E. P. Box, Statistician

Prediction is a tricky business.  It’s a mixture of science an art.

“All models are wrong…”

From the perspective of science, we’re looking for absolutes, and the world of absolutes is where modeling often falls short: models rarely predict things with 100% accuracy.  If you evaluate the success of a model in such absolute terms, you will almost certainly be unhappy with the results.

“…but some models are useful.”

This is where the “art” comes in to play.  Art is about creation.  And this is where modeling has the greatest promise: by helping us to understand what may happen in the future, we can make better decisions today; we can plan for, design, and actually create a better future.

Rethinking Hospitals From the Ground Up

July 18, 2012

You won’t find Sanford Smith’s name on the roster of doctors at Hoag Hospital Presbyterian in Irvine. He doesn’t dispense medication at the pharmacy, and he isn’t trained to check your blood pressure. But make no mistake: He plays a significant role in improving the lives of everyone who comes through the hospital’s doors.

Smith, an alumnus and an architect by training, is senior vice president of real estate and facilities at Hoag, and it is his job to ensure that patients and caregivers alike who traverse its halls see the hospital as an ally rather than an obstacle. He recalls what the hospital’s CEO said during an orientation for new employees:

“Many of the people you will encounter in the hallways of this institution today are having the worst day of their life. They’re coming to this place and they’re under tremendous stress and emotional discord, and our job is to try to make them feel comfortable and at ease.”

That observation applies not only to the quality of care but to the quality of the place where the care is offered. For Smith, that means assessing the hospital’s design. “Are we confusing them and adding to their anxiety or are we doing things to help them feel more comfortable? It’s an important question,” he says.

It’s a multitrillion-dollar question.

Health care, the largest segment of the U.S. economy, generates more than $2 trillion in activity annually. Significant developments, including passage of the federal Patient Protection and Affordable Care Act, the graying of the Baby Boom generation and the need to meet new seismic standards in California by 2020, will dramatically alter the way patients, doctors, administrators and a host of others involved in the medical field approach sickness and wellness.

Bob Kain, an alumnus and principal at HMC Architects in Ontario, Calif., says architects are like detectives. They discover bits and pieces of evidence, make sense of it, suggest scenarios and find answers. Meeting the demands of the health care revolution will be their biggest case in decades, and he and Smith want to help ensure that those
entering the profession are up to the challenge.

Their answer is the Healthcare Architecture Initiative, which sprang from a casual discussion over lunch last year after an event honoring Smith as the College of Environmental Design’s distinguished alumnus. Smith suggested to Kain and Professor Judith Sheine, chair of the architecture department, that the college consider entering a design competition sponsored by Kaiser Permanente.

“It turns out that the competition had already closed,” Smith recalls, “but it gave light to a sense that it would still be a great thing to encourage a new generation of thinkers to start addressing the complex problem of health care and health care delivery.”

“Version 1.0” of the fledgling initiative, as Smith puts it, involves fundraising, forming an advisory board from the region’s health care professionals and architects, and assessing the curriculum. The goal is to create a graduate specialty that would be the first of its kind on the West Coast.

Sheine says the pieces are in place to move forward.

“To get any program going, you have to make sure there’s faculty support and faculty participation or, in general, it doesn’t go anywhere,” she says. “To get traction outside the school, there has to be interest and need in the profession. This [initiative] is a combination of both.”

Kain and Smith both attest to the College of Environmental Design’s importance in their career success, which explains why they jump-started the fundraising effort with $10,000 donations — one from the HMC Designing Futures Foundation and one from Smith himself.

For years, regulations and society’s perception of health care impeded architectural innovation. Hospitals were staid objects of function but not necessarily form. “Patient-friendly” was not part of the lexicon. However, Kain says a new mind-set has evolved, one that views a medical facility as a vehicle for healing.

“It’s not so much bricks and mortar — it’s that people entering that facility need to be made whole,” he says. “It’s the use of natural light. It’s smell. It’s colors. It’s something that will calm people. … It’s that warmer environment, softer colors, less noise — not so much a home environment, but a soothing environment.”

It’s also about what Kain calls way-finding. How easy is it to get from the freeway to the medical facility? How easy is it to park? Is the signage clear? Can patients get where they need to go easily or must they navigate a labyrinth of hallways? Every detail matters because the public’s perceptions of health care are different today.

“People are going to act more and more like consumers. … They’re making a holistic decision about an experience,” Kain says. “Health care is going to become more community based. While I’m waiting for the results of some test, can I go to a fitness center and do some exercises? Is there a coffee shop nearby with Wi-Fi? What is my health care experience going to be?”

Health care professionals and architects at Hoag got a glimpse of the future when a group of ENV students, under the tutelage of Professor Hofu Wu, shared their ideas in a series of presentations at the hospital.

The goal of the 10-week studio project was to design a medical office building that would not only complement the hospital but also incorporate the adjacent surroundings, including a nature trail and a river walk.

“They absorbed so much knowledge in such a short period and put their innovative ideas into the projects,” says Wu, who has taught sustainable design for the past 20 years and health care design for the past seven. “They addressed some very complex problems.”

The two- and three-student teams incorporated natural sunlight, provided easy access to gardens, made the facility intuitively navigable and even reduced the number of steps nurses would have to take in the course of a day. Many of the proposals emphasized energy efficiency and sustainability.

“Good design can be complementary to good health,” Wu says.

Kain, Smith and Sheine agree. And the way to create positive experiences — experiences that make physical and fiscal sense — starts with a solid foundation.

“The key is teaching,” Kain says. “The hospital of the future will continue to evolve, and the well-trained architect will be able to adapt to change because of a clear understanding of the goals of all parties in the health care process. That’s what the Healthcare Architecture Initiative will try to achieve.”

[Source: Cal Poly Pomona news release]

With Maps, Researcher Aims to Bridge Gap between Scientists, Indigenous Experts

July 11, 2012

As humankind wrestles with the growing repercussions of a changing climate, the transfer of knowledge between scientists and local environmental experts becomes ever more crucial to human adaptation.

Yet, due to historical and cultural factors, dialogue about environmental change between two crucial groups — scientists from the developed world and experts from indigenous populations — remains largely ineffective.

Margaret Pearce

Margaret Pearce

“There are indigenous ways of knowing and strategizing about environmental change,” said Margaret Pearce, assistant professor of geography at the University of Kansas. “Those are different from nonindigenous people. They’re different because they’re based on disparate worldviews. It can be as basic as the separation between science and religion, or perceptions of time and ways to measure distance. These kinds of differences then influence the failure of dialogue between indigenous and nonindigenous experts regarding environmental change. It’s a very entrenched historical and cultural lack of communication.”

Now, using cartography as a tool, Pearce is set to help blend local, indigenous knowledge and outside, scientific understanding of environmental adaptation into a visual entity clear to both groups. Her work is based on recent research conducted in the North Pare Mountains of Tanzania as part of a collaborative study titled “Linking Local Knowledge” funded by the National Science Foundation.

By interpreting the conclusions of previous NSF researchers and communicating with people in the Tanzanian villages of Kirya, Lambo, and Mangio — communities that have suffered a crisis of livelihood loss as the result of environmental change — Pearce will use her training in cartography to translate these findings into maps that visually connect the adaptive strategies of the culturally disparate groups.

“There’s a need for a synthesizing dialogue between local and outside experts,” she said. “That’s been hard in words — whether the words are spoken or written — for a variety of historical reasons. I see cartography as a very powerful tool because it can visually represent spaces of difference and spaces of agreement. You can point and say, ‘Here are the places where we agree and see things similarly, and here are the places where we diverge.’”

Through better cross-cultural communication, Pearce aims to help solve the problem of livelihood loss in the region. Further, she hopes to create a model that will encourage both indigenous and nonindigenous people around the world to share strategies for effective adaptation in the face of severe environmental change.

“My expertise is to listen to what people are telling me about geography in a certain place and represent those geographies graphically in the map,” Pearce said. “What’s wonderful about cartographic language is that it can show specific information like, ‘This is how far we used to have to walk to water, and this is how far we walk now.’ But maps can also show differences in how we think in general, like the way we think about distance. A Western scientist might use kilometers to map how far away something is, whereas local people might map the same distance in terms of the time on a watch, because for them the distance is inextricable from time. Those differences can be revealed clearly on the map.”

Pearce will devote a full year to this project as an American Council of Learned Societies Fellow and as the Anne Ray Fellow at the School for Advanced Research in Santa Fe, N.M., where she will be based. Pearce’s collaborators in the NSF project include researchers at Ohio University, Michigan State University, University of Florida and Sokoine University.

[Source: University of Kansas news release]

New Chip Captures Power from Light, Heat, and Vibration

July 11, 2012

Researchers at MIT have taken a significant step toward battery-free monitoring systems — which could ultimately be used in biomedical devices, environmental sensors in remote locations and gauges in hard-to-reach spots, among other applications.

Previous work from the lab of MIT professor Anantha Chandrakasan has focused on the development of computer and wireless-communication chips that can operate at extremely low power levels, and on a variety of devices that can harness power from natural light, heat and vibrations in the environment. The latest development, carried out with doctoral student Saurav Bandyopadhyay, is a chip that could harness all three of these ambient power sources at once, optimizing power delivery.

Graphic: Christine Daniloff

Graphic: Christine Daniloff

The energy-combining circuit is described in a paper being published this summer in the IEEE Journal of Solid-State Circuits.

“Energy harvesting is becoming a reality,” says Chandrakasan, the Keithley Professor of Electrical Engineering and head of MIT’s Department of Electrical Engineering and Computer Science. Low-power chips that can collect data and relay it to a central facility are under development, as are systems to harness power from environmental sources. But the new design achieves efficient use of multiple power sources in a single device, a big advantage since many of these sources are intermittent and unpredictable.

“The key here is the circuit that efficiently combines many sources of energy into one,” Chandrakasan says. The individual devices needed to harness these tiny sources of energy — such as the difference between body temperature and outside air, or the motions and vibrations of anything from a person walking to a bridge vibrating as traffic passes over it — have already been developed, many of them in Chandrakasan’s lab.

Combining the power from these variable sources requires a sophisticated control system, Bandyopadhyay explains: Typically each energy source requires its own control circuit to meet its specific requirements. For example, circuits to harvest thermal differences typically produce only 0.02 to 0.15 volts, while low-power photovoltaic cells can generate 0.2 to 0.7 volts and vibration-harvesting systems can produce up to 5 volts. Coordinating these disparate sources of energy in real time to produce a constant output is a tricky process.

So far, most efforts to harness multiple energy sources have simply switched among them, taking advantage of whichever one is generating the most energy at a given moment, Bandyopadhyay says, but that can waste the energy being delivered by the other sources. “Instead of that, we extract power from all the sources,” he says. The approach combines energy from multiple sources by switching rapidly between them.

Another challenge for the researchers was to minimize the power consumed by the control circuit itself, to leave as much as possible for the actual devices it’s powering — such as sensors to monitor heartbeat, blood sugar, or the stresses on a bridge or a pipeline. The control circuits optimize the amount of energy extracted from each source.

The system uses an innovative dual-path architecture. Typically, power sources would be used to charge up a storage device, such as a battery or a supercapacitor, which would then power an actual sensor or other circuit. But in this control system, the sensor can either be powered from a storage device or directly from the source, bypassing the storage system altogether. “That makes it more efficient,” Bandyopadhyay says. The chip uses a single time-shared inductor, a crucial component to support the multiple converters needed in this design, rather than separate ones for each source.

PhD student Saurav Bandyopadhyay, left, and Professor Anantha Chandrakasan showing the designed system. Photo: Justin Knight/MIT Energy Initiative

PhD student Saurav Bandyopadhyay, left, and Professor Anantha Chandrakasan showing the designed system.
Photo: Justin Knight/MIT Energy Initiative

David Freeman, chief technologist for power-supply solutions at Texas Instruments, who was not involved in this work, says, “The work being done at MIT is very important to enabling energy harvesting in various environments. The ability to extract energy from multiple different sources helps maximize the power for more functionality from systems like wireless sensor nodes.”

Only recently, Freeman says, have companies such as Texas Instruments developed very low-power microcontrollers and wireless transceivers that could be powered by such sources. “With innovations like these that combine multiple sources of energy, these systems can now start to increase functionality,” he says. “The benefits from operating from multiple sources not only include maximizing peak energy, but also help when only one source of energy may be available.”

The work has been funded by the Interconnect Focus Center, a combined program of the Defense Advanced Research Projects Agency and companies in the defense and semiconductor industries.

[Source: MIT news release]