Commentary

Natural Solutions to Climate Change

OXFORD – In response to climate change, land is key. Today, agriculture, forestry, and other land uses account for roughly a quarter of global greenhouse-gas emissions. But adopting sustainable land management strategies could provide more than one-third of the near-term emission reductions needed to keep warming well below the target – 2°C above pre-industrial levels – set by the Paris climate agreement.


Conservation organizations like mine have long been working to balance the interaction between people and nature. But only recently have we fully grasped just how important land-use management is in addressing climate change. With the development of remote sensing, artificial intelligence, and biogeochemical modeling, we can better forecast outcomes, and develop strategies to manage and minimize adverse consequences.

Some of the most promising ways to mitigate climate change are what we call “natural climate solutions”: the conservation, restoration, and improved management of land, in order to increase carbon storage or avoid greenhouse-gas emissions in landscapes worldwide. The full potential of these solutions is detailed in a new study produced by my organization, the Nature Conservancy, and 15 other leading institutions.

Among the most important natural climate solutions is protecting “frontier forests” – pristine woodlands that serve as natural carbon sinks. Intact tropical and northern forests, as well as savannas and coastal ecosystems, store huge amounts of carbon accumulated over centuries. When these areas are disturbed, carbon is released. Preservation of frontier habitats also helps regulate water flows, reduces the risk of flooding, and maintains biodiversity.

Reforestation is another important natural solution. Globally, an estimated two billion hectares (4.9 billion acres) of land has been deforested or degraded. Because trees are the best carbon-capture-and-storage technology the world has, reversing these numbers would bring a significant reduction in global carbon levels. We estimate that the world could capture three gigatons of CO2 annually – equivalent to taking more than 600 million cars off the roads – simply by planting more trees.

A third category of natural solution is agricultural reform. From field to fork, the food sector is a major contributor to climate change through direct and indirect emissions, and by its often-negative effects on soil health and deforestation. Recognizing these risks, 23 global companies – including Nestlé, McDonald’s, Tesco, and Unilever – recently signed a commitment to halt deforestation in Brazil’s Cerrado savanna. The region, which covers a quarter of the country, has come under growing pressure from production of beef, soy, and other commodities, together with the associated infrastructure.

As the Cerrado pledge demonstrates, when governments and businesses come together to address land-use challenges, the impact is potent. Natural climate solutions have the potential to reduce CO2 emissions by an estimated 11.3 billion tons a year – equal to a complete halt in burning oil, according to our study. One recent study calculated that if Brazil reached zero deforestation by 2030, it would add 0.6% of GDP, or about $15 billion, to its economy. Communities also reap secondary benefits – such as rural regeneration, improved food and water security, and coastal resilience – when natural climate solutions are implemented.

Yet, despite the data supporting better land-use decision-making, something isn’t adding up. In 2016, the world witnessed a dramatic 51% increase in forest loss, equivalent to an area about the size of New Zealand. We need to buck this trend now, and help the world realize that land-use planning is not simply a conservation story.

Some countries are moving in the right direction. The Indian government, for example, has set aside $6 billion for states to invest in forest restoration. In Indonesia, the government created a dedicated agency to protect and restore peatlands, bogs, and swamp-like ecosystems that have immense CO2 storage capabilities.

But they are the exceptions. Of the 160 countries that committed to implementing the Paris climate agreement, only 36 have specified land-use management in their emissions-reduction strategies.

Overcoming inertia will not be easy. Forests, farms, and coasts vary in size, type, and accessibility. Moreover, the lives of hundreds of millions of people are tied to these ecosystems, and projects that restore forest cover or improve soil health require focused planning, a massive undertaking for many governments.

One way to get things moving, especially in the agricultural sector, would be to remove or redirect subsidies that encourage excessive consumption of fertilizers, water, or energy in food production. As Indian government officials reminded their peers during a World Trade Organization meeting earlier this year, meaningful agricultural reforms can begin only when rich countries reduce the “disproportionately large” subsidies they give their own farmers.

Supporting innovation and entrepreneurship can also help power change. New processes and technologies in landscape planning, soil analysis, irrigation, and even alternative proteins such as plant-based meat are making agriculture and land use more sustainable. Similarly, changes in the construction industry, which is turning to more efficiently produced products like cross-laminated timber (CLT), can help reduce carbon pollution.

Finally, financing options for natural climate solutions must be dramatically increased. While payments to conserve forests are starting to flow under the UN’s REDD+ program, and the Green Climate Fund has committed $500 million for forest protection payments, total public investment in sustainable land use remains inadequate. According to the Climate Policy Initiative, public financing for agriculture, forestry, and land-use mitigation attracted just $3 billion in 2014, compared to $49 billion for renewable energy generation and $26 billion for energy efficiency.

At the UN climate change meeting that just concluded in Bonn, Germany, global leaders reaffirmed that the world cannot respond adequately to rising temperatures if governments continue ignoring how forests, farms, and coasts are managed. Now that there is a firm consensus, governments must act on it.

Justin Adams is Global Managing Director for Lands at the Nature Conservancy.

By Justin Adams

The Eternal Return of the Plague

NORMAN, OKLAHOMA – “Fearsome Plague Epidemic Strikes Madagascar.” That recent New York Times headline might sound like the synopsis of a horror movie. The epidemic gripping Madagascar is not just any plague, and it certainly isn’t some Hollywood apocalypse. It’s the plague, caused by the bacterium Yersinia pestis, agent of the notorious bubonic plague.


For most people, “the plague” conjures up images of the medieval Black Death, and perhaps a vaguely reassuring sense that, in the developed world, such ancient dangers are long past. But in recent years, thanks to the work of geneticists, archaeologists, and historians, we now know that human civilization and the plague have a much deeper and more intimate association than previously assumed. Lessons learned from studying this historic interaction could reshape how we think about global public health today.

All infectious diseases are caused by pathogens – bacteria, viruses, protozoa, and parasites – that are capable of subverting our immune systems long enough to make us sick. These organisms are the product of their own biological evolution, and the history of the plague’s development is perhaps (along with maybe HIV) the most detailed biography of any pathogen known to science.

The plague bacterium, in its most destructive form, is about 3,000 years old. It evolved in Central Asia as a rodent disease; humans were accidental victims. From the germ’s point of view, people make poor hosts, because we die quickly and are usually a terminus, not a transmitter. The plague is spread principally by the bite of fleas, and a few thousand years ago, the bacterium acquired a genetic mutation that made it ferociously effective at spreading. This adaptation improved the plague’s biological fitness, which, for rodents – and the humans who live near them – has proven to be a nightmare.

Thanks to new genomic evidence, we can say with greater confidence how long this nightmare has been recurring. One of the most surprising and solidly confirmed findings in recent years has been the prevalence of plague in samples from Stone Age and Bronze Age societies in Europe and Central Asia. While it remains unclear what role plague played in the failure of those societies, it is reasonable to assume that the disease has long influenced human history.

What is now beyond question is that Yersinia pestis was indeed the pathogen responsible for two of the most destructive pandemics ever. The Black Death, which lives on in popular imagination to this day, arrived from Central Asia in the 1340s, and in the space of a few years, wiped out roughly half of the population in the regions it struck. The disease then lingered for a few more centuries, killing many more.

But this entire episode is properly known as the “second pandemic.” The first pandemic began in AD 541, during the reign of the Roman Emperor Justinian. The outbreak is known as the Justinianic plague, and, like the Black Death, it cut a swath of destruction from inner Asia to the shores of the Atlantic in the space of a few years. Total mortality was in the tens of millions, and stupefied contemporaries were certain they were living on the verge of the last judgment.

As with the Black Death, later historians questioned whether a rodent disease could cause destruction on such a scale. But in recent years, the pathogen’s genetic traces have been found in sixth-century graves, and the DNA evidence convicts Yersinia pestis of this ancient mass murder as definitively as it would in a modern courtroom. The plague triggered a demographic crisis that helped to topple the Romans’ “eternal empire.”

Plague pandemics were events of mind-boggling ecological intricacy. They involved a minimum of five species, in perilous alignment: the bacterium itself, the reservoir host such as marmots or gerbils, the flea vector, the rodent species in close quarters with humans, and the human victims.

The germ first had to leave its native Central Asia. In the case of the Justinianic plague, it seems to have done so by exploiting the shipping networks in the Indian Ocean. Once within the Roman Empire, it found an environment transformed by human civilization, along with massive colonies of rodents fattened on the ancient world’s ubiquitous granaries. Human expansion helped rodents prosper, and rat infestations, in turn, intensified and prolonged the plague’s outbreak.

There is tantalizing evidence that climate change also played a role in triggering the first pandemic. Just a few years before the appearance of the plague on Roman shores, the planet experienced one of the most abrupt incidents of climate change in the last few thousand years. A spasm of volcanic explosions – in AD 536, when historians reported a year without summer, and again in AD 539-540 – upset the global climate system. The precise mechanisms by which climate events fueled plague remain contested, but the link is unmistakable, and the lesson is worth underscoring: the complex relationship between climate and ecosystems impacts human health in unexpected ways.

The plague in Madagascar today is an offshoot of what is known as the “third plague pandemic,” a global dispersion of Yersinia pestis that radiated from China in the late nineteenth century. There still is no vaccine; while antibiotics are effective if administered early, the threat of antimicrobial resistance is real.

That may be the deepest lesson from the long history of this scourge. Biological evolution is cunning and dangerous. Small mutations can alter a pathogen’s virulence or its efficiency of transmission, and evolution is relentless. We may have the upper hand over plague today, despite the headlines in East Africa. But our long history with the disease demonstrates that our control over it is tenuous, and likely to be transient – and that threats to public health anywhere are threats to public health everywhere.

Kyle Harper, a professor of classics and letters at the University of Oklahoma, is author of The Fate of Rome: Climate, Disease, and the End of an Empire.

By Kyle Harper

Saving Somalia Through Debt Relief

LONDON – Julius Nyerere, the first president of Tanzania, once asked his country’s creditors a blunt question: “Must we starve our children to pay our debts?” That was in 1986, before the public campaigns and initiatives that removed much of Africa’s crushing and unpayable debt burden. But Nyerere’s question still hangs like a dark cloud over Somalia.


Over the last year, an unprecedented humanitarian effort has pulled Somalia back from the brink of famine. As the worst drought in living memory destroyed harvests and decimated livestock, almost $1 billion was mobilized in emergency aid for nutrition, health, and clean water provision. That aid saved many lives and prevented a slow-motion replay of the 2011 drought, when delayed international action resulted in nearly 260,000 deaths.

Yet, even after these recent efforts, Somalia’s fate hangs in the balance. Early warning systems are pointing to a prospective famine in 2018. Poor and erratic rains have left 2.5 million people facing an ongoing food crisis; some 400,000 children live with acute malnutrition; food prices are rising; and dry wells have left communities dependent on expensive trucked water.

Humanitarian aid remains essential. Almost half of Somalia’s 14 million people need support, according to UN agencies. But humanitarian aid, which is often volatile and overwhelmingly short-term, will not break the deadly cycles of drought, hunger, and poverty. If Somalia is to develop its health and education systems, economic infrastructure, and the social protection programs needed to build a more resilient future, it needs predictable, long-term development finance.

Debt represents a barrier to that finance. Somalia’s external debt is running at $5 billion. Creditors range from rich countries like the United States, France, and Italy, to regional governments and financial institutions, including the Arab Monetary Fund.

But Somalia’s debt also includes $325 million in arrears owed to the International Monetary Fund. And there’s the rub: countries in arrears to the IMF are ineligible to receive long-term financing from other sources, including the World Bank’s $75 billion concessional International Development Association (IDA) facility.

Much of the country’s current debt dates to the Cold War, when the world’s superpower rivalry played out in the Horn of Africa. Over 90% of Somalia’s debt burden is accounted for by arrears on credit advanced in the early 1980s, well before two-thirds of today’s Somali population was born.

Most of the lending then was directed to President Siad Barre as a reward for his abandonment of the Soviet Union and embrace of the West. Military credits figured prominently: over half of the $973 million in US debt is owed to the Department of Defense. Somalia got state-of-the-art weaponry, liberally financed by loans. The IMF was nudged into guaranteeing repayment through a structural adjustment program. Repaying the debt today would cost every Somali man, woman, and child $361.

None of this would matter if Somalia had qualified for debt reduction. The Heavily Indebted Poor Countries Initiative (HIPC), created in response to the great debt relief campaigns of the 1990s, has written off around $77 billion in debt for 36 countries. Somalia is one of just three countries that have yet to qualify. The reason: the arrears owed to the IMF. (Eritrea and Sudan have also not qualified, for similar reasons).

The IMF view is that Somalia, like earlier HIPC beneficiaries, should establish a track record of economic reform. This will delay a full debt write-off for up to three years, exclude Somalia from long-term development finance, and reinforce its dependence on emergency aid. Other creditors have endorsed this approach through silent consent.

Somalia deserves better. President Mohamed Abdullahi Mohamed’s government has demonstrated a commitment to economic reform, improved accountability, and transparency. For two years, it has adhered to an IMF program, achieving targets for improving public finance and the banking sector. More needs to be done, especially in terms of domestic resource mobilization. But this is the first Somali government to provide the international community with a window of opportunity to support recovery. We must capitalize on it.

Waiting three more years as Somalia ticks the IMF’s internal accounting boxes would be a triumph of bureaucratic complacency over human needs. Without international support, Somalia’s government lacks the resources needed to break the deadly cycle of drought, hunger, and poverty.

Somalia’s children need investment in health, nutrition, and schools now, not at some point in the indefinite future. Investing in irrigation and water management would boost productivity. With drought-related livestock and crop losses estimated at around $1.5 billion, government-supported cash payment programs would help aid recovery, strengthen resilience, and build trust.

The benefits of these investments would extend to security. Providing the hope that comes with education, health care, and the prospect of a job is a far more effective weapon than a drone to combat an insurgency that feeds on despair, poverty, joblessness, and the absence of basic services.

There is an alternative to IMF-sponsored inertia on debt relief. The World Bank and major creditors could convene a creditor summit to agree to terms for a prompt debt write-off. More immediately, the World Bank could seek its shareholders’ approval for a special mechanism – a “pre-arrears clearance grant” – that would enable Somalia to receive IDA financing. There is a precedent for this: In 2005, the US championed World Bank financing for Liberia, which at the time had significant IMF debt after emerging from civil war.

The technicalities can be discussed and the complexities resolved. But we should not lose sight of what is at stake. It is indefensible for the IMF and other creditors to obstruct Somalia’s access to financing because of arrears on a debt incurred three decades ago as much through reckless lending as through irresponsible borrowing.

Somalia’s children played no part in creating that debt. They should not have to pay for it with their futures.

Kevin Watkins is CEO of Save the Children UK.

By Kevin Watkins

 

Banking on African Infrastructure

JOHANNESBURG – As the US Federal Reserve embarks on the “great unwinding” of the stimulus program it began nearly a decade ago, emerging economies are growing anxious that a stronger dollar will adversely affect their ability to service dollar-denominated debt. This is a particular concern for Africa, where, since the Seychelles issued its debut Eurobond in 2006, the total value of outstanding Eurobonds has grown to nearly $35 billion.


But if the Fed’s ongoing withdrawal of stimulus has frayed African nerves, it has also spurred recognition that there are smarter ways to finance development than borrowing in dollars. Of the available options, one specific asset class stands out: infrastructure.

Africa, which by 2050 will be home to an estimated 2.6 billion people, is in dire need of funds to build and maintain roads, ports, power grids, and so on. According to the World Bank, Africa must spend a staggering $93 billion annually to upgrade its current infrastructure; the vast majority of these funds – some 87% – are needed for improvements to basic services like energy, water, sanitation, and transportation.

Yet, if the recent past is any guide, the capital needed will be difficult to secure. Between 2004 and 2013, African states closed just 158 financing deals for infrastructure or industrial projects, valued at $59 billion – just 5% of the total needed. Given this track record, how will Africa fund even a fraction of the World Bank’s projected requirements?

The obvious source is institutional and foreign investment. But, to date, many factors, including poor profit projections and political uncertainty, have limited such financing for infrastructure projects on the continent. Investment in African infrastructure is perceived as simply being too risky.

Fortunately, with work, this perception can be overcome, as some investors – such as the African Development Bank, the Development Bank of Southern Africa, and the Trade & Development Bank – have already demonstrated. Companies from the private sector are also profitably financing projects on the continent. For example, Black Rhino, a fund set up by Blackstone, one of the world’s largest multinational private equity firms, focuses on the development and acquisition of energy projects, such as fuel storage, pipelines, and transmission networks.

But these are the exceptions, not the rule. Fully funding Africa’s infrastructure shortfall will require attracting many more investors – and swiftly.

To succeed, Africa must develop a more coherent and coordinated approach to courting capital, while at the same time working to mitigate investors’ risk exposure. Public-private sector collaborations are one possibility. For example, in the energy sector, independent power producers are working with governments to provide electricity to 620 million Africans living off the grid. Privately funded but government regulated, these producers operate through power purchase agreements, whereby public utilities and regulators agree to purchase electricity at a predetermined price. There are approximately 130 such producers in Sub-Saharan Africa, valued at more than $8 billion. In South Africa alone, 47 projects are underway, accounting for 7,000 megawatts of additional power production.

Similar private-public partnerships are emerging in other sectors, too, such as transportation. Among the most promising are toll roads built with private money, a model that began in South Africa. Not only are these projects, which are slowly appearing elsewhere on the continent, more profitable than most financial market investments; they are also literally paving the way for future growth.

Clearly, Africa needs more of these ventures to overcome its infrastructure challenges. That is why I, along with other African business leaders and policymakers, have called on Africa’s institutional investors to commit 5% of their funds to local infrastructure. We believe that with the right incentives, infrastructure can be an innovative and attractive asset class for those with long-term liabilities. One sector that could lead the way on this commitment is the continent’s pension funds, which, together, possess a balance sheet of about $3 trillion.

The 5% Agenda campaign, launched in New York last month, underscores the belief that only a collaborative public-private approach can redress Africa’s infrastructure shortfall. For years, a lack of bankable projects deterred international financing. But in 2012, the African Union adopted the Program for Infrastructure Development in Africa, which kick-started more than 400 energy, transportation, water, and communications projects. It was a solid start – one that the 5% Agenda seeks to build upon.

But some key reforms will be needed. A high priority of the 5% Agenda is to assist in updating the national and regional regulatory frameworks that guide institutional investment in Africa. Similarly, new financial products must be developed to give asset owners the ability to allocate capital directly to infrastructure projects.

Unlocking new pools of capital will help create jobs, encourage regional integration, and ensure that Africa has the facilities to accommodate the needs of future generations. But all of this depends on persuading investors to put their money into African projects. As business leaders and policymakers, we must ensure that the conditions for profitability and social impact are not mutually exclusive. When development goals and profits align, everyone wins.

Ibrahim Assane Mayaki, a former Prime Minister of Niger, is CEO of the New Partnership for Africa’s Development (NEPAD) Planning and Coordinating Agency.

By Ibrahim Assane Mayaki

The Greening of the Miners

LONDON – Donald Trump’s presidency in the United States has turned mining – and the coal industry in particular – into a political cause célèbre over the last year. In June, during his first White House cabinet meeting, Trump suggested that his energy policies were putting miners back to work and transforming a troubled sector of the economy.


But Trump is mistaken to think that championing the cause of miners and paying respect to a difficult profession will be sufficient to make mining sustainable. To achieve that, a far more complex set of interdependencies must be navigated.

Debates about mining and the environment are often framed in terms of a “nexus” between extraction of a resource and the introduction of other resources into the extraction process. The forthcoming Routledge Handbook of the Resource Nexus, which I co-edited, defines the term as the relationship between two or more naturally occurring materials that are used as inputs in a system that provides services to humans. In the case of coal, the “nexus” is between the rock and the huge amounts of water and energy needed to mine it.

For decision-makers, understanding this linkage is critical to effective resource and land-use management. According to research from 2014, there is an inverse relationship between the grade of ore and the amount of water and energy used to extract it. In other words, misreading how inputs and outputs interact could have profound environmental consequences.

Moreover, because many renewable energy technologies are built with mined metals and minerals, the global mining industry will play a key role in the transition to a low-carbon future. Photovoltaic cells may draw energy from the sun, but they are manufactured from cadmium, selenium, and tellurium. The same goes for wind turbines, which are fashioned from copious amounts of cobalt, copper, and rare-earth oxides.

Navigating the mining industry’s resource nexus will require new governance models that can balance extraction practices with emerging energy needs – like those envisioned by the UN’s Sustainable Development Goals (SDGs). Value creation, profit maximization, and competitiveness must also be measured against the greater public good.

Some within the global mining industry have recognized that the winds are changing. According to a recent survey of industry practices by CDP, a non-profit energy and environmental consultancy, mining companies from Australia to Brazil are beginning to extract resources while reducing their environmental footprint.

Nonetheless, if the interests of the public, and the planet, are to be protected, the world cannot rely on the business decisions of mining companies alone. Four key changes are needed to ensure that the industry’s greening trend continues.

First, mining needs an innovation overhaul. Declining ore grades require the industry to become more energy- and resource-efficient to remain profitable. And, because water scarcity is among the top challenges facing the industry, eco-friendly solutions are often more viable than conventional ones. In Chile, for example, copper mines have been forced to start using desalinated water for extraction, while Sweden’s Boliden sources up to 42% of its energy needs from renewables. Mining companies elsewhere learn from these examples.

Second, product diversification must start now. With the Paris climate agreement a year old, the transformation of global fossil-fuel markets is only a matter of time. Companies with a large portfolio of fossil fuels, like coal, will soon face severe uncertainty related to stranded assets, and investors may change their risk assessments accordingly.

Large mining companies can prepare for this shift by moving from fossil fuels to other materials, such as iron ore, copper, bauxite, cobalt, rare earth elements, and lithium, as well as mineral fertilizers, which will be needed in large quantities to meet the SDGs’ targets for global hunger eradication. Phasing out coal during times of latent overproduction might even be done at a profit.

Third, the world needs a better means of assessing mining’s ecological risks. Although the industry’s environmental footprint is smaller than that of agriculture and urbanization, extracting materials from the ground can still permanently harm ecosystems and lead to biodiversity loss. To protect sensitive areas, greater global coordination is needed in the selection of suitable mining sites. Integrated assessments of subsoil assets, groundwater, and biosphere integrity would also help, as would guidelines for sustainable resource consumption.

Finally, the mining sector must better integrate its value chains to create more economic opportunities downstream. Establishing models of material flows – such as the ones existing for aluminum and steel – and linking them with “circular economy” strategies, such as waste reduction and reuse, would be a good start. A more radical change could come from a serious engagement in markets for secondary materials. “Urban mining” – the salvaging, processing, and delivery of reusable materials from demolition sites – could also be better integrated into current core activities.

The global mining industry is on the verge of transforming itself from fossil-fuel extraction to supplying materials for a greener energy future. But this “greening” is the result of hard work, innovation, and a complex understanding of the resource nexus. Whatever America’s coal-happy president may believe, it is not the result of political platitudes.

Raimund Bleischwitz is Deputy Director of the University College London Institute for Sustainable Resources.

By Raimund Bleischwitz

Sounding the Alarm on Biodiversity Loss

NORWICH – With the United Nations’ climate change conference underway in Bonn, Germany, rising global temperatures are once again at the top of the world’s agenda. But why care about the increase in temperature, if not because of its impact on life on Earth, including human life?


That is an important question to consider, in view of the relative lack of attention devoted to a closely related and equally important threat to human survival: the startling pace of global biodiversity loss.

The availability of food, water, and energy – fundamental building blocks of every country’s security – depends on healthy, robust, and diverse ecosystems, and on the life that inhabits them. But, as a result of human activities, planetary biodiversity is now declining faster than at any point in history. Many policymakers, however, have yet to recognize that biodiversity loss is just as serious a threat as rising sea levels and increasingly frequent extreme weather events.

This lack of sufficient attention comes despite international commitments to protect biodiversity. In October 2010, global leaders met in Aichi, Japan, where they produced the Strategic Plan for Biodiversity 2011-2020, which included 20 ambitious targets – such as halving global habitat loss and ending overfishing – that signatories agreed to meet by 2020. Safeguarding biodiversity is also specifically included in the UN’s Sustainable Development Goals. Yet progress toward these global biodiversity goals is likely to fall dangerously short of what is needed to ensure an acceptable future for all.

Policymakers have largely agreed on the importance of holding the increase in global temperature to less than 2°C above pre-industrial levels – the goal of the Paris climate agreement. But too few leaders have shown any sense of urgency about stemming biodiversity losses. The sustainable future we want depends on ending this indifference.

Toward that end, the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES), which I chair, will release a series of landmark reports next March on the implications of biodiversity decline. Prepared over three years by more than 550 experts from some 100 countries, these expert assessments will cover four world regions: the Americas, Asia and the Pacific, Africa, and Europe and Central Asia. A fifth report will address the state of land degradation and restoration at regional and global levels.

The reports will highlight trends and plausible futures, outlining the best policy options available to slow the degradation of ecosystems, from coral reefs to rainforests. Taken together, the IPBES assessments will represent the global scientific community’s consensus view on the state of biodiversity and ecosystem services.

Moreover, the reports will highlight the close links between biodiversity loss and climate change, which should be addressed simultaneously. The world will not be able to meet the goals of the Paris agreement – or many of the SDGs, for that matter – unless it takes into account the state of biodiversity and ecosystem services.

Today, most governments separate their environmental authorities from those focusing on energy, agriculture, and planning. This makes it difficult to address climate change or biodiversity losses in a holistic way. New types of innovative governance structures are needed to bridge these policy silos.

After the release of IPBES regional reports next year, a global assessment building on them will be published in 2019. This will be the first global overview of biodiversity and ecosystem services since the authoritative Millennium Ecosystem Assessment of 2005. It will examine the health of terrestrial, freshwater, and marine ecosystems, and the impact of factors including acidification, rising sea surface temperatures, trade, invasive species, overfishing, pollution, and land use changes.

The success of efforts to reverse unsustainable uses of the world’s natural assets will require policymakers to reconsider the value of biodiversity for their people, environments, and economies. But the first step is ensuring that we have the best peer-reviewed knowledge available to make sound decisions; the forthcoming IPBES assessments will move us in that direction.

If the full consequences of climate change are to be addressed in our lifetime, we must recognize that human activity is doing more than just adding a few degrees of temperature to the annual forecast. By early next year, we will have the data on biodiversity and ecosystem services to prove it, and the policy options to change course.

Robert Watson, Strategic Director of the Tyndall Center for Climate Change Research at the University of East Anglia, is Chair of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES).

By Robert Watson

How to Boost Access to Essential Medicines

DÜSSELDORF – Around the world, health security is increasingly being recognized as the foundation of economic growth. Healthy populations are better able to produce, trade, and innovate, while unhealthy populations strain public budgets and create risks that discourage economic exchange. This logic is written into countless European Union reports, and is even gaining traction in the United States, despite the “America First” approach to international affairs embraced by President Donald Trump’s administration.


Against this backdrop, the World Health Organization (WHO), under its new Director-General, Tedros Ghebreyesus, has a unique opportunity to pursue urgently needed reforms. The WHO’s response to the 2014-2016 Ebola outbreak in West Africa was roundly judged a failure. And with the emergence of new diseases such as Zika – and the revival of old foes like bubonic plague – there is no question that much of humanity remains at the mercy of biology. Moreover, globalization has compounded the danger by facilitating the spread of communicable diseases. A flu outbreak like that of 1918-1920, which killed between 50 and 100 million people, would be even more devastating today.

To prevent such catastrophic outcomes, we need a comprehensive approach for strengthening health-care delivery in low- and middle-income countries. In particular, these countries need help improving drug delivery and managing chronic diseases such as cancer and diabetes, which impose an immense burden on their economies.

Unfortunately, the WHO’s leadership, like much of the West, has not pursued this course of action, because it has been distracted by an ideological obsession with drug prices. But drug prices are a vanishingly small part of the problem in countries struggling to build healthy, productive societies. Of all the drugs on the WHO’s “Essential Medicines” list, 95% are already off-patent, meaning that cheaper generic versions are available worldwide.

In cases where drugs aren’t reaching people who need them, the reason is not high prices, but rather dysfunctional health systems. Fortunately, public-health analysts have identified a handful of structural reforms that would largely eliminate existing bottlenecks that are hindering the distribution of essential medicines.

The first trouble spot is infrastructure. More than half of the world’s rural population lacks access to basic health care, compared to about a fifth of the urban population. Inadequate and unreliable transportation networks make accessing health-care services costly and time-consuming, and impede drug deliveries from supply centers. With better roads and more fully developed transportation systems, emerging economies could boost not just health outcomes, but also economic and educational opportunities.

A second problem, even in areas with adequate infrastructure, is the prevalence of bureaucratic and economic barriers that limit access to essential medicines. According to a 2008 study of 36 developing countries, torturous registration and approval processes create frequent shortages of 15 of the most commonly used generic medicines. For example, in South Africa it can take up to five years for new medicines to reach the market, owing to that country’s drug-registration and labeling regulations. By streamlining drug-approval processes, removing tariffs, and simplifying customs procedures, many countries could immediately increase the availability of dozens of essential medicines.

A third problem is that there are too few health-care workers. In many low- and middle-income countries, patients cannot get the drugs they need simply because there are no doctors or nurses to prescribe them, nor pharmacists to dispense them. According to the WHO, the world suffers from a deficit of some seven million health-care professionals; by 2035, that number is expected to reach 13 million. Making matters worse, there is an even greater shortfall of specialists equipped to treat chronic diseases such as diabetes, which is spreading rapidly through the developing world, owing to changing diets and habits.

Flawed – or nonexistent – health-finance schemes are the fourth, and perhaps the largest, barrier to drug delivery in many countries. Even when generic-brand essential medicines are available, they often are unaffordable for low-income patients in countries with scant state subsidies and no risk-pooling insurance mechanisms. By one estimate, almost 90% of people in low- and middle-income countries will face impoverishment if they have to pay out of pocket for a single commonly used generic drug.

Some of the countries that are most eager to strip patent protections are also notorious for skimping on health-care expenditures. The Indian government, for example, spends just around 1% of GDP on health care, well below the 5% needed to move toward universal health coverage. But expropriating drug makers’ intellectual property will do nothing to improve outcomes where crucial safety nets are missing.

Increasing the availability of essential medicines is imperative for improving health-care outcomes for hundreds of millions of people around the world. When societies are not burdened by disease, they can focus on boosting productivity, consumption, and trade. At the same time, neglecting the threat posed by communicable diseases in the developing world invites catastrophe not just in those countries, but in developed economies, too. We can do much to close the global health-security gap; but undermining patent protections for new drugs will do precisely the opposite.

Justus Haucap is a professor of economics at Heinrich-Heine University.

By Justus Haucap

A Formula for Health Equity

KIGALI – Imagine a country where some 90% of the population is covered by health insurance, more than 90% of those with HIV are on a consistent drug regime, and 93% of children are vaccinated against common communicable diseases including HPV. Where would you guess this enchanted land of medical equity is? Scandinavia? Costa Rica? Narnia?


Try Africa – Rwanda, to be precise. In my native country, health care is a right guaranteed for all, not a privilege reserved for the rich and powerful. Rwanda remains poor, but, over the past 15 years, its health care advances have gained global attention, for good reason. In 2000, life expectancy at birth was just 48 years; today, it’s 67. International aid has helped, but our achievements have come primarily from other, non-financial innovations.

For starters, Rwanda has established a collaborative, cluster approach to governance that allows us to achieve more with the same amount of funding. Moreover, our civil servants embrace problem solving, demonstrating a level of resourcefulness that has produced many localized solutions to human development challenges such as ensuring food security and adequate supplies of clean water and housing.

But perhaps the most important factor behind our dramatic health-care gains has been the national equity agenda, which sets targets for supporting the needy and tracks progress toward meeting them. Since implementing this approach, Rwanda has managed to decrease the percentage of people living in extreme poverty from 40% of the population in 2000 to 16.3% in 2015

Aside from the obvious benefits, these gains matter because, as UNICEF recently noted a country’s potential return on investment in social services for vulnerable children is two times greater when the benefits reach the most vulnerable. In other words, Rwanda has achieved so much so fast because we are enjoying higher rates of return by investing in the poorest.

In working toward health equity, Rwanda has made accessibility a top priority. As of 2016, nine out of ten Rwandans were enrolled in one of the country’s health insurance programs. The majority of the population is enrolled in the Community-Based Health Insurance (CBHI) scheme, which has increased access to health care for Rwanda’s most vulnerable citizens by waiving fees.

As a result, the reach of health-care coverage in Rwanda is high by global standards – all the more remarkable for a country that suffered the horrors of genocide a generation ago. Consider the situation in the US: while the rate of uninsured Americans has dropped precipitously under the 2010 Affordable Care Act, the insured face rapid increases in premiums and out-of-pocket expenses. Perhaps the US should consider adopting a CBHI-type program, to reduce further the number of Americans facing financial barriers to medical care.

Rwanda has crafted health care delivery with access in mind as well, by deploying community health workers (CHWs) to the country’s 15,000 villages. These local practitioners serve as the gatekeepers to a system that has reduced waiting times and financial burdens by treating patients directly – often at patients’ homes.

The US could also benefit from a CHW program. The US is brimming with educated young people who, as CHWs, could bridge the gap between medical facilities and patients, thereby improving American social capital and health outcomes. As Rwanda’s experience has demonstrated, such programs not only broaden access to health care; they also lower overall costs by reducing unnecessary hospitalizations.

Such programs have been shown to be transferable. Starting in 1997, Brigham and Women’s Hospital supported the HIV+ community of Boston through the Prevention and Access to Care and Treatment (PACT) program. That initiative was based on the CHW model implemented in rural Haiti by Partners In Health – a non-profit health-care organization that integrates CHWs into primary care and mental health.

As a result of that initiative, the government insurer Medicaid spent less money on hospital stays, and inpatient expenditures fell by 62%. Other US communities could, and should, incorporate similar models into their treatment programs for chronic conditions.

Innovation is what kick-started Rwanda’s health-care revival, and progressive thinking is what drives it forward today. For example, health centers established throughout the country provide vaccinations and treat illnesses that village-level CHWs cannot, and have extended obstetrics services to the majority of Rwandan women.

Broadening access further, each district in Rwanda has one hospital, and each region in the country has a referral or teaching hospital with specialists to handle more difficult cases. While some hospitals still suffer from staff shortages, the government has sought to patch these holes through an initiative that employs faculty from over 20 US institutions to assist in training our clinical specialists.

In just over two decades, thanks to homegrown solutions and international collaboration, Rwanda has dramatically reduced the burden of disease on its people and economy. As we look forward, our goal is to educate tomorrow’s leaders to build on the equitable health-care system that we have created. This is the mission of the University of Global Health Equity, a new university based in rural Rwanda that has made fairness, collaboration, and innovation its guiding principles.

As a Rwandan doctor who contributed to building my country’s health-care system from its infancy, I am proud of what we have accomplished in so short a time. It wasn’t magic; it was a formula. Through continued global cooperation, other countries, including developed ones, can learn to apply it.

Agnes Binagwaho, a former minister of health of Rwanda, is Vice Chancellor of the University of Global Health Equity. She is a 2017 inductee into the US National Academy of Medicine.

By Agnes Binagwaho

When Climate Leaders Protect Dirty Investments

GENEVA – Solutions to the climate crisis are often associated with big conferences, and the next two weeks will no doubt bring many “answers.” Some 20,000 delegates have now descended on Bonn, Germany, for the latest round of United Nations climate change talks.


The talks in Bonn should focus on the implementation of the Paris climate agreement. And the path forward is clear. The only way to keep the rise in global temperatures within the limit set in Paris – “well below 2°C” higher than pre-industrial levels – is to shift capital away from fossil fuels and toward zero-carbon projects. To do that, we must change how global energy investments are governed.

At the moment, the very governments leading the fight against climate change continue to support and protect investment in fossil-fuel exploration, extraction, and transportation. Rather than investing in efficient housing, zero-carbon mobility, renewable energy, and better land-use systems, these governments say one thing but still do another.

According to the most recent World Energy Investment report from the International Energy Agency, global expenditure in the oil and gas sector totaled $649 billion in 2016. That was more than double the $297 billion invested in renewable electricity generation, even though achieving the Paris agreement’s target implies leaving at least three quarters of known fossil-fuel reserves in the ground. As these numbers suggest, institutional inertia and entrenched industry interests continue to stand in the way of shifting investment into sustainable energy.

Much of the problem can be traced to bilateral investment treaties and investment rules embedded within broader trade pacts, such as the North American Free Trade Agreement (NAFTA), the Energy Charter Treaty, and the EU-Canada Comprehensive Economic and Trade Agreement (CETA). Because these treaties were designed to shield foreign investors from expropriation, they include investor-state dispute settlement (ISDS) mechanisms that allow investors to seek compensation from governments, via international arbitration tribunals, if policy changes affect their business.

This has handcuffed governments seeking to limit fossil-fuel extraction. Compensation from ISDS cases can be staggering. In 2012, an American investor filed a lawsuit against the Quebec government’s decision to deny a permit for hydraulic fracturing under the Saint Lawrence River. Arguing that the denial was “arbitrary, capricious, and illegal” under NAFTA, the Delaware-based energy firm sought $250 million in damages.

In January 2016, the TransCanada energy company used NAFTA to sue the United States, claiming $15 billion in losses after President Barack Obama denied a permit for the Keystone XL oil pipeline. (The company suspended its suit after President Donald Trump approved the project in January 2017).

And in July 2017, Quebec agreed to pay nearly $50 million in compensation to companies after canceling oil and gas exploration contracts on Anticosti Island in the Gulf of Saint Lawrence. These and other payments are in addition to the hundreds of billions of dollars in subsidies that continue to flow to the fossil-fuel industry.

Big payouts do more than drain public coffers; the mere threat of them discourages governments from pursuing more ambitious climate policies, owing to fear that carbon-dependent industries could challenge them in international tribunals.

Fortunately, this state of affairs is not set in stone. Many governments now see reform of the investment regime not just as a possibility, but as a necessity. Last month, the UN Conference on Trade and Development convened a high-level meeting in Geneva, with the goal of developing options for comprehensive reform of the investment regime, including the renegotiation or termination of some 3,000 outdated treaties.

Governments should start by overhauling or exiting the Energy Charter Treaty, the world’s only energy-specific investment pact. The ECT’s investment protections and lack of climate provisions are no longer appropriate. Since its inception, the ECT has served as the basis for more than 100 claims by energy firms against host countries, with some challenging national environmental policies, such as the nuclear phase-out in Germany. Russia and Italy have already withdrawn from the ECT; other countries should do the same or commit to renegotiating it.

Moreover, countries should put climate concerns at the center of their trade and investment negotiations, such as by carving out fossil-fuel projects from investment clauses. That is essentially what France recently proposed, when ecology minister Nicolas Hulot announced his country’s intention to enact a “climate veto” to CETA. Hulot said France would ratify the treaty only if it contained assurances that its climate commitments could not be challenged before arbitration tribunals. Fossil-fuel projects could also be exempted from investment protection in new environmental treaties, such as the Global Pact for the Environment presented by French President Emmanuel Macron to the UN General Assembly in September.

Rebalancing the global investment regime is only the first step toward a zero-carbon economy. To shift capital from fossil-fuel heavy initiatives to green energy projects, countries will need new legal and policy frameworks at the regional, national, and international levels. These agreements should promote and facilitate zero-carbon investments. Big meetings like the one getting underway this week and the Paris Climate Summit next month can kick-start these conversations.

Nathalie Bernasconi-Osterwalder is director of the Economic Law and Policy Program at the International Institute for Sustainable Development (IISD). Jörg Haas is Department Head: International Politics at the Heinrich Böll Foundation.

By Nathalie Bernasconi-Osterwalder and Jörg Haas

Freeing Africa’s Internet

WASHINGTON, DC – Much to the dismay of the government in Addis Ababa, “Zone 9” has become a household name in Ethiopia. Since 2012, this small group of journalists-turned-online activists has used social media to campaign for political freedoms and civil liberties in their country. The group’s success – measured, for example, by the flood of likes and comments on its Facebook page – has come in spite of government efforts to silence the writers, including the arrest of six members in 2014 on trumped-up terrorism charges.


Ethiopia’s government is not alone in seeking to consolidate political power by restricting what citizens say online. Across Africa, governments are enacting legislation to restrict Internet access and outlaw criticism of elected officials. Digital campaigners face myriad censorship tactics, including “Border Gateway Protocol” attacks, “HTTP throttling,” and “deep packet inspections.”

The irony, of course, is that censorship rarely quiets the disaffected. Rather than quelling dissent, government intervention only inspires more people to take their grievances to WhatsApp, Facebook, Twitter, and other social media platforms, where Africans are increasingly challenging corrupt governments, exposing rigged elections, and demanding to be heard.

At the moment, however, few of Africa’s leaders are listening. Leaders in nine of the 18 African countries that held elections in 2016 placed some level of restriction on the Internet to limit dissent. Four days prior to Uganda’s presidential vote in February, President Yoweri Museveni cut access to mobile payment services and social media sites. In August and September, Gabon’s president, Ali Bongo, seeking to project an atmosphere of calm to the international community, shut down Internet access overnight. Then in December, officials in Democratic Republic of the Congo ordered an Internet shutdown the day before President Joseph Kabila was scheduled to leave office, thereby quashing online dissent when he refused to step down.

Internet blackouts like these violate people’s human rights and undermine democratic processes. Last year, the United Nations Human Rights Council approved a resolution affirming that, “rights that people have offline must also be protected online, in particular freedom of expression.”

Most African governments try to justify Internet embargoes by arguing that the restrictions are necessary to ensure public safety and security. Museveni, for example, claimed that blocking Internet access was the only way to protect visiting heads of state during his swearing-in ceremony. But he presented no evidence linking social media accessibility and security in Uganda, or anywhere else. According to Access Now, an international advocacy group for digital rights, people typically feel less secure without the Internet, because they cannot access information or connect with friends and family in times of uncertainty.

With several key African elections coming up, Internet shutdowns are again on the horizon. In Zimbabwe, where President Robert Mugabe, who is 93, is expected to run for his eighth term in mid-2018, a government-led crackdown appears inevitable. For decades, Mugabe has relied on intimidation and violence to stifle political dissent. It is not surprising, then, that he has already begun taking a hostile approach to online activism. Last year, his government shut down the Internet in the middle of political protests, and vowed to arrest anyone caught generating or sharing “abusive or subversive material on social media.”

But citizens are not helpless. While governments issue orders to cut off Internet access, only telecommunications companies have the ability to hit the “kill switch.” That is why Africa’s bloggers and online activists must work more closely with investors and shareholders of communications firms to convince them to stand up for democracy and human rights by resisting illiberal government directives.

Moreover, civil-society groups, the African Union, and the UN should do more to condemn national legislation that aims to normalize restrictive Internet policies. Just as it launched a model law on access to information in 2013, the African Union should provide new guidance to states on how to safeguard the right to assemble and express views online.

Finally, new continent-wide measures are needed to ensure that Africans’ online rights are recognized and respected by their governments. Although the UN Human Rights Council’s resolution to protect online freedoms is not binding, it offers a starting point for ensuring that governments allow citizens to use the Internet as a tool for maximizing political participation.

Such interventions are needed now more than ever. The Kenyan, Zimbabwean, and Ethiopian legislatures are currently considering laws that would permit significantly greater government control over Internet access. Last year, Tanzania adopted legislation that has already been used to charge individuals with crimes who have criticized President John Magufuli on social media.

Whether governments bar citizens from gathering in public, signing petitions, or accessing the Internet and posting on social media makes no difference. All such measures are designed to strip citizens of their rights. The battle for freedom, as Zone 9 has shown, is no less real when the public square is the digital domain.

Kizito Byenkya is a senior program specialist at the Open Society Human Rights Initiative. Alex Humphrey is a policy associate at the Open Society Foundations.

By Kizito Byenkya and Alex Humphrey

Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…