The China Factor in Kenya and Zimbabwe

BEIJING – Ask anyone with a basic knowledge of Africa which country is more poised for success – Zimbabwe or Kenya – and he or she will undoubtedly answer “Kenya.” Events of the last week would seem to confirm that verdict.

On Monday, after Kenya’s Supreme Court upheld the reelection of President Uhuru Kenyatta in the country’s contested presidential election, the rule of law seemed to trump political violence for the first time in years. Zimbabwe, on the other hand, is without President Robert Mugabe for the first time in 37 years. And, although the country may be ecstatic now, its political future is far from certain.

But as a Kenyan living in China, one of the African continent’s most important development partners, I see one metric that tips the scale in Zimbabwe’s favor: its relationship with my adopted home. In fact, Zimbabwe’s economic and political ties to China could prove decisive for Africa’s perpetual underdog.

On paper, Kenya clearly has the edge. Although Zimbabwe has more natural resources and mineral wealth, it has far less land and extreme poverty is much more widespread. More than 70% of the country’s 16 million people live on less than $1.90 a day, compared to 46% of Kenya’s 48 million people. Moreover, as many as 90% of Zimbabweans are unemployed or underemployed, compared to 39% of Kenyans.

Even Kenya’s economic links to China might seem more impressive at first glance. Kenya and China have long cooperated on large infrastructure projects. A Chinese-funded railway between Nairobi and Mombasa, which opened earlier this year, is the latest example. Since 2000, China has offered Kenya $6.8 billion in loans for infrastructure projects, compared to $1.7 billion for Zimbabwe. And, because loan conditions often include a requirement to hire Chinese employees, Kenya had more than 7,400 at the end of 2015, while Zimbabwe had just over 950.

But in the competition for Chinese largesse, Kenya’s advantage over Zimbabwe ends there. Cumulative Chinese foreign direct investment since 2003 has reached nearly $7 billion in Zimbabwe, compared to $3.9 billion for Kenya. Year on year, more Chinese money is flowing to Zimbabwe as well.

Moreover, Zimbabwe’s trade balance with China is far superior to Kenya’s. In 2015, Kenya’s exports to China totaled $99 million, while it imported from China a staggering 60 times that amount. Even taking into account imports of materials tied to Chinese-built infrastructure, this is an exceptionally wide bilateral deficit.

Zimbabwe, on the other hand, despite its slow growth rate, exported $766 million worth of goods to China in 2015, and imported $546 million. Most surprisingly, Zimbabwe’s exports were not restricted to minerals and metals, as one might assume, but also included tobacco and cotton, products that are relatively more labor-intensive, meaning more job creation at home. And, while Zimbabwe has around 50 fewer registered Chinese companies than Kenya, Kenya’s economy is around 4.5 times the size of Zimbabwe’s, clearly implying that those firms that are operating there contribute more to the country’s economy.

How has Zimbabwe achieved what looks like, at least from a numerical perspective, a more productive relationship with China than Kenya has?

Few beyond Mugabe and his close colleagues, including the country’s new president, Emmerson Mnangagwa, know for sure. But one way to make an educated guess is to compare both countries’ history of bilateral engagement with China.

Both Kenya and Zimbabwe have had two visits from Chinese heads of state during their post-colonial histories. Chinese President Jiang Zemin visited each country in 1996, while President Hu Jintao visited Kenya in 2006. China’s current president, Xi Jinping, visited Zimbabwe in 2015.

State visits in the other direction have been more uneven. Mugabe’s first visit to China was in 1980, just six months after independence; he made 13 more during his tenure, and high-level visits by other Zimbabwean officials were even more frequent, occurring roughly once every two years during Mugabe’s reign. Kenyan presidents, by contrast, traveled to China just six times during the same period, most recently in May 2017.

Zimbabwe’s leaders made the most of their visits to press for trade and military cooperation, and likely engaged directly with private Chinese companies. This has nurtured a culture of reciprocity. Just a few months ago, for example, a Chinese company approached my firm asking for advice about how to enter Zimbabwe’s health-care market. I have not yet fielded similar questions about gaining access to markets in Kenya.

China’s role in African economies has been criticized; but, as I have argued before, Chinese investment has also been a lifeline to many on the continent. From creating employment opportunities to providing direct investment in infrastructure, China has been a partner to Africa when many Western investors preferred to stay away.

How Kenya and Zimbabwe navigate their future relationships with China remains to be seen. Both countries have supported Xi’s signature Belt and Road Initiative, which, in theory, should increase their strategic value to China. Kenya’s return to political stability should also sustain, if not deepen, the country’s economic engagement with China.

Zimbabwe’s historic ties to China will be no less important. Following Mugabe’s resignation, China’s foreign ministry went out of its way to praise the “friendship between China and Zimbabwe,” and Mnangagwa can be expected to continue that relationship. The new president received military training in China, and paid an official visit as speaker of the parliament in 2001. There is even speculation that China was warned of the looming coup in Zimbabwe, if not consulted beforehand.

As Kenya and Zimbabwe navigate their political futures, much in both countries will no doubt change – one hopes for the better. Their ties with China will be a key metric in assessing their trajectory.

Hannah Ryder, a former head of policy and partnerships for the United Nations Development Program in China, is founder and CEO of Development Reimagined.

By Hannah Ryder

Capitalizing on Climate Unity

BONN – When Donald Trump was elected US president a year ago, some said the end of the Paris climate agreement was nigh. Yet, as the latest round of global climate talks in Bonn, Germany, has shown, the world’s political leaders are more committed to the deal than ever. This is good news, but the fact remains that countries’ commitments do not yet add up to enough to turn the tide – and our window of opportunity to act effectively on climate change is rapidly closing.

Trump’s decision to withdraw the United States – the world’s largest historical carbon dioxide emitter – from the Paris agreement dealt the accord a major blow. Many of America’s closest allies – including both of our countries, the Marshall Islands and Australia – were deeply disappointed by the move, which was shortsighted, for both America and the world.

But it is hard not to take heart from the fresh wave of global resolve Trump’s decision has unleashed, both globally and within the US itself. Almost every major US state, city, and company has now pledged to do more to ensure that their country can meet its commitments, despite the Trump administration’s opposition.

The fact that climate action is now the worlds’ biggest economic opportunity has certainly helped. According to the Trump administration’s own analysis, more than twice as many Americans are now working in the solar industry than in coal, oil, and gas combined. And earlier this year, the OECD indicated that we could boost global growth by 5% per year by 2050, simply by linking the climate and growth agendas.

There is no time to waste; climate change has already arrived. This year’s record-breaking drought in the Marshall Islands, apocalyptic storms in the Caribbean, and devastating floods in Bangladesh and the US demonstrate this.

As the United Nations Environment Programme recently reminded us, even if every country hits its existing 2030 emissions-reduction targets, we will be unable to limit warming to below 1.5° Celsius above preindustrial levels – the threshold, recognized in the Paris agreement, beyond which the impact of climate change becomes far greater. Our chances even of staying within the more conservative – and dangerous – 2° Celsius limit will be slim.

To ignore this reality is to gamble with the existential future of many island countries, not to mention the prosperity of the global economy. Without a sharp rise in global ambition for emissions reductions by 2020, we will be unable to save the world’s most vulnerable countries. And if runaway climate change takes hold, no country will be immune to its effects.

Unfortunately, things will get a lot worse before they get better. That is why we must step up our efforts to boost our resilience to the climate effects we won’t be able to avoid, and address the associated security consequences.

In the meantime, we must urgently increase the ambition of our climate commitments. Fortunately, several upcoming events offer an opportunity to do just that. We need to seize that opportunity with both hands.

Next month, French President Emmanuel Macron will host a conference to mark the two-year anniversary of the Paris agreement. And next September, California Governor Jerry Brown will host his own summit to galvanize greater action by cities, companies, and other non-state actors. The biggest opportunity, however, will come in 2019, when UN Secretary-General António Guterres convenes world leaders in New York for the biggest climate gathering since the Paris talks.

We need to build an arc of ambition across these events that can, in the words of our friend Tony de Brum, the late Marshallese foreign minister and untiring climate warrior at the Paris conference, deliver a pathway to survival for the most vulnerable.

Some significant players are already going above and beyond their pledges. A number of others, including the Marshall Islands, are set to bring forward new targets by 2020, to augment their current targets, which reach only as far as 2025. Still others – including France, India, and New Zealand – have said informally that they are eager to do more.

The truth is that almost all countries have the capacity to do more, especially if the support is there and the opportunities are identified. The imperative now is to create the right political conditions both to motivate and facilitate action. As more countries signal their ability to increase the ambition of their commitments, still more will follow.

At the same time, we must ensure that every sector, as well as every country, does its fair share. This includes, for example, international shipping, which, if it were a country, would be the world’s sixth-largest emitter.

Next year’s “Talanoa Dialogue” – to be convened by Fiji, which last week became the first island state to chair UN climate talks – will help countries identify exactly how they can achieve the goals set in the Paris agreement. That dialogue, which countries should approach in good faith, must be a springboard for further action. To that end, the recent Intergovernmental Panel on Climate Change report laying out pathways for keeping the temperature rise below the 1.5°C threshold will be crucial. The science remains key.

The Paris talks proved that political success is possible, if leaders are given the right platform, if civil society mobilizes behind them, and if the world acts in unison. To get the rest of the way to a sustainable future, we must apply this lesson again. The catchphrase at the Bonn conference was “further, faster, and together.” Our collective challenge is to translate a nice-sounding slogan into reality.

Hilda Heine is President of the Republic of the Marshall Islands. Kevin Rudd, the 26th prime minister of Australia, is currently President of the Asia Society Policy Institute.

By Hilda Heine and Kevin Rudd

Natural Solutions to Climate Change

OXFORD – In response to climate change, land is key. Today, agriculture, forestry, and other land uses account for roughly a quarter of global greenhouse-gas emissions. But adopting sustainable land management strategies could provide more than one-third of the near-term emission reductions needed to keep warming well below the target – 2°C above pre-industrial levels – set by the Paris climate agreement.

Conservation organizations like mine have long been working to balance the interaction between people and nature. But only recently have we fully grasped just how important land-use management is in addressing climate change. With the development of remote sensing, artificial intelligence, and biogeochemical modeling, we can better forecast outcomes, and develop strategies to manage and minimize adverse consequences.

Some of the most promising ways to mitigate climate change are what we call “natural climate solutions”: the conservation, restoration, and improved management of land, in order to increase carbon storage or avoid greenhouse-gas emissions in landscapes worldwide. The full potential of these solutions is detailed in a new study produced by my organization, the Nature Conservancy, and 15 other leading institutions.

Among the most important natural climate solutions is protecting “frontier forests” – pristine woodlands that serve as natural carbon sinks. Intact tropical and northern forests, as well as savannas and coastal ecosystems, store huge amounts of carbon accumulated over centuries. When these areas are disturbed, carbon is released. Preservation of frontier habitats also helps regulate water flows, reduces the risk of flooding, and maintains biodiversity.

Reforestation is another important natural solution. Globally, an estimated two billion hectares (4.9 billion acres) of land has been deforested or degraded. Because trees are the best carbon-capture-and-storage technology the world has, reversing these numbers would bring a significant reduction in global carbon levels. We estimate that the world could capture three gigatons of CO2 annually – equivalent to taking more than 600 million cars off the roads – simply by planting more trees.

A third category of natural solution is agricultural reform. From field to fork, the food sector is a major contributor to climate change through direct and indirect emissions, and by its often-negative effects on soil health and deforestation. Recognizing these risks, 23 global companies – including Nestlé, McDonald’s, Tesco, and Unilever – recently signed a commitment to halt deforestation in Brazil’s Cerrado savanna. The region, which covers a quarter of the country, has come under growing pressure from production of beef, soy, and other commodities, together with the associated infrastructure.

As the Cerrado pledge demonstrates, when governments and businesses come together to address land-use challenges, the impact is potent. Natural climate solutions have the potential to reduce CO2 emissions by an estimated 11.3 billion tons a year – equal to a complete halt in burning oil, according to our study. One recent study calculated that if Brazil reached zero deforestation by 2030, it would add 0.6% of GDP, or about $15 billion, to its economy. Communities also reap secondary benefits – such as rural regeneration, improved food and water security, and coastal resilience – when natural climate solutions are implemented.

Yet, despite the data supporting better land-use decision-making, something isn’t adding up. In 2016, the world witnessed a dramatic 51% increase in forest loss, equivalent to an area about the size of New Zealand. We need to buck this trend now, and help the world realize that land-use planning is not simply a conservation story.

Some countries are moving in the right direction. The Indian government, for example, has set aside $6 billion for states to invest in forest restoration. In Indonesia, the government created a dedicated agency to protect and restore peatlands, bogs, and swamp-like ecosystems that have immense CO2 storage capabilities.

But they are the exceptions. Of the 160 countries that committed to implementing the Paris climate agreement, only 36 have specified land-use management in their emissions-reduction strategies.

Overcoming inertia will not be easy. Forests, farms, and coasts vary in size, type, and accessibility. Moreover, the lives of hundreds of millions of people are tied to these ecosystems, and projects that restore forest cover or improve soil health require focused planning, a massive undertaking for many governments.

One way to get things moving, especially in the agricultural sector, would be to remove or redirect subsidies that encourage excessive consumption of fertilizers, water, or energy in food production. As Indian government officials reminded their peers during a World Trade Organization meeting earlier this year, meaningful agricultural reforms can begin only when rich countries reduce the “disproportionately large” subsidies they give their own farmers.

Supporting innovation and entrepreneurship can also help power change. New processes and technologies in landscape planning, soil analysis, irrigation, and even alternative proteins such as plant-based meat are making agriculture and land use more sustainable. Similarly, changes in the construction industry, which is turning to more efficiently produced products like cross-laminated timber (CLT), can help reduce carbon pollution.

Finally, financing options for natural climate solutions must be dramatically increased. While payments to conserve forests are starting to flow under the UN’s REDD+ program, and the Green Climate Fund has committed $500 million for forest protection payments, total public investment in sustainable land use remains inadequate. According to the Climate Policy Initiative, public financing for agriculture, forestry, and land-use mitigation attracted just $3 billion in 2014, compared to $49 billion for renewable energy generation and $26 billion for energy efficiency.

At the UN climate change meeting that just concluded in Bonn, Germany, global leaders reaffirmed that the world cannot respond adequately to rising temperatures if governments continue ignoring how forests, farms, and coasts are managed. Now that there is a firm consensus, governments must act on it.

Justin Adams is Global Managing Director for Lands at the Nature Conservancy.

By Justin Adams

The Eternal Return of the Plague

NORMAN, OKLAHOMA – “Fearsome Plague Epidemic Strikes Madagascar.” That recent New York Times headline might sound like the synopsis of a horror movie. The epidemic gripping Madagascar is not just any plague, and it certainly isn’t some Hollywood apocalypse. It’s the plague, caused by the bacterium Yersinia pestis, agent of the notorious bubonic plague.

For most people, “the plague” conjures up images of the medieval Black Death, and perhaps a vaguely reassuring sense that, in the developed world, such ancient dangers are long past. But in recent years, thanks to the work of geneticists, archaeologists, and historians, we now know that human civilization and the plague have a much deeper and more intimate association than previously assumed. Lessons learned from studying this historic interaction could reshape how we think about global public health today.

All infectious diseases are caused by pathogens – bacteria, viruses, protozoa, and parasites – that are capable of subverting our immune systems long enough to make us sick. These organisms are the product of their own biological evolution, and the history of the plague’s development is perhaps (along with maybe HIV) the most detailed biography of any pathogen known to science.

The plague bacterium, in its most destructive form, is about 3,000 years old. It evolved in Central Asia as a rodent disease; humans were accidental victims. From the germ’s point of view, people make poor hosts, because we die quickly and are usually a terminus, not a transmitter. The plague is spread principally by the bite of fleas, and a few thousand years ago, the bacterium acquired a genetic mutation that made it ferociously effective at spreading. This adaptation improved the plague’s biological fitness, which, for rodents – and the humans who live near them – has proven to be a nightmare.

Thanks to new genomic evidence, we can say with greater confidence how long this nightmare has been recurring. One of the most surprising and solidly confirmed findings in recent years has been the prevalence of plague in samples from Stone Age and Bronze Age societies in Europe and Central Asia. While it remains unclear what role plague played in the failure of those societies, it is reasonable to assume that the disease has long influenced human history.

What is now beyond question is that Yersinia pestis was indeed the pathogen responsible for two of the most destructive pandemics ever. The Black Death, which lives on in popular imagination to this day, arrived from Central Asia in the 1340s, and in the space of a few years, wiped out roughly half of the population in the regions it struck. The disease then lingered for a few more centuries, killing many more.

But this entire episode is properly known as the “second pandemic.” The first pandemic began in AD 541, during the reign of the Roman Emperor Justinian. The outbreak is known as the Justinianic plague, and, like the Black Death, it cut a swath of destruction from inner Asia to the shores of the Atlantic in the space of a few years. Total mortality was in the tens of millions, and stupefied contemporaries were certain they were living on the verge of the last judgment.

As with the Black Death, later historians questioned whether a rodent disease could cause destruction on such a scale. But in recent years, the pathogen’s genetic traces have been found in sixth-century graves, and the DNA evidence convicts Yersinia pestis of this ancient mass murder as definitively as it would in a modern courtroom. The plague triggered a demographic crisis that helped to topple the Romans’ “eternal empire.”

Plague pandemics were events of mind-boggling ecological intricacy. They involved a minimum of five species, in perilous alignment: the bacterium itself, the reservoir host such as marmots or gerbils, the flea vector, the rodent species in close quarters with humans, and the human victims.

The germ first had to leave its native Central Asia. In the case of the Justinianic plague, it seems to have done so by exploiting the shipping networks in the Indian Ocean. Once within the Roman Empire, it found an environment transformed by human civilization, along with massive colonies of rodents fattened on the ancient world’s ubiquitous granaries. Human expansion helped rodents prosper, and rat infestations, in turn, intensified and prolonged the plague’s outbreak.

There is tantalizing evidence that climate change also played a role in triggering the first pandemic. Just a few years before the appearance of the plague on Roman shores, the planet experienced one of the most abrupt incidents of climate change in the last few thousand years. A spasm of volcanic explosions – in AD 536, when historians reported a year without summer, and again in AD 539-540 – upset the global climate system. The precise mechanisms by which climate events fueled plague remain contested, but the link is unmistakable, and the lesson is worth underscoring: the complex relationship between climate and ecosystems impacts human health in unexpected ways.

The plague in Madagascar today is an offshoot of what is known as the “third plague pandemic,” a global dispersion of Yersinia pestis that radiated from China in the late nineteenth century. There still is no vaccine; while antibiotics are effective if administered early, the threat of antimicrobial resistance is real.

That may be the deepest lesson from the long history of this scourge. Biological evolution is cunning and dangerous. Small mutations can alter a pathogen’s virulence or its efficiency of transmission, and evolution is relentless. We may have the upper hand over plague today, despite the headlines in East Africa. But our long history with the disease demonstrates that our control over it is tenuous, and likely to be transient – and that threats to public health anywhere are threats to public health everywhere.

Kyle Harper, a professor of classics and letters at the University of Oklahoma, is author of The Fate of Rome: Climate, Disease, and the End of an Empire.

By Kyle Harper

Saving Somalia Through Debt Relief

LONDON – Julius Nyerere, the first president of Tanzania, once asked his country’s creditors a blunt question: “Must we starve our children to pay our debts?” That was in 1986, before the public campaigns and initiatives that removed much of Africa’s crushing and unpayable debt burden. But Nyerere’s question still hangs like a dark cloud over Somalia.

Over the last year, an unprecedented humanitarian effort has pulled Somalia back from the brink of famine. As the worst drought in living memory destroyed harvests and decimated livestock, almost $1 billion was mobilized in emergency aid for nutrition, health, and clean water provision. That aid saved many lives and prevented a slow-motion replay of the 2011 drought, when delayed international action resulted in nearly 260,000 deaths.

Yet, even after these recent efforts, Somalia’s fate hangs in the balance. Early warning systems are pointing to a prospective famine in 2018. Poor and erratic rains have left 2.5 million people facing an ongoing food crisis; some 400,000 children live with acute malnutrition; food prices are rising; and dry wells have left communities dependent on expensive trucked water.

Humanitarian aid remains essential. Almost half of Somalia’s 14 million people need support, according to UN agencies. But humanitarian aid, which is often volatile and overwhelmingly short-term, will not break the deadly cycles of drought, hunger, and poverty. If Somalia is to develop its health and education systems, economic infrastructure, and the social protection programs needed to build a more resilient future, it needs predictable, long-term development finance.

Debt represents a barrier to that finance. Somalia’s external debt is running at $5 billion. Creditors range from rich countries like the United States, France, and Italy, to regional governments and financial institutions, including the Arab Monetary Fund.

But Somalia’s debt also includes $325 million in arrears owed to the International Monetary Fund. And there’s the rub: countries in arrears to the IMF are ineligible to receive long-term financing from other sources, including the World Bank’s $75 billion concessional International Development Association (IDA) facility.

Much of the country’s current debt dates to the Cold War, when the world’s superpower rivalry played out in the Horn of Africa. Over 90% of Somalia’s debt burden is accounted for by arrears on credit advanced in the early 1980s, well before two-thirds of today’s Somali population was born.

Most of the lending then was directed to President Siad Barre as a reward for his abandonment of the Soviet Union and embrace of the West. Military credits figured prominently: over half of the $973 million in US debt is owed to the Department of Defense. Somalia got state-of-the-art weaponry, liberally financed by loans. The IMF was nudged into guaranteeing repayment through a structural adjustment program. Repaying the debt today would cost every Somali man, woman, and child $361.

None of this would matter if Somalia had qualified for debt reduction. The Heavily Indebted Poor Countries Initiative (HIPC), created in response to the great debt relief campaigns of the 1990s, has written off around $77 billion in debt for 36 countries. Somalia is one of just three countries that have yet to qualify. The reason: the arrears owed to the IMF. (Eritrea and Sudan have also not qualified, for similar reasons).

The IMF view is that Somalia, like earlier HIPC beneficiaries, should establish a track record of economic reform. This will delay a full debt write-off for up to three years, exclude Somalia from long-term development finance, and reinforce its dependence on emergency aid. Other creditors have endorsed this approach through silent consent.

Somalia deserves better. President Mohamed Abdullahi Mohamed’s government has demonstrated a commitment to economic reform, improved accountability, and transparency. For two years, it has adhered to an IMF program, achieving targets for improving public finance and the banking sector. More needs to be done, especially in terms of domestic resource mobilization. But this is the first Somali government to provide the international community with a window of opportunity to support recovery. We must capitalize on it.

Waiting three more years as Somalia ticks the IMF’s internal accounting boxes would be a triumph of bureaucratic complacency over human needs. Without international support, Somalia’s government lacks the resources needed to break the deadly cycle of drought, hunger, and poverty.

Somalia’s children need investment in health, nutrition, and schools now, not at some point in the indefinite future. Investing in irrigation and water management would boost productivity. With drought-related livestock and crop losses estimated at around $1.5 billion, government-supported cash payment programs would help aid recovery, strengthen resilience, and build trust.

The benefits of these investments would extend to security. Providing the hope that comes with education, health care, and the prospect of a job is a far more effective weapon than a drone to combat an insurgency that feeds on despair, poverty, joblessness, and the absence of basic services.

There is an alternative to IMF-sponsored inertia on debt relief. The World Bank and major creditors could convene a creditor summit to agree to terms for a prompt debt write-off. More immediately, the World Bank could seek its shareholders’ approval for a special mechanism – a “pre-arrears clearance grant” – that would enable Somalia to receive IDA financing. There is a precedent for this: In 2005, the US championed World Bank financing for Liberia, which at the time had significant IMF debt after emerging from civil war.

The technicalities can be discussed and the complexities resolved. But we should not lose sight of what is at stake. It is indefensible for the IMF and other creditors to obstruct Somalia’s access to financing because of arrears on a debt incurred three decades ago as much through reckless lending as through irresponsible borrowing.

Somalia’s children played no part in creating that debt. They should not have to pay for it with their futures.

Kevin Watkins is CEO of Save the Children UK.

By Kevin Watkins


Banking on African Infrastructure

JOHANNESBURG – As the US Federal Reserve embarks on the “great unwinding” of the stimulus program it began nearly a decade ago, emerging economies are growing anxious that a stronger dollar will adversely affect their ability to service dollar-denominated debt. This is a particular concern for Africa, where, since the Seychelles issued its debut Eurobond in 2006, the total value of outstanding Eurobonds has grown to nearly $35 billion.

But if the Fed’s ongoing withdrawal of stimulus has frayed African nerves, it has also spurred recognition that there are smarter ways to finance development than borrowing in dollars. Of the available options, one specific asset class stands out: infrastructure.

Africa, which by 2050 will be home to an estimated 2.6 billion people, is in dire need of funds to build and maintain roads, ports, power grids, and so on. According to the World Bank, Africa must spend a staggering $93 billion annually to upgrade its current infrastructure; the vast majority of these funds – some 87% – are needed for improvements to basic services like energy, water, sanitation, and transportation.

Yet, if the recent past is any guide, the capital needed will be difficult to secure. Between 2004 and 2013, African states closed just 158 financing deals for infrastructure or industrial projects, valued at $59 billion – just 5% of the total needed. Given this track record, how will Africa fund even a fraction of the World Bank’s projected requirements?

The obvious source is institutional and foreign investment. But, to date, many factors, including poor profit projections and political uncertainty, have limited such financing for infrastructure projects on the continent. Investment in African infrastructure is perceived as simply being too risky.

Fortunately, with work, this perception can be overcome, as some investors – such as the African Development Bank, the Development Bank of Southern Africa, and the Trade & Development Bank – have already demonstrated. Companies from the private sector are also profitably financing projects on the continent. For example, Black Rhino, a fund set up by Blackstone, one of the world’s largest multinational private equity firms, focuses on the development and acquisition of energy projects, such as fuel storage, pipelines, and transmission networks.

But these are the exceptions, not the rule. Fully funding Africa’s infrastructure shortfall will require attracting many more investors – and swiftly.

To succeed, Africa must develop a more coherent and coordinated approach to courting capital, while at the same time working to mitigate investors’ risk exposure. Public-private sector collaborations are one possibility. For example, in the energy sector, independent power producers are working with governments to provide electricity to 620 million Africans living off the grid. Privately funded but government regulated, these producers operate through power purchase agreements, whereby public utilities and regulators agree to purchase electricity at a predetermined price. There are approximately 130 such producers in Sub-Saharan Africa, valued at more than $8 billion. In South Africa alone, 47 projects are underway, accounting for 7,000 megawatts of additional power production.

Similar private-public partnerships are emerging in other sectors, too, such as transportation. Among the most promising are toll roads built with private money, a model that began in South Africa. Not only are these projects, which are slowly appearing elsewhere on the continent, more profitable than most financial market investments; they are also literally paving the way for future growth.

Clearly, Africa needs more of these ventures to overcome its infrastructure challenges. That is why I, along with other African business leaders and policymakers, have called on Africa’s institutional investors to commit 5% of their funds to local infrastructure. We believe that with the right incentives, infrastructure can be an innovative and attractive asset class for those with long-term liabilities. One sector that could lead the way on this commitment is the continent’s pension funds, which, together, possess a balance sheet of about $3 trillion.

The 5% Agenda campaign, launched in New York last month, underscores the belief that only a collaborative public-private approach can redress Africa’s infrastructure shortfall. For years, a lack of bankable projects deterred international financing. But in 2012, the African Union adopted the Program for Infrastructure Development in Africa, which kick-started more than 400 energy, transportation, water, and communications projects. It was a solid start – one that the 5% Agenda seeks to build upon.

But some key reforms will be needed. A high priority of the 5% Agenda is to assist in updating the national and regional regulatory frameworks that guide institutional investment in Africa. Similarly, new financial products must be developed to give asset owners the ability to allocate capital directly to infrastructure projects.

Unlocking new pools of capital will help create jobs, encourage regional integration, and ensure that Africa has the facilities to accommodate the needs of future generations. But all of this depends on persuading investors to put their money into African projects. As business leaders and policymakers, we must ensure that the conditions for profitability and social impact are not mutually exclusive. When development goals and profits align, everyone wins.

Ibrahim Assane Mayaki, a former Prime Minister of Niger, is CEO of the New Partnership for Africa’s Development (NEPAD) Planning and Coordinating Agency.

By Ibrahim Assane Mayaki

The Greening of the Miners

LONDON – Donald Trump’s presidency in the United States has turned mining – and the coal industry in particular – into a political cause célèbre over the last year. In June, during his first White House cabinet meeting, Trump suggested that his energy policies were putting miners back to work and transforming a troubled sector of the economy.

But Trump is mistaken to think that championing the cause of miners and paying respect to a difficult profession will be sufficient to make mining sustainable. To achieve that, a far more complex set of interdependencies must be navigated.

Debates about mining and the environment are often framed in terms of a “nexus” between extraction of a resource and the introduction of other resources into the extraction process. The forthcoming Routledge Handbook of the Resource Nexus, which I co-edited, defines the term as the relationship between two or more naturally occurring materials that are used as inputs in a system that provides services to humans. In the case of coal, the “nexus” is between the rock and the huge amounts of water and energy needed to mine it.

For decision-makers, understanding this linkage is critical to effective resource and land-use management. According to research from 2014, there is an inverse relationship between the grade of ore and the amount of water and energy used to extract it. In other words, misreading how inputs and outputs interact could have profound environmental consequences.

Moreover, because many renewable energy technologies are built with mined metals and minerals, the global mining industry will play a key role in the transition to a low-carbon future. Photovoltaic cells may draw energy from the sun, but they are manufactured from cadmium, selenium, and tellurium. The same goes for wind turbines, which are fashioned from copious amounts of cobalt, copper, and rare-earth oxides.

Navigating the mining industry’s resource nexus will require new governance models that can balance extraction practices with emerging energy needs – like those envisioned by the UN’s Sustainable Development Goals (SDGs). Value creation, profit maximization, and competitiveness must also be measured against the greater public good.

Some within the global mining industry have recognized that the winds are changing. According to a recent survey of industry practices by CDP, a non-profit energy and environmental consultancy, mining companies from Australia to Brazil are beginning to extract resources while reducing their environmental footprint.

Nonetheless, if the interests of the public, and the planet, are to be protected, the world cannot rely on the business decisions of mining companies alone. Four key changes are needed to ensure that the industry’s greening trend continues.

First, mining needs an innovation overhaul. Declining ore grades require the industry to become more energy- and resource-efficient to remain profitable. And, because water scarcity is among the top challenges facing the industry, eco-friendly solutions are often more viable than conventional ones. In Chile, for example, copper mines have been forced to start using desalinated water for extraction, while Sweden’s Boliden sources up to 42% of its energy needs from renewables. Mining companies elsewhere learn from these examples.

Second, product diversification must start now. With the Paris climate agreement a year old, the transformation of global fossil-fuel markets is only a matter of time. Companies with a large portfolio of fossil fuels, like coal, will soon face severe uncertainty related to stranded assets, and investors may change their risk assessments accordingly.

Large mining companies can prepare for this shift by moving from fossil fuels to other materials, such as iron ore, copper, bauxite, cobalt, rare earth elements, and lithium, as well as mineral fertilizers, which will be needed in large quantities to meet the SDGs’ targets for global hunger eradication. Phasing out coal during times of latent overproduction might even be done at a profit.

Third, the world needs a better means of assessing mining’s ecological risks. Although the industry’s environmental footprint is smaller than that of agriculture and urbanization, extracting materials from the ground can still permanently harm ecosystems and lead to biodiversity loss. To protect sensitive areas, greater global coordination is needed in the selection of suitable mining sites. Integrated assessments of subsoil assets, groundwater, and biosphere integrity would also help, as would guidelines for sustainable resource consumption.

Finally, the mining sector must better integrate its value chains to create more economic opportunities downstream. Establishing models of material flows – such as the ones existing for aluminum and steel – and linking them with “circular economy” strategies, such as waste reduction and reuse, would be a good start. A more radical change could come from a serious engagement in markets for secondary materials. “Urban mining” – the salvaging, processing, and delivery of reusable materials from demolition sites – could also be better integrated into current core activities.

The global mining industry is on the verge of transforming itself from fossil-fuel extraction to supplying materials for a greener energy future. But this “greening” is the result of hard work, innovation, and a complex understanding of the resource nexus. Whatever America’s coal-happy president may believe, it is not the result of political platitudes.

Raimund Bleischwitz is Deputy Director of the University College London Institute for Sustainable Resources.

By Raimund Bleischwitz

Sounding the Alarm on Biodiversity Loss

NORWICH – With the United Nations’ climate change conference underway in Bonn, Germany, rising global temperatures are once again at the top of the world’s agenda. But why care about the increase in temperature, if not because of its impact on life on Earth, including human life?

That is an important question to consider, in view of the relative lack of attention devoted to a closely related and equally important threat to human survival: the startling pace of global biodiversity loss.

The availability of food, water, and energy – fundamental building blocks of every country’s security – depends on healthy, robust, and diverse ecosystems, and on the life that inhabits them. But, as a result of human activities, planetary biodiversity is now declining faster than at any point in history. Many policymakers, however, have yet to recognize that biodiversity loss is just as serious a threat as rising sea levels and increasingly frequent extreme weather events.

This lack of sufficient attention comes despite international commitments to protect biodiversity. In October 2010, global leaders met in Aichi, Japan, where they produced the Strategic Plan for Biodiversity 2011-2020, which included 20 ambitious targets – such as halving global habitat loss and ending overfishing – that signatories agreed to meet by 2020. Safeguarding biodiversity is also specifically included in the UN’s Sustainable Development Goals. Yet progress toward these global biodiversity goals is likely to fall dangerously short of what is needed to ensure an acceptable future for all.

Policymakers have largely agreed on the importance of holding the increase in global temperature to less than 2°C above pre-industrial levels – the goal of the Paris climate agreement. But too few leaders have shown any sense of urgency about stemming biodiversity losses. The sustainable future we want depends on ending this indifference.

Toward that end, the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES), which I chair, will release a series of landmark reports next March on the implications of biodiversity decline. Prepared over three years by more than 550 experts from some 100 countries, these expert assessments will cover four world regions: the Americas, Asia and the Pacific, Africa, and Europe and Central Asia. A fifth report will address the state of land degradation and restoration at regional and global levels.

The reports will highlight trends and plausible futures, outlining the best policy options available to slow the degradation of ecosystems, from coral reefs to rainforests. Taken together, the IPBES assessments will represent the global scientific community’s consensus view on the state of biodiversity and ecosystem services.

Moreover, the reports will highlight the close links between biodiversity loss and climate change, which should be addressed simultaneously. The world will not be able to meet the goals of the Paris agreement – or many of the SDGs, for that matter – unless it takes into account the state of biodiversity and ecosystem services.

Today, most governments separate their environmental authorities from those focusing on energy, agriculture, and planning. This makes it difficult to address climate change or biodiversity losses in a holistic way. New types of innovative governance structures are needed to bridge these policy silos.

After the release of IPBES regional reports next year, a global assessment building on them will be published in 2019. This will be the first global overview of biodiversity and ecosystem services since the authoritative Millennium Ecosystem Assessment of 2005. It will examine the health of terrestrial, freshwater, and marine ecosystems, and the impact of factors including acidification, rising sea surface temperatures, trade, invasive species, overfishing, pollution, and land use changes.

The success of efforts to reverse unsustainable uses of the world’s natural assets will require policymakers to reconsider the value of biodiversity for their people, environments, and economies. But the first step is ensuring that we have the best peer-reviewed knowledge available to make sound decisions; the forthcoming IPBES assessments will move us in that direction.

If the full consequences of climate change are to be addressed in our lifetime, we must recognize that human activity is doing more than just adding a few degrees of temperature to the annual forecast. By early next year, we will have the data on biodiversity and ecosystem services to prove it, and the policy options to change course.

Robert Watson, Strategic Director of the Tyndall Center for Climate Change Research at the University of East Anglia, is Chair of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES).

By Robert Watson

How to Boost Access to Essential Medicines

DÜSSELDORF – Around the world, health security is increasingly being recognized as the foundation of economic growth. Healthy populations are better able to produce, trade, and innovate, while unhealthy populations strain public budgets and create risks that discourage economic exchange. This logic is written into countless European Union reports, and is even gaining traction in the United States, despite the “America First” approach to international affairs embraced by President Donald Trump’s administration.

Against this backdrop, the World Health Organization (WHO), under its new Director-General, Tedros Ghebreyesus, has a unique opportunity to pursue urgently needed reforms. The WHO’s response to the 2014-2016 Ebola outbreak in West Africa was roundly judged a failure. And with the emergence of new diseases such as Zika – and the revival of old foes like bubonic plague – there is no question that much of humanity remains at the mercy of biology. Moreover, globalization has compounded the danger by facilitating the spread of communicable diseases. A flu outbreak like that of 1918-1920, which killed between 50 and 100 million people, would be even more devastating today.

To prevent such catastrophic outcomes, we need a comprehensive approach for strengthening health-care delivery in low- and middle-income countries. In particular, these countries need help improving drug delivery and managing chronic diseases such as cancer and diabetes, which impose an immense burden on their economies.

Unfortunately, the WHO’s leadership, like much of the West, has not pursued this course of action, because it has been distracted by an ideological obsession with drug prices. But drug prices are a vanishingly small part of the problem in countries struggling to build healthy, productive societies. Of all the drugs on the WHO’s “Essential Medicines” list, 95% are already off-patent, meaning that cheaper generic versions are available worldwide.

In cases where drugs aren’t reaching people who need them, the reason is not high prices, but rather dysfunctional health systems. Fortunately, public-health analysts have identified a handful of structural reforms that would largely eliminate existing bottlenecks that are hindering the distribution of essential medicines.

The first trouble spot is infrastructure. More than half of the world’s rural population lacks access to basic health care, compared to about a fifth of the urban population. Inadequate and unreliable transportation networks make accessing health-care services costly and time-consuming, and impede drug deliveries from supply centers. With better roads and more fully developed transportation systems, emerging economies could boost not just health outcomes, but also economic and educational opportunities.

A second problem, even in areas with adequate infrastructure, is the prevalence of bureaucratic and economic barriers that limit access to essential medicines. According to a 2008 study of 36 developing countries, torturous registration and approval processes create frequent shortages of 15 of the most commonly used generic medicines. For example, in South Africa it can take up to five years for new medicines to reach the market, owing to that country’s drug-registration and labeling regulations. By streamlining drug-approval processes, removing tariffs, and simplifying customs procedures, many countries could immediately increase the availability of dozens of essential medicines.

A third problem is that there are too few health-care workers. In many low- and middle-income countries, patients cannot get the drugs they need simply because there are no doctors or nurses to prescribe them, nor pharmacists to dispense them. According to the WHO, the world suffers from a deficit of some seven million health-care professionals; by 2035, that number is expected to reach 13 million. Making matters worse, there is an even greater shortfall of specialists equipped to treat chronic diseases such as diabetes, which is spreading rapidly through the developing world, owing to changing diets and habits.

Flawed – or nonexistent – health-finance schemes are the fourth, and perhaps the largest, barrier to drug delivery in many countries. Even when generic-brand essential medicines are available, they often are unaffordable for low-income patients in countries with scant state subsidies and no risk-pooling insurance mechanisms. By one estimate, almost 90% of people in low- and middle-income countries will face impoverishment if they have to pay out of pocket for a single commonly used generic drug.

Some of the countries that are most eager to strip patent protections are also notorious for skimping on health-care expenditures. The Indian government, for example, spends just around 1% of GDP on health care, well below the 5% needed to move toward universal health coverage. But expropriating drug makers’ intellectual property will do nothing to improve outcomes where crucial safety nets are missing.

Increasing the availability of essential medicines is imperative for improving health-care outcomes for hundreds of millions of people around the world. When societies are not burdened by disease, they can focus on boosting productivity, consumption, and trade. At the same time, neglecting the threat posed by communicable diseases in the developing world invites catastrophe not just in those countries, but in developed economies, too. We can do much to close the global health-security gap; but undermining patent protections for new drugs will do precisely the opposite.

Justus Haucap is a professor of economics at Heinrich-Heine University.

By Justus Haucap

A Formula for Health Equity

KIGALI – Imagine a country where some 90% of the population is covered by health insurance, more than 90% of those with HIV are on a consistent drug regime, and 93% of children are vaccinated against common communicable diseases including HPV. Where would you guess this enchanted land of medical equity is? Scandinavia? Costa Rica? Narnia?

Try Africa – Rwanda, to be precise. In my native country, health care is a right guaranteed for all, not a privilege reserved for the rich and powerful. Rwanda remains poor, but, over the past 15 years, its health care advances have gained global attention, for good reason. In 2000, life expectancy at birth was just 48 years; today, it’s 67. International aid has helped, but our achievements have come primarily from other, non-financial innovations.

For starters, Rwanda has established a collaborative, cluster approach to governance that allows us to achieve more with the same amount of funding. Moreover, our civil servants embrace problem solving, demonstrating a level of resourcefulness that has produced many localized solutions to human development challenges such as ensuring food security and adequate supplies of clean water and housing.

But perhaps the most important factor behind our dramatic health-care gains has been the national equity agenda, which sets targets for supporting the needy and tracks progress toward meeting them. Since implementing this approach, Rwanda has managed to decrease the percentage of people living in extreme poverty from 40% of the population in 2000 to 16.3% in 2015

Aside from the obvious benefits, these gains matter because, as UNICEF recently noted a country’s potential return on investment in social services for vulnerable children is two times greater when the benefits reach the most vulnerable. In other words, Rwanda has achieved so much so fast because we are enjoying higher rates of return by investing in the poorest.

In working toward health equity, Rwanda has made accessibility a top priority. As of 2016, nine out of ten Rwandans were enrolled in one of the country’s health insurance programs. The majority of the population is enrolled in the Community-Based Health Insurance (CBHI) scheme, which has increased access to health care for Rwanda’s most vulnerable citizens by waiving fees.

As a result, the reach of health-care coverage in Rwanda is high by global standards – all the more remarkable for a country that suffered the horrors of genocide a generation ago. Consider the situation in the US: while the rate of uninsured Americans has dropped precipitously under the 2010 Affordable Care Act, the insured face rapid increases in premiums and out-of-pocket expenses. Perhaps the US should consider adopting a CBHI-type program, to reduce further the number of Americans facing financial barriers to medical care.

Rwanda has crafted health care delivery with access in mind as well, by deploying community health workers (CHWs) to the country’s 15,000 villages. These local practitioners serve as the gatekeepers to a system that has reduced waiting times and financial burdens by treating patients directly – often at patients’ homes.

The US could also benefit from a CHW program. The US is brimming with educated young people who, as CHWs, could bridge the gap between medical facilities and patients, thereby improving American social capital and health outcomes. As Rwanda’s experience has demonstrated, such programs not only broaden access to health care; they also lower overall costs by reducing unnecessary hospitalizations.

Such programs have been shown to be transferable. Starting in 1997, Brigham and Women’s Hospital supported the HIV+ community of Boston through the Prevention and Access to Care and Treatment (PACT) program. That initiative was based on the CHW model implemented in rural Haiti by Partners In Health – a non-profit health-care organization that integrates CHWs into primary care and mental health.

As a result of that initiative, the government insurer Medicaid spent less money on hospital stays, and inpatient expenditures fell by 62%. Other US communities could, and should, incorporate similar models into their treatment programs for chronic conditions.

Innovation is what kick-started Rwanda’s health-care revival, and progressive thinking is what drives it forward today. For example, health centers established throughout the country provide vaccinations and treat illnesses that village-level CHWs cannot, and have extended obstetrics services to the majority of Rwandan women.

Broadening access further, each district in Rwanda has one hospital, and each region in the country has a referral or teaching hospital with specialists to handle more difficult cases. While some hospitals still suffer from staff shortages, the government has sought to patch these holes through an initiative that employs faculty from over 20 US institutions to assist in training our clinical specialists.

In just over two decades, thanks to homegrown solutions and international collaboration, Rwanda has dramatically reduced the burden of disease on its people and economy. As we look forward, our goal is to educate tomorrow’s leaders to build on the equitable health-care system that we have created. This is the mission of the University of Global Health Equity, a new university based in rural Rwanda that has made fairness, collaboration, and innovation its guiding principles.

As a Rwandan doctor who contributed to building my country’s health-care system from its infancy, I am proud of what we have accomplished in so short a time. It wasn’t magic; it was a formula. Through continued global cooperation, other countries, including developed ones, can learn to apply it.

Agnes Binagwaho, a former minister of health of Rwanda, is Vice Chancellor of the University of Global Health Equity. She is a 2017 inductee into the US National Academy of Medicine.

By Agnes Binagwaho

We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…