Refugee Doctors for Refugee Health

TORONTO – Syrian refugees are often portrayed as an unwelcome drain on the communities to which they relocate, especially with regard to health care. But, for those escaping Syria’s civil war, ignorance of their plight is overshadowed only by the reality of their needs – and the diversity of their expertise. Although refugees do bring with them extensive health-care issues, they also bring years of experience in the medical profession that, if put to proper use, could be a boon to the communities that receive them, not to mention for other refugees.

One of the biggest challenges for refugees anywhere is finding a doctor. In many host countries, inadequate treatment is the result of xenophobia, language barriers, or insufficient supply of medical staff. This is especially true for Syrians, who are scattered across the Middle East, North Africa, Europe, and North America.

But many Syrian refugees are also highly educated. As they settle in places far from the hospitals and clinics in which they once practiced, Syria’s doctors simply want to get back to work. Isn’t it time that they did?

In the United Kingdom, efforts are underway to make that happen. The National Health Service and the British Medical Association have begun retraining refugee doctors, including many from Syria and Afghanistan, to fill the ranks of depleted clinics in UK. Through English-language training, postgraduate study, and professional registration, programs in London, Lincolnshire, and Scotland aim to reintegrate refugee doctors into the medical profession. These efforts should be lauded.

Retraining refugee doctors is not only a moral exercise; it also makes practical sense. Displaced doctors are better able to treat refugee patients’ ailments. Refugee doctors can also help ensure that the flood of new patients does not overwhelm host countries’ health-care systems. And retraining a refugee doctor is cheaper and faster than educating a new medical student. With approximately 600 refugee doctors living in Britain, the well of untapped talent in the UK is deep.

Moreover, refugee patients benefit when treated by doctors who understand their circumstances, including the enormous psychosocial stress that displacement causes. Translators can help, but they are not always available in crisis settings. Doctors who understand refugees emotionally and culturally are better equipped to put patients at ease.

Britain is not alone in recognizing the potential of refugee doctors. In Turkey, Syrian doctors and nurses have received training to help them become familiar with the Turkish health-care system. The goal is to enable qualified Syrian professionals to treat refugee patients, thus mitigating the language and logistical barriers to effective, accessible, and dignified care.

But other host countries have not been as forward thinking. In Lebanon and Jordan, for example, where more than 1.6 million registered Syrian refugees currently live, efforts to allow Syrian doctors to care for refugee patients have been criminalized. Doctors ignoring the law face arrest and possible deportation. Even Canada, a country that generally welcomes diversity and values human rights, is behind the curve on innovative approaches to refugee health. Syrian doctors face “many, many years” of retraining in Canada, and often struggle to fund the high cost of recertification.

Amid this resistance, refugee health care should be viewed as more than a set of logistical and operational challenges, but also as an inherently political process. Two dimensions of the issue must be addressed if refugee patients are ever to be properly cared for, and refugee doctors properly deployed.

For starters, refugee doctors may struggle to be accepted by their local colleagues, owing to political or personal bias. Recognizing the potential for local resistance to integration programs for refugee doctors is essential to develop proactive policies that ensure success.

Moreover, refugee doctors must be trained to address the diversity of medical needs they will face in their adoptive homes. For example, in many countries where refugees originate, lesbian, gay, bisexual, transgender, and intersex (LGBTI) health concerns remain taboo, even among medical professionals. For refugee doctors relocating to countries where LGBTI health and rights are recognized, integration curricula should include training on LGBTI health, particularly the rights of exceptionally vulnerable LGBTI refugees. Improving health for LGBTI refugees can serve as a foundation for a more open society.

The refugee crisis that has engulfed Syria is just one ripple in a tidal wave of global displacement. Around the world, some 22.5 million people are officially registered as refugees, and nearly 66 million have been forced from their homes. These numbers are unlikely to fall in the near term, as calamities caused by climate change, and by human and natural disasters, continue to push even more people from their communities.

Every one of these future refugees will need access, at some point, to medical professionals trained in refugee health, diversity, and inclusion. Empowering refugee doctors to become part of the solution will help overcome entrenched dogmas toward refugee diversity and social identities. But, just as important, it will mark a crucial step forward in ensuring more inclusive refugee health.

Vural Özdemir is a medical doctor, independent writer, and adviser on technology, society, and democracy.

By Vural Özdemir

China, the Digital Giant

SHANGHAI – China has firmly established itself as a global leader in consumer-oriented digital technologies. It is the world’s largest e-commerce market, accounting for more than 40% of global transactions, and ranks among the top three countries for venture capital investment in autonomous vehicles, 3D printing, robotics, drones, and artificial intelligence (AI). One in three of the world’s unicorns (start-ups valued at more than $1 billion) is Chinese, and the country’s cloud providers hold the world record for computing efficiency. While China runs a trade deficit in services overall, it has lately been running a trade surplus in digital services of up to $15 billion per year.

Powering China’s impressive progress in the digital economy are Internet giants like Alibaba, Baidu, and Tencent, which are commercializing their services on a massive scale, and bringing new business models to the world. Together, these three companies have 500-900 million active monthly users in their respective sectors. Their rise has been facilitated by light – or, perhaps more accurate, late – regulation. For example, regulators put a cap on the value of online money transfers a full 11 years after Alipay introduced the service.

Now, these Internet firms are using their positions to invest in China’s digital ecosystem – and in the emerging cadre of tenacious entrepreneurs that increasingly define it. Alibaba, Baidu, and Tencent together fund 30% of China’s top start-ups, such as Didi Chuxing ($50 billion), Meituan-Dianping ($30 billion), and ($56 billion).

With the world’s largest domestic market and plentiful venture capital, China’s old “copy-cat” entrepreneurs have transformed themselves into innovation powerhouses. They fought like gladiators in the world’s most competitive market, learned to develop sophisticated business models (such as Taobao’s freemium model), and built impregnable moats to protect their businesses (for example, Meituan-Dianping created an end-to-end food app, including delivery).

As a result, the valuation of Chinese innovators is many times higher than that of their Western counterparts. Moreover, China leads the world in some sectors, from livestreaming (one example is, a lip-syncing and video-sharing app) to bicycle sharing (Mobike and Ofo exceed 50 million rides per day in China, and are now expanding abroad).

Most important, China is at the frontier of mobile payments, with more than 600 million Chinese mobile users able to conduct peer-to-peer transactions with nearly no fees. China’s mobile-payment infrastructure – which already handles far more transactions than the third-party mobile-payment market in the United States – will become a platform for many more innovations.

As Chinese firms become increasingly technically capable, the country’s market advantage is turning into a data advantage – critical to support the development of AI. The Chinese firm Face++ recently raised $460 million, the largest amount ever for an AI company. DJI (a $14 billion consumer drone company), iFlyTek (a $14 billion voice recognition company), and Hikvision (a $50 billion video-surveillance company) are the world’s most valuable firms in their respective domains.

Another important developing trend in China is “online merging with offline” (OMO) – a trend that, along with AI, Sinovation Ventures is betting on. The physical world becomes digitized, with companies detecting a person’s location, movements, and identity, and then transmitting the data so that it can help shape online experiences.

For example, OMO stores will be equipped with sensors that can identify customers and discern their likely behavior as seamlessly as e-commerce websites do now. Similarly, OMO language learning will combine native teachers lecturing remotely, local assistants keeping the atmosphere fun, autonomous software correcting pronunciation, and autonomous hardware grading homework and tests. With China in a position to rebuild its offline infrastructure, it can secure a leading position in OMO.

Yet, even as China leads the way in digitizing consumer industries, business adoption of digital technologies has lagged. This may be about to change. New McKinsey Global Institute research finds that three digital forces – disintermediation (cutting out the middle man), disaggregation (separating processes into component parts), and dematerialization (shifting from physical to electronic form) – could account for (or create) 10-45% of the industry revenue pool by 2030.

Those actors that successfully capitalize on this shift are likely to be large enough to influence the global digital landscape, inspiring digital entrepreneurs far beyond China’s borders. Value will shift from slow-moving incumbents to nimble digital attackers, armed with new business models, and from one part of the value chain to another. Large-scale creative destruction will root out inefficiencies and vault China to a new echelon of global competitiveness.

China’s government has grand plans for the country’s future as a digital world power. The State Council-led Mass Entrepreneurship and Innovation Program has resulted in more than 8,000 incubators and accelerators. The government’s Guiding Fund program has provided a total of $27.4 billion to venture capital and private equity investors – a passive investment, but with special redemption incentives. The authorities are now mobilizing resources to invest $180 billion in building China’s 5G mobile network over the next seven years, and are supporting the development of quantum technology.

The State Council has also issued guidelines for developing AI technologies, with the goal of making China a global AI innovation center by 2030. Xiongan, now under construction, may be the first “smart city” designed for autonomous vehicles. In Guangdong Province, the government has set an ambitious target of 80% automation by 2020.

Such aspirations will inevitably disrupt the labor market, beginning with routine white-collar jobs (such as customer service and telemarketing), followed by routine blue-collar jobs (such as assembly line work), and finally affecting some non-routine jobs (such as driving or even radiology). Recent MGI research found that in a rapid-automation scenario, some 82-102 million Chinese workers would need to switch jobs.

Retraining the displaced will be a major challenge for China’s government, as will preventing the major digital players from securing innovation-stifling monopolies. But the government’s readiness to embrace the emerging digital age, pursuing supportive policies and avoiding excessive regulation, has already placed the country at a significant advantage.

Kai-Fu Lee is a co-founder and CEO of Sinovation Ventures, a leading venture capital firm investing in China and North America. Jonathan Woetzel is a Shanghai-based senior partner of McKinsey & Company and a director of the McKinsey Global Institute.

By Kai-Fu Lee and Jonathan Woetzel

Four Ways to Beat HIV/AIDS

PITTSBURGH – In the fight against HIV/AIDS, some stories illuminate the long road to global eradication more than others. In 2009, I heard one such story in Tanzania.

I was visiting a remote village when I spoke to a woman who knew that she was HIV-positive. She told me that the established health guidelines at the time indicated that she could not receive treatment until her count of CD4 T-helper cells, a type of white blood cell used by the immune system, had dropped below a certain threshold.

After walking several miles to get her count checked, she arrived at the clinic only to find its testing machine broken. The machine was still inoperative the second time she made the long journey. Only months later, after her third trip to the clinic on foot, did she receive her cell count: her levels were far below the necessary threshold. Her treatment should have begun months before.

Since HIV/AIDS was first identified in 1984, it has killed more than 35 million people. Although the number of AIDS-related deaths has fallen by almost half since peaking in 2005, there are still far too many people dying from this preventable condition. In 2016 alone, one million people around the world died from HIV-related causes, while 1.8 million more became infected. Contrary to popular myth, we have not turned the corner on AIDS – not by a long shot.

World AIDS Day, on December 1, is an occasion to honor the millions of victims, and to recommit to ending this devastating disease. According to UNAIDS, just 54% of HIV-positive adults, and only 43% of HIV-positive children, are currently receiving the antiretroviral therapies that save lives and prevent new infections. With so many untreated patients, the virus will continue to spread.

As CEO of a global pharmaceutical company, I’m proud of the work we have done to fight HIV/AIDS around the world. Today, more than eight million people – nearly half of all patients receiving treatment for HIV in developing countries – depend on the antiretroviral treatments that we produce.

But for those of us on the front lines of this struggle, our work is far from over. The pharmaceutical industry has a responsibility to expand access to testing and treatment, and to help stop the spread of HIV once and for all. Fulfilling four key commitments will make this goal achievable.

For starters, pharmaceutical companies should do more to increase the availability of low-cost, generic medicines. My company, Mylan, introduced the first generic once-daily pill for developing countries in 2009, and we have continually reduced its price to make it more accessible to more people. With this treatment alone, Mylan and other generic manufacturers save the US government, international donors, and national health programs more than $4.5 billion a year.

Still, treatment options could be expanded further. In September, Mylan announced a collaboration with UNAIDS, the Bill & Melinda Gates Foundation, the Clinton Health Access Initiative, and other partners to provide the next-generation single-pill HIV regimen to patients in more than 90 low- and middle-income countries for less than $75 per year. These drugs are widely used in high-income countries because they produce fewer side effects. Affordability initiatives like this one should be replicated.

Next, drug makers must continue investing in capacity and supply-chain reliability. Since 2005, the number of people on antiretroviral therapies worldwide has grown by a factor of ten, to 21 million. But roughly twice as many people are currently infected with HIV. Over the last decade, Mylan has invested more than $250 million in expanding production capacity, and we now produce four billion tablets and capsules each year. But further investments are needed if we are to provide access to the other 21 million people still not on treatment.

A third urgently needed commitment is to increase support for research that accelerates the development of new innovations in effective and efficient treatment delivery. For example, Mylan provides study medications to research trials, like the MaxART trial in Swaziland, which demonstrated that providing treatment to all HIV-positive people is the best way to slow the disease’s spread. We also supported the Kirby Institute’s ENCORE1 trial, to develop a reduced-dose version of the most commonly used HIV treatment regimen. And we are currently working with the US Agency for International Development as part of a partnership called OPTIMIZE, which aims to accelerate access to new therapies.

We do not support trials like these because we hope to gain any marketable intellectual property – we won’t. Rather, we support them because it is the right way to advance science and improve treatment.

Finally, real gains in the fight against HIV/AIDS will require drug makers to account for the limitations of health-care systems and distribution networks in the developing countries they serve.

Antiretroviral therapies for children are a good example of these challenges. Drugs for young people produced in the West are often liquids that require refrigeration. But developing countries often have limited cold-storage capacities or an inability to transport liquid in bulk. That’s why Mylan has developed heat-stable, taste-masked, dispersible tablets that can easily be incorporated into food. Our scientists are now working on the next-generation formula, which comes in the equivalent of a sugar packet that even newborns can take. More innovations like these will be needed to solve the country-specific issues that patients face.

The global health community has made remarkable progress in turning the tide on HIV/AIDS, introducing new products and advocating for earlier treatment. But when I think back to the woman I met in Tanzania, I am reminded of how much work remains to be done. Makers of generic medicines have an important role to play in this fight, and we will not stop working until treatment is available to every patient in the world who needs it.

Heather Bresch is CEO of Mylan, a global pharmaceutical company that specializes in prescription generic and brand-name medicines, and over-the-counter (OTC) offerings.

By Heather Bresch

Evidence-Based Policy Mistakes

TURIN – After years of stressing the importance of evidence-based policymaking, economists have clearly had some influence on politicians. What economists now need to do is to impress upon those same politicians that citing any evidence before adopting any policy is not evidence-based policymaking at all.

Turkish President Recep Tayyip Erdoğan has thrown around numbers to defend his decision to flood the Turkish economy with state-guaranteed credit. But the truth is that the policy was a politically motivated effort to win public support by engineering short-term growth (at the cost of driving inflation to a nine-year high of 12%).

Likewise, US President Donald Trump cites simplistic trade-deficit figures to justify protectionist policies that win him support among a certain segment of the US population. In reality, the evidence suggests that such policies will hurt the very people Trump claims to be protecting.

Now, the chair of Trump’s Council of Economic Advisers, Kevin Hassett, is attempting to defend Congressional Republicans’ effort to slash corporate taxes by claiming that, when developed countries have done so in the past, workers gained “well north of” $4,000 per year. Yet there is ample evidence that the benefits of such tax cuts accrue disproportionately to the rich, largely via companies buying back stock and shareholders earning higher dividends.

It is not clear whence Hassett is getting his data. But chances are that, at the very least, he is misinterpreting it. And he is far from alone in failing to reach accurate conclusions when assessing a given set of data.

Consider the oft-repeated refrain that, because there is evidence that virtually all jobs over the last decade were created by the private sector, the private sector must be the most effective job creator. At first glance, the logic might seem sound. But, on closer examination, the statement begs the question. Imagine a Soviet economist claiming that, because the government created virtually all jobs in the Soviet Union, the government must be the most effective job creator. To find the truth, one would need, at a minimum, data on who else tried to create jobs, and how.

Moreover, it is important to recognize that data alone are not enough to determine future expectations or policies. While there is certainly value in collecting data (via, for example, randomized control trials), there is also a need for deductive and inductive reasoning, guided by common sense – and not just on the part of experts. By dismissing the views and opinions of ordinary people, economists may miss out on crucial insights.

People’s everyday experiences provide huge amounts of potentially useful information. While a common-sense approach based on individual experience is not the most “scientific,” it should not be dismissed out of hand. A meteorologist might detect a coming storm by plugging data from myriad sources – atmospheric sensors, weather balloons, radar, and satellites – into complex computer models. But that doesn’t mean that the sight of gathering clouds in the sky is not also a legitimate sign that one might need an umbrella – even if the weather forecast promises sunshine.

Intuition and common sense have been critical to our evolution. After all, had humans not been able to draw reasonably accurate conclusions about the world through experience or observation, we wouldn’t have survived as a species.

The development of more systematic approaches to scientific inquiry has not diminished the need for such intuitive reasoning. In fact, there are important and not obvious truths that are best deduced using pure reason.

Consider the Pythagorean Theorem, which establishes the relation among the three sides of a right triangle. If all conclusions had to be reached by combing through large data sets, Pythagoras, who is believed to have devised the theorem’s first proof, would have had to measure a huge number of right triangles. In any case, critics would likely argue that he had looked at a biased sample, because all of the triangles examined were collected from the Mediterranean region.

Inductive reasoning, too, is vital to reach certain kinds of knowledge. We “know” that an apple will not remain suspended in mid-air, because we have seen so many objects fall. But such reasoning is not foolproof. As Bertrand Russell pointed out, “The man who has fed the chicken every day throughout its life at last wrings its neck instead, showing that more refined views as to the uniformity of nature would have been useful to the chicken.”

Of course, many policymakers – not just the likes of Erdoğan and Trump – make bad decisions not because of a misunderstanding of the evidence, but because they prefer to pursue politically expedient measures that benefit their benefactors or themselves. In such cases, exposing the inappropriateness of their supposed evidence may be the only option.

But, for the rest, the imperative must be to advocate for a more comprehensive approach, in which leaders use “reasoned intuition” to draw effective conclusions based on hard data. Only then will the age of effective evidence-based policymaking really begin.

Kaushik Basu, former Chief Economist of the World Bank, is Professor of Economics at Cornell University and Nonresident Senior Fellow at the Brookings Institution.

By Kaushik Basu

The China Factor in Kenya and Zimbabwe

BEIJING – Ask anyone with a basic knowledge of Africa which country is more poised for success – Zimbabwe or Kenya – and he or she will undoubtedly answer “Kenya.” Events of the last week would seem to confirm that verdict.

On Monday, after Kenya’s Supreme Court upheld the reelection of President Uhuru Kenyatta in the country’s contested presidential election, the rule of law seemed to trump political violence for the first time in years. Zimbabwe, on the other hand, is without President Robert Mugabe for the first time in 37 years. And, although the country may be ecstatic now, its political future is far from certain.

But as a Kenyan living in China, one of the African continent’s most important development partners, I see one metric that tips the scale in Zimbabwe’s favor: its relationship with my adopted home. In fact, Zimbabwe’s economic and political ties to China could prove decisive for Africa’s perpetual underdog.

On paper, Kenya clearly has the edge. Although Zimbabwe has more natural resources and mineral wealth, it has far less land and extreme poverty is much more widespread. More than 70% of the country’s 16 million people live on less than $1.90 a day, compared to 46% of Kenya’s 48 million people. Moreover, as many as 90% of Zimbabweans are unemployed or underemployed, compared to 39% of Kenyans.

Even Kenya’s economic links to China might seem more impressive at first glance. Kenya and China have long cooperated on large infrastructure projects. A Chinese-funded railway between Nairobi and Mombasa, which opened earlier this year, is the latest example. Since 2000, China has offered Kenya $6.8 billion in loans for infrastructure projects, compared to $1.7 billion for Zimbabwe. And, because loan conditions often include a requirement to hire Chinese employees, Kenya had more than 7,400 at the end of 2015, while Zimbabwe had just over 950.

But in the competition for Chinese largesse, Kenya’s advantage over Zimbabwe ends there. Cumulative Chinese foreign direct investment since 2003 has reached nearly $7 billion in Zimbabwe, compared to $3.9 billion for Kenya. Year on year, more Chinese money is flowing to Zimbabwe as well.

Moreover, Zimbabwe’s trade balance with China is far superior to Kenya’s. In 2015, Kenya’s exports to China totaled $99 million, while it imported from China a staggering 60 times that amount. Even taking into account imports of materials tied to Chinese-built infrastructure, this is an exceptionally wide bilateral deficit.

Zimbabwe, on the other hand, despite its slow growth rate, exported $766 million worth of goods to China in 2015, and imported $546 million. Most surprisingly, Zimbabwe’s exports were not restricted to minerals and metals, as one might assume, but also included tobacco and cotton, products that are relatively more labor-intensive, meaning more job creation at home. And, while Zimbabwe has around 50 fewer registered Chinese companies than Kenya, Kenya’s economy is around 4.5 times the size of Zimbabwe’s, clearly implying that those firms that are operating there contribute more to the country’s economy.

How has Zimbabwe achieved what looks like, at least from a numerical perspective, a more productive relationship with China than Kenya has?

Few beyond Mugabe and his close colleagues, including the country’s new president, Emmerson Mnangagwa, know for sure. But one way to make an educated guess is to compare both countries’ history of bilateral engagement with China.

Both Kenya and Zimbabwe have had two visits from Chinese heads of state during their post-colonial histories. Chinese President Jiang Zemin visited each country in 1996, while President Hu Jintao visited Kenya in 2006. China’s current president, Xi Jinping, visited Zimbabwe in 2015.

State visits in the other direction have been more uneven. Mugabe’s first visit to China was in 1980, just six months after independence; he made 13 more during his tenure, and high-level visits by other Zimbabwean officials were even more frequent, occurring roughly once every two years during Mugabe’s reign. Kenyan presidents, by contrast, traveled to China just six times during the same period, most recently in May 2017.

Zimbabwe’s leaders made the most of their visits to press for trade and military cooperation, and likely engaged directly with private Chinese companies. This has nurtured a culture of reciprocity. Just a few months ago, for example, a Chinese company approached my firm asking for advice about how to enter Zimbabwe’s health-care market. I have not yet fielded similar questions about gaining access to markets in Kenya.

China’s role in African economies has been criticized; but, as I have argued before, Chinese investment has also been a lifeline to many on the continent. From creating employment opportunities to providing direct investment in infrastructure, China has been a partner to Africa when many Western investors preferred to stay away.

How Kenya and Zimbabwe navigate their future relationships with China remains to be seen. Both countries have supported Xi’s signature Belt and Road Initiative, which, in theory, should increase their strategic value to China. Kenya’s return to political stability should also sustain, if not deepen, the country’s economic engagement with China.

Zimbabwe’s historic ties to China will be no less important. Following Mugabe’s resignation, China’s foreign ministry went out of its way to praise the “friendship between China and Zimbabwe,” and Mnangagwa can be expected to continue that relationship. The new president received military training in China, and paid an official visit as speaker of the parliament in 2001. There is even speculation that China was warned of the looming coup in Zimbabwe, if not consulted beforehand.

As Kenya and Zimbabwe navigate their political futures, much in both countries will no doubt change – one hopes for the better. Their ties with China will be a key metric in assessing their trajectory.

Hannah Ryder, a former head of policy and partnerships for the United Nations Development Program in China, is founder and CEO of Development Reimagined.

By Hannah Ryder

Capitalizing on Climate Unity

BONN – When Donald Trump was elected US president a year ago, some said the end of the Paris climate agreement was nigh. Yet, as the latest round of global climate talks in Bonn, Germany, has shown, the world’s political leaders are more committed to the deal than ever. This is good news, but the fact remains that countries’ commitments do not yet add up to enough to turn the tide – and our window of opportunity to act effectively on climate change is rapidly closing.

Trump’s decision to withdraw the United States – the world’s largest historical carbon dioxide emitter – from the Paris agreement dealt the accord a major blow. Many of America’s closest allies – including both of our countries, the Marshall Islands and Australia – were deeply disappointed by the move, which was shortsighted, for both America and the world.

But it is hard not to take heart from the fresh wave of global resolve Trump’s decision has unleashed, both globally and within the US itself. Almost every major US state, city, and company has now pledged to do more to ensure that their country can meet its commitments, despite the Trump administration’s opposition.

The fact that climate action is now the worlds’ biggest economic opportunity has certainly helped. According to the Trump administration’s own analysis, more than twice as many Americans are now working in the solar industry than in coal, oil, and gas combined. And earlier this year, the OECD indicated that we could boost global growth by 5% per year by 2050, simply by linking the climate and growth agendas.

There is no time to waste; climate change has already arrived. This year’s record-breaking drought in the Marshall Islands, apocalyptic storms in the Caribbean, and devastating floods in Bangladesh and the US demonstrate this.

As the United Nations Environment Programme recently reminded us, even if every country hits its existing 2030 emissions-reduction targets, we will be unable to limit warming to below 1.5° Celsius above preindustrial levels – the threshold, recognized in the Paris agreement, beyond which the impact of climate change becomes far greater. Our chances even of staying within the more conservative – and dangerous – 2° Celsius limit will be slim.

To ignore this reality is to gamble with the existential future of many island countries, not to mention the prosperity of the global economy. Without a sharp rise in global ambition for emissions reductions by 2020, we will be unable to save the world’s most vulnerable countries. And if runaway climate change takes hold, no country will be immune to its effects.

Unfortunately, things will get a lot worse before they get better. That is why we must step up our efforts to boost our resilience to the climate effects we won’t be able to avoid, and address the associated security consequences.

In the meantime, we must urgently increase the ambition of our climate commitments. Fortunately, several upcoming events offer an opportunity to do just that. We need to seize that opportunity with both hands.

Next month, French President Emmanuel Macron will host a conference to mark the two-year anniversary of the Paris agreement. And next September, California Governor Jerry Brown will host his own summit to galvanize greater action by cities, companies, and other non-state actors. The biggest opportunity, however, will come in 2019, when UN Secretary-General António Guterres convenes world leaders in New York for the biggest climate gathering since the Paris talks.

We need to build an arc of ambition across these events that can, in the words of our friend Tony de Brum, the late Marshallese foreign minister and untiring climate warrior at the Paris conference, deliver a pathway to survival for the most vulnerable.

Some significant players are already going above and beyond their pledges. A number of others, including the Marshall Islands, are set to bring forward new targets by 2020, to augment their current targets, which reach only as far as 2025. Still others – including France, India, and New Zealand – have said informally that they are eager to do more.

The truth is that almost all countries have the capacity to do more, especially if the support is there and the opportunities are identified. The imperative now is to create the right political conditions both to motivate and facilitate action. As more countries signal their ability to increase the ambition of their commitments, still more will follow.

At the same time, we must ensure that every sector, as well as every country, does its fair share. This includes, for example, international shipping, which, if it were a country, would be the world’s sixth-largest emitter.

Next year’s “Talanoa Dialogue” – to be convened by Fiji, which last week became the first island state to chair UN climate talks – will help countries identify exactly how they can achieve the goals set in the Paris agreement. That dialogue, which countries should approach in good faith, must be a springboard for further action. To that end, the recent Intergovernmental Panel on Climate Change report laying out pathways for keeping the temperature rise below the 1.5°C threshold will be crucial. The science remains key.

The Paris talks proved that political success is possible, if leaders are given the right platform, if civil society mobilizes behind them, and if the world acts in unison. To get the rest of the way to a sustainable future, we must apply this lesson again. The catchphrase at the Bonn conference was “further, faster, and together.” Our collective challenge is to translate a nice-sounding slogan into reality.

Hilda Heine is President of the Republic of the Marshall Islands. Kevin Rudd, the 26th prime minister of Australia, is currently President of the Asia Society Policy Institute.

By Hilda Heine and Kevin Rudd

Natural Solutions to Climate Change

OXFORD – In response to climate change, land is key. Today, agriculture, forestry, and other land uses account for roughly a quarter of global greenhouse-gas emissions. But adopting sustainable land management strategies could provide more than one-third of the near-term emission reductions needed to keep warming well below the target – 2°C above pre-industrial levels – set by the Paris climate agreement.

Conservation organizations like mine have long been working to balance the interaction between people and nature. But only recently have we fully grasped just how important land-use management is in addressing climate change. With the development of remote sensing, artificial intelligence, and biogeochemical modeling, we can better forecast outcomes, and develop strategies to manage and minimize adverse consequences.

Some of the most promising ways to mitigate climate change are what we call “natural climate solutions”: the conservation, restoration, and improved management of land, in order to increase carbon storage or avoid greenhouse-gas emissions in landscapes worldwide. The full potential of these solutions is detailed in a new study produced by my organization, the Nature Conservancy, and 15 other leading institutions.

Among the most important natural climate solutions is protecting “frontier forests” – pristine woodlands that serve as natural carbon sinks. Intact tropical and northern forests, as well as savannas and coastal ecosystems, store huge amounts of carbon accumulated over centuries. When these areas are disturbed, carbon is released. Preservation of frontier habitats also helps regulate water flows, reduces the risk of flooding, and maintains biodiversity.

Reforestation is another important natural solution. Globally, an estimated two billion hectares (4.9 billion acres) of land has been deforested or degraded. Because trees are the best carbon-capture-and-storage technology the world has, reversing these numbers would bring a significant reduction in global carbon levels. We estimate that the world could capture three gigatons of CO2 annually – equivalent to taking more than 600 million cars off the roads – simply by planting more trees.

A third category of natural solution is agricultural reform. From field to fork, the food sector is a major contributor to climate change through direct and indirect emissions, and by its often-negative effects on soil health and deforestation. Recognizing these risks, 23 global companies – including Nestlé, McDonald’s, Tesco, and Unilever – recently signed a commitment to halt deforestation in Brazil’s Cerrado savanna. The region, which covers a quarter of the country, has come under growing pressure from production of beef, soy, and other commodities, together with the associated infrastructure.

As the Cerrado pledge demonstrates, when governments and businesses come together to address land-use challenges, the impact is potent. Natural climate solutions have the potential to reduce CO2 emissions by an estimated 11.3 billion tons a year – equal to a complete halt in burning oil, according to our study. One recent study calculated that if Brazil reached zero deforestation by 2030, it would add 0.6% of GDP, or about $15 billion, to its economy. Communities also reap secondary benefits – such as rural regeneration, improved food and water security, and coastal resilience – when natural climate solutions are implemented.

Yet, despite the data supporting better land-use decision-making, something isn’t adding up. In 2016, the world witnessed a dramatic 51% increase in forest loss, equivalent to an area about the size of New Zealand. We need to buck this trend now, and help the world realize that land-use planning is not simply a conservation story.

Some countries are moving in the right direction. The Indian government, for example, has set aside $6 billion for states to invest in forest restoration. In Indonesia, the government created a dedicated agency to protect and restore peatlands, bogs, and swamp-like ecosystems that have immense CO2 storage capabilities.

But they are the exceptions. Of the 160 countries that committed to implementing the Paris climate agreement, only 36 have specified land-use management in their emissions-reduction strategies.

Overcoming inertia will not be easy. Forests, farms, and coasts vary in size, type, and accessibility. Moreover, the lives of hundreds of millions of people are tied to these ecosystems, and projects that restore forest cover or improve soil health require focused planning, a massive undertaking for many governments.

One way to get things moving, especially in the agricultural sector, would be to remove or redirect subsidies that encourage excessive consumption of fertilizers, water, or energy in food production. As Indian government officials reminded their peers during a World Trade Organization meeting earlier this year, meaningful agricultural reforms can begin only when rich countries reduce the “disproportionately large” subsidies they give their own farmers.

Supporting innovation and entrepreneurship can also help power change. New processes and technologies in landscape planning, soil analysis, irrigation, and even alternative proteins such as plant-based meat are making agriculture and land use more sustainable. Similarly, changes in the construction industry, which is turning to more efficiently produced products like cross-laminated timber (CLT), can help reduce carbon pollution.

Finally, financing options for natural climate solutions must be dramatically increased. While payments to conserve forests are starting to flow under the UN’s REDD+ program, and the Green Climate Fund has committed $500 million for forest protection payments, total public investment in sustainable land use remains inadequate. According to the Climate Policy Initiative, public financing for agriculture, forestry, and land-use mitigation attracted just $3 billion in 2014, compared to $49 billion for renewable energy generation and $26 billion for energy efficiency.

At the UN climate change meeting that just concluded in Bonn, Germany, global leaders reaffirmed that the world cannot respond adequately to rising temperatures if governments continue ignoring how forests, farms, and coasts are managed. Now that there is a firm consensus, governments must act on it.

Justin Adams is Global Managing Director for Lands at the Nature Conservancy.

By Justin Adams

The Eternal Return of the Plague

NORMAN, OKLAHOMA – “Fearsome Plague Epidemic Strikes Madagascar.” That recent New York Times headline might sound like the synopsis of a horror movie. The epidemic gripping Madagascar is not just any plague, and it certainly isn’t some Hollywood apocalypse. It’s the plague, caused by the bacterium Yersinia pestis, agent of the notorious bubonic plague.

For most people, “the plague” conjures up images of the medieval Black Death, and perhaps a vaguely reassuring sense that, in the developed world, such ancient dangers are long past. But in recent years, thanks to the work of geneticists, archaeologists, and historians, we now know that human civilization and the plague have a much deeper and more intimate association than previously assumed. Lessons learned from studying this historic interaction could reshape how we think about global public health today.

All infectious diseases are caused by pathogens – bacteria, viruses, protozoa, and parasites – that are capable of subverting our immune systems long enough to make us sick. These organisms are the product of their own biological evolution, and the history of the plague’s development is perhaps (along with maybe HIV) the most detailed biography of any pathogen known to science.

The plague bacterium, in its most destructive form, is about 3,000 years old. It evolved in Central Asia as a rodent disease; humans were accidental victims. From the germ’s point of view, people make poor hosts, because we die quickly and are usually a terminus, not a transmitter. The plague is spread principally by the bite of fleas, and a few thousand years ago, the bacterium acquired a genetic mutation that made it ferociously effective at spreading. This adaptation improved the plague’s biological fitness, which, for rodents – and the humans who live near them – has proven to be a nightmare.

Thanks to new genomic evidence, we can say with greater confidence how long this nightmare has been recurring. One of the most surprising and solidly confirmed findings in recent years has been the prevalence of plague in samples from Stone Age and Bronze Age societies in Europe and Central Asia. While it remains unclear what role plague played in the failure of those societies, it is reasonable to assume that the disease has long influenced human history.

What is now beyond question is that Yersinia pestis was indeed the pathogen responsible for two of the most destructive pandemics ever. The Black Death, which lives on in popular imagination to this day, arrived from Central Asia in the 1340s, and in the space of a few years, wiped out roughly half of the population in the regions it struck. The disease then lingered for a few more centuries, killing many more.

But this entire episode is properly known as the “second pandemic.” The first pandemic began in AD 541, during the reign of the Roman Emperor Justinian. The outbreak is known as the Justinianic plague, and, like the Black Death, it cut a swath of destruction from inner Asia to the shores of the Atlantic in the space of a few years. Total mortality was in the tens of millions, and stupefied contemporaries were certain they were living on the verge of the last judgment.

As with the Black Death, later historians questioned whether a rodent disease could cause destruction on such a scale. But in recent years, the pathogen’s genetic traces have been found in sixth-century graves, and the DNA evidence convicts Yersinia pestis of this ancient mass murder as definitively as it would in a modern courtroom. The plague triggered a demographic crisis that helped to topple the Romans’ “eternal empire.”

Plague pandemics were events of mind-boggling ecological intricacy. They involved a minimum of five species, in perilous alignment: the bacterium itself, the reservoir host such as marmots or gerbils, the flea vector, the rodent species in close quarters with humans, and the human victims.

The germ first had to leave its native Central Asia. In the case of the Justinianic plague, it seems to have done so by exploiting the shipping networks in the Indian Ocean. Once within the Roman Empire, it found an environment transformed by human civilization, along with massive colonies of rodents fattened on the ancient world’s ubiquitous granaries. Human expansion helped rodents prosper, and rat infestations, in turn, intensified and prolonged the plague’s outbreak.

There is tantalizing evidence that climate change also played a role in triggering the first pandemic. Just a few years before the appearance of the plague on Roman shores, the planet experienced one of the most abrupt incidents of climate change in the last few thousand years. A spasm of volcanic explosions – in AD 536, when historians reported a year without summer, and again in AD 539-540 – upset the global climate system. The precise mechanisms by which climate events fueled plague remain contested, but the link is unmistakable, and the lesson is worth underscoring: the complex relationship between climate and ecosystems impacts human health in unexpected ways.

The plague in Madagascar today is an offshoot of what is known as the “third plague pandemic,” a global dispersion of Yersinia pestis that radiated from China in the late nineteenth century. There still is no vaccine; while antibiotics are effective if administered early, the threat of antimicrobial resistance is real.

That may be the deepest lesson from the long history of this scourge. Biological evolution is cunning and dangerous. Small mutations can alter a pathogen’s virulence or its efficiency of transmission, and evolution is relentless. We may have the upper hand over plague today, despite the headlines in East Africa. But our long history with the disease demonstrates that our control over it is tenuous, and likely to be transient – and that threats to public health anywhere are threats to public health everywhere.

Kyle Harper, a professor of classics and letters at the University of Oklahoma, is author of The Fate of Rome: Climate, Disease, and the End of an Empire.

By Kyle Harper

Saving Somalia Through Debt Relief

LONDON – Julius Nyerere, the first president of Tanzania, once asked his country’s creditors a blunt question: “Must we starve our children to pay our debts?” That was in 1986, before the public campaigns and initiatives that removed much of Africa’s crushing and unpayable debt burden. But Nyerere’s question still hangs like a dark cloud over Somalia.

Over the last year, an unprecedented humanitarian effort has pulled Somalia back from the brink of famine. As the worst drought in living memory destroyed harvests and decimated livestock, almost $1 billion was mobilized in emergency aid for nutrition, health, and clean water provision. That aid saved many lives and prevented a slow-motion replay of the 2011 drought, when delayed international action resulted in nearly 260,000 deaths.

Yet, even after these recent efforts, Somalia’s fate hangs in the balance. Early warning systems are pointing to a prospective famine in 2018. Poor and erratic rains have left 2.5 million people facing an ongoing food crisis; some 400,000 children live with acute malnutrition; food prices are rising; and dry wells have left communities dependent on expensive trucked water.

Humanitarian aid remains essential. Almost half of Somalia’s 14 million people need support, according to UN agencies. But humanitarian aid, which is often volatile and overwhelmingly short-term, will not break the deadly cycles of drought, hunger, and poverty. If Somalia is to develop its health and education systems, economic infrastructure, and the social protection programs needed to build a more resilient future, it needs predictable, long-term development finance.

Debt represents a barrier to that finance. Somalia’s external debt is running at $5 billion. Creditors range from rich countries like the United States, France, and Italy, to regional governments and financial institutions, including the Arab Monetary Fund.

But Somalia’s debt also includes $325 million in arrears owed to the International Monetary Fund. And there’s the rub: countries in arrears to the IMF are ineligible to receive long-term financing from other sources, including the World Bank’s $75 billion concessional International Development Association (IDA) facility.

Much of the country’s current debt dates to the Cold War, when the world’s superpower rivalry played out in the Horn of Africa. Over 90% of Somalia’s debt burden is accounted for by arrears on credit advanced in the early 1980s, well before two-thirds of today’s Somali population was born.

Most of the lending then was directed to President Siad Barre as a reward for his abandonment of the Soviet Union and embrace of the West. Military credits figured prominently: over half of the $973 million in US debt is owed to the Department of Defense. Somalia got state-of-the-art weaponry, liberally financed by loans. The IMF was nudged into guaranteeing repayment through a structural adjustment program. Repaying the debt today would cost every Somali man, woman, and child $361.

None of this would matter if Somalia had qualified for debt reduction. The Heavily Indebted Poor Countries Initiative (HIPC), created in response to the great debt relief campaigns of the 1990s, has written off around $77 billion in debt for 36 countries. Somalia is one of just three countries that have yet to qualify. The reason: the arrears owed to the IMF. (Eritrea and Sudan have also not qualified, for similar reasons).

The IMF view is that Somalia, like earlier HIPC beneficiaries, should establish a track record of economic reform. This will delay a full debt write-off for up to three years, exclude Somalia from long-term development finance, and reinforce its dependence on emergency aid. Other creditors have endorsed this approach through silent consent.

Somalia deserves better. President Mohamed Abdullahi Mohamed’s government has demonstrated a commitment to economic reform, improved accountability, and transparency. For two years, it has adhered to an IMF program, achieving targets for improving public finance and the banking sector. More needs to be done, especially in terms of domestic resource mobilization. But this is the first Somali government to provide the international community with a window of opportunity to support recovery. We must capitalize on it.

Waiting three more years as Somalia ticks the IMF’s internal accounting boxes would be a triumph of bureaucratic complacency over human needs. Without international support, Somalia’s government lacks the resources needed to break the deadly cycle of drought, hunger, and poverty.

Somalia’s children need investment in health, nutrition, and schools now, not at some point in the indefinite future. Investing in irrigation and water management would boost productivity. With drought-related livestock and crop losses estimated at around $1.5 billion, government-supported cash payment programs would help aid recovery, strengthen resilience, and build trust.

The benefits of these investments would extend to security. Providing the hope that comes with education, health care, and the prospect of a job is a far more effective weapon than a drone to combat an insurgency that feeds on despair, poverty, joblessness, and the absence of basic services.

There is an alternative to IMF-sponsored inertia on debt relief. The World Bank and major creditors could convene a creditor summit to agree to terms for a prompt debt write-off. More immediately, the World Bank could seek its shareholders’ approval for a special mechanism – a “pre-arrears clearance grant” – that would enable Somalia to receive IDA financing. There is a precedent for this: In 2005, the US championed World Bank financing for Liberia, which at the time had significant IMF debt after emerging from civil war.

The technicalities can be discussed and the complexities resolved. But we should not lose sight of what is at stake. It is indefensible for the IMF and other creditors to obstruct Somalia’s access to financing because of arrears on a debt incurred three decades ago as much through reckless lending as through irresponsible borrowing.

Somalia’s children played no part in creating that debt. They should not have to pay for it with their futures.

Kevin Watkins is CEO of Save the Children UK.

By Kevin Watkins


Banking on African Infrastructure

JOHANNESBURG – As the US Federal Reserve embarks on the “great unwinding” of the stimulus program it began nearly a decade ago, emerging economies are growing anxious that a stronger dollar will adversely affect their ability to service dollar-denominated debt. This is a particular concern for Africa, where, since the Seychelles issued its debut Eurobond in 2006, the total value of outstanding Eurobonds has grown to nearly $35 billion.

But if the Fed’s ongoing withdrawal of stimulus has frayed African nerves, it has also spurred recognition that there are smarter ways to finance development than borrowing in dollars. Of the available options, one specific asset class stands out: infrastructure.

Africa, which by 2050 will be home to an estimated 2.6 billion people, is in dire need of funds to build and maintain roads, ports, power grids, and so on. According to the World Bank, Africa must spend a staggering $93 billion annually to upgrade its current infrastructure; the vast majority of these funds – some 87% – are needed for improvements to basic services like energy, water, sanitation, and transportation.

Yet, if the recent past is any guide, the capital needed will be difficult to secure. Between 2004 and 2013, African states closed just 158 financing deals for infrastructure or industrial projects, valued at $59 billion – just 5% of the total needed. Given this track record, how will Africa fund even a fraction of the World Bank’s projected requirements?

The obvious source is institutional and foreign investment. But, to date, many factors, including poor profit projections and political uncertainty, have limited such financing for infrastructure projects on the continent. Investment in African infrastructure is perceived as simply being too risky.

Fortunately, with work, this perception can be overcome, as some investors – such as the African Development Bank, the Development Bank of Southern Africa, and the Trade & Development Bank – have already demonstrated. Companies from the private sector are also profitably financing projects on the continent. For example, Black Rhino, a fund set up by Blackstone, one of the world’s largest multinational private equity firms, focuses on the development and acquisition of energy projects, such as fuel storage, pipelines, and transmission networks.

But these are the exceptions, not the rule. Fully funding Africa’s infrastructure shortfall will require attracting many more investors – and swiftly.

To succeed, Africa must develop a more coherent and coordinated approach to courting capital, while at the same time working to mitigate investors’ risk exposure. Public-private sector collaborations are one possibility. For example, in the energy sector, independent power producers are working with governments to provide electricity to 620 million Africans living off the grid. Privately funded but government regulated, these producers operate through power purchase agreements, whereby public utilities and regulators agree to purchase electricity at a predetermined price. There are approximately 130 such producers in Sub-Saharan Africa, valued at more than $8 billion. In South Africa alone, 47 projects are underway, accounting for 7,000 megawatts of additional power production.

Similar private-public partnerships are emerging in other sectors, too, such as transportation. Among the most promising are toll roads built with private money, a model that began in South Africa. Not only are these projects, which are slowly appearing elsewhere on the continent, more profitable than most financial market investments; they are also literally paving the way for future growth.

Clearly, Africa needs more of these ventures to overcome its infrastructure challenges. That is why I, along with other African business leaders and policymakers, have called on Africa’s institutional investors to commit 5% of their funds to local infrastructure. We believe that with the right incentives, infrastructure can be an innovative and attractive asset class for those with long-term liabilities. One sector that could lead the way on this commitment is the continent’s pension funds, which, together, possess a balance sheet of about $3 trillion.

The 5% Agenda campaign, launched in New York last month, underscores the belief that only a collaborative public-private approach can redress Africa’s infrastructure shortfall. For years, a lack of bankable projects deterred international financing. But in 2012, the African Union adopted the Program for Infrastructure Development in Africa, which kick-started more than 400 energy, transportation, water, and communications projects. It was a solid start – one that the 5% Agenda seeks to build upon.

But some key reforms will be needed. A high priority of the 5% Agenda is to assist in updating the national and regional regulatory frameworks that guide institutional investment in Africa. Similarly, new financial products must be developed to give asset owners the ability to allocate capital directly to infrastructure projects.

Unlocking new pools of capital will help create jobs, encourage regional integration, and ensure that Africa has the facilities to accommodate the needs of future generations. But all of this depends on persuading investors to put their money into African projects. As business leaders and policymakers, we must ensure that the conditions for profitability and social impact are not mutually exclusive. When development goals and profits align, everyone wins.

Ibrahim Assane Mayaki, a former Prime Minister of Niger, is CEO of the New Partnership for Africa’s Development (NEPAD) Planning and Coordinating Agency.

By Ibrahim Assane Mayaki

We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…