Jobs in the Age of Artificial Intelligence

WASHINGTON, DC – The world has no shortage of pressing issues. There are 1.6 billion people living in acute poverty; an estimated 780 million adults are illiterate. Serious problems are not confined to the developing world: “deaths of despair,” for example, are raising mortality among white males in the United States. Even when advanced economies grow, they are not lifting all boats. Higher-income groups thrive while lower-income households and minority groups are consistently left behind.

And now some analysts suggest that new forms of computer programming will compound these developments, as algorithms, robots, and self-driving cars destroy middle-class jobs and worsen inequality. Even the summary term for this technology, Artificial Intelligence, sounds ominous. The human brain may be the “most complex object in the known universe,” but, as a species, we are not always collectively very smart. Best-selling science fiction writers have long predicted that we will one day invent the machines that destroy us.

The technology needed to create this dystopian future is not even on the horizon. But recent breakthroughs in AI-related technologies do offer enormous potential for positive advances in a range of applications from transportation to education and drug discovery. Used wisely, this boost in our computational abilities can help the planet and some of its most vulnerable citizens.

We can now find new patterns that are not readily evident to the human observer – and this already suggests ways to lower energy consumption and carbon dioxide emissions. We can increase productivity in our factories and reduce food waste. More broadly, we can improve prediction far beyond the ability of conventional computers. Think of the myriad activities in which a one-second warning would be useful or even lifesaving.

And yet the fear remains: Won’t these same improvements entail giving up all of our jobs – or most of our good jobs? In fact, there are three reasons why the jobs apocalypse is on hold.

First, Moravec’s paradox applies. Hans Moravec and other computer scientists pointed out in the 1980s that what is simple for us is hard for even the most sophisticated AI; conversely, AI often can do easily what we regard as difficult. Most humans can walk, manipulate objects, and understand complex language from an early age, never paying much attention to the amount of computation and energy needed to perform these tasks. Smart machines can perform mathematical calculations far exceeding a human’s capabilities, but they cannot easily climb stairs, open a door, and turn a valve. Or kick a soccer ball.

Second, today’s algorithms are becoming very good at pattern recognition when they are provided with large data sets – finding objects in YouTube videos or detecting credit card fraud – but they are much less effective with unusual circumstances that do not fit the usual pattern, or simply when the data are scarce or a bit “noisy.” To handle such cases, you need a skilled person, with his or her experience, intuition, and social awareness.

Third, the latest systems cannot explain what they have done or why they are recommending a particular course of action. In these “black boxes,” you cannot simply read the code to analyze what is happening or to check if there is a hidden bias. When interpretability is important – for example, in many medical applications – you need a trained human in the decision-making loop.

Of course, this is just the state of technology today – and high rates of investment may quickly change what is possible. But the nature of work will also change. Jobs today look very different from jobs 50 or even 20 years ago.

And new computer algorithms will take time to penetrate the economy fully. Data-rich sectors such as digital media and e-commerce have just begun to unleash the capabilities AI has created. The multitude of narrow AI applications that could affect jobs in sectors such as health care, education, and construction will take much longer to spread. In fact, this may come just in time – an aging population in developed economies implies a smaller workforce – and greater need for personal care services – in the coming decades.

Public policy decisions will shape the AI era. We need opportunity and competition, not the growth of powerful monopolies, in order to promote technological progress in a way that does not leave a large number of people behind. This requires improving access to all forms of education – and at low or zero cost.

With developed economies’ competitors, including China, investing heavily in AI, policymakers should be increasing support for basic research and ensuring that their countries have the physical and human resources they need to invent and manufacture everything connected with this major new general purpose technology.

We should not underestimate humans’ abilities to inflict damage on their community, their environment, and even the entire planet. Apocalyptic fiction writers may one day be proved correct. But, for now, we have a powerful new tool for enabling all people to live better lives. We should use it wisely.Simon Johnson and Jonathan Ruane teach the Global Business of Artificial Intelligence and Robotics at MIT’s Sloan School of Management.

By Simon Johnson and Jonathan Ruane

We Need an International Environmental Criminal Court

NAIROBI – The announcement of the winners of this year’s Goldman Environmental Prize is an opportunity to celebrate activist leaders. But it is also a moment to recognize just how much courage their efforts (and those of a great many others) can demand.

When my dear friend Berta Cáceres and I won the prize in 2015, Berta said in her acceptance speech, “I have given my life for the service of mother earth.” Not long after, Berta was assassinated in Honduras. Her story is tragic, but not unique. Indeed, just months later, Isidro Baldenegro López, another Goldman Environmental Prize recipient, was shot dead.

There has never been a more dangerous time to be an environmental activist. Consider the violence unleashed against the environmental defenders protesting the Dakota Access Pipeline in the United States. Police were accused of using excessive force to try to disperse members of the Standing Rock Sioux Tribe and their supporters, who argued that the project would contaminate water and damage sacred burial sites.

Fortunately, no one was killed during those protests. But elsewhere, in more fragile democracies, environmental campaigners who stand up to polluters are paying with their lives. A Global Witness report documented 185 killings across 16 countries in 2015 alone. That is almost double the number of journalists killed that year.

My own experience highlights the dangers facing environmental crusaders. For eight years, my community in rural Kenya, Owino Uhuru, has been exposed to toxic lead poisoning caused by the operations of a state-licensed smelter. The World Health Organization’s measure of lead poisoning is five micrograms per deciliter. The highest lead level recorded in Owino Uhuru was 420 micrograms per deciliter. In the highly publicized contamination case in Flint, Michigan, the readings were 35 micrograms per deciliter.

When my community found out that we were being poisoned, we fought back. We wrote letters to the government and organized peaceful protests. With the support of my community, I founded the Center for Justice, Governance, and Environmental Action (CJGEA) to hold the state and corporations accountable for ensuring a clean and healthy environment.

In February 2016, the CJGEA went to court against six state agencies and two corporate entities. Nothing happened. One year later, when we published public notices in local newspapers of our intention to sue the two corporations, all hell broke loose.

Despite the murders of Berta and Isidro and so many others, I did not fully recognize the danger of challenging a powerful government-backed operation. Soon, I received a chilling phone call warning me to watch over my son carefully. Environmental activists within the community were attacked, their houses surrounded by thugs wielding machetes. The son of a close ally was abducted – and, fortunately, later released – by unidentified men.

You might expect that the state would protect its citizens from such tactics, if not from being poisoned in the first place. We broke no laws; on the contrary, we have been upholding Kenya’s constitution, which guarantees citizens’ rights to a safe and healthy environment. But perhaps we should not be surprised by the state’s behavior. After all, in 2015, Kenya’s government voted in the UN General Assembly, along with just 13 others, against a United Nations resolution calling for the protection of human-rights defenders.

Nature provides enough for everyone’s needs, but not for everyone’s greed. As natural resources become scarcer, Africa’s lush, mineral-rich lands are becoming more lucrative for investors seeking to maximize profits. But, while governments should welcome opportunities for economic growth and job creation, they should not allow companies to damage the environment and threaten residents’ health and livelihoods.

As stories like Berta’s, Isidro’s, and mine demonstrate, we can no longer rely on state bodies, such as national law enforcement, to ensure this outcome, much less to investigate and prosecute crimes against the planet and those who fight for it. That is why the world needs an independent, internationally recognized legal body to which communities and activists can turn to address environmental crimes.

The appointment in March 2012 of the first-ever UN special rapporteur on human rights and the environment was a positive step. But we need a system with teeth. Twenty years ago, the International Criminal Court was established to prosecute war crimes and crimes against humanity. A similar court should do the same for crimes against the environment and its defenders.

Silencing the voices fighting to uphold environmental laws and regulations is self-defeating. People and the planet are dying. Those who are fighting to prevent those deaths deserve protection, not to become further casualties.Phyllis Omido, a Kenyan environmental activist and a winner of the 2015 Goldman Environmental Prize, is a 2017 Aspen New Voices Fellow.

By Phyllis Omido

Too Many Health Clinics Hurt Developing Countries

FREETOWN, SIERRA LEONE – Donors like the World Bank and the World Health Organization often urge developing countries to invest in national health systems. But while rushing to construct clinics and other medical facilities in even the remotest regions may seem like a straightforward approach to ensuring universal health coverage, that has not turned out to be true.

The recent Ebola epidemic in West Africa highlighted the urgent need for stronger, more efficient, and more resilient health-care systems in developing countries. But when countries rush to build more clinics, the resulting facilities tend to be hastily constructed and lacking in the equipment, supplies, and staff needed to deliver vital health services effectively.

In my frequent visits to rural areas of my native Sierra Leone, I have seen more than a few health facilities that communities could do without. A newly refurbished facility in Masunthu, for example, had scant equipment and no water in the taps. The facilities in nearby Maselleh and Katherie had cracked walls, leaky roofs, and so few cupboards that supplies like syringes and medical registers had to be stacked on the floor.

This situation is the direct result of a piecemeal and hurried approach to investment in health-care infrastructure. At the end of the civil war in 2002, Sierra Leone had fewer than 700 health facilities, according to the 2004 Primary Health Care Handbook. In 2003, the cash-strapped government decided to “decentralize” various public services to the district level, fueling fierce competition for limited resources.

Local councils, seeking to grab the biggest possible slice of the pie, began to push forward new projects, leading to rapid and uncontrolled expansion of the health system. Today, Sierra Leone – with a population of just seven million – has nearly 1,300 health facilities. The Ministry of Health has been unable to equip all of these new facilities and cover staff and operational costs, as its budget has not risen to match the system’s expansion. In fact, very few (if any) of the African countries that signed the 2001 Abuja Declaration to allocate 15% of their budget to health have been able to do so.

Last September, Sierra Leone conducted an assessment of the distribution of public-health facilities and health workers in the country, in order to guide discussions on the Human Resources for Health Strategy 2017-2021. The results were stark: only 47% of the country’s health facilities employed more than two health workers, including unsalaried workers and volunteers. Seven percent of health facilities had no health workers assigned to them at all – an empty promise in physical form.

This situation is not unique to Sierra Leone – or to Africa. In Indonesia, the government invested oil revenue in the massive and rapid expansion of basic social services, including health care. But today an insufficient number of doctors plagues many of these facilities, particularly in remote areas, where absenteeism also is high. There are many nurses, but most are inadequately trained. Still, they are left to run remote facilities on their own.

Beyond personnel, remote health facilities in Indonesia lack adequate supporting infrastructure: clean water, sanitation, reliable electricity, and basic medicine and equipment. Decentralized local governments, which have little authority over remote clinics, cannot supervise their activities. Small wonder that Indonesia has one of the highest rates of maternal mortality in East Asia.

An excess of poorly equipped health facilities is not only ineffective; it can actually make matters worse, owing to factors like poor sanitation and weak emergency referral systems. During the recent Ebola crisis, underequipped facilities caused even more deaths, not just among patients, but also among the health workers committed to helping them.

Rather than continuing to pursue the uncontrolled proliferation of poorly equipped and operated health-care facilities, policymakers should consider a more measured approach. Of course, people living in remote areas need access to quality health care, without having to navigate rough and dangerous roads that can become virtually inaccessible during some periods of the year. But outreach services and community health workers could cover these areas much more effectively. The value of such an approach has recently been demonstrated in Ethiopia, where health outcomes have improved.

While most of the Sierra Leone facilities were built with donor funds, the government has gone along with plans to accelerate the construction drive. The government and donors have a joint responsibility to pursue a more cautious approach that guarantees quality service delivery.

At the WHO’s World Health Assembly this month, participants should shine a spotlight on this responsibility and begin to rethink current strategies for achieving universal health coverage. With a more measured approach, it will take longer to build the same number of clinics. But more lives will be saved. And that’s the only indicator that should count.

Samuel Kargbo is Director of Policy and Planning in Sierra Leone, a member of the UHC2030 Steering Committee, and a 2016 Aspen Institute New Voices fellow.

By Samuel Kargbo

Manchester’s Bright Future

MANCHESTER – I am a proud Mancunian (as the people of Manchester are known), despite the fact I haven’t lived there permanently since I left school for university when I was 18. I was born in St. Mary’s hospital near the city center, was raised in a pleasant suburb in South Manchester, and attended a normal primary and junior school in a nearby, tougher neighborhood, before attending Burnage for secondary school. Thirty-eight years after I attended Burnage, so, apparently, did Salman Abedi, the suspected Manchester Arena bomber.

The atrocity carried out by Abedi, for which the Islamic State has claimed credit,is probably worse than the dreadful bombing by the Irish Republican Armythat destroyed parts of the city center 21 years ago, an event that many believe played a key role in Manchester’srenaissance. At least in that case, the bombers gave a 90-minute warning that helped avoid loss of life. Abedi’sbarbaric act, by contrast, killed at least 22 people, many of them children.

In recent years, I have been heavily involved in the policy aspects of this great city’s economic revival. I chaired an economic advisory group to the Greater Manchester Council, and then served as Chair of the Cities Growth Commission, which advocated for the “Northern Powerhouse,” a program to link the cities of the British north into a cohesive economic unit. Subsequently, I briefly joined David Cameron’s government, to help implement the early stages of the Northern Powerhouse.

I have never attended a concert at the Manchester Arena, but it appears to be a great venue for the city.Just as Manchester Airport has emerged as a transport hub serving the Northern Powerhouse, the arena plays a similar role in terms of live entertainment. As the sad reports about thoseaffected indicate, attendees came from many parts of northern England (and beyond).

In the past couple of years, Manchester has received much praise for its economic revival, including its position at the geographic heart of the Northern Powerhouse, and I am sure this will continue. Employment levels and the regional PMI business surveys indicate that, for most of the past two years, economic momentum has been stronger in North West England than in the country as a whole, including London.Whether this is because of the Northern Powerhouse policy is difficultto infer; whatever the reason, it is hugely welcome and important to sustain.

To my occasional irritation, manypeople still wonder what exactly the Northern Powerhouseis. At its core, it represents the economic geography that lies within Liverpool to the west, Sheffield to the East, and Leeds to the northeast, with Manchester in the middle. The distance from Manchester to the center of any of those other cities is less than 40 miles (64 kilometers), which is shorter than the London Underground’s Central, Piccadilly, or District lines. If the 7-8 million people wholive in those cities – and in the numerous towns, villages,and other areas between them – can be connected via infrastructure, they can act as a single unitin terms of their roles as consumers and producers.

The Northern Powerhouse would thenbe a genuine structural game changer forBritain’s economy.Indeed, along with London, it would be a second dynamic economic zone that registers on a global scale. It is this simple premise that led the previous government to place my ideas at the core of its economic policies, and why the Northern Powerhouse has become so attractive to business here in the United Kingdom and overseas.

It is a thrilling prospect, and, despite being less than three years old, it is showing signs of progress.In fact, given the broader economic benefits of agglomeration, the Northern Powerhouse mantra can be extendedto the whole of the North of England, not least to include Hull and the North East. But it is what I often inelegantly call “Man-Sheff-Leeds-Pool”thatdistinguishes the Northern Powerhouse, and Manchester, which sits at the heart of it, is certainly among the early beneficiaries.

Despite this, I have frequently said to local policy leaders, business people, those from the philanthropic world, and others that unless the areas lying outside the immediate vicinity of central Manchester benefit fromregional dynamism, Greater Manchester’s success will be far from complete. Anyone wholookslittle more than a mile north, south, east,or west of Manchester’s Albert Square – never mind slightly less adjacent parts such as Oldham and Rochdale – can see that much needs to be improved, including education, skills training, and inclusiveness, in order to ensure long-termsuccess.

Whatever the warped motive of the 22-year-old Abedi,who evidentlyblew himself up along with the innocent victims, his reprehensible act willnot tarnish Manchester’s bright, hopeful future. I do notclaimtounderstand the world of terrorism,but I do know that those who live in and around Manchester and other cities need to feel part of their community and share its aspirations. Residents who identify with their community are less likely to harm it – and more likely to contribute actively to itsrenewed vigor.

Now more than ever, Manchester needs the vision that the Northern Powerhouse provides. It is a vision that other cities and regions would do well to emulate. Jim O’Neill, a former chairman of Goldman Sachs Asset Management and a former UK Treasury Minister, is Honorary Professor of Economics at Manchester University and former Chairman of the British government's Review on Antimicrobial Resistance.

By Jim O’Neill

The Six-Day War at 50

NEW YORK – The world is about to mark the 50th anniversary of the June 1967 war between Israel and Egypt, Jordan, and Syria – a conflict that continues to stand out in a region with a modern history largely defined by violence. The war lasted less than a week, but its legacy remains pronounced a half-century later.

The war itself was triggered by an Israeli preemptive strike on the Egyptian air force, in response to Egypt’s decision to expel a United Nations peacekeeping force from Gaza and the Sinai Peninsula and to close the Straits of Tiran to Israeli shipping. Israel struck first, but most observers regarded what it did as a legitimate act of self-defense against an imminent threat.

Israel did not intend to fight on more than one front, but the war quickly expanded when both Jordan and Syria entered the conflict on Egypt’s side. It was a costly decision for the Arab countries. After just six days of fighting, Israel controlled the Sinai Peninsula and the Gaza strip, the Golan Heights, the West Bank, and all of Jerusalem. The new Israel was more than three times larger than the old one. It was oddly reminiscent of Genesis: six days of intense effort followed by a day of rest, in this case the signing of a cease-fire.

The one-sided battle and its outcome put an end to the notion (for some, a dream) that Israel could be eliminated. The 1967 victory made Israel permanent in ways that the wars of 1948 and 1956 did not. The new state finally acquired a degree of strategic depth. Most Arab leaders came to shift their strategic goal from Israel’s disappearance to its return to the pre-1967 war borders.

The Six-Day War did not, however, lead to peace, even a partial one. That would have to wait until the October 1973 war, which set the stage for what became the Camp David Accords and the Israel-Egypt peace treaty. The Arab side emerged from this subsequent conflict with its honor restored; Israelis for their part emerged chastened. There is a valuable lesson here: decisive military outcomes do not necessarily lead to decisive political results, much less peace.

The 1967 war did, however, lead to diplomacy, in this case UN Security Council Resolution 242. Approved in November 1967, the resolution called for Israel to withdraw from territories occupied in the recent conflict – but also upheld Israel’s right to live within secure and recognized boundaries. The resolution was a classic case of creative ambiguity. Different people read it to mean different things. That can make a resolution easier to adopt, but more difficult to act on.

It thus comes as little surprise that there is still no peace between Israelis and Palestinians, despite countless diplomatic undertakings by the United States, the European Union and its members, the UN, and the parties themselves. To be fair, Resolution 242 cannot be blamed for this state of affairs. Peace comes only when a conflict becomes ripe for resolution, which happens when the leaders of the principal protagonists are both willing and able to embrace compromise. Absent that, no amount of well-intentioned diplomatic effort by outsiders can compensate.

But the 1967 war has had an enormous impact all the same. Palestinians acquired an identity and international prominence that had largely eluded them when most were living under Egyptian or Jordanian rule. What Palestinians could not generate was a consensus among themselves regarding whether to accept Israel and, if so, what to give up in order to have a state of their own.

Israelis could agree on some things. A majority supported returning the Sinai to Egypt. Various governments were prepared to return the Golan Heights to Syria under terms that were never met. Israel unilaterally withdrew from Gaza and signed a peace treaty with Jordan. There was also broad agreement that Jerusalem should remain unified and in Israeli hands.

But agreement stopped when it came to the West Bank. For some Israelis, this territory was a means to an end, to be exchanged for a secure peace with a responsible Palestinian state. For others, it was an end in itself, to be settled and retained.

This is not to suggest a total absence of diplomatic progress since 1967. Many Israelis and Palestinians have come to recognize the reality of one another’s existence and the need for some sort of partition of the land into two states. But for now the two sides are not prepared to resolve what separates them. Both sides have paid and are paying a price for this standoff.

Beyond the physical and economic toll, Palestinians continue to lack a state of their own and control over their own lives. Israel’s objective of being a permanent Jewish, democratic, secure, and prosperous country is threatened by open-ended occupation and evolving demographic realities.

Meanwhile, the region and the world have mostly moved on, concerned more about Russia or China or North Korea. And even if there were peace between Israelis and Palestinians, it would not bring peace to Syria, Iraq, Yemen, or Libya. Fifty years after six days of war, the absence of peace between Israelis and Palestinians is part of an imperfect status quo that many have come to accept and expect.

Richard N. Haass is the president of the Council on Foreign Relations and the author, most recently, ofAworld World in Disarray: American Foreign Policy and the Crisis of the Old Order.

By Richard N. Haass

The Promise of Digital Health

BASEL – Africa has changed remarkably, and for the better, since I first worked as a young doctor in Angola some 20 years ago. But no change has been more obvious than the way the continent has adopted mobile technology. People in Africa – and, indeed, throughout low- and middle-income countries – are seizing the opportunities that technology provides, using mobile phones for everything from making payments to issuing birth certificates, to gaining access to health care.

The benefit of mobile technologies lies in access. Barriers like geographical distance and low resources, which have long prevented billions of people from getting the care they need, are much easier to overcome in the digital age. And, indeed, there are countless ways in which technology can be deployed to improve health-care access and delivery.

Of course, this is not new information, and a growing number of technology-based health initiatives have taken shape in recent years. But only a few have reached scale, and achieved long-term sustainability; the majority of projects have not made it past the pilot phase. The result is a highly fragmented landscape of digital solutions – one that, in some cases, can add extra strain to existing health systems.

The first step to addressing this problem is to identify which factors breed success – and which impede it. Here, perhaps the most important observation relates to how the solution is linked to the reality on the ground. After all, technology is an enabler for the innovation of health-care delivery, not an end in itself.

Solutions that focus on end-users, whether health practitioners or patients, have the best chance of succeeding. Fundamental to this approach is the recognition that what users need are not necessarily the most advanced technologies, but rather solutions that are easy to use and implement. In fact, seemingly outdated technologies like voice and text messages can be far more useful tools for the intended users than the latest apps or cutting-edge innovations in, say, nanotechnology.

Consider the Community-based Hypertension Improvement Project in Ghana, run by the Novartis Foundation, which I lead, and FHI 360. The project supports patients in self-managing their condition through regular mobile medication reminders, as well as advice on necessary lifestyle changes. This approach is successful because it is patient-centered and leverages information and communication technology (ICT) tools that are readily available and commonly used. In a country where mobile penetration exceeds 80% but only a few people have smartphones, such simple solutions can have the greatest impact.

For health practitioners, digital solutions must be perceived as boosting efficiency, rather than adding to their already-heavy workload. Co-creating solutions with people experienced in delivering health care in low-resource settings can help to ensure that the solutions are adopted at scale.

For example, the telemedicine network that the Novartis Foundation and its partners rolled out with the Ghana Health Service was a direct response to the need, expressed by health-care practitioners on the ground, to expand the reach of medical expertise. The network connects frontline health workers with a simple phone call to consultation centers in referral hospitals several hours away, where doctors and specialists are available around-the-clock. From the outset, the project was a response to an expressed need to expand the reach of medical expertise, and was fully operated on the ground by Ghana Health Service staff, which made this model sustainable at scale.

To realize the full potential of digital health, solutions need to be integrated into national health systems. Only then can digital technology accelerate progress toward universal health coverage and address countries’ priority health needs.

Collaboration across the health and ICT sectors, both public and private, is essential. Multidisciplinary partnerships driven by the sustained leadership of senior government officials must guide progress, beginning at the planning stage. Intra-governmental collaboration, dedicated financing for digital health solutions, and effective governance mechanisms will also be vital to successful strategies.

Digital technologies offer huge opportunities to improve the way health care is delivered. If we are to seize them, we must learn from past experience. By remaining focused on the reality of end-users and on priority health needs, rather than being dazzled by the latest technology, we can fulfil the promise of digital health. Ann Aerts is Head of the Novartis Foundation and Chair of the Broadband Commission for Sustainable Development Working Group on Digital Health.

By Ann Aerts

Energy, Economics, and the Environment

LONDON – To secure a low-carbon future and begin to address the challenge of climate change, the world needs more investment in renewable energy. So how do we get there? No system of power production is perfect, and even “green” power projects, given their geographic footprint, must be managed carefully to mitigate “energy sprawl” and the associated effects on landscapes, rivers, and oceans.

Hydropower offers one of the clearest examples of how the location of renewable energy infrastructure can have unintended consequences. Dam-generated electricity is currently the planet’s largest source of renewable energy, delivering about twice as much power as all other renewables combined. Even with massive expansion in solar and wind power projects, most forecasts assume that meeting global climate mitigation goals will require at least a 50% increase in hydropower capacity by 2040.

Despite hydropower’s promise, however, there are significant economic and ecological consequences to consider whenever dams are installed. Barriers that restrict the flow of water are particularly disruptive to inland fisheries, for example. More than six million tons of fish are harvested annually from river basins with projected hydropower development. Without proper planning, these projects could jeopardize a key source of food and income generation for more than 100 million people.

Consequences like these are not always apparent when countries plan dams in isolation. In many parts of Asia, Latin America, and Sub-Saharan Africa, hydropower is an important source of energy and economic development. But free-flowing rivers are also essential to the health of communities, local economies, and ecosystems. By some estimates, if the world completes all of the dam projects currently underway or planned without mitigation measures, the resulting infrastructure would disrupt 300,000 kilometers (186,411 miles) of free-flowing rivers – a length equivalent to seven trips around the planet.

There is a better way. By taking a system-scale approach – looking at dams in the context of an entire river basin, rather than on a project-by-project basis – we can better anticipate and balance the environmental, social, and economic effects of any single project, while at the same time ensuring that a community’s energy needs are met. The Nature Conservancy has pioneered such a planning approach – what we call “Hydropower by Design” – to help countries realize the full value within their river basins.

Even one dam changes the physical attributes of a river basin. Multiplied through an entire watershed, the impact is magnified. Hydropower projects planned in isolation not only often cause more environmental damage than necessary; they often fail to achieve their maximum strategic potential and may even constrain future economic opportunities.

As a result, even dams that meet their power-generation goals may fail to maximize the long-term value of other water-management services such as flood control, navigation, and water storage. Our research shows that these services add an estimated $770 billion annually to the global economy. Failure to design dams to their fullest potential, therefore, carries a significant cost.

In the past, some developers have been resistant to this sort of strategic planning, believing that it would cause delays and be expensive to implement. But, as the Conservancy’s latest report – The Power of Rivers: A Business Case – demonstrates, accounting for environmental, social, and economic risks up front can minimize delays and budget overruns while reducing the possibility of lawsuits. More important, for developers and investors, employing a holistic or system-wide approach leverages economies of scale in dam construction.

The financial and development benefits of such planning enable the process to pay for itself. Our projections show that projects sited using a Hydropower by Design approach can meet their energy objectives, achieve a higher average rate of return, and reduce adverse effects on environmental resources. With nearly $2 trillion of investment in hydropower anticipated between now and 2040, the benefits of smarter planning represent significant value.

System-scale hydropower planning does not require builders to embrace an entirely new process. Instead, governments and developers can integrate principles and tools into existing planning and regulatory processes. Similar principles are being applied to wind, solar, and other energy sources with large geographic footprints.

Completing the transition to a low-carbon future is perhaps the preeminent challenge of our time, and we won’t succeed without expanding renewable-energy production. In the case of hydropower, if we plan carefully using a more holistic approach, we can meet global goals for clean energy while protecting some 100,000 kilometers of river that would otherwise be disrupted. But if we don’t step back and see the whole picture, we will simply be trading one problem for another.GiulioBoccaletti is Chief Strategy Officer and Global Managing Director for Water at The Nature Conservancy.

By GiulioBoccaletti

Germany Will Lose if Macron Fails

FRANKFURT – When Emmanuel Macron won the French presidential election, many Germans breathed a loud sigh of relief. A pro-European centrist had soundly defeated a far-right populist, the National Front’s Marine Le Pen. But if the nationalist threat to Europe is truly to be contained, Germany will have to work with Macron to address the economic challenges that have driven so many voters to reject the European Union.

This will not be easy. In fact, within a couple of days of the election, core planks of Macron’s economic platform were already under attack in Germany. For starters, his proposed reforms of eurozone governance have been met with substantial criticism.

Macron’s campaign manifesto embraced the idea of more eurozone federalism, characterized by a shared budget for eurozone public goods, administered by a eurozone economics and finance minister and accountable to a eurozone parliament. It also called for greater coordination of tax regimes and border controls, stronger protection of the integrity of the internal market, and, in view of the rising threat of protectionism in the United States, a “made in Europe” procurement policy.

An attempt at re-opening the debate about Eurobonds, or the partial mutualization of eurozone public-sector liabilities, was viewed as a pie-in-the-sky suggestion, mostly just a distraction. And, incidentally, it appears nowhere in Macron’s platform. Far more disturbing to German pundits and policymakers is Macron’s desire for Germany to make use of its fiscal capacity to boost domestic demand, thereby reducing its massive current-account surplus.

These are not new ideas: the European Commission, the International Monetary Fund, Macron’s predecessors, and economists throughout Europe have advanced them often. And, just as predictably, Germany’s government has roundly rejected them, relying on reasoning that, like its counter-arguments, is well rehearsed.

For the most part, German economists and officials believe that economic policy should focus almost exclusively on the supply side, diagnosing and addressing structural problems. And German officials also regularly suggest that their economy is already performing at close to its supply-determined limits.

In fact, far from viewing the current-account surplus as a policy problem, the German government sees it as a reflection of the underlying competitiveness of German firms. It is the benign upshot of responsible labor unions, which allow for appropriate firm-level wage flexibility.

The accumulation of foreign assets is a logical corollary of these surpluses, not to mention an imperative for an aging society. Indeed, German policymakers view as essential a reduction of Germany’s debt-to-GDP ratio toward the 60% ceiling set by European rules. When, if not in good times, does one have the chance to save?

This stance does not align particularly smoothly with Macron’s economic program. While Macron’s program includes significant proposals for addressing supply-side issues with the French economy, it also favors output stabilization and, more important, increased spending in areas like public infrastructure, digitization, and clean energy to boost potential growth.

Despite Macron’s decisive victory, he faces an uphill battle implementing his economic agenda. Even if the National Assembly, to be elected in June, endorses his reform program, street-level resistance will be no less fierce than it has been over the last few years.

Germany, however, has good reason to support Macron’s supply- and demand-side reforms. After all, France and Germany are deeply interdependent, meaning that Germany has a stake in Macron’s fate.

While it is true that the German government cannot (fortunately) fine-tune wages, it could, out of sheer self-interest, provide for its future by investing more in its human and social capital – including schools, from kindergartens to universities, and infrastructure like roads, bridges, and bandwidth. This approach would reduce the private user cost of capital, thereby making private investment more attractive. It would also create domestic real assets, reducing Germany’s exposure to foreign credit risk. A lower current-account surplus implies a more sustainable net-financial-liability position for Germany’s partners.

If Germany and Macron don’t find common ground, the costs to both will be massive. No malicious external actor is imposing populism on Europe; it has emerged organically, fueled by real and widespread grievances. While those grievances are not exclusively economic, the geography of populism does fit that of the EU’s economic malaise: too many Europeans have been losing out for too long. So, if Macron fails to deliver on his promises, a Euroskeptic like Le Pen could well win France’s next election.

To avoid this outcome, Macron must be firmer than his predecessors in pursuing difficult but ultimately beneficial policies. He might take a page from former German Chancellor Gerhard Schröder’s playbook. In 2003, Schröder prioritized reforms over rigorous obedience to the EU’s Stability and Growth Pact. Additional fiscal leeway was needed to smooth the economy’s adjustment to the bold labor-market reforms that he was introducing. The decision to prioritize reforms over obstinate rule-following proved to be a good one.

Now is Macron’s Schröderian moment. He, too, appears to have opted for reasoned pragmatism over the blind implementation of rigid rules (which cannot make sense under any circumstances). Fortunately, policy principles are not written in stone, not even in Germany. Recall that the German government adamantly rejected the eurozone banking union and the European Stability Mechanism, both of which were ultimately launched (though some say it was too little, too late).

Europe is experiencing a seismic shift, with its political system being undermined from within (and becoming vulnerable to Russian pressure from without). Fear of the “other” and perceptions of trade as a zero-sum game are taking hold. These circumstances call for bold and committed action, not only by France, but also by Germany, which, ultimately, has the most to lose.

Hans-Helmut Kotz, Program Director of the SAFE Policy Center at the Goethe University in Frankfurt, is a visiting professor of economics and a resident fellow at the Center for European Studies at Harvard University.

By Hans-Helmut Kotz

Toward a Global Treaty on Plastic Waste

BERLIN – If there are any geologists in millions of years, they will easily be able to pinpoint the start of the so-called Anthropocene – the geological age during which humans became the dominant influence on our planet’s environment. Wherever they look, they will find clear evidence of its onset, in the form of plastic waste.

Plastic is a key material in the world economy, found in cars, mobile phones, toys, clothes, packaging, medical devices, and much more. Worldwide, 322 million metric tons of plastic were produced in 2015. And the figure keeps growing; by 2050, it could be four times higher.

But plastic already is creating massive global environmental, economic, and social problems. Despite requiring resources to produce, plastic is so cheap that it often is used for disposable – often single-use – products. As a result, a huge amount of it ends up polluting the earth.

Plastic clogs cities’ sewer systems and increases the risk of flooding. Larger pieces can fill with rainwater, providing a breeding ground for disease-spreading mosquitos. Up to 13 million tons of plastic waste end up in the ocean each year; by 2050, there could be more plastic in there than fish. The plastic that washes up on shores costs the tourism industry hundreds of millions of dollars every year.

Moreover, all that plastic poses a serious threat to wildlife. Beyond the dead or dying seals, penguins, and turtles that had the bad fortune of becoming entangled in plastic rings or nets, biologists are finding dead whales and birds with stomachs stuffed with plastic debris.

Plastic products may not be all that good for humans, either. While the plastics used, say, to package our foods are usually nontoxic, most plastics are laden with chemicals, from softeners (which can act as endocrine disruptors) to flame retardants (which can be carcinogenic or toxic in higher concentrations). These chemicals can make it through the ocean and its food chain – and onto our plates.

Addressing the problem will not be easy; no single country or company, however determined, can do so on its own. Many actors – including the biggest plastic producers and polluters, zero-waste initiatives, research labs, and waste-picker cooperatives – will have to tackle the problem head-on.

The first step is to create a high-level forum to facilitate discussion among such stakeholders, with the goal of developing a cooperative strategy for reducing plastic pollution. Such a strategy should go beyond voluntary action plans and partnerships to focus on developing a legally binding international agreement, underpinned by a commitment from all governments to eliminate plastic pollution. Negotiations on such a treaty could be launched this year, at the United Nations Environment Assembly in Nairobi in December.

Scientists have already advanced concrete proposals for a plastic-pollution treaty. One of the authors of this article proposed a convention modeled after the Paris climate agreement: a binding overarching goal combined with voluntary national action plans and flexible measures to achieve them. A research team from the University of Wollongong in Australia, taking inspiration from the Montreal Protocol, the treaty that safeguards the ozone layer, has suggested caps and bans on new plastic production.

Some might ask whether we should embark on yet another journey down the long, winding, and tiresome road of global treaty negotiations. Can’t we engineer our way out of our plastic problem?

The short answer is, probably not. Biodegradable plastics, for example, make sense only if they decompose quickly enough to avoid harming wildlife. Even promising discoveries like bacteria or moths that can dissolve or digest plastics can provide only auxiliary support.

The only way truly to address the problem is to slash our plastic waste. Technology might be able to help, offering more options for substitution and recycling; but, as the many zero-waste communities and cities around the world have shown, it is not necessary.

For example, Capannori, a town of 46,700 inhabitants near Lucca in Tuscany, signed a zero-waste strategy in 2007. A decade later, it has reduced its waste by 40%. With 82% of municipal waste now separated at source, just 18% of residual waste ends up in landfills. Such experiences should inform and guide the national action plans that would form part of the treaty on plastics.

The European Commission’s “circular economy package” may provide another example worth emulating. Though it has not yet been implemented, its waste targets have the potential to save the European Union 190 million tons of CO2 emissions per year. That is the equivalent of annual emissions in the Netherlands.

Of course, the transition to zero waste will require some investment. Any international treaty on plastic must therefore include a funding mechanism, and the “polluter pays” principle is the right place to start. The global plastic industry, with annual revenues of about $750 billion, surely could find a few hundred million dollars to help clean up the mess it created.

A comprehensive, binding, and forward-looking global plastics treaty will not be easy to achieve. It will take time and cost money, and it will inevitably include loopholes and have shortcomings. It certainly will not solve the plastic pollution problem on its own. But it is a prerequisite for success.

Plastic pollution is a defining problem of the Anthropocene. It is, after all, a global scourge that is entirely of our making – and entirely within our power to solve as well.

Nils Simon is a political scientist and Senior Project Manager at adelphi research. LiliFuhr heads the Ecology and Sustainable Development Department at the Heinrich Böll Foundation.

By Nils Simon and LiliFuhr

How Culture Shapes Human Evolution

ST. ANDREWS – Is there an evolutionary explanation for humanity’s greatest successes – technology, science, and the arts – with roots that can be traced back to animal behavior? I first asked this question 30 years ago, and have been working to answer it ever since.

Plenty of animals use tools, emit signals, imitate one another, and possess memories of past events. Some even develop learned traditions that entail consuming particular foods or singing a particular kind of song – acts that, to some extent, resemble human culture.

But human mental ability stands far apart. We live in complex societies organized around linguistically coded rules, morals, and social institutions, with a massive reliance on technology. We have devised machines that fly, microchips, and vaccines. We have written stories, songs, and sonnets. We have danced in Swan Lake.

Developmental psychologists have established that when it comes to dealing with the physical world (for example, spatial memory and tool use), human toddlers’ cognitive skills are already comparable to those of adult chimpanzees and orangutans. In terms of social cognition (such as imitating others or understanding intentions), toddlers’ minds are far more sophisticated.

The same gap is observed in both communication and cooperation. Vaunted claims that apes produce language do not stand up to scrutiny: animals can learn the meanings of signs and string together simple word combinations, but they cannot master syntax. And experiments show that apes cooperate far less readily than humans.

Thanks to advances in comparative cognition, scientists are now confident that other animals do not possess hidden reasoning powers and cognitive complexity, and that the gap between human and animal intelligence is genuine. So how could something as extraordinary and unique as the human mind evolve?

A major interdisciplinary effort has recently solved this longstanding evolutionary puzzle. The answer is surprising. It turns out that our species’ most extraordinary characteristics – our intelligence, language, cooperation, and technology – did not evolve as adaptive responses to external conditions. Rather, humans are creatures of their own making, with minds that were built not just for culture, but by culture. In other words, culture transformed the evolutionary process.

Key insights came from studies on animal behavior, which showed that, although social learning (copying) is widespread in nature, animals are highly selectiveabout what and whom they copy. Copying confers an evolutionary advantage only when it is accurate and efficient. Natural selection should therefore favor structures and capabilities in the brain that enhance the accuracy and efficiency of social learning.

Consistent with this prediction, research reveals strong associations between behavioral complexity and brain size. Big-brained primates invent new behaviors, copy the innovations of others, and use tools more than small-brained primates do. Selection for high intelligence almost certainly derives from multiple sources, but recent studies imply that selection for the intelligence to cope with complex social environments in monkeys and apes was followed by more restricted selection for cultural intelligence in the great apes, capuchins, and macaques.

Why, then, haven’t gorillas invented Facebook, or capuchins built spacecraft? To achieve such high levels of cognitive functioning requires not just cultural intelligence, but also cumulative culture, in which modifications accumulate over time. That demands transmission of information with a degreeof accuracyof which only humans are capable. Indeed, small increases in the accuracy of social transmission lead to big increases in the diversity and longevity of culture, as well as to fads, fashions, and conformity.

Our ancestors were able to achieve such high-fidelity information transmission not just because of language, but also because of teaching – a practice that is rare in nature, but universal among humans (once the subtle forms it takes are recognized). Mathematical analyses reveal that,while it is generally difficult for teaching to evolve, cumulative culture promotes teaching. This implies that teaching and cumulative culture co-evolved, producing a species that taught relatives across a broad range of circumstances.

It is in this context that language appeared. Evidence suggests that language originally evolved to reduce the costs, increase the accuracy, and expand the domains of teaching. That explanation accounts for many properties of language, including its uniqueness, power of generalization, and the fact that it is learned.

All of the elements that have underpinned the development of human cognitive abilities –encephalization (the evolutionary increase in the size of the brain), tool use, teaching, and language – have one key characteristic in common: the conditions that favored their evolution were created by cultural activities, through selective feedback. As theoretical, anthropological, and genetic studies all attest, a co-evolutionary dynamic – in which socially transmitted skills guided the natural selection that shaped human anatomy and cognition – has underpinned our evolution for at least 2.5 million years.

Our potent capacity for imitation, teaching, and language also encouraged unprecedented levels of cooperation among individuals, creating conditions that not only promoted longstanding cooperative mechanisms such as reciprocity and mutualism, but also generated new mechanisms. In the process, gene-culture co-evolution created a psychology – a motivation to teach, speak, imitate, emulate, and connect – that is entirely different from that of other animals.

Evolutionary analysis has shed light on the rise of the arts, too. Recent studies of the development of dance, for example, explain how humans move in time to music, synchronize their actions with others, and learn long sequences of movements.

Human culture sets us apart from the rest of the animal kingdom. Grasping its scientific basis enriches our understanding of our history – and why we became the species we are. Kevin Laland is Professor of Behavioral and Evolutionary Biology at the University of St Andrews, and the author of Darwin’s Unfinished Symphony: How Culture Made the Human Mind.

By Kevin Laland

We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…