Opinion

Keeping US Policymaking Honest

BERKELEY – In a recent appearance here at the University of California, Berkeley, Alice Rivlin expressed optimism about the future of economic policymaking in the United States. What Rivlin – who served as Vice Chair of the US Federal Reserve, Director of the White House Office of Management and Budget (OMB) under President Bill Clinton, and founding Director of the Congressional Budget Office (CBO) – thinks about that topic matters a great deal. Indeed, America owes its current system of “technocracy” – which ensures that policymaking follows sound analysis and empirical evidence – more to Rivlin than to any other living human.


When she was younger, however, Rivlin was denied admission to the graduate program at Harvard University’s Littauer Center of Public Administration. Her application was rejected, she was told, because of “unfortunate experiences” with previous admissions of “women of marriageable age.”

In those phrases, you can almost hear the New England Puritans’ unctuous sermonizing about the seduction of Eve by the serpent, and her subsequent temptation of Adam. Of course, when Rivlin helped found the CBO in 1974 she was essentially eating from the Tree of Knowledge, and she was making the rest of us eat from it, too. We are all better for it.

In her recent talk, Rivlin expressed confidence that, despite today’s populist attacks on expertise, high-quality policy analysis will continue to flourish in the twenty-first-century public sphere. And she predicted that empirical evidence and expert knowledge will still carry substantial – if not full – weight in decision-making by legislators, presidents, and their advisers.

To be sure, the CBO has never been more influential than it is this year. Its influence has been felt not merely because of its role in congressional proceedings, but also because it offers assessments that are widely respected across government, the media, and civil society. Its estimates of how congressional Republicans’ legislative proposals will affect the country are deeply informed, nonpartisan, and made in good faith. So far, at least, it seems that Rivlin is right to be optimistic.

Still, I have my doubts about the future. Rivlin believes that there is a general consensus within policymaking circles about basic economic principles, and that those principles will underpin the assessments, estimates, and models used in public-policy debates. She pointed out that no reputable economists today regard a simple monetary-policy rule as a magic bullet for avoiding depressions and inflationary spirals, whereas many once did.

That is true, as far as it goes. And yet, until the announcement that Jerome Powell had been selected as the next Fed Chair, Stanford University economist John Taylor was a leading contender. Taylor is known for having developed his own guideline (the “Taylor rule”) for how central banks should set interest rates. And he has long clung to this rule, despite a lack of evidence that it would have delivered better results than the Fed’s actual policy decisions since the 1970s.

Moreover, when US President Donald Trump appointed former American Enterprise Institute economist Kevin Hassett to lead the White House Council of Economic Advisers, many expected that Hassett would be a “normal” CEA chairman. Hassett, we were told, would safeguard the CEA’s credibility, by ensuring that its estimates remained in line with those of the larger policy-analysis community. And he would understand that agencies and organizations such as the CBO, OMB, Joint Committee on Taxation, Tax Policy Center (TPC), and Center on Budget and Policy Priorities have a principal allegiance to facts, not to some donor or political master.

Yet Hassett has so far spent his time at the CEA tearing down TPC estimates, even though the organization will undoubtedly issue assessments in the future that are as inconvenient for his political adversaries as they are for him today.

According to the near-consensus among policy analysts, the share of corporate taxes borne by labor, and the share of lost revenues from a cut in corporate income tax that will be recouped through increased investment, are both 25%. Yet the CEA, under Hassett, now assumes that both are 82%. That claim, as well as Hassett’s recent attacks on the TPC, made former US Treasury Secretary Larry Summers angrier than I can ever recall having seen him with respect to a public-policy issue. According to Summers, Hassett’s analysis is “some combination of dishonest, incompetent, and absurd.”

Benjamin Franklin famously told the American people that the US Constitution would provide them with “a republic, if you can keep it.” In her long, distinguished career, Rivlin and others like her have provided us with a rational policymaking process – if we can keep it.

J. Bradford DeLong, a former deputy assistant US Treasury secretary, is Professor of Economics at the University of California at Berkeley and a research associate at the National Bureau of Economic Research.

By J. Bradford DeLong

Climate Leadership Means Ending Fossil-Fuel Production

VANCOUVER/BERLIN – The end of the fossil-fuel era is on the horizon. With renewables like solar and wind consistently outperforming expectations, growth in electric vehicles far exceeding projections, and governments worldwide acknowledging the urgency of tackling climate change, the writing is on the wall.


And yet somehow, the question central to it all is not being seriously addressed: what is the plan for weaning ourselves off oil, coal, and gas?

That question is becoming increasingly urgent, because governments around the world, from Argentina to India to Norway, are supporting plans to continue producing fossil fuels and explore for more. These governments claim that new fossil-fuel projects are consistent with their commitments under the Paris climate agreement, despite the fact that burning even the fossil fuels in already-existing reserves would push global temperatures higher than 2°C above pre-industrial levels – and thus far beyond the threshold established in that accord. It is a startling display of cognitive dissonance.

The reality is that limiting fossil-fuel production today is essential to avoid continued entrenchment of energy infrastructure and political dynamics that will make shifting away from fossil fuels later more difficult and expensive. Important questions about equity will arise: Who gets to sell the last barrel of oil? Who pays for the transition to renewables? And who compensates affected communities and workers? But, ultimately, these questions must be addressed, within a broader context of climate justice.

Climate change has been called the moral challenge of our age. This year alone, the world has faced unprecedented floods, hurricanes, wildfires, and droughts on virtually every continent. Yet the real storm is yet to come. If we are to avoid its most devastating impacts, phasing out coal – climate killer number one – will not be enough. A safe climate future requires ending the age of Big Oil.

The good news is that social change is not a gradual, linear process. Rather, it often happens in waves, characterized by “tipping point” moments brought on by the confluence of technological progress, financial incentives, political leadership, policy change, and, most important, social mobilization. We seem to be closing in on just such a moment.

For starters, technology is advancing faster than anyone thought possible. Twenty years ago, when we started working on climate issues, we sent faxes, made phone calls from landlines, and developed photos taken on 35mm film in darkrooms. Another 20 years from now, we will be living in a world that is powered by the sun, the waves, and the wind.

Moreover, popular opposition to fossil-fuel development is mounting, generating political pressure and financial and legal risks. Ordinary people everywhere have been working hard to halt projects inconsistent with a climate-safe future, whether by protesting against the Dakota Access Pipeline in the United States or the Kinder Morgan Trans Mountain Pipeline System in Canada; by joining the blockade by “kayactivists” of drilling rigs in the Arctic; or by using local referenda to stop oil and mining projects in Colombia.

Recently, over 450 organizations from more than 70 countries signed the Lofoten Declaration, which explicitly calls for the managed decline of the fossil-fuel sector. The declaration demands leadership from those who can afford it, a just transition for those affected, and support for countries that face the most significant challenges.

Wealthy countries should lead the way. Norway, for example, is not just one of the world’s richest countries; it is also the seventh-largest exporter of carbon dioxide emissions, and it continues to permit exploration and development of new oil and gas fields. Proposed and prospective new projects could increase the amount of emissions Norway enables by 150%.

If Norway is to fulfill its proclaimed role as a leader in international climate discussions, its government must work actively to reduce production, while supporting affected workers and communities during the transition. Canada, another wealthy country that considers itself a climate leader yet continues to pursue new oil and gas projects, should do the same.

Some countries are already moving in the right direction. French President Emmanuel Macron has introduced a bill to phase out all oil and gas exploration and production in France and its overseas territories by 2040; the Scottish government has banned fracking altogether; and Costa Rica now produces the vast majority of its electricity without oil. But the real work is yet to come, with countries not only canceling plans for new fossil-fuel infrastructure, but also winding down existing systems.

A fossil-free economy can happen by design or by default. If we build it purposefully, we can address issues of equity and human rights, ensuring that the transition is fair and smooth, and that new energy infrastructure is ecologically sound and democratically controlled. If we allow it simply to happen on its own, many jurisdictions will be stuck with pipelines to nowhere, half-built mega-mines, and stranded assets that weaken the economy and contribute to political polarization and social unrest. There is only one sensible option.

Citizens around the world are championing a vision of a better future – a future in which communities, not corporations, manage their natural resources and ecosystems as commons, and people consume less, create less toxic plastic waste, and enjoy a generally healthier environment. It is up to our political leaders to deliver that vision. They should be working actively to engineer a just and smart shift to a future free of fossil fuels, not making that future harder and more expensive to achieve.

Tzeporah Berman, former Co-Director of Greenpeace International’s Climate Program and co-founder of ForestEthics, is a strategic adviser to a number of First Nations, environmental organizations, and philanthropic foundations and an adjunct professor at York University. She is the author of This Crazy Time: Living Our Environmental Challenge. Lili Fuhr heads the Ecology and Sustainable Development Department at the Heinrich Böll Foundation.

By Tzeporah Berman and Lili Fuhr

Learning from Martin Luther About Technological Disruption

GENEVA – Five hundred years ago this week, a little-known priest and university lecturer in theology did something unremarkable for his time: he nailed a petition to a door, demanding an academic debate on the Catholic Church’s practice of selling “indulgences” – promises that the buyer or a relative would spend less time in purgatory after they died.


Today, Martin Luther’s “95 Theses,” posted at the Castle Church in Wittenberg, Germany (he simultaneously sent a copy to his boss, Cardinal Albrecht von Brandenburg), are widely recognized as the spark that started the Protestant Reformation. Within a year, Luther had become one of Europe’s most famous people, and his ideas – which challenged not only Church practice and the Pope’s authority, but ultimately man’s relationship with God – had begun to reconfigure systems of power and identity in ways that are still felt today.

What made Luther’s actions so momentous? After all, calls for reforming the Church had been occurring regularly for centuries. As the historian Diarmaid MacCulloch writes in A History of Christianity: The First Three Thousand Years, the two centuries before Luther featured near-constant challenges to papal supremacy on issues of philosophy, theology, and politics. How did the concerns of a minor theologian in Saxony lead to widespread religious and political upheaval?

A central piece of the puzzle is the role of emerging technology. A few decades before Luther developed his argument, a German blacksmith named Johannes Gutenberg had invented a new system of movable-type printing, allowing the reproduction of the written word at greater speeds and lower costs than the laborious and less-durable woodblock approach.

The printing press was a revolutionary – and exponential – technology for the dissemination of ideas. In 1455, the “Gutenberg Bible” was printed at a rate of roughly 200 pages per day, significantly more than the 30 pages per day that a well-trained scribe could produce. By Luther’s time, the daily printing rate of a single press had increased to roughly 1,500 single-sided sheets. Improved printing efficiency, combined with steep declines in cost, led to a dramatic increase in access to the written word between 1450 and 1500, even though only an estimated 6% of the population was literate.

Luther quickly grasped the potential of the printing press to spread his message, effectively inventing new forms of publishing that were short, clear, and written in German, the language of the people. Perhaps Luther’s most enduring personal contribution came via his translation of the Bible from Greek and Hebrew into German. He was determined to “speak as men do in the marketplace,” and more than 100,000 copies of the “Luther Bible” were printed in Wittenberg over the following decades, compared to just 180 copies of the Latin Gutenberg Bible.

This new use of printing technology to produce short, punchy pamphlets in the vernacular transformed the industry itself. In the decade before Luther’s theses, Wittenberg printers published, on average, just eight books annually, all in Latin and aimed at local university audiences. But, according to the British historian Andrew Pettegree, between 1517 and Luther’s death in 1546, local publishers “turned out at least 2,721 works” – and average of “91 books per year,” representing some three million individual copies.

Pettegree calculates that a third of all books published during this period were written by Luther himself, and that the pace of publishing continued to increase after his death. Luther effectively published a piece of writing every two weeks – for 25 years.

The printing press greatly expanded the accessibility of the religious controversy that Luther helped fuel, galvanizing the revolt against the Church. Research by the economic historian Jared Rubin indicates that the mere presence of a printing press in a city before 1500 greatly increased the likelihood that the city would become Protestant by 1530. In other words, the closer you lived to a printing press, the more likely you were to change the way you viewed your relationship with the Church, the most powerful institution of the time, and with God.

There are at least two contemporary lessons to be drawn from this technological disruption. For starters, in the context of the modern era’s “Fourth Industrial Revolution” – which Klaus Schwab of the World Economic Forum defines as a fusion of technologies blending the physical, digital, and biological spheres – it is tempting to assess which technologies could be the next printing press. Those who stand to lose from them might even move to defend the status quo, as the Council of Trent did in 1546, when it banned the printing and sale of any Bible versions other than the official Latin Vulgate, without Church approval.

But perhaps the most enduring lesson of Luther’s call for a scholarly debate – and his use of technology to deliver his views – is that it failed. Instead of a series of public discussions about the Church’s evolving authority, the Protestant Reformation became a bitter battle played out via mass communication, splitting not just a religious institution but also an entire region. Worse, it became a means to justify centuries of atrocities, and triggered the Thirty Years’ War, the deadliest religious conflict in European history.

The question today is how we can ensure that new technologies support constructive debate. The world remains full of heresies that threaten our identities and cherished institutions; the difficulty is to view them not as ideas that must be violently suppressed, but as opportunities to understand where and how current institutions are excluding people or failing to deliver promised benefits.

Calls for more constructive engagement may sound facile, naive, or even morally precarious. But the alternative is not just the hardening of divisions and estrangement of communities; it is widespread dehumanization, a tendency that current technologies seem to encourage.

Today’s Fourth Industrial Revolution could be an opportunity to reform our relationship with technology, amplifying the best of human nature. To grasp it, however, societies will need a subtler understanding of the interplay of identity, power, and technology than they managed during Luther’s time.

Nicholas Davis is Head of Society and Innovation at the World Economic Forum.

By Nicholas Davis

Banking on African Infrastructure

JOHANNESBURG – As the US Federal Reserve embarks on the “great unwinding” of the stimulus program it began nearly a decade ago, emerging economies are growing anxious that a stronger dollar will adversely affect their ability to service dollar-denominated debt. This is a particular concern for Africa, where, since the Seychelles issued its debut Eurobond in 2006, the total value of outstanding Eurobonds has grown to nearly $35 billion.


But if the Fed’s ongoing withdrawal of stimulus has frayed African nerves, it has also spurred recognition that there are smarter ways to finance development than borrowing in dollars. Of the available options, one specific asset class stands out: infrastructure.

Africa, which by 2050 will be home to an estimated 2.6 billion people, is in dire need of funds to build and maintain roads, ports, power grids, and so on. According to the World Bank, Africa must spend a staggering $93 billion annually to upgrade its current infrastructure; the vast majority of these funds – some 87% – are needed for improvements to basic services like energy, water, sanitation, and transportation.

Yet, if the recent past is any guide, the capital needed will be difficult to secure. Between 2004 and 2013, African states closed just 158 financing deals for infrastructure or industrial projects, valued at $59 billion – just 5% of the total needed. Given this track record, how will Africa fund even a fraction of the World Bank’s projected requirements?

The obvious source is institutional and foreign investment. But, to date, many factors, including poor profit projections and political uncertainty, have limited such financing for infrastructure projects on the continent. Investment in African infrastructure is perceived as simply being too risky.

Fortunately, with work, this perception can be overcome, as some investors – such as the African Development Bank, the Development Bank of Southern Africa, and the Trade & Development Bank – have already demonstrated. Companies from the private sector are also profitably financing projects on the continent. For example, Black Rhino, a fund set up by Blackstone, one of the world’s largest multinational private equity firms, focuses on the development and acquisition of energy projects, such as fuel storage, pipelines, and transmission networks.

But these are the exceptions, not the rule. Fully funding Africa’s infrastructure shortfall will require attracting many more investors – and swiftly.

To succeed, Africa must develop a more coherent and coordinated approach to courting capital, while at the same time working to mitigate investors’ risk exposure. Public-private sector collaborations are one possibility. For example, in the energy sector, independent power producers are working with governments to provide electricity to 620 million Africans living off the grid. Privately funded but government regulated, these producers operate through power purchase agreements, whereby public utilities and regulators agree to purchase electricity at a predetermined price. There are approximately 130 such producers in Sub-Saharan Africa, valued at more than $8 billion. In South Africa alone, 47 projects are underway, accounting for 7,000 megawatts of additional power production.

Similar private-public partnerships are emerging in other sectors, too, such as transportation. Among the most promising are toll roads built with private money, a model that began in South Africa. Not only are these projects, which are slowly appearing elsewhere on the continent, more profitable than most financial market investments; they are also literally paving the way for future growth.

Clearly, Africa needs more of these ventures to overcome its infrastructure challenges. That is why I, along with other African business leaders and policymakers, have called on Africa’s institutional investors to commit 5% of their funds to local infrastructure. We believe that with the right incentives, infrastructure can be an innovative and attractive asset class for those with long-term liabilities. One sector that could lead the way on this commitment is the continent’s pension funds, which, together, possess a balance sheet of about $3 trillion.

The 5% Agenda campaign, launched in New York last month, underscores the belief that only a collaborative public-private approach can redress Africa’s infrastructure shortfall. For years, a lack of bankable projects deterred international financing. But in 2012, the African Union adopted the Program for Infrastructure Development in Africa, which kick-started more than 400 energy, transportation, water, and communications projects. It was a solid start – one that the 5% Agenda seeks to build upon.

But some key reforms will be needed. A high priority of the 5% Agenda is to assist in updating the national and regional regulatory frameworks that guide institutional investment in Africa. Similarly, new financial products must be developed to give asset owners the ability to allocate capital directly to infrastructure projects.

Unlocking new pools of capital will help create jobs, encourage regional integration, and ensure that Africa has the facilities to accommodate the needs of future generations. But all of this depends on persuading investors to put their money into African projects. As business leaders and policymakers, we must ensure that the conditions for profitability and social impact are not mutually exclusive. When development goals and profits align, everyone wins.

Ibrahim Assane Mayaki, a former Prime Minister of Niger, is CEO of the New Partnership for Africa’s Development (NEPAD) Planning and Coordinating Agency.

By Ibrahim Assane Mayaki

Preempting the Next Pandemic

SYRACUSE – Recent disease outbreaks, like Ebola and Zika, have demonstrated the need to anticipate pandemics and contain them before they emerge. But the sheer diversity, resilience, and transmissibility of deadly diseases have also highlighted, in the starkest of terms, just how difficult containment and prevention can be.


One threat to our preparedness is our connectedness. It was thanks to easy international travel that in recent years the dengue, chikungunya, and Zika viruses were all able to hitch a ride from east to west, causing massive outbreaks in the Americas and Caribbean. Another threat is more mundane: failing to agree about money. Whatever the reason, the fact is that as long as humans fail to organize a collective and comprehensive defense, infectious diseases will continue to wreak havoc – with disastrous consequences.

Building an effective prevention and containment strategy – being bio-prepared – is the best way to reduce the threat of a global contagion. Preparedness requires coordination among agencies and funders to build networks that enable quick deployment of and access to vaccines, drugs, and protocols that limit a disease’s transmission. Simply stated, preparing for the next pandemic means not only building global capacity, but also paying for it.

That’s the idea, at least. The reality of bio-preparedness is far more complicated. For starters, the absence of dedicated funding is impeding implementation of long-term prevention strategies in many countries; a new World Bank report finds that only six countries, including the United States, have taken the threat seriously. Meanwhile, public health officials in many parts of the world struggle to respond to disease outbreaks, owing to a dearth of labs and clinics. And many funding agencies, including governments and NGOs, typically offer only one-year commitments, which rules out long-term planning.

For years, scientists, physicians, and civil-society actors have voiced concern over the lack of reliable, meaningful, and institutionalized investment in pandemic preparedness. These pleas have come, frustratingly, as military funding to thwart bio-attacks, consciously mounted by human actors, has remained robust. But while purposeful and nefarious infectious-disease outbreaks could do massive damage, they remain relatively unlikely. Naturally occurring outbreaks, by contrast, occur regularly and are far more costly, even if they lack the sensational “fear factor” of bioterrorism.

Not that long ago, those of us engaged in the prevention of infectious-disease outbreaks felt more secure about the availability of the resources required to prepare. But in many places, budgets are stagnating, or even declining. This is astonishingly shortsighted, given the relative costs of prevention versus response. For example, what would it have cost to build the clinical and laboratory infrastructure and provide the training needed to identify and prevent the recent Ebola outbreak in West Africa? Precise figures are elusive, but I have no doubt it would have been less than the billions of dollars spent on containment. Preparedness pays.

It is not only the lack of funding that is raising alarms; so are restrictions on how available funds can be used. It is not uncommon for a grant to be restricted to specific activities, leaving major gaps in a program’s capacity to meet its objectives. A funder may, for example, allow the renovation of an existing lab but not the construction of a new one; or funds may support the purchase of a diagnostic machine but not the training of those required to operate it. In many developing countries, communities do not even have the physical buildings in which to test, monitor, or store dangerous pathogens. Myopic funding that overlooks key elements of the big picture is money poorly spent.

Add to these challenges the difficulty of paying staff or ensuring reliable electricity and other essential services, and it becomes clear that preparing for disease outbreaks requires broad engagement with the international aid community. But at the moment, onerous spending rules and weak financial commitments are tying the hands of those working to prevent the next serious disease outbreak.

The number of obstacles faced by scientists and public health experts in the race to contain deadly infectious diseases is staggering. To overcome them, we need to redefine how we think about preparedness, moving from a reactive position to a more proactive approach. Money earmarked for preparedness must be allocated at levels sufficient to have the required impact. Limitations on how it can be spent should be loosened. Funding sources must be opened to allow for multi-year commitments. Health-care providers and first responders must receive proper training. And long-term solutions such as establishing and connecting bio-surveillance systems should be expanded and strengthened, to enable public-health professionals around the world to track and report human and animal diseases and plan defenses together.

Public health is an essential element of global security. Failing to invest appropriately in prevention of infectious-disease outbreaks puts all of us at risk, whenever or wherever the next one occurs.

Stephen J. Thomas, an infectious diseases physician, is Professor of Medicine and Chief of the Division of Infectious Diseases at the State University of New York, Upstate Medical University.

By Stephen J. Thomas

Crypto-Fool’s Gold?

CAMBRIDGE – Is the cryptocurrency Bitcoin the biggest bubble in the world today, or a great investment bet on the cutting edge of new-age financial technology? My best guess is that in the long run, the technology will thrive, but that the price of Bitcoin will collapse.


If you haven’t been following the Bitcoin story, its price is up 600% over the past 12 months, and 1,600% in the past 24 months. At over $4,200 (as of October 5), a single unit of the virtual currency is now worth more than three times an ounce of gold. Some Bitcoin evangelists see it going far higher in the next few years.

What happens from here will depend a lot on how governments react. Will they tolerate anonymous payment systems that facilitate tax evasion and crime? Will they create digital currencies of their own? Another key question is how successfully Bitcoin’s numerous “alt-coin” competitors can penetrate the market.

In principle, it is supremely easy to clone or improve on Bitcoin’s technology. What is not so easy is to duplicate Bitcoin’s established lead in credibility and the large ecosystem of applications that have built up around it.

For now, the regulatory environment remains a free-for-all. China’s government, concerned about the use of Bitcoin in capital flight and tax evasion, has recently banned Bitcoin exchanges. Japan, on the other hand, has enshrined Bitcoin as legal tender, in an apparent bid to become the global center of fintech.

The United States is taking tentative steps to follow Japan in regulating fintech, though the endgame is far from clear. Importantly, Bitcoin does not need to win every battle to justify a sky-high price. Japan, the world’s third largest economy, has an extraordinarily high currency-to-income ratio (roughly 20%), so Bitcoin’s success there is a major triumph.

In Silicon Valley, drooling executives are both investing in Bitcoin and pouring money into competitors. After Bitcoin, the most important is Ethereum. The sweeping, Amazon-like ambition of Ethereum is to allow its users to employ the same general technology to negotiate and write “smart contracts” for just about anything.

As of early October, Ethereum’s market capitalization stood at $28 billion, versus $72 billon for Bitcoin. Ripple, a platform championed by the banking sector to slash transaction costs for interbank and overseas transfers, is a distant third at $9 billion. Behind the top three are dozens of fledgling competitors.

Most experts agree that the ingenious technology behind virtual currencies may have broad applications for cyber security, which currently poses one of the biggest challenges to the stability of the global financial system. For many developers, the goal of achieving a cheaper, more secure payments mechanism has supplanted Bitcoin’s ambition of replacing dollars.

But it is folly to think that Bitcoin will ever be allowed to supplant central-bank-issued money. It is one thing for governments to allow small anonymous transactions with virtual currencies; indeed, this would be desirable. But it is an entirely different matter for governments to allow large-scale anonymous payments, which would make it extremely difficult to collect taxes or counter criminal activity. Of course, as I note in my recent book on past, present, and future currencies, governments that issue large-denomination bills also risk aiding tax evasion and crime. But cash at least has bulk, unlike virtual currency.

It will be interesting to see how the Japanese experiment evolves. The government has indicated that it will force Bitcoin exchanges to be on the lookout for criminal activity and to collect information on deposit holders. Still, one can be sure that global tax evaders will seek ways to acquire Bitcoin anonymously abroad and then launder their money through Japanese accounts. Carrying paper currency in and out of a country is a major cost for tax evaders and criminals; by embracing virtual currencies, Japan risks becoming a Switzerland-like tax haven – with the bank secrecy laws baked into the technology.

Were Bitcoin stripped of its near-anonymity, it would be hard to justify its current price. Perhaps Bitcoin speculators are betting that there will always be a consortium of rogue states allowing anonymous Bitcoin usage, or even state actors such as North Korea that will exploit it.

Would the price of Bitcoin drop to zero if governments could perfectly observe transactions? Perhaps not. Even though Bitcoin transactions require an exorbitant amount of electricity, with some improvements, Bitcoin might still beat the 2% fees the big banks charge on credit and debit cards.

Finally, it is hard to see what would stop central banks from creating their own digital currencies and using regulation to tilt the playing field until they win. The long history of currency tells us that what the private sector innovates, the state eventually regulates and appropriates. I have no idea where Bitcoin’s price will go over the next couple years, but there is no reason to expect virtual currency to avoid a similar fate.

Kenneth Rogoff, a former chief economist of the IMF, is Professor of Economics and Public Policy at Harvard University.

By Kenneth Rogoff

Preempting the Next Pandemic

SYRACUSE – Recent disease outbreaks, like Ebola and Zika, have demonstrated the need to anticipate pandemics and contain them before they emerge. But the sheer diversity, resilience, and transmissibility of deadly diseases have also highlighted, in the starkest of terms, just how difficult containment and prevention can be.


One threat to our preparedness is our connectedness. It was thanks to easy international travel that in recent years the dengue, chikungunya, and Zika viruses were all able to hitch a ride from east to west, causing massive outbreaks in the Americas and Caribbean. Another threat is more mundane: failing to agree about money. Whatever the reason, the fact is that as long as humans fail to organize a collective and comprehensive defense, infectious diseases will continue to wreak havoc – with disastrous consequences.

Building an effective prevention and containment strategy – being bio-prepared – is the best way to reduce the threat of a global contagion. Preparedness requires coordination among agencies and funders to build networks that enable quick deployment of and access to vaccines, drugs, and protocols that limit a disease’s transmission. Simply stated, preparing for the next pandemic means not only building global capacity, but also paying for it.

That’s the idea, at least. The reality of bio-preparedness is far more complicated. For starters, the absence of dedicated funding is impeding implementation of long-term prevention strategies in many countries; a new World Bank report finds that only six countries, including the United States, have taken the threat seriously. Meanwhile, public health officials in many parts of the world struggle to respond to disease outbreaks, owing to a dearth of labs and clinics. And many funding agencies, including governments and NGOs, typically offer only one-year commitments, which rules out long-term planning.

For years, scientists, physicians, and civil-society actors have voiced concern over the lack of reliable, meaningful, and institutionalized investment in pandemic preparedness. These pleas have come, frustratingly, as military funding to thwart bio-attacks, consciously mounted by human actors, has remained robust. But while purposeful and nefarious infectious-disease outbreaks could do massive damage, they remain relatively unlikely. Naturally occurring outbreaks, by contrast, occur regularly and are far more costly, even if they lack the sensational “fear factor” of bioterrorism.

Not that long ago, those of us engaged in the prevention of infectious-disease outbreaks felt more secure about the availability of the resources required to prepare. But in many places, budgets are stagnating, or even declining. This is astonishingly shortsighted, given the relative costs of prevention versus response. For example, what would it have cost to build the clinical and laboratory infrastructure and provide the training needed to identify and prevent the recent Ebola outbreak in West Africa? Precise figures are elusive, but I have no doubt it would have been less than the billions of dollars spent on containment. Preparedness pays.

It is not only the lack of funding that is raising alarms; so are restrictions on how available funds can be used. It is not uncommon for a grant to be restricted to specific activities, leaving major gaps in a program’s capacity to meet its objectives. A funder may, for example, allow the renovation of an existing lab but not the construction of a new one; or funds may support the purchase of a diagnostic machine but not the training of those required to operate it. In many developing countries, communities do not even have the physical buildings in which to test, monitor, or store dangerous pathogens. Myopic funding that overlooks key elements of the big picture is money poorly spent.

Add to these challenges the difficulty of paying staff or ensuring reliable electricity and other essential services, and it becomes clear that preparing for disease outbreaks requires broad engagement with the international aid community. But at the moment, onerous spending rules and weak financial commitments are tying the hands of those working to prevent the next serious disease outbreak.

The number of obstacles faced by scientists and public health experts in the race to contain deadly infectious diseases is staggering. To overcome them, we need to redefine how we think about preparedness, moving from a reactive position to a more proactive approach. Money earmarked for preparedness must be allocated at levels sufficient to have the required impact. Limitations on how it can be spent should be loosened. Funding sources must be opened to allow for multi-year commitments. Health-care providers and first responders must receive proper training. And long-term solutions such as establishing and connecting bio-surveillance systems should be expanded and strengthened, to enable public-health professionals around the world to track and report human and animal diseases and plan defenses together.

Public health is an essential element of global security. Failing to invest appropriately in prevention of infectious-disease outbreaks puts all of us at risk, whenever or wherever the next one occurs.

Stephen J. Thomas, an infectious diseases physician, is Professor of Medicine and Chief of the Division of Infectious Diseases at the State University of New York, Upstate Medical University.

By Stephen J. Thomas

The Not-So-Dire Future of Work

WASHINGTON, DC – The future of work is a hot topic nowadays. It has inspired a seemingly endless train of analyses, commentaries, and conferences, and it featured prominently in last week’s annual meetings of the International Monetary Fund and the World Bank. For good reason: new technologies – namely, digitization, robotics, and artificial intelligence – have far-reaching implications for employment. But, contrary to how the story is often framed, a happy ending is possible.


The current debate often skews toward the melodramatic, foretelling a future in which machines drive humans out of work. According to some bleak estimates, 47% of jobs are at risk in the Unites States; 57% in the OECD countries; two thirds in developing economies; and half of all jobs globally (around two billion).

But similarly dire predictions of large-scale job destruction and high technology-driven structural unemployment accompanied previous major episodes of automation, including by renowned economists. John Maynard Keynes offered one; Wassily Leontief provided another. Neither materialized. Instead, technological change acted as a powerful driver of productivity and employment growth.

One key reason is that the technological innovations that destroy some existing jobs also create new ones. While new technologies reduce demand for low- to middle-skill workers in routine jobs, such as clerical work and repetitive production, they also raise demand for higher-skill workers in technical, creative, and managerial fields. A recent analysis estimates that new tasks and job titles explain about half of the recent employment growth in the US.

Given this, the evolution of work should be viewed as a process of dynamic adjustment, not as a fundamentally destructive process that we should seek to slow. To erect barriers to innovation, such as taxes on robots, which some have proposed as a way to ease the pressure on workers, would be counterproductive. Instead, measures should focus on equipping workers with the higher-level skills that a changing labor market demands, and supporting workers during the adjustment process.

So far, education and training have been losing the race with technology. Shortages of the technical and higher-level skills demanded by new technologies are partly responsible for the paradox of booming technology and slowing productivity growth in advanced economies: skills shortages have constrained the diffusion of innovations. Imbalances between supply and demand have also fueled income inequality, by increasing the wage premia that those with the right skills can command.

To address these shortcomings, education and training programs must be revamped and expanded. As the old career path of “learn, work, retire” gives way to one of continuous learning – a process reinforced by the aging of many economies’ workforces – options for reskilling and lifelong education must be scaled up.

This will demand innovations in the content, delivery, and financing of training, as well as new models for public-private partnerships. The potential of technology-enabled solutions must be harnessed, supported by a stronger foundation of digital literacy. At a time of rising inequality – in the US, for example, gaps in higher education attainment by family income level have widened – a strong commitment to improving access for the economically disadvantaged is also vital.

At the same time, countries must facilitate workers’ ability to change jobs through reforms to their labor markets and social safety nets. This means shifting the focus from backward-looking labor-market policies, which seek to protect workers in existing jobs, to future-oriented measures, such as innovative insurance mechanisms and active labor-market policies.

Moreover, social contracts based on formal long-term employer-employee relationships will need to be overhauled, with benefits such as retirement and health care made more portable and adapted to evolving work arrangements, including the expanding “gig” economy. Here, several proposals have already been put forward, including a universal basic income, currently being piloted in Finland and some sub-national jurisdictions such as Ontario, Canada; a negative income tax; and various types of portable social security accounts that pool workers’ benefits.

On both of these fronts, France is setting a positive example. Early this year, the country launched a portable “personal activity account,” which enables workers to accrue rights to training across multiple jobs, rather than accumulating such rights only within a specific position or company. President Emmanuel Macron’s administration is now undertaking reforms to France’s stringent job protections, in order to boost labor-market flexibility. Pursuing such initiatives simultaneously will enable France to capture reform synergies and ease the adjustment for workers.

Technological change will continue to pose momentous challenges to labor markets across economies, just as it has in the past. But, with smart, forward-looking policies, we can meet those challenges head on – and ensure that the future of work is a better job.

Zia Qureshi, a former director of development economics at the World Bank, is a non-resident senior fellow at the Brookings Institution.

By Zia Qureshi

Educating for Myanmar’s Future


GENEVA – The violence that has ravaged Myanmar’s Rakhine State underscores the challenges the country faces on its bumpy road from military rule to democracy. The country is confronting a deep crisis, and urgent action is desperately needed to prevent further violence and assist the huge numbers of refugees and internally displaced people. To address the political, socioeconomic, and humanitarian challenges fueled by the crisis, the Advisory Commission on Rakhine State, chaired by Kofi Annan, recommends urgent and sustained action on a number of fronts to prevent violence, maintain peace, and foster reconciliation.


While global attention has rightly focused on how to end the attacks on Muslim Rohingya, many other, more systemic fixes are critical to Myanmar’s long-term stability. Education reform is one of the most important.

In late August, I was in Naypyidaw, Myanmar’s new capital, with the International Commission on Financing Global Education Opportunity. The Education Commission, as we are known, was there to present findings from our latest report, The Learning Generation, and to share ideas with the country’s leadership on paying for education and improving outcomes. We met with Aung San Suu Kyi, the government’s de facto leader, and Myo Thein Gyi, the education minister.

Our conversations were cordial and productive. By the end, we agreed on this much: sustaining Myanmar’s political transition hinges on improving its education sector.

To many of Myanmar’s leaders, their country is an economic-power-in-waiting. Home to some 53 million people, it is rich in minerals, natural gas, and fertile farmland, and it occupies a strategic location between India and China. Most important, Myanmar is rich in human potential, with a diverse and youthful workforce – the median age is just 28 – ready to take their country forward. What Myanmar lacks are the schools needed to train them.

Before military rule was imposed in 1962, Myanmar’s education system was among the best in Asia. For the next half-century, schools were neglected and underfunded. Starved of resources and teachers, the system atrophied. Rote learning replaced critical thinking, undermining creativity. Today, while some children have returned to the classroom, attendance in many parts of the country remains low, and teaching standards poor, contributing to high dropout rates.

In addition to these shortcomings, Myanmar faces severe human challenges, including endemic poverty, poor health indicators, and a lack of basic infrastructure. Among ASEAN countries, Myanmar has the lowest life expectancy and the second-highest rate of infant and child mortality.

Improving Myanmar’s education system, while tackling its other problems, will not be easy. But it can be done. Vietnam and South Korea offer inspiring examples of countries that transformed their education systems within a generation. As former South Korean education minister and commission member Lee Ju-ho noted during our visit, teaching young people to think critically takes time, but the results can have powerful knock-on effects for a country’s knowledge economy.

Aware of these benefits, Myanmar has put education at the heart of its reform agenda. One priority – to improve inclusivity – is already underway. For example, the government is currently working to encourage instruction in more local languages – more than 100 are spoken in Myanmar – in rural areas. Moreover, the government has increased its education budget, from just 0.7% of GDP in 2011 to 2.1% of GDP in 2014. While spending remains far below the regional average of 3.6% of GDP, funding is moving in the right direction.

To be sure, much work remains to be done. The government’s recently completed National Education Strategic Plan sets out an ambitious five-year timeline to improve “the knowledge, skills, and competencies” of all its students. The Advisory Commission on Rakhine State recommends that all communities should have equal access to education. The Education Commission supports these recommendations. As Suu Kyi noted during our conversation, education will play an increasingly important role in reducing poverty and promoting peace. If members of the current generation are to become productive members of society, she noted, they must be trained in cultural and ethical understanding.

During this fraught period of political transition, inclusive education can help promote a peaceful consolidation of democracy. As the crisis in Rakhine State powerfully illustrates, ethnic and ideological rifts run deep in Myanmar, and accessible, quality education may be the only means by which a common sense of shared identity can be cultivated. And, of course, better training in basic skills can also ultimately boost economic growth and increase social welfare.

The list of challenges facing Myanmar’s leaders is long, and overcoming most of them will be neither quick nor easy. But ensuring that no child loses the opportunity to learn must rank near the top of the country’s agenda.

Caroline Kende-Robb is chief adviser to the International Commission on Financing Global Education Opportunity.

By Caroline Kende-Robb

Central Banks Must Work Together – or Suffer Alone

NEW YORK – Global growth seems to be moving, slowly but surely, along the path to recovery. The International Monetary Fund’s latest World Economic Outlook predicts 3.5% global growth this year, up from 3.2% last year. But there’s a hitch: the easy monetary policies that have largely enabled economies to return to growth are reaching their limits, and now threaten to disrupt the recovery by creating the conditions for another financial crisis.


In recent years, the world’s major central banks have pursued unprecedentedly easy monetary policies, including what a recent Deutsche Bank report calls “multi-century all-time lows in interest rates.” That, together with large-scale quantitative easing, has injected a massive $32 trillion into the global economy over the last nine years. But these unconventional policies are turning out to be a classic game-theoretic bad equilibrium: each central bank stands to gain by keeping interest rates low, but, collectively, their approach constitutes a trap.

In today’s globalized world, a slight reduction in interest rates by an individual central bank can bring some benefits, beginning with weakening the currency and thus boosting exports. But the more countries employ this strategy, the greater the strain on the banking sector. This is already apparent in Europe, where bank equity prices have dropped steadily in recent months.

Moreover, low and especially negative interest rates make holding cash costly, prompting investors to seek riskier investments with higher potential returns. As a result, collateralized loan obligations (CLOs) have more than doubled this year, reaching an overall market value of $460 billion. That looks a lot like the surge in collateralized debt obligations (CDOs) that helped to drive the 2008 financial crisis. While the world has implemented more checks and balances for CLOs than it did for CDOs before the crisis, the trend remains deeply worrying.

Finally, persistently low interest rates can cause people to worry about their retirement funds, spurring them to save more. Far from boosting consumption, as intended, monetary stimulus may create an environment that dampens demand, weakening prospects for economic growth.

Today, no single country can steer the world away from this trap. The United States, which might have taken the lead in the past, has ceded its global leadership position in recent years – a process that has been greatly accelerated during the first year of Donald Trump’s presidency. Moreover, the G20 has lately lost steam in supporting closer coordination of monetary and fiscal policies among the world’s major advanced and emerging economies.

Perhaps a new grouping of the major players – the GMajor? – needs to step up, before it is too late. To gain the needed motivation, monetary policymakers should recall the “traveler’s dilemma,” a game theory parable that highlights the pitfalls of individual rationality.

The parable features a group of travelers, returning home with identical pottery purchased on a remote island. Finding that the pottery has been damaged in transit, they demand compensation from the airline. Because the airline manager – known as the “financial wizard” – has no idea what the price of the pottery is, a creative solution is needed to determine the appropriate amount of compensation.

The manager decides that each traveler should write down the price – any integer from $2 to $100 – without conferring with one another. If all write the same number, that figure will be understood as the price, and thus the amount of compensation each traveler receives. If they write different numbers, the lowest number will be taken as the correct price. Whoever wrote the lowest number would receive an additional $2, as a reward for honesty, while anyone who wrote a higher number would receive $2 less, as a penalty for cheating. So if some write $80 and some $90, they will receive $82 and $78, respectively, in compensation.

At first blush, the travelers are thrilled. The pottery has no actual monetary value, but if they each write $100, all can receive $100 in compensation. One traveler, however, quickly realizes that writing $99 would be a better option, because it would garner that extra $2 reward, and thus a total of $101. That traveler quickly realizes, however, that others must have had the same idea, and so decides to put down $98 instead. But what if the others had the same thought? Better make it $97.

In the end, trapped by this inexorable logic, all travelers end up writing and receiving $2. The outcome may seem a disaster, but it is also the most rational choice – the “Nash equilibrium” of the traveler’s dilemma game. It is clear how the financial wizard came by his moniker.

The moral of the story is simple. The invisible hand of the market does not always lead individually self-interested agents to a collectively desirable outcome. Altruism and regard for others must play a role. If they are missing, the players at least need to coordinate their decisions. Unless central bankers take that message to heart, they will find themselves sweeping up a lot of broken pottery.

Kaushik Basu, former Chief Economist of the World Bank, is Professor of Economics at Cornell University and Nonresident Senior Fellow at the Brookings Institution.

 

Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…