page image

CHAPTER IV - What Role for Government in Overseeing Digital Assets?

What Role for Government in Overseeing Digital Assets?

In light of the ferocious pace of technological innovation, an unavoidable question is what role government should play, and how policymaking at different levels—national, state and international— might be coordinated.

Bill Coleman, the long-time entrepreneur and Partner of Alsop Louie Partners, envisions government as “a platform for the digital economy” whose first principle ought to be “Do no harm!” By that, he means that government should not be in the business of choosing winners and losers in the marketplace.

In effect, Coleman regards ARPANET—the Defense Department agency that incubated the Internet in the late 1960s and early 1970s—as the model for how government should behave in developing new systems for managing digital assets. Government should seed, nurture and facilitate the new system in all its dimensions. Other possible models of “government as platform,” said Coleman, include the government’s role in developing and spinning off G.P.S.; in collecting and disseminating weather information; and arguably its role in building the interstate highway system and the Federal Aviation Administration network.

Coleman believes that government should adhere to the following basic principles: Build open infrastructures that allow others to add value; do not constrain innovation; allow social norms to emerge before codifying them through regulatory regimes; do not charge fees to raise transaction revenue except in rare cases necessary to enable the platform and then only when a government agency is the only logical mechanism (as the Obama administration did in proposing that the Federal Aviation Administration charge a $100 per-flight fee on every takeoff and landing by general aviation aircraft.)

The key issue for government to address, said Coleman, is how to unlock the value of digital assets. This requires that mechanisms be found to enable the protection of digital assets, whether through technological, legal, regulatory or social means. Coleman believes that securing personal information and intangible assets requires some means to tie them to one’s personal identity, and then to ensure that the owner of that digital identity be able to assert effective control over the asset (e.g. using crypto currency).
So the question may devolve in the first place to how to secure a person’s digital identity. For this, Coleman believes that parties to transactions involving intangible assets must be able to authenticate identity and claims without having to disclose personal identifying information (PII). The privacy of the owner/authorizer must be protected throughout all counterparty transactions. Privacy cannot really be achieved, Coleman argued, without cleanly separating authentication of a person’s identity from authorization to engage in a transaction. When the two are integrated, he said, there is a much greater risk of mass disclosures of PII.

The best and perhaps only solution, he suggested, is to avoid centralized control of authorization and to empower individual owners of digital assets to control identity-authorization themselves. “The problem today is that every time I want to do business with someone, the authorizer is also the authenticator, which means that I am giving everybody my personal identifying information. So the more things you do on the Internet, the weaker your security is. This is the paradigm that we have to break.”

Centralized storage of identity credentials has not worked, said Coleman, because the data is never truly up-to-date; because it is seen as the most attractive place for hackers to attack; and because “not every party can ever agree who is going to be the neutral disintermediator for all of my transactions.”

To help build out the digital economy, therefore, government must show leadership in building “a platform service to enable trust based on the vast amounts of identifying information on citizens, organizations and businesses which it maintains across all departments and agencies. This platform must be open and extensible to include other trust authorities and services,” said Coleman.

Coleman recommended that the government platform should be an extension of the program run by the government’s NSTIC Program—the National Strategy for Trusted Identities in Cyberspace—and ultimately governed by a neutral, non-governmental organization. The platform should provide “authentication services and secure protocols which provide privacy-enhancing authentication in accordance with the NSTIC principles, thus separating authentication from the authorization.” Coleman envisions a system in which the storage of personal identifying information for authentication would be decentralized, and which would use open APIs and protocols to allow third-party authenticating services to join the network.

Besides assuring privacy and fostering digital commerce, a government trust platform is needed to help manage the identity credentials for the emerging Internet of Things. The proliferation of sensors and devices with Universal Resource Identifiers (URIs) means that the growing universe of networked computing tools must also be subject to reliable authentication and authorization protocols.

Here, too, it makes the most sense for individual owners and transactors to set the appropriate credentialing policies for access, use and exchange. They should be able to choose from any combination of authentication services that they deem trustworthy in order to gain the confidence of the counterparties to authorize a transaction.

Fortunately, there are now many sorts of authentication systems (such as homomorphic encryption, tokens, crypto-hashtags, etc.) that can achieve reliable identity and claims authentications without requiring the sharing of actual personal identifying information. John Clippinger added that software systems such as OAuth, an open standard for authentication; the decentralized block chain ledger technology at the heart of Bitcoin; and ID3’s Open Mustard Seed software all offer authentication without PII disclosures. These systems also move beyond password-based systems of authentication, introducing combinations of biometric and behavioral metrics that are not easily spoofed by third parties.

Trust in transactions of digital assets can be further enhanced, said Coleman, by the routine creation of a secure, encrypted audit trail, which is accessible only to the owner of the resources or by duly authorized authorities acting under due process (e.g., court order). This scenario would allow individuals to engage in anonymous transactions and yet, with sufficient probable cause for suspecting illegal activity, authorities could identify individual suspects.

Moving Forward on an Alternative Regime for Managing Digital Assets

A new regulatory and legal regime is essential, said John Clippinger of ID3, because the existing apparatus of law and regulation presumes that it can adequately control the data from unauthorized parties, which is manifestly untrue. “We have to start to recognize that data has got to flow. You cannot just say, ‘Do not collect PII,’ as if data can be quarantined. The presumption that you can cordon off certain kinds of data and control them by limiting the flow is just not credible. Our lives are immersed in data. We require data.” A more realistic, effective legal and regulatory system would recognize that “with the use of data comes obligations. With potential harms come duties.”

If such a system were in place, it could help make available all sorts of valuable datasets held by private corporations that could help inform public policy. Allen Blue of LinkedIn pointed out that its user data could be of immense help to the U.S. Department of Labor in developing policies to “close the skills gap.” But of course, LinkedIn users currently have no way to authorize government use of PII that they give to LinkedIn, nor much confidence that the government would in fact respect their privacy.

This problem prompted Michael Fertik of to suggest that new systems be established for “the provenancing of data” so that one might establish responsibility for data as it was shared over time. “If you know from whence the data come, you can know whom to blame for a violation and whom to credit for creativity or innovation.”

Perhaps a more basic meta-problem is how any new initiative for a government trust platform would be initiated. Fertik noted that the problem with the government trust framework idea is that “it more or less relies upon everyone in the room to hold hands at the same time and leap together to a future that is uncertain and undefined. That is not going to happen.” Based on his experience, Fertik said that “these issues are normally solved from the edges in, and very incrementally,” usually after private business has taken the first steps or after a social norm has started to crystallize.

So what may instigate a new approach? There may be some seeds germinating right now. A company in Estonia has supposedly established a “keyless signature infrastructure” using a timestamp that has attracted the interest of telecom carriers. The U.S. Department of Health and Human Services is reportedly building a new system premised on data having provenance as it moves through the system. Government’s role as a major purchaser of new technology gives it the capacity to drive adoption of new standards and protocols. Another constructive step: leadership among government and business leaders to stir the civic imagination about the actual societal benefits of using data in creative ways to serve public health and civic needs.

In a short presentation, Kim Taipale of the Stilwell Center offered a conceptual framework for thinking about where strategic interventions might be made to achieve new policy approaches for digital assets. He referenced four de facto types of “law” that Professor Lawrence Lessig invokes in his 1999 book Code and Other Laws of Cyberspace: conventional law, technological architectures, markets and social norms. Each in its own way constitutes a kind of “law” that regulates people’s behaviors.

Taipale noted the limitations and failures of these different forms of law, as now constituted, in managing digital assets. For conventional law, there is a distinct sense that legal and regulatory systems are toocumbersome, slow and antiquated to deal with the pace of technological innovation. For network architecture, there is a problem with its propensity to produce a “power law distribution” of highly unequal outcomes—a “winner takes most” scenario that aggravates inequality and hollows out the middle class. For markets, the routine production of hidden externalities is a major problem. And for social norms, it is clear that general social awareness and support for different ways of managing digital assets have not yet emerged, which makes it difficult to imagine how new legal regimes could be formulated or enforced via social norms.

What might be the points of intervention in these different types of law? Taipale identified the following possibilities:

Conventional law: One could legislate certain minimal standards fortechnological or social performance using a command-and-control approach. The Consumer Privacy Bill of Rights, currently a set of guidelines, could be enacted into law, for example. It might also be attractive for legislators to consider creating an entirely new class of property rights called “data rights,” which might borrow elements of intellectual property law (which is ill-suited for managing data) while adding new twists that take account of prevailing use-cases of data. Because of its immense versatility, contract law may also be a fruitful body of law that could be adapted to deal with digital assets.

Technological architecture: One could “build in” certain discontinuities and friction back into the system as a way to advance certain values and social goals. For example, the problems caused by high-speed stock trading could be addressed by building in a uniform time delay for all transactions. Or platforms could be regulated to ensure that they provide open APIs. The Open Mustard Seed software proposed by ID3 is another way that the technology design itself could embody certain policy priorities.

Markets: The power of markets can be leveraged if we were to assignliability to key actors in the digital asset ecosystem, and expose the actual costs of certain behaviors. For example, the Securities and Exchange Commission has been moving to force corporations to disclose their exposure to cyberattack risks. Perhaps the benefits and liabilities of digital assets more generally ought to be made visible on corporate balance sheets or in the Generally Accepted Accounting Principles (GAAP). Another market-based intervention could be the development of new types of insurance for the stewardship (and misuse of) digital assets.

Social norms: Ways must be found for people’s behaviors to beexpressible and made consequential in self-regulating the behavior of others. The idea is that in today’s world, social interactions are highly distributed and self-organizing, and routinely give rise to new kinds of social norms. Why not use technologies to help make these social norms more visible as a kind of “digital common law,” in the spirit of Oliver Wendell Holmes, and even codify the more powerful, consensual norms? This is one goal of the Open Mustard Seed platform. One could also leverage social norms as a kind of law by empowering people to make their own choices about levels of identity-authentication and privacy protection.

Whatever new regime for managing digital assets may be initiated, John Clippinger stressed that “policy must be informed by technology.” Right now, he complained, the two realms are often treated as quasi-independent realms. Bringing the two together means that the law must formally sanction experiments and dynamic innovation as part of the search for solutions. It cannot simply look to familiar principles of law and try to impose them (futilely) on the boisterous, unpredictable and globally distributed world of tech innovation.

Title Goes Here
Close [X]