page image

CHAPTER V - Solutions for Smart Regulation

New Government Agencies
The group identified seven recommendations that center on the need to reform government to be better equipped to address challenges in the future, address the responsibilities of technology companies, call for technology-focused job training and programs to advance digital literacy skills, and act to close the digital divide. The first two recommendations focus on reforming government to be better equipped to address the challenges of an online world. There is a mismatch between the potential harms that need to be addressed and the responsibilities of existing federal government institutions. For example, there is currently no federal agency responsible for the collection, use and flow of information. Therefore, participants suggested it is time to start talking about the creation of the Communications and Data Strategy Act of 2021, an updated iteration of the Telecommunications Act of 1996 to better regulate the current digital ecosystem.

Conferees agreed that federal agencies will need more tools to combat today’s technological challenges. As Reed Hundt put it, the structure of government today reflects the industrial economy of the 1930’s through the 1950’s. Neither the FCC nor the FTC are well equipped to address today’s problems without additional resources. The FCC’s organizational structure—with separate media, wireline and wireless bureaus—may be at odds with the increasingly diversified business interests of today’s communications companies. The platform companies that represent the largest portion of the internet economy appear beyond the FCC’s jurisdictional reach, and the FTC largely constrains activities through enforcement action, not regulation.

Additionally, efficiency and implementation issues arise when there is overlap of responsibilities among independent agencies and the executive branch. For example, some attendees noted that U.S. spectrum management strategy may be hampered by both the National Telecommunications and Information Administration (NTIA) and the FCC exerting authority over federal and private spectrum use, respectively. Likewise, attendees were critical of the division of responsibility for antitrust enforcement between the U.S. Department of Justice (DOJ) and the FTC. Addressing these structural inefficiencies will help streamline regulation of the communications sector at the federal level.

More critically, some attendees questioned the continued need for federal independent agencies to regulate the communications sector. Congress needs to periodically reevaluate the authority of independent agencies given the rapid pace of technological change. Moreover, the notion that independent agencies regulate without political influence is naive, at best. In recent years, under both political parties, the chairs of the FCC have advanced priorities advocated by the executive branch.

Recommendation 1: Government should create a new federal agency to address data and technology policy.
Given the structural issues noted above, the group agreed that new government institutions are needed to address today’s problems, but participants had different opinions on the scope and responsibilities of any such newly created bodies. Some suggested that Congress has delegated too much authority to regulatory agencies.

Today, individual federal agencies and departments are addressing AI in the context of their particular sphere of responsibility. Similarly, multiple congressional committees are looking at AI in the context of issues under their jurisdiction, but no one in Congress is in charge of thinking about AI from a multi-sector perspective.

Some attendees initially suggested the creation of a task force, special commission or federal expert agency to study algorithmic decisionmaking. Such a body could compile information on existing best practices for algorithmic decision-making, both in the U.S. and abroad. It could convene multi-stakeholder gatherings to identify safe harbors for industry players. It could also provide a regulatory sandbox for anti-bias experimentation and collaborate with industry to develop a framework for self-certification. Ideally, Congress would appropriate funding for research in this area.

Others pushed back at the idea of a special agency dedicated to AI. They suggest that the better course is to make sure that each agency has the institutional AI expertise that can be integrated into agency policymaking, with some level of coordination at the federal level to ensure the government is looking at issues consistently.

Johanna Shelton, Director of Government Affairs & Public Policy at Google, pointed out that no company would share its algorithmic trade secrets with the government. If such an effort is to be successful, it must build on an industry collaboration model. Nicol Turner-Lee suggested there is a need for interdisciplinary research to help develop norms and an ecology of best practices.

Marc Rotenberg, President and Executive Director of the Electronic Privacy Information Center (EPIC), argued for the creation of a Data Protection Agency. He noted that other democratic governments around the world have such an agency, and the failure of the U.S. to establish such an agency has come at a cost. Under his proposal, the new Data Protection Agency would focus on issues relating to privacy and data protection, but it would not assume other regulatory functions that currently reside in agencies such as the FCC. Moreover, it would be an independent agency, not part of a larger communications and technology department. The FCC and FTC would remain in place, in his view.

Others were receptive to this idea, agreeing that a new data protection agency should have comprehensive responsibility for all data, not just that generated by the communications sector. While the group agreed on the need for federal privacy legislation, they fundamentally disagreed about whether such federal legislation should represent a floor, with states free to take additional action, or if federal legislation should preempt state authority.

The boldest proposal was for all regulatory authority over communications and technology issues to be consolidated at the federal level in a new agency, with existing agencies abolished. Proponents of this approach argue that it no longer makes sense to have the communications sector regulated by an independent agency, such as the FCC, while important issues of national and international scope remain under the domain of the NTIA, including input from the White House Office of Science and Technology Policy, and the National Economic Council. Issues relating to networks and data need to be examined comprehensively and together. Conference moderator Charlie Firestone, Executive Director of the Aspen Institute Communications and Society Program, called for a new Department of Networks and Data, and others offered similar approaches.

David Redl, former Administrator at the NTIA and Assistant Secretary for Communications and Information, argued that the new agency should be a Cabinet-level organization, rather than located within an existing department in the executive branch. Only then would the U.S. regulator have equivalent rank and stature to counterparts abroad. Moreover, this proposed executive branch agency would be more directly accountable than an independent regulator.

The focus of such a new Cabinet-level department would be to develop and streamline rules and policies in an effort to establish a comprehensive communications and technology strategy. Some of the focus areas include universal access to broadband; comprehensive spectrum management for both the public and private sector; a national data policy regarding the collection, manipulation, use and dissemination of data by both governmental actors and the private sector; action to protect people from invasions of privacy, discrimination, fraud and misrepresentation; and the ethical use of AI and related issues in the communications and information arena.

Participants recognized that it is difficult to create a new agency unless there is a national crisis. The Department of Homeland Security was created in the wake of the September 11, 2001 attacks and the Consumer Financial Protection Bureau after the Great Recession of 2008, but in ordinary times Congress is loath to create a new agency, particularly one that upends existing jurisdictional assignments of congressional committees. It is not clear whether battles over spectrum management and recurring data breaches will create enough impetus for Congress to reorganize the federal government.

“We need to recognize that there are good reasons why we have deliberative government,” noted Larry Downes, Senior Fellow at Accenture Research. “We have a Constitution that slows down the pace of legal change because, in most situations, in most circumstances, that’s the right thing,” he added. The institutional roadblocks to rapid change in government may protect us from our own worst possible behavior.

The group agreed nonetheless that existing agencies are not well equipped to tackle the important issues that arise from the use of data in today’s world. Each has its own institutional culture and is subject to varying degrees to regulatory capture. Too often, regulators are focused on solving yesterday’s problems. To establish meaningful change, it is important to start anew.

One of the first jobs of such a new agency would be to develop a National Advanced Technology Strategy. Such a strategy starts with the premise that the private sector should be free to innovate and develop new business models, without restrictive regulation by the government. An equally critical imperative is to focus U.S. tax, research and development (R&D), and other policies to ensure U.S. technological leadership. At a minimum, the U.S. needs to have a national plan for technology policy with goals and dates for key actions to focus attention and galvanize action.

Within this context, the rise of China was top of the mind for many conference attendees. China’s dominance stems in large part from its economic model and the concerted effort on the part of the Chinese government to implement its strategic “Made in China 2025”—a plan to become the world leader in advanced manufacturing of high-tech goods. China is using government subsidies, state-owned firms, and aggressive pursuit of Western intellectual property to achieve dominance. Many see China’s ambition as an existential threat to U.S. technological leadership. The consensus of the group is that the U.S. cannot “out-China” China; that is antithetical to the U.S. free market system.

Nonetheless, it is critical to realign U.S. national priorities to promote the American economy, security and core values. The U.S. needs a concerted effort to bring the benefits of technological innovation and jobs to a greater number of its cities. To help accomplish this, federal funding for R&D at U.S. universities should be carefully calibrated to promote U.S. strategic goals. It is also imperative to reduce the influence of foreign investment in university R&D. Intellectual property developed in the U.S. should remain American intellectual property.

Another critical imperative is for the U.S. to win in standards battles vis-à-vis non-market-based competitors. The U.S. government should promote U.S. participation in international standards setting and other fora via private sector groups such as the 3rd Generation Partnership Project (3GPP), and the Institute of Electronics and Electrical Engineers (IEEE), government-run entities like the International Telecommunications Union (ITU), or non-treaty fora such as the Organization for Economic Co-operation and Development (OECD). Antitrust law should not thwart U.S. companies from working together in the national interest at these standards bodies. The R&D tax credit should be extended to cover standards work.

The group generally agreed that consumer protection policies for the online world are inherently national, and those policies should be developed at the federal level. David Redl said that providing localities power over local issues is fine, but data policy is not local. Data transits through a particular place to anywhere in the world, so local decisions inherently create externalities beyond the local borders. Kevin O’Scannlain, Special Assistant to the President for Economic Policy at the White House National Economic Council, agreed, observing that in the case of data privacy, preemption makes sense because technology is inherently cross-border. State attorney generals should be charged with enforcing federal laws and policies, where appropriate.

Recommendation 2: Government should create a Federal Innovation Center to implement a digital-first strategy for delivery of government benefits and services.
A second, more narrowly focused recommendation is to create a Federal Innovation Center, similar to the California Office of Digital Innovation. Such an office could build on the successful efforts of the existing U.S. Digital Service, which was created in 2014 to bring talented engineers and others into the federal government for shortterm tours of duty to advise existing departments on the delivery of public-facing government services and the use of digital platforms. The creation of this type of center could help federal agencies change the way they approach service delivery and technology investments.

One goal of the Federal Innovation Center should be to implement a comprehensive digital-first strategy for the dissemination of governmental benefits and services. That, in turn, would create more incentives to adopt broadband because people would be able to easily access government benefits and services online, saving both time and expense. It is critical that the government do more to show non-adopters the relevance of broadband to their lives.

In addition, the federal government should lead by example in integrating AI into the delivery of government services. For instance, the federal government is both the largest single provider of health care and the largest health care insurer in the U.S. Significant opportunities exist to use AI in both the delivery of care and in the payment process. The federal government could advance the ball by focusing on its own role in the healthcare ecosystem.

Efforts to bring tech experts into government to work on specific problems also should be expanded beyond the federal government to state governments and to cities. A Federal Innovation Center could serve as a resource to state and local government officials that are similarly working to transform how they deliver services at the local level.

Big Technology Companies and Competition Policy

A recurring topic of discussion among participants was the role of the largest technology companies in today’s marketplace. Two decades ago, Google and Facebook did not exist; today, they are dominant forces. But, as Johanna Shelton was quick to point out, in 2014 only four of the top twenty internet and technology companies were Chinese; now, nine of them are.

At the time of this writing, there are multiple active antitrust inquiries into the major online platforms, at both the federal and state level. Antitrust authorities are investigating whether these companies are engaged in practices that have reduced competition, stifled innovation or otherwise harmed consumers.

Recommendation 3: Ensure antitrust policy evolves to include greater consideration of the impact of market structure on innovation and consumer choice.
Given this context, the group recommended that antitrust issues should be addressed at the federal level by a single agency. It is not efficient to have antitrust responsibility split between the DOJ and the FTC.

The consumer welfare standard should be retained, but with a more robust focus on the impact on innovation. Antitrust policy should recognize the role of consumer choice (or lack thereof) in competition analysis. Without choice, people feel disempowered. Remedies should be tailored to address identified consumer harms. To do so antitrust authorities must become nimble and address their own gaps in expertise to understand better the business models of technology companies.

“Big” is not per se “bad.” Returns to scale are inherent in a digital economy, and big data can lead to big results in key verticals like healthcare. Instead, the focus should be on bad behavior, not sheer size. As Robert Atkinson, Founder and President of the Information Technology and Innovation Foundation (ITIF), pointed out, most R&D is funded by the big companies and the U.S. does not want to lose that. China would have an enormous advantage if the U.S. were to break up the U.S. technology companies, noted Kevin O’Scannlain, Special Assistant to the President for Economic Policy at the White House National Economic Council. Big technology companies are significant drivers of innovation which is critical to maintaining relevance in a rapidly changing world economy. Large U.S. companies have both the resources and incentives to compete with international competitors, which ultimately benefits consumers.

At the same time, the potential for harm exists, and it is important to inquire whether large companies are abusing their market power in a way that is harmful to competition or consumers. We do not know where we’re going, lamented Jeff Smulyan, Founder and Chairman of the Board of EMMIS Communications. Moreover, the largest companies have the ability to overwhelm both Congress and regulators.

Eli Noam, Professor at Columbia University and the Director of the Columbia Institute for Tele-Information, offered a counter proposal. In his view, it is necessary to create and protect consumer choice in near-monopoly situations. He proposed that when there is convincing evidence that a multi-product vertically integrated firm has significant market power in a service that is important to many people or for services that exhibit strong economies of scale, the company should be required to offer that service unbundled from its other services and products, without preferential treatment to itself. In cases where individualization of service exists, companies should be required to provide consumers with the ability to customize such individualization. This includes the option of delegating customization to independent nonprofit entities and commercial providers selected by the consumer.

In response, Paula Boyd, Senior Director of Government and Regulatory Affairs at Microsoft, asked whether those trusted third parties would perform those functions for free or at some cost. If consumers must pay to get the customized experience, then that would exacerbate a world of haves and have-nots.

Marc Rotenberg, President and Executive Director of the Electronic Privacy Information Center, reacted by noting that models that allow users the ability to change preferences regarding data collection and use do not work because users have given up on exercising their option to make choices. The problem is information asymmetry: people have no idea what factors are causing an algorithm to recommend another site or product. What is necessary is greater transparency, which promotes accountability and greater trust. Eli Noam disagreed, arguing that transparency does not solve anything.

Opportunities for Greater Self-Regulation

Not all potential harms associated with the digital transformation warrant pro-active government intervention. But there were varying views on the need to reform Section 230 of the Communications Decency Act, which shields “interactive computer services” from legal liability for the content published on their websites, with limited exceptions. When it was enacted in 1996, the intent was that liability protection would foster content moderation by websites and nurture the growth of the internet. In recent times, however, some have questioned whether it needs to be re-examined given the growth of the darker side of the internet, such as the use of online platforms by bad actors to facilitate international terrorism and sexual exploitation of children. Indeed, subsequent to the conference, both the U.S. Senate and the U.S. House of Representatives held hearings to explore the effectiveness of industry online content moderation practices.

Conferees debated what responsibility online platforms should have to address harmful and extreme speech. The group agreed that the answer is not heavy-handed government regulation; not all problems relating to online content can or should be solved by government. Indeed, government intervention on content moderation could do more harm than good. Many content issues are best left to the industry to solve, at least for a time, with the possibility of a government backstop to industry efforts to moderate content.

Recommendation 4: Industry should be transparent regarding actions to address hate speech and content that incites violence.
Industry needs to ensure due diligence and be transparent about their content decisions to gain public trust in their implementation of content moderation standards. Koy Miller, Head-North America, Connectivity and Access Policy at Facebook, suggested that the key is to manage user expectations through transparency about content moderation policies, adopt appropriate oversight and appeals mechanisms, and have industry come together to develop substantive content standards that increase predictability for users. This has been accomplished in other contexts, such as movie and television ratings, where industry develops standards and then self-enforces those standards.

Some of the large technology companies are taking new steps to address these issues. In September of 2019, Facebook released its charter for an independent oversight board that will review the company’s decisions about what posts, photos and videos are removed or left up on its platform under Facebook’s community standards. Among other things, the board will issue written explanations of its decisions, which will be available in a public archive, and its decisions will bind the company.

In addition, AI is helping to facilitate content identification and removal. AI is being used today to flag extreme content online. For instance, nearly 90 percent of the 9 million videos that YouTube removed from its platform in the second quarter of 2019 were flagged by automated tools, and more than 80 percent of those auto-flagged videos were removed before they received a single view. Through the use of AI, Facebook has reduced the average time it takes to identify a violation of its community standards to 12 seconds.

Blair Levin, Senior Fellow with the Metropolitan Policy Program at the Brookings Institution, suggested that the three large online content platforms—Facebook, YouTube and Twitter—are effectively running individual experiments on content moderation. Several participants suggested that it is critical that the industry at least agree on the key terms—such as what constitutes hate speech—so that the public can have a consistent understanding, even if the individual companies decide independently of one another how to implement any content moderation policies. Moreover, it would be beneficial for an outside independent entity, such as a university, to evaluate the effectiveness of the different approaches against defined public policy outcomes.

Another area where self-regulation may be desirable is cyber security. Robert Atkinson, Founder and President of ITIF, argued that the private sector has strong financial and reputational incentives to protect itself and its customers against cybercrime. He sees little upside to government dictating specific steps regarding what industry should do. Moreover, limiting the proper role of the U.S. government in cyber security provides a principled basis for industry to argue for similar limitations on the role of the Chinese government and other foreign nations in cyber security abroad. While there remains an important role for government, industry should have flexibility to develop standards for how to protect against cyber threats on emerging technologies. As a practical matter, industry is more likely to be nimble in this area than government.

Improving Digital Competence

Another set of recommendations centers around the need for technology-focused job training and programs to advance digital literacy skills. Not all Americans are benefiting from new technologies and more needs to be done to give individuals the tools to succeed in a digital society. Critical focus areas include training workers of tomorrow, providing broadband users the skills to evaluate online content and embedding ethics into algorithmic decision-making.

The American Library Association defines digital literacy as “the ability to use information and communication technologies to find, evaluate, create and communicate information, requiring both cognitive and technical skills.” Individuals must know how to access and create content on digital platforms, assess the validity of third-party content and take appropriate action to disseminate such content. As one report explains, “Part of digital literacy is not just understanding how a tool works but also why it is useful in the real world and when to use it. This concept can be described as digital citizenship—the responsible and appropriate use of technology, underscoring areas including digital communication, digital etiquette, digital health and wellness, and digital rights and responsibilities.”

To participate fully in a digital world, both in the workplace and on a personal level, individuals need to be digitally literate. According to one estimate, 90 percent of the workforce will require basic digital skills to be able to function effectively in the workplace. More than half of workers will need to be able to use, configure and build digital systems. As AI increasingly is used to perform routine tasks requiring relatively low levels of judgment, those workers will be displaced. If American workers are not properly equipped to perform the jobs of tomorrow, then our country will face social and economic instability.

Unfortunately, there is reason to worry the U.S. will be ill equipped to compete effectively in the digital world in the years ahead. According to the OECD, millennials in the U.S.—who now comprise one-third of the American workforce—placed nearly last in digital skills (defined as literacy, numeracy and problem-solving) as compared to the same age group in other developed nations.

To remediate this gap, it requires a concerted effort by both government and the private sector, which leads to the next recommendation.

Recommendation 5: Government should do more to advance digital literacy and job skills.
The U.S. government should focus on training the next generation for future employment. It is not sufficient to simply ensure all individuals have the basic skills to get online; government and anchor institutions need to help prepare people for the changes in the workforce stemming from the integration of AI into key verticals. Every major market sector of the economy will be using AI as a force multiplier to perform tasks more quickly and effectively, including inventory management, human resources, logistics and procurement.

Anchor institutions can play an important role in delivering programs for training and broadband access. As trusted members of the community, anchor institutions meet significant needs for a population that may not be aware of more sophisticated platforms for career development and job skills training. Community colleges and technical programs will be the entry point for many to re-train for the jobs of tomorrow. Moreover, as Francella Ochillo, Executive Director of Next Century Cities pointed out, local community colleges and community centers can provide technical training to people seeking the necessary skillset to build and operate communications networks.

Singapore MySkillsFuture platform, with its focus on lifelong learning and job skills training, could be a scalable model for U.S. action. This portal is a one-stop shop that includes online aptitude tests to identify compatible industries or occupations; information on different industries and what types of skills those industries require now and in the future; an online job bank to help job seekers find and apply for employment; a tool to identify skills shortfalls between an individual’s profile and a selected occupation as well as resources to fill those gaps; and an education management system to let individuals and employers track certifications and accreditations obtained over the course of one’s career. The U.S. Department of Labor already sponsors a version of this platform, MySkillsMyFuture, in partnership with the American Job Center Network in the U.S.

There is also recent legislation sponsored by U.S. Senator Patty Murray of Washington state to establish a State Digital Equity Capacity Grant program within the NTIA to provide funding for state-driven digital inclusion initiatives. Funding would be available for targeted digital inclusion efforts, skills training and other workforce development programs, the construction and operation of public computing centers, and making technology available to covered populations at low or no cost. It is currently pending in the Senate Committee on Commerce, Science, and Transportation.

With digital competence becoming a necessity for many career paths, a critical U.S. goal should be to promote lifelong learning and specialized training so individuals of all ages can become digitally literate. One idea is to make changes to traditional 529 accounts so they can be used for various types of courses and not just post-secondary education. Another idea is to dedicate one percent of unemployment taxes to retraining workers displaced by the advent of AI in the workplace. A third idea is to establish rural innovation centers to educate populations less familiar with technology due to availability and adoption shortfalls.

In order to promote U.S. leadership in computer sciences, government and industry should focus on strengthening public education. A focus on college and graduate-level education is not enough; K-12 education is important, as well. Young people need not only specific job training, but also training in AI creation.

The deep challenges of AI will not be solved without a new generation of technologists who can bring to bear their unique insights into the problems at hand. Shireen Santosham, Chief Innovation Officer of the San Jose Mayor’s Office of Technology & Innovation, pointed out that the U.S. cannot graduate more engineers to catch up with China because China is graduating half of all the engineers in the world. But the U.S. can do a better job of integrating macroeconomic policy concerns into engineering curricula so those who come to the U.S. from other nations for an advanced degree are grounded in core American values.

Google’s Johanna Shelton said engineers need to be thinking about ethics at a much earlier stage in their education journey. Computer science education needs to include ethical considerations as a core component of the curriculum. It is too late to be talking about ethics on the back-end after a new platform or application is created. Product designers need to be thinking about how to give users meaningful choices regarding what they want and do not want rather than designing a product that can only be used in one way. Moreover, as Brookings Institution Fellow Nicol Turner-Lee suggested, social scientists should be working with computer scientists; not all decisions should be made by engineers. Potential bias can be reduced by increasing diversity in those who are creating AI.

Closing the Digital Divide

The final set of recommendations center on addressing the digital divide, which includes issues of digital inclusion. The critical issue is that people who are not online—for whatever reason—cannot take advantage of a wide range of applications and services that most people take for granted. Diane Griffin Holland, Senior Advisor for Tech and Telecom at the National Urban League, identified three dimensions of the digital divide—relevance, affordability and availability. Each dimension warrants a different menu of potential solutions. The ultimate objective is to ensure every person has meaningful access to broadband connectivity, can pay for such connectivity and understands the value of fully participating in the digital economy. It is critical that all people can benefit from advances in technology and connectivity and that no one is disproportionately harmed or excluded from these benefits.

Participants expressed concern that the rise of emerging technologies, like AI, 5G and IoT, could have a disproportionately negative impact on certain communities. Digital data deserts will lead to algorithmic bias, which in turn will exacerbate the exclusion of marginalized communities. Redlined communities will not have access to the same information as the rest of the world. Where 5G connectivity is lacking, communities cannot realize the benefits of IoT or AI.

In the years to come, the internet will be used for more than just connecting people; the promise of IoT is the ability to connect millions upon millions of devices for specialized purposes. Real-time processing of information by AI achieves better outcomes. There is growing recognition across the federal government of the importance of connectivity for such use cases as precision agriculture and connected care.i The danger is that the benefits of IoT for these applications may not be fully realized in discrete areas of the country.

A complicating factor in addressing the digital divide are the respective roles of localities, states and the federal government. Preemption of local processes for 5G deployment can achieve a uniform national approach, but it removes local decision-making and power. Blair Levin, Brookings Institution Fellow, argued that Google Fiber changed this dynamic by creating a competition among cities, thereby motivating cities to devise ways to make their localities more hospitable for investment in next-generation networks. He declared that the FCC has preempted cities without obtaining anything in return from industry.

Enormous investment is being made by the industry, noted Len Cali, Senior Vice President of Global Public Policy at AT&T. He said cities should reduce barriers to deployment and not view 5G as a nearterm revenue opportunity, but rather as a growth opportunity. Shireen Santosham of the San Jose Mayor’s Office of Technology & Innovation offered a different perspective: local governments should not be viewed simply as another source of regulation. Local governments also are a vehicle for feedback and community data input. Those voices need to bubble up to the federal level. Christopher Lewis, President and Chief Executive Officer of Public Knowledge, emphasized two sets of values that must be balanced: rapid 5G deployment and accessibility for all Americans versus local control and historic preservation. It is important to recognize the tension of values in the conversation about solutions.

The respective role of various levels of government is also relevant to determining who should provide broadband. Nearly half of the states in the country have laws in place that serve to preclude municipalities or electric cooperatives from providing retail broadband to end users. While some believe government should not be in the business of competing with the private sector, municipalities are now stepping in to fill the void because in some instances the private sector has failed to serve the community. Often, local efforts to promote broadband are galvanized by dissatisfaction from key constituencies regarding the level of service provided by the current incumbent, whether telecommunications carrier or cable operator. As GLIA Foundation’s Richard Whitt put it, if people want to tax themselves to do municipal broadband they should be allowed to do that.

However, others argue that municipalities are legal creations of states, and states should have the power to determine what local governments can and cannot do. If local constituencies exert enough pressure at the state level, then state laws limiting broadband deployment can be changed. And even if localities are barred by state law from directly providing broadband, they can take other important steps to galvanize local action.

Conference attendees voiced differing opinions on where policymakers should focus their attention in addressing the digital divide. Blair Levin believes government efforts have been too focused on making broadband available, to the detriment of focusing on strategies to increase utilization. Digital inclusion is more than just an infrastructure issue. He argued that several developments on the horizon—including the launch of low-earth orbiting satellites, resolution of the C-band proceeding, and merger commitments made in the T-Mobile-Sprint transaction—suggest that market forces will, in the next few years, substantially improve broadband availability. In that light, given the number of affected individuals, there should be a greater focus on barriers to adoption, such as affordability, relevance and digital literacy.

Jonathan Chaplin, Managing Partner of New Street Research, cautioned that people should not conflate lack of access with affordability. In his view, mobile is available to almost the entire country and areas served by mobile should not be viewed as unserved. Others disagreed, pointing out that usage caps and the higher pricing of mobile make it unusable for home broadband needs. Access to video is essential to ensure equitable access to education today and will increasingly become important for health care in the future. But Chaplin countered that not every single use case needs to be solved with the same kind of network. Moreover, efforts should be focused on how to make mobile an affordable substitute. While today the focus may be on wired and fixed wireless in-home broadband solutions, there needs to be further exploration of the role of mobility in solving the divide over the longer-term.

Recommendation 6: Ensure all can pay for an in-home broadband connection/device and appreciate the value of broadband.
Martha Guzman Aceves, Commissioner of the California Public Utility Commission, noted that affordability is the elephant in the room: broadband simply is too expensive, even in areas where there is competitive choice. To remedy this problem, the government should increase funding for low-cost broadband offerings to make broadband affordable for vulnerable populations. In 2016, the FCC modernized the Lifeline program to provide a $9.25 discount on qualifying broadband service, which typically provides access to a relatively restricted level of service. It is a problem when low-income households are running out of data halfway through the month. One innovative solution, offered by Brookings Institution Fellow Nicol Turner-Lee, would be to exempt access to government websites from monthly usage caps for Lifeline subscribers. That would at least enable those households to access workforce development training materials and educational resources without restriction. This is something that might be pursued through sponsored data programs.

In addition, the group suggested that the FCC’s existing Lifeline program is not appropriately structured to make broadband affordable for low-income households. One way to improve Lifeline is to reduce the administrative burdens of participation so that more service providers are willing to participate. A more radical idea, possibly requiring legislative action, is to provide vouchers and subsidies directly to the consumer rather than routing funds through a service provider.

Second, carriers could adopt common eligibility requirements for their low-cost broadband offerings. For instance, Comcast’s Internet Essentials program was originally available for low-income households with school-aged children but has expanded to include a much broader population including low-income veterans, residents in public housing and people with disabilities. Similar programs of other companies are more narrowly focused.

Third, the homework gap can be solved by allowing communities to gain broadband connectivity through E-rate funded Wi-Fi. One way to do this is to permit E-rate funds to be used to support Wi-fi enable school buses.

A fourth effort is to encourage public-private partnerships to ensure students have access to the devices used for broadband connectivity. The FCC’s universal service programs pay for broadband service, not end user devices. Comcast’s Internet Essentials program provides access to subsidized computers as well as digital training.

Affordability is not the only problem, however. Other reasons for non-adoption include lack of need, lack of interest and the related lack of digital literacy skills. Efforts to close the digital divide must address the full panoply of reasons why people may be unwilling or unable to adopt broadband. As discussed elsewhere in this report, a digital-first strategy for delivery of government benefits and services will encourage broadband adoption. Likewise, concrete actions to promote digital literacy through education and workforce training should promote greater usage of broadband.

Recommendation 7: Government should impose a moratorium on federal and state funding for broadband deployment until improved broadband maps are available.
Turning to the issue of broadband access, the most striking recommendation was to slow down current efforts to award new funding for broadband deployment to address availability. Government funding to supplement private capital is critical to closing the digital divide in infrastructure availability. But there is a growing fear among experts that the problem of availability is much bigger than it currently appears.

The FCC initiated a rulemaking in August 2019 to determine the rules that it will use to award up to $20.4 billion in funding from the newly renamed Rural Digital Opportunity Fund (RDOF). This fund carries forward the FCC’s vision of awarding funding to support the deployment of broadband in rural areas through a competitive bidding process rather than automatically providing such funding to incumbent telecommunications carriers. Subsequent to the Aspen conference, in January 2020, the FCC adopted rules for the first phase of funding ($16 billion over ten years) and decided it would use the current FCC Form 477 data to identify areas eligible for funding. Funding will be available in areas lacking 25/3 Mbps fixed broadband service. The FCC expects to award the second phase of funding (at least $4.4 billion over ten years) using data from its new and improved, more granular data collection.ii

Congress and other interested stakeholders have voiced a widespread, bi-partisan dissatisfaction with the current data utilized by the FCC and other agencies to target funding to support the deployment of broadband in rural areas. The methodology used by the FCC to determine how many people lack fixed broadband—treating all residents in a given census block as served even if only one resident has access to service—masks the true extent of the broadband availability gap. Moreover, the current Form 477 collects data regarding advertised service availability, which some parties argue may be overstated. Finally, the FCC has no process in place to independently validate the Form 477 data submitted by service providers.

The FCC needs access to better broadband availability data before it sends billions and billions of dollars out the door, said FCC Commissioner Geoffrey Starks. Moreover, good data about broadband availability is critical to detecting digital redlining.

In July 2019, shortly before the Aspen event, the FCC adopted an order mandating a new data collection process—the Digital Opportunity Data Collection—which collects more granular information regarding broadband service availability. This will enable the government to target funding more effectively. Under this new data collection, all broadband providers will be required to submit geospatial maps showing the precise boundaries of where they offer service.iii Moreover, the FCC has decided to establish a more robust process to gather public input on where coverage is overstated. Local leaders will play a valuable role in this process as they are better situated to know conditions in their own jurisdictions.

The new mapping requirements are expected to take some time to implement, so new broadband maps will not be publicly available before the FCC plans to hold the first Rural Digital Opportunity Fund auction in October 2020. The FCC has decided to proceed with an auction to award most of its RDOF budget of funding while work continues to improve the maps to pinpoint where service is lacking. The open question is whether the FCC will raise the minimum performance requirement—now 25/3 Mbps for fixed broadband—when it is ready to proceed with a second round of RDOF funding. With an evolving standard for universal service, it does not make sense to lock up all available FCC funding for the next decade based on a 2020 view of the world. An understanding of what constitutes universal availability undoubtedly will change as consumers increasingly subscribe to more robust services over time. Today’s definition of broadband will become tomorrow’s old news.

While conferees disagreed about the extent of the broadband availability problem in this country, many believed more money is necessary to fully address broadband availability and affordability. Today, funding for the FCC’s current universal service programs is largely derived from an assessment on a percentage of interstate or international telecommunications service retail end user revenues. While the group did not offer a specific proposal for contributions reform, it agreed almost any alternative methodology would be an improvement over the current system. It is time for the FCC to increase and sustain the federal universal service fund.


i In the 2018 Farm Bill, Congress directed the FCC to form a Task Force to develop policy recom- mendations, among other things, to promote broadband availability on unserved agricultural lands. While the FCC’s current broadband deployment funding program, the Connect America Fund, is not designed to target funding to agricultural lands, the Task Force will examine spe- cific steps that the FCC might consider in future funding programs dedicated to the deployment of broadband infrastructure. FCC Announces and Solicits Nominations for Working Groups of the Task Force for Reviewing the Connectivity and Technology Needs of Precision Agriculture in the United States, Public Notice, GN Docket No. 19-329, DA 19-1188 (rel. Nov. 19, 2019).
ii Rural Digital Opportunity Fund, Report and Order, WC Docket No. 19-126, FCC 20-5 (rel. Feb. 7, 2020).
iii Establishing the Digital Opportunity Data Collection, Report and Order, Second Further Notice of Proposed Rulemaking, 34 FCC Rcd 7505 (2019).
 
 
Title Goes Here
Close [X]