Tuesday, March 24, 2015

Data-driven policy visualizations from Central and Eastern Europe

Last year the Open Society Foundations' Think Tank Fund published an online portfolio of a set of projects they supported from 2010 to 2013. The projects are all applications of data-driven policy in Central and Eastern Europe. The portfolio includes:

View the accompanying report here.

Thursday, March 19, 2015

Oxford Internet Institute study: Data for Policy

A group of researchers and professors at the Oxford Internet Institute, along with Technopolis Group and the Centre for European Policy Studies, have been commissioned to "conduct an international study on innovative data-driven approaches to inform policymaking" by the European Commission. They're accepting contributions from the public until March 25. Their call for submissions is below and the study website here. They intend to publish a draft report in April of this year.

Between 5 and 25 March 2015, the study team welcomes suggestions for existing data for policy initiatives. We are preparing an inventory of big data for policy and other innovative data-driven approaches for evidence-informed policymaking.

Our inventory of relevant initiatives will focus on:
a) operational pilots, demonstrators and implementations,
b) that are supported, on a structural basis, by policymakers,
c) that are at the national level, in European Union Member States, Canada, India, Singapore, South Korea and US, or
d) at the international level, initiated or supported by EC, OECD, WHO, WTO, Worldbank, UN, etc.

As such, the inventory does not cover research projects that do not prepare for implementation (these are addressed in our interviews and literature review) or initiatives at sub-national level (e.g. smart cities).

Open data initiatives are within the scope of the study if there are clear elements of data analytics and use of the data in one or several steps of the policy cycle of agenda and priority setting, policy options generation, policy design, ex ante impact assessment, monitoring and ex post evaluation and impact assessment.

Please send your suggestions to martijn.poel(at)technopolis-group.com

Thursday, March 12, 2015

Data-driven government at Brookings

At an event hosted by the Center for Effective Management at The Brookings Institution this week, Martin O'Malley, the Governor of Maryland and former Mayor of Baltimore give his perspective on data-driven approaches to government, providing examples of how the city and Baltimore and the state of Maryland have used data to make decisions about infrastructure and public services. They've made the video available, which we've embedded below.

Friday, March 6, 2015

How are Internet start-ups affected by liability for user content?

David Jevons is a Partner at Oxera

Internet intermediaries facilitate the free flow of information online by assisting users to find, share and access content. However, users may sometimes share copyright-protected or illegal content; ‘internet intermediary liability’ (IIL) laws define the extent to which the intermediaries are liable for this. Holding internet intermediary start-ups accountable for user content will reduce the costs of enforcement but may also harm the incentive for entrepreneurs to develop new intermediary business models. To help inform this debate, Google asked our team at Oxera to examine the effect of different IIL laws in terms of success rates and profitability on Internet start-ups, including a detailed examination of four countries: Germany, Chile, Thailand and India.

The effects on start-ups of clear and cost-efficient requirements

Ambiguity in IIL laws can lead to over enforcement, which can alienate users. SoundCloud, a streaming service in Berlin, suffered a user backlash resulting from issues in its takedown policy, including petitions and threats to open a competing platform. Over-compliance is a related issue, which can be costly for the start-up. MThai, a web portal in Thailand, employs more than 20 people to check content before uploading, and prevents uploading during the night, in order to limit its costs. In extreme cases, ambiguity in legislation can lead to inadvertent violations of the law. The executives of Guruji, an Indian search engine, were arrested in 2010 following claims that they were infringing copyright which eventually led to the shutdown of the music search site.

In line with these examples, we find that intermediary start-ups could benefit considerably from a modified IIL regime with legislation that is clearer and sanctions that are focussed on cases where it is socially efficient to hold intermediaries liable. This is reflected in the quantitative results of our study, with the largest effects found in markets (such as India and Thailand) where current legislation is most ambiguous. Our analysis indicates that an improved IIL regime could increase start-up success rates for intermediaries in our focus countries by between 4% (Chile) and 24% (Thailand) and raise their expected profit by between 1% (Chile) and 5% (India).

Estimated impact on start-up success rates (%)
Estimated impact on the expected profits of successful start-ups (%)

Implications for the design of future IIL regimes

The IIL regime is one of several levers available to policymakers wishing to encourage more start-up activity, however it may be one of the easier ones to pull for policy makers wanting to stimulate growth in this sector.

Our study highlighted the following implications for the design of future IIL regimes:

  • Find the right balance between the cost effective enforcement of copyright and allowing innovation in intermediary start-ups.
  • Costs matter when designing safe harbours. The costs of compliance are likely to have a considerable impact on intermediaries, particularly on start-ups.
  • Legal uncertainty increases costs of compliance. Intermediaries will find it difficult to ascertain the required level of compliance and may ‘over-comply’
  • Start-ups comply with take down requests as they do not have the resources to engage in legal action. Legitimate user content may be removed as a precaution.
  • Start-up vibrancy can be lost as high risks and compliance costs increase the likelihood that a start up with a commercially sound, legitimate business model fails.

If you are interested in finding out more about our study and the economic issues surrounding IIL, please read our full study on the Oxera website.

Wednesday, February 25, 2015

Global Broadband Pricing Study: Updated Dataset and Call to Action

Vincent Chiu is a Technical Program Manager at Google

Since 2012, Google has supported the study and publication of broadband pricing for researchers, policymakers and the private sector to better understand the landscape and to help consumers make smarter choices about broadband access. We released the first dataset in August 2012, then refreshed the data in May 2013 and March 2014. Today we’re releasing the latest dataset, a result of our fourth iteration of the study. The mobile dataset covers 3,305 plans in 112 countries. The fixed dataset covers 1,983 plans in 105 countries.

December 2014 data links:

  • Price observations for fixed broadband plans can be found here.
  • Mobile broadband prices can be found here.
  • Explanatory notes here and ancillary data is here.

We believe in the power of open data. We want this study to be useful for regulators, academics and policymakers to understand the state of internet access and make data-driven decisions. We plan to continue to sponsor this work in the future so we can identify trends in the space and we’d love to collaborate with other organizations who are interested in getting involved in this study. We would also like to hear how you are using this data. If you are interested, please contact us at broadband-study@google.com.

Thursday, February 19, 2015

Why aren't there more Monkeybrains?

Derek Slater is a Senior Policy Manager at Google

Monkeybrains is a wireless Internet access provider in San Francisco. Created in 1998, it’s not a big company—it’s a small business founded by two friendly engineers and run by a handful of people. Until recently, they only offered broadband speeds similar to old copper telephone networks. But they’ve started rolling out an ultrafast 300 Mbps service, buoyed by a successful crowdfunding campaign that raised nearly half a million dollars and the declining cost of antenna equipment.


In the U.S. broadband market, this sort of development is particularly intriguing. Today, nearly 75% of the country has no choice in broadband providers; at best, most consumers only have two options.

Even if the market of available service providers does not grow significantly, the market becoming more contestable—meaning an increased potential for new entrants—could be a significant development. That would make existing Internet service providers much more concerned about customer satisfaction—and that’s good for consumers.

More competition, especially at high speeds, isn’t inevitable though. Even as the technology matures and costs continue to come down, there are a number of barriers to entry that policymakers can help address, including:

  • Spectrum reform: ultrafast wireless services would be much easier to deploy if there was more spectrum available for both exclusive and shared use.
  • Access to video programming: most consumers expect to buy a TV package with their Internet access, but it is difficult for new entrants to license broadcast and cable TV channels at competitive prices.
  • Rights of way, siting for the equipment, and permitting processes: cities could make it easier to put up wireless equipment in public rights of way. This of course needs to be balanced against a range of factors—including a desire to not have a city cluttered with lots of wireless equipment.

Friday, February 13, 2015

New research on security and cloud computing

Ross Schulman is a Policy Manager at Google

Earlier this week, Leviathan Security Group released three white papers that explore cybersecurity in cloud computing versus local storage. Each paper examines a different aspect of security and storage, including availability, cost, and talent acquisition. In general, Leviathan finds that cloud solutions are generally more secure, resilient, and redundant than local equivalents.

A few data highlights:

  • Cloud services provide much better resiliency and redundancy than local services in the face of disasters of all sizes—from small transformer explosions that affect 30,000 users to superstorms the size of Thaiphoon Haiyan. This means quicker data recovery and the ability to keep communications infrastructure like email up and running, which is essential in a post-disaster environment.

  • Even with increasing emphasis on STEM education and growth of computer science programs, organizations—private and public—will not be able to acquire all of the talent necessary to satisfy the demands of local storage infrastructure. For example, there are currently over a million open security positions worldwide, but beginning in 2017, all of the GCHQ-led cybersecurity programs together will graduate just 66 PhD's per year.

  • Of note for numbers lovers, the paper titled “Value of Cloud Security: Vulnerability” lays out a thorough analysis of storage needs for companies of different sizes and compares cost of cloud versus local storage solutions. They find that cloud solutions are cheaper for small organizations in the near term and provide better security because of the expertise, which is concentrated in large organizations.

So what do these findings mean from a public policy point of view? Many countries, including Brazil and Russia, have proposed laws requiring that companies keep the data of that country’s users within national borders. This idea, known as “data localization,” purports to keep citizen users safer and out of the hands of spying governments and hackers.

However, forced data localization prevents companies, governments, and organizations from realizing many of the benefits afforded by cloud services. For example, if a local data center is impacted by a natural disaster, that data is not replicated elsewhere and thus is lost. And given the shortage of security expertise, there’s simply no way that every organization’s security infrastructure for locally stored data can keep up with the state of the art. Finally, preventing small enterprises like startups from using cloud services means that they must take on additional costs in terms of talent and infrastructure, and will likely end up with systems that are less secure than what cloud infrastructure would provide. In the end, data localization reduces opportunities, results in weaker security, and, in some instances, compromises the availability of data.

To learn more about data localization proposals around the world, check out Anupam Chander's paper, "Breaking the Web: Data Localization vs. the Global Internet."