With everyone doing their 2012 Best-ofs and highlights, we thought we'd hop on that bandwagon and repost the most popular Policy By the Numbers posts of 2012.
Wishing a very Happy New Year to all!
Back in October, we posted about a competition Google co-sponsored with The Guardian's Datastore challenging entrants to create compelling visualizations using world aid data. Today we announced the winner in the The Guardian: Aidan Berensten's visualization of global aid spending by agency. Berensten used data from the International Aid Transparency Initiative to create a an interactive visualization so robust that it comes with its own tutorial.
Check it out Berensten's winning work at iatid.com, and read more over at The Guardian, "See how the world's development aid flows with our competition winner."
Naureen Kabir is Director of the New Cities Foundation Urban Lab.
Urban traffic and the difficulties of commuting are challenging issues for cities globally, especially as the already rapid pace of urbanization continues to accelerate. Delays in the U.S. alone cost an average of 34 hours a year per commuter. Moreover, wasted fuel, carbon emissions and opportunity costs mean that in the U.S. traffic congestion costs over $100 billion annually. Globally, these costs multiply: workers and students from Stockholm to Seoul cite daily commutes as a key cause of stress and missed time at work.
Governments and private companies are investigating how to address the challenges of urban mobility, emphasizing improvements in infrastructure. But one year ago, the New Cities Foundation set out to understand the potential of making commuters part of the solution. In partnership with Ericsson and the University of Berkeley, the Foundation set up a Task Force on Connected Commuting to study the impact of connecting travelers who take the same daily commute route via smartphone apps that allow them to share relevant, useful information with each other. Two pioneering commuting applications were used for the study: Waze (for car commuters) and Roadify (for public transport users). We piloted the study in San Jose, CA, which city ranks 22nd among large American cities in number of person-hours delayed (42 million annually), and 25th in congestion cost ($842 million).
The Task Force wanted to know: can a new level of networking between commuters enhance the overall commuting experience? Is the connected commute "better" than the unconnected commute? From a city perspective, is it more resource efficient?
The findings of this year-long study present an opportunity for transport agencies, local governments and app developers to identify alternative ways to effectively improve the commute experience. From a policy standpoint, the Task Force study's bottom line is this: while innovative long-term solutions such as road space rationing (Brazil), license plate quotas (China), and congestion pricing (Britain) should continue to be implemented, in the short-term, encouraging and utilizing crowd-sourced information sharing among commuters—especially if done in a safe manner—can be an efficient, cost-effective way to build a community of commuters who themselves provide solutions to the woes of commuting.
Released last week, the study revealed that information-sharing among commuters has benefits for both individual commuters and organizations—be they public or private—working on transportation and mobility:
Francisco Ruiz Anton is a Policy Manager for Google in Spain.
At the end of the third quarter in 2012, roughly 25% of adults in Spain were out of work. More than half of adults under 24 years old are unemployed. Recent graduates and young adults preparing to enter the workforce face the toughest job market in decades.
The Internet presents an opportunity for growth and economic development. According to recent research, more than 100,000 jobs in Spain originate from the Internet and it directly contributes to the GDP with 26.7 billion euros (2.5%). That impact that could triple by 2015 under the right conditions.
One of those conditions is making high-quality education accessible, echoed by a recent OECD report on the youth labor market in Spain. This is no easy task. University degrees are in high demand, straining the reach of our existing institutions.
The web has become a way for learners to develop new skills when traditional institutions aren’t an option. Recent courses on platforms like Udacity, Coursera and edX have seen hundreds of thousands of students enroll and participate in courses taught by prestigious professors and lecturers.
Google is partnering with numerous organizations and universities in Spain to organize UniMOOC, an online course intended to educate citizens in Spain and the rest of the Spanish-speaking world about entrepreneurship. It was built with Course Builder, Google’s new open source toolkit for constructing online courses.
To date nearly 10,000 students have registered for the course, over two-thirds of them from Spain and one-third from 93 countries. It recently won an award for the “Most innovative project” in 2012 from the newspaper El Mundo.
Spain’s situation is not entirely unique in Europe. Policymakers across the continent are asking themselves how best to create economic opportunity for their citizens, and how to ensure that their best and brightest students are on a path toward financial success. Our hope is that the people taking this course will be more empowered with the right skills and tools to start their own businesses that can create jobs. They will push not only Spain, but Europe and the rest of the world towards economic recovery and growth.
The course is still running, and you’re able to join today.
Today we hung out with Alex Howard to talk about the big stories and trends in data from 2012 and get his outlook for 2013. Alex touches on some important policy issues relating to data, including privacy and security, identity, and ownership.
Hasan Bakhshi is Director, Creative Economy and Juan Mateos-Garcia is Research Fellow in Nesta’s Policy & Research Unit.
Boston Consulting Group says that the UK is the largest nation in the G20 as a percentage of GDP—8.3% compared to an average of 4.1%. The UK is a nation of e-shoppers, with almost two-thirds of consumers reporting having purchased goods or services online in the previous three months. That’s almost twice the average for the countries in the Eurozone. The UK was also the first country in the world where online advertising spend overtook TV advertising in 2009.
Yet, a recent study of digital readiness from Booz & Co. puts the UK in an unimpressive twelfth place and ranks the nation eighteenth on average broadband connection speeds. They find that two-thirds of UK SMEs have "little or no presence online." Only 14% of UK SMEs sell online compared 30% in Norway. Eurostat data confirm that UK businesses are not among the leading pack of countries in e-commerce markets.
Our new research on how UK businesses collate and use their online customer data adds to this picture of lagging engagement with digital. Our findings suggest that collection of data is patchy, and that four out of five businesses with active online operations are not making full use of their data for decision making.
Even in our Internet-active sample, only 38% of businesses collect comprehensive transactions data. In the majority of businesses, the analysis of online data is only basic and descriptive. For example, only 27% run A/B experiments and other controlled trials and an even lower 13% use statistical techniques such as regression analysis.
Only 41% of businesses in our sample use online data to inform their business strategy, and fewer use it to optimise prices. Even among the sub-sample of firms for whom e-commerce makes up more than half of overall revenues, less than four in ten use their online customer data to set prices.
However, 18% of businesses in our sample—the 'datavores'—are showing the way. They are likely to collect, analyse, and, above all, act on their online customer data. They appear to be investing more aggressively in data capabilities than other firms, suggesting that companies who don’t learn to use data will be left even further behind.
Datavores are four times more likely than intuition-driven companies to report a positive contribution from their online data and are even more likely to be product innovators. The implication here is that that there may be an immediate benefit to the UK economy if more businesses made use of online data.
What might all this mean for policy?
Policymakers need to think about how to create a regulatory environment that strikes the right balance for consumers between data privacy and the potential benefits to be gleaned from data use, like more efficient pricing of products and rapid product innovation. Concerns about data privacy and security ranked highly as a barrier to greater use of online data by the datavores in our survey.
Policymakers should also heed the importance of sound analytical and management skills if the wish to encourage data-driven business. They should ask whether the education system attaches enough importance to such skills, and whether the system is prepared to cope with increasing demands as more businesses begin to unlock the value of data.
Fred Von Lohmann is Legal Director at Google.
We believe that data should play an important role in figuring out how to make copyright work better online. Six months ago, we launched a feature in our Transparency Report that discloses how many copyright removal requests we receive to remove Google Search results to help inform ongoing policy conversations.
Starting today, anyone interested in studying the data can download all the data shown for copyright removals in the Transparency Report. The data will be updated every day.
We are also providing information about how often we remove search results that link to allegedly infringing material. Specifically, we are disclosing how many URLs we removed for each request and specified website, the overall removal rate for each request and the specific URLs we did not act on. Between December 2011 and November 2012, we removed 97.5% of all URLs specified in copyright removal requests.
As policymakers evaluate how effective copyright laws are, they need to consider the collateral impact copyright regulation has on the flow of information online. When we launched the copyright removals feature, we received more than 250,000 requests per week. That number has increased tenfold in just six months to more than 2.5 million requests per week today. While we’re now receiving and processing more requests more quickly than ever (on average, within approximately six hours), we still do our best to catch errors or abuse so we don’t mistakenly disable access to non-infringing material.
We’ll continue to fine tune our removals process to fight online piracy while providing information that gives everyone a better picture of how it works. By making our copyright data available in detail, we hope policymakers will be able to see whether or not laws are serving their intended purpose and being enforced in the public interest.
Ian Hathaway is a Research Manager at the Bay Area Council Economic Institute in San Francisco.
Yesterday my colleagues and I at the Bay Area Council Economic Institute published a report detailing the importance of high-tech employment to the U.S. economy. It is the first of two studies aimed at providing a better understanding of job creation and business formation in America today—in particular the role that high-tech plays in the process. Part two will focus on the job creation dynamics of high-tech startups and will be released in a few months. Both reports are made possible by a generous contribution from our friends at Engine Advocacy.
Below is the executive summary of the report as well as some key graphics. We hope you’ll find it as interesting as we do. Enjoy.
This report analyzes patterns of high-technology employment and wages in the United States. It finds not only that high-tech jobs are a critical source of employment and income in the U.S. economy, but that growth in the high-tech sector has increasingly been occurring in regions that are economically and geographically diverse. This report also finds that the high-tech sector—defined here as the group of industries with very high shares of workers in the STEM fields of science, technology, engineering and math—is an important source of secondary job creation and local economic development. The key findings are as follows:
Since the dot-com bust reached bottom in early 2004, employment growth in the high-tech sector has outpaced growth in the private sector as a whole by a ratio of three-to-one. High-tech sector employment has also been more resilient in the recent recession-and-recovery period and in the last year. The unemployment rate for the high-tech sector workforce has consistently been far below the rate for the nation as a whole, and recent wage growth has been stronger.
Employment growth in STEM occupations has consistently been robust throughout the last decade, outpacing job gains across all occupations by a ratio of 27 to 1 between 2002 and 2011. When combined with very low unemployment and strong wage growth, this reflects the high demand for workers in these fields.
Employment projections indicate that demand for high-tech workers will be stronger than for workers outside of high-tech at least through 2020. Employment in high-tech industries is projected to grow 16.2 percent between 2011 and 2020 and employment in STEM occupations is expected to increase by 13.9 percent. Employment growth for the nation as a whole is expected to be 13.3 percent during the same period.
Workers in high-tech industries and STEM occupations earn a substantial wage premium of between 17 and 27 percent relative to workers in other fields, even after adjusting for factors outside of industry or occupation that affect wages (such as educational attainment, citizenship status, age, ethnicity and geography, among others).
The growing income generated by the high-tech sector and the strong employment growth that supports it are important contributors to regional economic development. This is illustrated by the local multiplier, which estimates that the creation of one job in the high-tech sector of a region is associated with the creation of 4.3 additional jobs in the local goods and services economy of the same region in the long run. That is more than three times the local multiplier for manufacturing, which at 1.4, is still quite high.
— Tech hiring creates other jobs (San Francisco Chronicle)
— Technology Works for Startups and Our Economy (Engine Advocacy blog)
TechNet released a report on broadband in the United States, and today's visualization is an infographic that summarizes the report findings. TechNet's State Broadband Index "rates the states on indicators of broadband adoption, network quality, and economic structure as a way of taking stock of where states stand."
Andrew Adams is a Professor of Information Ethics in the Graduate School of Business Administration and Deputy Director of the Centre for Business Information Ethics at Meiji University in Tokyo. This is the first of two Policy By the Numbers posts by Andrew on open access publishing.
Scholarly academic communications began with letters between early scientists. Groundbreaking work was sent to multiple people and as the early scientific and scholarly societies grew they started distributing their proceedings to their members, and making them available in libraries. As academia expanded and in particular as science and engineering grew to industrial scale enterprises in which the unknowing duplication of experiments was seen to be a waste of money (knowing duplication as an independent check on results remains a valid exercise though unfortunately one which is rarely given the importance it deserves) the need for wide dissemination of experimental results led to the emergence of scientific publishers of three types: scholarly/scientific societies with a specific disciplinary focus; university presses with a mission partly driven by dissemination of their own researchers' work and partly by general academic desire to improve communications; commercial publishers either founded specifically to meet an apparently unmet need in a specific area or general publishers using their expertise and economies of scale in typesetting, printing, binding and distribution to provide a valuable service to academia as a conduit for scientists to talk to each other.
The "gold standard" of academic publishing gradually emerged as peer review: consideration of the merits of a piece by other scholars/scientists with appropriate knowledge of the relevant field. Over the course of the twentieth century a large publishing industry emerged with a set of common practices followed by most publications and most publishers. (There are exception to these practices but there is no space to present them here.) Academics (and a small proportion of researchers in independent research institutions and commercial labs) constitute the main authors, reviewers, academic editors and academic editorial boards of the journals (and in Computer Science fully reviewed conference proceedings) in which most academic work is published. In the twentieth century, publishers provided typesetting, copyediting, printing and distribution services. They generally made a reasonable profit margin on these activities, whatever type of publisher they were. In return, authors (mostly academics) transferred the copyright in their work to the publishers, without charging any financial recompense. Likewise reviewers and editorial boards are generally unpaid. Editors may receive a small stipend and/or a small contribution to administrative support costs (though by no means always).
Since about 1990, the system has come under multiple pressures. Commercial publishers have merged or been taken over and a small number of large players, sometimes part of larger multimedia conglomerates and sometimes just large primarily academic presses (though usually also publishing at least textbooks and monographs as well, and often publishing other education/research-related materials). Scholarly societies and universities have come to expect profits from their publishing arms which will support their other activities. Commercial publishers have consistently increased their subscription prices well above both inflation and the funding available to most universities) while at the same time their costs should have been going down after an initial investment in digital reproduction technologies was covered. Meanwhile in a few fields such as Computer Science and Physics (most notably High Energy Physics) the new technology of the Internet was providing a parallel route for academics to disseminate their peer-reviewed articles, first using ftp sites, then the Web and finally through databases with web interfaces providing both machine- and human-accessible meta-data as well as the postscript, pdf, Word document, HTML etc format versions of the text.
In addition to this, a relatively small number of journals have been either founded with no subscription costs for electronic access, or moved to such a model. The funding to support these journals comes from a variety of sources including fees from authors who publish (occasionally fees from authors who submit for possible publication), academic society, university, research funder, government or charitable donation support and others.
While there remain some methodological difficulties and disputes, most of the studies carried out in the last ten years have shown that when articles are available without readers being restricted by payment (either institutional or individual subscription to the journal or by payment for the individual article) then that article is cited more often. Since the primary purpose of research writing is communication with other researchers and impact on the field, one crude measure of which is the number of times an article is cited, then it is clearly in the interests of research authors, their institutions and the funders of their research that their articles are made freely available. This is, in effect, a pre-requisite for a free market in scholarly/scientific ideas in which the best ideas rather than those easier to access, receive greatest attention.