Medtronic says federal rules prohibit giving Ms. Hubbard's data to anyone but her doctor and hospital. "Our customers are physicians and hospitals," said Elizabeth Hoff, general manager of Medtronic's data business. Medtronic would need regulatory approval to give patients the data, she said. It hasn't sought approval because "we don't have this massive demand."
[Cross-posted from Homeroom, the Department of Education's blog]
Joe Barison is the Director of Communications and Outreach for ED's San Francisco Regional Office.
With a putting green, 18 cafeterias, gardens and even a giant statue of a dinosaur, one may not associate Google's massive headquarters in Mountain View, Calif., as a place where Department of Education officials, educators and business leaders come together to discuss career pathways for community college students.
Yet during a recent stop by U.S. Under Secretary of Education Martha Kanter, the novelty of Google's complex were the last thing on the mind as leaders from Google, Cisco Systems, Lockheed Martin—Space Systems and the Silicon Valley Leadership Group came together for a panel discussion on innovative strategies to improve career pathways for community college students in science, technology, engineering and mathematics (STEM).
The 90-minute discussion—in front of a large audience comprised of regional industry leaders, community college presidents, K-12 educators, local policymakers and students—included Kanter's detailed description of the Obama Administration's support of STEM education at community colleges, highlighting the $2 billion Trade Adjustment Assistance Community College Career Training grant program that funds the pairing of community colleges with workforce partners to ensure that graduates are career-ready with the knowledge and skills that employers need.
After the formal program and a Q-and-A session ended, many of the participants stayed to continue the discussion. Dennis Cima, senior vice president of the Silicon Valley Leadership Group, an association of 380 Silicon Valley employers, talked about the value of linking K-12 schools, community colleges and businesses.
"We know how important it is to create connections between education and industry," Cima said. "Because once those connections are made, then industry has the ability to really help education fill its own needs. This was an opportunity to open people’s eyes about how important those public-private partnerships are."
Google’s director of education and university relations, Maggie Johnson, who was a panelist, said that she found the session to be a good start because it brought the right people together.
"I really liked the part that came out around how there are very many different sectors that need to come together and coordinate in order to really make something happen," Johnson said. "We got the community colleges in the room; we have industry; we have government. So at least we got everybody in the room. Where it goes from here, we’ll have to see."
Kanter's assessment of the value was similar to Johnson's. "I think it was the beginning of what I hope will become a call to action, so the different sectors of education, business, philanthropy, government, labor, and community partners can come together to say, 'How can these stakeholders—working together across sectors—architect a plan for this region to lead the way, through innovation, to make sure that every student gets the best possible education and is prepared, college-and-career ready, for the jobs now and for the future?'"
Kate Harrison is a graduate student at UC Berkeley and Anant Sahai is an Associate Professor at UC Berkeley.
Using TV white spaces means allowing wireless devices (e.g. wireless routers) to transmit on frequencies previously exclusive to over-the-air TV. The goal is not to eliminate over-the-air TV but instead to increase efficiency by maximizing existing resources. A useful analogy is pouring sand into a jar of large rocks, where the rocks in the jar naturally leave gaps for sand to fill in. We can think of the signals for TVs, called primaries, as the rocks, which leave room for signals from new devices, the secondaries, our sand.
The principle concern is preventing harm to primaries. Secondaries must be "quiet" enough that TV sets can still "hear" TV signals (in communications lingo, the signal-to-noise ratio must not drop too much). Consequently, we must enforce a limit on the collective "volume" of secondaries.
The standard approach is to hard-code the per-device limit on transmit power ("volume"). This works where devices have roughly the same requirements regardless of location. However, as the map below shows, white space availability varies greatly.
The natural response to a variable environment is to adapt to it. To be legal, white space devices must contact servers to register and get permission to transmit, which ensures they don’t get too close to protected TV signals (in this sense, the policy is already data-driven). With this setup, it’s easy to simultaneously assign a custom transmit power. We showed with data-driven simulations that there is a power limit function which allows significantly higher mobile data rates without hurting TV coverage:
These maps were created in Matlab using US 2010 Census data by tract, the ITU propagation model, and a list of the 8,186 US TV towers, assuming white space ISP towers are placed to serve 2,000 people each. Find the code here.
Notice that data rates are much higher and more uniform in the variable-power map than in the single-power map. But an infinite number of functions satisfy the (linear) constraints of the problem (i.e. preserving TV reception). How should we pick in a principled way?
The traditional economics approach is to assign prices (using real or pretend money) to transmit power and allow people to trade freely until everyone is satisfied. Given the quantity of wireless devices, this is practically infeasible: imagine asking people to manually adjust the power of their wireless routers or even determine their valuation for a unit of power.
However, if we make the simple assumption that all devices (users) crave data rate, we can actually simulate their actions in a hypothetical market. This lets us approximate the optimal outcome easily without requiring any human interactions.
In our award-winning paper, "Seeing the bigger picture: context-aware regulations," we created a proof-of-concept “market” under the additional assumption that fair access to white space services is important to society. For example, San Franciscans will need more per-channel power than Montanans because they have fewer available white space channels. This hypothetical market is just a min/max convex optimization problem which can be solved quickly using today’s data centers and scales well even with thousands of constraints.
Since white space access already requires communication with a data center, we can easily apply changes there without deploying new white space devices. This lets us refine algorithms over time—including testing them in small regions before deploying them to the entire nation—in order to improve data rates for users. Through this, the white spaces could open up an exciting new realm of real-time data-driven policy.
Today we hung out with Professor Rob Gould (UCLA) and Chris Franklin (University of Georgia), who teach statistics to both college students and teachers. Rob published a great piece on stats education titled "Statistics and the Modern Student" and Chris was recently featured in UGA Research. They reflect on the value of statistical literacy, the challenges and rewards of data science education, and the future of the field. Really interesting conversation! For additional context, here's a January New York Times piece on the growing popularity of statistics education.
Jed Christiansen is Head of Channel Sales for Emerging Markets at Google and founder of Seed-DB.
In the summer of 2005, Paul Graham and three partners kicked off the “Summer Founders Program.” They aimed to combine seed-stage investing in startups with mentorship for young entrepreneurs. What made this program unique was that investments occurred synchronously via “classes” of startups funded at the same time and advised together. Their program became Y Combinator.
Their success kicked off a global movement of institutions that combined funding and education for entrepreneurs, commonly known now as seed accelerators. Seed accelerators are invaluable to generating economic growth and fostering entrepreneurial culture in communities. To demonstrate their importance, I created Seed-DB to track programs and the companies that have graduated from them. As of today, we’re tracking 134 seed accelerators in 33 countries. These accelerators have funded over 2000 startups, and seed companies have raised over $1.6 billion in funding. One hundred of these companies have already sold for an estimated total of over $1 billion.
In a thesis I wrote on seed accelerators in 2009, I surveyed entrepreneurs that had gone through seed accelerators. One of the more interesting results was that the funding they received was the least important aspect of the program. What mattered more was the community they were accepted into via the accelerator. As more and more startups go through each program, the alumni network becomes bigger, more diverse, and stronger. Thus, each class of startups receives more value than the class before it, building a virtuous circle.
Seed accelerators also affect their local economies through job creation. New jobs in the SMB sector are very important to society: Between 1980 and 2005 all net job growth came from firms less than five years old. And in times of economic downturns, small businesses can buck overall trends by creating new opportunities instead of scaling back.
The startups that have gone through seed accelerators have created over 4800 jobs. These are brand-new, high-value knowledge worker jobs. And even if an individual startup fails, founders’ experiences will be very valuable to traditional corporate employers.
Though the data in Seed-DB isn’t 100% complete (it relies on startups self-reporting), some important conclusions can be made. Seed accelerators are a new kind of institution that promote and celebrate a culture of entrepreneurship. Most programs fund 10-20 startups per year, educating between 20-50 entrepreneurs. Programs and entrepreneurs support each other through challenges, make key introductions for each other, and build communities that give startups the best chance of success. Were all accelerators to self-report, I estimate the number of jobs created would be more than 7,000.
From just one accelerator in 2005, to a handful in 2007, to over 130 around the world today, seed accelerators—and the jobs they create—are a positive change in the economic infrastructure of the technology industry.
Get in touch with Jed with any questions or suggestions for Seed-DB via Google+ or by e-mail jed.christiansen[at]seed-db.com.
Martine Durand is Chief Statistician at OECD.
On October 16, 2012 almost 400,000 babies were born in the world. On that same day, approximately 1000 people from around the world, including economists, statisticians, policy-makers and representatives from business and civil society, met to talk about the future lives of these babies. The 4th OECD World Forum on Measuring Well-Being for Development and Policy Making was held in New Delhi, featuring around 70 presentations, four roundtables, and several keynote lectures. The Forum provided a great opportunity for sharing knowledge and networking on Well-Being and Development.
Issues discussed by participants included: factors shaping trends in poverty and inequalities; business models and practices holding greater promise to improve well-being at work and beyond; links between effective and responsive institutions and people’s well-being; obstacles to gender equality and the type of environment needed for the start-up and success of women-owned businesses; policies helping children and at-risk youth to move into adulthood; preventing environmental degradation; improving the capacity of people, business and policy-makers to manage the consequences of disasters and conflicts; how to strengthen social cohesion.
The OECD World Fora on ‘Statistics, Knowledge and Policies’ have become one of the most important rendez-vous of the global community working on Well-Being. The 4th OECD Forum followed those held in Palermo (2004), Istanbul (2007) and Busan (2009). However, this forum marked a shift in the international well-being agenda. While previous Fora focused mainly on the “why” and the “how” to measure well-being, the 4th OECD Forum looks at how well-being can be made actionable. The OECD Better Life Initiative, launched in 2011 on the occasion of the Organisation’s 50th Anniversary—under the motto Better Policies for Better Lives—lies at the heart of this attempt to use improved well-being metrics to influence policy making. The OECD Better Life Initiative combines advanced statistical tools for measuring well-being with information on people’s aspirations and needs, as collected through the Better Life Index, a new innovative interactive platform.
But knowing what matters to citizens and where societies want to go is not enough to ensure that we will get there; this is one of the main messages coming out of the discussions held in New Delhi. We need to build our knowledge regarding what works or does not work to achieve better lives. We need new evidence and models to understand how people think and behave, and how policies can raise well-being given our new understanding. Part of the evidence is already there, though, and models are being developed. But the journey is long and will require the involvement of all actors—researchers coming from a range of disciplines, decision-makers, business, ordinary citizens.
Four additional key messages came out from New Delhi, and you can read the summary of conclusions here. The first is that the well-being agenda has made giant steps all over the world and that it is based on a common understanding of the issues. The second is that progress in measuring well-being has been uneven, with great advancements in areas such as subjective well-being but much more modest ones on measuring sustainability for example. The third is that more research is needed on the determinants of well-being, particularly on the role of policies. The fourth is that the well-being agenda is relevant for both developed and developing countries, although priorities may differ. The next OECD World Forum will take place in 2015 and be aligned with discussions on the outcomes of Rio+20 and the post-2015 agenda.. The 5th Forum will thus be an important landmark to judge whether Development Goals will have become, indeed, Well-Being Goals for all.
Highlights of the week's stories about all things data.
What is to be done? Campaigns should make public every outreach message so we at least know what they are saying. These messages can be placed in a public database like campaign contributions so the other side can be aware of, and have the right to respond to, false claims. Political access to proprietary databases should be regulated to provide an even playing field.
“Big data” is an impressive buzzword, but don’t get caught up in that. Let’s leave it to the journalists, researchers and marketing departments to tell us what the big data market is. Instead, focus on understanding your data. Your data is much more powerful than big data; in fact, it’s the only data you truly care about.
Ever wonder what diffusion ideas and information on Twitter might look like? Whisper, a project shown at IEEE's VisWeek 2012 back in October, "is an online visualization tool for tracing the process of information diffusion on Twitter in real time." Their live demo isn't online yet, but they've posted some images of the "diffusion story" of last year's 6.8 magnitude earthquake and the following tsunami in Japan here. This promises to be a very interesting tool to understand communication on Twitter.
Highlights from the week in data!
This tool traced the path of Hurricane Sandy starting on October 22. As the predictions about the Storm’s path changed, data on potential impacts were automatically updated. This innovative tool provides information about the potentially affected population, the kinds of businesses impacted by a natural disaster, and the number and characteristics of workers, as well as where they live. You can use it to look at statistics for declared disaster areas and learn about things like the impacted industries, the ages of workers and workers’ earnings. These statistics can be used by communities to know not only how many people live in an area where there is a disaster but also how many people work in those areas.
Obama won the US presidential election, but political analyst Nate Silver was also a big winner. Today's visualization is the side-by-side comparison of Silver's US presidential election forecast and the actual election results, pulled from a Salon post titled "Nate Silver nails it."
ReadWriteWeb has also publisehd a fantastic breakdown of forecast versus reality: "Nate Silver's Model Proves To Be Stunning Portrait Of Logic Over Punditry." Time also has a great piece on the data analysts who helped optimize resources and build a winning strategy for the Obama campaign.
With many voters displaced in New York and New Jersey because of Sandy, others waiting in huge lines to cast votes, and robocalls telling Floridians that Election Day is Wednesday, we may not see full, accurate returns tonight. Regardless of the outcome, it's a red-letter day for data visualization in the form of real-time results interfaces. It seems like every political blog and news site has its own "election central" chock full of interactive maps and dynamic charts, and here's just a brief a survey of some of the resources available this year.
What are your favorites? Let us know in the comments!
Happy Friday! A few highlights this week.