|Predicting Your Future Location Years in Advance
University of Rochester computer scientist Adam Sadilek has developed a system that can, under certain conditions, predict the location of an individual years in advance. The breakthrough is possible, in part, because of the wide and growing adoption of GPS-enabled smartphones. Almost 50% of the U.S. population now carries a GPS device of some sort, according to the Pew Internet and American Life Project.
Sadilek evaluated a large dataset consisting of 703 subjects (carrying GPS devices) over a variety of different time periods and collected more than 30,000 daily samples.
“While your location in the distant future is in general highly independent of your recent location, as we will see, it is likely to be a good predictor of your location exactly one week from now. Therefore, we view long-term prediction as a process that identifies strong motifs and regularities in subjects’ historical data, models their evolution over time, and estimates future locations by projecting the patterns into the future,” Sadilek writes.
The system, called Far Out, could be used to map future traffic congestion, disease spread, and electricity demand. Sadilek will present his paper at the Association for the Advancement of Artificial Intelligence 2012 conference in July.
What Does Climate Change Have to Do with the Price of Corn in Iowa?
Check off another box on the list of climate change’s impacts: commodity crop price volatility. “Even one or two degrees of global warming is likely to substantially increase heat waves that lead to low-yield years and more price volatility,” says Noah Diffenbaugh, a researcher at the Stanford Woods Institute for the Environment and one of the authors of a recent Nature Climate Change paper on how climate change will impact commodity markets.
Government will also affect price volatility through such policies as promoting the use of corn as a renewable fuel source. The corn market will be less resilient if bound by the biofuels mandate, so any yield fluctuations will drive up prices even more.
Nudging the U.S. corn belt northward to a little below the Canadian border could help avoid excessive heat that would devastate corn crops and impact of market prices. Alternatively, the corn could stay in place—as long as new varieties are bred with increased heat tolerance of at least 6 degrees Fahrenheit.
The government’s attempts to curb the causes of climate change might have costly unintended effects, although less costly than the impact of climate change itself. In a recent survey by Jon Krosnick at the Stanford Woods Institute for the Environment, most responders favor federal tax breaks for companies producing alternative energies (wind, water, and solar). But with the politically charged topic on the table during the presidential race, only 62% of Americans say they support government action to address climate change—a drop from 72% in the survey’s 2010 iteration.
This reversal in public sentiment is out of step with growing consensus on global climate change among scientists. According to a recent Yale survey, more than 90% of climate scientists now agree on the existence of man-made global warming.
Machines That Can Read Between the Lines
In the military, lives depend on human leaders getting messages and understanding them in full. Unfortunately, defense operators and analysts receive huge volumes of data from many sources, and written texts’ meanings are not always obvious. Important information may not be explicitly stated, key details may be unclear, and there may only be indirect references to important activities and objects.
Under deadline pressure, the readers may miss important points. Now, a new “natural-language processing” computer program could help human leaders to cut through the haze.
The U.S. Defense Advanced Research Projects Agency (DARPA) is creating a Deep Exploration and Filtering of Text (DEFT) program that will read documents and understand inferred meaning from them far more quickly than a human reader could.
Defense analysts who use this program will be able to investigate and process far more documents in less time; they’ll also be able to discover important but implicitly expressed information in the documents. The system will identify connections among documents, filter redundancies, and infer implicit information, all of which will ease planning and decision making.
U.S. Census Improves in Accuracy
In its post-enumeration surveys, the U.S. Census Bureau reports that it achieved near-zero overcounting of the nation’s population in the 2010 Census. The net overcount of 0.01% (representing 36,000 people) improves upon the 2000 Census overcount of 0.49% and the 1990 Census undercount of 1.61%.
The survey sampled the 300.7 million Americans living in housing units (excluding nursing homes, college dorms, and other group quarters) and matched responses to the Census in order to estimate errors. These errors may include omissions, duplications, imputable demographic characteristics, and fictitious responses.
Post-enumeration surveys are part of the Census Bureau’s strategy to improve its data collection. Other efforts involve evaluating Census operations and data-collection processes and comparing other methods for estimating population size. The goal is to improve Census processes (and the resulting data) for the 2020 Census.
“On this one evaluation—the net undercount of the total population—this was an outstanding census,” says Census Bureau Director Robert Groves. “When this fact is added to prior positive evaluations, the American public can be proud of the 2010 Census their participation made possible.”
A remaining challenge to Census accuracy is reaching the nation’s harder-to-count renters and minority populations, says Groves.
Source: U.S. Bureau of the Census