Blog

October 29, 2013 | 0 comments

From Patrick Meier's iRevolution blog: There is so much attention (and hype) around the use of social media for emergency management (SMEM) that we often forget about mainstream media when it comes to next generation humanitarian technologies. The news media across the globe has become increasingly digital in recent years—and thus analyzable in real-time. Twitter added little value during the recent Pakistan Earthquake, for example. Instead, it was the Pakistani mainstream media that provided the immediate situational awareness necessary for a preliminary damage and needs assessment. This means that our humanitarian technologies need to ingest both social media and mainstream media feeds: http://irevolution.net/2013/10/29/mainstream-media-for-emergency-managem...

October 29, 2013 | 0 comments

The historic cyclone that made landfall on this date last year was so powerful and devastating that it was designated a "superstorm," had its name retired, and entered the tropical storm hall of fame. But hurricane experts fear that something far worse than Sandy, blamed for $50 billion in damage, is brewing. In the next two decades, the nation could experience a $500 billion storm. The sea level is rising, and global warming might affect future storms. But even if the world's temperature stops rising before you finish this paragraph, hurricanes far more damaging than Sandy are all but a certainty, they say. Despite unprecedented forecasting, monitoring, and warning abilities, and a record period of hurricanes avoiding landfall, the disaster remains one of the nation's most robust growth industries, with almost unlimited potential. "Quite simply, there are more people and more infrastructure in harm's way," said Margaret Davidson, the National Oceanic and Atmospheric Administration's acting director of coastal resource management: http://www.emergencymgmt.com/disaster/Experts-Use-Lessons-Sandy.html

October 28, 2013 | 0 comments

A team of scientists, led by the University of Southampton, has developed a new method to help the world’s coasts adapt to global sea-level rises over the next 100 years. Future sea-level rise seems inevitable, although the rates and geographical patterns of change remain uncertain. Given the large and growing populations and economic activity in coastal zones, as well as the importance of coastal ecosystems, the potential impacts of sea-level change are far-reaching. A University of Southampton release reports that current methods to assess the potential impact of sea-level rise have varied significantly and hindered the development of useful scenarios and, in turn, suitable adaption policies and planning. A new study led by Professor Robert Nicholls from the University of Southampton, has combined the available data on a number of different climate and non-climate (such as uplift, subsidence, and natural phenomena – earthquakes, for example) mechanisms, which contribute to sea-level change, to create appropriate scenarios of sea-level rise at any location when policy-makers consider impacts and adaption. Nicholls says: “The goal here is not to ‘scare people’ but rather to encourage policy makers to think across the full range of possibilities. Hence, the problem can be addressed in a progressive and adaptive manner where sea-level rise is planned for now, and that plan includes monitoring and learning about sea-level change over the coming decades. This means that sea-...

October 28, 2013 | 0 comments

From Patrick Meier's iRevolution blog: Can we use advanced computing to rapidly identify Twitter users who are reporting from ground zero? The answer is yes. An important indicator of whether or not a Twitter user is reporting from the scene of a crisis is the number of times they are retweeted. During the Egyptian revolution in early 2011, “nearly 30% of highly retweeted Twitter users were physically present at those protest events.” Kate et al. drew on this insight to study tweets posted during the Occupy Wall Street (OWS) protests in September 2011. The authors of the "Learning from the Crowd: Collaborative Filtering Techniques for Identifying On-the-Ground Twitterers during Mass Disruptions" report manually analyzed a sample of more than 2,300 Twitter users to determine which were tweeting from the protests. They found that 4.5% of Twitter users in their sample were actually onsite. Using this dataset as training data, the authors were able to develop a classifier that can automatically identify Twitter users reporting from the protests with an accuracy of just shy of 70%. More training data could very well help increase this accuracy score:
http://irevolution.net/2013/10/28/automatically-identifying-eyewitness-r...

October 28, 2013 | 0 comments

When tropical storms hit New York City, internet connectivity is often the first thing to go down. The next time it happens in the low-lying coastal community of Red Hook, Brooklyn, it will be a group of teenagers running something called a Wi-Fi Mesh Network that will come to the rescue--providing a model for a low-cost, community-built solution to the so-called Last Mile gaps that the massive telcos can’t (or won’t) bridge. A year ago, a community organization called Red Hook Initiative (RHI) had just started a pilot program for a Red Hook Wi-Fi mesh. A “mesh network” is a system of inexpensive router nodes that beam Wi-Fi around above the streets for everyone to use, and even if the internet connection goes down, the mesh allows communication within its bounds. So while you can’t watch Netflix over a closed mesh network, you can still communicate with people in your vicinity--which is obviously crucial in emergency scenarios. In the hurricane’s aftermath, RHI handed the reins of the Red Hook Wi-Fi mesh project to a new group calling themselves The Digital Stewards, comprised of eight 19- to 25-year-olds: http://www.fastcolabs.com/3020680/how-to-build-a-low-cost-wifi-mesh-netw...!