Showing posts with label mapping. Show all posts
Showing posts with label mapping. Show all posts

Friday, September 10, 2010

What's your excuse?

Seriously, what reason do you have for not deploying a map server? Take your pick of the open source map servers or even a commercial one. The cost of deploying a map server on the Internet is $5.12/month for 100% usage on Amazon Web Services. There's a one time charge for a 1 year ($54) or 3 year ($82) reserved instance on an EBS boot (read you won't lose your work if you terminate the instance).  If you want to host a low bandwidth map server for testing, learning, or just because you have cool data to share via maps, a micro instance costs $5.12, as the Amazon Simple Monthly calculator shows:





Sunday, June 6, 2010

The map fades

At Wherecamp 2010, I twittered, "as we extend our senses with sensors the map fades." The idea that maps will eventually become obsolete and affectations for future descendants of steampunk aficionados has been difficult to let go. Mike Liebhold started the idea by asking the question, "is 1:1 scale mapping a reasonable idea" on the geowanking list. This touched off a thread that included David Asbury's response containing a link to quotes that touch on the various facets of the 1:1 map. Quotes from the comedian Steven Wright and writers Jorge Luis Borges and Lewis Carroll highlight the absurdity of such a map; and respondents to the question cited information overload would make such a map impractical.


The answer is simple, there is no map. It's gone. Moving freely through the world without resorting to abstractions of location places a large cognitive burden.  Studies of societies steeped in wayfinding have shown that they have developed acute abilities to distinguish environmental clues and extract meaningful information.  What if you no longer need years of experience to develop wayfinding cognitive abilities? 



Wikitude Drive - Test Drivers Wanted from Wikitude on Vimeo.

Sunday, June 14, 2009

CrisisCamp

CrisisCamp: June 12-14, 2009
Ignite
On Friday night, Ignite was held at the World Bank. The talks were excellent and although I didn't take notes, people twittered where to find their presentations.
@ajturner noted that it was "perhaps the swankiest Ignite ever, hosted at the World Bank." Credit for the photo also goes to @ajturner . Also, the World Bank asked that we drink all the booze.
CrisisCamp Day 1
Crisis Mapping in Sudan
Led by Patrick Meier via Skype from Khartoum, the discussion was about a UNDP project, Threat and Risk Mapping Analysis (TRMA). TMRA maps microlevel problems and threat and risk indicators to provide better information when implementing development programs and to avoid causing or exacerbating problems. The data is collected by focus groups and capture the rich local knowledge through capturing information on printed maps. The data is entered into a GIS and so far has collected 6000+ data points and mapped over 700 new villages. Patrick also talked about 4W (who, what, when, where) crisis mapping tool based on open source. 4W shows emerging trends, situational pressures, market routes and critical fault lines through out a region. On the interoperability front, the project has been part of establishing an information sharing group and a data sharing protocol among UN agencies for baseline data. Data sharing has been problematic and has led to duplication of effort, the desire not to share data was describe as "DHD, data hugging disorder." Mesh4X was also discussed a means for data sharing across multiple platforms and high latency networks (read disconnected clients).
The discussion turned to data collection, especially using SMS and occasionally MMS. Andrew Turner mentioned the Youth Assets, where SMS is used by children to perform emotional mapping. It was generally agreed that SMS is the lowest common denominator in terms of a protocol and platform for data collection. However, SMS was not always ideal and that there was some difficulty in getting structured data in Ushahidi. FrontlineSMS was also mentioned as a means for coordinating receipt and delivery of SMS messages when Internet access is not available.
The last part of the discussion tied the technology back with the practice of data collection in the field. IFAD, the International Fund for Agricultural Development published Good Practices in Participatory Mapping, which is a practical guide for data collection in the field.
UX, Usability, Visualization
This session was supposed to be about user experience, usability, and visualization, but it focused more on reliability of crowd sourced data and verification of the data to provide reliable and actionable information. Swift, a framework for verifying crowdsourced information, was discussed.
Part of this discussion was dissemination mechanisms, and delivery crisis information on the most common platforms, which are radio and tv.
Again, SMS was noted as being one of the last communication systems to go down during a communication surge situation that frequently occurs during crisis. The question was asked, "How do you text a 911 call?" if that is the case. The point was made that SMS should not be used as a push mechanism for delivery alerts; rather, SMS should be used for listening to requests and this is a more efficient use of the technology.
There was also a discussion of authoritative or official sources and their role in a crisis. There are 2 national systems to alert during emergency IPAWS and CMAS for delivering alerts.
Crowd Sourcing Situational Awareness
This session was led by David Stephenson. (Unfortunately, I walked in late to this session, missing about half the session).
"How do you get institutional buy-in form the government?" was the question posed during the session. Buy-in maybe defacto because it can not keep up in comparison to crowdsourcing. Stephenson noted that David Robinson of Princeton's Center for Information Technology Policy advocates that the government should publish data and leave the interpretations to the consumers of the data. It was noted that data.gov will go from 80 feeds to 100,000 feeds in a month.
With the imminent flood of data, data analysis and visualization tools such as swivel and manyeyes were mentioned.
Swine Flu Response
Andrew Wilson of the Department of Health and Human service led a discussion on his work using social media in response to the Swine Flu pandemic. In addition to the official channel, pandemicflu.gov, he employed FaceBook, twiiter and podcasts to provide and collect information about H1N1. One if his strategies was to use tweeters with large numbers of followers to retweet information to create an amplifying effect. One area that they did relatively poorly was mapping and he pointed to the Ushahidi Swine Flu map as a better implementation.
Lazy Web Disaster
At WhereCamp2009, Mikel Maron led a sesssion where participants could yell out ideas or projects that they would like to see but never implement. The Lazy Web Disaster session collected these ideas on twitter - the results.
CrisisCamp Day 2
Recap of Day 1
The second day started with a long recap of the session from the first day.
It was evident in the previous sessions that there was a tension between domestic and international responses to crisis. Despite this tension there are commonalities and possibly low hanging fruit between the two. Lack of a common a common vocabulary was identified as one of the obstacles that prevent organizations from working together. For example, there are 60,000 organizations nationally that provide crisis management and relief services, but they don't share a common vocabulary. As noted in the Crisis Mapping in the Sudan session the UN has developed a wiki to build a common terminology.
Noel Dickover asked, "What is the coalescing function? Was it crisis response? Why are you here?"
Greg Elin responded that crisis cuts across multiple communities, opens doors and gets people at the table. Crisis activates the bureaucracy, creates opportunities, and opens doors. Bureaucracy may not move or change, but there is still a desire to tap into opportunities, some of which are generated by social media.
The discussion turned to preparedness. Succesful crisis management is based on agility, reacting to the unexpected. So, what are the tools available to handle the situation? Furthermore, is there way to incentivize being prepared. Ready.gov is an official channel for disaster preparedness information, but why don't people know about it and if they did would they care?
It was note that ready.gov is hamstrung because its in the government space, and that being a government organization hampers its effectives for disseminating the message. Preparedness will require a cultural change, and that current fear based communications do not work.
The discussion turned to what could the participants do as a result of the CrisisCamp; how can these dialogue and ideas become actionable. Concrete steps to continue the work of CrisisCamp were:
  1. develop a common language
  2. concentrate on what first by identifying the gaps and not jump to how, i.e. technology
  3. engage more people to provide the force multiplier, grow the 80 participants to a network of 8000 people
  4. provide tools that are simple to use. Google tools don't require training, so that is the bar to shoot for
  5. create interoperable applications as demonstation
  6. create a demo scenario and build apps around the scenario
  7. creation of a wiki: crisiscommons.org
  8. creation of crisiscampdc.ning.com - common community platform
  9. create a common design document/template for crisis apps
  10. crisiscamp messaging matrix
The remaining sessions were devoted to discussion on implementation of these ideas.

Sunday, May 31, 2009

Ranty, rant, rant: There ought to be a law ...

Correct if I'm wrong, but we are near the end of the first decade of the 21st century, right?  We've had more than a decade of federal initiatives to make geospatial data more available and interoperable.  Then why do I get handed a largish data set (4 to 7GB depending on the format) produced by a federal agency in a proprietary format that requires me to have a license to ESRI products?  
My task is to move this data set into PostgreSQL/PostGIS, but since I don't have the necessary SDE libraries on the system, I can't even compile ogr2ogr to accomplish this.  This really is a vendor problem, specifically an ESRI problem.  If I was handed a Oracle export file, I can easily move this data by compiling ogr2ogr because Oracle provides the necessary libraries in its client SDK, which is available by download.  
Ironically, the purpose of this exercise to make this data more easily available.  However, I can't make it past first base because I'm hamstrung by the data format.  This is such a 90's problem.  Although shape files are the defacto lingua franca for geospatial data, the format has its limitations, namely a 2GB limit for dbf files.  It was unusual for a data set to be larger than 1GB in the 90's; but 20 years later, data sets over 2GB are not uncommon.  The current Federal procurement language should be amended to mandate that geospatial software produce entire data sets that are in an open and accessible format.

Wednesday, April 15, 2009

Maps for Advocacy

From the WhereIdeas Wiki:
Tactical Technology Collective has put together a great book Maps for Advocacy: An Introduction to Geographical Mapping Techniques (http://www.tacticaltech.org/mapsforadvocacy). Use of maps in the political and activism space is growing and very successful. —GregElin, Sunlight Foundation.
Summary blurb from the Tactical Technology Collective website:
"The booklet is an effective guide to using maps in advocacy. The mapping process for advocacy is explained vividly through case studies, descriptions of procedures and methods, a review of data sources as well as a glossary of mapping terminology. Scattered through the booklet are links to websites which afford a glance at a few prolific mapping efforts. "
Indeed, in 44 pages the booklet provides coverage of the majority of the web based mapping technologies available today that don't require purchasing a license.  One of the best parts of the book is the roadmap to the technologies based upon what a user wants to do. 
If you provide a service or software, where do you think your technology fits?

Monday, March 2, 2009

Transparency Camp09: Mapping Session

Transparency Camp is a bar camp for open government advocates (government representatives, technologists, developers, NGOs, wonks and activists) to share knowledge on how to use new technologies to make our government transparent and meaningfully accessible to the public. Andrew Turner of GeoCommons and I ran a session on mapping, public participation and open data. Andrew avoided the "powerpoint" approach did a wonderful job of moderating the session and made it very interactive. He captured the discussion by dividing it into two sections Mapping and GeoData and Solutions.  Mapping and GeoData was further divided into two categories: Difficulties and Goals. Solutions was the beginning of a mind map I've summarized the session roughly according to Andrew's categories of Difficulties and Goals. Data input and output The conversation started out with common frustrations which can be divided into two problems: getting data in and getting maps out. Getting data into a mapping interface remains problematic. The session members commonly used csv, shape files and geotiffs to overlay on a base map source such as Google Maps. While data is available from the web sites and from government agencies, the problem is that the data is poorly described. Data sources from different government offices are unique bit contain common data. In addition, the usefulness of a particular data set is not known until they have completed the entire process of building a map. The primary output problem was printing maps. On-line mapping applications do not provide an easy way to create printed maps, especially large format (D or E size) maps. There was also a concern about licensing and copyright when printing maps from an on-line source such as Google Maps. 'Easy' was the word of day and the goal with regards to data input and out put. The ability to preview the data or even have meaningful metadata (i.e. fitness of use ranking, popularity, etc) was needed. Also the problem of having multiple schemas of similar data could be addressed with a common community schema. To make it easy, there could be a graphical tool that lets one map a data set to the community schema by simply drawing lines between two fields. Motivations Session members said that their motivation for mapping was to tell a story and that maps were a way to tie data to communities. Maps make data real and concrete and inherently provoke a visceral reaction. Maps are used as an exploratory interface so that patterns of data can be revealed. Maps are used as a way to plan, coordinate and share information through a variety of contexts. A distinction was made about different contexts; for example a map showing the locations of services did not work because it didn't readily answer the question of "how close is the service." In that case proximity was more important actual location. Andrew summed it up nicely, "Geographic doesn't always mean cartographic." Although, most members agreed that visualization of location was the primary use of maps, it was obvious that they also used maps in a more sophisticated way to communicate the implications of data. Current web mapping platforms remain focused on the "where is" aspect of maps. The next generation web mapping platforms should implement the basic cartographic thematic forms of isopleths, chloropleths, dot density, and proportional symbols. Open data and fear A good part of the discussion was about open data and why organizations did not want to release it.  Fear was the main reason cited for not releasing open data; below is a list of fears:
  • liability
  • may expose problems with the data
  • data may be used against an organization
  • some organizations fear that it will reveal patterns of behavior that will be criticized
  • concerns about real time location/tracking of archeological sites or endangered species
  • security and privacy are of concern
What they really want
To summarize, here is want they wanted:
easy 
  • data preview 
  • a common way to view data from different sources (a normalization tool)
  • making sophisticated maps guided by wizards
  • printed map output, or press ready map output for brochures
cheap 
  • the applications should be low cost or free
  • require minimal staff training
  • the amount of time spent on maintenance should be low
  • use a minimal amount of bandwidth (interesting comment that hints at an interest in hosted solutions)