Friday, December 31, 2010

Install script for Geoserver on Ubuntu EC2 instance

Continuing with the theme of running stuff on the free tier of Amazon Web Services, here's a script to install Geoserver proxied through apache for when you want to throw up a quick map server. The script installs Geoserver 2.02 on Ubuntu Maverick 10.10 using ami-cef405a7.

The script is available here.

# install Geoserver on Ubuntu Maverick 10.10
# note: Geoserver is proxied through apache so port 8080 is not used
# @spara 11/15/10

# setup sources 
sudo sh -c "echo ' ' >> /etc/apt/sources.list"
sudo sh -c "echo 'deb maverick multiverse' >> /etc/apt/sources.list"
sudo sh -c "echo 'deb-src maverick multiverse' >> /etc/apt/sources.list"
sudo sh -c "echo 'deb maverick-updates multiverse' >> /etc/apt/sources.list"
sudo sh -c "echo 'deb-src maverick-updates multiverse' >> /etc/apt/sources.list"
sudo sh -c "echo 'deb maverick partner' >> /etc/apt/sources.list"
sudo apt-get update

# magic! (installs java without physically accepting license)
echo "sun-java6-jdk shared/accepted-sun-dlj-v1-1 boolean true" | sudo -E debconf-set-selections

# setup prerequisites 
sudo apt-get -y install sun-java6-bin
export JAVA_HOME=/usr/lib/jvm/java-6-sun
sudo apt-get -y install unzip

# set java paths
sudo touch /etc/profile.d/
sudo sh -c "echo 'export JAVA_HOME=/usr/lib/jvm/java-6-sun' >> /etc/profile.d/"
sudo sh -c "echo 'export PATH=$PATH;$JAVA_HOME/bin' >> /etc/profile.d/"
sudo source /etc/profile.d/

#install tomcat6
sudo apt-get install -y tomcat6
sudo chgrp -R tomcat6 /etc/tomcat6
sudo chmod -R g+w /etc/tomcat6

# install and config apache
sudo apt-get install -y apache2
sudo ln -s /etc/apache2/mods-available/proxy.conf /etc/apache2/mods-enabled/proxy.conf
sudo ln -s /etc/apache2/mods-available/proxy.load /etc/apache2/mods-enabled/proxy.load
sudo ln -s /etc/apache2/mods-available/proxy_http.load /etc/apache2/mods-enabled/proxy_http.load

# add tomcat proxy
sudo chmod 666 /etc/apache2/sites-available/default
sudo sed -i '$d'  /etc/apache2/sites-available/default
sudo sh -c "echo ' ' >> /etc/apache2/sites-available/default"
sudo sh -c "echo 'ProxyRequests Off' >> /etc/apache2/sites-available/default"
sudo sh -c "echo '# Remember to turn the next line off if you are proxying to a NameVirtualHost' >> /etc/apache2/sites-available/default"
sudo sh -c "echo 'ProxyPreserveHost On' >> /etc/apache2/sites-available/default"
sudo sh -c "echo ' ' >> /etc/apache2/sites-available/default"
sudo sh -c "echo '' >> /etc/apache2/sites-available/default"
sudo sh -c "echo '    Order deny,allow' >> /etc/apache2/sites-available/default"
sudo sh -c "echo '    Allow from all' >> /etc/apache2/sites-available/default"
sudo sh -c "echo '' >> /etc/apache2/sites-available/default"
sudo sh -c "echo ' ' >> /etc/apache2/sites-available/default"
sudo sh -c "echo 'ProxyPass /geoserver http://localhost:8080/geoserver' >> /etc/apache2/sites-available/default"
sudo sh -c "echo 'ProxyPassReverse /geoserver http://localhost:8080/geoserver' >> /etc/apache2/sites-available/default"
sudo sh -c "echo ' ' >> /etc/apache2/sites-available/default"
sudo sh -c "echo '' >> /etc/apache2/sites-available/default"
sudo chmod 644 /etc/apache2/sites-available/default

# get geoserver, change to version you want
sudo service tomcat6 stop
sudo unzip -d /var/lib/tomcat6/webapps/
sudo chown -R tomcat6 /var/lib/tomcat6/webapps/geoserver.war
sudo chgrp g+w tomcat6 /var/lib/tomcat6/webapps/geoserver.war

# restart
sudo service tomcat6 restart
sudo service apache2 restart

# echo message
echo " "
echo "Geoserver is available at: http://$addy/geoserver"

# additional tweaks for production instances
# add the following options to
# JAVA_OPTS="-Djava.awt.headless=true -Xms256m -Xmx768m -Xrs -XX:PerfDataSamplingInterval=500 -XX:MaxPermSize=128m -DGEOSERVER_DATA_DIR=/var/lib/tomcat6/webapps/geoserver/data"

Thursday, December 30, 2010

Install script for ThinkUp 0.7 on Ubuntu EC2 instance

ThinkUp is a nifty web app for managing social media; from the site:
ThinkUp captures your posts, replies, retweets, friends, followers, and links on social networks like Twitter and Facebook. We'll be adding more networks in the future. ThinkUp stores your social data in a database you control, and makes it easy to search, sort, filter, export, and visualize in useful ways.
ThinkUp requires the LAMP stack, a number of php packages, and sendmail. Installing these individually can be daunting so I wrote a script that takes care of all the prerequisites and installs ThinkUp on an Ubuntu EC2 instance. I commented out the phpmyadmin installation because it isn't necessary, but it is nice to have if you need to make changes to the database.

For testing, I used my Ubuntu 10.10 AMI that complies with AWS free tier requirements: ami-8548bfec. One caveat, I set my ThinkUp account email to gmail which seems to mark the autoregistration notification email as spam, so check your spam folder first.

UPDATE 12/30/10: Canonical released refreshed UEC images for 10.10 (Maverick Meerkat) with 8GB root EBS volumes that will run on the AWS free tier. The list of Amazon published AMIs is available here.

UPDATE 1/5/11: Andy Baio updated the script and wrote a tutorial to perform the whole install in the browser. The tutorial is on the ThinkUp wiki.

# install ThinkUp on EC2 Ubuntu instance:
# @spara 12/23/10

echo "Installing required packages, follow the prompts"
sleep 2

# install required packages
sudo apt-get update
sudo tasksel install lamp-server
sudo apt-get -y install unzip
sudo apt-get -y install curl libcurl3 libcurl3-dev php5-curl php5-mcrypt php5-gd --fix-missing
sudo apt-get -y install sendmail

# restart apache to init php packages
sudo service apache2 restart

# not necessary but nice to have
#sudo apt-get -y install phpmyadmin

wget --no-check-certificate
sudo unzip -d /var/www/

# config thinkup installer
sudo ln -s /usr/sbin/sendmail /usr/bin/sendmail
sudo chown -R www-data /var/www/thinkup/_lib/view/compiled_view/
sudo touch /var/www/thinkup/
sudo chown www-data /var/www/thinkup/

# create database
echo -n "Enter the MySQL admin password: "
read -e pword
mysqladmin -h localhost -u root -p$pword create thinkup

# echo message
echo "Copy the URL below to install and configure Thinkup"
echo "http://$addy/thinkup/install/"

Wednesday, December 29, 2010

Build-out Script for Postgres/PostGIS with RAID 10 on Amazon EBS volumes

My iteration on Simon Tokumine's script to install Postgres on Amazon Web Services. This version is based on Ubuntu 10.4 and builds out a RAID 10 drive, installs GEOS, Proj4, osm2pgsql and PostGIS from source, and creates a database ready for loading OSM data.

# Amazon EC2 PostGIS 1.5 on RAID10,f2 EBS Array Build Script
# Complete Rip off of:
# Additional glue by Simon Tokumine, 15/11/09
# Additions by Sophia Parafina, 10/08/10
#        added additional repos to sources.list
#        custom postgis, proj4, geos build
#        added packages for building postgis, proj4, geos
#        configured to build RAID10
#        customized for Canonical Ubuntu AMIs
# I ORIGINALLY USED THE 32-bit AMI: ami-ccf615a5 (jaunty)

#Please complete the parts that are in []'s (over writing the []'s)
#then just run the script on the server

# change this to you keypair and cert
export EC2_PRIVATE_KEY=~[key.pem]
export EC2_CERT=~[cert.pem]
# change this to your instance
# change to the instance's availability zone
# builds out RAID10, so size of RAID=volumes*size/2,
#   change this to your needs
# change to your mount point
# change to a device
# change to your password
# create a postgis template


sudo sh -c "echo ' ' >> /etc/apt/sources.list"
sudo sh -c "echo 'deb lucid multiverse' >> /etc/apt/sources.list"
sudo sh -c "echo 'deb-src lucid multiverse' >> /etc/apt/sources.list"
sudo sh -c "echo 'deb lucid-updates multiverse' >> /etc/apt/sources.list"
sudo sh -c "echo 'deb-src lucid-updates multiverse' >> /etc/apt/sources.list"
# this is specific for lucid only
sudo sh -c "echo 'deb lucid partner' >> /etc/apt/sources.list"
sudo apt-get update
sudo apt-get -y install ec2-api-tools
sudo apt-get -y install sun-java6-bin
export JAVA_HOME=/usr/lib/jvm/java-6-sun

devices=$(perl -e 'for$i("h".."k"){for$j("",1..15){print"/dev/sd$i$j\n"}}'|
head -$volumes)
while [ $i -le $volumes ]; do
   volumeid=$(ec2-create-volume -z $availability_zone --size $size | cut -f2)
   echo "$i: created $volumeid"
   echo $volumeid
   ec2-attach-volume $volumeid -i $instanceid -d $device
   volumeids="$volumeids $volumeid"
   let i=i+1
echo "volumeids='$volumeids'"

sudo apt-get update &&
sudo apt-get install -y mdadm xfsprogs

devices=$(perl -e 'for$i("h".."k"){for$j("",1..15){print"/dev/sd$i$j\n"}}'|
head -$volumes)

#builds out RAID10
yes | sudo mdadm \
--create $raid_array_location \
--chunk=$raid_chunk \
--level=$raid_level \
--layout=$raid_layout \
--metadata=1.1 \
--raid-devices $volumes \

echo DEVICE $devices | sudo tee /etc/mdadm.conf
sudo mdadm --detail --scan | sudo tee -a /etc/mdadm.conf

sudo mkfs.xfs $raid_array_location

echo "$raid_array_location $mountpoint xfs noatime 0 0" | sudo tee -a /etc/fstab
sudo mkdir $mountpoint
sudo mount $mountpoint

#echo " " >> /etc/apt/sources.list
#echo "deb jaunty main" >> /etc/apt/sources.list
#echo "deb-src jaunty main" >> /etc/apt/sources.list
#sudo apt-key adv --keyserver --recv-keys 8683D8A2
#sudo apt-get update
sudo apt-get install libxml2-dev

sudo apt-get -y install postgresql-8.4 postgresql-server-dev-8.4 postgresql-contrib-8.4 libpq-dev
sudo /etc/init.d/postgresql-8.4 stop

sudo mkdir $mountpoint/data
sudo chmod -R 700 $mountpoint/data
sudo chown -R postgres.postgres $mountpoint/data
sudo -u postgres /usr/lib/postgresql/8.4/bin/initdb -D $mountpoint/data
sudo sed -i.bak -e 's/port = 5433/port = 5432/' /etc/postgresql/8.4/main/postgresql.conf
sudo sed -i.bak -e "s@\/var\/lib\/postgresql\/8.4\/main@$mountpoint\/data@" /etc/postgresql/8.4/main/postgresql.conf
sudo sed -i.bak -e 's/ssl = true/#ssl = true/' /etc/postgresql/8.4/main/postgresql.conf
sudo /etc/init.d/postgresql-8.4 start
cd /tmp
sudo apt-get -y install bzip2
sudo apt-get -y install g++
sudo apt-get -y install checkinstall

# install geos
bunzip2 geos-3.2.2.tar.bz2
tar xvf geos-3.2.2.tar
cd geos-3.2.2
make && sudo checkinstall --pkgname geos --pkgversion 3.2.2-src --default

# install proj
cd ../
tar xvfz proj-4.7.0.tar.gz
cd proj-4.7.0
make && sudo checkinstall --pkgname proj4 --pkgversion 4.70-src --default
cd ../

# install postgis as a package for easier removal if needed
tar xvfz postgis-1.5.2.tar.gz
cd postgis-1.5.2
make && sudo checkinstall --pkgname postgis --pkgversion 1.5.2-src --default # remove with dpkg -r postgis
sudo /sbin/ldconfig

# config template_postgis
sudo -u postgres psql -c"ALTER user postgres WITH PASSWORD '$postgres_password'"
sudo -u postgres createdb $db_name
sudo -u postgres createlang -d$db_name plpgsql
sudo -u postgres psql -d$db_name -f /usr/share/postgresql/8.4/contrib/postgis-1.5/postgis.sql
sudo -u postgres psql -d$db_name -f /usr/share/postgresql/8.4/contrib/postgis-1.5/spatial_ref_sys.sql
sudo -u postgres psql -d$db_name -c"select postgis_lib_version();"

# osm
# install osm2pgsql
cd /tmp
sudo apt-get -y install subversion
sudo apt-get -y install autoconf
sudo apt-get -y install libbz2-dev
svn export
cd osm2pgsql
sed -i 's/-g -O2/-O2 -march=native -fomit-frame-pointer/' Makefile
sudo make install

# create osm database
# sudo -u postgres createdb -T template_postgis osm

Recipe: Ubuntu Maverick 10.10 on the AWS Free Tier

I love free stuff, who doesn't? In grad school, I once found a fancy cake in a shopping cart in the grocery store parking lot. I brought it to our graduate seminar to share with my fellow grad students, who enjoyed it until I mentioned where the cake came from.

I pretend to read the fine print, but in my excitement over free I usually fail to comprehend it. So it was for the AWS free tier announcement. I excitedly spun up an Official Ubuntu AMI and went on my merry way setting up apps, thinking all the time how wonderful it was to have my very own server on the interwebs humming away for gratis. At the end of the month, I got a bill for $0.50. Huh, not free, so I read the fine print (again):
10 GB of Amazon Elastic Block Storage, plus 1 million I/Os, 1 GB of snapshot storage, 10,000 snapshot Get Requests and 1,000 snapshot Put Requests*
Oops, the Official Unbuntu images have a 15GB root file system. 

Scott Moser provided a recipe for making an AWS Ubuntu 10.10 AMI with a 10GB root file system. So I made a public AMI based on his recipe that y'all can use: ami-8548bfec

UPDATE 12/30/10: Canonical released refreshed UEC images for 10.10 (Maverick Meerkat) with 8GB root EBS volumes that will run on the AWS free tier. The list of Amazon published AMIs is available here.

Trigonometric and Spatial functions for MySQL

MySQL function snippets for azimuth, perpendicular of the the azimuth, coordinates of a point given a bearing and a distance, and Vincenty Distance. These functions are for use at large scales, i.e. short distances.

Calculate the azimuth between two points (from Aviation Formulary V1.45):

   DECLARE lat_rad1 DOUBLE;
   DECLARE lat_rad2 DOUBLE;

   SET var = lng1;

   SET lat_rad1 := RADIANS(lat1);
   SET lat_rad2 := RADIANS(lat2);
   SET dlon := RADIANS(lng2-lng1);
   SET az := atan2( 
                 (cos(lat_rad1)*sin(lat_rad2)) - (sin(lat_rad1)*cos(lat_rad2)*cos(dLon))
   SET az := (DEGREES(az) + 360) % 360;
   RETURN az;

Calculate perpendicular bearing of an azimuth (note: MySQL was not correctly calculating values for 90° and 270°, so I hard coded the correct values):
DROP FUNCTION IF EXISTS perpendicularBearing//
CREATE FUNCTION perpendicularBearing(azimuth DOUBLE, side VARCHAR(5))
   DECLARE perp_az DOUBLE;

   SET perp_az = DEGREES(atan(1/tan(RADIANS(azimuth))*-1));
   IF side = "left" THEN
      SET perp_az = 180 + (180 -(180 - perp_az));
   END IF;
   IF azimuth = 90 THEN
      SET perp_az = 0;
   END IF;
   IF azimuth = 270 THEN
      SET perp_az = 0;
   END IF;
   IF azimuth = -90 THEN
      SET perp_az = 0;
   END IF;
   IF azimuth = -270 THEN
      SET perp_az = 0;
   END IF;
   RETURN perp_az;

Calculate the coordinates of a point given a start point, an azimuth, and a distance in feet (from Aviation Formulary V1.45):
DROP FUNCTION IF EXISTS destinationPoint//
CREATE FUNCTIOn destinationPoint(azimuth DOUBLE, distance DOUBLE, lat DOUBLE, lng DOUBLE)
   DECLARE lat_rad DOUBLE;
   DECLARE lng_rad DOUBLE;
   DECLARE dist_km DOUBLE;
   DECLARE dist_rad DOUBLE;
   DECLARE lat_dest DOUBLE;
   DECLARE lng_dest DOUBLE;
   DECLARE dest_point Point;  
   SET dist_km = distance * 0.0003048;
   SET dist_rad = dist_km/6371;
   SET lat_rad = RADIANS(lat);
   SET lng_rad = RADIANS(lng);
   SET az_rad = RADIANS(azimuth);
   SET lat_dest = asin(sin(lat_rad)*cos(dist_rad) + 
                      cos(lat_rad)*sin(dist_rad)*cos(az_rad) );
   SET lng_dest = lng_rad + atan2(sin(az_rad)*sin(dist_rad)*cos(lat_rad), 
   SET lng_dest = (lng_dest+3*pi())%(2*pi()) - pi();
   SET dest_point = Point(DEGREES(lng_dest),DEGREES(lat_dest));
   RETURN dest_point;

Finally, Vincenty Distance calculations from bramsi at

    COMMENT 'Vincenty Distance WGS-84'

DECLARE wgs84_major DOUBLE;
DECLARE wgs84_minor DOUBLE;
DECLARE wgs84_flattening DOUBLE;

DECLARE reduced_lat1 DOUBLE;
DECLARE reduced_lat2 DOUBLE;
DECLARE sin_reduced1 DOUBLE;
DECLARE cos_reduced1 DOUBLE;
DECLARE sin_reduced2 DOUBLE;
DECLARE cos_reduced2 DOUBLE;

DECLARE lambda_lng DOUBLE;
DECLARE lambda_prime DOUBLE;

DECLARE iter_limit INT;

DECLARE sin_lambda_lng DOUBLE;
DECLARE cos_lambda_lng DOUBLE;
DECLARE cos_sq_alpha DOUBLE;
DECLARE cos2_sigma_m DOUBLE;
DECLARE delta_sigma DOUBLE;

SET lng_rad1 := RADIANS(lng1);
SET lat_rad1 := RADIANS(lat1);
SET lng_rad2 := RADIANS(lng2);
SET lat_rad2 := RADIANS(lat2);

SET wgs84_major := 6378.137;
SET wgs84_minor := 6356.7523142;
SET wgs84_flattening := 1 / 298.257223563;

SET delta_lng := lng_rad2 - lng_rad1;

SET reduced_lat1 := atan((1 - wgs84_flattening) * tan(lat_rad1));
SET reduced_lat2 := atan((1 - wgs84_flattening) * tan(lat_rad2));

SET sin_reduced1 := sin(reduced_lat1);
SET cos_reduced1 := cos(reduced_lat1);
SET sin_reduced2 := sin(reduced_lat2);
SET cos_reduced2 := cos(reduced_lat2);

SET lambda_lng := delta_lng;
SET lambda_prime := 2 * pi();

SET iter_limit = 20;

WHILE abs(lambda_lng - lambda_prime) > pow(10, -11) and iter_limit > 0 DO
     SET sin_lambda_lng := sin(lambda_lng);
     SET cos_lambda_lng := cos(lambda_lng);

     SET sin_sigma := sqrt(pow((cos_reduced2 * sin_lambda_lng), 2) +
                      pow((cos_reduced1 * sin_reduced2 - sin_reduced1 *
                       cos_reduced2 * cos_lambda_lng), 2));

     IF sin_sigma = 0 THEN
        RETURN 0;
     END IF;

     SET cos_sigma := (sin_reduced1 * sin_reduced2 +
                         cos_reduced1 * cos_reduced2 * cos_lambda_lng);

     SET sigma := atan2(sin_sigma, cos_sigma);

     SET sin_alpha := cos_reduced1 * cos_reduced2 * sin_lambda_lng / sin_sigma;
     SET cos_sq_alpha := 1 - pow(sin_alpha, 2);

     IF cos_sq_alpha != 0 THEN
         SET cos2_sigma_m := cos_sigma - 2 * (sin_reduced1 * sin_reduced2 /
         SET cos2_sigma_m := 0.0;
     END IF;

     SET C := wgs84_flattening / 16.0 * cos_sq_alpha * (4 + wgs84_flattening * (4 - 3 * cos_sq_alpha));

     SET lambda_prime := lambda_lng;
     SET lambda_lng := (delta_lng + (1 - C) * wgs84_flattening * sin_alpha *
                   (sigma + C * sin_sigma *
                    (cos2_sigma_m + C * cos_sigma *
                     (-1 + 2 * pow(cos2_sigma_m, 2)))));
     SET iter_limit := iter_limit - 1;

IF iter_limit = 0 THEN

SET u_sq := cos_sq_alpha * (pow(wgs84_major, 2) - pow(wgs84_minor, 2)) / pow(wgs84_minor, 2);

SET A := 1 + u_sq / 16384.0 * (4096 + u_sq * (-768 + u_sq *
                                        (320 - 175 * u_sq)));

SET B := u_sq / 1024.0 * (256 + u_sq * (-128 + u_sq * (74 - 47 * u_sq)));

SET delta_sigma := (B * sin_sigma *
               (cos2_sigma_m + B / 4. *
                (cos_sigma * (-1 + 2 * pow(cos2_sigma_m, 2)) -
                 B / 6. * cos2_sigma_m * (-3 + 4 * pow(sin_sigma, 2)) *
                 (-3 + 4 * pow(cos2_sigma_m, 2)))));

SET gcdx := wgs84_minor * A * (sigma - delta_sigma);

IF metric = 'km' THEN
 RETURN gcdx;
ELSEIF metric = 'mi' THEN
 RETURN gcdx * 0.621371192;
ELSEIF metric = 'nm' THEN
 RETURN gcdx / 1.852;
 RETURN gcdx;


Sunday, November 21, 2010

OSX Boonana Trojan

SecureMac has discovered a new trojan horse in the wild that affects Mac OS X, including Snow Leopard (OS X 10.6), the latest version of OS X. The trojan horse, trojan.osx.boonana.a, is spreading through social networking sites, including Facebook, disguised as a video. The trojan is currently appearing as a link in messages on social networking sites with the subject "Is this you in this video?" ... more

Saturday, November 13, 2010

Wave Protocol Summit Videos and Documents

Pamela Fox uploaded and posted the videos from the Wave Protocol Summit in a wave. I've put the links here for convenience.

Wave Summit Talks

Links to Youtube videos and slides.

Day 1:
Day 2:
Day 3:

Thursday, November 11, 2010

Movie looping in the iPad

None of the movie players currently available for the iPad (the iPad Videos app, VLC, CineXPlayer) will loop a movie continuously. A quick search suggested playing the movie in the browser using a bit of HTML5.
<html xmlns="">
      <title>Parrot AR Drone</title>

<video src="" width="576" height="576" controls="controls" autoplay="autoplay" loop = "loop" qtnext1="goto0" controller="false" kioskmode="true"></video>


Note that this won't work for flash movies.

Saturday, October 30, 2010

Texas GIS Forum 2010

I've never been fond of regional GIS conferences because they seemed set up to sing the praises of the sales team of that GIS vendor whose name I dare not speak because of the Voldemort rule.  In full disclosure, I haven't been to a regional GIS conference in six to eight years — when I was howling in the wilderness about "web services" and hearing crickets in the background. 

I attended the Texas GIS Forum and had my rather jaded view of regional GIS conferences rearranged. I've always loved presentations about how people are using GIS and spatial technologies to solve problems. Listening to people talk about how they apply their domain knowledge and using or creating tools is the most enjoyable part of conferences for me. However, I've always preferred presentations where people create tools or creatively use tools with a bit of side-ways thinking, over presentations where they use only the vendor provided toolset.  It's not that there isn't a lot of creative problem solving going on with single vendor solutions, but when people start using an assortment of tools I usually learn something new to add to my own toolbox.  

A summary bullet point from a presentation seemed to me the main lesson of the conference:
"Don't be afraid to mix technology in your GIS/Web stack"
When I saw that, I knew I was at the right place.

Story Musgrave gave an fascinating keynote that wove together elements of his life starting from growing up on the farm, working as an aircraft mechanic during the Korean War, life as an astronaut at NASA, the 18 years he spent managing the Hubble Space Telescope project , and his current activities in the rotting business and teaching design at Art Center College of Design in Pasadena, CA. Throughout the keynote he referenced the principles of simplicity and reliability in design, while gorgeous photos of his life were shown on the screen as examples. He graciously made his slides available to the audience. They are available for download.

I did not attend as many sessions as I wanted (one day I would like to be at a conference with zero telecon responsibilities), but I made a few notes on some of the presentations that I did see.

  • Get the Results You Want, Mapping with KML,  Michael Chamberlain of TxDOT TPP demoed an application that combined TxDOT's Linear Referencing System (LRS), javascript, and KML to produce an online Statewide Planning Map.
  • Visualizing Recovery Act Funding: Lessons Learned from Development to Deployment by Jeremiah Akin, Texas State Comptroller's Office demonstrated an mobile app that shows of TARP funds in Texas suitable for a number of mobile clients such as iPhone and iPad.
  • Microsoft demoed the Bing Interactive SDK where you can change code and see the change in the brower.
  • In Introduction to SQL Server Spatial and Capabilities by RanJan Muttiah of iHydro Engineering was a great overview of SQL Server Spatial. I was struck by the adherence to OGC standards and how all the SQL shown would run in PostGIS unmodified. Also, major kudos to MicroSoft for adhering to EPSG codes instead of publishing their own version of WKT (Well Known Text), unlike other vendors (cough, Oracle; cough, that other GIS vendor).
  • Bing Maps and SQL Server - Adding Data Awareness to GEMSS by Richard Wade and Chris Williams of TNRIS demoed GEMSS (Geographic Emergency Management Support System) which is a home grown SDI for Texas by TNRIS (Texas Natureal Resource Information System). It currently acts as a searchable data archive but TNRIS is adding uploading of user data. Wade said that this was the future direction for data dessemination by TNRIS

It was great conference and I hope to be back next year.


Jared Cohen at the World Affairs Council, October 4th, Houston, TX

Audio of Jared Cohen's talk at the World Affairs Council on October 4th, 2010 in Houston, Texas.

Jared Cohen is the Director of Google Ideas, a think/do tank.

* Recorded using Evernote, which only records 20 minute segments. Audio was spliced at ~18 minutes.

Wednesday, October 6, 2010

Obligatory xkcd Map of Online Communities in OpenLayers

@godwinsgo egged me on, so here it is:

Thanks to Bernie Connors for alerting me to problems with the iFrame in Windows.

Tuesday, October 5, 2010

Hey Google, events have a time and a location

Usually I whine, then post a solution; this one is all whine. Consider yourself warned. I use Google Calendar in my personal life because it's convenient and at work because we've bought into Google Apps. I travel frequently, which means I switch time zones. Life and work go on wherever I might be at the time.

The problem is that when I add events in Central Standard Time while I'm at work in Eastern Standard Time, Google Calendar assumes that the event is in EST. When I'm at home, Google notes that I've changed time zones and moves the event. This works if I do this only once, but since I make appointments months in advance and I jump across multiple time zones before the appointment, the time for the event shifts constantly.  Apparently, I'm not the only one with this problem.

The press have picked up on this issue. In a PCWorld article:
I talked to GCal product manager Grace Kwak, who acknowledged users' need to be able to set time zones for individual events, and said that her team will soon add that capability to the calendar. “We are working on it right now, and it’s something we think is a great feature addition.”
However, I found Kwak's response as to why Google Calendar doesn't handle time zones less than satisfying:
“Time zones are very complicated,” Kwak told me. “Google has one way of depicting time: We use a universal clock, so there’s like a universal time, and all the lines on the globe are in relation to that universal time.”
Really?  I wonder if she had to Google that?

I wonder how Microsoft Outlook, Palm Calendar, and all those other calendar apps figured out this tricky time zone stuff years ago.

Sunday, September 26, 2010

Overcoming tl;dr on the iPad

tl;dr is defined by Encyclopedia Dramatica as too long; didn't read; if the meaning isn't obvious, just read the article. Shockingly, there are number of books and articles that I have either lied about reading or given a vague nod to indicate that I have read it repeatedly and have written copious notes in the gloss, while the reality is that I tossed the book out of a moving car. For example:
I find many things on the web and in my twitter stream that I would like to read, but just don't have time at the moment or I find that the the writing style detracts so much from the content that I need more time to focus on the material. Book marking has never worked for me because I lack the discipline that it takes to maintain a tidy and organized bookmarking schema. Furthermore, bookmarks are merely pointers to the article and not the text itself (which is why I think geospatial data catalogs based on metadata are stupid, but I digress). 

Enter Instapaper, which describes itself as "a simple tool to save web pages for reading later." Instapaper works in the browser, but it also works on the iPad and the Kindle, which means that I always have access to the articles (if I remember to sync Instapaper on the iPad). This ensures that I have plenty of reading material even at 30,000 feet. Instapaper is easy to use and you can install a scriptlet called the Read Later bookmarklet on your browser that saves the web page. Installing the bookmarklet on Safari on the iPad is a little more involved, but the process is documented here.  The other really great thing about Instapaper is that the bookmarklet function (i.e. Read Later) is integrated in many other applications such as RSS readers and twitter clients. That means you never have to leave your current stream of work to throw another article on the pile.

The one short coming is that Instapaper does not support pdfs which academic journals and research white papers seem to favor. Fortunately, iBooks recently added pdf support. I add pdfs to my book archive by downloading them to the Automatically Add to iTunes directory (/Users//Music/iTunes/iTunes Media/Automatically Add to iTunes/); which, as the folder name says, adds the pdfs to iTunes.  When I sync the iPad with iTunes, all the pdfs are transferred to the iPad.

RSS feeds, like bookmarking, do not work for me because they are essentially all you can eat buffets of links that lack curation. I can't be bothered to sift through all of that. This is true of especially podcasts, where they encourage users to subscribe to a stream. Typically, I just want to listen to a podcast or an audio file. Huffduffer is like Instapaper, but it bookmarks audio files using a bookmarklet called Huffduff it.  Huffduffer creates an RSS feed of the audio which can be added to iTunes so you have your own curated Podcast channel. 

Finally, Youtube also has a lots of content where the video isn't terribly important and an mp3 works just as well. Download Helper is a a Firefox extension that can download video and convert it to other formats such as mp3.  Using Downloader Helper's preferences menu, I set the download directory to the Automatically Add to iTunes directory so it's added to iTunes and loaded when the iPad is synched.

Thanks to Joel Ludwig (@joeludwig) for the twitter exchange that prompted this post. 

Friday, September 10, 2010

What's your excuse?

Seriously, what reason do you have for not deploying a map server? Take your pick of the open source map servers or even a commercial one. The cost of deploying a map server on the Internet is $5.12/month for 100% usage on Amazon Web Services. There's a one time charge for a 1 year ($54) or 3 year ($82) reserved instance on an EBS boot (read you won't lose your work if you terminate the instance).  If you want to host a low bandwidth map server for testing, learning, or just because you have cool data to share via maps, a micro instance costs $5.12, as the Amazon Simple Monthly calculator shows:

Wednesday, August 11, 2010

Your information is somebody's asset

The Electronic Frontier Foundation published an article about the disposition of assets belonging to the defunct magazine that included the personal information of their 1,000,000+ former subscribers. The parties in the case reached and agreement to destroy all personally identifiable information as suggested by the FCC.In this instance, things ended well for the former customers.  However the law is unclear on how to handle corporate assets such as customer lists.  

I recently received an email from Derek Sivers, the founder and former CEO of CDBaby. I've bought CDs and mp3s through CDBaby in the past, so my information was part of CDBaby's corporate assets. What is interesting is that Derek Sivers contacted me a couple of years after he sold CDBaby. Siver's email was a, "Hi, I wanted to tell you about my new projects." Definitely not creepy, and interesting enough for me to click through a couple of links. However, using a contact list from his former company seem incongruous. I wasn't the only one:

@sivers What's up with this email you sent out to old CDBaby members? Did you take a list of emails/purchases/credit cards when you left?  (from Jon Ursenbach)
Derek Sivers replied:
@jonursenbach I got to keep my database of clients and customers for two years, yes. Many are good friends. 
His email also mentioned the last album I bought from CDBaby, so I'm fairly sure he has access to complete database. Thanks to the Internet Archive Wayback Machine I can check on the privacy policy around the time I bought my last album from CDBaby.

So Derek Sivers is not another company, but when he sold the company he became an external party that had access to CDBaby's customer database. No being a lawyer, my best guess is that Siver's access to the customer database falls within the letter of the CDBaby's privacy policy.  Having sold a company, I'm well aware of the details and negotiations that go on during a sale. But having access to the customer database as a condition in the sale of a company strikes me as out of the ordinary. 

I do believe that the email Sivers sent was genuinely part of a friendly hello, get in touch, soft marketing campaign. However, it was unexpected and makes me just that more leery of blanket privacy policies. CDBaby continues to have a similar privacy policy:

We don’t give or sell your personal info to any other company - EVER! (Not even your email address!)
Only the musicians whose music you purchase will know who you are. If you don't even want the musician to know who you are, you can easily change your customer account settings to remain anonymous.
In contrast, Bandcamp (my current favorite place to buy music from independent musicians) has a very detailed privacy policy that specifies the conditions that determine how your personal information is shared, including the transfer of assets in the event of the sale of the company. The Bandcamp privacy policy is very clear that any information I provide becomes the property of Bandcamp. This isn't much comfort, but at least I know what I'm giving up.

Tuesday, August 10, 2010

Updating a T-Mobile myTouch to Froyo (Cyanogenmod-5)

During last week's geekgasm, I upgraded my T-Mobile myTouch to Froyo because I've lost all hope of T-Mobile providing an OTA Froyo upgrade. There are many sites that describe how to root an Android phone and the Cyanogen site has minimalist install instructions, so these are notes are for the process I muddled through. They are as much for me as for other folks.

A couple of things before starting. 
  • Back up your SD card or use another one, this process will wipe out all your data, photos, etc. 
  • All the download links here are for an old style myTouch, not the one with the earphone jack on the top, or the myTouch Slide. The process is the same but you will have to find images for your phone.
  • There are points during the install where it seems nothing is happening or the install is locked up. My advice is to wait and don't be impatient and reboot. The boot cycle after loading a ROM can take quite a while.
  • As a convention, green text is a summary of what you are going to do and red text are steps I ignored.

Updating the myTouch to Cyanogenmod-5
  • Downgrade your myTouch to 1.5 to install a custom recovery image that can load a custom ROM.
I. Downgrade to Cupcake

1. Download the Original SAPPIMG.nbh:
Original SAPPIMG.nbh
2. Plug the phone into your computer via USB. Select Mount by pulling down on the notification bar at the top of the phone’s screen and selecting the USB notification. You should now be able to access the sd card in your phone on your computer.
3. Now, put the .nbh file that you just downloaded on the root of the SD card (NOT in any folder, just on the sdcard itself).
4. Unplug the phone and turn it off.
5. Turn on the phone by holding the Volume Down button and the End key until the bootloader screen comes up.
6. Hit the End key to start the update. DO NOT INTERUPT THIS PROCESS.
7. Once it is done, hit the trackball to restart the phone. You now are on the stock Cupcake firmware.
II. Flash a Custom Recovery Image
1. On your phone, goto Settings > Applications and make sure Unknown source is checked ON
2. On your phone, goto Settings, SD card and phone storage, and click Unmount SD card.
3. Then click Format SD card (it should automatically remount after this).
4. Plug the phone into the compuer via USB, then pull down the notification bar and click on the SD card notification. Then click mount.
5. Download this APK and the recovery image and save it to your computer:
Amon Ra’s Recovery Image
6. Once they are saved to your computer, copy them both to your SD card (do NOT put them inside any folders on the SD card, just put them on the SD card itself).
7. Unplug the phone from the computer once they are downloaded to the SD card.
8. Goto the Market and download Linda File Manager or any file manager program if you do not already have one.
9. Open the file manager and goto SDCard and then find the FlashRec apk file and click it. If asked tell it to use Package Installer to open it. It should automatically install the apk.
10. Open the FlashRec program and click on Backup Recovery Image and wait for it to finish.
11. Once done, click on the empty text box in the FlashRec program and type:
Then click on the Flash Custom RecoveryImage button and wait for it to finish.
12. Turn off the phone and turn it on into Recovery mode by holding down Home and Power to turn it on (keep holding until the recovery screen comes up has a bunch of text on a black background). So long as that screen comes up, LEAVE IT ON THAT SCREEN, you have done it correctly and can now go on to loading a ROM .
III. Partition Your Memory Card for Hero ROMs, Swap, and Apps2SD 
1. With your phone STILL in recovery mode from the How To Root procedure, click on Partition SD Card > Partition SD (this will erase everything off of your memory card).
2. When it asks you, select 96mbs for Swap, 512mbs for ext2 and fat32 for the remainder
3. Once it is done partitioning the memory card, click on Partition SD Card > SD: ext2 to ext3 
  • At this point you can load Cyanogenmod-5 (Froyo 2.1)
Once you have the custom recovery image loaded, you can load a basic rooted ROM such as the Generic MyTouch ROM w/ Root – Here 
Instructions from
1. Root your device and install Clockwork Recovery (ROM Manager on the market) or Amon_RA's recovery (Dream / Magic)
2. Do a Nandroid backup!
3. Install the DangerSPL if you don't already have it
* NOTE:  I skipped the steps in red because the myTouch was already wiped from the downgrade to Cupcake 1.5, so I didn't see much point in doing a back up. Using the custom recovery image I loaded the Cyanogenmod-5 ROM directly.
  • if you want the google apps (who doesn't?), download the google apps image (don't unzip)
  • copy the files to the root of the sdcard by plugging the myTouch into the USB port and selecting USB-MS Toggle in the Recovery mode menu 
4. select WIPE in the Recovery mode menu
5. Install the ROM from the zip file
6. Optionally install the Google Addon if you want Google Applications like Gmail and Market
  • Feeling more adventurous?
Try a nightly Cyanogen build, I'm running one and it's been very stable. You can also install Clockwork Recovery, which is a recovery image with a user friendly front end. Makes loading ROMs much easier.

Monday, July 26, 2010

Why location privacy matters

I attended The Next HOPE, the 8th installment of the Hackers on Planet Earth conference, last weekend. I've attended "hacker" conferences over the past decade and I've rarely seen  location or geo content with the exception of the war driving contests in the early '00s. I was excited that HOPE had a number of geo related sessions ranging from location privacy to hacking your GPS.

Ben Jackson from Mayhemic Labs presented Locational Privacy and Wholesale Surveillance via Photo Services. Jackson sampled 2.5 million photo links posted to Twitter, Twitpic, YFrog and Sexypeek and retrieved latitude and longitude from the EXIF metadata from 65,000 photos. His message was that users are leaking location information, often with knowing it. To publicize his findings, Jackson established to let users know that they could be easily located. 

In a similar vein, Paul Vet presented Geotagging: Opting-in to Total Surveillance (video available). His tag line, "One geotag is anecdote, many geotags are data," summarizes his position that information about a person's location (home, work, entertainment) and habits (timing) can be derived from mining their twitter stream. Like Jackson, extracting tweets with key words such as bed, home, TV and extracting the location data could be used to build a profile of a person's home, place of work, and their habits. Even tweets from friends, such as "Playing XBox with @username" adds additional information. Vet used a clustering algorithm to further refine individual coordinates in probable locations.

I also followed the GeoLoco conference via Twitter. Panelists in the Future of Geo-Location Panel responded to predictions collected by Dr. Phil Hendrix, the moderator. Here are two predictions that impact location privacy:

2. Location-awareness will be integral to any mobile app.
The panelists mainly agreed with this statement, with the observation that not all mobile apps will need LBS.
“For me, this is obvious,” Eisnor said. “With increase in precision, we’re moving towards an ecosystem of location-aware devices.”
“We’re going to have way too many devices in 2014; we will need to know where they are,” said GigaOM’s Liz Gannes.
4. Virtually all user-generated content will be geo-tagged.
In Ron’s words, “That’s already happening today,” but some of the panelists had reservations about a totally geo-tagged world.
“We’re going to find situations where location-sharing can be very weird,” Scoble said, noting that a recent deal between Rackspace and NASA could have been discovered before it was announced if observers had been tracking both organization’s locations.
“We’re getting to the point where journalists could know what the intelligence community does,” Liebhold said. 
The attitude of conference attendees (via Twitter) towards location privacy seemed to take a back seat to the business of monetizing location, despite the possibility that location privacy issues could make or break a company.

Another week, another hacker conference. This time Thomas Ryan will be presenting Getting in Bed with Robin Sage; which describes his exercise of creating a fake twenty-something year old woman who worked for Naval Network Warfare Command. Robin Sage was able to collect 300 connections on LinkedIn, 110 Facebook friends, and 141 Twitter followers. Robin Sage was able to view photos with location information from Afghanistan and Iraq in Facebook and Twitter. Sage even received job offers and dinner invitations. More information about Robin Sage is available from

While the US ponders the release of (six month to years old) information from WikiLeaks. It is worth noting that we might want be looking at social media when it comes to releases of information that endanger operational security in the present day.