Friday, September 25, 2009

Flying with a cello, part 2

On my return flight, I arrived at the airport three hours early expecting more ticketing fun and games.  Sure enough, even though everything (according to the Southwest ticket agents) was done correctly, the ticket agents ended up calling Southwest's Dallas headquarters to get a special dispensation to release my tickets.   Everyone at Southwest was very helpful; at one point a ticket agent was holding a phone in each ear at the same time.  The whole process took about 30 minutes to issue a boarding pass.  Southwest has an open seating policy, so I was told to talk to the gate agents when boarding.  


Going through security was uneventful.  The case fit through the xray machine with a little room to spare.  I was not asked to remove the end pin.  Walking through the airport was kind of fun; people smiled and parents would tell their children to look at the cello.  I might as well have been leading a llama through the concourse, because as everyone knows people love llamas.


At the gate, I asked the boarding agent about when to board and what were Southwest's regulations concerning the transport of cellos.  She didn't know if there were any regulations, but she let me board with the families with small children so I could find a pair of seats more easily.  Once on board, the stewards and stewardesses knew that I was flying with a cello and led me to the last row of seats. I also asked for a seat belt extender to secure the cello case.  The space between the seats was not large enough to fit the cello case so I had to pull up the seat cushion.  


I've always been hesitant about pulling up seat cushions on a plane ever since my cell phone slipped between the seat crack on a flight soon after 9/11.  When I pulled up the seat cushion that time, I found a number of bullets in the seat presumably left by an air marshall.  Long story short, the flight was delayed and I received the evil eye from the other passengers. 



Happily there were only stale peanuts and I secured the cello by sliding the seatbelt through the handle.  The flight was uneventful and the cello arrived safe and sound.


I would say that the important stuff about transporting a cello on a plane are:

  1. Give yourself lots of time to deal with ticketing problems and to avoid the stress of trying to make a flight when there are long security lines.
  2. Talk to the ticket agents and gate agents, they will help expedite the process.
  3. Take a direct flight if you can.
  4. If you have to take connecting flight, give yourself plenty of time between connections.
  5. Smile a lot, it really does help make things go smoothly.


Saturday, September 19, 2009

Flying with a cello, part 1

I moved to Texas in June but I left my cello in Maryland because I thought I would be spending more time in Maryland.  However, it didn't work out that way so I'm flying it back with me.  I researched a couple of options, one of which was to purchase a a flight case for my Bobelock 2000 cello case with the intent of checking the cello in as luggage.  Apparently there are travel cases for Bobelock cases, but they are not the same as flight cases.  The only flight case I could find was for Bam cases and they cost around $700.  Not sure that my Bobelock case would fit  and that the return flight was less than $100, I decided to buy my cello a seat.


I usually fly Southwest between San Antonio and Baltimore (BWI) because they have a direct and they're convenient.  I first booked my round trip ticket online and called reservations to book a seat for the cello.  The agent was very helpful and she reserved a seat for the cello and tied that ticket to my roundtrip ticket.  When buying a seat for an inanimate object, they issue the ticket in your name with an IXS suffix (Inanimate Xtra Seat, I think).  It took more than one try and they issued a couple of confirmation emails but it all seemed to work - until I tried to check.


When I tried to checkin at the kiosk, I was told to see a gate agent.  Apparently, Southwest's reservation system was unhappy that my return trip included a second seat under my name and it wouldn't let me checkin.  Southwest solution was to break my reservations into individual legs, i.e. three one way tickets.  They issued new confirmation tickets for the return flight, but the cello's ticket lacks the IXS suffix, so we shall see.

Monday, September 7, 2009

Augmented Reality and Android: This is not the droid I'm looking for

Brother Cavil: In all your travels, have you ever seen a star go supernova?
Ellen Tigh: No.
Brother Cavil: No? Well, I have. I saw a star explode and send out the building blocks of the Universe. Other stars, other planets and eventually other life. A supernova! Creation itself! I was there. I wanted to see it and be part of the moment. And you know how I perceived one of the most glorious events in the universe? With these ridiculous gelatinous orbs in my skull! With eyes designed to perceive only a tiny fraction of the EM spectrum. With ears designed only to hear vibrations in the air.
Battlestar Galactica [No Exit]
My Blackberry died of a broken USB connector a few weeks back and I replaced it with a myTouch; or as it is more geekily known, the second Android phone (this distinction in branding is important and I'll get back to it later). As with any new toy/technology, I can't resist loading the latest geek ware in an effort to break it. August was the Augmented Reality (AR) news month with Bruce Sterling playing beaming father to the fledgling industry; so what better way to try out new hardware than loading up a bunch of apps that make full use of the platform?
The Android based phones seems to be platform du jour for AR because of hardware access to the GPS, accelerometer, compass, and 3G (for the majority of the time). Augmented reality apps for iPhones have been recently released with the advent of the iPhone 3gs, and since the iPhone apps market is estimated to be worth 2.4 billion USD, in comparison to Android's miniscule apps market, expect AR apps to migrate to the iPhone real soon now. Returning to the branding issue, Joe Lamantia raises the geek-to-chic issue that will impact the future development of AR platforms.
I loaded Layar, Wikitude, and Gamaray and used them for a couple of weeks across different environments ranging from the suburbs and urban areas in Texas, Maryland, and New York City, inside and outside buildings, and while stationary and moving (car, train, plane). Comparisons are not terribly useful since they are very similar in function. In general, Layar currently has the most data layers but searching seems limited; for example a search for "sushi" returned no results but "food" did return results, but not always relevant. Wikitude imprints the closest street address if you take a photo, but I don't know if it writes the location data to the exif. Gamaray allows you to create content (including the insertion of 3D objects) without a developer key à la KML.
There are annoyances related to hardware. Getting a position fix on the myTouch's GPS can take over 2 minutes, and sometimes a reboot is necessary to kick the GPS out of its locational funk. Apparently, the iPhone also suffers from this problem. Layar and Wikitude handle this by using the last known position and Gamaray just refuses to start with a current position fix. AR apps are frequently rendered useless when in inside a building or in a moving vehicle. New York City's buildings create an urban canyon effect that hinders reception of GPS signals. This is unfortunate because AR apps seem to be targeted at the built environment.
For me, the biggest shortcomings of the current crop of AR apps has been the user experience, the lack interesting content, and the use of only locational sensors. I'll address them individually within the context of Joe Lamantia's article detailing four common AR user interaction patterns:
  • Heads Up Display - "information about the real objects are added to a fixed point of view, typically the focus of the user’s visual field."
  • Tricorder - "add pieces of information to an existing real-world experience, representing them directly within the combined, augmented-reality, or mixed-reality experience."
  • Holochess - "adds new and wholly virtual objects directly into the augmented experience, combining them with existing, real objects. The virtual items in Holochess interaction patterns often interact with one another—and sometimes with the real elements of the mixed-reality experience."
  • X-ray vision - "simulates seeing beneath the surface of objects, people, or places, showing their internal structure or contents. AR experiences using the X-ray Vision pattern often use a combination of projection and rendering—frequently, a schematic or abstracted rendering—of the object of interest"
UX sucks
Layar, Wikitude and Gamaray use a hybrid of the the Heads Up Display and the the Tricorder. My main pet peeve is the use of the Heads Up Display interaction pattern which is represented as a radar like scope with a wedge representing the field of view. Don't get me wrong, I spent many happy hours playing Battle Zone, but when I have to physically rotate the device (and myself) to see objects to my side or behind me, a simple 2D map is a more practical interface.
The Tricorder pattern is equally annoying, since this involves holding up the phone in front of you while scanning the horizon for alien life forms. Doing this made me feel like an alien in Herald Square. While I don't have a problem being an alien, it also made me feel like one of those self-important dorks who wore a blue-tooth headset constantly when they first came out.
Oh the promise!
Gamaray differs from Layar and Wikitude in that it implements the Holochess pattern by allowing users to place 3D object in the field view. This where things start getting interesting because users can interact with the application and reality instead of passively viewing info bubbles. It's not a far jump to the locative art installations as described in William Gibson's book Spook Country. The Mannahatta Project reconstructs the changing ecology of Manhattan back to 1609, and a AR app could overlay the historical view over the current view. AR applications are a natural fit for these visualizations. Temporal overlays could also be used to overlay houses,tax parcels, and demographic data in areas hit by tornados or hurricanes for use by emergency workers and insurance adjusters.
Where's the content?
Granted that AR is in its infancy, but the currently available content detracts from the idea of AR as a useful tool. A number of apps use Wikipedia as a starting point for content, but the entries in Wikipedia are scale inappropriate and are spatially coarse to be of interest. Tobler's law and its corollaries lands with a heavy thunk. Wikitude, Layar, and Gamaray all support user created content but there doesn't appear to be standardization around any particular content model. Crowd sourcing geospatial content worked well for Google, but Google had the advantage of a single data schema in KML as well as a single API.
AR is a natural fit for built environments but geospatial data is rarely collected at the resolution of individual manmade structures and data collection at this scale often means 3D. CAD data does exist for many urban areas but it is frequently locked up in hard drives of various planning agencies and engineering companies; its difficult to imagine that any of that data will be readily available. An alternative is Sketchup and apps are starting emerge, but none on a mobile platform so far.
Not significantly advanced technology
Considering that 18 years ago I carried a GPS that was larger and heavier than my current notebook and 7 years ago my Garmin Etrex was bigger than my circa 2000 Motorola Startac cell phone, you would think that I would be pretty happy with a device that combines GPS, compass, cell phone, camera, internet terminal, and music player into a single package. Actually, I am happy that I own this technological wonder that can begin to implement AR, but where AR apps fail to be indistinguishable from magic is that they don't use all the sensors. Amidst all the trumpeting that the compass was the killer feature on Android, features such as bluetooth, wifi, and of course the camera seemed to have been forgotten.
Makers have been busily hacking at phone cameras to make scanners and there are accessories to turn the phone camera into a microscope. An application that makes use of image recognition to identify objects in the field of view instead of using proximity and bearing to the phone's location would be infinitely more magical the current applications. An AR app can conceivably take a picture and send it to an image recognition service and then provide additional information based on the picture just like the folks at astronomy.net. That would be magic indeed.