Brother Cavil: In all your travels, have you ever seen a star go supernova?Ellen Tigh: No.Brother Cavil: No? Well, I have. I saw a star explode and send out the building blocks of the Universe. Other stars, other planets and eventually other life. A supernova! Creation itself! I was there. I wanted to see it and be part of the moment. And you know how I perceived one of the most glorious events in the universe? With these ridiculous gelatinous orbs in my skull! With eyes designed to perceive only a tiny fraction of the EM spectrum. With ears designed only to hear vibrations in the air.Battlestar Galactica [No Exit]
The Android based phones seems to be platform du jour for AR because of hardware access to the GPS, accelerometer, compass, and 3G (for the majority of the time). Augmented reality apps for iPhones have been recently released with the advent of the iPhone 3gs, and since the iPhone apps market is estimated to be worth 2.4 billion USD, in comparison to Android's miniscule apps market, expect AR apps to migrate to the iPhone real soon now. Returning to the branding issue, Joe Lamantia raises the geek-to-chic issue that will impact the future development of AR platforms.
I loaded Layar, Wikitude, and Gamaray and used them for a couple of weeks across different environments ranging from the suburbs and urban areas in Texas, Maryland, and New York City, inside and outside buildings, and while stationary and moving (car, train, plane). Comparisons are not terribly useful since they are very similar in function. In general, Layar currently has the most data layers but searching seems limited; for example a search for "sushi" returned no results but "food" did return results, but not always relevant. Wikitude imprints the closest street address if you take a photo, but I don't know if it writes the location data to the exif. Gamaray allows you to create content (including the insertion of 3D objects) without a developer key à la KML.
There are annoyances related to hardware. Getting a position fix on the myTouch's GPS can take over 2 minutes, and sometimes a reboot is necessary to kick the GPS out of its locational funk. Apparently, the iPhone also suffers from this problem. Layar and Wikitude handle this by using the last known position and Gamaray just refuses to start with a current position fix. AR apps are frequently rendered useless when in inside a building or in a moving vehicle. New York City's buildings create an urban canyon effect that hinders reception of GPS signals. This is unfortunate because AR apps seem to be targeted at the built environment.
For me, the biggest shortcomings of the current crop of AR apps has been the user experience, the lack interesting content, and the use of only locational sensors. I'll address them individually within the context of Joe Lamantia's article detailing four common AR user interaction patterns:
- Heads Up Display - "information about the real objects are added to a fixed point of view, typically the focus of the user’s visual field."
- Tricorder - "add pieces of information to an existing real-world experience, representing them directly within the combined, augmented-reality, or mixed-reality experience."
- Holochess - "adds new and wholly virtual objects directly into the augmented experience, combining them with existing, real objects. The virtual items in Holochess interaction patterns often interact with one another—and sometimes with the real elements of the mixed-reality experience."
- X-ray vision - "simulates seeing beneath the surface of objects, people, or places, showing their internal structure or contents. AR experiences using the X-ray Vision pattern often use a combination of projection and rendering—frequently, a schematic or abstracted rendering—of the object of interest"
UX sucks
Layar, Wikitude and Gamaray use a hybrid of the the Heads Up Display and the the Tricorder. My main pet peeve is the use of the Heads Up Display interaction pattern which is represented as a radar like scope with a wedge representing the field of view. Don't get me wrong, I spent many happy hours playing Battle Zone, but when I have to physically rotate the device (and myself) to see objects to my side or behind me, a simple 2D map is a more practical interface.
The Tricorder pattern is equally annoying, since this involves holding up the phone in front of you while scanning the horizon for alien life forms. Doing this made me feel like an alien in Herald Square. While I don't have a problem being an alien, it also made me feel like one of those self-important dorks who wore a blue-tooth headset constantly when they first came out.
Oh the promise!
Gamaray differs from Layar and Wikitude in that it implements the Holochess pattern by allowing users to place 3D object in the field view. This where things start getting interesting because users can interact with the application and reality instead of passively viewing info bubbles. It's not a far jump to the locative art installations as described in William Gibson's book Spook Country. The Mannahatta Project reconstructs the changing ecology of Manhattan back to 1609, and a AR app could overlay the historical view over the current view. AR applications are a natural fit for these visualizations. Temporal overlays could also be used to overlay houses,tax parcels, and demographic data in areas hit by tornados or hurricanes for use by emergency workers and insurance adjusters.
Where's the content?
Granted that AR is in its infancy, but the currently available content detracts from the idea of AR as a useful tool. A number of apps use Wikipedia as a starting point for content, but the entries in Wikipedia are scale inappropriate and are spatially coarse to be of interest. Tobler's law and its corollaries lands with a heavy thunk. Wikitude, Layar, and Gamaray all support user created content but there doesn't appear to be standardization around any particular content model. Crowd sourcing geospatial content worked well for Google, but Google had the advantage of a single data schema in KML as well as a single API.
AR is a natural fit for built environments but geospatial data is rarely collected at the resolution of individual manmade structures and data collection at this scale often means 3D. CAD data does exist for many urban areas but it is frequently locked up in hard drives of various planning agencies and engineering companies; its difficult to imagine that any of that data will be readily available. An alternative is Sketchup and apps are starting emerge, but none on a mobile platform so far.
Not significantly advanced technology
Considering that 18 years ago I carried a GPS that was larger and heavier than my current notebook and 7 years ago my Garmin Etrex was bigger than my circa 2000 Motorola Startac cell phone, you would think that I would be pretty happy with a device that combines GPS, compass, cell phone, camera, internet terminal, and music player into a single package. Actually, I am happy that I own this technological wonder that can begin to implement AR, but where AR apps fail to be indistinguishable from magic is that they don't use all the sensors. Amidst all the trumpeting that the compass was the killer feature on Android, features such as bluetooth, wifi, and of course the camera seemed to have been forgotten.
Makers have been busily hacking at phone cameras to make scanners and there are accessories to turn the phone camera into a microscope. An application that makes use of image recognition to identify objects in the field of view instead of using proximity and bearing to the phone's location would be infinitely more magical the current applications. An AR app can conceivably take a picture and send it to an image recognition service and then provide additional information based on the picture just like the folks at astronomy.net. That would be magic indeed.
well we in the southern hemisphere have seen a supernova it was in the Lesser Magellanic Cloud I think and really did look like a big star then slowly faded away over months I think I remember, a few years ago now
ReplyDelete