The iPhone - A Pointing Device for the Real World?
Will Google use the snapshots you take with the iPhone for automatically adding 3D building detail and for increasing the resolution of Google Earth imagery?
- What on Earth makes you think that?
I started thinking about virtual annotation and information overlays on the real world while writing my previous post "Pulling Answers from Thin Air".
I wrote about sending geotagged images from a phone for automatic image recognition and data lookup services. A technology that the company Neven Vision have been developing. They were aquired by Google in 2006, which I find interesting.
You can use the geographic position of an image to help identify known buildings depicted in that image. Then you also can calculate the exact position and the exact orientation of the phone in relation to the real world if you know the focal length and sensor specifications of the camera.
You will have the exact orientation at that instant, at least...
But doesn't the iPhone have an orientation sensor implemented using accelerometers? This means that the iPhone could keep track of its position and orientation in relation to the real world, even after the image was snapped, when you are moving around!
Maybe my mind is easily boggled, but this makes a lot of interesting things possible. You could envisage:
- Extreme resolution enhancement for Google Earth - Use the images snapped by iPhone users all over the world to automatically get high resolution up-to-date map images and extremely good 3D building detail.
- Information overlays on top of a live image of the real world on your iPhone.
- Accessing mobile services and data by using the phone as a pointing device for the real world.
- Orientation tagged images. Generate virtual tours where you seamlessly zoom into an image, blending with the virtual landscape. Build Quicklime VR panoramas of the globe. Orientation tagged movies and web cams. Moving images directly in the 3D view of Google Earth.
- Historical overlays. Scrub a slider on the iPhone screen to change the point in time and see the scene in front of you change accordingly.
- Scrub the same time slider and change the accumulated digital images that are mapping the area and you have instant animated history.
I just wish that this kind of device already had been around for about four billion years. I would love to be able to see the same kind of animation for an area using a geological time scale.
Highly accurate GPS and orientation data is needed to make something like this work. To make it useful, I guess you roughly would need to have a position that is accurate within a few meters and an angular error that not much higher than one degree. Is that possible?
That would depend on the accuracy of the accelerometer in the iPhone. It could be the same type as the accelerometer in the MacBook, which is pretty accurate, but I have no idea if that is accurate enough for this kind of application. An electronic magnetic compass combined with accelerometers and image recognition of known landmarks sounds like a workable solution.
I'll admit that this is a little speculative. It relies on the premise that the iPhone has or will have a GPS chip and a method for acquiring accurate 3D orientation data. But, is it difficult to put together such a contraption?
No. It has already been done (pdf).
What will Google have to gain from something like this? Besides getting ahead of competitors in the ongoing battle of more resolution and 3D detail, the possibilities for very effective advertising can not be ignored. I would not pass on something like this. Would you, Google?
1 comment:
We've been talking about the iPhone over at Highbrid Nation and I'm not sure how I feel yet. One thing is for sure, I'm not feeling the exclusive deal with AT&T. Sorry, the iPhone is not enough for me to switch carriers. I really do want one though :)
Post a Comment