Incredible augmented reality technology that has real-world, everyday use: use cameras and computers to make seen what can’t been seen by the normal eye. In this case, it’s welding that benefits, but there is an obvious extension into many other fields.
As computers become even more powerful, it will be possible to do similar real-time HDR image processing with little more than a smartphone. There will be a point down the road where humans will be able to manufacture cheap and lightweight glasses that are capable of providing better imaging than our own eyes can manage. This will be one of several points where true augmentation of human capabilities begins — the very definition of cyborg. It’s not science-fiction, it’s the future.
“Morphologically, we’ve built a jellyfish. Functionally, we’ve built a jellyfish. Genetically, this thing is a rat,” says Kit Parker, a biophysicist at Harvard University in Cambridge, Massachusetts, who led the work.
I’ve said it before, but the rise of the cheap sensor, combined with ubiquitous connectivity, is going to do more to change the way we interact with our world than you can imagine.
The coolest thing at Google I/O this year isn’t a cheap tablet or a pair of overpriced glasses or even a killer keyboard. It is, believe it or not, an alarm clock. But not just any alarm clock — this is an alarm clock with potential. What you see above, and demonstrated in the video after the break, is the gadget that was handed out to attendees who went to learn about the Android Accessory Development Kit.
Google’s Project Glass product lead Steve Lee walks us through his experience with the development of the company’s sci-fi-inspired eyewear–from his team’s “hundreds of variations and dozens of early prototypes” to his vision of the future.
I’ve been talking for awhile now in my presentations and writing about how interfaces are changing, and how the way we design for information retrieval is going to evolve over the next few years as they do. Here’s a prime example of the sort of thing that will be trivial to do in a very short amount of time…gestures as UI for computing.
Remember all those talks I gave over the last few months talking about a data explosion because sensors were getting so cheap that they will soon be ubiquitous and allow us to measure everything and anything?