Google’s Project Glass product lead Steve Lee walks us through his experience with the development of the company’s sci-fi-inspired eyewear–from his team’s “hundreds of variations and dozens of early prototypes” to his vision of the future.
I was totally into this stuff in the 90’s, reading Wired and Mondo2000 whenever I could find them, and devouring Gibson and Sterling like candy. I’m a product of the cyberpunk age.
I’ve been talking for awhile now in my presentations and writing about how interfaces are changing, and how the way we design for information retrieval is going to evolve over the next few years as they do. Here’s a prime example of the sort of thing that will be trivial to do in a very short amount of time…gestures as UI for computing.
You know that Internet-of-Things I’ve been talking about? Here’s an example, from two of the hottest products to come from Kickstarter:
Remember all those talks I gave over the last few months talking about a data explosion because sensors were getting so cheap that they will soon be ubiquitous and allow us to measure everything and anything?
Yeah. So that’s happening.
I know how dangerous that laser is, and I still want one.
By “wearable computing” I mean mobile computing where both computer-generated graphics and the real world are seamlessly overlaid in your view; there is no separate display that you hold in your hands think Terminator vision. The underlying trend as we’ve gone from desktops through laptops and notebooks to tablets is one of having computing available in more places, more of the time. The logical endpoint is computing everywhere, all the time – that is, wearable computing – and I have no doubt that 20 years from now that will be standard, probably through glasses or contacts, but for all I know through some kind of more direct neural connection. And I’m pretty confident that platform shift will happen a lot sooner than 20 years – almost certainly within 10, but quite likely as little as 3-5, because the key areas – input, processing/power/size, and output – that need to evolve to enable wearable computing are shaping up nicely, although there’s a lot still to be figured out.
Imagine that you have a big box of sand in which you bury a tiny model of a footstool. A few seconds later, you reach into the box and pull out a full-size footstool: The sand has assembled itself into a large-scale replica of the model.
Will your next home be a printed home?Along with this new technology will come a number of labor-reducing and cost-saving features. The number of people needed to build a home will drop by a factor of ten, maybe more.Over time, we may see old houses torn down with PacMan-like recycling machines, where the material is ground up, reformulated, and an entirely new house is printed in its place – all in less than one day.
We knew Google was working on this, but here’s the first real look at Project Glass, their wearable display/augmented reality hardware.