Disney Research has been on a serious roll with its 3D printing innovations and 3D printing patents. From high-res 3D printing processes, to replicating reflective properties onto 3D printed surfaces, to 3D printed wall-climbing robots, it seems as though Disney is looking to redefine how movie merchandise is made using 3D printing technology. But their latest study shows that they are also keen to bring 3D printing principles to other industries, for they have developed a new compiler that lets knitting machines behave like 3D printers and easily produce customized objects.
Sony, Samsung, and Alphabet want to get right in your eyeballs. The tech firms have made public movements that appear to see contact lenses as one of the future interaction mediums. For taking pictures, video streaming, and measuring health signs, there may be solid reasoning behind sticking a computer in your eye.
I’ve been looking for a good answer for lifelogging for awhile, and have been anticipating the point at which the technology drops in price enough to make this possible for the average consumer. Google Glasses is the high-end answer, and this may just be the sort of thing that emerges at the low end.
The Memoto camera is a tiny camera and GPS that you clip on and wear. It’s an entirely new kind of digital camera with no controls. Instead, it automatically takes photos as you go. The Memoto app then seamlessly and effortlessly organizes them for you.
Incredible augmented reality technology that has real-world, everyday use: use cameras and computers to make seen what can’t been seen by the normal eye. In this case, it’s welding that benefits, but there is an obvious extension into many other fields.
As computers become even more powerful, it will be possible to do similar real-time HDR image processing with little more than a smartphone. There will be a point down the road where humans will be able to manufacture cheap and lightweight glasses that are capable of providing better imaging than our own eyes can manage. This will be one of several points where true augmentation of human capabilities begins — the very definition of cyborg. It’s not science-fiction, it’s the future.
Google’s Project Glass product lead Steve Lee walks us through his experience with the development of the company’s sci-fi-inspired eyewear–from his team’s “hundreds of variations and dozens of early prototypes” to his vision of the future.
I was totally into this stuff in the 90’s, reading Wired and Mondo2000 whenever I could find them, and devouring Gibson and Sterling like candy. I’m a product of the cyberpunk age.
Sony is going to be releasing heads-up display glasses to theaters for accessibility issues (read as: hearing and visual overlays of subtitles).
By “wearable computing” I mean mobile computing where both computer-generated graphics and the real world are seamlessly overlaid in your view; there is no separate display that you hold in your hands think Terminator vision. The underlying trend as we’ve gone from desktops through laptops and notebooks to tablets is one of having computing available in more places, more of the time. The logical endpoint is computing everywhere, all the time – that is, wearable computing – and I have no doubt that 20 years from now that will be standard, probably through glasses or contacts, but for all I know through some kind of more direct neural connection. And I’m pretty confident that platform shift will happen a lot sooner than 20 years – almost certainly within 10, but quite likely as little as 3-5, because the key areas – input, processing/power/size, and output – that need to evolve to enable wearable computing are shaping up nicely, although there’s a lot still to be figured out.