Be My Eyes, the accessibility app for mobile devices that puts blind and low-vision people on a live video call with a sighted guide, will help Microsoft train its AI. Be My Eyes will provide anonymized video data to improve scene understanding in Microsoft’s accessibility-focused AI models.
The data sets Be My Eyes gives Microsoft will include “unique objects, lighting and framing that realistically represents the lived experience of the blind and low vision community.” The goal is to make Microsoft’s AI more inclusive for people with vision disabilities.
The companies say all personal info has been scrubbed from the metadata. The provided data won’t be used for advertising or any purpose other than training Microsoft’s AI models.
Although this is Be My Eyes’ first such data partnership, it’s worked with Microsoft before by incorporating its Be My AI tool into Microsoft’s Disability Answer Desk. As its name suggests, Be My AI is the company’s GPT-4-powered spin on an assistance product. In that case, it helps people with vision disabilities navigate Office, Windows and Xbox.
Be My Eyes also struck a deal with Hilton earlier this month. In that case, dedicated hotel staff help blind and low-vision lodgers do things like adjust their thermostats, make coffee and raise or lower their blinds. A previous 2023 partnership between the two companies helped train the Be My AI model.
This article originally appeared on Engadget at https://www.engadget.com/ai/microsoft-recruits-accessibility-app-to-make-its-ai-more-useful-to-blind-and-low-vision-users-130006439.html?src=rss
Neuralink says the Food and Drug Administration has designated its experimental Blindsight implant as a "breakthrough device." The company is developing the technology in an attempt to restore blind people's sight.
Manufacturers who apply to the FDA's voluntary breakthrough devices program and receive the designation from the agency are granted "an opportunity to interact with FDA experts through several different program options to efficiently address topics as they arise during the premarket review phase.” The FDA also prioritizes breakthrough devices for review. Ultimately, a breakthrough device designation can accelerate development of a technology. Last year, the FDA gave the designation to 145 medical devices.
Blindsight is separate from Telepathy, its implant that enables patients with spinal cord injuries to control computers using their thoughts, allowing them to play video games and design 3D objects. Neuralink owner and founder Elon Musk said in August that the company had implanted the chip into a second human patient
Musk claimed back in March that Blindsight "is already working in monkeys. Resolution will be low at first, like early Nintendo graphics, but ultimately may exceed normal human vision." (Federal investigators have reportedly looked into Neuralink's animal testing practices but Musk said in March that "no monkey has died or been seriously injured by a Neuralink device.")
The Blindsight device from Neuralink will enable even those who have lost both eyes and their optic nerve to see.
Provided the visual cortex is intact, it will even enable those who have been blind from birth to see for the first time.
Blindsight "will enable even those who have lost both eyes and their optic nerve to see," Musk said following the FDA's designation. "Provided the visual cortex is intact, it will even enable those who have been blind from birth to see for the first time." He added that while the resolution of Blindsight is low to begin with, "eventually it has the potential [to] be better than natural vision and enable you to see in infrared, ultraviolet or even radar wavelengths."
Those are lofty claims and Neuralink is some way off from being able to fully restore sight to someone who has lost it, if it’s ever actually able to do that. It's not the first company or research team to work on vision-restoring implants either. Meanwhile, as TechCrunch points out, it's unlikely that Blindsight or similar tech can help people who have been blind since birth, given that such people have not "developed the biological capacity for seeing through their eyes."
This article originally appeared on Engadget at https://www.engadget.com/science/neuralink-says-the-fda-designated-its-blindsight-implant-as-a-breakthrough-device-182343456.html?src=rss
The Apple Vision Pro has made my eyes work harder than ever before. It's not just because I have ultra-sharp, incredibly bright micro-OLED screens right on my face: The Vision Pro relies on eye tracking for navigating visionOS, its new "spatial computing" operating system. If you want to launch an app, visit a link or do just about anything that would typically require a mouse or a touchscreen input, your eyes have to look directly at them.
It's magical, almost telepathic. The Vision Pro's eye tracking makes it feel like you're discovering the power of the Force, a sensation that's buoyed by the intuitive hand gestures used to interact with whatever your eyes are focused on. But by relying so much on your gaze, the Vision Pro can quickly become exhausting when you run into issues, like trying to hit a tiny button on YouTube's visually overstuffed website. The eye strain is real.
That's pretty much the Vision Pro experience in a nutshell. Wonder and frustration. A peek into the future that's limited by the hardware that exists today — even if that hardware is among the best we've ever seen.
Before you ask, no, you probably shouldn't even think about buying the Apple Vision Pro. It's purely meant for developers, wealthy Apple fanatics who won't think twice about jumping on a $3,500 curiosity, and, of course, clout-chasing influencers. But you also shouldn’t dismiss it entirely. This is just the beginning of Apple's spatial computing journey. Like the iPod and iPhone before it, the Vision Pro has the potential to fundamentally reshape the way we live with technology.
That future is likely years away, assuming Apple manages to deliver a cheaper and lighter headset. But it's fascinating to see the company set off an entirely new direction of computing, without knowing exactly where it's headed.
What is the Apple Vision Pro?
Despite Apple’s refusal to say the words virtual reality, or even the letters V and R in that order, the Vision Pro is a virtual reality headset. What makes it unique from almost every other VR headset we've seen over the past decade (save for the Meta Quest Pro and Quest 3), is that the Vision Pro delivers a video feed of the real world to its micro-OLED screens. It's a far diminished view of reality — colors are muted, dark scenes look fuzzy and objects warp as you move around — but it's the best attempt we've seen at augmented reality (AR) from a VR headset.
The Vision Pro isn't a purely AR gadget like Microsoft's HoloLens and Magic Leap's headsets, both of which display digital overlays atop transparent lenses. Those devices deliver far more realistic AR experiences, since they don't have to recreate the real world via cameras. But they can never achieve the sense of immersion you feel from a VR headset, where your entire field of view can be taken over by digital environments. (The Magic Leap 2's dimming displays are one attempt to address that.)
In an ideal world, Apple wouldn't have to do the work of capturing reality through cameras and feeding it onto a headset display. VR aficionados call this "passthrough," but it’s just a brute force method for mimicking augmented reality. It's far easier to make the world digital, rather than dealing with complex new display technology to paint the real world with virtual objects.
In typical Apple fashion, the Vision Pro looks far more handsome than any VR headset I've seen. That mostly comes down to materials: Whereas the competition is almost entirely made of plastic cases, Apple's device is built out of smooth glass, polished metal and designer fabrics. Bulbous glass protects the cameras, sensors and "Eyesight" display (which shows off virtual reconstructions of your eyes) up front. A curvy metal body, which resembles the Apple Watch and previous iPhone models, leads to a soft fabric eye cushion.
The Vision Pro ships with a flexible single band headstrap, which looks like a long lost Lululemon accessory, and there's an optional dual loop band that adds a top strap for additional security. While I loved how luxurious the single loop band felt (especially the additional cushioning behind my head), it grew uncomfortable over longer sessions trying to keep the 1.3-pound headset on my face. I stuck with the dual loop band — which is conspicuously absent from Apple's press images — to spread that weight out more evenly.
And then there's the external battery, which almost single-handedly kills Apple's ultra-polished aesthetic. It looks like a typical USB battery pack (albeit with a nicer metal finish than most), and it attaches to the Vision Pro via a five-foot cable. It's very existence seems like everything Apple is against — even the Meta Quest line has a seamlessly integrated battery. While I dreaded having to juggle a cable around, I'll admit it wasn't too annoying while I was sitting down. But this isn't the sort of compromise I'd expect from a $3,500 device.
The Vision Pro's hardware
Given the amount of technology stuffed into the Vision Pro, though, it may be understandable why Apple didn't want to shove in a heavy battery. The headset features an M2 chip with an eight-core CPU, 10-CORE GPU and a 16-core Neural Engine for AI processing. There's also 16GB of RAM onboard, along with Apple's R1 chip for processing all of the cameras and sensors needed to capture a feed of the real world. Those include a LiDAR scanner, TrueDepth 3D camera, six world tracking cameras, two high-resolution main cameras, four internal eye-tracking cameras, a flicker sensor, ambient light sensor and four inertial measurement units to track how your head and body is moving in 3D space. Whew.
Perhaps most controversially, the Vision Pro ships with a mere 256GB of storage. That's awfully low for a $3,500 device, especially since Apple is positioning it as a full-fledged computer, which means you'll be installing plenty of apps and games. And while you'll likely be streaming video to it most of the time, such a low amount of storage doesn't leave much room for offline media (there's no SD card expansion either, typical for a mobile Apple product). If you're buying the Vision Pro, we'd recommend the $3,699 512GB model for a bit more breathing room, or you can top it out with 1TB of storage for $3,899.
Apple might have played it safe with the Vision Pro's battery and storage to make room for its display, which is one of the most glorious screens to ever befall my eyes. It's a Micro-OLED panel (a technology that's so new, I've only seen it on one other shipping product: the BigScreen Beyond) sporting 23 million pixels, or nearly three times the resolution of a single 4K screen.
After trying almost every major VR headset since the Oculus Rift DK2 prototype, and testing a wide variety of increasingly large gaming monitors, the Vision Pro's screen is a revelation. It's wonderfully sharp and crisp, making text easy to read (which wasn't always possible on early VR headsets), and packing enough pixels to scale 4K movies to theater-sized screens. The Vision Pro supports refresh rates up to 100Hz, which makes scrolling through websites feel silky smooth, and it can play movies in multiples for 24fps and 30fps for judder-free playback. (That's the slight stuttering you see on TVs during action scenes and camera pans.)
The Vision Pro also supports HDR, which produces brighter highlights in photos and videos, as well as more nuanced black levels. Dark scenes look particularly spooky, since the Micro-OLED display can achieve pure black, unlike headsets with LCD panels. When I first enabled Apple's immersive environment on the Moon, I spent a while just staring off into the darkness of space, with only the rocky lunar surface and the distant Sun keeping me company.
I'd like to say I flipped on some classical music as I contemplated the meaning of existence, but in truth I just started playing "OK Computer" for the millionth time and wallowed in nostalgic angst and my fears of where technology is taking us. The Vision Pro, for better or worse, is the ideal device for escaping the troubles of the world. (I realized after a few minutes that I was basically recreating the infamous Watchmen scene with Doctor Manhattan sitting alone on the desolate surface of Mars, eager to leave humanity behind. Perhaps I need a vacation.)
The headset's built-in Spatial Audio speakers are good enough for unplanned bouts of nostalgia, YouTube binge sessions and general computing. They're far better than typical laptop speakers, with enough depth and nuance to capture Thom Yorke's ennui. There's not much low-end though, so you'll have to stick with headphones if you want to enjoy thumping bass for music and movies. (At the moment, the Vision Pro supports Apple's AirPods and wireless Beats headphones, and I haven't had luck connecting it to any other wireless cans, like Sony's excellent MDR-1000XM5.) The Spatial Audio speakers are also open, so anyone nearby will get an earful of whatever you're hearing.
Listening to music on the Vision Pro sounds like having an actual speaker in your room, and true to its name, audio also follows apps around your virtual space. Apple earned a patent for audio ray tracing, and it's clear after just a few minutes that the company has thought deeply about how sound works throughout VisionOS. (We sure have come a long way from highly compressed iTunes music and tinny white earbuds.) The Vision Pro also features a six-mic array to capture your voice while shouting at Siri, dictating text and hopping on video chats.
Setting up the Vision Pro
Getting started with the Apple Vision Pro wasn't very different from my hands-on experience last year. Once pre-orders were available, I briefly scanned my face on the Apple Store app (which determines the size of your light seal cushion and headband) and selected the 512GB storage option. That's pretty much it for choices, unless you're adding accessories like the $200 travel case.
Since the Vision Pro doesn't support glasses, I also had to plug in my eye prescription to configure the $149 Zeiss lenses (which snap onto the headset magnetically). My prescription expired, it turns out, which led to a mad scramble for a quick eye exam. Thankfully, it was easy to upload a new prescription via the Apple Store app — you're not forced to visit another retailer for custom lenses like the Quest 3.
Once my Vision Pro arrived on launch day, I tore open the package and was surprised to find it stuffed with accessories. In addition to the headset, battery and Solo Knit Band, you also get a soft cover, an additional Light Seal Cushion (which may be thicker or thinner than the one installed on your headset), and a Dual-Loop band for additional security. There's also a USB-C charger, 1.5 meter USB-C cable, and a polishing cloth (which definitely came in handy.)
The first time I put on the Vision Pro my wife looked at me with an expression somewhere between bewilderment and disgust. She's been forced to live through my VR adventures over the past decade, but this was clearly not another plastic helmet meant for a gamer cave. The Vision Pro looks sleek and stylish, but it's still undeniably dorky once you put it on, like an over-engineered attempt at Robocop cosplay.
Apple has gotten its onboarding experiences down to a science at this point. Once I had the Vision Pro on, I stared ahead to create an Optic ID, biometric authentication based on your iris, as well as a six-digit security pin. Next up I had to create a Persona, a creepy digital reconstruction of my face that shows in FaceTime calls and powers the digital eyes on the front FaceTime screen. Building your Persona involves taking off the headset, following its instructions to look in every direction, and then forcing a closed mouth smile, an open-mouthed smile and a view of your eyes closed.
I probably should have warned my wife about this whole process — she practically screamed when she saw my digital eyes peering out at her. Perhaps the world isn't ready for us to see our loved ones rebuilt as soulless digital avatars.
Hello, Spatial Computing
With the busy work done, I was confronted with a common site: A home screen. But this one was floating above a pile of toys in my family room, not confined to a screen like my iPhone or iPad. The first time I tried the Vision Pro, I was sitting in a boring Apple meeting room that was hastily constructed to show off the headset. But now I was home and I was looking at the VisionOS's interface hovering right where my kids play. It felt like using an iPhone for the first time. I knew computing would never be the same.
As my eyes darted around familiar app icons, like Safari, TV and Photos, they each came to life, ready for me to launch them by touching my thumb and index finger together. I had been thinking for weeks about the first thing I would do with the Vision Pro, and it ended up being a perfect inaugural moment for Apple's spatial computing vision.
I opened Photos, headed to a recent Spatial Video from a trip to Zoo Atlanta, and my eyes welled up a bit with tears. My wife and kids were sitting on a small train ride, eager to make a loop around the zoo, and I was rewatching (practically reliving) that moment in 3D. It's as if it was caught in amber. Sure, the resolution and frame rate could be better (iPhone 15 Pro Spatial Videos are only captured in 1080p at 30fps), but it's still astonishing how immersive it feels.
I was also surprised at how well the Vision Pro handles panoramic photos: Hit a button and you can make them large enough to fill your walls. While there's no true sense of depth, the mere act of seeing a high-resolution, ultra-wide picture blown up to an enormous size is enough to help you relive memories. I found myself revisiting tons of photos on the Vision Pro, simply because they look great on its Micro-OLED screen. Apple may have created the world's best nostalgia machine.
It could just be that I was enamored with seeing my photos in a new light — the Vision Pro even made the mundane act of web browsing seem exciting, since Safari windows can easily scale to towering heights. You can scroll through pages by pinching your fingers together and moving them horizontally or vertically, like the world's nerdiest orchestra conductor.
Within five minutes of testing the Vision Pro, I already felt like a spatial computing expert. That's a testament to how intuitive the entire interface is, from the finger gestures for selection and scrolling, to the eye tracking used to navigate the interface. The only major knock I have against visionOS is its virtual keyboard, which is really only suited to slow, two-fingered hunting and pecking. Alternatively, you can also use Siri to plug in text or tackle basic tasks, like launching an app or rebooting the headset.
That, by the way, is something I've ended up doing a few times a day to deal with a variety of bugs. Sometimes apps don't respond when I click on them from the home screen. Sometimes windows disappear entirely and I can't do anything with the Vision Pro, except beg Siri to help me out.
As impressive as the headset is, it's clear that it's also reaching the public without extensive testing, as if early adopters are paying $3,500 to be beta testers. That's not exactly new for Apple — the original iPhone and Macintosh were both expensive and lacked crucial features — but it makes it hard to stomach the company's aggressive marketing campaign. The more I use the Vision Pro, the more obvious it becomes that it’s a developer kit. That's a reality Apple seems unable to accept, as if everything it touches needs to be a must-have product.
When everything is working smoothly, though, it's easy to buy into Apple's dream of a spatial computing future. Painting your world with virtual windows takes just a few seconds and it never gets old. During my testing, I typically had an enormous web browser opened in front of me, a YouTube window floating above my couch on the right, the App Store hovering above where my cats sleep on on the left, and a small FaceTime window floating around to field calls from family.
In my kitchen, I set up floating timers for a few dishes and a Freeform window on my fridge for jotting down notes. Everything stayed in place when I moved between rooms, though everything would disappear if I had to reset or unplug the headset. We'll have to come up with a term that's more expansive than multitasking, being inside the Vision Pro feels like megatasking. Maybe we should just call it living? (Do we call reading a newspaper with a TV on in the background multitasking?
Immersive experiences (just don't call it VR)
In addition to showing you a view of the real world, you can also rotate the Vision Pro's Digital Crown to gradually immerse you into one of Apple's Environments, digital recreation of locations like Mt Hood, Yosemite and the aforementioned lunar surface. These locations are all gorgeously rendered, and they also have adjustable sound effects to help sell the illusion of being there. While they feel like baby steps into the world of VR, they're also a sign that Apple actually understands essential elements of immersion: Depth, scale and fidelity.
You can only walk around three feet of an Environment before the Vision Pro breaks you out of it, but like its virtual windows, the immersive space persists in a specific location. If you visit the Moon in your living room, then head to the kitchen and grab a drink, you'll find yourself right back on the Moon when you return to your seat.
Apple's boldest attempt at delivering full immersion in the Vision Pro is Encounter Dinosaurs, the same demo I previewed last year (and also the one that caused Engadget's Cherlynn Low to freak out when a butterfly landed on her finger). It turns a wall of your home into a portal into a prehistoric world, where you'll see a few small dinosaurs running around, followed by larger dinos that break out of the portal and appear to enter your home. The dinosaurs all look incredibly sharp and believable, and they even react a bit to your hands if you move close.
While my kids couldn't see Encounter Dinosaurs in the Vision Pro themselves, I mirrored the headset's view to the Apple TV and they were amazed to see enormous beasts invading their playroom. I hope this isn't a one-off demo for Apple.
Apple's new Immersive Videos — 180-degree 3D content shot in 8K with Spatial Audio — are similarly ambitious. They're all about placing you in a specific location with incredible fidelity and life-like depth, from watching a highliner walking atop a thin cable 3,000 feet above a Norwegian cliffside in Adventure, to a fly-on-the-wall jam session in Alicia Keys: Rehearsal Room.
I've seen plenty of VR video in my time, and Apple's format by far delivers the greatest sense of "presence," the idea that you're physically transported to a virtual scene. I also didn't miss having full 360-degree video, a format that allows for a great amount of viewer freedom, but also makes it difficult to focus on key moments.
Adventure opens with an ultra high-resolution close-up of free solo highliner Faith Dickey, something that would have been less effective in 360-degree video. You can see every pore on her face, the brilliant color of her eyes and every strand of hair as if she was standing right in front of you. It's a jarring shot, but an effective one at conveying what's possible with Immersive Video.
It also helps to ground the scope of her highlining feats, which look absolutely stunning. It's thrilling enough to see Dickey walk across a thin rope over an impossibly high cliffside, but it also feels more meaningful because she was just looking right at you, almost within reach.
A personal cinema (and the return of 3D video)
Like many VR headsets, the Vision Pro excels at being a cinema for one. But it's more compelling than the likes of the Quest 3 and Vive by giving you a ton of flexibility around how you can enjoy shows and movies. Any video can be expanded to a cinema-sized screen, and no matter how large you scale it, everything looks sharp and clear. (Remember, you've got more pixels than a 4K TV jammed right up against each of your eyeballs.)
The Vision Pro ended up being a wonderful way to revisit some of my favorite recent films, like Dune and Mad Max: Fury Road. Most surprising of all? You can actually watch Avatar: The Way of Water the way James Cameron intended: In 4K 3D with high refresh rates and immersive spatial sound. Now that TVs and projectors aren't regularly offering 3D, I figured I would never be able to see that film in its full glory again. But the Vision Pro looked even better than in the theater, since I didn't have any clunky 3D glasses darkening the screen.
While you can watch videos floating in the real world, you can also view them in virtual theaters via the Apple TV app, or in the Avengers headquarters on Disney+. That's an easy way to replicate the titanic scale of theater screens, and you also have the ability to choose your seating location. (I usually opt for the front row, though I also love the perspective from the front balcony view.)
Among the early Vision Pro entertainment apps, which includes Max, Prime Video, Crunchyroll and major sports leagues, I was most surprised to see an app from IMAX. It's hard to capture the feeling of being in front of a giant IMAX screen at home, even if you're sitting in front of a projector screen. But the Vision Pro managed to replicate the experience of watching A Beautiful Planet in a full-sized IMAX theater. The sense of depth and scale was so convincing, at times I felt like I could fall into the screen. It's thrilling and overpowering, the way IMAX was meant to be.
As great as it was watching movies on the Vision Pro, though, the headset's speakers can't compete with a decent home theater setup or a Dolby Atmos soundbar. I found myself throwing on AirPod Pros just to get a decent bit of bass in Mad Max: Fury Road and Dune. (Watching those films on the Vision Pro was also the first time I wished I had a pair of AirPods Max around for more dynamic sound. This was clearly Apple's plan.)
A Mac superpower
While it's a capable computer in its own right, the Vision Pro's most compelling use case for me is its ability to take a modern Mac, even a 13-inch MacBook Air, and transform its screen into an enormous virtual window. All it takes is a glance at your Mac's monitor, a tap of the "Connect" button and boom, you've got a Mac in Spatial Computing.
Many aspects of the Vision Pro feel magical, for lack of a better word. But I'm genuinely dumbfounded by how well the Mac integration works. Connectivity is seamless, your Mac screen looks sharp, and there's very little latency when it comes to typing or mousing around. And to make the experience even more compelling, your keyboard and mouse/trackpad also work on native Vision Pro apps. (It just works, seriously.)
Again, while you can technically use virtual desktops on other VR headsets, they don't look nearly as sharp, latency can be messy depending on your network, and you're typically trapped within the confines of virtual space. On the Vision Pro, I could be working on a 100-inch virtual Mac window in my kitchen, while also keeping an eye on my kids. It's empowering and effortless, the way all great technology should be.
Sure, I'd love to see more native apps on the Vision Pro, as well as a deeper exploration of immersive content. But if Apple just sold a headset that virtualized your Mac's screen for $1,000 this well, I'd imagine creative professionals and power users would be all over it. The Vision Pro can both enhance your existing workflow and give you super-powered multitasking capabilities when you're away from your main workspace.
The film director Jon Chu recently revealed he was able to edit his upcoming Wicked film in real-time on an enormous screen using the Vision Pro, after being stuck at home due to flooding in LA. For many professionals, the headset could unlock entirely new ways to work.
The Persona problem
There's no way to spin it: Apple's digital Personas look awful. Cold, dead, inhuman — take your pick. Apple is clearly aware of this though, as the Personas already look dramatically better in the visionOS 1.1 developer beta, with more detail and better support for people with facial hair. My initial Persona looked tragically sad, with a misshapen skull and messy hair. The visionOS 1.1 version looks noticeably better, mostly because my skull no longer looks like a wax figure melting in the sun. I think it may still be aging me up a bit, but maybe that’s just wishful thinking.
Thankfully you don't have to view your Persona too often, but it'll pop up for anyone you FaceTime or call over Zoom or Microsoft Teams. During a few FaceTime chats with non-Vision Pro users (including Engadget Senior Editor Pranav Dixit), their initial reaction was usually laughter and confusion. But the longer we chatted, the more normal it seemed (though nobody ever forgot they were talking to my digital doppelganger). It's less awkward when talking to another Vision Pro user, at that point you're just sharing the shame.
For all of the Persona's faults, though, I'm still amazed that Apple is able to create a semi-accurate avatar in just a few seconds. But the company clearly has plenty of work ahead before it escapes the uncanny valley.
A weak start on gaming
Another sign that the Vision Pro isn't fully cooked is its anemic gaming lineup at launch. What the Golf and Super Fruit Ninja are both available on Apple Arcade, but they're basically the same games we've been playing on other devices, except now they've got a bit of spatial flair. Synth Riders is the closest Vision Pro has to Beat Saber, but it's not nearly as dynamic and I genuinely hate most of the music.
There are a few standalone titles like Black Box ($20), which deploys a variety of puzzles around your space, and Wisp ($20), a game that seems to involve a virtual desk ornament and an AI companion. But even those aren't as immersive or exciting as the VR titles we've been playing for a decade (like Superhot and Space Pirate Trainer). Partially, it's due to the Vision Pro platform being young, but Apple has also limited developers by not releasing a VR gaming controller.
I didn't miss having to find and charge a pair of joysticks just to use the Vision Pro, but I did miss the precision and feedback of the Quest 3's controllers when I was actually ready to play some games. Apple could eventually allow third-party VR controllers to work on the Vision Pro, but that also passes on the responsibility of hardware development to other companies. If Apple really wants developers to take games seriously, it'll need to step up and design its own VR controller solution.
You can, at least, pair a PlayStation or Xbox controller to the Vision Pro for supported games, like Sonic Dream Team and TMNT Splintered Fate. Those controllers also come in handy for streaming titles from the cloud and gaming PCs. The iPad Steam Link app works just fine on the Vision Pro, and you can also easily access Xbox cloud streaming and NVIDIA's GeForce Now via the Nexus app.
Given Microsoft's cloud streaming deal with Meta and Apple's new rules for game streaming apps, I wouldn't be surprised if we see an official Vision Pro solution from Microsoft eventually. (You can also use the official Xbox app to stream games from an Xbox console in your home.)
Guess I found the Vision Pro’s kryptonite: Petting cats makes the finger gesture system FLIP OUT pic.twitter.com/F9zzm9Tj8u
It's easy to sound overly positive about the Vision Pro while discussing the sheer beauty of its screens, and the wonder of seeing Apple step into a bold new era of computing. So here's some cold water:
At 1.3 pounds, it was too heavy to wear for more than an hour at a time, even with the Dual Loop band. (I recall that Apple also had a version of the Solo Knit band with a makeshift top strap during my original demo. It'd be interesting to see that make a comeback, Apple!)
Even during short sessions, it requires a significant amount of eye movement, and you're looking directly at very bright screens. This isn't something you'd want to use as you're trying to wind down before bed.
The front EyeSight display isn't nearly bright enough, and it's certainly far from what Apple's promo pictures show. You really have to lean in close to see it, which defeats the purpose of having it.
Apple's hand tracking recognition is a bit sensitive. At times when I pet my nearby cat, the Vision Pro would freak out, thinking I'm trying to click things around the screen.
It would be nice to see more battery life. The Vision Pro currently gets between two and two and a half hours on a charge, but somehow I've been cursed to see the low battery warning almost every time I get into a good creative flow.
While Apple has announced that there are 600 apps available for the Vision Pro, they're not all worthwhile (and some, like Lowes' Style Studio, are surprisingly buggy).
The Vision Pro isn't great when you’re moving a lot. All those videos you see of people walking down the street while wearing it are mostly made for social media clout. Its cameras are good, but your eyes are far better.
Its cameras can't capture fine detail very well. Cooking with the Vision Pro is a non-starter, since it's tough to see how much seasoning you're applying, and the battery cord could easily get wet, cut or burned.
Its $3,500 cost is tough to swallow, full stop. (We've already established this thing isn't meant for regular users, but it's wild that it costs about the same as a MacBook Pro, iPhone Pro and iPad Pro combined.)
A fascinating start, and an uncertain future
This may be the longest review I've ever written, because I really can't stop thinking about the Vision Pro. What does it mean for computing? What does it mean for Apple? Are we even ready for people to be wearing these things in public? And do I need to reassess it once Apple squashes bugs, adds new features and courts more developers?
The Vision Pro is a flawed product, but it's certainly not empty. It's as if Apple has compiled everything it's learned from building the Mac, iPhone, Apple Watch and AirPods into a single device, all in a bid to avoid the Innovator's Dilemma. It would be easy for the company to coast by slowly iterating its current products, making minor tweaks to appease investors and slight hardware hops to excite an already devoted fanbase. True vision takes risk, and I can't help but admire that.
This article originally appeared on Engadget at https://www.engadget.com/apple-vision-pro-review-beta-testing-the-future-201430272.html?src=rss
Those who've been yearning for a chance to try the Apple Vision Pro headset and have the cash to spare won't need to wait much longer to snap one up. The company says the hotly anticipated device will arrive in the US on February 2. Pre-orders for the mixed reality headset, which starts at $3,499 for 256GB of storage, will open on January 19. The device will be available at all US Apple Store locations as well as through the company's web store.
Those who require vision correction will need to snap up Zeiss optical inserts and attach them to the headset magnetically (Vision Pro doesn't work with glasses). Readers will cost $99, while prescription lenses will set you back $149. The inserts will only be available for purchase online, so don't expect to be able to wander into an Apple Store to pick them up. Naturally, you'll need a prescription for the prescription lenses. However, Apple says that "not all prescriptions are supported."
The era of spatial computing has arrived! Apple Vision Pro is available in the US on February 2. pic.twitter.com/5BK1jyEnZN
This is Apple's first new major product line since it introduced the Apple Watch back in 2014. Apple revealed the Vision Pro release date just as CES 2024 is kicking off, likely to steal some thunder away from the show's exhibitors without needing to actually show up in Las Vegas itself.
The Vision Pro, which Apple announced at WWDC last year, marks the company's initial foray into spatial computing. You'll primarily control it with your hands, eyes and voice, though you can pair a Magic Keyboard and Trackpad for productivity needs or a controller when it's time to kick back and play games.
Apple says a brand new App Store will support more than a million apps from the iOS and iPadOS ecosystems. Of course, there will be apps that are unique to the headset's visionOS. You'll interact with apps by just looking at them, tapping your fingers (à la Apple Watch's new Double Tap feature), flicking your wrist to scroll and using dictation or a virtual keyboard for typing. Siri will enable to you control media playback, open and close apps and much more, Apple says.
Users can place apps anywhere in a 3D virtual environment, which could be a boon for multitasking. You'll be able to access your Mac through your Vision Pro as well, so you'll have access to a giant 4K canvas for your desktop or laptop to help you get things done.
On the entertainment front, you'll be able to stream shows and movies from the likes of Apple TV+, Disney+ and Max on a virtual screen that appears to be 100 feet wide. There's HDR support and, through the Apple TV app, you'll me able to check out more than 150 titles in 3D. Vision Pro also supports Apple's new Immersive Video format, through which you can check out 180-degree, 3D experiences in high resolution.
As for games, Vision Pro will support more than 250 Apple Arcade titles as well as others from the App Store. Players will be able to check out "spatial games," such as Game Room, What the Golf? and Super Fruit Ninja. In those cases, Apple says the headset will transform the space around you, likely leading to more immersive gaming experiences. It's possible that you'll be able to use PlayStation and Xbox remote play features using Vision Pro too.
Speaking of immersion, you'll be able to virtually relocate to more peaceful environments, such as a national park or the surface of the Moon, if you don't feel like looking at your office or home in mixed reality. By turning the Digital Crown, you can adjust the level of immersion in these environments.
The iPhone 15 Pro or iPhone 15 Pro Max can now capture spatial photos and videos, and you'll be able to view those in "life-size scale" through Vision Pro. Panoramas, for instance, will wrap around you.
FaceTime and other types of calls are getting an intriguing upgrade through Vision Pro. Headset users will appear as a Persona, a virtual representation of them that shows their hand movements and facial expressions (Personas are also supported on the likes of Zoom, Webex and Microsoft Teams). Those taking part in a call on a Mac, iPad or iPhone will appear in a tile, while spatial audio will make it seem as though each person's voice comes from the location of their tile in the space.
Oftentimes, wearers of virtual reality or mixed reality headsets seem disconnected from others in the same physical space as they can't make eye contact with those around them. To mitigate that, Apple has developed technology called EyeSight. This makes it appear as though the Vision Pro is transparent, allowing others to see a wearer's eyes.
Elsewhere, Apple has developed a new authentication system called Optic ID to unlock the device, as well as for password autofill and Apple Pay payment approval. The company says that eye-tracking information remains private — neither Apple nor the makers of third-party apps or websites can access that data. It also notes that Vision Pro has a number of accessibility-minded features, such as the ability to enable eye tracking for one dominant eye (which may be helpful for those who have severe vision loss in one eye or a misalignment).
Given the price of Apple's headset, it's highly unlikely that it will see wide adoption, at least in its first iteration. This is one for developers, early adopters and Apple enthusiasts. It may be the case that Apple eventually becomes the company to make mixed reality mainstream. In the meantime, at least we now know when eager beavers will be able to buy a Vision Pro if they have a spare few thousand dollars burning a hole in their pockets.
This article originally appeared on Engadget at https://www.engadget.com/the-apple-vision-pro-goes-on-sale-in-the-us-on-february-2-for-3499-142006153.html?src=rss
At least 15 visitors at Yuga Labs’ ApeFest, a celebration of the marvels of Bored Ape Yacht Club NFTs, may have experienced serious eye injuries. Bloombergreports that multiple people attending the NFT event in Hong Kong last weekend say they experienced vision problems, which they suspect stem from the event’s stage lighting. Some of the attendees claim doctors subsequently diagnosed them with photokeratitis (aka “welder’s eye”), caused by exposure to ultraviolet rays.
“Woke up in the middle of the night after ApeFest with so much pain in my eyes that I had to go to the hospital,” the user Crypto June posted on X (viaCoin Telegraph). “Doctor told me it was due to the UV from stage lights.” User @docwagmi suspected that the “ape friends” reporting problems appeared to have been “up close with us front stage.”
Meanwhile, Adrian Zduńczyk wrote on X, “To all my friends who suffer now: go get your eyes checked. You’ve likely most literally got your eyes burnt with UV like I did, which requires medications, eye drops, eye protection, antibiotics and specialist care. Don’t ignore this health hazard. Without proper treatment, it may cause long lasting vision impairment and other serious damage.” Zduńczyk wrote that seeking medical attention quickly appears to have spared him long-term damage. “My vision was tested as close to perfect with no serious cornea damage, luckily.”
Yuga Labs briefly addressed the issue on X, saying it’s “aware of the eye-related issues that affected some of the attendees of ApeFest,” while claiming it’s “proactively reaching out to individuals since yesterday to try and find the potential root causes.” The company downplayed the number of people reporting issues, adding, “Based on our estimates, we believe that much less than 1% of those attending and working the event had these symptoms.” The NFT company advised attendees experiencing symptoms to “seek medical attention just in case.”
X users seemed none too pleased with Yuga Labs’ PR response:
From the PR team:
- Guys completely downplay it, make it look like a small number, eg... less than 1% - Make it look like you're actively helping and solving - Try to 'Find' the 'Potential' root causes, even though we know exactly what it was
The potentially dangerous incident echoes one in 2017 when attendees of a HypeBeast party reported eye damage. The event’s DJ later reported that the lighting contractor used Philips bulbs that emit UV-C, often used as a disinfectant.
This article originally appeared on Engadget at https://www.engadget.com/bored-ape-nft-event-at-least-15-attendees-reporting-severe-eye-burn-welders-eye-173746237.html?src=rss
A patient has been fitted with a highly realistic 3D printed prosthetic eye for the first time ever, Fraunhofer Technology has announced. Patient Steve Verze received the high-tech version as a permanent replacement for his traditional prosthetic eye. "It makes me feel more and more confident," he told On Demand News. "If I can't spot the difference, I know other people won't spot the difference."
Fraunhofer worked with British company Ocupeye Ltd on a new process that's faster and far less invasive. Previously, doctors would need to make a mold of the eye socket, something that's so difficult for kids that they need to go under a general anesthetic.
Now, the team can do a non-invasive 2.4-second scan using a specially modified ophthalmic scanner that delivers a precise measurement of the eye socket. That data is combined with a color-calibrated image of the healthy eye and transferred over Fraunhofer's "Cuttlefish:Eye" system, which rapidly creates a 3D print model. The software is particularly apt at making a "realistic representation of even transparent materials," according to Fraunhofer.
The model is then printed out by a company called Fit AG which has experience in additive manufacturing for medical technology. From there, the prostheses are inspected and given a final polish and touchup by ocularists. "With a single 3D printer, Ocupeye can potentially fulfil the annual requirement of around 10,000 prostheses required for the UK market," according to the press release.
Verze's prosthetic is a precursor to a forthcoming clinical trial that will evaluate the effectiveness of 3D printed eyes vs. traditional, hand-made eyes, according to University College London. Around 40 patients will be recruited two assess the prostheses for motility (movement), cosmesis (look), fit, comfort, mucous discharge and more. "This new eye looks fantastic and, being based on 3D digital printing technology, it’s only going to be better and better," Verze said in a statement.
A patient has been fitted with a highly realistic 3D printed prosthetic eye for the first time ever, Fraunhofer Technology has announced. Patient Steve Verze received the high-tech version as a permanent replacement for his traditional prosthetic eye. "It makes me feel more and more confident," he told On Demand News. "If I can't spot the difference, I know other people won't spot the difference."
Fraunhofer worked with British company Ocupeye Ltd on a new process that's faster and far less invasive. Previously, doctors would need to make a mold of the eye socket, something that's so difficult for kids that they need to go under a general anesthetic.
Now, the team can do a non-invasive 2.4-second scan using a specially modified ophthalmic scanner that delivers a precise measurement of the eye socket. That data is combined with a color-calibrated image of the healthy eye and transferred over Fraunhofer's "Cuttlefish:Eye" system, which rapidly creates a 3D print model. The software is particularly apt at making a "realistic representation of even transparent materials," according to Fraunhofer.
The model is then printed out by a company called Fit AG which has experience in additive manufacturing for medical technology. From there, the prostheses are inspected and given a final polish and touchup by ocularists. "With a single 3D printer, Ocupeye can potentially fulfil the annual requirement of around 10,000 prostheses required for the UK market," according to the press release.
Verze's prosthetic is a precursor to a forthcoming clinical trial that will evaluate the effectiveness of 3D printed eyes vs. traditional, hand-made eyes, according to University College London. Around 40 patients will be recruited two assess the prostheses for motility (movement), cosmesis (look), fit, comfort, mucous discharge and more. "This new eye looks fantastic and, being based on 3D digital printing technology, it’s only going to be better and better," Verze said in a statement.
You might not have to visit an optometrist just to get a basic prescription update for your glasses. Warby Parker is trotting out Virtual Vision Test, a revamp of its Prescription Check app that lets you renew your glasses or contact lens prescription using only an iPhone and your existing eyewear. Prop up your phone, stand 10 feet away and you can walk through a familiar "can you read this" test that will gauge whether or not your glasses or contacts need updating.
The update isn't automatic. An eye doctor will review the results and give you a verdict within two days. If you're seeing well enough, you'll just need to pay $15 to renew your prescription. If you're either struggling or just aren't eligible to use the app (see below), Warby Parker will recommend that you get a thorough eye exam.
The company is clear that this isn't a replacement for your eye doctor, and that you'll have to meet certain criteria beyond what we just mentioned. You have to be between 18 and 65 years old, with no existing eye health concerns and a single-vision distance prescription.
The approach relies on iOS' Vision Framework to measure your distance from your iPhone and ensure that you're far enough away from a proper test. We've asked Warby Parker about an Android version, although that might take a while given that it would need a rough equivalent to Vision Framework.
The motivations behind Virtual Vision Test are clear — on top of the renewal fee, this could lead to more people buying Warby Parker glasses and visiting the company's stores. Regardless, it could be very useful if you either can't make time to update your prescription or are still skittish about a visit while COVID-19 remains a concern.
You might not have to visit an optometrist just to get a basic prescription update for your glasses. Warby Parker is trotting out Virtual Vision Test, a revamp of its Prescription Check app that lets you renew your glasses or contact lens prescription using only an iPhone and your existing eyewear. Prop up your phone, stand 10 feet away and you can walk through a familiar "can you read this" test that will gauge whether or not your glasses or contacts need updating.
The update isn't automatic. An eye doctor will review the results and give you a verdict within two days. If you're seeing well enough, you'll just need to pay $15 to renew your prescription. If you're either struggling or just aren't eligible to use the app (see below), Warby Parker will recommend that you get a thorough eye exam.
The company is clear that this isn't a replacement for your eye doctor, and that you'll have to meet certain criteria beyond what we just mentioned. You have to be between 18 and 65 years old, with no existing eye health concerns and a single-vision distance prescription.
The approach relies on iOS' Vision Framework to measure your distance from your iPhone and ensure that you're far enough away from a proper test. We've asked Warby Parker about an Android version, although that might take a while given that it would need a rough equivalent to Vision Framework.
The motivations behind Virtual Vision Test are clear — on top of the renewal fee, this could lead to more people buying Warby Parker glasses and visiting the company's stores. Regardless, it could be very useful if you either can't make time to update your prescription or are still skittish about a visit while COVID-19 remains a concern.
Have you ever had your life flash before your eyes when you are spending the night away from home but don’t have your contact solution? Then you Google how to keep your contacts safe because you need them the next morning and you end up putting it in saltwater, praying that they don’t dry up or cause infections to your eye. Obviously, this is not safe and can be extremely harmful – even contact solution alone is not enough to kill all the microbes in your lenses, and if you’re a girl then you know there might even be specs of mascara. So how do we ensure our lenses are always clean and avoid serious infections? We use Q Egg and a reminder on our phones to pack the solution!
Q Egg is a smart contact lens case that gives you triple protection against bacteria by working with your contact solution as well as DNA-smashing UVC light to kill infection-spreading microorganisms. There are three layers of defense that are designed in the product to make the process super efficient while delivering the highest standards of cleanliness – your contact solution, UVC light, and a vibrational motor. Q Egg radiates DNA-smashing UVC light as a second layer of defense to kill infection-spreading microorganisms. The vibrational motor that recirculates the solution to rinse off any remaining particles and natural eye secretions. These powerful combinations kill 99.999% of the toughest and most stubborn pathogens commonly found responsible for contact-related eye infections. In a pandemic where the virus can spread via contact with the eyes, it is important to make sure we are keeping our contact lenses as clean as possible.
The case’s sleek design lets you disinfect your lenses on-the-go and was made to fit with the rest of your cosmetic tools. The product has controlled wavelengths and doses so it won’t fog or change the color of your contact lenses. “Q Egg needs just 30 minutes to do what your contact lens solution alone needs 4 hours to achieve,” claims the team as they talk about how this product was designed to reduce medical risks and costs. One charge cycle will keep it running for 2 weeks and you will not need to keep buying new plastic cases each month for your contacts. It is compatible with all types of contact lenses.