The Apple Vision Pro is the best headset demo I have ever seen


FaceTime on MacOS 17: Highlights of Apple’s First Year at WWDC, and some New Updates for MacOS Sonoma

Apple will soon let you use FaceTime on Apple TV. The new capability leverages the iPhone’s Continuity Camera, allowing you to use your iPhone as a substitute for a webcam as you see and chat with the people you’re meeting with from your TV. Center Stage will ensure that you are in the frame.

As you can see from the above announcements, widgets were pretty big at this year’s WWDC. The watchOS 10 can be used to look through various information at a glance by turning your crown. Apple is also adding several new watch faces, a way to measure how much time you spend in the daylight, cycling features, and trailhead information for hikers.

A new game mode for MacOS will give the top priority to the processor and graphics and also offer lowered audio latency on the AirPods. Death Stranding and some of Hideo Kojima’s other games will come to macOS as part of Apple’s push into gaming.

Apple announced some visual upgrades to macOS Sonoma, which will now support widgets that you can add to your desktop, along with new moving screensavers that you can also use as your wallpaper. There are new features forSafari that allow you toPin web apps to your dock, and make profiles for different browsing sessions.

Those aren’t the only updates coming with iPadOS 17. It will include the Health app, as well as a personalized lock screen similar to that on the iPhone.

With iPad OS 17 Apple is introducing new interactive features that will allow you to quickly access features off the homescreen. There are also updates for the device’s Notes app, which will now be capable of detecting the fields in a PDF. It will allow you to work with others on a project in real time.

In addition, there will be a new Safety-focused Check in feature, as well as the name drop feature that will allow you to easily share your email or phone number with another iPhone user. This is the reason Apple dropped the “Hey” portion of its phrase.

A number of new features were revealed by Apple in regards to their upcoming mobile operating system. There is also a new Stand By feature that allows your phone to display essential information, such as the time and date, when it is tilted horizontally while charging.

There’s a new app for the iPad, called Journal. As its name implies, Journal will encourage you to log your thoughts about recent activities or trips. The app is secured by end-to-end encryption, and logs are stored locally on your device, according to Apple. The app will be available this year.

Source: https://www.theverge.com/2023/6/5/23749243/apple-wwdc-2023-biggest-announcements-vision-pro-macbook-air-15-inch-ios-17

Apple Silicon-Powered Mac Studio and Mac Pro Upgrades, with a Special Vision Pro Headset for Interactive Virtual Reality Experiences at Disney Plus

Both the Mac Studio and Mac Pro are getting Apple Silicon-powered upgrades. The new M2 Ultra chip is in the latest versions of the two desktop devices, and has an up to 24-core CPU and 76-core GPUs.

The device is described by Apple as the world’s lightest 15-inch laptop at a weight of just over three pounds. The device has an 18 hour battery life and 500 nits of brightness. It costs $1,299, and you can order it today, with availability starting next week.

A few days after Meta had a new headset, and a couple of days prior to that, Lenovo introduced its Vision Pro mixed-reality headset during Apple’s keynote.

While the “pro” label on iPhones has come to mean a better camera and screen, Apple hasn’t announced a regular Apple Vision headset without the “pro,” so this definition doesn’t apply here (yet). The Apple Vision Pro isn’t going after high-level creative professionals like the MacBook Pro and iMac Pro have done in the past. In fact, Apple didn’t really show much content creation at all for the Vision Pro — it was mostly focused on content consumption, even in the work parts of its demos.

The chip that powers the device is the M2 and it’s the R1 that handles sensor processing. The Vision Pro has a strip of glass on the back, along with a crown that lets you switch between virtual reality and augmented reality. It has an external battery pack that can be used for up to 2 hours and built-in speakers that support spatial audio.

Disney Plus will add some interactive virtual reality experiences to its platform that can be accessed via the Vision Pro headset, including the ability to travel through a National Geographic adventure from your couch, according to a tease shown on the internet. The turtle that was swimming across the presentation, if it was similar to the one in the Metaquest experience, would be an indication that this could be a more interactive experience than a video game.

Mark Gurman mentioned in a Power On newsletter in April that Disney Plus users can use the Vision Pro headset to watch sports in virtual reality. The first of the examples is a 2D football game that surrounds useful information in boxes such as the score and win probability. Another example involves a 3D top-down view of a basketball game projected onto a coffee table in the user’s lounge, allowing them to see a courtview replay of the match from every angle.

But every successful Apple product of the past two decades has disappeared into our lives in some way—the iPhone into our pockets, the iPad into our purses, the Apple Watch living on our wrists and the AirPods resting in our ears. Wearing the Vision Pro for hours on end will call into question what it means to compute, but also, what it means to live in the real world. My forehead felt cool when I took the Vision Pro off after around 30 minutes, a testament to Apple’s considerate design. But my face also breathed with relief, the way it has after using other heads up displays. The air feels more real out here.

Apple had a 180-degree 3D video with audio in it that was called the Apple Immersive Video Format and may or may not be released. (They looked like the 3D videos we’ve been seeing in VR demos forever.) I looked at a 3D photo of some cute kids shot by the headset’s cameras and watched a 3D video of those kids blowing out a birthday candle. (Same.) I did a one-minute Mindfulness meditation in which a voice commanded me to be grateful while the room darkened and a sphere of colorful triangles expanded all around me. Supernatural exists, with millions of users on the Quest, and has offered guided meditation since 2020. One of the oldest virtual reality demos ever was watched by me in a movie theater that looked just like a movie theater.

I did get to see a quick FaceTime call with someone else in a Vision Pro using an AI-generated 3D “persona” (Apple does not like it when you call them “avatars”) which was both impressive and deeply odd. It was obvious that I was talking to a person in a way it was impossible to see their eyes or mouth. But even that much was convincing after a while, and certainly much nicer than your average Zoom call. You set up a persona by holding the headsets in front of you and letting it take a picture of your face, but I didn’t have the time to do it myself, so I’ll wait until later.

The video passthrough was similarly impressive. It appeared with zero latency and was sharp, crisp and clear. I happily talked to others, walked around the room, and even took notes on my phone while wearing the headset — something I would never be able to do with something like the Meta Quest Pro. It is still video passthrough. I could see pretty intense compression at times, and loss of detail when people’s faces moved into shadows. I could see the IR light on the front of my iPhone futilely blink as it attempted to unlock with FaceID to no avail. The room’s display was dimmer than the room itself, so I had to turn off my headset and check how bright the room was.

The display is made of 4k resolution and has a size of only 23 microns for each eye. In the short time I tried it, it was totally workable for reading text in Safari (I loaded The Verge, of course), looking at photos, and watching movies. I have never seen a display like that before. There was some green and purple fringing around the edges of the lenses, but I can’t say for certain if that was down to the quick fitment or early demo nature of the device or something else entirely. We don’t know when it will ship.

We couldn’t see the other view in which the wearer’s eyes are projected through that front screen, which was the indication that someone was in the headset but couldn’t see out. That view will either be innovative or horrifying. We’ll see.

I never got the chance to try the shutter button on the Vision Pro, which is used to take 3D videos and photos. The Digital Crown is on the right, it brings up the home screen of app icons, while changing the level of virtual reality in certain modes. I was asking why anyone would want to change the settings, it appeared Apple was thinking of the middle setting as a way to keep the sides open for work.

Around the headset itself you’ll count 12 cameras, a LIDAR sensor, and a TrueDepth camera, as well as IR flood illuminators to make sure the cameras can see your hands in dark environments for control purposes. The combination of Apple M2 and new R1 processors generated a good amount of heat. The Vision Pro vents that heat by pulling air up through the bottom of the device, and venting it out the top.

The FaceSight headset at the Fieldhouse: How much do we actually know about Mixed Reality devices besides the iPhone and AirPods?

The design language is all brushed aluminum, shiny glass, and soft fabrics; the vibe is closer to iPhone 6 than iPhone 14. The curved glass on the front of the phone serves as the appropriate lens for the cameras and the screen, which shows your eyes, is a complex piece of optical engineering. I didn’t get to try EyeSight because it’s a feature.

The headset itself weighs a little less than a pound — it’s connected by a braided white power cable to a silver battery pack that offers about two hours of use. The cable detaches from the headset with a mechanical latch, but it’s permanently connected to the battery pack. If you want the wall to be plugged into, you have to use a battery pack.

Apple built the Fieldhouse for the World Wide Developers’ Conference and held Vision Pro demos there. Upon entry, I was handed an iPhone for a quick setup process: a turn-your-face-in-a-circle scan (very much like the Face ID setup that determined what size face mask to use), and then another side-to-side face scan that looked at my ears to calibrate spatial audio. After that, Apple had me visit an “vision specialist” who asked if I wore glasses — I was wearing my contacts, but glasses-wearers had a quick prescription check so Apple could fit the Vision Pros with the appropriate lenses. (The lenses are made by Zeiss; Apple needed a partner that can legally sell prescription lenses. They snap in magnetically and will be sold separately at launch.)

We have seen it but it is a dramatically better looking device than any other headset we have seen. The headset is thin and has a large, plushy band on the back, but it’s mostly from the fabric shield around it. The goggles are slightly curved and should wrap around most faces fairly nicely. The whole thing is made of silvery color and has a cable coming out the left side and a battery pack at the bottom, providing it with two hours of battery life.

There is a debate about what is this for. There is very little market for mixed reality devices like the Meta Quest and the Magic Leap, unlike when the company launched the Apple Watch or even AirPods. Most people have little or no idea how these headsets work, and little about how they should work seems to have been settled.

What will Apple do next? Apple’s first leap into spatial virtual reality and how it’s going to cost consumers? An analysis by Apple CEO Mark Zuckerberg

Apple sells more than 200 million of its marquee phones each year. But the iPhone wasn’t an immediate sensation, with sales of fewer than 12 million units in its first full year on the market.

Wedbush Securities analyst Dan Ives estimated Apple will sell just 150,000 of the headsets during its first year on the market before escalating to 1 million headsets sold during the second year — a volume that would make the goggles a mere speck in the company’s portfolio.

Magic Leap, a startup that caused a stir with previews of mixed-reality technology that could make a whale appear in the middle of a gymnasium floor, has shifted its focus to industrial, health care and emergency uses since it had trouble marketing its first headset to consumers.

The metaverse mostly remains a digital ghost town although Meta’s virtual reality headset, the Quest, remains the top selling device in its category. Cook and other Apple executives avoided referring to the metaverse in their presentations, describing the Vision Pro as the company’s first leap into “spatial computing” instead.

Facebook founder Mark Zuckerberg has been describing these alternate three-dimensional realities as the “metaverse.” It’s a geeky concept that he tried to push into the mainstream by changing the name of his social networking company to Meta Platforms in 2021 and then pouring billions of dollars into improving the virtual technology.

Even so, analysts are not expecting the Vision Pro to be a big hit right away. Most people can’t see a reason to wear something wrapped around their face for more than a short period of time, largely because it’s expensive.

Although Vision Pro won’t require physical controllers that can be clunky to use, the goggles will have to either be plugged into a power outlet or a portable battery tethered to the headset — a factor that could make it less attractive for some users.

The headset will include a variety of sensors and cameras which will allow users to control it using their eyes and hand gestures. Apple said the experience won’t cause nausea and headaches like similar devices have. The technology was developed to create a three-dimensional version of each user for video conferencing.

The company emphasized that it drew upon its past decades of product design during the years it spent working on the Vision Pro, which Apple said involved more than 5,000 different patents.

Apple has a tradition of releasing breakthrough products, which began in 1984 with Jobs’s first Macintosh and continued with the iPod in 2001, the iPhone in 2007, the iPad in 2010, the Apple Watch in 2014) and its newest accessory, the AirPods.

Despite such skepticism, the headset could become another milestone in Apple’s lore of releasing game-changing technology, even though the company hasn’t always been the first to try its hand at making a particular device.

I doubt Apple has nailed the legibility of text here, but at less than $4000 for a virtual giant workspace and tv screens, it’s a good option for a mobile workplace.

“It’s an impressive piece of technology, but it was almost like a tease,” said Gartner analyst Tuong Nguyen. “It looked like the beginning of a very long journey.”

Expected Launch Times of the Vision Pro headset and the Rise of the Internet of Virtual Reality (Made-up of the iPhone, iPod Touch)

Although Apple executives provided an extensive preview of the headset’s capabilities during the final half hour of Monday’s event, consumers will have to wait before they can get their hands on the device and prepare to pay a hefty price to boot. Vision Pro will retail for $3,500 when it is released.

The initial reviews were mixed and skeptics questioned whether even Apple could make virtual reality anything more than a niche technology. Proponents say that Apple has the ability to make it mainstream, because of its two billion users.

“At the end of the demo, I took off the headset and felt two things: 1) Wow. Very cool. According to The Wall Street Journal, two questions were asked about how I do drugs.

These are tough times for virtual reality. Enthusiasm for virtual worlds waned as lockdowns went down. According to PitchBook, Metaverse-related start-ups have raised about $664 million over the course of the last five months, down 77 percent from a year ago.

The Meaning of “Pro” in the Early Macintosh Era: How Apple Endowed the Mac Pro Label for a 10 Years

We’ve seen Apple struggle to adapt the iPad for creation over the years, even after the company blurred the lines with the iPad Pro — a hybrid device much like the Surface Pro that blends laptop and tablet. In the iPad Pro announcement in 2015, Apple focused on professionals getting work done with its productivity apps like Office and Photoshop, while also demonstrating the iPad Pro as a device to use for work. Almost 10 years later, I still grab a laptop when I want to get work done because iPad apps and the OS still haven’t quite caught up to macOS or Windows for multitasking and creation.

A few demonstrations did go beyond consumption. The new DJay app for Apple Vision Pro is likely to provide some impressive interaction that’s unlike anything else Apple demonstrated.

The ability to drag and drop 3D content from Messages, but people didn’t create that within the headset. There was a brief demo shown using a virtual keyboard to send a message, but not the complex type of “pro” interactions for text, document, and image manipulation using just your voice, hands, and eyes that we’ve come to expect from pro devices with a traditional mouse and keyboard attached.

The label has lost its meaning since the early days of MacBook Pro. Before Apple decided to change the name of its device to the iPhone 11 Pro, other phones such as the ones from OnePlus, Huawei, and others started using the word pro. At the time, former Verge senior reporter Chaim Gartenberg (damn, I miss that nerd) asked what it even means for a phone to be “pro,” and here we are nearly four years later asking the same about a new headset.

One of the first Macintosh computers to use Intel was the MacBook Pro, which was announced alongside the iMac that included a built-in iSight camera, DVD burning capabilities, and a bundle of digital lifestyle apps. The switch to Intel for power was a primary reason why the MacBook Pro was built. Steve Jobs showed off SPECint benchmarks for the largest number of processing units during the announcement. Apple didn’t use any benchmarks to justify its “pro” label on the Vision Pro.

What I’ve Learned about Augmented Reality Apps and How I Used to Wear them: Toward a Virtual Reality Home Screen for People in Perturbation Theory

The part that was more fascinating was how I talked to them. I opened Photos by pinching my forefinger and thumb together, scrolled through photos by “grabbing” each image and swiping to the left, expanded panoramic photos by staring and tapping at the “Expand” option. I scrolled web 2D pages in Safari using my eyes and a couple fingers. I opened Messages, too, though audio interactions aren’t ready yet apparently, and I wasn’t able to record or send a Message. I didn’t have the power to scale up the apps or bring myself into them, as most of the content was not fully volumetric. An Apple representative has said, though, that app makers can build these experiences in the future.

A dock of Apple apps appeared in front of me in home mode. The real-life living room settings could still be seen by me. There is an augmented reality home screen for Apple apps. The app containers themselves were certainly not reinvented, and their icons were not little grabble globules or anything else that conferred volume. They were just there.

I assumed the headset would feel light, but it still felt heavy. I went through another calibration process once I adjusted the back straps, which resulted in an audible chime of approval. A light orb appeared in the middle of my demo.

“People’s tolerance for wearing something on their head for an extended period of time is limited,” says Leo Gebbie, a VR analyst at CCS Insights. It should be slim and light and comfortable for people to wear all day. No one has done that in the virtual reality world yet.

Also, the screens we already use every day aren’t totally reliable. You’ve probably had the experience where you want to snag a photo or video of something, so you launch your phone’s camera app, only to see the image stutter or the app crash. Now imagine that happening with your entire field of vision.