You will pay more than $3000 for a virtual reality headset from Apple

Introducing New Widths on Apple TV: Highlights from WWDC and Other Announcements at AppleOS 17 (review version)

Soon, you will be able to use FaceTime on Apple TV. You can use your iPhone as an alternative to a webcam to chat with the people you’re meeting with from your TV with the new capability. It will even ensure that you’re in the frame using Center Stage.

As you can see from the above announcements, widgets were pretty big at this year’s WWDC. They’re even coming to watchOS 10, which you can browse through for an at-a-glance look at various information by turning your watch’s digital crown. Apple is also adding several new watch faces, a way to measure how much time you spend in the daylight, cycling features, and trailhead information for hikers.

Apple is also introducing a new “game mode” for macOS that will prioritize the GPU and CPU while gaming on a Mac and offers lowered audio latency on AirPods. As part of Apple’s push into gaming, developer Hideo Kojima also announced that Death Stranding (and some of his other games) will arrive on macOS.

New screensavers that you can use as your wallpaper andwidgets that you can add to your desktop are just two of the new features announced by Apple. There are some new features for Safari as well, which let you create and pin web apps to your Dock, as well as make profiles for different browsing sessions.

Those aren’t the only updates coming with iPadOS 17. It will have a personalized lock screen similar to the one on the iPhone and come with the health app.

With iPadOS 17, Apple is adding new interactive widgets that let you quickly access apps and features from the homescreen. The Notes app will be able to detect the fields in a PDF with the new updates. It will also let you work with others in real time to organize and annotate PDFs.

You can easily share your email address or phone number with another iPhone user, there is a new Check In feature and it will support transcription for voice messages. Oh, and Apple’s dropping the “Hey” portion of its “Hey, Siri” trigger phrase.

A number of new features were shown off by Apple. Standby is a new feature that makes your phone’s screen into a smart home-style display when it’s tilted horizontally while charging, allowing it to display important information, like the time and date.

Journal is a new app for the Apple mobile operating system. Journal is meant to encourage you to log your thoughts about recent activities. Apple says the app is secured with end-to-end encryption and that your logs are stored locally on your device. The app will be available later this year.

Source: https://www.theverge.com/2023/6/5/23749243/apple-wwdc-2023-biggest-announcements-vision-pro-macbook-air-15-inch-ios-17

How much can you do with the Mac Pro? Apple and Disney are talking about the future of VR, AR, and AR-enabled virtual reality

Apple has an option for the M2 Max or M2 Ultra chip in the Mac Studio. The Mac Pro, on the other hand, features only the M2 Ultra chip, as well as the option for PCIe expansion. While the Mac Studio starts at $1,999, the Mac Pro starts at $6,999.

Apple is marketing the device as the “world’s thinnest” 15-inch laptop and says it weighs just a little over three pounds. The device comes with up to 18 hours of battery life, 500 nits of brightness, and a 1080p webcam. You can order it today and it will be available next week.

The Vision Pro headset is one of the things Apple showed off at its new event. That’s a pretty big upgrade to its MacBook Air lineup, which has only featured 13-inch displays up until now.

As shown off during WWDC, you’ll be able to use Apple’s Vision Pro headset to interact with the company’s native apps, such as Safari, FaceTime, Photos, Music, and more. You can even use the device to display your Macbook on the External display, play Apple Arcade games and watch movies.

The device is powered by two chips: the M2 and a new R1 chip for real-time sensor processing. According to Apple, the Vision Pro features a single strip of glass on the front of the device, along with a digital crown that lets you switch in and out of AR and VR. It also comes with support for spatial audio through built-in speakers and an external battery pack that’s capable of lasting two hours with a single charge.

A partnership with Disney is huge for a new device like this. The virtual reality experiences teased in Iger’s Disney Plus showcase are exactly the kind of premium content Meta hasn’t been able to get for its own Quest VR headsets, and its Metaverse project isn’t exactly looking very hot these days. We don’t have a way of knowing if Disney’s demonstration will reflect the Vision Pro’s capabilities when it arrives next year.

Mark Gurman, an Apple leaker, mentioned in his Power On newsletter in April that Disney Plus users will be able to use the Vision Pro headset to watch sports games in virtual reality. This is demonstrated in a few ways: the first of which features a regular 2D football game surrounded by useful information in widget-like boxes, such as the score, win probability, and player stats. An example is where a 3D top-down view of a basketball game projected onto a coffee table in the user’s lounge allows them to see a replay from every angle.

Tuong Nguyen, a director analyst at the tech analysis firm Gartner, says it leads to the “head-in-a-box” problem. Something like Google Glass or Meta’s Facebook Ray-Bans may not be as feature-rich as Apple’s Vision Pro, but at least you can see around their frames. The physical knob on Apple’s headset can be used to adjust how much of the screen is taken up by digital elements, but you’re still relying on a screen to pump real world visuals in.

Yet what Apple demonstrated on Monday were mostly immersive versions of apps like FaceTime and Safari, as well as 3-D photos and video, rather than a wholly VR experience. Early tester had to say what they thought.

I did get to see a quick FaceTime call with someone else in a Vision Pro using an AI-generated 3D “persona” (Apple does not like it when you call them “avatars”) which was both impressive and deeply odd. I could see right away that I was speaking to a person in a way that mimicked a valley, especially as most of the person’s face was frozen. Even that amount was convincing, and nicer than your average call. You set up a persona by holding the headset in front of you and letting it scan your face, but I wasn’t able to set one up myself and there there’s clearly a lot of refinement yet to come, so I’ll withhold judgement until later.

The video passthrough was similarly impressive. It appeared with zero latency and was sharp, crisp and clear. I was happily talking to others, walking around the room and taking notes while wearing a headset, things that would never be possible with a game like Meta Quest Pro. That said, it’s still video passthrough. When people’s faces were moved into shadows, I could see intense compression and loss of detail. I could see the IR light on the back of my phone that was malfunctioning when it tried to unlocks with FaceID. The room was not as bright as it could be, so when I took the headset off, I had to adjust to how much brighter the room was.

The 4k display for each eye has just 23 microns of resolution, making it absolutely insane. In the short time I tried it, it was totally workable for reading text in Safari (I loaded The Verge, of course), looking at photos, and watching movies. The display is the highest resolution I have ever seen. There was some green and purple fringing around the edges of the lenses, but I can’t say for certain if that was down to the quick fitment or early demo nature of the device or something else entirely. When it ships will be seen by us.

When you put on the headset, there’s a quick automatic eye adjustment that’s much quicker and more seamless than on something like the Quest Pro — there are no manual dials or sliders for eye settings at all. Apple wouldn’t say anything specific about its field of view this long before launch, but I definitely saw black in my peripheral vision. The marketing videos have you believe that the Vision Pro is completely immersed.

The top of the Vision Pro has a button on the left that serves as a shutter button to take 3D videos and photos, which I didn’t get to try. The Digital Crown is on the right; clicking it brings up the home screen of app icons, while turning it changes the level of VR immersion in certain modes. I was wondering why anyone would want to change the immersion level to something other than all-on or all-off, and Apple appears to be thinking of setting the middle level to be a sort of workspace for apps while still allowing you to talk to your colleagues.

Around the headset itself you’ll count 12 cameras, a LIDAR sensor, and a TrueDepth camera, as well as IR flood illuminators to make sure the cameras can see your hands in dark environments for control purposes. The whole thing runs on a combination of Apple’s M2 and new R1 processors, which unsurprisingly generate a fair amount of heat. The Vision Pro vents that heat by pulling air up through the bottom of the device, and venting it out the top.

An Aperiodic Apple Headset Powered by Prescription Lenses: A Better AR/VR Bose-Handset than Glasses

The design language is all brushed aluminum, shiny glass, and soft fabrics; the vibe is closer to iPhone 6 than iPhone 14. The front glass of the camera is curved but still uses a lens that is appropriate to use when you are looking at people. (This feature is called EyeSight; I didn’t get to try it in any way.)

The headset itself weighs a little less than a pound — it’s connected by a braided white power cable to a silver battery pack that offers about two hours of use. The cable is connected to the battery pack even though it detaches from the headset. There is a battery pack that can be plugged into the wall.

Apple held Vision Pro demos in a large white cube-shaped building it built for WWDC called the Fieldhouse. I was given an apple for a quick setup process: a turn-your-face-in-a-circle Scan that looked at my ears, and another side-to-side face Scan that looked at my ears. After that, Apple had me visit an “vision specialist” who asked if I wore glasses — I was wearing my contacts, but glasses-wearers had a quick prescription check so Apple could fit the Vision Pros with the appropriate lenses. Apple needed a partner that could legally sell the prescription lenses that are made by Zeiss. They are sold separately at launch.

Based on the little bit we’ve seen, it’s a dramatically better-looking device than any other AR or VR headset we’ve seen. The headset itself is thin and not much else besides a fabric shield and a plushy band surrounds it. The goggles are slightly curved and should wrap around most faces fairly nicely. The silvery color of the thing is down to the cable on the left side, and the battery pack at the bottom, providing two hours of battery life.

The response to virtual, augmented and mixed reality has been decidedly ho-hum so far. Some of the gadgets that use this technology have been ridiculed, with the most notable example being the internet- connected glasses that Google released more than a decade ago.

Apple’s Vision Pro: From a Ghost Town to a Three-Dimensional Digital Reality? How the Metaverse has Become a Reality

By comparison, Apple sells more than 200 million of its marquee iPhones a year. It took the iPhone a full year to sell 12 million units, and it was not an immediate sensation.

Wedbush Securities analyst Dan Ives estimated Apple will sell just 150,000 of the headsets during its first year on the market before escalating to 1 million headsets sold during the second year — a volume that would make the goggles a mere speck in the company’s portfolio.

Magic Leap, a startup that was excited about its mixed-reality technology that could conjure the spectacle of a whale breaching through a gymnasium floor had so much trouble marketing its first headset to consumers that they have now shifted their focus to industrial, health care and emergency uses.

The metaverse isn’t a ghost town, but it is a digital ghost town, and Meta’s virtual reality headset is the top selling device in this category. Cook and other Apple executives avoided referring to the metaverse in their presentations, describing the Vision Pro as the company’s first leap into “spatial computing” instead.

Facebook founder Mark Zuckerberg has been describing these alternate three-dimensional realities as the “metaverse.” He changed his social networking company’s name to Meta Platforms, and poured billions of dollars into the virtual technology in order to push the concept into the mainstream.

Analysts don’t think the Vision Pro will be a big hit. That’s largely because of the hefty price, but also because most people still can’t see a compelling reason to wear something wrapped around their face for an extended period of time.

Although Vision Pro won’t require physical controllers that can be clunky to use, the goggles will have to either be plugged into a power outlet or a portable battery tethered to the headset — a factor that could make it less attractive for some users.

Users will be able to control the headset with just their eyes and hands, thanks to 12 cameras, 6 microphones, and various apps with just their eyes and hands. Apple said the experience won’t cause the recurring nausea and headaches that similar devices have in the past. A three-dimensional digital version of each user was created by the company.

The company emphasized that it drew upon its past decades of product design during the years it spent working on the Vision Pro, which Apple said involved more than 5,000 different patents.

Apple’s lineage of breakthroughs date back to a bow-tied Jobs peddling the first Mac in 1984 —a tradition that continued with the iPod in 2001, the iPhone in 2007, the iPad in 2010, the Apple Watch in 2014 and its AirPods in 2016.

Despite such skepticism, the headset could become another milestone in Apple’s lore of releasing game-changing technology, even though the company hasn’t always been the first to try its hand at making a particular device.

I don’t doubt that Apple has probably nailed text legibility here and made this immersive environment more compelling to use as a mobile workstation, but at $3,499, it’s a lot compared to the many VR headsets that can also create virtual giant workspaces and TV screens for you.

“It’s an impressive piece of technology, but it was almost like a tease,” said Gartner analyst Tuong Nguyen. “It looked like the beginning of a very long journey.”

Apple’s Vision Pro: Where is the tech going? When has the headset come to the foreground? Why has the tech community become more mainstream?

Although executives from Apple showed off a lot of the headset’s capabilities during the last 30 minutes of the event, consumers will have to wait before they can get their hands on the device. Vision Pro will sell for $3,500 once it’s released in stores early next year.

The initial reviews were mixed and skeptics questioned whether even Apple could make virtual reality anything more than a niche technology. Apple is the only company that can make it mainstream with its two billion user base, according to boosters.

“At the end of the demo, I took off the headset and felt two things: 1) Wow. Very cool. 2) Did I just do drugs?” wrote Joanna Stern of The Wall Street Journal.

These are tough times for virtual reality. Enthusiasm for virtual worlds, often called the metaverse, rose during the pandemic, but waned as lockdowns eased. Investors also appear to have moved onto shinier new technologies like artificial intelligence: Metaverse-related start-ups raised about $664 million in the first five months of 2023, down 77 percent year on year, according to PitchBook.

A decade after Apple introduced the word “pro”: Four years after Apple announced the Mac Pro release – a virtual keyboard or MacBook Pro?

Apple has struggled to adapt the iPad for creation over the years even after the company blurred the lines with the iPad Pro, which is a hybrid device similar to the Surface Pro. Apple spent most of its time during the iPad Pro announcement in 2015 demonstrating productivity apps like Office and Photoshop, with a focus on professionals getting work done. It has been over a decade since I picked up a laptop when I wanted to get work done because the OS hasn’t caught up to Windows or Macintosh for multitasking and creation.

Some demonstrations went beyond consumption. The Djay app for Apple Vision Pro looks to be different and will offer more interaction than anything else Apple demonstrated.

3D content was dragged from Messages but people didn’t create it in the headset. While a virtual keyboard isn’t the “pro” interaction we’ve come to expect from pro devices with a traditional mouse and keyboard attached, there is a brief demonstration showing how to send a message with it.

That’s probably because the “pro” label has long lost its meaning across the industry since those early MacBook Pro days. Apple decided to use the term pro on its phone after many other phone manufacturers started using it. We are nearly four years after Chaim asked what it meant for a phone to be “pro,” and here he is asking the same about a new headset.

The iMac and MacBook Pro were the first Macs to switch to Intel and also included a camera, DVD burner, and a bundle of digital lifestyle apps. The main goal of the MacBook Pro was to justify the switch to Intel for power and performance per watt. Steve Jobs showed off SPECint benchmarks for the most powerful core processor during the announcement. There was no need for Apple to justify its “pro” label on the Vision Pro.

The Dock of Apple Apps: A VR Experience Experience Lived in the Eyes of a Closer Look in a Gravity/Relativity Camera

The more interesting thing was how I interacted with them. I opened Photos by pinching my forefinger and thumb together, scrolling through photos with the same movement as before, and then using my fingers to expand the panoramic photos in front of me. I scrolled web 2D pages in Safari using my eyes and a couple fingers. I opened Messages, too, though audio interactions aren’t ready yet apparently, and I wasn’t able to record or send a Message. Most of the content I saw wasn’t fully volumetric, nor could I pinch the apps to scale up, or bring myself into them. In the future, an Apple representative said that app makers can build these experiences.

The dock of Apple apps was in front of me. I could still see the real-life living room surroundings. An AR home screen of Apple apps in AR is as vanilla as it sounds. The app containers weren’t reinvented and their icons were not grabble blobs that generated volume. They were there.

I assumed the headset would be light, but it still felt heavy. I went through a second process after adjusting both the big back strap and the soft strap. A light orb appeared in the middle of my demo.

“People’s tolerance for wearing something on their head for an extended period of time is limited,” says Leo Gebbie, a VR analyst at CCS Insights. “If it’s something that people are going to wear all day, it needs to be slim and light and comfortable. No one has really achieved that just yet in the VR world.”

The screens we use every day aren’t completely reliable. You’ve probably had the experience where you want to snag a photo or video of something, so you launch your phone’s camera app, only to see the image stutter or the app crash. Now imagine that happening with your entire field of vision.

Previous post Prince Harry is the first royal to testify in a phone hacking trial
Next post Russia is blamed by the Ukrainians for blowing up the southern dam