Cook claims to have revolutionary significance and can shoot in the blink of an eye! Apple Releases First 25000 yuan Head Display Time | Apple | Revolutionary
Apple has released a new hardware, the Apple Vision Pro, priced at $3499.
Vision Pro is regarded by Apple as the pioneer of the next generation of space computing.
AI is fully integrated into Apple's system applications, including emojis, messages, memos, and more.
Apple has updated features such as AirDrop to enhance interconnectivity between devices.
Apple's entire Mac product line has completely bid farewell to Intel chips and switched to Apple's self-developed M-series chips.
When the keyword for all tech company developer conferences was "AI", Apple returned to "hardware".
At the Apple Developers Conference in the early morning of June 6th Beijing time, after the new MacBook Air, new chip M2 Ultra, and a series of operating system updates, Apple CEO Cook released the revolutionary new hardware - Apple Vision Pro - in 40 minutes - almost half of the time.
"Just as Mac has brought us into the era of personal computing and iPhone has led us into the era of mobile computing, Apple Vision Pro will guide us into the era of spatial computing," Cook announced from the Steve Jobs Theater stage in Apple Park.
This theater is designed to commemorate the founder of Apple, Steve Jobs, who is known for his innovation. Cook was personally selected by Steve Jobs as his successor, leaving only the impression of "supply chain and cost control" on the outside world. As long as consumers accept Apple Vision Pro, Cook seems to be able to effectively counter those who call him "lacking innovation" and those who are disappointed with Apple's "inaction" in the new era of AI after the release of ChatGPT.
A series of companies, including VR, AR, and even the metaverse, are also waiting for Apple's rescue. In 2021, Facebook changed its name to Meta in anticipation of virtual reality, but since then, sales in the helmet market have not been high, with fewer applications inside, and the expected VR explosion has not occurred. Microsoft's Hololens, which display computer images through expensive transparent lenses, also failed - hoping to be pinned on Apple, which has the strongest product definition and market investment capabilities.
At the end of the conference, Cook released this product under the name "One More Thing", which is of great significance to him personally and the industry. The Apple Vision Pro is so important, so the correct way to watch this developer conference should be to watch it upside down.
![Cook claims to have revolutionary significance and can shoot in the blink of an eye! Apple Releases First 25000 yuan Head Display Time | Apple | Revolutionary](https://a5qu.com/upload/images/32d171dcd0ed48c2826befe2283266b3.jpg)
The Apple Vision Pro is not just a product,
But rather the next generation hardware platform
The scenes that once appeared in science fiction movies - web pages displayed in the air, fingers swiping in the air to turn pages; Making a phone call is like chatting face-to-face, with the other person's image immediately appearing in front of them at the moment the signal is connected. In the presentation demo of Apple Vision Pro, these situations all appear.
The apps originally displayed on iPhone, iPad, and Mac screens appear in the physical space in front of the helmet, with the background being the real physical space where the person wearing the helmet is located. Users can control the opening, closing, zooming in, zooming out, or moving of the app through eye movements, gestures, and sound.
The images are not actually projected into the air, but are displayed on the lenses in front of the eyes. The lens is equipped with an eye tracking system. As long as you focus on a certain app and use your fingers to make pinching movements, the app that has been "noticed" will be opened. If you look at the earpiece label on the search bar, the dictation input function will be turned on. At this time, as long as you speak, the search engine will automatically convert the sound into text input and start the search.
"Navigation in the system only requires eye movement, and each element can respond to your gaze, feeling like you are controlling everything with your mind," said Alan Dye, Vice President of Apple's Human Computer Interaction, in the product introduction.
Lens: Switch between real and virtual worlds at any time
Unlike common VR helmets, Apple's Apple Vision Pro is not a virtual reality device that covers the eyes and isolates users from the outside world, but an AR device that can switch between virtual and reality at any time through screen technology.
Apple designers claim to have used a technology called EyeSight, which makes the Apple Vision Pro "transparent" when others appear nearby, displaying the user's eyes. When users are completely immersed in a virtual experience - watching movies, playing games, or focusing on a mindfulness training session, the lenses of the Apple Vision Pro are obscured, letting those around them know that they are not in a state of communication with the outside world.
The transparency of image lenses is not physical, and Apple actually solved this problem by using technology from both internal and external cameras and screens. The eyes of the helmet wearer seen by surrounding people through the lens are actually played on the external screen after being captured by the camera, and the external world seen by the helmet wearer is also played on the internal screen in real time after being captured by the camera.
Users can determine the depth of immersion through a knob function on the helmet. If you have used an Apple Watch before, then this button is not unfamiliar as it looks very similar to the digital crown on a watch. In the demonstration video, as the user rotates the "digital crown", the originally real world background gradually transforms into a virtual world, walking from the office filled with data into the lush Nordic primitive forest, looking very sci-fi. Apple's statement is that this knob adjusts the user's immersion, and the VR and AR routes are no longer binary opposites here, but smoothly connected together.
![Cook claims to have revolutionary significance and can shoot in the blink of an eye! Apple Releases First 25000 yuan Head Display Time | Apple | Revolutionary](https://a5qu.com/upload/images/7aea9c35f41f8c852501684110abbdaf.jpg)
Moreover, EyeSight is not only used for viewing, but also has a shooting function. Users can complete the shooting with just a blink of an eye. "Apple Vision Pro was Apple's first 3D camera," the introduction said.
The Apple Vision Pro lenses, although only the size of a regular ski mirror, have 23 million pixels, making each eye feel as if they are facing a 4K screen. Moreover, due to the 3D display mode of its content, users can stack the content in front of them as they wish, just like stacking clothes, and the wardrobe has no boundaries - compared to tangible screens such as Mac, MacBook, iPad, iPhone, etc., the virtual screen of Apple Vision Pro is the entire physical space, even if the front is full, turning around will create another space.
"Your entire world is the canvas of applications, you can arrange them anywhere, no matter what you're doing," said Allessandra McGinnis, product manager of Apple Vision Pro. "You can place this screen anywhere you want, expand it, and it's a huge portable 4k monitor.".
The Apple Vision Pro supports all day use - provided it is plugged in, and if an external battery is used, it can only support two hours. To reduce the weight borne by the head, Apple has designed the battery as an external device that can be placed in a pocket and connected to the helmet through a cable. For those who wear glasses, Apple also offers portability, as long as myopia lenses are inserted, they can attach to the lenses of Apple Vision Pro, and eye tracking accuracy will not be affected as a result.
The Apple Vision Pro will be launched on Apple's official website and Apple retail stores in the United States early next year, starting at $3499, and will enter more international markets later next year.
There are some application scenarios, but not so many yet
Apple envisions that this device can be used in multiple scenarios such as personal work, entertainment, and collaboration with people. If you want to immerse yourself in a photo taken deep in the forest, you can easily enlarge the photo in the lenses of Apple Vision Pro to the size of a room. To further immerse yourself, Apple Vision Pro can also automatically lower the surrounding light; Watching movies and playing games will provide a portable 3D experience; With the addition of the Persona feature, Apple Vision Pro can also create a digital character for you and dynamically match your facial and hand movements through AI, making you feel like you're in person when chatting remotely with friends or having meetings with multiple colleagues.
In another demo, Apple showcases several real-life applications that developers can develop based on Apple Vision Pro. One of the developers designed a beating 3D heart that can be separated and reassembled based on anatomical parts, allowing students to observe a three-dimensional virtual heart; Another developer provided a 3D design of an F1 racing car, while another developer recreated the factory's production line.
Disney CEO Bob Egger personally stood up for the application prospects of Apple Vision Pro in entertainment scenes. "One of the world's greatest storytelling companies, one of the world's most innovative technology companies, capable of bringing you real life magic," said Egger, who also announced that the Disney+service for Apple Vision Pro would be ready on the day of its release. This means that when the Apple Vision Pro is launched early next year, users can buy the helmet and see a 3D version of Disney animation.
Unlike the Apple Watch, the Apple Vision Pro can be used independently without the need to work with the wearer's iPhone or Mac computer. Although the battery life of less than two hours doesn't seem good enough at present, whether it's AR, VR, or MR, the biggest challenge is not hardware performance, but the content ecosystem. There are currently 34 million developers for Apple phones, and if Apple Vision Pro cannot succeed, the entire XR - AR, VR, MR field will need to rethink the value of such products.
The threshold for competitors to catch up with Apple Vision Pro is not low
![Cook claims to have revolutionary significance and can shoot in the blink of an eye! Apple Releases First 25000 yuan Head Display Time | Apple | Revolutionary](https://a5qu.com/upload/images/f0fd17cc03da8175a40b9fb0b3dcecd1.jpg)
Every Apple executive who introduced the Apple Vision Pro on stage gave an explanation for elevating this hardware. Richard Howarth, Vice President of Industrial Design, called it the "first wearable space computer.". Mike Rockwell, Vice President of Apple's Technology Development Group, said it is "the most advanced personal electronic device in history" and "not just a new product, but also the beginning of a new platform.". Cook had long anticipated that such a helmet would be like an iPhone, a device that everyone would have.
Cook has been betting on AR for many years. As early as 2016, he once stated during a technology tour in Utah that "augmented reality allows people to better integrate into the real world, rather than closing their eyes and entering a pure virtual world." It is precisely this difference between AR and VR that has led Cook to insist since 2016 that AR and VR are different. The former will be owned by everyone, while the latter will ultimately be a small vertical market.
Around 2016, Apple launched multiple acquisitions in the 3D vision and AR fields. In its acquisition list, SensoMotor Instruments acquired in 2017 has eye tracking technology, and Akonia Holography acquired in 2018 can develop transparent lenses for AR glasses. Both of these technologies have appeared in Apple Vision Pro products.
In addition to capital layout, Apple's layout at the chip and operating system levels takes longer. Like the Mac, the Apple Vision Pro is also equipped with the second-generation self-developed chip M2, which was only released in 2022. Its use on Mac computers has shown its low-temperature and silent capabilities, which are very important for an electronic device worn on the head. At the same time, Apple Vision Pro has also implanted another chip called R1, which specifically processes data input from 12 cameras, 5 sensors, and 6 microphones. Apple claims that these two chips enable images to be displayed on the screen of the Apple Vision Pro within 12 milliseconds, which is 8 times faster than blinking a human eye.
From the A-series chips on mobile phones to the M-series chips on computers, Apple began laying out in 2008, and its layout on operating systems has also been long-standing. At the developer conference in the early morning of June 6th, after releasing the latest versions of iOS, MacOS, and iPadOS, Apple finally released VisionOS for Apple Vision Pro. This is Apple's first operating system for spatial computing, and in this first released operating system, the 3D engine can provide the best image quality in real-time to the eyes to find every frame position.
Privacy will become a new challenge
Using eye contact as a means of interaction may bring a series of conveniences. Apple claims that they have developed a new type of ID called Optical ID, which is unlocked through iris technology. Similar to Face ID, Optical ID will also be used for data encryption, device activation, automatic password filling, and payment confirmation.
But this will make privacy issues even more important - especially when a device can track eye gaze positions in real-time. Apple once again emphasized the privacy issue of helmet devices during the press conference that day. Apple claims that the Apple Vision Pro processes user eye movement data at the system level, and a single application cannot obtain this data unless the user is staring at an app and clicking their finger at the same time before the system tells the application or website that it has been seen.
However, whether privacy can still be protected when more manufacturers launch similar products will be a big issue. After all, when helmets become everyday electronic devices, the true attention economy has arrived, and not every company can resist this temptation.
Operating system updates,
Make devices more interesting and communication between devices simpler
![Cook claims to have revolutionary significance and can shoot in the blink of an eye! Apple Releases First 25000 yuan Head Display Time | Apple | Revolutionary](https://a5qu.com/upload/images/1282b19de8fe97c07587d4b4beb1cea9.jpg)
Apple has spent a lot of time releasing a series of updates from iOS to MacOS and iPadOS, which can be summarized as having two main goals: firstly, to make applications more interesting or useful through AI; Secondly, make communication between devices simpler, whether it's your own device or your device with friends.
One of the characteristics of Apple's launch event is to give users the right to choose between intermediate states. In addition to the seamless transition between VR and AR reflected in Vision Pro, you can also choose the newly launched "adaptive mode" in AirPods' "noise reduction mode" and "transparency mode"; When in AirDrop, there is no need to stay in place and automatically switch to network transmission when walking far away; When encountering unfamiliar phone calls, choose to read the automatically converted text voice content before making a decision between answering and hanging up. And all of this is based on Apple's powerful ability to combine software and hardware, which makes it easy for you to "not feel the presence of technology" when using these features.
How should AI be used? This is the answer given by Apple
One of the applications that Apple is trying to make more interesting is emoticons. Thanks to the increasingly powerful AI image processing capabilities, Apple encourages users to use emoticons more freely in the new operating system, including turning photos into emoticons. These emoticons can not only be sent in Messages, but also be used throughout the entire system, including other apps.
A useful example is the Notes memo. After the system update, users can directly read PDF files in the memo, and through AI image recognition capabilities, users can directly take photos of the forms they need to fill out and convert them into PDF. Apple can directly generate spaces in the areas they need to fill out for users to use.
With the support of system level AI, Apple has also launched a new diary application called Journal. It can use AI functions similar to recommendation algorithms, combined with the user's daily mobile phone usage behavior, to summarize the content worth recording for the user that day. For example, if you attended a classmate party and took photos that day, Apple will retrieve relevant information from the calendar, photos, SMS and other applications as the material for your diary. Of course, according to Apple, all of this information processing is done locally.
Although the theme of Apple's press conference this time is not a big model, Transformer has been mentioned at least three times in the conference. For example, in the input method, Apple's new keyboard can predict the next word that the user may input in real time. The more users use it, the more personalized the model behind it can reflect the user's language habits.
Make the device a whole
Apple has been exploring how to make terminal devices such as phones, watches, tablets, computers, and headphones seamlessly integrated over the years, and airdrop was proposed in this context. In this WWDC, these features are more powerful and less noticeable.
The communication between image devices has also become simpler. With the NameDrop function, friends only need to put their phones together and can share personal information such as phone numbers and emails. When using AirDrop to transfer large files, if you have to leave during the transmission, the transferred files can continue to be sent to the other device through the Internet signal. The component functions between iPhone, iPad, Apple Watch, and Mac are also interconnected. For example, an application on iPhone provides a certain component function, and even if the application is not installed on Mac, the component capability can be seamlessly used on Mac.
In addition to the improved continuity between Apple devices of the same user, real-time interaction between Apple devices of different users is also increasing. Users can now watch movies and enjoy music together in real-time, and use applications such as memo and borderless notes to work together.
![Cook claims to have revolutionary significance and can shoot in the blink of an eye! Apple Releases First 25000 yuan Head Display Time | Apple | Revolutionary](https://a5qu.com/upload/images/9846d93417a262b2db920927ee05fba5.jpg)
By making hardware invisible and software prominent, Apple strengthens the stickiness of the ecosystem through this approach. When you copy text on an iPhone but find nothing to paste on a Windows computer, you will definitely feel the charm of Apple's family bucket.
More powerful chips,
Support local processing of Transformer models
Cook did not release many hardware updates at the developer conference, except for Apple Vision Pro, which only updated the Mac product line. The 15 inch MacBook Air, Mac Studio, and Mac Pro have all released versions equipped with Apple's latest chips, marking the complete departure of Apple's entire Mac product line from Intel chips. Since the release of the first M1 in 2021, Apple has completed de Intel in less than three years, and the latter's stock price has also halved in these three years.
The first one mentioned is the MacBook Air, which starts at $1299. This new computer will be enlarged to 15 inches, making it the largest MacBook Air in history. It weighs only 1.5 kilograms and has a battery life of 18 hours, a 50% increase compared to the previous version.
Mac Studio, prepared for professionals, will receive its first update with two versions of M2 Max and M2 Ultra chips, starting at $1999. Similar to the M1 Ultra, the M2 Ultra is also a product that combines two M2 Max pieces together, doubling its performance compared to the M2 Max. The unified memory of M2 Ultra reaches 192GB, and Apple stated in the launch event that such a chip can support processing Transformer models.
Apple's most high-end product, Mac Pro, has been updated with a starting price of $6999.