Unlike other companies that have attempted significant transformation with artificial intelligence, Apple is using emerging technologies to improve basic functions in its new gadgets.
Instead of using the words “artificial intelligence” to describe the emerging technology, Apple unveiled a new line of iPhones and a new watch that includes improved semiconductor designs that power new AI features.
The features mainly improve basic functions such as taking calls or taking better images.
Artificial intelligence didn’t even make an appearance at its June developer conference but has been quietly shaping Apple’s core software products behind the scenes for months.
In contrast, Microsoft and Google have set ambitious goals for the level of transformation with their AI efforts.
Industry leaders have warned of the potential pitfalls of unregulated development of new tools like generative AI.
Apple built the Series 9 Watch with a new chip that includes improved data crunching capabilities, notably the addition of a four-core “neural engine” that can process machine learning tasks twice as fast.
Neural engines are what Apple calls the building blocks for its chips that accelerate AI functions.
The watch chip’s AI components make Siri, Apple’s voice assistant, 25 percent more accurate.
But with the machine learning chip components, Apple has enabled Apple to launch a new way to interact with the device: People can “double tap” by pinching a finger on their watch face to answer or do things like end a phone call, pause or launch music. Other information such as weather.
The idea is to give people a way to control the Apple Watch when their non-watch hand is busy holding a cup of coffee or walking their dog.
The feature works by using a new chip and machine learning to detect subtle movements and changes in blood flow when users tap their fingers together.
The iPhone maker also showed off improved image capture for its lineup of phones. The company has long offered a “portrait mode” that can blur the background using computing power to simulate a large camera lens.
But users have to remember to turn on the feature.
Now, the camera automatically detects when a person is in the frame and then collects the necessary data to blur the background.
Apple is the only smartphone maker to add AI to its hardware. Google’s Pixel phones, for example, allow users to erase unwanted people or objects from images.