[
Apple Inc.’s silicon design group is working on new chips that will serve as the brains for future devices, including its first smart glasses, more powerful Macs and artificial intelligence servers.
The company has made progress on the chip that it’s developing for smart glasses, according to people with knowledge of the matter. The move indicates that Apple is ramping up work on such a device, which would compete with the popular Ray-Ban spectacles offered by Meta Platforms Inc.
The silicon team has become a critical piece of Apple’s product development engine in recent years, especially after it began replacing Intel Corp. processors with homegrown Mac chips in 2020. Other semiconductors in development will enable future Macs as well as AI servers that can power the Apple Intelligence platform, said the people, who asked not to be identified because the plans are private.
A representative for Cupertino, California-based Apple declined to comment.
The glasses processor is based on chips used in the Apple Watch that require less energy than the components in products like the iPhone, iPad and Mac. The chip has been customized to remove some parts in order to further improve power efficiency. The processor is also being designed to control the multiple cameras that are planned for the glasses.
Read More: Apple Remains a Threat in AR, Even as Meta and Google Race Ahead
The company aims to begin mass production of the processor by the end of next year or in 2027, indicating that the glasses — if successful — are likely to come to market in roughly the next two years. As with Apple’s other major chips, partner Taiwan Semiconductor Manufacturing Co. will handle production.
Apple has spent years trying to develop smart glasses — something lightweight that consumers can wear all day. The original idea was to use augmented reality, which superimposes media, notifications and apps over real-world views. But AR remains years away from being practical.
In the meantime, Meta and others have had success with non-AR smart glasses, which can take pictures, play audio, make phone calls and let users talk to a voice assistant. Apple now looks to jump into that market as well — even while it continues to pursue the AR concept. The company held user studies with employees on the concept last year.
Apple is working on both options under the code name N401, a recent shift from the prior internal nomenclature of N50. Tim Cook, the company’s chief executive officer, is determined to beat Meta in the glasses market, Bloomberg News has reported. But Meta is moving aggressively itself. The social networking giant is rolling out a premium model with a display later this year and plans its first true AR spectacles for 2027.
Read More: Apple Readies Pair of Headsets While Still Looking Ahead to Glasses
Apple is currently exploring non-AR glasses that use cameras to scan the surrounding environment and rely on AI to assist users. That would make the device similar to the Meta product, though Apple is still figuring out the exact approach it wants to take. The iPhone maker also needs its own artificial intelligence technology to vastly improve before the company can roll out a compelling AI-centric device.
The company also is spreading its bets. Apple has been working on adding cameras to its AirPods and smartwatches, aiming to turn those products into AI products well, Bloomberg News has reported. The company is developing a chip called Nevis for the camera-equipped Apple Watch and a component named Glennie for the similarly outfitted AirPods. Apple is aiming to have those chips ready by around 2027.
Already, the iPhone has a feature called Visual Intelligence that can provide context for photos. For instance, customers can scan a music poster and have the event details added to their calendar.
Beyond the semiconductors for smaller devices, Apple is working on several new Mac chips, including processors that will likely be known as the M6 (Komodo) and M7 (Borneo). There’s also another, more advanced Mac chip in development dubbed Sotra. The company is planning to bring the M5 processor to the iPad Pro and MacBook Pro as early as the end of this year.
Read More: Apple Working on Turning Watches Into AI Devices With Cameras
The AI server chips, meanwhile, would be the company’s first processors expressly made for that purpose. They will help process Apple Intelligence requests remotely and feed information to consumers’ devices. Today, Apple manages this task with the same chips it puts in high-end Macs, including the M2 Ultra. The Information reported that the AI server project would use a component developed with Broadcom Inc.
The project, dubbed Baltra, is planned to be completed by 2027. As part of the effort, Apple is considering different types of chips, including ones that have double, quadruple or eight times the number of main processing and graphics cores as today’s M3 Ultra. The semiconductors would make Apple’s AI services faster and more powerful, potentially helping it catch up in an area where it’s struggled.
The new semiconductors in development join a series of other initiatives in the works within Apple’s hardware technology groups, run by executive Johny Srouji. Following the release of the company’s first C1 modem chip in the iPhone 16e earlier this year, Apple is planning a pro-level C2 modem for next year’s high-end iPhones and an even higher-end C3 version for the year after, Bloomberg News has reported.
The group is also in charge of underlying components for initiatives planned for even further in the future, including a sensor and chip system that can noninvasively measure a person’s glucose levels. The company aims to include the technology in a future version of the Apple Watch.
This story was originally featured on Fortune.com
https://fortune.com/img-assets/wp-content/uploads/2025/05/GettyImages-2170417848-e1746725498408.jpg?resize=1200,600
https://fortune.com/article/apple-specialized-chips-glasses-macs-ai-servers/
Mark Gurman, Bloomberg