The Download: Apple’s headset challenges, and what AI can learn from nuclear safety

This is today’s edition of The Download, our weekday newsletter that provides a daily dose of what’s going on in the world of technology. Apple will face an uphill battle convincing developers to build apps for its headset The ‘one more thing’ announced by Apple at its Worldwide Developers Conference (WWDC) this year was the industry’s worst-kept…
The Download: Apple’s headset challenges, and what AI can learn from nuclear safety

The ‘one more thing’ announced by Apple at its Worldwide Developers Conference (WWDC) this year was the industry’s worst-kept secret. The Apple Vision Pro, the tech giant’s gamble on making mixed reality headsets a thing, has received a mixed reception. Most of the concern has centered on the eye-watering $3,499 cost.

But there’s a bigger problem: Whether there’ll be enough apps available to make the cost of the device worth it. It’s a real challenge to redesign apps for an entirely new interface—and developers are concerned. Read the full story.

—Chris Stokel-Walker

To avoid AI doom, learn from nuclear safety

For the past few weeks, the AI discourse has been dominated by those who think we could develop an artificial-intelligence system that will one day become so powerful it will wipe out humanity.

So how do companies themselves propose we avoid AI ruin? One proposed solution comes from a new paper by DeepMind et al that suggests that AI developers should evaluate a model’s potential to cause “extreme” risks before even starting any training.

The process could help developers decide whether it’s too risky to proceed. But potentially it’d be more helpful for the AI sector to draw lessons from a field that knows a thing or two about very real existential threats—safety research and risk mitigation around nuclear weapons.

—Melissa Heikkilä

Melissa’s story is from The Algorithm, her weekly newsletter giving you the inside track on all things AI. Sign up to receive it in your inbox every Monday.