The problem with “next-gen” gadgets


Gadgets, since time in memoriam, have worked a certain way.

You, a company, release one. It’s good, but it’s not perfect. No gadget is perfect! So you do market research and focus groups. You figure out who’s buying. You figure out what they like and what they don’t like. You refine. You fix problems.

The next year, you release a version of that device that is objectively, concretely better. This is the next-gen device, the Device 2.0. You call this device an “upgrade.” You tell your customers to recycle Device 1.0 and replace it with Device 2.0. Some of them do. “Should you upgrade?” the tech bloggers write, calculating the pros and cons of doing so.

I know, I know, this is a vast oversimplification of how consumer tech actually works. I merely mean to illustrate that many of us who follow the gadget space share an assumption about the way products work: that products improve as the years go on. That next-gen gadgets are better than the gadgets they’re replacing.

But not all technology works that way anymore. And it’s time for all of us — companies and consumers alike — to stop acting like it does.

The “upgrade” mentality made a lot of sense for new categories of products that were trying to probe into what customers wanted. The smart home space in the mid-2010s was a good example — it wasn’t clear how exactly people would use Alexa, Google Assistant, and various hardware that included them, and as the market learned more, the software and speakers and such were refined to better suit those use cases. The Google Homes got louder and gained functionality without losing much in exchange.

But many prominent gadget categories — notably smartphones, laptops, and TVs — are now firmly out of that space. These are mature markets full of established players and products that work very, very well already. And that makes an “upgrade,” in the traditional sense, a tricky task.

One need only look at this year’s laptop market to see how that’s playing out. There were very, very few laptop releases that were strictly better than the predecessors they replaced. The examples I can think of are all in gaming, where some rigs did see a meaningful jump in graphics quality from both hardware and software improvements.

But almost every “next-gen” device I reviewed from the consumer computing space was not what I would call an “upgrade” from previous generations. They were upgrades in some ways and downgrades in others. Across the board, they were just different.

Some were radically different, in both design and function. Take Dell’s XPS 13 2-in-1, for example. Since 2017, this device has been a very standard convertible — that is, a regular-looking laptop that happens to be able to fold back 360 degrees. This year, however, Dell eschewed that design for a Surface Pro-esque form factor instead. This year’s 2-in-1, while still marketed as the XPS 13 2-in-1 and replacing the old one on Dell’s store, is essentially a Windows tablet with a magnetic keyboard case. That form factor isn’t necessarily better or worse, but it’s difficult to conceptualize as an “upgrade” from the previous form factor. It’s ideal for different use cases, and it’s targeting a different customer. It’s just different.

But there are also legions of next-gen laptop models that didn’t see many (if any) design updates but still ended up targeting a new customer entirely. That has to do with the choices Intel made about its 12th Gen processor lineup. Intel has long been the world’s largest semiconductor manufacturer and has operated without much meaningful competition for much of the past few decades. Only in recent years have AMD and Apple burst onto the scene with threatening, core-crammed competitors.

Where Intel could once get away with incremental performance bumps each year, it’s recently had to make bigger and riskier moves. The company made big strides in raw power this year, and its Alder Lake chips rivaled (and even surpassed) Apple’s Arm chips by many metrics. But those chips were also more power-hungry than the 11th Gen series was, and the battery life of many Intel-powered 2022 laptops suffered as a result.

And so we had, across the board, a year full of Windows laptops that were more powerful than their identical-looking predecessors but did not last nearly as long to a charge. Seriously, you can click on any review of a next-gen laptop that I wrote this year. I can almost guarantee you that I praised the performance but complained about the battery life. These were not upgrades, even though parts of them had improved. They were different devices, targeting users for whom power was a priority and battery life was not. They were not — even if there was overlap — strictly targeting shoppers who owned previous versions of those devices.

This isn’t exclusive to the laptop market, though. Look at the iPhone 14. It’s the iPhone 13, but there’s, like, a new camera sensor? I know very few people who have actually bought this new iPhone — I do know several people who have chosen to buy the 13 instead because they feel that it’s better value for their money.

The Acer Chromebook Spin 714 was a markedly different device from the Spin 713, mainly because of its battery life.
Photo by Becca Farsace / The Verge

I want to be clear that I don’t mean to knock next-gen gadgets or argue that they should go away. They clearly serve an important purpose in the tech landscape. But if they’re not upgrades, then what are they? Hear me out: they’re sequels.

Entertainment has been doing this a different way for decades. When a sequel to a movie is released, we don’t assume that sequel will be an improvement on that movie. This is true of remakes as well. I think we can all be thankful that the 2004 Nicole Kidman version of The Stepford Wives didn’t erase the 1975 Katharine Ross title — the two are different movies with different tones and target audiences, despite having an entire premise and plot in common. A sequel is sometimes (often, in fact) worse than its prequel, and that’s okay, not a massive failure or a sign that the studio is doomed.

Obviously, there are countless differences between consumer technology’s and Hollywood’s business models. Movies can’t break and don’t degrade (though elements of them — their special effects, their costumes and hairstyles, elements of their settings and storylines — do date them as time goes on). Gadgets need to be replaced in a way that movies don’t.

Still, I think parts of the entertainment business’s model could provide an alternative way for both shoppers and manufacturers to think about consumer technology. (There are, of course, tech products outside the gadget space that are already widely viewed this way — cars are one example.)

Some categories are as good as they’re going to get

I’m imagining a world where if my XPS 13 breaks, I can easily replace it with another 10th Gen XPS 13 — even if a 12th Gen model is on shelves. In this world, chipmakers don’t necessarily release new generations every year; they update when they have something groundbreaking to share. Companies don’t replace their gadgets with new versions of those gadgets, but sell both side by side, with clear descriptions of who each one is and isn’t for. And reviewers evaluate new units on their own, unique merits, rather than comparing them spec-for-spec to their predecessors.

I’m not suggesting that this world is even possible. We’re talking about companies who have a profit incentive to keep us buying new things and about consumers who love shiny new toys. I’m just saying it’s a world I’d vibe with.



Source link