I stay pretty well read up on technology, and Apple in particular for the last fifteen years, give or take. So, when this question came up recently, I set out to jot down a short, concise reply to indicate my general positivity and reasoning therefor.

I failed.

I wound up writing a much more complete answer than I anticipated, which went on to become a general observation on the advancement of technology compared to the problems it solves. You have been warned.

Here it is:

In short, I’m generally quite positive about the 7. It’s a significant improvement along all of the vectors people use smartphones for. Faster, more memory, more responsive, more durable, and so on. And all by statistically significant margins even over last year. Any iPhone owner at the 2-3 year point of their upgrade cycle should get one without hesitation.

For myself, personally, it’s a greyer choice. I had my 5 for three years, so when I got a 6S+ last year, it was a HUGE leap. A year later, it’s still a spectacular device. I took it out camping this weekend and abused the camera doing time-lapse and long-exposure tricks, and it just performed.

The one feature I was hoping for in the 7 was Pencil support. Had it had that, I’d already be in line somewhere to get one. Without it, the 7 is a good upgrade from the 6S+, but not enough to make me rush out. I might become interested just on the basis of improved camera and the fact that I’m financing through AT&T anyway, so my monthly payment would be the same no matter what. 

As to the lack of the headphone jack, I think this is MOSTLY a non-issue. The 7 comes with an adapter, and they’re selling extras for $9 (super cheap for Apple), so there’s a solution for retaining current headphone investments. People talk a lot about Apple pushing new Lightning headphones, but I think this is incorrect. Apple’s push is to wireless, period. And like their abrupt tech pivots in the past — shift to USB in the original iMac, removal of floppy drives, OS 9 to OS X, shift to Intel chips in Macs, 32-pin to Lightning — broader support for this will ramp up over the next 6 months and be a complete change in about a year. The whole industry will catch up in that time and the later adopters will have an easy time of it. Expect a lot of new wireless headphone solutions to hit the market, in other words.

I also think that the fact that the exterior design is similar to the 6 series isn’t much of an issue. This is what happens in a matured, post-Cambrian Explosion ecosystem, and this is what’s happened in every technology sector that’s ever existed. Forms stabilize and slower iteration sets in, as technology allows. Eventually this results in a small handful of dominant forms, and a whole ecosystem of more specialized forms each seeking to serve smaller and smaller markets.

Looking at the computer industry as another example, the basic form was set by the 90s, and it’s been slow year-over-year iteration and small-scale experiments pushing change.

What most people use computers for is a relatively small set of activities. Internet, email, socials, photos, video, and so forth. Most of that is just communication, and once a machine gets fast enough to keep up with typing, any additional improvements are aesthetic, or ergonomic, or comfort features. So you go lighter and smaller, which led to the laptop explosion of the 2000s. The moment laptops could do the work of desktops, people went portable in droves. I did this with design and art, first with the iBook and PowerBook, and later with MacBook Pros. 

The point is many of these processes don’t change much over time, where humans are concerned. When I started doing graphic design on computers, back in the mid-1990s, the machines struggled to keep up with even basic tasks. The power needed to set type, to show color, to build vector graphics, etc, was demanding on that hardware. By 2000 or so, monitors had become good enough to show accurate color on flat screens (expensive screens, mind - my 22 inch LaCie flat CRT cost me about $1500 at the time; it weighed about 70 lbs and was a beast, but the screen was glorious for the time). But building publications and doing photographic compositing at print-resolutions (300pixels per inch, in CMYK colorspace) still needed as much power as you could throw at it, and would still take 10 minutes to save a 500MB file. Printing to a color laser might take 20 minutes. Go make a sandwich. I had two full workstations in my office so I could work on one while the other was rendering or saving or printing. 300 Mhz monsters, the fastest machines that could be purchased by consumers then.

By 2004 I could do all of that on a 12-inch PowerBook, so I went portable full time. And that was my primary workstation for years, until I upgraded to the shiny new Unibody MacBook Pro in 2009. It was like a piece of science fiction by comparison. I’ve upgraded a few times since then, and currently use a Unibody MBP that’s going to be four years old in February. This one has a massive SSD in it, though, and is still superfast.

Now, during this whole twenty-year span, the fundamentals of printing technology haven’t changed much at all. I’m still creating 300ppi files to print in CMYK at the same paper sizes as I was in 1996. The difference is that everything now happens instantaneously. Photoshop launches in 5 seconds instead of 2 minutes, 500MB files save in seconds instead of minutes, I can no longer overload the RAM so the machine slows to a crawl executing commands via scratch disk, and I have more cheap storage space than I can fill. 

So the ability of technology to deliver has blown past the demands placed on it, even for someone like me who used to be a genuine power user. These days the only envelope-pushing demands on computers are in content creation, 3D rendering, editing hi-res video, high-end gaming, and so forth. Large markets, but smaller and small by percentage to the overall market.

As the tech catches up with the uses, and flattens out the friction points of solving the actual problems to which it is being applied, upgrade times lengthen. People wait longer to abandon a solution that works to move one that merely works better. The “better” has to be demonstrable and clear, and convincing people gets harder and harder as time goes on.

Computing has gone through this curve, and now is trying gimmick after gimmick to find the next mainstream success. 4k and VR are the current edge cases. There’s a lot of hype around these right now. Partly because they’re new (they’re actually not new, but the new promise is that affordable computers may finally be ready to deliver these to the masses). And partly because other industries, that have themselves plateaued, are cross-promoting new tech because they are also searching for the next revolution to sell more of their own stuff – 4K TVs, for example, just a few years after they finally sold everyone a 1080p TV; or virtual reality solutions of one kind or another (I’m not sure I think facehugging headsets are going to catch on any more than the Kinect did; an initial explosion of interest followed by a collapse as the limitations and inherent inconvenience sets in to reality). 

This will happen with phones as well. I suspect we’re already seeing the beginning of this transition. So starting to see some long-term consistency in the exterior designs is a good indication, I think. Of course as I say this, Apple is probably preparing to throw it all out the window next year, so I should enjoy this feeling of insightfulness while I can.