After several years, I watched the full Apple product event last night. I was impressed by the technological achievements but the show hasn't made me "dream" about the new iPhone 7 or Apple Watch 2 etc.
Let me start with a review of the event. Before Tim Cook, the Apple CEO, arrived, we saw a funny exchange he had with the taxi driver. It looked like a video that was filmed just before Cook entered the hall in San Francisco.
Cook and various people were giving their talks. A billion of iPhones have been sold, that's why they are everywhere. Super Mario (Run) has finally been brought to iPhones. The game's father, my uncle Šigeru Mija-Motl, presented the new game in half-English, half-Japanese.
Other people were telling us about the projects to teach kids how to code (in Swift) using an iPad app or what was that. Another woman showed that Apple's counterpart of the Microsoft Office will now enable the real-time collaboration of many users that create a presentation or another file. Apple Watch 2 was waterproof, GPS-enhanced, and a hike enhanced by the product was sketched. Apple Watch Nike Plus could be useful for runners, a Nike guy argued.
This watch will tease the owners – it's sunny outside or Sunday and you really need to go running, Joe is ahead of you etc. – a reason why I surely won't buy such an annoying, arrogant watch. ;-)
Apple Watch 2, iPhone 7, and iPhone 7 Plus are supposed to be water (and dust) resistant. Speakers push the water out of the new iPhones. The iPhones have a new impressive super-polished Jet Black color-and-material (which, Apple admits, easily attracts scratches), ceramics was added to the materials etc. The iPhones' performance are now 120-240 times higher than that of the first iPhone.
The brightness of the displays has been almost doubled on all these products, battery life was claimed to go up, 16 GB flash memory was banned and 32 GB is the minimum now. The iPhone camera was revolutionized, with lots of stabilization and other features. iPhone 7 Plus has two back cameras – inequivalent to one another – and that's good for zooming or blurring the background behind the people (enhanced by lots of fancy software) and many professional-looking photographs may be made and were shown. Aside from the two cameras, the iPhone 7 Plus only differs from iPhone 7 by the larger size.
iOS 10 is the new operating system. Almost nothing was shown about it. In fact, the front side of the iPhones were remarkably absent from all the photos. I do care what's happening on the display because the display is probably the most looked at portion of a smartphone! ;-) Just look what e.g. photos of Lumia 640 typically look like. Almost of them boast some arrangement of the live tiles etc. But we mostly saw the iPhone 7 as seen from the back side!
It seems that they're ashamed of the display layout. Visually and according to the self-evident UI features, the iOS hasn't really made a substantial progress since 2007.
However, it seems clear that the technological progress hasn't stopped at Apple. It may be faster than what almost anyone really needs or wants – and that will probably be the main reason of the Peak iPhone if one arrives (or has already arrived).
Those who always use the newest iPhone probably can't imagine how perfectly satisfactory my Lumia 520 (which you may buy for $50 these days) is, even after those 2+ years. It is fast for all purposes, including browsing of many things, and hasn't slowed down, despite those 100 apps I have combined with 512 MB of RAM. I am actually increasingly worried that once I buy a 3 times or 15 times more expensive phone, many things will actually be worse. Not just worse per invested dollar but worse in the absolute sense.
The connector controversies
The connectors and the communication with the external gadgets of the new iPhones turned out to be the most controversial aspects of the new products. First of all, all iPhone 7 devices completely abandoned the 3.5-millimeter audio jacks. The Apple guy was saying that it was because they have a vision and they are brave to enforce it. Murray Gell-Mann's Enron ad about jettisoning of things that are unnecessary would be appropriate here.
What is the vision? Well, there are no "mechanical" old-fashioned connectors or ports on the devices and everything is wireless etc. Now, there is a part of me that completely sympathizes with this vision. To get rid of all the connectors and cables represents a kind of "perfection". But do I as a whole sympathize with this vision? ;-)
Well, I don't. I am too practical for that.
First, the 3.5 mm audio jack was abandoned. You may still use earphones with a cable. However, the connector of the cable isn't the old universal trans-corporate audio jack. Instead, it's the "Lightning" port, Apple's preferred cousin of micro-USB – which they want to be the only port. Apple will be adding adapters to connect your old earphones etc. with the 3.5 mm audio jack to the Lightning port. Note that there's only one Lightning port in iPhone 7 so you can't listen to earphones and charge at the same moment etc.
But the recommended new step forward are the AirPods, wireless buds, see the picture above. They have a new wireless chip on them. The signal is going through the air and it is said to work well enough. You may listen for 5 hours before the battery runs out of juice. Then you need to return both AirPods to the small AirPod box which recharges them. The box itself must be charged and it allows the music to play for 25 hours if I remember well.
You may buy these tiny wireless earphones for $159.
What do I think about them? I have highly mixed feelings. On one hand, I do hate cables going to earphones. What is annoying is that the cable is either too long and then it gets entangled; or it is not long enough and some motion of the smartphone in your pocket combined with gravity acting on the cable tends to tear the earphones out of our ears. Wireless buds may be better.
But are they? Wouldn't a new cable geometry that starts on the top of your head be a better solution for the earbuds?
There have three obvious features that reduce their practicality:
- The wireless earphones – and their box – need to be recharged rather frequently. So you have additional "pets" that need to be fed and you need to think of them. It's some extra work that occupies your biological CPU.
- The wireless signal must get shielded under certain configurations or assuming some interference – you may probably sometimes lose the signal.
- When you lose them, e.g. while running in Nature, you lose $159. In fact, it's plausible that you lose $159 as soon as you lose one of the two AirPods.
Analogously, you are not allowed to be worried about having just one Lightning port – or complain about additional disadvantages of the evolution away from the audio jacks and other classic connectors. That's too bad because the free practical people – and I guess that an overwhelming majority of those live outside the San Francisco area – will care about these matters. Fortunately for Apple, there are probably hundreds of millions of obedient members of the "San Francisco sect" all over the world who will shut up, buy the new products, and avoid any imaginable blasphemous criticism.
There is a way to fight against the "lost wireless earphones" problem. Glue a thread or a wire to the smartphone as well as the AirPods so that they are physically connected. When the earphones fall out of your ears, they won't get lost. You may trace them by following the wire. The iPhone 8 may have this additional feature – the wires connecting the wireless earphones back to the smartphone.
In fact, using the same cable, one may even remarkably solve the other problems. The cable protecting the wireless earphones from the loss may also conduct electricity and the electricity could be used to power the earphones – so that you don't need to charge them manually! :-) And the cable could even carry the information to the earphones, thus solving the lost-wireless-signal problem. I am describing these questions satirically but my broader point is clear: It is damn questionable whether the transition to wireless earphones represents positive progress or negative progress. It's a controversial change that goes sideways.
Some rooms and offices with lots of computers, monitors etc. contained lots of cables. Sometimes they get entangled in knots. The knots look ugly and you may be worried that you will spend an hour when you need to disentangle them and so on. These are the feelings that drive our and Apple's desire to make everything "perfectly wireless".
On the other hand, the cables – a heritage of the 19th century – are wonderful. They work. Cables are the most practical tools to propagate the electrical energy and even information at short enough distances. In particular, when you have just several cables around your laptop, they are just perfect. They are the memory of the good old times when technologies weren't wireless and people were satisfied. The cables and the 3D knots on them beautifully exploit the possibilities that Nature gave us when it chose the dimensionality of the spacetime to be \(D=3+1\).
Similarly, the Apple guy said that they don't want (numerous) connectors in general because "the real estate on the iPhone is expensive" or how he exactly expressed the idea. Except that when you only have one Lightning connector, one might argue that almost the whole circumference is unused. How can the real estate be expensive if the utilization rate is zero? ;-) An alternative viewpoint would be that it's a good idea – or at least justifiable and acceptable – to have a connector or a button or a slot at every millimeter of the smartphone's circumference; you may call it the "principle of maximal utilization". For example, three Lightning connectors could be better than one. It's just a different philosophy. One can't really say that Apple's wireless ideal is better; it's just different. You may worship the clean and holy connector-free design – but you may also change your mind once you actually need to connect something that you could connect previously.
Just a year ago or so, I really switched my laptop at home to Wi-Fi 24 hours a day. Before that, the router was close enough to the laptop so I preferred the ethernet cable. Now I have UPC – which sends the TV/Internet/telephone signal through the coaxial "TV" cables – so the router is in a different room. But even the Wi-Fi connection of your main computer is clearly not something that is "absolutely needed". Even if an ethernet cable had to go to another room, it wouldn't be a catastrophe. Many people have had one.
And I have one more negative experience with a wireless replacement: the computer mouse. I actually did buy (for about $40?) and own a wireless mouse some 14 years ago. It just didn't work too well for me. The signal was probably weak so it didn't get to the laptop from the other side. And the recharging was annoying. One faces the problem of the short-time battery life – the need to recharge the battery X times in a month or so, I forgot what X was. But one also faces the long-term battery life problems. After some time, the battery's capacity goes down or the battery dies, you're never sure whether you will be able to buy a new battery and where, and so on.
Problems qualitatively similar to those of the wireless mouse probably plague AirPods as well, hopefully to a lesser extent.
Parameters that consumers care about but producers don't
Battery life – and, more generally, the minimum care that the consumer has to spend with batteries and recharging – is among the most important parameters that the consumers would like to be improved. Yes, chemistry and physics make the progress hard. But even software optimization should focus on things like that. That's one of the many reasons why I consider Windows Phones more optimized for the practical life. Their battery life is generally better. And when it comes to the battery drain and slowdown caused by updates of background processes, Microsoft appreciated that it is a serious issue. The amount of data that is being transmitted for the live tiles to be updated is stringently regulated by the OS and really tiny – especially relatively to the amount of beauty and practicality that the live tiles bring to the user.
Many parameters of the computers and cell phones have improved brutally in recent decades and years. The new iPhones' CPU performance is 120 or 240 times better than that of the oldest iPhones. The RAM has grown by orders of magnitude, and so did the disk space and many other things.
But what about things like... the boot time? Look how Commodore 64 booted more than 30 years ago. You press the "switch on/off" button and the computer is up and running in less than a second, with the Bill Gates Commodore 64 BASIC eagerly waiting for your commands or code.
It's remarkable if you realize that, if I remember well, the frequency of the 6510 microprocessor was just 0.000985320 GHz. It was several thousand times slower than the frequency of the today's microprocessors. But booting did take less than a second. These days, you need minutes to turn on the computer. Even rather new computers may need up to 5 minutes from the true beginning to the moment when you may really do everything at the normal speed.
Clearly, the boot times haven't been a priority; making the computer ready for many kinds of tasks right away (even if much of it could be considered "bloatware") was more important. But the boot times could have been a priority.
Are the boot times important? We often use the sleep or hibernation modes and it's faster to wake up a computer from those. But I still think it's reasonable to say that a computer user may need to turn the computer on – in the old-fashioned sense – approximately once a day. Moreover, various updates often need to restart the computer. Shouldn't those things be faster?
If you have owned a computer such as Commodore 64 and you have tried an emulator on the PC, you must have noticed that you may run the app at the maximum, non-realistic speed, and achieve some 50-fold acceleration or much higher. It took several minutes for a Commodore 64 game to load from a tape (even with a turbo). Those things have surely sped up. But if you run these old computers and programs with the newest hardware, the speedup is just amazing.
In my opinion, there are actually numerous applications or activities in which the old Commodore 64 simulated on the newest hardware seems more practical than the corresponding contemporary applications and programs. Well, I could go further. Some old digital or even analog stopwatches could be more practical than the stopwatch app on newest smartphones – at least in the most ordinary circumstances when the old stopwatches just worked. Have you ever tried to fairly compare them?
Needless to say, speedy boot times aren't considered the most important quantity defining the technological progress so little attention is being given to this variable in the research labs of the computer and smartphone manufacturers. But people do care. Those things affect their lives, much like the battery life.
The same comment applies to the time a computer needs to update itself. I do think that the updates should be made "virtually instantaneous". They should proceed in the background and create a whole new virtual copy of the soon-to-be-changed files in the operating system. And then the upgrade would be completed by (verification of integrity and) simply switching to the new copy.
And then you have the problems with troubleshooting. Certain problems that arise may devour huge amounts of time of the user. Progress is being made here – with the automatic detection of problems, automatic installation of the right drivers in Windows 10 etc. (the amount of time I have wasted with the installation of new drivers before Windows 10 was immense – those annoyances have basically disappeared, we are in a new era in this sense, just like when we entered a new era when the widespread adoption of Unicode cured the problems with 20 encoding standards for Czech letters with diacritics, something that wasted tens of hours in the mid 1990s) – but I feel that lots of users are still facing rather rudimentary problems that the computer or phone could easily "know how to fix".
It seems to me that the companies sometimes force "unnecessarily high-tech solutions" upon the users – such as wireless earphones – while they don't fully exploit the advantages of existing technologies that could actually be useful. Microsoft's Continuum – more generally, smartphones broadcasting the image to TVs as if they were computers – is clearly a feature that may be very useful. When you're giving a PowerPoint-like presentation, it should be normal that the smartphone is the only device you need on your trip.
Microsoft Continuum may be said to be "too hard". But I think that many capabilities of the smartphones aren't being exploited. For example, lots of people and families have several iPhones that could communicate and co-operate in many ways. But even trivial things such as "chess for two iPhones in the same living room" seem to be almost non-existent.
However, to a large extent, Apple is "improving" things that are already good enough. These improvements impress theorists like me – who like the technological progress even for the sake of it – but they're not too positively correlated with the practical needs of the users. Also, many parts of the technology are becoming complicated. I've been playing with gadgets for decades and did numerous things with them so I cannot be a complete idiot relatively to others. But I do worry that with every new smartphone or technology like that, I may become unable to perform even the most rudimentary tasks.
The speed, ease, and practicality of the simplest and most ordinary, frequent tasks should be a higher priority for producers of new hardware and software.