|'My iPhone family pile' |
by Blake Patterson
under a CC license
With time, internet gained ground and developers started using it as an alternative update distribution vehicle. It has been a very welcome thing, indeed; it is normally an easy process and allows for much more frequent updates.
As the number of our digital devices grows, software updates have become an increasingly important part of our (digital) lives. Smartphones, tablets, routers, intelligent devices (thermostats, smart light bulbs, cameras, even camera lenses) allow for their software to be updated.
The frequency of the updates depends on the product and its developers but for "small" applications and apps it can be very frequent. I have come across Android apps that have had 2-3 updates per week. And there, exactly, I believe I can spot a problem. The update process is beginning to take a bit more time (as well as bandwidth and data volume) than perhaps it should. Taking one's smartphone offline for a day, most probably means being prompted for a few tens of app updates when it is taken online, again.
Has the ease of deploying updates made developers sloppier? Has it increased pressure on them to release software as soon as possible, even if not all features are there and even when the software has undergone only little testing? Or is it just adding value to users, offering them access to new functionality, design enhancements and innovative stuff as they are created? As a software user/ consumer, I 'd very much like to think of the latter, though I suspect that we are mostly victims of the former. To be fair, though, for software and apps I really value, hitting the "update" button is often accompanied with great expectations :-)
There is nothing wrong in improving users' experience through well planned software updates. Needless to say that providing updates to fix security holes or fix critical bugs is a must, too. However, offering updates on too frequent a basis can have a negative impact on users' perception of software quality and come at a cost (users' time and productivity, network's bandwidth, etc.). Is it perhaps time for software developers to re-discover quality practices? Or is the constant updating thing something that we, the software users, will need to get used to (and perhaps, even, taught to like)?