Monday 28 December 2009

Crave-Buy-Use-Discard: The life-cycle of consumer-grade technology

sunflower life-cycle
We must be living in an era of wonders. Product-wise, at least. In many parts of the world, technology has found its way to our everyday lives. From smart textiles to intelligent food packaging materials, to the fast, affordable multi-core processors, to the modern highly advanced army of mobile devices that increasingly fill up our pockets, bags and carrying-cases, there is no doubt that technology is having an impact onto the shape and capabilities of the products around us.

But does this also mean that people have a better grasp of the underlying technology behind such advancements? Are they in a position to get the most out of the modern technological wonders?

Don't get me wrong, I do not mean to criticise in a dismissive way. In fact, that is a sort of rhetorical question, since the answer - I believe - is simply "no, not necessarily". Of course, education has improved. The amount of knowledge pumped into the young individuals through the educational system has also increased. Plus, young people are incredibly good at getting familiarised with gadgets of all sorts. But at the same time the technological level required to produce many of the modern stuff around us has been going up at a higher pace thus making catching-up a challenging task.

It is no surprise that when things go wrong with modern devices, "fixing them" usually means either "open the box, replace a part, close the box" or - more often - "discard and replace". I admit that with the level of integration of the various components, doing otherwise might have been highly impractical.

What bugs me, however, is that we don't get the time to make the most out of the gadgets we buy. The actual depreciation rate is a tad too quick. Is it because of the poor usability of the devices we buy? Is it because of the market-disruptive advantages the newer devices bring? Is it because of consumers' poor understanding of the true capabilities of the things they already own? Are the price tags low enough to justify a non-conservative approach in out acquisition of technology-oriented goods? Is it simply a question of fashion?

Possibly the reason is a combination of things. If one looks at computer software, say at the office, then it is clear that the features/ capabilities offered today are much more than the average user will ever use. Interestingly for word processing software in particular, there haven't been that many features added since the old Wordperfect 5.0 (TM). If one focuses on hardware, things are even worse; CPU cores, in general, spend the majority of time idle, waiting for user input; GPUs also have managed to achieve insane processing power, that, unless somebody immerses himself/herself in the realms of gaming, goes unused. Games aside, all that power is mostly staying clear of the average consumer.


What needs to be done, IMHO, is to give technology more time to mature. I certainly don't expect manufacturers to go for that route any time soon. I do, however, believe that consumers would benefit a lot from adopting technology, with a proven relevance to their needs and a reasonable future ahead.

Would that kill innovation? Hardly. It would, however, affect the way that innovative technology looks like.

Take the small, pocket-sized, mobile 10 lumen projectors (the idea in 2008 and the user experience in 2009): The idea is cool; the niche markets are there; but - at least in my region - those products came to the market, stayed for a few weeks and then practically disappeared. Why? Was the market saturated? Were people indifferent to that new product class? I think that was because that new class appeared too early. People are still getting bulky and bright projectors for offices and the average living room is still lacking a projector; it is clear that the market is not mature.

With consumers demanding a longer cycle for the technology they use, the new products would have to focus on relevance, quality (including adherence to standards), usability and feature a serious product support scheme. Software developers (including firmware developers) would have more time to optimise code and performance and, thus, allow for a much better utilisation of present technology.

(Photo: "Sunflower life-cycle", by me)

1 comment:

Andrea Isabel said...

I`ve been feeling since 3 years that I know very little of what my laptop can offer me, I mean, all the functions it has, are for sure, much more that the ones I use. But, in my case, my excuse is: I don´t have enough time to explore all the features. Although maybe the time invested in this exploration would propably save more time in the duties I commonly do with my laptop, I guess the human factor is between me and the act of getting deeply in all the innovative stuff my laptop can do.
I absolutely agree on the focuses you mention on your last paragraph, specially on relevance & usability.

Ps: what is IMHO? ( In my H... Opinion?) ( It is just a guessing, sorry if it sounds ridiculous)

Your photo is amazing? Where was that whole-cycle-show happening?