Tuesday 28 October 2014

What is the must-have knowledge and how does one protect it?

'Book stack' by ginny
under a CC license
There was that Slashdot article on survival knowledge that got me thinking again. The question is: (i) what is the minimum knowledge that human civilisation should have in order to kick-start itself after a catastrophic event of some sort and (ii) how does one effectively preserve that knowledge?

There is even a book on 'rebooting civilization' and I'm sure that there plenty more works on that question, which - by the way - is not at all uncommon.

The second part of the question seems simpler. There is no ultimate backup medium and we already know that the internet is no safe bet. Modern technology is good and sleek but it can fail, too (plenty of personal experience on that front). So, really, modern non-magnetic storage media, such as DVD's, seem like a decent backup solution but they haven't been tested against time, yet. Magnetic media are now reliable for operation in the scale of 5-10 years but one shouldn't expect miracles. The 'cloud' could do better, since the storage equipment is maintained but then access to the stored data can't be guaranteed. For an digital storage medium and format there's the additional challenge of compatibility with future (or past) equipment.

I hate to admit that but as a storage medium, paper has served us reasonably well. Despite it being fragile, compostable, flammable, etc. Amazing, isn't it? And by some tricks we could store even more per page, even though that would make pages illegible to humans (for example, the QR code below, contains the first 2 paragraphs of this blog entry - and, yes, it can be printed smaller and still be readable by a smartphone).


To settle the argument, let's say that we use a combination of media and storage methods to be on the safe side. What should we put on those? The Survivor library that the article mentioned has an interesting selection of topics that range from 'hat making' and 'food', to 'anesthesia' and 'lithography'. Several state-of-the-art areas are missing (but that may be the point) and so do some well-established disciplines such as mathematics and physics, while with some topics, we could possibly do without.

Some have proposed keeping a copy of Wikipedia at a safe place. Yes, Wikipedia can be downloaded (its database dump, at least) and the size - so far - is said to be about 0.5 Tb (i.e., 512 Gb or about common 110 DVDs) with all the media files included.

There are also several physical and digital archives. Some specialised, other more general. The Internet Archive is an interesting approach, as it keeps snapshots of the various websites at various times. Not necessarily useful for the survival of mankind but interesting, anyway.

Another tricky bit that may not be apparent is that knowledge can only be effectively used by skilled people. So, not only do we need the knowledge but also a group of people with sufficient expertise to put that knowledge in good use. And then we need materials, resources, tools to allow for knowledge to be put into practice....

Hmmm.... Saving the civilisation seems to need a lot of thinking, after all :)

Sunday 26 October 2014

Art as a commodity

'Street art @ London' by
Alex Abian under a  CC license
I am not an art expert and I'm sure that I don't have the right 'eye' for art. However, there are pieces of art, paintings, buildings, graffiti, etc. that I find 'active', in the sense that they seem to be able to cause an emotion on me.

At the very basic level, art is an object of some sort. For instance, a painting is a surface with drawings and, possibly, colours on it, arranged in a particular way. They may or may not resemble a physical object or a setup. The result may or may nor look nice. But it will evoke an emotion on some people, perhaps under certain circumstances.

To my understanding, regardless of their other functions, the works of art are a kind of commodity. A very special one, certainly different from the other goods. So my question is, how does one put a certain price tag on a work of art?

Yes, I assume supply and demand is involved. But that shouldn't be the only factor. The reputation of the artist? Yes, that too should play a role. The views of the critics? That, too. Other factors such as the 'collectible' value also play a role. The effect the work of art has on people? Hmmm.... I'm not sure that this really counts.

Well, it is clear that I won't be reaching a conclusion here. To me, handling art as a commodity feels a bit strange. The only thing I'm certain about is that art, in all possible forms and prices tags, can be a very welcome addition to our everyday life!


Sunday 19 October 2014

If we want innovation we may need to re-think on the right to fail

'Failure' by Beat Küng
under a CC license
Success and failure are two terms that we come to meet very early in life. The paradox is that, while we learn and develop through failure, ultimately reaching success, later in life, we tend to look down on those who do not succeed.

Certainly, there must be an evolution element involved in that attitude of ours. Clearly, success is the desirable outcome. When it comes to making breakthroughs, though, regardless of whether those are disruptive innovations or smaller forward-leap ideas, trial-and-error or - in plain english - failure is part of the process. Our stance on somebody 's failure normally includes elements of constructive or not-so-much criticism and sympathy at ratios that vary according to our ties with the individual in question and the impact of the individual 's failure.

At any rate, despite the fact we know that failure is part of life, which may even lead to success, we often 'forget' that people have the 'right to fail', at least to some extent.

Interestingly, our legal and business norms seem better prepared to handle failure than our social instincts. Entrepreneurs can go bankrupt, for instance, and start over after a while. First offenders get a 'lighter' treatment in the justice system. In each case, of course, the impact of failure on the individual does vary - and there is always some negative impact and maybe even some longer lasting effects.

So the question is, how do we shape things in such a way that the fear of failure does not hinder innovation, including innovative thinking, innovative design, innovative practices, etc., while the impact of a likely failure is contained reasonably well?

I'm not sure I have the answer to that. But there are things, both related to the effort towards success and to the (potential) failure, most of there already tested and proved, that may help:
  • Make advice easily available to innovators. That may be through free research or business development services, through subsidies available for consultants, etc.
  • Develop a network of mentors available to support innovators. Having a mentor solves the problems of 'what is the right question to ask a consultant?' and 'how to I prioritise tasks?'. Such schemes - to my knowledge - have been limited to mostly within academic and large corporation environments. Maybe it worth considering how to deploy such scheme to emerging innovative entrepreneurs.
  • Encourage step-wise development. Such steps would limit the cost of failure at each step with the added bonus of better awareness of all opportunities as the 'product' matures.
  • Encourage pooling of resources and diversify investment. Now that is a tricky one. It can apply to both enterprises and investors, including financial institutions. The former may not have the capacity to adopt such approach but the latter, most likely, have something like that already in place. The problem is how to correctly estimate the risk for each investment, so as to allocate reasonable funds in a reasonable way. There, both underestimating the risk and overestimating it leads to serious problems for the innovation system.
  • Provide guidance after (potential) failure. Yes, seriously. Failure doesn't always have to be an abrupt halt but innovators should have the means to access what went wrong and if/ how it can be fixed. And yes, the next step is to provide resources after (potential) failure, should things prove to be fixable.
  • Promote success stories.
  • Encourage the innovative thinking of students within the education system. That should be a no-brainer, yet in practice we choose to be on the conservative side. There many ways to do that; gamification of the challenge could be one of the alternatives. To be fair, however, that is no easy task - especially if the education system runs under limited resources. In any case, it should include advice on how to deal with failure at the factual and - possibly - at the emotional level.
The list, above, is only indicative. The bad thing is that they all come at a cost and that the potential benefit is linked to the (perhaps risky) innovation at the end of the chain. The good thing is that such measures can be applied within different environments and at suitable intensities, minimising risk while still being able to reach (and study) results.

And, for the end, a couple of relevant TED talks. As usual, inspiring to watch :)




Sunday 12 October 2014

Could we increase ideas' diversity by simply switching languages?

'Language' by
<leonie di vienna> under
a CC license
The idea that language and thought are interconnected is not new. Discussion is very much ongoing on whether it is the language which is shaped by perception or vice-versa. A number of interesting examples surface from time to time, demonstrating the link among language, perception and - possibly - thinking.

For instance, Pormpuraawans, an aboriginal community in Australia, use cardinal directions in their speech (north, south, east, west) instead of relative ones (left, right). This seems to be associated with a very high awareness of orientation, even indoors. As described by L. Boroditsky, those people, when given a series of cards representing images of temporal nature (such as an aging man), they used the east-west orientation to put them in order, while subjects using English would use a left-to-right order and Hebrew speakers a right-to-left order for the same thing.

While all those are, indeed, interesting, I find intriguing the idea that by simply switching languages our perception of reality may shift. I sounds like being able to change viewpoint above a problem or think out-of-the-box taking that one easy step (if one is bilingual or multilingual, of course).

I do remember one of the teachers of mine supported that in order to learn and then master a language one should stop merely translating from one's mother tongue to the new language but, instead, think what one needs to say in the new language altogether. I know, it sounds confusing but there may be some truth in that advice. To my small experience, different population groups think differently, their language often reflects that and using that language helps a foreigner understand that different way of thinking, at least if he/she has been exposed to the corresponding culture.

Should the facts be right and the hypothesis on the 2-way link between language and thinking be valid, there is certainly considerable potential here. Imagine that one could instantly enhance opinion diversity simply by having a group discuss a topic in a different language or by keeping notes - and later reviewing them - in a different language or, even, by producing - at a later stage - a summary of thoughts and decisions in a different language. Alternatively, in a more traditional approach, one could try mixing people with different mother tongues in the same working group, although that may not always be feasible or practical. Some thoughts on activities, actions or interventions that may sound less likely to be successful or too unconventional in one language may sound perfectly reasonable or manageable in another. That could be simply because the two languages may be linked to societal perception of different dynamism. Of course, all that assumes that people are well immersed in the second language they use, which typically happens when they have a very good level in that language. Such people, however, are increasingly more common today. There is, unfortunately, the catch that regardless of how good or bad something sounds in a discussion, implementing a decision will be having its own effect (good or bad) independently of the discussion that preceded. Still though, views diversity should be a plus for identifying problems, solution, risks and opportunities.

At any rate, while not the only such approach, this is a route that should be easy to explore since there is no extra cost involved (since most people tend to know a second language, anyway). Maybe it will prove too good to be true, maybe not. Well, having said that, it will be feeling rather awkward and unconventional, at least at the beginning, but - hey - there is no real harm in trying that once or twice :-). After all, because of the internet, international collaboration, globalisation, world politics, etc., using a language different to one's mother tongue is not that rare any more...

Wednesday 8 October 2014

Simplicity; the all-too-common target we normally miss

'Beauty in Simplicity' by Clay Carson
under a CC license
Can you recall the safety demonstration that is performed just before take off in every flight? It is basically about just 4 things (seat belts, oxygen masks, life jackets, emergency exits and route to them). That simple. The bare minimum information that can save lives in case of emergency within a plane, which, by the way, is a very complex machine.

I like simplicity. Most people do so, I believe. But I'm used to things around me being complex and requiring handling of a certain complexity. 

In some cases, simplicity may be a matter of taste. For instance, minimalist architecture, minimalist design and minimalism, in general. Then, it may be a matter of function or usability. For example, the one-button mouse that Apple introduced or the bare interface of GNOME or Xfce, the operation of Microsoft Kinect and so on. And, of course, we have simplicity in processes and procedures (administrative procedures included), with the one-stop-shops and lean manufacturing or lean management concepts as examples.

To me, the latter is of utmost importance. Simplicity is the approach that saves resources, helps transparency, facilitates participation, minimises mistakes, encourages standardisation, etc. For instance, could you imagine referendums with complex what-if sort of questions? I hope not. That level of simplicity should be a target for most processes and procedures around us. The tax forms, the procedures for establishing businesses, the formalities of communication across public or private organisations, the procedures for public consultation, etc.

Of course, many will argue that a one-size-fits-all approach doesn't really work in all aspects of life. True. But I believe that the challenge is to apply simple models on small groups of applications in a coherent way rather than trying to use a single process for all applications. However that is no small feat. Mistakes will be made, corrective actions will need to be taken and a new 'simplifying' circle will need to start. And there lies the hidden challenge: frequent changes cause confusion, regardless if each new approach is a simple one.

Simplicity (and clarity) is a thing that we could certainly use more of. At the collective level, it could allow things to function better and at a lower cost. It would cut down red tape and limit confusion. At a more personal level, simplicity has the potential to make our lives better and give us the chance to focus more on things that matter, undistracted from clutter, regardless of those 'things' being people, causes or creations of any kind.

So, once more, is there a limit to simplicity? Most likely yes. But we have still plenty till we hit that.

Sunday 5 October 2014

Do we make the most out of (computing) technology?

Typewritter photo
'Typewriter' by Reavenshoe Group
under a CC license
Sadly, the brief answer is no. Most of us have in our hands, at home or at work, computing or other electronic hardware that would have been considered pure fiction 20-30 years ago. Although we have changed the way we live and work due to technology, the steps forward we have made don't necessarily go hand in hand with the leaps in technology we have witnessed.

Of course there are exceptions to the observation above but let me mention a couple of examples and tell me whether they sound familiar or not.

At the place that I work, all employees have PCs. Their (the PCs') primary tasks are e-mail, word-processing and printing and web browsing (not necessarily in that order). Yes, sure, so people do some statistical analysis, some DTP and some database design and some feed input to a number of databases but, still, the majority of PC time is devoted to the three things I mentioned before.  You may think that the volume of work or the quality of the output has increased. Indeed, it may. But there is still a small number of regular PC users that treats word processing software closer to a typewriter than a modern PC. OK, I'm exaggerating here but I believe you can see my point.

The other major change has been in the field of mobile devices. Each smart phone is practically a small computer, powerful enough to handle not only calls and messages but also browsing, voip and video chat and practically most of the stuff that would run on a desktop computer. Do people use those features? Yes, some people use some of those. But some others seem to have problems with that new technology. The following infographic shows an approximate breakdown of the various uses of smart phones.



According to the infographic above, new stuff (web, search, social media, news. other) account to a moderate to low 24% of the time of smart phone use. An interesting question would be if the total time interacting with smart phones is higher than before, when we had plain mobile phones. I suspect it is.

So why can't we make more and different things now that we have such computing power in our hands?

I don't really know (I'll be doing some guessing here) but here are some possible reasons:
  • Bad design on the user interface. Yes, all manufacturers and software designer call their interfaces intuitive but that is not always the case. To make things worse, I don't believe that there is the perfect user-friendly, intuitive interface. It will always need persistence, imagination and luck to get to use an interface successfully. But there are design basics that can help. Below there is an early (very) critical review of Windows 8 (which btw I rather like as OS)


  • Crappy or buggy software; Software incompatibilities; Software complexity; Inconsistency across platforms and devices; Lack of decent manuals or efficient tutorials. Lack of user training (it sounds old fashioned but in some cases it could help).
  • Software cost and/ or poor use of open source software. This particular point always bugs me. It 's fine to pay for software that enhances productivity. But why do businesses avoid to invest in open source software in a coherent way? Especially in cases where the open source alternative proves better in usability, compatibility and, well, cost.
  • Hardware restrictions. Yes, you read correctly. We have plenty of processing power but we may be having other limitations that hinder full use of that power. For instance, smart phones can do a lot but they need to be reliably connected to a fast network. That comes at a cost that in many cases is undesirable or, even, excessive. Another example is modern PCs that are powerful but often they come with the minimum possible display estate. Just adding a second monitor would boost productivity (and save on printer paper) but the majority of workplaces I know of stick to small single monitors (often badly positioned in front of the user). Another all-too-common thing is policy restrictions in the use of PCs, some of which severely impact usability, especially when that is paired with an IT department that refuses to listen to the users' needs.
  • IT departments that are overloaded with the typical tasks and don't have the resources to add new capabilities to their systems (an extra programmer could do miracles under many circumstances).
  • No reliable communication between (casual) users and developers to assist new product development or product improvement (yes, there are beta testers and developers can gather telemetry data but this is not even close in magnitude to what I refer to).
The disappointing thing is that most of the problems above are not-so-hard to address. Maybe the entire product-market-user model needs some rethinking. Maybe developers and, possibly, manufacturers, need to put more effort on durable platforms and commit to their support for longer periods. And, finally, maybe we, the users, need to be more conscious of our options/ choices and voice our thoughts/ wishes/ concerns when needed. Just saying....


Thursday 2 October 2014

What does it take to make an active citizen?

'Smile! It's Contagious' by
Daniel Go under a CC license
An active citizen is a citizen with the proper sense of responsibility towards society. That's, indeed, a vague and ambiguous statement that can hardly serve as a definition. Just to contribute more to confusion, active citizens are not necessarily activists - at least not under the negative light that occasionally has been shed on the term. The problem is that there is no formal definition for active citizens, just examples placing them as the good guys of society, the ones doing the right thing, from respecting the environment to voting and from properly voicing their opinion to volunteering for a good cause.

Does a society need active citizens? Certainly yes! Could a society do without those? Maybe. But it would need to heavily rely on other mechanisms to ensure its proper function, should the majority of its members choose not to fulfill their responsibilities. Imagine, for instance, a society where people would neglect the environment and, instead, only pollute.

Active citizens can drive societies further ahead of what laws and established norms could, on their own, achieve. They could do that possibly at a lower total cost, mobilising more diverse resources and, most likely, managing them more effectively.

The question is, how does a society (or a state) encourage active citizenship. Especially in times of weak economy and overall uncertainty. What do people need in order to grow from plain individuals to active citizens within a dynamic society?

Inevitably, I'll be doing some not-properly-documented brainstorming here but feel free to correct me:
  • People need to be inspired by someone or something. This could be done via a role model, a motivational speech, work experience, culture, personal interactions with others, etc.
  • People need the time to process ideas, to reflect, plan, discuss and reach decisions.
  • People need the space and the means (or resources or support) to implement their decisions, to setup, run and monitor their plan.
  • People need to have margin for failure.
  • Should success come, people need to be able to benefit, at least morally and emotionally so as to, in turn, inspire others.
That's a small collection of 5 rather naive and quite ideal points. In practice people won't have the privilege of all those - at least not at the same time. However, there are feasible steps that societies/ states can take to make the environment friendlier to active citizens. For instance:
  • Setting/ improving a clear and easy-to-comprehend legal framework for citizen welfare (health, education and further development, employment). Just for the sake of the argument, having healthy work environments with proper time of paid leave and decent minimum wages would allow people to think beyond work as a means of survival. Adding incentives could help a lot (e.g., leaves for charity work).
  • Encouraging corporate social responsibility both in the private and in the public sector so as to benefit society directly but also to further expose people on (some) values with relevance to active citizenship.
  • Establish policies providing a framework for citizen initiatives and (some) access to resources, e.g. simple processes for establishing non-profit CSOs, providing access to data, allowing access to and use of public spaces, providing public funding for certain citizen initiatives, providing legal advice and business plan support, etc.
  • Promote the culture of active citizenship, e.g., via education or via promotion of successful initiatives.
  • Interact with citizens - active and not-so-active - in motivational ways (invite input, listen, discuss, provide feedback).
  • Adopt good practices, invest on results and work towards causes highlighted by active citizens, etc., thus demonstrating that getting actively involved leads to positive change and benefits society.
I'm sure that there is more to add to the list. I'm also well aware that measures to that direction do exist in nearly all countries in the EU. However, there is always more that can done, even with little resources available. At challenging times, this is a promising path worth investigating...

[Of course, many, many others have voiced (a variety of) thoughts on this topic in the past. The video below is from the TEDx talk of Dave Meslin. Some of his points do have local only relevance but most apply rather widely.]