Saturday, 25 December 2010

Sunday, 12 December 2010

Food Designer: Where innovation becomes edible

A drop of inspiration
by Tammy Bogestrand
under BY-NC-ND
In most productive sectors product innovation is a major drive for the market. Take the automotive industry, for instance. Innovative designs, new technologies that affect safety, performance or cost are all evaluated by the consumers and contribute to making their choice.

As with any rule, exceptions exist. The tourism industry, for example. There, the consumer often wants to re-live an experience; at least, a considerable part of the market is related to experiencing tradition and cultural heritage.

The food industry lies somewhere in between. Policy makers tend to treat it as a traditional sector, although that trend is not too consistent.  And so do most of the consumers. "Food", as a word, doesn't automatically link to the words "novelty" or "innovation". That doesn't mean that there is no innovation in the sector. On the contrary. While technologies that can ensure safety and quality have been in place for many, many, many years (thermal processing, the use of salt or smoke for food preservation, food fermentation, are all really old breakthroughs), further progress is ongoing.

Much of that innovation in the food sector is under the hood, transparent to the consumer. Take high temperature - short time pasteurisation, for instance. While not too recent as a piece of innovation, it is commonly used on liquid products, such as milk or juices and together with aseptic packaging can give products with amazing shelf-life, without sacrificing any of the nutritional characteristics of the raw ingredients (well, the latter is not 100% accurate but the losses are minor). Depending on the local labelling legislation and its implementation, the consumer may be unaware of the pasteurisation technology employed.
It's only when a food product is marketed as innovative that the consumers will establish it as such and associate it with the brand. Energy drinks is a good example, where consumers are likely to be informed of the innovation involved and aware various differences across the products currently available. Products containing stanol or sterol esters (which can lower cholesterol levels) is another such example.

Unlike the other industries, the food industry hosts very diverse views when it comes to innovating. Few would object to employing innovation for the benefit of enhanced safety. However, even there barriers exist. For instance, food irradiation has never gained wide acceptance - at least not in Europe. Also, technologies that affect any sensory property, making the product to diverge from the established norm, are likely to be met with skepticism. That has been one of the hurdles for high pressure pasteurisation, which in some cases affects the colour of the treated foodstuffs. Innovation in the food sector also is a question of ethics, as well as subject to the specific food law.

Interestingly, however, a wave of industry professionals is working towards innovation that will be clearly visible to the consumer. The so called "functional foods" is one such example. "Minimally processed" food is another one, where the innovation is on the way of safe delivery rather than on the formulation. The whole food experience is studied by an emerging class of "food designers". The modern way of living, at least in the big, busy cities of the world, poses several challenges to the food producers and gives ground for further thinking. Effective and handy food packaging, which is nice to the environment, portions that are "right", variety in flavours and nutritional balance, the food experience at a catering venue, all these are examples of the challenges on the table.

Food designers certainly have a lot to deal with, not only from the scientific or technological point of view, but also from the social. Food has always been a social element and that isn't going to change much any time soon. Compromising innovation with the societal perception for food and food preparations is, for sure, challenging. A good side effect of that is that any innovation reaching - finally - the consumer is likely to be a more "mature" one, which is a good thing when it comes to playing with nutrition and food.

One thing is certain, though: food attracts attention. Or at least gastronomy does. The weekly TV programme, in Greece, hosts at least 6 gastronomy-related shows, which collectively manage to get a fair share of the viewing audience. Of course, unlike technological innovation, gastronomical innovation is more familiar to the consumer. It is a kind of creativity within the reach of every one of us. Messing with flavours, recipes and dishes can be part of the social game, too. Can food designers do something like that with the other aspects of food innovation? Is there a way for technological innovation to have social consensus (ethics included) early in the product development process?

Food innovation doesn't mean that we should forget about the traditional qualities of food, both raw and processed. That means that we, consumers, should learn about what we eat, both the good and the not-so-good side, and learn how to make informed choices. To that end, relying solely on the industry to provide such education to the extent needed, doesn't make sense. Legislation and formal education can help but, again, they are no panacea. When it comes to such knowledge, consumers should care to undertake such initiative.

Sunday, 24 October 2010

The apple from the edge of the world

Eating an apple by Sean
under BY-NC-SA

On 1 October a warning was posted on the site of the Food Standards Agency (UK) advising food operators that shipments of apples imported from Chile have been found containing morpholine (at about 2 ppm) and that those apples should not be sold in the UK market. Similar advice was given by the food safety authorities across the EU and - where necessary - recalls where initiated.

In various places around Europe, the news piece got attention, as would happen with any news item related to food safety. In this particular case, however, the warning/recall/etc. drill from the food safety authorities is due to a legal reason rather than the appearance of a major threat for consumer safety: Morpholine (1-oxa-4-azacyclohexane) is a chemical used in a variety of applications. The major one is corrosion protection in water tanks and steam production systems. The morpholine molecule is amphiphilic making it also a good emulsifier. As such, morpholine can added in wax mixtures used by the produce industry in the coating of fruits like apples, citrus fruits, etc., making those easier to spread on the fruits.

Using various coating mixtures onto fruits, post-harvest, is a common practice. Such coatings hinder water evaporation and protect the fruit from various environmental factors, thus prolonging their shelf life. Fruits do have a natural water-repelling coating. Often, though, this gets damaged or remove by post-harvest handling. It's good to keep in mind that such coatings cannot make a fruit fresh; they just protect the freshness of the fruit.

The use of morpholine in wax coatings is not allowed in the EU, but it is permitted in other places in the world (e.g., USA, Canada, Chile, etc.). The EU has chosen the safe route here. The concern is not so much for morpholine itself, for which studies (examples here and here) do not indicate significant toxicity, mutagenic or teratogenic action, but rather for its nitrosated derivative that has been shown to be mutagenic and carcinogenic in lab animals. The conversion of morpholine to its nistosated derivative can be done in the body in the presence of nitrites. Even in the latter case, current risk analysis scenarios, as hinted in the FSA statement, suggest a very (very) small risk for the consumer. Peeling the apples would remove the risk factor altogether but - to be fair - plenty of consumers prefer eating them unpeeled; a normal rinse, would not affect the coating and would not remove its content.

As in every food safety discussion, voices calling for reviewing the handling of morpholine-containing wax-coated fruits have surfaced. After all, the food law is not necessarily a reflection of scientific findings. It's politics, too. It's very much a balancing game, where risk is on the one side and benefit on the other (and I'm talking about all kinds or risks and benefits, not just the ones about human health). Often, law makers take sides based on (their) common sense.

IMHO, not using a chemical that when ingested may pose a risk - no matter how inconceivably small - makes sense, especially when there is a list of down-to-earth, food grade compounds that could be used to achieve the same technological purpose (e.g., lecithin, fatty acid esters). Edible coatings may still be a rather active scientific niche but there are already proven formulations that could be used. Normal packaging materials could also be sufficient.

Going beyond the scientific/ technological discussion, I believe that for the food operator things are rather easy. Knowing what the law dictates is the way that their business will not be disrupted. Simply, that's the way to compete at the local level. I'm not saying that monitoring the food law across the globe is an easy task, although things have been slowly improving on that field: the EU and several other countries offer online access to legal documentation. Beyond that, there are officers that can be asked and professionals that they can help. In any case, such challenges are part of the global commerce game.

Sunday, 17 October 2010

The taste of silence

savory silence
Savory Silence by Josh Liba
under BY-NC-SA
(Alternative title: "Tastless food? Quick! Get those earplugs on!")

Recently, the BBC News had an article on the work of Woods et al. titled "Effect of background noise on food perception" (published in 'Food Quality and Preference').

The study received particular attention from the press, both at home and abroad. While the inter-correlation between the senses is within popular belief (e.g., impaired vision and auditory perception), the study points to normal life effects that were not - by popular wisdom - normally attributed to an interaction between senses.

The scientific paper demonstrated that the existence of background sound affects the perceived sensory properties of the food; gustatory properties (taste, e.g., saltiness, sweetness) were diminished while auditory properties (e.g., crunchiness) increased. The press extrapolated on the example of in-flight meals, which commonly get described as 'tasteless'. However, if the observations of the study hold, the everyday life effects could be of much greater importance.

Although tempted, I'll skip the case of the restaurant environment (but I do wonder, could a quieter eating environment make a chef's creations tastier?) and, instead, I'll share a few thoughts for the office environment.

The modern, urban environment most of us live and work in tends to be noisy. I don't know whether the effect of background sound is a function of its intensity (I would assume so, possibly also featuring a cut-off level, under which no significant effect on taste perception would be observed) but, please, think of it for a second: The typical office chatter can reach 65 dBA, a properly maintained PC is at about 45 dBA, a ringing phone could be at about 75 dBA, a printer could be between 60 and 75 dBA. For comparison, a quiet room is at about 35 dBA, a lawn mower is at about 90 dBA and a crying baby can reach 110 dBA. In flight cabin noise levels are between 70 and 85 dBA, depending on the type of aircraft, flight phase, cruising speed, location of the measurement point, etc. Thus, while not directly threatening for the human auditory system, the office environment is certainly not quiet.

Now attempting to extrapolate the study to the practical effects on food consumption in an office environment becomes interesting; existing noise levels may be pushing employees to use more salt or sugar to reach the taste intensity the are used to experiencing at home. At an era where both salt use and sugar consumption are under fire for their contribution to high blood pressure and obesity, respectively, the auditory environment around us may be contributing towards the wrong direction. Although rather hasty to urge for action based on limited evidence, the link between sound environment and nutrition-related choices is something that should be looked into. In any case, if one takes into account the other health risks of office noise exposure, it becomes evident that noise control maybe of higher priority than commonly thought.

In the majority of cases the reduction of background noise levels is neither costly nor technically challenging. Simple measures, like relocating noisy equipment, encouraging people to use earphones (instead of loudspeakers), using sound dumping/ diffusing office space dividers, etc., may be a good start. However, in cases where space is precious and the convenience of private offices cannot be afforded, help from an expert should be used. After all, it is a question of both health and productivity!

(BTW, what about air quality and food sensory perception???)

Sunday, 6 June 2010

Language twists: the case of N-(L-α-Aspartyl)-L-phenylalanine, 1-methyl ester

A few days ago, a case involving Asda, a UK supermarket chain, drew my attention. As mentioned in the FoodNavigator, Asda and Ajinomoto, a producer of food ingredients, entered a court battle on the use of the word "nasty" for, amongst others, aspartame (actually, aspartame was indirectly referred to as one of the "hidden nasties"; Asda produces private label products claiming to contain "no nasties").

It seems that the first court ruling allowed Asda to use a generic term as "nasty" for foodstuff, thinking that it does constitute a specific "malicious falsehood". However, after Ajinomoto's appeal, a second ruling supports that the word "nasty" carries multiple meanings, one of which is damaging to aspartame products.

Although the case seems to be in progress, it is certainly interesting to think of the possible results. Asda claims to be using terms that echo consumers' concerns. So, the "no nasties" claim actually translates to the fact that the product labelled as such contains none of the ingredients consumers tend to think as "nasty". Interestingly, that approach requires little or no scientific backing. On the other hand, "nasty" carries a negative meaning, which might affect the choice of consumers that may think that the term is based on scientific evidence.

The food sector is, unfortunately, not empty of controversy cases. The case of aspartame is recorded as one of them. Having said that, aspartame is an approved sweetener (also known as E951 in Europe) and that means that there is sufficient scientific proof that it is safe to use within the limits and for the uses specified. By the way, Asda is not disputing aspartame's safety.

What worries me is that the case is, in a way, a question of whether free speech can apply in the market environment. Indeed, the consumers have preferences, which are not always based on scientific facts and, historically, have not always been right. So, is it ok to mirror those beliefs on products?

Well, in a way that has been done before. Think of the various certification signs that appear on food products. Some of them certify qualities that have little to do with the actual safety or nutritional content of the food. For instance, think of examples like the PDO labels, which indicate that a product was made in a specific geographical area or the TSG label that guarantees the "traditional character" of the product. Those labels often attract the consumer to the benefit of products that carry them. The difference with the "no nasties" label is that those signs are awarded after a certification process.

So the question translates: Could there be a certification process for the "no nasties" (or other equivalent phrases) sign? In theory, why not? The national legislations across Europe allow for claims such as "no preservatives", "no artificial colourings" ,etc. A similar trend applies to foodstuffs with natural flavouring. However, it would be safe to assume that the industry behind the ingredients concerned will react, possibly on the basis of established safety or on the basis of offering choices and health benefits to specific consumer groups.

The key here, as always in the food world, is for the consumer to be in position and understand what a claim on the packaging means. And that key principle applies - IMHO - to all claims, legally established or not. It is no coincidence that Regulation (EC) 1924/2006, regarding the nutrition and health claims on foodstuffs, requires food manufacturers to ensure that their claim is understood by the average consumer and, also, to provide additional relevant advice together with the claim.

Some say that the food labels of the future will have a lot of things to read. But then again, making usable information available to the consumer could be a way to improve the food choice mechanism and the food-associated wellbeing. Time will tell, I guess....

Tuesday, 4 May 2010

Real challenges; virtual worlds; working solutions

We spend a fair amount of our daily lives in problem solving. From child-raising tasks like figuring out why our babies cry or devising reliable ways to communicate with adolescents, to working environment challenges such as re-assigning resources to meet product demand or identifying alliances to go after a tender, it seems that every step in our life is a step into an infinite tangled web of missions, target objectives, challenges, obstacles and desires.

Please do not misunderstand me. This is no cry of despair. It is just the basis of the idea that, since problem solving is the way to push one's life forward, then investing on better problem solving methods - at any level and for any type of problem - is a very useful thing to do.

I'm hardy the first one to express such an opinion. I'm not even close (chronologically) to the group of those avant-guard thinkers that first started paying with the concept of problem solving. Still though, the idea of transforming problems (in the realms of mathematics - e.g., Laplace or Fourier transform), converting them from incomprehensible challenges to manageable bits has been intriguing me since my student years (at least).

Talking about transformations in mathematics, what makes things even more interesting is that it is exactly those transformations that have enabled us to achieve plenty of technological wonders, many of which tend to be taken for granted today (think of communications and anything that makes use of noise filtering or signal processing, for instance).

Interestingly, opportunities for creative problem solving also exist beyond the world of mathematics. Playing games, for instance, is nature's way of choice to teach young individuals the skills/ tricks of life that couldn't be hard-written in the genetic code. The idea of gaming as a means for teaching and skills improvement has been explored via the serious game concept, which - in the modern computer era - is supported by the Serious Games Initiative and embraced by numerous developments and users throughout the world.

However, gaming may have a much greater potential that plainly helping at skill development. It has been supported that gaming environments presenting real-life challenges to players can help people devise and try solutions, regardless of whether they might have looked crazy in the real world, which may be feasible in the real world. Dr. Jane McGonigal, in a very inspiring TED speech of hers (see the video below), suggested that encouraging people to spend time in collaborative online gaming may be a good way of pooling together the problem-solving capacities of individuals across the glove for the benefit of reaching more effective solutions at shorter times.

Are we in position to fix reality with online games? I believe that we are actively exploring those options. Simulation, after all, is a very valid scientific approach. Today simulation is used from training to the design of buildings, to the assessment of impact of pharmaceutical compounds, etc; such tools can considerably speed up practical innovation against actual challenges and limitations.

Having Ender's Game in mind (and having enjoyed the Matrix trilogy :-), I do believe that virtual worlds have a lot to offer to real people, BUT I also do wonder for how much more will the barriers between the real and virtual world stay thick enough to be able to validly tell the difference.

Saturday, 3 April 2010

Friday, 26 March 2010

"And finally, Monsieur, a wafer-thin mint." *

Update (28/03/2010): I 've just read on the BBC News (again) that the Food Standards Agency in the UK is asking food manufacturers to start making available smaller packages of the unhealthy snacks, as well as to cut saturated fat in foods like biscuits and cakes.

I was really intrigued by the BBC News article regarding the change in portion sizes depicted in the Last Supper through time. The article was saying that, with time, the food present on the Last Supper table was getting more and that the portion sizes grew larger!

I always suspected that we eat more than once people used to. And I could imagine that the everyday visual stimuli (read: advertisements) would point to that directions. But I wouldn't easily say that even classical, artistic themes, such as the Last Supper, would be affected by what we tend to do in our everyday lives.

The evolution of the portion size is getting more and more into focus. People seem less skilled in properly controlling the amount of nutrients they take. And that is understandable. Food was - and still is - a human need. But it's also an element of social life. Not to mention that a range of disorders can affect the food intake in terms of quality and quantity.

One of the trends in the food industry - consumers interface, is the increase in the amount of information food producers make available to consumers. Food labelling regulations in Europe do not require the provision of nutritional information, unless a nutrition or health claim is made on the food label. However, more an more often, such information is featured on packaged food, while catering businesses have begun adding such information next to their dishes in the menu or on the packages of take-away food. According to the Examiner, the new health bill in the United States may make displaying calorie content compulsory for restaurants, which is certainly an interesting development.

Consumers have - at times - expressed confusion over the various food labelling schemes. In Europe, the GDA nutrition labelling scheme is gaining popularity amongst food producers. The GDA scheme informs the consumer on one or more nutritional attributes of a food product, refer to the absolute and relative content of a portion of the said foodstuff, taking into consideration the Guideline Daily Amount for that nutrient. Most implementations of the GDA scheme include reference to calories, sugars, fat, saturates and sodium. However, the list is being enriched with additional elements.

Part of the debate at the food industry stakeholders level is what to do with the portion size. There is no strict, formal definition of "portion size". Every food producer can choose what they judge as appropriate for their product. But critics argue that setting on the label of an 100g-pack of crisps the portion size to 40g is, simply, not realistic. I couldn't agree more. Yes, it might be prudent to eat only 40g of crisps but come on.... No, I'm not trying to hold responsible the food companies for my extra kilos of body weight. But I, too, support that "portion size" in everyday life is defined more in terms of convenience rather that nutritional content. In other words, the portion is the bag of crisps and not the amount that I should consume based on the nutritional profile of the said crisps.

It is a fact that people can make considerable error in fixing a portion for them. Even the size of the plate the food is put onto may play an effect on how much people will consume; the Small Plate Movement is encouraging people who want to loose weight to use smaller plates.

The debate on portion sizes is certainly no mere philosophical discussion. There are implications at many levels. Risk assessment, for instance, is done on the assumption of a certain portion size (although, depending on the case, a "large portion" may be taken into consideration) - see here for an example of how risk assessment is made. Assuming a smaller portion size may lead to a lower projected risk.

Indications that consumers adopt a one-package=1 portion approach in some cases, causes nutritional concerns, especially when the foods in question can contribute to a higher calorie intake or feature a high glyceamic index. In the case of people with existing disorders such as diabetes or cardiovascular problems the issue escalates to health concerns.

And then, there is the possible impact on food prices and the environment because of the need for additional packaging material (also, higher transport cost per food unit, etc.).

However, I admit, the portion size alone is no panacea in the nutritional problems of - mostly - the western world. Surely, in the good nutrition game, consumer education, possibly from the younger ages, cannot be ignored. After all, eating well is much easier when it becomes a largely practiced habit.

* A line from a rather disturbing scene featuring the dialog between Maitre D and Mr. Creosote from the Monty Python's The Meaning of Life.

Monday, 15 March 2010

The science in the (kitchen) cupboard

Photo of flying pop-corn

It is interesting that when people want to describe something complex they refer to it as "rocket science". On those grounds, I guess introducing oneself as a "rocket scientist" (or as an aerospace engineer) in most social occasions would cause plenty of heads to turn to oneself. Now, I wonder if introducing oneself as a "food scientist" would have the same effect.....

I admit - I've never tried it. But my gut feeling is that "food science" scores really low on the coolness scale most people maintain. And that is totally unfair!

Food science is not an isolated island in the sea of knowledge. Instead, it would be better described as a large group of islands: food science is about chemistry, physics and biology applied onto systems of considerable complexity. I know. It doesn't sound convincing. So let me give you a couple of simple examples:

Example 1: Pop-corn physics. You know the story: You take dry corn seeds; you throw them in a saucepan with a bit of oil; you warm the saucepan; after a while the corn seeds violently explode into yummy white-ish flakes. So what has happened? Well, basically, two things have happened: Firstly, the water inside the seed turned into steam, which, with when heated up to about 170-200 oC, it raises the pressure inside the corn seed to very high levels, until the seed hull finally breaks with a explosion. Secondly, the starch inside the corn seed changes it structure to a higher volume "jelly" form. When the seeds explode, the water steam - starch mixture breaks loose and rapidly expands; at the same time, the steam escapes to the atmosphere leaving back the familiar, starch-made, foamy structure that we call pop-corn.

Example 2: Corn flour physics. Corn flour is a rather popular ingredient. In cooking, water-dispersed corn flour often functions as a thickening agent, which, when warm is thin-flowing, but when it gets colder it assumes a jelly-like behaviour. Kids are a bit more familiar with the corn flour slime, which, is unarguably considerably more fun: Just slowly start adding corn flour to a bowl with a bit of water, while stirring to ensure homogeneity; once you reach equal amounts of water and corn flour, the mix will become thicker; keep adding corn flour slowly and you will reach a point (which depends on the type of corn flour you use and the temperature), where the mix will be changing to a "solid" form under rapid stirring and melting back to the liquid form when the stirring stops.

Why that strange behaviour? Well, corn flour contains a lot of corn starch. Starch is carbohydrate molecule, comprised by lots of glucose molecules linked together, forming a long chain. Starch chains also feature smaller glucose chains (branches) attached onto the main chain. When in solution, the neighbouring starch chains can interact with each other. At a certain starch concentration, stirring or agitating the solution helps the individual starch chains hit and -briefly- stick to a high number of nearby chains, thus creating solid-looking blobs of starch. When the agitation stops, the chains go back to their original, "untangled" state, thus giving the solution its liquid-like look.

Corn-starch solutions are non-Newtonian fluids, which at certain starch concentrations behave as rheopectic (shear-thickening) ones. As you may suspect, depending on the corn starch solution and the agitation frequency, a number of cool effects can be seen. Check out the video below:

Beyond plain fun, fluids with such properties have a wide range of practical applications, from power transmission in mechanical systems to enhanced performance in bulletproof vests (traditionally employing polymer fibres).

I guess that the humble corn flour doesn't look that naive anymore, does it?


Sunday, 28 February 2010

Food in times of crisis

airdrop of humanitarian aid

Natural disasters do strike. And when they do, man-made infrastructures are not guaranteed to survive. While people can - in some cases - be protected by early warning systems and effective evacuation plans, it is the magnitude of disruption to utilities (water, electricity), transportation and communication infrastructure that will affect the final size of the disaster's cost (in lives and in money).

With two such cases very recent - the earthquakes in Haiti and in Chile - a number of questions come into my mind: how can one, effectively, get food and water to people in need? What kind of food? Should people rely on airdrops or should people/ local authorities maintain a small stockpile of supplies for such cases? One could argue that food (and water) are perishable goods, which makes keeping stocks a - possibly - expensive exercise. But, on the other side, the availability and quality of those elements are essential for the health, welfare and morale of people affected by the disaster.

Interestingly, there is an ongoing project on "Food For Crises: Developing an option for humanitarian aid"; the project is the objective of a joint thesis within the European Masters Degree in Food Studies. The students assigned to the project try to design food "products" suitable for distribution in such crises cases. They are taking into consideration nutritional value and health benefits, health risks (e.g. allergies) and labelling, cultural parameters (e.g., regarding the composition of food), shelf-life and ease of use, suitable packaging to allow for rough handling, etc. 

When it comes such purpose-designed food, cost is an important parameter. The technology to achieve all the objectives mentioned above in a single, tailor-made foodstuff item does exist but it may drive the cost to prohibitive levels. It is essential to employ technologies that are either easy to cheap to implement or technologies that can - in parallel - find wide commercial application, which can help towards a quick lowering of their cost. Fermented foodstuffs with select microorganism strains is an example of a technology easy to implement (although many of the fermented foods we consume on a daily basis are far from ideal to distribute in times of crises). Packaging technologies belong to the second category.

Then, there is the question of nutritional content. Ideally, each consumer group would have their own emergency-food. But in practical terms, I feel that this would be unrealistic. It is not so much the problem of cost but rather the way to manage such supplies, both when designing the shipment of humanitarian aid and when receiving it. Especially in harsh conditions, where the recipients are the people affected directly, the margin for an organised, individual, nutrition-based distribution is next to impossible. 

I am no expert on the issue and, I admit, I have little knowledge on how humanitarian aid missions are designed and implemented. But I believe that revisiting the topic - at least to whatever has to do with the foodstuff part of the equation - may be a productive exercise, possibly indicating cost-effective ways that foods we have now can be made better with regards to any of the parameters that would matter in a disaster situation (shelf-life, nutrition content, cost, ...). To a somewhat similar direction, the EC recently had a call on "Health-value-added food products for population groups at risk of poverty" (KBBE-2010 general call). It would certainly be interesting to see the directions that the people, who will get the project on that topic, will follow!

Saturday, 27 February 2010

Sunday, 21 February 2010

Food choice - a reading game

Fondant & Ice cream
Nutritious food; gourmet food; fast food; healthy food; baby food; convenience food; organic food... Food constitutes a human need tightly integrated to most sides of our social existence. In several places around the globe (but not everywhere), people have access to a considerable variety of foodstuffs, while new products pop up on a daily basis, often dynamically co-existing with traditional ones at nearby supermarket shelves.

There, consumers have the chance to choose. A number of factors are known to get in the middle, including biological, economic and social factors. Understanding the process of making a food choice, is certainly a hot desire for the corresponding sector these days. And it's not only the marketing pressure, as you may think. Surely, the food industry would love to make products that are (or can become) more appealing to consumers. But since food is closely associated with other things like health, it would be really useful if the choices people would go for, would also be "healthy" ones.

But there is a thin line somewhere there! Yes, food does affect the functions of the human body. Although research is still ongoing, there is clear evidence that food and the function of the nervous system, of the immune system and of the metabolism - to name a few of the systems/ processes of the human body - are related. But to what extent can food, on its own, prevent or cure diseases? If a food-health link is substantiated for a specific foodstuff, could food producers go ahead and inform the consumer on the health benefit of that food?

In Europe, nutrition and health claims are governed by Regulation (EC) 1924/2006. That Regulation places restrictions on what can be claimed of a food label and provides templates for a number of claims. Any health claim made on food labels must be true, not misleading and clearly understood by the average consumer; the claimed benefit should be achieved by reasonable consumption (specified by the producer); it must not imply that the by not consuming the food in question the consumer's health will be negatively affected; it should be accompanied with notes on the importance of a healthy, varied diet and a healthy lifestyle, and warn consumers on potential hazards associated with excessive consumption.

Regarding health claims, the Regulation discriminates across several categories:
  • Health claims that have to do with the general function of the organism
  • Health claims that refer to psychological or behavioural function
  • Health claims regarding slimming, satiety control, etc.
  • Health claims on the reduction of the risk of a disease or the health or development of children
The authorisation of each new claim depends on the category it falls under. However, in any case, claims examined by the European Food Safety Authority (EFSA) need to be sufficiently substantiated by scientific evidence, strong enough to demonstrate a cause and effect relationship between the nutrient or food that carries the claim and the claimed benefit. Don't be mistaken on that; that is no trivial task (e.g., check out the EFSA panel's recent opinion on an application for a health claim of a product containing cranberry extract, or for the function of phospholipids).

Clearly, the law offers - in a controllable way - opportunities for food producers to advertise to the consumers health benefits that foodstuffs may help towards. Critics do exist in both opposing camps: pro-health claims and contra-health claims. However, few can ignore the fact that consumers today can have access to increasingly more information on what they eat. All one needs to do, is take the time to read a label. Although - as some fear - we may be having increasingly longer food labels within the years to come!

Sunday, 31 January 2010

Lights, camera, gourmet shopping...

Sunset cow
Slashdot editors post on food research again! This time their post links to a story covering the work of some researchers at the Gifu Prefectural Livestock Research Institute, who managed to rate high quality Japanese beef (Hida-gyu) using an infra-red camera. Well, to be fair, this is work in progress since the success rate of this optical evaluation method is not too high yet (about 60%) but the aim is too ambitious to ignore: After refinement, it could allow consumers to use cheap cameras, perhaps the cameras that mobile phones are already equipped with, to "measure" the sensory profile of beef cuts at the supermarket before buying.

The article does not provide an extensive scientific background; it seems that the infra red image can reveal information on the oleic acid content of the meat, which can be associated to desirable sensory parameters such as tenderness, flavour and overall taste.

I admit that I 'm totally fascinated by the idea. Smart tags have been in the focus of food researchers for a long time. At the beginning, the objective was traceability. In that sense, RFID tags would be able to hold all the necessary information to identify the origin of the product. But as their data holding capacity improved and the cost began to drop (although it is still prohibiting for most general uses), new applications came up. Time-temperature integrators, for instance, which can easily accompany RFID circuits, can provide data to already developed mathematical models that can estimate the microbiological or quality status of certain foodstuffs. Such systems allow the packaging to provide feedback to the consumer.

So far, however, such systems required investment on behalf of the industry, the cost of which could have an impact on product prices and - at a second step - on consumer preferences. This time, the researchers work on a scenario that uses (mostly) already available technology. I assume that an additional light source and IR filters may need to be employed in order to get readings from a cellphone camera. On top of that, one would also need specific software and - possibly - some means of calibration. Still though, none of those are unrealistic given the computing power of modern mobile phones.

I am really curious to see what will happen when that technology "hits the market". I would expect many more applications to follow that path of development. Also, I wouldn't be too surprised if meat producers started pre-marinating their cuts in olive oil solutions :-)

Interestingly, I have just realised that until now I considered infrared photography to be just another beautiful - yet geeky - kind of art. Well, that is about to change!

Sunday, 17 January 2010

The knowledge in the closet

What's in your closet - hanging by their hooks...
14/01 was a big day for the agro-, food- and bio- people around the world going after European research grants (it was the deadline for the 2010 KBBE call of FP7 - a public research funding instrument in Europe, giving about 53 billion euros to research in the period from 2007 to 2013).

The persistently pending question, however, is: what happens to the results of all that research? Understandably, not all research efforts are successful; and even when they are successful, they don't necessarily lead to tangible results. There 's really nothing wrong with that; research is a venture into the unknown (well, actually it is a venture in the not fully known, but let's not stick onto that for the time being), thus has associated risks, mostly of financial nature. It is also understandable that some of the research carried out will end up calling for further research in order to reach a ready-to-exploit stage.

But what is the amount of that ready-to-exploit scientific knowledge? The last few years (or decades) the need from knowledge exploitation has become a policy priority. I can't judge if that has led to substantial results (I have no means to measure in an objective way) but at least I feel that a higher number of people in universities and companies are rather aware that there are ways to protect, trade and - in general - exploit new knowledge.

The current system for intellectual property protection has been widely promoted as a helpful tool for the quest of knowledge utilisation. While I can see the pluses, I can't help but wonder what could other players do, that have limited access to the resources needed for such a game. And what about knowledge that is already available in an "unprotected" form, that is, either published or unpublished - being kept in a closet full of paper, data CDs and other archiving means.

Surely, even more of that knowledge could be exploited; if not at a big scale, at least at a micro-scale, through a cooperation of scientists will small companies under short-term projects of low, affordable budget; something like the sales that shops have, only for science :-)

As an example, think about the valorisation of the waste from the fish processing factories. The large production plants often produce fishmeal or fish oil out of that, using available equipment suitable for their volumes of production. At low production volumes, however, although the principles remain the same, it is likely that no optimised processes are commercially available, which would be a relatively easy task for an engineer to design. Interestingly, the original research on that must have worked on laboratory-scale volumes and, thus, is likely to be closer to the desired application.

The same applies for most of the waste outputs of farming, where biotechnology could provide solutions, sometimes with no further research being necessary. One could argue that the driving force of additional income from such as effort is simply non-existent in those cases; the value of the products derived from those exploitation processes is only achievable if one has a distribution and sales network reaching the right market. And then, there is the risk of producing surpluses of secondary products, thus leading to a drop of their market value. True and true. The right answer depends on the actual case but, in general, it takes an "unbalanced" action to break a vicious circle.

Cooperation frameworks around that idea have been tried in a number of countries with encouraging results (e.g. the innovation vouchers scheme that has been tried in many places, including Ireland, the UK, the Netherlands, Greece, etc.). However, with big grants around asking for ambitious research, the priorities of most researchers are not shaped towards "low-tech" cooperation with small enterprises. While I admit it would be stupid to suggest throttling the funding for innovative research, I believe that a stronger mandate for exploitation through small companies should begin to form.

Yes, there will be implementation problems (e.g., how many small food companies would put innovation as a priority instead of growing production volumes or sales figures? how many of those companies would be willing to participate in such schemes?). And yes, the existing legal framework may not be very flexible around food innovation (e.g., putting a health-related claim on a foodstuff is not a trivial process - and that has a pretty good reason behind that - I might add). But the potential benefits are many-fold:
  • "Older" knowledge or published knowledge could find application in a way that could further benefit the original researcher or research group
  • "Older" knowledge or published knowledge could be transformed to practical innovation at a higher pace than entirely new, breakthrough knowledge
  • Small companies will get exposed to working together with scientists and vice-versa; possibly a beneficial exercise for both groups
  • The public profile of food research will improve
  • The mobilisation of private funds for research could be encouraged (many small sums of money instead of a few larger ones)
  • The competition between food producers would benefit - even at the regional level
It might be worth considering it in a more thorough way, especially now that the global financial crisis has reminded to us that "big" doesn't necessarily mean "stable".

Sunday, 10 January 2010

The nut factor

roasted peanutsI was idly going through slashdot on Friday, where I came across a link to a story on the Globe and Mail. Apparently, the Canadian Transportation Agency told Air Canada to provide a nut-free buffer zone in their airplanes. Air Canada had already stopped serving peanuts on-board, but they were still serving cashews nuts. That move was prompted by the complaint of a passenger that was severely allergic to nuts.

Interestingly, the Agency considered a severe food allergy to be equivalent to a disability that the air carrier should take into account in the service they offer.

I imagine, the easiest way to comply with that decision is to stop serving nuts altogether. Isolating a part of the plane and making it allergen-free is technically possible but it won't come cheap (think of air filters or at least air-curtains, special floor mats, regular surface cleaning, control on who and what enters that zone, zone autonomy - e.g., restrooms, and a procedure to control those things).

Allergens in food have been receiving increasing attention lately. In the EU there is concise legislation in place that requires foodstuffs to identify on the label any allergens that are contained as ingredients (either as main ingredients or as carry over substances from one of the main ingredients or from one of the treatments the foodstuff or one of its ingredients was subjected to) as well as any allergens that are likely to be present in traces (e.g., due to contamination during the production process). The list of allergens is regularly reviewed and revised. When needed, EFSA also looks into cases of allergens used as technological aids in food processing and tries to assess the levels that can survive until the final product and estimate the corresponding risk.

However, in my understanding, so far, food allergens have not been treated as a hazard with an environmental dimension. Personally, I am unaware of the potential effect of food allergens on people with severe allergy, when the contact is made through the skin (or the lungs, in case a food allergen can somehow be found on fine dust). I would imagine that brief skin contact could not easily lead to adverse reactions; but, then again, I am no expert on allergies and, thus, I might be very wrong on my hypothesis.

While I acknowledge the risks, the inconvenience and all the negative effects that allergies have on the quality of life of the people affected, I tend to believe that in severe cases, the person affected should also take his/ her condition into consideration. What exactly that may mean, really depends on the case and their doctor's advice. It could mean avoiding certain places or wearing long sleeves and gloves or a mask (i.e., means to avoid coming in contact with the allergen), carrying cortisone or antihistamine medication or even shots of epinephrine (i.e., means to fight the allergic reaction), etc.

After all, it makes sense to me that one should take reasonable care of oneself!

Thursday, 7 January 2010


big bug
A few days ago, just before the entry of 2010, I came across an article on the BBC News website on the undesirable effect that disinfectants may have on bacteria. The article was supporting the the incorrect use of disinfectants (e.g., incorrect dilution) could allow bacteria to develop resistance to antibiotics.

I am by no means an expert in microbiology, or even plain-vanilla biology, however, I was aware that overuse or misuse of antibiotics could lead to increased resistance to those antibiotics; a trait which, once acquired by a group of bacteria, can be passed onto others, under certain conditions. In the hospital world, where people (patients) often have a weakened immune system, MRSA is a considerable threat, while an increasing number of other pathogens (or potential pathogens) begin to exhibit threatening tolerance to the available antibiotics, turning from simple "bugs" to "superbugs".

To my understanding, the antibiotics-induced antibiotic resistance can be mitigated by a tight antibiotic-use regime. Sweden has had considerable success in tackling the MRSA problem by forcing the health care system to resort to antibiotics only when absolutely necessary. The transition period might have taken considerable time but the gain sounds considerable: they can still make good use of antibiotics that on many place of this world are now considered to be ineffective.

The article of the BBC I spoke of earlier, however, is alarming in the sense that not only antibiotics but also disinfectants (and possibly other bacteria control means???) can somehow lead to antibiotic resistance. Clearly, improper use of disinfectants, which allows for a select portion of the microbial population to survive, favours that surviving population in the sense that it eliminates the competition. I would assume that this process effectively ensures that the descending bacteria will have those gene combinations that allowed their ancestors to survive the disinfectant. It seems, if my understanding of the article is correct, that those "disinfectant-survival" gene combinations can also be effective against antibiotics.

The alarming bit is that the use of disinfectants is much, much wider than the use of antibiotics. There not only used by hospitals but by a very high number of businesses, including the food industry, and they are also at hand in the typical household. I admit it would be inconsiderate to extrapolate that all disinfectants, if misused, could lead to superbugs. The fact that the number of known supebugs is still rather a small one, while the use of disinfectants has been more-or-less systematic over the last decades would rather support that the risk is minor.

I mentioned the food industry before. Interestingly, the agro-food industry was alarmed, in the past, by the antibiotic resistance problem but managed to sort it out by adopting good livestock practices and by considerably limiting the use of antibiotics. In Europe, there is legislation in place to ensure that things stay this way. But what about the use of disinfectants? The manufacturers of such products do include instructions for use, which normally are followed. I wonder though, with the modern foodstuffs enjoying increasingly longer shelf-lives, are there any significant chances that microorganisms which find their way into foods can turn into threatening superbugs?

In "live" foodstuffs (i.e., foods that contain a flora consisting of living microorganisms, such as yogurt, fermented sausages, various cheeses, tea, etc.), which contain a small eco-system, it is likely to be much easier to keep things under control. An undesirable contamination would be worse in the case of previously sterilised (or poorly sterilised) products under packaging conditions that lack bacteria growth barriers.

In any case, and for any of the existing reasons (ranging from health and safety concerns to competitiveness and sustainability issues), it may be worth revisiting some of the practices in our every day "war" with bacteria.

The use of good practices when it comes to cleaning surfaces or when actually using antibiotics has been proven to be an effective one. The use of phages to fight off antibiotic resistant bacteria has also been tested - successfully I believe; although, I'm not sure if it can find wide scale application beyond the health sector. The manipulation of the microbial ecology could be another promising sector, which has recently re-attracted research interest; after all, it may be time to remember that microorganisms are our valuable friends far more often than otherwise (not only when it comes to the function of the human body).

(Photo "big bug", CC by G J Hutton)