Sunday 31 January 2010

Lights, camera, gourmet shopping...

Sunset cow
Slashdot editors post on food research again! This time their post links to a story covering the work of some researchers at the Gifu Prefectural Livestock Research Institute, who managed to rate high quality Japanese beef (Hida-gyu) using an infra-red camera. Well, to be fair, this is work in progress since the success rate of this optical evaluation method is not too high yet (about 60%) but the aim is too ambitious to ignore: After refinement, it could allow consumers to use cheap cameras, perhaps the cameras that mobile phones are already equipped with, to "measure" the sensory profile of beef cuts at the supermarket before buying.

The article does not provide an extensive scientific background; it seems that the infra red image can reveal information on the oleic acid content of the meat, which can be associated to desirable sensory parameters such as tenderness, flavour and overall taste.

I admit that I 'm totally fascinated by the idea. Smart tags have been in the focus of food researchers for a long time. At the beginning, the objective was traceability. In that sense, RFID tags would be able to hold all the necessary information to identify the origin of the product. But as their data holding capacity improved and the cost began to drop (although it is still prohibiting for most general uses), new applications came up. Time-temperature integrators, for instance, which can easily accompany RFID circuits, can provide data to already developed mathematical models that can estimate the microbiological or quality status of certain foodstuffs. Such systems allow the packaging to provide feedback to the consumer.

So far, however, such systems required investment on behalf of the industry, the cost of which could have an impact on product prices and - at a second step - on consumer preferences. This time, the researchers work on a scenario that uses (mostly) already available technology. I assume that an additional light source and IR filters may need to be employed in order to get readings from a cellphone camera. On top of that, one would also need specific software and - possibly - some means of calibration. Still though, none of those are unrealistic given the computing power of modern mobile phones.

I am really curious to see what will happen when that technology "hits the market". I would expect many more applications to follow that path of development. Also, I wouldn't be too surprised if meat producers started pre-marinating their cuts in olive oil solutions :-)

Interestingly, I have just realised that until now I considered infrared photography to be just another beautiful - yet geeky - kind of art. Well, that is about to change!


Sunday 17 January 2010

The knowledge in the closet

What's in your closet - hanging by their hooks...
14/01 was a big day for the agro-, food- and bio- people around the world going after European research grants (it was the deadline for the 2010 KBBE call of FP7 - a public research funding instrument in Europe, giving about 53 billion euros to research in the period from 2007 to 2013).

The persistently pending question, however, is: what happens to the results of all that research? Understandably, not all research efforts are successful; and even when they are successful, they don't necessarily lead to tangible results. There 's really nothing wrong with that; research is a venture into the unknown (well, actually it is a venture in the not fully known, but let's not stick onto that for the time being), thus has associated risks, mostly of financial nature. It is also understandable that some of the research carried out will end up calling for further research in order to reach a ready-to-exploit stage.

But what is the amount of that ready-to-exploit scientific knowledge? The last few years (or decades) the need from knowledge exploitation has become a policy priority. I can't judge if that has led to substantial results (I have no means to measure in an objective way) but at least I feel that a higher number of people in universities and companies are rather aware that there are ways to protect, trade and - in general - exploit new knowledge.

The current system for intellectual property protection has been widely promoted as a helpful tool for the quest of knowledge utilisation. While I can see the pluses, I can't help but wonder what could other players do, that have limited access to the resources needed for such a game. And what about knowledge that is already available in an "unprotected" form, that is, either published or unpublished - being kept in a closet full of paper, data CDs and other archiving means.

Surely, even more of that knowledge could be exploited; if not at a big scale, at least at a micro-scale, through a cooperation of scientists will small companies under short-term projects of low, affordable budget; something like the sales that shops have, only for science :-)

As an example, think about the valorisation of the waste from the fish processing factories. The large production plants often produce fishmeal or fish oil out of that, using available equipment suitable for their volumes of production. At low production volumes, however, although the principles remain the same, it is likely that no optimised processes are commercially available, which would be a relatively easy task for an engineer to design. Interestingly, the original research on that must have worked on laboratory-scale volumes and, thus, is likely to be closer to the desired application.

The same applies for most of the waste outputs of farming, where biotechnology could provide solutions, sometimes with no further research being necessary. One could argue that the driving force of additional income from such as effort is simply non-existent in those cases; the value of the products derived from those exploitation processes is only achievable if one has a distribution and sales network reaching the right market. And then, there is the risk of producing surpluses of secondary products, thus leading to a drop of their market value. True and true. The right answer depends on the actual case but, in general, it takes an "unbalanced" action to break a vicious circle.

Cooperation frameworks around that idea have been tried in a number of countries with encouraging results (e.g. the innovation vouchers scheme that has been tried in many places, including Ireland, the UK, the Netherlands, Greece, etc.). However, with big grants around asking for ambitious research, the priorities of most researchers are not shaped towards "low-tech" cooperation with small enterprises. While I admit it would be stupid to suggest throttling the funding for innovative research, I believe that a stronger mandate for exploitation through small companies should begin to form.

Yes, there will be implementation problems (e.g., how many small food companies would put innovation as a priority instead of growing production volumes or sales figures? how many of those companies would be willing to participate in such schemes?). And yes, the existing legal framework may not be very flexible around food innovation (e.g., putting a health-related claim on a foodstuff is not a trivial process - and that has a pretty good reason behind that - I might add). But the potential benefits are many-fold:
  • "Older" knowledge or published knowledge could find application in a way that could further benefit the original researcher or research group
  • "Older" knowledge or published knowledge could be transformed to practical innovation at a higher pace than entirely new, breakthrough knowledge
  • Small companies will get exposed to working together with scientists and vice-versa; possibly a beneficial exercise for both groups
  • The public profile of food research will improve
  • The mobilisation of private funds for research could be encouraged (many small sums of money instead of a few larger ones)
  • The competition between food producers would benefit - even at the regional level
It might be worth considering it in a more thorough way, especially now that the global financial crisis has reminded to us that "big" doesn't necessarily mean "stable".


Sunday 10 January 2010

The nut factor

roasted peanutsI was idly going through slashdot on Friday, where I came across a link to a story on the Globe and Mail. Apparently, the Canadian Transportation Agency told Air Canada to provide a nut-free buffer zone in their airplanes. Air Canada had already stopped serving peanuts on-board, but they were still serving cashews nuts. That move was prompted by the complaint of a passenger that was severely allergic to nuts.

Interestingly, the Agency considered a severe food allergy to be equivalent to a disability that the air carrier should take into account in the service they offer.

I imagine, the easiest way to comply with that decision is to stop serving nuts altogether. Isolating a part of the plane and making it allergen-free is technically possible but it won't come cheap (think of air filters or at least air-curtains, special floor mats, regular surface cleaning, control on who and what enters that zone, zone autonomy - e.g., restrooms, and a procedure to control those things).

Allergens in food have been receiving increasing attention lately. In the EU there is concise legislation in place that requires foodstuffs to identify on the label any allergens that are contained as ingredients (either as main ingredients or as carry over substances from one of the main ingredients or from one of the treatments the foodstuff or one of its ingredients was subjected to) as well as any allergens that are likely to be present in traces (e.g., due to contamination during the production process). The list of allergens is regularly reviewed and revised. When needed, EFSA also looks into cases of allergens used as technological aids in food processing and tries to assess the levels that can survive until the final product and estimate the corresponding risk.

However, in my understanding, so far, food allergens have not been treated as a hazard with an environmental dimension. Personally, I am unaware of the potential effect of food allergens on people with severe allergy, when the contact is made through the skin (or the lungs, in case a food allergen can somehow be found on fine dust). I would imagine that brief skin contact could not easily lead to adverse reactions; but, then again, I am no expert on allergies and, thus, I might be very wrong on my hypothesis.

While I acknowledge the risks, the inconvenience and all the negative effects that allergies have on the quality of life of the people affected, I tend to believe that in severe cases, the person affected should also take his/ her condition into consideration. What exactly that may mean, really depends on the case and their doctor's advice. It could mean avoiding certain places or wearing long sleeves and gloves or a mask (i.e., means to avoid coming in contact with the allergen), carrying cortisone or antihistamine medication or even shots of epinephrine (i.e., means to fight the allergic reaction), etc.

After all, it makes sense to me that one should take reasonable care of oneself!


Thursday 7 January 2010

Superbugs

big bug
A few days ago, just before the entry of 2010, I came across an article on the BBC News website on the undesirable effect that disinfectants may have on bacteria. The article was supporting the the incorrect use of disinfectants (e.g., incorrect dilution) could allow bacteria to develop resistance to antibiotics.

I am by no means an expert in microbiology, or even plain-vanilla biology, however, I was aware that overuse or misuse of antibiotics could lead to increased resistance to those antibiotics; a trait which, once acquired by a group of bacteria, can be passed onto others, under certain conditions. In the hospital world, where people (patients) often have a weakened immune system, MRSA is a considerable threat, while an increasing number of other pathogens (or potential pathogens) begin to exhibit threatening tolerance to the available antibiotics, turning from simple "bugs" to "superbugs".

To my understanding, the antibiotics-induced antibiotic resistance can be mitigated by a tight antibiotic-use regime. Sweden has had considerable success in tackling the MRSA problem by forcing the health care system to resort to antibiotics only when absolutely necessary. The transition period might have taken considerable time but the gain sounds considerable: they can still make good use of antibiotics that on many place of this world are now considered to be ineffective.

The article of the BBC I spoke of earlier, however, is alarming in the sense that not only antibiotics but also disinfectants (and possibly other bacteria control means???) can somehow lead to antibiotic resistance. Clearly, improper use of disinfectants, which allows for a select portion of the microbial population to survive, favours that surviving population in the sense that it eliminates the competition. I would assume that this process effectively ensures that the descending bacteria will have those gene combinations that allowed their ancestors to survive the disinfectant. It seems, if my understanding of the article is correct, that those "disinfectant-survival" gene combinations can also be effective against antibiotics.

The alarming bit is that the use of disinfectants is much, much wider than the use of antibiotics. There not only used by hospitals but by a very high number of businesses, including the food industry, and they are also at hand in the typical household. I admit it would be inconsiderate to extrapolate that all disinfectants, if misused, could lead to superbugs. The fact that the number of known supebugs is still rather a small one, while the use of disinfectants has been more-or-less systematic over the last decades would rather support that the risk is minor.

I mentioned the food industry before. Interestingly, the agro-food industry was alarmed, in the past, by the antibiotic resistance problem but managed to sort it out by adopting good livestock practices and by considerably limiting the use of antibiotics. In Europe, there is legislation in place to ensure that things stay this way. But what about the use of disinfectants? The manufacturers of such products do include instructions for use, which normally are followed. I wonder though, with the modern foodstuffs enjoying increasingly longer shelf-lives, are there any significant chances that microorganisms which find their way into foods can turn into threatening superbugs?

In "live" foodstuffs (i.e., foods that contain a flora consisting of living microorganisms, such as yogurt, fermented sausages, various cheeses, tea, etc.), which contain a small eco-system, it is likely to be much easier to keep things under control. An undesirable contamination would be worse in the case of previously sterilised (or poorly sterilised) products under packaging conditions that lack bacteria growth barriers.

In any case, and for any of the existing reasons (ranging from health and safety concerns to competitiveness and sustainability issues), it may be worth revisiting some of the practices in our every day "war" with bacteria.

The use of good practices when it comes to cleaning surfaces or when actually using antibiotics has been proven to be an effective one. The use of phages to fight off antibiotic resistant bacteria has also been tested - successfully I believe; although, I'm not sure if it can find wide scale application beyond the health sector. The manipulation of the microbial ecology could be another promising sector, which has recently re-attracted research interest; after all, it may be time to remember that microorganisms are our valuable friends far more often than otherwise (not only when it comes to the function of the human body).

(Photo "big bug", CC by G J Hutton)