A friend taught me a trick to tell if pineapples are ready to be eaten. Tug gently at one of the spiky leaves on top and, if it comes away easily, the fruit has reached the right level of ripeness. I find this works pretty well, but for the most part, telling when food is at its peak is an inexact science of sniffing and prodding.
A huge amount of food is wasted. Roughly a third of all the food produced globally — about 1.3bn tonnes — is lost each year, according to the UN Food and Agriculture Organisation.
Waste happens at all levels of the process — from harvesting to transport to consumers throwing away perfectly good food because it has passed the use-by date.
But what if there was a smartphone app that could tell you whether food was still safe to eat?
This is not completely in the realm of science fiction. Abi Ramanan, co-founder of ImpactVision, is developing hyperspectral imaging software that could tell you how fresh a piece of beef is, or how ripe an avocado, by analysing how they reflect light across the electromagnetic spectrum, beyond the range of visible light.
This is the kind of technology that Nasa uses to study planets and monitor the surface of the earth. Deeper spectroscopic analysis reveals the chemical make-up of the materials. The concept applies as much to what is on a plate as what is on a planet.
“Every object — an apple, bread, meat — absorbs and reflects light in a unique way,” says Ms Ramanan. Even different types of meat, from beef to horse meat, have different spectral fingerprints, so you would be able to verify the bovine origin — or otherwise — of your favoured supermarket meatball. At the moment, ImpactVision is working on how to apply hyperspectral imaging to meat production and plans to install cameras on conveyor belts in packing plants.
Technology that once captured Jules Verne’s imagination is more relevant than ever
Ms Ramanan says that analysing the infrared spectrum will even indicate the beef’s tenderness and what its acidity level is, an indicator of its eating quality. “Up to now the food industry has relied on visual inspections, destructive tests and pulling random samples off for testing,” says Ms Ramanan. “It is inexact. Our technology can tell you the pH [acidity] of every piece of meat and its tenderness without it having to be touched.”
Anything that does not match the desired 5.5pH level required for steak, for example, can be pulled off the production line before it is packed and shipped. Trials have shown that the technology could greatly decrease waste in the supply chain as it can help meat producers to detect problems earlier, so avoiding costly recalls of food that might spoil if shipped.
If meat can be classified more easily, it can be used more efficiently. Meat with a pH of above 5.5 can still be used for ground mince, for example, but should not be sold as steak.
Bananas, for which the exact state of ripeness is crucial when it comes to transporting them, could be another product that would benefit from the technology, Ms Ramanan says.
It could be used to detect contamination — such as in the case of melamine-laced baby-milk formula in China in 2008.
The technology offers many non-food uses. Optina Diagnostics, based in Quebec, Canada, is using similar technology to scan people’s retinas to detect Alzheimer’s disease, while Texas-based Rebellion Photonics uses it to scan oil rigs for dangerous gasses.
From beer to toys, smart technology is to be found everywhere
One day, Ms Ramanan says, the sensors will be small and cheap enough to put in a smartphone. So far, hyperspectral imaging cameras are too big and expensive for this. BaySpec, based in San Jose, California, sells handheld hyperspectral cameras for several thousand dollars a piece.
But prices could fall as the market grows. Israel-based Unispectral has raised $7.5m to develop a hyper-spectral digital camera that could eventually fit into a phone.
“People in Silicon Valley say this will happen in a year. Others say it will take 10 years, or never happen. I think it is somewhere in the middle,” says Ms Ramanan.
At that point my days of tugging pineapple leaves may be over. We might be able to say goodbye to those inexact and wasteful use-by dates on food labels. One thing is certain — it would make photographing and posting pictures of food online more interesting.