OMG: The Glasses That Can Detect Parkinson’s (and Film Your Lunch)

In the near future our glasses might be able to advise us on our diet, warn us we might be in the early stages of Parkinson’s, and let us know if we should seek help for our mental health. Yes, by glasses I mean spectacles. Or rather “eyewear”, for the modern reader.

Years ago, I laughed when my husband said that soon we would have technology built into clothes, monitoring our physiological processes. I laughed when he said that there would be cars which drove themselves, and if we could figure out how to buy even just 1000 bitcoins we might make lots of money one day (hollow laugh).

It turns out he was right. Wearable tech is coming on fast, self-driving cars are already on the roads, and that 1000 bitcoin would now be worth about £80 000 000.

Queues around the block at the Brighton Dome

Last Friday (4th July 2025), I found myself at a highly techie conference called Evolve 25 at the Brighton Dome. Not being very techie myself, I made a beeline for the medically-based talks, where I lucked out with a fantastic presentation by Charles Nduka, Consultant Plastic Surgeon from the esteemed Queen Victoria Hospital in East Grinstead. He specialises in facial reanimation and skin lesions. I had to have some minor facial plastic surgery there myself, but looking at the slides of a 24 year old car crash victim was sobering. (Charles is quite a fan of the self-driving car, by the way. He says that 90% car crashes are due to driver error – namely, emotional or distracted driving. He believes there will be a dramatic reduction in deaths and injuries when the cars drive themselves.)

Charles Nduka on the right. We all wore headphones as it was a noisy room – the talk came straight to our ears

Glasses that work like Google maps

At this point Charles launched into an interesting analogy. Google Maps, he said, relies on a network, like Waze. It uses data from the GPS on people’s phones to build a real-time map. This donated data forms a kind of collaborative network, creating not only the map but also giving information about what he called the “emotional landscape” of the system: road closures, traffic lights, bottlenecks, jams. It’s a sensitive, responsive system, which, by the use of information received from all users, help to improve the quality of everyone’s life. In other words, it’s a bit like a collective nervous system. Just by going about our day, we’re sending it feedback. That feedback, in turn, helps the system guide us along the smoothest path. 1

Can we prevent our body crashing?

Charles pointed out that we also have crashes within our body. A brain crash = a stroke. A sugar crash = hypoglycaemia. (Hospitals are crash repair shops, surgeons are mechanics.) However, we don’t have our own dashboard giving us information about fatigue, emotional stress, facial flexibility. We are driving blind. With wearable tech, though, we can track the drift into physical problems, by monitoring aspects of our physiology and getting real time feedback. 70% of premature deaths, he told us, were down to behavioural issues. Maybe if we knew we were heading for disaster earlier, we could take preventative action.

It is true we are not totally driving blind. We know our age, and we do already have heart rate and blood pressure monitors, glucose sensors, thermometers. Following ablation therapy, one of my patients gets a recall when his heart skips a beat, picked up by some kind of sensor. But what additional information can we get from the face? – the “naked, hairless, flexible” communicative part of our body.

OCOSENSE GLASSES: These are somewhat in the style of Eric Morecambe, but more streamlined versions are on the way

Optomyography

Optomyography, for this is its name – OMG for short! – is a a method of tracking tiny facial muscle movements. It receives information up to 6000 times a second from your face, then bluetooths it to your phone. It also measures activation of the sympathetic nervous system. I was a little hazy about how exactly it gets the information, but cameras play a role, and sensing of skin movement. Basically, the glasses will detect emotional states, muscle activity and through this, even very early signs of some conditions (For example, facial stiffening – the mask like effect of Parkinson’s – typically begins to occur ten years before you get tremors. ). They will be able to tell if you are smiling or frowning. Charles himself works extensively with facial palsy patients so it is easy to see the use here.

The Ocosense glasses – developed by Emteq Labs – also have an outward looking camera. They will be able to film what you eat. Hmm. Is this good? Diet, he stated, is the foundation of all health, and we are currently in a global epidemic of obesity and metabolic disease. No arguing with that. But 90% of type 2 Diabetes cases are curable with diet and exercise, he said.

What Conditions Could be Helped?

You might not be able to read the above slide but it is a long list of conditions. I didn’t get them all down but here are the few I did –

  • TMJ
  • Bruxism
  • Migraine
  • Depression
  • Bipolar
  • Alzheimer’s
  • Eating disorders
  • Epilepsy
  • Bipolar
  • Parkinson’s
  • Stroke recovery
  • Facial palsies

Certainly, the application is broad.

Where is this heading?

The eyeware at the moment looks fairly large and thick rimmed. Think Austin Powers, or Eric Morecambe. However, they are looking to miniaturise the sensors and use them in normal eyewear. One day, he says the phone will only be a secondary device. Your face, and your glasses, will be the primary interface.

Concerns and Ethics

I was left with a lot of questions.

What about poker-faced people? Will they be treated as emotionally repressed?
What about people-pleasing over-smilers—those who are crying inside while their zygomaticus major is hard at work? Will the glasses detect genuine feeling or mistake masking for happiness? (Cue Tears of a Clown, by Smokey Robinson.)

Will it capture the full emoji range of emotions—disgust, surprise, doubt, rage, mild contempt—or just smiling and frowning?

And if these glasses can film what you eat, what else might they record? What you drink? Who you see? How much time you spend watching Love Island?

If I want to see if I am still grinding my teeth, do I have to wear them to sleep?

We’d like to see a wonderful new world where early signs of illness are caught and treated with meditation, vegetable soup, and long walks in nature. But, big pharma being what it is, it’s just as easy to picture a future in which we’re constantly nudged—perhaps even pressured—toward buying more of their lovely drugs. Today it’s pre-diabetes; tomorrow it could be pre-hypertension, pre-dementia, pre-depression—each with its own pharmacological “solution”.

When does a prevention tool become a surveillance device, and just where is Aldous Huxley when you need him?

Final Thoughts from tech world

I came away from Evolve 25 with a few takeaways:

  • AI should more accurately be called “machine learning”
  • There is a lot of hype around it, and “overhyped” was a word I kept hearing
  • In practice, implementing it is quite difficult
  • It’s (apparently) still not too late to buy bitcoin
  • Maybe it’s time to rise above the current divisive Twitter vs Blue-sky spats and all move to web3 – decentralized, truly democratic apps such as Farcaster. Not a single tech billionaire in sight.

But wearable tech? Although it’s clearly fraught with danger, the potential upsides seem simply too huge to dismiss, and the momentum seems unstoppable. Right now, the hopes outweigh the fears.

You can find out more about this fascinating innovation at Emteqlabs.com.

  1. Similarly we have have the Human Genome Project: a vast, collaborative effort in which pieces of DNA data have created a usable map of our body’s instruction manual. ↩︎

Now it's over to you, please have your say