Skip to main navigation Skip to site navigation Skip to content

Graduate School

UMaine turning water into wine?

This article was originally written by Steve Collins for the Sun Journal.
ORONO — Imagine it’s New Year’s Eve in, say, 2050.

You’re ready to attend a glamorous, glitzy, high-class party to usher in the new year.

So you put on your grimy old sweats, pour a pitcher of water, pile some soy patties on a plate, plop down in your favorite chair and slip on your Google Glasses, version 23.0.

Next thing you know, you’re standing in a penthouse five miles up, wearing a tuxedo or flowing gown, sipping the most magnificent champagne on the planet, eating the rarest caviar and maybe watching a virtual Elvis Presley counting down the seconds until midnight.

Or maybe you choose to sidle up to a street vendor in ancient Rome and order some spiced wine, gulping it down as you head to the Coliseum to watch gladiators duel.

Or perhaps you prefer to drink some mead in an old Viking hall or sip sake with Japanese sumo wrestlers or maybe just compare vintage Coca-Cola to the version delivered by Amazon drones with the rest of your groceries.

Nimesha Ranasinghe, an assistant professor in the School of Computing and Information Science at the University of Maine, is busy working out the science that might just make that happen — at least the beverage part of that future.

Not long ago, I visited his laboratory at the far end of a tangle of hallways in Boardman Hall on the Orono campus to find a glowing plastic wine cup attached to a long USB cable, with a sort of low rumble emanating from its solid white base.

Ranasinghe plugged the cup into a slot on his laptop, opened a bottle of Dasani water and filled about a third of the container.

On the rim of the cup were two little metallic electrodes, unavoidable for anyone taking a sip, and an attachment with LED lights that could shift the color of the water instantly.

Ranasinghe, a 36-year-old Sri Lankan native with a Ph.D from the National University of Singapore and a can-do spirit, told me to take a sip.

I resisted the urge to say “you first” because I was reasonably confident that zapping a reporter would not help his chances of getting tenure. Still, I sipped cautiously.

Next thing I knew I was feeling a little buzzing sensation at the tip of my tongue, pretty much the same feeling you get if you lick a 9-volt battery. Not surprisingly, both Ranasinghe and I knew exactly what that felt like because as children we were adventuresome, or maybe stupid.

As he talked about his research, I tasted something lemony, then another sip delivered something like a gin and tonic, and then, finally, to my utter astonishment, it seemed like chocolate.

But all along it was just plain old water.

The bottom line is that while Ranasinghe can’t turn water into wine, he can fool someone into thinking he did.


Back in his student days, Ranasinghe got to thinking about virtual reality and the way it has focused almost entirely on just two senses — sight and sound — to provide the illusion of entering something distant from our humdrum daily existence.

“Why can’t we take that approach with our other senses?” he wondered.

For instance, he said, maybe there is a way add a flavor perception to the mix so that someone wandering in an alternate reality could eat broccoli in real life but perceive it as, say, cheesecake.

It would be a dieter’s dream come true.

Ranasinghe said that as humanity moves from the information era into “the age of experience,” creating a multi-sensory environment that includes taste and smell is going to prove crucial.

So he started looking into the idea.

He said he would have to figure out a way to simulate the taste, smell and color of a beverage using digital means to pull it off.

But what he didn’t know is whether that could be done.

“There’s no fundamental understanding” of taste “even in the medical domain,” he said.

Ranasinghe said he realized that he needed to find a way to fool the tongue into sensing that’s something is salty, bitter or sweet. But, he said, he had “no idea” how.

Then he found a research paper almost half-a-century old describing how scientists used electricity to send low-voltage pulses to subjects’ tongues as part of an effort to understand the mechanics behind taste.

Some of the people involved noted that electrical pulses stimulated salty or sour tastes, an avenue that Ranasinghe recognized as a starting point for a more comprehensive effort to create more complex sensations.

Tiny electrical pulses, he soon discovered, offered a good start for the elusive goals he had in mind.

He said he found through trial and error that rapid heating and cooling created by changing electrical currents enabled him to mimic different taste sensations, from the coolness of a mint to some sense of spiciness.

That led to experiments with an electronic lollipop, electronic chopsticks and other gizmos aimed at trying to find a path to virtual reality for the stuff we shovel or pour into our mouths.

For now, the state-of-the-art is something akin to an especially ugly wine glass.


The virtual cocktail or vocktail system that Ranasinghe concocted relies on a cup “that is seamlessly fused into a 3D printed structure” that includes the electronic module on the lip, three scent cartridges inside the base, a trio of tiny air pumps and the LED lighting.

It’s operated by a custom-designed app that can adjust the lighting, electronic stimulation and scent of the drink.

The lighting is reasonably straightforward. Ranasinghe uses the changing colors to make people think of a drink they know, establishing a kind of bias in their brain that associates certain colors with particular tastes.

So if he wants to have a lime taste, for example, it may be green.

The scent is a little less obvious.

In the base of the cup are places where Ranasinghe can slip in small bottles of Aftelier Chef’s Essence Drops.

Suppose he wants the drinker to think chocolate, for instance. He can slip a Chocolate Absolute flavor into the base and then use the tiny air pumps to make the drink smell like chocolate.

If something smells like chocolate, it’s halfway there.

Then Ranasinghe adjusts the electrical pulses to trick the tongue into sensing bitterness, sourness and other experiences that mix together to simulate a taste.

Admittedly, it’s still a work in progress. I tasted the chocolate-scented water, after all, and wasn’t quite sure what I was supposed to imagine it was for a few seconds.

But, hey, I’m just a reporter, not some kind of tasting whiz. Mostly, I simply eat and drink what’s in front of me with a little too much gusto.

Yet I still felt a bit of wonder at the notion that water could be tricked-out to seem like something else.


It’s hard to know where Ranasinghe’s bid to “fool the brain with these technologies” may lead.

He said that someday, we might watch our favorite chefs on television and then, when the show is over, download an electronic code and in some way or another print out the meal and eat it.

It’s easier to imagine that the technology might help people with medical conditions rediscover  taste, improving their daily lives.

Or possibly it could lend a hand to astronauts on a long space voyage with otherwise limited options.

If they can taste a wide array of flavors, “they can have a sense of home,” Ranasinghe said. It has the potential to be “really powerful.”

Or perhaps those of us with a penchant for downing too many calories might find some help by settling for fake food and drink that offers some of the pleasures of the real thing without contributing to our weight gain.

After all, it’s tough to argue with Ranasinghe’s observation that many people “are eating and consuming way more than we need.”

At one point, some of his students sought to tie the virtual tasting experience in with music, combining a virtual menu with Michael Jackson’s “Earthsong” as it swings through emotions such as happiness, anger and curiosity.

That tying of emotion to flavors, Ranasinghe said, is one way to enhance the virtual experience he’s exploring.

Virtual reality may offer more ways to shift the way we taste food and drink as well.

At Cornell University this fall, for example, panelists wearing virtual reality headsets ate three identical samples of blue cheese, each in a different electronically induced setting.

In one, they sat on a park bench, in another they sat in an empty room and in the other, they sat in the cow barn owned by the Ivy League college.

The 50 participants generally found the cheese more pungent when they ate it surrounded by images of cows, even though all the samples were the same.

“When we eat, we perceive not only just the taste and aroma of foods, we get sensory input from our surroundings – our eyes, ears, even our memories about surroundings,” Robin Dando, associate professor of food science and senior author of the study, explained in a release by the university.

Ranasinghe said the memory of experiences we’ve already had plays into the way we experience tastes.

For Ranasinghe, there are a lot of questions left to answer, a lot of discoveries yet to come.

He said he feels as if he’s working on little black-and-white televisions back in the early 1950s, carving out a path toward a brighter, bigger and somewhat unimaginable future.

Ranasinghe is excited about the possibilities and convinced that the future is going to provide the solutions to many of those questions.

He said he would enjoy bringing his vocktail device to a real-life restaurant or bar to see how it goes over.

More than that, though, he’d love to convince some entity with deep pockets to fund a whole lot more research to figure out just how to turn his idea into a commercial product.


Ranasinghe said he’s passionate about pursuing ever more complex virtual creations, hoping someday to replicate both texture and taste, to make it possible to eat a sliver of soy and imagine it’s lobster or steak or a Hershey bar or whatever.

Technology, he said, offers so much already.

He said he’s glad he can interact with his sister in North Carolina and his parents in Sri Lanka, that a world once impossibly large is now something smaller.

For the moment, Ranasinghe said, his family is limited to joint chats with video and audio.

But, he asked, what if they want to share a meal?

“We cannot do that on the internet,” Ranasinghe said.

Someday, he’s confident it will be possible. And we might even be able to eat whatever we like without putting on pounds

If Ranasinghe is right, the future may prove endlessly delicious.