Happy New Year!
I hope you all had a wonderful Christmas and a spectacular start to the New Year! Yes, 2017 should have received a speeding ticket, since it went by too fast! And, I’m sure, 2018 will similarly speed along at an unsuitable pace – apparently time flies when you’re having fun, or is it just old age? Never mind.
What do we do now?
So, it’s 2018, and I thought in this month’s column I would steer away from the traditional diet of “Here are my predictions for 2018.” To be honest, I simply couldn’t swallow writing a piece of this ilk, as I’m sure plenty of others will have their own unique perspective. Does anyone actually look back at the predictions that were made at the start of the year? Anyway, it’s January and we typically kick off with CES soon followed by the ever-popular Mobile World Congress – they have become, if you like, a default factory reset to the New Year for the tech world – we just couldn’t be without them!
The idea for this month’s column was inspired during the “What do we do now?” period over the holidays – you know, that void between Christmas Day and New Year. I decided to fill my time by gorging on movies and caught up with the Blade Runner 2049 sequel. In particular, I loved the interaction between the character portrayed by Ryan Gosling, that is, Officer K and Ana de Armas who took up the role of Joi.
A natural evolution for our VAs
Whilst Officer K is an artificial human, called a replicant, he cohabits with his artificially intelligent (AI) hologram companion, Joi. I’m not going to unravel the love story, perhaps infused with K’s fantasies or delusions but, instead, I want to focus on the “What if?” In the film, Wallace Corp manufactured the hologram Joi, as a perfect companion, with the tagline, "Everything you want to see. Everything you want to hear.”
Well, that’s exactly it! For me, it’s the motivation I needed to write this post and it additionally creates that “What if?” moment because I realized that the interaction between K and Joi becomes what could well be a natural evolution for the many virtual assistants (VAs) out there, such as Cortana, Siri, Alexa and, of course, Bixby. Yes, Bixby is Samsung’s answer to Cortana, Alexa and Siri, making an appearance in the company’s newly launched talking fridge (don’t get me started!).
It’s not quite perfect, yet!
I’ll take this opportunity to clarify my thoughts further: So, I’m not talking about an augmented reality (AR) or virtual reality (VR) experience; rather, for me, the next evolutionary step is a fully interactive personable holographic experience, very much aligned to how the characters are portrayed in the movie.
I talked about my smart home in last month’s column with “What’s in your smart home this Christmas?” and whilst the general experience is wonderful there are a few niggles that fail the overall experience. Most notably, it’s us, we humans! We fumble, stumble and often think randomly and our VAs don’t quite get it. For example, I’ll ask Alexa, “Turn on ‘Clear’ in the sitting room lights,” but you must be exact and perfect with your request – Alexa doesn’t accommodate any pause while you mentally look up the mode of lighting you want in the room.
Realizing our internal fantasies
You see, Cortana, Siri, Alexa and, Google, for that matter, are all guilty – they are not yet brilliant with accommodating our language nuances. However, with a holographic VA comes an ability to ‘see’ what you say, what you do and how you express yourself. It’s an holistic and immersive experience, which will relate better when you fully interact with, if you like, a real assistant that’s seemingly standing before you. And this is where I think the next step should be, since if a holographic VA can see and experience you, she will know that you are pausing, perhaps to look up the mode of lighting, or that you’ve fluffed your words. This would, in turn, inadvertently fulfil the tagline promised by Wallace Corp.
Moreover, my vision fully incorporates an Alexa who, when called, appears. As such, may I introduce the holographic virtual assistant (hVA)? She is able to see me; greet or acknowledge me, “Hello Dean”; read the micro-expressions on my face and, more importantly, my body movements. To take the holographic experience further, you can, of course, dress her the way you wish and probably fulfil some (awkward) fantasies, but I’m not going to cover that here. Essentially, K realized Joi in his mind and, with this evolutionary expansion of existing VAs, we can bring to virtual-life our own assistants or, ultimately, companions.
Until next time …
“What if?” We already have robots that make an attempt at reading facial expressions, whether we’re happy or sad, for example. I’m also abundantly aware that this is an opportunity that doesn’t quite exist yet, but the possibilities are mind blowing. I’m sure it won’t be too long before we’re able to holographically interact with our VAs.
So, this is where a “forever a daydreamer” Dr. G signs off.