North Ridge Partners

View Original

The Precogs predicted CES2020

We welcome a guest post from Laura Ashton, company director, advisor, thought-leader and friend of North Ridge Partners. She reflects on 5G, future Smart Cities and other fascinating technologies from CES2020. A highly recommended read.

-----

Having recently returned from the planet’s largest consumer tech show in Las Vegas, I had a strong compulsion to rewatch Steven Spielberg’s 2002 cyberpunk action thriller, Minority Report.

Gravity-defying grids of self-driving cars. Tom Cruise navigating vast amounts of data, as though conducting an orchestra, with touchless hand gestures. “Hi, John Anderton…” hyper-personalised ads. It was all there - either in-market now or market-ready soon.

There is no doubt – Agatha and the precog twins predicted CES 2020.  Love it or hate it, the future is here.

FLYING CARS and SMART CITIES

5G, the next generation of wireless infrastructure, was one of the key technologies I was investigating at CES – after years of talk, where are we up to?

A few cheaper, 5G, foldable, camera-rich mobile phones were on display, although typically the big reveals are kept under wraps until World Mobile Congress, scheduled for Barcelona later this month, Coronavirus permitting.

With blistering speed of and low latency (network delay) in accessing data, 5G will enable the connected Internet of Everything: not just telecommunications and gaming at 10 gigs per second but AI and machine learning-enabled autonomous vehicles, edge computing, smart cities, smart homes, connected healthcare and remote surgery and much more.

In addition to prototype driverless vehicles on display as well as legions of underpinning technologies, two Utopian visions of connected, smart cities with 5G infrastructure and autonomous vehicles at their hearts stood out.

Hyundai and Uber announced their aerial ride-sharing deal, Uber Elevate, which it plans to live demo this year and commercialise within 3 years (initially human-piloted, eventually autonomous).

An exciting, be-goggled VR “flight” took me from San Francisco to Oakland on a flying taxi with redundant engines that can take off and land vertically. Passengers transferred through a hub building to autonomous vehicles that raced through the streets, filtering through traffic, passing cargo vehicles and being automatically recharged on the fly.

The elegant ballet of vehicles (and its distinct absence of IT-glitch traffic jams and accidents, despite sharing the cityscape with fallible human drivers) is thanks to a bewildering array of 5G-enabled, onboard sensors, cameras, radar and LIDAR, processing terabytes of data through onboard edge computing. As Wired put it, “the faster you can get data into and out of rolling robot, the better the experience”. Data is indexed and shared, real-time, across fleets of aerial and earthbound vehicles and other devices, the entire system continuously learning and improving. 

Similarly, at the base of Mount Fuji, Toyota’s 175-acre, hydrogen fuel cell-powered, sustainable Woven City, will be “the world's first programmable city”. Smart construction is expected to begin in 2021, with move-ins by 2025. A testbed for global corporate and research cooperation, AI, machine learning, robotics and flying and land-based driverless vehicles for riders, cargo, convenience shops and deliveries will support this “living laboratory”. 

It is "a unique opportunity to develop future technologies, including a digital operating system for the city's infrastructure, " Akio Toyoda, president of Toyota Motor Corp. said at CES 2020.

5G ubiquity, however, is still a future dream. As the millimetre waves have an awful time getting through walls, there will need to be as many as 20 access points per square kilometre.

In controlled environments like open sports stadiums where 5G can today be fully deployed, bandwidth, low latency and edge computing power are already beginning to transform the fan experience with instant replays, augmented reality, superstar perspectives and interactions and so on.

Seoul Smart City expects to be real by 2022, although has 4G at its heart and no images (yet) of autonomous vehicles racing vertically up canyons of steel. For visions as grand as Oakland and the Woven City to scale beyond a single urban test bed will take colossal investments and time. 

So 5G is fast, but for some it will be faster than for others.

TOM CRUISE’S AMAZING TOUCHLESS SCREEN

Steven Spielberg wanted Minority Report to create a plausible "future reality" for the year 2054. He convened a scientific thinktank before production began to ensure that the futuristic technology in the dystopian tale of future data privacy was grounded in reality.

Tom Cruise, as Pre-Crime Detective John Anderton, leaps into action when the precogs (3 drug-damaged people, cursed with the ability to foresee murders) announce that a crime will take place. Using illuminated three-fingered gloves, he interacts touchlessly with a vast transparent screen. On it, the precogs’ dreamy neural “movies” are combined with computer data and visual files images of people and places all over the city, 3D models of every building and more.

Unspoken but evident is instant access to colossal data bases, massive processing power and AI, natural language processing and machine learning at an extraordinary speed. You can be sure he’s not running this on home broadband speeds or over a 4G phone.

In a quiet suite, away from the crush of the main CES show and thanks to some wonderful visit curation from Story-Tech.com, I got to experience a Minority Report-like interface. Bristol and Silicon Valley-based UltraLeap offers technology that uses ultrasound, tracking and haptics to interact with real and virtual screens, without touch and without Detective Anderton’s gloves.

I stood in front of an advertising screen, presented with several interactive options. At about waist height was a smooth black box. As I moved my hands a few inches above the box, I felt the slight sensation of a fan blowing – but there was no fan. When I turned my wrists and tapped my fingers together (touching nothing), I was able to control and scroll the advertising screen and make choices from its menus. The magic of the black box was ultrasound waves.

Some of UltraLeap’s use cases required me to don VR headsets. In fact, many companies at CES2020 showcased consumer headsets for immersive gaming and top of the line enterprise models for design (e.g., cars) and training (e.g., air traffic controllers and complex machine operators). Brain-sensing neurotech wearables from NextMind caught the Innovation Award judges attention. Some offered chunky “realistic touch feedback” haptic gloves (HaptX) for teleoperation of robots and full body haptic suits (bHaptics) for Three Body Problem-like gaming.

With just a VR goggle set and a similar ultrasound UltraLeap black panel in front of me, I entered a simple virtual environment. I could “see” my gloveless hands (although they had become green and vaguely robotic looking in VR) and my every finger movement corresponded to what I saw. 3D cubes and polyhedrons appeared before my eyes. I could “catch” these, stretch their dimensions, even juggle them a little. Amazing; and it took little imagination to see the applications for training, design and digital twin interaction.

Finally, I climbed onboard a self-driving car mock-up, with black ultrasound panels conveniently close to my hands. With no driving to do, I had plenty of time to read or watch news, select entertainment, video-chat with a friend, choose a restaurant for lunch and so on. With a reassuring, gentle breeze-like sensations on my palms, I touchlessly flicked through various colourful windows and interacted with controls to amuse myself as we raced through the virtual city.

In this case, a VR headset was needed, but in 4-5 years when the 5G-enabled autonomous vehicles that will sport this technology are ready to go to market, UltraLeap’s technology won’t require the headsets and I suspect will feel a lot like Detective Anderton’s heads-up display interactions. Minus the murder.

“HI, JOHN ANDERTON” PERSONALISED ADS

Based on omnipresent retinal scanners, Tom Cruise’s character is never out of view of authorities… and advertisers. Outdoor video ads (almost certainly not GDPR compliant!) for Lexus, Guinness and American Express - speak directly to him, detecting his mood and hyper-personalising messages.At CES, Samsung-backed Star Labs showcased its Neon “artificial humans”, computationally created life-size, multilingual, 2D digital avatars that, as the company says, look and behave like real humans, conveying emotions and intelligence and learning and adapting. Based on some of the questions the founders were fielding, concierge services as well as advertising are in Neon’s near future.

Weirdly flirty, holographic anime companions from Gatebox are still small scale, but their personalisation makes them great candidates for brand-building.

But perhaps the closest analogue to the very private, very public communication that John Anderton was subjected to came from the most unexpected quarter, Delta Air Lines.

CEO Ed Bastian introduced the airline’s Parallel Reality collaboration with MisappliedSciences. Thanks to multi-view pixels sending different colours of light in thousands of directions and ceiling sensors identifying opted-in travellers, passengers see large screens in the airport with personalised information, in their preferred language, about their upcoming flight that no one else can see.

No special glasses needed. Magic.

I scanned a bar-coded boarding pass to Paris while another person scanned one to London. In the mocked up airport space, on a 2-metre square screen, I saw only my information and my neighbour saw only his. I had to invade his personal space (almost putting my chin on his shoulder…sorry!) to see just a little of what he was seeing. The trial was set up for 4 people at a time at CES but this will be in live pilot for up to 100 people at a time this summer in Detroit Airport.

There were a few other trends at CES, particularly smart home and wearables, on which I’ll comment in another post.

Science fiction aside, the point is that there are rapidly commercialising technologies that will transform businesses, market places and business models. Company Boards and Executive teams must take the time to learn and reflect on how the opportunities and threats that they represent will feature in their risk management and their proactive and defensive plans. If topics like these are not on the agenda of your next strategy session or digital transformation discussion, perhaps it will be your competitors, rather than the precogs predicting a nasty outcome.

If you’d like some help thinking about these topics, at Board or Executive level, please contact me.

- Laura Ashton