Monday, December 19, 2016

From Concept to Reality: BlackBerry-QNX's Groundbreaking CES Tradition

Thomas Bloor
Business Development Manager, QNX BlackBerry

The annual Consumer Electronics Show in Las Vegas has been growing in importance for the automotive industry over the years. You can hardly fail to notice that this year, as in previous years, the big automakers vie for floor space and attention with the glut of big screen TVs and other consumer goods. As always, BlacBerry QNX will be in the North Hall, proudly in the middle of the big automotive OEMs. 

At CES BlackBerryQNX has an enviable history of bringing concept cars that rival anything on the show floor, with one important difference – ours are not pure flights of fancy, and we show technologies that will become realities in the near future.

We started this trend back in 2010 with an LTE-connected Toyota Prius – 18 months before the first commercial LTE deployment in mid-2011. Working with Alcatel-Lucent to provide the experimental network, we demonstrated Google maps functionality with local search and an embedded Pandora radio app in a car for the first time. Connectivity is standard in many cars today, but in 2010 we demonstrated the future.
2012 brought us a CNET "Best of CES" award for demonstrating cloud-based natural language voice recognition, text-to-speech, and NFC based one-touch Bluetooth pairing.  Simply touching your phone to an NFC reader in the center console automatically paired the phone and car. 
In 2013 we got ahead of the trend for ever larger center stack displays – with detailed 3D maps and voice recognition Keyword Spotting – common today in smartphones but a first in a car. Simply saying "Hello Bentley" enabled you to start interacting with the natural language cloud based voice recognition Powered by AT&T’s Watson. 
2014 took literally us in a different direction. A 21-inch horizontally orientated center stack display extends across the dash, naturally extending the interaction and functionality towards the passenger.  Behind the screens the instrument cluster was integrated with the center stack running both driver information and IVI functions. With seamless controllability across the touch screen, physical buttons, and the jog wheel controls multi-modal input was highlighted across all available functionality. 

Not content with that, we foreshadowed greater integration of ADAS functionality warnings to the driver. In 2014 we warned the driver if local speed limits were exceeded through both the cluster and verbally through text-to-speech, and we followed this up in 2015 with a system that recommends an appropriate speed for upcoming curves based upon driving conditions and the radius of the bend.

So, what innovations will we be showing in 2017? I’m not allowed to tell you just yet but, in a first (for us), we’ll be showing both future and current production technologies and innovations.

Building on our products ranging from in-car acoustics through our comprehensive QNX-CAR application platform and to next generation driver assistance/autonomous drive we will be demonstrating how technology can enhance the user experience and increase safety for drivers and passengers.

While demonstrating technologies that will come to future production vehicles, these cars are not just "show floor wonders" because our automotive knowledge enables us to build demonstrators for the real world, which can be driven, thus allowing technologies to be experienced first-hand.






No comments:

Post a Comment