Report from Barcelona: first meeting of the W3C automotive business group

Last week, I had the privilege of attending the first face-to-face meeting of the W3C automotive business group and the honor of being nominated group co-chair. (The other co-chair is Adam Abramski, an open source project manager for Intel.) With more than 70 members, the group has already become the eight-largest group in the W3C, even though it is barely two months old. Clearly, it’s generating a lot of interest.

The meeting included three presentations and two contributions. I presented on the lessons we’ve learned with the QNX CAR platform, how we think the market is changing, and how these changes should drive HTML5 standardization efforts.

I presented my three “musts” for standardizing HTML5 in the car:
  1. Must create something designed to run apps, not HMIs (unless HMIs come along for free)
  2. Must focus on mobile developers as the target development audience
  3. Must support integration of HTML5 environments with native environments like EB Guide and Qt
I described some of the changes that have resulted from the alignment of the QNX CAR platform with the Apache Cordova framework, and why they are crucial to our HTML5 work. Unfortunately, we didn't have our W3C contribution ready due to these changes, but members generally agreed that having a standard consistent with mobile development was an appropriate course change.

Tizen and GenIVI gave presentations about their vehicle APIs. Tizen has contributed its APIs, but GenIVI hasn't yet — still waiting on final approvals. Webinos contributed its APIs before the meeting, but didn’t deliver a presentation on its contribution; members had reviewed the Webinos work before the meeting.

The meeting was a great chance to sit down with people I don’t normally meet. Overall, the group is moving in the right direction, creating a standard that can help automakers bring the goodness of HTML5 into the car.

Autonomous, not driverless

Paul Leroux
I don't know about you, but I'm looking forward to the era of self-driving cars. After all, why spend countless hours negotiating rush-hour traffic when the car could do all the work? Just think of all the things you could do instead: read a novel, Facebook with friends, or even watch Babylon 5 re-runs.

Unlike Babylon 5, this scenario is no longer a page out of science fiction. It’s coming soon, faster than many imagine. That said, the story of the self-driving car still has a few unfinished chapters — chapters in which the human driver still has an important role to play. Yes, that means you.

As I’ve discussed in previous posts, the fully autonomous car is a work in progress. In fact, some of the technologies that will enable cars to drive themselves (adaptive cruise control, forward collision avoidance, etc.) are already in place. Moreover, research suggests that these technologies can, among other things, improve traffic flow and reduce accidents. But does that mean you will soon be able to sit back, close your eyes, and let the car do everything? Not quite.

Evolution, not revolution
If you ask me, Thilo Koslowski of Gartner hit the bull's eye when he said that self-driving cars will go through three evolutionary phases: from automated to autonomous to unmanned. Until we reach the endpoint, we should pay heed to the words of Toyota's Jim Pisz: autonomous does not mean driverless.

If planes can do it…
Some folks hear this and are disappointed. They point to auto-pilot technology in planes and ask why we can’t have driverless cars sooner than later. The argument goes something like this: "It's much harder to fly a plane, yet we have no problem with a computer handling such a complex task. So why not let a computer drive your car?”

If only life were so simple. For one thing, automakers will have to make autonomous cars affordable — doable but not easy. They’ll also have to negotiate a variety of legal hurdles. And in any case, driving and flying have less in common than you might think.

When you drive, you must remain alert on a continuous basis. Lose your attention for a second, and you stand a good chance of hitting something or somebody. The same doesn't always hold true in flight. When a plane is cruising at 30,000 feet along a proscribed flight path, the pilot can avert his or her attention for 5 seconds and incur little chance of hitting anything. In comparison, a driver who becomes distracted for 5 seconds is hell on wheels.

And, of course, auto-pilot doesn’t mean pilot-less. As Ricky Hudi of Audi points out, pilots may rely on autopilot, but they still retain full responsibility for flying the plane. So just because your car is on auto-pilot doesn’t mean you can watch YouTube on your tablet. Bummer, I know.

An alarming solution
Source: Modern Mechanix blog (and yes, that should 
read Frankfurt)

All of which to say, the driver of an autonomous car will have to remain alert most or all of the time — until, of course, autonomous vehicles become better than humans at handling every potential scenario. Now that could happen, but it will take a while.

It seems that someone anticipated this problem in the early 50s when they invented “alarming glasses” — take a gander at the accompanying photo from the August 1951 issue of Modern Mechanix.

Scoff if you will, but a kinder and gentler form of this technology is exactly what autonomous cars need. No, I'm not suggesting that scientists find a better way to glue wires to eyelids. But I am saying that, until cars become fully and safely autonomous, drivers will need to pay attention — after all, it’s tempting to drift off when the car is doing all the work. And, indeed, technologies to keep drivers alert are already being developed.

Pre-warned means prepared
Mind you, it isn’t enough to keep the driver alert; the car may also need to issue “pre-warnings” for when the driver needs to take over. For instance, let’s say driving conditions become too challenging for the car’s autonomous mode to handle — these could heavy rain, a street filled with pedestrians, or an area where lane markers are obscured by snow. In that case, the car can’t wait until it can no longer drive itself before alerting the driver, for the simple reason that the driver may simply take too long to assess the situation. The car will need to provide ample warning ahead of time.

The more, the better
That cars will become autonomous is inevitable. In fact, the more autonomous, the better, as far I'm concerned. Research already suggests that technologies for enabling autonomous driving can, in many cases, do a better job of avoiding accidents and improving traffic flow than human drivers. They also seem to do better at things like parallel parking — a task that has caused more than one student driver to fail a driving test.

But does this all mean that, as a driver, I can stop paying attention? Not in the near future. But someday.

Goodbye passwords, hello biometrics

Let's face it — passwords suck.

Every day we have to recall all manner of alphanumeric combinations for bank PINs, network log-ons, corporate email, social networking, and e-commerce. According to Microsoft Research, the average user types eight passwords per day.

During a talk at last year's SAE Convergence, Joseph Carra from the US Department of Transportation said, “Passwords have to go” ... a breath of fresh air for those of us who rely heavily on the "forgot password" option. The stage is set, according to Carra, for biometrics to replace passwords in the vehicle.

Using biometrics for driver preferences is nothing new — my favorite example is a car seat that can identify you by the shape of your butt — but using them to replace passwords makes perfect sense.

Ultrasound fingerprinting, iris scans, facial recognition, signature dynamics, voice recognition, keystroke dynamics, hand geometry, skin patterns, and foot dynamics are already being used in enterprise security, law enforcement, border control, ATM transactions, and so on. And second-gen biometrics promise to pump up the Sci-Fi factor with neural wave analysis, electro-physiological biometrics, skin luminescence, body odor, and so on.

Many technologies eventually find their way into the car after becoming popular elsewhere — mobile telephony, media players, GPS navigation, etc. I can’t think of too many world-changing technologies that got their start inside the car. But given the innovative trajectory of today’s auto industry, that may be about to change.

A matter of context: How digital instrument clusters can enhance the driving experience

I always drive a manual, so checking the tachometer in my car’s instrument cluster has become second nature to me. But while I have a personal interest in what my cluster displays, why would a software company like QNX be interested in instrument clusters? After all, most clusters use physical gauges and relatively little software.

The answer, of course, is that automakers are starting to migrate to digital instrument clusters, which replace mechanical gauges with virtual instruments rendered on an LCD display. In fact, Jaguar and Land Rover, who are pioneers in this market, have been shipping QNX-based digital clusters since about 2010. Here, for instance, is a photo of the digital cluster and dash in the latest Range Rover:



So why use a large LCD display instead of mechanical gauges? For one thing, you can attract early adopters who always want the latest tech and who see large 3D displays as cool. But more importantly, a digital cluster can provide an experience that is both personal and adaptive — personal because consumers today want to control the UX (just as they customize their smartphones) and adaptive to help the driver in a variety of traffic situations.

Context matters
In the latest QNX technology concept car, for instance, the digital cluster can re-configure itself to display a 3D rear view camera to help with parking. Saab pursued similar ideas a few years ago with a context-based cluster that avoids loading the driver with too much information during night-time driving.

It will be interesting to see who takes this to the next level with an adaptive HMI that takes speed, location, and driving conditions into account. For instance, driving at high speed on a German Autobahn differs immensely from driving at low speed on a busy downtown street with lots of pedestrians and intersections. These two scenarios place different demands on the driver, and a digital cluster could adapt accordingly.

On the autobahn, the cluster could increase the size of the speedometer and tachometer to make them easier to see, while hiding other information that isn’t currently needed. (The cluster would, of course, still display any necessary warnings, such as high oil temperature.) In the city, meanwhile, the cluster could replace the tachometer with pedestrian warnings to improve the driver's situational awareness.

Also, think of a car that supports both automatic and manual gear-shifting. A driver who prefers automatic might not be interested in a tachometer, whereas a driver who shifts manually will want to see a RPM readout to optimize gear shifting. A digital cluster could accommodate both preferences.

For safety’s sake
What does it mean from a safety perspective to include a large display and its attendant electronics in the car? A malfunctioning digital cluster can’t directly kill or injure, but it could give false indications that may lead to an accident. That is why automakers will likely have to address ISO 26262 requirements for their digital clusters.

So what is ISO 26262? It’s a standard that focuses on functional safety in cars and other types of passenger vehicles, with the goal of avoiding or controlling system failures. It is similar in content and purpose to the IEC 61508 functional safety standard, to which two QNX OS products have already been certified. Read our previous posts (here and here) for more information on ISO 26262.

Massive arrays
When it comes to digital clusters, I’ve only scratched the surface. For instance, cars are becoming massive sensor arrays that generate tons of data. By leveraging this data, reconfigurable clusters could display contextually relevant information, such as highlighting a person in your path, an accident up ahead, or the current speed limit.

And from the automaker’s perspective, a digital cluster could help reduce costs by allowing the same hardware to be used across multiple vehicle lines; in many cases, only the graphics would need to be “reskinned.”


Emil Dautovic is an automotive business development manager at QNX Software Systems, where he is responsible for the European automotive market.

Finally, I can throw away my 8 tracks

Okay, maybe I'm not old enough to have owned 8-track tapes. But I do remember that my uncle had an 8-track player in the dash of his station wagon when I was a kid, and I am old enough to have owned a car with a cassette player.

Music has been fundamental to the driving experience for about as long as cars have been on the road. Terrestrial radio dominated forever, supplemented by tape and then CD. XM radio came along in 2001 and connecting your iPod started to show up in the late 2000s. That's 5 formats since the Model T was introduced in 1908 (okay, so it didn't  have a radio) and 3 formats in the first 90 years.

Now, with connected cars becoming a reality, the rate of change is shifting into overdrive. Want Pandora – check. Want to listen to the top alternative radio station in Dublin while driving in California – check. Want to keep listening to your Songza programming as you move from the house to the car – check.

Today's announcement from QNX and 7Digital adds a whole new dimension. Having 7Digital in the car will unify the music ownership experience across the big three: car, pocket, and home. Want to listen to your own music programming in the car – check. Want to buy a song you just heard on that Dublin radio station – check.

Read the press release for details. And when you're done, check out the 7digital blog.

Enabling drivers to interact safely with applications and services

Since February 2011, QNX Software Systems has been leading an international standards effort to help drivers interact safely with applications and services. And not just apps on phones, but apps running in the cloud, in roadside infrastructure systems, in the car itself, and other locations.

If you jump to the end of this post, you’ll find a list of use cases being targeted by this effort. For now, let’s look at Use Case 2, Scenario A (arbitration of external message), which illustrates how we are working towards a comprehensive framework for managing distraction and workload.

Keeping priorities straight
In this user scenario, a navigation maneuver is given priority over a social media status update message. The blue call-out boxes indicate where the ITU-T recommendations under development can enable safe interaction between the driver and applications. For instance, ITU-T recommendation G.SAM will define mechanisms for prioritizing navigation, while G.V2A will define the communications interface between the app and the driver-vehicle interface (DVI), and P.UIA will recommend characteristics of the auditory social media message.

Remember that the focus here isn't on how to implement social media in the car, but rather, on how best to manage workload and distraction.



Giving a navigation maneuver priority over a social media status update message


Often, I am asked how this effort differs from the MirrorLink standard being developed by the Car Connectivity Consortium. The simple answer is that MirrorLink addresses only some of the use cases listed below. For instance, the scope of MirrorLink is limited to applications and services running on nomadic devices. Furthermore, adaptation of the driver-vehicle interface and external applications and services in the current MirrorLink solution uses a simple two-state approach, driving or not driving, which limits the ability of the vehicle to control the timing and modality of communications with the driver. Also, MirrorLink doesn’t adequately address arbitration or integration of communications with all external applications and services.

In for the long haul
At QNX Software Systems, our aim is to:
  1. Work with the relevant parties to identify solutions to the problem of technology-related driver distraction and workload. These parties include automotive, telecommunications, and consumer electronics organizations; standards development groups; academia; and government agencies.
  2. Determine which aspects of the solution should be standardized, then help drive this standardization.
  3. Align QNX product roadmaps as solutions develop.
To be clear, this is a longer term strategy that will take years to realize. Both the standardization process and the time it takes to deploy technology in vehicles must be factored in. Therefore, we are also pursuing shorter term solutions, some of which I hope to cover in future posts.

The end of the beginning
The first major milestone in this effort was achieved at the closing plenary of the ITU-T Study Group 12 meeting, held on March 28 in Geneva. Here, the final report and 4 deliverables of the ITU-T Focus Group on Driver Distraction were approved. There was also approval of a liaison statement communicating these results to a large list of organizations working on this topic.

This marks the end of the focus group, but is really just the beginning for QNX and ITU-T efforts in this area. In future posts, I will explore various aspects of this comprehensive strategy.



Use cases and user scenarios targeted by ITU-T recommendations

Use Case 1: Interaction with external application/service
   a) Application on nomadic device
   b) Application on cloud-based server
   c) Downloaded Application
   d) Broadcast of roadway information
   e) Tethering
Use Case 2: Arbitration and integration of external message
   a) Arbitration of messages
   b) Integration of messages
   c) Both arbitration and integration of messages
   d) E-call
Use Case 3: Negotiation of network Quality of Service (QoS)
   a) Application selects network
   b) Application suspends interaction
   c) Application availability due to roaming
Use Case 4: Management of multiple dialogues
   a) Opening/closing an application
   b) Switching between applications
   c) Interaction with background application
Use Case 5: Adaptation of DVI (driver-vehicle interface) and external applications/services to driver abilities
   a) Driver with disability
   b) Dynamically changing driver capabilities
   c) Detection of impaired driver state
Use Case 6: Adaptation of DVI and external applications/services to roadway situation
   a) Driver busy notification
   b) Delay of message delivery in demanding driving situation
   c) Change message format based on road conditions
   d) Interruption of driver interaction
Use Case 7: Adaptation of DVI and external applications/services to vehicle status
   a) Vehicle enters safe operating condition (e.g., park gear, < 5 m.p.h., etc.)
   b) Driver adjusts vehicle controls (e.g., climate control, etc.)
   c) Suppression of hazard alert due to safe speed
Use Case 8: Adaptation of DVI and external applications/services to local regulations
   a) Application blocked
   b) Application suspended
   c) Interface modality disabled
   d) Age restriction
   e) Content restriction

For details on these use cases, download the FG Distraction Use Cases report.

Using smartphones to prevent traffic jams

Paul Leroux
Smartphones and driving don’t mix, right? Normally, you would get no argument from me. Driving is the one activity where a half-second lapse in attention can translate into a lifetime of regret.

But you know, there’s more than one way to use a smartphone in a car. Take Honda, for example. They’ve been experimenting with an approach that may help prevent accidents, rather than cause them.

Let’s rewind a bit. A few months ago, I wrote a post on the potential benefits of adaptive cruise control. These benefits include a dramatic reduction of traffic congestion and safer distances between cars.

Well, guess what: it seems that a smartphone app can have much the same effect. Recently, Honda equipped a number of drivers with an app that monitors acceleration and deceleration. When a subject drives in a way that avoids causing a traffic jam, the app screen turns green; otherwise, it shows blue.

Simple enough, right? And yet, the results were dramatic: formation of traffic jams was delayed by up 6 minutes and fuel efficiency shot up by 22%. Not bad for a smartphone app.

I am, of course, skipping a few details. Read more about the study in Tech-On!, an outlet of Nikkei Business Publications.

Jamming on a theme of connectivity at Automotive Megatrends 2013

Justin Moon
"The only time I really don’t feel connected is when I am driving in my car.”

You can always tell you’ve had a successful conference if you come away with a few “ah ha” or “oh no” moments. The above quote, which I cannot take credit for, was my “ah ha” moment at Automotive Megatrends 2013. The conference saw leaders and forward thinkers in the automotive industry come together and discuss (or debate) three technology streams: powertrain, safety, and the stream I participated in, connectivity.

The day began with a panel discussing the Big Picture of in-vehicle connectivity. Five panelists presented their views on business model pros and cons, where connectivity is headed, how it could change consumer expectations, and steps for ensuring success. Following the panelists’ presentations, the floor was opened for questions and dialogue with the audience. It was a great introduction and it set the stage for the rest of the day.

After a brief “networking” break, a panel discussing hardware and the effects associated with connectivity began. The format continued as before: each panelist spoke on their area of interest or expertise, then the audience joined the conversation. I was intrigued by the state of persistent storage in infotainment systems, including the problems and potential solutions for maintaining performance and reliability.

Lunch was just as engaging as the panels and discussions. I participated in a lively table-wide discussion with several analysts and industry thinkers on how the connection in my vehicle needs to become a seamless part of my lifestyle, just like my smartphone or the connected entertainment equipment in my home. This discussion was a great lead-in to the panel I had the pleasure of participating in — Software and Apps.

Clearing a path
The panel followed the same format as the others. Panelists discussed the role of the software platform and where application models fit into the future of the connected vehicle. One engaging panelist, after a rousing lunch discussion, explored how ubiquitous connectivity will clear a path for bigger possibilities with ecological initiatives, safety strategies, and making the vehicle a part of the connected lifestyle consumers already relate to and expect.

The final panel of the day was about wireless networks and had some industry big thinkers jam about infrastructure requirements, futures, and business models.

All in all, my experience at Automotive Megatrends was very positive and I look forward to doing it again next year.

Meet the QNX concept team: Alex James, software engineer

We continue our spotlight on the QNX concept development team with Alex James, who gives us his impressions of the Bentley and the buzz at 2013 CES.

Besides attending CES, Alex worked on the latest QNX technology concept car from conception to launch — an amazing experience for any software engineer.

Working with bleeding-edge technologies sometimes brings unexpected challenges, along with interesting opportunities, as Alex discovered.

If you haven't had a chance to meet the other team members, you can read their stories here.



The challenge of creating an (auto)mobile user experience

On March 12, I had the honor of joining a distinguished group of panelists at a luncheon for the Los Angeles Motor Press Guild. The panelists included:


The purpose of the panel was to share information on trends in the connected car space and in the automotive application ecosystem. The panel was well attended, with journalists from publications like the New York Times, and with representatives from companies like Alpine, Beats by Dr. Dre, Hyundai, and Toyota.

Two things stood out for me. First, the press really picked up on the need for solutions that can offer ease of use, upgradeability, and reliability while also reducing distraction and liability. Second, an expert witness hired by car companies to testify in Lemon Law suits told the panel that he was already being hired to provide testimony in cases involving in-vehicle electronics. He speculated that the technology described on the panel was going to “make him rich.”

His comments help illustrate a point. A car isn’t a mobile phone. OEMs and end-users may want the same kind of fresh and updateable experience that a phone can provide, but unlike a phone, an in-car infotainment system must be simple to use even while you’re driving down the highway. Such systems offer the ideal environment for a hard real-time OS that can also enable the latest consumer technologies and applications in a reliable and easy-to-use way.

Jim Pisz mentioned a sign he saw at the Geneva Motor Show. The sign said “Don’t Worry, Be Appy.” That sign makes me realize that the industry is at a crossroads. OEMs want access to consumer app developers and, in some cases, the apps themselves. At the same time they want a reliable solution that they won’t have to “worry” about. With QNX’s pedigree of reliability and amazing app ecosystem, we are uniquely positioned to help OEMs build “appy” cars, without the worry.

Traveling on reserve power

Or how a new kind of electro-mobility can be fun. A guest post from Thomas Fleischmann of Elektrobit Automotive.

Thomas Fleishmann
Imagine you are always driving on reserve power. You find this difficult? Get used to it — and welcome to the new era of electro-mobility! Vehicles like the Chevy Volt, with its secondary combustion engine, are already addressing this challenge — but having to support two types of driving technology can be challenging and costly.

So how can electro-mobility, which is supported by software, be implemented meaningfully? And how do we get drivers to accept it? Certainly not by abandoning the driver with nothing but a nicely animated display of the car’s battery condition.

Let's assume you live in a big city. You don’t even own a car. Instead, you subscribe to a certain number of hours of travel time with your favorite car brand. In addition to your S or XL subscription for four weeks a year, you get access to a fossil-fuel engine for your vacation in another state or country twice a year.

In the morning you find and reserve a car with your BlackBerry phone and get into it at a nearby charge-point. The HMI adapts to your profile settings automatically — your friends, contacts, addresses, and music are already there. The navigation system is your energy consultant; it tells you, based on traffic conditions and topography, how far you can drive with this car and, at the appropriate time, suggests an available electric socket within easy reach. Or the system warns you to turn back soon if you want to arrive home safely. After parking the car successfully, your smartphone guides you for the last few kilometers by bus or subway to your destination — it knows the way and easily adopts the data of your navigation system.

Using software solutions like EB GUIDE or EB street director and frameworks like the QNX CAR application platform you can concentrate on creating the end-user experience and transform the journey on reserve power into something fun and convenient — suddenly an electric vehicle becomes a smart mobility concept.

Check this out; I fried it last week. The yellow center represents the area in which you can drive and go back home. The white area represents the range you can drive, depending on traffic or topography:





Thomas Fleischmann is Senior Product Manager at Elektrobit Automotive responsible for the HMI solution EB GUIDE. Contact him at Thomas.Fleischmann@elektrobit.com.


Fortune Profiles QNX Software Systems

In the latest issue of Fortune magazine, Kurt Wagner takes a look at the history of QNX and specifically highlights our leadership in the automotive industry. In the article, Kurt mentions the automotive companies we are working with — including Audi, Toyota, BMW, Porsche, Honda and more — and also provides a look at the technology concept car we unveiled at CES 2013, based on a Bentley Continental GTC.

To quote Kurt, “…the $191,000 luxury vehicle quickly became a must-see attraction, not for its curvaceous sheet metal but for its futuristic dashboard inside, featuring a massive 17-inch touchscreen, 3-D maps, even videoconferencing.”

You can read the full article in the April 8 issue of Fortune or on the web here (requires a subscription). Want to see more of this gorgeous technology concept car? Go behind the scenes and take a full tour.



The isolation imperative: protecting software components in an ISO 26262 system

Software components can be impolite, if not downright delinquent. For instance, a component might:

  • rob other components of CPU time
  • rob other components of file descriptors and other system resources
  • access the private memory of other components
  • corrupt data shared with other components
  • create a deadlock or livelock situation with other components

Shameful, I know. But in all seriousness, this sort of behavior can wreak havoc in a safety-critical system. For instance, let's say that a component starts to perform a CPU-intensive calculation just as the system enters a failure condition. Will that component hog the CPU and prevent an alarm process from running?

The answer, of course, is that it damn well better not.

It becomes important, then, to prevent components from interfering with one another. In fact, this principle is baked into the ISO 26262 functional safety standard for road vehicles, which defines interference as:

    "...the presence of cascading failures from a sub-element with no ASIL [Automotive Safety Integrity Level] assigned, or a lower ASIL assigned, to a sub-element with a higher ASIL assigned leading to the violation of a safety requirement of the element”

To put it crudely, less important stuff can't stop more important stuff from happening.

So how do you prevent interference? One approach is through isolation. For instance, a system may implement spatial isolation between application processes. This would include mechanisms for interprocess communication and interprocess locking that prevent one process from inadvertently affecting another.

Mind you, there are multiple types of interference, so you need to implement multiple forms, or axes, of isolation. Time for a picture:




In general, you need to determine what does, and what doesn't, need to be isolated. You also need to identify which components are apt to be delinquent and build a cage around them to protect more critical components. Which brings me to a recent paper by my inestimable colleagues Chris Hobbs and Yi Zheng. It's titled "Protecting Software Components from Interference in an ISO 26262 System," and it explores techniques that can help you:

  • implement the component isolation required by ISO 26262
  • demonstrate that such isolation has been implemented

And while you're at it, check out the other titles in our "safe" whitepaper series. These include "The Dangers of Over-Engineering a Safe System" and "Ten Truths about Building Safe Embedded Software Systems."

And don't worry: there's nothing delinquent about downloading all of them.

Hello, Bentley: Using Sensory speech technology to create a natural user experience

By Bernard Brafman, Sensory Inc., and Justin Moon, QNX Software Systems

In-vehicle infotainment systems are becoming more and more complex, and integral to the overall driving experience. As this trend continues, it will become increasingly important to create systems that support multiple forms of user interaction. If you’re driving, the last thing you want to do is enter a destination manually, or search for your favorite artist in a playlist by using a touch screen, jog wheel, or other manual input method. Drivers want and require a user experience that is both simple and natural; integration of speech recognition technology goes a long way toward achieving that goal.

In fact, speech recognition is a key component of the latest QNX technology concept car, a modified Bentley Continental GT. The speech rec system lets you plot a route or select your favorite artist using natural speech, but it goes even further by letting you simply ask the car to perform an action. Leveraging Sensory’s FluentSoft SDK, more specifically the TrulyHandsfreeTM Voice Control technology, the QNX concept development team implemented keyword spotting techniques to interact with the vehicle.

So how does this work? Well, let’s say you’re in Vegas and need directions to the Wynn Casino. To engage the cloud-based Watson speech system, you simply say “Hello Bentley” — no need to push a button. You then complete the request by saying “Take me to the Wynn Casino.” FluentSoft, along with the architecture of the advanced speech recognition system included in the QNX CAR platform, allowed the team to create this seamless, easily implemented, and well-executed voice interaction experience.


When you say “Hello Bentley,” the QNX concept car displays a visual prompt at the top of the screen, indicating that the speech rec system is now listening for natural speech or directed commands.

Multiple triggers
The team plans to further utilize Sensory technology in future concept car releases. The current implementation has the single “Hello Bentley” trigger, which engages the speech system. But TrulyHandsfree Voice Control supports multiple active triggers as well as a robust recognition vocabulary to create a rich command-and-control user experience that doesn’t require prompts or pauses. Thus, it’s possible to create a hybrid system that is seamless and transparent to the user. For instance, “Hello Bentley air 68 degrees” or “Hello Bentley what time is it in Tokyo?” can both be executed flawlessly, regardless of which speech rec system is engaged.

A matter of choice
For an even more personalized experience, this technology can allow drivers to create their own custom trigger with a simple one-time enrollment process that verifies their identity as a voice password or identifies one of several previously enrolled. This creates a custom experience not only by letting you choose your own trigger phrases (come on now, who hasn’t named their car at some point?), but also by recalling individual preferences such as seat position, steering wheel position, and multimedia presets.

Look for these enhanced features in concept cars to come!



Bernard Brafman is vice president of business development for Sensory, Inc., responsible for strategic business partnerships. He received his MSEE from Stanford University. Contact Bernard at bbrafman@sensoryinc.com

Justin Moon is a global technical evangelist for the automotive business development team at QNX Software Systems.


Pandora interview: Using HTML5 to deliver content to the car

At CES this year, our own Andy Gryc had a chance to sit down with Tom Conrad, CTO at Pandora, a long-time QNX CAR platform partner. Pandora is already in 85 vehicle models today and continues to grow their footprint, not only in automotive but in consumer as well.

Take a couple minutes to hear Tom's perspective on standardizing on HTML5 across markets and to get a glimpse of the future of Internet radio in automotive. And make sure you watch the whole thing — there's some fun outtakes at the end.



Making the growing number of connected cars continuously better

Guest post by Yoram Berholtz, Director of Market Adoption, Red Bend Software

More and more car manufacturers are implementing over-the-air software updates as a way to improve functionality, fix software defects, and guarantee a user experience that is continuously better. Car manufacturers GM (OnStar) and Daimler (MBRACE 2) have been leaders in recognizing the value of over-the-air updates for improving their infotainment systems. For example, GM recently updated the Bluetooth technology in OnStar to support late model smartphones.

The ability to update the infotainment system even manually is an improvement over requiring car owners to visit the dealership every time a new software update is available. As an example, Ford recently launched a program for consumers to update their own MyFord Touch system by mailing Ford owners a USB drive loaded with the appropriate software updates. However, many consumers view manual updates as bothersome and complicated, which means some systems simply don’t get updated. Today’s car owners expect their infotainment systems to have the same user experience as their mobile devices, and that means performing software updates over-the-air.

Scope and scale
According to ABI Research, there will be 210 million connected cars by 2016, and together with the ability to tether the smartphone to the infotainment system, the main enabler for doing over-the-air update is there: connectivity.

The updating solution must have scope and scale. Scope is the ability and the flexibility to update all of the memory including the user and system space with full or discrete components. As well, the solution must scale to manage millions of updates without failure and with the highest security possible. This, for example, would enable users of the QNX CAR application platform to update not only the QNX CAR software but also individual applications such as Pandora or the Weather channel.

In the mobile industry, where over-the-air software updating is a well-established practice, manufacturers and service providers realize many benefits:
 
  • Cost reduction — Over-the-air software updates have reduced warranty costs
     
  • Update success rate — Over-the-air software updates deliver the highest success rate
     
  • Faster updates — Sending only the code that is different between the original software and the update (often called the delta) is faster and uses less bandwidth
     
  • Customer satisfaction — A fast and automatic over-the-air process eliminates the need for the consumer to go to the dealer

A holistic solution
The mobile industry has enjoyed these benefits for some time. The automotive industry needs over-the-air updating even more so because the infotainment system includes millions of lines of code and updating this software requires a holistic solution that can manage the whole software life-cycle.

Red Bend Software has integrated its vRapid Mobile® update technology, which exists in more than 1.6 billion devices, into the QNX CAR platform. This enables car manufacturers and Tier 1 providers the flexibility to create an over-the-air update strategy that is optimized for infotainment systems and also for other embedded systems in the car. Today, infotainment systems are central in the car cockpit experience. These systems contain not only the QNX CAR 2 platform but also a variety of applications. Applications for the auto industry are not like applications for mobile devices. Applications for the auto industry have been modified in order to meet the car environment and have more voice activation and larger buttons so the driver isn’t distracted.

Car manufacturers are looking at their infotainment systems as product differentiators and as a valuable asset to generate revenues after the sale. The automobile industry doesn’t want Over-the-Top companies controlling the delivery channel to the infotainment system and weakening automotive brands. With a holistic Firmware Over-the-Air (FOTA) solution, car manufacturers can guarantee ownership of the infotainment firmware and applications, increasing the consumers’ perceived value through a much stronger brand.

Not if, but when
No longer is the auto industry asking whether or not to perform over-the-air updates. Now car manufacturers and tier one suppliers are asking how often and when should updates be provided during the life-cycle of the infotainment system.



Yoram Berholtz is the Director of Market Adoption at Red Bend Software, the market leader in Mobile Software Management. Mr. Berholtz is responsible for working with mobile operators and device manufacturers to help them increase and improve their usage of over-the-air software updating. In addition, he has responsibility for developing partnerships and go-to-market strategies in the Automotive and Connected Device markets, and oversees the Red Bend Certified™ Interoperability program. Mr. Berholtz has experience in engineering, product management and partner management with an emphasis on mobile communications technologies, having worked at Motorola, Pelephone, ECI Telecom, Schema, Intel and Marvell.

Creating HTML5 apps for the car

Adding a downloadable app capability to the car isn't just a matter of bolting consumer-grade technology onto an automotive hardware platform, dusting your hands, and calling it a day.

Apps should be integrated into the vehicle experience, which means they need access to vehicle resources. But you must carefully control that access: the apps should be isolated in their own environment to protect the rest of the car software. Most of all, the apps need to conform to safe driving practices, which typically entails a re-write of the user interface.

Still, we should leverage as much as possible from the mobile world. That’s where the real innovation happens; the mobile community provides a much bigger pool of fresh ideas than automakers could ever build by themselves. And the best tools and libraries are focused on mobile development.

That’s why QNX Software Systems is building the best of both: an application tool that draws heavily from mobile, but is adapted to the car. It's provisionally named the HTML5 SDK for the QNX CAR application platform and, while it isn't yet available to the public, beta versions are now available for QNX CAR platform customers.

For a preview of what we’ll be rolling out, check out this video:




Making your car a first-class citizen of the Web

Tina Jeffrey
Anyone who follows the latest ongoings of the Worldwide Web Consortium (W3C) may have heard today’s news: the launch of the Automotive and Web Platform Business group. We live in a connected world, and let’s face it, many of us expect access to our favorite applications and services while on the road. I see the formation of this W3C group as a huge step in the pursuit of marrying web technology and the automobile.

The business group will bring together developers, OEMs, automotive technology vendors — many of who, like QNX, were part of Web and Automotive Workshop held last November. The group allows us to continue the discussion and to define a vehicle data API standard for enabling automotive services via the Web. And this is just the start of greater things to come: standards for OTA (over-the-air) software updates, driver safety, security, and seamless integration of smart phones and tablets.

As a member of the QNX automotive team, I second my colleague Andy’s enthusiasm in the announcement in saying we’re extremely excited to be part of this group and the process of helping to define these standards for the industry.

Check out the W3C press release.



Tina is an automotive product marketing manager at QNX Software Systems

An (info)graphic look at self-driving cars

If I were in the insurance industry, I'd be following the development of autonomous cars with keen interest. Think about it: all those cars will have to be insured, but they will probably get into fewer accidents (and incur fewer insurance settlements) than conventional vehicles. That could be good for business as well as for safety.

So why am I bringing this up? Because InsuranceQuotes.com has come up with an infographic on autonomous cars, and it's a doozy. (Trivia dep't: Some believe that the expression "it's a doozy" was coined by the legendary automaker Duesenberg, as part of a campaign to promote its vehicles. Others disagree. I thought you'd want to know.)

Kidding aside, the infographic does a nice job of summarizing the potential benefits of self-driving vehicles, including greater safety, faster traffic flow, reduced fuel wastage, and increased mobility for people with physical handicaps.

Of course, if these benefits are borne out, we will all have to come to terms with the inevitable conclusion: computers do a better job of driving than humans. If you can get comfortable with that, you should survive the year 2040 with a minimum of future shock.

Self-driving cars

Infographic from Bankrate Insurance’s InsuranceQuotes.com

First look: HTML5 SDK for the QNX CAR platform

Whenever I hear the word “ripple,” I think of ice cream: butterscotch ripple, chocolate ripple, lemon ripple, and (yum) strawberry ripple. Well, the video I'm about to show you isn’t about ice cream, but it is about something that’s just as cool and just as sweet: the Ripple mobile environment emulator.

Ripple already supports multiple platforms, such as BlackBerry 10 and Apache Cordova, allowing developers to preview how their apps will look and function on a variety of mobile devices. And now, thanks to extensions provided by the QNX CAR development team, it will also emulate how an app looks and performs in a vehicle infotainment system.

Simply put, the same tool that helps app developers target mobile platforms will also help them target the car.

QNX Software Systems will offer this modded version of Ripple as part of the HTML5 SDK for the QNX CAR platform. The goal of the SDK is simple: to help mobile developers and automakers work together on creating apps for in-vehicle infotainment systems.

If you’re a developer, you’ll be able to use the Ripple emulator and its associated Web Inspector to perform JavaScript debugging, HTML DOM inspection, automated testing, and screen-resolution emulation, all from the convenience of a web browser. You’ll also be able to modify your apps and view the results without having to recompile — simply edit your source code and hit refresh in the browser. You’ll even be able to perform remote debugging on the evaluation board or final hardware used by the automaker, again from the same browser environment.

Enough from me. Let’s get the complete scoop from my colleague Andy Gryc:



Designing interfaces from the outside in

User interfaces are a pet peeve of mine.

I’m one of those people whose VCR always blinked 12:00. Not because I couldn’t figure it out but because I resented that I had to.

Basically, I have neither the time nor the inclination to read manuals. If I’m paying good money for a consumer-facing product then it better not require an engineering degree to use it.

Not surprisingly, then, I think UI design is every bit as important as product; maybe even more so. Because if your user experience sucks, make no mistake; I will be walking and talking to your competitors.

It wasn’t until I entered the glamorous world of software development that I came to the following conclusion: Interfaces are complicated because development tools require an engineer (or similarly brilliant individual) to use them.

Of course this is a sweeping statement and I’ll gladly debate it but the point is this: Someone with unique skills and knowledge about user-centric design should be creating interfaces. Not someone who knows the product from the inside out.

I know in a traditional model this can create a lot of churn but companies like Crank Software have come up with a way to decouple the roles of embedded engineer and UI designer, allowing them to work in parallel while focusing on their individual core competencies.

I spoke to several members of the QNX concept development team when they were heavily embroiled in creating the latest technology concept car. It was obvious when talking to the engineers and the UI designers that Crank’s Storyboard made both jobs that much easier and the process a whole lot quicker. The end result, achieved in a very short time frame, speaks for itself.



This is great news for people like me who curse like sailors whenever using a remote, microwave, GPS, treadmill, camera, and so on. Indeed, I'm counting on teams like QNX and Crank to ensure the digital car is an enjoyable and intuitive  experience. If not, I do know who I'm gonna call.

The 10 qualities of highly effective hands-free systems

The first time I saw — and heard — a hands-free kit in action was in 1988. (Or was it 1989? Meh, same difference.) At the time, I was pretty impressed with the sound quality. Heck, I was impressed that hands-free conversations were even possible. You have to remember that mobile phones were still an expensive novelty — about $4000 in today’s US dollars. And good grief, they looked like this:



It’s almost a shock to see how far we’ve come since 1988. We’ve become conditioned to devices that cost far less, do far more, and fit into much smaller pockets. (Though, admittedly, the size trend for smartphones has shifted into reverse.) Likewise, we’ve become conditioned to hands-free systems whose sound quality would put that 1998 kit to shame. The sound might have been okay at the time, but because of the contrast effect, it wouldn’t pass muster today. Our ears have become too discerning.

Which brings me to a new white paper from Phil Hetherington and Andrew Mohan of the acoustics team at QNX Software Systems. Evaluating hands-free solutions from various suppliers can be a complex endeavor, for the simple fact that hands-free systems have become so sophisticated and complex. To help simplify the decision process, Phil and Andrew have boiled the problem down to 10 key factors:

  • Acoustic echo cancellation
  • Noise reduction and speech reconstruction
  • Multi-channel support
  • Automatic gain control
  • Equalization
  • Wind buffet suppression
  • Intelligibility enhancement
  • Noise dependent receive gain
  • Bandwidth extension
  • Wideband support

Ultimately, you must judge a hands-free solution by the quality of the useful sound it delivers. By focusing on these 10 essentials, you can make a much sounder judgment (pun fully intended).

Recently, Electronic Design published a version of this paper on their website. For a longer version, which includes a decision checklist, visit the QNX download center.

Meet the QNX concept team: True Nguyen, UX designer

We continue our spotlight on the QNX concept development team with True Nguyen, the team's user experience designer.

We interviewed True just prior to CES 2013, and she was hoping that people's impressions of the latest QNX technology concept car would be as fantastic as hers. True's love of cars stems back to her childhood, and that really comes out in the interview.

If you haven't had a chance to meet the other team members, you can read their stories here.

Next up, we'll interview Alexandre James to get his impressions of the Bentley and the buzz from CES 2013.



Our best CES yet

Anecdotes and observations from the QNX booth at 2013 CES

As a wrap-up to last week’s Consumer Electronics Show, I would love to regale you with all the cool technologies and nifty gadgets that I saw. But over the course of the entire four days, I rarely left the 20’x40’ patch of white carpet that was the QNX booth — with brief exceptions, of course, for bodily maintenance. The booth was just too busy for me to get away. If you checked out the QNX booth webcam, you know what I'm talking about.

Paul Leroux and Nancy Young have already posted a lot of information and photos about the show and the new QNX concept car, which is based on a Bentley Continental GT. So let me provide my personal view of CES through assorted anecdotes or observations collected at the booth.

  • As you’d expect, the Bentley got a lot of attention. But our reference vehicle, based on a Jeep Wrangler, got more attention than I thought it would, even though this is the third time we’ve shown it in public. Many of the people interested in the Jeep just wanted to see what our QNX CAR application platform looked like “out of the box” without customization. And some were confessed Jeep or truck aficionados, without the “luxury brand lust” experienced by most.
     
  • People in the auto industry knew who we were without introduction. Non-automotive people didn’t know who we were until I mentioned that “we are a wholly owned subsidiary of Research In Motion,” at which point most of them said “Oh, you’re that QNX.” Seems that your average person has heard quite a bit about QNX in the context of BlackBerry, but has no idea that the same company is doing things in automotive — or in anything else, for that matter. I usually then spoke about our 30+ year legacy in life- and mission-critical systems. When people learned that an OS used for mission-critical systems will also power their next potential phone, their reaction was “wow—that’s really cool.”
     
  • Tanner Foust is a really nice young kid. (Actually, he’s not that much younger than me, but he sure looks young!) I didn’t know who he was when he was being filmed in the booth, surrounded by a throng of admirers. But since then, I’ve watched a lot of his YouTube videos and boy, can he drive! He's an accomplished race car driver, TV personality, and stuntman for lots of famous movies, but it’s nice to see he hasn’t let it go to his head.
     
  • We wanted to make sure that our concept car respected the Bentley brand. To do that, we ran our design sketches by the folks at Bentley and they occasionally suggested some tweaks. It was all our own work, however, and the Bentley folks never saw it before it hit the show floor. When they came to the booth, they were very happy with what they saw — enough so that they said “it looked like we did it.” That, to me, was the ultimate compliment.
     
  • Most frequent question: “Are you giving this away?” As it turns out, it’s something that people have said for every concept car we’ve done to date. Second most frequent question: “Can I drive it?” Unfortunately, but unsurprisingly, the answer to both is “No.”
     
  • I was a little surprised by the enthusiastic response to the car's video conferencing. Of course, it works only while the car is parked, and you only get audio while the car is in Drive. But the part that seemed to impress people the most is the audio: two channel stereo and a full 20Hz to 22KHz means that the call sounds so much better than your typical hands-free call. You could see the reaction when the our director of acoustics Phil Hetherington started talking — you don’t know what you’ve been missing until you hear it.
     
  • Bentley wanted us to add our video conferencing solution to the technology concept car. Because many Bentley vehicle owners aren’t necessarily the drivers, this feature makes a whole lot more sense for rear-seat systems than you might initially imagine.
     
  • I was really impressed by two members of the media: Brian Cooley of CNET and Craig Peterson of Clear Channel. Both could receive a five minute technology core dump, quickly digest it, and talk intelligently about it on video or live radio (respectively) with no stumbles, questions, or missteps. I’ve had the pleasure of seeing both in action before, but their consummate professionalism is really quite amazing.
     
  • I and every other QNX’er was delighted that we didn’t win the CNET Best of CES award! Instead, our customer, Chevrolet, won it for their MyLink system, and we couldn’t have been happier. Two out of the three nominees were QNX-based systems (the Garmin K2 was the other), so our odds were good. I’d rather that we never win another Best of CES award if it meant that one of our customers could always win instead.
     
  • A number of people asked about the RIM booth and its absence. I explained that RIM was focusing on their launch at the end of January, and that since they wouldn’t have a new product to show the public, it didn’t make sense to be there. (It’s notable that Microsoft wasn’t there either, and Apple never is.) RIM was in Las Vegas in a hotel outside the convention center, giving media private previews of the upcoming phones that seemed to be extremely well received. And we had a few of our RIM compatriots helping us out at the QNX booth as well.
     
That’s all I’ve got to say about CES 2013 — our best show yet. See you next year!

Okay, time to get technical

Have glossy photos of the QNX concept car left you hungry for more? Dig into a technical whitepaper with our friends from Texas Instruments.

By now, many of you have seen photos and videos of the new QNX technology concept car, a specially modded Bentley Continental GT. Now, I'd like to say that the car was completed in record time by a small team of highly creative QNX engineers. And in many ways, that's absolutely true. But it's just as true that the work started more than 10 years ago, when QNX Software Systems started to build deep partnerships with leading players in the auto industry.

Because the truth is, you don't create this kind of magic overnight. And you don't do it on your lonesome. QNX has become successful in automotive for many reasons, but one of the most important is our ability to work closely, and productively, with A-list partners like Texas Instruments.
Inside the concept car
Take a look at the amazing displays in the Bentley, and the speed at which the screens redraw, and you get a taste just for how well QNX software and TI silicon work together under the covers.

Which brings me to a new white paper co-authored by Andy Gryc of QNX, and Matt Watson and Scott Linke of TI. It's titled "In-Vehicle Connectivity is So Retro," and among other things, it tells the story of how technologies from QNX and TI have co-evolved to help automotive developers build high-performance systems in less time and at less cost.

If your working vocabulary includes terms like OMAP 5, 1080p video decode/encode, floating-point DSP, MOST MLB, Ethernet AVB, PCIe, SATA, WiLink, Bluetooth, GPS, and NFC, this paper is for you.