Sarah Tatsis, is the Vice President of Advanced Technology Development Labs at BlackBerry.
BlackBerry already secures more than 500M endpoints including 150M cars on the road. BlackBerry is leading the way with a single platform for securing, managing and optimizing how intelligent endpoints are deployed in the enterprise, enabling customers to stay ahead of the technology curve that will reshape every industry.
BlackBerry launched the Advanced Technology Development Lab (Blackberry Labs) in late 2019. What was the strategic importance of creating an entire new business division for BlackBerry?
As an innovation accelerator, BlackBerry Advanced Technology Development Labs is an intentional investment of 120 team members into the future of the company. The rise of the Internet of Things (IoT) alongside a dynamic threat landscape has fostered a climate where organizations have to guard against new threats and breaches at all times. We’ve handpicked the team to include experts in the embedded IoT space with diverse capabilities, including strong data science expertise, whose innovation funnel investigates, incubates and develops technologies to keep BlackBerry at the forefront of security innovation. ATD Labs works in strong partnership with the other BlackBerry business units, such as QNX, to further the company’s commitment to safety, security and data privacy for its customers. BlackBerry Labs is also partnering with universities on active research and development. We’re quite proud of these initiatives and think they will greatly benefit our future roadmap.
Last year, BlackBerry Labs successfully integrated Cylance’s machine learning technology into BlackBerry’s product pipeline. BlackBerry Labs is currently focused on incubating and developing new concepts to accelerate the innovation roadmaps for our Spark and IoT business units. My role is primarily helping to drive the innovation funnel and partner with our business units to deliver valuable solutions for our customers.
What type of products are being developed at BlackBerry Labs?
BlackBerry Labs is facilitating applied research and using insights gained to innovate in the lines of business where we’re already developing market-leading solutions. For instance, we’re applying machine learning and data science to our existing areas of application, including automotive, mobile security, etc. This is possible in large part due to the influx of BlackBerry Cylance technology and expertise, which allows us to combine our ML pipeline and market knowledge to create solutions that are securing information and devices in a really comprehensive way. As new technologies and threats emerge, BlackBerry Labs will allow us to take a proactive approach to cybersecurity, not only updating our existing solutions, but evaluating how we can branch out and provide a more comprehensive, data-based, and diverse portfolio to secure the Internet of Things.
At CES, for instance, we unveiled an AI-based transportation solution geared towards OEMs and commercial fleets. This solution provides a holistic view of the security and health of a vehicle and provides control over that security for a manufacturer or fleet manager. It also uses machine learning based continuous authentication to identify a driver of a vehicle based on past driving behavior. Born in BlackBerry Labs, this concept marked the first time BlackBerry Cylance’s AI and ML technologies have been integrated with BlackBerry QNX solutions, which are currently powering upwards of 150 million vehicles on the road today.
For additional insights into how we envision AI and ML shaping the world of mobility in the years to come, I would encourage you to read ‘Security Confidence Through Artificial Intelligence and Machine Learning for Smart Mobility’ from our recently released ‘Road to Mobility’ guide. Also released at this year’s CES, The Road to Mobility: The 2020 Guide to Trends and Technology for Smart Cities and Transportation, is a comprehensive resource that government regulators, automotive executives and technology innovators can turn to for forward-thinking considerations for making safe and secure autonomous and connected vehicles a reality, delivering a transportation future that drivers, passengers and pedestrians alike can trust.
Featuring a mix of insights from both our own internal experts and recognized voices from across the transportation industry, the guide provides practical strategies for anyone who’s interested in playing a vital role in shaping what the vehicles and infrastructure of our shared autonomous future will look like.
How important is artificial intelligence to the future of BlackBerry?
As both IoT and cybersecurity risk explodes, traditional methods of keeping organizations, things, and people safe and secure are becoming unscalable and ineffective. Preventing, detecting, and responding to potential threats needs to account for large amounts of data and intelligent automation of appropriate responses. AI and data science include tools that address these challenges and are therefore critical to the roadmap of BlackBerry. These tools allow BlackBerry to provide even greater value to our customers by reducing risk in efficient ways. BlackBerry leverages AI to deliver innovative solutions in the areas of cybersecurity, safety and data privacy as part of our strategy to connect, secure, and manage every endpoint in the Internet of Things.
For instance, BlackBerry trains our end point protection AI model against billions of files, good and bad, so that it learns to autonomously convict, or not convict files, pre-execution. The result of this massive, ongoing training effort is a proven track record of blocking payloads attempting to exploit zero-days for up to two years into the future.
The ability to protect organizations from zero-day payloads, well before they are developed and deployed, means that when other IT teams are scrambling to recover from the next major outbreak, it will be business as usual for BlackBerry customers. For example, WannaCry, which rendered millions of computers across the globe useless, was prevented by a BlackBerry (Cylance) machine learning model developed, trained, and deployed 24 months before the malware was first reported.
BlackBerry’s QNX software is embedded in more than 150 million cars. Can you discuss what this software does?
Our software provides the safe and secure software foundation for many of the systems within the vehicle. We have a broad portfolio of functional safety-certified software including our QNX operating system, development tools and middleware for autonomous and connected vehicles. In the automotive segment, the company’s software is deployed across the vehicle in systems such as ADAS and Safety Systems, Digital Cockpits, Digital Instrument Clusters, Infotainment, Telematics, Gateways, V2X and increasingly is being selected for chassis control and battery management systems that are advancing in complexity.
QNX software includes cybersecurity which protects autonomous vehicles from various cyber-attacks. Can you discuss some of the potential vulnerabilities that autonomous vehicles have to cyberattacks?
I think there is still a misconception out there that when you get into your car to drive home from work later today you might fall prey to a massive and coordinated vehicle cyberattack in which a rogue state threatens to hold you and your vehicle ransom unless you meet their demands. Hollywood movies are good at exaggerating what is possible, for example, instant and entire compromise of fleets that undermines all safety systems in cars. Whilst there are and always will be vulnerabilities within any system, to exploit a vulnerability and on scale with unprecedented reliability presents all kinds of hurdles that must be overcome, and would also require a significant investment of time, energy and resources. I think the general public needs to be reminded of this and the fact that hacking, if and when they do occur, are undesirable but not as movies would have you believe.
With a modern connected vehicle now containing well over 100 million lines of code and some of the most complex software ever deployed by automakers, the need for robust security has never been more important. As the software in a car grows so does the attack surface, which makes it more vulnerable to cyberattacks. Each poorly constructed piece of software represents a potential vulnerability that can be exploited by attackers.
BlackBerry is perfectly positioned to address these challenges as we have the solutions, the expertise and pedigree to be the safety certified and secure foundational software for autonomous and connected vehicles.
How does QNX software protect vehicles from these potential cyberattacks?
BlackBerry has a broad portfolio of products and services to protect vehicles against cybersecurity attacks. Our software has been deployed in critical embedded systems for over three decades and it’s worth pointing out, has also been certified to the highest level of automotive certification for functional safety with ISO 26262 ASIL D. As a company, we are investing significantly to broaden our safety and security product and services portfolio. Simply put, this is what our customers demand and rely on from us – a safe, secure and reliable software platform.
As it pertains to security, we firmly believe that security cannot be an afterthought. For automakers and the entire automotive supply chain, security should be inherent in the entire product lifecycle. As part of our ongoing commitment to security, we published a 7-Pillar Cybersecurity Recommendation to share our insight and expertise on this topic. In addition to our safety-certified and secure operating system and hypervisor, BlackBerry provides a host of security products– such as managed PKI, FIPS 140-2 certified toolkits, key inject tools, binary code static analysis tools, security credential management systems (SCMS), and secure Over-The-Air (OTA) software update technology. The world’s leading automakers, tier ones, and chip manufacturers continue to seek out BlackBerry’s safety-certified and highly-secure software for their next-generation vehicles. Together with our customers we will help to ensure that the future of mobility is safe, secure and built on trust.
Can you elaborate on what is the QNX Hypervisor?
The QNX® Hypervisor enables developers to partition, separate, and isolate safety-critical environments from non-safety critical environments reliably and securely; and to do so with the precision needed in an embedded production system. The QNX Hypervisor is also the world’s first ASIL D safety-certified commercial hypervisor.
What are some of the auto manufacturers using QNX software?
BlackBerry’s pedigree in safety, security, and continued innovation has led to its QNX technology being embedded in more than 150 million vehicles on the road today. It is used by the top seven automotive Tier 1s, and by 45+ OEMs including Audi, BMW, Ford, GM, Honda, Hyundai, Jaguar Land Rover, KIA, Maserati, Mercedes-Benz, Porsche, Toyota, and Volkswagen.
Is there anything else that you would like to share about Blackberry Labs?
BlackBerry is committed to constant and consistent innovation– it’s at the forefront of everything we do– but we also have a unique legacy of being one of the pioneers of mobile based security, and further the idea of a truly secure devices, endpoints, and communications. The lessons we learned over the past decades, as well as the technology we developed, will be instrumental for helping us to create a new standard for privacy and security as the tsunami of connected devices enter the IoT. Much of what BlackBerry has done in the past is re-emerging in front of us, and we’re one of the only companies prioritizing a fundamental belief that all users deserve solutions that allow them to own their data and secure communications– it’s baked into our entire development pipeline and is one of our key differentiators. BlackBerry Labs is combining this history with new technology innovations to address the rapidly expanding landscape of mobile and connected endpoints, including vehicles, and increased security threats. Through our strong partnerships with BlackBerry business units we are creating new features, products, and services to deliver value to both new and existing customers.
Thank you for the wonderful interview and for your extensive responses. It’s clear to me that Blackberry is at the forefront of technology and its best days are still ahead. Readers who wish to learn more should visit the Blackberry website.
Phil Duffy, VP of Product, Program & UX Design at Brain Corp – Interview Series
Phil Duffy, is the VP of Product, Program & UX Design at Brain Corp a San Diego-based technology company specializing in the development of intelligent, autonomous navigation systems for everyday machines.
The company was co-founded in 2009 by world-renowned computational neuroscientist, Dr. Eugene Izhikevich, and serial tech entrepreneur, Dr. Allen Gruber. Brain Corp’s initial work involved advanced R&D for Qualcomm Inc. and DARPA. The company is now focused on developing advanced machine learning and computer vision systems for the next generation of self-driving robots.
Brain Corp powers the largest fleet of autonomous mobile robots (AMRs) with over 10,000 robots deployed or enabled worldwide and works with several Fortune 500 customers like Walmart and Kroger.
What attracted you initially to the field of robotics?
My personal interest in developing robots over the last two decades stems from the fact that intelligent robots are one of the two major unfulfilled dreams of the last century—the other dream being flying cars.
Scientists, science-fiction writers, and filmmakers all predicted we would have intelligent robots doing our bidding and helping us in our daily lives a long time ago. As part of fulfilling that vision, I am passionate about developing robots that tackle the repetitive, dull, dirty, and dangerous tasks that robots excel at, but also building solutions that highlight the unique advantages of humans performing creative, complex tasks that robots struggle with. Developing robots that work alongside humans, both empowering each other, ensures we build advanced tools that help us become more efficient and productive.
I am also driven by being part of a fledgling industry that is building the initial stages of the robotics ecosystem. The robotics industry of the future, like the PC or smartphone industry today, will include a wide array of technical and non-technical staff, developing, selling, deploying, monitoring, servicing, and operating robots. I’m excited to see how that industry grows and how decisions we make today impact the industry’s future direction.
In 2014, Brain Corp pivoted from performing research and development for Qualcomm, to the development of machine learning and computer-vision systems for autonomous robots. What caused this change?
It was really about seeing a need and opportunity in the robotics space and seizing it. Brain Corp’s founder, Dr. Eugene Izhikevich, was approached by Qualcomm in 2008 to build a computer based on the human nervous system to investigate how mammalian brains process information and how biological architecture could potentially form the building blocks to a new wave of neuromorphic computing. After completing the project, Eugene and a close-knit team of scientists and engineers decided to apply their computational neuroscience and machine learning approaches to autonomy for robots.
While exploring different product directions, the team realized that the robotics industry of the day looked just like the computer industry before Microsoft—dozens of small companies all adding custom software to a recipe of parts from the same hardware manufacturer. Back then, lots of different types of computers existed, but they were all very expensive and did not work well with each other. Two leaders in operating systems emerged, Microsoft and Apple, with two different approaches: while Apple focused on building a self-contained ecosystem of products and services, Microsoft built an operating system that could work with almost any type of computer.
The Brain Corp team saw the value in creating a “Microsoft of robotics” that would unite all of the disparate robot solutions under one cloud-based software platform. Their goal became to help build out the emerging category of autonomous mobile robots (AMRs) by providing autonomy software that others could use to build their robots. The Brain Corp team decided to focus on making a hardware-agnostic operating system for AMRs. The idea was simple: to enable builders of robots, not build the robot intelligence themselves.
What was the inspiration for designing an autonomous scrubber versus other autonomous technologies?
Industrial robotic cleaners were the perfect way to enter the market with our technology. The commercial floor cleaning industry was in the midst of a labor shortage when we started out—constant turnover meant many jobs were simply not getting done. Autonomous mobile cleaning robots would not only help fill the labor gap in an essential industry, they would also be scalable—every environment has a floor and that floor probably needs cleaning. Floorcare was therefore a good opportunity for a first application.
Beyond that, retail companies spend about $13B on floorcare labor annually. Most employ cleaning staff who use large machines to scrub store floors, which is rote, boring work. Workers drive around bulky machines for hours when their time could be better spent on tasks that require acuity. An automated floor cleaning solution would fill in for missing workers while optimizing the efficiency and flow of store operations. By automating the mundane, boring task of scrubbing store floors, retail employees would be able to spend more time with customers and have a greater impact on business, ultimately leading to greater job satisfaction.
Can you discuss the challenge of designing robots in an environment that often involves tight spaces and humans who may not be paying attention to their surroundings?
It’s an exciting challenge! Retail was the perfect first implementation environment for Brain Corp’s system because they are such complex environments that pose an autonomy challenge, and are ripe with edge cases that allow Brain Corp to collect data that refines the BrainOS navigation platform.
We addressed these challenges of busy and crowded retail environments by building an intelligent system, BrainOS, that uses cameras and advanced LIDAR sensors to map the robot’s environment and navigate routes. The same technology combination also allows the robots to avoid people and obstacles, and find alternate routes if needed. If the robot encounters a problem it cannot resolve, it will call its human operator for help via text message.
The robots learn how to navigate their surroundings through Brain Corp’s proprietary “teach and repeat” methodology. A human first drives the robot along the route manually to teach it the right path, and then the robot is able to repeat that route autonomously moving forward. This means BrainOS-powered robots can navigate complex environments without major infrastructure modifications or relying on GPS.
How has the COVID-19 pandemic accelerated the adoption of Autonomous Mobile Robots (AMRs) in public spaces?
We have seen a significant uptick in autonomous usage across the BrainOS-powered fleet as grocers and retailers look to enhance cleaning efficiency and support workers during the health crisis.
During the first four months of the year, usage of BrainOS-powered robotic floor scrubbers in U.S. retail locations rose 18% compared to the same period last year, including a 24% y-o-y increase in April. Of that 18% increase, more than two-thirds (68%) occurred during the daytime, between 6 a.m. and 5:59 p.m. This means we’re seeing retailers expand usage of the robots to daytime hours when customers are in the stores, in addition to evening or night shifts. We expect this increase to continue as the value of automation comes sharply into focus.
What are some of the businesses or government entities that are using Brain Corp robots?
Our customers include top Fortune 500 retail companies including Walmart, Kroger, and Simon Property Group. BrainOS-powered robots are also used at several airports, malls, commercial buildings, and other public indoor environments.
Do you feel that this will increase the overall comfort of the public around robots in general?
Yes, people’s perception of robots and automation in general is changing as a result of the pandemic. More people (and businesses) realize how robots can support human workers in meaningful ways. As more businesses reopen, cleanliness will need to be an integral part of their brand and image. As people start to leave their homes to shop, work, or travel, they will look to see how businesses maintain cleanliness. Exceptionally good or poor cleanliness may have the power to sway consumer behavior and attitudes.
As we’ve seen in the last months, retailers are already using BrainOS-powered cleaning robots more often during daytime hours, showing their commitment and investment in cleaning to consumers. Now more than ever, businesses need to prove that they’re providing a safe and clean environment for customers and workers. Robots can help them deliver that next level of clean—a consistent, measureable clean that people can count on and trust.
Another application by Brain Corp is the autonomous delivery tug. Could you tell us more about what this is and the use cases for it?
The autonomous delivery tug, powered by BrainOS, enables autonomous delivery of stock carts and loose-pack inventory for any indoor point-to-point delivery needs, enhancing efficiency and productivity. The autonomous delivery tug eliminates inefficient back and forth material delivery and works seamlessly alongside human workers while safely navigating complex, dynamic environments such as retail stores, airports, warehouses, and factories.
A major ongoing challenge for retailers—one that has been exacerbated by the COVID-19 health crisis—is maintaining adequate stock levels in the face of soaring demand from consumers, particularly in grocery. Additionally, the process of moving inventory and goods from the back of a truck, to the stockroom, and then out to store shelves, is a laborious and time-consuming process requiring employees to haul heavy, stock-laden carts back and forth multiple times. The autonomous delivery tug aims to help retailers address these restocking challenges, taking the burden off store workers and providing safe and efficient point-to-point delivery of stock without the need for costly or complicated facility retrofitting.
The autonomous delivery application combines sophisticated AI technology with proven manufacturing equipment to create intelligent machines that can support workers by moving up to 1,000 pounds of stock at a time. Based on an in-field pilot program, the autonomous delivery tug will save retail employees 33 miles of back-and-forth travel per week, potentially increasing their productivity by 67%.
Is there anything else that you would like to share about Brain Corp?
Brain Corp powers the largest fleet of AMRs operating in dynamic public indoor spaces with over 10,000 floor care robots deployed or enabled worldwide. According to internal network data, AMRs powered by BrainOS are currently collectively providing over 10,000 hours of daily work, freeing up workers so they can focus on other high value tasks during this health crisis, such as disinfecting high-contact surfaces, re-stocking, or supporting customers.
In the long term, robots give businesses the flexibility to address labor challenges, absentee-ism, rising costs, and more. From a societal standpoint, we believe robots will gain consumer favor as they’re seen more frequently operating in stores, hospitals, and health care facilities, or in warehouses providing essential support for workers.
We’re also excited about what the future holds for Brain Corp. Because BrainOS is a cloud-based platform that can essentially turn any mobile vehicle built by any manufacturer into an autonomous mobile robot, there are countless other applications for the technology beyond commercial floor cleaning, shelf scanning, and material delivery. Brain Corp is committed to continuously improving and building out our AI platform for powering advanced robotic equipment. We look forward to further exploring new markets and applications.
Thank you for the amazing interview, readers who wish to learn more should visit Brain Corp.
Safety of Self-Driving Cars Improved With New Training Method
One of the most important tasks for a self-driving car when it comes to safety is tracking pedestrians, objects, and other vehicles or bicycles. In order to do this, self-driving cars rely on tracking systems. These systems could become even more effective with a new method developed by researchers at Carnegie Mellon University (CMU).
The new method has unlocked a lot more autonomous driving data compared to before, such as road and traffic data that is crucial for training tracking systems. The more data there is, the more successful the self-driving car can be.
Himangi Mittal is a research intern who works alongside David Held, an assistant professor in CMU’s Robotics Institute.
“Our method is much more robust than previous methods because we can train on much larger datasets,” Mittal said.
Lidar and Scene Flow
Most of today’s autonomous vehicles rely on lidar as their main system for navigation. Lidar is a laser device that looks at what is surrounding the vehicle and generates 3D information out of it.
The 3D information comes in the form of a cloud of points, and the vehicle uses a technique called scene flow in order to process the data. Scene flow involves the speed and trajectory of each 3D point being calculated. So, whenever there are other vehicles, pedestrians, or moving objects, they are portrayed to the system as a group of points moving together.
Traditional methods for training these systems usually require labeled datasets, which is sensor data that has been annotated to track the 3D points over time. Because these datasets are required to be manually labeled and are expensive, a very minimal amount exists. To get around this, simulated data is used in scene flow training, and while it is less effective than the other way, a small amount of real-world data is used to improve it.
The named researchers, along with Ph.D. student Brian Okorn, developed the new method by using unlabeled data in scene flow training. This type of data is much easier to gather and only requires a lidar being placed on top of a car as it drives around.
In order for this to work, the researchers had to find a way for the system to detect its own errors in scene flow. The new system tries to make predictions about where each 3D point will end up and how fast it is traveling, and it then measures the distance between the predicted location and the actual location of the point. This is what forms one type of error to be minimized.
After that process, the system then reverses and works backward from the predicted point location to map where the point originated. By measuring the distance between the predicted position and the origin point, the second type of error is formed from the resulting distance.
After detecting these errors, the system then works to correct them.
“It turns out that to eliminate both of those errors, the system actually needs to learn to do the right thing, without ever being told what the right thing is,” Held said.
The results demonstrated scene flow accuracy at 25% when using a training set of synthetic data, and when it was improved with a small amount of real-world data, that number increased to 31%. The number improved even more to 46% when a large amount of unlabeled data was added to train the system.
Bob Leigh, Market Development for Autonomous Systems, Real-Time Innovations (RTI) – Interview Series
Bob Leigh, is the Senior Market Development Director of Autonomous Systems at Real-Time Innovations (RTI).
RTI is the largest software framework provider for smart machines and real-world systems. The company’s RTI Connext® product enables intelligent architecture by sharing information in real time, making large applications work together as one.
With over 1,500 deployments, RTI software runs the largest power plants in North America, connects perception to control in vehicles, coordinates combat management on US Navy ships, drives a new generation of medical robotics, controls hyperloop and flying cars, and provides 24/7 medical intelligence for hospital patients and emergency victims.
Could you define what IIoT refers to, and which types of devices it is applicable to?
We define the Industrial Internet of Things (IIoT) as a network of devices, machines and computers. These components are connected into highly intelligent systems that combine Industrial operations with advanced data analytics to transform business outcomes. Unlike connecting consumer devices, the IIoT involves large, complex, mission-critical systems. It is ushering in new infrastructure for the most critical societal systems such as the smart grid, public transportation, connected healthcare and autonomous vehicles.
With the IIoT, a standard hospital bed transforms into an intelligent, connected medical system that provides real-time patient data to medical providers, allowing them to draw insights and deliver a higher level of care. These systems must also scale to encompass many thousands or millions of interconnected points, distributed across many different networks and locations. The complexity of these systems and lack of connectivity is often a key obstacle for companies working in these markets. To address these challenges, RTI delivers the software framework that can be used to accelerate development, reduce risk and costs and deliver the resilience, security, performance and scalability necessary to build intelligent architecture, smart machines and real-world systems.
What was it that attracted you to working with IIoT?
I was drawn to work in the IIoT space because there is an opportunity to solve complex technology challenges and to facilitate innovation that will truly help people. From energy to healthcare, from automotive to defense, RTI’s technology enables the sharing of mission critical information in real time, making large applications work together as one. We’ve found that data-centric connectivity, like that provided by RTI, is critical to the success of these high performance, intelligent systems.
How would you describe what Real-Time Innovations (RTI) does in as few words as possible?
Real-Time Innovations (RTI) is the largest software framework provider for smart machines and real-world systems. The company’s RTI Connext® product enables intelligent architecture by sharing information in real time, making large applications work together as one, integrated system. The connecting intelligent, distributed systems that RTI powers help improve medical care, make our roads safer, improve energy use, and protect our freedom. With over 1,500 deployments, RTI software runs the largest power plants in North America, connects perception to control in vehicles, coordinates combat management on US Navy ships, drives a new generation of medical robotics, controls hyperloop and flying cars, and provides 24/7 medical intelligence for hospital patients and emergency victims.
RTI is the leading vendor of products compliant with the Object Management Group® (OMG) Data Distribution Service™ (DDS) standard. RTI is privately held and headquartered in Sunnyvale, California with regional headquarters in Spain and Singapore.
RTI Connext Drive is the first complete connectivity solution for autonomous vehicle development. Could you share some details regarding this technology?
RTI Connext Drive is the first standards-based connectivity framework to handle the complex integration and data distribution challenges of sensor fusion applications in autonomous systems. From research to production, Connext Drive offers automakers and developers the software they need to operate in diverse real-time environments, interoperate with other systems within the vehicle, connect to off-vehicle systems and build in automotive-grade security. From edge to cloud and across remote environments, it distributes and manages real-time data to keep critical systems running. Connext Drive uses DDS, a standard I detail below, and delivers:
- Efficient High-Bandwidth Data Distribution. Communicate rapidly with throughput of over millions of messages per second using a data-centric databus, which allows data to flow when and where it’s needed: securely, at scale and with ultra-low latency.
- Enhanced Performance. With support for the latest Object Management Group® (OMG®) DDS Extensible Types standard, applications benefit from network bandwidth savings, enabling flexibility for multiple Quality of Service (QoS) strategies. An optimized Dynamic Data implementation delivers improved serialization performance.
- Full Redundancy. Any sensor, data source, algorithm, compute platform or even network can be easily duplicated to provide higher reliability. The data-centric design allows the system to resolve this redundancy naturally. Connex Drive supports shared memory, LAN, WAN and Internet transports.
- Broad support for embedded systems and platforms. Connext Drive is integrated with technology from many of the leading automotive technology providers. This interoperability provides automotive customers with the ability to use the software and platforms of choice.
- Safety Certification Pathway. This feature option meets the stringent requirements of ISO 26262 ASIL-D, reducing risk, time and project costs.
- Updated DDS Security. Connext Drive is compliant with the latest OMG DDS Security specification v1.1 and supports the latest OpenSSL v1.1.1. The latest updates to the RTI Security Plugins also support loading keys from an SSL engine to more easily integrate best practice key storage.
- Integration with Standardized Frameworks and Platforms. Through its standards-based architecture, Connext Drive eases integration between OEMs and suppliers, from research through production. It provides Interoperability across programming languages, operating systems and CPU families, plus a Gateway Toolkit and adapters for integrating non-DDS components.
Could you share with our readers what DDS is exactly and what makes it so important?
Data Distribution Service™(DDS) is a standard that aims to enable dependable, high-performance, interoperable, real-time, and scalable data exchanges. DDS is the only open standard for messaging that supports the unique needs of both enterprise and real-time systems. Its open interfaces and advanced integration capabilities slash costs across a system’s life cycle, from initial development and integration through ongoing maintenance and upgrades.
DDS is composed of two primary specifications – one for the application layer interfaces, and another that assures wire-level interoperability between vendor implementations. These layers not only ensure that a different vendor’s DDS implementation can be swapped in without impacting application code, but also that systems built using different implementations of DDS will interoperate.
This standards-based approach delivers enhanced performance and massive scalability while lowering risk. Connext Drive is the first – and only – software that can integrate DDS, ROS 2, AUTOSAR Classic and AUTOSAR Adaptive. This allows automotive companies to work with the standard or standards that best meet their needs at different points in the development cycle.
Could you detail in what capacity Baidu uses RTI technology in autonomous vehicles?
Baidu is developing solutions for autonomous valet parking, fully autonomous mini-buses and more, based on Apollo, a leading global autonomous vehicle technology platform. Baidu uses RTI’s Connext Drive as the connectivity framework for superior reliability — a critical factor in the development of autonomous driving. With Connext Drive, Baidu can guarantee the utilization of bandwidth with TCP + UDP, ensure flexibility through multiple QoS strategies and apply standards-based security and safety.
Can you share with us why building autonomous vehicle connectivity software is so challenging?
Autonomous vehicles are some of the most complex systems ever conceived and hold the promise of profoundly changing daily life. These unmanned machines require real-time, human-level decision-making capabilities with fail proof protections for safety and security. Auto manufacturers also need to ensure their systems can operate in diverse real-time environments, scale and interoperate.
Autonomous vehicle development is challenging and risky because AVs are built from many subsystems and require full interoperability between components. To provide consistently optimal performance, data must flow correctly, reliably and with extremely low latency. Barriers to enter this industry are high, with in-house connectivity solutions taking extensive time to develop and requiring technical expertise. More so, autonomous vehicle designs must last for years, requiring automakers to ensure that their systems not only address today’s connectivity challenges, but also anticipate future challenges. They must also track constantly evolving technology and security requirements.
Reliable connectivity is essential to enable the next generation of vehicles and to achieve higher levels of autonomy.
Is there anything else that you would like to share about RTI or autonomous vehicles?
In the past few years, we’ve seen radically different timelines about when autonomous vehicles will become part of our everyday life. Through the first half of 2020, we’ve seen auto manufacturers and companies increase their investments in the underlying technologies that will be the critical foundation for these autonomous systems – machine learning algorithms, sensors and the underlying connectivity software. However, 2020 likely won’t be the year cars will reach full Level 4 or 5 autonomy, as we still have major hurdles to overcome in terms of safety regulations, satisfying security concerns and weather constraints.
This has been a fantastic interview, thank you for taking the time to explain all of these technologies. Readers who wish to learn more should visit the RTI website.
- How Quantum Mechanics will Change the Tech Industry
- Jim McGowan, head of product at ElectrifAi – Interview Series
- NASA to Use Machine Learning to Enhance Search for Alien Life on Mars
- New Study Attempts to Improve Hate Speech Detection Algorithms
- Pentagon’s Joint AI Center (JAIC) Testing First Lethal AI Projects