Search

Japanese Startup Creates Smart Mask Capable of Translating Eight Languages - Unite.AI

fijars.blogspot.com

Vincent Scesa is the Autonomous Vehicle Program Manager at EasyMile.

EasyMile is a pioneer in driverless technology and smart mobility solutions. The fast-growing start-up develops software to automate transportation platforms without the need for dedicated infrastructure. EasyMile’s cutting-edge technology is revolutionizing passenger and goods transportation, offering completely new mobility options. It has already deployed over 210 driverless projects with more than 320,000 people transported over 250,000 km.

What was it that initially attracted you to AI and robotics?

I’ve always been passionate about all forms of intelligence. I was always curious as a child and still am. My father is an engineer and my mother a psychosociologist. This got me interested. I realised that human intelligence is still much more advanced than artificial intelligence so managing the people that create the AI is also a great challenge. It’s the combination that gets me: managing the people and teams who create and manage the AI.

For as long as I can remember, I have always been fascinated by intelligence, whatever its form; computational, gestural, mechanical, emotional, collective, strategic, human or artificial. Robotics is a field that brings together so much of this. It attracted me at a very early age and I quickly oriented my path in that direction. I love that these machines have computing capacities that enable them to analyse complex situations and, combined with adapted mechanical assemblies, can provide answers and act by showing impressive behaviour.

This is what led me to couple my engineering background with a PhD in Robotics and Artificial Intelligence. I was able to work on decision making in complex articulated machines (bipedal robots), inspired by algorithms reproducing the processes found in the brains of living beings.

I then wanted to continue my professional career in this field, trying to find concrete applications of these technologies that would solve problems. But at the time there were still relatively few applications for robotics, so I threw myself into setting up companies three times. The first focused on AI for video games and robots, the second on robotics for monitoring industrial sites, and the last on cleaning robots for professionals.

You have been working with autonomous vehicles since 2015, what drew you to the space?

The more I learned about robots and machines, for me what became even more interesting than AI and robotics… was human intelligence! This is still far beyond what we can still imagine doing with computers and I am fascinated by the combination.

So, I was drawn to working with engineers and PhDs who are experts in robotics and AI.

This is what pushed me to join EasyMile in 2015, to manage human experts in the creation of artificial intelligence and robots in order to create, organize and monitor autonomous robotics and vehicle projects that solve everyday problems.

You’re the Project Manager at EasyMile, what does your average day look like?

My days are usually pretty busy 😉

My work is based on four different aspects:

  • Management of my team (the technical team who are the interface with our vehicle manufacturer partners): discussions on load plans, daily management, resolution of technical situations, facilitation of the work, training new staff, reviewing and optimizing our processes.
  • Relationship management with other teams: we interact with all the other teams from safety to fleet management through navigation and perception and AI algorithms. I work hand in hand with other managers to ensure that exchanges between teams are as efficient and optimal as possible. This is so that we can incorporate solutions into the vehicles that are best able to respond to each platform, while maintaining overall consistency.
  • Responsibility for the program to create new platforms (monitoring and reporting): I am in charge of making sure with the project managers that the various projects we have are in line with schedules and budgets and that our partners are satisfied with our work. I then report back to management on progress and status, and I make sure to present any blockages so that strategic decisions can be taken to resolve them.
  • Pre-sales for future projects: more recently, I regularly present our solutions to future partners, imagining new opportunities and building the project plans that allow us to help them meet their needs.

Can you discuss the sensor set and the computer vision technology that is used in EasyMile autonomous vehicles?

Taking a conservative approach to the sensor suite, EasyMile uses devices from a number of market leading suppliers but is not committed to any particular technology or supplier, and regularly implements updates every four to six months, which can involve sensor changes. The current set of LiDARs integrated into our EZ10 autonomous passenger shuttle for example comes from Velodyne, Valeo and SICK, indeed the entire sensor set and the computer suite are new in the vehicle. The purpose of this change was to be able to see further and in greater detail.

For example, the move to our next generation of vehicles included a change in the model of Velodyne LiDAR from the Puck VLP-16 to the Ultra Puck VLP-32 and its position shifted from just below the headlights to the roof, expanding the envelope of protection that it provides. The Ultra Puck offers a 120 mm range, 360 degree horizontal and 40 degree fields of view, a 0.33 degree vertical resolution and advanced features designed to minimise false positives. Another addition is a set of Scala LiDARs from Valeo mounted on the corners of the vehicle and at the front down at valence level.

Our new sensor suite also features stereo instead of mono cameras, adding passive 3D depth perception through binocular vision. The company has also integrated IMUs from a variety of sources including Continental and XSens.

We are testing several sensor sets and the market is evolving fast. All our vehicles are based on the same kind of sensors, but depending on the size, the dynamics of the platform and the use cases addressed, we make some small adjustments.

For now we are using what we think gives us the best information on every part of our environment, both close to the vehicle and at longer ranges.

Complementing the LiDAR’s, the stereo cameras provide input to EasyMile’s deep learning effort, which is centred at its Singapore office, a separate team that adds another redundancy layer in terms of software development. EasyMile’s own programmers write the algorithms that interpret sensor data and apply deep learning techniques to them.

The EasyMile autonomous vehicles are equipped with cybersecurity software, how important of an issue is cybersecurity?

Thinking of the vehicles we work with as equivalent to small, mobile enterprise computing systems, or even as data centres on wheels, makes the importance of cyber security obvious. With the main vehicle computer running the autonomous systems, the sensor suite, the communication and navigation systems, for example, there can be 20 or so computing instances connected over an Ethernet bus. Then there are the automotive components such as batteries, inverters and motors, controllers to open doors etc, all of which run software, and each vehicle is connected to the cloud. This makes for a potential “attack surface” that must be protected.

There are many different components on different networks – Ethernet, CAN bus etc – and some off-the-shelf components come with wifi capability.

You have to make sure that the network traffic looks nice, with no strange messages. For example, if your LiDAR is supposed to send messages at a frequency of 50 Hz or so, but you start to receive messages at 100 Hz, there is something fishy going on.

What’s more, sensors are not allowed to talk to each other; they are only permitted to communicate with the main computer.

Only the central computer has the right to speak to everyone, which means that you have to secure this device very thoroughly. It is the brain of the vehicle and is where we put most security. It’s what we call a minimised attack surface because we close every possible service that is not useful. We deactivate USB ports and wifi routers, for example. We make sure it is very, very hard for someone to connect to our computer.

Passwords and penetration tests

With many computer run devices on every vehicle and a growing fleet of vehicles that have to be maintained by engineers and technicians, there are a lot of passwords that have to be managed securely and applied in conjunction with other means of authenticating people who need physical access to vehicles deployed around the world.

Security is the main reason why EasyMile does not yet install software upgrades to its vehicles over the internet, for the moment sending one of their technicians with a secure computer to load the new software at the operator’s facility instead.

It’s like upgrading your brain. It has to be very, very secure, and we prefer to approach it step by step. First you have to prove that the code you want to run on your vehicle is the same code that was written by EasyMile developers, then you have to prove that this code was compiled by EasyMile on our servers and so on, so you have electronic signatures and certificates.  You have to have this layer of assurance just to make sure that when you inject a new system you are 100% sure that it is the right system.

To make sure that all these measures actually result in a secure vehicle and ecosystem, EasyMile regularly employs white hat hackers to conduct penetration tests

Environment hacks

In addition to the familiar cyber threats and countermeasures, there are new ones emerging that target services and sensors. The availability of GPS jamming and spoofing devices is well known, and attacks on the system are increasing, but hackers are also targeting sensors such as cameras and, through them, the AI and machine learning algorithms by subtly altering some aspects of the environment.

Last year, for example, a team from McAfee Advanced Threat Research managed to trick two Teslas equipped with Mobileye camera systems by altering a speed limit sign using electrical tape so that it appeared to read 85 mph instead of 35 mph. Tested in an off-road environment and with Traffic Aware Cruise Control engaged, both cars accelerated automatically in response to the sign before the drivers applied the brakes.

Some white hat hacking research teams are also looking into how to attack LiDARs.

The answer to this type of threat is never to rely on just a single sensor or subsystem for safety critical functions. EasyMile takes this further by mixing LiDARs from different suppliers. With three different brands on the vehicle, an attacker would have to be able to hack different LiDARs that don’t work in the same way, they use different wavelengths, for example. So redundancy is part of the security.

Can you also discuss the monitoring and blackbox technology?

Originally, the satellite navigation portion of our navigation and localisation system used only GPS, but the latest iteration is a multi-GNSS system that processes GLONASS as well and will soon add Galileo and Beidou also. The system’s precision is enhanced by Real Time Kinematic (RTK) processing. The GNSS position is also used in conjunction with information from 3G or 4G grid. We use it very much for correction.

The overall navigation and positioning system is accurate to a few cms, enabled by the combination of GNSS, the LiDARs, cameras, inertial system and odometry, which also provide redundancy and graceful degradation in case the system loses the GNSS signal.

Our vehicles communicate with EasyMile’s cloud-based supervision centre via the 3G/4G network. With a view to implementing 5G, the company is working with a number of providers around the world including SFR in France, Verizon in the US, Ericsson in Scandinavia and Saudi Telecom. In the short to medium term, 5G promises faster feedback from deployed EZ10 fleets, boosting both machine learning and R&D, the ability to update vehicles faster with large data sets, enhanced video surveillance through simultaneous streaming of multiple high-quality video feeds, and infotainment for passengers.

To communicate with the road infrastructure, our vehicles can exploit technologies provided by V2X suppliers, through an onboard unit that talks to roadside units, providing information on the state of traffic lights, for example.

If communications with the supervision centre are lost longer than 3 to 5 seconds, the vehicle will continue to the next planned stop and wait for communication with the EasyMile server to be restored so it can receive its next set of instructions.

EasyMile has multiple autonomous vehicles on the road. Could you give us some details on these?

Fully fabless, EasyMile licenses its software technology and sells/rents fully equipped driverless vehicles. It outsources production to blue-chip manufacturers.

EasyMile has developed a complete technology stack for autonomous vehicles that can be used for each of its use cases. The technology is vehicle/ platform agnostic.

The flagship vehicle this is found in is the EZ10 which is the most deployed autonomous passenger shuttle in the world. They carry passengers at speeds of up to 15 miles per hour and operate on a specified route. They are used around the world to show how cutting-edge technology will deliver huge benefits for communities. They improve public transport by connecting hubs and in many areas, provide a shared transport service where there otherwise wasn’t one. They also offer a powerful fleet management and supervision system, one of the first to be deployed with real-world autonomous vehicles.

Its rising star is the TractEasy fully electric autonomous tow-tractor. It allows 24/7 ground transportation of goods on industrial sites and logistics centres. It optimizes supply chains with the new and highly-automated innovation of being able to  cross from indoors to complex, outdoor environments, unlike existing automated guided vehicles (AGVs).

EasyMile is also working on other heavy-duty vehicle applications including buses, trams and trucks. My team is in charge of this program and I would say that working on EasyMile’s future AV vehicles is really challenging and motivating !

With more than 250 deployments in over 30 countries, EasyMile’s technology has powered 600,000km of autonomous driving to date.

What are some of the different cities or municipalities that you are currently working with?

In the USA, our EZ10s are involved in demonstration projects in 16 American cities, carrying tens of thousands of passengers. Most of these are by organizations like Departments of Transportation, airports, universities and transit agencies in collaboration with US-based EasyMile Inc.

We have a very strong presence in Germany and France as well as other projects around Europe.

These include business parks, hospitals, universities, cities and towns, and communities.

In Australia they have a focus on shared mobility with recent projects including a retirement village and connecting a ferry service to the centre of a small island.

We are also working on a number of projects in Asia.

Is there anything else that you would like to share about EasyMile?

It was such a wonderful opportunity for me because at the time when I was looking for an opportunity in this area the industry was still in its infancy.

I love that EasyMile is serious – we are really industrialising our products and services and this is still quite unique in this space. We’re not just playing in a garage with robots, we’re delivering real, measurable, services that deliver tangible benefits and outcomes for our clients.

Thank you for the fantastic interview I really enjoyed learning more about EasyMile, easily one of the most underrated startups in the autonomous vehicles space. Readers who wish to learn more should visit EasyMile.

Spread the love

Let's block ads! (Why?)



"smart" - Google News
August 10, 2020 at 05:46AM
https://ift.tt/2XIQfLg

Japanese Startup Creates Smart Mask Capable of Translating Eight Languages - Unite.AI
"smart" - Google News
https://ift.tt/2P2kUhG
https://ift.tt/3febf3M

Bagikan Berita Ini

0 Response to "Japanese Startup Creates Smart Mask Capable of Translating Eight Languages - Unite.AI"

Post a Comment


Powered by Blogger.