Oops, You Just Collided With An AI Self-Driving Car, What Happens Next?


We all dread getting into a car crash.

I’m hoping that you’ve never had such an occurrence.

You might find of interest that there are about 6.7 million car crashes in the United States each year (see my stats coverage at this link here). Sadly, this produces about 2.3 million injuries and approximately 40,000 fatalities. All told, a car crash or collision is something we anxiously hope to avoid.

Even if you have nerves of steel, the moment that you realize that a car collision is imminent, you are emotionally bracing for the worst. The physical jarring is one aspect of a car crash, while the mental and emotional toll is another substantive factor. It is hard to keep your mind focused on what to do once the collision has happened.

The odds are that you have been taught or been told about the types of things you should do when a car crash occurs. Perhaps your car insurance company has provided you with a handy list of steps that you keep inside your vehicle at all times. Rather than trying to remember off the top of your head what to do, it can be crucial to have a written list. Your mind isn’t going to be as clear as a bell in the aftermath of a collision and so any notes or prompts could be essential.

Some people use an app to provide guidance during the post-crash effort. That can be helpful, though the crux is that your smartphone needs to still be in working order. If the phone got cracked or damaged, you might be out of luck in terms of trying to use any salient apps. For that reason, oftentimes people keep a paper-printed version in their cars or their wallets or purse. Yes, something could happen adversely to those lists too, so the idea is to have a multitude of sources that one or another will still be viable for access.

Probably the most important first step entails making sure that everyone is safe.

Let’s assume that you are driving your car and get into a collision with another vehicle.

You would want to see how you are doing and whether you are hurt from the collision. If you have any passengers in your car, you would want to check on them too. In terms of the vehicle you collided with, the question arises about whether the driver is okay and also whether the passengers of that vehicle are okay.

Safety might encompass getting out of your now collided vehicles. Heaven forbid that the fuel tanks are leaking or a fire has started. The bashed-up cars are prone to potential hazards that can further instill harm, beyond the initial impact of the collision.

Another consideration is whether the vehicles are potentially in an active roadway and might imperial additional vehicles to crash into you all. Traffic zipping past the scene could inadvertently get carried into the mess. This endangers those drivers and their passengers. It obviously further endangers you, your passengers, the other driver of the collision, and the passengers in that vehicle.

After trying to achieve some appropriate level of safety, the next step usually involves reporting the incident, such as by calling 911, if applicable. There might also be a need to request that medical assistance arrive. If the vehicles are in bad shape after the collision, perhaps a towing service will be required.

Finally, after all that, you are likely to begin exchanging pertinent info with the other parties involved.

This usually consists of exchanging data such as driver’s licensing info, car registrations, and car insurance particulars. You would be wise to try and document the accident while still at the scene, assuming you are able to safely do so. In some instances, you might use your smartphone to take pictures, or possibly utilize an app provided by your car insurance company to document what happened.

That covers some of the top points, though there are certainly lots of additional steps that can be taken.

It is easy beforehand to think that you will calmly be able to perform those steps. In the midst of experiencing a car crash, those seemingly simple and obvious steps do not necessarily fill your mind. You are probably in a partial state of shock and not able to think clearly. This is especially the case if the collision is the first time you’ve ever been in such an incident. For those people that have experienced car crashes previously, they have formed a type of mental muscle memory, as it were, recalling what took place and what they need to do this time around.

Note that the foregoing description about what to do in a car crash or collision is predicated on the notion that the other car had a driver at the wheel. That seems absurdly obvious, perhaps. We would certainly expect that the other car had a human driving the vehicle, else it would not seemingly be in traffic and on an active roadway (I’m for the moment excluding parked cars from this discussion, and likewise a car that maybe was left in neutral and is rolling down a hill on its own, those kinds of special cases are a different matter).

Why do I bring up the blatantly obvious point that there is a human driver in the car that you’ve collided with?

Because you might soon enough collide with a moving car that has no human driver at the wheel.

Say what?

Consider that the future of cars consists of AI-based true self-driving cars.

There isn’t a human driver involved in a true self-driving car. Keep in mind that true self-driving cars are driven via an AI driving system. There isn’t a need for a human driver at the wheel, and nor is there a provision for a human to drive the vehicle. For my extensive and ongoing coverage of Autonomous Vehicles (AVs) and especially self-driving cars, see the link here.

Here’s an intriguing question that is worth pondering: What should you do if you collide with an AI self-driving car?

Before jumping into the details, I’d like to further clarify what is meant when I refer to true self-driving cars.

Understanding The Levels Of Self-Driving Cars

As a clarification, true self-driving cars are ones that the AI drives the car entirely on its own and there isn’t any human assistance during the driving task.

These driverless vehicles are considered Level 4 and Level 5 (see my explanation at this link here), while a car that requires a human driver to co-share the driving effort is usually considered at Level 2 or Level 3. The cars that co-share the driving task are described as being semi-autonomous, and typically contain a variety of automated add-on’s that are referred to as ADAS (Advanced Driver-Assistance Systems).

There is not yet a true self-driving car at Level 5, which we don’t yet even know if this will be possible to achieve, and nor how long it will take to get there.

Meanwhile, the Level 4 efforts are gradually trying to get some traction by undergoing very narrow and selective public roadway trials, though there is controversy over whether this testing should be allowed per se (we are all life-or-death guinea pigs in an experiment taking place on our highways and byways, some contend, see my coverage at this link here).

Since semi-autonomous cars require a human driver, the adoption of those types of cars won’t be markedly different than driving conventional vehicles, so there’s not much new per se to cover about them on this topic (though, as you’ll see in a moment, the points next made are generally applicable).

For semi-autonomous cars, it is important that the public needs to be forewarned about a disturbing aspect that’s been arising lately, namely that despite those human drivers that keep posting videos of themselves falling asleep at the wheel of a Level 2 or Level 3 car, we all need to avoid being misled into believing that the driver can take away their attention from the driving task while driving a semi-autonomous car.

You are the responsible party for the driving actions of the vehicle, regardless of how much automation might be tossed into a Level 2 or Level 3.

Self-Driving Cars And Collisions

For Level 4 and Level 5 true self-driving vehicles, there won’t be a human driver involved in the driving task.

All occupants will be passengers.

The AI is doing the driving.

One aspect to immediately discuss entails the fact that the AI involved in today’s AI driving systems is not sentient. In other words, the AI is altogether a collective of computer-based programming and algorithms, and most assuredly not able to reason in the same manner that humans can.

Why this added emphasis about the AI not being sentient?

Because I want to underscore that when discussing the role of the AI driving system, I am not ascribing human qualities to the AI. Please be aware that there is an ongoing and dangerous tendency these days to anthropomorphize AI. In essence, people are assigning human-like sentience to today’s AI, despite the undeniable and inarguable fact that no such AI exists as yet.

With that clarification, you can envision that the AI driving system won’t natively somehow “know” about the facets of driving. Driving and all that it entails will need to be programmed as part of the hardware and software of the self-driving car.

Let’s dive into the myriad of aspects that come to play on this topic.

First, consider whether you would even realize that the car you’ve collided with is a self-driving car.

That might be harder to discern than you would otherwise assume.

Most self-driving cars will be equipped with externally mounted sensors such as video cameras, radar, LIDAR, ultrasonic units, thermal imagining, and the like. This does tend to make self-driving cars stand out in a crowd. You can usually see that these various contraptions are protruding and are not normally seen on a conventional human-driven car.

Well, it could be that it is a self-driving car but that it is being overseen by an onboard human driver that is serving as the backup or safety monitor for the AI driving system. This is being done on many of the existing public roadway tryouts of self-driving cars. The idea is that a trained and versed human driver is sitting at the wheel and ready to take over the controls if needed. For more about the role of the safety driver, see my discussion at this link here.

I bring this up because the self-driving car might have been in the self-driving mode or it might have been disengaged by the human driver and thus the human driver was directly controlling the autonomous vehicle at the time of the collision. In that sense, the self-driving car really isn’t any longer a self-driving car and merely another variant of a human-driven car.

If that perchance is the case, namely a human driver was driving the presumed self-driving car, everything that you would do as a result of having crashed with a human-driven car is still entirely applicable. Essentially, there is no notable difference at that juncture.

For sake of discussion, let’s assume that the self-driving car does not have a backup or safety driver present and therefore the AI driving system is considered the driver of the vehicle.

You have genuinely collided with a true self-driving car.

Some of the more outrageous vendors and even some of the pundits would insist that this could never ever happen. In other words, they claim that self-driving cars going to be uncrashable. Sorry to say, that is unmitigated balderdash and completely misleading. My detailed explanation of why no one should be referring to self-driving cars as being uncrashable can be found at this link here.

Okay, so I am readily and unreservedly asserting that you could indeed have collided with a self-driving car.

Note that we aren’t at this juncture of this discussion caring about the responsible cause or party of the car crash incident and instead focused on what steps to take while at the scene of the collision. It could be that you made a booboo and rammed into the self-driving car, which I’ve reported previously about people that have rear-ended self-driving cars (see my column coverage). It could be that the self-driving car rammed into your vehicle.

The key is that a collision has occurred and in the aftermath what you should be doing. Eventually, yes, there will be a need to discover who was at fault. This is part-and-parcel of any car crash, including crashes that involve self-driving cars.

As an aside, you might have heard some pundits exclaiming that we are going to be in some nebulous mushy area of the law about the fault since it will be the question of whether the human driver (you, in this case) or the AI driving system was responsible for the collision. This kind of logic goes overboard and tries to suggest that the AI driving system is sentient and will need to be held accountable as though it is a thinking being of an equivalent to a human driver.

I emphasized earlier herein that today’s AI is not sentient. Not even close. You can’t even try to contend that the AI is somehow in the same ballpark. It is not.

The key that I am driving toward (a pun!) is that the responsible party for the self-driving car is the automaker or self-driving tech firm that crafted and fielded the AI driving system amidst the self-driving car, plus the fleet operator that is running and overseeing the self-driving whilst on our roadways. Those are humans that opted to put the self-driving car onto our public streets and byways. They bear the responsibility for what the self-driving car does.

You cannot try to let them skip out of this and point their fingers at the AI. It might be a clever way to attempt to dodge a big financial fine or try to skirt out of their legal responsibilities, but it doesn’t cut the mustard. The self-driving car and the AI driving system are human-made products that will legally be held to the same repercussions as other products that we use or interact with daily (see my columns for further elaboration).

Now, let’s get back to the scene of the crash.

You have just collided with a self-driving car. There isn’t a human driver in the self-driving car. That is our starting point of figuring out what to do next.

The same rule-of-thumb about achieving safety is still a tried-and-true mantra in this use case.

Are you injured? Are any of your passengers injured? You want to right away assess the status of the people involved.

This includes the aspect that the self-driving car might contain passengers. Though there wasn’t a human driver at the wheel, there might have been people riding in the self-driving car. They could be injured and require immediate medical attention.

Again, similar to the earlier look at what should happen in the case of a human driver that collides with another human-driven car, consider whether anyone is in imminent danger. The danger might be from the vehicles themselves. For example, the expectation is that self-driving cars are likely going to be EVs (electrical vehicles), which makes sense partially to glean the needed electrical power to run the onboard processors for the AI driving system. As we’ve all seen in the news, sometimes the EV batteries can be especially endangering when a car crash occurs.

I’m not suggesting that only self-driving cars will be EVs, don’t take things that way. There will be lots of human-driven cars that are going to be EVs. Regardless of whether the vehicle is a human-driven vehicle that is an EV, or it is a self-driving car that is an EV, the qualm is that the batteries could entail special risks as a result of a car crash.

Continuing our steps, a bit of a twist arises about the aspect of possibly being in the way of other ongoing traffic. If you have conventional cars involved in a collision, you can sometimes safely move the cars out of the way of the active setting. This has to be assessed cautiously. Do you try to physically handle the cars and push them out of the way? Do you attempt to start the cars and drive them out of the way? This depends on a plethora of factors. The vehicles might not be drivable. They might not be safe to be moved. Etc.

To some degree, this also arises with a self-driving car.

Suppose the self-driving car is considered in a viable drivable state and could be driven out of the way of traffic. The problem is that there is typically no provision for a human to get into the autonomous vehicle and drive it. You need to somehow convince the AI driving system to make this driving effort. This is problematic because most of the existing AI driving systems are not programmed to handle this kind of extraordinary driving situation.

By and large, the AI driving systems are being programmed to drive from point A to point B, such as from someone’s house to the grocery store. You indicate the starting point of the driving journey and the ending point, and the AI driving system attempts to safely navigate there, using digital maps and other means of devising the path.

Imagine that your collision has occurred in the slow lane of the freeway. Both vehicles are now straddling the emergency lane and the slow lane. You want to try and move both vehicles out of the slow lane and entirely into the emergency lane, or perhaps onto some flat space that is adjacent to the roadway and looks safe to be parked in.

The odds of your being able to merely tell any of today’s AI driving systems to move over like this is ostensibly close to nil. In practical terms, you would more likely need to confer with a fleet-approved remote agent that could either direct the AI driving system as to what to do or possibly have that remote agent directly take over the driving controls remotely (the latter is something that I have argued vehemently is going to be highly suspect, see my columns).

This does bring up a related aspect.

When a self-driving car gets into a car crash, most of the automakers and self-driving tech firms have the autonomous vehicle send an alert to the fleet operator. At that juncture, presumably, the fleet operator is aware that the self-driving car is in trouble and assistance will be needed. Usually, the fleet operator will dispatch a prearranged team to quickly get to the scene of the incident.

All of that is a bit iffy. The self-driving car might not have been able to send an alert or might be so damaged that it cannot send an alert. The fleet operator might or might not receive an alert. The fleet operator might or might not have a team readied to head to the location. The location might be far away from the nearest team and therefore it will take time to have them arrive. And so on.

I mention this because some self-driving car companies would have you believe that a top-notch crew will be at the scene instantaneously. That’s just not realistic in all cases. Unless the self-driving car is being closely trailed by such a team, which some self-driving tryouts are in fact doing, there is going to be some semblance of delay between when the crash occurred and when humans pertinent to the self-driving car arrive.

If the self-driving car is still in viable shape, you might be able to talk with the fleet operator via the self-driving car in-car telephonic system. That’s one means to discuss the situation with a remote human. I have also urged that the self-driving car companies ought to be posting a glaringly obvious phone number on the outside of the autonomous vehicle, kind of like those bumper stickers that urge you to call a number about the driving of a vehicle.

In states that are allowing self-driving cars to operate, they usually have a provision that the autonomous vehicle must have readily inside somewhere an indication of the car registration and the car insurance. In theory, you would be able to find that, though this could be difficult in terms of having to try and rummage around inside the self-driving car, something that could be a danger due to the effects of the collision and what the autonomous vehicle status is.

This brings up another consideration, namely that the self-driving car could attempt to continue driving, despite having gotten into a collision. In theory, the AI driving system should pull over and come to a proper stop, entering into what is known as an MRC (minimal risk condition). You have no guarantee that this will happen, nor that it will happen flawlessly.

When a human driver is at the wheel, you can usually plainly see if they are trying to drive the vehicle and perhaps seeking to drive away or maneuver for some reason. With a self-driving car, it will be quite hard to know what is going on. The inner workings of the AI driving system are not readily apparent to outside observers.

Overall, you ought to approach the self-driving car is an extremely cautious manner. The AI driving system might be confounded and make other potential driving moves that are dangerous. Another possibility is that the AI driving system has been damaged and therefore is of little use for the moment. I’ve discussed in my postings that this also presents challenges to first responders.

Conclusion

I trust that you can see that coping with a collision involving a self-driving car has its own quirky elements that are a variation of what happens when crashing with another human-driven car.

The overarching steps are still pretty much the same.

It can be unnerving though to be standing outside of a self-driving car and not immediately have at hand the other driver that you can discuss the situation with. Some lawyers might suggest this is handy because people tend to say things that perhaps they ought not to be saying while still in the shock of the collision.

The possibility of course exists that you will have encountered a collision for which there isn’t any other human immediately involved. That is actually somewhat reassuring. You would presumably be always worried about the other driver and whether they are okay. If the self-driving car doesn’t have any passengers, it also means that there aren’t any other humans inside of it.

In a manner of thinking, it is as though you collided with a light pole or any kind of object that one might ram into. There is going to be damage done to the vehicles, but other than you, no other human damages (assuming no passengers in the self-driving car, and none in your car).

Meanwhile, imagine the outsized story that you’ll have to tell your friends and even strangers. Hey, I got into a minor fender bender the other day, you might indicate. I’m okay, wasn’t hurt at all.

An attentive listener might quickly ask, was the other driver okay?

Sure, it was just an AI system. Replace a few circuit boards and it will be ready to hit the road again. No emotional toll. No broken bones.

At least until the day that we have those walking and talking robots, for which when they are driving cars (see my prediction at this link here), you might want to be careful approaching the “driver” since they might be really irked and have those vaunted The Terminator glaring look in their menacing robotic eyes.

AI can be one tough android to deal with after a car crash.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *